Copy Link
Add to Bookmark
Report
Neuron Digest Volume 11 Number 45
Neuron Digest Friday, 16 Jul 1993 Volume 11 : Issue 45
Today's Topics:
Administrivia - ND on holiday for two weeks
Neuroprose entry - extracting and learning "grammar"
Paper available in Neuroprose
TR - Models of Reading aloud
SUMMARY: From neurobiological to computational models - State of the art?
Re: SUMMARY: From neurobiological to computation
Post-doc in Neurophysiology...
PostDoc positions in Korea
Cambridge Neural Nets Summer School
POSITION AVAILABLE - STATISTICIAN
Research Opportunities in Neural Networks
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Administrivia - ND on holiday for two weeks
From: "Neuron-Digest Moderator, Peter Marvit" <neuron@cattell.psych.upenn.edu>
Date: Fri, 16 Jul 93 18:44:01 -0500
Dear readers,
Due to a last minute change of plans, Neuron Digest will go on holiday
for the next two weeks (a little earlier and longer than I had planned).
I will return on August 3, so do not get worried if you do not hear from
me until later during that first week in August.
Thanks to all, as always, for your continued readership and support.
Apologies to contributors who must wait.
-Peter
: Peter Marvit, Neuron Digest Moderator :
: Email: <neuron-request@cattell.psych.upenn.edu> :
: Courtesy of the Psychology Department, University of Pennsylvania :
: 3815 Walnut St., Philadelphia, PA 19104 w:215/898-6274 h:215/387-6433 :
------------------------------
Subject: Neuroprose entry - extracting and learning "grammar"
From: giles@research.nj.nec.com (Lee Giles)
Date: Tue, 18 Feb 92 13:49:26 -0500
[[ Editor's Note: Personal apologies to Lee for the slight (!) delay. -PM ]]
The following paper has been placed in the Neuroprose archive.
Comments and questions are invited.
*******************************************************************
--------------------------------------------
EXTRACTING AND LEARNING AN "UNKNOWN" GRAMMAR
WITH RECURRENT NEURAL NETWORKS
--------------------------------------------
C.L.Giles*, C.B.Miller D.Chen, G.Z.Sun, H.H.Chen, Y.C.Lee
NEC Research Institute *Institute for Advanced Computer Studies
4 Independence Way Dept. of Physics & Astronomy
Princeton, N.J. 08540 University of Maryland
giles@research.nj.nec.com College Park, Md 20742
___________________________________________________________________
- -------------------------------------------------------------------
Abstract
--------
Simple second-order recurrent networks are shown to readily
learn small known regular grammars when trained with positive
and negative strings examples. We show that similar methods
are appropriate for learning "unknown" grammars from examples
of their strings. The training algorithm is an incremental real-
time, recurrent learning (RTRL) method that computes the
complete gradient and updates the weights at the end of each
string. After or during training, a dynamic clustering algorithm
extracts the production rules that the neural network has
learned. The methods are illustrated by extracting rules from
unknown deterministic regular grammars. For many cases the
extracted grammar outperforms the neural net from which it
was extracted in correctly classifying unseen strings.
(To be published in Advances in Neural Information Processing
Systems 4, J.E. Moody, S.J. Hanson and R.P. Lippmann (eds.)
Morgan Kaufmann, San Mateo, Ca 1992).
********************************************************************
Filename: giles.nips91.ps.Z
- ----------------------------------------------------------------
FTP INSTRUCTIONS
unix% ftp archive.cis.ohio-state.edu (or 128.146.8.52)
Name: anonymous
Password: anything
ftp> cd pub/neuroprose
ftp> binary
ftp> get giles.nips91.ps.Z
ftp> bye
unix% zcat giles.nips91.ps.Z | lpr
(or whatever *you* do to print a compressed PostScript file)
- ----------------------------------------------------------------
^^^^^^^^^^^^^^^^^^^^^^^cut here^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
C. Lee Giles
NEC Research Institute
4 Independence Way
Princeton, NJ 08540
USA
Internet: giles@research.nj.nec.com
UUCP: princeton!nec!giles
PHONE: (609) 951-2642
FAX: (609) 951-2482
------------------------------
Subject: Paper available in Neuroprose
From: arun@hertz.njit.edu (arun maskara spec lec cis)
Date: Mon, 16 Mar 92 13:46:08 -0500
The following paper is now available by ftp from neuroprose archive:
Forcing Simple Recurrent Neural Networks to Encode Context
Arun Maskara, New Jersey Institute of Technology,
Department of Computer and Information Sciences
University Heights, Newark, NJ 07102, arun@hertz.njit.edu
Andrew Noetzel, The William Paterson College,
Department of Computer Science, Wayne, NJ 07470
Abstract
The Simple Recurrent Network (SRN) is a neural network model that has been
designed for the recognition of symbol sequences. It is a back-propagation
network with a single hidden layer of units. The symbols of a sequence are
presented one at a time at the input layer. But the activation pattern in
the hidden units during the previous input symbol is also presented as an
auxiliary input. In previous research, it has been shown that the SRN
can be trained to behave as a finite state automaton (FSA) which accepts the
valid strings corresponding to a particular grammar and rejects the invalid
strings. It does this by predicting each successive symbol in the input string.
However, the SRN architecture sometime fails to encode the context necessary to
predict the next input symbol. This happens when two different states in the FSA
generating the strings have the same output, and the SRN develops similar hidden
layer encodings for these states. The failure happens more often when number of
units in the hidden layer is limited. We have developed a new architecture,
called the Forced Simple Recurrent Network (FSRN), that solves this problem.
This architecture contains additional output units, which are trained to show
the current input and the previous context. Simulation results show that for
certain classes of FSA with $u$ states, the SRN with $\lceil \log_2u \rceil$
units in the hidden layers fails, where as the FSRN with the same number of
hidden layer units succeeds.
- -------------------------------------------------------------------------------
Copy of the postscript file has been placed in neuroprose archive. The
file name is maskara.fsrn.ps.Z
The usual instructions can be followed to obtain the file from the
directory pub/neuroprose from the ftp site archive.cis.ohio-state.edu
Arun Maskara
------------------------------
Subject: TR - Models of Reading aloud
From: Max Coltheart <mcolthea@laurel.ocs.mq.edu.au>
Date: Tue, 24 Mar 92 08:20:24 +1000
Models Of Reading Aloud: Dual-Route And Parallel-Distributed-Processing
Approaches
Max Coltheart, Brent Curtis and Paul Atkins
School of Behavioural Sciences
Macquarie University
Sydney NSW 2109
Australia
email: max@currawong.mqcc.mq.oz.au
Submitted for publication March 23, 1992.
Abstract
It has often been argued that various facts about skilled reading aloud cannot
be explained by any model unless that model possesses a dual-route
architecture: one route from print to speech that may be described as lexical
(in the sense that it operates by retrieving pronunciations from a mental
lexicon) and another route from print to speech that may be described as
non-lexical (in the sense that it computes pronunciations by rule, rather
than by retrieving them from a lexicon). This broad claim has been challenged
by Seidenberg and McClelland (1989, 1990). Their model has but a single route
from print to speech, yet, they contend, it can account for major facts about
reading which have hitherto been claimed to require a dual-route architecture.
We identify six of these major facts about reading. The one-route model
proposed by Seidenberg and McClelland can account for the first of these, but
not the remaining five: how people read nonwords aloud, how they perform
visual lexical decision, how two particular forms of acquired dyslexia can
arise, and how different patterns of developmental dyslexia can arise.
Since models with dual-route architectures can explain all six of these
basic facts about reading, we suggest that this remains the viable
architecture for any tenable model of skilled reading and learning to read.
Preprints available from MC at the above address.
------------------------------
Subject: SUMMARY: From neurobiological to computational models - State of the art?
From: massimo@cui.unige.ch (DEFRANCESCO Massimo)
Organization: University of Geneva, Switzerland
Date: 04 Jul 93 08:25:03 +0000
[[ Editor's Note: This is from the Neuroscience mailing list. -PM ]]
A couple of weeks ago I posted a request on the net for the state
of the art in neuromodelling, with emphasis on the computational side.
I received 13 answers, 4 of which were asking for the summary.
Many thanks to the following people (hope I forgot nobody):
Richard E. Myers (rmyers@ics.uci.edu)
Jacques Brisson (X042@hec.ca)
Christoph Ian Connoly (connoly@cs.umass.edu)
Prashanth Kumar (akp@cns.nyu.edu)
Tobias Fabio Christen (tchriste@iiic.ethz.ch)
Joern Erbguth (jnerbgut@cip.informatik.uni-erlangen.de)
Drago IndJic (d.indjic@ic.ac.uk)
Thomas P. Vogl (vogl@helix.nih.gov)
German Cavelier (cavelier@smaug.cns.caltech.edu)
======================
Original question:
>I need your precious help to find out what is the state of the art
>in neuromodelling of the human brain AND derivation from the neurological
>model of "practical", computer-oriented, artificial neural network
>models. We are going to start a research that will a) study the
>behaviour of real neurons at the neurophysiological level, b) develop
>a theoretical (biological) model able to explain the observations,
>and c) develop from it an artificial neural network (ANN) model usable in
>practical applications.
>We are aware of at least one ANN model which was heavily derived from
>neurophysiological investigations of neurons in the hyppocampus, i-e. the
>Dystal model (Alkon et al).
>We are heavily interested in references/pointers to any work of this kind.
>Email is preferred because faster. Feel free to post anyway.
>I will compile a summary of the answers that I'll receive privately.
=======================
Richard E. Myers recommends to look at the work of Gary Lynch and Richard
Granger. References included:
1. Gluck MA; Granger R.
Computational models of the neural bases of learning and memory.
Annual Review of Neuroscience, 1993, 16:667-706.
(UI: 93213082)
Pub type: Journal Article; Review; Review, Academic.
2. Ambros-Ingerson J; Granger R; Lynch G.
Simulation of paleocortex performs hierarchical clustering.
Science, 1990 Mar 16, 247(4948):1344-8.
ABSTRACT available. (UI: 90193697)
3. Granger R; Lynch G.
Higher olfactory processes: perceptual learning and memory.
Current Opinion in Neurobiology, 1991 Aug, 1(2):209-14.
ABSTRACT available. (UI: 92330264)
Pub type: Journal Article; Review; Review, Tutorial.
4. Anton PS; Lynch G; Granger R.
Computation of frequency-to-spatial transform by olfactory bulb glomeruli.
Biological Cybernetics, 1991, 65(5):407-14.
ABSTRACT available. (UI: 92075782)
=============================
Jacques Brisson (X042@hec.ca) suggests to get a hand on the november
1992 issue of Trends In the Neuroscience. It is a special edition on
Nervous System Modelization. [thanks Jacques, we did]
=============================
Christopher Ian Connolly (connoly@cs.umass.edu) suggests the following
article:
Connolly CI, Burns JB, "A Model for the Functioning of the Striatum",
Biological Cybernetics 68(6):535-544.
It discusses a method for robot control and a plausible correlate in
networks of medium spiny cells ("matrisomes") of the striatum.
=============================
Prashanth Kumar A.K. (akp@cns.nyu.edu) writes:
work of Ken miller might be of interest to you (E-Mail: ken@caltech.edu)
he worked with mike stryker and developed models for development of ODC
and now for orientation selectivity.
=============================
Drago Indjic (d.indjic@ic.ac.uk) suggests to take a look at Kryukov et al
work described in a few books by Manchester University Press 1990:
Attention and neural networks and Stochastic Cellular Systems.
Kryukov is continuing Ukhtomsky work (Pavlov reflex
school competitor)and is based on many decades of
experimental work.
=============================
Thomas P. Vogl (vogl@helix.nih.gov) writes:
look at the paper by Blackwell et al in the July '92 issue of "Pattern
Recognition" ; also the paper by Werness et al (particularly some of the
references therein) in the December '92 issue of "Biological Cybernetics"
=============================
German Cavelier (cavelier@smaug.cns.caltech.edu) suggests to ask info
about GENESIS to David Bilitch (dhb@smaug.cns.caltech.edu).
============================
Joern Erbguth sent to me a very interesting and long summary he posted
some days before my request on neuromodelling. Since his summary has been
already posted to bionet.neuroscience, I won't include it here (but will
include it in my summary for comp.ai.neural-nets)
=============================
That's all. Thanks again for your help. The net is a real pleasure.
Massimo
_______________________________________________________________________
Massimo de Francesco email: massimo@cui.unige.ch
Research assistant
Computer Science Center
University of Geneva
Switzerland
------------------------------
Subject: Re: SUMMARY: From neurobiological to computation
From: massimo@cui.unige.ch (DEFRANCESCO Massimo)
Organization: University of Geneva, Switzerland
Date: 04 Jul 93 08:43:49 +0000
In article 28366@news.unige.ch, massimo@cui.unige.ch (DEFRANCESCO Massimo) writes:
>
>A couple of weeks ago I posted a request on the net for the state
>of the art in neuromodelling, with emphasis on the computational side.
>I received 13 answers, 4 of which were asking for the summary.
>
>Many thanks to the following people (hope I forgot nobody):
>
Well, I did. Here is the reply of Tobias Fabio Christen (tchriste@iiic.ethz.ch):
We are a team at Ciba SA in Basel that has members of the biological
side the electrical side and me as a computer scientist, due to
connections of my boss (Thomas Knoepfel) we have contact and pers.
comm. with people from the EPFZ (department for theoretical physics)
and the Brain Research Center at the UNI ZH. At the moment we are
working around a simulation for presynaptic modulation via auto- and
heteroreceptors (though we are still way apart from the definite
model).
Depending on whether you want a purely electrophysiological model or
whether you want to bring in some biochemical aspects (specially for
synaptical
transmission and second messenger gadgets), you do need different
approaches and literature (where as these two worlds seem to avoid each
other).
For the former case a good introduction to the evolution
biological->technical net is:
1
AU - Koch C
AU - Segev I
TI - Methods in Neuronal Modeling
ED - Koch C
ED - Segev I
BK -
SO - MIT Press ,Cambridge, MA 1989;1:0-0
2
AU - Traub RD
AU - Miles R
TI - Neuronal Networks of the Hippocampus
ED - Traub RD
ED - Miles R
BK -
SO - Cambridge University Press ,New York 1991;1:0-0
The biochemical side has little or nothing done in generalizing models
there is only little literature available (if you need it fell free to
contact me)
toby
Thanks again!
Massimo
------------------------------
Subject: Post-doc in Neurophysiology...
From: pck@castle.ed.ac.uk (P C Knox)
Organization: Edinburgh University
Date: 06 Jul 93 08:20:45 +0000
---- LABORATORY FOR NEUROSCIENCE ----
University of Edinburgh
Applications are invited for a post-doctoral research post for
up to three years to join a group working on the physiology of the
control of gaze (see Donaldson & Knox, Neuroscience 38:145-161, 1990;
Knox & Donaldson, Proc.Roy.Soc.Lond.B. 246:243-250, 1991; Hayman et al,
J.Physiol (Lond) 459:458P, 1993).
Experience in single unit electrophysiology is essential and
some experience of anatomical neural tracing techniques using
transported markers would be and advantage. Salary on the AR1A scale
with placement according to age and experience. Initial salary up to
#17 379, which includes Wellcome Trust supplement. Informal enquiries:
telephone 031-650-3526
Applications (please quote REF: NA 930215) including full CV and
the names and addresses of two academic referees, should be submitted
to:
The Personnel Office,
The University of Edinburgh,
1 Roxburgh Street,
Edinburgh EH8 9TB
The closing date is 31st July, 1993.
------------------------------
Subject: PostDoc positions in Korea
From: sbcho@gorai.kaist.ac.kr (Kang)
Organization: KTRC in Seoul, Korea
Date: 07 Jul 93 10:04:01 +0000
Please pass this advertisement to anyone who you think
might be interested.
Thanks in advance.
Sung-Bae Cho
=======================================================================
Korea Advanced Institute of Science and Technology
Computer Science Department
and
Center for Artificial Intelligence Research
Post Doctoral Researchers in AI & Pattern Recognition
Two one-year Post Doctoral Researcher positions are available with
the AI & Pattern Recognition Group. The main project is in the area
of 'Cursive handwriting recognition with hidden Markov model and/or
artificial neural networks,' and will aim to explore the feasibility
of psychological and cognitive scientific studies. Pattern recognition
issues such as pattern discrimination and modeling power will be
investigated. But researchers in other AI fields will also be applicable.
For further particulars and an application form, contact Dr. Jin H.
Kim, Computer Science Department, KAIST, 373-1, Koosung-dong,
Yoosung-ku, Taejeon 305-701, Republic of Korea. Phone 82 42 869 3517,
E-mail: jkim@cs.kaist.ac.kr.
The Center follows an equal opportunities policy.
=======================================================================
********
* ******* Sung-Bae Cho
** ** Computer Science Department
*** *** *** *** Korea Advanced Institute of Science and Technology
*** *** *** *** 373-1, Goosung-dong, Yoosung-ku,
*** *** *** *** Taejeon 305-701, South Korea
** **
******* * Phone : 82-42-869-3557
******** e-mail: sbcho@gorai.kaist.ac.kr
------------------------------
Subject: Cambridge Neural Nets Summer School
From: Richard Prager <rwp@eng.cam.ac.uk>
Date: Fri, 09 Jul 93 11:38:20 +0000
The Cambridge University Programme for Industry in Collaboration
with the Cambridge University Engineering Department Announce
their Third Annual Neural Networks Summer School.
3 1/2 day short course
13-16 September 1993
BOURLARD GEE HINTON JERVIS
JORDAN KOHONEN NARENDRA NIRANJAN
PECE PRAGER SUTTON TARRASENKO
Outline and aim of the course
The course will give a broad introduction to the application and design of
neural networks and deal with both the theory and with specific
applications. Survey material will be given, together with recent
research results in architecture and training methods, and applications
including signal processing, control, speech, robotics and human vision.
Design methodologies for a number of common neural network architectures
will be covered, together with the theory behind neural network
algorithms. Participants will learn the strengths and weaknesses of the
neural network approach, and how to assess the potential of the technology
in respect of their own requirements.
Lectures are being given by international experts in the field, and
delegates will have the opportunity of learning first hand the technical
and practical details of recent work in neural networks from those who are
contributing to those developments.
Who Should Attend
The course is intended for engineers, software specialists and other
scientists who need to assess the current potential of neural networks.
The course will be of interest to senior technical staff who require an
overview of the subject, and to younger professionals who have recently
moved into the field, as well as to those who already have expertise in
this area and who need to keep abreast of recent developments. Some,
although not all, of the lectures will involve graduate level mathematical
theory.
PROGRAMME
Introduction and overview:
Connectionist computing: an introduction and overview
Programming a neural network
Parallel distributed processing perspective
Theory and parallels with conventional algorithms
Architectures:
Pattern processing and generalisation
Bayesian methods in neural networks
Reinforcement learning neural networks
Communities of expert networks
Self organising neural networks
Feedback networks for optimization
Applications:
Classification of time series
Learning forward and inverse dynamical models
Control of nonlinear dynamical systems using neural networks
Artificial and biological vision systems
Silicon VLSI neural networks
Applications to diagnostic systems
Shape recognition in neural networks
Applications to speech recognition
Applications to mobile robotics
Financial system modelling
Applications in medical diagnostics
LECTURERS
DR HERVE BOURLARD is with Lernout & Hauspie Speech Products in
Brussels. He has made many contributions to the subject particularly in
the area of speech recognition.
MR ANDREW GEE is with the Speech, Vision and Robotics Group of
the Cambridge University Engineering Department. He specialises in the
use of neural networks for solving complex optimization problems.
PROFESSOR GEOFFREY HINTON is in the Computer Science Department
at the University of Toronto. He was a founding member of the PDP
research group and is responsible for many advances in the subject
including the classic back-propagation paper.
MR TIMOTHY JERVIS is with Cambridge University Engineering
Department. His interests lie in the field of neural networks and in
the application of Bayesian statistical techniques to learning control.
PROFESSOR MICHAEL JORDAN is in the Department of Brain & Cognitive Science
at MIT. He was a founding member of the PDP research group and he made
many contributions to the subject particularly in forward and inverse
systems.
PROFESSOR TEUVO KOHONEN is with the Academy of Finland and Laboratory of
Computer and Information Science at Helsinki University of Technology.
His specialities are in self-organising maps and their applications.
PROFESSOR K S NARENDRA is with the Center for Systems Science in the
Electrical Engineering Department at Yale University. His interests are
in the control of complex systems using neural networks.
DR MAHESAN NIRANJAN is with the Department of Engineering at Cambridge
University. His specialities are in speech processing and pattern
classification.
DR ARTHUR PECE is in the Physiological laboratory at the University of
Cambridge. His interests are in biological vision and especially neural
network models of cortical vision.
DR RICHARD PRAGER is with the Department of Engineering at Cambridge
University. His specialities are in speech and vision processing using
artificial neural systems.
DR RICH SUTTON is with the Adaptive Systems Department of GTE Laboratories
near Boston, USA. His specialities are in reinforcement learning,
planning and animal learning behaviour.
DR LIONEL TARRASENKO is with the Department of Engineering at the
University of Oxford. His specialities are in robotics and the hardware
implementation of neural computing.
COURSE FEES AND ACCOMMODATION
The course fee is 750 (UK pounds), payable in advance, and includes full
course notes, a certificate of attendance, and lunch and day-time
refreshments for the duration of the course. A number of heavily
discounted places are available for academics; please contact Renee Taylor
if you would like to be considered for one of these places. Accommodation
can be arranged for delegates in college rooms with shared facilities at
Wolfson College at 163 (UK pounds) for 4 nights to include bed and
breakfast, dinner with wine and a Course Dinner.
For more information contact: Renee Taylor, Course Development Manager
Cambridge Programme for Industry, 1 Trumpington Street, Cambridge CB2 1QA,
United Kingdom tel: +44 (0)223 332722 fax +44 (0)223 301122
email: rt10005@uk.ac.cam.phx
------------------------------
Subject: POSITION AVAILABLE - STATISTICIAN
From: Phil Goodman <goodman@unr.edu>
Date: Mon, 12 Jul 93 22:59:14 +0000
******************* Professional Position Announcement ******************
"STATISTICIAN for NEURAL NETWORK & REGRESSION DATABASE RESEARCH"
.- - - - - - - - - - - - - - OVERVIEW - - - - - - - - - - - - - - - - -.
| |
| THE LOCATION: |
| Nevada's Reno/Lake Tahoe region is an outstanding environment for |
| living, working, and raising a family. Winter skiing is world-class,|
| summer recreation includes many mountain and water sports, and |
| historical exploration and cultural opportunities abound. |
| |
| THE PROJECT: |
| The new CENTER FOR BIOMEDICAL MODELING RESEARCH recently received |
| federal funding to refine and apply a variety of artificial neural |
| network algorithms to large cardiovascular health care databases. |
| |
| THE CHALLENGE: |
| The predictive performance of neural nets will be compared to |
| advanced regression models. Other comparisons to be made include |
| handling of missing and noisy data, and selection of important |
| interactions among variables. |
| |
| THE JOB REQUIREMENT: |
| Masters-level or equivalent statistician with working knowledge |
| of the SAS statistical package and the UNIX operating system. |
| |
| THE SALARY : |
| Approximate starting annual salary: $42,000 + full benefits . |
| (actual salary will depend on experience and qualifications) |
._ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .
POSITION: Research Statistics Coordinator for
NEURAL NETWORKS / HEALTH CARE DATABASE PROJECT
LOCATION: Center for Biomedical Modeling Research
Department of Internal Medicine
University of Nevada School of Medicine
Washoe Medical Center, Reno, Nevada
START DATE: September 1, 1993
CLOSING DATE: Open until filled.
DESCRIPTION: Duties include acquisition and translation of data
from multiple external national sources; data management and archiving;
performance of exploratory and advanced regression statistics;
performance of artificial neural network processing; participation
in scholarly research and publications.
QUALIFICATIONS: (1) M.S., M.A., M.P.H. or equivalent training in
statistics with experience in logistic and Cox regression analyses,
(2) ability to program in the SAS statistical language, and
(3) experience with UNIX computer operating systems.
Desirable but not mandatory are the abilities to use
(4) the S-PLUS data management system and (5) the C programming language.
SALARY: Commensurate with qualifications and experience.
(For example, with database experience, typical annual
salary would be approximately $42,000 + full benefits.)
APPLICATION: > Informal inquiry may be made to:
Phil Goodman, Director, Center for Biomedical Modeling Research
Internet: goodman@unr.edu Phone: 702-328-4867
> Formal consideration requires a letter of application,
vita, and names of three references sent to:
Philip Goodman, MD, MS
Director, Center for Biomedical Modeling Research
University of Nevada School of Medicine
Washoe Medical Center, Room H1-166
77 Pringle Way, Reno, NV 89520
The University of Nevada is an Equal Opportunity/Affirmative Action
employer and does not discriminate on the basis of race, color,
religion, sex, age, national origin, veteran's status or handicap
in any program it operates. University of Nevada employs only U.S.
citizens and aliens lawfully authorized to work in the United States.
************************************************************************
------------------------------
Subject: Research Opportunities in Neural Networks
From: rohwerrj <rohwerrj@cs.aston.ac.uk>
Date: Tue, 13 Jul 93 12:49:52 +0000
*****************************************************************************
RESEARCH OPPORTUNITIES in NEURAL NETWORKS
Dept. of Computer Science and Applied Mathematics
Aston University
*****************************************************************************
Funding has recently become available for up to 6 PhD studentships and
up to 3 postdoctoral fellowships in the Neural Computing Research
Group at Aston University. This group is currently undergoing a major
expansion with the recent appointments of Professor Chris Bishop
(formerly head of the Applied Neurocomputing Centre at AEA Technology,
Harwell Laboratory) and Professor David Lowe (formerly head of the
neural network research group at DRA, Malvern), joining Professor
David Bounds and lecturers Richard Rohwer and Alan Harget. In
addition, substantial funds are being invested in new computer
hardware and software and other resources, which will provide the
Group with extensive research facilities.
The research programme of the Group is focussed on the development of
neural computing techniques from a sound statistical pattern
processing perspective. Research topics span the complete range from
developments of the theoretical foundations of neural computing,
through to a wide range of application areas. The Group maintains
close links with several industrial organisations, and is
participating in a number of collaborative projects.
For further information, please contact me at the address below:
Richard Rohwer
Dept. of Computer Science and Applied Mathematics
Aston University
Aston Triangle
Birmingham B4 7ET
ENGLAND
Tel: (44 or 0) (21) 359-3611 x4688
FAX: (44 or 0) (21) 333-6215
rohwerrj@uk.ac.aston.cs
------------------------------
End of Neuron Digest [Volume 11 Issue 45]
*****************************************