Copy Link
Add to Bookmark
Report

Neuron Digest Volume 11 Number 06

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Monday, 25 Jan 1993		Volume 11 : Issue 6 

Today's Topics:
Connectionist Models Summer School 1993
Degeneracy of weights in networks?
Biologically Plausible Dynamic Artificial Neural Networks
Re: Biologically Plausible Dynamic Artificial Neural Networks
Re: Biologically Plausible Dynamic Artificial Neural Networks
Re: Biologically Plausible Dynamic Artificial Neural Networks
Re: Biologically Plausible Dynamic Artificial Neural Networks
Job Offer - signal processing
Postdoc in neural signal processing theory
Computational Biology Postdoc
Job announcement - Siemens Corp.
Teaching post at U. Western Ontario


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Connectionist Models Summer School 1993
From: nan@central.cis.upenn.edu (Nan Biltz)
Organization: University of Pennsylvania
Date: 15 Dec 92 22:09:59 +0000


CONNECTIONIST MODELS SUMMER SCHOOL
University of Colorado
June 21 - July 3, 1993

The University of Colorado will host the 1993 Connectionist Models Summer
School from June 21 - July 3, 1993. The purpose of the summer school is
to provide training to promising young researchers in connectionism
(neural networks) by leaders of the field and to foster interdisciplinary
collaboration. This will be the fourth such program in a series that was
held at Carnegie Mellon in 1986 and 1988, and at UC San Diego in 1990.
Previous summer schools have been extremely successful and the organizers
look forward to the 1993 session with anticipation of another exciting
event.

The summer school will offer courses in many areas of connectionist
modeling, with emphasis on artificial intelligence, cognitive
neuroscience, cognitive science, computational methods, and theoretical
foundations. Visiting faculty (see list of invited faculty below) will
present daily lectures and tutorials, coordinate informal workshops, and
lead small discussion groups. The summer school schedule is designed to
allow for significant interaction among students and faculty. As in
previous years, a proceedings of the summer school will be published.

Applications will be considered only from graduate students currently
enrolled in Ph.D. programs. About 50 students will be accepted.
Admission is on a competitive basis. Tuition will be covered for all
students, and it is expected that scholarships will be available to
subsidize housing and meal costs, which will run approximately $300.
Applications will be accepted from foreign students, with the caveat that
students are responsible for their own travel expenses.

Applications should include the following materials:

* a one-page statement of purpose, explaining major areas of interest and
prior background in connectionist modeling and neural networks.

* a vita, including academic history, list of publications (if any), and
relevant courses taken with instructors' names and grades received.

* two letters of recommedation from individuals familiar witht the applicant's
work; and

* if room and board is requested, a statement from the applicant
describing potential sources of financial support available
(department, advisor, etc.) and the estimated extent of need. It is
hoped that sufficient scholarship funds will be available to provide
room and board to all accepted students regardless of financial need.

Applications should be sent to:

Connectionist Models Summer School
Institute of Cognitive Science
University of Colorado
Boulder, CO 80309-0344

All application materials must be received by MARCH 1st, 1993. Decisions
about acceptance and scholarship awards will be announced around April
15. If you have further questions, write to the address above or send
electronic mail to: cmss@boulder.colorado.edu.

Organizing Committee:

Jeff Elman (UC San Diego)
Mike Mozer (Univ of Colorado)
Paul Smolensky (Univ of Colorado)
Dave Touretzky (Carnegie Mellon)
Andreas Weigend (Xerox PARC, Univ of Colorado)

Additional Faculty will include:

Andy Barto (Univ of Mass, Amherst)
Jack Cowan (Univ of Chicago)
David Haussler (UC Santa Cruz)
Geoff Hinton (Univ of Toronto)
John Kruschke (Indiana Univ)
Steve Nowlan (Salk Institute)
Ennio Mingolla (Boston Univ)
Jay McClelland (Carnegie Mellon)
Dave Plaut (Carnegie Mellon)
Jordan Pollack (Ohio State)
Dave Rumelhart (Stanford)
Terry Sejnowski (UC San Diego and Salk Inst)

The Summer School is sponsored in part by the American Association for
Artificial Intelligence and Siemens Research Center.

------------------------------

Subject: Degeneracy of weights in networks?
From: jjb@watson.ibm.com
Date: Wed, 30 Dec 92 10:45:37 -0500

Ref: Your append to NEURON DIGEST (FORUMS) at 22:24:23 on 92/12/28

(I hope this is the right address to ask question of the
neurons-in-the-know).

Before training a homogeneous artificial neural network, the expectation
value for the trainable weights are all identical. After training,
these weights will assume an organized pattern representing in some
way the information encoded in the network by the training. Most
of the artificial networks I have seen described are structurally
degenerate: a pattern of three weights 1,2,3 cannot give different
answers from 2,1,3 or 3,2,1. This degeneracy, a result of the
homogeneity, is not present in bio-nets: they are not homogeneous.

Am I wrong about this degeneracy? Does it not cause grief in numerical
training to "optimal" weights?

John.
John J. Barton 29-221 jjb at watson.ibm.com
P.O. Box 218 jjb at yorktown (on BITNET, probably)
IBM TJ Watson Research Center
Yorktown Heights NY 10598
(914) 945-2340 FAX (914)945-2141


------------------------------

Subject: Biologically Plausible Dynamic Artificial Neural Networks
From: paulf@manor.demon.co.uk (Paul Fawcett)
Organization: UDI
Date: 05 Jan 93 05:53:57 +0000


Biologically Plausible Dynamic Artificial Neural Networks.
-----------------------------------------------------------

A *Dynamic Artificial Neural Network* (DANN) [1]
possesses processing elements that are created and/or
annihilated, either in real time or as some part of a
development phase [2].

Of particular interest is the possibility of
constructing *biologically plausible* DANN's that
models developmental neurobiological strategies for
establishing and modifying processing elements and their
connections.

Work with cellular automata in modeling cell genesis and
cell pattern formation could be applicable to the design
of DANN topologies. Likewise, biological features that are
determined by genetic or evolutionary factors [3] would
also have a role to play.

Putting all this together with a view to constructing a
working DANN, possessing cognitive/behavioral attributes of
a biological system is a tall order; the modeling of nervous
systems in simple organisms may be the best approach when
dealing with a problem of such complexity [4].

Any comments, opinions or references in respect of the
above assertions would be most welcome.


Many thanks

Paul Fawcett.

University of Westminster


References.

1. Ross, M. D., et al (1990); Toward Modeling a Dynamic
Biological Neural Network, Mathl Comput. Modeling,
Vol 13 No.7, pp97-105.

2. Lee, Tsu-Chang,(1991); Structure Level Adaptation for
Artificial Neural Networks, Kluwer Academic Publishers.

3. Edleman, Gerald,(1987); Neural Darwinism the Theory of
Neural Group Selection, Basic Books.

4. Beer, Randal, D,(1990); Intelligence as Adaptive Behavior
: An Experiment in Computational Neuroethology.
Academic Press.

Paul Fawcett | Internet: paulf@manor.demon.co.uk
London, UK. | tenec@westminster.ac.uk

------------------------------

Subject: Re: Biologically Plausible Dynamic Artificial Neural Networks
From: andrick@rhrk.uni-kl.de (Ulf Andrick [Biologie])
Organization: University of Kaiserslautern, Germany
Date: 06 Jan 93 22:41:07 +0000

paulf@manor.demon.co.uk (Paul Fawcett) writes:
:
: Biologically Plausible Dynamic Artificial Neural Networks.
: -----------------------------------------------------------

Biologically Plausible Artificial Neural Network sounds to me a bit like
an oxymoron. I tend to consider any `Artificial Neural Networks' as not
biologically plausible.

: Work with cellular automata in modeling cell genesis and
: cell pattern formation could be applicable to the design
: of DANN topologies. Likewise, biological features that are
: determined by genetic or evolutionary factors [3] would
: also have a role to play.


Cellular automata? One might feel reminded of the Game of Life, where the
cells change their state of being alive or dead according to the states
of the neighbouring cells. If something like that is suggested, I feel
somewhat skeptical if that is of use. I thought that the main issue of
neurogenesis was the formation of synapses. That means, e. g., how do the
neuronal processes find their way to their targets through a nascent
entanglement of cells (not necessarily neurones, but also glia)? How is
synaptic coupling changed in response to some stimulus? So, are your
`cells' neurones, processes, synapses, or what?

But perhaps you meant a concept of a cellular automaton so general that
one might consider the use of the word as nearly meaningless.

At least, the point seems to be a little mute to a person with some
half-knowledge about cellular automata and neurogenesis.

: Putting all this together with a view to constructing a
: working DANN, possessing cognitive/behavioral attributes of
: a biological system is a tall order; the modeling of nervous
: systems in simple organisms may be the best approach when
: dealing with a problem of such complexity [4].

There seems to be enough work to be done to simulate `static' Neural
Networks in simple organisms. An interesting question is, e. g., what
role the complex electrophysiological properties of the single neuron
play for the behaviour of the whole network? What are the effects of
neuromodulators? And these questions may also be of relevance in neural
development.

Artificial Neural Networks do hardly play any role in that kind of
research, IMVHO, unless they have sophisticated neuronal properties,
which most information scientists never dream of, but I wouldn't call
such a model Artificial Network in order to distinguish it from much more
primitive devices, which might be appropriate to describe spin glasses or
whatever.

As you can see, my view is that the Artificial Neural Network research is
an engineering discipline detached from natural paradigmata, just as the
whole AI. (As this is also crossposted to AI groups, I expect to have to
put on my flame-proof suit.)

Ulf R. Andrick andrick@rhrk.uni-kl.de
FB Biologie - Tierphysiologie
Universitaet Was du nicht selber weiszt,
D-W 6750 Kaiserslautern das muszt du dir erklaeren (Tegtmeier)

------------------------------

Subject: Re: Biologically Plausible Dynamic Artificial Neural Networks
From: doshay@ursa.arc.nasa.gov (David Doshay)
Organization: NASA-Ames Biocomputation Center
Date: 08 Jan 93 19:12:56 +0000

Because M.D. Ross was the first reference in the posting, and I work in
her lab at NASA Ames, I feel some need to post also. Dr. Ross is a
neuroanatomist and we study nerves in the vestibular macula at the
ultrastructural level.

First, I have never heard her refer to a DANN in the manner the post
does, and in the post she is referenced right after this acronym.

Second, with respect to Ulf's posting, we do use both artificial neural
net models as well as electrophysiological models here in the
Biocomputation Center. There are some questions that are best answered
with one of those models because they are too hard to answer with the
other type. We have far more time in the 'real' models than the
artificial, but some networking questions are still best posed in the
context of an artificial neural net just because of the huge
computational requirements for networks of more realistic nerves. We are,
however, scaling up our 'real' models to ask such questions. In those
cases we plan to burn several hundred hours of CRAY YMP time. Thank
goodness (and the US taxpayers) that NASA has the supercomputers.

These statements are mine, and not those of the Biocomputation Center

David doshay@soma.arc.nasa.gov

The thought police insist I tell you:
my thoughts, not NASA's

------------------------------

Subject: Re: Biologically Plausible Dynamic Artificial Neural Networks
From: andrick@sun.rhrk.uni-kl.de (Ulf Andrick [Biologie])
Organization: University of Kaiserslautern, Germany
Date: 11 Jan 93 14:01:08 +0000

doshay@ursa.arc.nasa.gov (David Doshay) writes:
: Second, with respect to Ulf's posting, we do use both artificial neural
: net models as well as electrophysiological models here in the
: Biocomputation Center. There are some questions that are best answered
: with one of those models because they are too hard to answer with the
: other type.

I just wanted to counter the view that Artificial Neural Networks (ANN)
are suitable to explain everything in the brain. Further more, the
posting I answered to referred to small neural systems in simple
organisms, and here, I don't see a field for the application of ANN. I
think of the stomatogastric ganglion of the crab or the flight generator
of the locust when talking about small neural systems.

I'm aware that some work has been done with ANNs to simulate large
networks as in the human cortex.

But I didn't mind to present a view which was perhaps a little one-sided,
because I wanted to see the reactions, esp. of AI people. But somehow,
there was not much response, at least I saw no other replies in
bionet.neuroscience.

Ulf R. Andrick andrick@rhrk.uni-kl.de
FB Biologie - Tierphysiologie
Universitaet Was du nicht selber weiszt,
D-W 6750 Kaiserslautern das muszt du dir erklaeren (Tegtmeier)

------------------------------

Subject: Re: Biologically Plausible Dynamic Artificial Neural Networks
From: jdevlin@pollux.usc.edu (Joseph T. Devlin)
Organization: University of Southern California, Los Angeles, CA
Date: 11 Jan 93 19:42:25 +0000

Ulf Andrick writes:
>I just wanted to counter the view that Artificial Neural
>Networks (ANN) are suitable to explain everything in the brain.

I think I can be fairly confident when I say that no-one
really suggests that ANNs might explain "everything in the brain" -
that'd be a neat trick! It'd put us all out jobs, however...

>Further more, the posting I answered to referred to small
>neural systems in simple organisms, and here, I don't see a
>field for the application of ANN. I think of the stomatogastric
>ganglion of the crab or the flight generator of the locust
>when talking about small neural systems.

I think this depends on what exactly you are referring to
when you say "Artificial Neural Net (ANN)". If you mean any
computational model of neural activity then certainly Selverston's work
at UCSD qualifies as a small ANN in a simple organism (the lobster
stomatoganglion system).
If, on the other hand, you mean solely the more traditional
ANNs such as the models in McClelland and Rumelhart's PDP book then I
would agree. These types of models seem to provide no real insight into
detailed neural systems that are fairly well characterized biologically
but I don't believe they were intended to, either. PDP models are more
useful for modeling cognitive issues where the underlying biology is as
yet unknown but nonetheless the modeler would like to capture general
components of the biology - such as distributed representation, massive
parallelism, etc. As it stands there is certainly debate concerning the
usefullness of these models - see the ongoing McCloskey/Seidenberg debate -
but I like Seidenberg's arguments which I think are very elegant (but I
work in his lab so I'm biased. :-)

- Joe

*************************************************************************
Joseph Devlin * email: jdevlin@pollux.usc.edu
University of Southern California *
Department of Computer Science * "The axon doesn't think.
Los Angeles, CA 90089 * It just ax." George Bishop
*************************************************************************

McClelland and Rumelhart (1986) _Parallel Distributed Processing_, MIT Press.

McCloskey (1992) Networks and theories: The place of connectionism in
cognitive science. _Psychogical Science_

Rowat & Selverston (1991) Learning algorithms for oscillatory networks with
gap junctions and membrane currents. _Networks 2_, 17-41.

Seidenberg (in press) [A response to McCloskey...] _Psychological Science_

Note: The references are from memory basically so I apologize for any
inaccuracies in advance. I just can't remember the title of the
Seidenberg paper - my copy doesn't have one.

------------------------------

Subject: Job Offer - signal processing
From: bouzerda@eleceng.adelaide.edu.au
Date: Tue, 05 Jan 93 13:23:13 +1100

POSTDOCTORAL OR RESEARCH FELLOW
in
Signal Processing and Neural Networks
**************************************

A postdoctoral or research fellow is sought to join as soon as possible
the Centre for Sensor Signal and Information Processing (CSSIP) and the
University of Adelaide EE Eng Department. The CSSIP is one of several
cooperative research centres awarded by the Australian Government to
establish excellence in research and development. The University of
Adelaide, represented by the EE Eng Dept, is a partner in this
cooperative research centre, together with the Defence Science and
Technology Organization (DSTO), four other Universities, and several
companies. The cooperative research centre consists of more than 50
effective full time researchers, and is well equipped with many UNIX
Workstations and a massively parallel machine (DEC MPP).

The aim is to develop and investigate principles of artificial neural
networks for sensor signal and image processing, classification and
separation of signals, and data fusion. The position is for one year with
a strong possibility of renewal.

DUTIES: In consultation with task leaders and specialist researchers to
investigate alternative algorithm design approaches, to design
experiments on applications of signal processing and artificial neural
networks, to prepare data and carry out the experiments, to prepare
software for testing algorithms, and to prepare or assist with the
prepation of technical reports.

QUALIFICATIONS: The successful candidate must have a Ph.D., a proven
research record, and a demonstrated ability in written and spoken
English.

PAY and CONDITIONS: will be in accordance with University of Adelaide
policies, and will depend on the qualifications and experience.
Appointments may be made in scales A$ 36766 to A$ 42852 for a postdoc,
and A$ 42333 to A$ 5999 for a research fellow.

ENQUIRIES: Prof. R. E. Bogner, Electrical & Electronic Engineering Dept.,
The University of Adelaide, G.P.O. Box 498, Adelaide, South Australia
5001. Phone: (61)-08-228-5589, Fax: (61)-08-232-5720 Email:
bogner@eleceng.adelaide.edu.au

Dr. A. Bouzerdoum, Phone (61)-08-228-5464, Fax (61)-08-232-5720 Email:
bouzerda@eleceng.adelaide.edu.au


------------------------------

Subject: Postdoc in neural signal processing theory
From: Benny Lautrup <lautrup@connect.nbi.dk>
Date: Fri, 08 Jan 93 14:31:02 +0000

POST-DOC POSITION IN
NEURAL SIGNAL PROCESSING
THEORY

The Danish Computational Neural Network Center (CONNECT), announces a
one-year post-doc position in the theory of neural signal processing.
CONNECT is a joint effort with participants from the University of
Copenhagen, Risoe National Laboratory, and the Technical University of
Denmark. The position is available March 1, 1993, at the Electronics
Institute of the Technical University of Denmark.

The work of the neural signal processing group concerns generalization
theory, algorithms for architecture optimization, applications in time
series analysis, seismic signal processing, image processing and pattern
recognition.

The candidate must have a strong background in statistics or statistical
physics and have several years of experience in neural signal
processing. A candidate with proven abilities in generalization theory
of signal processing neural networks or in seismic signal processing
will be favoured.

Further information about the position can be obtained from:


Lars Kai Hansen, Phone: (+45) 45 93 12 22, ext 3889.
Electronics Institute B349, Fax: (+45) 42 87 07 17.
Technical University of Denmark, email: lars@eiffel.ei.dth.dk
DK-2800 Lyngby.

Applications containing CV, list of publications, and three letters of
recommendation should be mailed to

Benny Lautrup,
CONNECT
Niels Bohr Institute
Blegdamsvej 17
DK-2100 Copenhagen


Deadline February 15, 1992

------------------------------

Subject: Computational Biology Postdoc
From: George Berg <berg@cs.albany.edu>
Date: Tue, 12 Jan 93 13:36:03 -0500



Postdoctoral Position in Computational Biology


A one-year postdoctoral position supported by an NSF grant is
available to study protein secondary and tertiary structure prediction
using artificial intelligence and other computational techniques.
Position is available starting in March, 1993, or later.

The successful applicant will have a strong background in the
biochemistry of protein structure. Ability to program is a must.
Experience with artificial neural networks is a definite plus. Preferred
candidates will have experience with C, UNIX, and molecular modeling.

For further information, contact either George Berg (Department of
Computer Science) or Jacquelyn Fetrow (Department of Biological Sciences)
by electronic mail at postdoc-info@cs.albany.edu.

To apply, please send curriculum vitae and three letters of
recommendation to:
Jacquelyn Fetrow
Department of Biological Sciences
University at Albany
1400 Washington Avenue
Albany, NY 12222


------------------------------

Subject: Job announcement - Siemens Corp.
From: Ellen Voorhees <ellen@sol.siemens.com>
Date: Fri, 15 Jan 93 09:14:40 -0500

The learning department of Siemens Corporate Research in Princeton, New
Jersey is looking to hire a researcher interested in statistical and
knowledge-based methods for natural language processing, text retrieval,
and text categorization. The position requires either a PhD (preferred)
or a masters degree with some experience in an appropriate field. The
main responsibility of the successful candidate will be to conduct
research in automatic information retrieval and (statistical) natural
language processing. Tasks include setting up and running experiments,
programming, etc.

People interested in the position should send a PLAIN ASCII resume to
ellen@learning.siemens.com or a hardcopy of the resume to:
Human Services
Department EV
Siemens Corporate Research, Inc.
755 College Road East
Princeton, NJ 08540
Siemens is an equal opportunity employer.


Ellen Voorhees
Member of Technical Staff
Siemens Corporate Research, Inc.


------------------------------

Subject: Teaching post at U. Western Ontario
From: Mel Goodale <22026_1672@uwovax.uwo.ca>
Date: Sat, 02 Jan 93 12:51:43 -0500

NEUROPSYCHOLOGY TEACHING POSITION

Two-year replacement teaching position, 1993-1995. Possibility of
research in existing laboratories with salary supplement. Active
programme in Psychobiology / Clinical Neuropsychology. Existing faculty
with neuropsychological interests include Mel Goodale, Elizabeth Hampson,
Doreen Kimura, and David Sherry.
h
We would especially like a person with expertise in human memory, but any
area of Neuropsychology / Cognitive Neuroscience is acceptable.

Basic salary is at a postdoctoral level. Send vita and references
to:

Mel Goodale, Department of Psychology
University of Western Ontario
London Ontario N6A 5C2
FAX (519) 661-3029



M.A. Goodale: Department of Psychology
University of Western Ontario
London, Ontario
Phone (519) 661-2070
GOODALE@UWO.CA


------------------------------

End of Neuron Digest [Volume 11 Issue 6]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT