Copy Link
Add to Bookmark
Report
Neuron Digest Volume 07 Number 03
Neuron Digest Thursday, 10 Jan 1991 Volume 7 : Issue 3
Today's Topics:
TR - Symbol Processing Systems, Connectionist Nets, etc.
Preprint: Stimulus Sampling & Distributed Representations
TR on the Modelling of Synaptic Plasticity
4 vs 3 layers -- Tech Report available from connectionists archive
Language, Tools and Brain: BBS Call for Commentators
Consciousness: BBS Call for Commentators
Full/Part-Time NN Research Assistant & Programmer Positions
POSTDOCTORAL POSITION IN NEW YORK AREA: Cognitive & NeuralModels of Human Learn
4th NN Conference. Indiana-Purdue.
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).
------------------------------------------------------------
Subject: TR - Symbol Processing Systems, Connectionist Nets, etc.
From: honavar@iastate.edu
Date: Tue, 18 Dec 90 15:08:02 -0600
The following technical report is available in postscript form
by anonymous ftp (courtesy Jordan Pollack of Ohio State Univ).
Comments on the paper are welcome (please direct them to honavar@iastate.edu)
_________________________________________________________________
Symbol Processing Systems, Connectionist Networks, and
Generalized Connectionist Networks
Vasant Honavar Leonard Uhr
Department of Computer Science Computer Sciences Department
Iowa State University University of Wisconsin-Madison
Technical Report #90-24, December 1990
Department of Computer Science
Iowa State University, Ames, IA 50011
Abstract
Many authors have suggested that SP (symbol processing) and CN
(connectionist network) models offer radically, or even fundamentally,
different paradigms for modeling intelligent behavior (see Schneider,
1987) and the design of intelligent systems. Others have argued that CN
models have little to contribute to our efforts to understand
intelligence (Fodor & Pylyshyn, 1988).
A critical examination of the popular characterizations of SP and CN
models suggests that neither of these extreme positions is justified.
There are many advantages to be gained by a synthesis of the best of both
SP and CN approaches in the design of intelligent systems. The
Generalized connectionist networks (GCN) (alternately called generalized
neuromorphic systems (GNS)) introduced in this paper provide a framework
for such a synthesis.
______________________________________________________________________________
You will need a POSTSCRIPT printer to print the file.
To obtain a copy of the report, use anonymous ftp from
cheops.cis.ohio-state.edu (here is what the transaction looks like):
% ftp
ftp> open cheops.cis.ohio-state.edu
Connected to cheops.cis.ohio-state.edu.
220 cheops.cis.ohio-state.edu FTP server (Version blah blah) ready.
Name (cheops.cis.ohio-state.edu:yourname): anonymous
331 Guest login ok, send ident as password.
Password: anything
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
250 CWD command successful.
ftp> bin
200 Type set to I.
ftp> get honavar.symbolic.ps.Z
200 PORT command successful.
150 Opening BINARY mode data connection for [[...]]
226 Transfer complete.
local: honavar.symbolic.ps.Z remote: honavar.symbolic.ps.Z
55121 bytes received in 1.8 seconds (30 Kbytes/s)
ftp> quit
221 Goodbye.
% uncompress honavar.symbolic.ps.Z
% lpr honavar.symbolic.ps
------------------------------
Subject: Preprint: Stimulus Sampling & Distributed Representations
From: gluck%psych@Forsythe.Stanford.EDU (Mark Gluck)
Date: Wed, 19 Dec 90 07:29:57 -0800
PRE-PRINT AVAILABLE:
Stimulus Sampling and Distributed Representations
in Adaptive Network Theories of Learning
Mark A. Gluck
Department of Psychology
Stanford University
[To appear in: A. Healy, S. Kosslyn, & R. Shiffrin (Eds.),
Festschrift for W. K. Estes. NJ: Erlbaum, 1991/in press]
ABSTRACT:
Current adaptive network, or "connectionist", theories of human
learning are reminiscent of statistical learning theories of the 1950's
and early 1960's, the most influential of which was Stimulus Sampling
Theory, developed by W. K. Estes and colleagues (Estes, 1959; Atkinson &
Estes, 1963). This chapter reviews Stimulus Sampling Theory, noting some
of its strengths and weaknesses, and compares it to a recent network
model of human learning (Gluck & Bower, 1986, 1988a,b). The network
model's LMS learning rule for updating associative weights represents a
significant advance over Stimulus Sampling Theory's more rudimentary
learning procedure. In contrast, Stimulus Sampling Theory's stochastic
scheme for representing stimuli as distributed patterns of activity can
overcome some limitations of network theories which identify stimulus
cues with single active input nodes. This leads us to consider a
distributed network model which embodies the processing assumptions of
our earlier network model but employs stimulus-representation assumptions
adopted from Stimulus Sampling Theory. In this distributed network,
stimulus cues are represented by the stochastic activation of overlapping
populations of stimulus elements (input nodes). Rather than replacing
the two previous learning theories, this distributed network combines the
best established concepts of the earlier theories and reduces to each of
them as special cases in those training situations where the previous
models have been most successful.
_________________________________________________________________
To request copies, send email to: gluck@psych.stanford.edu
with your hard-copy mailing address.
Or mail to: Mark A. Gluck, Department of Psychology, Jordan Hall, Bldg. 420,
Stanford Univ., Stanford, CA 94305-2130
------------------------------
Subject: TR on the Modelling of Synaptic Plasticity
From: Patrick Thomas <thomasp@informatik.tu-muenchen.dbp.de>
Date: 27 Dec 90 13:27:38 +0100
The following technical report is now available:
BEYOND HEBB SYNAPSES:
BIOLOGICAL BUILDING BLOCKS FOR UNSUPERVISED LEARNING
IN ARTIFICIAL NEURAL NETWORKS
Patrick V. Thomas
Report FKI-140-90
Abstract
This paper briefly reviews the neurobiology of synaptic plasticity as
it is related to the formulation of learning rules for unsupervised
learning in artificial neural networks. Presynaptic, postsynaptic and
heterocellular mechanisms are discussed and their relevance to neural
modelling is assessed. These include a variety of phenomena of
potentiation as well as depression with time courses of action ranging
from milliseconds to weeks. The original notion put forward by Donald
Hebb stating that synaptic plasticity depends on correlated pre- and
postsynaptic firing is reportedly inadequate. Although postsynaptic
depolarization is necessary for associative changes in synaptic
strength to take place (which conforms to the spirit of the hebbian
law) the association is understood as being formed between pathways
converging on the same postsynaptic neuron. The latter only serves as a
supporting device carrying signals between activated dendritic regions
and maintaining long-term changes through molecular mechanisms. It is
further proposed to restrict the interactions of synaptic inputs to
distinct compartments. The hebbian idea that the state of the
postsynaptic neuron as a whole governs the sign and magnitude of
changes at individual synapses is dropped in favor of local mechanisms
which guide the depolarization-dependent associative learning process
within dendritic compartments. Finally, a framework for the modelling
of associative and non-associative mechanisms of synaptic plasticity at
an intermediate level of abstraction, the Patchy Model Neuron, is
sketched.
To obtain a copy of the technical report FKI-140-90 please send your physical
mail address to either "thomasp@lan.informatik.tu-muenchen.de" or Patrick V.
Thomas, Institute for Medical Psychology, Goethe-31, 8000 Munich 2, Germany.
------------------------------
Subject: 4 vs 3 layers -- Tech Report available from connectionists archive
From: sontag@control.RUTGERS.EDU
Date: Mon, 07 Jan 91 11:37:04 -0500
REPORT AVAILABLE ON CAPABILITIES OF FOUR-LAYER vs THREE-LAYER NETS
At the request of a few people at NIPS, I placed in the connectionists
archive the postscript version of my report describing why TWO hidden
layers are sometimes necessary when solving function-approximation types
of problems, a fact that was mentioned in my poster. (About 1/2 of the
report deals with the general question, while the other half is devoted
to the application to control that led me to this.) Below are the
abstract and instructions on ftp retrieval.
I would very much welcome any discussion of the practical implications --
if any -- of the result. If you want, send email to me and I can
summarize later for the net.
Happy palindromic year to all,
-eduardo
-----------------------------------------------------------------------------
Report SYCON-90-11, Rutgers Center for Systems and Control, October 1990
FEEDBACK STABILIZATION USING TWO-HIDDEN-LAYER NETS
This report compares the representational capabilities of three-layer
(that is, "one hidden layer") and four-layer ("two hidden layer") nets
consisting of feedforward interconnections of linear threshold units.
It is remarked that for certain problems four layers are required,
contrary to what might be in principle expected from the known
approximation theorems. The differences are not based on numerical
accuracy or number of units needed, nor on capabilities for feature
extraction, but rather on a much more basic classification into "direct"
and "inverse" problems. The former correspond to the approximation of
continuous functions, while the latter are concerned with approximating
one-sided inverses of continuous functions ---and are often encountered
in the context of inverse kinematics determination or in control
questions.
A general result is given showing that nonlinear control systems can be
stabilized using four layers, but not in general using three layers.
-----------------------------------------------------------------------
To obtain copies of the postscript file, please use Jordan Pollack's
service:
Example:
unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62)
Name (cheops.cis.ohio-state.edu:): anonymous
Password (cheops.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get
(remote-file) sontag.twolayer.ps
(local-file) twolayer.ps.Z
ftp> quit
unix> uncompress twolayer.ps
unix> lpr -P(your_local_postscript_printer) twolayer.ps
----------------------------------------------------------------------------
If you have any difficulties with the above, please send e-mail to
sontag@hilbert.rutgers.edu. DO NOT "reply" to this message, please.
------------------------------
Subject: Language, Tools and Brain: BBS Call for Commentators
From: Stevan Harnad <harnad@clarity.Princeton.EDU>
Date: Thu, 20 Dec 90 22:55:40 -0500
Below is the abstract of a forthcoming target article to appear in
Behavioral and Brain Sciences (BBS), an international, interdisciplinary
journal that provides Open Peer Commentary on important and controversial
current research in the biobehavioral and cognitive sciences.
Commentators must be current BBS Associates or nominated by a current BBS
Associate. To be considered as a commentator on this article, to suggest
other appropriate commentators, or for information about how to become a
BBS Associate, please send email to:
harnad@clarity.princeton.edu or harnad@pucc.bitnet or write to:
BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771]
To help us put together a balanced list of commentators, please give some
indication of the aspects of the topic on which you would bring your
areas of expertise to bear if you are selected as a commentator.
____________________________________________________________________
Language, Tools, and Brain: The development and evolution of
hierarchically organized sequential behavior
Patricia Marks Greenfield
Department of Psychology
University of California, UCLA
Los Angeles, CA 90024-1563
electronic mail: rygreen@uclasscf.bitnet
Abstract: During the first two years of life a common neural substrate
(roughly, Broca's area) underlies the hierarchically organized
combination of elements in the development of both speech and manual
action, including tool use. The neural evidence implicates relatively
specific cortical circuitry underlying a grammatical "module."
Behavioral and neurodevelopmental data suggest that the modular
capacities for language and manipulation are not present at birth but
come into being gradually during the third and fourth years of life. An
evolutionary homologue of the common neural substrate for language
production and manual action during the first two years of human life is
hypothesized to have provided a foundation for the evolution of language
before the divergence of hominids and the great apes. Support comes from
the discovery of a Broca's area analogue in contemporary primates. In
addition, chimpanzees have an identical constraint on hierarchical
complexity in both tool use and symbol combination. Their performance
matches that of the two-year-old child who has not yet developed the
differentiated neural circuits for the relatively modularized production
of complex grammar and complex manual construction activity.
------------------------------
Subject: Consciousness: BBS Call for Commentators
From: Stevan Harnad <harnad%phoenix.Princeton.EDU@VM.TCS.Tulane.EDU>
Date: Fri, 21 Dec 90 12:55:58 -0500
Below is the abstract of a forthcoming target article to appear in
Behavioral and Brain Sciences (BBS), an international, interdisciplinary
journal that provides Open Peer Commentary on important and controversial
current research in the biobehavioral and cognitive sciences.
Commentators must be current BBS Associates or nominated by a current BBS
Associate. To be considered as a commentator on this article, to suggest
other appropriate commentators, or for information about how to become a
BBS Associate, please send email to:
harnad@clarity.princeton.edu or harnad@pucc.bitnet or write to:
BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771]
To help us put together a balanced list of commentators, please give some
indication of the aspects of the topic on which you would bring your
areas of expertise to bear if you are selected as a commentator. (The
article is retrievable by anonymous ftp from directory /pub/harnad as
file velmans.bbs on princeton.edu, however, please do not prepare a
commentary unless you have been formally invited to do so.)
____________________________________________________________________
IS HUMAN INFORMATION PROCESSING CONSCIOUS?
Max Velmans
Department of Psychology
Goldsmiths College
University of London
electronic mail: MLV@gold.lon.ac.uk
KEY WORDS: consciousness, information processing, brain, unconscious,
attention, mind, functionalism, reductionism, complementarity.
ABSTRACT: Investigations of the function of consciousness in human
information processing have focused mainly on two questions: (1) where
does consciousness enter into the information processing sequence and (2)
how does conscious processing differ from preconscious and unconscious
processing. Input analysis is thought to be initially "preconscious,"
"pre-attentive," fast, involuntary, and automatic. This is followed by
"conscious," "focal-attentive" analysis which is relatively slow,
voluntary, and flexible. It is thought that simple, familiar stimuli can
be identified preconsciously, but conscious processing is needed to
identify complex, novel stimuli. Conscious processing has also been
thought to be necessary for choice, learning and memory, and the
organization of complex, novel responses, particularly those requiring
planning, reflection, or creativity.
This target article reviews evidence that consciousness performs none of
these functions. Consciousness nearly always results from focal-attentive
processing (as a form of output) but does not itself enter into this or
any other form of human information processing. This suggests that the
term "conscious process" needs re-examination. Consciousness appears to
be necessary in a variety of tasks because they require focal-attentive
processing; if consciousness is absent, focal-attentive processing is
absent. From a first-person perspective, however, conscious states are
causally effective. First-person accounts are complementary to
third-person accounts. Although they can be translated into third-person
accounts, they cannot be reduced to them.
------------------------------
Subject: Full/Part-Time NN Research Assistant & Programmer Positions
From: gluck%psych@Forsythe.Stanford.EDU (Mark Gluck)
Date: Mon, 31 Dec 90 20:50:24 -0800
Two Full/Part Time Research Assistant Positions in:
---------------------------------------------------
COGNITIVE PSYCHOLOGY / NEURAL NETWORK MODELING
at
Rutgers University
Center for Molecular & Behavioral Neuroscience
195 University Avenue
Newark, NJ 07102
Two research positions are available for persons interested in
pursuing empirical and/or theoretical research in the in cognitive and
neural sciences.
The positions are ideal for someone who has just graduated with an
undergraduate degree and would like a year or two of "hands on"
experience in research before applying to graduate school in one of
the cognitive sciences (e.g., neuroscience, psychology, computer
science).
We are looking for two people:
1. RESEARCH PROGRAMMER:
A person with strong programming skills to work in the development
of computational theories of the neural & cognitive bases of
learning. Familiarity with current PDP/neural-network algorithms and
research would be helpful, as would experience with C/Unix and Sun
computer systems. Work would either focus on the development of
network models of human learning and/or biological-circuit models of
the neural bases of animal learning.
2. EXPERIMENTAL RESEARCH ASSISTANT:
A person with experience in running and designing human cognitive
psychology experiments to work in the design, execution, and data
analysis of behavioral studies of human categorization learning.
__________________________________________________________________________
Other Information:
FACILITIES: The Center is a new state-funded research center for
the integrated studies of cognitive, behavioral, and molecular
neuroscience. The Center has good computational resources and
experimental laboratories for behavioral and neural studies.
LOCATION: The Center is located in Newark, NJ, approximately 20 minutes
outside of Manhattan, New York (with easy train and subway access to
midtown and downtown NYC) and close to rural New Jersey countryside.
Numerous other research centers in the cognitive and neural sciences
are located nearby, e.g.: Cognitive Science Center, Rutgers/New
Brunswick; Centers for Cognitive & Neural Science, New York
University; Cognitive Science Center, Princeton Univ.; Columbia Univ.
& Medical School; Siemens Corporate Research, Princeton, NJ; NEC
Research Labs, Princeton, NJ; AT&T Labs; Bellcore; IBM T. J. Watson
Research Labs.
CURRENT FACULTY: E. Abercrombie, G. Buzsaki, I. Creese, M. Gluck,
H. Poizner, R. Siegel, P. Tallal, J. Tepper. Six additional faculty
will be hired. The center has a total of ten state-funded
postdoctoral positions and will direct, in collaboration with the
Institute for Animal Behavior, a graduate program in Behavioral and
Neural Sciences.
----------------------------------------------------------------------------
For more information on learning research at the CMBN/Rutgers or to
apply for these post-doctoral positions, please send cover letter with
a statement of your research interests, a CV, copies of relevant
preprints, and the the names & phone numbers of references to:
Dr. Mark A. Gluck Phone: (415) 725-2434
Dept. of Psychology <-[Current address to 4/91] FAX: (415) 725-5699
Jordan Hall; Bldg. 420
Stanford University email: gluck@psych.stanford.edu
Stanford, CA 94305-2130
------------------------------
Subject: POSTDOCTORAL POSITION IN NEW YORK AREA: Cognitive & NeuralModels of Human Learning
From: gluck%psych@Forsythe.Stanford.EDU (Mark Gluck)
Date: Mon, 31 Dec 90 20:57:22 -0800
Postdoctoral Positions in:
--------------------------
COGNITIVE & NEURAL BASES OF LEARNING
at
Rutgers University
Center for Molecular & Behavioral Neuroscience
195 University Avenue
Newark, NJ 07102
Postdoctoral positions are available for recent Ph.D's in all areas of
Cognitive Science (e.g., Neuroscience, Psychology, Computer Science)
interested in pursuing empirical and/or theoretical research in the
following areas of cognitive and neural science:
1. COGNITIVE SCIENCE/ADAPTIVE "CONNECTIONIST" NETWORKS:
Experimental and theoretical (computational) studies of human
learning and memory.
2. COMPUTATIONAL NEUROSCIENCE / COGNITIVE NEUROSCIENCE:
Models of the neural bases of learning in animals and humans.
Candidates with any (or all) of the following skills are particular
encouraged to apply: (1) familiarity with neural network algorithms and
models, (2) strong computational/analytic skills, and (3) experience with
experimental methods, experimental design, and data analysis in cognitive
psychology.
----------------------------------------------------------------------------
Other Information:
FACILITIES: The Center is a new state-funded research center for
the integrated studies of cognitive, behavioral, and molecular neuroscience.
The Center has good computational resources and experimental laboratories
for behavioral and neural studies.
LOCATION: The Center is located in Newark, NJ, approximately 20 minutes
outside of Manhattan, New York (with easy train and subway access to
midtown and downtown NYC) and close to rural New Jersey countryside.
Numerous other research centers in the cognitive and neural sciences
are located nearby including: Cognitive Science Center, Rutgers/New Brunswick;
Centers for Cognitive & Neural Science, New York University; Cognitive
Science Center, Princeton Univ.; Columbia Univ. & Medical School; Siemens
Corporate Research, Princeton, NJ; NEC Research Labs, Princeton, NJ;
AT&T Labs; Bellcore; IBM T. J. Watson Research Labs.
CURRENT FACULTY: E. Abercrombie, G. Buzsaki, I. Creese, M. Gluck,
H. Poizner, R. Siegel, P. Tallal, J. Tepper. Six additional faculty
will be hired. The Center has a total of ten state-funded postdoctoral
positions and will direct, in collaboration with the Institute for Animal
Behavior, a graduate program in Behavioral and Neural Sciences.
----------------------------------------------------------------------------
For more information on learning research at the CMBN/Rutgers or to apply
for these post-doctoral positions, please send a cover letter with a
statement of your research interests, a CV, copies of relevant preprints,
and the the names & phone numbers of references to:
Dr. Mark A. Gluck Phone: (415) 725-2434
Dept. of Psychology <-[Current address to 4/91] FAX: (415) 725-5699
Jordan Hall; Bldg. 420
Stanford University email: gluck@psych.stanford.edu
Stanford, CA 94305-2130
------------------------------
Subject: 4th NN Conference. Indiana-Purdue.
From: SAYEGH%IPFWCVAX.BITNET@vma.CC.CMU.EDU
Date: Fri, 21 Dec 90 21:55:00 -0500
FOURTH CONFERENCE ON NEURAL NETWORKS
------------------------------------
AND PARALLEL DISTRIBUTED PROCESSING
-----------------------------------
INDIANA UNIVERSITY-PURDUE UNIVERSITY
------------------------------------
11,12,13 APRIL 1991
-------------------
CALL FOR PAPERS
---------------
The Fourth Conference on Neural Networks and Parallel Distributed Processing
at Indiana University-Purdue University will be held on the Fort Wayne Campus,
April 11,12, 13, 1991.
Authors are invited to submit a one page abstract of current research in
their area of Neural Networks Theory or Application before February 5,
1991. Notification of acceptance or rejection will be sent by February
28.
The proceedings of the third conference are now in press and will be
announced on the network in early January.
Conference registration is $20 and students attend free. Some limited
financial support might also be available to allow students to attend.
Abstracts and inquiries should be addressed to:
email: sayegh@ipfwcvax.bitnet
-----
US mail:
-------
Prof. Samir Sayegh
Physics Department
Indiana University-Purdue University
Fort Wayne, IN 46805
FAX: (219) 481-6880
Voice: (219) 481-6157
------------------------------
End of Neuron Digest [Volume 7 Issue 3]
***************************************