Copy Link
Add to Bookmark
Report
Neuron Digest Volume 06 Number 71
Neuron Digest Thursday, 13 Dec 1990 Volume 6 : Issue 71
Today's Topics:
tech rep available on evolving neural networks
learning a synaptic learning rule. TR available.
tech. rep. available on evolving trail-following organisms
Dissertation abstract & Technical report
Preprint about scaled conjugate gradient available.
graduate study in neural networks
TR on Children's Acquisition of Irregular Past Tense available
IJCNN-91-Seattle paper deadline is coming up!
reprints available
left-out info: IJCNN-91-Seattle
FTP archives; Technical report available
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).
------------------------------------------------------------
Subject: tech rep available on evolving neural networks
From: Dr Michael G Dyer <dyer@cs.ucla.edu>
Date: Wed, 21 Nov 90 09:35:03 -0800
Evolution of Communication in Artificial Organisms*
Gregory M. Werner
Michael G. Dyer
Tech. Rep. UCLA-AI-90-06
Abstract: A population of artificial organisms evolved simple
communication protocols for mate finding. Female animals in our
artificial environment had the ability to see males and to emit sounds.
Male animals were blind, but could hear signals from females. Thus, the
environment was designed to favor organisms that evolved to generate and
interpret meaningful signals. Starting with random neural networks, the
simulation resulted in a progression of generations that exhibit
increasingly effective mate finding strategies. In addition, a number of
distinct subspecies, i.e. groups with different signaling protocols or
"dialects", evolve and compete. These protocols become a behavioral
barrier to mating that supports the formation of distinct subspecies.
Experiments with physical barriers in the environment were also
performed. A partially permeable barrier allows a separate subspecies to
evolve and survive for indefinite periods of time, in spite of occasional
migration and contact from members of other subspecies.
* To appear in: J. D. Farmer, C. Langton, S. Rasmussen & C. Taylor
(Eds.), Artificial Life II, Addison-Wesley, in press.
For a copy of the above paper, please send a request for Tech. Rep.
UCLA-AI-90-06 to: valerie@cs.ucla.edu
------------------------------
Subject: learning a synaptic learning rule. TR available.
From: Yoshua BENGIO <yoshua@HOMER.MACH.CS.CMU.EDU>
Date: Wed, 21 Nov 90 19:41:36 -0500
The following technical report is now available by ftp from neuroprose:
Bengio Y. and Bengio S. (1990). Learning a synaptic learning rule.
Technical Report #751, Universite de Montreal, Departement d'informatique
et de recherche operationelle.
Learning a synaptic learning rule
Yoshua Bengio Samy Bengio
McGill University, Universite de Montreal
School of Computer Science, Departement d'informatique
3480 University street, et de recherche operationelle,
Montreal, Qc, Canada, H3A 2A7 Montreal, Qc, Canada, H3C 3J7
yoshua@cs.mcgill.ca bengio@iro.umontreal.ca
An original approach to neural modeling is presented, based on the idea
of searching for and tuning, with learning methods, a synaptic learning
rule which is biologically plausible, and yields networks capable to
learn to perform difficult tasks. This method relies on the idea of
considering the synaptic modification rule DeltaW() as a parametric
function. This function has local inputs and is the same in many neurons.
Its parameters can be estimated with known learning methods. For this
optimization, we give particular attention to gradient descent and
genetic algorithms. Estimation of these parameters consists of a joint
global optimization of
(a) the synaptic modification function, and
(b) the networks that are learning to perform some tasks, using this
function.
We show how to compute the gradient of an optimization criteria with
respect to the parameters of DeltaW(). Both network architecture and the
learning function can be designed within constraints derived from
biological knowledge.
To avoid that DeltaW() be too specialized, this function is forced to be
the same for a large number of synapses, in a population of networks
learning to perform different tasks. To enforce efficiency constraints,
some of these networks should learn complex mappings (as in pattern
recognition). Others should learn to reproduce behavioral phenomena, such
as associative conditioning, and neurological phenomena, such as
habituation, recovery, dishabituation and sensitization. The architecture
of the networks reproducing these biological phenomena can be designed
based on well-studied circuits, such as those involved in associations in
Aplysia, Hermissenda, or the rabbit eyelid closure response. Multiple
synaptic modification functions allow for the diverse types of synapses
(e.g. inhibitory, excitatory). Models of pre-, epi- and post-synaptic
mechanisms can be used to bootstrap Delta W(), so that it initially
consists of a combination of simpler modules, each emulating a particular
synaptic mechanism.
- ---------------------------------------------------------------------------
Copies of the postscript file bengio.learn.ps.Z may be obtained from the
pub/neuroprose directory in cheops.cis.ohio-state.edu. Either use the
Getps script or do this:
unix-1> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62)
Connected to cheops.cis.ohio-state.edu.
Name (cheops.cis.ohio-state.edu:): anonymous
331 Guest login ok, sent ident as password.
Password: neuron
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
ftp> binary
ftp> get bengio.learn.ps.Z
ftp> quit
unix-2> uncompress bengio.learn.ps.Z
unix-3> lpr -P(your_local_postscript_printer) bengio.learn.ps
Or, order a hardcopy by sending your physical mail address to
bengio@iro.umontreal.ca, mentioning Technical Report #751. Please do
this only if you cannot use the ftp method described above.
------------------------------
Subject: tech. rep. available on evolving trail-following organisms
From: Dr Michael G Dyer <dyer@CS.UCLA.EDU>
Date: Tue, 27 Nov 90 21:29:39 -0800
Evolution as a Theme in Artificial Life: The Genesys/Tracker System*
David Jefferson, Robert Collins, Claus Cooper
Michael Dyer, Margot Flowers, Richard Korf
Charles Taylor, Alan Wang
Abstract
Direct, fine-grained simulation is a promising way of investigating and
modeling natural evolution. In this paper we show how we can model a
population of organisms as a population of computer programs, and how the
evolutionarily significant activities of organisms (birth, interaction
with the environment, migration, sexual reproduction with genetic
mutation and recombination, and death) can all be represented by
corresponding operations on programs. We illustrate these ideas in a
system built for the Connection Machine called Genesys/Tracker, in which
artificial "ants" evolve the ability to perform a complex task. In less
than 100 generations a population of 64K "random" ants, represented
either as finite state automata or as artificial neural nets, evolve the
ability to traverse a winding broken "trail" in a rectilinear grid
environment. Throughout this study we pay special attention to
methodological issues, such as the avoidance of representational
artifacts, and to biological verisimilitude.
* To appear in J. D. Farmer, C. Langton, S. Rasmussen and C. Taylor (Eds.),
Artificial Life II, Addison-Wesley, in press.
For a copy of this tech. rep., please send e-mail
to: valerie@cs.ucla.edu
requesting "Jefferson et al. -- Evolution as Theme in ALife"
Be sure to include your USA mail address.
------------------------------
Subject: Dissertation abstract & Technical report
From: menon@cs.utexas.edu
Date: Wed, 28 Nov 90 12:34:21 -0600
Technical report announcement:
Dynamic Aspects of Signaling in Distributed Neural Systems
Vinod Menon
Dept. of Computer Sciences
Univ. of Texas at Austin
Austin, TX 78712
ABSTRACT
A distributed neural system consists of localized populations of
neurons -- neuronal groups -- linked by massive reciprocal connections.
Signaling between neuronal groups forms the basis of functioning of such
a system. In this thesis, fundamental aspects of signaling are
investigated mathematically with particular emphasis on the architecture
and temporal self-organizing features of distributed neural systems.
Coherent population oscillations, driven by exogenous and endogenous
events, serve as autonomous timing mechanisms and are the basis of one
possible mechanism of signaling. The theoretical analysis has,
therefore, concentrated on a detailed study of the origin and
frequency-amplitude-phase characteristics of the oscillations and the
emergent features of inter-group reentrant signaling.
It is shown that a phase shift between the excitatory and inhibitory
components of the interacting intra-neuronal-group signals underlies the
generation of oscillations. Such a phase shift is readily induced by
delayed inhibition or slowly decaying inhibition. Theoretical analysis
shows that a large dynamic frequency-amplitude range is possible by
varying the time course of the inhibitory signal.
Reentrant signaling between two groups is shown to give rise to
synchronization, desynchronization, and resynchronization (with a large
jump in frequency and phase difference) of the oscillatory activity as
the latency of the reentrant signal is varied. We propose that this
phenomenon represents a correlation dependent non-Boolean switching
mechanism. A study of triadic neuronal group interactions reveals
topological effects -- the existence of stabilizing (closed loop) and
destabilizing (open loop) circuits.
The analysis indicates (1) the metastable nature of signaling, (2) the
existence of time windows in which correlated and uncorrelated activity
can take place, and (3) dynamic frequency-amplitude-phase modulation of
oscillations. By varying the latencies, and hence the relative phases of
the reentrant signals, it is possible to dynamically and selectively
modulate the cross-correlation between coactive neuronal groups in a
manner that reflects the mapping topology as well as the intrinsic
neuronal circuit properties. These mechanisms, we argue, provide dynamic
linkage between neuronal groups thereby enabling the distributed neural
system to operate in a highly parallel manner without clocks, algorithms,
and central control.
To obtain a copy of the technical report TR-90-36 please send $5 in US
bank check or international money order payable to "The University of
Texas" at the following address:
Technical Report Center
Department of Computer Sciences
University of Texas at Austin
Taylor Hall 2.124
Austin, TX 78712-1188 USA
- -------------------------------------------------------------------------
Dept. of Computer Sciences email: menon@cs.utexas.edu
University of Texas at Austin tel: 512-343-8033
Austin, TX 78712 -471-9572
- -------------------------------------------------------------------------
------------------------------
Subject: Preprint about scaled conjugate gradient available.
From: Martin Moller <fodslett@daimi.aau.dk>
Date: Fri, 30 Nov 90 19:48:14 +0100
*************************************************************
******************** PREPRINT announcement: ****************
A Scaled Conjugate Gradient Algorithm for Fast Supervised Learning.
Martin F. Moller
Computer Science Dept.
University of Aarhus
Denmark
e-mail: fodslett@daimi.aau.dk
Abstract--
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with
superlinear convergence rate is introduced. SCG uses second order
information from the neural network, but requires only O(N) memory usage.
The performance of SCG is benchmarked against the performance of the
standard backpropagation algorithm (BP) and several recently proposed
standard conjugate gradient algorithms. SCG yields a speed-up at least an
order of magnitude relative to BP. The speed-up depends on the
convergence criterion,i.e., the bigger demand for reduction in error the
bigger the speed-up. SCG is fully automated including no user dependent
parameters and avoids a time consuming line search, which other conjugate
gradient algorithms use in order to determine a good step size.
Incorporating problem dependent structural information in the
architecture of a neural network often lowers the overall complexity. The
smaller the complexity of the neural network relative to the problem
domain, the bigger the possibility that the weights space contains long
ravines characterized by sharp curvature. While BP is inefficient on
these ravine phenomena, SCG handles them effectively.
The paper is available by ftp in the neuroprose directory under the name:
moller.conjugate-gradient.ps.Z
Any question or comments on this short writing or on the preprint would
be very much appriciated.
Martin M.
------------------------------
Subject: graduate study in neural networks
From: caroly@park.bu.edu
Date: Fri, 30 Nov 90 15:13:20 -0500
(please post)
***********************************************
* *
* GRADUATE PROGRAM IN *
* COGNITIVE AND NEURAL SYSTEMS (CNS) *
* AT BOSTON UNIVERSITY *
* *
***********************************************
Gail A.Carpenter & Stephen Grossberg, Co-Directors
The Boston University graduate program in Cognitive and Neural Systems
offers comprehensive advanced training in the neural and computational
principles, mechanisms, and architectures that underly human and animal
behavior, and the application of neural network architectures to the
solution of outstanding technological problems.
Applications for Fall, 1991 admissions and financial aid are now being
accepted for both the MA and PhD degree programs.
To obtain a brochure describing the CNS Program and a set of application
materials, write or telephone:
Cognitive & Neural Systems Program
Boston University
111 Cummington Street, Room 240
Boston, MA 02215
(617) 353-9481
or send a mailing address to: caroly@park.bu.edu
Applications for admission and financial aid should be received by the
Graduate School Admissions Office no later than January 15.
Applicants are required to submit undergraduate (and, if applicable,
graduate) transcripts, three letters of recommendation, and Graduate
Record Examination (GRE) scores. The Advanced Test should be in the
candidate's area of departmental specialization. GRE scores may be waived
for MA candidates and, in exceptional cases, for PhD candidates, but
absence of these scores may decrease an applicant's chances for admission
and financial aid.
Description of the CNS Program:
The Cognitive and Neural Systems (CNS) Program provides advanced training
and research experience for graduate students interested in the neural
and computational principles, mechanisms, and architectures that underly
human and animal behavior, and the application of neural network
architectures to the solution of outstanding technological problems.
Students are trained in a broad range of areas concerning cognitive and
neural systems, including vision and image processing; speech and
language understanding; adaptive pattern recognition; associative
learning and long-term memory; cognitive information processing;
self-organization; cooperative and competitive network dynamics and
short-term memory; reinforcement, motivation, and attention; adaptive
sensory-motor control and robotics; and biological rhythms; as well as
the mathematical and computational methods needed to support advanced
modeling research and applications. The CNS Program awards MA, PhD, and
BA/MA degrees.
The CNS Program embodies a number of unique features. Its core curriculum
consists of eight interdisciplinary graduate courses each of which
integrates the psychological, neurobiological, mathematical, and
computational information needed to theoretically investigate fundamental
issues concerning mind and brain processes and the applications of neural
networks to technology. Each course is taught once a week in the evening
to make the program available to qualified students, including working
professionals, throughout the Boston area. Students develop a coherent
area of expertise by designing a program that includes courses in areas
such as Biology, Computer Science, Engineering, Mathematics, and
Psychology, in addition to courses in the CNS core curriculum.
The CNS Program prepares Ph.D. students for thesis research with
scientists in one of several Boston University research centers or
groups, and with Boston-area scientists collaborating with these centers.
The unit most closely linked to the Program is the Center for Adaptive
Systems. The Center for Adaptive Systems is also part of the Boston
Consortium for Behavioral and Neural Studies, a Boston-area
multi-institutional Congressional Center of Excellence. Another
multi-institutional Congressional Center of Excellence focussed at Boston
University is the Center for the Study of Rhythmic Processes. Other
research resources include distinguished research groups in dynamical
systems within the mathematics department; in theoretical computer
science within the Computer Science Department; in biophysics and
computational physics within the Physics Department; in sensory robotics,
biomedical engineering, computer and systems engineering, and
neuromuscular research within the Engineering School; and in
neurophysiology, neuroanatomy, and neuropharmacology at the Medical
School.
------------------------------
Subject: TR on Children's Acquisition of Irregular Past Tense available
From: Steve Pinker <steve@psyche.mit.edu>
Date: Fri, 30 Nov 90 19:07:05 -0500
The following technical report is now available:
MIT CENTER FOR COGNITIVE SCIENCE
OCCASIONAL PAPER #41
Overregularization
Gary F. Marcus
Michael Ullman
Steven Pinker
Michelle Hollander
T. John Rosen
Fei Xu
MIT
ABSTRACT
Children's overregularization errors such as 'comed' bear on three
issues: "U"-shaped development where children get worse over time because
of an interaction between memory and rule-governed processes; the
unlearning of grammatical errors in the absence of parental negative
feedback; and whether cognitive processes are computed by rules or by
parallel distributed processing (connectionist) networks. We remedy the
lack of quantitative data on overregularization by exhaustively analyzing
the 11,500 irregular past tense utterances in the transcribed spontaneous
speech of 69 children, and by reviewing the naturalistic and experimental
literature. We found: (1) overregularization errors are relatively rare
(median 2.5% of irregular past tense forms), suggesting that there is no
qualitative defect in children's grammars that must be unlearned. (2)
Overregularization occurs at a roughly constant low rate from the late
two's into the school-age years, affecting most irregular verbs. (3)
Though there is no stage where overregularization errors predominate, one
other aspect of U-shaped development was confirmed: an extended period of
correct performance before the first overregularization. (4) No support
was found for Rumelhart & McClelland's (1986) hypothesis that
overregularization is caused by increases in the number or proportion of
regular verbs in the input to the past tense system (either parents'
tokens, children's tokens, or children's types). Thus the traditional
account in which a memory system operates before a rule system cannot be
replaced by a connectionist alternative in which a single network
displays rotelike or rulelike behavior in response to changes in input
statistics. (5) The onset of overregularization is best predicted by the
onset of *obligatoriness*: the errors appear when children stop leaving
verbs in past tense contexts unmarked (e.g., 'Yesterday I come'). (6) The
more often a parent uses an irregular past tense form of a verb, the less
often the child overregularizes it. (7) Verbs are protected from
overregularization by neighborhoods of similar-sounding irregulars, but
are not attracted to overregularization by neighborhoods of
similar-sounding regulars. This suggests that the associative properties
of connectionist networks may help explain performance with irregulars
(via the memory system in which they are stored) but not with regulars. A
simple hypothesis explains these phenomena. Children, like adults,
obligatorily mark tense, using one of two mechanisms: memory for
irregulars, and an affixation rule that can generate a regular past tense
form for any verb. Retrieval of an irregular blocks the rule, but
children's memory traces for irregulars are not strong enough to
guarantee perfect retrieval. When retrieval fails, the rule is applied,
and overregularization results.
- ---------------------------------------------------------------------------
Copies of the postscript file overreg.ps.Z may be obtained
electronically from psyche.mit.edu as follows:
unix-1> ftp psyche.mit.edu (or ftp 18.88.0.85)
Connected to psyche.mit.edu.
Name (psyche:): anonymous
331 Guest login ok, sent ident as password.
Password: yourname
230 Guest login ok, access restrictions apply.
ftp> cd pub
250 CWD command successful.
ftp> binary
200 Type set to I.
ftp> get overreg.ps.Z
200 PORT command successful.
150 Opening data connection for overreg.ps.Z (18.88.0.154,1500) (253471 bytes).
226 Transfer complete.
local: overreg.ps.Z remote: overreg.ps.Z
253471 bytes received in 4.1 seconds (61 Kbytes/s)
ftp> quit
unix-2> uncompress overreg.ps.Z
unix-3> lpr -P(your_local_postscript_printer) overreg.ps
Or, order a hardcopy by sending your physical mail address to Eleanor
Bonsaint (bonsaint@psyche.mit.edu), asking for Occasional Paper #41,
Please do this only if you cannot use the ftp method described above.
------------------------------
Subject: IJCNN-91-Seattle paper deadline is coming up!
From: Don Wunsch <dwunsch@blake.u.washington.edu>
Date: Mon, 03 Dec 90 11:47:46 -0800
JCNN '91 Seattle
Call for Papers
The International Neural Networks Society (INNS) and the
Institute for Electronic and Electrical Engineers (IEEE) invite all
persons interested in the field of Neural Networks to submit papers for
possible presentation at the Conference.
Papers must be RECEIVED by February 1, 1991. Submissions
received after February 1, 1991 will be returned unopened. All
submissions will be acknowledged by mail. International authors should
submit their work via Air Mail or Express courier so as to ensure timely
arrival.
Eight copies (one original and seven copies) are required for
submission. Do not fold or staple the original, camera-ready copy. Do
not number the pages on the front of the camera-ready copy. Papers of no
more than six pages, including figures, tables and references, should be
written in English, and only complete papers will be considered.
Papers must be submitted camera-ready on 8 1/2" by 11" white
bond paper with 1" margins on each of the top, bottom, left and right
sides, and un-numbered. They should be prepared by a typewriter or
letter-quality printer in one-column format, single-spaced, in Times or
similar style font of 10 point or larger, and should be printed on one
side of the paper only. FAX submissions are not acceptable.
Centered at the top of the first page should be the complete
title, author name(s), affiliation(s) and mailing address(es). This is
followed by a space and then the abstract, up to 15 lines, followed by
the text. In an accompanying letter, the fillowing must be included:
Corresponding Author:
Name, mailing address, telephone and FAX numbers
Technical Area
(Neurobiology, applications, electronic implementations,
optical implementations, image processing, vision, speech,
network dynamics, optimization, robotics and control,
learning and generalization, neural network architectures,
or other)
Presentation Format Preferred:
Oral or Poster
Presenter:
Name, mailing address, telephone and FAX numbers
If an oral talk is requested, include any special audio/video
requests. Special audio/video requests beyond 35mm slide and
overhead transparency projectors will be honored only if there
are sufficient requests to justify them. If you have special
audio/video needs, please contact Sarah Eck at conference
management for more information.
Send Papers to:
IJCNN '91 Seattle
Conference Management,
Attn. Sarah Eck, MS/GH-22, Suite 108
5001 25th Ave. NE, Seattle, WA 98195
Tel (206) 543-0888 FAX (206) 685-9359
DEADLINE FEBRUARY 1, 1991
Submissions received after this date will be returned unopened.
The cost for IJCNN-91-Seattle is $195.00 for INNS and IEEE
members for early registration, deadline March 1, 1991. Non-members
early registration is $295.00 and students will pay $50.00. Late
registration will be honored until June 1, 1991 to members at $295.00,
non-members at $395.00, and students at $75.00. On-site registration
will be $395.00, $495.00 and $95.00 respectively.
Tutorials will be offered at an additional cost of $195.00, or
$295.00 for tutorial registration on site.
Exhibitors will present the latest in neural networks, including
neurocomputers, VLSI neural networks, implementations, software systems
and applications at IJCNN-91-SEATTLE. IJCNN-91-SEATTLE is the neural
network industry's largest trade show.
Hope to see you there!
Don
------------------------------
Subject: reprints available
From: Dr Michael G Dyer <dyer@CS.UCLA.EDU>
Date: Mon, 03 Dec 90 20:23:12 -0800
reprints available:
Dyer, M. G. Distributed symbol formation and processing
in connectionist networks. Journal of Experimental and Theoretical
Artificial Intelligence. Vol. 2, 215-239, 1990.
Abstract: Distributed connectionist (DC) systems offer a set of
processing features which are distinct from those provided by traditional
symbol processing (SP) systems. In general, the features of DC systems
are derived from the nature of their distributed representations. Such
representations have a microsemantics -- i.e. symbols with similar
internal representations tend to have similar processing effects. In
contrast, the symbols in SP systems have no intrinsic microsemantics of
their own; e.g. SP symbols are formed by concatenating ASCII codes that
are static, human engineered, and arbitrary. Such symbols possess only a
macrosemantics -- i.e. symbols are placed into structured relationships
with other symbols, via pointers, and bindings are propagated via
variables. The fact that DC and SP systems each provide a distinct set
of useful features serves as a strong research motivation for seeking a
synthesis. What is needed for such a synthesis is a method by which
symbols can dynamically form their own microsemantics, while at the same
time entering into structured, recursive relationships with other
symbols, thus developing also a macrosemantics. Here, we describe a
general method, called symbol recirculation, for allowing symbols to form
their own microsemantics. We then discuss three techniques for
implementing variables and bindings in DC systems. Finally, we describe
a number of DC systems, based on these techniques, which perform a
variety of high-level cognitive tasks.
requests for reprints should be sent to: valerie@cs.ucla.edu
------------------------------
Subject: left-out info: IJCNN-91-Seattle
From: Don Wunsch <dwunsch@blake.u.washington.edu>
Date: Mon, 03 Dec 90 23:29:44 -0800
>Hi,
>The date of the conference seems to be missing. Would you please post it on
>the network again? Thanks.
>Regards,
>Dr. Dit-Yan Yeung
Thanks, Dr. Yeung, for pointing out my oversight. I don't want to use up
all my permitted postings, so I'll just post the few most critical lines,
this time including the date.
JCNN '91 Seattle, July 8-12, 1991
Papers must be RECEIVED by February 1, 1991.
Send Papers to:
IJCNN '91 Seattle
Conference Management,
Attn. Sarah Eck, MS/GH-22, Suite 108
5001 25th Ave. NE, Seattle, WA 98195
Tel (206) 543-0888 FAX (206) 685-9359
The cost for IJCNN-91-Seattle is $195.00 for INNS and IEEE
members for early registration, deadline March 1, 1991. Non-members
early registration is $295.00 and students will pay $50.00. Late
registration will be honored until June 1, 1991 to members at $295.00,
non-members at $395.00, and students at $75.00. On-site registration
will be $395.00, $495.00 and $95.00 respectively.
Finally, another addition: anyone interested in volunteering should
contact me at dwunsch@blake.u.washington.edu as soon as you can.
Don
------------------------------
Subject: FTP archives; Technical report available
From: David Chalmers <dave@cogsci.indiana.edu>
Date: Tue, 04 Dec 90 19:17:03 -0500
(1) Following many requests, the bibliography that I have compiled on the
philosophy of mind/cognition/AI is now available by anonymous ftp from
cogsci.indiana.edu (129.79.238.6). It is contained in 5 files
chalmers.bib.* in the directory "pub". Also contained in this archive
are various articles by members of the Center for Research on Concepts
and Cognition. Instructions for retrieval are given below.
(2) The following technical report is now available.
THE EVOLUTION OF LEARNING: AN EXPERIMENT IN GENETIC CONNECTIONISM
David J. Chalmers
Center for Research on Concepts and Cognition
Indiana University
CRCC-TR-47
This paper explores how an evolutionary process can produce systems that
learn. A general framework for the evolution of learning is outlined,
and is applied to the task of evolving mechanisms suitable for supervised
learning in single-layer neural networks. Dynamic properties of a
network's information-processing capacity are encoded genetically, and
these properties are subjected to selective pressure based on their
success in producing adaptive behavior in diverse environments. As a
result of selection and genetic recombination, various successful
learning mechanisms evolve, including the well-known delta rule. The
effect of environmental diversity on the evolution of learning is
investigated, and the role of different kinds of emergent phenomena in
genetic and connectionist systems is discussed.
A version of this paper appears in _Proceedings of the 1990 Connectionist
Models Summer School_ (Touretzky, Elman, Sejnowski and Hinton, eds).
- -----------------------------------------------------------------------------
This paper may be retrieved by anonymous ftp from cogsci.indiana.edu
(129.79.238.6). The file is chalmers.evolution.ps.Z, in the directory
pub. To retrieve, do the following:
unix-1> ftp cogsci.indiana.edu # (or ftp 129.79.238.6)
Connected to cogsci.indiana.edu
Name (cogsci.indiana.edu:): anonymous
331 Guest login ok, sent ident as password.
Password: [identification]
230 Guest login ok, access restrictions apply.
ftp> cd pub
ftp> binary
ftp> get chalmers.evolution.ps.Z
ftp> quit
unix-2> uncompress chalmers.evolution.ps.Z
unix-3> lpr -P(your_local_postscript_printer) chalmers.evolution.ps
The file is also available from the Ohio State neuroprose archives by the
usual methods. If you do not have access to ftp, hardcopies may be
obtained by sending e-mail to dave@cogsci.indiana.edu.
------------------------------
End of Neuron Digest [Volume 6 Issue 71]
****************************************