Copy Link
Add to Bookmark
Report
Neuron Digest Volume 08 Number 12
Neuron Digest Sunday, 1 Dec 1991 Volume 8 : Issue 12
Today's Topics:
special deal for Connection Science
Graduate study in Cognitive & Neural Systems at Boston University
IJNS announcement & CFP
THINKNET NEWSLETTER ANNOUNCEMENT
MUSIC AND CONNECTIONISM Book Announcement
MIT NSL Reports on Adaptive Neurocontrol
NIPS Workshop Announcement (and CFP)
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: special deal for Connection Science
From: Lyn Shackleton <lyn@dcs.exeter.ac.uk>
Date: Fri, 11 Oct 91 13:56:49 +0000
********** CONNECTION SCIENCE SPECIAL ISSUE ******************
CONNECTIONIST MODELLING OF PSYCHOLOGICAL PROCESSES
VOLUME 3.2 (out now)
EDITOR
Noel Sharkey
SPECIAL BOARD
Jim Anderson
Andy Barto
Thomas Bever
Glyn Humphreys
Walter Kintsch
Dennis Norris
Kim Plunkett
Ronan Reilly
Dave Rumelhart
Antony Sanford
CONTENTS
J R Levenick:NAPS: a connectionist implementation of cognitive maps.
A Pouget & S J Thorpe: Connectionist models of orientation
identification.
D R Shanks: A connectionist account of base-rate biases in
categorization.
A J O'Toole, K Deffenbacher, H Abdi & J Bartlett: Simulating the
"Other-race effect" as a problem in perceptual learning.
S Kaplan, M Sonntag & E Chown: Tracing recurrent activity of cognitive
elements (TRACE): a model of temporal dynamics in a cell assembly.
Research Notes:
A H Kawamoto & S N Kitzis: Time course of regular and irregular pronunciations.
A VERY SPECIAL DEAL FOR MEMBERS OF THE CONNECTIONISTS MAILING.
Prices for members of this list will now be:
North America 44 US Dollars (reduced from 126 dollars)
Elsewhere and U.K. 22 pounds sterling.
(Sterling checks must be drawn on a UK bank)
These rates start from 1st January 1992 (volume 4).
Conditions:
1. Personal use only (i.e. non-institutional).
2. Must subscribe from your private address.
You can receive a subscription form by emailing direct to the publisher:
email: carfax@ibmpcug.co.uk
Say for the attention of David Green and say CONNECTIONISTS MAILING LIST.
noel
------------------------------
Subject: Graduate study in Cognitive & Neural Systems at Boston University
From: dlukas@park.bu.edu (David Lukas)
Date: Thu, 17 Oct 91 12:58:42 -0500
(please post)
***********************************************
* *
* DEPARTMENT OF *
* COGNITIVE AND NEURAL SYSTEMS (CNS) *
* AT BOSTON UNIVERSITY *
* *
***********************************************
Stephen Grossberg, Chairman
The Boston University Department of Cognitive and Neural Systems offers
comprehensive advanced training in the neural and computational
principles, mechanisms, and architectures that underly human and animal
behavior, and the application of neural network architectures to the
solution of outstanding technological problems.
Applications for Fall, 1992 admissions and financial aid are now being
accepted for both the MA and PhD degree programs.
To obtain a brochure describing the CNS Program and a set of application
materials, write or telephone:
Department of Cognitive & Neural Systems
Boston University
111 Cummington Street, Room 240
Boston, MA 02215
(617) 353-9481
or send a mailing address to: kellyd@cns.bu.edu
Applications for admission and financial aid should be received by
the Graduate School Admissions Office no later than January 15.
Applicants are required to submit undergraduate (and, if applicable,
graduate) transcripts, three letters of recommendation, and Graduate
Record Examination (GRE) scores. The Advanced Test should be in the
candidate's area of departmental specialization. GRE scores may be waived
for MA candidates and, in exceptional cases, for PhD candidates, but
absence of these scores may decrease an applicant's chances for admission
and financial aid.
Description of the CNS Department:
The Department of Cognitive and Neural Systems (CNS) provides advanced
training and research experience for graduate students interested in the
neural and computational principles, mechanisms, and architectures that
underlie human and animal behavior, and the application of neural network
architectures to the solution of outstanding technological problems.
Students are trained in a broad range of areas concerning cognitive and
neural systems, including vision and image processing; speech and
language understanding; adaptive pattern recognition; cognitive
information processing; self-organization; associative learning and
long-term memory; cooperative and competitive network dynamics and
short-term memory; reinforcement, motivation, and attention; adaptive
sensory-motor control and robotics; and biological rhythms; as well as
the mathematical and computational methods needed to support advanced
modeling research and applications. The CNS Department awards MA, PhD,
and BA/MA degrees.
The CNS Department embodies a number of unique features. It has developed
a core curriculum that consists of ten interdisciplinary graduate courses
each of which integrates the psychological, neurobiological,
mathematical, and computational information needed to theoretically
investigate fundamental issues concerning mind and brain processes and
the applications of neural networks to technology. Additional advanced
courses, including research seminars, are also offered. Each course is
typically taught once a week in the evening to make the program available
to qualified students, including working professionals, throughout the
Boston area. Students develop a coherent area of expertise by designing a
program that includes courses in areas such as Biology, Computer Science,
Engineering, Mathematics, and Psychology, in addition to courses in the
CNS core curriculum.
The CNS Department prepares students for thesis research with scientists
in one of several Boston University research centers or groups, and with
Boston-area scientists collaborating with these centers. The unit most
closely linked to the department is the Center for Adaptive Systems. The
Center for Adaptive Systems is also part of the Boston Consortium for
Behavioral and Neural Studies, a Boston-area multi-institutional
Congressional Center of Excellence. Another multi-institutional
Congressional Center of Excellence focused at Boston University is the
Center for the Study of Rhythmic Processes. Other research resources
include distinguished research groups in neurophysiology, neuroanatomy,
and neuropharmacology at the Medical School and the Charles River campus;
in sensory robotics, biomedical engineering, computer and systems
engineering, and neuromuscular research within the Engineering School; in
dynamical systems within the mathematics department; in theoretical
computer science within the Computer Science Department; and in
biophysics and computational physics within the Physics Department.
1991 FACULTY and STAFF of CNS and CAS:
Daniel H. Bullock Nancy Kopell
Gail A. Carpenter John W.L. Merrill
Michael A. Cohen Ennio Mingolla
H. Steven Colburn Alan Peters
Paolo Gaudiano Adam Reeves
Stephen Grossberg James T. Todd
Thomas G. Kincaid Allen Waxman
------------------------------
Subject: IJNS announcement & CFP
From: BRUNAK@nbivax.nbi.dk
Date: Mon, 21 Oct 91 14:51:00 +0000
INTERNATIONAL JOURNAL OF NEURAL SYSTEMS
The International Journal of Neural Systems is a quarterly journal
which covers information processing in natural and artificial neural
systems. It publishes original contributions on all aspects of this
broad subject which involves physics, biology, psychology, computer
science and engineering. Contributions include research papers, reviews
and short communications. The journal presents a fresh undogmatic
attitude towards this multidisciplinary field with the aim to be a
forum for novel ideas and improved understanding of collective and
cooperative phenomena with computational capabilities.
ISSN: 0129-0657 (IJNS)
=----------------------------------
Contents of Volume 2, issue number 3 (1991):
1. D.G. Stork:
Sources of neural structure in speech and language processing.
2. L. Xu, A. Krzyzak and E. Oja:
Neural nets for the dual subspace pattern recognition method.
3. P.J. Zwietering, E.H.L. Aarts and J. Wessels:
The design and complexity of exact multi-layered perceptrons.
4. M.M. van Hulle:
A goal programming network for mixed integer linear programming:
A case study for the job-shop scheduling problem.
5. J-X. Wu and C. Chan:
A three-layered adaptive network for pattern density estimation
and classification.
6. L. Garrido and V. Gaitan:
Use of neural nets to measure the tau-polarisation and its
Bayesian interpretation.
7. C.M. Bishop:
A fast procedure for retraining the multilayer perceptrons.
8. V. Menon and D.S. Tang:
Population oscillations in neuronal groups.
9. V. Rodrigues and J. Skrzypek:
Combining similarities and dissimilarities in supervised learning.
=----------------------------------
Editorial board:
B. Lautrup (Niels Bohr Institute, Denmark) (Editor-in-charge)
S. Brunak (Technical Univ. of Denmark) (Assistant Editor-in-Charge)
D. Stork (Stanford) (Book review editor)
Associate editors:
B. Baird (Berkeley)
D. Ballard (University of Rochester)
E. Baum (NEC Research Institute)
S. Bjornsson (University of Iceland)
J. M. Bower (CalTech)
S. S. Chen (University of North Carolina)
R. Eckmiller (University of Dusseldorf)
J. L. Elman (University of California, San Diego)
M. V. Feigelman (Landau Institute for Theoretical Physics)
F. Fogelman-Soulie (Paris)
K. Fukushima (Osaka University)
A. Gjedde (Montreal Neurological Institute)
S. Grillner (Nobel Institute for Neurophysiology, Stockholm)
T. Gulliksen (University of Oslo)
D. Hammerstrom (Oregon Graduate Institute)
D. Horn (Tel Aviv University)
J. Hounsgaard (University of Copenhagen)
B. A. Huberman (XEROX PARC)
L. B. Ioffe (Landau Institute for Theoretical Physics)
P. I. M. Johannesma (Katholieke Univ. Nijmegen)
M. Jordan (MIT)
G. Josin (Neural Systems Inc.)
I. Kanter (Princeton University)
J. H. Kaas (Vanderbilt University)
A. Lansner (Royal Institute of Technology, Stockholm)
A. Lapedes (Los Alamos)
B. McWhinney (Carnegie-Mellon University)
M. Mezard (Ecole Normale Superieure, Paris)
J. Moody (Yale, USA)
A. F. Murray (University of Edinburgh)
J. P. Nadal (Ecole Normale Superieure, Paris)
E. Oja (Lappeenranta University of Technology, Finland)
N. Parga (Centro Atomico Bariloche, Argentina)
S. Patarnello (IBM ECSEC, Italy)
P. Peretto (Centre d'Etudes Nucleaires de Grenoble)
C. Peterson (University of Lund)
K. Plunkett (University of Aarhus)
S. A. Solla (AT&T Bell Labs)
M. A. Virasoro (University of Rome)
D. J. Wallace (University of Edinburgh)
D. Zipser (University of California, San Diego)
=--------------------
CALL FOR PAPERS
Original contributions consistent with the scope of the journal are
welcome. Complete instructions as well as sample copies and
subscription information are available from
The Editorial Secretariat, IJNS
World Scientific Publishing Co. Pte. Ltd.
73, Lynton Mead, Totteridge
London N20 8DH
ENGLAND
Telephone: (44)81-446-2461
or
World Scientific Publishing Co. Inc.
687 Hardwell St.
Teaneck
New Jersey 07666
USA
Telephone: (1)201-837-8858
or
World Scientific Publishing Co. Pte. Ltd.
Farrer Road, P. O. Box 128
SINGAPORE 9128
Telephone (65)382-5663
------------------------------
Subject: THINKNET NEWSLETTER ANNOUNCEMENT
From: Kent D Palmer <palmer@world.std.com>
Date: Wed, 23 Oct 91 02:25:10 -0500
[[ Editor's Note: Again, this "publication" may be at the fringe of
Neural Networks, but may appeal to some of the Digest's readership. -PM ]]
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
||||||| PLEASE POST ----- NEWSLETTER ANNOUNCEMENT ||||||||
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
/| ....... .. .. . . . .
.==|........ ... .. .... . .... ..
._____. . * . . / ===|_ _. ..______________________________......
| | | | |\ | / ======== |\ ...| .... |.THINKNET:An Electronic....
| |---| | | \ |< ========== |. \ .|---- . |.Journal Of Philosophy,...
| | | | | \| \ ======== |... \| ..... |.Meta-Theory, And Other..
| | | | | | \ ====== |.... |____.. |.Thoughtful Discussions....
.==| ........ .. .... .. ... .. .
\| .... ... .. .. . . .. . .
- -----------------------------------------------------------------------------
OCTOBER 1991 ISSUE 001 VOLUME 1 NUMBER 1
- -----------------------------------------------------------------------------
This is an announcement for Thinknet, an on-line magazine forum dedicated
to thoughtfulness in the cybertime environment. Thinknet covers
philosophy, systems theory, and meta-theoretical discussions within
disciplines. It is your interdisciplinary window on to what significant
information sources are available to foster thought provoking discussion.
*CONTENTS*
Publication Data
Scope of newsletter.
Rationale for newsletter.
Subscriptions and Submittals address.
Bulletin Boards where it may be found.
Services offered by newsletter.
Staff of this edition.
Coda: call for participation.
About Thinknet
Discussion of goals of Thinknet Newsletter.
Prospect for Philosophy and Systems Theory in Cybertime
Is there a possibility for a renaissance for philosophy?
The Philosophy Category on GEnie
Review by Gordon Swobe with list of topics.
Philosophy on the WELL
Review by Jeff Dooley with list of topics.
Origin Conference on the WELL
Review by Bruce Schuman with list of topics
Internet Philosophy Mailing Lists
A review of all know philosophy oriented mailing lists by Stephen Clark.
Books Of Note
THE MATRIX
!%@:: A DIRECTORY OF ELECTRONIC MAIL ADDRESSING & NETWORKS
Other Publications
BOARDWATCH MAGAZINE
SOFTWARE ENGINEERING FOUNDATIONS [a work in progress]
Books, Electronic Newsletters, and Cyber-Artifacts Received
ARTCOM NEWSLETTER
FACTSHEET FIVE
Protocols for Meaningful Discussions: ARTICLE by Kent Palmer
A consideration of how philosophy discussions might be made more
useful and their history accessible by using a voluntary protocol.
Thoughtful Communications: EDITORIAL
Closing remarks.
<<<<<<<<<<<<Thinknet Electronic Newsletter (c) 1991 Kent Palmer.>>>>>>>>>>>>>
- -----------------------------------------------------------------------------
HOW TO GET YOUR COPY kdp
- -----------------------------------------------------------------------------
*Price*
The electronic form is FREE.
Hardcopies cost money for reproduction, postage, and handling.
*Subscriptions*
Send an e-mail message to the following address:
thinknet@world.std.com
Your message should be of the following form:
SEND THINKNET TO YourFullName AT YourEmailAddress
Some mailing lists do not include your return mailing address if you use
the reply function of your mail reader so you must make sure your return
e-mail address is in the body of your message.
Thinknet file is long, about 1113 lines; 7136 words; 51795 bytes.
You will be added to the thinknet subscription list. You will get all
further issues unless you unsubscribe.
*Bulletin Boards*
Thinknet will be posted in the WELL philosophy conference in a topic.
The WELL
27 Gate Five Road, Sausalito, CA 94965
modem 415-332-6106
voice 415-332-4335
Also on GEnie in the Philosophy category under the Religion and Ethics
Bulletin Board.
GEnie Client Services 1-800-638-9636
*PHILOS-L Listserver*
You will eventually be able to get the thinknet newsletter from a listserver.
Send the message 'GET THINKNET DOC' to 'LISTSERV@LIVERPOOL.AC.UK'.
If you get an error message try the regular thinknet address.
*Or if all else fails*
THINKNET
PO BOX 8383
ORANGE CA 92664-8383
UNITED STATES
------------------------------
Subject: MUSIC AND CONNECTIONISM Book Announcement
From: todd@galadriel.stanford.edu
Date: Fri, 25 Oct 91 14:50:47 -0800
BOOK ANNOUNCEMENT:
MUSIC AND CONNECTIONISM
edited by
Peter M. Todd and D. Gareth Loy
MUSIC AND CONNECTIONISM is now available from MIT Press. This 280-pp.
book contains a wide variety of recent research in the applications of
neural networks and other connectionist methods to the problems of
musical listening and understanding, performance, composition, and
aesthetics. It consists of a core of articles that originally appeared
in the Computer Music Journal, along with several new articles by
Kohonen, Mozer, Bharucha, and others, and new addenda to the original
articles describing the authors' most recent work. Topics covered range
from models of psychological processing of pitches, chords, and melodies,
to algorithmic composition and performance factors. A wide variety of
connectionist models are employed as well, including back-propagation in
time, Kohonen feature maps, ART networks, and Jordan- and Elman-style
networks. We've also included a discussion generated by the Computer
Music Journal articles on the use and place of connectionist systems in
artistic endeavors. A more detailed description of the book is provided
below (from the jacket text), along with the complete table of contents.
We hope this book will be of use to a wide variety of readers, including
neural network researchers interested in a broad, challenging, and fun
new area of application, cognitive scientists and music psychologists
looking for robust new models of musical behavior, and artists seeking to
learn more about a potentially very useful technology.
MUSIC AND CONNECTIONISM can be found in bookstores that carry MIT Press
publications, or can be purchased directly from MIT Press by calling
their toll-free order number, 1-800-356-0343, and giving the operator
this catalog number: 1CSAT 503, and this book code: TODMH. By phone and
mail-order, the price is $39.95; in stores, it will probably be $45
(there is some confusion with the publisher on this point, so I wanted to
give out the detailed information for phone orders to save people some
money).
Please drop me a line if you have any questions, and especially if you
take up the gauntlet and pursue research or applications in this area!
cheers,
peter todd
*****************************************************************************
Music and Connectionism
edited by Peter M. Todd and D. Gareth Loy
As one of our highest expressions of thought and creativity, music has
always been a difficult realm to capture, model, and understand. The
connectionist paradigm, now beginning to provide insights into many
realms of human behavior, offers a new and unified viewpoint from which
to investigate the subtleties of musical experience. \fIMusic and
Connectionism\fP provides a fresh approach to both fields, using
techniques of connectionism and parallel distributed processing to look
at a wide range of topics in music research, from pitch perception to
chord fingering to composition.
The contributors, leading researchers in both music psychology and neural
networks, address the challenges and opportunities of musical
applications of network models. The result is a current and thorough
survey that advances our understanding of musical perception, cognition,
composition, and performance and of the design and analysis of networks.
Music and Connectionism is based on a core of articles originally
appearing as two special issues of the Computer Music Journal. These
have been augmented with addenda covering more recent research by the
authors. The book opens with tutorial chapters introducing neural
networks in a musical context and relevant aspects of previous computer
music research, making this a self-contained text. There are many new
chapters, along with new section introductions, summaries of related
work, and a final debate on the artistic implications of connectionist
methods.
Peter M. Todd is a doctoral candidate in the PDP Research Group of the
Psychology Department at Stanford University. Gareth Loy DMA is an
award-winning composer, member of the Board of Directors of the Computer
Music Association, lecturer in the Music Department of UC San Diego, and
member of the technical staff of Frox Inc.
Contents:
Preface and Introduction
Peter M. Todd and D. Gareth Loy
Part 1: Background
Machine Tongues XII: Neural Networks
Mark Dolson
Connectionism and Musiconomy
D. Gareth Loy
Part 2: Perception and Cognition
A Neural Net Model for Pitch Perception
Hajime Sano and B. Keith Jenkins
Connectionist Models for Tonal Analysis
Don L. Scarborough, Ben O. Miller, and Jacqueline A. Jones
The Representation of Pitch in a Neural Net Model of Chord Classification
Bernice Laden and Douglas H. Keefe
Pitch, Harmony, and Neural Nets: A Psychological Perspective
Jamshed J. Bharucha
The Ontogenesis of Tonal Semantics: Results of a Computer Study
Marc Leman
Modeling the Perception of Tonal Structure with Neural Nets
Jamshed J. Bharucha and Peter M. Todd
Using Connectionist Models to Explore Complex Musical Patterns
Robert O. Gjerdingen
The Quantization of Musical Time: A Connectionist Approach
Peter Desain and Henkjan Honing
Part 3: Applications
A Connectionist Approach to Algorithmic Composition
Peter M. Todd
Connectionist Music Composition Based on Melodic, Stylistic, and
Psychophysical Constraints
Michael C. Mozer
Creation By Refinement and the Problem of Algorithmic Music Composition
J.P. Lewis
A Nonheuristic Automatic Composing Method
Teuvo Kohonen, Pauli Laine, Kalev Tiits, and Kari Torkkola
Fingering for String Instruments with the Optimum Path Paradigm
Samir I. Sayegh
Part 4: Conclusions
Letter from Otto Laske
Responses to Laske by Todd and Loy
Further Research and Directions
Peter M. Todd
List of Author Addresses
------------------------------
Subject: MIT NSL Reports on Adaptive Neurocontrol
From: Rob Sanner <rob@tlon.mit.edu>
Date: Mon, 11 Nov 91 17:00:15 -0500
The following are the titles and abstracts of three reports we
have uploaded to the neuroprose archive. Due to a large number of
recent requests for hardcopy reprints, these reports have now been
made available electronically. They can also be obtained (under their
NSL reference number) by anonymous ftp at tlon.mit.edu in the pub
directory.
These reports describe the results of research conducted at
the MIT Nonlinear Systems Laboratory during the past year into
algorithms for the stable adaptive tracking control of nonlinear
systems using gaussian radial basis function networks.
These papers are potentially interesting to researchers in
both adaptive control and neural network theory. The research
described starts by quantifying the relation between the network size
and weights and the degree of uniform approximation accuracy a trained
network can guarantee. On this basis, it develops a _constructive_
procedure for networks which ensures the required accuracy. These
constructions are then exploited for the design of stable adaptive
controllers for nonlinear systems.
Any comments would be greatly appreciated and can be sent to
either rob@tlon.mit.edu or jjs@athena.mit.edu.
Robert M. Sanner and Jean-Jacques E. Slotine
- ------------------------------------------------------------------------------
on neuroprose: sanner.adcontrol_9103.ps.Z
(NSL-910303, March 1991)
Also appears: Proc. American Control Conference, June 1991.
Direct Adaptive Control Using Gaussian Networks
Robert M. Sanner and Jean-Jacques E. Slotine
Abstract:
A direct adaptive tracking control architecture is proposed
and evaluated for a class of continuous-time nonlinear dynamic systems
for which an explicit linear parameterization of the uncertainty in
the dynamics is either unknown or impossible. The architecture
employs a network of gaussian radial basis functions to adaptively
compensate for the plant nonlinearities. Under mild assumptions about
the degree of smoothness exhibited by the nonlinear functions, the
algorithm is proven to be stable, with tracking errors converging to a
neighborhood of zero.
A constructive procedure is detailed, which directly
translates the assumed smoothness properties of the nonlinearities
involved into a specification of the network required to represent the
plant to a chosen degree of accuracy. A stable weight adjustment
mechanism is then determined using Lyapunov theory.
The network construction and performance of the resulting
controller are illustrated through simulations with an example system.
- -----------------------------------------------------------------------------
on neuroprose: sanner.adcontrol_9105.ps.Z
(NSL-910503, May 1991)
Gaussian Networks for Direct Adaptive Control
Robert M. Sanner and Jean-Jacques E. Slotine
Abstract:
This report is a complete and formal exploration of the ideas
originally presented in NSL-910303; as such it contains most of
NSL-910303 as a subset.
We detail a constructive procedure for a class of neural
networks which can approximate to a prescribed accuracy the functions
required for satisfaction of the control objectives. Since this
approximation can be maintained only over a finite subset of the plant
state space, to ensure global stability it is necessary to introduce
an additional component into the control law, which is capable of
stabilizing the dynamics as the neural approximation degrades. To
unify these components into a single control law, we propose a novel
technique of smoothly blending the two modes to provide a continuous
transition from adaptive operation in the region of validity of the
network approximation, to a nonadaptive operation in the regions where
this approximation is inaccurate. Stable adaptation mechanisms are
then developed using Lyapunov stability theory.
Section 2 describes the setting of the control problem to be
examined and illustrates the structure of conventional adaptive
methods for its solution. Section 3 introduces the use of
multivariable Fourier analysis and sampling theory as a method of
translating assumed smoothness properties of the plant nonlinearities
into a representation capable of uniformly approximating the plant
over a compact set. This section then discusses the conditions under
which these representations can be mapped onto a neural network with a
finite number of components. Section 4 illustrates how these networks
may be used as elements of an adaptive tracking control algorithm for
a class of nonlinear systems, which will guarantee convergence of the
tracking errors to a neighborhood of zero. Section 5 illustrates the
method with two examples, and finally, Section 6 closes with some
general observations about the proposed controller.
- -----------------------------------------------------------------------------
on neuroprose: sanner.adcontrol_9109.ps.Z
(NSL-910901, Sept. 1991)
To appear: IEEE Conf. on Decision and Control, Dec. 1991.
Stable Adaptive Control and Recursive Identification
Using Radial Gaussian Networks
Robert M. Sanner and Jean-Jacques E. Slotine
Abstract:
Previous work has provided the theoretical foundations of a
constructive design procedure for uniform approximation of smooth
functions to a chosen degree of accuracy using networks of gaussian
radial basis functions. This construction and the guaranteed uniform
bounds were then shown to provide the basis for stable adaptive
neurocontrol algorithms for a class of nonlinear plants.
This paper details and extends these ideas in three
directions: first some practical details of the construction are
provided, explicitly illustrating the relation between the free
parameters in the network design and the degree of approximation error
on a particular set. Next, the original adaptive control algorithm is
modified to permit incorporation of additional prior knowledge of the
system dynamics, allowing the neurocontroller to operate in parallel
with conventional fixed or adaptive controllers. Finally, it is shown
how the gaussian network construction may also be utilized in
recursive identification algorithms with similar guarantees of
stability and convergence. The identification algorithm is
evaluated on a chaotic time series and demonstrates the predicted
convergence properties.
------------------------------
Subject: NIPS Workshop Announcement (and CFP)
From: David Cohn <pablo@cs.washington.edu>
Date: Tue, 19 Nov 91 09:42:41 -0800
- --------------------------------------------------------------
NIPS Workshop on Active Learning and Control
Announcement (and call for participation)
- --------------------------------------------------------------
organizers: David Cohn, University of Washington
Don Sofge, MIT AI Lab
An "active" learning system is one that is not merely a passive
observer of its environment, but instead play an active role in
determining its inputs. This definition includes classification
networks that query for values in "interesting" parts of their domain,
learning systems that actively "explore" their environment, and
adaptive controllers that learn how to produce control outputs to
achieve a goal.
Common facets of these problems include building world models in
complex domains, exploring a domain to safely and efficiently, and,
planning future actions based on one's model.
In this workshop, our main focus will be to address key unsolved
problems which may be holding up progress on these problems rather
than presenting polished, finished results. Ours hopes are that
unsolved problems in one field may be able to draw on insight from
research in other fields.
Each session of the workshop will begin with introductions to specific
problems in the field by researchers in each area, with the second half
of each session reserved for discussion.
- ---------------------------------------------------------------------------
Current speakers include:
Chris Atkeson, MIT AI Lab
Tom Dietterich, Oregon State Univ.
Michael Jordan, MIT Brain & Cognitive Sciences
Michael Littman, BellCore
Andrew Moore, MIT AI Lab
Jurgen Schmidhuber, Univ. of Colorado, Boulder
Satinder Singh, UMass Amherst
Rich Sutton, GTE
Sebastian Thrun, Carnegie-Mellon University
A few open slots remain, so if you would be interested in discussing
your "key unsolved problem" in active learning, exploration, planning
or control, send email to David Cohn (pablo@cs.washington.edu) or Don
Sofge (sofge@ai.mit.edu).
- ---------------------------------------------------------------------------
Friday, 12/6, Morning Active Learning
" " Afternoon Learning Control
Saturday, 12/7, Morning Active Exploration
" " Afternoon Planning
- ---------------------------------------------------------------------------
------------------------------
End of Neuron Digest [Volume 8 Issue 12]
****************************************