Copy Link
Add to Bookmark
Report
Neuron Digest Volume 05 Number 06
Neuron Digest Monday, 30 Jan 1989 Volume 5 : Issue 6
Today's Topics:
Administrivia
Tech Report Announcement
Connectionist Concepts: BBS Call for Commentators
ICSI talk - Two Decades of Applied Kolmogorov Complexity
Stanford Adaptive Networks Colloquium
ICSI talk - Structured Representations
CALL FOR PARTICIPATION
ICSI talk - Efficient associative storage
Call for papers for IJCNN
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
ARPANET users can get old issues via ftp from hplpm.hpl.hp.com (15.255.16.205).
------------------------------------------------------------
Subject: Administrivia
From: "Your local Moderator" <neuron-request@hplabs.hp.com>
Date: Mon, 31 Jan 89 20:00:00 -0800
[[This issue is devoted to talk announcements, calls for participation and
so on. Astute readers will note that some of the dates are rather old.
I've been experiencing some local mailer difficulties, plus the usual hectic
schedule at the beginning of a new semester, which combined to delay. At
this rate, I may never catch up!
As usual, I include talk announcements, even though they are past, since
they let readers know who's doing what in the field.
For those of you who have requested back issues by mail... you haven't been
forgotten. I appreciate your patience. For those who access back issues
via ftp... please do it in the extreme off hours or on Mon/Wed/Fri mornings;
you are accessing my personal workstation and ftp gets pretty low priority
when I'm doing something else.
Finally, if you are missing issues, please let me know. Vol4 ended
(arbitrarily) at #34.
-Peter Marvit ]]
------------------------------
Subject: Tech Report Announcement
From: eric@mcc.com (Eric Hartman)
Date: Mon, 02 Jan 89 16:02:06 -0600
The following MCC Technical Report is now available.
Requests may be sent to
eric@mcc.com
or
Eric Hartman
Microelectronics and Computer Technology Corporation
3500 West Balcones Center Drive
Austin, TX 78759-6509
U.S.A.
-----------------------------------------------------------------
Explorations of the Mean Field Theory Learning Algorithm
Carsten Peterson* and Eric Hartman
Microelectronics and Computer Technology Corporation
3500 West Balcones Center Drive
Austin, TX 78759-6509
MCC Technical Report Number: ACA-ST/HI-065-88
Abstract:
The mean field theory (MFT) learning algorithm is elaborated and explored
with respect to a variety of tasks. MFT is benchmarked against the back
propagation learning algorithm (BP) on two different feature recognition
problems: two-dimensional mirror symmetry and eight-dimensional statistical
pattern classification. We find that while the two algorithms are very
similar with respect to generalization properties, MFT normally requires a
substantially smaller number of training epochs than BP. Since the MFT
model is bidirectional, rather than feed-forward, its use can be extended
naturally from purely functional mappings to a content addressable memory.
A network with N visible and N hidden units can store up to approximately 2N
patterns with good content-addressability. We stress an implementational
advantage for MFT: it is natural for VLSI circuitry. Also, its inherent
parallelism can be exploited with fully synchronous updating, allowing
efficient simulations on SIMD architectures.
*Present Address: Department of Theoretical Physics
University of Lund
Solvegatan 14A, S-22362 Lund, Sweden
------------------------------
Subject: Connectionist Concepts: BBS Call for Commentators
From: harnad@Princeton.EDU (Stevan Harnad)
Date: Wed, 04 Jan 89 10:12:06 -0500
Below is the abstract of a forthcoming target article to appear in
Behavioral and Brain Sciences (BBS), an international, interdisciplinary
journal that provides Open Peer Commentary on important and controversial
current research in the biobehavioral and cognitive sciences. Commentators
must be current BBS Associates or nominated by a current BBS Associate. To
be considered as a commentator on this article, to suggest other appropriate
commentators, or for information about how to become a BBS Associate, please
send email to:
harnad@confidence.princeton.edu or write to:
BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771]
____________________________________________________________________
THE CONNECTIONIST CONSTRUCTION OF CONCEPTS
Adrian Cussins, New College, Oxford
Keywords: connectionism, representation, cognition, perception,
nonconceptual content, concepts, learning, objectivity, semantics
Computational modelling of cognition depends on an underlying theory of
representation. Classical cognitive science has exploited the
syntax/semantics theory of representation derived from formal logic. As a
consequence, the kind of psychological explanation supported by classical
cognitive science is "conceptualist": psychological phenomena are modelled
in terms of relations between concepts and between the sensors/effectors and
concepts. This kind of explanation is inappropriate according to Smolensky's
"Proper Treatment of Connectionism" [BBS 11(1) 1988]. Is there an
alternative theory of representation that retains the advantages of
classical theory but does not force psychological explanation into the
conceptualist mold? I outline such an alternative by introducing an
experience-based notion of nonconceptual content and by showing how a
complex construction out of nonconceptual content can satisfy classical
constraints on cognition. Cognitive structure is not interconceptual but
intraconceptual. The theory of representational structure within concepts
allows psychological phenomena to be explained as the progressive emergence
of objectivity. This can be modelled computationally by transformations of
nonconceptual content which progressively decrease its
perspective-dependence through the formation of a cognitive map.
Stevan Harnad ARPA/INTERNET harnad@confidence.princeton.edu
harnad@princeton.edu harnad@mind.princeton.edu srh@flash.bellcore.com
harnad@elbereth.rutgers.edu CSNET: harnad%mind.princeton.edu@relay.cs.net
UUCP: harnad@princeton.uucp BITNET: harnad@pucc.bitnet harnad1@umass.bitnet
Phone: (609)-921-7771
------------------------------
Subject: ICSI talk - Two Decades of Applied Kolmogorov Complexity
From: baker%icsi.Berkeley.EDU@berkeley.edu (Paula Ann Baker)
Date: Wed, 04 Jan 89 15:37:43 -0800
The International Computer Science Institute
is pleased to present a talk:
Friday, January 13, 1989 2:00 p.m.
Paul M.B. Vitanyi
Centrum voor Wiskunde en Informatica, Amsterdam
and
Department of Mathematics and Computer Science
University of Amsterdam
"Two Decades of Applied Kolmogorov Complexity"
This talk is an introduction to the main ideas of Kolmogorov
complexity and surveys the wealth of useful applications of this elegant
notion. Topics include notions of randomness; a version of Goedel's
incompleteness theorem; lower bound arguments in formal language theory,
complexity of computation, and electronic chips; Bayesian inference and a
priori probability with applications ranging from foundational issues to the
theory of learning and inductive inference in Artificial Intelligence;
resource-bounded Kolmogorov complexity with applications ranging from
NP-completeness and the P versus NP question to cryptography. (We will
treat these subjects as far and in such multitude as feasible.) This is
joint work with Ming Li, and is a selection from the textbook ``An
Introduction to Kolmogorov Complexity and its Applications'' we are
preparing.
This talk will be held in the Main Lecture Hall at ICSI.
1947 Center Street, Suite 600, Berkeley, CA 94704
(On Center between Milvia and Martin Luther King Jr. Way)
------------------------------
Subject: Stanford Adaptive Networks Colloquium
From: netlist@psych.Stanford.EDU (Mark Gluck)
Date: Tue, 10 Jan 89 06:43:16 -0800
Stanford University Interdisciplinary Colloquium Series:
ADAPTIVE NETWORKS
AND THEIR APPLICATIONS
Co-sponsored by the Departments of Psychology and Electrical Engineering
Winter Quarter 1989 Schedule
----------------------------
Jan. 12th (Thursday, 3:30pm):
-----------------------------
STEVEN PINKER CONNECTIONISM AND
Department of Brain & Cognitive Sciences THE FACTS OF HUMAN LANGUAGE
Massachusetts Institute of Technology
email: steve@psyche.mit.edu (with commentary by David Rumelhart)
Jan. 24th (Tuesday, 3:30pm):
----------------------------
LARRY MALONEY LEARNING BY ASSERTION:
Department of Psychology CALIBRATING A SIMPLE VISUAL SYSTEM
New York University
email: ltm@xp.psych.nyu.edu
Feb. 9th (Thursday, 3:30pm):
----------------------------
CARVER MEAD VLSI MODELS OF NEURAL NETWORKS
Moore Professor of Computer Science
California Institute of Technology
Feb. 21st (Tuesday, 3:30pm):
----------------------------
PIERRE BALDI ON SPACE AND TIME IN NEURAL COMPUTATIONS
Jet Propulsion Laboratory
California Institute of Technology
email: pfbaldi@caltech.bitnet
Mar. 14th (Tuesday, 3:30pm):
----------------------------
ALAN LAPEDES NONLINEAR SIGNAL PROCESSING WITH NEURAL NETS
Theoretical Division - MS B213
Los Alamos National Laboratory
email: asl@lanl.gov
Additional Information
----------------------
The talks (including discussion) last about one hour and fifteen minutes.
Following each talk, there will be a reception. Unless otherwise noted, all
talks will be held in room 380-380F, which is in the basement of the
Mathematical Sciences buildings. To be placed on an electronic-mail
distribution list for information about these and other adaptive network
events in the Stanford area, send email to netlist@psych.stanford.edu. For
additional information, contact: Mark Gluck, Department of Psychology, Bldg.
420, Stanford University, Stanford, CA 94305 (phone 415-725-2434 or email to
gluck@psych.stanford.edu). Program Committe: Committee: Bernard Widrow
(E.E.), David Rumelhart, Misha Pavel, Mark Gluck (Psychology). This series
is supported by the Departments of Psychology and Electrical Engineering and
by a gift from the Thomson-CSF Corporation.
Coming this Spring: D. Parker, B. McNaughton, G. Lynch & R. Granger
------------------------------
Subject: ICSI talk - Structured Representations
From: baker%icsi.Berkeley.EDU@berkeley.edu (Paula Ann Baker)
Date: Wed, 11 Jan 89 16:10:45 -0800
The International Computer Science Institute
is pleased to present a talk:
Wednesday, January 18, 1989 12:00 p.m.
"Structured representations and connectionist models"
Jeff Elman
Departments of Cognitive Science and Linguistics
University of California, San Diego
Recent descriptions of connectionist models have argued that
connectionist representations are unstructured, atomic, and
bounded (e.g., Fodor & Pylyshyn, 1988). I will describe
several sets of results with recurrent networks and distri-
buted representations which contest these claims. The simu-
lations address the type/token distinction, the representa-
tion of hierarchical categories in language, and the
representation of grammatical structure; the results suggest
that connectionist networks are able to learn representa-
tions which are indeed richly structured and open-ended.
I will also discuss the notion that trajectories through
state space provide a useful representational dimension
which is available to connectionist models. A method is
presented for analyzing the dynamic aspects of the represen-
tations which arise in recurrent networks. The method
involves rotating the dimensions of the hidden unit space in
order to extract meaningful dimensions and constructing
phase state portraits of projections of the hidden unit time
series along selected dimensions. These phase state por-
traits are then analyzed in terms of linguistic (grammati-
cal) structure.
This talk will be held in the Main Lecture Hall at ICSI.
1947 Center Street, Suite 600, Berkeley, CA 94704
(On Center between Milvia and Martin Luther King Jr. Way)
------------------------------
Subject: CALL FOR PARTICIPATION
From: "Joerg Kindermann" <unido!gmdzi!joerg@uunet.UU.NET>
Date: Thu, 12 Jan 89 08:30:50 -0100
Workshop ``DANIP''
Distributed Adaptive Neural Information Processing.
24.-25.4.1989
Gesellschaft fuer Mathematik und Datenverarbeitung mbH
Sankt Augustin
Neural information processing is constantly gaining increasing attention in
many scientific areas. As a consequence the first ``Workshop
Konnektionismus'' at the GMD was organized in February 1988. It gave an
overview of research activities in neural networks and their applications to
Artificial Intelligence. Now, almost a year later, the time has come to
focus on the state of neural information processing itself.
The aim of the workshop is to discuss TECHNICAL aspects of information
processing in neural networks on the basis of personal contributions in one
of the following areas:
- new or improved learning algorithms (including evaluations)
- self organization of structured (non-localist) neural networks
- time series analysis by means of neural networks
- adaptivity, e.g the problem of relearning
- adequate coding of information for neural processing
- generalization
- weight interpretation (correlative and other)}
Presentations which report on ``work in progress'' are encouraged. The size
of the workshop will be limited to 15 contributions of 30 minutes in length.
A limited number of additional participants may attend the workshop and take
part in the discussions.
To apply for the workshop as a contributor, please send information about
your contribution (1-2 pages in English or a relevant publication).
If you want to participate without giving an oral presentation, please
include a description of your background in the field of neural networks.
Proceedings on the basis of workshop contributions will be published after
the workshop.
SCHEDULE:
28 February 1989: deadline for submission of applications
20 March 1989: notification of acceptance
24 - 25 April 1989: workshop ``DANIP''
31 July 1989: deadline for submission of full papers
to be included in the proceedings
Applications should be sent to the following address:
Dr. Joerg Kindermann or Alexander Linden
Gesellschaft fuer Mathematik
und Datenverarbeitung mbH
- Schloss Birlinghoven -
Postfach 1240 D-5205 Sankt Augustin 1
WEST GERMANY
e-mail: joerg@gmdzi al@gmdzi
------------------------------
Subject: ICSI talk - Efficient associative storage
From: baker%icsi.Berkeley.EDU@BERKELEY.EDU (Paula Ann Baker)
Date: Thu, 12 Jan 89 17:19:04 -0800
The International Computer Science Institute
is pleased to present a talk:
Thursday, January 19th, 1989 2:00 p.m.
Joachim Buhmann
University of Southern California
"Efficient associative storage of static
patterns and pattern sequences."
Low activity neural networks are proposed for efficient storage and recall
of static patterns and pattern sequences. One of the most promising
extensions of the concept of associative memories is storage of sequences of
patterns. The neural network proposed stores a set of sparse coded patterns
and the transitions between subsequent patterns. Transition times are
controlled by the level of noise in the system which provides an universal
mechanism to control the recall speed. Patterns are stored in the network
by a sparse coding scheme, i.e. only few neurons are firing and most of the
neurons are quiet. This coding scheme turns out to yield a much higher
storage capacity than storage of random patterns as in the popular Hopfield
model. By introducing global inhibition in the network all spurious states
can be suppressed.
This talk will be held in the Main Lecture Hall at ICSI.
1947 Center Street, Suite 600, Berkeley, CA 94704
(On Center between Milvia and Martin Luther King Jr. Way)
------------------------------
Subject: Call for papers for IJCNN
From: pwh@ece-csc.ncsu.edu (Paul Hollis)
Date: Fri, 13 Jan 89 17:28:39 -0500
[[Editor's note: PLEASE NOTE IMMEDIATE DATE FOR SUBMISSION! -PM]]
NEURAL NETWORKS
CALL FOR PAPERS
International Joint Conference on Neural Networks
June 18-22, 1989
Washington, D.C.
The 1989 IEEE/INNS International Joint Conference on Neural Networks
(IJCNN-89) will be held at the Sheraton Washington Hotel in Washington,
D.C., USA from June 18-22, 1989. IJCNN-89 is the first conference in a new
series devoted to the technology and science of neurocomputing and neural
networks in all of their aspects. The series replaces the previous IEEE
ICNN and INNS Annual Meeting series and is jointly sponsored by the IEEE
Technical Activities Board Neural Network Committee and the International
Neural Network Society (INNS). IJCNN-89 will be the only major neural net-
work meeting of 1989 (IEEE ICNN-89 and the 1989 INNS Annual Meeting have
both been cancelled). Thus, it behooves all members of the neural network
community who have important new results for presentation to prepare their
papers now and submit them by the IJCNN-89 deadline of 1 FEBRUARY 1989. The
Conference Proceedings will be distributed AT THE REGISTRATION DESK to all
regular conference registrants as well as to all student registrants. The
conference will include a day of tutorials (June 18), the exhibit hall (the
neurocomputing industry's primary annual trade show), plenary talks, and
social events. Mark your calendar today and plan to attend IJCNN-89 -- the
definitive annual progress report on the neurocomputing revolution!
DEADLINE FOR SUBMISSION OF PAPERS for IJCNN-89 is FEBRUARY 1, 1989. Papers
of 8 pages or less are solicited in the following areas:
- -Real World Applications -Associative Memory
- -Supervised Learning Theory -Image Analysis
- -Reinforcement Learning Theory -Self-Organization
- -Robotics and Control -Neurobiological Models
- -Optical Neurocomputers -Vision
- -Optimization -Electronic Neurocomputers
- -Neural Network Architectures & Theory -Speech Recognition
FULL PAPERS in camera-ready form (1 original on Author's Kit forms and 5
reduced 8 1/2" x 11" copies) should be submitted to Nomi Feldman, Confer-
ence Coordinator, at the address below. For more details, or to request
your IEEE Author's Kit, call or write:
Nomi Feldman, IJCNN-89 Conference Coordinator
3770 Tansy Street, San Diego, CA 92121
(619) 453-6222
------------------------------
End of Neurons Digest
*********************