Copy Link
Add to Bookmark
Report
Neuron Digest Volume 09 Number 07
Neuron Digest Sunday, 16 Feb 1992 Volume 9 : Issue 7
Today's Topics:
Paper available: Neural mechanisms for steering in visual motion
Announcement-submission to neuron-digest
TR-Temporal adaptation in a silison auditory nerve
Proceedings Announcement (Indiana)
NIPS preprint - Adaptive elastic models for hand-printed chars
NIPS preprint - predictions with discontinuities
NIPS paper - Weight decay improves generalization
Contents of NETWORK - Vol. 3, no 1 Feb 1992
Neural Computation 4:1
YANIPSPPI (Yet Another NIPS PrePrint by Internet)
papers available --- ann, genetic algorithms
TR - Expert networks
Adding noise to training -- A psychological perspective (Preprint)
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Paper available: Neural mechanisms for steering in visual motion
From: Jonathan Marshall <marshall@cs.unc.edu>
Date: Thu, 09 Jan 92 14:31:05 -0500
The following paper is available via ftp from the neuroprose archive
at Ohio State (instructions for retrieval follow the abstract).
Challenges of Vision Theory:
Self-Organization of Neural Mechanisms for Stable Steering
of Object-Grouping Data in Visual Motion Perception
Jonathan A. Marshall
Department of Computer Science, CB 3175, Sitterson Hall
University of North Carolina, Chapel Hill, NC 27599-3175, U.S.A.
919-962-1887, marshall@cs.unc.edu
Invited paper, in Stochastic and Neural Methods in Signal Processing,
Image Processing, and Computer Vision, Su-Shing Chen, Ed., Proceedings
of the SPIE 1569, San Diego, July 1991, pp.200-215.
ABSTRACT
Psychophysical studies on motion perception suggest that human visual
systems perform certain nonlocal operations. In some cases, data
about one part of an image can influence the processing or perception
of data about another part of the image, across a long spatial range.
In others, data about nearby parts of an image can fail to influence
one another strongly, despite their proximity. Several types of
nonlocal interaction may underlie cortical processing for accurate,
stable perception of visual motion, depth, and form:
o trajectory-specific propagation of computed moving stimulus
information to successive image locations where a stimulus is
predicted to appear;
o grouping operations (establishing linkages among perceptually
related data);
o scission operations (breaking linkages between unrelated data);
and
o steering operations, whereby visible portions of a visual group or
object can control the representations of invisible or occluded
portions of the same group.
Nonlocal interactions like these could be mediated by long-range
excitatory horizontal intrinsic connections (LEHICs), discovered in
visual cortex of several animal species. LEHICs often span great
distances across cortical image space. Typically, they have been
found to interconnect regions of like specificity with regard to
certain receptive field attributes, e.g., stimulus orientation.
It has recently been shown that several visual processing mechanisms
can self-organize in model recurrent neural networks using
unsupervised "EXIN" (excitatory+inhibitory) learning rules. Because
the same rules are used in each case, EXIN networks provide a means to
unify explanations of how different visual processing modules acquire
their structure and function. EXIN networks learn to multiplex (or
represent simultaneously) multiple spatially overlapping components of
complex scenes, in a context-sensitive fashion. Modeled LEHICs have
been used together with the EXIN learning rules to show how visual
experience can shape neural mechanisms for nonlocal, context-sensitive
processing of visual motion data.
To get a copy of the paper, do the following:
unix> ftp archive.cis.ohio-state.edu
login: anonymous
password: neuron
ftp> cd pub/neuroprose
ftp> binary
ftp> get marshall.steering.ps.Z
ftp> quit
unix> uncompress marshall.steering.ps.Z
unix> lpr marshall.steering.ps.Z
If you have trouble printing the file on a Postscript-compatible
printer, send me e-mail (marshall@cs.unc.edu) with your postal
address, and I'll have a hardcopy mailed to you (may take several
weeks for delivery, though).
------------------------------
Subject: Announcement-submission to neuron-digest
From: ingber@umiacs.UMD.EDU (Lester Ingber)
Date: Thu, 09 Jan 92 18:42:04 -0500
The following paper has been accepted for publication in
Physical Review A as a Rapid Communications.
Generic mesoscopic neural networks based on
statistical mechanics of neocortical interactions
Lester Ingber
A series of papers has developed a statistical mechanics of
neocortical interactions (SMNI), deriving aggregate behavior of
experimentally observed columns of neurons from statistical
electrical-chemical properties of synaptic interactions, demon-
strating its capability in describing large-scale properties of
short-term memory and electroencephalographic (EEG) systematics.
This methodology also defines an algorithm to construct a mesos-
copic neural network (MNN), based on realistic neocortical
processes and parameters, to record patterns of brain activity
and to compute the evolution of this system. Furthermore, this
new algorithm is quite generic, and can be used to similarly pro-
cess information in other systems, especially, but not limited
to, those amenable to modeling by mathematical physics techniques
alternatively described by path-integral Lagrangians, Fokker-
Planck equations, or Langevin rate equations. This methodology
is made possible and practical by a confluence of techniques
drawn from SMNI itself, modern methods of functional stochastic
calculus defining nonlinear Lagrangians, Very Fast Simulated Re-
Annealing (VFSR), and parallel-processing computation.
The above preprint, in PostScript-compressed format, can be obtained
by anonymous ftp. To obtain this file from your local machine:
local% ftp ftp.umiacs.umd.edu
[local% ftp 128.8.120.23]
Name (ftp.umiacs.umd.edu:yourloginname): anonymous
Password (ftp.umiacs.umd.edu:anonymous): [type in yourloginname]
ftp> cd pub/ingber
ftp> binary
ftp> get mnn.ps.Z
ftp> quit
local% uncompress mnn.ps.Z
local% lpr [-P..] mnn.ps [to your PostScript laserprinter]
This prints out 12 pages.
If you do not have access to ftp, then send me an email request,
and I will email you a PostScript-compressed-uuencoded ascii file with
instructions on how to produce laserprinted copies, just requiring the
additional first step of 'uudecode file'. Sorry, but I cannot
take on the task of mailing out hardcopies of this paper.
------------------------------------------
| Prof. Lester Ingber |
| ______________________ |
| Science Transfer Corporation |
| P.O. Box 857 703-759-2769 |
| McLean, VA 22101 ingber@umiacs.umd.edu |
------------------------------------------
------------------------------
Subject: TR-Temporal adaptation in a silison auditory nerve
From: John Lazzaro <lazzaro@boom.CS.Berkeley.EDU>
Date: Tue, 14 Jan 92 13:25:14 -0800
An announcement of a NIPS-4 preprint on the neuroprose server ...
Temporal Adaptation in a Silicon Auditory Nerve
John Lazzaro, CS Division, UC Berkeley
Abstract
--------
Many auditory theorists consider the temporal adaptation of the
auditory nerve a key aspect of speech coding in the auditory
periphery. Experiments with models of auditory localization and pitch
perception also suggest temporal adaptation is an important element of
practical auditory processing. I have designed, fabricated, and
successfully tested an analog integrated circuit that models many
aspects of auditory nerve response, including temporal adaptation.
To retrieve ...
>ftp cheops.cis.ohio-state.edu
>Name (cheops.cis.ohio-state.edu:lazzaro): anonymous
>331 Guest login ok, send ident as password.
>Password: your_username
>230 Guest login ok, access restrictions apply.
>cd pub/neuroprose
>binary
>get lazzaro.audnerve.ps.Z
>quit
%uncompress lazzaro.audnerve.ps.Z
%lpr lazzaro.audnerve.ps
--john lazzaro
lazzaro@boom.cs.berkeley.edu
------------------------------
Subject: Proceedings Announcement (Indiana)
From: SAYEGH@CVAX.IPFW.INDIANA.EDU
Date: Wed, 15 Jan 92 20:52:40 -0500
The Proceedings of the Fourth Conference on Neural Networks and Parallel
Distributed Processing at Indiana University-Purdue University at Fort
Wayne, held April 11, 12, and 13, 1991 are now available.
They can be ordered ($6 + $1 U.S. mail cost) from:
Ms. Sandra Fisher, Physics Department
Indiana University-Purdue University at Fort Wayne
Fort Wayne, IN 46805-1499
FAX: (219) 481-6880 Voice: (219) 481-6306 OR 481-6157
email: proceedings@ipfwcvax.bitnet
The following papers are included in the Proceedings:
Optimization and genetic algorithms:
J.L. Noyes, Wittenberg University
Neural Network Optimization Methods
Robert L. Sedlmeyer, Indiana University-Purdue University at Fort
Wayne
A Genetic Algorithm to Estimate the Edge-Intergrity of Halin
Graphs
Omer Tunali & Ugur Halici, University of Missouri/Rolla
A Boltzman Machine for Hypercube Embedding Problem
William G. Frederick and Curt M. White, Indiana University-Purdue
University at Fort Wayne
Genetic Algorithms and a Variation on the Steiner Point
Problem
Network analysis:
P.G. Madhavan, B. Xu, B. Stephens, Purdue University, Indianapolis
On the Convergence Speed and the Generalization Ability of
Tri-state Neural Networks
Mohammad R. Sayeh, Southern Illinois University at Carbondale
Dynamical-System Approach to Unsupervised Classifier
Samir I. Sayegh, Indiana University-Purdue University at Fort Wayne
Symbolic Manipulation and Neural Networks
Zhenni Wang, Ming T. Tham & A.J. Morris, University of Newcastle
upon Tyne
Multilayer Neural Networks: Approximated Canonical
Decomposition of Nonlinearity
M.G. Royer & O.K. Ersoy, Purdue University, West Lafayette
Classification Performance of Pshnn with BackPropagation
Stages
Sean Carroll, Tri-State University
Single-Hidden-Layer Neural Nets Can Approximate B-Splines
G. Allen Pugh, Indiana University-Purdue University at Fort Wayne
Further Design Considerations for Back Propagation
Biological aspects:
R. Manalis, Indiana University-Purdue University at Fort Wayne
Short Term Memory Implicated in Twitch Facilitation
Edgar Erwin, K. Obermayer, University of Illinois
Formation and Variability of Somatotopic Maps with Topological
Mismatch
T. Alvager, B. Humpert, P. Lu, and C. Roberts, Indiana State
University
DNA Sequence Analysis with a Neural Network
Christel Kemke, DFKI, Germany
Towards a Synthesis of Neural Network Behavior
Arun Jagota, State University of New York at Buffalo
A Forgetting Rule and Other Extensions to the Hopfield-Style
Network Storage Rule and Their Applications
applications:
I.H. Shin and K.J. Cios, The University of Toledo
A Neural Network Paradigm and Architecture for Image Pattern
Recognition
R.E. Tjia, K.J. Cios and B.N. Shabestari, The University of Toledo
Neural Network in Identification of Car Wheels from Gray Level
Images
M.D. Tom and M.F. Tenorio, Purdue University, West Lafayette
A Neuron Architecture with Short Term Memory
S. Sayegh, C. Pomalaza-Raez, B. Beer and E. Tepper, Indiana
University-Purdue University at Fort Wayne
Pitch and Timbre Recognition Using Neural Network
Jacek Witaszek & Colleen Brown, DePaul University
Automatic Construction of Connectionist Expert Systems
Robert Zerwekh, Northern Illinois University
Modeling Learner Performance: Classifying Competence Levels
Using Adaptive Resonance Theory
tutorial lectures:
Marc Clare, Lincoln National Corporation, Fort Wayne
An Introduction to the Methodology of Building Neural Networks
Ingrid Russell, University of Hartford
Integrating Neural Networks into an AI Course
Arun Jagota, State University of New York at Buffalo
The Hopfield Model and Associative Memory
Ingrid Russell, University of Hartford
Self Organization and Adaptive Resonance Theory Models
Note: Copies of the Proceedings of the Third Conference on NN&PDP
are also available and can be ordered from the same address.
------------------------------
Subject: NIPS preprint - Adaptive elastic models for hand-printed chars
From: Geoffrey Hinton <hinton@ai.toronto.edu>
Date: Thu, 16 Jan 92 10:07:26 -0500
The following paper is available as hinton.handwriting.ps.Z in neuroprose
ADAPTIVE ELASTIC MODELS FOR HAND-PRINTED CHARACTER RECOGNITION
Geoffrey E. Hinton, Christopher K. I. Williams and Michael D. Revow
Department of Computer Science, University of Toronto
ABSTRACT
Hand-printed digits can be modeled as splines that are governed by about
8 control points. For each known digit, the control points have
preferred "home" locations, and deformations of the digit are generated
by moving the control points away from their home locations. Images of
digits can be produced by placing Gaussian ink generators uniformly along
the spline. Real images can be recognized by finding the digit model
most likely to have generated the data. For each digit model we use an
elastic matching algorithm to minimize an energy function that includes
both the deformation energy of the digit model and the log probability
that the model would generate the inked pixels in the image. The model
with the lowest total energy wins. If a uniform noise process is
included in the model of image generation, some of the inked pixels can
be rejected as noise as a digit model is fitting a poorly segmented
image. The digit models learn by modifying the home locations of the
control points.
------------------------------
Subject: NIPS preprint - predictions with discontinuities
From: becker@ai.toronto.edu
Date: Thu, 16 Jan 92 14:44:36 -0500
The following paper is available as becker.prediction.ps.Z in neuroprose:
LEARNING TO MAKE COHERENT PREDICTIONS IN DOMAINS WITH DISCONTINUITIES
Suzanna Becker and Geoffrey E. Hinton
Department of Computer Science, University of Toronto
ABSTRACT
We have previously described an unsupervised learning procedure that
discovers spatially coherent properties of the world by maximizing the
information that parameters extracted from different parts of the sensory
input convey about some common underlying cause. When given random dot
stereograms of curved surfaces, this procedure learns to extract surface
depth because that is the property that is coherent across space. It
also learns how to interpolate the depth at one location from the depths
at nearby locations (Becker and Hinton, 1992). In this paper, we propose
two new models which handle surfaces with discontinuities. The first
model attempts to detect cases of discontinuities and reject them. The
second model develops a mixture of expert interpolators. It learns to
detect the locations of discontinuities and to invoke specialized,
asymmetric interpolators that do not cross the discontinuities.
------------------------------
Subject: NIPS paper - Weight decay improves generalization
From: Anders Krogh <krogh@cse.ucsc.edu>
Date: Fri, 17 Jan 92 12:02:35 -0800
The following paper has been placed in the Neuroprose archive
Title: A Simple Weight Decay Can Improve Generalization
Authors: Anders Krogh and John A. Hertz
Filename: krogh.weight-decay.ps.Z
(To appear in proceedings from NIPS 91)
Abstract:
It has been observed in numerical simulations that a weight decay can
improve generalization in a feed-forward neural network. This paper
explains why. It is proven that a weight decay has two effects in a
linear network. First, it suppresses any irrelevant components of the
weight vector by choosing the smallest vector that solves the learning
problem. Second, if the size is chosen right, a weight decay can
suppress some of the effects of static noise on the targets, which
improves generalization quite a lot. It is then shown how to extend
these results to networks with hidden layers and non-linear units.
Finally the theory is confirmed by some numerical simulations using the
data from NetTalk.
FTP INSTRUCTIONS
unix> ftp archive.cis.ohio-state.edu (or 128.146.8.52)
Name: anonymous
Password: anything
ftp> cd pub/neuroprose
ftp> binary
ftp> get krogh.weight-decay.ps.Z
ftp> bye
unix> zcat krogh.weight-decay.ps.Z | lpr
(or however you uncompress and print postscript)
------------------------------
Subject: Contents of NETWORK - Vol. 3, no 1 Feb 1992
From: David Willshaw <david@cns.edinburgh.ac.uk>
Date: Tue, 21 Jan 92 09:10:08 +0000
CONTENTS OF NETWORK - COMPUTATION IN NEURAL SYSTEMS
Volume 3 Number 1 February 1992
Proceedings of the First Irish Neural Networks Conference held at The
Queen's University of Belfast, 21 June 1991.
Proceedings Editor: G Orchard
PAPERS
1 Sharing interdisciplinary perspectives on neural networks: the
First Irish Neural Networks Conference
G ORCHARD
5 On the proper treatment of eliminative connectionism
S MILLS
15 Cervical cell image inspection - a task for artificial neural
networks
I W RICKETTS
19 A geometric interpretation of hidden layer units in feedforward
neural networks
J MITCHELL
27 Comparison and evaluation of variants of the conjugate gradient
method for efficient learning in feedforward neural networks with
backward error propagation
J A KINSELLA
37 A connectionist technique for on-line parsing
R REILLY
47 Are artificial neural nets as smart as a rat?
T SAVAGE & R COWIE
61 The principal components of natural images
P J B HANCOCK, R J BADDELEY & L S SMITH
71 Information processing and neuromodulation in the visual system of
frogs and toads
P R LAMING
89 Neurodevelopmental events underlying information acquisition and
storage
E DOYLE, P NOLAN, R BELL & C M REGAN
95 ABSTRACTS SECTION
97 BOOK REVIEWS
Network welcomes research Papers and Letters where the findings have
demonstrable relevance across traditional disciplinary boundaries.
Research Papers can be of any length, if that length can be justified
by content. Rarely, however, is it expected that a length in excess
of 10,000 words will be justified. 2,500 words is the expected limit
for research Letters.
Articles can be published from authors' TeX source codes. Macros can
be supplied to produce papers in the form suitable for refereeing and
for IOP house style. For more details contact the Editorial Services
Manager at IOP Publishing, Techno House, Redcliffe Way, Bristol BS1
6NX, UK. Telephone: 0272 297481 Fax: 0272 294318 Telex: 449149
INSTP G Email Janet: IOPPL@UK.AC.RL.GB
Subscription Information
Frequency: quarterly
Subscription rates: Institution 125.00 pounds (US$220.00)
Individual (UK) 17.30 pounds
(Overseas) 20.50 pounds (US$37.90)
A microfiche edition is also available at 75.00 pounds (US$132.00)
------------------------------
Subject: Neural Computation 4:1
From: Terry Sejnowski <terry@jeeves.UCSD.EDU>
Date: Tue, 21 Jan 92 01:41:45 -0800
Neural Computation Volume 4, Issue 1, January 1992
Review:
Neural Networks and the Bias/Variance Dilemma
Stuart German, Elie Bienenstock, and Rene Doursat
Article:
A Model for the Action of NMDA Conductances in the Visual Cortex
Kevin Fox and Nigel Daw
Letters:
Alternating and Synchronous Rhythms in Reciprocally
Inhibitory Model Neurons
Xiao-Jing Wang and John Rinzel
Feature Extraction Using an Unsupervised Neural Network
Nathan Intrator
Speaker-Independent Digit Recognition Using a Neural
Network with Time-Delayed Connections
K. P. Unnikirishnan, J. J. Hopfield, and D. W. Tank
Local Feedback Multilayered Networks
Paolo Frasconi, Marco Gori, and Giovanni Soda
Learning to Control Fast-Weight Memories: An Alternative
to Dynamic Recurrent Networks
Jurgen Schmidhuber
SUBSCRIPTIONS - VOLUME 4 - BIMONTHLY (6 issues)
______ $40 Student
______ $65 Individual
______ $150 Institution
Add $12 for postage and handling outside USA (+7% for Canada).
(Back issues from Volumes 1-3 are available for $17 each.)
MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142.
(617) 253-2889.
------------------------------
Subject: YANIPSPPI (Yet Another NIPS PrePrint by Internet)
From: Barak Pearlmutter <bap@james.psych.yale.edu>
Date: Fri, 24 Jan 92 16:45:02 -0500
Because of the large number of requests for preprints of "Gradient
descent: second-order momentum and saturating error", I am following the
trend and making it available by FTP. I apologize to those who left
their cards, but the effort and expense of distribution is prohibitive.
If you can not access the paper in this fashion but really must have a
copy before the proceedings come out, please contact me.
FTP INSTRUCTIONS:
ftp JAMES.PSYCH.YALE.EDU
user anonymous
password state-your-name
cd pub/bap/asymp
binary
get nips91.PS.Z
quit
zcat nips91.PS.Z | lpr
Maybe next year, instead of contracting out the proceedings, we can
require postscript from all contributers and everyone will print
everything out at home. Money saved will be used to purchase giant
staplers.
Barak Pearlmutter
Yale University
Department of Psychology
11A Yale Station
New Haven, CT 06520-7447
pearlmutter-barak@yale.edu
------------------------------
Subject: papers available --- ann, genetic algorithms
From: Stefan Bornholdt <T00BOR%DHHDESY3.BITNET@BITNET.CC.CMU.EDU>
Date: Mon, 27 Jan 92 15:54:45 -0500
papers available, hardcopies only.
GENERAL ASYMMETRIC NEURAL NETWORKS AND
STRUCTURE DESIGN BY GENETIC ALGORITHMS
Stefan Bornholdt
Deutsches Elektronen-Synchrotron DESY, Notkestr. 85, 2000 Hamburg 52
Dirk Graudenz
Institut f\"ur Theoretische Physik, Lehrstuhl E, RWTH 5100 Aachen,
Germany.
A learning algorithm for neural networks based on genetic algorithms is
proposed. The concept leads in a natural way to a model for the
explanation of inherited behavior. Explicitly we study a simplified
model for a brain with sensory and motor neurons. We use a general
asymmetric network whose structure is solely determined by an
evolutionary process. This system is simulated numerically.
It turns out that the network obtained by the algorithm
reaches a stable state after a small number of sweeps.
Some results illustrating the learning capabilities are presented.
[to appear in Neural Networks]
preprints available from:
Stefan Bornholdt, DESY-T, Notkestr. 85, 2000 Hamburg 52, Germany.
Email: t00bor@dhhdesy3.bitnet (hardcopies only, all rights reserved)
------------------------------
Subject: TR - Expert networks
From: Chris Lacher <lacher@NU.CS.FSU.EDU>
Date: Wed, 29 Jan 92 14:36:23 -0500
The following paper has been placed in the neuroprose archives under the
name 'lacher.rapprochement.ps.Z'. Retrieval, uncompress, and printing
have been successfuly tested.
Expert Networks:
Paradigmatic Conflict, Technological Rapprochement^\dagger
R. C. Lacher
Florida State University
lacher@cs.fsu.edu
Abstract. A rule-based expert system is demonstrated to have both a
symbolic computational network representation and a sub-symbolic
connectionist representation. These alternate views enhance the
usefulness of the original system by facilitating introduction of
connectionist learning methods into the symbolic domain. The
connectionist representation learns and stores metaknowledge in highly
connected subnetworks and domain knowledge in a sparsely connected expert
network superstructure. The total connectivity of the neural network
representation approximates that of real neural systems which may be
useful in avoiding scaling and memory stability problems associated with
some other connectionist models.
Keywords. symbolic AI, connectionist AI, connectionism, neural networks,
learning, reasoning, expert networks, expert systems, symbolic models,
sub-symbolic models.
^\dagger Paper given to the symposium "Approaches to Cognition", the
fifteenth annual Symposium in Philosophy held at the University of North
Carolina, Greensboro, April 5-7, 1991.
------------------------------
Subject: Adding noise to training -- A psychological perspective (Preprint)
From: Mark Gluck <gluck@pavlov.Rutgers.EDU>
Date: Wed, 13 Nov 91 09:29:58 -0500
In a recent paper we have discussed the role of stochastic noise in
training data for adaptive network models of human classification
learning. We have shown how the incorporation of such noise (usually
modelled as a stochastic sampling process on the external stimuli)
improves generalization performance, especially with deterministic
discriminations which underconstrain the set of possible
solution-weights. The addition of noise to the training biases the
network to find solutions (and generalizations) which more closely
correspond to the behavior of humans in psychological experiments. The
reference is:
Gluck, M. A. (1991,in press). Stimulus sampling and distributed
representations in adaptive network theories of learning. In A. Healy, S.
Kosslyn, & R. Shiffrin (Eds.), Festschrift for W. K. Estes. New Jersey:
Lawrence Erlbaum Associates.
Copies can be received by emailing to:
______________________________________________________________________
Dr. Mark A. Gluck
Center for Molecular & Behavioral Neuroscience
Rutgers University
197 University Ave.
Newark, New Jersey 07102
Phone: (201) 648-1080 (Ext. 3221)
Fax: (201) 648-1272
Email: gluck@pavlov.rutgers.edu
------------------------------
End of Neuron Digest [Volume 9 Issue 7]
***************************************