Copy Link
Add to Bookmark
Report
Neuron Digest Volume 11 Number 48
Neuron Digest Wednesday, 18 Aug 1993 Volume 11 : Issue 48
Today's Topics:
ANNs for Noise Filtering, Edge Detect. and Signature Extraction
Paper available: `Statistical Aspects of Neural Networks'
Preprint available: A network to velocity vector-field correction
Several papers (Simulated Annealing, Review, NP-hardness)
TR - VISUAL ATTENTION AND INVARIANT PATTERN RECOGNITION
2 TRs - Iterated Function Systems, Approximations to Functions
Book - The Global Dynamics of CA
IJNS contents vol. 3 issues 2 and 3
Preprint available: Synchronization and label-switching
VLSI Neural Network Application in High Energy Physics
Preprint available: A network to velocity vector-field correction
Genetic Synthesis of Unsupervised Learning Algorithms
Preprint Available: Random-Walk Learning
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: ANNs for Noise Filtering, Edge Detect. and Signature Extraction
From: speeba@cardiff.ac.uk (Eduardo Bayro)
Date: Sat, 11 Jul 92 19:24:12 +0000
/****NEURAL COMPUTING****IMAGE PROCESING*****NEURAL COMPUTING*********/
Journal Systems Engineering (1992)2, Springer Verlag
NEURAL COMPUTING FOR NOISE FILTERING, EDGE DETECTION
AND SIGNATURE EXTRACTION
D.T. Pham and E.J. Bayro-Corrochano
This paper describes two applications of neural computing
to low-level image processing. The first application concerns noise
filtering and edge detection. A neural processor employing back-
propagation multi-layer perceptrons is presented which has been shown
quantitatively to perform better than well known conventional edge
detectors. The second application is in feature extraction. A mask set
has been designed for picking up basic geometrical details of skele-
tonised contours. The use of the masks in a net which implements the
n-tuple contour analysis technique is reported.
Correspondence and offprint requests to:
D.T. Pham
e-mail: phamdt@uk.ac.cardiff
E.J. Bayro-Corrochano
e-mail: speeba@uk.ac.cardif
Intelligent Systems Research Laboratory, School of
Electrical, Electronic and Systems Engineering,
University of Wales, College of Cardiff, P.O. Box 904,
Cardiff CF1 3YH, U.K.
/****NEURAL COMPUTING****IMAGE PROCESING*****NEURAL COMPUTING*********/
------------------------------
Subject: Paper available: `Statistical Aspects of Neural Networks'
From: ripley@statistics.oxford.ac.uk (Prof. Brian Ripley)
Date: Mon, 20 Jul 92 11:46:51 +0000
[This corrects a message sent a hour or so ago. We have re-organized to
a more logical directory.]
A paper, with principal audience statisticians, entitled
Statistical Aspects of Neural Networks
is available by anonymous ftp from
markov.stats.ox.ac.uk (192.76.20.1 or 129.67.1.190)
at pub/neural/papers/ripley.ps.Z (336kB), with abstract ripley.abstract as
follows:
Neural networks have been a much-publicized topic of research in the
last five years, and are now beginning to be used in a wide range of
subject areas traditionally thought by statisticians to be their
domain. This paper explores the basic ideas of neural networks from the
point of view of a statistician, and compares some of their
applications with those of traditional and modern methods of statistics
and pattern recognition.
Neural networks are mainly used as non-linear approximations to
multivariable functions or as classifiers. They are non-parametric in
character in that no subject-domain knowledge is incorporated in the
modelling process, and the parameters are estimated using algorithms
which at least in principle can be computed on loosely-coupled parallel
computers. We argue that the modelling-based approach traditional in
statistics and pattern recognition can be at least as effective, and
often more so. This is illustrated by data on the areas in Zimbabwe
environmentally suitable for Tsetse flies.
Invited lectures for SemStat (S\'eminaire Europ\'een de
Statistique), Sandbjerg, Denmark, 25-30 April 1992. To appear in the
proceedings to be published by Chapman & Hall in January 1993.
.----------------------------------------------------.
| Prof. Brian D. Ripley |
| Dept. of Statistics, |
| University of Oxford, |
| 1 South Parks Road, |
| Oxford OX1 3TG, UK |
| |
| ripley@uk.ac.ox.stats (JANET) |
| ripley@stats.ox.ac.uk (Internet) |
`----------------------------------------------------'
------------------------------
Subject: Preprint available: A network to velocity vector-field correction
From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Mon, 20 Jul 92 14:24:33 +0100
The following paper has been accepted for publication in the
proceedings of the International Conference on
Artificial Neural Networks '92 in Brighton:
Relaxation in 4D state space - A competitive network
approach to object-related velocity vector-field correction
by Helmut Gluender Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany
and Astrid Lehmann Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, Germany
ABSTRACT:
A standard principle of (energy-)minimization is applied to the
problem of visual motion analysis. In contrast to well-known
mathematical optimization procedures and universal optimizing
networks it is proposed to use a problem-adapted network
architecture. Owing to the bilocal coincidence-type motion
detector considered here the task of object-related motion
analysis appears as a geometric correspondence problem. Hence,
the correct spatio-temporal correspondeces between elements in
consecutive images must be selected from all possible ones. This
is performed by neighborhood operations that are repeatedly
applied to the instantaneous signal representation in the
space/velocity-domain until an estimate of the actual flow-field
is reached.
Hardcopies of the paper are available. Please send requests
to the following address in Germany:
Helmut Gluender
Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany
or via email to:
alfred@lnt.e-technik.tu-muenchen.de
communicated by Alfred Nischwitz
------------------------------
Subject: Several papers (Simulated Annealing, Review, NP-hardness)
From: Xin Yao <Xin.Yao@dbce.csiro.au>
Organization: CSIRO, Div. Building Constr. and Eng'ing, Melb., Australia
Date: Fri, 24 Jul 92 14:27:37 -0500
The following papers have been put in neuroprose archive. Thanks to Jordan
Pollack. Limited number of hard copies can be obtained by sending a note,
specifying the author and title, to:
Smail: Ms. Cathy Bowditch, The Editor
CSIRO Division of Building, Construction and Engineering
PO Box 56, Highett, Vic 3190, Australia
Email: cathy@mel.dbce.csiro.au
(1) X. Yao, "A Review of Evolutionary Artificial Neural Networks," Accepted by
International Journal of Intelligent Systems, to appear.
Filename in neuroprose: yao.eann.ps.Z
(2) X. Yao, "Finding Approximate Solutions to NP-hard Problems by Neural
Networks Is Hard," Information Processing Letters, 41:93--98, 1992.
Filename: yao.complex.ps.Z
(3) X. Yao, "Simulated Annealing with Extended Neighbourhood," International
Journal of Computer Mathematics, 40:169--189, 1991.
Filename: yao.sa_en.ps.Z
ftp Instructions:
unix> ftp archive.cis.ohio-state.edu (or 128.146.8.52)
Name: anonymous
Password: (your email address)
ftp> cd pub/neuroprose
ftp> binary
ftp> get yao.filename.ps.Z (where filename is one of the above three)
ftp> quit
unix>uncompress yao.filename.ps.Z
unix>lpr yao.filename.ps (or whatever you used to print .ps files)
- --
| Xin Yao CSIRO Division of Building, Construction and Engineering |
| Post Office Box 56, Highett, Victoria, Australia 3190 |
| Internet: xin@mel.dbce.csiro.au Fax: +61 3 252 6244 |
| Tel: +61 3 252 6000 (swtichboard) +61 3 252 6374 (office) |
|_____________________________________________________________________________|
------------------------------
Subject: TR - VISUAL ATTENTION AND INVARIANT PATTERN RECOGNITION
From: bruno@cns.caltech.edu (Bruno Olshausen)
Date: Fri, 07 Aug 92 22:51:01 -0800
The following technical report has been archived for public ftp:
- ----------------------------------------------------------------------
A NEURAL MODEL OF VISUAL ATTENTION AND INVARIANT PATTERN RECOGNITION
Bruno Olshausen, Charles Anderson*, and David Van Essen
Computation and Neural Systems Program
Division of Biology, 216-76
and
*Jet Propulsion Laboratory
California Institute of Technology
Pasadena, CA 91125
CNS Memo 18
Abstract. We present a biologically plausible model of an attentional
mechanism for forming position- and scale-invariant object
representations. The model is based on using control neurons to
dynamically modify the synaptic strengths of intra-cortical
connections so that information from a windowed region of primary
visual cortex, V1, is routed to higher cortical areas while preserving
information about spatial relationships. This paper describes details
of a neural circuit for routing visual information and provides a
solution for controlling the circuit as part of an autonomous
attentional system for recognizing objects. The model is designed to
be consistent with known neurophysiology, neuroanatomy, and
psychophysics, and it makes a variety of experimentally testable
predictions.
- ----------------------------------------------------------------------
Obtaining the paper via anonymous ftp:
1. ftp to kant.cns.caltech.edu (131.215.135.31)
2. login as 'anonymous' and type your email address as the password
3. cd to pub/cnsmemo.18
4. set transfer mode to binary (type 'binary' at the prompt)
5. get either 'paper-apple.tar.Z' or 'paper-sparc.tar.Z'. The first
will print on the Apple LaserWriter II, the other on the SPARCprinter.
(They may work on other PostScript printers too, but I can't guarantee it.)
6. quit from ftp, and then uncompress and detar the file on your
machine by typing
uncompress -c filename.tar.Z | tar xvf -
7. remove the tarfile and print out the three postscript files
(paper1.ps, paper2.ps and paper3.ps), beginning with paper3.ps.
If you don't have an appropriate PostScript printer, then send a
request for a hardcopy to bruno@cns.caltech.edu.
------------------------------
Subject: 2 TRs - Iterated Function Systems, Approximations to Functions
From: rdj@demos.lanl.gov (Roger D. Jones)
Date: Mon, 10 Aug 92 10:35:07 -0700
TECHNICAL REPORTS AVAILABLE
A RECURRENT NETWORK FOR THE SOLUTION TO THE INVERSE PROBLEM
OF ITERATED FUNCTION SYSTEMS
O. L. Bakalis, R. D. Jones, Y. C. Lee, and B. J. Travis
ON THE EXISTENCE AND STABILITY OF CERTAIN TYPES OF NEUROMORPHIC
APPROXIMATIONS TO FUNCTIONS
R. K. Prasanth, R. D. Jones, and Y. C. Lee
Please send surface mail address to rdj@lanl.gov
or
Roger D. Jones
MS-F645
Los Alamos National Laboratory
Los Alamos, New Mexico 87545
------------------------------
Subject: Book - The Global Dynamics of CA
From: Andrew Wuensche <100020.2727@CompuServe.COM>
Date: 13 Aug 92 09:06:07 -0500
I would like to announce the following book, now available.
thanks
Andy Wuensche
wuensch@santafe.edu
THE GLOBAL DYNAMICS OF CELLULAR AUTOMATA
An Atlas of Basin of Attraction Fields of
One-Dimensional Cellular Automata.
Andrew Wuensche
Mike Lesser
Foreword by Chris Langton
Diskette included for PC-compatible computers.
Santa Fe Institute Studies in the Sciences of Complexity
Reference Vol 1
Addison-Wesley Publishing Co. Reading MA, phone:(800) 447 2226
IBSN 0-201-55740-1 price: about $54
Abstract:
The Global Dynamics of Cellular Automata introduces a new global
perspective for the study of discrete dynamical systems, analogous to
the phase portrait in continuous dynamical systems.
As well as looking at the unique trajectory of the systems future,
an algorithm is presented that directly computes the multiple merging
trajectories that may have constituted the system's past. A given set
of cellular automata parameters will, in a sense, crystallize state
space into a set of basins of attraction that will typically have the
topology of branching trees rooted on attractor cycles. The explicit
portraits of these mathematical objects are made accessible. The Atlas
presents two complete classes of such objects: for the 3-neighbour
rules (elementary rules) and for the 5-neighbour totalystic rules.
The book looks in detail at CA architecture and rule systems, and
the corresponding global dynamics. It is shown that the evolution of CA
with periodic boundary conditions is bound by general principles
relating to symmetrys of the circular array. The rule numbering system
and equivalence classes are reviewed. Symmetry categories, rule
clusters, limited pre-image rules, and the reverse algorithm are
introduced. The Z parameter (depending only on the rule table) is
introduced, reflecting the degree of pre-imaging, or the convergence of
dynamical flow in state space evident in the basin of attraction
field. A relationship between the Z parameter, basin field topology,
and rule behaviour classes is proposed. A genotype-phenotype analogy
looks at the effect of mutating the rule table to produce mutant basin
fields.
The accompanying software is an interactive research tool capable of
generating basins of attraction for any of the 2^32 CA rules in
5-neighbour rule space (for a range of array size), as well as
pre-images, space-time patterns and mutation. The operating
instructions are contained in the book.
* * * * * *
------------------------------
Subject: IJNS contents vol. 3 issues 2 and 3
From: BRUNAK@nbivax.nbi.dk
Date: 30 Oct 92 10:56:54 +0100
Begin Message:
- -----------------------------------------------------------------------
INTERNATIONAL JOURNAL OF NEURAL SYSTEMS
The International Journal of Neural Systems is a quarterly journal
which covers information processing in natural and artificial neural
systems. It publishes original contributions on all aspects of this
broad subject which involves physics, biology, psychology, computer
science and engineering. Contributions include research papers, reviews
and short communications. The journal presents a fresh undogmatic
attitude towards this multidisciplinary field with the aim to be a
forum for novel ideas and improved understanding of collective and
cooperative phenomena with computational capabilities.
ISSN: 0129-0657 (IJNS)
- ----------------------------------
Contents of Volume 3, issue number 2 (1992):
1. H.C. Card & C.R. Schneider:
Analog CMOS Neural Circuits - In situ Learning.
2. M.W. Goudreau & C.L. Giles:
Routing in Random Multistage Interconnections Networks:
Comparing Exhaustive Scarch, Greedy and Neural Network Approaches.
3. P.J. Zwietering, E.H. L. Aarts & J. Wessels:
Exact Classification with Two-Layered Perceptrons.
4. D. Saad & R. Sasson:
Examining the CHIR Algorithm Performance for
Multilayer Networks and Continous Input Vectors.
5. I. Ginzberg & D. Horn:
Learning the Rule of a Time Series.
6. H.J. Chang, J. Ghosh & K. Liano:
A Macroscopic Model of Neural Ensembles:
Learning-Induced Oscilliations in a Cell.
7. S. Hejazi, S.M. Bauer & R.A. Spangler:
Neural Network Analysis of Thermal Image Data.
8. K.T. Sun & H.C. Fu:
A Neural Network Implemantation for the Traffic
Control Problem on Crossbar Switch Networks.
Contents of Volume 3, issue number 3 (1992):
1. J. Reynolds & L. Tarassenko:
Spoken Letter Recognition with Neural Networks.
2. Z. Li:
Different Retinal Ganglion Cells have Different Functional Goals.
3. O. Shagrir:
A Neural Net with Self-Inhibiting Units for the N-Queens Problem.
4. L. Xu, S. Klasa & A. Yuille:
Recent Advances on Techniques of Static Feed-forward Networks
with Supervised Learning.
5. M-Y. Chow & S.O. Yee:
A Measure of Relative Robustness for Feedforward
Neural Networks Subject to Small Input Perturbations.
6. F.L. Chung & T. Lee:
A Node Pruning Algorithm for Backpropagation Networks.
7. S. Tan, J. Hao & J. Vandewalle:
Pattern Storage and Hopfield Neural Assosiative
Memory with Hidden Structure.
- ----------------------------------
Editorial board:
B. Lautrup (Niels Bohr Institute, Denmark) (Editor-in-charge)
S. Brunak (Technical Univ. of Denmark) (Assistant Editor-in-Charge)
D. Stork (Stanford) (Book review editor)
Associate editors:
J. Alspector (Bellcore)
B. Baird (Berkeley)
D. Ballard (University of Rochester)
E. Baum (NEC Research Institute)
S. Bjornsson (University of Iceland)
J. M. Bower (CalTech)
S. S. Chen (University of North Carolina)
R. Eckmiller (University of Dusseldorf)
J. L. Elman (University of California, San Diego)
M. V. Feigelman (Landau Institute for Theoretical Physics)
F. Fogelman-Soulie (Paris)
K. Fukushima (Osaka University)
A. Gjedde (Montreal Neurological Institute)
S. Grillner (Nobel Institute for Neurophysiology, Stockholm)
T. Gulliksen (University of Oslo)
D. Hammerstrom (Oregon Graduate Institute)
D. Horn (Tel Aviv University)
J. Hounsgaard (University of Copenhagen)
B. A. Huberman (XEROX PARC)
L. B. Ioffe (Landau Institute for Theoretical Physics)
P. I. M. Johannesma (Katholieke Univ. Nijmegen)
M. Jordan (MIT)
G. Josin (Neural Systems Inc.)
I. Kanter (Princeton University)
J. H. Kaas (Vanderbilt University)
A. Lansner (Royal Institute of Technology, Stockholm)
A. Lapedes (Los Alamos)
B. McWhinney (Carnegie-Mellon University)
M. Mezard (Ecole Normale Superieure, Paris)
J. Moody (Yale, USA)
A. F. Murray (University of Edinburgh)
J. P. Nadal (Ecole Normale Superieure, Paris)
E. Oja (Lappeenranta University of Technology, Finland)
N. Parga (Centro Atomico Bariloche, Argentina)
S. Patarnello (IBM ECSEC, Italy)
P. Peretto (Centre d'Etudes Nucleaires de Grenoble)
C. Peterson (University of Lund)
K. Plunkett (University of Aarhus)
S. A. Solla (AT&T Bell Labs)
M. A. Virasoro (University of Rome)
D. J. Wallace (University of Edinburgh)
D. Zipser (University of California, San Diego)
- ----------------------------------
CALL FOR PAPERS
Original contributions consistent with the scope of the journal are
welcome. Complete instructions as well as sample copies and
subscription information are available from
The Editorial Secretariat, IJNS
World Scientific Publishing Co. Pte. Ltd.
73, Lynton Mead, Totteridge
London N20 8DH
ENGLAND
Telephone: (44)81-446-2461
or
World Scientific Publishing Co. Inc.
Suite 1B
1060 Main Street
River Edge
New Jersey 07661
USA
Telephone: (1)201-487-9655
or
World Scientific Publishing Co. Pte. Ltd.
Farrer Road, P. O. Box 128
SINGAPORE 9128
Telephone (65)382-5663
- -----------------------------------------------------------------------
End Message
------------------------------
Subject: Preprint available: Synchronization and label-switching
From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Wed, 08 Apr 92 19:20:43 +0100
The following paper has been accepted for publishing in the
proceedings of the International Conference on
Artificial Neural Networks '92 in Brighton:
SYNCHRONIZATION AND LABEL-SWITCHING IN NETWORKS OF
LATERALLY COUPLED MODEL NEURONS
by Alfred Nischwitz Lehrstuhl fuer Nachrichtentechnik
Peter Klausner Technische Universitaet Muenchen
Andreas von Oertzen Arcisstrasse 21, D-8000 Muenchen 2, Germany
and
Helmut Gluender Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany
ABSTRACT:
Necessary Conditions for the impulse synchronization in non-
oscillating networks of laterally coupled 'integrate-and-fire'
model neurons are investigated. The behaviour of such networks
for homogeneous stimulations as well as for differently stimulated
subpopulations is studied. In the first case, synchronization
accurate to fractions of the impulse duration can be achieved by
either lateral inhibition or lateral excitation and in the second
case, good and independent synchronization is obtained within
subpopulations, if they are separated by unstimulated neurons.
Hardcopies of the paper are available. Please send requests via
email or to the following address in Germany:
Alfred Nischwitz
Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, F.R.Germany
email: alfred@lnt.e-technik.tu-muenchen.de
Alfred Nischwitz
------------------------------
Subject: VLSI Neural Network Application in High Energy Physics
From: LINDSEY@FNAL.FNAL.GOV
Date: Mon, 13 Apr 92 14:05:53 -0600
For those interested in hardware neural network applications,
copies of the following paper are available via mail or fax. Send requests to
Clark Lindsey at BITNET%"LINDSEY@FNAL".
REAL TIME TRACK FINDING IN A DRIFT CHAMBER WITH A
VLSI NEURAL NETWORK*
Clark S. Lindsey (a), Bruce Denby (a), Herman Haggerty (a),
and Ken Johns (b)
(a) Fermi National Accelerator Laboratory, P.O. Box 500, Batavia,
Illinois 60510.
(b) University of Arizona, Dept of Physics, Tucson, Arizona 85721.
ABSTRACT
In a test setup, a hardware neural network determined track parameters
of charged particles traversing a drift chamber. Voltages proportional
to the drift times in 6 cells of the 3-layer chamber were inputs to the
Intel ETANN neural network chip which had been trained to give the
slope and intercept of tracks. We compare network track parameters to
those obtained from off-line track fits. To our knowledge this is the
first on-line application of a VLSI neural network to a high energy
physics detector. This test explored the potential of the chip and the
practical problems of using it in a real world setting. We compare chip
performance to a neural network simulation on a conventional computer.
We discuss possible applications of the chip in high energy physics
detector triggers.
Accepted by Nuclear Instruments and Methods, Section A
* FERMILAB-Pub-92/55
------------------------------
Subject: Preprint available: A network to velocity vector-field correction
From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Thu, 30 Apr 92 09:49:47 +0100
The following paper has been accepted for publishing in the
proceedings of the International Conference on
Artificial Neural Networks '92 in Brighton:
Relaxation in 4D state space - A competitive network
approach to object-related velocity vector-field correction
by Helmut Gluender Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany
and Astrid Lehmann Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, Germany
ABSTRACT:
A standard principle of (energy-)minimization is applied to the
problem of visual motion analysis. In contrast to well-known
mathematical optimization procedures and universal optimizing
networks it is proposed to use a problem-adapted network
architecture. Owing to the bilocal coincidence-type motion
detector considered here the task of object-related motion
analysis appears as a geometric correspondence problem. Hence,
the correct spatio-temporal correspondeces between elements in
consecutive images must be selected from all possible ones. This
is performed by neighborhood operations that are repeatedly
applied to the instantaneous signal representation in the
space/velocity-domain until an estimate of the actual flow-field
is reached.
Hardcopies of the paper are available. Please send requests
to the following address in Germany:
Helmut Gluender
Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany
or via email to:
alfred@lnt.e-technik.tu-muenchen.de
communicated by Alfred Nischwitz
------------------------------
Subject: Genetic Synthesis of Unsupervised Learning Algorithms
From: dasdan@trbilun.bitnet (Ali Dasdan)
Date: Mon, 19 Jul 93 11:33:44 +0200
The following 25-page paper is available via anonymous ftp.
Genetic Synthesis of Unsupervised Learning Algorithms
Ali DASDAN and Kemal OFLAZER
Department of Computer Engineering and Information Science
Bilkent University
06533 Bilkent, Ankara, TURKEY
Email : dasdan@bcc.bilkent.edu.tr
Abstract
This paper presents new unsupervised learning algorithms that have been
synthesized using a genetic approach. A set of such learning algorithms
has been compared with the classical Kohonen's Algorithm on the
Self-Organizing Map and has been found to provide a better performance measure.
This study indicates that there exist many unsupervised learning algorithms
that lead to an organization similar to that of Kohonen's Algorithm, and
that genetic algorithms can be used to search for optimal algorithms and
optimal architectures for the unsupervised learning.
To obtain an electronic copy:
Either #1 :
- --------
ftp archive.cis.ohio-state.edu
login: anonymous
password: <your email address>
cd /pub/neuroprose
binary
get dasdan.gen-unsup.ps.Z
quit
Then at your system:
uncompress dasdan.gen-unsup.ps.Z
Or #2 :
- --------
ftp firat.bcc.bilkent.edu.tr
login: anonymous
password: <your email address>
cd /pub/Neural/Papers
binary
get gen-unsup.ps.z
quit
Then at your system:
uncompress gen-unsup.ps.z
Kemal Oflazer e-mail: ko@hattusas.cs.bilkent.edu.tr
Bilkent University : ko@cs.bilkent.edu.tr
Computer Engineering Department : ko@trbilun.bitnet (BITNET/EARN)
Bilkent, ANKARA, 06533 TURKIYE tel: (90) 4 - 266-4133
fax: (90) 4 - 266-4126
- ----------------------------------------------------------------------------
------------------------------
Subject: Preprint Available: Random-Walk Learning
From: rwa@spine.lanl.gov (Russell W. Anderson)
Date: Fri, 23 Jul 93 08:08:01 -0700
PREPRINT AVAILABLE:
"Biased Random-Walk Learning:
A Neurobiological Correlate to Trial-and-Error"
(In press: Progress in Neural Networks)
Russell W. Anderson
Los Alamos National Laboratory
Abstract: Neural network models offer a theoretical testbed for the study
of learning at the cellular level. The only experimentally verified
learning rule, Hebb's rule, is extremely limited in its ability to train
networks to perform complex tasks. An identified cellular mechanism
responsible for Hebbian-type long-term potentiation, the NMDA receptor,
is highly versatile. Its function and efficacy are modulated by a wide
variety of compounds and conditions and are likely to be directed by
non-local phenomena. Furthermore, it has been demonstrated that NMDA
receptors are not essential for some types of learning. We have shown
that another neural network learning rule, the chemotaxis algorithm, is
theoretically much more powerful than Hebb's rule and is consistent with
experimental data. A biased random-walk in synaptic weight space is a
learning rule immanent in nervous activity and may account for some types
of learning -- notably the acquisition of skilled movement.
- ------------------------------------------
Electronic copy available, excluding 2 figures.
For hardcopies of the figures, please
contact me by email or slow mail.
To obtain a postscript copy:
%ftp mhc.lanl.gov
login: anonymous
password: <your email address>
ftp> cd pub
ftp> binary
ftp> get bias.ps.Z
ftp> quit
%uncompress bias.ps.Z
%lpr bias.ps
E-mail:
send request to rwa@temin.lanl.gov
Slow mail:
Russell Anderson
Theoretical Division (T-10)
MS K710
Los Alamos National Laboratory
Los Alamos, NM 87545
USA
(505) 667-9455
- -----------------------------------------------------
------------------------------
End of Neuron Digest [Volume 11 Issue 48]
*****************************************