Copy Link
Add to Bookmark
Report

Neuron Digest Volume 11 Number 44

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Thursday, 15 Jul 1993                Volume 11 : Issue 44 

Today's Topics:
Administrivia - Paper announcements
Preprint available: A network to velocity vector-field correction
New Book and Videotape on Genetic Programming
Subject: Preprint available: On spike synchronization
Preprint: Computational Models of the Neural Bases of Learning and Memory
paper available - the Ghost in the Machine
thesis available
PhD dissertation available


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Administrivia - Paper announcements
From: "Neuron-Digest Moderator, Peter Marvit" <neuron@cattell.psych.upenn.edu>
Date: Thu, 15 Jul 93 16:27:10 -0500

Dear readers,

Many of you have noticed a lack of paper and technical report
announcements for some time with the Neuron Digest. As moderator, I have
tended to put priority to time-sensitive conference announcements and
general personal discussion (including jobs announcements). So, I have
had a significant backlog of papers and TRs. I will start rectifying the
absence. Unfortunately, many "pre-prints" will now be available in the
journals or conference proceedings.

As a reminder, if someone advertises, hard-copy *PLEASE* do not
indiscriminately ask for copies. A deluge of requests from the net can
put a strain on a researcher's time and budget. Carefully read the
abstract and then decide if you think it is worth someone to make a copy,
address an envelope, put on lots of stamps, and mail it to you. Please
also note that many hard-copy adverts include specific mailing
instructions plus a publication fee. Please read the directions
carefully.

This is also a note to encourage electronic distribution of manuscripts,
either in Postscript form with figures or plain text.

Cheers and happy reading!
Peter

: Peter Marvit, Neuron Digest Moderator :
: Email: <neuron-request@cattell.psych.upenn.edu> :
: Courtesy of the Psychology Department, University of Pennsylvania :
: 3815 Walnut St., Philadelphia, PA 19104 w:215/898-6274 h:215/387-6433 :

------------------------------

Subject: Preprint available: A network to velocity vector-field correction
From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Mon, 31 Aug 92 14:14:10 +0100


The following paper has been accepted for publication in the
proceedings of the International Conference on
Artificial Neural Networks '92 in Brighton:

Relaxation in 4D state space - A competitive network
approach to object-related velocity vector-field correction


by Helmut Gluender Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany

and Astrid Lehmann Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, Germany

ABSTRACT:

A standard principle of (energy-)minimization is applied to the
problem of visual motion analysis. In contrast to well-known
mathematical optimization procedures and universal optimizing
networks it is proposed to use a problem-adapted network
architecture. Owing to the bilocal coincidence-type motion
detector considered here the task of object-related motion
analysis appears as a geometric correspondence problem. Hence,
the correct spatio-temporal correspondeces between elements in
consecutive images must be selected from all possible ones. This
is performed by neighborhood operations that are repeatedly
applied to the instantaneous signal representation in the
space/velocity-domain until an estimate of the actual flow-field
is reached.

Hardcopies of the paper are available. Please send requests
to the following address in Germany:

Helmut Gluender
Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany

or via email to:
alfred@lnt.e-technik.tu-muenchen.de

communicated by Alfred Nischwitz


------------------------------

Subject: New Book and Videotape on Genetic Programming
From: John Koza <koza@CS.Stanford.EDU>
Date: Sun, 15 Nov 92 16:40:57 -0800



BOOK AND VIDEOTAPE ON GENETIC PROGRAMMING

A new book and a one-hour videotape (in VHS NTSC, PAL, and SECAM
formats) on genetic programming are now available from the MIT
Press.

NEW BOOK...

GENETIC PROGRAMMING: ON THE PROGRAMMING OF COMPUTERS BY
MEANS OF NATURAL SELECTION

by John R. Koza, Stanford University

The recently developed genetic programming paradigm provides a
way to genetically breed a computer program to solve a wide variety
of problems. Genetic programming starts with a population of
randomly created computer programs and iteratively applies the
Darwinian reproduction operation and the genetic crossover (sexual
recombination) operation in order to breed better individual
programs. The book describes and illustrates genetic programming
with 81 examples from various fields.

840 pages. 270 Illustrations. ISBN 0-262-11170-5.

Contents...

1 Introduction and Overview
2 Pervasiveness of the Problem of Program Induction
3 Introduction to Genetic Algorithms
4 The Representation Problem for Genetic Algorithms
5 Overview of Genetic Programming
6 Detailed Description of Genetic Programming
7 Four Introductory Examples of Genetic Programming
8 Amount of Processing Required to Solve a Problem
9 Nonrandomness of Genetic Programming
10 Symbolic Regression - Error-Driven Evolution
11 Control - Cost-Driven Evolution
12 Evolution of Emergent Behavior
13 Evolution of Subsumption
14 Entropy-Driven Evolution
15 Evolution of Strategy
16 Co-Evolution
17 Evolution of Classification
18 Iteration, Recursion, and Setting
19 Evolution of Constrained Syntactic Structures
20 Evolution of Building Blocks
21 Evolution of Hierarchies of Building Blocks
22 Parallelization of Genetic Programming
23 Ruggedness of Genetic Programming
24 Extraneous Variables and Functions
25 Operational Issues
26 Review of Genetic Programming
27 Comparison with Other Paradigms
28 Spontaneous Emergence of Self-Replicating and Self-Improving
Computer Programs
29 Conclusions

Appendices contain simple software in Common LISP for
implementing experiments in genetic programming.

ONE-HOUR VIDEOTAPE...

GENETIC PROGRAMMING: THE MOVIE

by John R. Koza and James P. Rice, Stanford University

The one-hour videotape (in VHS NTSC, PAL, and SECAM formats)
provides a general introduction to genetic programming and a
visualization of actual computer runs for 22 of the problems
discussed in the book GENETIC PROGRAMMING: ON THE PROGRAMMING
OF COMPUTER BY MEANS OF NATURAL SELECTION. The problems
include symbolic regression, the intertwined spirals, the artificial
ant, the truck backer upper, broom balancing, wall following, box
moving, the discrete pursuer-evader game, the differential pursuer-
evader game, inverse kinematics for controlling a robot arm,
emergent collecting behavior, emergent central place foraging, the
integer randomizer, the one-dimensional cellular automaton
randomizer, the two-dimensional cellular automaton randomizer,
task prioritization (Pac Man), programmatic image compression,
solving numeric equations for a numeric root, optimization of lizard
foraging, Boolean function learning for the 11-multiplexer, co-
evolution of game-playing strategies, and hierarchical automatic
function definition as applied to learning the Boolean even-11-
parity function.

- ---------------------------ORDER FORM----------------------

PHONE: 800-326-4471 TOLL-FREE or 617-625-8569
MAIL: The MIT Press, 55 Hayward Street, Cambridge, MA 02142
FAX: 617-625-9080

Please send
____ copies of the book GENETIC PROGRAMMING: ON THE
PROGRAMMING OF COMPUTERS BY MEANS OF NATURAL SELECTION by
John R. Koza (KOZGII) (ISBN 0-262-11170-5) @ $55.00.
____ copies of the one-hour videotape GENETIC PROGRAMMING: THE
MOVIE by John R. Koza and James P. Rice in VHS NTSC format
(KOZGVV) (ISBN 0-262-61084-1) @$34.95
____ copies of the videotape in PAL format (KOZGPV) (ISBN 0-262-
61087-6) @$44.95
____ copies of the videotape in SECAM format (KOZGSV) (ISBN 0-
262-61088-4) @44.95.

Name __________________________________

Address_________________________________

City____________________________________

State_________________Zip________________

Country_________________________________

Phone Number ___________________________

$ _______ Total
$ _______ Shipping and Handling ($3 per item. Outside U.S. and
Canada, add $6 per item for surface rate or $22 per item for airmail)
$ _______ Canada - Add 7% GST
$ _______ Total due MIT Press

__ Payment attached (check payable to The MIT Press in U.S. funds)
__ Please charge to my VISA or MASTERCARD credit card

Number ________________________________
Credit Card Expires _________________________________
Signature ________________________________




------------------------------

From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Wed, 13 Jan 93 14:08:00 +0100
Subject: Preprint available: On spike synchronization

The following paper will be published in
"Brain Theory - Spatio-Temporal Aspects of Brain Function"
edited by A.Aertzen & W. von Seelen, Elsevier, Amsterdam:

ON SPIKE SYNCHRONIZATION

by Helmut Gluender Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany

and Alfred Nischwitz Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, Germany
ABSTRACT:

We start with historically founded reflections on the relevance
of synchronized activity for the neural processing of information and we
propose to differentiate between synchrony at the emitting and the
receiving side. In the main part we introduce model networks which consist
of chains of locally coupled and noisy spiking neurons. In the case of
lateral excitation without delay as well as for delayed lateral inhibition
these basic structures can turn homogeneous stimulations into synchronized
activity. The synchrony is maintained under temporally varying stimulations
thus evoking aperiodic spike fronts. Although we present some hypotheses,
the question of how the nervous system deals with this network property
remains to be answered.

Hardcopies of the paper are available. Please send requests via
email or to the following address in Germany:

Alfred Nischwitz
Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, F.R.Germany
email: alfred@lnt.e-technik.tu-muenchen.de

Alfred Nischwitz


------------------------------

From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Tue, 19 Jan 93 10:56:51 +0100

Subject: Letter available: Spike-Synchronization mechanisms
From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Tue, 19 Jan. 93

The following short letter is published in the
german issue of scientific american (in german language):

SPIKE-SYNCHRONISATIONS MECHANISMEN

by Alfred Nischwitz Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, Germany
and
Helmut Gluender Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany
ABSTRACT:

Anhand von Zeichnungen werden Synchronisations- und
Desynchronisations-mechanismen fuer inhibitorisch
und exzitatorisch gekoppelte 'leaky integrate and fire'
Modell-Neurone erklaert.

Hardcopies of the paper are available. Please send requests via
email or to the following address in Germany:

Alfred Nischwitz
Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, F.R.Germany
email: alfred@lnt.e-technik.tu-muenchen.de

Alfred Nischwitz


------------------------------

Subject: Preprint: Computational Models of the Neural Bases of Learning and Memory
From: Mark Gluck <gluck@pavlov.rutgers.edu>
Date: Wed, 03 Feb 93 09:13:20 -0500


For (hard copy) preprints of the following article:


Gluck, M. A. & Granger, R. C. (1993). Computational models of the neural
bases of learning and memory. Annual Review of Neuroscience. 16: 667-706

ABSTRACT: Advances in computational analyses of parallel-processing have
made computer simulation of learning systems an increasingly useful tool
in understanding complex aggregate functional effects of changes in
neural systems. In this article, we review current efforts to develop
computational models of the neural bases of learning and memory, with a
focus on the behavioral implications of network-level characterizations
of synaptic change in three anatomical regions: olfactory (piriform)
cortex, cerebellum, and the hippocampal formation.

____________________________________

Send US-mail address to: Mark Gluck (Center for Neuroscience, Rutgers-Newark)
gluck@pavlov.rutgers.edu



------------------------------

Subject: paper available - the Ghost in the Machine
From: Andrew Wuensche <100020.2727@CompuServe.COM>
Date: 03 Jun 93 11:38:43 -0500

The Ghost in the Machine
========================
Cognitive Science Research Paper 281, University of Sussex.

The following paper describes recent work on the basins of attraction of
random Boolean networks, and implications on memory and learning.
Currently only hard-copies are available. To request a copy, email

andywu@cogs.susx.ac.uk, or write to

Andy Wuensche, 48 Esmond Road, London W4 1JQ, UK
giving a surface mail address.


A B S T R A C T
- ---------------
The Ghost in the Machine
Basins of Attraction of Random Boolean Networks

This paper examines the basins of attraction of random Boolean networks,
a very general class of discrete dynamical systems, in which cellular
automata (CA) form a special sub-class. A reverse algorithm is presented
which directly computes the set of pre-images (if any) of a network's
state. Computation is many orders of magnitude faster than exhaustive
testing, making the detailed structure of random network basins of
attraction readily accessible for the first time. They are portrayed as
diagrams that connect up the network's global states according to their
transitions. Typically, the topology is branching trees rooted on
attractor cycles.
The homogeneous connectivity and rules of CA are necessary for the
emergence of coherent space-time structures such as gliders, the basis of
CA models of artificial life. On the other hand random Boolean networks
have a vastly greater parameter/basin field configuration space capable
of emergent categorisation.
I argue that the basin of attraction field constitutes the network's
memory; but not simply because separate attractors categorise state space
- - in addition, within each basin, sub-categories of state space are
categorised along transient trees far from equilibrium, creating a
complex hierarchy of content addressable memory. This may answer a basic
difficulty in explaining memory by attractors in biological networks
where transient lengths are probably astronomical.
I describe a single step learning algorithm for re-assigning
pre-images in random Boolean networks. This allows the sculpting of their
basin of attraction fields to approach any desired configuration. The
process of learning and its side effects are made visible. In the context
of many semi-autonomous weakly coupled networks, the basin field/network
relationship may provide a fruitful metaphor for the mind/brain.





------------------------------

Subject: thesis available
From: "Egbert J.W. Boers" <boers@WI.leidenuniv.nl>
Date: Thu, 01 Jul 93 15:41:41 +0100


FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/boers.biological-metaphors.ps.Z

The file boers.biological-metaphors.ps.Z (104 pages) is now available for
copying from the Neuroprose repository:

Biological metaphors
and the design of modular
artificial neural networks

Egbert J.W. Boers, Herman Kuiper
Leiden University
The Netherlands

ABSTRACT: In this thesis, a method is proposed with which good modular
artificial neural network structures can be found automatically using a
computer program. A number of biological metaphors are incorporated in
the method. It will be argued that modular artificial neural networks
have a better performance than their non-modular counterparts. The human
brain can also be seen as a modular neural network, and the proposed
search method is based on the natural process that resulted in the brain:
Genetic algorithms are used to imitate evolution, and L-systems are used
to model the kind of recipes nature uses in biological growth.

A small number of experiments have been done to investigate the
possibilities of the method. Preliminary results show that the method
does find modular networks, and that those networks outperform 'standard'
solutions. The method looks very promising, although the experiments
done were too limited to draw any general conclusions. One drawback is
the large amount of computing time needed to evaluate the quality of a
population member, and therefore in chapter 9 a number of possible
improvements are given on how to increase the speed of the method, as
well as a number of suggestions on how to continue from here.



Unfortunately, I'm not in the position to distribute paper-copies of this
thesis. Questions and remarks are most welcome.


Egbert Boers
Leiden University
The Netherlands
boers@wi.LeidenUniv.nl


------------------------------

Subject: PhD dissertation available
From: SCHOLTES@ALF.LET.UVA.NL
Date: Mon, 08 Feb 93 11:36:00 +0700

===================================================================

Ph.D. DISSERTATION AVAILABLE

on

Neural Networks, Natural Language Processing, Information Retrieval

===================================================================

A Copy of the dissertation "Neural Networks in Natural Language Processing
and Information Retrieval" by Johannes C. Scholtes can be obtained for
cost price and fast airmail- delivery at US$ 25,-.

Payment by Major Creditcards (VISA, AMEX, MC, Diners) is accepted and
encouraged. Please include Name on Card, Number and Exp. Date. Your Credit
card will be charged for Dfl. 47,50.

Within Europe one can also send a Euro-Cheque for Dfl. 47,50 to:

University of Amsterdam
J.C. Scholtes
Dufaystraat 1
1075 GR Amsterdam
The Netherlands

Do not forget to mention a surface shipping address. Please allow 2-4
weeks for delivery.


Abstract

1.0 Machine Intelligence

For over fifty years the two main directions in machine intelligence
(MI), neural networks (NN) and artificial intelligence (AI), have been
studied by various persons with many dif-ferent backgrounds. NN and AI
seemed to conflict with many of the traditional sciences as well as with
each other. The lack of a long research history and well defined
foundations has always been an obstacle for the general acceptance of
machine intelligence by other fields.

At the same time, traditional schools of science such as mathematics and
physics devel-oped their own tradition of new or "intelligent"
algorithms. Progress made in the field of statistical reestimation
techniques such as the Hidden Markov Models (HMM) started a new phase in
speech recognition. Another application of the progress of mathematics
can be found in the application of the Kalman filter in the
interpretation of sonar and radar signals. Much more examples of such
"intelligent" algorithms can be found in the statistical classification
en filtering techniques of the study of pattern recognition (PR).


Here, the field of neural networks is studied with that of pattern
recognition in mind. Although only global qualitative comparisons are
made, the importance of the relation between them is not to be
underestimated. In addition it is argued that neural networks do indeed
add something to the fields of MI and PR, instead of competing or
conflicting with them.

2.0 Natural Language Processing

The study of natural language processing (NLP) exists even longer than
that of MI. Already in the beginning of this century people tried to
analyse human language with machines. However, serious efforts had to
wait until the development of the digital com- puter in the 1940s, and
even then, the possibilities were limited. For over 40 years, sym- bolic
AI has been the most important approach in the study of NLP. That this
has not always been the case, may be concluded from the early work on NLP
by Harris. As a mat-ter of fact, Chomsky's Syntactic Structures was an
attack on the lack of structural proper- ties in the mathematical methods
used in those days. But, as the latter's work remained the standard in
NLP, the former has been forgotten completely until recently. As the
scientific community in NLP devoted all its attention to the symbolic
AI-like theories, the only use-ful practical implementation of NLP
systems were those that were based on statistics rather than on
linguistics. As a result, more and more scientists are redirecting their
atten- tion towards the statistical techniques available in NLP. The
field of connectionist NLP can be considered as a special case of these
mathematical methods in NLP.

More than one reason can be given to explain this turn in approach. On
the one hand, many problems in NLP have never been addressed properly by
symbolic AI. Some exam- ples are robust behavior in noisy environments,
disambiguation driven by different kinds of knowledge, commensense
generalizations, and learning (or training) abilities.
On the other hand, mathematical methods have become much stronger and
more sensitive to spe-cific properties of language such as hierarchical
structures.

Last but not least, the relatively high degree of success of mathematical
techniques in commercial NLP systems might have set the trend towards the
implementation of simple, but straightforward algorithms.

In this study, the implementation of hierarchical structures and
semantical features in mathematical objects such as vectors and matrices
is given much attention. These vectors can then be used in models such as
neural networks, but also in sequential statistical pro- cedures
implementing similar characteristics.

3.0 Information Retrieval

The study of information retrieval (IR) was traditionally related to
libraries on the one hand and military applications on the other.
However, as PC's grew more popular, most common users loose track of the
data they produced over the last couple of years. This, together with the
introduction of various "small platform" computer programs made the field
of IR relevant to ordinary users.

However, most of these systems still use techniques that have been
developed over thirty years ago and that implement nothing more than a
global surface analysis of the textual (layout) properties. No deep
structure whatsoever, is incorporated in the decision whether or not to
retrieve a text.

There is one large dilemma in IR research. On the one hand, the data
collections are so incredibly large, that any method other than a global
surface analysis would fail. On the other hand, such a global analysis
could never implement a contextually sensitive method to restrict the
number of possible candidates returned by the retrieval system.
As a result, all methods that use some linguistic knowledge exist only
in laboratories and not in the real world. Conversely, all methods that
are used in the real world are based on technological achievements from
twenty to thirty years ago.

Therefore, the field of information retrieval would be greatly indebted
to a method that could incorporate more context without slowing down. As
computers are only capable of processing numbers within reasonable time
limits, such a method should be basedon vec- tors of numbers rather than
on symbol manipulations. This is exactly where the challenge is: on the
one hand keep up the speed, and on the other hand incorporate more
context. If possible, the data representation of the contextual
information must not be restricted to a single type of media. It should
be possible to incorporate symbolic language as well as sound, pictures
and video concurrently in the retrieval phase, although one does not know
exactly how yet...

Here, the emphasis is more on real-time filtering of large amounts of
dynamic data than on document retrieval from large (static) data bases.
By incorporating more contextual infor- mation, it should be possible to
implement a model that can process large amounts of unstructured text
without providing the end-user with an overkill of information.

4.0 The Combination

As this study is a very multi-disciplinary one, the risk exists that it
remainsrestricted to a surface discussion of many different problems
without analyzing one in depth. To avoid this, some central themes,
applications and tools are chosen. The themes in this work are
self-organization, distributed data representations and context. The
applications are NLP and IR, the tools are (variants of) Kohonen feature
maps, a well known model from neural network research.

Self-organization and context are more related to each other than one may
suspect. First, without the proper natural context, self-organization
shall not be possible. Next, self-organization enables one to discover
contextual relations that were not known before.

Distributed data representation may solve many of the unsolved problems
in NLP and IR by introducing a powerful and efficient knowledge
integration and generalization tool. However, distributed data
representation and self-organization trigger new problems that should be
solved in an elegant manner.

Both NLP and IR work on symbolic language. Both have properties in common
but both focus on different features of language. In NLP hierarchical
structures and semantical fea- tures are important. In IR the amount of
data sets the limitations of the methods used. However, as computers grow
more powerful and the data sets get larger and larger, both approaches
get more and more common ground. By using the same models on both
applications, a better understanding of both may be obtained.

Both neural networks and statistics would be able to implement
self-organization, distrib- uted data and context in the same manner. In
this thesis, the emphasis is on Kohonen fea- ture maps rather than on
statistics. However, it may be possible to implement many of the
techniques used with regular sequential mathematical algorithms.

So, the true aim of this work can be formulated as the understanding of
self-organization, distributed data representation, and context in NLP
and IR, by in depth analysis of Kohonen feature maps.

------------------------------

End of Neuron Digest [Volume 11 Issue 44]
*****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT