Copy Link
Add to Bookmark
Report
Neuron Digest Volume 06 Number 09
Neuron Digest Friday, 2 Feb 1990 Volume 6 : Issue 9
Today's Topics:
Parallelism, Real vs. Simulated: A Query
Network for coordinate transformations
need help for Prof. Gallant's e-mail address
NNs on Transputers
Re: Data Complexity
Our experience with NWorks by NeuralWare
Emperor's New Mind: BBS Call for Commentators
VLSI hardware for Artificial Neural Nets
ghost in the hippocampus
graph matching
Bibliography (followup)
ridiculous price
Publishers of NN Journals
Request For Info
A "half-baked" Question...
Re: Neuron Digest V6 #1
Signature verification
request to neuron-digest readers
address/bibliography
Re: ND V6 #2
Help!
Job Opening - Please Post
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).
------------------------------------------------------------
Subject: Parallelism, Real vs. Simulated: A Query
From: harnad@clarity.Princeton.EDU (Stevan Harnad)
Date: Thu, 05 Oct 89 23:36:54 -0400
I have a simple question: What capabilities of PDP systems do and do not
depend on the net's actually being implemented in parallel, rather than
just being serially simulated? Is it only speed and capacity parameters,
or something more?
Stevan Harnad
Psychology Department
Princeton University
harnad@confidence.princeton.edu
[[ Editor's note: Interesting question. I would suspect certain
time-related aspects of biological parallelism may escape serial
modeling, but I seem to recall general assertions atht *any* parallel
porcess can be modelled sequentially. Of course, the reverse may be a
bit tricky, as I'm discovering in my current literatire search for
connectionist models of serial processing. Readers, any comments? -PM ]]
------------------------------
Subject: Network for coordinate transformations
From: RM5I%DLRVM.BITNET@CUNYVM.CUNY.EDU
Date: Mon, 23 Oct 89 15:35:16 -0500
Hi,
I'm trying to implement the cartesian/polar and cartesian/sphere
coordinate transformation in a backpropagation network. I was successful
with cartesian/polarcoordinate transformation by using a two layer
backpropagation network in which the first hidden layer has a tanh
transfer function and the second layer is split into two halfs of which
one has a sin transfer function the the other half has a sigmoid transfer
function.
Is someone else interested in this and can give me soem ideas of
the more complex cartesian/spherecoordinate transformation ?
Regards..Roland Luettgens (rm5i@dlrvm) Bitnet
------------------------------
Subject: need help for Prof. Gallant's e-mail address
From: HCFU%TWNCTU01.BITNET@CUNYVM.CUNY.EDU
Date: Wed, 15 Nov 89 13:57:00 +0800
I am current reading a paper written by Dr. Stephen I. Gallant in the
Feb. 1988 issue of CACM. He suppose to be with the college of computer
science of Northeasten University. In additoin, I am also interested in
reading and collecting papers on the Neural nets in connection with
expert systems. Would you pls give some suggestions, such as name of
proceedings, TRs, special issues or models, and real systems etc. Thanks
in advances
Hsin Chia Fu
------------------------------
Subject: NNs on Transputers
From: M Norman <mgn%castle.edinburgh.ac.uk@NSFnet-Relay.AC.UK>
Date: Fri, 01 Dec 89 20:02:06 +0700
Herve Frylander (hfry@alize.imag.fr) requests information about
parallelisation of NNs. The Edinburgh Concurrent Supercomputer has some
experience in this and indeed a BackProp simulator called Rhwydwaith
which runs in parallel on transputers.
A useful reference is
Norman Radcliffe Richards Smieja Wallace Collins Hayward and Forrest
"Neural Network applications in the Edinburgh Concurrent Supercomputer
Project"
Neuro Computing: Algorithms Applications and Architectures
eds. F. Fogelman Soulie and J Herault, to appear (1989).
Edinburgh Preprint No 89/462
The contact point for Rhwydwaith is Nick Radcliffe
(njr@uk.ac.ed.castle).
Mike Norman
Dept of Physics
University of Edinburgh
------------------------------
Subject: Re: Data Complexity
From: chaos%gidrah.Berkeley.EDU@jade.berkeley.edu (Jim Crutchfield)
Date: Mon, 04 Dec 89 13:28:05 -0800
An element of the set "nonlinear dynamics people" would like to
draw your attention to the following papers that address nonlinear model
reconstruction and complexity in dynamical systems.
Many of the questions brought up in the recent postings
concerning "Data Complexity" receive constructive and quantitative
answers when addressed to modeling nonlinear time-dependent processes.
o "Equations of Motion from a Data Series", JPC and B.
McNamara, Complex Systems 1 (1987) 417.
o "Inferring Statistical Complexity", JPC and K. Young,
Physical Review Letters 63 (1989) 105.
o "Computation at the Onset of Chaos", JPC and K. Young,
in Entropy, Complexity, and the Physics of Information,
W. Zurek, editor, Addison-Wesley, Reading, Massachusetts
(1989) in press.
o "Information and its Metric", JPC, in Nonlinear Structures
in Physical Systems - Pattern Formation, Chaos, and Waves, L.
Lam and H. C. Morris, editors, Springer-Verlag, Berlin (1990)
in press.
o "Inferring the Dynamic, Quantifying Physical Complexity",
JPC, in Quantitative Measures of Dynamical Complexity in Nonlinear Systems,
A. M. Albano, N. B. Abraham, P. E. Rapp, and A. Passamante, eds., Plenum
Press, New York (1989) in press.
Also, I would like to point out that in the context of dynamical
systems theory it is a theorem that the Kolmogorov-Chaitin complexity,
based on the computational model of deterministic Turing machines, of a
typical orbit of a chaotic dynamical system is degenerate with the
system's metric entropy. The latter is based on Shannon information
theory as introduced into dynamics by Kolmogorov and Sinai. Thus, from
the viewpoint of stationary dynamical systems the K-C complexity is not
of much interest. One might as well use Shannon information.
In the above work we define a different complexity based on
Turing machines with a random register (Bernoulli-Turing machines). This
leads to a quantitative measure of physical complexity that is
complementary to information-based measures of the degree of randomness.
This is a new invariant for dynamical processes. One result is
that the space of dynamical systems appears to be organized into
informational phases (gas, liquid, and solid) the boundaries of which
support high levels of computation. Physicists call the boundaries phase
transitions. This result appears to hold not only for continuous
dynamical systems, but also for discretized dynamical systems such as
cellular automata.
Note that although the K-C complexity is in general
noncomputable, requiring the minimal Turing machine representation of
given data, in the case of chaotic dynamical systems it can be readily
estimated! (See J. P. Crutchfield and N. H. Packard, "Symbolic Dynamics
of One-Dimensional Maps: Entropies, Finite Precision, and Noise",
International Journal of Theoretical Physics volume 6/7 (1982) 433.)
So much for advertising. What does this have to do with neural
networks?
Good question.
Jim Crutchfield
Physics Department
University of California
Berkeley, California 94720
(415) 642-1287
------------------------------
Subject: Our experience with NWorks by NeuralWare
From: RM5I%DLRVM.BITNET@CUNYVM.CUNY.EDU
Date: Mon, 04 Dec 89 17:16:33 -0500
We use a Neural Network simulator called NWorks made by NeuralWare. It
has 13 models of neural nets included like BP and ART. You can either
design your own net with a ADD LAYER and ADD PROCESSING ELEMENT feature
or use one of the standard nets with INSTANET or LOAD NETWORK.
This software can run on different machines like IBM-PC and i think SUN.
It has a graphics interface and you can view the network at learning
intervalls and see graphically how weights get adjusted.
This software costs a few hundred dollar but it is worth to buy this if
you can. From my point of view it is a very useful simulator with a
powerful userinterface and environment.
If you have questions feel free to call me.
Roland Luettgens (rm5i@dlrvm) Bitnet
------------------------------
Subject: Emperor's New Mind: BBS Call for Commentators
From: harnad@clarity.Princeton.EDU (Stevan Harnad)
Date: Tue, 05 Dec 89 01:01:46 -0500
[[Editor's Note: For those of you with access to USENET, there was quite
a debate raging over Penfield's book, as well as the recent Scientific
American articles by Searle and the Churchlands. i don't know if any of
the mailing lists are carrying the discussions. I highly recommend the
Scientific American articles, especially since the Churchlands are
succint about viability of connectionist models with respect to their
capabilities and potentail for "thinking." -PM ]]
Below is the synopsis of a book that will be accorded a multiple book
review (20 - 30 multidisciplinary reviews, followed by the author's
response) in Behavioral and Brain Sciences (BBS), an international,
interdisciplinary journal that provides Open Peer Commentary on important
and controversial current research in the biobehavioral and cognitive
sciences. Reviewers must be current BBS Associates or nominated by a
current BBS Associate. To be considered as a reviewer for this book, to
suggest other appropriate reviewers, or for information about how to
become a BBS Associate, please send email to:
harnad@confidence.princeton.edu or write to:
BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771]
____________________________________________________________________
THE EMPEROR'S NEW MIND:
CONCERNING COMPUTERS, MINDS AND THE LAWS OF PHYSICS
Roger Penrose
Rouse Ball Professor of Mathematics
University of Oxford
The Emperor's New Mind is an attempt to put forward a scientific
alternative to the viewpoint of "Strong AI," according to which mental
activity is merely the acting out of some algorithmic procedure. John
Searle and other thinkers have likewise argued that mere calculation does
not, of itself, evoke conscious mental attributes, such as understanding
or intentionality, but they are still prepared to accept that the action
of the brain, like that of any other physical object, could in principle
be simulated by a computer. In my book I go further than this and suggest
that the outward manifestations of conscious mental activity cannot even
be properly simulated by calculation. To support this view I use various
arguments to show that the results of mathematical insight, in
particular, do not seem to be obtained algorithmically. The main thrust
of this work, however, is to present an overview of the present state of
physical understanding and to show that an important gap exists at the
point where quantum and classical physics meet, and to speculate on how
the conscious brain might be taking advantage of whatever new physics is
needed to fill this gap, in order to achieve its non-algorithmic effects.
------------------------------
Subject: VLSI hardware for Artificial Neural Nets
From: Hsin Chia Fu <HCFU%TWNCTU01.BITNET@CUNYVM.CUNY.EDU>
Date: Sat, 09 Dec 89 09:30:00 +0800
I am interested in this particular VLSI hardware implementaion for ANN.
Would some one out there give list of individuals or organizations on
this matter. I would be very appreciated if the list could include both
postal and electronic addresses. In addition, someone may pass addresses
for Dr. Federico Faggin and/or Synaptics, Inc.
Thanx
Hsin Chia Fu
[[ Editor's Note: I immediately think of Carver Mead's recent book on
Analog VLSI. He describes the articial retina and cochlea he's been
working on for years; in biological systems, these structures are often
seen simply as "brain extensions" rather than ennervated organs. Thus
they are about as "neural net" as one can get. -PM ]]
------------------------------
Subject: ghost in the hippocampus
From: arti6!tony@relay.EU.net (Tony Bell)
Date: Mon, 11 Dec 89 01:25:59 +0100
Thank you, Barry Kort, for the interesting account of the talks you heard
at MIT. I was particularly intrigued to hear that Terry Sejnowski has
reported that:
"The Hebbs Synapse would seem to be the foundation
for superstitious learning."
So Nancy Reagan can blame it all on those NMDA receptors.
------------------------------
Subject: graph matching
From: roysam@ecse.rpi.edu (Roysam)
Date: Tue, 12 Dec 89 15:49:22 -0500
I have a citation on this subject that I would like to see, but really
don't know how.
"Matching of Attributed and non-attributed graphs by use of the Boltzmann
Machine algorithm," Kuner, Siemens, West Germany.
This paper was on the list of abstracts for a recent conference.
Badri (roysam@ecse.rpi.edu)
------------------------------
Subject: Bibliography (followup)
From: chuck@utkux1.utk.edu (chuck)
Date: Tue, 12 Dec 89 17:26:14 -0500
Friends,
The flurry of requests for references prompts me to mention
several resources of a general nature which may be of interest. I hope
these will be useful.
A large bibliography, assembled and maintained by Eugene Miya, is
available by anonymous ftp from icarus.riacs.edu (128.102.16.8) in the
/pub/bib directory. This is primarily a bibliography of parallel and
supercomputing references, but does contain quite a bit of other
'connectionist' stuff. It is in 'refer' format. Eugene Miya has asked
to maintain the list, and requests that additions and corrections be sent
to him (instructions are in the files in the same directory, along with
several tools to assist in moving the format to scribe or tex.)
I hope, in the near future, to contribute references to this
bibliography which will include announcements and lists appearing on the
internet in various places --including the archives of Neuron Digest.
These entries will be primarily 'connectionist' references.
Our library reports the acquisition of a reference book entitled
"The 1989 Neuro-Computing Bibliography" by Casimir Klim. I have not had
the opportunity to examine it, however, and cannot offer any comment at
this time.
Season's Greetings!
Chuck Joyce chuck@cs.utk.edu
------------------------------
Subject: ridiculous price
From: "Rolf Pfeifer" <pfeifer@ifi.unizh.ch>
Date: 17 Dec 89 12:07:00 +0100
I ordered the IJCNN-89 proceedings from a book store in Zurich,
Switzerland. They charged me a ridiculuous price of
SFr. 520 (which is approximately US $ 330).
This is clearly not meant to enhance the process of scientific
communication. I wonder who gets how much on this.
--Rolf Pfeifer
AI Lab, Computer Science Department
University of Zurich
Zurich, Switzerland
------------------------------
Subject: Publishers of NN Journals
From: Song Y. Yan <munnari!ysy@uunet.UU.NET>
Date: Sun, 07 Jan 90 00:48:41 +1100
Do you know who are the publishers of the following two NN journals
1. Journal of Neural Network Computing
2. Neural Computation
and their addresses?
Thank you very much. Looking forward to hearing from you soon.
Fondest Regards,
Song Y. Yan
S. Y. Yan, Dept of Computer Science, University of Melbourne,
Parkville, Victoria 3052, Australia
Phone: 61 3 344-6807
E-mail: ysy@cs.mu.oz.au
------------------------------
Subject: Request For Info
From: "DAVE MCKEE" <mckee@tisss.radc.af.mil>
Date: 09 Jan 90 14:08:00 -0400
For readers of the Neural Network E-Mail List.
I am an Engineer at Rome Air Development Center, Griffiss AFB, Rome NY.
I am interested in collecting information on reliability issues in neural
network designs. Below is a edited version of a RADC Weekly Activity
Report issued throughout RADC for information references. If you are
interested in, or have information pertaining to these issues please send
E-mail to the below address:
MCKEE@TISSS.RADC.AF.MIL
Thank you very much. David T. McKee
Neural Network Reliability Characterization:
Rome Air Development Center (RADC) at Griffiss AFB, Rome, NY 13441-5700
have initiated a library search for reliability oriented work in this
area, hoping to also identify potential customers and sponsors. There is
increasing worldwide interest in modeling, software simulation (using
conventional and modified architecture machines), and direct silicon
implementation of neural network emulations. (IEEE MICRO, Dec. 1989).
DARPA is mega-funding research, and Ford, GTE, and even the USAF are
pursuing real world applications to product debug, lifetime estimation
based on complex process control data, and field failure prediction based
on experience data bases. At this point we need to be alert to
reliability implications being made: extensive interconnect may dominate
reliability of some implementations; some network chips with many
processing defects still work fairly well, without repair;
optoelectronics technology may be used; analog CMOS (with its potential
ESD and latch-up weaknesses) may dominate; etc. We also need to be alert
to how they might be useful to us. It seems clear that researchers hope
that networks will eventually offer competitive solutions to complex
control and optimization problems. (Perhaps related to IC design,
processing, inspection and screening)? Please help us collect
references.
------------------------------
Subject: A "half-baked" Question...
From: "DAVE MCKEE" <mckee@tisss.radc.af.mil>
Date: 09 Jan 90 15:29:00 -0400
I am inquiring, under the heading "half baked ideas" what work, if any,
has been done concerning neural networks that take into account quantum
effects, morphogenic fields, or random processes where "true" random
noise is used (as opposed to pseudo random algorithms)?
I have been thinking how the neurons in the brain are affected by these
processes, and I wonder how much of biological neural behavior is in fact
affected by them.
[[ Editor's note: See also Penfield's book, mentioned above. In this
context, though, I'm not sure about the practical distinction betwen
"true" and "pseudo" randomness, given the inherent limitations of
computational algorithms. -PM ]]
------------------------------
Subject: Re: Neuron Digest V6 #1
From: Daniel Abramovitch <danny@hpldya>
Date: Wed, 10 Jan 90 07:59:22 -0800
> From: worden@ut-emx.UUCP (worden)
> Organization: The University of Texas at Austin, Austin, Texas
> Date: 21 Oct 89 08:06:41 +0000
>
> Lyapunov functions and stability criteria are one of the mainstays of
> control theory (aka, linear systems theory).
Linear systems theory is a subset of control theory. Much of control
theory deals with nonlinear systems. In particular, Lyapunov functions
are useful because they allow the engineer to establish the stability of
nonlinear systems. (For linear systems, there are quite a few other
methods of establishing the stability of a system.)
The best (most intuitive) discussion of Lyapunov functions that I have
come accross is in a book by Ogata, "Modern Control Engineering",
published by Prentice Hall. One chapter is devoted to Lyapunov functions
(about 20 pages).
Danny Abramovitch
danny%hpldya@hplabs.hp.com
------------------------------
Subject: Signature verification
From: rmurrays%computer-science.strathclyde.ac.uk@NSFnet-Relay.AC.UK
Date: Fri, 12 Jan 90 11:26:35 +0000
Dear World,
I'm doing some research into the verification of handwritten signatures.
The hardware has already been built and this supplies the verification
system with 36 parameters, describing features of the signature. From
this hardware a large database of hundreds of valid signatures has been
developed.
I am hoping to use the back-propagation algorithm for the verification of
the signatures. The main difficulty will be in deciding how to mark the
boundary between the valid decision area and the invalid area, so that
the algorithm has two classes to decide on. We have to create the samples
for the second class, the invalid decision area.
Has anyone else done any work in this type of area? Are there already
many well-established traditional multivariate statistical methods that I
am ignorant of? Are there more suitable algorithms than back-propagation?
I also hope to implement a recognition system, which will take the
description of the signature and perform a sort of hashing function on
it, to return the name of the author from the database. Any ideas or
experience with this problem will also be carefully studied.
Any help on either of these topics will be gratefully received and passed
on to anyone else interested.
Roderick Murray-Smith
------------------------------
Subject: request to neuron-digest readers
From: UNNI%RCSMPB@gmr.com
Date: Fri, 12 Jan 90 14:43:00 -0500
I am involved in organizing a workshop on neural networks for graduate
students in India. Does anybody have public domain software for
simulation of various networks on garden variety IBM PCs? This would be
used by a group of about 30 students. Will any of the commercial software
developers be willing to donate a simulation package for this workshop?
unni@gmr.com
------------------------------
Subject: address/bibliography
From: levine@antares.mcs.anl.gov
Date: Mon, 15 Jan 90 12:11:07 -0600
Hi, not sure what email address to use to ask this question, and
the question may be dumb, but...
In the recent bibliography posted. How is one to tell the magazine,
journal,... name. E.g., I recognize as Scientific American, but if
I hadn't ....? Thanks --dave levine
%A A. K. Dewdney
%D 1985
%T Computer Recreations: Exploring the field of genetic algorithms in a
primor\dial computer sea full of flibs
%P 21-32
David Levine levine@mcs.anl.gov
Mathematics and Computer Science {alliant,sequent,rogue}!anlams!levine
9700 Cass Avenue South (708) 972-6735
Argonne National Laboratory (708)-972-5986 (FAX)
Argonne, Illinois 60439
------------------------------
Subject: Re: ND V6 #2
From: gt0228b@prism.gatech.edu (gt0228b FALCO,VINNIE)
Date: Tue, 23 Jan 90 00:06:11 -0500
In reference to the first article in the aforementioned 'Digest, I am
wondering what the importance of specification (8) in that all Symbol
Systems are SEMANTICALLY INTERPRETABLE : Surely there are symbol systems
that have no semantic interpretation, yet are self consistent. The
constraint of semantic interpretations upon symbol systems seems to rule
out a whole class of systems that may have importance, while not having
an implicit, intrinsinc semantic interpretation.
- Vinnie Falco gt0228b@prism.gatech.edu
------------------------------
Subject: Help!
From: Sanjeev Sharma <sharma@hpihoed>
Date: Fri, 26 Jan 90 17:26:31 -0800
Hello fellow netters,
I have a pattern-recognition and classification problem for which I am
trying to determine the "best" ANN algorithm and architecture to use.
Being a relative new-comer to the Neural Network field, I would
appreciate any suggestions from the ANN gurus out there.
The problem I'm trying to solve is described below:
The input data is a set of real-valued 2-dimensional patterns with
unknown temporal and spatial probability distributions. The input to the
neural network is an n-tuple obtained from sampling these patterns at
n-locations. Typically, n = 45.
There are m binary (ON/OFF) outputs, any combination of which can be ON
at one time (each output represents a certain condition detected in the
input). However, generally, at most 3 of the outputs will be ON at one
time. Typically, m = 10.
At time k, the output vector W(k) is a function of the present and past
input vectors V(k), V(k-1), ..., V(k-p). For the sake of simplicity, let
p = 5.
The network will be trained in a supervised-learning environment, with
the expected output known for each input training vector. After the
training period, the network will be used to classify input patterns P.
These patterns P are similar to the training set T to the extent that
both P and T are the products of the same underlying process. Note that
the network will be required to exhibit the properties of rotational and
translational invariance.
Any suggestions, pointers, related work, etc. will be highly appreciated.
I hope I've been clear in the description of my problem. If not, please
feel free to drop me a line.
Thanks.
Sanjeev Sharma
sharma%hpda@hplabs.hp.com
------------------------------
Subject: Job Opening - Please Post
From: "Fabio Idrobo" <psy9a3n@BUACCA.BU.EDU>
Date: Sat, 27 Jan 90 10:22:41 -0500
JOB ANNOUNCEMENT
The Boston University Department of Psychology seeks an
Experimental Psychologist with interests in human cognition-
attention or memory beginning Fall 1990. The Department
seeks candidates with demonstrated excellence in research
and teaching. The appointment will be tenure track at the
assistant professor level.
Candidates should submit a vita, representative reprints
and a statement outlining research interests and teaching
experience. In addition, they should have three letters of
recommendation forwarded. Applications should arrive by
February 19, but later applications may be considered.
Please write to:
Cognitive Search Committee,
Department of Psychology
Boston University
64 Cummington Street
Boston, MA. 02215.
U. S. A.
Boston University is an Affirmative Action/Equal Opportunity
Employer. Women and minority candidates are encouraged to
apply.
------------------------------
End of Neuron Digest [Volume 6 Issue 9]
***************************************