Copy Link
Add to Bookmark
Report

Neuron Digest Volume 02 Number 25

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

NEURON Digest	Wed Oct 28 09:35:59 CST 1987  Volume 2 / Issue 25 
Today's Topics:

Time-averaging of neural events/firings
neuro sources
Using Hopfield Nets to solve the Traveling Salesman Problem
references
code for comprehensive backprop simulator
Re: Learning with Delta Rule
Re: Carver Mead's book
The BAM example in Byte
Source of Hopfield TSP solution
1988 summer school announcement
Neural Net References
Announcing Neural Network Review
Speech Recognition Using Connectionist Networks (UNISYS)
Tech. report abstract

----------------------------------------------------------------------

Date: Fri, 16 Oct 87 08:36:52 EST
From: "Peter H. Schmidt" <peter@mit-nc.mit.edu>
Subject: Time-averaging of neural events/firings

Is anyone out there investigating neural nets that don't use McCullogh-Pitts
1-bit-quantized neurons? In other words, is anyone investigating the
possibility that all the information is *not* conveyed merely by the (high or
low) frequency of neural firings?

Thanks,

Peter H. Schmidt
peter%nc@mc.lcs.mit.edu (ARPANET)
peter%nc%mc.lcs.mit.edu@cs.net.relay (CSNET)


------------------------------

Date: Tue 20 Oct 87 21:07:39-EDT
From: "John C. Akbari" <AKBARI@cs.columbia.edu>
Subject: neuro sources

anyone have the source code for either of the following?

kosko, bart. constructing an associative memory. _byte_ sept. 1987

jones, w.p. & hoskins, j. back-propagation. _byte_ oct. 1987.

any help would be appreciated.

John C. Akbari

PaperNet 380 Riverside Drive, No. 7D
New York, New York 10025 USA
SoundNet 212.662.2476 (EST)
ARPANET & Internet akbari@CS.COLUMBIA.EDU
BITnet akbari%CS.COLUMBIA.EDU@WISCVM.WISC.EDU
UUCP columbia!cs.columbia.edu!akbari

------------------------------

Date: Tue, 27 Oct 87 14:05 EST
From: Fausett@radc-multics.arpa
Subject: Using Hopfield Nets to solve the Traveling Salesman Problem

Can anyone give me a reference to the use of Hopfield nets in solving
the traveling salesman problem (TSP).
Can this approach be used to solve TSP's which have local and/or global
constraints?

------------------------------

Date: Thu, 15 Oct 87 14:52:09 CDT
From: simpson@nosc.mil
Subject: references

>Date: 15 Oct 87 10:30:00 EST
>From: "NRL::MAXWELL" <maxwell%nrl.decnet@nrl3.arpa>
>Subject: REFERENCES
>
>DEAR PATRICK,
>
> YOUR MESSAGE WAS UNCLEAR. WAS THE PRICE ON THE REFERENCE LIST
>$3.00 plus OR including $3.00 postage?
> MAXWELL@NRL.ARPA
------


Dr. Maxwell,

I apologize for the lack of clarity in the message. $3.00 will cover the cost
of postage and handling, that is the ONLY charge. I am simply covering the
cost of copies, envelopes and postage with the $3.00.

Patrick K. Simpson
9605 Scranton Road
Suite 500
San Diego, CA 92121


------------------------------

Date: 19 Oct 87 19:15:51 GMT
From: Andrew Hudson <PT.CS.CMU.EDU!andrew.cmu.edu!ah4h+@cs.rochester.edu>
Subject: code for comprehensive backprop simulator


This is in response to a query for connectionist simulator code.
Within a month, one of the most comprehensive back propagation
simulators will be available to the general public.
Jay McClelland and David Rumelhart's third PDP publication,
Exploring Parallel Distributed Processing: A Handbook of Models, Programs,
and
Exercises will be available from MIT Press. C source code for the complete
backprop simulator, as well as others, is supplied on two MS-DOS format
5 1/4" floppy discs. The simulator, called BP, comes with the
necessary files to run encoder, xor, and other problems. It supports
multiple layer networks, constrained weight, and sender to receiver options.
It also has nicely laid out and nicely parsed menu options for every
parameter you could ever imagine.
The handbook and source code can be ordered from MIT Press at the address
below. The cost for both is less than $30. Why spend thousands more for
second best?

The MIT Press
55 Hayward Street
Cambridge, MA 02142

Another version of the BP simulator which is not yet generally available
to the public has been modified to take full advantage of the vector
architecture of the Convex mini-supercomputer. For certain applications
this gives speed increases of 30 times that of a VAX 11/780. A study is
underway to see how well BP will perform on a CRAY XMP-48.

- Andrew Hudson

ah4h@andrew.cmu.edu.arpa
Department of Psychology
Carnegie Mellon
412-268-3139

Bias disclaimor: I work for Jay, I've seen the code.

------------------------------

Date: 19 Oct 87 09:37:41 PDT (Mon)
From: creon@ORVILLE.ARPA
Subject: Re: Learning with Delta Rule


The problem is, what if you do not know the invariances before hand,
and thus cannot "wire them in"? What I would like is for the net to
discover the invariances. We tried and tried this, using both first
order and second order (corellated) three layer nets. We had the
computer randomly choose a pattern, shift it, and present it to the
net. Then it would correct the net (if necessary) in the standard
backprop way. We could not get the net to learn the invariaces by
itself, and the net did not have the capacity to learn all possible
shifts of each pattern explicity, which is not what we wanted anyway.

Are there any results on non-trivial spontaneous generalization in
back-propagation nets? They are good at recalling the previous input
that has the minimum hamming distance from current input, but can't
they do more than this?


------------------------------

Date: Thu, 22 Oct 87 09:48:01 EST
From: Manoel F Tenorio <tenorio@ee.ecn.purdue.edu>
Subject: Re: Carver Mead's book

>> Carvey Mead's book in analog VLSI


Manoj,
I have talked to the publisher (Addison-Wesley), and it won't be out till the
Spring. If you get your hands on the notes, I would appreciate receiving a
copy.

--ft.

School of Electrical Engineering
Purdue Univesity
W. Lafayette, IN 47907

------------------------------

Date: Fri, 23 Oct 87 13:17 N
From: SCHOMAKE%HNYKUN53.BITNET@wiscvm.wisc.edu
Subject: The BAM example in Byte

[]
Apart from the alignment dependency, which is a general characteristic of
most simple neural net simulation implementations there may be more problems.
I tried to build the biderectional associative memory (BAM) program from the
recipe (Listing 1) and I have something that works, but...: I find the
program's capacities rather disappointing, compared to e.g. (the admittedly
more complex) Siloam. Recognizing more than two pairs is often difficult.
The network converges alright, but it may be to a meaningless state instead of
reverberating the "best matching pair". Now there are two possibilities.

1) I missed some important point while coding (I don't think so, since
the simple examples with two stored 6bit pairs work alright).
or:
2) The author was very lucky in selecting three pairs of character
bitmaps that resulted in good recognition in his example (;-).

Also, I noticed someone interpreting Figure 2 in the article as:

>...in the Byte article they demonstrate correct recall of an image
>corrupted by randomly flipping a number of bytes, simulating "noise"...
>Greg Corson, ...seismo!iuvax!ndmath!milo

They do not. Figure 2 shows the recognition process in a kind of slow motion,
by randomly choosing weights that are allowed to be updated during the
iteration (asynchronous recall). This randomness is not in the data, it is in
the recall process itself. This tells us that the BAM does not have to
be a synchronous technical machine but _could_ be a model for some kind of
biological neural memory. In fact, when an association is strong, it would come
up in only one to three synchronous iterations. The input pair <S>-<E> of the
example is _not_ corrupted!

To tell the truth, I am a little bit skeptical about BAMs. From systems
theory and signal processing theory I know that you can reconstruct
a single input signal from a crosscorrelation function (here: the matrix
of synaptic weights) that is based on several input and output sweeps, if, and
only if, the spectral contributions of the sweeps are significantly different.
Adding the (I/O)-(O/I) iteration will enhance the capacities of such a system,
but it will always suffer from the disability to deal with many-to-one
mappings or many-to-(many-similars) mappings.
Lambert Schomaker
SCHOMAKE@HNYKUN53.BITNET
Nijmegen, The Netherlands.
Reference:

Kosko, B. (1987). Constructing an Associative Memory. Byte: the Small Systems
Journal, Vol. 12 (10), pp.137-144.

------------------------------

Date: Wed, 7 Oct 87 13:36:16 EST
From: "Peter H. Schmidt" <peter@mit-nc.mit.edu>
Subject: Source of Hopfield TSP solution

The article "Computing With Neural Circuits: A Model", Science, Vol. 233,
8-8-86, pp. 625-632, by Hopfield and Tank, describes the application of a
Hopfield net using graded-response neurons to TSP, and to a simple
analog-binary computation. It's very readable. N.B. The circuit described
doesn't "solve" the TSP in terms of finding *the* optimum solution - rather,
it converges quickly to 1 of the 10^7 best solutions out of a possible ~10^30
tours in a 30 city problem, say. The advantage over conventional computational
techniques is that the Hopfield net needs only 900 neurons, while a
comparable time solution would require a "microcomputer having 10^4 times as
many devices." (ibid., p. 632) This comparison seems a little beside the
point to me.

Peter H. Schmidt
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Work: MIT 20A-002 Home: 3 Colonial Village, #3
Cambridge, MA, 02139 Arlington, MA, 02174
(617) 253-3264 (617) 646-2215
ARPANET: peter%nc@mc.lcs.mit.edu
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||


------------------------------

Date: Wed 14 Oct 87 03:29:58-EDT
From: Dave.Touretzky@C.CS.CMU.EDU
Subject: 1988 summer school announcement


THE 1988 CONNECTIONIST MODELS SUMMER SCHOOL

ORGANIZER: David Touretzky

ADVISORY COMMITTEE: Geoffrey Hinton, Terrence Sejnowski

SPONSORS: The Sloan Foundation; AAAI; others to be announced.

DATES: June 17-26, 1988

PLACE: Carnegie Mellon University, Pittsburgh, Pennsylvania

PROGRAM: The summer school program is designed to introduce young neural
network researchers to the latest developments in the field. There will be
sessions on learning, theoretical analysis, connectionist symbol processing,
speech recognition, language understanding, brain structure, and neuromorphic
computer architectures. Students will have the opportunity to informally
present their own research and to interact closely with some of the leaders of
the field.

PARTIAL LIST OF FACULTY:

Yaser Abu-Mostafa (Caltech) James McClelland (Carnegie Mellon)
Dana Ballard (Rochester) David Rumelhart (Stanford)
Andrew Barto (U. Mass.) Terrence Sejnowski (Johns Hopkins)
Gail Carpenter (Boston U.) Paul Smolensky (UC Boulder)
Scott Fahlman (Carnegie Mellon) David Tank (AT&T Bell Labs)
Geoffrey Hinton (Toronto) David Touretzky (Carnegie Mellon)
George Lakoff (Berkeley) Alex Waibel (ATR International)
Yann Le Cun (Toronto) others to be announced

EXPENSES: Students are responsible for their meals and travel expenses,
although some travel assistance may be available. Free dormitory space will be
provided. There is no tuition charge.

WHO SHOULD APPLY: The summer school's goal is to assist young researchers who
have chosen to work in the area of neural computation. Participation is
limited to graduate students (masters or doctoral level) who are actively
involved in some aspect of neural network research. Persons who have already
completed the Ph.D. are not eligible. Applicants who are not full time
students will still be considered, provided that they are enrolled in a
doctoral degree program. A total of 50 students will be accepted.

HOW TO APPLY: By March 1, 1988, send your curriculum vitae and a copy of one
relevant paper, technical report, or research proposal to: Dr. David Touretzky,
Computer Science Department, Carnegie Mellon University, Pittsburgh, PA, 15213.
Applicants will be notified of acceptance by April 15, 1988.
-------

------------------------------


Date: 12-OCT-1987 12:16
From: simpsonp@nosc.mil
Subject: Neural Net References



NEURAL NETWORK REFERENCES AVAILABLE

750+ Neural Net References, 50 page reference list. A comprehesive ANS
reference list from 1938 to present, includes every major ANS researcher
and their earliest puiblications.

Send $3.00 to cover postage and handling to:
Patrick K. Simpson
Verac, Inc.
9605 Scranton Road
Suite 500
San Diego, CA 92121


------------------------------

Date: Mon, 26 Oct 87 18:00:32 EST
From: Craig Will <csed-1!will@hc.dspo.gov>
Subject: Announcing Neural Network Review


Announcing a new publication
NEURAL NETWORK REVIEW

The critical review journal
for the neural network community


Neural Network Review is intended to provide a forum
for critical analysis and commentary on topics involving
neural network research, applications, and the emerging
industry. A major focus of the Review will be publishing
critical reviews of the neural network literature, including
books, individual papers, and, in New York Review of Books
style, groups of related papers.

The Review will also publish general news about events
in the neural network community, including conferences,
funding trends, and announcements of new books, papers,
courses, and other media, and new hardware and software pro-
ducts.

The charter issue, dated October, 1987, has just been
published, and contains a review and analysis of 11 articles
on neural networks published in the popular press, a report
on the San Diego conference, a report on new funding initia-
tives, and a variety of other information, a total of 24
pages in length. The next issue, due in January, 1988, will
begin detailed reviews of the technical literature. Neural
Network Review is aimed at a national audience, and will be
published quarterly. It is published by the Washington
Neural Network Society, a nonprofit organization based in
the Washington, D.C. area.

Subscriptions to Neural Network Review are $ 10.00 for
4 issues, or $ 2.50 for a single copy. International rates
are slightly higher. Rates for full-time students are $5.00
for 4 issues. (Checks should be payable to the Washington
Neural Network Society). Subscription orders and inquiries
for information should be sent to:

Neural Network Review
P. O. Box 427
Dunn Loring, VA 22027

For more information on Neural Network Review, send your
physical, U. S. Postal mail address in a message to
will@hc.dspo.gov (Craig Will).


------------------------------

Date: Tue 27 Oct 87 20:33:41-PST
From: finin@bigburd.PRC.Unisys.COM (Tim Finin)
Subject: Speech Recognition Using Connectionist Networks (UNISYS)


AI Seminar
UNISYS Knowledge Systems
Paoli Research Center
Paoli PA


SPEECH RECOGNITION USING CONNECTIONIST NETWORKS

Raymond Watrous
Siemens Corporate Research
and
University of Pennsylvania


The thesis of this research is that connectionist networks are
adequate models for the problem of acoustic phonetic speech
recognition by computer. Adequacy is defined as suitably high
recognition performance on a representative set of speech recognition
problems. Six acoustic phonetic problems are selected and discussed
in relation to a physiological theory of phonetics. It is argued that
the selected tasks are sufficiently representative and difficult to
constitute a reasonable test of adequacy.

A connectionist network is a fine-grained parallel distributed
processing configuration, in which simple processing elements are
interconnected by simple links. A connectionist network model for
speech recognition has been defined called the TEMPORAL FLOW MODEL.
The model incorporates link propagation delay and internal feedback to
express temporal relationships.

It has been shown that temporal flow models can be 'trained' to
perform successfully some speech recognition tasks. A method of
'learning' using techniques of numerical nonlinear optimization has
been demonstrated for the minimal pair "no/go", and voiced stop
consonant discrimination in the context of various vowels. Methods for
extending these results to new problems are discussed.

10:00am Wednesday, November 4, 1987
Cafeteria Conference Room
Unisys Paloi Research Center
Route 252 and Central Ave.
Paoli PA 19311

-- non-UNISYS visitors who are interested in attending should --
-- send email to finin@prc.unisys.com or call 215-648-7446 --

------------------------------

Date: 15 Oct 87 18:22:16 GMT
From: A Buggy AI Program <speedy!honavar@speedy.wisc.edu>
Subject: Tech. report abstract


Computer Sciences Technical Report #717, September 1987.
--------------------------------------------------------


RECOGNITION CONES: A NEURONAL ARCHITECTURE FOR
PERCEPTION AND LEARNING


Vasant Honavar, Leonard Uhr

Computer Sciences Department
University of Wisconsin-Madison
Madison, WI 53706. U.S.A.


ABSTRACT

There is currently a great deal of interest
and activity in developing connectionist, neu-
ronal, brain-like models, in both Artificial
Intelligence and Cognitive Science. This paper
specifies the main features of such systems,
argues for the need for, and usefulness of struc-
turing networks of neuron-like units into succes-
sively larger brain-like modules, and examines
"recognition cone" models of perception from this
perspective, as examples of such structures.
Issues addressed include architecture, information
flow, and the parallel-distributed nature of pro-
cessing and control in recognition cones; and
their use in perception and learning.


-----
Vasant Honavar
honavar@speedy.wisc.edu


------------------------------

End of NEURON-Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT