Copy Link
Add to Bookmark
Report

Neuron Digest Volume 08 Number 14

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Thursday, 19 Dec 1991                Volume 8 : Issue 14 

Today's Topics:
many-to-many associate in multi-layer back propagation ??
Industrial (real-life) uses of ANN for visual inspection??
Presentation
ANN and time series forecasting
Response to request for info
Neuron simulation - who's doing it?
Re: Neuron simulation - who's doing it?
Code for neural network with time-delays
Kohonen SOM X11 Simulator
Re: The XOR problem
Post-Doc at Rockefeller
Re: Neuron Digest V8 #13
Job Announcement
re: Connection Machine Speedup
RE: Neuron Digest V8 #7
Search for a paper
Re: Smalltalk Neural Network Program
Re: "Invariant Pattern Recognition with ANNs":


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: many-to-many associate in multi-layer back propagation ??
From: mr794339%cs.nthu.edu.tw@CUNYVM.CUNY.EDU (mr794339)
Date: Sat, 26 Oct 91 21:58:17 -0800

Hello, all,

I am a master student in Nationial Tsing Hua University (NTHU) in Taiwan.
Now I have a problem in the multi-layer back propagation.

This is my problem...

If the pattern contains the don't-care bit. Then, How can I do the
following thing.

input Pattern Output Pattern

A: 1 * * * * 0
B: * * 0 * 1 *
--------------------------------------------
C: 1 * 0 * 1 0

If the Nueral Network learn the Rule A and Rule B ,then Could he can
recall the Rule C correctly??

I think this is a many-to-many associated problem.

All suggests and help are welcoming. And thanks in advance.

Yu-Min, Marfada , Chang


------------------------------

Subject: Industrial (real-life) uses of ANN for visual inspection??
From: eba@computing-maths.cardiff.ac.uk (Eduardo Bayro)
Date: Mon, 28 Oct 91 08:42:31 +0000

Dear Netters!

Recently many neuro-chips have been designed and built. Few authors have
presented neural nets applications for idustrial inspection. Two authors
presented the applications of the back-propagation multi layer perceptron
and another the use of the Reduced Coulomb Energy neural net for
classification purposes. All of them used accelerator boards ( ANZA or
Balboa) to run fast simulated nets. Can anybody give me his point of view
on ways of using neural computing for industrial visual inspection,
considering cost and current trends?. Are there any neural computers
working on the floor of a factory for classification and rejection of
faulty objects?. Is this a cost effective machine vision system? Any
references would be highly appreciated. Results will be posted back to
the list. Thanking you in advance, Eduardo.

------------------------------

Subject: Presentation
From: neuron%esemetz.ese-metz.fr@esemetz.uucp (Neuron Mailing list)
Date: Mon, 28 Oct 91 13:58:50 +0000

We are a team of engineers working and teaching at the Computer Science
Department in Supelec (High School in Electrical Engineering). Our field
of research is neural networks. We deal with main issues in pattern
recognition based on neural networks, and especially in handwritten
recognition. We also work in signals forms recognition with neural
networks. In a nearest future we will try to implement a parallelized
version of the backpropagation algorithm on a Transputer-based machine.

------------------------------

Subject: ANN and time series forecasting
From: MSANDRI%IVRUNIV.BITNET@ICINECA.CINECA.IT
Date: Tue, 29 Oct 91 13:21:33 -0100

Here are some references on ANN and (chaotic) time series forecasting:

*CASDAGLI M. (1989) "NONLINEAR PREDICTION OF CHAOTIC TIME SERIES"
PHYSICA D 35 - PP.335:356.

DAY P.S. AND DAVENPORT M.R. (1991) "CONTINUOS-TIME TEMPORAL BACK-
PROPAGATION WITH ADAPTABLE TIME DELAYS"
Preprint submitted to:
IEEE TRANSACTIONS ON NEURAL NETWORKS. Email: SHAWND@EE.UBC.CA
(Day) - DAVENPO@PHYSICS.UBC.CA (Davenport)

*FARMER J.D. AND SIDOROWICH J.J. (1989) "EXPLOITING CHAOS TO
PREDICT THE FUTURE AND REDUCE NOISE"
IN LEE Y.C. (ed.) EVOLUTION,
LEARNING AND COGNITION - WORLD SCIENTIFIC - SINGAPORE

FRYE R.C., RIETMAN E.A. AND WONG C.C. (1991) "BACK-PROPAGATION
LEARNING AND NONIDEALITIES IN ANALOG NEURAL NETWORK HARDWARE"
-
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL.2, No. 1

GALLANT A.R. AND WHITE H. (1991) "ON LEARNING THE DERIVATIVES OF
AN UNKNOWN MAPPING WITH MULTILAYER FEEDFORWARD NETWORKS"
Preprint
- - Email ARG@CCVR1.CC.NCSU.EDU (Gallant)

JONES R.D., LEE Y.C. ET AL. (1990) "FUNCTION APPROXIMATION AND
TIME SERIES PREDICTION WITH NEURAL NETWORKS"
LOS ALAMOS LA-UR-90-
21 -Email RDJ@LANL.GOV (Jones)

JONES R.D., LEE Y.C. ET AL. (1990) "NONLINEAR ADAPTIVE NETWORKS: A
LITTLE THEORY, A FEW APPLICATIONS"
LOS ALAMOS LA-UR-91-273 -Email
RDJ@LANL.GOV (Jones)

LAPEDES A. AND FARBER R. (1989) "HOW NEURAL NETWORK WORK" IN LEE
Y.C. (ed.) EVOLUTION, LEARNING AND COGNITION - WORLD SCIENTIFIC -
SINGAPORE - Email RMF@T13.LANL.GOV (Farber) - ASL@T13.LANL.GOV
(Lapedes)

MEAD W.C., JONES R.D. ET AL. (1991) "PREDICTION OF CHAOTIC TIME
SERIES USING CNLS-NET -- EXAMPLE: THE MACKEY-GLASS EQUATION"
LOS
ALAMOS LA-UR-91-720 - Email RDJ@LANL.GOV (Jones)

*MEYER T.P. AND PACKARD N.H. (1991) "LOCAL FORECASTING OF HIGH
DIMENSIONAL CHAOTIC DYNAMICS"
CENTER FOR COMPLEX SYSTEMS RESEARCH
- - BECKMANN INSTITUTE - UNIV. OF ILLINOIS AT URBANA - TR CCSR-91-1
- - Email n@complex.ccsr.uiuc.edu (Packard)

*PACKARD N.H. (1989) "A GENETIC LEARNING ALGORITHM FOR THE ANALYSIS
OF COMPLEX DATA"
CENTER FOR COMPLEX SYSTEMS RESEARCH - BECKMANN
INSTITUTE - UNIV. OF ILLINOIS AT URBANA - TR CCSR-89-10 - Email
n@complex.ccsr.uiuc.edu (Packard)

* = these are useful readings but they are not direct applications
of ANN on time series forecasting

I hope this list could be helpful.

=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=
++++ ////// ****** ! ! MARCO SANDRI
++ / ** ! ! UNIVERSITY OF VERONA
++ ///// **** ! ! ISTITUTO DI SCIENZE ECONOMICHE
++ / ** ! ! VIA DELL'ARTIGLIERE 19
++++ ///// ****** !! 37129 VERONA -- ITALY
MSANDRI@IVRUNIV.BITNET FAX NO. 45-8098292
=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=:=


------------------------------

Subject: Response to request for info
From: avlab::mrgate::"a1::raethpg"%avlab.dnet@wrdc.af.mil
Date: Thu, 31 Oct 91 07:05:37 -0500

[[ Editor's Note: This was from a request for "beginner's software." I
certainly assume Raeth has no connection to NeuralWare except as a
satisfied customer. Do any of you readers have favorite commercial
software packages? -PM ]]


From: NAME: Major Peter G. Raeth
TEL: AV-785 513-255-7854 <RAETHPG AT A1 AT AVLAB>

The attached email was my attempt to respond to a request for information
contained in Neuron Digest. Our system could not make the proper
connections. I'm sending it on to you since you sometimes put reader
responses in the digest itself.

Best.

Pete.

Subject: Exploring Neural Networks
To: "rey@esrf.uucp"@labddn@mrgate@avlab,
NAME: Mr Vincent Rey Bakaikoa <PAPER MAIL AT A1 AT AVLAB>

The best tool I am aware of for exploring neural networks and their
applications to specific problems is NeuralWorks Explorer produced by
NeuralWare, Inc. If you find the package worthwhile, it can be upgraded
in NeuralWorks Professional II. Contact:

Mr Kevin Coleman or Mr Casey Klimasauskas
NeuralWare, Inc
Penn Center West, Bldg IV, Suite 227
Pittsburgh, PA 15276 USA
(Voice: 412-787-8222, Fax: 412-787-8220)

Good luck in your continuing research.


------------------------------

Subject: Neuron simulation - who's doing it?
From: amtw@cl.cam.ac.uk (Adrian Wrigley)
Organization: U of Cambridge Computer Lab, UK
Date: 02 Nov 91 00:22:42 +0000

Sent to comp.ai.neural-nets but without any response:

> What is the current state-of-the-art in simulation of biologically
> plausible neural networks?
>
> By biologically plausible, I mean ones that propagate impulses
> that affect membrane potentials and synaptic weights by some
> nonlinear, perhaps time-variant law. (i.e. NOT back/prop etc.!).
>
> In particular, how many synapses/neurons are being simulated at what
> rate by the best worstations, and by supercomputers, with or without
> an 'accelerator'?
>
> I am interested in systems that could model part of the brain at the
> same rate as a live brain, i.e. a few hundred pulses/sec at most.
>
> Adrian Wrigley

This was probably a dumb question to ask anyway on c.ai.nn, so I'll
try another (no bordeom flames please - I'm new to this game and don't know
the jargon!). . .

Who out there has a simulation of groups of neurons operating with pulses
and local adaption algorithms? I have been thinking of setting up a
system using 1000+ neurons operating on speech spectra in real time, and
was wondering how succesful simple (modified) Hebbian learning would be
at recognising phonemes/words? (which is the most appropriate newsgroup
for this?)

Adrian Wrigley
Cambridge University Computer Laboratory, England.

------------------------------

Subject: Re: Neuron simulation - who's doing it?
From: ratnam@leipzig.ks.uiuc.edu (Rama Ratnam)
Organization: University of Illinois at Urbana
Date: 02 Nov 91 22:58:53 +0000


I seriously doubt whether a simulator for biological networks exists,
that combines learning with the modelling. I may be ignorant but i think
this would be fairly deficult to develop.

I do have some info for you (if it is of any help)

1.There is a simulator from Caltech which i modified to suit our research
requirements. The simulator was originally developed at Caltech by a
group of people working on the cat visual cortex. It runs on the
Connection machine.

I routinely simulate 24,000 neurons and approx 5 Million synapses. This
is pretty large and ties up the CM for quite a while (about 1.5 hours per
8k CM sequencer). (I'd believe that this is currently the largest scale
simulation going around)

2.Another simulator is BIOSIM developed by the SANS group at the Royal
Institute of Technology in Sweden. This is a really detailed model based
on Compartments and the Hodgkin-Huxley equations. It also runs on the CM,
but a serial version called SWIM (also developed by the group) is useful
for small groups of neurons. Both simulators model channels.

3. GENESIS, also developed at Caltech is another simulator that runs on
serial machines. It has an X interface. I have never tried it out, but a
couple of my group members use it.

The caltech simulator (the CM one) works on populations and pathways,
does not simulate dendritic compartments, doesn't care too much about
channels and uses a leaky integrate-and-fire model.

Both have their plus points. Simulating biological neural networks is a
major headache. The design issues are messy, so i suggest that if you
are interested, lets correspond by email. I plan to write my own
simulator soon (for the Connection machine). My suggestion: If you are
trying to simualte a few hundred neurons with no more than a few thousand
synapses, then you would be stretching a serial machine pretty hard. Even
a 1000 neurons is quite a bit. AND if you want to add an adaptation
module - phew!

-ratnam

==============================================================
Rama Ratnam, Theoretical Biophysics, Beckman Institute
Dept of Biophysics, University of Illinois--Urbana-Champaign
ratnam@lisboa.ks.uiuc.edu
==============================================================

------------------------------

Subject: Code for neural network with time-delays
From: kunnikri@neuro.cs.gmr.com (K.P.Unnikrishnan CS/50)
Date: Sat, 02 Nov 91 19:02:01 -0500


The simulation code (mostly in FORTRAN) for the neural network with
time-delays described in the foll. two papers are now available via.
anonymous ftp.

K.P. Unnikrishnan
unni@neuro.cs.gmr.com

References:

Unnikrishnan, K.P., Hopfield, J.J., and Tank, D.W. 1991. Connected-digit
speaker-dependent speech recognition using a neural network with
time-delayed connections. IEEE Transactions on Signal Processing. vol.
39, pp. 698-713.

Unnikrishnan, K.P., Hopfield, J.J., and Tank, D.W. 1991.
Speaker-Independent Digit recognition Using a Neural Network with
Time-Delayed Connections. Neural Computation. vol. 4, pp. 108-119 (in
press).


Directions for anonymous ftp:

unix> ftp neuro.cs.gmr.com (or ftp 129.124.8.44)
Name: anonymous
Password: <your id>
ftp> cd /pub
ftp> get README

This README file gives details of the directories and programs.


------------------------------

Subject: Kohonen SOM X11 Simulator
From: Michael Arras <arras@icase.edu>
Date: Tue, 05 Nov 91 16:40:28 -0500


I have written a simple Kohonen network simulator that runs under
X11R4-5. It comes with very little online documentation, and no
information on the particular models used. This simulator was written
for someone who knows what he/she is doing. It is not a good
introduction to SOM's. It has been tested on a Sun3 running SunOS 4.0,
Sun4 running SunOS 4.1, an Iris running IRIX 4.0 and a DECStation 5000
running Ultrix 4.2. This simulator is a first attempt at simulating a
Kohonen network and writing an X Window application so the code is far
from optimal (in fact it is terrible). I can make no guarantees
regarding the correctness of this code. If you ask, I will mail you the
C code. If you have any questions, please refer to `Self-Organization
and Associative Memory' 2nd Ed., T. Kohonen.

Mike Arras
arras@icase.edu
ICASE/NASA Langley


------------------------------

Subject: Re: The XOR problem
From: young%bunyip.cc.uq.oz.au%munnari.oz.au@tcgould.tn.cornell.edu (Steven Young)
Organization: Prentice Centre, University of Queensland
Date: 08 Nov 91 06:50:08 +0000

kattan@JANE.UH.EDU (Mike Kattan) writes:
>In article <1991Nov4.150407.6573@menudo.uh.edu>, kattan@JANE.UH.EDU
>>(Mike Kattan) writes:
>>One of the limitations of the perceptron was that it could not solve the
>>XOR problem. A backpropagating neural network can. Can somewone show me
>>what the weights could be, assuming a 2-2-1 and the following data:
>> X1 X2 Y
>> -- -- -
>> 0 0 0
>> 0 1 1
>> 1 0 1
>> 1 1 0
>Although I 've gotten some helpful responses, I need to do a better job of
>explaining my question. I'm assuming a logistic transfer function
>[1/(1+e^sum)] at the hidden and output layer neurons, and a hidden layer
>neuron in the input and hidden layers if needed. Based on that, I still
>don't have a solution.
>Mike Kattan

You may find the following reference helpful: E. K. Blum, "Approximations
of Boolean Functions by Sigmoidal Networks: Part I: XOR and Other
Two-Variable Functions"
, Neural Computation, vol. 1, pp. 532--540, 1989.

This reference considers solutions in weight space for a 2-2-1 network,
and actually proves the existence of a manifold of exact solutions and
also the existence of a manifold of local minima of mean squared error.
They give their results with `symmetry' constraints on the weights.

This paper seems to be an answer to your problem as posed.

There are two other papers that you may find helpful. P.J.G. Lisboa and
S.J. Perantonis, "Complete solution of the local minima in the XOR
problem"
, Network, vol. 2, pp. 119--124, 1991.

Lisboa and Perantonis consider the 2-2-1 network and also the single
hidden node network with the input units and the hidden node connected to
the output node, but considers a different cost or error function
attributed to Wallace (Wallace, D.J. "Neural network models: a
physicist's primer"
in "Computational Physics", ed. R.D. Kenway and G.S.
Pawley, Edinburgh: Scottish Universities Summer Schools in Physics, pp.
168--211.) which in TeX is $$ E = -\sum_i \sum_k \ln((O_k^i)^{t_k^i}
(1-O_k^i)^{1-t_k^i}) $$ where $i$ ranges over the patterns and $k$ ranges
over the output nodes. They state that this error function doesn't
exhibit stationary points at the extreme values of the sigmoid (ie 0 and
1). (Anybody got any clues to how this might compare with the use of
relative entropy as an error measure presented by Solla, Levin and
Fleisher, "Accelerated Learning in Layered Neural Networks", Complex
Systems, 2, pp. 625--640, 1988.)

and

Mirta B. Gordon and Pierre Peretto, "The statistical distribution of
Boolean gates in two-inputs, one-output multilayered neural networks"
,
Journal of Physics A: Mathematical and General, 23, 3061--3072, 1990.

Gordon and Peretto consider the probability of a Perceptron network (no
hidden layer) and multilayer networks implementing various two input one
output Boolean functions given uniform distribution of weights (in a
bounded range) and Gaussian distribution. For the One interesting
observation they made is that the XOR (and EQUIVALENT) function has a
lower probability of being implemented than the other linearly separable
functions

So I have a general request to the net: I'm wondering if there's been any
other work on analysis of the error surface in weight space, and the
capabilities of network in the Boolean function domain more than just
knowing that Perceptrons can't implement nonlinearly separable functions,
in the same vain as Gordon and Peretto's work?

Thanks,

Steven Young.

Steven Young, (young@s2.elec.uq.oz.au)
Dept of Electrical Engineering,
University of Queensland. 4072, Australia.
"My CPU is a neural network processor---a learning machine" - Arnie, T2.


------------------------------

Subject: Post-Doc at Rockefeller
From: Zhaoping Li <zl@guinness.ias.edu>
Date: Fri, 22 Nov 91 16:56:17 -0500


POSTDOCTOROAL POSITIONS IN COMPUTATIONAL NEUROSCIENCE
AT
ROCKEFELLER UNIVERSITY, NEW YORK

We anticipate the opening of one or two positions in
computational neuroscience at Rockefeller University. The positions are
at the postdoctoroal level for one year, starting in September 1992, with
the possibility of renewal for a second year. Interested applicants
should send a CV including a statement of their current research
interests, and arrange for three letters of recommendation to be sent as
soon as possible directly to Prof. Joseph Atick, Institute for Advanced
Study, Princeton, NJ 08540. (Note that applications are to be sent to the
Princeton address.)

------------------------------

Subject: Re: Neuron Digest V8 #13
From: Jonathan Marshall <marshall@cs.unc.edu>
Date: Mon, 09 Dec 91 13:59:47 -0500

> Subject: Invariant pattern recognition with ANNs
> From: evol@infko.uni-koblenz.de
> Date: 05 Dec 91 14:34:33 +0000
>
> I am interested in translation, rotation and scaleing invariant pattern
> recognition with ANNs. I already know about Fokushima's Neocognitron and
> Hubel & Wiesels work about the human visual cortex. I also read some
> papers about complex logarithmic mapping and Fast Fourier Transformation
> (H. Haken, H. J. Reitboeck & J. Altmann, H. Wechsler & G. L. Zimmerman).

You should contact Dr. Al Nigrin (nigrin@chopin.uucp); he's done some
good work in this area.

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
= =
= Jonathan A. Marshall marshall@cs.unc.edu =
= Department of Computer Science =
= CB 3175, Sitterson Hall =
= University of North Carolina Office 919-962-1887 =
= Chapel Hill, NC 27599-3175, U.S.A. Fax 919-962-1799 =
= =
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

------------------------------

Subject: Job Announcement
From: WARREN%BROWNCOG.BITNET@BITNET.CC.CMU.EDU
Date: Mon, 09 Dec 91 16:41:00 -0500


JOB OPENING IN HIGHER-LEVEL COGNITION
DEPT. OF COGNITIVE AND LINGUISTIC SCIENCES
BROWN UNIVERSITY

The Department of Cognitive and Linguistic Sciences at Brown University
invites applications for a tenure-track, Assistant Professor position in
higher-level cognition, beginning July 1, 1992. Preference will be given
to the areas of concepts, memory, attention, problem solving, knowledge
representation, and the relation between cognition and language.
Applicants should have a strong experimental research program and broad
teaching ability in the field of cognition and cognitive science. Women
and minorities are especially encouraged to apply. Send C.V., three
letters of reference, copies of publications, and statement of research
interests by January 1, 1992, to:

Dr. William H. Warren, Chair
Cognitive Search Committee
Dept. of Cognitive and Linguistic Sciences, Box 1978
Brown University
Providence, RI 02912

Brown University is an Equal Opportunity/Affirmative Action Employer.

NOTE: This is a separate position from the one recently advertised
in the Psychology Department at Brown. The deadline has been
extended until Jan. 1, 1992.


------------------------------

Subject: re: Connection Machine Speedup
From: David Kanecki <kanecki@cs.uwp.edu>
Date: Mon, 09 Dec 91 22:07:56 -0600

THE CM speedup is due to the following mathematical relation:

If a matrix M, composed of X input and X output units is subdived//
subdivided by four the time required to process M multiplied by a vector
of length X is:

(x/4)*(x/4) = x^2/16

Thus, the processing time is 16 times faster.

When the matrix M is subdivided by N the time required is:

(x/n)*(x/n) = x^2/n^2

Thus, the processing time is n^2 faster than a sequential processing
machine.

This is one aspect where multiprocessing can aid in neural network research.


David H. Kanecki, Bio. Sci., A.C.S.
kanecki@cs.uwp.edu

------------------------------

Subject: RE: Neuron Digest V8 #7
From: livingston_d%frgen.dnet@smithkline.com (David Livingstone Med. Chem. ext 2088)
Date: Tue, 10 Dec 91 08:51:44 -0500

There was a request in the above digest from Paul Bakker concerning
forthcoming conferences. I sent him details of a workshop in France
next year and wondered if it might be of interest to other readers.
the details are:

"First International Workshop on Neural Networks applied to
Chemistry and Environmental Sciences"


To be held in Lyon, France on the 8th - 10th of June 1992

The closing date has passed, but that may just be for submissions.

The contact address is : Dr. J. Devillers, CTIS, 21 rue de la Banniere
69003 LYON, France. FAX No. (33) 78 62 99 12.

------------------------------

Subject: Search for a paper
From: mcsun!uknet!cf-cm!news@uunet.uu.net (Daniel K R McCormack)
Organization: University of Wales College of Cardiff, Cardiff, Wales, UK.
Date: 11 Dec 91 11:43:50 +0000

Hello ,

I am aware that it is a well know fact that a one layer
backpropagation network is guaranteed to converge given enough time,
assuming the problem is solvable using a layer network. Has this been
proven mathematically ( I assume it has ) and if so , where can I find it
etc. etc.

Many thanks in advance.

Daniel.

Quote : I am not responsible for my actions , I only perform them.


------------------------------

Subject: Re: Smalltalk Neural Network Program
From: Joerg Rade <JRADE1%DGOGWDG1.bitnet@CUNYVM.CUNY.EDU>
Date: Thu, 12 Dec 91 16:13:29 -0500


On Thu, 21 Nov 91 08:51:00 -0600, Bob Snyder, SNYDERR@randd.abbott.com
asked:
>
>Does anyone have or heard of a neural network program written in the
>Smalltalk language? Thanks in advance.
>
>Bob Snyder <snyderr@randb.abbott.com>

I've translated Backpropagation from Turbo-Pascal to Smalltalk/V 286.
The translation isn't a masterpiece in terms of ObjectOriented Design
(i.e. doesn't take much advantage of messagepassing and Smalltalk
datastructures). It's about 5 times slower than compiled Turbo-Pascal,
but it works and is free and should be portable to other dialects. I can
mail it to you if your'e interested. There is also a NN in Smalltalk
Goodies #2 from Digitalk. It retails for something in the range of
49-99$. I forgot.

Joerg Rade
Institut fuer Wirtschafts-
und Sozialpsychologie _ o
Universitaet Goettingen _ \ \ _ How can I know what I say,
DW-3400 Goettingen (_)/ (_) before I hear what I think ?


------------------------------

Subject: Re: "Invariant Pattern Recognition with ANNs":
From: apr1@homxc.att.com (Anthony P Russo)
Organization: AT&T Bell Laboratories
Date: Mon, 16 Dec 91 11:48:24 -0500

Concerning "Invariant Pattern Recognition with ANNs":

I've done much applied work in the area, and give you the following
reference of a successful ANN application to size and translation
invariant object recognition. The reference is:


Title: "Constrained Neural Networks for Recognition of Passive Sonar
Signals Using Shape"


Author: Anthony P. Russo, AT&T Bell Laboratories

Source: Proceedings of the IEEE Conference on Neural Networks for
Ocean Engineering (August 15-17, 1991 Washington, DC)

Date of Publication: August 15, 1991

Pages: 69-76


I hope it helps.


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~ Tony Russo " Surrender to the void." ~
~ AT&T Bell Laboratories ~
~ apr@cbnewsl.ATT.COM or apr1@homxc.att.com ~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~



------------------------------

End of Neuron Digest [Volume 8 Issue 14]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT