Copy Link
Add to Bookmark
Report

Neuron Digest Volume 11 Number 43

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Friday,  9 Jul 1993                Volume 11 : Issue 43 

Today's Topics:
job openings
Help: Research on Neural Robot Systems that Learn to Behave?
Cultured Neural Nets
NN and sismo. : results
Reinforcement Learning Mailing List
Kolmogorov's Theorem, real world applications


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: job openings
From: John Ostrem <cic!john!ostrem@unix.sri.com>
Date: Tue, 22 Jun 93 13:48:32 -0800

Communication Intelligence Corporation (CIC) is a leader in handwriting
recognition and other pen input technologies. We currently market
recognizers for English, Western European, and Asian languages on a variety
of platforms (e.g., DOS, Windows, Macintosh, and so on). These systems enable
the pen to serve as the sole input and control device, combining the functions
of both keyboard and mouse, and adding new capabilities.

Advanced development is directed toward integrated discrete/cursive
recognizers, and future integration with voice recognition, OCR, and
similar technologies.

CIC was founded in 1981 in conjunction with SRI International (formerly
Stanford Research Institute). CIC is headquartered in Redwood Shores,
California, and has an international subsidiary, CIC Japan, Inc., in
Tokyo, Japan.

CIC currently has immediate openings for the following positions:

- -----------------------------------------------------------------------------

POSITION: Software Engineer

QUALIFICATIONS:

1. 3-5 years experience in designing and coding for large software projects
in a UNIX environment
2. Good communication skills and works well with other people.
3. Expert C programmer (at least 3-5 years experience)
4. BS or MS in Computer Science or the equivalent
5. Experience in graphics programming and user interfaces a plus
6. The following are additional pluses:

a. Experience in handwriting recognition (on-line or off-line)
b. Linguistic experience (particularly statistical linguistics)
c. Experience planning/executing complex projects
d. Experience in commercial companies
e. Experience in SunOS system administration

JOB DESCRIPTION:

1. Work with and support researchers working on handwriting and speech
recognition
2. Design, implement, and support data collection software and analysis
tools


- -----------------------------------------------------------------------------

POSITION: Pattern Recognition Specialist/project leader

QUALIFICATIONS:

1. Strong background in statistics, pattern recognition, algorithm development
2. Experience in OCR a plus
3. 3-5 years experience in designing and coding for large software projects
in a UNIX environment
4. Good communication skills and works well with other people.
5. Expert C programmer (at least 3-5 years experience)
6. Ph.D. or substantial experience in Computer Science, Electrical
Engineering or the equivalent
7. The following are additional pluses:

a. Experience in handwriting recognition (on-line or off-line)
b. Linguistic experience (particularly statistical linguistics)
c. Experience planning/executing complex projects
d. Experience in commercial companies

JOB DESCRIPTION

1. Work with a team of researchers on the next generation of handwriting
recognition systems (both off-line and on-line) for the commercial
market
2. Develop into a project leader/manager

- -----------------------------------------------------------------------------

Please reply to cic!ostrem@unix.sri.com (or cic\!ostrem@unix.sri.com in
command mode), or write or fax to

John S. Ostrem
Communication Intelligence Corporation
275 Shoreline Drive, 6th Floor
Redwood Shores, CA 94065-1413
Fax: (415) 802-7777




------------------------------

Subject: Help: Research on Neural Robot Systems that Learn to Behave?
From: ashley@cs.unsw.oz.au (Ashley Aitken)
Date: Mon, 05 Jul 93 19:29:16 +0900


G'day,

I am interested in finding out about research labs around the world that are
actively working in the following research area,

Real or more-likely simulated Robots, or part thereof, which are capable of
Learning Sensory-Motor Maps and Complex Behaviours, and to some degree
Based on Neuroscience (ie biologically plausible neural networks).

Two researchers that come to mind are Gerald Edelman (Neural Darwin Systems)
and Michael Kuperstein (Neural Model of Adaptive Hand-Eye Coordination).

If you could please e-mail me any details I would be most grateful. If there
is enough interest I will post a summary.

Thanks in advance,
Ashley Aitken.

- --
E-MAIL : ashley@cse.unsw.edu.au AARNet

Schools of EE and CS&E, (AI Lab) c/o Basser College, (Flat 7A)
University of New South Wales, The Kensington Colleges,
Box 1,PO KENSINGTON,N.S.W.,2033. Box 24,PO KENSINGTON,N.S.W,2033.
AUSTRALIA. AUSTRALIA.



------------------------------

Subject: Cultured Neural Nets
From: Steve Potter <spotter@darwin.bio.uci.edu>
Date: Fri, 02 Jul 93 12:26:17 -0800

Below I present a bibliography of all of the researchers I know of that
are growing neurons in culture on multielectrode substrates. A belated
thank-you is due to several connectionists who responded to my request
posted a couple of years ago. This is a
surprisingly small list. If you know of someone I have missed, please
send me email (spotter@darwin.bio.uci.edu).

I believe that approaches such as these are likely to close the gap
between the engineering and biological camps of neural network research.
With long-term, multi-site monitoring of real (though simple) networks, we
may learn which aspects of real neural processors must be included in our
simulations if we hope to emulate the accomplishments of Mother Nature.

If you are involved in this research and I have not contacted
you already, please email me; I am looking for a post-doctoral position.

Steve Potter
Psychobiology dept.
UC Irvine
Irvine, CA 92717
spotter@darwin.bio.uci.edu


CULTURED NETS ON MULTI-ELECTRODE SUBSTRATES:
(Recent or representative publications are listed)
Steve Potter 7-2-93
spotter@darwin.bio.uci.edu


Masuo Aizawa

(Layman's article)
Freedman, D.H. (1992). If he only had a brain. Discover : 54-60.


Robert L. Dawes, Martingale Research (Texas)

(Proposal--Never followed up?)
Dawes, R.L. (1987). Biomasscomp: A procedure for mapping the architecture
of a living neural network into a machine. IEEE ICNN proceedings 3: 215-225.


Mitch D. Eggers, MIT

(Any subsequent work with this device?)
Eggers, M.D., Astolfi, D.K., Liu, S., Zeuli, H.E., Doeleman, S.S., McKay,
R., Khuon, T.S., and Ehrlich, D.J. (1990). Electronically wired petri
dish: A microfabricated interface to the biological neuronal network. J.
Vac. Sci. Technol. B 8: 1392-1398.


Peter Fromherz, Ulm University (Germany)

Fromherz, P., Offenhausser, A., Vetter, T., and Weis, J. (1991). A
neuron-silicon junction: a Retzius cell of the leech on an insulated-gate
field-effect transistor. Science 252: 1290-3.


Guenter W. Gross, U. of N. Texas

Gross, G.W. and Kowalski, J. (1991) Experimental and theoretical analysis
of random nerve cell network dynamics, in Neural Networks: Concepts,
applications, and implementations (P. Antognetti and B Milutinovic, Eds.)
Prentice-Hall: NJ. p. 47-110.


Vera Janossy, Central Research Inst. for Physics (Hungary)

Janossy, V., Toth, A., Bodocs, L., Imrik, P., Madarasz, E., and Gyevai, A.
(1990). Multielectrode culture chamber: a device for long-term recording
of bioelectric activities in vitro. Acta Biol Hung 41: 309-20.


Akio Kawana, NTT (Japan)

(News article)
Koppel, T. (1993). Computer firms look to the brain. Science 260: 1075-1077.


Jerome Pine, Caltech

Regehr, W.G., Pine, J., Cohan, C.S., Mischke, M.D., and Tank, D.W. (1989).
Sealing cultured invertebrate neurons to embedded dish electrodes
facilitates long-term stimulation and recording. J Neurosci Methods 30:
91-106.


David W. Tank, AT&T Bell Labs

(Abstract)
Tank, D.W. and Ahmed, Z. (1985). Multiple site monitoring of activity in
cultured neurons. Biophys. J. 47: 476a.


C. D. W. Wilkinson, U. of Glasgow (Scotland)

Connolly, P., Clark, P., Curtis, A.S., Dow, J.A., and Wilkinson, C.D.
(1990). An extracellular microelectrode array for monitoring electrogenic
cells in culture. Biosens Bioelectron 5: 223-34.

Curtis, A.S., Breckenridge, L., Connolly, P., Dow, J.A., Wilkinson, C.D.,
and Wilson, R. (1992). Making real neural nets: design criteria. Med Biol
Eng Comput 30: CE33-6.


ACUTE PREPS (NOT CULTURED):

Bruce C. Wheeler, U. of Illinois

(Hippocampal slice)
Boppart, S.A., Wheeler, B.C., and Wallace, C.S. (1992). A flexible
perforated microelectrode array for extended neural recordings. Ieee Trans
Biomed Eng 39: 37-42.

Novak, J.L. and Wheeler, B.C. (1986). Recording from the Aplysia abdominal
ganglion with a planar microelectrode array. Ieee Trans Biomed Eng 33:
196-202.


Markus Meister, Harvard

Meister, M., Wong, R.O., Baylor, D.A., and Shatz, C.J. (1991). Synchronous
bursts of action potentials in ganglion cells of the developing mammalian
retina. Science 252: 939-43.

Litke, A. and Meister, M. (1991). The retinal readout array. Nuclear
Instruments and Methods in Physics Research A310: 389-394.






------------------------------


Subject: NN and sismo. : results
From: slablee@mines.u-nancy.fr
Date: Tue, 06 Jul 93 10:29:02 +0700


I posted a request for help in February 1993, and promised to
sum up the answers as soon as possible.
Well, I really got many answers, and I finally found a little
time to give you these results.

My problem was :
I'm trying to use NN to detect the start of a sampled signal
within noise.
My first attempts with a BackProp Network (7520x150x70x1 !!!)
were unsuccesfull, because of network "paralysis" (as describe
by Rumelhart) and local minima. The network always stopped
learning with a rather high error.

After having received all the answers, I decided to use the
Cascade-Correlation of Scott Fahlman, which worked very well...

Let me first thank all the following people for having replied
my request :

Rick Alan (IEEE Neural Network Council, USA)
- 70324.1625@compuserve.com -
Frederic Alexandre (CRIN (Computer Research Center), Nancy, France)
- falex@loria.crin.fr -
Bill Armstrong (University of Alberta, Canada)
- arms@cs.ualberta.ca -
Paul Bakker (University of Queenland, Lucia, Australia)
- bakker@cs.uq.oz.au -
Bart Bartholomew (US National Coomputer Security Center, Meade, Maryland, USA)
- hcbarth@afterlife.ncsc.mil -
Istvan Berkeley (University of Alberta, Canada)
- istvan@psych.ualberta.ca -
Weiren Chang (University of Texas, USA)
- wrchang@ccwf.cc.utexas.edu -
Terry Chung (Queen's University, Canada)
- terry@solar.me.queensu.ca -
Michel Ianotto (Supelec, Metz, France)
- mi@ese-metz.fr -
Charles W.Lee (Bolton Institute, Bolton, UK)
- (helped me by mail (I mean "snail-mail" !),
I didn't find any e-mail address...) -
Stan Malyshev (University of California, Berkeley, USA)
- stas@sting.berkeley.edu -
William A.Rennie (University of Albany, New-York, USA)
- wr5908@csc.albany.edu -
George Rudolph (Brigham Young University, Provo, Utah, USA)
- george@axon.cs.byu.edu -
Soheil Shams (Hughes Research Labs, Malibu, Canada)
- shams@maxwell.hrl.hac.com -

- ----------------------------------------------------------------

This project was lead by the INERIS (Institut National de
l'Environnement Industriel et des Risques = National Institute
of Industrial Environment and Risks), and help was offered by
the Earth Science Department (Earth Mechanics Laboratories) of the
Ecole des Mines de Nancy (Nancy, France).

It lead, among other things, to the study of the possibility of
using NN for detecting the start of a sismical wave (P-wave)
within a signal with a lot of noise.
This study showed that using NN could bring better results in this
problem of P-wave-detection.
So we build a connexionnist system called SCOP (Systeme Connexionniste
de Pointage).
This system is going to be improved and tested soon.
It uses the Cascade-Correlation algorithm (from Scott E.Fahlman - Carnegie
Mellon University, Pittsburgh, PA).
I hope further studies will show other uses of this system (in the same
field).

- -------------------------------------------------------------------------

I worked on a HP9000-720 computer, under UNIX system.
All the parts of the system have been developped in C (ANSI).

I pre-processed the signal with the Gabor method (a "sliding" FFT)
wich give a 3-Dimensional representation of the spectrum
(a time-frequencies diagram).
We will perhaps study the results with using a Wavelet preprocessing.

The NN take a 400 points input (each point = the spectral value at a
given time for a given frequency).
So the NN have, after learning, about 410 units.

There are 5 outputs. Each output unit represents a time window containing
the start of P-waves. In the learning patterns, there were four zeros
and one 1. The output unit which have the 1 value give the window which
must contain the start of P-waves.

The results were 89.8 % of good answers for a 1.3 seconds window
(versus 83 % with the "classical" (not NN) algorithms).
The problem was the lack of datas for learning. The network learned
with about 3000 patterns, and was tested with about 500 patterns.

- --------------------------------------------------------------------------

If you want more details about this project, please e-mail to :

slablee@mines.u-nancy.fr
(BE CAREFUL : until August 1st only)
or
gueniffe@mines.u-nancy.fr after August 1st
(please explain that the message is for me !)
- --------------------------------------------------------------------------

SOME OF THE ANSWERS
====================

Terry Solar advised me to use the Scott Fahlman's Cascade Correlation,
because the results I had could be the best performance I could have
with the BackProp structure : backprop NNs have a fixed structure so
we can just "fine tune" an initial "guess"...

Bart Bartholomew spent a long time to explain me his experiences with
the pre-processing problems. He noticed that the zeroth component is
actually the D.C. bias of the input terms and can normally be discarded
here. Another idea would be to use the differences of the frequencies
rather than the frequencies themselves. I didn't have yet enough time for
trying this. Bart spoke also of filters. Sorry Bart, I didn't find enough
time again !

Weiren Chang spoke about simulated annealing or perturbations of the
weights in order to avoid local minima. The problem is : the larger
the network is, the easier you run into a local minimum. And my first
network was really huge !

Rick Alan said that the key to getting a net to learn is preprocessing
thre data properly. I perfectly agree with this opinion : the REAL problem
in NN learning is preprocessing.

Paul Bakker spoke about the "relativity" of the error (this word is mine !).
What he means is : I said that I have a 5 % error, but this could be of
no importance if the result is 0.95 for a 1 target, and 0.5 for a 0 target.
The problem was that I was using with back-prop a real number as output,
between 0 and 1, and not a binary answer (like bits). Now, I've
changed my outputs, and they are bits, so Paul's words are wrights.

William A.Rennie had the same idea as Paul Bakker about binary outputs.
He also spoke of OVERTRAINING. It is perhaps the thing that helped me
the most, because overtraining is a real problem that I didn't see.
The problem is that the net could start to memorize the training data
rather than generalizing. William says that my training set would
probably have to contain over a half a million cases to prevent
memorization (he was speaking about my 7520x150x70x1 backprop-net,
not about the later one). There's a mean to avoid this : to compute
the error for the testing set at each iteration, and stop the
training when the performance on the training set begins to climb
rapidly while performance on the testing data remains unchanged (so
it is the beginning of overtraining).
I trained my net like this, and... IT WORKED !
The performance was really better.

Bill Armstrong proposed to use the Atree Adaptive Logic Network
Simulation Package (available via ftp from menaik.cs.ualberta.ca
[129.128.4.241] in pub/atre27.exe).

Soheil Shams thinks (like others) that my network was huge and need
a real large database of samples. He says also that it is important
to look at the net input sum to each neuron to make sure it is not
saturated.

Istvan Berkeley also said that my net was huge, and that my learning
set could just be unlearnable...

Stan Malyshev proposed to use QuickProp instead of BackProp.
Well, the Cascade-Correlation algorithm I used after these answers
was using QuickProp.
He also advised to put my units into a single layer, for faster learning.
I did it... and he was wright !

Michel Ianotto advised to be careful in the choice of the activation function.

For George Rudolph, the problem is not necessarily the BackProp, but
the purposes and datas of my network, and the way I turned my problem.

And Charles W.Lee gave me a paper of him (to appear in 'neural networks')
called : 'learning in neural networks by using tangent planes to constraint
surfaces'. (I didn't find his e-mail, but I have his address and his phone
number).

- ----------------------------------------------------------------------------

Thanks everybody !

The Cascade-Correlation Learning Architecture was developped by :
Scott E.Fahlman - School of Computer Science - Carnegie Mellon University -
5000 Forbes Avenue - Pittsburgh - PA 15213.
Internet : sef+@cs.cmu.edu Phone : 412 268-2575
A C code of Cascade-Correlation and some papers about it could be found
by ftp at ftp.cs.cmu.edu [128.2.206.173] in afs/cs/project/connect/code

- ----------------------------------------------------------------------------

Stephane Lablee

slablee@mines.u-nancy.fr (until 01/08/93)
gueniffe@mines.u-nancy.fr (after 01/08/93)

Ecole des Mines de Nancy
Parc de Saurupt
54042 Nancy Cedex
France

- ----------------------------------------------------------------------------














- --


------------------------------

Subject: Reinforcement Learning Mailing List
From: Matthew McDonald <mafm@cs.uwa.edu.au>
Date: Thu, 08 Jul 93 13:34:17 +0700


This message is to announce an informal mailing list
devoted to reinforcement learning. The list is intended to provide an
informal, unmoderated, forum for discussing subjects relevant to
research in reinforcement learning; in particular, discussion of
problems, interesting papers and that sort of thing is welcome.
Announcements and other information relevant to researchers in the
field are also welcome. People are encouraged to post abstracts of
recent papers or reports.

The list is intended to be fairly informal and unmoderated.
If you'd like to join the list, please send mail to
`reinforce-request@cs.uwa.edu.au'

Cheers,
- --
Matthew McDonald mafm@cs.uwa.oz.au
Nothing is impossible for anyone impervious to reason.


------------------------------

Subject: Kolmogorov's Theorem, real world applications
From: CMRGW@staffordshire.ac.uk
Date: Fri, 09 Jul 93 10:40:00 +0000

To: neuron-request@130.91.68.31
Subject: re Kolmogorovs Theorem
Status: R


In issue number 37 K. Maguire writes
> any real-world applications of Kolmogorovs Theorem <

In this months issue of the Journal of Chemical Information and Computer
Sciences I have a paper titled "Predicting Phosphorus NMR Shifts Using
Neural Networks". In essence this paper demonstrates that Nets can be
used to represent the continuous mapping that occurs between the
parameters of molecular structure (a la graph theory ) and the NMR shift of
a central resonating atom. Currently there are several non net methods for
predicting these shifts from the structure, but these are generally long
and have a high manual analysis component, ofeten taking weeks for any
one sub class of compounds. On an IRIS wd34, the net can find a decent
generalisation in several hours. The material in the JCICS paper was
submitted last June. To date our results indicate that we are very close
to the current state of the art in terms of prediction performance. As in
Chemistry, new compounds are being continuously discovered the prediction
methods currently used "go off" when new compounds are discovered in the
various subclasses. The manual derivation process must then be repeated.
The ability of the net to do this automatically is a major step forward
We anticipate that in the next few years ournet based method will find
quite a lot of use.

Geoff West




------------------------------

End of Neuron Digest [Volume 11 Issue 43]
*****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT