Copy Link
Add to Bookmark
Report
Neuron Digest Volume 07 Number 12
Neuron Digest Monday, 4 Mar 1991 Volume 7 : Issue 12
Today's Topics:
Neurosimulators
Commercial Simulators
Computists International
A book on Self-Organization
request for information on real time applications of NNs
Re: Looking for Phoneme Data
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).
------------------------------------------------------------
Subject: Neurosimulators
From: MURRE%rulfsw.LeidenUniv.nl@BITNET.CC.CMU.EDU
Date: Tue, 12 Feb 91 16:38:00 +0700
Dear connectionist researchers,
We are compiling a list of neurosimulators for inclusion in a review
paper. The table below presents the 45 simulators that we have been able
to track down so far. We have not been able to find out all the details.
We would, therefore, appreciate it when users or developers could fill
us in on the white spots in the list (or point out any mistakes). Also,
if anyone knows of other simulators that should be included, please,
drop us a note. We would especially welcome any (pointers to) papers
describing neurosimulators. This would enable us to refine and extend
the list of features.
Thanks!
Jaap Murre Steven Kleynenberg
E-mail: MURRE@HLERUL55.Bitnet
Surface mail: Jacob M.J. Murre
Unit of Experimental and Theoretical Psychology
Leiden University
P.O. Box 9555
2300 RB Leiden
The Netherlands
To save precious bytes, we have configured the table below in a 132
column format. It may be easier to send the file to a line printer, then
to read it behind the terminal. (On a VAX use: set term /width=132.)
TABLE: NEUROSIMULATORS
Name Manufacturer Language Models Hardware Reference Price ($)
=-----------------------------------------------------------------------------
-----------------
ADAPTICS Adaptic [AZEMA]
ANNE Oregon Grad. Cent. HLL Intel hypercube [AZEMA]
ANSE TRW TRW [AZEMA]
ANSIM SAIC several IBM [COHEN]
495.00
ANSKIT SAIC several many [BARGA][
BYTE]
ANSPEC SIAC HLL many IBM,MAC,SUN,VAX
995.00
AWARENESS Neural Systems IBM [BYTE]
275.00
AXON HNC HLL many HNC neurocomp. [AZEMA][
BYTE] 1950.00
BOSS [REGGIA]
BRAIN SIMULATOR Abbot,Foster, & Hauser IBM
99.00
BRAINMAKER Cal.Scient.Software bp IBM [BYTE]
195.00
CABLE VAX [MILLER]
CASENET Prolog [DOBBINS
]
COGNITRON Cognitive Software Lisp many MAC,IBM [ZEITV][
BYTE] 600.00
CONE IBM Palo Alto HLL IBM [AZEMA]
CONNECTIONS hopf IBM [BYTE]
87.00
CORTEX [REGGIA]
DESIRE/NEUNET IBM [KORN]
EXPLORENET 3000 HNC HLL many IBM,VAX [BYTE][C
OHEN]
GENESIS Neural Systems IBM [MILLER]
1095.00
GRADSIM Univ. of Penns. C several
GRIFFIN Texas Instruments [AZEMA]
HYPERBRAIN Neurix MAC [BYTE]
995.00
MACBRAIN Neurix many MAC [BYTE]
995.00
MACTIVATION Univ. of Colorado?
METANET Leiden University HLL many IBM,VAX [MURRE]
MIRROR 2 HLL several [REGGIA]
N-NET AIWare C bp IBM,VAX [BYTE]
695.00
N1000 Nestor IBM,SUN [BYTE]
19000.00
N500 Nestor IBM [BYTE]
NEMOSYS IBM RS/6000 [MILLER]
NESTOR DEV. SYSTEM Nestor IBM,MAC
9950.00
NET [REGGIA]
NETSET 2 HNC many IBM,SUN,VAX
19500.00
NETWURKZ Dair Computer IBM [BYTE]
79.95
NEURALWORKS NeuralWare HLL many IBM,MAC,SUN [BYTE][C
OHEN] 1495.00
NEUROCLUSTERS VAX [AZMY]
NEURON [MILLER]
NEUROSHELL Ward Systems Group bp IBM [BYTE]
195.00
NEUROSOFT HNC
NEUROSYM NeuroSym many IBM
179.00
NEURUN Dare Research bp IBM
NN3 GMD Bonn HLL many [LINDEN]
NNSIM [NIJHUIS
]
OWL Olmsted & Watkins many IBM,MAC,SUN,VAX [BYTE]
1495.00
P3 UCSD HLL many Symbolics [ZIPSER]
PABLO [REGGIA]
PLATO/ARISTOTLE NeuralTech [AZEMA]
PLEXI Symbolics Lisp bp,hopf Symbolics
PREENS Nijmegen University HLL many SUN
PYGMALION Esprit C many SUN,VAX [AZEMA]
RCS Rochester University C many SUN [AZEMA]
SFINX UCLA HLL [AZEMA]
Explanation of abbreviations and terms:
Languages: HLL = High Level Language (i.e., network definition language;
if specific programming languages are mentioned networks can
be defined using high level functions in these languages)
Models: several = a fixed number of models is (and will be) supported;
many = the systems can be (or will be) extended with new models;
bp = backpropagation hopf = hopfield (if specific models are
mentioned these are the only ones supported)
References: see list below (We welcome any additional references.)
[AZEMA] Azema-Barac, M., M. Heweston, M. Recce, J. Taylor, P. Treleaven,
M. Vellasco (1990). Pygmalion, neural network progamming
environment.
[BARGA] Barga, R.S., R.B. Melton (1990). Framework for distributed artificial
neural system simulation. Proceedings of the IJCNN-90-Washington
DC, 2, 94-97.
[BYTE] Byte (product listing) (1989). BYTE, 14(8), 244-245.
[COHEN] Cohen, H. (1989). How useful are current neural network software
tools? Neural Network Review, 3, 102-113.
[DOBBINS] Dobbins, R.W., R.C. Eberhart (1990). Casenet, computer aided
neural network generation tool. Proceedings of the
IJCNN-90-Washington DC, 2, 122-125.
[KORN] Korn, G.A. (1989). A new environment for interactive neural
network experiments. Neural Networks, 2, 229-237.
[LINDEN] Linden, A., Ch. Tietz (in prep.). Research and development
software environment for modular adaptive systems. Technical
Report NN3-1, GMD Birlinghoven, Sankt Augustin, Germany.
[MILLER] Miller, J.P. (1990). Computer modelling at the single-neuron level.
Nature, 347, 783-784.
[MURRE] Murre, J.M.J., S.E. Kleynenberg (submitted). Extending the
MetaNet Network Environment: process control and machine independence.
[NIJHUIS] Nijhuis, J., L. Spaanenburg, F. Warkowski (1989). Structure
and application of NNSIM: a general purpose neural network
simulator. Microprocessing and Microprogramming, 27, 189-194.
[REGGIA] Reggia, J.A., C.L. D'Autrechy, G.C. Sutton III, S.M. Goodall
(1988). A general-purpose simulation environment of developing
connectionist models. Simulation, 51, 5-19.
[VIBERT] Vibert, J.F., N. Azmy (1990). Neuro_Clusters: A biological
plausible neural networks simulator tool.
[ZEITV] Zeitvogel, R.K. (1989). Cognitive Software's Cognitron 1.2
(review). Neural Network Review, 3, 11-16.
[ZIPSER] Zipser, D., D.E. Rabin (1986). P3: a parallel network
simulation system. In: D.E. Rumelhart, J.L. McClelland (1986).
Parallel distributed processing. Volume 1. Cambridge MA: MIT
Press.
------------------------------
Subject: Commercial Simulators
From: Bill Mammel <CHRMWCM%engvms.unl.edu@CUNYVM.CUNY.EDU>
Date: Tue, 12 Feb 91 19:02:00 -0500
Here's a list of commercial neural network simulators that I have
compiled. Each listing gives the company name, address, & phone and
their product(s). Indented under each product is its hardware
environment. Finally, there are two products listed at the end--I have
only seen these referenced in the literature so I don't know exactly
where to get them.
All information is correct to my best knowledge. Product/Company
inclusion is not an endorsement or recommendation. Neither is exclusion.
Corrections or additions are welcome. If a product is not listed or is
incorrect, please e-mail to me at "chrmwcm@engvms.unl.edu"--
Neural Network Software Simulators
=----------------------------------
AI Ware, Inc. 11000 Cedar Avenue, Suite 212
(216) 421-2380 Cleveland, OH 44106
N-NET
PCAT, VAX/VMS
California Scientific Software 160 E. Montecito Avenue, Suite E
(818) 355-1094 Sierra Madre, CA 91024
Brain Maker Professional 2.0
PCAT
Cognitive Software, Inc. 703 East 30th Street
(317) 924-9988 Indianapolis, IN 46205
Cognitron
PCAT, Mac, transputers
DAIR Computer Systems 3440 Kenneth Drive
(415) 494-7081 Palo Alto, CA 94303
Net Wurkz
PCAT
HNC, Inc. 5501 Oberlin Drive
(619) 546-8877 San Diego, CA 92121
ExploreNet 3000
MSDOS, Sun workstations
Neural coprocessor boards
Korn Industrial Consultants 6801 Opatas Street
(602) 298-7054 Tucson, AZ 85745
Desire/NeuNet
PCAT
Nestor, Inc. 1 Richmond Square
(401) 331-9640 Providence, RI 02906
NDS 1000
Neural Computer Systems Limited 79 Olney Road, Emberton
(0) 234-713298 Olney, bucks, England MK465BU
Neurun
Neurun Light
Neural coprocessor boards
Neural Systems, Inc. 2827 West 43rd Avenue
(604) 263-3667 Vancouver, BC V6N 3H9
Awareness
Genesis
PCAT
NeuralWare, Inc. Penn Center West Building IV, Suite 227
(412) 787-8222 Pittsburgh, PA 15276
NeuralWorks Professional II Plus
PCAT, Mac II, SE,
Sun-3,4,386i,
NeXT, transputers
NeuralWorks Designer Pack
PCAT, Sun
NeuralWorks Explorer
PCAT
Neurix One Kendall Square, Suite 2200
(617) 577-1202 Cambridge, MA 02139
MacBrain
MacIntosh Plus, SE, II
SAIC 10260 Campus Point Drive
(619) 546-6290 Mail Stop 71
San Diego, CA 92121
ANSim 2.1
PCAT
The Software Tailors Co. 1295 N. Providence Road, Suite B103
(215) 565-4705 Media, PA 19063
Neural Network Simulation Program
PCAT
Ward Systems Group, Inc. 228 West Patrick Street
(301) 662-7950 Frederick, MD 21701
NeuroShell
PCAT
NeuroBoard coprocessors
FORTRAN-77 Neural Network Simulator. FORTRAN-77 Neural Network Simulator
(F77NNS) User Guide. Mission Support Directorate, Mission Planning and
Analysis Division, National Aeronautics and Space Administration, Lyndon
B. Johnson Space Center, Houston, TX. COSMIC Program #MSC-21638. June,
1989.
CaseNet as referenced in Eberhart, R.C., Dobbins, R.W., and Webber,
W.R.S. CaseNet: A neural network tool for EEG waveform classification.
Proceedings IEEE Symposium on Computer Based Medical Systems,
Minneapolis, MN, 60-68, 1989.
------------------------------
Subject: Computists International
From: Ken Laws <LAWS@ai.sri.com>
Date: Tue, 26 Feb 91 22:54:02 -0800
*** PLEASE POST ***
This is to announce Computists International, a new
"networking" association for computer and information scientists.
Hi! I'm Ken Laws If this announcement interests you, contact
me at internet address laws@ai.sri.com. If you can't get through,
my mail address is: Dr. Kenneth I. Laws; 4064 Sutherland Drive,
Palo Alto, CA 94303; daytime phone (415) 493-7390.
I'm back from two years at the National Science Foundation.
I used to run AIList, and I miss it. Now I'm creating a broader
service for anyone interested in information (or knowledge),
software, databases, algorithms, or doing neat new things with
computers. It's a career-oriented association for mutual
mentoring about grant and funding sources, information channels,
text and software publishing, tenure, career moves, institutions,
consulting, business practices, home offices, software packages,
taxes, entrepreneurial concerns, and the sociology of work. We
can talk about algorithms, too, with a focus on applications.
Toward that end, I'm going to edit and publish a weekly+
newsletter, The Computists' Communique. The Communique will be
tightly edited, with carefully condensed news and commentary.
Content will depend on your contributions, but I will filter,
summarize, and generally act like an advice columnist. (Ann
Landers?) I'll also suggest lines of discussion, collect
"common knowledge" about academia and industry, and help track
people and projects. As a bonus, I'll give members whatever
behind-the-scenes career help I can.
Alas, this won't be free. The charter membership fee for
Computists will depend in part on how many people respond to this
notice. The Communique itself will be free to all members, FOB
Palo Alto; internet delivery incurs no additional charge. To
encourage participation, there's a full money-back guarantee
(excluding postage). Send me a reply to find out more.
-- Ken
Computists International and The Computists' Communique are
service marks of Kenneth I. Laws. Membership in professional
organizations may be a tax-deductible business expense.
------------------------------
Subject: A book on Self-Organization
From: R14502%BBRBFU01.BITNET@BITNET.CC.CMU.EDU
Date: Wed, 27 Feb 91 16:16:14 +0100
To all interested in the concept of Selforganization
------------------------------------------------------
The subject matter of selforganization has drawn a lot of attention
from physicists, chemists, and theoretical biologists before becoming so
popular with the Neural Network researches.
Anybody interested in the field with few hours to spare may find the
basics and typical examples of self-organization of complex systems
in an elementary book which I wrote few years ago. The title is:
"Molecules, Dynamics and Life:
An introduction to self-organization of matter",
Wiley, New York, 1986.
A. Babloyantz
University of Brussels
" Dr. Babloyantz has produced an engaging and earnest introduction to the
field of self-organization in chemical and biological systems.
Dr. Babloyantz proves herself to be a pleasant, practical and reliable
guide to new territory which is still largely uncharted and inhospitable
to tourists. Her style falls halfway between that found in a popular
account and that of a txtbook. She tells her story in a chatty,
down-to-earth way, while also giving serious scientific consideration to
fundamental issues of the self-organization of matter." (Nature)
" The issue of self-organization has at the center of a larger
theoretical revolution in physics - the belief that the fundamental laws
of nature are irreversible and random, rather than determinstic and
reversible. The concepts and processes underlying this new way of
thinking are formidable.
Molecules, Dynamics and Life makes these concepts and processes
accessible, for the first time, to students and researchers in physics,
chemistry, biology, and the social sciences." (Physics Briefs)
" In Molecules, Dynamics and Life, Dr. Agnes Babloyantz develops a clear
and easy to read presentation of this developing field of knowledge.
Because only a few advanced research treatises are available so far, this
book is especially welcomed. It offers an excellent introduction to an
interdisciplinary domain, involving physics and biology, chemistry and
mathematics. Obviously, putting together all these topics and making
them readable to a large audience was really a challenge." (Biosystem's)
" With this fine book Agnessa Babloyantz has provided a successful and
welcome summary of what has been accomplished so far in the study of
self-organization of matter according to the Prigogine school in
Brussels.
Dr. Babloyantz's book can be highly recommended to all those interested
in self-organization in the fields of chemistry and biochemistry."
(Bull. Math. Biology)
------------------------------
Subject: request for information on real time applications of NNs
From: Paulo V Rocha <P.Rocha@cs.ucl.ac.uk>
Date: Thu, 28 Feb 91 10:26:46 +0000
I am benchmarking a neurocomputer and need data on the typical load
generated by real-time, real-world NNs applications.
Virtually any model, and any application details are welcomed but I
suppose the heaviest loads are generated by speech processing and signal
(radar?) and image processing. Am I right?
I would appreciate any information or pointers to litterature containing
these data. This is more or less a case of life or death to my Thesis.
:-)
P.
+-----------------------------+---------------------------------------------+
Paulo Valverde de L. P. Rocha | JANET:procha@uk.ac.ucl.cs
Department of Computer Science| BITNET:procha%uk.ac.ucl.cs@UKACRL
University College London |Internet:procha%cs.ucl.ac.uk@nsfnet-relay.ac.uk
Gower Street | ARPANet:procha@cs.ucl.ac.uk
London WC1E 6BT | UUCP:...!mcvax!ukc!ucl-cs!procha
England | tel: +44 (071) 387 7050 x 3719
| fax: +44 (071) 387 1397
+-----------------------------+---------------------------------------------+
------------------------------
Subject: Re: Looking for Phoneme Data
From: Giorgos Bebis <bebis@csi.forth.gr>
Date: Thu, 28 Feb 91 15:56:49 +0200
In comp.ai.neural-nets you write:
>I am looking for speech data that I can use as input into a phoneme
>recognition neural network. I am working on alternative neural network
>models that have improved learning rates in terms of time and I need to
>test these algorithms with speech data used with traditional
>implementations of neural network based speech recognition packages. Any
>information on where and how to get this speech input data would be
>greatly appreciated. Thanks.
>Gunhan H. Tatman
>Computer Engineering Dept.
>The University of South Carolina
>Columbia, SC 29201
>e-mail: gunhan@otis.hssc.scarolina.edu
> (gunhan@129.252.1.2)
I suggest you to look at the benchmark's collection of Scott Fahlman.
Here is a message that he has post before some time :
George Bebis,
Dept. of Computer Science,
University of Crete,
PO BOX 1470, Iraklion, Crete, GREECE E-mail : bebis@csi.forth.gr
=-----------------------------------------------------------------------------=
Subject: CMU Benchmark collection
From: Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU
Date: Mon, 03 Sep 90 22:15:32 -0400
[[ Editor's Note: My thanks to Scott for this complete and thoughtful
reply. The subject of benchmarking arise periodically, hence my last
reference to Scott's valient efforts. As always, I assume readers will
scan this message carefully and follow the directions. If anyone
volunteers to be an additional repository for the files, especially if
they are willing to make up tapes or diskettes and/or provide UUCP
access, please contact me or Scott. -PM ]]
Since the topic of the CMU benchmark collection was raised here, let me
post this information as a way of avoiding dozens of individual questions
and answers.
The benchmark collection is available via internet FTP -- directions for
how to access the collection are included below. I have hesitated to
advertise it to this newsgroup because so many people out on usenet have
no FTP access. As a rule, I don't have time to E-mail these files to
individuals (some are quite large and would have to be chopped up), and
we certainly are not in a position to send out copies on mag tape or
floppy disk. However, any of you who are able to access this material
via FTP are welcome to do so.
I set up the collection a couple of years ago as part of my own empirical
research on neural network learning algorithms. An important question is
how to measure one algorithm against another, even when they deal with
problem domains of similar size and shape. The typical paper in this
field describes some new algorithm and then presents an experiment or two
comparing the new algorithm vs. vanilla backprop. Unfortunately, no two
researchers seem to run the same problem in the same way. Not
surprisingly, everyone beats backprop by at least an order of magnitude,
and usually more. Of course, backprop is very sensitive to the choice of
training parameters, so with a new problem there is always the question
of whether backprop was given a fair chance. The more interesting
question of how a new algorithm stacks up against other post-backprop
algorithms is seldom addressed at all.
So my goal has been to collect a variety of benchmark problems, including
some small, artificial ones (e.g. parity) and some larger, more realistic
ones (e.g. nettalk). For each of these, the collection contains a
problem description, data sets for testing and training (or an algorithm
for generating the same), and a summary of results that people have
obtained on the problem in question using various algorithms. These
results make it possible for people with new algorithms to compare them
against the best results reported to date, and not just against vanilla
backprop. This material is provided solely for the benfit of
researchers; we have no desire to become the "Guiness Book of World
Records" for neural networks. Since my goal is to compare learning
algorithms, not machines, these results are expressed in epochs or
floating-point operations rather than "seconds on a Cray Y/MP" or
whatever.
There is a mailing list for frequent users of this collection and other
interested in benchmarking issues. It is named "nn-bench@cs.cmu.edu"
(Internet address). Mail to this address goes to a couple of hundred
places worldwide, so "add me" requests and other administrative messages
should not go there. Instead they should go to
"nn-bench-request@cs.cmu.edu".
Unfortunately, not too many people have contributed problems to this
collection, and I have been too busy with other things to spend a lot of
time promoting this effort and combing the literature for good problems.
Consequently, the collection and the mailing list have been dormant of
late. I am enclosing a list of files currently in the collection. I
have a couple of other data sets that need some work to get them into
usable form. I hope to find some time for this in the near future, but
that is hard to predict. If someone with lots of time and energy, lots
of online file storage, and good connections to the Internet wants to
take over this effort and run with it, please contact me and we can
discuss this.
Scott Fahlman
School of Computer Science
Carnegie Mellon University
Pittsburgh, PA 15213
Internet: fahlman@cs.cmu.edu
.......................................................................
Current contents of Neural Net Benchmark directory.
First number is file size in bytes.
8094 Mar 15 1989 FORMAT
11778 Aug 13 1989 GLOSSARY
13704 Dec 5 1989 nettalk
541269 Jul 15 17:55 nettalk.data
7382 Oct 16 1989 nettalk.list
5570 Apr 16 1989 parity
1911 Oct 16 1989 protein
14586 Aug 22 1989 protein.test
73489 Aug 22 1989 protein.train
5872 Dec 23 1989 sonar
49217 Dec 23 1989 sonar.mines
43052 Dec 23 1989 sonar.rocks
7103 Feb 27 22:20 two-spirals
16245 Mar 4 23:01 vowel
61542 Apr 23 1989 vowel.data
6197 Apr 15 1989 xor
.......................................................................
FTP access instructions:
For people (at CMU, MIT, and soon some other places) with access to the
Andrew File System (AFS), you can access the files directly from
directory "/afs/cs.cmu.edu/project/connect/bench". This file system uses
the same syntactic conventions as BSD Unix: case sensitive names, slashes
for subdirectories, no version numbers, etc. The protection scheme is a
bit different, but that shouldn't matter to people just trying to read
these files.
For people accessing these files via FTP:
1. Create an FTP connection from wherever you are to machine
"pt.cs.cmu.edu". The internet address of this machine is
128.2.254.155, for those who need it.
2. Log in as user "anonymous" with no password. You may see an error
message that says "filenames may not have /.. in them" or something
like that. Just ignore it.
3. Change remote directory to "/afs/cs/project/connect/bench". Any
subdirectories of this one should also be accessible. Parent
directories should not be.
4. At this point FTP should be able to get a listing of files in this
directory and fetch the ones you want.
5. The directory "/afs/cs/project/connect/code" contains public-domain
programs implementing the Quickprop and Cascade-Correlation
algorithms, among other things. Access it in the same way.
I've tested this access method, but it's still possible that some of our
ever vigilant protection demons will interfere with access from out in
net-land. If you try to access this directory by FTP and have trouble,
please contact me.
The exact FTP commands you use to change directories, list files, etc.,
will vary from one version of FTP to another.
=----------------------------------------------------------------------------
Good luck,
Bye, George.
------------------------------
End of Neuron Digest [Volume 7 Issue 12]
****************************************