Copy Link
Add to Bookmark
Report
Neuron Digest Volume 05 Number 35
Neuron Digest Sunday, 27 Aug 1989 Volume 5 : Issue 35
Today's Topics:
Science centre exhibit on neural nets
Re: Science centre exhibit on neural nets
Schools for AI&Neural-nets
Schools for AI&Neural-nets
Re: Schools for AI&Neural-nets
Schools for Neural Networks and a M.S.
Schools for AI/Neural-Nets : Addendum!!
Call for papers
Tech Report Available: Symbol Grounding Problem
ICNC Conference Announcement March 1990 in Germany
Fame
Problems with the Neural Net as System Model
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).
------------------------------------------------------------
Subject: Science centre exhibit on neural nets
From: tap@ai.toronto.edu (Tony Plate)
Organization: Department of Computer Science, University of Toronto
Date: Fri, 21 Jul 89 21:53:45 +0000
[[ Editor's Note: For any readers who have not been to the Toronto Science
Center, it is a "must see" if you're in the area. A sort of classy
Exploratorium (for those familar with San Francisco). -PM ]]
Does anyone have any good ideas for a science center exhibit on neural
networks which will give the average visitor some idea of what neural
networks are about?
I'm helping to design a neural networks exhibit for the Ontario Science
Center, a science "museum" in Toronto.
The exhibit will probably be interactive - with a few buttons or knobs the
visitor can play with. The exhibit should be something which a person with
high school education understand. The concepts involved in it should be
simple, concrete and interesting - thus things like truth tables are out
because they are both abstract and incredibly boring.
One problem is that many simple neural networks don't do very surprising or
difficult things, e.g. Xor is not really a very difficult problem, it is
only interesting to researchers because we know that a perceptron cannot do
it.
The network has to be simple because the computers available are not very
powerful, and we would like it to learn while the visitor is with the
exhibit, i.e. in 2 to 5 minutes. They are using Amigas, which seem to be
able to do about 1200 link per second for back-propagation training (but
maybe that could be improved by doing integer arithmetic and table lookup).
We already have a few ideas, but none of them are truly inspirational, so I
shan't bore you with them.
So if you have or have heard of any good ideas for an interesting and
understandable exhibit networks, please mail them to me. If there is enough
interest I will post a summary to comp.ai.neural-nets.
More context:
The exhibit is to be part of a rather large show on Psychology, which will
stay on show in Toronto for approximately 8 months, and then move to a
museum in the U.S. The organizers want a section on AI, but have decided
that this section will consist of one exhibit on neural networks...
I'm not getting paid for this, so neither will you if you contribute a great
idea. However, I will do my best to make sure you get credited. My
connection with the ontario science center is entirely informal, so if you
don't want to give away all rights to your ideas, then don't mail them to
me.
Tony Plate ---------------------- tap@ai.utoronto.ca -----
Department of Computer Science, University of Toronto,
10 Kings College Road, Toronto,
Ontario, CANADA M5S 1A4
------------------------------
Subject: Re: Science centre exhibit on neural nets
From: mcvax!ukc!strath-cs!nott-cs!ucl-cs!M.Nigri@uunet.uu.net
Date: 27 Jul 89 12:03:26 +0000
From: "Meyer E. Nigri" <M.Nigri@uk.ac.ucl.cs>
Tony,
One simple demonstration is to show how a nn can recognise patterns. For
example, one can present a serie of characters to the nn to be learnt. After
the learning phase, one can recover a correct character presenting a
corrupted version. With a good graphical interface, I think this example can
be interesting.
If the graphical interface is a little bit better, a true image could be
presented to the nn (like a picture of a car, house, horse, tree etc).
Meyer.
+--------------------------+-----------------------------------------------+
|Meyer Elias Nigri | JANET:mnigri@uk.ac.ucl.cs |
|Dept. of Computer Science | BITNET:mnigri%uk.ac.ucl.cs@UKACRL |
|University College London |Internet:mnigri%cs.ucl.ac.uk@nsfnet-relay.ac.uk|
|Gower Street | ARPANet:mnigri@cs.ucl.ac.uk |
|London WC1E 6BT | UUCP:...!mcvax!ukc!ucl-cs!mnigri |
+--------------------------+-----------------------------------------------+
------------------------------
Subject: Schools for AI&Neural-nets
From: ramamurb@turing.cs.rpi.edu (Badrinath Ramamurthy)
Organization: RPI CS Dept.
Date: Tue, 01 Aug 89 20:19:32 +0000
[[ Editor's Note: Following this message is the set of responses. Remember
that "best" is terribly subjective and folks interested in these programs
should note the particular biases and strengths of each individual school.
For example, while Stanford's CS program is highly rated, participants
report that the graduate AI track is very logic oriented -- quite different
from teh Cognitiuve Science approach. -PM ]]
A friend of mine finished his MS and intends to work for his PhD
in Neural Networks and AI. Any suggestions which schools he should
be applying to ?
You can email me on this address: ramamurb@turing.cs.rpi.edu
Thanx in advance.
-Badri
------------------------------
Subject: Schools for AI&Neural-nets
From: ramamurb@turing.cs.rpi.edu (Badrinath Ramamurthy)
Organization: RPI CS Dept.
Date: Tue, 08 Aug 89 14:29:31 +0000
Hello friends !
I posted a request for names of schools with good PhD programs for Neural
Nets and AI, for a friend of mine. A lot of people were enthusiastic and
shared their thoughts. Here is what I received:
Many people named the following Schools:
Boston U
Caltech
CMU
MIT
Rochester
U of toronto
UCSB
Univ of Southern California at LA (USC at LA)
UCSD
Yale
Many people chose to mention names and elaborate, and here I reproduce
the text from these responses. Some mention newly formed groups and
some give interesting information on literature to look up.
------------------ ------------------------------ ----------
Received: from ucsd.edu by turing.cs.rpi.edu (4.0/1.2-RPI-CS-Dept)
id AA16856; Tue, 1 Aug 89 23:06:16 EDT
Subject: Re: Schools for AI/Neural nets
I am satisfied with UCSD - there are a LOT of people doing work in the area,
not only in computer Science but in Cognitive Science, Physics, Linguistics,
Economics and Biology (as well as others, no doubt). Course work available
include Hecht-Nielsen's full year course, an intro course by Gary Cottrell
in Computer Science, seminars on Machine Learning & more. Ongoing seminars
include the PDP research group in Cognitive Science, authors & source of the
three volume PDP books.
Other schools doing interesting things:
Boston University & Northeastern have the Center for Adaptive Systems or
some such, under Steven Grossberg. The program is new and I don't know much
about it, but Grossberg's work (while hard to plow through) is very
important.
USC has Michael Arbib, Bart Kosko and Christof von der Malsberg. I have
heard from a couple of their grad students that contact with them is not
easy nor productive. I think very highly of von der Malsberg, however. He
is on the right track...
Carnegie-Mellon has a major effort.
Colorado-Boulder has Mike Mozer & Paul Smolensky. I don't agree with
everything they say, but their work is excellent.
Mike Jordan is now at M.I.T., he would be excellent to work for in
applications of neural networks to robotics problems.
Toronto & Rochester have some good people as well.
And Stanford now has Rumelhart as well as Mark Gluck, a recent PhD who is
extremely prolific. Most of their work is in Psych, or Signal Processing
(Bernie Widrow).
This should be a good core of schools to look at. I'd recommend getting a
hold of recent proceedings from IJCNN, or the last two ICNN conferences,
plus the NIPS conference and take a look at papers. If your friend sees
something exciting, then he might want to contact that school & see what's
up. Lots of research now will pay off in a happy PhD program...
Dave DeMers
demers@cs.ucsd.edu
----------------------- xx xx xx xx xx xx xx -------------------
>From enorris@gmuvax2.gmu.edu Wed Aug 2 13:20:17 1989
Your friend would do well to consider George Mason University, located near
Washington, D.C. There is a novel Ph.D. program in Information Technology
which provides a broad base in computer science, software engineering,
OR/STAT, etc. The CS Department has a flourishing and well-=funded AI Center
with particular interests in Machine Learning. Other CS facultyt are active
in neural networks, expert systems, genetic algorithms, natural language
processing, etc. Assistantships are available. For firther information
write
Dr. James Palmer
School of Information Technology & Engineering
George Mason University
Fairfax, VA 22030
(703) 323-2939
Eugene Norris
CS Dept, GMU
--------------------------------------------------------------
>From honavar@cs.wisc.edu Wed Aug 2 14:05:34 1989
Organization: U of Wisconsin CS Dept
CMU, UCSD, Yale, Brown, MIT, UWisc-Madison, Boston U, Stanford, Berkeley,
Rochester, UCLA, Maryland, ..
He should look at the recent proceedings of IJCAI, NIPS, as well as AI and
NNet journals.
----------------------------------------------------------------------
>From heirich@cs.UCSD.EDU Thu Aug 3 00:18:21 1989
Organization: EE/CS Dept. U.C. San Diego
You may consider me biased, given my location, but I think the best schools
for a graduate program in neural nets are, without a doubt:
Univ. Calif. San Diego
Cal Tech
Univ. Southern Cal.
The other significant places would be: U. Toronto; Stanford; U.C. Boulder;
Boston U.
There are certainly other places, but these should be at the top of any list
because of the faculty there.
Alan Heirich Comp. Sci. & Eng., Cognitive Science
C-014 University of California, San Diego 92093
heirich@cs.ucsd.edu
aheirich@ucsd.bitnet
--------------------------------------------------------------------
>From ck@rex.cs.tulane.edu Thu Aug 3 02:10:00 1989
At the Computer Science Department of Tulane University we have a PhD
program and a group working on Neural Nets.
Dr. Koutsougeras
------------------------------------------------------------------------
>From plong@saturn.ucsc.edu Fri Aug 4 14:36:21 1989
Subject: Grad Schools for Neural Nets
The University of California at Santa Cruz has an excellent program in
Computational Learning Theory, if your friend is interested in studying the
theoretical properties of neural nets.
David Haussler and Manfred Warmuth, two of the "Four Germans" who wrote a
key paper in this area, continue to be central figures, and both are a
pleasure to work with. A third faculty member, Dave Helmbold, who has wide
ranging interests, has contributed papers recently. Scholars visiting
during the summer for joint research projects include Andrzej Ehrenfeucht,
Michael Kearns, Nick Littlestone, Rob Schapire and Bob Sloan.
The atmosphere here is very relaxed and noncompetative. The campus is truly
beautiful, nestled in the redwoods in the hills overlooking Santa Cruz, with
much of the campus having a view of the ocean. Computing facilities are
excellent, and a vast majority of graduate students receive some sort of
assistantship.
I strongly recommend UC Santa Cruz for studying the theory of neural nets.
Phil Long
P.S. Some other schools with a lot of activity in Computational Learning
Theory are MIT, Harvard, Technion (Isreal), Penn, Illinois,
Illinois-Chicago, and Pittsburgh.
----------------------------------------------------------------------------
Thanks to all the people who gave me this information.
In case I receive more responses, I'll post the addenda.(If thats
the right word).
-Badri
...............
Badrinath Ramamurthy ( ramamurb@turing.cs.rpi.edu )
...............
------------------------------
Subject: Re: Schools for AI&Neural-nets
From: Lloyd Lim <iris!lim@UCDAVIS.UCDAVIS.EDU>
Organization: U.C. Davis - Department of Electrical Engineering and Computer Science
Date: 08 Aug 89 18:22:45 +0000
In previous article ramamurb@turing.cs.rpi.edu (Badrinath Ramamurthy) writes:
>
> Univ of Southern California at LA (USC at LA)
I just thought I'd better clarify so others don't get confused. There is no
school that is known as USC at LA.
There is the University of Southern California (USC) which is a private
school and one of a kind.
Then there is the University of California at Los Angeles (UCLA) which is a
branch of the University of California system. UCSB, UCSD, and UCD are also
UC branches.
I suppose that there were responses for both USC and UCLA, thus the
confusion. I don't understand how people can get confused. After all,
California is THE center of the world. :-) :-D :-)
+++
Lloyd Lim Internet: lim@iris.ucdavis.edu (128.120.57.20)
Compuserve: 72647,660
US Mail: 146 Lysle Leach Hall, U.C. Davis, Davis, CA 95616
------------------------------
Subject: Schools for Neural Networks and a M.S.
From: mikeoro@hubcap.clemson.edu (Michael K O'Rourke)
Organization: Clemson University, Clemson, SC
Date: Fri, 11 Aug 89 13:10:21 +0000
I followed with interest the recent discussion on which schools are good for
neural nets and AI. I am about to get my B.S. and will enter grad school to
get an M.S. in Aug '90. Does everyone pretty much agree that the schools
good for the PhD hold true for an M.S. in neural nets? Can anyone suggest
other schools that might be better for an M.S. than a PhD? (I am also
interesting in Networking and Communications if someone can suggest a school
that has these as well as neural nets)
Thanks,
Michael O'Rourke
CLemson University
------------------------------
Subject: Schools for AI/Neural-Nets : Addendum!!
From: ramamurb@turing.cs.rpi.edu (Badrinath Ramamurthy)
Organization: RPI CS Dept.
Date: Tue, 22 Aug 89 13:05:33 +0000
Hi Folks,
Here I am, back with a few more responses I received for "Schools for
AI/Neural-Nets". Thanks to all people who replied now and then.
========================================================================
>From: patil@a.cs.okstate.edu (Patil Rajendra Bha)
Subject: Re: Schools for AI&Neural-nets
Date: 11 Aug 89 04:32:47 GMT
Organization: Oklahoma State Univ., Stillwater
The information about the schools is given in one of the neural
network society journal of 1988 some of them are
The university of Tennessee , Center for neural engineering
Boston University, Center for adaptive systems
Brown university, Rhode Island
John Hopkins university
CALTECH
Carnegie Mellon, Psychology Dept,
MIT
There are many others,
last month I posted the same message, I got the same reply what you
got, Then I looked into the research center directory and now awating the
replies from the universities.
I am a graduate student in computer science and willing to go for
Ph.D in neural networks. I would appreciate if you could mail the list of
universitiesto me. I will let you know about the replies that I will
receive.
Thank you
Patil Rajendra
patil@a.cs.okstate.edu
-----------------------------------------------------------------------------
Date: Tue, 8 Aug 89 16:54:23 PDT
From: ash@cs.UCSD.EDU (Tim Ash)
Subject: Re: Schools for AI&Neural-nets
The top Neural Net graduate programs are at the following schools:
U.C. San Diego
U.S.C.
C.M.U.
Stanford U.
Boston U.
Some others to consider (not as good as the above):
U.C. Boulder
C.I.T.
University of Toronto
Northeastern U.
All of the above schools have people who are active and well known in the
field. U.C. San Diego is acknowledged to be the strongest school in the
field (both in terms of numbers of people, and the variety of academic
departments involved in the work). Many of the top researchers at other
universities passed through U.C.S.D. at one point or another. Good luck in
your friend's search. If you need specific information about U.C.S.D., send
me e-mail.
Tim Ash (CSE Dept. U.C.S.D.)
ash@ucsd.edu
-------------------------------------------------------------------------
From: gary@cs.ucsd.edu (Gary Cottrell)
How about: University of California San Diego:
Researchers in PDP, AI and related fields include:
(active research, not department, in parens)
Henry Abarbanel (Dynamical systems)
Elizabeth Bates (Brain and Language)
Rik Belew (Genetic Algs, PDP and AI)
Shankar Chatterjee (Vision and simulated annealing)
Patricia Churchland (Philosophy of Computational Neuroscience))
Gary Cottrell (PDP)
Francis Crick (Neuroscience)
Jeff Elman (PDP, NLP and GA)
Clark Guest (Optical neurocomputing)
Paul Kube (Vision)
Marta Kutas (NLP and Neuroscience)
David Kirsh (AI)
Helen Neville (Brain and Language)
Mohan Paturi (Learning theory)
Ramachandran (Human Vision)
Walt Savitch (NLP)
Terry Sejnowski (PDP and Computational Neuroscience)
Marty Sereno (PDP and Neuroscience)
Hal White (PDP and theory)
David Zipser (PDP and Computational Neuroscience)
gary cottrell 619-534-6640
Computer Science and Engineering C-014
UCSD,
La Jolla, Ca. 92093
gary%cs@ucsd.edu (ARPA)
{ucbvax,decvax,akgua,dcdwest}!sdcsvax!gary (USENET)
gcottrell@ucsd.edu (BITNET)
===================================================================
Hope it helps all souls searching for way into neural-nets/AI !
-Badri ( ramamurb@turing.cs.rpi.edu )
------------------------------
Subject: Call for papers
From: margaux!bouguett@BOULDER.COLORADO.EDU
Organization: University of Colorado, Boulder
Date: 22 Jul 89 08:15:26 +0000
FIRST MAGHREBIN CONFERENCE ON ARTIFICIAL INTELLIGENCE
AND SOFTWARE ENGINEERING
Constantine, Algeria, September 24-27, 1989
CALL FOR PAPERS
TOPICS
The Conference Program will include bith invited and contributed papers.
Authors from Maghreb are particulary encouraged to submit. The adressed
topics, but not limited to, are :
- Algebraic Specification
- Program Construction and Proving
- Expert Systems
- Knowledge and Data Bases
- Communication Protocols
- Distributed Systems
- Object Oriented Programming
TERMS OF PRESENTATION OF PAPERS :
Papers should be in English, French or Arabic and meet the following
requirements :
1- Pages should not number more than 20, including an abstract, tables, figures
and references.
2- The papers should be double typed on (A 4) single faced page.
3- The full-name of author (s) and institude and country where the research
was conducted should be written on the title page with an abstract of no more
than 300 words.
4- Four copies of the papers should be sent to the chaiman of the organizing
committee.
DEADLINE FOR SUBMISSION OF PAPERS :
The closing date for acceptance of papers is 10 August 1989. Those
whose papers are accepted will be informed by 4th September 1989.
ORGANIZED BY :
Laboratory of Knowledge Bases and Distributed Systems Computer Science
Institute, Constantine University with the partipation of LRI ORSAY- FRANCE.
GUEST SPEAKER :
Eric G. Wagner, Research staff member IBM Watson Research Center (USA)
CORRESPONDANCE :
All correspondance should be adressed to :
Dr. BETTAZ Mohamed
Institut d'Informatique
Universite de Constantine
Constantine 25000
ALGERIA
Telephone : (213) (4) 69.21.39
Telex : 92436 UNCZL
------------------------------
Subject: Tech Report Available: Symbol Grounding Problem
From: harnad@clarity.Princeton.EDU (Stevan Harnad)
Date: Sat, 05 Aug 89 01:21:57 -0400
THE SYMBOL GROUNDING PROBLEM
Stevan Harnad
Department of Psychology
Princeton University
ABSTRACT: There has been much discussion recently about the scope and limits
of purely symbolic models of the mind and about the proper role of
connectionism in cognitive modeling. This paper describes the "symbol
grounding problem" for a semantically interpretable symbol system: How can
its semantic interpretation be made intrinsic to the symbol system, rather
than just parasitic on the meanings in our heads? How can the meanings of
the meaningless symbol tokens, manipulated solely on the basis of their
(arbitrary) shapes, be grounded in anything but other meaningless symbols?
The problem is analogous to trying to learn Chinese from a Chinese/Chinese
dictionary alone.
A candidate solution is sketched: Symbolic representations must be grounded
bottom-up in nonsymbolic representations of two kinds: (1) iconic
representations, which are analogs of the proximal sensory projections of
distal objects and events, and (2) categorical representations, which are
learned and innate feature-detectors that pick out the invariant features of
object and event categories from their sensory projections. Elementary
symbols are the names of these object and event categories, assigned on the
basis of their (nonsymbolic) categorical representations. Higher-order (3)
symbolic representations, grounded in these elementary symbols, consist of
symbol strings describing category membership relations ("An X is a Y that
is Z").
Connectionism is one natural candidate for the mechanism that learns the
invariant features underlying categorical representations, thereby
connecting names to the proximal projections of the distal objects they
stand for. In this way connectionism can be seen as a complementary
component in a hybrid nonsymbolic/symbolic model of the mind, rather than a
rival to purely symbolic modeling. Such a hybrid model would not have an
autonomous symbolic "module," however; the symbolic functions would emerge
as an intrinsically "dedicated" symbol system as a consequence of the
bottom-up grounding of categories' names in their sensory representations.
Symbol manipulation would be governed not just by the arbitrary shapes of
the symbol tokens, but by the nonarbitrary shapes of the icons and category
invariants in which they are grounded.
Preprint Available:
Stevan Harnad JVNET: harnad@confidence.princeton.edu harnad@princeton.edu
srh@flash.bellcore.com harnad@elbereth.rutgers.edu
CSNET: harnad%confidence.princeton.edu@relay.cs.net UUCP: harnad@princeton.uucp
BITNET: harnad@pucc.bitnet harnad1@umass.bitnet Phone: (609)-921-7771
------------------------------
Subject: ICNC Conference Announcement March 1990 in Germany
From: ECKMILLE@DD0RUD81.BITNET (Rolf Eckmiller, Duesseldorf, FRG)
Date: Fri, 18 Aug 89 17:06:00 +0700
CALL FOR PAPERS 8/89
International Conference on:
PARALLEL PROCESSING IN NEURAL SYSTEMS AND COMPUTERS (ICNC)
- 10th Cybernetics Congress of the DGK -
19. - 21. March, 1990, Heinrich-Heine-Universitaet Duesseldorf (FRG)
Organizing Committee: R. Eckmiller (chair), Duesseldorf (FRG)
G. Hartmann, Paderborn (FRG)
G. Hauske, Muenchen (FRG)
C. v.d. Malsburg Los Angeles (USA)
W. v. Seelen Bochum (FRG)
Conference Language: English
Topics: 1) New Concepts in Neuroscience and Computational Neuroscience
2) Massively Parallel Computers (e.g. SUPRENUM, Transputer Systems)
3) Structure and Function of Biological Neural Systems
4) Self Organization versus Programming in Parallel Computers
5) Optical Computers and Molecular Computers
6) Parallel Processing in Artificial Intelligence
Activities: *) Invited Lectures by American and European Scientists on the
listed Topics
*) Oral Presentations (15 + 5 min.)
*) Posters Presentations
*) Exhibition of Books, Neural Systems, and Computers
Invited Speakers include:
J.R Barker/Glasgow (UK) G. Carpenter/Boston (USA)
J. Feldman/Berkeley (USA) A. Cremers/Dortmund (FRG)
H. Haarer/Bayreuth (FRG) K. Fukushima/Osaka (Japan)
T. Kohonen/Espoo (Finland) H. Haken/Stuttgart (FRG)
D. Psaltis/Pasadena (USA) W. Reichardt/Tuebingen (FRG)
U. Trottenberg/Bonn (FRG)
Deadline for Submission of Papers:>> 15 November, 1989 << !!!!!!!
(4 camera-ready pages per contribution)
Publication: Conference Proceedings will be available at the conference
as hard cover book (Elsevier Science Publ.)
Registration Fees: Before 1 October, 1989 = 150,- DM including Proceedings
After 1 October, 1989 = 200,- DM " "
Students = 100,- DM " "
Students = 50,- DM without Proceedings
ICNC-Conference Secretariat:
Dr. R. Eckmiller
Heinrich-Heine-Universitaet Duesseldorf
Div. Biocybernetics
Universitaetsstrasse 1 Tel.: (211) 311-5204
D-4000 Duesseldorf (FRG) e-mail: ECKMILLE@DD0RUD81.BITNET
Please complete and mail the request of 2nd Announcement to the ICNC-Conference
Secretariat.
cut here - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
To ICNC-Conference Secretariat
Dr. R. Eckmiller
Heinrich-Heine-Universitaet Duesseldorf
Div. Biocybernetics
Universitaetsstrasse 1
D-4000 Duesseldorf (FRG)
Request for 2nd Announcement of ICNC
19. - 21. March, 1990, Duesseldorf (FRG)
Last Name:
___________________________________________________________________
First Name:
__________________________________________________________________
Affiliation:
_________________________________________________________________
Street:
______________________________________________________________________
City:
________________________________________________________________________
Country:
_____________________________________________________________________
Tel.: e-mail:
______________________________ _________________________________
I intend to participate at the ICNC ( ) as student (yes / no)
I intend to submit a contribution on the topic No. ( )
as oral presentation ( )
as poster presentation ( )
as oral paper or poster ( )
Please, send the 2nd Announcement of ICNC to my above address.
Date: Sincerely,
_________________________ ___________________________________
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
------------------------------
Subject: Fame
From: kingsley_morse%01%hp5003@hplabs.hp.com
Date: 23 Aug 89 09:20:00 -0800
Want a chance to be famous?
Would you like the chance to have your neural net program become a
standard of comparison? The major computer companies are cooperating to
collect new benchmark programs that will be distributed far and wide.
"The prevailing set of synthetic benchmark standards are clearly
inadequate to measure today's advanced computer systems," said John Mashey
of MIPS Computer Systems. "Small, synthetic benchmarks like Dhrystones and
Whetstones can no longer be used to gauge the performance of systems that
now take advantage of mainframe and supercomputer design concepts. Today's
workstations and servers deliver high performance by utilizing heavy
instruction pipelining, multiple execution units working in parallel, large
caches, fast memory systems, and optimizing compilers. Some of the existing
benchmarks easily fit within the caches of these machines, and the
performance results are completely unrealistic. It's like trying to measure
the speed of a bullet with a stopwatch."
The major computer company cooperative, called SPEC (for Systems
Performance Evaluation Cooperative) is collecting programs that are better
benchmarks for today's systems. It's members include IBM, DEC, HP, Sun,
MIPS, Data General, and Motorola. The programs chosen for this suite of
benchmark programs will be an industry standard.
If you're interested in sharing the limelight, here's what you should
do:
1.) Select a neural net training program and training data whose source
code you won't mind putting in the public domain.
2.) Check that your program and training data take 1 to 10 minutes to
train on an 8 mips machine. (My understanding is that an HP 835 is about 14
mips, a Sun Sparc station 9 mips, a DEC 11/780 1 mip, and a Decstation 3100
11 mips).
3.) Copy the source code for your neural net program and data to a floppy
disc or magnetic tape and mail it by October 15th, 1989 to:
Kingsley G. Morse Jr.
1039 Continentals Way, #301
Belmont, CA 94002
I encourage you to submit back-propagation applications written in C or
FORTRAN, and to call me at (415) 691-3221 during the day if you have any
questions.
Sincerely,
Kingsley G. Morse Jr.
(415) 691-3221
sun.com!hpda!hp5003!01!kingsley_morse
------------------------------
Subject: Problems with the Neural Net as System Model
From: fishwick@fish.cis.ufl.edu (Paul Fishwick)
Organization: UF CIS Department
Date: Mon, 14 Aug 89 14:34:11 +0000
For a pre-print of this article, please send a note with your address to:
paulette@bikini.cis.ufl.edu
NEURAL NETWORK MODELS IN SIMULATION:
A COMPARISON WITH TRADITIONAL MODELING APPROACHES
Paul A. Fishwick
Department of Computer and Information Science
University of Florida
Bldg. CSE, Room 301
Gainesville, FL 32611
INTERNET: fishwick@fish.cis.ufl.edu
To be presented at:
The Winter Simulation Conference, Dec. 1989
ABSTRACT
Neural models are enjoying a resurgence in systems research primarily due to
a general interest in the connectionist approach to modeling in artificial
intelligence and to the availability of faster and cheaper hardware on which
neural net simulations can be executed. We have experimented with using a
multi-layer neural network model as a simulation model for a basic
ballistics model. In an effort to evaluate the efficiency of the neural net
implementation for simulation modeling, we have compared its performance
with traditional methods for geometric data fitting such as linear
regression and surface response methods. Both of the latter approaches are
standard features in many statistical software packages. We have found that
the neural net model appears to be inadequate in most respects and we
hypothesize that accuracy problems arise, primarily, because the neural
network model does not capture the system structure characteristic of all
physical models. We discuss the experimental procedure, issues and problems,
and finally consider possible future research directions.
+------------------------------------------------------------------------+
| Prof. Paul A. Fishwick.... INTERNET: fishwick@bikini.cis.ufl.edu |
| Dept. of Computer Science. UUCP: gatech!uflorida!fishwick |
| Univ. of Florida.......... PHONE: (904)-335-8036 |
| Bldg. CSE, Room 301....... FAX is available |
| Gainesville, FL 32611..... |
+------------------------------------------------------------------------+
------------------------------
End of Neurons Digest
*********************