Copy Link
Add to Bookmark
Report
Neuron Digest Volume 03 Number 03
NEURON Digest Tue Feb 2 13:50:22 CST 1988 Volume 3 / Issue 3
Today's Topics:
Re: Commercial Neural Nets
Neural Net Study Group
'87 neural nets proceedings
Msc.
Re: Commercial Neural Nets
Announcing a Connectionist/Neural Network Symposium
Seminar
Tech report available...
Reading List Suggestions ?
Fault Tolerance & Neural Networks
----------------------------------------------------------------------
Date: Fri, 22 Jan 88 08:17:46 PST
From: Daniel Abramovitch <danny@ford-wdl1.arpa>
Can anyone recommend a good starter text on Neural Nets? I've
come from the adaptive control world and neural nets seems to be the
hot buzz word in industry.
Thanks in advance,
Danny
------------------------------
Date: 24 Jan 88 16:54:00 GMT
From: codas!novavax!hcx1!brian@bikini.cis.ufl.edu
Subject: Re: Commercial Neural Nets
I have heard of a neural net product called MacBrain (I don't know
who makes it) which runs on a Macintosh. I would be interested in
hearing from people who use this product in the "real world".
------- -------
Brian M. Leach "The lasers are in the lab
Harris Computer Systems The old man is dressed in white clothes
2101 W. Cypress Creek Rd. #161 Everybody says he's mad
Ft. Lauderdale, FL 33309 No one knows the things he knows
brian@harris.com No one knows."
Neil Young, "Sedan Delivery"
------------------------------
Date: 25 Jan 88 13:43:30 PST (Monday)
Subject: Neural Net Study Group
From: Rockwell.HENR801c@xerox.com
I'm trying to find members of Xerox in Rochester,NY interested in
joinning/forming a neural net/connectionist study group. Interested parties
should reply to ROCKWELL:HENR801C:XEROX.
------------------------------
Date: 25 Jan 88 08:13:36 GMT
From: Mike MacGregor <ihnp4!alberta!macg@ucbvax.berkeley.edu>
Subject: '87 neural nets proceedings
Does anyone have an address for obtaining the proceedings from last summer's
neural nets conference ? Thanks in advance.
uucp: macg@alberta Innately analog: (403)432-3978
ean: macg@pembina.alberta.cdn
disclaimer: I'm saving all my opinions for my thesis.
------------------------------
Date: Thu 21 Jan 88 09:32:56-PST
From: Ken Laws <LAWS@iu.ai.sri.com>
Subject: Msc.
Kaiti Riley asked about neural models of timing and phase. I would
suggest the extensive recent literature on optic flow in the computational
vision conferences and journals. You may want to reduce the techniques
to one spatial dimension for NN experimentation, but the mathematics
of spatiotemporal derivatives should be the same.
A question was asked about Hopfield networks, which reminded me of the
following. Hopfield nets are often hyped as a solution to the traveling
salesman problem. Even allowing for the fact that only approximate
solutions are found, I am not convinced that this is a good approach.
Optimal paths are always nonintersecting loops, but Hopfield networks
can give solutions that cross themselves. Non-NN postprocessing can
correct for this by breaking and reconnecting the crossed links, but
why should this be necessary? Has anyone built planarity into the
constraint functions that drive the network? If this can't be done
elegantly, shouldn't we prefer Karmarkar's linear programming approach
or search for some other feedback solution that is more flexible?
(Beam search is an AI technique that permits multiple solutions to
compete, whereas Hopfield networks can only track one trajectory at
a time. Perhaps the ability to consider multiple sets of solutions
is critical.)
Julian Dow suggested that Neuronal and Neural systems be distinguished.
Unfortunately, the meanings were reversed from those already used by
the neuroscientists. I refer specifically to an article in the new
Daedalus issue by Jacob T. Schwartz, which mentions Neural systems
as the biological ones and Neuronal as the artificial ones.
While I'm on the subject, it turns out that the Daedalus AI issue
is really more of a Connectionist issue. Each paper deals with
either the potential of neuronal systems or the history of the
symbolic vs. holistic split in AI -- all written in philosophical
rather than engineering style. I particularly enjoyed Hillis'
argument that each human brain requires only a gigabit of memory
-- about the size of a Connection Machine! -- and [his] discussion
of the emergent properties of water. I haven't quite finished
the journal yet, but I recommend it to this audience. You can
get copies for $5 (plus $1 outside the U.S.) from
daedalus%amcad.uucp@husc6.harvard.edu or from
DAEDALUS Business Office
P.O. Box 515
Canton, MA 02021
-- Ken
------------------------------
Date: 27 Jan 88 15:25:14 GMT
From: Dave Hampton <tikal!phred!daveh@beaver.cs.washington.edu>
Subject: Re: Commercial Neural Nets
Regarding Commercial Neural Net Packages:
At the AAAI convention last summer, I ordered a copy of a neural
networks simulator for the IBM-PC called NeuralWorks Professional,
from NeuralWare, Inc. The package cost just over $100 at the
time, although I believe that it's selling for about $400 now.
It consists of NWorks, a Neural Network construction and simulation
environment, and two demonstration packages, Networks I and II.
The distribution of the programs was delayed, and I received the
original disks in October. It didn't work at all. A follow-up
(Version 1.01) arrived in December at no cost, and I have been
able to install it and get some of the simple networks running.
I haven't been able to explore the package thoroughly yet, but it
seems complete and is worth considering.
Contact: Casimir "Casey" Klimasauskas, President
NeuralWare, Inc.
103 Buckskin Court
Sewickley, PA 15143
(412) 741-5959
------------------------------
From: @C.CS.CMU.EDU:JOSE@FLASH.BELLCORE.COM
Subject: Announcing a Connectionist/Neuroscience symposium
Connectionist Modeling and Brain Function:
The Developing Interface
February 25-26, 1988
Princeton University
Lewis Thomas Auditorium
This symposium explores the interface between connectionist modeling
and neuroscience by bringing together pairs of collaborating speakers
or researchers working on related problems. Each set of speakers will
provide a sample of the kind of successful interaction characterizing
the rapidly developing field of computational modeling of brain function.
Thursday Friday
Associative Memory and Learning Sensory Development and Plasticity
9:00 am 9:00 am
Introductory Remarks Preliminaries
Professor G. A. Miller Announcements
9:15 am 9:15 am
Olfactory Process and Associative Role of Neural Activity in the
Memory: Cellular and Modeling Development of the Central Visual
Studies System: Phenomena, Possible Mechanism
and a Model
Professor A. Gelperin Professor Michael P. Stryker
AT&T Bell Laboratories University of California, San Fran
Princeton University
10:30 am 10:30 am
Simple Neural Models of Towards an Organizing Principle for a
Classical Conditioning Perceptual Network
Dr. G. Tesauro Dr. R. Linsker, Ph.D., M.D.
Center for Complex Systems Research IBM Watson Research Lab
Noon-Lunch Noon-Lunch
1:30 pm 1:30 pm
Brain Rhythms and Network Memories: Biological Constraints on a Dynamic
I. Rhythms Drive Synaptic Change Network: Somatosensory Nervous System
Professor G. Lynch Dr. T. Allard
University of California, Irvine University of California, San Francisco
3:00 pm 3:00 pm
Brain Rhythms and Network Memories: Computer Simulation of Representational
II. Rhythms Encode Memory Plasticity in Somatosensory Cortical
Hierarchies Maps
Professor R. Granger Professor Lief H. Finkel
University of California, Irvine Rockefeller University
The Neuroscience Institute
4:30 pm General Discussion 4:30 pm General Discussion
5:30 pm Reception 5:30 pm Reception
Green Hall, Langfeld Lounge Green Hall, Langfeld Lounge
Organizers Sponsored by
Stephen J. Hanson Bellcore & Department of Psychology
Princeton U. Cognitive Science Laboratory
Carl R. Olson Princeton U. Human Information Processing Group
George A. Miller, Princeton U.
(new page)
Connectionist Modeling and Brain Function:
The Developing Interface
February 25-26, 1988
Princeton University
Lewis Thomas Auditorium
Travel Information
Princeton is located in central New Jersey, approximately 50 miles
southwest of New York City and 45 miles northest of Philadelphia. To
reach Princeton by public transportation, one usually travels through
one of these cities. We recommend the following routes:
By Car
>From NEW YORK - - New Jersey Turnpike to Exit #9, New Brunswick; Route
18 West (approximately 1 mile) to U.S. Route #1 South, Trenton. From
PHILADELPHIA - - Interstate 95 to U.S. Route #1 North. From
Washington - - New Jersey Turnpike to Exit #8, Hightstown; Route 571.
Princeton University is located one mile west of U.S. Route #1. It
can be reached via Washington Road, which crosses U.S. Route #1 at the
Penns Neck Intersection.
By Train
Take Amtrak or New Jersey Transit train to Princeton Junction, from
which you can ride the shuttle train (known locally as the "Dinky")
into Princeton. Please consult the Campus Map below for directions on
walking to Lewis Thomas Hall from the Dinky Station.
For any further information concerning the conference please
contact our conference planner:
Ms. Shari Landes
Psychology Department
Princeton University, 08540
Phone: 609-452-4663
Elec. Mail: shari@mind.princeton.edu
------------------------------
Date: Thu, 28 Jan 88 08:28:44 CST
From: @RELAY.CS.NET:UNICORN!LUSE@NOSC.MIL
Subject: Seminar
ACM SIGANS
presents
"Economic Prediction
using
Neural Networks"
Dr. Halbert White
Professor of Economics at UCSD
Tuesday February 23
6-8 pm
General Dynamics
CRA Pavillion
For more information call Dave Holden at 592-5026. GD CRA
Pavillion is located in Missile Park, just east of 163 off
Clairemont Mesa Blvd. (Thomas Bros. map page 45, F6.)
------------------------------
Date: Wed 20 Jan 88 12:15:45-CST
From: Jim Anderson <ANDERSON%MAXIMILLION.CP.MCC.COM@mcc.com>
Subject: Tech report available...
MCC-EI-287-87
Neural Networks and NP-complete Optimization Problems;
A Performance Study on the Graph Bisection Problem
Carsten Peterson and James R. Anderson
Microelectronics and Computer Technology Corporation
3500 West Balcones Center Drive
Austin, TX 78759-6509
Abstract:
The performance of a mean field theory (MFT) neural network technique for
finding approximate solutions to optimization problems is investigated for
the case of the minimum cut graph bisection problem, which is NP-complete.
We address the issues of solution quality, programming complexity, convergence
times and scalability. Both standard random graphs and more structured
geometric graphs are considered. We find very encouraging results for all
these aspects for bisection of graphs with sizes ranging from 20 to 2000
vertices. Solution quality appears to be competitive with other methods,
and the effort required to apply the MFT method is minimal. Although the
MFT neural network approach is inherently a parallel method, we find that
the MFT algorithm executes in less time than other approaches even when it
is simulated in a serial manner.
---------------------------------------------------------------------------
Requests for copies should include name and land address.
Please send requests to HINER@MCC.COM.
------------------------------
Date: Tue, 26 Jan 88 11:00:35 CST
From: @C.CS.CMU.EDU:CECI@BOULDER.COLORADO.EDU
Subject: Re: Reading list suggestions?
Here's a suggestion. It's a short work, mostly conceptual, and it concerns
some of the problems coonectionist models can be expected to encounter
when trying to do natural language processing.
David L. Waltz. Connectionist Models: Not Just a Notational
Variant, Not a Panacea.
Abstract:
Connectionist models inherently include features and exhibit
behaviors which are difficult to achieve with traditional logic-
based models. Among the more important characteristics are:
(1) the ability to compute nearest match rather than requiring
unification or exact match; (2) learning; (3) fault tolerance
through the integration of overlapping modules, each of which
may be incomplete or fallible, and (4) the possibility of scaling
up such systems by many orders of magnitude, to operate more
rapidly or to handle much larger problems, or both. However,
it is unlikely that connectionist models will be able to learn
all of language from experience, because it is unlikely that a
full cognitive system could be built via learning from an initially
random network; any successful large-scale connectionist learning
system will have to be to some degree "genetically" prewired.
The paper is only seven pages long, including references, so you can get an
idea of the depth of analysis. It is, however, very clearly written
and sets out the major obstacles connectionist language learning will
have to overcome. Unfortunately, I don't have the complete source citation;
it's a technical report, but I don't remember who put it out. Waltz
listed his credentials as "Thinking Machines Corporation and Brandeis
University," so perhaps one of those two institutions will know the exact
citation.
I'd be very interested in the reading list you compile.
Cheers,
Lou Ceci
Dept. of Journalism and Mass Communications
Univ. of Northern Colorado
Greeley, CO 80639
(303) 351-2726.
home: 3065 30th St. #6
Boulder, CO 80301
(303) 449-7839
------------------------------
Date: Fri, 29 Jan 88 23:49:42 pst
From: "Andrew J. Worth" <worth@iris.ucdavis.edu>
Subject: Fault Tolerance & Neural Networks
I would like to thank those who responded to my query and re-post this
query for information on:
- the inherent fault-tolerance in neural networks
- determining the fault-tolerance capabilities of neural networks
- increasing fault tolerance in neural networks
- using neural networks for traditional fault tolerance applications
Results of this query so far follow. Due to address problems, I am
re-posting this request for information. Anyone with additional
information on the above subjects is encouraged to respond via one of
my addresses given below. Thanks in advance.
-----------------------------------------------------------------------
From: kurt@bach.csg.uiuc.edu (Kurt)
T. Hogg and B. Huberman, "Understanding Biological Computation: Reliable
Learning and Recognition," Proceeding of the National Academy of Science,
November 1984, pp. 6871-6875.
-----------------------------------
From: ee.worden@a20.cc.utexas.edu (Sue J. Worden)
C. R. Legendy, "On the scheme by which the human brain
stores information," MATH.BIOSCI., vol. 1, pp. 555-597, 1967
J. J. Hopfield, "Neural networks and physical systems with
emergent collective computational abilities," PROC>NATL.
ACAD.SCI.USA, vol. 79, no. 8, pp. 2554-2558, 1982
J. A. Anderson, "Cognitive and psychological computation
with neural models," IEEE TRANS.SYST.,MAN,CYBERN., vol. SMC-13,
no. 5, pp. 799-815, 1983.
S. S. Venkatesh, "Epsilon capacity of neural networks,"
NEURAL NETWORKS FOR COMPUTING, AIP CONF.PROC. 151, J. S.Denker,
ed., pp. 440-445. 1986
OTHER POSSIBILITIES:
-----------------------------------
see PDP ch 12 p.472 & PDP ch 7 p.303 & PDP ch22 p.413
PDP = Rumelhart, D., and McClelland, J., Parallel Distributed
Processing: Explorations in the Microstructure of Cognition:
Volumes 1 and 2, Bradford Books/MIT Press, Cambridge, 1986
-----------------------------------
A mention of using Hopfield nets for FT applications:
correcting serial transmissions? (see Lippmann, Richard P.,
An Introduction to Computing with Neural Netw, IEEE ASSP,
April 1987, p. 8.)
-----------------------------------
Simplson references Cottrell about gracefull degridation of brains:
Cottress, G. and Small, S., "Viewing Parsing as Word Sense
Discrimination: A Connectionis Approach", Computational Models
of Natural Language Processing. Bara, B. and Guida, G. (Eds.),
Elsevier Science Publishers, B.B.: North-Holland (1984).
ONGOING RESEARCH:
-----------------------------------
Sue J. Worden, U. Texas, Austin.
-fault tolerance of a neural network architecture based on
compacta theory (see ref. above to C. R. Legendy)
Michael J. Carter, U. New Hampshire.
-a quantitative theory of fault tolerance for neural networks
initially for multi-layer perceptrons.
-----------------------------------------------------------------------
-Andy "everyone just says they are fault tolerant and that's all"
worth@iris.ucdavis.edu
worth%iris.ucdavis.edu@relay.cs.net
worth@clover.ucdavis.edu
1421 H Street Apt 4, Davis, CA, 95616-1128
(916) 753-9910
------------------------------
End of NEURON-Digest
********************