Copy Link
Add to Bookmark
Report

Neuron Digest Volume 04 Number 12

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Sunday,  9 Oct 1988		Volume 4 : Issue 12 

Today's Topics:
Re: Excaliber's Savvy
What is MX-1/16?
Pulse coded neural networks
PDP under Turbo C
Commercial Uses of Neural Nets: Survey Summary
Washington Neural Network Society Meeting Announcement
Hector Sussmann to speak on formal analysis of Boltzmann Machine
Josh Alspector to speak on a neuromorphic learning chip

Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"

------------------------------------------------------------

Subject: Re: Excaliber's Savvy
From: bstev@pnet12.cts.com (Barry Stevens)
Organization: People-Net [pnet12], Del Mar, Ca.
Date: 01 Oct 88 14:56:24 +0000

With reference to Savvy:

Die-hard neural network purists say that Savvy has nothing to do with
neural nets. The explanation I got was that the system has no "connection"
to recognized neural networks, such as back propogation.

I asked the original author of the program what was under the covers. He
wouldn't answer me directly, but suggested that I look up papers on
something called "n-tuple pattern recognition", and that they were, among
other places, found in the Sandia Labs library. I called Sandia, and since
I'm not an employee, couldn't get a copy.

I'd like to do some reading on the topic -- does anyone know where I can
find papers on the topic : "N-tuple pattern recognition?"

I have used Savvy. It has a very friendly front end, being as smart about
English as you are with the synonyms you can build, it has an extremely
powerful parsing capability that has let me experiment with extracting
production rules from text, and performs nearest-neighbor classification on
the words you want to look up very quickly. Whether or not this qualifies
as neural network capabilities...

UUCP: {crash ncr-sd}!pnet12!bstev
ARPA: crash!pnet12!bstev@nosc.mil
INET: bstev@pnet12.cts.com

------------------------------

Subject: What is MX-1/16?
From: ghosh@ut-emx.UUCP (Joydeep Ghosh)
Organization: UTexas Computation Center, Austin, Texas
Date: 03 Oct 88 16:39:35 +0000

The DARPA Executive Summary on Neural Networks mentions a neural network
simulation system called MX-1/16 with a projected storage capacity of 50M
interconnects and processing speed of 120M interconnects/sec.

Could someone shed light on who is building this machine, what network
models does it support, system architecture, stage of development...?

Thanks,

Joydeep Ghosh

Internet: ghosh@ece.utexas.edu
University of Texas, Austin
(512)-471-8980

[[ Maybe this is the fabled bee's brain (*smile*). Note for Joydeep's
return address may need to be ghosh%ece.utexas.edu@cs.utexas.edu due to U.
Texas internal rerouting. -PM]]

------------------------------

Subject: Pulse coded neural networks
From: gill@nc.MIT.EDU (Gill Pratt)
Date: Mon, 03 Oct 88 13:21:39 -0500


Our lab is still pursuing research on time interval coded nets,
and we would enjoy corresponding with others doing the same.

Gill Pratt

gill@mc.lcs.mit.edu

------------------------------

Subject: PDP under Turbo C
From: uunet!otago.ac.nz!R_OSHEA
Date: 04 Oct 88 09:50:13 -0800

I am trying to recompile PDP models under Turbo C Version 1.5. Currently
I have not got a curses.h header file for Turbo C and do not know Turbo
C graphics primitives well enough to translate the curses header file I
do have into Turbo. Could anyone be of any assistance here.
Yours
Aaron.V.Cleavin
University Of Otago
P.O. Box 56
New Zealand.

Replies should go via Robert O'Shea

uucp: r_oshea%otago.ac.nz@uunet.uu.net or
...!uunet!otago.ac.nz!r_oshea
internet: r_oshea@otago.ac.nz
csnet: R_OSHEA%OTAGO.AC.NZ@RELAY.CS.NET or
R_OSHEA%OTAGO.AC.NZ%waikato.ac.nz@RELAY.CS.NET

[[ I assume Aaron means Rumelhart & McClelland's PDP Vol3, "Explorations".
I remember a request about Macintosh versions of this simulator software.
Any help on either count from readers? -PM]]

------------------------------

Subject: Commercial Uses of Neural Nets: Survey Summary
From: bstev@pnet12.cts.com (Barry Stevens)
Organization: People-Net [pnet12], Del Mar, Ca.
Date: 04 Oct 88 22:36:13 +0000

Two weeks ago, I posted a message that I had done a survey of companies
looking for applications that were suitable for neural networks, and asking
if there was any interest. Since that time, the responses have been coming
back by Email almost daily, just over 30 of them so far.

Accordingly, I have prepared a summary of that study for posting in
comp.neural-nets. The summary appears as a comment to this message, and is
approximately 200 lines in length.

The original report can't be released, since it contains some proprietary
material. The summary, however, contains material that is available from
numerous public sources, if one knew where and when to look. To keep the
post down, I had to exclude all of the technical detail that usually
accompanies discussion about neural networks. I describe the applications
that were identified, and what, if anything, has been done about them.
Period.

Barry Stevens


UUCP: {crash ncr-sd}!pnet12!bstev
ARPA: crash!pnet12!bstev@nosc.mil
INET: bstev@pnet12.cts.com

[[ I've talked to Barry and his background is the proverbial engineer's.
Several issues ago, someone asked what Neural Nets are "good for" and how
they are really being used. Barry found quite a bit. Due to mailer
problems, he couldn't send me his survey directly; I'm taking it from his
USENET posting and it will appear next issue. -PM]]

------------------------------

[[ Editor's Note: I will try to put conference and talk announcements at
the end of the Digest so that folks can parse the conents of the Digest
more easily. Let me know what you think. -PM ]]

Subject: Washington Neural Network Society Meeting Announcement
From: weidlich@ludwig.scc.com (Bob Weidlich)
Date: Thu, 29 Sep 88 23:18:58 -0400

The Washington Neural Network Society

First General Meeting
October 12, 1988 7:00 PM

Speaker: Fred Weingard
Booz, Allen & Hamilton, Inc.
Arlington, Virginia.


Neural Networks: Overview and Applications


Neural networks and neurocomputing provide a novel and promis-
ing alternative to conventional computing and artificial in-
telligence. Conventional computing is characterized by the
use of algorithms to solve well-understood problems. Artifi-
cial intelligence approaches are generally characterized by
the use of heuristics to obtain good, but not necessarily
best, solutions to problems whose solution steps are not so
well-understood. In both approaches, knowledge representions
or data structures to solve the problem must be worked out in
advance and a problem domain expert is essential. These ap-
proaches result in systems that are brittle to unexpected in-
puts, cannot adapt to a changing environment, and cannot easi-
ly take advantage of parallel hardware architectures. Neural
network systems, in contrast, can learn to solve a problem by
exposure to examples, are naturally parallel, and are
``robust" to novelty. In this talk Fred Weingard will give a
general overview of neural networks that covers many of the
most promising neural network models, and discuss the applica-
tion of such models to three difficult real-world problems --
radar signal processing, optimal decisionmaking, and speech
recognition.

Fred Weingard heads the Neural Network Design and Applications
Group at Booz, Allen & Hamilton. Prior to joining Booz, Al-
len, Mr. Weingard was a senior intelligence analyst at the De-
fense Intelligence Agency. He has degrees in engineering from
Cornell University and is completing his doctorate in computer
science / artificial intelligence at George Washington Univer-
sity.

The meeting will be held in the Contel Plaza Building Audito-
rium at Contel Federal Systems in Fairfax, Virginia, at the
southwest edge of the Fair Oaks mall. Directions from 495
Beltway: Take Route 66 Westbound (toward Front Royal) and get
off at route 50 heading west (Exit 15 Dulles/Winchester). Go
1/4 mile on route 50, follow sign to "
shopping center". Stay
in right lane and merge into service road that circles shop-
ping center. Take driveway from service road to Contel build-
ing. Address is 12015 Lee Jackson Highway. Contel building
is across shopping parking lot from Lord and Taylor, near
Sears. For further information call Billie Stelzner at (703)
359-7685. Host for the meeting is the recently-established
Contel Technology Center. Dr. Alan Salisbury, Director of the
Technology Center, will present a brief introduction to the
plans for research and application of technology at the Contel
laboratory, including work in artificial intelligence and
man-machine interface design.

Schedule:
7:00 - 7:15 Welcoming (Alan Salisbury)
7:15 - 8:15 Speaker (Fred Weingard)
8:15 - 8:30 Report on Neural Network Society (Craig Will)
8:30 - 9:30 Reception, informal discussion

------------------------------

Subject: Hector Sussmann to speak on formal analysis of Boltzmann Machine
From: pratt@zztop.rutgers.edu (Lorien Y. Pratt)
Organization: Rutgers Univ., New Brunswick, N.J.
Date: Fri, 30 Sep 88 20:51:11 +0000

Fall, 1988
Neural Networks Colloquium Series
at Rutgers

On the theory of Boltzmann Machine Learning
-------------------------------------------

Hector Sussmann
Rutgers University Mathematics Department

Room 705 Hill center, Busch Campus
Friday October 14, 1988 at 11:10 am
Refreshments served before the talk

Abstract

The Boltzmann machine is an algorithm for learning in neural networks,
involving alternation between a ``learning'' and ``hallucinating'' phase.
In this talk, I will present a Boltzmann machine algorithm for which it
can be proven that, for suitable choices of the parameters, the weights
converge so that the Boltzmann machine correctly classifies all
training data. This is because the evolution of the weights follow
very closely, with very high probability, an integral trajectory of the
gradient of the likelihood function whose global maxima are exactly the
desired weight patterns.
- --
- -------------------------------------------------------------------
Lorien Y. Pratt Computer Science Department
pratt@paul.rutgers.edu Rutgers University
Busch Campus
(201) 932-4634 Piscataway, NJ 08854

------------------------------

Subject: Josh Alspector to speak on a neuromorphic learning chip
From: pratt@zztop.rutgers.edu (Lorien Y. Pratt)
Organization: Rutgers Univ., New Brunswick, N.J.
Date: 04 Oct 88 17:58:21 +0000


Fall, 1988
Neural Networks Colloquium Series
at Rutgers

Electronic Models of Neuromorphic Networks
------------------------------------------

Joshua Alspector
Bellcore, Morristown, NJ 07960

Room 705 Hill center, Busch Campus
Piscataway, NJ

Friday October 21, 1988 at 11:10 am
Refreshments served before the talk


Abstract

We describe how current models of computation in the brain can be
physically implemented using VLSI technology. This includes modeling
of sensory processes, memory, and learning. We have fabricated a test
chip in 2 micron CMOS that can perform supervised learning in a manner
similar to the Boltzmann machine. The chip learns to solve the XOR
problem in a few milliseconds. Patterns can be presented to it at
100,000 per second. We also have demonstrated the capability to do
unsupervised competitive learning as well as supervised learning.
- --
- -------------------------------------------------------------------
Lorien Y. Pratt Computer Science Department
pratt@paul.rutgers.edu Rutgers University
Busch Campus
(201) 932-4634 Piscataway, NJ 08854

------------------------------

End of Neurons Digest
*********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT