Copy Link
Add to Bookmark
Report

Neuron Digest Volume 04 Number 14

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Thursday, 13 Oct 1988		Volume 4 : Issue 14 

Today's Topics:
Anza neuronal coprocessor
Intelligence / Consciousness Test for Machines (Neural-Nets)???
Neural Networks for Temporal Recognition
Occam
discrete vs. analog weights
Re: Rumelhart's generalized cost metric.
Technical Report Announcement
Washington (DC) Neural Network Society
Neural Computation Subscriptions


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"

------------------------------------------------------------

Subject: Anza neuronal coprocessor
From: daniel@vmucnam.UUCP (Daniel Lippmann)
Organization: C.N.A.M, Paris, France
Date: 05 Oct 88 16:10:07 +0000

is there anybody knowing details about the neural coprocessor
named Anza plus for Sun/VMEbus (hardware and software) ?
All I know is the name of the manufacturing company :
- -Hecht Nielsen Neurocomputers (HNC)
But I have no idea of their location and postal or electronic
address.
Any help will be welcome
daniel

- --
daniel ( ...!mcvax!inria!vmucnam!daniel)
- ----------------------------------------------
C.N.A.M Laboratoire d'Informatique Daniel Lippmann
292 rue Saint-Martin 75141 Paris cedex 3 FRANCE

------------------------------

Subject: Intelligence / Consciousness Test for Machines (Neural-Nets)???
From: mician@usfvax2.EDU (Rudy Mician)
Organization: University of South Florida at Tampa
Date: 05 Oct 88 17:49:38 +0000


I have a question that I know has been addressed in the past (and undoubtedly
continues to be addressed):

When can a machine be considered a conscious entity?

For instance, if a massive neural-net were to start from a stochastic state
and learn to interact with its environment in the same way that people do
(interact not think), how could one tell that such a machine thinks or exists
(in the same context as Descarte's "COGITO ERGO SUM"/"DUBITO ERGO SUM"
argument- that is, how could one tell whether or not an "I" exists for the
machine?

Furthermore, would such a machine have to be "creative"? And if so, how would
we measure the machine's creativity?

I suspect that the Turing Test is no longer an adequate means of judging
whether or not a machine is intelligent.


If anyone has any ideas, comments, or insights into the above questions or any
questions that might be raised by them, please don't hesitate to reply.

Thanks for any help,

Rudy


- --

Rudy Mician mician@usfvax2.usf.edu
Usenet: ...!{ihnp4, cbatt}!codas!usfvax2!mician

------------------------------

Subject: Neural Networks for Temporal Recognition
From: Andreas Herz <BY9%DHDURZ1.BITNET@CUNYVM.CUNY.EDU>
Date: Thu, 06 Oct 88 17:14:27 +0000

Just read your message concerning temporal recognition - we have done some
work on that problem:

Neural Networks for Temporal Recognition
Within a generalized Hopfield-scheme it is possible to learn and represent
spatio-temporal objects. The represenatbility of such objects is guaranteed
by a broad distribution of signal-delays within the network. Learning is
achieved by the Hebb-rule. Our numerical and analyitcal results clearly
show the robustness of the model, for sequential as well as for parallel
update, both for deterministic and stochastic dynamics. This work will be
published in Europysics Letters,6 (1988), biological implications are poin-
ted out in an extended version, submitted to Biol. Cybern.

Andreas Herz

------------------------------

Subject: Occam
From: dave@umbc3.UMD.EDU (David A Freeman)
Organization: University of Maryland, Baltimore County
Date: 07 Oct 88 00:02:01 +0000



Does anyone know of a neural net written in occam?
(Occam is a parallel language for transputers)


Thanks to all who reply,
dave


dave@umbc3.umd.edu
dave@umbc2 :bitnet
dave@umbc1 :bitnet

------------------------------

Subject: discrete vs. analog weights
From: glockner%cs@ucsd.edu (Alexander Glockner)
Date: Fri, 07 Oct 88 14:44:43 -0700


A recent dissertation which discussed, among other topics, the discrete vs.
analog difference is:

Tomlinson, M. S. "Implementing Neural Networks" Ph.D.
dissertation, University of California San Diego, 1988.

I'm told that copies of this are (only) available through University
Microfilms (those !@#$ at Ann Arbor that charge an arm and a leg). Dr.
Tomlinson himself can be reached at stan%cs@ucsd.edu.
Alexander glockner


------------------------------

Subject: Re: Rumelhart's generalized cost metric.
From: Matt Heffron <BEC.HEFFRON@ECLA.USC.EDU>
Date: Fri, 07 Oct 88 18:33:49 -0700

I did write it down... (I hope I wrote it correctly :-) )

The generalized cost function is:

cost = L*(<the-regular-sum-squared-error>) +
(1 - L) * ((K1 * <cost-per-weight>) +
(K2 * <cost-per-unit>))

<the-regular-sum-squared-error> is the error metric for backprop that we
all know and love.

( ) 2
( w )
( ij )
<cost-per-weight> = -----------------
( ) 2
1 + ( w )
( ij )

and
( ) 2
( w )
( ij )
<cost-per-unit> = -------------------
----
\ ( ) 2
1 + > ( w )
/ ( ik )
----
k

The K1 and K2 are for weighting the relative importance of reducing weights
vs. units. (therefore: K1+K2=1.0)

The L is for trading off accuracy of the answer vs. complexity.

Normally, L is started at (approx) 1.0 and reduced by simulated annealing
during learning. Stepwise, as the net gets good at minimizing the
sum-of-squares-error, then reduce L. If the complexity increases the cost
too much, then increase L.

- -Matt Heffron BEC.HEFFRON@ECLA.USC.EDU

ps. any terms above which aren't explained, are because I didn't write down
the definitions. I hope that they are obvious to someone, because at
6:30pm on a Friday, they aren't to me. -MH
- -------

------------------------------

Subject: Technical Report Announcement
From: <ANDERSON%BROWNCOG.BITNET%MITVMA.MIT.EDU@MITVMA.MIT.EDU>
Date: 05 Oct 88 15:23:05 -0800


A technical report is available from the Brown University
Department of Cognitive and Linguistic Sciences:

Technical Report 88-01
Department of Cognitive and Linguistic Sciences
Brown University

Representing Simple Arithmetic in Neural Networks

Susan R. Viscuso, James A. Anderson and Kathryn T. Spoehr

This report discuses neural network models of qualitative
multiplication. We review past research in magnitude representation
and cognitive arithmetic. We then develop a framework for
building neural network models that exhibit behaviors that
mimic the empirical results. The simulations show that neural
net models can carry out qualitative multiplication given
an adequate representation of magnitude information. It is
possible to model a number of interesting psychological
effects such as associative interference, practice effects,
and the symbolic distance effect. However, this set of
simulations clearly shows that neural networks are not
satisfactory as devices for doing accurate arithmetic. It is
possible to spend many hours of supercomputer CPU time teaching
multiplication to a network, and still have a system that makes
many errors. If, however, instead of accuracy we view this
simulation as developing a very simple kind of `number sense,'
with the formation and use of internal representations
of sizes of numbers, then the simulation is more interesting.
When real mathematicians and real physicists think about
mathematics and physics, they rarely use logic or formal
reasoning, but use past experience and their intuitive understanding
of the complex systems they work on. We suspect a useful
goal for network models may be to develop similar qualitative
intuition in complex problem solving domains.

This Technical Report can be obtained by sending an email
message to Anderson@BROWNCOG (BITNET) or a request to:

James A. Anderson
Department of Cognitive and Linguistic Sciences
Box 1978
Brown University
Providence, RI 02912

Make sure you give your mailing address in your message.

------------------------------

Subject: Washington (DC) Neural Network Society
From: will@ida.org (Craig Will)
Date: Thu, 06 Oct 88 14:40:26 -0400



Washington DC - Virginia - Maryland
Area

Washington Neural Network Society


The Washington Neural Network Society is a local group
of those interested in neural networks. WNNS sponsors
monthly lectures and publishes a journal, Neural Network
Review, the latter of which has a national and international
audience.

Upcoming lectures scheduled include the following:

October 12: Fred Weingard, Booz Allen Hamilton
"Neural Networks: Overview and Applications"
7 pm at Contel in Fairfax, VA
Call Billie Stelzner at (703) 359-7685 for info.

November 14: C. Lee Giles, Air Force Office of
Scientific Research, Washington, DC.
"High Order Neural Networks", evening
At MITRE in McLean, Virginia

Also of interest is a talk by John Cater sponsored by
the IEEE local signal processing chapter:
October 19: John Cater, Digital Signal Corp.
"Pulse Frequency Activated Neural Networks"
7:30 pm at George Mason University, Fairfax, VA.
Student Union, Rooms 3&4.

To be put on the arpanet mailing list to receive announce-
ments of such events, please send a message requesting same
with your arpanet address to Craig Will at will@ida.org .
To be put on the mailing list to receive physical announce-
ments of such events, please send your name and US postal
address in a message to Craig Will at will@ida.org, or write
to Washington Neural Network Society, P. O. Box 427, Dunn
Loring, VA 22027.

To receive an information sheet describing Neural Net-
work Review, a review journal which is distributed nation-
ally and internationally (quarterly, $24.00 for 4 issues)
please send your name and US postal address in a message or
US mail to the above arpa or US postal address.

- -- Craig Will

------------------------------

Subject: Neural Computation Subscriptions
From: terry@cs.jhu.edu (Terry Sejnowski <terry@cs.jhu.edu>)
Date: Thu, 06 Oct 88 17:03:55 -0400

Subscriptions to Neural Computation are available for:

$45.00 Individual
$90.00 Institution
(add $9.00 surface mail or $17.00 postage
for outside US and Canada)

Available from:

MIT Press Journals
55 Hayward Street
Cambridge, MA 02142

(617) 253 2889 for credit card orders

Terry

- -----

------------------------------

End of Neurons Digest
*********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT