Copy Link
Add to Bookmark
Report
Neuron Digest Volume 05 Number 37
Neuron Digest Tuesday, 5 Sep 1989 Volume 5 : Issue 37
Today's Topics:
Position available at U. of Dortmund
INNS email address
Convergence rate of the Hopfield net?
When does a hopfield net converge ?
Re: When does a hopfield net converge ?
Re: When does a hopfield net converge ?
Re: When does a hopfield net converge ?
Good book on Neural Nets.
Re: Good book on Neural Nets.
Re: Good book on Neural Nets.
Re: Good book on Neural Nets.
Intro to Neural Nets info wanted
Intro to Neural Net texts
Re: looking for work on text `interest score' computation
medical applications of neural nets
NN researchers in Pakistan?
Fuzzy Logic in NN Sources
Re: Fuzzy Logic in NN Sources
Methods of simple object recognition
M&P
New Magazine: NEURAL NETWORK COMPUTING
Re: New Magazine: NEURAL NETWORK COMPUTING
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).
------------------------------------------------------------
Subject: Position available at U. of Dortmund
From: UAP001%DDOHRZ11.BITNET@CUNYVM.CUNY.EDU
Date: Tue, 29 Aug 89 09:35:36 +0700
A vacancy exists at the Institut fuer Arbeitsphysiologie
at the University of Dortmund for a
University Professor (C 4)
To be Director of the Department 'Arbeitsphysiologie - Schwerpunkt
Arbeitspsychologie' (successor to Prof. J. Rutenfranz) The Institute is
funded by the State of North-Rhine Westphalia and by the Federal Republic of
Germany. It is mandated to conduct research in basic and applied physiology
for the purpose of the welfare and protection of the working population.
The successful candidate will be appointed professor at the University of
Dortmund and will be seconded to perform research at the Institute. This
research should deal with applied experimental psychology and
psychophysiology, utilizing experimental, epidemiological, and other
empirical methods.
The position can be filled ba a scientist with a background in medicine,
physiology, or psychology; the chief requirement is a background in the
above areas of research. Willingness to collaborate with the other
departments of the Institute (environmental physiology, toxicology,
ergonomics, and sensory and neurophysiology) is a requirement for
appointment.
Applications should be directed by 15 October 1989 to the
Geschaeftsfuehrenden Direktor des Instituts fuer Arbeitsphysiologie an der
Universitaet Dortmund, Ardeystr. 67, D-4600 Dortmund 1, FRG (Current
research in the department deals with problems of night and shift work,
circadian rhythms, the effects of combined stresses, and motivation; but the
incoming director will have considerable scope in developing new areas of
research.)
------------------------------
Subject: INNS email address
From: mcvax!fesa.es!seco@uunet.UU.NET (Jose Gonzalez Seco)
Date: Tue, 29 Aug 89 12:44:42 -0600
Do you know if the INNS (or one of its members) has an email address I can
write to if I want to renew my membership?
Thanks,
Jose Gonzalez-Seco
seco@fesa.es
[[ Editor's note: I assume Jose looked at the masthead of the INNS Journal
and found nothing. I know that AAAI's magazine lists e-mail addresses and
most conference papers do as well. I would like to see similar courtesies
in other publications and journals. -PM ]]
------------------------------
Subject: Convergence rate of the Hopfield net?
From: changei@liberia.crd.ge.com (eric i chang)
Organization: General Electric Corp. R&D, Schenectady, NY
Date: Thu, 24 Aug 89 15:17:19 +0000
I am trying to use the Hopfield net to solve optimization problems
and I am wondering if there has been any study on the convergence rate
of the Hopfield net?
Specifically, how does the convergence rate of the Hopfield net
change with size of the problem? Hopfield and Tank mention that the
net typically converges in a few cycles. But suppose one increases the
size of the net ten times, would the convergence time stay the same?
Any comments and references regarding this problem would be
greatly appreciated. I will summarize the response on the net.
Thank you!
Eric Chang
changei@crd.ge.com
------------------------------
Subject: When does a hopfield net converge ?
From: Xinzhi Li AG Richter <mcvax!unido!uklirb!xinzhi@uunet.uu.net>
Organization: University of Kaiserslautern, W-Germany
Date: 10 Jul 89 15:43:16 +0000
When does a hopfield net converge to stead state? If it converges, how many
steps will it take to enter the stead state? I tried to answer such problems
by using methods of linear algebra (i.e. eigenvalue related methods). I
always got trouble with the non-linearity caused by the threshold function.
Does anyone knows any method to overcome such difficulty? Does anyone knows
any theorem in this direction?
I have read a assertion in a article of Lippmann (Apr. 1987 IEEE ASSP) that
a hopfield net could always converge given that the net is symmetric and
completely connected. But i can not get the proof of this assertion. Does
anyone knows the paper proving this assertion? Which methods are used there?
Thank you in advance.
xinzhi@uklirb.uucp
Xinzhi Li, university kaiserslautern, west germany
------------------------------
Subject: Re: When does a hopfield net converge ?
From: artzi@cpsvax.cps.msu.edu (Ytshak Artzi - CPS)
Organization: Michigan State University, Computer Science Department
Date: Tue, 11 Jul 89 11:35:16 +0000
>When does a hopfield net converge to stead state?
Hopfield model is totally unpredictable. Moreover, it depends on the
particular instance of the particular problem you try to solve, which in
turn depends on the initial parameters you choose for your equations. If
parameteres are not wisely (??) chosen the network WON'T converge at all.
Itzik.
------------------------------
Subject: Re: When does a hopfield net converge ?
From: thomae@cslvs5.ncsu.edu (Doug Thomae)
Organization: North Carolina State University
Date: Tue, 11 Jul 89 17:59:58 +0000
> Hopfield model is totally unpredictable. Moreover, it depends on the
It has been proven (in the mathematical sense ) by Hopfield and Grossberg
(and perhaps others), that Hopfield networks that have 1) symmetric
connections (weight from neuron i to neuron j is the same as the weight in
the reverse direction) and 2) weight of connection from a neuron to itself
is zero, will always converge in the sense that they will settle down and
not oscillate forever. That does not mean that they will settle into the
state desired by the user, just that they will settle into some state. The
basic idea behind the proof is to show that a Lyaponov (sp?) function exists
for the network, and then use a theorem from control theory that says that
if such a function exists for a network, then the system is stable. A good
text on control theory should have all the gory details. The two papers
usually referenced in the neural net community for all this are:
M.A. Cohen and S. Grossberg, "Absolute Stability of Global Pattern Formation
and Parallel Memory Storage by Competitive Neural Networks", IEEE
Transactions on Systems, Man and Cybernetics, p. 815-826, 1983
J.J. Hopfield, "Neurons with Graded Response Have Collective Computational
Properties Like Those of Two-State Neurons", Proceedings of the National
Academy of Sciences, Vol. 81, p. 3088-3092, May 1984
I have heard mention of more general theorem that show that a Hopfield
network will also converge under some conditions when the connections are
not symmetric, but I don't know the reference for it.
------------------------------
Subject: Re: When does a hopfield net converge ?
From: laird@ptolemy.arc.nasa.gov (Philip Laird)
Organization: NASA Ames Research Center, Moffett Field, CA
Date: Tue, 11 Jul 89 20:37:58 +0000
In article <3376@ncsuvx.ncsu.edu> thomae@cslvs5.UUCP (Doug Thomae) writes:
>... I have heard mention of more general theorem that show that a Hopfield
>network will also converge under some conditions when the connections
>are not symmetric, but I don't know the reference for it.
Convergence (or lack thereof) is rather well understood theoretically. Two
recent references are:
Bruck and Sanz, Int. Journal of Intelligent systems, 3, p. 59-75, 1988.
Bruck and Goodman, IEEE Trans. on Information Theory, IT-34 (Sept. 88).
Phil Laird, NASA Ames Research Center, Moffett Field, CA 94035
(415)-694-3362 LAIRD@PLUTO.ARC.NASA.GOV
------------------------------
Subject: Good book on Neural Nets.
From: aj3u@uvacs.cs.Virginia.EDU (Asim Jalis)
Organization: U.Va. CS Department, Charlottesville, VA
Date: Sat, 19 Aug 89 02:17:14 +0000
I am looking for a good introductory text on Neural Nets. I am specially
interested in the physics-related aspects of the theory. The book should not
unduly emphasize rigor in mathematical proofs and could take an intuitive
approach to the subject. I realize that all of the above laid down
guidelines cannot be satisfied, nevertheless, suggestions would be very
helpful.
Asim.
[[ Editor's Note: Ah the eternal question! In a future issue, I'll collect
all the "where can I get sowftare" questions as well. -PM ]]
------------------------------
Subject: Re: Good book on Neural Nets.
From: Joseph Brady <brady@LOUIE.UDEL.EDU>
Organization: University of Delaware
Date: 20 Aug 89 23:26:49 +0000
In article (Asim Jalis) writes:
>I am looking for a good introductory text on Neural Nets.
Wasserman's recent book (I believe entitled "Neural Networks" )
is a very good introductory treatment.
[[ Editor's Note: I have also heard that the Wasserman book (citation
unknown to me) is very good, especially for engineers who want a quick
"how-to" survey, rather than a theoretic treatise. I just picked up
Anderson & Rosenfeld' "Neurocomputing" which is a lovely historical
comependium of the seminal papers starting with William James (1890!) to
Hebb (1949) to Sivilotti, Mahowald and Mead (1987); a very good overview,
though not as an introduction. Read "Neurocomputing" when you've read one
too many back-prop papers! -PM ]]
------------------------------
Subject: Re: Good book on Neural Nets.
From: krisk@tekigm2.MEN.TEK.COM (Kristine L. Kaliszewski)
Organization: Tektronix Inc., Beaverton, Or.
Date: Wed, 23 Aug 89 20:09:24 +0000
A good intro book is From Neuron To Brain by Kuffler, Nicholls and Martin.
It goes into a lot of backround without much math. Other books are written
by Dr. R. MacGregor (I don't remember the titles) from the Univ. of Colo. I
attended CU and took classes from him and others in the field on this
subject so let me know if you have any further questions.
Kristine
[[ Editor's Note: Kuffler is *the* best intro to neurobiology I know.
It has nothing about artificial nets, though. -PM ]]
------------------------------
Subject: Re: Good book on Neural Nets.
From: death@neuro.usc.edu
Organization: University of Southern California, Los Angeles, CA
Date: Wed, 30 Aug 89 08:47:21 +0000
>> I am looking for a good introductory text on Neural Nets.
NO ONE BOOK CAN DO THE JOB -- YOU NEED TO READ AND READ AND READ BOOKS,
JOURNAL ARTICLES, REVIEW ARTICLES, CONFERENCE PROCEEDINGS, ETC. BUT A
GOOD STARTING POINT (FOR THE SERIOUS RESEARCHER) FOLLOWS:
Koch, Christof and Idan Segev. Methods in Neuronal Modeling: From Synapses
to Networks. MIT Press:Cambridge 1989. This is an excellent collection of
papers from the Woods Hole Neural Network Modeling Course in August, 1988.
The first five chapters develop a set of mathematical modeling tools that
would help anyone with a minimal background in calculus, linear algebra and
differential equations to begin a serious modeling project.
Rumelhardt, McClelland & PDP Research Group. Parallel Distributed Processing
(and workbook: Explorations in PDP with floppy disks containing source code
in C for the IBM PC) MIT Press:1986. Everyone has read it. It you haven't
you need to read it to be able to talk to everyone else.
MacGregor, Ronald J. Neural and Brain Modeling. Academic Press:1987. This
book has lotsa FORTRAN listings for hackers and such. But, serious research
people should use it as an initial summary of past neural network experiments
designed to simulate specific subsystems of the brain. Read the summaries --
THEN READ THE PAPERS REFERENCED IN THE FOOTNOTES of your favorite section.
There are errors in some of the programs ... so recheck the code carefully
... and understand the purpose of every parameter, variable and statement.
Kandel, Eric R and James H. Schwartz. Principles of Neural Science. 2nd ed.
Elsevier:New York 1985. A solid overview of how the nervous system works.
Hille, Bertil. Ionic Channels of Excitable Membranes. Sinauer Associates,
Inc: Sunderland 1984. Everything you ever wanted to know about how nerves
and synapses work. Essential for building blocks for real neural networks.
Researchers should design abstract neurons for their systems based on the
behaviour of detailed compartmental models of such realistic neurons.
Carpenter, Malcolm B. Human Neuroanatomy. The Williams and Wilkins Company:
Baltimore. 7ed:1976. Overview of the gross structural divisions of the
nervous system. Pay particular attention to the chapters on the central
nervous system (medulla, pons, mesencephalon, cerebellum, diencephalon,
hypothalamus, basal ganglia, olfaction, hippocampus, amygdala, cerebral ctx).
This book is one of the better arguments for building hierarchical neural
network models in order to capture biological neural network behaviours.
Brodal, A. Neurological Anatomy. 3rd ed. Oxford Univ Press:New York 1981.
Contains many detailed chapters about pathways and information transmission
among major regions of the brain. Read this one to learn a little about the
vast amount already known about information transmission pathways and the
actual functions performed by those pathways in real brains.
Purves, Dale and Jeff W. Lichtman. Principles of Neural Development.
Sinauer Associates Inc:1985. You could learn a lot about how to build an
artificial neural network by studying how natural neural networks develop.
McGeer, Patrick L, Sir John Eccles, and Edith McGeer. Molecular Neurobiology
of the Mamalian Brain. 2nd ed. Plenum Press:1987. There are a lot more
signaling systems in the brain than electrical or chemical synapses. Study
this book to learn about long term wide area signaling systems modulating
natural brain function.
Matkowitsch, Hans J. Information Processing in the Brain. Hans Huber
Publishers:Toronto:1988. A good introduction to developing the ability
to critically read and understand papers from real neurophysiology journals
like: Experimental Brain Research, Journal of Neuroscience & Neurosurgery.
------------------------------
Subject: Intro to Neural Nets info wanted
From: robertc@cognos.UUCP (Rob Craig)
Organization: Cognos Inc., Ottawa, Canada
Date: Thu, 22 Jun 89 22:48:59 +0000
I am going to be writing an article on neural networks in the near future
and am looking for some background information. I have been following
comp.ai.neural-nets for a while now but have not seen any references to good
introductory level articles/books on neural nets. I'm not sure how much is
available yet, since it is a relatively new field, however I would
appreciate any information you could give me. Please respond through e-mail
and I will be happy to summarize what I recieve to the net (if there are any
articles available through the net (Usenet/ Bitnet/whatevernet) please let
me know).
I have read a bit about a particular NN 'application' called NETtalk.
Any further information on it, or anything similar would also be appreciated
(NETtalk seems like a good illustration of some of the capabilities of NNs).
Thanks, Rob.
| Robert Craig | 3755 Riverside Dr. | UUCP: |
| Cognos Incorporated | P.O. Box 9707, Ottawa, | uunet!mitel!sce! |
| (613) 738-1338 x5150 | Ontario, Canada K1G 3N3 | cognos!robertc |
|The above opinions are my own and do not reflect those of anyone else|
------------------------------
Subject: Intro to Neural Net texts
From: robertc@antares.UUCP (Rob Craig)
Organization: Cognos Inc., Ottawa, Canada
Date: Wed, 12 Jul 89 22:30:18 +0000
A few weeks ago I asked about introductory articles/books on Neural Nets.
Below is a summary of what I received, thanks to all who responded.
AI Expert magazine, 'Neural Network Primer', last year or so (series)
Dr. Dobb's Journal of Software Tools, Jan 89, Apr 87
IEEE ASSP Magazine, 'An Introduction to Computing with Neural Nets',
by Richard P. Lippmann, Apr 87 (4-22) issue
-- "good clear introduction with many references"
IEEE Neural Networks (publication)
'An Introduction to Neural Computing', Neural Networks V1 N1 (3-16),
by Teuvo Kohonen, 1988
'Parallel Distributed Processing: Explorations in the Microstructure of
Cognition', 3 volumes, by D.E. Rumelhart and J.L. McClelland,
MIT Press 1986
-- "THE book, excelent"
'Perceptions', by M. Misky and S. Papert, MIT Press 1969, Contains
critisim of single layered networks
'Real Brains - Artificial Minds', by J. Casti and A Kerlqvist, Elsevier
Science Publishing Company, New York, 1987
| Robert Craig | 3755 Riverside Dr. | UUCP: |
| Cognos Incorporated | P.O. Box 9707, Ottawa, | uunet!mitel!sce! |
| (613) 738-1338 x5150 | Ontario, Canada K1G 3N3 | cognos!robertc |
|The above opinions are my own and do not reflect those of anyone else|
------------------------------
Subject: Re: looking for work on text `interest score' computation
From: david@banzai.UUCP (David Beutel)
Organization: People's Computer Company, Williston, VT
Date: Wed, 26 Jul 89 15:31:19 +0000
In article eric@snark.uu.net (Eric S. Raymond) writes:
>Please help me save USENET from uselessness! ;-)
>
>I'm looking for methods for filtering news articles for `interest level'
>according to preferences expressed by the user, sort of a generalization
>of the kill-files concept.
>
[...]
>I would like to hear about any such work, whether statistical or knowledge-
>based and *no matter how slow the method is or what language it's in!*
I think the statistical approach would be the best, because by the
complexity of natural language a knowledge-based system couldn't hope to
understand a news article (thus no win for the extra complexity of
programming a logical system).
Specifically, I think a neural network of some type would work best.
The reader would train his/her filter by rating each article s/he reads.
The rating could be a simple 0..9 of how interested the reader was in
the article, or several different catogories (content, style, subject...)
could be rated.
The network would train itself by taking the article as the input vector,
producing the rating it thinks the reader would give it, comparing its
rating to the real rating the reader gave, and adjusting itself until the
rating it produces is the same as the real rating. After a learning period,
the NN would provide the reader with an acurate rating--a forcast of how the
reader would rate an article. When the NN is wrong, the reader can give it
the correct rating--so, if the reader's tastes or interests change, the NN
changes too.
The benefit of the system is that it would be objective and passive, as
opposed to a keyword system which makes demands upon the author. It would
also be personalized to the reader, as opposed to a moderator who, albeit
incomprably smarter than a NN, may not share the individual reader's tastes.
In a sophisticated newsreader, the NN could show the score near the subject
and keywords of an article, and the reader could use this advice when
deciding what to read. The newsreader could also mask articles that have a
score less than 5, for instance.
I don't know how to make such a neural network system--there are two big
questions that I don't know how to answer:
1) How should the articles be pre-processed for vectorized input
to the network? I.e., what's the best way of looking
at an article to producing a rating for it? This
involves producing statistics as well as passing some
text straight thru.
2) What sort of neural network is best for this?
I hope someone else will speculate! Maybe someone out there has tried
preparing text for generating statistical ratings, especially by neural
network?
J. David Beutel_______________11011011________________People's Computer Company
"I am, therefore I am." `Revolutionary Programming'
...!uunet!uvm-gen!banzai!david
------------------------------
Subject: medical applications of neural nets
From: bhbst@unix.cis.pittsburgh.edu (Barry H. Blumenfeld)
Organization: Univ. of Pittsburgh, Comp & Info Services
Date: Thu, 24 Aug 89 12:01:23 +0000
I'm writing an article on the use of neural networks in medicine for an
upcoming issue of Computer News for Physicians. I'm aware of much of
what is being done in the field, but I'm sure there are plenty
of people out there doing good work with whom I'm not familiar. If you,
or anyone you know, are engaged in research utilizing a connectionist
paradigm in a medical domain I'd like to hear about your research. You
can contact me by Email, or send a copy of any recent papers to my US mail
address.
--thanks
Barry Blumenfeld MD
ARPA: bhb@med.pittsburgh.edu
Compuserve user # 70667,1124
Section of Medical Informatics
B50A Lothrop Hall
University of Pittsburgh
Pittsburgh, Pa 15261
Phone-412-648-3036
------------------------------
Subject: NN researchers in Pakistan?
From: ux1.cso.uiuc.edu!uxe.cso.uiuc.edu!burkie@UXC.CSO.UIUC.EDU
Date: 30 Aug 89 06:04:00 +0000
Does anyone know if there is anyone working on neural-nets or a related
field in Pakistan? I am planning to go back soon and I would be interested
in finding as many people like that as I can. Thanks for any help.
Salman Burkie
burkie@complex.ccsr.uiuc.edu
(217)-244-1744
------------------------------
Subject: Fuzzy Logic in NN Sources
From: slager@osiris.cso.uiuc.edu
Date: Sun, 30 Jul 89 23:16:12 +0000
I am looking for information on fuzzy logic (contradictory information)
in neural nets. Any references will be much appreciated.
Please send mail to:
chuck@cerl.cecer.army.mil
------------------------------
Subject: Re: Fuzzy Logic in NN Sources
From: russell@minster.york.ac.uk
Organization: Department of Computer Science, University of York, England
Date: Tue, 01 Aug 89 12:57:44 +0000
Hope this is of use ...
Russell.
____________________________________________________________
Russell Beale, Advanced Computer Architecture Group,
Dept. of Computer Science, University of York, Heslington,
YORK. YO1 5DD. UK. Tel: [044] (0904) 432762
russell@uk.ac.york.minster JANET connexions
russell%york.minster@cs.ucl.ac.uk ARPA connexions
..!mcvax!ukc!minster!russell UUCP connexions
russell@minster.york.ac.uk eab mail
____________________________________________________________
Refs follow:
%T Fuzzy Logic for Handwritten Numeral Character Recognition
%A Pepe Siy
%A C S Chen
%J IEEE Transactions on systems, man, and cybernetics
%D November 1974
%P 570-575
%K directed graph nodes classification
%T Adaptive Inference in Fuzzy Knowledge Networks
%A Bart Kosko
%J IEEE First International Conference on Neural Networks
%V 2
%D June 21-24 1987
%P 261-268
%K logic matrix Lyapunov
%T On Designing Fuzzy Learning Neural-Automata
%A L. C. Shiue
%A R. O. Grondin
%J IEEE First International Conference on Neural Networks
%V 2
%D June 21-24 1987
%P 299-308
%T Estimation of Expert Weights Using Fuzzy Cognitive Maps
%A W.R. Taber
%A M.A. Siegel
%J IEEE First International Conference on Neural Networks
%V 2
%D June 21-24 1987
%P 319-326
------------------------------
Subject: Methods of simple object recognition
From: rao@enuxha.eas.asu.edu (Arun Rao)
Organization: Arizona State Univ, Tempe
Date: Wed, 03 May 89 16:49:48 +0000
I'm currently doing a performance evaluation of various methods of
geometric object recognition (neural and non-neural). The final objective is
to compare the performance of the best existing systems with a new approach
we are developing.
The pattern set to be recognized consists of simple 2-dimensional
contours. All objects are completely visible (no occlusion). The ASCII
character set is one example, but the system should be capable of handling
arbitrary shapes. It should be insensitive to translation, scaling and
rotation.
I know of the following methods:
(i) Hough transform methods (Dana Ballard).
(ii) Fourier/log-polar transform methods (Casasent and Psaltis,
Brousil and Smith, many others).
(iii) Neocognitron (Fukushima).
(iv) Invariance net (Widrow).
(v) Hierarchical structure coding (Hartmann).
(vi) Similitude invariant system (Prazdny).
I am specifically interested in how this problem is dealt with in
existing vision systems - most of the above are probably still confined to
the laboratory. How good, for example, are OCR systems ? As I understand it,
they still require a very strictly defined character set, and do not
tolerate misalignment.
Any comments would be appreciated. Thanks in advance.
Arun
Arun Rao
ARPANET: rao@enuxha.asu.edu BITNET: agaxr@asuacvax
950 S. Terrace Road, #B324, Tempe, AZ 85281
Phone: (602) 968-1852 (Home) (602) 965-2657 (Office)
------------------------------
Subject: M&P
From: muttiah@cs.purdue.EDU (Ranjan Samuel Muttiah)
Organization: Department of Computer Science, Purdue University
Date: Tue, 01 Aug 89 16:23:11 +0000
In a paper in the Bulletin of mathematical biophysics 5:115-113 titled "A
logical calculus of the ideas immanent in nervous activity," W. S. McCulloch
and W. Pitts make extensive use of the logical calculus notation used by
Carnap and Russell and Whitehead. If you have read this paper and know of
its notations, I would like to hear your email!
------------------------------
Subject: New Magazine: NEURAL NETWORK COMPUTING
From: Mark Robert Thorson <portal!cup.portal.com!mmm@uunet.uu.net>
Organization: The Portal System (TM)
Date: 29 Aug 89 01:13:47 +0000
I just heard on a local PBS show, Computer Chronicles, that a new magazine
is going to be started, NEURAL NETWORK COMPUTING. The publisher is said to
be Auerbach, located in New York.
Does anyone have the address for these people? Does anyone have any
additional information on this subject?
Along the same lines, can someone post the address of INNS, and the yearly
membership fee? What does last year's conf. proceeding cost?
And are there any other periodicals devoted to neural networks?
------------------------------
Subject: Re: New Magazine: NEURAL NETWORK COMPUTING
From: "Robert J. Reed" <rochester!kodak!young@CU-ARPA.CS.CORNELL.EDU>
Organization: Eastman Kodak Co, Rochester, NY
Date: 30 Aug 89 14:12:19 +0000
I just received a copy of Neural Network Computing. The address is
Auerbach Publishers, a Division of Warren, Gorham & Lamont,Inc.
210 South Street
Boston,MA 02111-9990
The price is $145 dollars per year and is issued quarterly. There 800
number is 1-800-950-1216.
The table of contents is:
Jet and Rocket Engine Fault Diagnosis in Real Time
The Significance of Sleep in Memory Retention and Internal Adaptation
Self-Organization, Pattern Retention and Internal Adaptation
A Desktop Neural Network for Dermatology Diagnosis
Applying,Acquiring, and Integrating Neural Networks
Neural Networks Bookshelf
Robotics Applications
My impression of the magazine is that it covers some practical applications
as well as covering new topics in NN computing. I am not an expert of the
subject and I found it very understandable. The goal of the magazine is to
" provide a channel of communication for neural network practitioners ".
This is not an endorsement of the journal.
[[ Editor's Note: $145/year shows that casual readers need not apply. -PM ]]
------------------------------
End of Neurons Digest
*********************