Copy Link
Add to Bookmark
Report
Neuron Digest Volume 09 Number 09
Neuron Digest Monday, 2 Mar 1992 Volume 9 : Issue 9
Today's Topics:
bibliography for knowledge retention
Positions available at Lawrence Livermore Labs
Lectureship in philosophy, University of Sussex
What is "Minimum Descriptional Complexity Estimate?"
Golden Section
Research Position -- GTE
Volunteers needed for IJCNN-92
supercomputer support/projects
neural network visualization
Open Letter - Response
bibliography for ANN visualization tools?
IEEE neural networks council seeks input
Reply to Pellionisz' "Open Letter"
Boston area ISSNNet meeting
NNs & NLP
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: bibliography for knowledge retention
From: ruizdeangulo%ispgro.cern.ch@BITNET.CC.CMU.EDU
Date: Tue, 04 Feb 92 14:27:33 +0100
>I am working on the retention of knowledge by a neural network. A neural
>network tends to forget the past training when it is trained on new data
>points. I'll be thankfull if you could suggest some references...
Here is a little bibliography:
R M .French. Using distributed representations to overcame catastrophic
forgetting in connectionist networks. CRCC Technical Report
51-1991.Center for Research on Concepts and Cognition. Indiana
University.
Hinton,G.E., & Plaut, D.C.(1987). Using fast weights to deblur old
memories. In Program if the Ninth Annual Conference of the Cognitive
Society (pp. 177-186). Hillsdale, NJ: Erlbaum.
McCloskey, M. & Cohen, N.J.(1989). Catastrophic interference in
connectionist networks; The sequential learning problem. In G.H. Bower
(Ed.), The psychology of learning and motivation. New York: Academic
Press.
R. Ratcliff. Connectionist Models of recognition memory:Constraints
imposed by learning and forgetting functions. Psychologycal review 1990
Vol 97 No. 2, 235-308.
V. Ruiz de Angulo, C Torras. Minimally Disturbing Learning.Proceedings of
the IWANN 91 (International Workshop on Artificial Neural Networks).Ed:
A. Prieto. Springer Verlag
F.J. Smieja Hyperplane (1991). "spin" dynamics, network plasticity and
back-propagation learning.Techinical Report of the German Researn Centre
for Computer Science (GMD).
I have not in my hands, but could also be related, this artice announced
to appear in the IJSN:
C.M. Bishop. A Fast procedure for retraining the Multilayer perceptrons.
International Journal of Neural Systems Vol 2, No 3 (1991).
------------------------------
Subject: Positions available at Lawrence Livermore Labs
From: jb@S1.GOV
Date: Wed, 05 Feb 92 18:30:52 -0800
*********************************************************************
*********** ***********
********** JOB OPPORTUNITIES IN NEURAL NETWORKS AT THE **********
********** LAWRENCE LIVERMORE NATIONAL LABORATORY **********
*********** ***********
*********************************************************************
The Institute of Scientific Computing Research at the Lawrence Livermore
National Laboratory expects to open one or two positions for postdoctoral
research scientists in the fiscal year 1993. We are seeking strongly
motivated, outstanding individuals with expertise in neural computation,
neural networks for vision and signal processing and/or modelling
biological neural systems. Candidate should have a research experience in
at least one of the following areas and commitment to the study of the
others categories:
- neural network modeling and neural network learning
- retina modeling
- visual pattern recognition and pattern classification with
neural networks
- neural networks for signal processing, e.g., detection of
very dim visual stimuli
- adaptive signal processing
Applicants for the positions should have completed their PhD and have a
demonstrated ability to do independent innovative research in an area
listed above. Familiarity with the Unix computing environment and object
oriented programming is a plus.
The Lawrence Livermore National Laboratory has a broad spectrum of
applications for neural computation, ranging from automated pattern
classification and object recognition to signal detection in astronomy
and robotics applications. The successful applicant will work on
non-classified projects.
Interested individuals should send a their curriculum vitae with a list
of their research interests, a list of publications and three references
to:
JOACHIM BUHMANN
Lawrence Livermore National Laboratory
Computational Physics Division Fax: (510) 423 9572
P.O.Box 808, L-270 email: jb@s1.gov
Livermore, Ca 94550
------------------------------
Subject: Lectureship in philosophy, University of Sussex
From: Andy Clark <andycl@syma.sussex.ac.uk>
Date: Fri, 07 Feb 92 18:28:46 +0000
**** PERMANENT LECTURESHIP IN PHILOSOPHY, UNIVERSITY OF SUSSEX ****
Applications are invited from men and women for a permanent lectureship
in philosophy at the University of Sussex, England. The applicant should
be able to teach "core" philosophy courses (e.g. Descartes to Hume), and
should have a specialist interest in the philosophy of mind/cognitive
science and/or the philosophy of language.
Application-forms and Further Particulars can be obtained from:
Ms. Elinor Mitchenall
The Personnel Office
Sussex House
University of Sussex
Falmer
Brighton
BN1 9RH
Phone: (0273)- 678201
FAX; (0273) - 678335 (mark FAX "for the attention of the Personnel Dept."
------------------------------
Subject: What is "Minimum Descriptional Complexity Estimate?"
From: "Dheeraj Khera" <khera@sed.eeel.nist.gov>
Date: 10 Feb 92 16:26:00 -0500
At the National Institute of Standards and Technology, we've been using
various machine-learning techniques, including neural nets, to interpret
and characterize the volumes of data obtained from semiconductor
manufacturing processes. Recently, I came across a term that I'm having
a little difficulty understanding:
Minimum Descriptional Complexity Estimate.
My colleague heard it at a talk by someone from TI's Computer Integrated
Manufacturing area. Could someone familiar with this help me out with a
brief explanation? Thanks.
Raj Khera
National Institute of Standards and Technology
B360, Bldg. 225
Gaithersburg, MD 20899
EMAIL: khera@sed.eeel.nist.gov
------------------------------
Subject: Golden Section
From: Jean-Bernard Condat <0005013469@mcimail.com>
Date: Tue, 11 Feb 92 07:12:00 +0000
[[ Editor's Note: This came as a bit of a surprise. While it may be
tangential to Neural Nets, I would be very interested in potential
connections (so to speak)! -PM ]]
Hallo!
I work on the golden section in sciences and look at all possible
refernces and/or article related to this subject. If you known one, could
you please send me one copy and/or the reference?
Thank you very much for your kind help.
Jean-Bernard Condat
CCCF
B.P. 8005
69351 Lyon Cedex 08 France
Fax.: +33 1 47 87 70 70
Phone: +33 1 47 87 40 83
DialMail #24064
MCI Mail #501-3469
------------------------------
Subject: Research Position -- GTE
From: Rich Sutton <rich@gte.com>
Date: Wed, 12 Feb 92 09:55:26 -0500
The machine learning group at GTE Laboratories is seeking a researcher
for their connectionist machine learning project. The primary
requirement is a demonstrated ability to perform and publish world-class
research in computational models of learning, preferably within the
context of real-time control. Candidates should also be eager to pursue
applications of their research within GTE businesses. GTE is a large
communications company, with major businesses in local telephone
operations, mobile communications, and government systems. GTE Labs has
had one of the largest machine learning research groups in industry for
about eight years.
A doctorate in Computer Science, Computer Engineering or Mathematics is
required, and post-graduate experience is preferred. A demonstrated
ability to communicate effectively in writing and in technical and
business presentations is also required.
Please send resumes and correspondence to:
June Pierce
GTE Laboratories Incorporated
Mail Stop 44
40 Sylvan Road
Waltham, MA 02254
USA
------------------------------
Subject: Volunteers needed for IJCNN-92
From: "Nina a. Kowalski" <nak@src.umd.edu>
Date: Thu, 13 Feb 92 12:08:02 -0500
[[ Editor's Note: For the students and otherwise cash poor folks, this is
a wonderful opportunity to contribute to the cause, meet some great
people, as well as attend the meeting sporting the latest in Neural Net
fashion (i.e., those red shirts). I recommend the experience if you have
the time. Nina Kowalski has been organizing volunteers for several years
now and is a very capable and charming leader. -PM ]]
***************************************************************************
IJCNN 1992 - REQUEST FOR VOLUNTEERS
***************************************************************************
This is the first call for volunteers to help at the International Joint
Committee on Neural Networks (IJCNN) conference, to be held at the Baltimore
Convention Center, Baltimore, MD from June 7 - June 11, 1992.
In exchange for 25 hours of work, volunteers will receive:
Full admittance and registration to all sessions of the conference
(excluding tutorial sessions).
Admittance to a tutorial session only if the volunteer is working that
session.
Complete set of proceedings.
A T-shirt that must be worn during work shifts (in order to make
volunteers visible).
The work shifts are in ~5 hour blocks. Whatever combination of blocks
that will total about 25 hours can be used. The work shifts range in
description from helping at the registration desk, guarding the entrances
to sessions, room monitors (go-fers) in sessions, and helping in
distribution of proceedings.
The conference begins Sunday, June 7 with the tutorial sessions and ends
June 11. We will be needing help starting Friday, June 5. The exact
schedule will not be determined until the session schedule is finalized.
I will be contacting you when this happens.
Those interested in signing up, please send me the following information:
Name
Address
phone number
email
Upon signing up, you be sent a form with a more detailed description of
the positions, and a request for shift preference and tutorials. Sign
ups will be based on the date of commitment.
You may contact me at:
University of Maryland
Systems Research Center
A.V. Williams Bldg.
College Park, MD 20742
(301) 405-6596 or email: nak@src.umd.edu
If you have further questions, please feel free to contact me.
Thank you,
Nina Kowalski
IJCNN-92 Volunteer Chair
------------------------------
Subject: supercomputer support/projects
From: goddard@icsi.berkeley.edu (Nigel Goddard)
Date: Wed, 19 Feb 92 17:02:48 -0800
The Pittsburgh Supercomputing Center has started a program of support
for and research in connectionist AI and neuroscience. We support
academic research around the country under an NSF/NIH program. We
have Crays, a CM-2 and a CM-5 (256 nodes). We plan to bring up
simulation software on the machines to support various kinds of
connectionist work: detailed neurobiological modeling; cognitive
modeling with PDP and structured nets; using neural nets to solve
scientific questions in other biomedical areas. At this stage we need
to find:
1. People who see a need in the near future for the kind of
computational resources we can provide.
2. Software that is needed to support these people on our machines.
If you see yourself falling in category 1, or can suggest others whom
we should contact, please let us know. We're particularly interested
in finding one or two research projects to work with intensively to
get this effort off the ground.
So far we are looking at the "Neuron" and "Genesis" simulators for
neuroscientists, Aspirin/Migraine and PlaNet for PDP, and probably the
Rochester simulator and ICSIM for structured nets. If you know of
other simulation software we should consider, please let us know.
Feel free to pass this on to anyone you think would be interested.
Nigel Goddard
ngoddard@psc.edu
------------------------------
Subject: neural network visualization
From: jones@cis.uab.edu (Warren Jones)
Date: Thu, 20 Feb 92 08:11:03 -0600
Our neural network research group is interested in getting more involved in
network visualization and perhaps developing some visualization strategies
in collaboration with the graphics and image processing faculty here. Is
there a comprehensive bibliography in this area?
Warren Jones
------------------------------
Subject: Open Letter - Response
From: cgq@ornl.gov
Date: Mon, 24 Feb 92 07:04:48 -0500
[[ Editor's Note: Here is one response to the recent issue. Compare this
with one later in this issue of the Digest. -PM ]]
Dear Neuron-Digest Moderator - Peter:
The "Open letter to Dr. Sun-Ichi Amari" section of the most recent digest
issue is an example of a very serious scientific problem that plagues
many fields. It is ABSOLUTELY UNETHICAL to not provide and acknowlegde
references to prior, related research when publishing a paper; especially
when one is aware of the related work. Dr. Pellionisz is so correct when
he says:
"The main issue, that of intellectual priority and proper citation,
affects all of us in research and forms the foundation of the modern
scientific tradition."
As a researcher I hold this tradition above all others.
Thank You for bring this subject to our attention.
Chuck Glover
cgq@ornl.gov
Oak Ridge National Lab
------------------------------
Subject: bibliography for ANN visualization tools?
From: jones@cis.uab.edu (Warren Jones)
Date: Mon, 24 Feb 92 09:28:49 -0600
Our neural network group is moving into a collaboration with local
graphics and image processing faculty for the development of a
visualization tool for neural network research. Does a general
bibliography exist for neural network visualization tools?
Warren Jones
jones@cis.uab.edu
------------------------------
Subject: IEEE neural networks council seeks input
From: Russel Eberhart <rce%babar@rti.rti.org>
Date: Mon, 24 Feb 92 14:52:55 -0500
[[ Editor's Note: The scientific community is served by various specialty
societies and committees within those societies. For them to serve us
better, we should let our opinions be known. I urge all IEEE members and
interested others to send a note to Russ and/or Pat. Let your dues go for
more than just the magazine. -PM ]]
February 24, 1992
Dear Neural Network Community:
In an effort to better serve the neural network community, we of the IEEE
Neural Networks Council (NNC) are asking you to respond to questions
concerning the role of a neural network society. We suggest that these
responses be made publicly, but we also understand the need for
confidentiality in some instances and respect your right to send e-mail
replies directly to either one or both of us. Pat and I are asking for
your input as individuals: this is not an officially sanctioned activity
of the NNC. We plan to present your initial responses to the NNC
Administrative Committee meeting on March 8, 1992, in San Diego, but we
hope the discussion continues long after that date.
The following questions are intended to begin a discussion, but they are
in no way meant to be comprehensive. We hope to establish a forum that
encourages you of the neural network community to voice your feelings
about neural network societies.
1. THE ROLE OF THE SOCIETY:
a. What role should a neural network society play?
b. In which of the following areas should a neural network
society become involved ? (Why?)
- Journals - Conferences
- Employment Assistance - Lobbying for Funding
- Public Relations - Standards
- Newsletters - Magazines
- Local/regional chapters
c. What are some other areas where societies should or could
play a role? What areas should be avoided?
d. How should a neural network society work with other technical
communities such as fuzzy systems, AI, genetic algorithms,
evolutionary programming, and virtual reality?
e. How should a neural network society serve the needs of students?
Of faculty members?
2. CONFERENCE FORMATS:
a. What type(s) of conferences are most useful to you?
b. Are shorter conferences with a specific theme more appealing
to you than a large yearly meeting, or does each have its place?
c. How often should large meetings be held and where should
they be held?
d. What are your thoughts on the oral presentation format versus the
poster presentation format?
e. What can be done to improve conferences?
f. What should be eliminated from conferences?
g. How can a society get more people involved in the conferences?
We look forward to hearing from the neural network community (YOU!). It
is the intent of the IEEE Neural Networks Council to serve the
community's best interests and we want to know how we can improve.
Sincerely,
Russell C. Eberhart, President Patrick K. Simpson, Vice Pres.
IEEE Neural Networks Council IEEE Neural Networks Council
Research Triangle Institute ORINCON Corporation
Biomedical Engineering 9363 Towne Centre Drive
P.O. Box 12194 San Diego, CA 92121
Research Triangle, NC 27709 E-mail: xm8@sdcc12.ucsd.edu
E-mail: rce@rti.rti.org
------------------------------
Subject: Reply to Pellionisz' "Open Letter"
From: "Michael Arbib" <arbib@pollux.usc.edu>
Date: Tue, 25 Feb 92 13:59:15 -0800
[[ Editor's Note: Here is another response. I appreciate the candor and
encourage others to express their opinion in a reasoned and supported
fashion. While I cannot comment on the correctness of the points either
in the original "open letter" or the various responses due to lack of
appropriate background, I am encouraged by the debate itself as process
of public education. -PM ]]
One of my colleagues sent me a copy of the letter by A. Pellionisz
complaining that Amari had not cited his earlier papers on applying
tensor analysis. Since my reply may be of general interest, I reproduce
it here:
"Amari had sent me the original letter, and we had agreed it was a
KINDNESS to Pellionisz not to refer to his earlier work, since in doing
so Amari would have had to summarize our 1985 argument showing that
Pellionisz had misunderstood the mathematics of tensor analysis. Anyway,
the work of Amari is NOT a generalization (why generalize a flawed
theory?!) but is a totally different application. For Pellionisz, the
tensors are the inputs and outputs to a single NN. For Amari (applying
his work of many years on information geometry) the whole NN is an
element of the Riemannian space on which tensors are defined, and the
metric on that space is used as an information measure to guide inference
of network parameters to find a NN meeting specified criteria. My only
criticism of Amari's paper is that it relies too much on his previous
publications directed to statisticians, and so will be very hard for NN
workers to read. Finally, note that tensor analysis is a powerful branch
of mathematics with many applications. The idea of applying it to NNs
does not need citation any more than does say the use of linear algebra.
But if one makes use of a specific technique in a way close to the work
of others, then full citation is appropriate. The latter case does NOT
apply here."
Let me simply add that the Japan-bashing in Pellionisz's letter is both
distasteful and (as I need hardly add) totally without foundation. It is
Professor Amari, not Dr. Pellionisz, who deserves a public apology.
Michael Arbib
Center for Neural Engineering
University of Southern California
Los Angeles, CA 90089-2520
USA
------------------------------
Subject: Boston area ISSNNet meeting
From: issnnet@copley.bu.edu (Student Society Account)
Organization: Boston University Center for Adaptive Systems
Date: 26 Feb 92 16:31:53 +0000
Hello everyone,
after three months of near dormancy, we are trying to make a strong
push to reaffirm ISSNNet as the official student society in the field
of Neural Networks. The society has been suffering from a lack of
organizational infrastructure, due mostly to the diminished available
time of some of its founders. We have also been very busy with the
organization of student sponsorships for IJCNN-Seattle and
IJCNN-Singapore.
We are organizing a meeting in the Boston area for next Wednesday,
March 4, at 2:00pm. All students working in Neural Networks are
invited to attend. The meeting will focus on the achievements and
goals of ISSNNet, and a discussion of how the society can become more
permanent and better organized with your help.
You do not need to be a member of ISSNNet to attend, in fact you don't
have to be a student. The only requirement is that you be interested
in the role of students in the field of Neural Networks.
The meeting will be held at Boston University, in room 101 of the old
Biology Building, at 2 Cummington Street (right next to the
Nickelodeon movie theater). A map is included below for those
unfamiliar with the area.
If you can't attend the meeting but would like to find out more about
ISSNNet, please send mail to issnnet@park.bu.edu (however, replies to
general info requests may be slow until we have more people to help :-)).
See you there!
ISSNNet
- ------------------------------------------------------------------------
Directions to 2 Cummington Street:
|
|Blandford St. -- First
|B-line stop after Kenmore
|
......................."B" Green line.......|........................
=====Commonwealth Ave.===================================Kenmore===--> Downton
| | |
| WARREN TOWER DORMS | |
| | |
------------------------------------------
Cummington St. ^ x
| \
Nickelodeon 2 Cummngton
ISSNNet, Inc.
P.O. Box 15661
Boston, MA 02215 USA
------------------------------
Subject: NNs & NLP
From: Jean-Francois Jodouin <jfj%FRLIM51.BITNET@BITNET.CC.CMU.EDU>
Date: Thu, 27 Feb 92 14:20:53 +0100
[[ Editor's note: See previous issues of the Digest which announce
technical reports for instruction on how to access the Neuroprose archive
at archive.cis.ohio-state.edu. -PM ]]
At last, at long last, the neural network & natral language processing
bibliography I've been pestering you about is compiled. The thing, though
far from complete, is now in the neuroprose archive under the name
"jodouin.nlpbib.asc.Z". Any comments, additions, deletions, etc. are
welcome.
jfj
------------------------------
End of Neuron Digest [Volume 9 Issue 9]
***************************************