Copy Link
Add to Bookmark
Report
Neuron Digest Volume 11 Number 41
Neuron Digest Wednesday, 30 Jun 1993 Volume 11 : Issue 41
Today's Topics:
Post-doctoral research position in AI
Re: Kolmogorov's Theorem
edited collection of ANN papers; discount
neural networks
Request for Training Data
Informational Brochure from Tufts University
Neural Nets course
Edward C. Posner Memorial Fellowhip Fund
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Post-doctoral research position in AI
From: Richard Lepage <lepage@gel.ulaval.ca>
Date: Thu, 17 Jun 93 09:32:56 -0500
Would-you please post the following announcement:
Post-doctoral Research Position
Information Integration Laboratory
Department of Electrical Engineering
Ecole de technologie superieure
Montreal, Canada
A post-doctoral position in Computer Vision and Artificial Intelligence
is presently available at the Department of Electrical Engineering of the
Ecole de Technologie Superieure, Montreal (Canada). Candidates in all
domains of computer vision and AI are invited to apply. A Ph.D. in
electrical engineering or computer science is required. Experience with
expert systems and/or neural networks would be a definite asset.
Interest in knowledge representation and acquisition is also desirable.
The candidate will participate in a research project aimed at
developing a vision interface for the Soar cognitive architecture. This
interface will be based on conventional image processing functions as
well as image processing neural networks. The Department is young, but
successful in attracting research funding. It is small, but big enough to
support a significant range of research interests. The atmosphere is
friendly and informal. A one year position is offered, renewable for a
second year upon agreement by both parties. The starting date is on or
after September 1, 1993.
Informal enquiries may be made to Dr. Daniel Crevier by e-mail
(crevier@ele.etsmtl.ca) or phone (514 289-8800 #7506). Applications
should include a resume and the names of two referees and should be sent
to:
Dr. Daniel Crevier,
Department of Electrical Engineering,
Ecole de Technologie Superieure,
4750 Henri-Julien, Montreal (Quebec),
Canada, H2T 2C8.
------------------------------
Subject: Re: Kolmogorov's Theorem
From: n@predict.com (Norman Packard)
Date: Thu, 17 Jun 93 14:40:37 -0700
Kevin Maguire < psnkm1@uk.ac.stir.forth > =<psnkm1@stirling.ac.uk>
asks:
"Does anyone out there have anything on the application of Kolmogorov's
Theorem to real-world problems..."
Kolmogorov proved a lot of theorems; I wonder which you mean?
You might check out "Minimum Complexity Density Estimation" A.R.
Barron, T.M. Cover, IEEE Trans on Info. Th. vol 37 p. 1034 (1991)
for use of Kolmogorov's complexity theory.
------------------------------
Subject: edited collection of ANN papers; discount
From: Pankaj Mehra <mehra@ptolemy.arc.nasa.gov>
Organization: Recom Technologies, Inc. at NASA-Ames Research Center
Date: Mon, 21 Jun 93 15:58:56 -0800
Fellow Connectionists:
Some of you may have already seen ``Artificial Neural Networks:
Concepts and Theory,'' edited by [yours truly] and Ben Wah. It was
published by IEEE Computer Society Press in August, 1992. The table of
contents are attached at the end of this message. The book is hardback
and has 667 pages of which approx 100 are from chapter introductions
written by the editors. The list price is $70 [$55 for IEEE members].
My intent in sending this message is not so much to announce the
availability of our book as it is to bring to your notice the following
offer of discount: If I place an order, I get an author's discount of 40%
off list price; if a school bookstore places the order, they get a 32%
discount. The IEEE order no. for the book is 1997; 1-800-CS-BOOKS. If
you are planning to teach a graduate course on neural networks, you will
probably find that our collection of papers as well as the up-to-date
bibliography at the end of each chapter introduction provide excellent
starting points for independent research.
-Pankaj Mehra
415/604-0165 mehra@ptolemy.arc.nasa.gov
NASA - Ames Research Center, M/S 269-3
Moffett Field, CA 94035-1000
USA
__________________________________________________________________________
TABLE OF CONTENTS: page
- -----------------
Chapter 1: INTRODUCTION
Introduction by editors 1-12
An Introduction to Computing with Neural Nets, Lippmann 13-31
An Introduction to Neural Computing, Kohonen 32-46
Chapter 2: CONNECTIONIST PRIMITIVES
Introduction by editors 47-55
A General Framework for Parallel Distributed Processing,
Rumelhart, Hinton, & McClelland 56-82
Multilayer Feedforward Potential Function Network, Lee & Kil 83-93
Learning, Invariance, and Generalization in High-Order
Networks, Giles & Maxwell 94-100
The Subspace Learning Algorithm as a Formalism for Pattern
Recognition and Neural Networks, Oja & Kohonen 101-108
Chapter 3: KNOWLEDGE REPRESENTATION
Introduction by editors 109-116
BoltzCONS: Reconciling Connectionism with the Recursive
Nature of Stacks and Tree, Touretzky 117-125
Holographic Reduced Representations: Convolution Algebra for
Compositional Distributed Representations, Plate 126-131
Efficient Inference with Multi-Place Predicates and Variables
in a Connectionist System, Ajjanagadde and Shastri 132-139
Integrated Architectures for Learning, Planning, and Reacting
Based on Approximating Dynamic Programming, Sutton 140-148
Chapter 4: LEARNING ALGORITHMS I
Introduction by editors 149-166
Connectionist Learning Procedures, Hinton 167-216
30 Years of Adaptive Neural Networks: Perceptron, Madaline,
and back-Propagation, Widrow and Lehr 217-244
Supervised Learning and Systems with Excess Degrees of
Freedom, Jordan 245-285
The Cascade-Correlation Learning Architecture, Fahlman 286-294
Learning to Predict by the Methods of Temporal Differences,
Sutton 295-330
A Theoretical Framework for Back-Propagation, le Cun 331-338
Two Problems with Backpropagation and other Steepest-Descent
Learning Procedures for Networks, Sutton 339-348
Chapter 5: LEARNING ALGORITHMS II
Introduction by editors 349-358
The Self-Organizing Map, Kohonen 359-375
The ART of Adaptive Pattern Recognition by a Self-Organizing
Neural Network, Grossberg 376-387
Unsupervised Learning in Noise, Kosko 388-401
A Learning Algorithm for Boltzmann Machines, Ackley, Hinton
& Sejnowski 402-424
Learning Algorithms and Probability Distributions in Feed-
forward and Feed-back Networks, Hopfield 425-429
A Mean Field Theory Learning Algorithm for Neural Networks,
Peterson & Anderson 430-454
On the Use of Backpropagation in Associative Reinforcement
Learning, Williams 455-462
Chapter 6: COMPUTATIONAL LEARNING THEORY
Introduction by editors 463-473
Information Theory, Complexity, and Neural Networks,
Abu-Mostafa 474-478
Geometrical and Statistical Properties of Systems of Linear
Inequalities with Applications in Pattern Recognition, Cover 479-487
Approximation by Superpositions of a Sigmoidal Function,
Cybenko 488-499
Approximation and Estimation Bounds for Artificial Neural
Networks, Barron 500-506
Generalizing the PAC Model: Sample Size Bounds From Metric
Dimension-based Uniform Convergence Results, Haussler 507-512
Complete Representations for Learning from Examples, Baum 513-534
A Statistical Approach to Learning and Generalization in
Neural Networks, Levin, Tishby & Solla 535-542
Chapter 7: STABILITY AND CONVERGENCE
Introduction by editors 543-550
Convergence in Neural Nets, Hirsch 551-561
Statistical Neurodynamics of Associative Memory, Amari &
Maginu 562-572
Stability and Adaptation in Artificial Neural Systems,
Schurmann 573-580
Dynamics and Architecture for Neural Computation, Pineda 581-610
Oscillations and Synchronizations in Neural Networks: An
Exploration of the Labeling Hypothesis, Atiya & Baldi 611-632
Chapter 8: EMPIRICAL STUDIES
Introduction by editors 633-639
Scaling Relationships in Back-Propagation Learning: Dependence
on Training Set Size, Tesauro 640-645
An Empirical Comparison of Pattern Recognition, Neural Nets, and
Machine Learning Classification Methods, Weiss & Kapouleas 646-652
Basins of Attraction of Neural Network Models, Keeler 653-657
Parallel Distributed Approaches to Combinatorial Optimization:
Benchmark Studies on Traveling Salesman Problem, Peterson 658-666
------------------------------
Subject: neural networks
From: corina@cs.tu-berlin.de (corina hartmann)
Organization: Technical University of Berlin, Germany
Date: 23 Jun 93 07:58:14 +0000
hi neuro fans, I need some help!
My work is to construct a neural network for smog forecasting. 1. Who
could help with experiences in forecasting of time series with neural
networks? I am interested in the following architectures:
-counterpropagation, forward-only
-bp according to Wong
-Kohonen network
2. Who works with the tool NeuralWorks?
I am looking for someone, who could advice from time to time on this tool.
All suggestions are welcome.
Greetings corina :-)
..............................................................................
corina@cs.tu-berlin.de
Corina Hartmann, TU Berlin, Sekr. FR 5-1, Franklinstr. 28/29, W-1000 Berlin 10
..............................................................................
------------------------------
Subject: Request for Training Data
From: Allen Herron <aherron@vdoe386.vak12ed.edu>
Date: Wed, 23 Jun 93 16:57:25 -0500
I am looking for access to data to be used in training networks. I am
interested in financial & sports data to be used in building networks.
Do you know of any electronic resources available that would provide
access to data for this purpose.
Allen Herron ...
------------------------------
Subject: Informational Brochure from Tufts University
From: cogstud@pearl.tufts.edu (Center for Cognitive Studies - Tufts)
Date: Wed, 23 Jun 93 17:00:02 -0500
Tufts University - Center for Cognitive Studies
Daniel C. Dennett, Director
11 Miner Hall
Medford, MA 02155-7059
(617) 627-3297
fax (617) 627-3952
email: cogstud@pearl.tufts.edu
"The Center for Cognitive Studies at Tufts University is a research unit,
offering no courses or degrees of its own, but providing an
administrative home for various research projects in cognitive studies
undertaken by the director and his associates. Marcel Kinsbourne,
Research Professor, has recently joined the Center on a permanent basis.
The Center hosts Visiting Fellows and Research Associates for varying
periods of time, ranging from a few weeks to a semester or academic year.
Visiting Fellows provide their own funding, while Research Associates
receive a modest stipend. Both are provided with library and clerical
support during their period at the Center. Research Associates are
typically advanced graduate students working on dissertations, or
engaging in related research, under the formal or informal supervision of
the Director or others at the Center."
The brochure states that the Center has preprints or reprints that are
available and requests for a listing of papers can be sent by email to
the email address listed above (there are electronic versions of some
papers available).
------------------------------
Subject: Neural Nets course
From: Joleen Barnhill <barnhill@Hudson.Stanford.EDU>
Date: Tue, 22 Jun 93 13:59:31 -0800
The Western Institute in Computer Science announces a one-week course in
Neural Networks to be held on the Stanford campus from August 16-20,
1993.
Aimed at those technical managers or engineers unfamiliar with Neural
Networks who are looking for a comprehensive overview of the field, the
course will provide an introduction and an understanding of basic
terminology. Included in the introduction will be several hands-on
sessions. The course will cover recent developments in the field and
their impact from an applications perspective. Students will be given
detailed coverage of the application of neural networks to scientific,
engineering, and commercial problems. Lastly, future directions of the
field will be discussed.
Throughout the course, students will have contact with leading
researchers, application engineers and managers experienced with the
technology.
INSTRUCTORS:
DR. DAVID BISANT received the Ph.D. from George Washington University and
is an Advanced Studies Fellow at STanford University. He holds several
adjunct faculty positions and has held positions at International
Medical Corporation and with the Department of Defense. His research
involves neural network applications to image processing and sequence
analysis.
DR. DAVID RUMELHART is a Professor at STanford University. He is a Fellow
of the NAS, the AAAS and the American Academy of Arts and Sciences. He
received a MacArthur Foundation Fellowship for his work on cognitive
modeling and he co-founded the Institute of Cognitive Science at UC-
San Diego. He received the Ph.D. from STanford.
For a complete brochure of all WICS offerings, including this one, send
your name and mailing address to barnhill@hudson.Stanford.EDU or call
(916) 873-0575 from 8 a.m.-5 p.m. Pacific Time.
------------------------------
Subject: Edward C. Posner Memorial Fellowhip Fund
From: Jim Bower <jbower@smaug.cns.caltech.edu>
Date: Thu, 24 Jun 93 11:17:20 -0800
Dear Colleague,
As you may be aware, last week Ed Posner was killed in an
unfortunate bicycle accident in Pasadena. Those of us who knew him well
feel the loss very deeply. However, Ed's strong commitment to the
communities to which he belonged makes his death even more unfortunate.
Throughout his career, Ed was actively involved in academic education and
organization. For example, in recent years, he was the first chairman
and principle organizer of the Neural Information Processing Systems
(NIPS) meeting. One of his many legacies to this meeting is a strong
commitment to student travel awards. He also spent much of the last year
of his life working hard to establish the NIPS Foundation so that NIPS as
well as other related meetings could be on sound financial and legal
footing.
In addition to his professional activities, Ed was also deeply
involved in educational activities at Caltech and JPL. His commitment to
students was legend at both institutions. Ed was particularly heavily
involved in the SURF program at Caltech through which undergraduates
(both from Caltech and from other institutions) carry out independent
research projects during the summer months. Ed was an active member of
the SURF Administrative Committee. He was also one of the most active
SURF research sponsors, having served as mentor to 13 students since
1984. Three more students were preparing to work with him this summer.
In addition, Ed co-founded the SURFSAT satellite program in 1986. In
this program, successive teams of SURF students are designing, building,
and testing a small communications satellite to support the research
objectives of NASA's Deep Space Network. Since its inception, 43
students have participated in SURFSAT.
Ed's persistent commitment to the scientific education of young
people stretches far and touches many. Just a few days prior to his
death, for example, Ed had begun to work with the Dean of Graduate
Education at Caltech to organize yet another educational program, in this
case to increase the number of underrepresented minorities in
engineering.
Given Ed's strong interest in science education, research, and
students, Mrs. Posner has asked that memorial gifts be designated to
support Caltech's SURF Program. It is our hope that gifts might
ultimately fund an Edward C. Posner SURF Fellowship Fund. Once funded,
the Posner SURF Fellowship would annually support an under-represented
minority student for a research project in a field related to Ed's own
professional interests.
Those individuals, or institutions interested in making a
contribution to the Edward C. Posner SURF fellowship fund in his memory
should contact:
Dore Charbonneau
Director of Special Gifts
Development, mail code 105-40
Caltech
Pasadena, CA. 91125
818 - 356-6285
Thank you for your interest and we hope to hear from you.
Carolyn Merkel
Director, SURF program
James M. Bower
Computation and Neural Systems Program-Caltech
------------------------------
End of Neuron Digest [Volume 11 Issue 41]
*****************************************