Copy Link
Add to Bookmark
Report
Neuron Digest Volume 11 Number 14
Neuron Digest Friday, 26 Feb 1993 Volume 11 : Issue 14
Today's Topics:
neural net applications to fixed-income security markets
connectionist models summer school -- final call for applications
RE: Neuron Digest V11 #8 (discussion + reviews + jobs)
Re: Speaker normalization and adaptation
A computationally efficient squashing function
BP network paralysis
Re: pattern recognition (pratical database considerations) ?
Re: pattern recognition (pratical database considerations)
Computational Biology Degree Programs
postdoctoral traineeships available
Postdoc position in computational/biological vision (learning)
Position for Programmer/Analyst with Neural Networks (YALE)
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: neural net applications to fixed-income security markets
From: danlap@internet.sbi.com (Dan LaPushin)
Date: Mon, 08 Feb 93 18:26:43 -0500
[[ Editor's Note: Once again, neural nets has reached Wall Street, but
this time from quite a different angle. A cursory search in past issues
of the Digest turned up nothing of relevance. Perhaps one of our
faithful readers might either lend a helpful ear or might get interested
in this as a new project via email! -PM ]]
To The Editor,
I am new to the field of neural networks but have a strong background in
mathematics, economics, and some computer programming. I work at a large
Wall St. firm and am interested in applying neural network technology to
the field of fixed-income research. Such instruments include bonds,
mortgage-backed securities and the like. There seems to be, as far as I
can tell, little research into neural net application to such markets. I
suspect this is because the data is hard to come by for those not in the
field, but I'm not sure. Could you direct me to any research in this
area so that I don't inadvertently recreate the wheel? Thanks for your
help!
Dan LaPushin
I'm on your mailing list as danlap@sp_server.sbi.com
------------------------------
Subject: connectionist models summer school -- final call for applications
From: "Michael C. Mozer" <mozer@dendrite.cs.colorado.edu>
Date: Thu, 11 Feb 93 22:10:05 -0700
FINAL CALL FOR APPLICATIONS
CONNECTIONIST MODELS SUMMER SCHOOL
The University of Colorado will host the 1993 Connectionist
Models Summer School from June 21 to July 3, 1993. The purpose
of the summer school is to provide training to promising young
researchers in connectionism (neural networks) by leaders of the
field and to foster interdisciplinary collaboration. This will
be the fourth such program in a series that was held at
Carnegie-Mellon in 1986 and 1988 and at UC San Diego in 1990.
Previous summer schools have been extremely successful and we
look forward to the 1993 session with anticipation of another
exciting event.
The summer school will offer courses in many areas of
connectionist modeling, with emphasis on artificial intelligence,
cognitive neuroscience, cognitive science, computational methods,
and theoretical foundations. Visiting faculty (see list of
invited faculty below) will present daily lectures and tutorials,
coordinate informal workshops, and lead small discussion groups.
The summer school schedule is designed to allow for significant
interaction among students and faculty. As in previous years, a
proceedings of the summer school will be published.
Applications will be considered only from graduate students
currently enrolled in Ph.D. programs. About 50 students will be
accepted. Admission is on a competitive basis. Tuition will be
covered for all students, and we expect to have scholarships
available to subsidize housing and meal costs, but students are
responsible for their own travel arrangements.
Applications should include the following materials:
* a vita, including mailing address, phone number, electronic
mail address, academic history, list of publications (if any),
and relevant courses taken with instructors' names and grades
received;
* a one-page statement of purpose, explaining major areas of
interest and prior background in connectionist modeling and
neural networks;
* two letters of recommendation from individuals familiar with
the applicants' work (either mailed separately or in sealed
envelopes); and
* a statement from the applicant describing potential sources of
financial support available (department, advisor, etc.) for
travel expenses.
Applications should be sent to:
Connectionist Models Summer School
c/o Institute of Cognitive Science
Campus Box 344
University of Colorado
Boulder, CO 80309
All application materials must be received by March 1, 1993.
Admission decisions will be announced around April 15. If you
have specific questions, please write to the address above or
send e-mail to "cmss@cs.colorado.edu". Application materials
cannot be accepted via e-mail.
Organizing Committee
Jeff Elman (UC San Diego)
Mike Mozer (University of Colorado)
Paul Smolensky (University of Colorado)
Dave Touretzky (Carnegie Mellon)
Andreas Weigend (Xerox PARC and University of Colorado)
Additional faculty will include:
Yaser Abu-Mostafa (Cal Tech)
Sue Becker (McMaster University)
Andy Barto (University of Massachusetts, Amherst)
Jack Cowan (University of Chicago)
Peter Dayan (Salk Institute)
Mary Hare (Birkbeck College)
Cathy Harris (Boston University)
David Haussler (UC Santa Cruz)
Geoff Hinton (University of Toronto)
Mike Jordan (MIT)
John Kruschke (Indiana University)
Jay McClelland (Carnegie Mellon)
Ennio Mingolla (Boston University)
Steve Nowlan (Salk Institute)
Dave Plaut (Carnegie Mellon)
Jordan Pollack (Ohio State)
Dean Pomerleau (Carnegie Mellon)
Dave Rumelhart (Stanford)
Patrice Simard (ATT Bell Labs)
Terry Sejnowski (UC San Diego and Salk Institute)
Sara Solla (ATT Bell Labs)
Janet Wiles (University of Queensland)
The Summer School is sponsored by the American Association for
Artificial Intelligence, the National Science Foundation, Siemens
Research Center, and the University of Colorado Institute of
Cognitive Science.
Colorado has recently passed a law explicitly denying protection
for lesbians, gays, and bisexuals. However, the Summer School
does not discriminate in admissions on the basis of age, sex,
race, national origin, religion, disability, veteran status, or
sexual orientation.
------------------------------
Subject: RE: Neuron Digest V11 #8 (discussion + reviews + jobs)
From: rkeller@academic.cc.colorado.edu
Date: Sun, 14 Feb 93 12:18:03 -0700
James L. McClelland and David E. Rumelhart provide code for a Boltzmann
Machine in "Explorations in Parallel Distributed Processing: A Handbook
of Models, Programs and Exercises" The code is written for a PC. This
is A Bradford Book available from The MIT Press and was designed to
provide excersises to accompany their two volume book "Parallel
Distributed Processing."
------------------------------
Subject: Re: Speaker normalization and adaptation
From: Yaakov Stein <stein@galaxy.huji.ac.il>
Date: Wed, 17 Feb 93 07:51:07 +0200
Nico Weymaere (WEYMAERE@lem.rug.ac.be) asked for references on speaker
normalization and adaptation. While the idea of exploiting the self
organization and learning capabilities of neural networks for this task
seems quite natural, I have not seen much in the proceedings of NN
conferences. The related questions of speaker independent recognition and
speaker identification / verification have been far more thoroughly
treated in this literature. In the speech and signal processing
conferences more has appeared. A quick search through my article file
turned up the following :
Bridle JS and Cox SJ, RecNorm: Simultaneous Normalization and
Classification Applied to Speech Recognition, NIPS-3, 234-40 (1991)
Cox SJ and Bridle JS, Simultaneous Speaker Normalization and Utterance
Labelling Using Bayesian/Neural Net Techniques,
ICASSP-90 article S3.8, vol 1, 161-4 (1990)
Hampshire JB II and Waibel AH, The Meta-Pi Network: Connectionist Rapid
Adaptation for High Performace Multi-Speaker Phoneme Recognition,
ICASSP-90 article S3.9, vol 1, 165-8 (1990)
Fukuzawa K and Komori Y, A Segment-based Speaker Adaptation Neural Network
Applied to Continuous Speech Recognition, ICASSP-92, I 433-6 (1992)
Huang X, Speaker Normalization for Speech Recognition, ICASSP-92,
I 465-8 (1992)
Iso K, Asogawa M, Yoshida K and Watanabe T, Speaker Adaptation Using
Neural Network, Proc. Spring Meeting of Acoust. Soc. of Japan,
I 6-16 (March 1989) [in Japanese, quoted widely but I don't have it]
Konig Y and Morgan N, GDNN: A Gender-Dependent Neural Network for
Continuous Speech Recognition, IJCNN-92(Baltimore), II 332-7 (1992)
Montacie C, Choukri K and Chollet G, Speech Recognition using Temporal
Decomposition and Multilayer Feedforward Automata,
ICASSP-89 article S8.6, vol 1, 409-12 (1989)
Nakamura S and Akabane T, A Neural Speaker Model for Speaker Clustering,
ICASSP-91 article S13.6, vol 2, 853-856 (1991)
Nakamura S and Shikano K, Speaker Adaptation Applied to HMM and
Neural Networks, ICASSP-89 article S3.3, vol 1, 89-92 (1989)
Nakamura S and Shikano K, A Comparative Study of Spectral Mapping for
Speaker Adaptation, ICASSP-90 article S3.7, vol 1, 157-160 (1990)
Schmidbauer O and Tebelskis J, An LVQ Based Reference Model for Speaker
Adaptative Speech Recognition, ICASSP-92, I 441-4 (1992)
Witbrock M and Hoffman P, Rapid Connectionist Speaker Adaptation,
ICASSP-92, pp. I 453-6 (1992)
Hope this helps.
Yaakov Stein
------------------------------
Subject: A computationally efficient squashing function
From: "Michael P. Perrone" <mpp@cns.brown.edu>
Date: Thu, 18 Feb 93 15:42:34 -0500
Recently on the comp.ai.neural-nets bboard, there has been a discussion
of more computationally efficient squashing functions. Some colleagues
of mine suggested that many members of this mailing list may not have
access to the comp.ai.neural-nets bboard; so I have included a summary
below.
Michael
- ------------------------------------------------------
David L. Elliot mentioned using the following neuron activation function:
x
f(x) = -------
1 + |x|
He argues that this function has the same qualitative properties of the
hyperbolic tangent function but in practice faster to calculate.
I have suggested a similar speed-up for radial basis function networks:
1
f(x) = -------
1 + x^2
which avoids the transcendental calculation associated with gaussian RBF
nets.
I have run simulations using the above squashing function in various
backprop networks. The performance is comparable (sometimes worse
sometimes better) to usual training using hyperbolic tangents. I also
found that the performance of networks varied very little when the
activation functions were switched (i.e. two networks with identical
weights but different activation functions will have comparable performance
on the same data). I tested these results on two databases: the NIST OCR
database (preprocessed by Nestor Inc.) and the Turk and Pentland human face
database.
- --------------------------------------------------------------------------------
Michael P. Perrone Email: mpp@cns.brown.edu
Institute for Brain and Neural Systems Tel: 401-863-3920
Brown University Fax: 401-863-3934
Providence, RI 02912
------------------------------
Subject: BP network paralysis
From: slablee@mines.u-nancy.fr
Date: Fri, 19 Feb 93 13:55:35 +0700
Dear Netters,
My english is a bit frenchy (!) so please excuse some poor
sentences !
I'm trying to use NN to detect the start of a sampled signal.
I'm using a 7520 x 150 x 70 x 1 BackPropagation network.
My problem is : whatever could be the parameters
I choose (learning rate, momentum...) the Network
stop learning with a rather high error (about 0.4 for each
unit, into a [0,1] range ! ).
I thought of two problems :
- network "paralysis" (as describe by Rumelhart) involved
by too high weight (which leads to activations near 0 or 1,
preventing the weights from being changed : the changes are
proportional to a(1-a) ). But the weights of my network always
have average values...
- some local minima. But a great value for the learning rate seems
to change nothing to it. I've tried to add a noise in the input units,
whithout any success. I've also tried to change the number of hidden units,
but the local minima are always here, even if lower.
Who could help me to escape from this problem ?
--------------------------------------------------
| Stephane Lablee |
| ***** |
| Ecole des Mines de Nancy |
| Parc de Saurupt |
| 54042 Nancy Cedex |
| France |
| ***** |
| E-mail : slablee@mines.u-nancy.fr |
--------------------------------------------------
- --
------------------------------
Subject: Re: pattern recognition (pratical database considerations) ?
From: gray@itd.nrl.navy.mil (Jim Gray)
Date: Fri, 19 Feb 93 09:33:19 -0500
Duane A. White writes:
> I am interested in pattern recognition. In my particular application I
> would like to compare a 2D monochrome bitmap image with those in a
> database. The program should determine if there is a match, and if not
> then add it to the database. Most of the literature I've read on pattern
> matching networks use a relatively small set of classification patterns
> (such as letters of the alphabet, numbers). In my case it wouldn't seem
> practical to train a single network to identify every entry in the
> database (on the order of hundreds or thousands of entries). Is there
> something fundemental in the approach that I'm missing?
You might try looking at Adaptive Resonance Theory (ART).
A good place to start is the book:
Carpenter and Grossberg, eds., Pattern Recognition by Self-Organizing
Neural Networks, The MIT Press, Cambridge, MA (1991)
ISBN 0-262-03176-0
I'm not sure whether ART networks can be applied to "thousands of entries"
in practice, but the basic operation is as you describe: the network
determines if there is a match, and if not, then adds it to the database.
> Also the program should to a small degree be rotation and translation
> invariant.
I'm not sure whether ART networks have been applied to this type of
problem, but you might try looking at:
Hinton, "A Parallel Computation that assigns Canonical Object-Based
Frames of Reference", in Proceedings of the International Joint
Conference on Artificial Intelligence, 1981, pp. 683-685.
Jim Gray.
------------------------------
Subject: Re: pattern recognition (pratical database considerations)
From: shsbishp@reading.ac.uk
Date: Tue, 23 Feb 93 11:26:45 +0000
>I am interested in pattern recognition. In my particular application I
>would like to compare a 2D monochrome bitmap image with those in a
>database. The program should determine if there is a match, and if not
>then add it to the database. Most of the literature I've read on pattern
>matching networks use a relatively small set of classification patterns
>(such as letters of the alphabet, numbers). In my case it wouldn't seem
>practical to train a single network to identify every entry in the
>database (on the order of hundreds or thousands of entries). Is there
>something fundemental in the approach that I'm missing?
>
>Also the program should to a small degree be rotation and translation
>invariant.
Having just perused todays neural digest (Vol.11; No. 12), I noticed the
above plea for help. Having been unable to email the sender direct, I enclose
the following information for the list.
As part of my doctoral research I developed a neural architecture (The
Stochastic Search Network) for use on this type of problem - Anarchic
Techniques for Pattern Classification, PhD thesis 1989, University of
Reading, UK. A recent reference on this work is; Bishop, J.M. & Torr, P.,
in Lingard, R., Myers, D.J. & Nightingale, C. (eds), Neural Networks for
Vision, Speech & Natural Language, Chapman Hall, pp: 370-388.
For further information please email to (shsbishp@uk.ac.rdg) or write to
Dr. J.M.Bishop, Department of Cybernetics, University of Reading,
Berkshire, UK.
------------------------------
Subject: Computational Biology Degree Programs
From: georgep@rice.edu (George Phillips)
Organization: Rice University
Date: 05 Feb 93 15:38:31 +0000
The W.M. Keck Center for Computational Biology offers studies in
Computational Biology through three partner institutions: Rice
University, Baylor College of Medicine, and the University of Houston.
Science and engineering are in the process of being transformed by the
power of new computing technologies. Our goal is to train a new kind of
scientist--one poised to seize the advantages of a national computational
prowess in solving important problems in biology.
The program emphasizes algorithm development, computation, and
visualization in biology, biochemistry and biophysics. The Program draws
on the intellectual and technologic resources of The Center for Research
on Parallel Computation at Rice, the Human Genome Center at Baylor
College of Medicine, and the Institute for Molecular Design at the
University of Houston, among others.
The research groups involved in the W.M. Keck Center for Computational
Biology are at the forefronts of their respective areas, and their
laboratories are outstanding settings for the program.
A list of participating faculty and application information can be
obtained by sending email to georgep@rice.edu.
======================================+=======================================
Prof. George N. Phillips, Jr., Ph.D. | InterNet: georgep@rice.edu
Dept. of Biochemistry and Cell Biology|
Rice University, P.O. Box 1892 | Phone: (713) 527-4910
Houston, Texas 77251 | Fax: (713) 285-5154
------------------------------
Subject: postdoctoral traineeships available
From: "John K. Kruschke" <KRUSCHKE@ucs.indiana.edu>
Date: Tue, 09 Feb 93 09:45:45 -0500
POST-DOCTORAL FELLOWSHIPS AT INDIANA UNIVERSITY
Postdoctoral Traineeships in MODELING OF COGNITIVE PROCESSES
Please call this notice to the attention of all interested parties.
The Psychology Department and Cognitive Science Programs at Indiana
University are pleased to announce the availability of one or more
Postdoctoral Traineeships in the area of Modeling of Cognitive
Processes. The appointment will pay rates appropriate for a new PhD
(about $18,800), and will be for one year, starting after July 1,
1993. The duration could be extended to two years if a training grant
from NIH is funded as anticipated (we should receive final
notification by May 1).
Post-docs are offered to qualified individuals who wish to further
their training in mathematical modeling or computer simulation
modeling, in any substantive area of cognitive psychology or Cognitive
Science.
We are particularly interested in applicants with strong
mathematical, scientific, and research credentials. Indiana University
has superb computational and research facilities, and faculty with
outstanding credentials in this area of research, including Richard
Shiffrin and James Townsend, co-directors of the training program, and
Robert Nosofsky, Donald Robinson, John Castellan, John Kruschke,
Robert Goldstone, Geoffrey Bingham, and Robert Port.
Trainees will be expected to carry out original theoretical and
empirical research in association with one or more of these faculty
and their laboratories, and to interact with other relevant faculty
and the other pre- and postdoctoral trainees.
Interested applicants should send an up to date vitae, personal
letter describing their specific research interests, relevant
background, goals, and career plans, and reference letters from two
individuals. Relevant reprints and preprints should also be sent.
Women, minority group members, and handicapped individuals are urged
to apply. PLEASE NOTE: The conditions of our anticipated grant
restrict awards to US citizens, or current green card holders. Awards
will also have a 'payback' provision, generally requiring awardees to
carry out research or teach for an equivalent period after termination
of the traineeship. Send all materials to:
Professors Richard Shiffrin and James Townsend,
Program Directors
Department of Psychology, Room 376B
Indiana University
Bloomington, IN 47405
We may be contacted at:
812-855-2722;
Fax: 812-855-4691
email: shiffrin@ucs.indiana.edu
Indiana University is an Affirmative Action Employer
------------------------------
Subject: Postdoc position in computational/biological vision (learning)
From: "John G. Harris" <harris@ai.mit.edu>
Date: Tue, 16 Feb 93 18:50:28 -0500
One (or possibly two) postdoctoral positions are available for one or two
years in computational vision starting September 1993 (flexible). The
postdoc will work in Lucia Vaina's laboratory at Boston University,
College of Engineering, to conduct research in learning the direction in
global motion. The researchers currently involved in this project are
Lucia M. Vaina, John Harris, Charlie Chubb, Bob Sekuler, and Federico
Girosi.
Requirements are PhD in CS or related area with experience in visual
modeling or psychophysics. Knowledge of biologically relevant neural
models is desirable. Stipend ranges from $28,000 to $35,000 depending
upon qualifications. Deadline for application is March 1, 1993. Two
letter of recommendation, description of current research and an up to
date CV are required.
In the research we combine computational psychophysics, neural networks
modeling and analog VLSI to study visual learning specifically applied to
direction in global motion. The global motion problem requires estimation
of the direction and magnitude of coherent motion in the presence of
noise. We are proposing a set of psychophysical experiments in which the
subject, or the network must integrate noisy, spatially local motion
information from across the visual field in order to generate a response.
We will study the classes of neural networks which best approximate the
pattern of learning demonstrated in psychophysical tasks. We will explore
Hebbian learning, multilayer perceptrons (e.g. backpropagation),
cooperative networks, Radial Basis Function and Hyper-Basis Functions.
The various strategies and their implementation will be evaluated on the
basis of their performance and their biological plausibility.
For more details, contact Prof. Lucia M. Vaina at vaina@buenga.bu.edu or
lmv@ai.mit.edu.
------------------------------
Subject: Position for Programmer/Analyst with Neural Networks (YALE)
From: Anand Rangarajan <rangarajan-anand@CS.YALE.EDU>
Date: Thu, 18 Feb 93 13:18:40 -0500
Programmer/Analyst Position
in Artificial Neural Networks
The Yale Center for Theoretical
and Applied Neuroscience (CTAN)
and the
Department of Computer Science
Yale University, New Haven, CT
We are offering a challenging position in software engineering in support of
new techniques in image processing and computer vision using artificial neural
networks (ANNs).
1. Basic Function:
Designer and programmer for computer vision and neural network
software at CTAN and the Computer Science department.
2. Major duties:
(a) To implement computer vision algorithms using a Khoros (or similar)
type of environment.
(b) Use the aforementioned tools and environment to run and analyze
computer experiments in specific image processing and vision application
areas.
(c) To facilitate the improvement of neural network algorithms and
architectures for vision and image processing.
3. Position Specifications:
(a) Education:
BA, including linear algebra, differential equations, calculus.
helpful: mathematical optimization.
(b) Experience:
programming experience in C++ (or C) under UNIX.
some of the following: neural networks, vision or image processing
applications, scientific computing, workstation graphics,
image processing environments, parallel computing, computer algebra
and object-oriented design.
Preferred starting date: March 1, 1993.
For information or to submit an application, please write:
Eric Mjolsness
Department of Computer Science
Yale University
P. O. Box 2158 Yale Station
New Haven, CT 06520-2158
e-mail: mjolsness-eric@cs.yale.edu
Any application must also be submitted to:
Jeffrey Drexler
Department of Human Resources
Yale University
155 Whitney Ave.
New Haven, CT 06520
- -Eric Mjolsness and Anand Rangarajan
(prospective supervisors)
------------------------------
End of Neuron Digest [Volume 11 Issue 14]
*****************************************