Copy Link
Add to Bookmark
Report
Neuron Digest Volume 12 Number 23
Neuron Digest Wednesday, 8 Dec 1993 Volume 12 : Issue 23
Today's Topics:
book announcement
Reference list
Beginner question: ANNs and cellular signal transduction?
ANNs and robotics
Matrix Back Prop (MBP) available
Cognitive Neuroscience Archives via ftp
Job openings
Conferences Aug-Sep 1994
Adaptive Control and Newral Network applied to power Sistems
Cellular Neural Networks mailing list
ANNs for DEC Alpha/PC-Solaris
please help - Lattice Corpus training sets requested
Lectureship in St Andrews
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: book announcement
From: "Michael C. Mozer" <mozer@dendrite.cs.colorado.edu>
Date: Wed, 17 Nov 93 10:12:13 -0700
In case you don't already have enough to read, the following volume is now
available:
Mozer, M., Smolensky, P., Touretzky, D., Elman, J., & Weigend, A. (Eds.).
(1994). _Proceedings of the 1993 Connectionist Models Summer School_.
Hillsdale, NJ: Erlbaum Associates.
The table of contents is listed below.
For prepaid orders by check or credit card, the price is $49.95 US.
Orders may be made by e-mail to "orders@leanhq.mhs.compuserve.com", by
fax to (201) 666 2394, or by calling 1 (800) 926 6579.
Include your credit card number, type, expiration date, and refer to
"ISBN 1590-2".
- -------------------------------------------------------------------------------
Proceedings of the 1993 Connectionist Models Summer School
Table of Contents
- -------------------------------------------------------------------------------
NEUROSCIENCE
Sigma-pi properties of spiking neurons / Thomas Rebotier and Jacques Droulez
Towards a computational theory of rat navigation /
Hank S. Wan, David S. Touretzky, and A. David Redish
Evaluating connectionist models in psychology and neuroscience / H. Tad Blair
VISION
Self-organizing feature maps with lateral connections: Modeling ocular
dominance / Joseph Sirosh and Risto Miikkulainen
Joint solution of low, intermediate, and high level vision tasks by global
optimization: Application to computer vision at low SNR /
Anoop K. Bhattacharjya and Badrinath Roysam
COGNITIVE MODELING
Learning global spatial structures from local associations /
Thea B. Ghiselli-Crippa and Paul W. Munro
A connectionist model of auditory Morse code perception / David Ascher
A competitive neural network model for the process of recurrent choice /
Valentin Dragoi and J. E. R. Staddon
A neural network simulation of numerical verbal-to-arabic transcoding /
A. Margrethe Lindemann
Combining models of single-digit arithmetic and magnitude comparison /
Thomas Lund
Neural network models as tools for understanding high-level cognition:
Developing paradigms for cognitive interpretation of neural network models /
Itiel E. Dror
LANGUAGE
Modeling language as sensorimotor coordination
F. James Eisenhart
Structure and content in word production: Why it's hard to say dlorm
Anita Govindjee and Gary Dell
Investigating phonological representations: A modeling agenda
Prahlad Gupta
Part-of-speech tagging using a variable context Markov model
Hinrich Schutze and Yoram Singer
Quantitative predictions from a constraint-based theory of syntactic ambiguity
resolution
Michael Spivey-Knowlton
Optimality semantics
Bruce B. Tesar
SYMBOLIC COMPUTATION AND RULES
What's in a rule? The past tense by some other name might be called
a connectionist net
Kim G. Daugherty and Mary Hare
On the proper treatment of symbolism--A lesson from linguistics
Amit Almor and Michael Rindner
Structure sensitivity in connectionist models
Lars F. Niklasson
Looking for structured representations in recurrent networks
Mihail Crucianu
Back propagation with understandable results
Irina Tchoumatchenko
Understanding neural networks via rule extraction and pruning
Mark W. Craven and Jude W. Shavlik
Rule learning and extraction with self-organizing neural networks
Ah-Hwee Tan
RECURRENT NETWORKS AND TEMPORAL PATTERN PROCESSING
Recurrent networks: State machines or iterated function systems?
John F. Kolen
On the treatment of time in recurrent neural networks
Fred Cummins and Robert F. Port
Finding metrical structure in time
J. Devin McAuley
Representations of tonal music: A case study in the development of temporal
relationships
Catherine Stevens and Janet Wiles
Applications of radial basis function fitting to the analysis of
dynamical systems
Michael A. S. Potts, D. S. Broomhead, and J. P. Huke
Event prediction: Faster learning in a layered Hebbian network with memory
Michael E. Young and Todd M. Bailey
CONTROL
Issues in using function approximation for reinforcement learning
Sebastian Thrun and Anton Schwartz
Approximating Q-values with basis function representations
Philip Sabes
Efficient learning of multiple degree-of-freedom control problems with
quasi-independent Q-agents
Kevin L. Markey
Neural adaptive control of systems with drifting parameters
Anya L. Tascillo and Victor A. Skormin
LEARNING ALGORITHMS AND ARCHITECTURES
Temporally local unsupervised learning: The MaxIn algorithm for maximizing
input information
Randall C. O'Reilly
Minimizing disagreement for self-supervised classification
Virginia R. de Sa
Comparison of two unsupervised neural network models for redundancy reduction
Stefanie Natascha Lindstaedt
Solving inverse problems using an EM approach to density estimation
Zoubin Ghahramani
Estimating a-posteriori probabilities using stochastic network models
Michael Finke and Klaus-Robert Muller
LEARNING THEORY
On overfitting and the effective number of hidden units
Andreas S. Weigend
Increase of apparent complexity is due to decrease of training set error
Robert Dodier
Momentum and optimal stochastic search
Genevieve B. Orr and Todd K. Leen
Scheme to improve the generalization error
Rodrigo Garces
General averaging results for convex optimization
Michael P. Perrone
Multitask connectionist learning
Richard A. Caruana
Estimating learning performance using hints
Zehra Cataltepe and Yaser S. Abu-Mostafa
SIMULATION TOOLS
A simulator for asynchronous Hopfield models
Arun Jagota
An object-oriented dataflow approach for better designs of neural
net architectures
Alexander Linden
------------------------------
Subject: Reference list
From: Jacob Sparre Andersen <sparre@connect.nbi.dk>
Date: Thu, 18 Nov 93 16:13:05 +0700
We're writing a paper on learning strategic games with neural nets and
other optimization methods.
We've collected some references, but we hope that we can get some help
improving our reference list.
Regards,
Jacob Sparre Andersen and Peer Sommerlund
Here's our list of references (some not complete):
Justin A. Boyan (1992): "Modular Neural Networks for Learning
Context-Dependent Game Strategies", Department of Engineering and
Computer Laboratory, University of Cambridge, 1992, Cambridge, England
Bernd Bruegmann (1993): "Monte Carlo Go", unpublished?
Herbert Enderton (1989?): "The Golem Go Program"
B. Freisleben (1992): "Teaching a Neural Network to Play GO-MOKU,"
Artificial Neural Networks 2, proceedings of ICANN '92, editors: I.
Aleksander and J. Taylor, pp. 1659-1662, Elsevier Science Publishers,
1992
W.T.Katz and S.P.Pham (1991): "Experience-Based Learning Experiments using
Go-moku", Proc. of the 1991 IEEE International Conference on Systems,
Man, and Cybernetics, 2: 1405-1410, October 1991.
M. Kohle & F. Schonbauer (19??): "Experience gained with a neural network
that learns to play bridge", Proc. of the 5th Austrian Artificial
Intelligence meeting, pp. 224-229.
Kai-Fu Lee and Sanjoy Mahajan (1988): "A Pattern Classification Approach to
Evaluation Function Learning", Artificial Intelligence, 1988, vol 36,
pp. 1-25.
Barney Pell (1992?): ""
Pell has done some work in machine learning for GO.
Article available by ftp.
A.L. Samuel (1959): "Some studies in machine learning using the game of
checkers", IBM journal of Research and Development, vol 3, nr. 3, pp.
210-229, 1959.
A.L. Samuel (1967): "Some studies in machine learning using the game of
checkers 2 - recent progress", IBM journal of Research and Development,
vol 11, nr. 6, pp. 601-616, 1967.
David Stoutamire (19??):
has written a thesis on machine learning applied to Go.
G. Tesauro (1989): "Connectionist learning of expert preferences by
comparison training", Advances in NIPS 1, 99-106 1989
G. Tesauro & T.J. Sejnowski (1989): "A Parallel Network that learns to play
Backgammon", Artificial Intelligence, vol 39, pp. 357-390, 1989.
G. Tesauro & T.J. Sejnowski (1990): "Neurogammon: A Neural
Network Backgammon Program", IJCNN Proceedings, vol 3, pp. 33-39, 1990.
In Machine Learning is this article, in which he comments on
temporal difference learning (i.e. training a net from scratch by
playing a copy of itself). The program he develops is called
"TD-gammon":
G. Tesauro (1991): "Practical Issues in Temporal Difference
Learning", IBM Research Report RC17223(#76307 submitted) 9/30/91; see
also the special issue on Reinforcement Learning of the Machine
Learning Journal 1992, where it also appears.
He Yo, Zhen Xianjun, Ye Yizheng, Li Zhongrong (1990): "Knowledge
acquisition and reasoning based on neural networks - the research of a
bridge bidding system", INNC '90, Paris, vol 1, pp. 416-423.
The annual computer olympiad involves tournaments in a variety of
games. These publications contain a wealth of interesting articles:
Heuristic Programming in Artificial Intelligence -
the first computer olympiad
D.N.L. Levy & D.F. Beal eds.
Ellis Horwood ltd, 1989.
Heuristic Programming in Artificial Intelligence 2 -
the second computer olympiad
D.N.L. Levy & D.F. Beal eds.
Ellis Horwood, 1991.
Heuristic Programming in Artificial Intelligence 3 -
the third computer olympiad
H.J. van den Herik & L.V. Allis eds.
Ellis Horwood, 1992.
- --------------------------------------------------------------------------
Jacob Sparre Andersen, Niels Bohr Institute, University of Copenhagen.
E-mail: sparre@connect.nbi.dk - Fax: (+45) 35 32 04 60
- --------------------------------------------------------------------------
Peer Sommerlund, Department of Computer science, University of Copenhagen.
E-mail: peso@connect.nbi.dk
- --------------------------------------------------------------------------
We're writing a paper on learning strategic games with neural nets and
other optimization methods.
- --------------------------------------------------------------------------
------------------------------
Subject: Beginner question: ANNs and cellular signal transduction?
From: slaumas@world.std.com (sandeep laumas)
Date: Thu, 18 Nov 93 23:30:12 -0500
Hi Peter,
I am a medical student with a backgroung in biomedical reseally I have
worked in the area of signal transduction, i.e. receptors, ligands
(growth factors etc.), protein kinases, G proteins and transcription
factors.
I am interested in applying some of the information theory and neural
network concepts to signal transduction in the cell. Sound crazy? Well
I have not been able to find anyone who directly works in this area.
Maybe your group might help.
You see the information being transmitted to the nucleus from the
membrane goes through a variety of pathways and finally converges on the
nucleus to turn on a set of target genes which might signal the cvell to
divide, move, ruffle the membrane and a myraid of other things. There is
more I can tell you, but is the above specific enogh you think? Please
let me know. Thanks. Sandeep.
------------------------------
Subject: ANNs and robotics
From: mosi@cca.pue.udlap.mx (Mosi Tatupu)
Date: Fri, 19 Nov 93 12:01:42 -0600
Hello.
I am finding papers about applications of
neural networks with robotics ( or robots ).
- Do you know some place ( or people ) that work's with this "combination" ?
- Do you can send me name places ( ftp sites or similar ) where exists papers or information ?
Thank you very much.
Please sen information to :
Angel Rico Guzman
arico@cca.pue.udlap.mx
arico@rico.pue.udlap.mx
------------------------------
Subject: Matrix Back Prop (MBP) available
From: Davide Anguita <anguita@dibe.unige.it>
Date: Sat, 20 Nov 93 16:14:48 -0500
Matrix Back Propagation v1.1 is finally available.
This code implements (in C language) the algorithms described in:
D.Anguita, G.Parodi, R.Zunino - An efficient implementation of BP on RISC-
based workstations. Neurocomputing, in press.
D.Anguita, G.Parodi, R.Zunino - Speed improvement of the BP on current
generation workstations. WCNN '93, Portland.
D.Anguita, G.Parodi, R.Zunino - YPROP: yet another accelerating technique
for the bp. ICANN '93, Amsterdam.
To retrieve the code:
ftp risc6000.dibe.unige.it <130.251.89.154>
anonymous
<your address as password>
cd pub
bin
get MBPv1.1.tar.Z <or get MBPv11.zip for DOS machines>
quit
uncompress MBPv1.1.tar.Z <or pkunzip mbpv11.zip>
tar -xvf MBPv1.1.tar
Then print the file mbpv11.ps (PostScript).
Send comments (or flames) to the address below.
Good luck.
Davide.
========================================================================
Davide Anguita DIBE
Phone: +39-10-3532192 University of Genova
Fax: +39-10-3532175 Via all'Opera Pia 11a
e-mail: anguita@dibe.unige.it 16145 Genova, ITALY
>From Dec.1st I will be at ICSI-Berkeley USA, but I should have no problem
in answering the mail from here.
========================================================================
------------------------------
Subject: Cognitive Neuroscience Archives via ftp
From: Phil Hetherington <het@blaise.psych.mcgill.ca>
Date: Sat, 20 Nov 93 19:05:04 -0500
****************************
*** General Announcement ***
****************************
The Cognitive Neuroscience monthly archives are now available via anonymous
ftp. Monthly archives are located in two directories, /pub/cogneuro/1992 and
/pub/cogneuro/1993.
Instructions for downloading:
ftp ego.psych.mcgill.ca (or 132.206.106.211)
username: anonymous
password: (full email address)
ftp> cd pub/cogneuro
ftp> cd 1992 (or 1993 for the 1993 archives)
To get all files in directory:
ftp> prompt
ftp> bin
ftp> mget *
To get one file in directory:
ftp> bin
ftp> get cns.may.92.Z
ftp> quit
All files are compressed using the standard UNIX compression algorithm.
To uncompress an archive:
unix: uncompress cns.may.92.Z
=============================================================================
There are no restrictions on transfer times. Please write me
(het@blaise.psych.mgill.ca) if any further assistance is required.
* Special thanks to our System Operator, Shelly Feran, for making this service
available.
Phil A. Hetherington
Department of Psychology
McGill University
------------------------------
Subject: Job openings
From: SMOMARA@vax1.tcd.ie
Date: Wed, 24 Nov 93 09:15:00 +0000
We have one postdoctoral fellowship and one studentship available
immediately. Fellowship ands studentship both tenable for 3 years.
The project involves the use of patch clamp techniques in the study
of synaptic transmission focused on LTP and the role of the
metabotropic receptors therein in the rat hippocampal slice. Studentship
(for PhD) is in the same area. Send resume and the names of two
referees to Dr Michael Rowan ofr Dr Roger Anwyl, Department of
Pharmacology, University of Dublin, Trinity College, Dublin 2,
Ireland. Phone: =353-1-702 1567; Fax: +353-1-7671 35037
Fax: +353-1-671 3507.
------------------------------
Subject: Conferences Aug-Sep 1994
From: ERI@FRCU.EUN.EG
Date: Wed, 24 Nov 93 10:33:16 +0000
Can any body please send me or tell me where can I find call for papers
for conferences held during the monts of August & September 1994.
Thank You
My E-MAIL is ERI@EGFRCUVX.BITNET
------------------------------
Subject: Adaptive Control and Newral Network applied to power Sistems
From: outros%labspot.ufsc.br@UICVM.UIC.EDU (Visitantes)
Date: Wed, 24 Nov 93 09:07:26 -0500
I'm looking for informations and public domain softwares that
bring up experiments in general regarding Electrical Power Sistems,
using methods such as ADAPTIVE CONTROL and NEWRAL NETWORK.
All information covering the area above will be appreciated.
Bernardino de Sena Aires Amaral
outros@labspot.ufsc.br
------------------------------
Subject: Cellular Neural Networks mailing list
From: cells@tce.ing.uniroma1.it
Date: Thu, 25 Nov 93 20:54:50 +0100
************************************************************
* ANNOUNCING A NEW MAILING LIST ON *
* CELLULAR NEURAL NETWORKS: *
* cells@tce.ing.uniroma1.it *
************************************************************
Cellular Neural Networks (CNN) are continuous-time dynamical systems,
consisting of a grid of processing elements (neurons, or cells) connected
only to neighbors within a given (typically small) distance. It is
therefore a class of recurrent neural networks, whose particular topology
is most suited for integrated circuit realization. In fact, while in
typical realizations of other neural systems most of silicon area is taken
by connections, in this case connection area is neglectible, so that
processor density can be much larger.
Since their first definition by L.O. Chua and L. Yang in 1988, many
applications were proposed, mainly in the field of image processing. In
most cases a space-invariant weight pattern is used (i.e. weights are
defined by a template, which repeats identically for all cells), and
neurons are characterized by simple first order dynamics. However, many
different kinds of dynamics (e.g. oscillatory and chaotic) have also been
used for special purposes.
A recent extension of the model is obtained by integrating the analog
CNN with some simple logic components, leading to the realization of a
universal programmable "analogic" machine.
Essential bibliography:
1) L.O. Chua & L. Yang, "Cellular Neural Networks: Theory", IEEE Trans. on
Circ. and Systems, CAS-35(10), p. 1257, 1988
2) -----, "Cellular Neural Networks: Applications", ibid., p. 1273
3) Proc. of IEEE International Workshop on Cellular Neural Networks and
their Applications (CNNA-90), Budapest, Hungary, Dec. 16-19, 1990
4) Proc. of IEEE Second International Workshop on Cellular Neural Networks
and their Applications (CNNA-92), Munich, Germany, Oct. 14-16, 1992
5) International Journal of Circuit Theory and Applications, vol.20, no. 5
(1992), special issue on Cellular Neural Networks
6) IEEE Transactions on Circuits and Systems, parts I & II, vol.40, no. 3
(1993), special issue on Cellular Neural Networks
7) T. Roska, L.O. Chua, "The CNN Universal Machine: an Analogic Array
Computer", IEEE Trans. on Circ. and Systems, II, 40(3), 1993, p. 163
8) V. Cimagalli, M. Balsi, "Cellular Neural Networks: a Review", Proc. of
Sixth Italian Workshop on Parallel Architectures and Neural Networks,
Vietri sul Mare, Italy, May 12-14, 1993. (E. Caianiello, ed.), World
Scientific, Singapore.
Our research group at "La Sapienza" University of Rome, Italy, has
been involved in CNN research for several years, and will host next IEEE
International Workshop on Cellular Neural Networks and their Applications
(CNNA-94), which will be held in Rome, December 18-21, 1994.
We are now announcing the start of a new mailing list dedicated to
Cellular Neural Networks. It will give the opportunity of discussing
current research, exchanging news, submitting questions. Due to memory
shortage, we are currently not able to offer an archive service, and hope
that some other group will be able to volunteer for the establishment of
this means of fast distribution of recent reports and papers.
The list will not be moderated, at least as long as the necessity
does not arise.
THOSE INTERESTED IN BEING INCLUDED IN THE LIST SHOULD SEND A MESSAGE
to Marco Balsi (who will be supervising the functioning of the list) at
address mb@tce.ing.uniroma1.it (151.100.8.30). This is the address to
which any communication not intended to go to all subscribers of the list
should be sent.
We would also appreciate if you let us know the address of
colleagues who might be interested in the list (rather than just forward
the announcement directly), so that we can send them this announcement and
keep track of those that were contacted, avoiding duplications.
TO SEND MESSAGES TO ALL SUBSCRIBERS PLEASE USE THE FOLLOWING ADDRESS:
cells@tce.ing.uniroma1.it (151.100.8.30)
We hope that this service will encourage communication and foster
collaboration among researchers working on CNNs and related topics.
We are looking forward for your comments, and subscriptions to the
list!
Yours,
Prof. V. Cimagalli
Dipartimento di Ingegneria Elettronica
Universita' "La Sapienza" di Roma
via Eudossiana, 18, 00184 Roma Italy
fax: +39-6-4742647
------------------------------
Subject: ANNs for DEC Alpha/PC-Solaris
From: "T.P Harte" <tph1001@cus.cam.ac.uk>
Date: Tue, 30 Nov 93 14:36:08 +0000
Does anyone know of *neural network packages* which run on:
<<<<<<<< DEC Alpha and/or PC Solaris>>>>>>>>
FTP site addresses etc. would be most appreciated.
Cheers,
Thomas.
______ _____
/ / / /\ //
/ / / / \/ /
/ /_____/ / /
------------------------------
Subject: please help - Lattice Corpus training sets requested
From: E S Atwell <eric@scs.leeds.ac.uk>
Date: Wed, 01 Dec 93 12:11:20 +0000
Please reply direct to Dan Modd, csxdtm@scs.leeds.ac.uk - not me. Thanks.
- Eric Atwell
- ---------------------------------------------------------------------------
My Computer Science final year project involves collecting together a wide
range of word-hypothesis recognition lattices, as output from large-vocabulary
speech and handwriting recognition systems. These word-candidate lattices look
something like this:
stephen stiffen stiffens
left lift
school scowl scull
lest last
yearn your year
The collected lattices will constitute a standard Lattice Corpus which,
hopefully, could be used as an evaluation resource for research in linguistic
constraint models for English speech and handwriting recognition systems.
Initially, I need to compare the range of word-lattice formats used by language modelling researchers to arrive at a standard repres
entation format.
If your research is in this area, I would be very grateful if you could send
me one or more example lattices (preferably as an ascii text file). Any
information about the format of the lattices (e.g. documentation, references,
e.t.c.) would also be welcome.
Thanks for your help,
Dan Modd
Centre for Computer Analysis of Speech and Language,
School of Computer Studies, University of Leeds.
csxdtm@scs.leeds.ac.uk
------------------------------
Subject: Lectureship in St Andrews
From: Peter Foldiak <pf2@st-andrews.ac.uk>
Date: Thu, 02 Dec 93 11:24:27 +0000
UNIVERSITY OF ST ANDREWS
LECTURESHIP IN THE SCHOOL OF PSYCHOLOGY
Applications are invited for the above post, which is made
available following the retirement of Professor MA Jeeves.
Individuals with a strong research record in an area of
psychology related to existing research strengths in the School
are encouraged to apply. The successful candidate will be
expected to make a significant contribution to the School's
research activity, and to contribute to undergraduate and
graduate teaching programmes.
The appointment is available from 1 September 1994. The
University is prepared to consider appointing at the senior
lectureship or readership level in light of an assessment of the
quality of the field available to it.
The salary shall be on the appropriate point on the Academic
payscale GBP 13601 - GBP 29788 per annum.
Application forms and further particulars are available from
Personnel Services, University of St Andrews, College Gate, St
Andrews KY16 9AJ, U.K. (tel: +44 334 62562, out of hours +44 334
62571 or by fax +44 334 62570), to whom completed forms accompanied
by a letter of application and CV should be returned to arrive not
later than 10 December 1993.
PLEASE QUOTE REFERENCE NUMBER SL/APS0001.
The University operates an Equal Opportunities Policy.
------------------------------
End of Neuron Digest [Volume 12 Issue 23]
*****************************************