Copy Link
Add to Bookmark
Report

Neuron Digest Volume 08 Number 15

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Thursday, 19 Dec 1991                Volume 8 : Issue 15 

Today's Topics:
Administrivia
resend Algorithms for Principal Components Analysis
Multilayer hebbian network
Postdoc at Hebrew University
Public domain LVQ-programs released


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Administrivia
From: "Neuron-Digest Moderator, Peter Marvit" <neuron@cattell.psych.upenn.edu>
Date: Thu, 19 Dec 91 17:46:32 -0500

Thanks to those who wrote in regarding the "campaign" material. From the
few responses (n=12), I got quite a mix of reactions -- quite evenly
split on the appropriateness of the message. I appreciate the feedback
which I will use in the future to aid in editing decisions.

This is a small issue to purge "discussion" entries. The next issues
over the next few days will all be calls for papers and
conference/symposia announcements. I will try to weed out any for which
deadlines have passed (with apologies to the organizers).

Neuron Digest will then go on vacation until after the beginning of 1992,
at which time a new volume number will begin. I will then send out a
series of issues announcing new technical reports and publications (which
have been queueing for far too long).

Again, I give the usual plea to ease my burdon: If your address will
change or your account will disappear, *please* let me know before mail
starts to bounce. If you do not receive the Digest for more than two
weeks, please send me a note; there may be trouble getting electronic
mail to you. As always, I solicit and appreciate comments and
submissions. This is *your* Digest, after all, and can only be as useful
as the readers wish.

My best wishes for the holidays and hope for a prosperous New Year...

: Peter Marvit, Neuron Digest Moderator
: Courtesy of Psychology Department, University of Pennsylvania
: neuron-request@cattell.psych.upenn.edu


------------------------------

Subject: resend Algorithms for Principal Components Analysis
From: FZI-mmdfmail <mmdf@gate.fzi.de>
Date: Sun, 24 Nov 91 16:31:07 +0000


Over the past few years there has been a great deal of interest in
recursive algorithms for finding eigenvectors or linear combinations of
them. Many of these algorithms are based on the Oja rule (1982) with
modifications to find more than a single output. As might be expected,
so many people working on a single type of algorithm has led to a certain
amount of duplication of effort. Following is a list of the papers I
know about, which I'm sure is incomplete. Anyone else working on this
topic should feel free to add to this list!

Cheers,
Terry Sanger



@article{sang89a,
author="Terence David Sanger",
title="Optimal Unsupervised Learning in a Single-Layer Linear
Feedforward Neural Network"
,
year=1989,
journal="Neural Networks",
volume=2,
pages="459--473"}

@incollection{sang89c,
author="Terence David Sanger",
title="An Optimality Principle for Unsupervised Learning",
year=1989,
pages="11--19",
booktitle="Advances in Neural Information Processing
Systems 1"
,
editor="David S. Touretzky",
publisher="Morgan Kaufmann",
address="San Mateo, {CA}",
note="Proc. {NIPS'88}, Denver"}

@article{sang89d,
author="Terence David Sanger",
title="Analysis of the Two-Dimensional Receptive Fields Learned
by the Generalized {Hebbian} Algorithm in Response to
Random Input"
,
year=1990,
journal="Biological Cybernetics",
volume=63,
pages="221--228"}

@misc{sang90c,
author="Terence D. Sanger",
title="Optimal Hidden Units for Two-layer Nonlinear
Feedforward Neural Networks"
,
year=1991,
note="{\it Int. J. Pattern Recognition and AI}, in press"}

@inproceedings{broc89,
author="Roger W. Brockett",
title="Dynamical Systems that Sort Lists, Diagonalize Matrices,
and Solve Linear Programming Problems"
,
booktitle="Proc. 1988 {IEEE} Conference on Decision and Control",
publisher="{IEEE}",
address="New York",
pages="799--803",
year=1988}

@ARTICLE{rubn90,
AUTHOR = {J. Rubner and K. Schulten},
TITLE = {Development of Feature Detectors by Self-Organization},
JOURNAL = {Biol. Cybern.},
YEAR = {1990},
VOLUME = {62},
PAGES = {193--199}
}

@INCOLLECTION{krog90,
AUTHOR = {Anders Krogh and John A. Hertz},
TITLE = {Hebbian Learning of Principal Components},
BOOKTITLE = {Parallel Processing in Neural Systems and Computers},
PUBLISHER = {Elsevier Science Publishers B.V.},
YEAR = {1990},
EDITOR = {R. Eckmiller and G. Hartmann and G. Hauske},
PAGES = {183--186},
ADDRESS = {North-Holland}
}

@INPROCEEDINGS{fold89,
AUTHOR = {Peter Foldiak},
TITLE = {Adaptive Network for Optimal Linear Feature Extraction},
BOOKTITLE = {Proc. {IJCNN}},
YEAR = {1989},
PAGES = {401--406},
ORGANIZATION = {{IEEE/INNS}},
ADDRESS = {Washington, D.C.},
MONTH = {June}
}

@MISC{kung90,
AUTHOR = {S. Y. Kung},
TITLE = {Neural networks for Extracting Constrained Principal
Components},
YEAR = {1990},
NOTE = {submitted to {\it IEEE Trans. Neural Networks}}
}

@article{oja85,
author="Erkki Oja and Juha Karhunen",
title="On Stochastic Approximation of the Eigenvectors and
Eigenvalues of the Expectation of a Random Matrix"
,
journal="J. Math. Analysis and Appl.",
volume=106,
pages="69--84",
year=1985}

@book{oja83,
author="Erkki Oja",
title="Subspace Methods of Pattern Recognition",
publisher="Research Studies Press",
address="Letchworth, Hertfordshire UK",
year=1983}

@inproceedings{karh84b,
author="Juha Karhunen",
title="Adaptive Algorithms for Estimating Eigenvectors of
Correlation Type Matrices"
,
booktitle="{Proc. 1984 {IEEE} Int. Conf. on Acoustics, Speech,
and Signal Processing}"
,
publisher="{IEEE} Press",
address="Piscataway, {NJ}",
year=1984,
pages="14.6.1--14.6.4"}

@inproceedings{karh82,
author="Juha Karhunen and Erkki Oja",
title="New Methods for Stochastic Approximation of Truncated
{Karhunen-Lo\`{e}ve} Expansions"
,
booktitle="{Proc. 6th Int. Conf. on Pattern Recognition}",
year=1982,
publisher="{Springer}-{Verlag}",
address="{NY}",
month="October",
pages="550--553"}

@inproceedings{oja80,
author="Erkki Oja and Juha Karhunen",
title="Recursive Construction of {Karhunen-Lo\`{e}ve} Expansions
for Pattern Recognition Purposes"
,
booktitle="{Proc. 5th Int. Conf. on Pattern Recognition}",
publisher="Springer-{Verlag}",
address="{NY}",
year=1980,
month="December",
pages="1215--1218"}

@inproceedings{kuus82,
author="Maija Kuusela and Erkki Oja",
title="The Averaged Learning Subspace Method for Spectral
Pattern Recognition"
,
booktitle="{Proc. 6th Int. Conf. on Pattern Recognition}",
year=1982,
publisher="Springer-{Verlag}",
address="{NY}",
month="October",
pages="134--137"}

@phdthesis{karh84,
author="Juha Karhunen",
title="Recursive Estimation of Eigenvectors of Correlation Type
Matrices for Signal Processing Applications"
,
school="Helsinki Univ. Tech.",
year=1984,
address="Espoo, Finland"}

@techreport{karh85,
author="Juha Karhunen",
title="Simple Gradient Type Algorithms for Data-Adaptive Eigenvector
Estimation"
,
institution="Helsinki Univ. Tech.",
year=1985,
number="TKK-F-A584"}

@inproceedings{karh82,
author="Juha Karhunen and Erkki Oja",
title="New Methods for Stochastic Approximation of Truncated
{Karhunen-Lo\`{e}ve} Expansions"
,
booktitle="{Proc. 6th Int. Conf. on Pattern Recognition}",
year=1982,
publisher="{Springer}-{Verlag}",
address="{NY}",
month="October",
pages="550--553"}

@inproceedings{oja80,
author="Erkki Oja and Juha Karhunen",
title="Recursive Construction of {Karhunen-Lo\`{e}ve} Expansions
for Pattern Recognition Purposes"
,
booktitle="{Proc. 5th Int. Conf. on Pattern Recognition}",
publisher="Springer-{Verlag}",
address="{NY}",
year=1980,
month="December",
pages="1215--1218"}

@inproceedings{kuus82,
author="Maija Kuusela and Erkki Oja",
title="The Averaged Learning Subspace Method for Spectral
Pattern Recognition"
,
booktitle="{Proc. 6th Int. Conf. on Pattern Recognition}",
year=1982,
publisher="Springer-{Verlag}",
address="{NY}",
month="October",
pages="134--137"}

@phdthesis{karh84,
author="Juha Karhunen",
title="Recursive Estimation of Eigenvectors of Correlation Type
Matrices for Signal Processing Applications"
,
school="Helsinki Univ. Tech.",
year=1984,
address="Espoo, Finland"}

@techreport{karh85,
author="Juha Karhunen",
title="Simple Gradient Type Algorithms for Data-Adaptive Eigenvector
Estimation"
,
institution="Helsinki Univ. Tech.",
year=1985,
number="TKK-F-A584"}

@misc{ogaw86,
author = "Hidemitsu Ogawa and Erkki Oja",
title = "Can we Solve the Continuous Karhunen-Loeve Eigenproblem
from Discrete Data?"
,
note = "Proc. {IEEE} Eighth International Conference on Pattern Recognition,
Paris"
,
year = "1986"}

@article{leen91,
author = "Todd K Leen",
title = "Dynamics of learning in linear feature-discovery networks",
journal = "Network",
volume = 2,
year = "1991",
pages = "85--105"}

@incollection{silv91,
author = "Fernando M. Silva and Luis B. Almeida",
title = "A Distributed Decorrelation Algorithm",
booktitle = "Neural Networks, Advances and Applications",
editor = "Erol Gelenbe",
publisher = "North-Holland",
year = "1991",
note = "to appear"}




------------------------------

Subject: Multilayer hebbian network
From: JJ Merelo <jmerelo@ugr.es>
Date: Thu, 19 Dec 91 10:43:01 +0200

I am presently working on a multilayer network with hebbian learning, and only
one hidden layer. I have scanned thru the literature, but I have found nothing
about this type of network. Some things about hebbian learning, or multilayer
network with backpropagation, but nothing else.

I am interested in an algorithm for unsupervised learning in artificial
life, and I thought this one would be quite suitable, as it resembles the
learning that actually takes place in the first stages of life ( see
v.gr. vdMalsburg papers, or many physiologically-oriented papers ).

I would be grateful to get any pointer to literature on this kind of
network, together with e-mail or real addresses to ask for them. If I get
enough, I'll post the result.

Thanx
JJ

////////////////////// Spain is different \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\


------------------------------

Subject: Postdoc at Hebrew University
From: Haim Sompolinsky <haim@galaxy.huji.ac.il>
Date: Thu, 19 Dec 91 15:06:59 +0200



POSTDOCTORAL POSITIONS
AT
THE CENTER FOR NEURAL COMPUTATION
THE HEBREW UNIVERSITY, JERUSALEM


A postdoctoral position in the areas of neural network theory and
computational neuroscience is available at the Racah Institute of
Physics. The research is to be conducted within the framework of the
newly established interdisciplinary Center for Neural Computation at the
Hebrew University, which encompasses research in neurophysiology,
theoretical and applied physics, computer science, and psychology.

The position is at the postdoctoral level for one year, starting
in October 1992, with the possibility of renewal for a second year.
Interested applicants should submit a CV and a statement of current
research interests, and arrange for three letters of recommendation to be
sent as soon as possible to:

Prof. Haim Sompolinsky
The Racah Institute of Physics
The Hebrew University
Jerusalem, Israel, 91904

haim@galaxy.huji.ac.il, haims@hujivms.BITNET
fax: 972-2-584437, 972-2-666804
tl: 972-2-584563


------------------------------

Subject: Public domain LVQ-programs released
From: Kari Torkkola <karit@spine.hut.fi>
Date: Thu, 19 Dec 91 14:53:28 +0200

************************************************************************
* *
* LVQ_PAK *
* *
* The *
* *
* Learning Vector Quantization *
* *
* Program Package *
* *
* Version 1.0 (December, 1991) *
* *
* Prepared by the *
* LVQ Programming Team of the *
* Helsinki University of Technology *
* Laboratory of Computer and Information Science *
* Rakentajanaukio 2 C, SF-02150 Espoo *
* FINLAND *
* *
* Copyright (c) 1991 *
* *
************************************************************************

Public-domain programs for Learning Vector Quantization (LVQ) algorithms
are available via anonymous FTP on the Internet.

"What is LVQ?", you may ask --- See the following reference, then: Teuvo
Kohonen. The self-organizing map. Proceedings of the IEEE,
78(9):1464-1480, 1990.

In short, LVQ is a group of methods applicable to statistical pattern
recognition, in which the classes are described by a relatively small
number of codebook vectors, properly placed within each class zone such
that the decision borders are approximated by the nearest-neighbor rule.
Unlike in normal k-nearest-neighbor (k-nn) classification, the original
samples are not used as codebook vectors, but they tune the latter. LVQ
is concerned with the optimal placement of these codebook vectors into
class zones.

This package contains all the programs necessary for the correct
application of certain LVQ algorithms in an arbitrary statistical
classification or pattern recognition task. To this package two
particular options for the algorithms, the LVQ1 and the LVQ2.1, have been
selected.

This is the very first release of the package, and updates will be
available as soon as bugs are found and fixed. This code is distributed
without charge on an "as is" basis. There is no warranty of any kind by
the authors or by Helsinki University of Technology.

In the implementation of the LVQ programs we have tried to use as simple
code as possible. Therefore the programs are supposed to compile in
various machines without any specific modifications made on the code. All
programs have been written in ANSI C. The programs are available in two
archive formats, one for the UNIX-environment, the other for MS-DOS. Both
archives contain exactly the same files.

These files can be accessed via FTP as follows:

1. Create an FTP connection from wherever you are to machine
"cochlea.hut.fi". The internet address of this machine is
130.233.168.48, for those who need it.

2. Log in as user "anonymous" with your own e-mail address as password.

3. Change remote directory to "/pub/lvq_pak".

4. At this point FTP should be able to get a listing of files in this
directory with DIR and fetch the ones you want with GET. (The exact
FTP commands you use depend on your local FTP program.) Remember
to use the binary transfer mode for compressed files.

The lvq_pak program package includes the following files:

- Documentation:
README short description of the package
and installation instructions
document.ps documentation in (c) PostScript format
document.ps.Z same as above but compressed
document.txt documentation in ASCII format

- Source file archives (which contain the documentation, too):
lvq_p1r0.exe Self-extracting MS-DOS archive file
lvq_pak-1.0.tar UNIX tape archive file
lvq_pak-1.0.tar.Z same as above but compressed


An example of FTP access is given below

unix> ftp cochlea.hut.fi (or 130.233.168.48)
Name: anonymous
Password: <your email address>
ftp> cd /pub/lvq_pak
ftp> binary
ftp> get lvq_pak-1.0.tar.Z
ftp> quit
unix> uncompress lvq_pak-1.0.tar.Z
unix> tar xvfo lvq_pak-1.0.tar

See file README for further installation instructions.

All comments concerning this package shoud be
addressed to lvq@cochlea.hut.fi.

************************************************************************


------------------------------

End of Neuron Digest [Volume 8 Issue 15]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT