Copy Link
Add to Bookmark
Report

Neuron Digest Volume 12 Number 24

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

Neuron Digest   Thursday,  9 Dec 1993                Volume 12 : Issue 24 

Today's Topics:
Lecturere in Cognitive Psychology
Brain imaging scholarships
PCA Neural networks
Frog-Net Announcement
New Book Announcement
Book Review
PCA algorithms, continued.
Searching...
fifth neural network conference proceedings...
FIrst IEEE Conference on Image Processing


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Lecturere in Cognitive Psychology
From: plunkett (Kim Plunkett) <@prg.ox.ac.uk:plunkett@dragon.psych>
Date: Thu, 02 Dec 93 15:04:26 +0000

Lecturer in Cognitive Psychology
University of Oxford
Department of Experimental Psychology

Job Specification


The successful applicant will be required to assume special
responsibility for teaching the Final Honours School paper
"Memory and Cognition" which covers the following topics to
be published in Examination Decrees and Regulations:

Basic processes and varieties of human memory.
Memory retrieval and interference; recognition and
recall; short- and long-term memory; working
memory; sensory memory; priming; acquisition of
cognitive and motor skills; modality-specific and
material-specific varieties of coding in memory;
mnemonics; every-day memory; mathematical and com-
putational models of learning and memory; impair-
ment of learning and memory.

The representation and use of knowledge. Topics
such as: semantic memory; inference; concept for-
mation; encoding of similarities and differences;
concepts, prototypes, and natural categories;
schemata; imagery; problem solving; decision-
making; heuristics and biases; cross-cultural
differences in cognition.

This is one of four papers in cognitive psychology offered
in Final Honours.

The appointed lecturer will be expected to pursue active
research in an area of cognitive psychology. Although
interests in higher mental functions, cognitive neurop-
sychology, language, or artificial intelligence would be an
advantage, it should be stressed to potential applicants
that there is no restriction on area of interest.


Further details can be obtained from:

Professor S.D. Iversen,
Head of Department
Department of Experimental Psychology
South Parks Road
Oxford OX1 3UD

or email:

Jane Brooks - brooks@psy.ox.ac.uk


------------------------------

Subject: Brain imaging scholarships
From: stokely@atax.eng.uab.edu (Ernest Stokely)
Date: Thu, 02 Dec 93 14:20:18 -0600

Please post the following notice in Neuron-Digest. Thank you.

Ernest Stokely
Chair, Department of Biomedical Engineering
University of Alabama at Birmingham
- -----------------------------------------------------------

Brain Imaging Scholarships

The Department of Biomedical Engineering at the University of Alabama at
Birmingham announces a Ph.D. program specialized in the area of functional
and structural imaging of the brain. This program will be funded by a
Whitaker Foundation Special Opportunities Award. Applicants should be
prepared to receive formal training not only in the physics and engineering
aspects of medical imaging, but also on topics in neurobiology and aspects
of clinical diagnostic imaging. Dissertation research topics can be
selected from a broad spectrum of opportunities including (but not limited
to) the areas of magnetic resonance imaging (including spectroscopic
imaging), MR coil design, various applications of functional brain imaging
using SPECT and MRI, and issues of clinical imaging in neurology and
psychiatry.

The stipend for these scholarships is $15,000 per year, plus paid tuition
and an allowance for travel and materials. This program will be small,
highly focused, and will seek applications from highly motivated
individuals who are interested in this new area of biomedical engineering.
Four applicants will be chosen for the fall term, 1994. Successful
applicants will have a B.S. or M.S. degree in biomedical or electrical
engineering, physics, or computer science. Competitive applicants will
have excellent GRE and GPA scores. Of particular interest are those
students who will have completed their M.S. degree in one of these areas by
the fall of 1994, but would like to pursue a Ph.D. in brain imaging.

For more information contact Dr. Ernest Stokely (stokely@atax.eng.uab.edu)
or Dr. Don Twieg (twieg@atax.eng.uab.edu), Department of Biomedical
Engineering, University of Alabama at Birmingham, BEC 256, Birmingham,
Alabama 35294-4461.




------------------------------

Subject: PCA Neural networks
From: Erkki Oja <oja@dendrite.hut.fi>
Date: Fri, 03 Dec 93 16:45:27 +0200

%
% *** A LIST OF REFERENCES RELATED TO PCA NEURAL NETWORKS ***
%
We offer a fairly extensive collection of references on
Principal Component Analysis (PCA) neural networks and learning
algorithms, available by anonymous ftp. The list also contains
references on extensions and generalizations of such networks
and some basic references on PCA and related matters.
You can copy the list freely on your own responsibility.

The original list has been compiled by Liu-Yue Wang, a graduate
student of Erkki Oja, and updated by Juha Karhunen, all from
Helsinki University of Technology, Finland.
The list should cover fairly well the field of PCA networks.
Although it is not complete and contains possibly some errors
and nonuniformity in notation, the reference collection
should be useful for people interested in PCA neural networks
already in its present form.

To get the list, connect by ftp to dendrite.hut.fi and
give anonymous as the user id. Then proceed according to
instructions.

Erkki Oja, Liu-Yue Wang, Juha Karhunen

% ************************************************************



------------------------------

Subject: Frog-Net Announcement
From: liaw@rana.usc.edu (Jim Liaw)
Date: Fri, 03 Dec 93 16:00:47 -0800



*********************************************************************
** **
** o/\o Frog-Net o/\o **
** \| |/ \| |/ **
** | | An electronic forum for researchers | | **
** -- engaged in the study of the behavior -- **
** \/ \/ and the underlying neural mechanisms \/ \/ **
** in amphibians **
** **
*********************************************************************


This mailing list is set up to facilitate the communication and
interaction among researchers interested in the behavior and the
underlying neural mechanisms in amphibians.

If you would like to send email to all members of the list, address it to

"frog-net@rana.usc.edu"

If you want to subscribe to the mailing list, please send an email to

"liaw@rana.usc.edu"



:::::::::::::::::::::::::::::::::::

Jim Liaw
Center for Neural Engineering
Univ. of Southern California
Los Angeles, CA 90089-2520
(213) 740-6991
liaw@rana.usc.edu



------------------------------

Subject: New Book Announcement
From: "F. Ventriglia" <LC4A%ICINECA.BITNET@BITNET.CC.CMU.EDU>
Date: Tue, 07 Dec 93 15:09:35 +0700


Dear Connectionists fellow,

The following book has appeared as part of Studies in Neuroscience
Series, and may be of interest to you.

Best,

Francesco Ventriglia
Neurodynamics Department
Cybernetics Institute, CNR
Arco Felice (NA), Italy

*****************************************************************
Neural Modeling and Neural Networks
F. Ventriglia editor - Pergamon Press

Research in neural modeling and neural networks has escalated
dramatically in the last decade, acquiring along the way terms and
concepts, such as learning, memory, perception, recognition, which are
the basis of neuropsychology. Nevertless, for many, neural modeling
remains controversial in its purported ability to describe brain
activity. The difficulties in modeling are various, but arise
principally in identifying those elements that are fundamental for the
espression (and description) of superior neural activity. This is
complicated by our incomplete knowledge of neural structures and
functions, at the cellular and population levels. The firts step towards
enhanced appreciation of the value of neural modeling and neural
networks is to be aware of what has been achieved in this
multidisclipinary field of research. This book sets out to create such
awareness. Leading experts develop in twelve chapters the key topics of
neural structures and functions, dynamics of single neurons,
oscillations in groups of neurons, randomness and chaos in neural
activity, (statistical) dynamics of neural networks, learning, memory
and pattern recognition.

Contents: Preface. Contributors.
Anatomical bases of neural network modeling (J. Szentagothai)
Models of visuomotor coordination in frog and monkey (M.A. Arbib)
Analysis of single-unit activity in the cerebral cortex (M. Abeles)
Single neuron dynamics: an introduction (L.F. Abbott)
An introduction to neural oscillators (B. Ermentrout)
Mechanisms responsible for epilepsy in hippocampal slices predispose the
brain to collective oscillations (R.D. Traub, J.G.R. Jefferys)
Diffusion models of single neurones' activity and related problems (L.M.
Ricciardi)
Noise and chaos in neural systems (P. Erdi)
Qualitative overview of population neurodynamics (W.F. Freeman)
Towards a kinetic theory of cortical-like neural fields (F. Ventriglia)
Psychology, neuro-biology and modeling: the science of hebbian reverberations
(D.J. Amit)
Pattern recognition with neural networks (K. Fukushima)
Bibliography. Author index. Subject index.

Publication date November 1993
Approx 300 pages Price US$ 125.00

Available from:
Pergamon Press Inc.
660 White Plains Road
Tarrytown
NY 10591-5153
USA
Phone +1-914-524-9200
Fax +1-914-333-2444


------------------------------

Subject: Book Review
From: ai@hpmoeott.canada.hp.com
Date: Tue, 07 Dec 93 15:17:01 -0500

Review of "Forecasting with Neural Networks" (a technical report)

I recently obtained a copy of a technical report called
"Forecasting with Neural Networks" through a mail-order advertisement
in PC AI magazine.

I thought I would share my observations of this report.

Here goes ...

(1) Target Audience

The report notes that it is intended for those with some exposure
to calculus, linear algebra and computer programming. I would
add here that some knowledge of statistics and time series analysis
would also be appropriate.

(2) Introduction

The report presents an overview of neuron physiology (basic but
adequate) followed by a brief history of the field of neural
networks.

(3) Theory

Neural Networks
The author starts essentially from scratch (tedious for those of
us who are familiar with neural networks already) and ends up
deriving the backprop and counterprop models. These are to be
used later on in forecasting. The math is all there for those
who like to see it. Lots of diagrams as well.

Forecasting
Forecasting is re-cast as an attempt to predict the short-term
behavior of a chaotic time series. The crux of the matter is that
since chaotic behavior is nonlinear, a neural network with a nonlinear
transfer function is well suited to this problem (eg. backprop with
a sigmoid transfer function).

(4) Practice

The report describes an application area (predicting stock prices)
and goes through the steps involved in setting up a neural network
to do the job. The interesting bit here is the treatment of each
of the network parameters (learning rate, momentum factor, number
of hidden neurons, etc.).

The most useful information is a technique called "validation".
In this methodology, the training set is split into two subsets of
input/output pairs. The first subset is used to train the network in
the normal fashion. The second subset is used every so often in
order to test to see how well the network is performing. The network
weights are never adjusted after presenting an input/output pair
from the second subset. The idea behind this is that the network
will start off by learning important features in the data.
During this time, the performance on both subsets of data will improve.
Eventually, however, the network will exhaust the main features
and begin to model the noise in the data. At this point,
performance on the first subset will continue to improve but
performance on the second subset will actually deteriorate.
That's when you stop training.

(5) References

The most useful reference is to a papaer called "Predicing the
Future: A Connectionist Approach" by Weigend et al (International
Journal of Neural Systems, 1990). I dug it up and found a
detailed analysis of using neural networks to predict sunspot activity
(another popular times series).

(6) Summary

Overall, I would have liked to see ...

- a bit more detail in the history section (I like history)

- a more sophisticated model like cascade correlation

(7) Source

I obtained the report from an ad in the May-June issue of PC AI
magazine from a company called "Bellwood Research".

Regards,
Winslow


------------------------------

Subject: PCA algorithms, continued.
From: "Terence D. Sanger" <tds@ai.mit.edu>
Date: Tue, 07 Dec 93 18:40:47 -0500

In response to my previous message, many people have sent me new references
to PCA algorithms, and these have been included in the BibTex database
pca.bib. (Also note Wang's more extensive pclist.tex file announced
recently on this net.)

Errki Oja has been kind enough to forward copies of some of his
recent papers on the "Weighted Subspace Algorithm" and "Nonlinear PCA".
Looking at these carefully, I think both algorithms are closely related to
Brockett's algorithm, and probably work for the same reason. I have
created another short derivation "oja.tex" which is available along with
the updated pca.bib by anonymous ftp from ftp.ai.mit.edu in the directory
pub/sanger-papers.

One could invoke some sort of transitivity property to claim that since
Oja's algorithms are related to Brockett's, Brockett's are related to GHA,
and GHA does deflation, then Oja's algorithms must also do deflation. This
would imply that Oja's algorithms also satisfy the hypothesis:

"All algorithms for PCA which are based on a Hebbian learning rule must
use sequential deflation to extract components beyond the first."

But I must admit that the connection is becoming somewhat tenuous.
Probably the hypothesis should be interpreted as a vague description of a
motivation for the computational mechanism, rather than a direct
description of the algorithm. However, I still feel that it is important
to realize the close relationship between the many algorithms which use
Hebbian learning to find exact eigenvectors.

As always, comments/suggestions/counterexamples/references are welcomed!

Terry Sanger


Instructions for retrieving latex documents:

ftp ftp.ai.mit.edu
login: anonymous
password: your-net-address
cd pub/sanger-papers
get pca.bib
get oja.tex
quit
latex oja
lpr oja.dvi



------------------------------

Subject: Searching...
From: martino@tiete.cepel.br (Marcello B. Martino)
Date: Thu, 09 Dec 93 17:11:24 -0400

Mr. Marvit,

I'm searching for information about neural network models which
satisfies most of the following conditions:
- hetero-associative,
- recursive,
- supervised learning,
- one-shot, evolutive or incremental learning
I would be glad if you could help me to find some of these models.

Yours sincerely,
Marcello de Martino.


------------------------------

Subject: fifth neural network conference proceedings...
From: Pulin <sampat@CVAX.IPFW.INDIANA.EDU>
Date: Mon, 29 Nov 93 12:47:51 -0500

The Proceedings of the Fifth Conference on Neural Networks and Parallel
Distributed Processing at Indiana University-Purdue University at Fort Wayne,
held April 9-11, 1992 are now available. They can be ordered ($9 + $1 U.S.
mail cost; make checks payable to IPFW) from:

Secretary, Department of Physics FAX: (219)481-6880
Voice: (219)481-6306 OR 481-6157
Indiana University Purdue University Fort Wayne
email: proceedings@ipfwcvax.bitnet
Fort Wayne, IN 46805-1499

The following papers are included in the Proceedings of the Fifth Conference:

Tutorials

Phil Best, Miami University, Processing of Spatial Information in the Brain
William Frederick, Indiana-Purdue University, Introduction to Fuzzy Logic
Helmut Heller and K. Schulten, University of Illinois, Parallel Distributed
Computing for Molecular Dynamics: Simulation of Large Hetrogenous
Systems on a Systolic Ring of Transputer
Krzysztof J. Cios, University Of Toledo, An Algorithm Which Self-Generates
Neural Network Architecture - Summary of Tutorial

Biological and Cooperative Phenomena Optimization

Ljubomir T. Citkusev & Ljubomir J. Buturovic, Boston University, Non-
Derivative Network for Early Vision
M.B. Khatri & P.G. Madhavan, Indiana-Purdue University, Indianapolis, ANN
Simulation of the Place Cell Phenomenon Using Cue Size Ratio
J. Wu, M. Penna, P.G. Madhavan, & L. Zheng, Purdue University at
Indianapolis, Cognitive Map Building and Navigation
J. Wu, C. Zhu, Michael A. Penna & S. Ochs, Purdue University at
Indianapolis, Using the NADEL to Solve the Correspondence Problem
Arun Jagota, SUNY-Buffalo, On the Computational Complexity of Analyzing
a Hopfield-Clique Network

Network Analysis

M.R. Banan & K.D. Hjelmstad, University of Illinois at Urbana-Champaign,
A Supervised Training Environment Based on Local Adaptation,
Fuzzyness, and Simulation
Pranab K. Das II & W.C. Schieve, University of Texas at Austin, Memory in
Small Hopfield Neural Networks: Fixed Points, Limit Cycles and Chaos
Arun Maskara & Andrew Noetzel, Polytechnic University, Forced Learning in
Simple Recurrent Neural Networks
Samir I. Sayegh, Indiana-Purdue University, Neural Networks Sequential vs
Cumulative Update: An * Expansion
D.A. Brown, P.L.N. Murthy, & L. Berke, The College of Wooster, Self-
Adaptation in Backpropagation Networks Through Variable
Decomposition and Output Set Decomposition
Sandip Sen, University of Michigan, Noise Sensitivity in a Simple Classifier
System
Xin Wang, University of Southern California, Complex Dynamics of Discrete-
Time Neural Networks
Zhenni Wang and Christine di Massimo, University of Newcastle, A Procedure
for Determining the Canonical Structure of Multilayer Feedforward
Neural Networks
Srikanth Radhakrishnan and C, Koutsougeras, Tulane University, Pattern
Classification Using the Hybrid Coulomb Energy Network

Applications

K.D. Hooks, A. Malkani, & L. C. Rabelo, Ohio University, Application of
Artificial Neural Networks in Quality Control Charts
B.E. Stephens & P.G. Madhavan, Purdue University at Indianapolis, Simple
Nonlinear Curve Fitting Using the Artificial Neural Network
Nasser Ansari & Janusz A. Starzyk, Ohio University, Distance Field Approach
to Handwritten Character Recognition
Thomas L. Hemminger & Yoh-Han Pao, Case Western Reserve University, A
Real-Time Neural-Net Computing Approach to the Detection and
Classification of Underwater Acoustic Transients
Seibert L. Murphy & Samir I. Sayegh, Indiana-Purdue University, Analysis of
the Classification Performance of a Back Propagation Neural Network
Designed for Acoustic Screening
S. Keyvan, L. C. Rabelo, & A. Malkani, Ohio University, Nuclear Diagnostic
Monitoring System Using Adaptive Resonance Theory





------------------------------

Subject: FIrst IEEE Conference on Image Processing
From: icip@pine.ece.utexas.edu (International Conf on Image Processing Mail Box)
Date: Tue, 30 Nov 93 13:31:27 -0600



PLEASE POST PLEASE POST PLEASE POST PLEASE POST

***************************************************************

FIRST IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING
November 13-16, 1994
Austin Convention Center, Austin, Texas, USA


CALL FOR PAPERS


Sponsored by the Institute of Electrical and Electronics
Engineers (IEEE) Signal Processing Society, ICIP-94 is the
inaugural international conference on theoretical, experimental
and applied image processing. It will provide a centralized,
high-quality forum for presentation of technological advances and
research results by scientists and engineers working in Image
Processing and associated disciplines such as multimedia and
video technology. Also encouraged are image processing
applications in areas such as the biomedical sciences and
geosciences.

SCOPE:

1. IMAGE PROCESSING: Coding, Filtering, Enhancement,
Restoration, Segmentation, Multiresolution Processing,
Multispectral Processing, Image Representation, Image Analysis,
Interpolation and Spatial Transformations, Motion Detection and
Estimation, Image Sequence Processing, Video Signal Processing,
Neural Networks for image processing and model-based compression,
Noise Modeling, Architectures and Software.

2. COMPUTED IMAGING: Acoustic Imaging, Radar Imaging,
Tomography, Magnetic Resonance Imaging, Geophysical and Seismic
Imaging, Radio Astronomy, Speckle Imaging, Computer Holography,
Confocal Microscopy, Electron Microscopy, X-ray
Crystallography, Coded-Aperture Imaging, Real-Aperture Arrays.

3. IMAGE SCANNING DISPLAY AND PRINTING: Scanning and Sampling,
Quantization and Halftoning, Color Reproduction, Image
Representation and Rendering, Graphics and Fonts, Architectures
and Software for Display and Printing Systems, Image Quality,
Visualization.

4. VIDEO: Digital video, Multimedia, HD video and packet video,
video signal processor chips.

5. APPLICATIONS: Application of image processing technology to
any field.

PROGRAM COMMITTEE:

GENERAL CHAIR: Alan C. Bovik, U. Texas, Austin
TECHNICAL CHAIRS: Tom Huang, U. Illinois, Champaign and
John W. Woods, Rensselaer, Troy
SPECIAL SESSIONS CHAIR: Mike Orchard, U. Illinois, Champaign
EAST EUROPEAN LIASON: Henri Maitre, TELECOM, Paris
FAR EAST LIASON: Bede Liu, Princeton University

SUBMISSION PROCEDURES
Prospective authors are invited to propose papers for lecture or
poster presentation in any of the technical areas listed above.
To submit a proposal, prepare a summary of the paper using no
more than 3 pages including figures and references. Send five
copies of the paper summary along with a cover sheet stating the
paper title, technical area(s) and contact address to:
John W. Woods
Center for Image Processing Research
Rensselaer Polytechnic Institute
Troy, NY 12180-3590, USA.

Each selected paper (five-page limit) will be published in the
Proceedings of ICIP-94, using high-quality paper for good image
reproduction. Style files in LaTeX will be provided for the
convenience of the authors.

SCHEDULE
Paper summaries/abstracts due*: 15 February 1994
Notification of Acceptance: 1 May 1994
Camera-Ready papers: 15 July 1994
Conference: 13-16 November 1994

*For an automatic electronic reminder, send a "reminder please"
message to: icip@pine.ece.utexas.edu


CONFERENCE ENVIRONMENT
ICIP-94 will be held in the recently completed state-of-the-art
Convention Center in downtown Austin. The Convention Center is
situated two blocks from the Town Lake, and is only 12 minutes
from Robert Meuller Airport. It is surrounded by many modern
hotels that provide comfortable accommodation for $75-$125 per
night.

Austin, the state capital, is renowned for its natural hill-
country beauty and an active cultural scene. Within walking
distance of the Convention Center are several hiking and jogging
trails, as well as opportunities for a variety of aquatic sports.
Live bands perform in various clubs around the city and at night
spots along Sixth Street, offering a range of jazz, blues,
country/Western, reggae, swing and rock music. Day temperatures
are typically in the upper sixties in mid-November.

An exciting range of EXHIBITS, TUTORIALS, SPECIAL PRODUCT
SESSIONS,, and SOCIAL EVENTS will be offered.

For further details about ICIP-94, please contact:

Conference Management Services
3024 Thousand Oaks Drive
Austin, Texas 78746
Tel: 512/327/4012; Fax:512/327/8132
or email: icip@pine.ece.utexas.edu




FINAL CALL FOR PAPERS
FIRST IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING
November 13-16, 1994
Austin Convention Center, Austin, Texas, USA



------------------------------

End of Neuron Digest [Volume 12 Issue 24]
*****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT