Copy Link
Add to Bookmark
Report

Neuron Digest Volume 07 Number 16

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

Neuron Digest   Monday,  1 Apr 1991                Volume 7 : Issue 16 

Today's Topics:
Genetic algorithms + fractals or L-systems?
USPS address for alternate site
Rigorous results on Fault Tolerance and Robustness (Are there any?)
learned discriminations based upon intensity
Utah's First Annual Cognitive Science Lecture
Cognitive Science at Birmingham
POSITIONS IN NEURAL NETWORKS
Johns Hopkins' search for applications to aid disabled
Postdoc at U of Edinburgh
Summer/WInter Fellowships at DEC/Europe
AI and NN: industrial applications
Applications of ANN in Finance and Banking
Boolean Models(GSN)


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Genetic algorithms + fractals or L-systems?
From: Kingsley Morse <kingsley@hpwrce.hp.com>
Date: Mon, 25 Mar 91 19:34:47 -0800


Is anyone out there mixing genetic algorithms with fractals or L-systems?



------------------------------

Subject: USPS address for alternate site
From: elsberry@evax.uta.edu (Wesley R Elsberry)
Date: Wed, 27 Mar 91 00:02:02 -0500

Re: Announcement in ND V7 I:15

In informing Neuron Digest of the existence of Central Neural System BBS,
I managed to omit a critical piece of information for those wishing to
send disks for obtaining files: a snail-mail address. Here it is, with
my apologies for the inconvenience this mistake may have caused:

Wesley R. Elsberry
Sysop, Central Neural System BBS
528 Chambers Creek Dr. S.
Everman, TX 76140


[[ Editor's Note: Thanks again, Wesley! -PM ]]

------------------------------

Subject: Rigorous results on Fault Tolerance and Robustness (Are there any?)
From: Dario Ringach <dario%techunix.bitnet@TAUNIVM.TAU.AC.IL>
Date: Thu, 28 Mar 91 07:43:25 +0200

It is usually claimed that one of the advantages of networks of simple
computing elements is their fault tolerance (or robustness) to
connectivity and node failures. However, I find it difficult to find
*rigorous* results on this topic so far. Can anyone provide me any
references on this subject?

Thanks in advance. -Dario

Dario Ringach
Technion, Israel Institute of Technology
Dept. of Electrical Engineering, 32000 Haifa, Israel
dario@techunix.bitnet | dario@techunix.technion.ac.il


------------------------------

Subject: learned discriminations based upon intensity
From: Jonathan Schull <J_SCHULL@ACC.HAVERFORD.EDU>
Date: Fri, 29 Mar 91 17:42:00 -0500

We want to build a neural net which takes the activation level of one
sensory neuron as the indicator of the intensity of an external stimulus,
and learns to make different responses depending upon the intensity.
Does anyone know the standard architecture and learning rule for this
kind of problem?


[[ Editor's Note: This may be my misunderstanding, but just defining an
"input" node as a sensory neuron does the job. That is, the level of the
stimulus becomes, by definition in a simple model, the output of that
first node. Readers, can you enlighten me as to the distincion here? -PM ]]

------------------------------

Subject: Utah's First Annual Cognitive Science Lecture
From: Jerome Soller <soller@cs.utah.edu>
Date: Mon, 18 Mar 91 22:55:06 -0700

The speaker at the First Annual Utah Cognitive Science Lecture
is Dr. Andreas Andreou of the Johns Hopkins University Electrical
Engineering Department. His topic is "A Physical Model of the Retina in
Analog VLSI That Explains Optical Illusions"
. This provides a contrast
to Dr. Carver Mead of Caltech, who spoke earlier this year in Utah at the
Computer Science Department's Annual Organick Lecture.
The time and date of the First Annual Cognitive Science Lecture will
be Tuesday, April 2nd, 4:00 P.M. The room will be 101 EMCB(next to the
Merrill Engineering Building), University of Utah, Salt Lake City, Utah.
A small reception(refreshments) will be available. This event is
cosponsored by the Sigma Xi Resarch Fraternity. Dr. Dick Normann, Dr.
Ken Horch, Dr. Dick Burgess, and Dr. Phil Hammond were extremely helpful
in organizing this event.
For more information on this event and other Cognitive Science related
events in the state of Utah, contact me (801)-581-4710 or by
e-mail(preferred) (soller@cs.utah.edu) . We have an 130 person
electronic mailing list within the state of Utah announcing these kind of
events. We are also finishing up this year's edition of the Utah
Cognitive Science Information Guide, which contains 80 faculty, 60
graduate students, 60 industry representatives, 32 courses, and 25
research groups from the U. of Utah, BYU, Utah State and local industry.
A rough draft can be copied by anonymous ftp from
/usr/spool/ftp/pub/guide.doc from the cs.utah.edu machine. A final draft
in plain text and a Macintosh version(better format) will be on the ftp
site in about 2 weeks.

Sincerely,

Jerome B. Soller
Ph. D. Student
Department of Computer Science
University of Utah


------------------------------

Subject: Cognitive Science at Birmingham
From: PetersonDM@computer-science.birmingham.ac.uk
Date: Thu, 21 Mar 91 15:18:34 +0000

============================================================================

University of Birmingham

Graduate Studies in COGNITIVE SCIENCE

============================================================================

The Cognitive Science Research Centre at the University of Birmingham
comprises staff from the Departments/Schools of Psychology, Computer
Science, Philosophy and Linguistics, and supports teaching and research
in the inter-disciplinary investigation of mind and cognition. The Centre
offers both MSc and PhD programmes.

MSc in Cognitive Science

The MSc programme is a 12 month conversion course, including a 4 month
supervised project. The course places a particular stress on the relation
between biological and computational architectures.
Compulsory courses: AI Programming, Overview of Cognitive
Science, Knowledge Representation Inference and Expert Systems, General
Linguistics, Human Information Processing, Structures for Data and
Knowledge, Philosophical Questions in Cognitive Science, Human-Computer
Interaction, Biological and Computational Architectures, The Computer and
the Mind, Current Issues in Cognitive Science.
Option courses: Artificial and Natural Perceptual Systems, Speech
and Natural Language, Parallel Distributed Processing.
It is expected that students will have a good degree in
psychology, computing, philosophy or linguistics.
Funding is available through SERC and HTNT.

PhD in Cognitive Science

For 1991 there are 3 SERC studentships available for PhD level research
into a range of topics including:

o computational modelling of emotion
o computational modelling of cognition
o interface design
o computational and psychophysical approaches to vision


Computing Facilities

Students have access to ample computing facilities, including networks of
Apollo, Sun and Sparc workstations in the Schools of Computer Science and
Psychology.

Contact

For further details, contact: Dr. Mike Harris CSRC, School of Psychology,
University of Birmingham, PO Box 363, Edgbaston, Birmingham B15 2TT, UK.

Phone: (021) 414 4913

Email: HARRIMWG@ibm3090.bham.ac.uk


------------------------------

Subject: POSITIONS IN NEURAL NETWORKS
From: Benny Lautrup <LAUTRUP@nbivax.nbi.dk>
Date: Thu, 28 Mar 91 11:22:00 +0100


POSITIONS AVAILABLE IN NEURAL NETWORKS


Recently, the Danish Research Councils funded the setting up of a
Computational Neural Network Centre (CONNECT). There will be some
positions as graduate students, postdocs, and more senior visiting
scientists available in connection with the centre. Four of the junior
(i.e. student and postdoc) positions will be funded directly from the
centre grant and have been allotted to the main activity areas as
described below. We are required to fill these very quickly to get the
centre up and running according to the plans of the program under which
it was funded, so the deadline for applying for them is very soon,
APRIL 25. If there happen to be exceptionally qualified people in the
relevant areas available right now, they should inform us immediately.

We are also sending this letter because there may be other positions
available in the future. These will generally be externally funded.
Normally the procedure would be for us first to identify the good
candidate and then to apply to research councils, foundations and/or
international programs (e.g. NATO, EC, Nordic Council) for support.
This requires some time, so if an applicant is interested in coming
here from the fall of 1992, the procedure should be underway in the
fall of 1991.

The four areas for the present positions are:

Biological sequence analysis

Development of new theoretical tools and computational methods for
analyzing the macromolecular structure and function of biological
sequences. The focus will be on applying these tools and methods to
specific problems in biology, including pre-mRNA splicing and
similarity measures for DNA sequences to be used in constructing
phylogenetic trees. The applicant is expected to have a thorough
knowledge of experimental molecular biology, coupled with experience in
mathematical methods for describing complex biological phenomena. This
position will be at the Department of Structural Properties of
Materials and the Institute for Physical Chemistry at the Technical
University of Denmark.

Analog VLSI for neural networks

Development of VLSI circuits in analog CMOS for the implementation of
neural networks and their learning algorithms. The focus will be on the
interaction between network topology and the constraints imposed by
VLSI technology. The applicant is expected to have a thorough knowledge
of CMOS technology and analog electronics. Experience with the
construction of large systems in VLSI, particularly combined
analog-digital systems, is especially desirable. This position will be
in the Electronics Institute at the Technical University of Denmark.

Neural signal processing

Theoretical analysis and implementation of new methods for optimizing
architectures for neural networks, with applications in adaptive signal
processing, as well as ``early vision''. The applicant is expected to
have experience in mathematical modelling of complex systems using
statistical or statistical mechanical methods. This position will be
jointly in the Electronics Institute at the Technical University of
Denmark and the Department of Optics and Fluid Dynamics, Risoe National
Laboratory.

Optical neural networks

Theoretical and experimental investigation of optical neural networks.
The applicant is expected to have a good knowledge of applied
mathematics, statistics, and modern optics, particularly Fourier
optics. This position will be in the Department of Optics and Fluid
Dynamics, Risoe National Laboratory.

In all cases, the applicant is expected to have some background in
neural networks and experience in programming in high-level languages.
An applicant should send his or her curriculum vitae and publication
list to

Benny Lautrup
Niels Bohr Institute
Blegdamsvej 17
DK-2100 Copenhagen
Denmark

Telephone: (45)3142-1616
Telefax: (45)3142-1016
E-mail: lautrup@nbivax.nbi.dk

before April 25.

He/she should also have two letters of reference sent separately by
people familiar with his/her work by the same date.


------------------------------

Subject: Johns Hopkins' search for applications to aid disabled
From: Russ Eberhart <RCE1%APLVM.BITNET@CUNYVM.CUNY.EDU>
Date: Thu, 28 Mar 91 11:58:17 -0500


Neuron Digest Announcement

JOHNS HOPKINS LAUNCHES NATIONAL SEARCH
FOR COMPUTING APPLICATIONS TO ASSIST PERSONS WITH DISABILITIES

The Johns Hopkins University is now conducting a nationwide search for
Computing Applications to Assist Persons with Disabilities which will run
through February 1992. The search is a competition for ideas, systems,
devices and computer programs designed to help the more than 25 million
Americans with physical or learning disabilities. Systems and devices
using neural network technology are obvious candidates. The search is
open to all residents of the United States. Amateurs, computer
professionals and students are invited to compete for hundreds of prizes
and awards including a $10,000 Grand Prize. Entries may address any
physical, mental, or learning disability and are due by August 23, 1991.

Regional events, competitions and exhibits will be held across the
country from now through December 7 of this year, with progress reports
and announcements being made through local and national media. Regional
winners will compete for the grand prize at the national exhibit and
awards ceremony in Washington, D.C. February 1 and 2, 1992.

The primary goal for the search is putting ingenuity and technology to
work for _people_. To obtain a flier giving details of the competition
and how you can participate, write to:

Computing to Assist Persons with Disabilities
Johns Hopkins National Search
P. O. Box 1200
Laurel, MD 20723

or email your request to rce1@aplvm.bitnet.


------------------------------

Subject: Postdoc at U of Edinburgh
From: D J Wallace <egnp46@castle.edinburgh.ac.uk>
Date: Fri, 29 Mar 91 14:50:51 +0700

POSTDOCTORAL POSITION IN NEURAL NETWORK MODELS AND APPLICATIONS

PHYSICS DEPARTMENT, UNIVERSITY OF EDINBURGH


Applications are invited for a postdoctoral reasearch position in the
Physics Department, University of Edinburgh funded by a Science and
Engineering Research Council grant to David Wallace and Alastair Bruce.
The position is for two years, from October 1991.

The group's interests span theoretical and computational studies of
training algorithms, generalisation, dynamical behaviour and
optimisation. Theoretical techniques utilise statistical mechanics and
dynamical systems. Computational facilities include a range of systems
in Edinburgh Parallel Computing Centre, including a 400-node 1.8Gbyte
transputer system, a 64-node 1Gbyte Meiko i860 machine and AMT DAPs, as
well as workstations and graphics facilities.

There are strong links with researchers in other departments, including
David Willshaw and Keith Stenning (Cognitive Science), Richard Rohwer
(Speech Technology), Alan Murray (Electrical Engineering) and Michael
Morgan and Richard Morris (Pharmacology), and we are in two European
Community Twinnings. Industrial collaborations have included
applications with British Gas, British Petroleum, British Telecom,
National Westminster Bank and Shell.

Applications supported by a cv and two letters of reference should be
sent to

D.J. Wallace
Physics Department,
University of Edinburgh,
Kings Buildings,
Edinburgh EH9 3JZ,
UK

Email: ADBruce@uk.ac.ed and DJWallace@uk.ac.ed
Tel: 031 650 5250 or 5247

to arrive if possible by 30th April. Further particulars can be
obtained from the same address.




------------------------------

Subject: Summer/WInter Fellowships at DEC/Europe
From: <pau@yippee.enet.dec.com>
Date: Fri, 29 Mar 91 07:16:32 -0800


STUDENT SUMMER (or WINTER) FELLOWSHIPS AT DIGITAL EQUIPMENT ERUOPE's
European technical center,Sophia Antipolis,France

Digital Equipment Europe is actively pursuing neural processing research
and project work worldwide. As part of these activities, the DEC Europe
European technical center hosts on a continuing basis student fellows at
the MsC or PhD level,for work mostly on:
-neural processing in text and image information retrieval
-signal understanding for instrumentation and process control
-neural processing for banking and financial applications
-hybrid systems (neural+expert systems),e.g. for transportation scheduling
-embedded NN classifiers in machine vision applications

The fellowship durations are to be a minimum of 3 months, with the
maximum of 6 months or less, as set by visa or work permit obligations;
preference will go to european (but also asian) applications; work
language is primarily english. DEC cannot cover travel expenses, but pays
a fixed monthly trainee allowance covering housing and food, and
dependent on diplomas/work experience .Applications can be sent any time
to: Dr L.F.Pau, Technical director, DEC Europe,POBox 129,F 06561
Valbonne, France; it must include: full C.V., photos, copy of last
diploma, 2 page statement of experience/courses taken in NN, proof of
student registration, and preferably 2 letters of recommendation by
faculty.It can be arranged for that the project work reports ,in some
cases, can be used in part,to comply with project requirements and
grading at the home institution.The working environment requires
familiarity with workstations and VMS or UNIX or DOS operating systems.


------------------------------

Subject: AI and NN: industrial applications
From: <pau@yippee.enet.dec.com>
Date: Fri, 29 Mar 91 07:16:32 -0800

AI and NN: industrial applications

In response to a few requests posted on this BB,this is to bring to your
attention the following paper,covering extensive industrail projects,and
the llearnings made.

L.F.Pau,F.S.Johansen, Neural network signal understanding for
instrumentation, IEEE Trans. on instrumentation and measurement,Vol 39,
no 4,august 1990,558-564

Abstract:This paper reports on the use of neural signal interpretation
theory and techniques for the purpose of classifying the shapes of a set
of instrumentation signals,in order to calibrate devices,diagnose
anomalies,generate tuning/settings,and interpret the measurement
results.Neural signal unerstanding research is surveyed,and the selected
implementation is described with its performance in terms of correct
classification rates and robustness to noise.Formal results on neural net
training time and sensitivity to weights are given.A theory for neural
control is given using functional link nets and an explanation technique
is designed to help neural signal understanding.The results of this are
compared to those of a knowledge based signal interpretation system
within the context of the same specific instrument and data.

------------------------------

Subject: Applications of ANN in Finance and Banking
From: <pau@yippee.enet.dec.com>
Date: Fri, 29 Mar 91 07:16:32 -0800

BOOK INCLUDING APPLICATIONS OF NN IN FINANCE AND BANKING

The following book has appeared which features chapters on the use of
neural networks, as well as inductive learning,on about 10 different
financial applications,besides detailing out the most advances knowledge
based techniques in those areas:

L.F.Pau,C.Gianotti, Economic and financial knowledge based systems,
Springer Verlag, NY and Heidelberg, 1990


------------------------------

Subject: Boolean Models(GSN)
From: ecdbcf@ukc.ac.uk
Date: Mon, 25 Feb 91 17:07:30 +0000

Dear Connectionists,

Most people who read this mail will probably be working with
continuous/analogue models. There is, however, a growing interest in
Boolean neuron models, and some readers might be interested to know that
I have recently successfully completed a Ph.D thesis which deals with a
particular kind of Boolean neuron. Some brief details are given below,
together with some references to more detailed material.

=-----------------------------------------------------------------------
Abstract

This thesis is concerned with the investigation of Boolean neural
networks based on a novel RAM-based Goal-Seeking Neuron(GSN). Boolean
neurons are particularly suited to the solution of Boolean or logic
problems such as the recognition and associative recall of binarised
patterns.

One main advantage of Boolean neural networks is the ease with which they
can be implemented in hardware. This can result in very fast operation.
The GSN has been formulated to ensure this implementation advantage is
not lost.

The GSN model operates through the interaction of a number of local low
level goals and is applicable to practical problems in pattern
recognition with only a single pass of the training data(one-shot
learning).

The thesis explores different architectures for GSNs (feed-forward,
feedback and self-organising) together with different learning rules, and
investigates a wide range of alternative configurations within these
three architectures. Practical results are demonstrated in the context
of a character recognition problem.
=-----------------------------------------------------------------------

Highlights of GSNs, Learning Algorithms, Architectures and Main
Contributions

The main advantage of RAM-based neural networks in comparison with
networks based on sum-of-products functions is the ease with which they
can be implemented in hardware. This derives from their essentially
logical rather than continuous nature.

The GSN model has a natural propensity to solve the main problems
associated with other RAM-based neurons. Specific classes of
computational activity can be more appropriately realised by using a
particular goal seeking function, and different kinds of goal seeking
functions can be sought in order to provide a range of suitable
behaviour, creating effectively a family of GSNs.

The main experimental results have demonstrated the viability of the
one-shot learning algorithms: partial pattern association,
quasi-self-organisation, and self-organisation. The one-shot learning is
only possible because of the the GSN's ability to validate the
possibility of learning a given input pattern using a single
presentation.

The partial pattern association and the quasi-self-organising learning
have been applied in feed-forward architectures. These two kinds of
learning have given similar performance, though the quasi-self-organising
learning gives slightly better results when a small training size is
considered.

The work reported has established the viability and basic effectiveness
of the GSN concept. The GSN proposal provides a new range of
computational units, learning algorithms, architectures, and new concepts
related to the fundamental processes of computation using Boolean
networks. In all of these ideas further modifications, extensions, and
applications can be considered in order fully to establish Boolean neural
networks as a strong candidate for solving Boolean-type problems. A
great deal of additional research can be identified for immediate
investigation as follows.

One of the most important contributions of this work is the idea of
flexible local goals in RAM-based neurons which allows the application of
RAM-based neurons and architectures to a wider range of problems.

The definition of the goal seeking functions for all the GSN models used
in the feed-forward, feedback and self-organising architectures are
important because they provide local goals which try to maximise the
memory capacity and to improve the recall of correct output patterns.

Although the supervised pattern association learning is not the kind of
learning most suitable for use with GSN networks, because it demands
multi-presentations of the training set and causes a fast saturation of
the neurons' contents, the variety of solutions presented to the problem
of conflict of learning can help to achieve correct learning with a
relatively small number of activations compared to the traditional way of
erasing a path without taking care to keep the maximum number of stored
patterns.

The partial pattern association, quasi-self-organising, and the
self-organising learning have managed to break away from the traditional
necessity for many thousands of presentations of the training set, and
instead have concentrated on providing one-shot learning. This is made
possible by the propagation of the undefined value between the neurons in
conjunction with the local goal used in the validating state.

Due to the partial coverage area and the limited functionality of the
pyramids, which can cause an inability to learn particular patterns, it
is important to change the desired output patterns in order to be able to
learn these classes. The network produces essentially self-desired
output patterns which are similar to the desired output patterns, but not
necessarily the same. The differences between the desired output
patterns and the self-desired output patterns can be observed in the
learning phase by looking at the output values of each pyramid and the
desired output values.

The definition of the self-desired and the learning probability recall
rules have provided a way of sensing the changes in the desired output
patterns, and of achieving the required pattern classification.

The principle of low connectivity and partial coverage area make possible
more realistic VLSI implementations in terms of memory requirements and
overall connection complexity associated with the traditional problem of
fan-in and fan-out for high connectivity neurons.

The feedback architecture is able to achieve associative recall and
pattern completion, demonstrating that it is possible to have a cascade
of feedback networks that incrementally increases the similarity between
a prototype and the output patterns. The utilisation of the freeze
feedback operation has given a high percentage of correct convergences
and fast stabilisation of the output patterns.

The analysis of the saturation problem has demonstrated that the
traditional way of using uniform connectivity for all the layers impedes
the advance of the learning process and many memory addresses remain
unused. This is because saturation is not at the same level for each of
the layers. Thus, a new approach has been developed to assign a varied
connectivity to the architecture which can achieve a better capacity of
learning, a lower level of saturation and a smaller residue of unused
memory.

In terms of architectures and learning, an important result is the design
of the GSN self-organising network which incorporates some principles
related to the Adaptive Resonance Theory(ART). The self-organising
network contains intrinsic mechanisms to prevent the explosion of the
number of clusters necessary for self-stabilising a given training
pattern set. Several interesting properties are found in the GSN
self-organising network such as: attention, discrimination,
generalisation, self-stabilisation, and so on.

References

@conference{key210,
author = "D L Bisset And E C D B C Filho And M C Fairhurst",
title = "A Comparative study of neural network structures for
practical application in a pattern recognition enviroment"
,
publisher= "IEE",
booktitle= "Proc. First IEE International Conference on
Artificial Neural Networks"
,
address = "London, UK",
month = "October",
pages = "378-382",
year = "1989"
}

@conference{key214,
author = "E C D B C Filho And D L Bisset And M C Fairhurst",
title = "A Goal Seeking Neuron For {B}oolean Neural Networks",
publisher= "IEEE",
booktitle= "Proc. International Neural Networks Conference",
address = "Paris, France",
month = "July",
volume = "2",
pages = "894-897",
year = "1990"
}

@article{key279,
author = "E C D B C Filho And D L Bisset And M C Fairhurst",
title = "Architectures for Goal-Seeking Neurons",
journal= "International Journal of Intelligent Systems",
publisher= "John Wiley & Sons, Inc",
note = "To Appear",
year = "1991"
}

@article{key280,
author = "E C D B C Filho And M C Fairhurst And D L Bisset",
title = "Adaptive Pattern Recognition Using Goal-Seeking Neurons",
journal= "Pattern Recognition Letters",
publisher= "North Holland",
month = "March"
year = "1991"
}

All the best,

Edson ... Filho

-- After 10-Mar-91 -----------------------------------------------------------
! Universidade Federal de Pernambuco ! e-mail: edson@di0001.ufpe.anpe.br !
! Departamento de Informatica ! Phone: (81) 2713052 !
! Av. Prof. Luis Freire, S/N ! !
! Recife --- PE --- Brazil --- 50739 ! !
------------------------------------------------------------------------------

------------------------------

End of Neuron Digest [Volume 7 Issue 16]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT