Copy Link
Add to Bookmark
Report

Neuron Digest Volume 07 Number 42

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

Neuron Digest   Monday, 29 Jul 1991                Volume 7 : Issue 42 

Today's Topics:
new cluster tool available
SUMMARY: GA&NN
Re: Stability proofs for recurrent networks
NetTools - a package of tools for NN analysis
Student Conference
ENNS information
RE: Neuron Digest V7 #39
TR - Autoregressive Backpropagation Algorithm
TR - Experiments with the Cascade-Correlation Algorithm
TR available: learning phrase structure


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

----------------------------------------------------------------------

Subject: new cluster tool available
From: stolcke@ICSI.Berkeley.EDU
Date: Mon, 15 Jul 91 13:00:20 -0600


Dear Connectionists:

After several months of testing I'm releasing a slightly revised version
my enhanced cluster utility. A major memory allocation glitch was fixed
and support for System 5 curses pads was added.

I should note that for viewing the graph output of cluster the original
version of the xgraph program is not enough because it cannot handle
labeled datapoints. An enhanced version that works well with cluster can
be ftped at the same location as cluster (see below).

Andreas


HOW TO GET CLUSTER

cluster is available via anonymous ftp from icsi-ftp.berkeley.edu
(128.32.201.55). To get it use FTP as follows:

% ftp icsi-ftp.berkeley.edu
Connected to icsic.Berkeley.EDU.
220 icsi-ftp (icsic) FTP server (Version 5.60 local) ready.
Name (icsic.Berkeley.EDU:stolcke): anonymous
Password (icsic.Berkeley.EDU:anonymous):
331 Guest login ok, send ident as password.
230 Guest login Ok, access restrictions apply.
ftp> cd pub/ai
250 CWD command successful.
ftp> binary
200 Type set to I.
ftp> get cluster-2.2.tar.Z
200 PORT command successful.
150 Opening BINARY mode data connection for cluster-2.2.tar.Z (15531 bytes).
226 Transfer complete.
15531 bytes received in 0.08 seconds (1.9e+02 Kbytes/s)
ftp> quit
221 Goodbye.

HOW TO BUILD CLUSTER

Unpack in an empty directory using

% zcat cluster-2.2.tar.Z | tar xf -

Read the README and especially the man page (cluster.man) for information.
Check the Makefile for any compile time flags that might need adjustment.
Then compile with

% make

After making the appropriate adjustments in the Makefile you can

% make install



------------------------------

Subject: SUMMARY: GA&NN
From: Bernd Rosauer <rosauer@fzi.uka.de>
Date: Tue, 16 Jul 91 22:47:52 +0000

Some weeks ago I posted a request concerning the combination of genetic
algorithms and neural networks. In the following you will find a summary
of the references I received. This summary is preliminary and the
references are not completely reviewed. Maybe, I will post an annotated
one at the end of this year when I have got all the relevant proceedings
of this year.

I would like to make some general comments in advance. First of all, two
summaries have already been published which cover the stuff until 1990:

Rudnick, Mike. "A Bibliography of the Intersection of Genetic
Search and Artificial Neural Networks."
Technical Report CS/E
90-001, Department of Computer Science and Engineering, Oregon
Graduate Institute, January 1990.

Weiss, Gerhard. "Combining Neural and Evolutionary Learning:
Aspects and Approaches."
Report FKI-132-90, Institut fuer
Informatik, Technische Universitaet Muenchen, May 1990.

As one of my trustworthy informants told me the proceedings of ICGA'91
and NIPS'91 (will) contain tons of stuff on that topic. Finally, there
is a mailing list on "neuro-evolution". Because of the administrator did
not yet answer my request I do not know whether this list is still
active. Anyway, try

<neuro-evolution-request@cse.ogi.edu>

for further information.

Now, here is the summary. Many thanks to everyone who responded. Feel
free to send me further references.

Bernd

- -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

Ackley, D. H., and M. S. Littman. "Learning from natural selection
in an artificial environment."
Proceedings of the International
Joint Conference on Neural Networks Washington, D.C., January 1990.
Ackley, D. H., and M. L. Littman. "Interactions between learning and
evolution."
Artificial Life 2. Ed. Chris Langton. New York:
Addison-Wesley, in press.
Belew, R. K. "Evolution, learning and culture: computational metaphors
for adaptive search."
Complex Systems 4.1 (1990): 11-49.
Belew, R. K., J. McInerney and N. N. Schraudolph. "Evolving networks:
Using the Genetic Algorithm with connectionist learning."
Proc.
2nd Artificial Life Conference. New York: Addison-Wesley, in press.
Belew, R. K., J. McInerney and N. Schraudolph. "Evolving Networks:
Using the Genetic Algorithm with Connectionist Learning."
Technical
Report CS90-174, University of California at San Diego, 1990.
Hinton, G. E., and S. J. Nowlan S.J. "How Learning Guides Evolution."
Complex System 1 (1987): 495-502.
Ichikawa, Y. "Evolution of Neural Networks and Applications to Motion
Control."
Proc. of IEEE Int. Work. on Intelligent Motion Control,
Vol.1, 1990.
Keesing, Ron, and David Stork. N.t. NIPS-3, 1990.
Kitano, Hiroaki. "Empirical Study on the Speed of Convergence of Neural
Network Training using Genetic Algorithms."
Proceedings of AAAI-90.
Kitano, Hiroaki. "Designing Neural Networks with Genetic Algorithms using
Graph Generation System."
Complex System 4.4 (1990).
Kouchi, Masahiro, Hiroaki Inayoshi and Tsutomu Hoshino. "Optimization of
Neural-Net Structure by Genetic Algorithm with Diploidy and Geographical
Isolation Model."
Inst. of Engineering Mechanics, Univ. Tsukuba, Ibaraki
305, Japan.
Menczer, F., and D. Parisi. "`Sexual' reproduction in neural networks."
Technical Report PCIA-90-06, Institute of Psychology, C.N.R., Rome,
1990.
Menczer, F., and D. Parisi. "Evidence of hyperplanes in the genetic
learning of neural networks."
Technical Report PCIA-91-08, Institute
of Psychology, C.N.R., Rome, 1991.
Miglino, O., and D. Parisi. "Evolutionary stable and unstable strategies
in neural networks."
Technical Report PCIA-91-09, Institute of
Psychology, C.N.R., Rome, 1991.
Mjolsness, Eric, David H. Sharp and Bradley K. Alpert. "Scaling, Machine
Learning, and Genetic Neural Nets."
Advances in Applied Mathematics 10
(1989): 137-163.
Montana, David J., and Lawrence Davis. "Training Feedforward Neural
Networks using Genetic Algorithms."
Proceedings of the 11th Intern.
Joint Conference on Artificial Intelligence, 1989, pp. 762-767.
Muehlenbein, H., and J. Kindermann. "The Dynamics of Evolution and
Learning - Towards Genetic Neural Networks."
Connectionism in
Perspective. Ed. R. Pfeifer et. al. Elsevier, 1989. pp. 173-197.
Nolfi, S., J. Elman and D. Parisi. "Learning and Evolution in Neural
Networks."
CRL Technical Report 9019, University of California at
San Diego, 1990.
Nolfi, S., and D. Parisi. "Auto-teaching: neural networks that develop
their own teaching input."
Technical Report PCIA-91-03, Institute of
Psychology, C.N.R., Rome, 1991.
Parisi, D., F. Cecconi and S. Nolfi. "Econets: Neural Networks that Learn
in an Environment."
Network 1 (1990): 149-168.
Parisi, D., S. Nolfi, and F. Cecconi. "Learning, Behavior, and Evolution."
Technical Report PCIA-91-14, Institute of Psychology, C.N.R., Rome, 1991.
Radcliff, Nick. "Genetic Neural Networks on MIMD Computers." Ph.D.
Thesis, University of Edinburgh.
Radcliff, Nick. "Equivalence Class Analysis of Genetic Algorithms."
Complex Systems, in press.
Radcliff, Nick. "Forma Analysis and Random Respectful Recombination."
Proceedings of ICGA91, in press.
Todd, P. M. and G. F. Miller, G.F. "Exploring adaptive agency II: simulating
the evolution of associative learning."
From Animals to Animats. Eds.
J. A. Meyer and S. W. Wilson. Cambridge, MA: MIT, 1991.



------------------------------

Subject: Re: Stability proofs for recurrent networks
From: Gary Cottrell <gary@cs.UCSD.EDU>
Date: Mon, 22 Jul 91 15:42:37 -0700

Hal White has shown convergence conditions for learning in recurrent
nets. Try writing him for reprints.

He is:
Hal White
Dept. of Economics
UCSD
La Jolla, CA 92093

gary


------------------------------

Subject: NetTools - a package of tools for NN analysis
From: stevep@cs.uq.oz.au
Date: Wed, 24 Jul 91 18:41:43 +1000


NetTools is a package of analysis tools and a tech. report
demonstrating two of these techniques.


Analysis Tools for Neural Networks.

by Simon Dennis and Steven Phillips

Abstract - A large volume of neural net research in the 1980's involved
applying backpropagation to difficult and generally poorly understood
tasks. Success was sometimes measured on the ability of the network to
replicate the required mapping. The difficulty with this approach, which
is essentially a black box analysis, is that we are left with little
additional understanding of the problem or the way in which the neural
net has solved it. Techniques which can look inside the black box are
required. This report focuses on two statistical analysis techniques
(Principal Components Analysis and Canonical Discriminant Analysis) as
tools for analysing and interpreting network behaviour in the hidden unit
layers.




Net Tools

The following package contains three tools for network analysis:

gea - Group Error Analysis
pca - Principal Components Analysis
cda - Canonical Discriminants Analysis

TOOL DESCRIPTIONS

Group Error Analysis (gea)

Gea counts errors. It takes an output file and a target file and
optionally a groups file. Each line in the output file is an output
vector and the lines in the targets file are the corresponding correct
values. If all values in the output file are within criterion of the
those in the target file then the pattern is considered correct. Note
that this is a more stringent measure of correctness than the total sum
of squares. In particular it requires the outputs to be either high or
low rather than taking some average intermediate value. If a groups file
is provided then gea will separate the error count into the groups
provided.

Principal Components Analysis (pca)

Principle components analysis takes a set of points in a high dimensional
space and determines the major components of variation. The principal
components are labeled 0-(n-1) where n is the dimensionality of the space
(i.e. the number of hidden units). The original points can be projected
onto these vectors. The result is a low dimensional plot which has
hopefully extracted the important information from the high dimensional
space.

Canonical Discriminants Analysis (cda)

Canonical discriminant analysis takes a set of grouped points in a high
dimensional space and determines the components such that points within a
group form tight clusters. These points are called the canonical variates
and are labeled 0-(n-1) where n is the dimensionality of the space (i.e.
the number of hidden units). The original points can be projected on to
these vectors. The result is a low dimensional plot which has clustered
the points belonging to each group.

TECHNICAL REPORT

Reference: Simon Dennis and Steven Phillips.
Analysis Tools for Neural Networks.
Technical Report 207,
Department of Computer,
University of Queensland,
Queensland, 4072
Australia
May, 1991

NetTools.ps is a technical report which demonstrates the results which
can be obtained from pca and cda. It outlines the advantages of each and
points out some interpretive pitfalls which should be avoided.

TUTORIAL

The directory tute contains a tutorial designed at the University of
Queensland by Janet Wiles and Simon Dennis to introduce students to
network analysis. It uses the iris data first published by Fisher in
1936. The backpropagation simulator is tlearn developed at UCSD by
Jeffery Elman and colleagues. In addition the tutorial uses the
hierarchical clustering program, cluster, which was written by Yoshiro
Miyata and modified by Andreas Stolcke.

These tools can be obtained as follows


$ ftp crl.ucsd.edu
Connected to crl.ucsd.edu.
220 crl FTP server (SunOS 4.1) ready.
Name (crl.ucsd.edu:mav): anonymous
331 Guest login ok, send ident as password.
Password:
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuralnets
250 CWD command successful.
ftp> bin
200 Type set to I.
ftp> get NetTools.tar.Z
200 PORT command successful.
150 Binary data connection for NetTools.tar.Z (130.102.64.15,1240) (185900 bytes).
226 Binary Transfer complete.
local: NetTools.tar.Z remote: NetTools.tar.Z
185900 bytes received in 1.9e+02 seconds (0.97 Kbytes/s)
ftp> quit
221 Goodbye.
$ zcat NetTools.tar.Z | tar -xf -

Shalom
Simon and Steven

=-----------------------------------------------------------------------------
Simon Dennis Address: Department of Computer Science
Email: mav@cs.uq.oz.au University of Queensland
QLD 4072
Australia
=-----------------------------------------------------------------------------



------------------------------

Subject: Student Conference
From: Masud Cader <CADER%AUVM.BITNET@CUNYVM.CUNY.EDU>
Date: Thu, 25 Jul 91 01:22:07 -0400

Research Experience for Undergraduates
Third Annual Student Conference
on
Intelligent Systems

Dept. of Computer Science and Information Systems
American University
4400 Massachusetts Avenue, N.W., Washington, D.C. 20016


8:45am-12:30pm, 25 July 1991

Ward Building, Room 220


The Third Annual NSF sponsored Research Experience for Undergraduates
Student Conference focusing on Intelligent Systems will be held on the
campus of the American University in Ward Building, room 220.

The student research conference will be split into five main sessions:
Automation of the RITES survey, Neural Approaches for survey analysis,
Hybrid architectures, Distributed AI, and Intelligent system design.

Individual talks will focus on such topics as Knowledge Based Systems,
Fuzzy Petri Nets, DAI, Neural systems, and Self-Organizing systems.

Sessions begin at 9am and proceed for approximately 30 minutes each.


The public is invited to attend. There is NO registration fee.

For further details please contact Dr. Larry Medsker (medsker@auvm.bitnet)
or phone at (202) 885-1470.
=------------------------------------------------------------------------
Masud Cader cader@auvm.bitnet
Dept. CSIS (202) 885-2767
Dept. Math & Stat (202) 885-2717
=------------------------------------------------------------------------


------------------------------

Subject: ENNS information
From: R14502%BBRBFU01.BITNET@CUNYVM.CUNY.EDU
Date: 25 Jul 91 17:14:57

Bruxelles, le 24 July 1991

Concerns : Information about ENNS.

Thanks to all of your who have shown interest in ENNS. Your name will be
on a list and you will get available information in due time.

Here are answers to the questions you all asked.

To join, please send a check of 50 ECUs (European community currency)
or its equivalent in any currency to :

Prof. John Taylor
King's College London
Dept. of Mathematics
University of London
Stand, London WC2R 2LS
England

The special interest groups are not yet formed and will materialize
during ICANN 1992 in Brighton Professor Igor Aleksander is one of the
organizers. His address is

I. Aleksander
Imperial College of Science and Technology
Dept. of Computing
180 Queen's Gate
London SW7 2B2
U.K.

The question of bylaws will be settled by 1992

A. Babloyantz
Secretary ENNS


------------------------------

Subject: RE: Neuron Digest V7 #39
From: avlab::mrgate::"a1::raethpg"%avlab.dnet@wrdc.af.mil
Date: Fri, 26 Jul 91 13:20:25 -0400

From: NAME: Major Peter G. Raeth
FUNC: WRDC/AAWP-1
TEL: AV-785 513-255-7854 <RAETHPG AT A1 AT AVLAB>
To: NAME: VMSMail User "neuron <"neuron@hplpm.hpl.hp.com"@LABDDN@MRGATE>


In the last issue someone was asking about the First International
Conference on AI Applications on Wall Street.

Information and registration forms for the conference are available by
contacting Ms Mary Bianchi, Voice: 718-260-3760, Fax: 718-260-3136).




------------------------------

Subject: TR - Autoregressive Backpropagation Algorithm
From: Russell Leighton <russ@oceanus.mitre.org>
Date: Mon, 15 Jul 91 13:39:41 -0400

The following paper is available in the neuroprose library (leighton.ar-backprop.ps.Z).


The Autoregressive Backpropagation Algorithm

{To appear in the Proceedings of the International Joint
Conference on Neural Networks, 1991}

Russell R. Leighton and Bartley C. Conrath
The MITRE Corporation
7525 Colshire Drive, McLean, VA 22102


This paper describes an extension to error backpropagation
that allows the nodes in a neural network to encode state information
in an autoregressive ``memory.'' This neural model gives such
networks the ability to learn to recognize sequences and
context-sensitive patterns. Building upon the work of Wieland
concerning nodes with a single feedback connection, this paper
generalizes the method to $n$ feedback connections and addresses
stability issues. The learning algorithm is derived, and a few
applications are presented.


To get the paper:

1. ftp 128.146.8.62
2. cd pub/neuroprose
3. binary
4. get leighton.ar-backprop.ps.Z
5. quit
6. uncompress leighton.ar-backprop.ps.Z
7. lpr leighton.ar-backprop.ps

Russ


INTERNET: russ@dash.mitre.org

Russell Leighton
MITRE Signal Processing Lab
7525 Colshire Dr.
McLean, Va. 22102
USA



------------------------------

Subject: TR - Experiments with the Cascade-Correlation Algorithm
From: Jihoon Yang <yang@judy.cs.iastate.edu>
Date: Fri, 19 Jul 91 11:46:13 -0500

> ------------------------------------------------------------------------
>
> The following tech report is now available as a compressed postscript
> file "
yang.cascor.ps.Z" through anonymous ftp from the neurprose archive
> (directory pub/neuroprose on cheops.cis.ohio-state.edu - Thanks to
> Jordan Pollack of Ohio State University).
>
> Experiments with the Cascade-Correlation Algorithm
> Technical report # 91-16 (July 1991)
> Jihoon Yang & Vasant Honavar
> Department of Computer Science
> Iowa State University
>
> -----------------------------------------------------------------------
> Jihoon Yang
> yang@judy.cs.iastate.edu
>



------------------------------

Subject: TR available: learning phrase structure
From: George Berg <berg@cs.albany.edu>
Date: Tue, 23 Jul 91 18:08:59 -0400


The following paper is available:


Learning Recursive Phrase structure:
Combining the Strengths of PDP and X-Bar Syntax

George Berg
Department of Computer Science
Department of Linguistics and Cognitive Science
State University of New York at Albany



ABSTRACT

In this paper we show how a connectionist model, the XERIC Parser,
can be trained to build a representation of the syntactic structure of
sentences. One of the strengths of this model is that it avoids
placing a priori restrictions on the length of sentences or the depth
of phrase structure nesting. The XERIC architecture uses X-Bar
grammar, an "
unrolled" virtual architecture reminiscent of Rumelhart
and McClelland's back-propagation through time, recurrent networks and
reduced descriptions similar to Pollack's RAAM. Representations of
words are presented one at a time, and the parser incrementally builds
a representation of the sentence structure. Along the way it does
lexical and number/person disambiguation. The limits on the current
model's performance are consistent with the difficulty of encoding
information (especially lexical information) as the length and
complexity of the sentence increases.


This paper is to be presented at the IJCAI-91 Workshop on Natural
Language Learning, and is also available as SUNY Albany Computer
Science Department technical report TR 91-5.

===============================================================================

This paper is available three ways. Please DO NOT write me for a
copy (among other reasons, because I'll be out of town most of the
rest of the Summer). I will, however, be happy to answer questions and
otherwise discuss the paper.


First Way: anonymous ftp via the neuroprose archive:

The file is available via anonymous ftp from
cheops.cis.ohio-state.edu as the file berg.phrase_structure.ps.Z in
the pub/neuroprose directory. It is a compressed postscript file.
Below is the log of a typical ftp session to retrieve the file:

yourprompt> ftp cheops.cis.ohio-state.edu
Connected to cheops.cis.ohio-state.edu.
220 cheops.cis.ohio-state.edu FTP server (Version 5.49 Tue May 9 14:01:04 EDT 19
89) ready.
Name (cheops.cis.ohio-state.edu:you): anonymous
331 Guest login ok, send ident as password.
Password:
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
250 CWD command successful.
ftp> binary
200 Type set to I.
ftp> get berg.phrase_structure.ps.Z
200 PORT command successful.
150 Opening BINARY mode data connection for berg.phrase_structure.ps.Z (107077 b
ytes).
226 Transfer complete.
local: berg.phrase_structure.ps.Z remote: berg.phrase_structure.ps.Z
107077 bytes received in 7.3 seconds (14 Kbytes/s)
ftp> quit
221 Goodbye.
yourprompt>

Then uncompress the file and print it in your local fashion for
printing postscript.



Second Way: anonymous ftp via SUNY Albany

The file is available via anonymous ftp from ftp.cs.albany.edu
(128.204.2.32) as the file tr91-5.ps.Z in the pub directory. It is a
compressed postscript file. Below is the log of a typical ftp session
to retrieve the file:

yourprompt> ftp ftp.cs.albany.edu
Connected to karp.albany.edu.
220 karp.albany.edu FTP server (SunOS 4.1) ready.
Name (ftp.cs.albany.edu:you): anonymous
331 Guest login ok, send ident as password.
Password:
230 Guest login ok, access restrictions apply.
ftp> cd pub
250 CWD command successful.
ftp> binary
200 Type set to I.
ftp> get tr91-5.ps.Z
200 PORT command successful.
150 Binary data connection for tr91-5.ps.Z (128.204.2.36,2116) (107077 bytes).
226 Binary Transfer complete.
local: tr91-5.ps.Z remote: tr91-5.ps.Z
107077 bytes received in 1 seconds (1e+02 Kbytes/s)
ftp> quit
221 Goodbye.
yourprompt>

Then uncompress the file and print it in your local fashion for
printing postscript.


Third Way: SUNY Albany Computer Science Department Technical Reports
Secretary.

A copy of the paper may be requested by writing:

Technical Reports Secretary
Computer Science Department, LI-67A
State University of New York at Albany
Albany, New York 12222
USA

and requesting a copy of Technical Report TR 91-5 ("
Learning Recursive
Phrase structure: Combining the Strengths of PDP and X-Bar Syntax" by
George Berg). As I do not wish to make an enemy of the technical reports
secretary, please only request a copy if you are unable to get one by
ftp.

- -------------------------------------------------------------------------------
| George Berg | Computer Science Dept. | If you want wit in 15 words |
| berg@cs.albany.edu | SUNY at Albany, LI 67A | or less, go check Bartlett's |
| (518) 442 4267 | Albany, NY 12222 USA | quotations -- I'm busy. |
- -------------------------------------------------------------------------------



------------------------------

End of Neuron Digest [Volume 7 Issue 42]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT