Copy Link
Add to Bookmark
Report

Neuron Digest Volume 06 Number 07

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Thursday, 25 Jan 1990		Volume 6 : Issue 7 

Today's Topics:
Research assistant (Artificial Neural Nets)
Call for papers -- COLT '90
tech report available: GA/ANN
Technical report available
workshop on intelligent diagnostic and control sys. for manufacturing
Short Course at AU
Graduate course in Anns by H. Szu
call for paper
Computational Metabolism on the Connection Machine and Other Stories...
Technical Report LA-UR-90-21
(New Tech. Report) From Simple Associations to Systematic Reasoning
Senior Eelectrical Engineer and Electrical Engineer
Preprint Available


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Research assistant (Artificial Neural Nets)
From: marwan@extro.ucc.su.oz.au (Marwan Jabri)
Organization: University Computing Service, Uni. of Sydney, Australia.
Date: 13 Jan 90 03:46:18 +0000

(This is a re-advertisement with additional info. If you have sent your
CV, then please read the "method of application" below and forward your
application as specified.)


Research Assistant

Reference No. 02/07

Microelectronic Implementations of Artificial Neural Networks

Systems Engineering and Design Automation Laboratory
Sydney University Electrical Engineering


Applications are invited from enthusiastic persons to work on the
development and implementation of analog/digital microelectronic building
blocks and simulation models for artificial neural networks. The project
aims at developing a multi-level simulation environment (mathematical,
structural and physical) for artificial neural networks.

Applicants should have an electrical engineering degree or equivalent.
The appointee may apply for enrollment towards a postgraduate degree
(part-time).

Preference will be given to applicants who have experience in artificial
neural networks, MOS analog or digital integrated circuit design or VLSI
computer aided design.

The appointment is for one year with prospect of renewal.

Salary range (according to qualifications)
Research Assistant Grade I ($22,585-$25,271)

For further information contact:

Dr M.A. Jabri,
Sydney University Electrical Engineering
NSW 2006 Australia
Tel: (+61-2) 692-2240
Fax: (+61-2) 692-3847
Email: marwan@ee.su.oz.au

Closing date: 25 January, 1990.

Method of application: Applications quoting reference no. 02/07,
including curriculum vitea, list of publications and the names, addresses
and fax numbers of three referees should be sent to:

The Registrar
Staff Office
University of Sydney
NSW 2006
Australia

The university reserves the right not to proceed with any appointment for
financial or other reasons.

Equal Opportunity is University Policy
--
Marwan Jabri E-mail: marwan@ee.su.oz
Systems Engineering and Design Automation Laboratory Fax: (+61-2) 692 3847
Sydney University Electrical Engineering
NSW 2006 Australia

------------------------------

Subject: Call for papers -- COLT '90
From: fulk@cs.rochester.edu (Mark Fulk)
Organization: U of Rochester, CS Dept, Rochester, NY
Date: 13 Jan 90 19:23:23 +0000



CALL FOR PAPERS
COLT '90
Third Workshop on Computational Learning Theory
Rochester, NY
August 6 - August 8, 1990

The third workshop on Computational Learning Theory will be held in
Rochester, NY. The conference will be jointly sponsored by SIGACT and
SIGART, and is expected to be similar in style to the previous such
workshops held at MIT and UC/Santa Cruz. Registration at COLT '90 is
open.

It is expected that most papers will present rigorous formal analyses of
theoretical issues in Machine Learning. Possible topics include, but are
not limited to: resource and robustness analysis of learning algorithms,
general learnability and non-learnability results in new and existing
computational learning models, theoretical comparisons among learning
models, and papers that connect learning theory with work in robotics,
neural nets, pattern recognition and cryptography. R. Freivalds (Latvian
State University, Riga) has agreed to present an invited talk; the
program committee may consider more such.

Authors should submit an extended abstract that consists of:

A) cover page with title, authors' names,
(postal and e-mail) addresses, and a 200 word summary.

B) body not longer than 10 pages in twelve-point font.

Be sure to include a clear definition of the model used, an overview of
the results, and some discussion of their significance, including
comparison to other work. Proofs or proof sketches should be included in
the technical section. Authors should send 10 copies of their abstract
to

John Case
COLT '90
Department of Computer and Information Sciences
103 Smith Hall
University of Delaware
Newark, DE 19716.

The deadline for receiving submissions is April 9, 1990. This deadline
is FIRM. Authors will be notified by May 22, final camera-ready papers
will be due June 18, and this deadline is ABSOLUTE. The proceedings will
be published by Morgan-Kaufmann. For further information about
submissions contact John Case (telephone: 302-451-2711, email:
case@cis.udel.edu).

Chair and local arrangements: Mark A. Fulk (U. Rochester).

Program committee:

J. Case (Delaware, chair),
D. Angluin (Yale),
E. Baum (NEC Research, Princeton)
S. Ben David (Technion, Israel),
M. Fulk (U. Rochester),
D. Haussler (UC Santa Cruz),
L. Pitt (U. Illinois),
R. Rivest (MIT),
C. Smith (Maryland),
S. Weinstein (U. Pennsylvania).

Note: papers that have appeared in journals or that are being submitted
to other conferences are not appropriate for submission to COLT with the
exception of papers submitted to the IEEE 30th Symposium on Foundations
of Computer Science (FOCS).

A joint submission policy coordinated with FOCS permits authors to send a
paper to both conferences; in the event that both conferences accept the
paper, it will be published in the FOCS proceedings, the authors will be
invited to give a talk at both conferences, and a short (one-page)
abstract will be printed in the COLT proceedings.

As the FOCS decisions may be quite late, authors of dual submissions will
be asked to send the abstract with their final copy, so as to allow the
publisher to substitute the abstract upon receiving word of the FOCS
decision.

It is, of course, required that authors notify both committees of the
dual submission by including a note in the cover letters.

------------------------------

Subject: tech report available: GA/ANN
From: rudnick@ogicse.ogc.edu (Mike Rudnick)
Organization: Oregon Graduate Institute (formerly OGC), Beaverton, OR
Date: 16 Jan 90 01:00:38 +0000

The following tech report/bibliography is available:


A Bibliography of the Intersection of
Genetic Search and Artificial Neural Networks

Mike Rudnick
Department of Computer Science and Engineering
Oregon Graduate Institute

Technical Report No. CS/E 90-001
January 1990

This is a fairly informal bibliography of work relating artificial neural
networks (ANNs) and genetic search. It is a collection of books, papers,
presentations, reports, and the like, which I've come across in the
course of pursuing my interest in using genetic search techniques for the
design of ANNs and in operating an electronic mailing list on GA/ANN.
The bibliography contains no references which I feel relate solely to
ANNs or GAs (genetic algorithms).


To receive a copy, simply request the report by name and number; send
email to kelly@cse.ogi.edu or smail to:

Kelly Atkinson
Department of Computer Science and Engineering
Oregon Graduate Institute
19600 NW Von Neumann Dr.
Beaverton, OR 97006-1999

Mike Rudnick CSnet: rudnick@cse.ogi.edu
Oregon Graduate Institute UUCP: {tektronix,verdix}!ogicse!rudnick
19600 N.W. von Neumann Dr. (503) 690-1121 X7390
Beaverton, OR. 97006-1999 OGI used to be OGC ... progress!

------------------------------

Subject: Technical report available
From: Mary Hare <hare@amos.ucsd.edu>
Date: Tue, 16 Jan 90 10:28:35 -0800



"The Role of Similarity in Hungarian Vowel Harmony:
A connectionist account"


Technical Report CRL-9004

Mary Hare
Department of Linguistics
&
Center for Research in Language

Over the last 10 years, the assimilation process referred to as vowel
harmony has served as a test case for a number of proposals in
phonological theory. Current autosegmental approaches successfully
capture the intuition that vowel harmony is a dynamic process involving
the interaction of a sequence of vowels; still, no theoretical analysis
has offered a non-stipulative account of the inconsistent behavior of the
so-called "transparent", or disharmonic, segments.

The current paper proposes a connectionist processing account of the
vowel harmony phenomenon, using data from Hungarian. The strength of
this account is that it demonstrates that the same general principle of
assimilation which underlies the behavior of the "harmonic" forms
accounts as well for the apparently exceptional "transparent" cases,
without stipulation.

The account proceeds in three steps. After presenting the data and
current theoretical analyses, the paper describes the model of sequential
processing introduced by Jordan (1986), and motivates this as a model of
assimilation processes in phonology. The paper then presents the results
of a series of parametric studies that were run with this model, using
arbitrary bit patterns as stimuli. These results establish certain
conditions on assimilation in a network of this type. Finally, these
findings are related to the Hungarian data, where the same conditions are
shown to predict the correct pattern of behavior for both the regular
harmonic and irregular transparent vowels.

----------------------------------------
Copies of this report may be obtained by sending an email request for
TR CRL-9004 to 'yvonne@amos.ucsd.edu', or surface mail to the Center for
Research in Language, C-008; University of California, San Diego;
La Jolla CA 92093.



------------------------------

Subject: workshop on intelligent diagnostic and control sys. for manufacturing
From: unccvax!billchu@mcnc.org (Tseng Bill Chu)
Organization: Univ. of NC at Charlotte, Charlotte, NC
Date: 18 Jan 90 22:50:10 +0000

[[ Editor's Note: This workshop is by invitation only. You must present
to attend. -PM ]]

Workshop on Intelligent Diagnostic and Control Systems for
Manufacturing

Time and Place: July 29 -30, 1990, Boston, MA

Program Committee

Bei-Tseng Bill Chu (Chair) Department of Computer Science, University of
North Carolina, Charlotte, NC 28223 (704-547-4568,
billchu@unccvax.uncc.edu).

Mary Emrich, Director of the Center of Intelligent Systems, Oak Ridge
National Laboratory, P.O.Box X, Oak Ridge, TN 37830

Gary Kahn Carnegie Group Inc, Five PPG Place, Pittsburgh, PA 15222
(gsk%cgi.com@relay.cs.net).

Hossein Nivi, Ford Motor Company, Manufacturing Systems and Operations
Engineering Department, 24500 Glendale Ave. Detroit, MI 48239
(313-592-2356).

Emanuel Sachs, Department of Mechanical Engineering, MIT, Boston, MA
02139 (619-253-5381) sachs@cas.mit.edu.


Focus of the workshop

This workshop is intended to provide a forum for practitioners and
researchers to exchange ideas and report successful techniques to
intelligently diagnose and control manufacturing processes. The emphasis
of this workshop is to examine how AI techniques can complement
traditional techniques such as statistical process control, digital
control, optimization theory, and operations research in helping
manufacturing facilities. We are looking for participants working in the
following areas:

1. Developing functioning intelligent diagnostic/control systems in
manufacturing facilities.

2. Architectures and methodologies that specifically address issues of
intelligent diagnostic and control systems for manufacturing.

The workshop is by invitation only. In order to facilitate discussion,
the workshop will be limited to no more than 50 participants. All
extended abstracts will be reviewed by the program committee and full
papers will be published in a proceeding.


Milestones:

March 26 1990: three copies of extended abstract (maximum four pages)
received by Bill Chu.

April 23, 1990: acceptance invitation mailed out.

May 18, 1990: full paper received by Bill Chu for workshop proceedings.

No exceptions can be granted for the above deadlines. Submissions are
encouraged to include e-mail addresses to speed up communications.


Preliminary agenda

Day 1.
Morning:

Opening statements by Bill Chu (UNCC)

Paper presentations on functioning intelligent diagnostic and control systems
for manufacturing.

Poster session.

Lunch break

Afternoon:

Invited presentation by Emanuel Sachs (MIT)

Break.

Paper presentations on architectures for intelligent diagnostic and
control systems for manufacturing.


Day 2.
Morning:

Breaking into workgroups on interested topics.

lunch break.

Report of workgroup discussions.

Penal discussion: discussion between program committee members
and participants.

Concluding remarks by Mary Emrich (ORNL)

------------------------------

Subject: Short Course at AU
From: Masud Cader <CADER%AUVM.BITNET@CORNELLC.cit.cornell.edu>
Date: Fri, 19 Jan 90 14:10:57 -0500

Applied Neural Networks Computing

March 12 - 14, 1990

Overview

Beginning with a brief introduction, this course analyzes various
existing neural network models in terms of their underlying principles
(namely nonlinear, nonlocal, nonstationary, and nonconvex); different
types of neurons (fine-, medium-, and large-grained processor elements);
and enriched synapses temporal behaviors (bursting, oscillating, graded,
and delayed transmissions). The fundamental learning models are grouped
according to known applications.

The course uses neural networks to solve practical problems such as the
rule of divide and conquer, the design of prototype, and the traditional
technology components (eg. rule-based artificial intelligence and
conventional optical character recognition used in bank-check reading and
zip code sorting). Different neural network learning algorithms are
matched with different input/output formats of different solution spaces
(eg. efficient fixed-point learning by the hetero-associative memory in a
matrix space and fine-tuning by various constrained optimization
algorithms in a linear vector space). Generic neural networks are
simulated using today's technology with emphasis on upward compatibility
with the number of actual vs. virtual neurons per layer and the number of
physical vs. virtual laryers. Reconfigurable nets are explored for
efficiency.

Participants are encouraged to bring their own problems to be formulated
in terms of neural networks. Evening computer laboratories are a unique
feature of this course, and provide participants with a hands-on
opportunity for discussion and consultation. Designated groups oriented
toward particular applications also meet during lunch to discuss mutual
problems and interests.

Lecturers

Harold Szu, Ph.D. Research Physicist, Washington, D.C., and Adjunct
Professor in the Department of Computer Science and Information Systems
at The American University, Dr. Szu's current research is on handwritten
character recognition and constrained optimization implementable on
superconducting optical neural networks computer, and a sixth-generation
computer based on the confluence of neural network, optical switching,
and new-high temperature superconducting materials technology. He holds
five patents, and has published about 100 technical papers, plus two
books. Dr. Szu is on the governing board of the International Neural
Network Society, is Editor-in- Chief of the Journal of Neural Network
Computing, and an editor for the journal Neural Networks.

Larry Medsker, Ph.D. Chair and Associate Professor, Department of
Computer Science and Information Systems at The American University,
Washington, D.C., Dr. Medsker's research is on expert systems, neural
networks, and the potential synergism of the two for practical
applications. Previously, Dr. Medsker was at Bell Laboratories' New
Jersey Institute of Technology, and the Purdue School of Science. He has
published over 90 technical papers in computer science and physics. He
is on the editorial board of the Journal of Neural Network Computing and
is Guest Editor for a special issue of Expert Systems with Applications:
An International Journal.







Schedule of Events

Monday, March 12

Overview of Neural Networks
Definitions
Single-layer Hopfield and Carpenter/Grossberg models (ART)
Backward error propagation models
Multiple-layer associative memory model
Applications-image processing, pattern recognition
Optimization
Fast simulated annealing
Statistical methods (Cauchy, Boltzmann)

Tuesday, March 13

Neural Networks Theory
Mapping data to intelligent behavior
Ability to abstract and generalize
Application - Handwritten character recognition
Architecture Design
Learning architectures
Determining topology
Temporal behavior vs. spatial topology
Preprocessor design
Application driven designs
Reconfigurable networks

Wednesday, March 14

Neural Nets - Image Processing and Pattern Recognition
Multiple channel model from a neural network model
Construction of Liapunov function
Simulation strategies based on invariance
Feature extraction/filtering
Implementation examples
Developments in Optical Neurocomputing

Monday - Wednesday Evenings

Hands-on work in computer laboratories
Participants work in small groups on specific problems

General Information

Prerequisite
BS in engineering or science, or equivalent experience in or relating to
course topic.

Enrollment
Early enrollment is recommended. Enrollment in the course is limited.
Participants may enroll by telephone, by mail, or in person at the address
listed below for the University Programs Advisement Center.

The fee for the course includes lecture notes and computer time. It does not
include parking, lodging, or meals.

Questions
- -For technical questions, call Dr Szu at (202) 767-1493 or Dr
Medsker at (202)885-1470
- -For registration questions, call University Programs Advisement
Center at (202) 885-2500

To Register

By Phone- Call (202) 885-2500 with a valid VISA or MasterCard.

By Mail- Complete and mail the attached registration form with a check ,
money order, training authorization (made payable to The
American University), or valid VISA or MasterCard
information to the address indicated below for the University Programs
Advisement Center.

In Person- Bring check, money order, training authorization (made payable
to The American University) or valid VISA or MasterCard
information to the address indicated below for the University
Programs Advisement Center.

Registration is complete only when full payment is verified or an authorized
training or purchase order has been received. Cash is not accepted. Payment
returned for insufficient funds will be subject to a $10 handling charge.

Registration Location

University Programs Advisement Center (202) 885-2500
McKinley Building, Room 153
The American University
4400 Massachusetts Avenue, N.W.
Washington, D.C. 20016

Registration Hours

Monday-Thursday, 9:00am to 8:30pm
Friday, 9:00am to 5:00pm
Saturday, 9:00am to 12:30pm

Confirmations and Location

Written confirmation of registration and the location of classes will be sent
upon registration. Questions on the status of the registration should be
referred to (202) 885-2500.

Cancellations

The American University reserves the right to cancel a scheduled course due
to low enrollment or other unavoidable reasons. Every effort will be made
to contact the participants registered for the course in advance of any
cancellations.

Withdrawals

Participants withdrawing prior to February 19, 1990 are entitled to a full
refund of fees paid. Any cancellations made after that date will be eligible
for a refund, less a $100 service charge.

Cost of the Course

The cost of the course is $750. A deposit of $100 is due by February 19,
1990. The balance of the cost, $650 will be due on the first morning of the
conference, Monday March 12, 1990.


Registration Form

Name


Title


Business or Organization


Business Address


City State Zip


Business phone ( )


Names of additional attendees:

1. Title


2. Title


(please list other participants on a supplemental sheet)

__ My check or purchase order made out to The American University is
enclosed.

__ Please charge to my credit card ___ VISA ___ MasterCard

Card number Exp. date


Cardholders name


Signature


Mail this form and payment to:
University Programs Advisement Center
The American University, McKinley Room 153
4400 Massachusetts Ave, N.W.
Washington, D.C. 20016
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

------------------------------

Subject: Graduate course in Anns by H. Szu
From: masud cader <GJTAMUSC%AUVM2.BITNET@CORNELLC.cit.cornell.edu>
Date: Fri, 19 Jan 90 18:41:07 -0400

3 Credit graduate course in ANNs

During the spring semester, Dr. H. Szu will be conducting a graduate
course in Artificial Neural Networks (number 690). Students in the DC
area consortium of universities are invited to participate. Please
contact your consortium advisor or see the trailer for further info. If
time permits, I shall post a tentative syllabus soon. The first class
will be held on 01/22/90.


________________________________________________________________________________
Dept. of Computer Science & Information Systems
College of Arts & Sciences
Washington DC (202) 885-1470
cader@auvm.bitnet

------------------------------

Subject: call for paper
From: U1O@PSUVM.BITNET
Organization: Penn State University
Date: 23 Jan 90 19:04:14 +0000

*********************************************************************
****** *****
****** CALL FOR PAPERS *****
****** Journal of Intelligent Manufacturing *****
****** SPECIAL ISSUE ON *****
****** Neural Networks In Manufacturing *****
****** *****
*********************************************************************

Guest Editor
Professor Soundar Kumara, Department of Industrial and Managament
Systems Engineering, The Pennsylvania State
University, University Park, PA 16802, U.S.A.
Professor Setsuo Ohsuga, Research Center for Advanced Science and
Technology, University of Tokyo, Tokyo, Japan
Professor Yung C. Shin, Department of Industrial and Managament
Systems Engineering, The Pennsylvania State
University, University Park, PA 16802, U.S.A.

This special issue is dedicated to the enhancement of advanced
manufacturing research through the exploration of the applicability of
Artificial Neural Networks.
Topics of interest include, but are not limited to:

* Survey and Tutorial based articles on Neural Networks and Their
Applicability to Manufacturing
* Neural Nets and Associative Memory in Engineering Design
* Knowledge Representation Via Neural Networks
* Connectionist Architecture for Manufacturing Control
* Neural Networks in Robotics
* Example Applications

Authors are encouraged to submit four copies of completed manuscripts to

Professor Soundar R. T. Kumara
Department of Industrial and Management Systems Engineering
The Pennsylvania State University
207 Hammond Building
University Park, PA 16802, U.S.A

Manuscript Due Date: May 1, 1990

If any question , fell free to contact Professor Soundar Kumara
(814)863-2359

------------------------------

Subject: Computational Metabolism on the Connection Machine and Other Stories...
From: Marek Lugowski <marek@iuvax.cs.indiana.edu>
Date: Tue, 23 Jan 90 10:44:14 -0500

Indiana University Computer Science Departamental Colloquium

Computational Metabolism on a Connection Machine and Other Stories...
---------------------------------------------------------------------
Elisabeth M. Freeman, Eric T. Freeman & Marek W. Lugowski
graduate students, Computer Science Department
Indiana University

Wednesday, 31 January 1990, 7:30 p.m.
Ballantine Hall 228
Indiana University campus, Bloomington, Indiana


This is work in progress, to be shown at the Artificial Life Workshop II,
Santa Fe, February 5-9, 1990. Connection Machine (CM) is a supercomputer
for massive parallelism. Computational Metabolism (ComMet) is such computation.
ComMet is a tiling where tiles swap places with neighbors or change their
state when noticing their neighbors. ComMet is a programmable digital liquid.

Reference: Artificial Life, C. Langton, ed., "Computational Metabolism:
Towards Biological Geometries for Computing"
, M. Lugowski, pp. 343-368,
Addison-Wesley, Reading, MA: 1989, ISBN 0-201-09356-1/paperbound.

Emergent mosaics:
- ----------------
This class of ComMet instances arise from generalizing the known
ComMet solution of Dijkstra's Dutch Flag problem. This has implications
for cryptology and noise-resistant data encodings. We observed deterministic
and indeterministic behavior intertwined, apparently a function of geometry.

A preliminary computational theory of metaphor:
- ----------------------------------------------
We are working on a theory of metaphor as transformations within ComMet.
Metaphor is loosely defined as expressing one entity in terms of another,
and so it must underlie categorization and perception. We postulate that
well-defined elementary events capable of spawning an emergent computation
are needed to encode the process of metaphor. We use ComMet to effect this.

A generalization of Prisoner's Dilemma (PD) for computational ethics:
- ---------------------------------------------------------------------
The emergence of cooperation in iterated PD interactions is known.
We propose a further generalization of PD into a communication between two
potentially complex but not necessarily aware of each other agents. These
agents are expressed as initial configurations of ComMet spatially arranged
to allow communication through tile propagation and tile state change.

Connection Machine (CM) implementation:
- --------------------------------------
We will show a video animation of our results, obtained on a 16k-processor CM,
including emergent mosaics, thus confirmed after we predicted them
theoretically. Our CM program computes in 3 minutes what took 7 days to do on
a Lisp Machine. Our output is a 128x128 color pixel map. Our code will
run in virtual mode, if need be, with up to 32 ComMet tiles per CM processor,
yielding a 2M-tile tiling (over 2 million tiles) on a 64k-processor CM.

------------------------------

Subject: Technical Report LA-UR-90-21
From: rdj%kite@LANL.GOV (Roger D. Jones)
Date: Tue, 23 Jan 90 12:09:34 -0700

The following technical report is available upon request to
rdj@lanl.gov or Roger D. Jones, MS-E531, Los Alamos National
Laboratory, Los Alamos, New Mexico 87545.

FUNCTION APPROXIMATION AND TIME SERIES PREDICTION
WITH NEURAL NETWORKS

by

R. D. Jones, Y. C. Lee, C. W. Barnes, G. W. Flake,
K. Lee, P. S. Lewis and S. Qian

ABSTRACT
Neural nets are examined in the context of function approximation and the
related field of time series prediction. A natural extension of radial basis
function nets is introduced. It is found that the use of adapable gradient
and normalized basis functions can significantly reduce the amount of data
necessary to train the net while maintaining the speed advantage of a net that
is linear in the weights. The local nature of the network permits the use of
simple learning algorithms with short memories of earlier training data. In
particular, it is shown that a one dimensional Newton's method is quite fast
and reasonably accurate.

------------------------------

Subject: (New Tech. Report) From Simple Associations to Systematic Reasoning
From: shastri@central.cis.upenn.edu
Date: Tue, 23 Jan 90 09:10:20 -0500


The following report may be of interest to some of you. Please direct e-mail
requests to: dawn@central.cis.upenn.edu


From Simple Associations to Systematic Reasoning:

A connectionist representation of rules, variables and dynamic bindings


Lokendra Shastri and Venkat Ajjanagadde
Computer and Information Science Department
University of Pennsylvania
Philadelphia, PA 19104

December 1989

Human agents draw a variety of inferences effortlessly, spontaneously,
and with remarkable efficiency --- as though these inferences are a
reflex response of their cognitive apparatus. The work presented in
this paper is a step toward a computational account of this remarkable
reasoning ability. We describe how a connectionist system made up of
simple and slow neuron-like elements can encode millions of facts and
rules involving n-ary predicates and variables, and yet perform a
variety of inferences within hundreds of milliseconds. We observe that
an efficient reasoning system must represent and propagate, dynamically,
a large number of variable bindings. The proposed system does so by
propagating rhythmic patterns of activity wherein dynamic bindings are
represented as the in-phase, i.e., synchronous, firing of appropriate
nodes. The mechanisms for representing and propagating dynamic bindings
are biologically plausible. Neurophysiological evidence suggests that
similar mechanisms may in fact be used by the brain to represent and
process sensorimotor information.


------------------------------

Subject: Senior Eelectrical Engineer and Electrical Engineer
From: munnari.oz.au!cluster!metro!new (Marwan Jabri)
Organization: University Computing Service, Uni. of Sydney, Australia.
Date: 25 Jan 90 11:54:57 +0000


Sydney University Electrical Engineering



Senior Electrical Engineer
and
Electrical Engineer


Microelectronic Implementation of Neural
Networks based Devices for the Analysis and
Classification of Medical Signals


Applications are invited from enthusiastic persons to work on
advanced neural network application project in the medical area.
The project is being funded jointly by the Australian Goverment
and a high-technology manufacturer of medical products.

The project is the research and development of different architectures of
networks to be implemented on ASIC's. The chips are to be used for the
analysis and classification of medical signals.

Senior Engineer:
- ----------------
Applicants for the Senior Engineer position should have an
electrical engineering degree or equivalent, and either a PhD
degree or a minimum of five years experience in a related
field.

Engineer:
- ---------
Applicants for the Engineer position should have an
electrical engineering degree or equivalent and a minimum of three
years experience in a related field.


The appointees may apply for enrollment towards a postgraduate degree
(part-time).

Preference will be given to applicants who have experience in artificial
neural networks, MOS analog or digital integrated circuit design.

The appointment is originally for one year with possibility of renewal.
Salary range according to qualifications, Senior Engineer: $36,000 pa,
Engineer: $30,000 pa.

Method of application:
- ----------------------
Applications including curriculum vitea, list of publications and the
names, addresses and fax numbers of three referees should be sent to:

Dr M.A. Jabri,
Sydney University Electrical Engineering
J03
NSW 2006 Australia
Tel: (+61-2) 692-2240
Fax: (+61-2) 692-3847
Email: marwan@ee.su.oz.au

>From whom further information may be obtained.

The University reserves the right not to proceed with any appointment for
financial or other reasons.

Equal Opportunity is University Policy.

Marwan Jabri E-mail: marwan@ee.su.oz
Systems Engineering and Design Automation Laboratory Fax: (+61-2) 692 3847
Sydney University Electrical Engineering
NSW 2006 Australia

------------------------------

Subject: Preprint Available
From: russella@garnet.berkeley.edu
Date: Thu, 25 Jan 90 15:34:48 -0800

AN ALTERNATIVE TO BACK-PROPAGATION:
A SIMPLE RULE OF SYNAPTIC MODIFICATION FOR NEURAL NET
TRAINING AND MEMORY.


by Hans J. Bremermann and Russell W. Anderson

Division of Biophysics, Department of Molecular and Cell Biology
and Department of Mathematics
University of California, Berkeley 94720
and
Graduate Group in Bioengineering University of California
Berkeley 94720 and San Francisco 94143.

Report Number: PAM-483 (Center for Pure and Applied Mathematics)


ABSTRACT.

Back-propagation is widely used in computer simulations of neural nets,
but has been criticized as neurobiologically implausible because of the
complexity of its weight adjustment rules. We propose algorithms for
neural net training which suggest a simple, neurobiologically feasible
rule for adjusting synaptic weights and/or connections between groups of
neurons for memorization and training. Efficiency is compared on
selected test problems with that of the back-propagation algorithm. Our
algorithm does not require modification for novel activation functions
(such as step functions or conjunctive inputs) or for dynamical
('recurrent') networks; in this sense, it is a general learning rule.
Our algorithm also suggests a way in which cortical maps can be
established, maintained, and expanded.

For copies of this paper, contact:
Russell Anderson
Dept. MCB-BCP
108A Donner Lab
U.C.Berkeley
Berkeley, CA 94720
russella@garnet.berkeley.edu

------------------------------

End of Neuron Digest [Volume 6 Issue 7]
***************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT