Copy Link
Add to Bookmark
Report

Neuron Digest Volume 05 Number 02

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Thursday,  5 Jan 1989		Volume 5 : Issue 2 

Today's Topics:
ALVINN: An Autonomous Land Vehicle in a Neural Network - Dean Pomerleau
SBIR on parallel processing
GRADSIM network simulator available
Binary Back Prop question
Position available - ANNs and Signal Processing
rewiring eyes to ears
rewiring eyes to ears
A better back propagation activation function
Learning Rates and Neural Plasticity
neural nets for Star Wars
Connectionist simulator - full.c


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"

------------------------------------------------------------

Subject: ALVINN: An Autonomous Land Vehicle in a Neural Network
From: MVILAIN@G.BBN.COM (Marc Vilain)
Organization: The Internet
Date: 06 Dec 88 18:38:16 +0000


[[ Editor's Note: I found this *after* the talk had been given. Its an
intriguing application. What else is being done in this area? -PM ]]

BBN Science Development Program
AI Seminar Series Lecture

ALVINN: AN AUTONOMOUS LAND VEHICLE IN A NEURAL NETWORK

Dean Pomerleau
Carnegie-Mellon University
(Dean.Pomerleau@F.GP.CS.CMU.EDU)

BBN Labs
10 Moulton Street
2nd floor large conference room
10:30 am, Tuesday December 13


In this talk I will describe my current research on autonomous navigation
using neural networks.

ALVINN (Autonomous Land Vehicle In a Neural Network) is a 3-layer
back-propagation network designed for the navigational task of road
following. Currently ALVINN is designed to take images from a camera and a
laser range finder as input and produce as output the direction the vehicle
should travel in order to follow the road.

Training has been conducted using simulated roads. Recent successful tests
on the Carnegie-Mellon NAVLAB, a vehicle designed for autonomous land
vehicle research, indicate that the network can be quite effective at road
following under certain field conditions. I will be showing a videotape of
the network controlling the vehicle and presenting current directions and
extensions I hope to make to this research.

------------------------------

Subject: SBIR on parallel processing
From: ohare@itd.nrl.navy.mil (John O'Hare)
Date: Wed, 14 Dec 88 09:03:04 -0500

[[ Editor's note: The close date is embarrassingly close. I'll try to rush
the dealine articles ahead of otehrs in the future. However, are propsal
dealines usually so quick (3 weeks)? -PM ]]

1. Researchers in small businesses (less than 500 people) might be
interested in participating in a research program on acoustic classification
with parallel-processing networks. Awards are $50K for a 6-month d efinition
phase; and in later competition, up to $250K for each of two years in the
work phase. Close date is 6 Jan 89.

2. The topic is #N89-003 (pg. 87) in the DoD program solicitation entitled
FY-89Small Business Innovation Research (SBIR) Program. T he general contact
is: Mr. Bob Wrenn, SBIR Coordinator, OSD/SADBU, US Dept of Defense,
Pentagon, Rm. 2A340,Washington, DC. 20301-3061. Phone: (202) 697-1481.

------------------------------

Subject: GRADSIM network simulator available
From: watrous@linc.cis.upenn.edu (Raymond Watrous)
Date: Thu, 15 Dec 88 21:25:26 -0500

*************************************************************************
* *
* GRADSIM: *
* *
* CONNECTIONIST NETWORK OPTIMIZATION PACKAGE *
* FOR *
* TEMPORAL FLOW MODEL *
* *
* *

Version 1.6 of the GRADSIM Connectionist Network Simulator is being
released. The Simulator was specifically designed for experiments with the
temporal flow model, which is characterised by delay links and unrestricted
network connectivity. The simulator accepts network descriptors and
experiment descriptors in a very simple format. The simulator efficiently
computes the complete gradient of recurrent networks and can be configured
for several gradient descent methods.

The simulator was written to handle speech data in a particular format and
must be modified for other data formats. The code is in C, and is modular,
so the required changes should be fairly localized. This simulator is not
recommended for Boolean problems, since it is based on a parameterized
target function which is oriented toward sampled continuous processes.

The simulator is not supported; no one at Penn is available to answer questions
about modifications or applications. I would be interested to know of its use,
and may be able to answer simple questions on a time-available basis.


Distribution:

GRADSIM is being distributed "as is" through the following channels:

Anonymous ftp:

Host: linc.cis.upenn.edu

Directory: ~ftp/pub

File: gradsim.tar.Z

Login as ftp or anonymous; use your name as password. The file is a
COMPRESSED Unix tar archive consisting of about twenty files and a brief
explanatory note called "DISTR".

Mag tape:

One-half inch, 9 track Unix tar format 1600 bpi; available for $150
distribution fee from:

Technical Report Facility
Room 269/Moore Building
Department of Computer Science
University of Pennsylvania
200 South 33rd Street
Philadelphia, PA 19104


DOCUMENTATION:

The simulator is described in the University of Pennsylvania
Technical Report:

GRADSIM: A Connectionist Network Simulator
Using
Gradient Optimization Techniques

MS-CIS-88-16

by Raymond L. Watrous March, 1988

The tech report is available from the technical report facility addressed
above. The postage-paid cost for the report is $3.13. The report is included
at no charge with mag tape orders.


------------------------------

Subject: Binary Back Prop question
From: u-jmolse%sunset.utah.edu@wasatch.UUCP (John M. Olsen)
Organization: University of Utah, Computer Science Dept.
Date: 15 Dec 88 18:04:53 +0000


I'm designing some software, and would like to know if this sort of thing
has been done before. I'm using a 64 X 64 array of binary inputs, starting
with about 5 levels (each 64 X 64) and the output the same size. Each node
has 9 inputs, each with a bias to pass or invert the binary value of the
source node, resulting in summations in the set (-9, -7, -5, -3, -1, 1, 3,
5, 7, 9) where positive results generate a value of 1, and negative values
generate zero.

1. Is this a brain-dead way of doing things?
2. Will it be good for anything? I was thinking in terms of image filters.

The reason I want to do this, is that once it's out of learn mode, I will
probably be able to process about 50,000 to 150,000 of these binary nodes
per second on my home PC (Amiga) by using one of it's custom chips.

/\/\ /| | /||| /\| | John M. Olsen, 1547 Jamestown Drive /\/\
\/\/ \|()|\|\_ |||.\/|/)@|\_ | Salt Lake City, UT 84121-2051 \/\/
/\/\ | u-jmolse%ug@cs.utah.edu or ...!utah-cs!utah-ug!u-jmolse /\/\
\/\/ "A full mailbox is a happy mailbox" \/\/

------------------------------

Subject: Position available - ANNs and Signal Processing
From: kruschke@cogsci.berkeley.edu (John Kruschke)
Date: Fri, 16 Dec 88 18:32:24 -0800


SENIOR RESEARCH ENGINEER -- signal processing,
pattern recognition,
NEURAL NETWORKS.

Working on a new program and as the lead R&D Algorithm Engineer, you will
have the opportunity to shape the future technological directions of our
Advanced Development Group. Your initial assignments will include
developing and applying NEURAL NETWORK algorithms for pre-classification and
pattern recognition of image and speech data.

You should demonstrate an in-depth theoretical understanding of NEURAL
NETWORK learning and pattern recognition algorithm development. These
include: feature extraction, segmentation, stochastic processes, temporal
data analysis, pre-classification and clustering analysis. Also requires a
PhD/EE or MS/BSEE with equivalent experience, including a minimum 6 years
research in pattern recognition. Recent work in NEURAL NETWORKS is
essential. Experience in vision, image or speech recognition is very
desirable.

To apply, please send your resume in confidence to

Ford Aerospace, Command and Control Group
Western Development Laboratories Division
Pat Fitzgerald
Dept. PF-SJ1218
220 Henry Ford II Drive
San Jose, CA 95134

An equal opportunity employer. U.S. citizenship may be required. Tell them
you heard it from Bernard Hodes, Palo Alto CA, via electronic mail.

[[ Editor's Note: Or better yet, tell them you read in Neuron Digest! -PM ]]

------------------------------

Subject: rewiring eyes to ears
From: Dave.Touretzky@B.GP.CS.CMU.EDU
Date: Fri, 16 Dec 88 19:28:49 -0500


SCIENCE NEWS: 12/10/88, p. 374

FERRETS, LOOKING LOUDLY, HEAR THE LIGHT

In a series of unusual experiments, scientists have
rewired the brains of newborn ferrets so the animals,
in a sense, hear things they would normally see. The
research provides the strongest confirmation yet for a
theory of brain function that deems the visual,
auditory and other "higher" parts of the brain as
fundamentally alike in computational function--
resembling, at least in early stages of development,
interchangeable parts.

Moreover, the research supports the notion that these
higher, or cortical, parts of the brain "learn" how to
perform many of their sensory or motor functions from
early cues in the environment. While that theory is
not new, the experiments appear to underline the
importance of sensory experiences before birth and
during infancy in determining an individual's ability
to process information later in life.

Mriganka Sur and his co-workers at the Massachusetts
Institute of Technology in Cambridge rerouted retinal
neurons--which normally send sensory data from the eyes
to the visual cortex in the brain--in 16 ferrets so
that the data went instead to the animals' auditory
cortex. Cortical areas process raw bits of data into
more useful "patterns" of information. The researchers
studied the response patterns of cells in the auditory
cortex while showing the ferrets various visual cues.

"The basic issue is: Does all cortex perform basically
the same operation, and do the different outcomes only
depend on putting different inputs in?"
says Jon Kaas,
an experimental psychologist at Vanderbilt University
in Nashville, Tennessee. "Functionally, each area of
the cortex is doing something quite different. But is
each area somehow doing the same sort of calculations
with whatever input it gets?"


The answer appears to be yes, the MIT researchers
report in the Dec. 9 SCIENCE. They found that some
cells in the auditory cortex "transform" raw data into
"oriented rectangular receptor fields"--a type of
patterned response to stimuli that has until now been
clearly identified only in the visual cortex.

The finding is somewhat surprising, Sur and others say,
since auditory information processing--which includes
calculations of frequency changes and phase shifts to
locate sound in space--seems in some respects quite
different from the operations required to sense visual
patterns. So while the finding supports the theory
that all cortical tissue organizes information
similarly, Sur says it also suggests that whatever
detailed differences may exist among auditory, visual
and other cortical operations are "learned"
differences--the result of specific neural wiring
patterns somehow programmed by early sensory inputs.

"This means there is nothing intrinsic about the
auditory cortex that makes it auditory,"
Sur says. "It
depends on what kind of input it gets"
early in life.
The finding, he adds, could help explain the enormous
capacity of the young brain for recovery of function
(SCIENCE NEWS: 4/30/88, p. 280). "So if early in life
there are...lesions in some part of the brain, other
parts of the brain have the capacity to sort of chip in
or help in the recovery of function."


Moreover, Kaas says, the research has potential
significance for learning theory. "As we understand the
role of the environment in the developing nervous
system, we'll understand how to modify [prenatal and
early childhood experiences] in ways that are
desirable, or perhaps more importantly to prevent
stimuli that are undesirable."
-R. Weiss

# # #

------------------------------

Subject: rewiring eyes to ears
From: aboulang@WILMA.BBN.COM
Date: Sat, 17 Dec 88 13:53:28 -0500

The full Science reference is:

"Experimentally Induced Visual Projections into Auditory Thalamus and Cortex"
Sur, Mriganka, Preston E. Garraghty, & Anna W. Roe
Science, 9 Dec. 88, Vol 242, 1437-1441

Albert Boulanger
BBN Systems & Technologies Corp.
aboulanger@bbn.com

------------------------------

Subject: A better back propagation activation function
From: Donald Tveter <ucbvax!uwvax!chinet.chi.il.us!drt>
Date: Sun, 18 Dec 88 08:17:43 -0600

Someone recently asked if there is a better back propagation activation
function than 1 / (1 + exp(-x)). Yes there is. You can use a piece-wise
linear approximation to this function. That is, a straight line from x = 0
to x = 1, another line from x = 1 to x = 2, another from 2 to 3, another
from 3 to 5. Above 5, let the function be 1. Similarly for negative x.

With 64 bit real weights this function requires a few percent more
iterations to solve a problem, but then each iteration is much faster.
Better still, with this activation function you can use all integer
arithmetic throughout the program. It will work with 16 bit integer
weights. Rounding when calculating weight changes seems to be important. I
use this function and update the weights immediately. I recently tried to
use it with the version of back propagation where you only update the
weights after the whole set of patterns has been presented and it didn't
seem to work very well, although I must admit I did not spend an adequate
amount of time looking for bugs in this version.

Also, on the subject of learning arbitrary real-valued functions, I too,
have had trouble getting back propagation to learn sin(x). However, it is
quite easy if you use thermometer code. For instance, let 6.28 be the 40
values:

1111111111 1111111111 1111111111 1111111111

and then 3.14 can be:

1111111111 1111111111 0000000000 0000000000

A two layer network with one output unit is easy to train. Also, you can
train a two-layer network to go from a single real input to thermometer
code. The two networks can be pieced together and trained some more.

Also, larger numbers can be coded in a decimal form, using a ten's place and
a one's place. For instance, 46 can be coded as:

1111000000 1111110000

Donald R. Tveter
5228 N. Nashville Ave.
Chicago, Illinois 60656

uucp: {mcdchg,nucsrl}!chinet!drt

------------------------------

Subject: Learning Rates and Neural Plasticity
From: "Walter L. Peterson" <wlp@calmasd.GE.COM>
Organization: Calma Co. & West Coast Univ.
Date: 19 Dec 88 23:42:08 +0000


I am looking for references to recent work on learning rates and neural
plasticity in both artificial and biological neural networks for research
that I am doing for my master's thesis. The most recent works that I have
that address learing rates at all are in the proceedings of the CMI
Connectionist Summer School for '88 and some recent work on neural
plasticity in the visual cortex that was reported in last month's Scientific
American.

Can anyone "out there" point me toward anything else as recent or more so ?
( I have access to execllent library facilities, so don't worry if it is a
bit obscure ).

Also if anyone out there is or has recently been working in this area, I
would like to hear from you ( any contributions will, of course, be
acknowledged as references in the paper ).

Thanks in advance,

Walter L. Peterson

------------------------------

Subject: neural nets for Star Wars
From: Dave.Touretzky@B.GP.CS.CMU.EDU
Date: Mon, 19 Dec 88 22:21:45 -0500

>From the December 1988 issue of Communications of the ACM, p. 1369:

SDIO EYES NEURAL NETS

A two-year, $389,000 contract to research the uses of neural computers to
detect nuclear warheads in space has been awarded by the SDIO to
Hecht-Nielsen Neurocomputer Corp., San Diego. A company spokesman says the
task will require a level of computational power in the neighborhood of a
billion arithmetic operations per second. The contract follows a DARPA
study that calls for an eight-year, $400 million research effort to develop
neural computers for military scanning purposes.

If only Frank Rosenblatt were alive today...

------------------------------

Subject: Connectionist simulator - full.c
From: Barak.Pearlmutter@F.GP.CS.CMU.EDU
Date: 18 Dec 88 15:27:00 -0500

The way to retrieve the "full" temporally continuous fully connected
recurrent network simulator has changed slightly. Ftp from host
DOGHEN.BOLTZ.CS.CMU.EDU (128.2.222.37), user "anonymous", any password, use
binary mode, and get the file "full/full.tar.Z". The CD command should work
now.

A short paper describing the network model titled "Learning State Space
Trajectories in Recurrent Neural Networks"
can be found in "Proceedings of
the 1988 Connectionist Models Summer School,"
published by Morgan Kaufmann.

------------------------------

End of Neurons Digest
*********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT