Copy Link
Add to Bookmark
Report

Neuron Digest Volume 05 Number 43

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Monday, 30 Oct 1989                Volume 5 : Issue 43 

Today's Topics:
INNC-90-PARIS
DOD Small Business Innovation Research Program
Job Posting
position offered
Job Announcement
Neural Networks for Industry: A two-day tutorial


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: INNC-90-PARIS
From: ff%FRLRI61.BITNET@CUNYVM.CUNY.EDU,
Francoise Fogelman <ff%FRLRI61.BITNET@CUNYVM.CUNY.EDU>
Date: Thu, 12 Oct 89 15:30:12 +0100

- ---------------------------------------------------------------------------
INNC 90 PARIS
- ---------------------------------------------------------------------------
INTERNATIONAL NEURAL NETWORK CONFERENCE
JULY 9-13, 1990 PALAIS DES CONGRES PARIS FRANCE
- ---------------------------------------------------------------------------
Co-chairmen of the Conference:
B. Widrow (Stanford University)
B. Angeniol (Thomson-CSF)
Program committee
chairman: T. Kohonen (Helsinki University)
members:
I. Aleksander (Imperial College)
S. Ichi Amari (Univ. of Tokyo)
L. Cooper (Brown Univ.)
R. Eckmiller (Univ. of Dusseldorf)
F. Fogelman (Univ. of Paris 11)
S. Grossberg (Boston Univ.)
D. Rumelhart (Stanford Univ.) *: to be confirmed
P. Treleaven (University College London)
C. von der Malsburg (Univ.of South California)
- -----------------------------------------------------------------------------
Members of the international community are invited to submit original
papers to the INNS-90-PARIS by january 20,1990, in english,
on scientific and industrial developments in the following areas:
A-APPLICATIONS B-IMPLEMENTATIONS C-THEORY D-COMMERCIAL
- -----------------------------------------------------------------------------
THE CONFERENCE will include
one day of tutorials
four days of conference
poster sessions
prototype demonstrations
A forum with workshop sessions:specific interest groups,products sessions
deal sessions.
- ----------------------------------------------------------------------------
For information, contact:
Nina THELLIER NTC
INNC-90-PARIS
19 rue de la Tour
75116 PARIS-FRANCE
Tel: (33-1) 45 25 65 65
Fax: (33-1) 45 25 24 22
- -----------------------------------------------------------------------------
Francoise Fogelman

------------------------------

Subject: DOD Small Business Innovation Research Program
From: will@ida.org (Craig Will)
Date: Thu, 12 Oct 89 17:39:39 -0400

SMALL BUSINESS INNOVATION RESEARCH PROGRAM

Department of Defense


The U. S. Department of Defense has announced its fis-
cal year 1990 solicitation for the Small Business Innovation
Research (SBIR) Program.

The SBIR program provides for research contracts for
small businesses in various program areas designated by DoD
component agencies, including the Army, Navy, Air Force,
Defense Advanced Research Project Agency (DARPA), and Stra-
tegic Defense Initiative Organization (SDIO).

This year there are 16 topics specifically targeting
neural networks, and another 13 topics that specifically
mention neural networks as possible approaches that might be
used. This compares with 4 topics in the 1989 solicitation
that were specifically for neural network research, and 7 in
which neural network approaches were mentioned as possible.

The program is in three Phases. Phase I awards are
essentially feasibility studies of 6 months and with a dol-
lar amount of about $50,000, intended for a one-half
person-year effort. Phase I contractors compete for Phase
II awards of 2 years in length and up to $500,000, intended
for 2 to 5 person-years of effort. Phase III is the commer-
cial application phase of the research.

Proposals must be no longer than 25 pages in length,
including the cover sheet, summary, cost proposal, resumes
and any attachments. Deadline for proposals is January 5,
1990. Principal investigators must be employees (50% or
more time) of small business firms. The program encourages
small businesses to make use of university-based and other
consultants when appropriate.

A brief description of each of the 29 neural network-
related topics has been published as a 4-page Special Sup-
plementary Issue of Neural Network Review. Copies of the
special issue are available upon request by sending a mes-
sage to pinna@ida.org on milnet, or U. S. postal mail to
Neural Network Review, P. O. Box 427, Dunn Loring, VA
22027. (Note that subscription orders and requests for
information or samples of regular issues should go to
Lawrence Erlbaum Associates, Inc., Journal Subscription
Department, 365 Broadway, Hillsdale, NJ 07642.)

For more details on the SBIR program and forms necessary
for submitting a proposal obtain a copy of the SBIR
Program Solicitation book (438 pages in length) from the
Defense Technical Information Center:
Attn: DTIC/SBIR, Building 5, Cameron Station, Alexandria, VA
22304-6145. Telephone: Toll-free, (800) 368-5211. For
Virginia, Alaska, Hawaii: (202) 274-6902.

Craig Will
Institute for Defense Analyses
will@ida.org

------------------------------

Subject: Job Posting
From: Jordan B Pollack <pollack@cis.ohio-state.edu>
Date: Fri, 13 Oct 89 10:48:47 -0400

My dept is recruiting a couple of faculty in areas
which migbt be of interest to this group. The advertisement
for COMPUTATIONAL MODELS of NEURAL INFO. PROCESSING, going
out to press is enclosed below.

Since the area is quite large and vague, we have two
subareas in mind, but quality will overrule discipline.

The first subarea is "Biologically Realistic Connec-
tionism"
, and would deal with working models of neurons,
organs, or small creatures. The second potentially skips
over biology and goes right to math and physics. "Non-
Linear Cognition"
, or the study of complex dynamical systems
related either to brain or mind (e.g. self-organizing circu-
itry, cellular automata (reversibility?) chaos and complex-
ity theory, fractal patterns in speech/music, and so on.

We are also recruiting on a separate billet in SPEECH
PROCESSING, which could easily be in neural networks as well.

Please contact me if you want to discuss it, or know of
anybody good. Columbus is an especially nice place to live.

Jordan
pollack@cis.ohio-state.edu

- --------------------------------------------------------------------
Laboratory for Artificial Intelligence Research
Department of Computer and Information Science
and The Center for Cognitive Science at the
The Ohio State University

Position Announcement in Computational Neuroscience

A tenure-track faculty position at the Assistant Pro-
fessor level is expected to be available in the area of Com-
putational Neuroscience. We are seeking outstanding appli-
cants who have a strong background and research interest in
developing computational models of neural information pro-
cessing. A Ph.D. in computer science, or in some other
appropriate area with a sufficiently strong background in
computation, is required. The candidate will be a regular
faculty member in the Department of Computer & Information
Science, and will promote interactions among cognitive
science, computer science and brain science through the
Center for Cognitive Science.

The LAIR has strong symbolic and connectionist projects
underway, the Department has wide interests in parallel com-
putation, and the University has the major facilities in
place to support the computational neuroscience enterprise,
including several parallel computers, a Cray Y/MP, and a
full range of brain imaging systems in the medical school.

Applicants should send a resume along with the names
and addresses of at least three professional references to

Prof. B. Chandrasekaran
Department of Computer & Information Science
Ohio State University
2036 Neil Ave.
Columbus, OH 43210

The Ohio State University is an Equal Opportunity
Affirmative Action Employer, and encourages applications
from qualified women and minorities.





------------------------------

Subject: position offered
From: ted@nmsu.edu (Ted Dunning)
Organization: NMSU Computer Science
Date: 13 Oct 89 17:33:23 +0000


GRADUATE STUDY IN COMPUTER SCIENCE AT NEW MEXICO STATE UNIVERSITY

Computer Science Department & Computing Research Laboratory
- -----------------------------------------------------------

We are looking for able new students to join the Master's and Doctoral
programs in the Computer Science Department, with possible involvement in
projects at the Computing Research Laboratory (CRL). Areas of interest in
the Department and Laboratory include Artificial Intelligence, Parallel
Processing (software and architectures), Programming Languages, Interfaces,
Databases, Computer Security and Theory. Interdisciplinary research is
encouraged: there are interdisciplinary MS and PhD programs, and CRL
includes faculty and students from several departments apart from CS,
notably Psychology, Mathematics and Electrical Engineering.

The University is the prime research university in New Mexico, and is in
the Carnegie R1 research category. An MS program in computer science has
existed since 1966 and the CS department's Doctoral program was set up in
1980. CRL is a Center of Excellence created in 1983 with funding from the
New Mexico state legislature, and is now self-supporting through a variety
of federal and industrial grants and contracts. CRL is engaged largely in
Artificial Intelligence and Cognitive Science research. Its AI research
includes work on natural language processing, knowledge representation,
model-based problem solving, neural and connectionist networks and computer
vision. There is also a variety of research on other topics, such as
genome classification and atmospheric analysis. There are fertile working
relationships with the Sandia and Los Alamos national laboratories.

The CS Department and CRL are housed, together with the psychology and
mathematics departments, in a new, well-appointed building with special
facilities for local computer networks. The working environment is superb,
making it pleasurable to come in early and stay late. CS/CRL equipment
includes a large Sun network (including several Sparc workstations),
various other workstations, a 64-node Intel Hypercube, a new, 8-node IBM
ACE machine, and image processing equipment. There are high-quality links
to regional and national networks, allowing convenient access to Connection
Machines and other computers elsewhere in the state and the country.

The university is in Las Cruces, a pleasant, inexpensive, uncrowded,
medium-sized town (although it is one of the fastest-growing cities in the
country). The area features clean air, very low humidity, and moderate
winters. There is good Mexican food, and Mexico itself is only an hour's
drive away. New Mexico as a whole benefits from a mixed
Anglo/Hispanic/Indian culture, well reflected in its architecture, art and
activities. The state is renowned for the highly variegated beauty of its
scenery. It is one of the larger states in the Union but has one of the
smallest populations. Las Cruces is in a partly mountainous, semi-arid
region, but is blessed with lush pecan orchards and a surprising variety of
other greenery, and with being only an hour and a half's drive from forests
and ski areas. The spectacular White Sands National Monument and Gila
Wilderness are also within easy reach.

Enquiries should be directed to either:

John Barnden, OR Yorick Wilks, Director,
Graduate Committee Chair, Computing Research Laboratory,
Computer Science Dept, Box 30001/3CRL,
Box 30001/3CU,

New Mexico State University,
Las Cruces, NM 88003-0001.

(505) 646-6108 (505) 646-5466

E-mail enquiries should go to jbarnden@nmsu.edu.

(In any type of enquiry please state where you saw this announcement and what
research areas you're interested in.)

- --
ted@nmsu.edu
Dem Dichter war so wohl daheime
In Schildas teurem Eichenhain!
Dort wob ich meine zarten Reime
Aus Veilchenduft und Mondenschein

------------------------------

Subject: Job Announcement
From: Steve Hanson <jose@neuron.siemens.com>
Date: Thu, 19 Oct 89 07:21:40 -0400



Learning & Knowledge Acquisition

Siemens Corporate Research, Inc, the US research branch of
Siemens AG with sales in excess of 30$ Billion worldwide has
research openings in the Learning and Knowledge Acquisition Group
for research staff scientists. The group does basic and applied
studies in the areas of Learning (Connectionist and AI), adaptive
processes, and knowledge acquisition.

Above and beyond Laboratory facilities, the group has a network
of sun workstations (sparcs), file and compute servers, Lisp
machines and a mini-supercomputer all managed by a group systems
administrator/research programmer.

Connections exist with our sister laboratory in Munich, Germany
as well as with various leading Universities including MIT, CMU
and Princeton University, in the form of joint seminars, shared
postdoctoral position, and collaborative research.

The susscessful candidate should have a Ph.D. in Computer Science
Electrical Engineering, or any other AI-related or Cognitive
Science field. Areas that we are soliciting for presently are in
Neural Computation, or Connectionist Modeling especially related
to
Learning Algorithms,
Novel Architectures,
Dynamics,
Biological Modeling,

and including any of the following application areas

Pattern Classification/Categorization,
Speech Recognition,
Visual Processing,
Sensory Motor Control (Robotics),
Problem Solving,
Natural Language Understanding,

Siemens is an equal opportunity employer, Please send your resume
and a reference list to

Stephen J. Hanson
Learning and Knowledge Acquisition Group
Siemens Corporate Research, Inc.
755 College Road East
Princeton, NJ 08540

jose@tractatus.siemens.com
jose@clarity.princeton.edu

------------------------------

Subject: Neural Networks for Industry: A two-day tutorial
From: itrctor@csri.toronto.edu (Ron Riesenbach)
Organization: University of Toronto, CSRI
Date: 23 Oct 89 20:29:56 +0000






INFORMATION TECHNOLOGY RESEARCH CENTRE

and

TELECOMMUNICATIONS RESEARCH INSTITUTE OF ONTARIO

are pleased to sponsor:


A Two-Day Tutorial on

N E U R A L N E T W O R K S F O R I N D U S T R Y


Presented by:
Dr. Geoffrey Hinton


Regal Constellation Hotel
900 Dixon Road (near Person International Airport)
Toronto, Ontario
December 12 and 13, 1989






Why Neural Networks?

Serial computation has been very successful at tasks that can be
character- ized by clean logical rules, but it has been much less
successful at tasks like real-world perception or common sense reasoning
that typically require a massive amount of uncertain evidence to be
combined to reach a reliable decision. The brain is extremely good at
these computations and there is now a growing con- sensus that massively
parallel "neural" computation may be the best way to solve these problems.

The resurgence of interest in neural networks has been fuelled by
several factors. Powerful new search techniques such as simulated
annealing and its deterministic approximations can be embodied very
naturally in these networks, so parallel hardware implementations promise
to be extremely fast at performing the best-fit searches required for
content-addressable memory and real-world perception. Recently, new
learning procedures have been developed which allow networks to learn from
examples. The learning procedures automatically construct the internal
representations that the networks require in particular domains, and so
they may remove the need for explicit programming in ill-structured tasks
that contain a mixture of regular structure, partial regularities and
excep- tions.

There has also been considerable progress in developing ways of
represent- ing complex, articulated structures in neural networks. The
style of representa- tion is tailored to the computational abilities of the
networks and differs in important ways from the style of representation
that is natural in serial von- Neuman machines. It allows networks to be
damage resistant which makes it much easier to build massively parallel
networks.




Who Should Attend

This tutorial is directed at Industry Researchers and Managers who
would like to understand the basic principles underlying the recent
progress in neural network research. Some impressive applications of
neural networks to real-world problems already exist, but there are also
many over-enthusiastic claims and it is hard for the non-expert to
distinguish between genuine results and wishful thinking. The tutorial
will explain the main learning procedures and show how these are used
effectively in current applications. It will also describe research in
progress at various laboratories that may lead to better learning
procedures in the future.

At the end of the tutorial attendees will understand the current
state-of- the-art in neural networks and will have a sound basis for
understanding future developments in this important technology. Attendees
will also learn the major limitations of existing techniques and will thus
be able to distinguish between real progress and grandiose claims. They
will then be in a position to make informed decisions about whether this
technology is currently applicable, or may soon become applicable, to
specific problems in their area of interest.



Overview of the Tutorial


EARLY NEURAL NETWORKS & THEIR LIMITATIONS

Varieties of Parallel Computation; Alternative Paradigms for Computation

A Comparison of Neural Models and Real Brains: The Processing Elements and
the Connectivity

Major Issues in Neural Network Research

The Least Mean Squares Learning Procedure: Convergence Rate, Practical
Applications and Limitations

The Perceptron Convergence Procedure and the Limitations of Perceptrons

The Importance of Adaptive "Hidden Units"



BACK-PROPAGATION LEARNING: THE THEORY & SIMPLE EXAMPLES

The Back-Propagation Learning Procedure

The NetTalk example

Extracting the Underlying Structure of a Domain: The Family Trees Example

Generalizing from Limited Training Data: The Parity Function

Theoretical guarantees on the generalization abilities of neural nets

Improving generalization by encouraging simplicity



SUCCESSFUL APPLICATIONS OF BACK-PROPAGATION LEARNING

Sonar Signal Interpretation

Finding Phonemes in Spectrograms Using Time-Delay Nets

Hand-written character recognition

Bomb detection

Adaptive interfaces for controlling complex physical devices

Promising Potential Applications




IMPROVEMENTS, VARIATIONS & ALTERNATIVES TO BACK-PROPAGATION

Ways of Optimizing the Learning Parameters for Back-Propagation

How the Learning Time Scales with the Size of the Task

Back-Propagation in Recurrent Networks for Learning Sequences

Using Back-Propagation with Complex Post-Processing

Self-Supervised Back-Propagation

Pre-Processing the Input to Facilitate Learning

Comparison with Radial Basis Functions



UNSUPERVISED LEARNING PROCEDURES

Competitive Learning for discovering clusters

Kohonen's Method of Constructing Topographic Maps: Applications to Speech
Recognition

Linsker's method of learning by extracting principal components

Using spatio-temporal coherence as an internal teacher

Using spatial coherence to learn to recognize shapes




ASSOCIATIVE MEMORIES, HOPFIELD NETS & BOLTZMANN MACHINES

Linear Associative Memories: Inefficient One-Pass Storage Versus Efficient
Iterative Storage

Early Non-Linear Associative Memories: Willshaw Nets

Coarse-coding and Kanerva's sparse distributed memories Hopfield Nets and
their Limitations

Boltzmann Machines, Simulated Annealing and Stochastic Units

Relationship of Boltzmann Machines to Bayesian Inference




MEAN FIELD NETWORKS

Appropriate Languages and Computers for Software Simulators

Predictions of Future Progress in the Theory and Applications of Neural Nets



GUEST LECTURE

Neural Signal Processing, by Dr. Simon Haykin, Director, Communications
Research Laboratory, McMaster University, Hamilton, Ontario.

In this talk Dr. Haykin will present the results of neural signal
process- ing research applied to radar-related problems. The algorithms
considered include (a) the backpropagation algorithm, (b) the Kohomen
feature map, and (c) the Boltzman machine. The radar data bases used in
the study include ice-radar as encountered in the Arctic, and air traffic
control primary radar. The neural processing is performed on the Warp
systolic machine, which is illustrative of a massively parallel computer.



Seminar Schedule

Tuesday, December 12, 1989 Wednesday, December 13, 1989


8:00 a.m. Registration and Coffee 8:00 a.m. Coffee

9:00 Opening words: Mike Jenkins, 9:00 Tutorial Session #5
Exec. Director, ITRC and Peter
Leach, Exec. Director,TRIO

9:15 Tutorial Session #1 10:30 Break

10:30 Break 11:00 Tutorial Session #6

11:00 Tutorial Session #2 12:30 p.m. Lunch

12:30 p.m. Lunch 2:00 Tutorial Session #7

2:00 Tutorial Session #3 3:30 Break

3:30 Break 4:00 Guest lecture: Dr.
Simon Haykin, "Neural
Signal Processing"


4:00 Tutorial Session #4 5:00 Closing words

5:30 Wine and Cheese reception





Registration and Fees:

The tutorial fee is $100 for employees of companies who are members of
ITRC's Industrial Affiliates Program or who's companies are members of
TRIO. Non-members fees are $375/person. Payment can be made by Visa,
MasterCard, AMEX or by cheque (Payable to: "Information Technology Research
Centre"
). Due to limited space ITRC and TRIO members will have priority in
case of over- subscription. ITRC and TRIO reserve the right to limit the
number of regis- trants from any one company.

Included in the fees are a copy of the course notes and
transparencies, coffee and light refreshments at the breaks, a luncheon
each day as well as an informal wine and cheese reception Tuesday evening.
Participants are responsi- ble for their own hotel accommodation,
reservations and costs, including hotel breakfast, evening meals and
transportation. PLEASE MAKE YOUR HOTEL RESERVA- TIONS EARLY:

Regal Constellation Hotel
900 Dixon Road
Etobicoke, Ontario
M9W 1J7
Telephone: (416) 675-1500
Telex: 06-989511
Fax: (416) 675-1737

Registrations will be accepted up to and including the day of the event
however, due to limited space, attendees who register by December 6th will
have priority over late registrants. All cancellations after December 6th
will result in a $50 withdrawal fee.

To register, complete the registration form attached to the end of
this message then mail or fax it to either one of the two sponsors.




Dr. Geoffrey E. Hinton

Geoffrey Hinton is Professor of Computer Science at the University of
Toronto, a fellow of the Canadian Institute for Advanced Research and a
princi- pal researcher with the Information Technology Research Centre. He
received his PhD in Artificial Intelligence from the University of
Edinburgh. He has been working on computational models of neural networks
for the last fifteen years and has published 55 papers and book chapters on
applications of neural networks in vision, learning, and knowledge
representation. These publications include the book "Parallel Models of
Associative Memory"
(with James Anderson) and the original papers on
distributed representations, on Boltzmann machines (with Ter- rence
Sejnowski), and on back-propagation (with David Rumelhart and Ronald Wil-
liams). He is also one of the major contributors to the recent collection
"Parallel Distributed Processing" edited by Rumelhart and McClelland.

Dr. Hinton was formerly an Associate Professor of Computer Science at
Carnegie-Mellon University where he created the connectionist research
group and was responsible for the graduate course on "Connectionist
Artificial Intelli- gence"
. He is on the governing board of the Cognitive
Science Society and the governing council of the American Association for
Artificial Intelligence. He is a member of the editorial boards of the
journals Artificial Intelligence, Machine Learning, Cognitive Science,
Neural Computation and Computer Speech and Language.

Dr. Hinton is an expert at explaining neural network research to a
wide variety of audiences. He has given invited lectures on the research
at numerous international conferences and workshops, and has twice
co-organized and taught at the Carnegie-Mellon "Connectionist Models Summer
School"
. He has given three three-day industrial tutorials in the United
States for the Technology Transfer Institute. He has also given tutorials
at AT&T Bell labs, at Apple, and at two annual meetings of the American
Association for Artificial Intelligence.


Dr. Simon Haykin

Simon Haykin received his B.Sc. (First-Class Honours) in 1953, Ph.D.
in 1956, and D.Sc. in 1967, all in Electrical Engineering from the
University of Birmingham, England. In 1980, he was elected Fellow of the
Royal Society of Canada. He is co-recipient of the Ross Medal from the
Engineering Institute of Canada and the J.J. Thomson Premium from the
Institution of Electrical Engineers, London. He was awarded the McNaughton
Gold Medal, IEEE (Region 7), in 1986. He is a Fellow of the IEEE.

He is presently Director of the Communications Research Laboratory and
Pro- fessor of Electrical and Computer Engineering at McMaster University,
Hamilton, Ontario. His research interests include image processing,
adaptive filters, adaptive detection, and spectrum estimation with
applications to radar.





----------------------------- Registration Form -----------------------------


Neural Networks for Industry
Tutorial by Geoffrey Hinton
December 12-13, 1989
Regal Constellation, 900 Dixon Rd.



Name _________________________________________

Title _________________________________________

Organization _________________________________________

Address _________________________________________

_________________________________________

_________________________________________

Postal Code _______________________

Telephone __________________ Fax ___________________

E-mail _______________________


Registration Fee (check one):


_ ITRC/TRIO Members - $100
_ Non-members - $375


Method of Payment (check one):

_ Cheque (Make cheques payable to "Information
Technology Research Centre"
)

_ VISA Card Number _________________________
_ MasterCard ==> Expiration Date _____________________
_ American Express Surname _____________________________
Signature ___________________________

Please note: There will be a $50 cancellation charge after December 6/89.


Please fax or mail your registration to ITRC or TRIO:

ITRC, Rosanna Reid TRIO, Debby Sullivan
203 College St., Suite 303 300 March Rd., Suite 205
Toronto, Ontario, M5T 1P9 Kanata, Ontario, K2K 2E2
Phone (416) 978-8558 Phone (613) 592-9211
Fax (416) 978-8597 Fax (613) 592-8163


PRIORITY REGISTRATION DEADLINE: DECEMBER 6/89.



------------------------------

End of Neurons Digest
*********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT