Copy Link
Add to Bookmark
Report

Neuron Digest Volume 06 Number 63

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Tuesday, 30 Oct 1990                Volume 6 : Issue 63 

Today's Topics:
Music
Info request: Neural Net Application Tools
Re: Neuron Digest V6 #62
A Short Course in Neural Networks and Learning Theory
Neural Network Simulation Service
Neuron Digest V6 #62
PRE-PRINT availability
New Book on Neural Networks (PC Tools)
info on a workshop


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: MUSIC
From: Niall Griffith <ngr@cs.exeter.ac.uk>
Date: Thu, 18 Oct 90 17:26:59 +0100


I am working at the Connection Science lab at Exeter, and I am writing a
review of connectionist research on music. It would be really useful if
you could send me as many references as you have on the subject.

I will of course make these publicly available.

Niall Griffith
Centre for Connection Science JANET: ngr@uk.ac.exeter.cs
Dept. Computer Science
University of Exeter UUCP: ngr@expya.uucp
Exeter EX4 4PT
DEVON BITNET: ngr@cs.exeter.ac.uk@UKACRL
UK

------------------------------

Subject: Info request: Neural Net Application Tools
From: lambert@cod.nosc.mil (David R. Lambert)
Date: Fri, 26 Oct 90 09:54:59 -0700

I would like recommendations for neural net application tools which are
reliable and easy for students to learn and use. I need to detect and
recognize patterns in data consisting of a dozen or so variables, each
with a fairly small number of discrete values. My platforms are IBM-PC,
VAX, and Macintosh. I am currently beginning to look at Brainmaker,
MacBrain, and NeuralWare Explorer.

David R. Lambert, PhD
Email: lambert@nosc.mil









------------------------------

Subject: Re: Neuron Digest V6 #62
From: howard@aic.hrl.hac.com
Date: Fri, 26 Oct 90 18:42:09 -0700

> I'm interested in any references dealing with the use of
> neural nets for the real time control or simulation of movement,
> especially locomotion. Ultimately, the group I'm working
> with wants to understand gait (locomotion ) disorders in
> humans - to me this means simulation.

Look for Nigel Goddard's work (student of Jerry Feldman). Goddard is
finishing his PhD at U. Rochester on a NN for recognizing gait. He uses
"moving light displays" created by placing a light on each joint and
filming movements.

----------Mike Howard, Hughes Research Labs


------------------------------

Subject: A Short Course in Neural Networks and Learning Theory
From: john@cs.rhbnc.ac.uk
Date: Sun, 28 Oct 90 12:45:21 +0000

-------------------------------------
A SHORT COURSE
IN
NEURAL NETWORKS AND LEARNING THEORY
7th and 8th January, 1991
-------------------------------------

Dr John Shawe-Taylor,
Department of Computer Science,
Royal Holloway and Bedford New College,
University of London,
Egham,
Surrey TW20 0EX UK

The two day course will give an introduction to Neural Networks and
Learning Theory. It will be an opportunity to hear of recent results
which place the subject of Connectionism on a firm theoretical
foundation, by linking it with Computational Learning Theory. John
Shawe-Taylor has himself contributed to recent developments in applying
results of Computational Learning Theory to Neural Networks.

A key feature of the course will be its hands-on practical flavour. Both
days will include sessions where participants will have an opportunity to
test out ideas in practical working examples. This will highlight the
real problems in network design and training. However, by applying the
theoretical results of Computational Learning Theory many difficult
problems will become better understood, and in some cases tractable
solutions will suggest themselves.

There follows a summary of the two days.

Day 1: Connectionism and Neural Networks
----------------------------------------
The day starts with an overview of connectionism stressing the main
strengths and weaknesses of the approach. Particular emphasis will be
given to areas where the techniques will find industrial applications in
the near future. At the same time the areas where major problems remain
to be solved will be outlined and an indication of current trends in
research will be given, as well as implementation techniques.

Particular emphasis will be placed on Feedforward Neural Networks. Such
networks will be discussed in more detail. This will be followed by an
opportunity to gain first-hand experience of the problems involved in
designing and training networks. In a concluding session a summary will
put the practical experiences in perspective with particular reference to
current research.

Day 2: Learning Theory for Feedforward Networks
-----------------------------------------------
During the second day the focus will be on Computational Learning Theory
and its application to the problems of training feedforward Neural
Networks. The day begins with an overview of the field of Computational
Learning Theory. This is followed by a discussion of contributions that
the theory has made in understanding connectionist architectures. The
results range from "negative", such as the fact that certain training
problems will be difficult or infeasible, to "positive", such as the
existence of methods for estimating the size of training sample needed to
give good generalisation with high confidence. The practical sessions of
the day will involve applying these insights to the problems of designing
and training feedforward Neural Networks.

It is possible to register for just one of the two days. For more details
and registration information, please write to:

Dr Penelope Smith,
Industrial Liaison Officer,
RHBNC,
Egham, Surrey TW20 0EX

or email your postal address to:
john@cs.rhbnc.ac.uk

------------------------------

Subject: Neural Network Simulation Service
From: David Kanecki <kanecki@vacs.uwp.wisc.edu>
Date: Sun, 28 Oct 90 17:49:05 -0600

CONNECT/LINK
------------

NEURAL NETWORK/ HYPOTHESIS CORRELATION SERVICES


To introduce people to neural networks and hypothesis correlation
services, a limited time offer is available where one can have at
no charge neural network/hypothesis correlation analysis of
information.

For example, given a series of results one can use the service to
aid in prediction based upon new or old circumstances. To use the
service, send the information via e-mail or regular mail with a
Stamped Self Addressed Enveloped included and use the information
format sheet below. Based upon that no fee will be charged, 5
request per organization or individual will be the limit
accepted. My e-mail address is: kanecki@vacs.uwp.wisc.edu. And,
my regular mail address is:

David H. Kanecki, Bio. Sci., A.C.S.
P.O. Box 93
Kenosha, WI 53141
United States, USA

or (414)-654-8710 After 7 PM CST



1. DATA FROM EXPERIMENT OR NOTEBOOK:

Example:

A system has two states of events and one state of action.
Based upon test the following information was obtained:

Observation 1 Observation 2 Action 1
------------- ------------- --------------
No Stimulus No Stimulus No Action occurred
No Stimulus Stimulus Action 1 occurred
Stimulus No Stimulus Action 1 occurred
Stimulus Stimulus No Action occurred

>From this data a request sheet was prepared as in the example
below:

--- Or verbal description so I can do coding, if need be ---

2. DATA FROM NOTEBOOK TRANSCRIBED FROM EXPERIMENT TO REQUEST FORM:


REQUEST for Neural Network/Hypothesis Correlation Service
----------------------------------------------------------

Name: __________________________________________________
Organization:___________________________________________
Address: _______________________________________________
E-Mail Address:_________________________________________
Phone No: ______________________________________________
FAX No: ________________________________________________

Data Coding Sheet:
Setup:

Number of Stimulus States: 2 ( Observation 1 and 2)
Number of Action States: 1 ( Action 1)

Stimulus Active Action Active
--------------- ---------------
Case 1: None None
Case 2: Stimulus 2 Action 1
Case 3: Stimulus 1 Action 1
Case 4: Stimulus 1 and 2 None

Hypothesis Query: (What action will occur if a stimulus is active)
Stimulus
---------
1). Stimulus 1
2). Stimulus 2
3). Stimulus 1 & 2

---------------------------------


Based upon the data sent in and queries asked I will use the
neural network/ hypothesis correlation program to generate a
response to hypothesis queries based upon the data specified in
the setup section of the form.


3. RESULTS OF ANALYSIS

Results of Analysis Form
-------------------------

Percent of Information learned from sample data : 100%
Special Coding used by Operator to increase retention: YES


Active Stimulus Given Action Predicted
--------------------- -----------------
1. None No action, Action 1
inactive
2. Stimulus 2 Action 1 active
3. Stimulus 1 Action 1 active
4. Stimulus 1 and 2 No action, Action 1
inactive

4. Allow 6 to 14 days for Return/Reply

May this service help you and your associates understand
simulation better for enrichment and creativity. Your comments
and feedback are welcome.


------------------------------

Subject: Neuron Digest V6 #62
From: JJ Merelo <jmerelo@ugr.es>
Date: 29 Oct 90 17:07:00 +0200

Here are some references for Kohonen network
[Koh82] T. Kohonen, Self organized formation of topologically
correct feature maps, Biological Cybernetics 43: 59-
69.
[KOH82] KOHONEN,T.: "Clustering, taxonomy, and topological
maps of patterns"
. In: Proceedings of the 6th Int.
Conf. on Pattern Recognition. IEEE Computer Society
Press. 1982.
[KOH84] KOHONEN,T; MKISARA,K.; SARAMKI,T.: "Phonotopic maps
-insightfull representation of phonological features
for speech recognition"
; Proceedings of IEEE 6th Int.
Conf. on Pattern Recognition. Montreal (Canada).
pp.182-185. 1984
[KOH88a] KOHONEN, T.: "Self-Organization and Associative
Memory"
; Springer-Verlag; 1st Edt. 1984; 2nd Edt. 1988

[KOH88b] KOHONEN T. "The 'Neural' Phonetic Typewriter" IEEE
Computer; Vol.21, No3, pp.11-22; 1988

You can get them from Kohonen himself, writing to him. I have not
found many other references. And speech recognition people seem to think
not very well about Kohonen's self-organizing map

JJ Merelo
Granada University ( Spain )
Electronics and Computer Tech. Dept.

------------------------------

Subject: PRE-PRINT availability
From: P.Refenes@cs.ucl.ac.uk
Date: Mon, 29 Oct 90 17:42:25 +0000

The following pre-print (SPIE-90, Boston, Nov. 5-9 1990) is available.
(write or e-mail to A. N. Refenes at UCL)

AN INTEGRATED NEURAL NETWORK SYSTEM for HISTOLOGICAL IMAGE UNDERSTANDING


A. N. REFENES, N. JAIN & M. M. ALSULAIMAN
Department of Computer Science,
University College London,
Gower Street, WC1, 6BT,
London, UK.

ABSTRACT


This paper describes a neural network system whose
architecture was designed so that it enables the
integration of heterogeneous sub-networks for performing
specialised tasks. Two types of networks are integrated: a)
a low-level feature extraction network for sub-symbolic
computation, and b) a high-level network for decision
support.

The paper describes a non trivial application from
histopathology, and its implementation using the Integrated
Neural Network System. We show that with careful network
design, the backpropagation learning procedure is an
effective way of training neural networks for histological
image understanding. We evaluate the use of symmetric and
asymmetric squashing functions in the learning procedure
and show that symmetric functions yield faster convergence
and 100% generalisation performance.

------------------------------

Subject: New Book on Neural Networks (PC Tools)
From: Russ Eberhart <RCE1%APLVM.BITNET@CORNELLC.cit.cornell.edu>
Date: Tue, 30 Oct 90 08:19:51 -0500


Announcing a new book on neural networks:

NEURAL NETWORK PC TOOLS:
A PRACTICAL GUIDE

Edited by Russell Eberhart and Roy Dobbins
The Johns Hopkins University Applied Physics Laboratory

Published by Academic Press: ISBN 0-12-228640-5

TABLE OF CONTENTS

Foreword (by Bernard Widrow)

Introduction (by Eberhart & Dobbins)
a. Myths versus realities
b. Purpose of book
c. Organization of book
d. Main neural network use categories

Chapter 1 - Background and History (by Eberhart & Dobbins)
a. Introduction
1. What is a neural network?
2. What is a neural network tool?
b. Biological basis for neural network tools
1. Introduction
2. Neurons
3. Differences between biological structures and NNT's
4. Where did neural networks get their name?
c. Neural network development history
1. Introduction
2. The Age of Camelot
3. The Dark Age
4. The Renaissance
5. The Neoconnectionist Age

Chapter 2 - Implementations (by Eberhart & Dobbins)
a. Introduction
b. Supervised training: The back-propagation model
1. Introduction
2. Topology and notation
3. Network input
4. Feedforward calculations
5. Training by error back-propagation
6. Running the back-propagation NNT
c. Unsupervised training: Self-organization and associative
memory
1. Introduction
2. Topology and notation
3. Network initialization and input
4. Training calculations
5. Running the self-organization NNT

Chapter 3 - Systems (by Eberhart & Dobbins)
a. Specification of the task
b. How to optimize the use of the neural network tool
c. How to choose the proper neural network tool
d. The importance of preprocessing
1. Use NN's wisely...don't try to do everything with them
2. Design for overall optimal system performance
e. Relationship to other areas including expert systems
f. Problem categories appropriate for neural networks
1. How to out-expert expert systems
2. Don't invent a Cadillac when a VW will do
3. Pattern recognition
4. Biopotential waveform analysis and classification

Chapter 4 - Software Tools (by Dobbins & Eberhart)
a. Introduction
b. Implementing neural networks on the PC
1. Using C and assembly language
2. Back-propagation networks
3. Vector and matrix operations
c. Running neural networks
1. Getting data into and out of the network
2. Setting attributes
3. What's it doing?
d. Implementation issues

Chapter 5 - Development Environments (by Dobbins & Eberhart)
a. Introduction
b. What is a neural network development environment?
1. Desirable characteristics of development environments
2. Why a development environment?
3. A brief survey of neural network development systems
c. Introduction to network modeling languages
d. Specifying neural network models
1. Specifying network architecture
2. Activation functions
3. Learning rules
4. Specifying the environment
5. Update rules
6. Neural network paradigms
e. CASENET: A neural network development environment

Chapter 6 - Hardware Implementations (by D. Gilbert Lee, Jr.)
a. When do you really need hardware assistance?
b. What's the deal about accelerator boards?
c. Transputers: when transputing is a cost-effective
approach
d. What's possible to implement on computers smaller than
a Cray
e. Mini-Case Study: Ship Pattern Recognition

Chapter 7 - Performance Metrics (by Eberhart, Dobbins, & Hutton)
a. Introduction
b. Percent correct
c. Average sum-squared error
d. Normalized error
e. Receiver operating characteristic curves
f. Recall and precision
g. Sensitivity, specificity, positive predictive value and
false alarm rate
h. Chi-square test

Chapter 8 - Network Analysis (by Vincent Sigillito & Russ Eberhart)
a. Introduction
b. Network analysis
1. Introduction
2. The "divide by three" Problem
3. Other considerations
4. The "square within a square" problem
5. Distributions of hidden neurode activity levels
6. Analyzing weights in trained networks

Chapter 9 - Expert Networks (by Maureen Caudill)
a. Introduction
b. Rule-based expert systems
c. Expert networks
1. Fuzzy logic
2. Fuzzy cognitive maps
3. An expert bond-rating network
4. A hierarchical expert network
5. Knowledge in an expert network
d. Expert network characteristics
e. Hybrid expert networks
1. Explanation by confabulation
2. Rule extraction
3. True hybrid expert

Chapter 10 - Case Study I - EEG Waveform Classification (by Eberhart
and Dobbins)
a. System specifications
b. Background
c. Data preprocessing and categorization
d. Test results

Chapter 11 - Case Study II - Radar Signal Processing (by Vincent
Sigillito and Larrie Hutton)
a. The radar system
b. Methods
c. Implementation
d. Conclusion

Chapter 12 - Case Study III - Technology in Search of a Buck (by
Tom Zaremba)
a. Introduction
b. Markets to watch and markets to trade
c. Futures market forecasts
d. Statistical futures market data
e. Sources and value of character-of-market data
f. Model description
g. Are neural nets suited to implementing technical analysis
models?
h. What was tried with the multilayer perceptron model?
i. How and why was the multilayer perceptron implemented in
EXCEL?
f. What was learned, what remains to be done and has any
money been made?

Chapter 13 - Case Study IV - Optical Character Recognition (by Gary
Entsminger)
a. Summary of the problem
b. System configuration
c. Scanner interfacing
d. Objects in Pascal
e. Notes and conclusions

Chapter 14 - Case Study V - Making Music (by Eberhart & Dobbins)
a. Introduction
b. Representing music for neural network tools
c. Network configurations
d. Stochasticity, variability and surprise
d. Playing your music with MIDI
e. Now what?

Glossary

References

Appendix A - Batchnet BP NNT code, with pattern, weight,
run and demo files

Appendix B - Self-Organizing NNT code, with pattern
run and demo files

Appendix C - Turbo Pascal code for optical character
recognition shell

Appendix D - Source code for music composition files

Appendix E - Additional NNT resources
a. Organizations/societies
b. Conferences/symposia
c. Journals/magazines/newsletters
d. Bulletin boards
e. Computer data bases

Appendix F - Matrix multiplication code for transputers

Index

------------------------------

Subject: info on a workshop
From: salam@frith.egr.msu.edu
Date: Wed, 24 Oct 90 12:17:30 -0400

This is a submission regarding information on a workshop that is
organized and managed by the LifeLong Eduation Program at MSU. I hope
that it would be of interest to some people.

A Tutorial Workshop on Neural Nets Theory, Design, and (Electronic)
Implementation



November 12-13, 1990



Michigan State University
East Lansing, Michigan 48824



A Tutorial Workshop on Neural Nets Theory Design, and (Electron-
ic) Implementation November 12-13, 1990

Overview:
It is recognized that
no matter how fast conventional digital computers would become,
it is unlikely that they could outdo the human performance in
tasks such as pattern recognition or associative memory. It is
only logical therefore that for such tasks the architecture of
the microprocessor or the computer should emulate that of the
brain. Many workers have proposed various architectures that
model some aspects of the highly interconnected nerves system and
the brain. These architectures, often referred to as n_e_u_r_a_l
n_e_t_s, basically consist of a large number of simple processors
(or neurons) which are highly interconnected and work asynchro-
nously and in parallel. Some design procedures have also been
proposed by many researchers; some procedures have been based on
intuitive arguments and physical reasoning alone, however. Conse-
quently, although the proposed neural devices have worked well in
some case studies, they have been found to fail in numerous other
cases as well. It is essential, therefore, to lay a foundation
for the proper design of neural processing devices and develop
effective learning algorithms. It is equally essential that the
designed architectures and the learning algorithms lend them-
selves naturally to the chosen medium of implementation. In
essence, one has to accommodate the prevalent technologies and
pursue a methodology that would ultimately balance the hypothesis
of theoretical models with the constraints of the media of imple-
mentation.



A Tutorial Workshop on Neural Nets Theory Design, and (Electron-
ic) Implementation June 4-5, 1990

Objectives:
This workshop provides an in-depth introduction to re-
cent formulation of neural networks spanning modeling, theory,
applications, and (electronic) silicon implementation. It intro-
duces the basic principles and mechanisms behind the present
designs of neural nets; it identifies the advantages and the lim-
itations of the existing design methodologies for specific appli-
cations. The course presents novel learning schemes and explains
what makes them work and (if and) when they might fail. >From a
practical view point, the course will also focus on implementa-
tions utilizing CMOS VLSI technologies. Recent design implementa-
tions on VLSI chips, resulting from the research activities of
the instructors, will also be described.

Who Should Attend:
This workshop is designed for those who wish to
learn about the recent development in neural nets, their current
use, their method of implementations, and their potential impact
on science and technology.

Prerequisite:
At least a Bachelor's degree in engineering, phy-
sics, mathematics, science, or equivalent. Background in circuits
and systems is helpful.



Faculty:

Anthony N. Michel: received the Ph.D. degree in electri-
cal engineering from Marquette University and the D.Sc. degree in
applied mathematics from the Technical University of Graz, Aus-
tria. He has seven years of industrial experience. From 1968 to
1984 he was at Iowa State University. At present he is Frank M.
Freimann Professor of Engineering and Dean of the College of
Electrical Engineering, University of Notre Dame, Notre Dame, IN.
He is author and coauthor of three texts and several other publi-
cations. Dr. Michel received the 1978 Best Transactions Paper
Award of the IEEE Control Systems Society (with R. D. Rasmussen),
the 1984 Gullemin-Cauer Prize Paper Award for the IEEE Circuits
and Systems Society (with R. K. Miller and B. H. Nam), an IEEE
Centennial Medal. He is a former Associate Editor and a former
Editor of the IEEE Transactions on Circuits and Systems and a
former Associate Editor of the IEEE Transactions on Automatic
Control. He was the Program Chairman of the 1985 IEEE Conference
on Decision and Control. He has been the present general
chairman of the 1990 International Symposium on Circuits and Systems
and he is presently an associate editor of the IEEE
Transactions on Neural Networks.


Fathi M. A. Salam (program chairman): is an associate professor of
electrical engineering at MSU. He received his B.S. and Ph.D. in
electrical science from the University of California-Berkeley,
and holds master's degree in both mathematics and electrical en-
gineering. The author or coauther of more than 70 technical pa-
pers, he was associate editor of the IEEE transactions on Cir-
cuits and Systems (CAS) for nonlinear circuits and systems from
1985-87. He was cochair of the Engineering Foundation Conference
on Qualitative Methods for Nonlinear Dynamics in June 1986. He is
the co-editor of the book, Dynamical Systems Approaches to Non-
linear Problems in Circuits and Systems, SIAM, January 1988. He
presently an associate editor of both the IEEE Transactions on
Neural Networks as well as the Journal of Circuits, Systems, and
Computers. Hisresearch interests include nonlinear phenomena in
circuits and systems, analysis and design of neural networks,
adaptive systems, and robotics.


Timothy Grotjohn: received his B.S. and M.S. degrees from the
University of Minnesota in 1982 and 1984 respectively. He then
continued his studies at Purdue University with an American Elec-
tronics Association-Hewlett Packard Faculty Development Fellow-
ship completing his Ph.D. degree in 1986. He joined MSU in the
Department of Electrical Engineering in 1987. His research area
is the simulation, modeling and characterization of semiconductor
devices and processes. He has done consulting at AT&T Bell La-
boratories and he has worked two summers at AT&T Bell Labora-
tories. He has also been a Visiting Researcher at the Institute
of Microelectronics in Stuttgart, West Germany.


Summary

Date: November 12-13, 1990
Days: Monday-Tuesday
Registration: 8:30 a.m. - 9:00 a.m.
Time: Monday, November 12
Session Time: 8:30 a.m. - 5:00 p.m. daily
Place: The Kellogg Center for
Continuing Education
Fee: $395.00 per person
Credit: 1.5 CEU



A Tutorial Workshop on Neural Nets Theory
Design, and (Electronic) Implementation June 4-5, 1990

Daily Schedule:
Sessions meet from 8:30 a.m. to 5:00 p.m. each day.

Monday, November 12
8:30 a.m. Session I, Room 104

Artificial Neural Nets - an introduction

- -neural nets or processing networks of the
brain: a different architecture for engineering technology
- -biological neuronal networks and their architectures the synaptic
weight and its models
- -advantages of neural network processors:
fault-tolerance, parallel processing, asynchronous processing
- -mathematical models: the feedforward and feedback models, mul-
tilayered models, the Hopfield model, the Grossberg model, the
Hoppensteadt model, and newly introduced models.
- -the discrete
models vs. the analog models A mathematical formulation - gra-
dient systems
Lunch, Centennial Room 1:15 p.m. Session II , Room 104

Engineering Design & Applications
- -Basic analysis-results
- -Conditions for proper design
- -Speed of convergence Four design
schemes; discrete/ continuous
- -Lower block triangular form
- -Versatile function generator design
- -A/D converters
- -Resistor sorters

7:30 - 9:30 p.m.
Optional tour of neural net research facility
at Michigan State. Performance of PC-interfaced Artificial Neur-
al Net chips will be demonstrated. Dr. Salam will be available
for informal questions and review.



Tuesday November 13
8:30 a.m. Session III, Room 104

Programming the
network, learning, or how to store memories
- -Learning: supervised and unsupervised
- -The back propagation algorithm: when it works
and why it might fail; continuous-time (analog) form; extensions
and improvements
- -The outer product (the Hebb) rule: theoretical
justifications as well as limitations; the discrete sum vs. the
integral form; practical experience with the rule; modifications
- -New (1990) learning rule(s) that stores data for feedback neural
nets

Noon Lunch, Centennial Room
1:15 p.m. Session IV , Room 104

Implementation via simulation, electronics, and electro-optics
- -Implementation media: software vs. hardware
- -Hardware: electro-optics vs. electronics
- -Electronics: digital vs. analog
- -Advantages of analog silicon VLSI for (artificial) neural nets
- -Basic elements of analog VLSI
- -Designed/implemented analog MOS neural
network VLSI chips

5:00 Adjournment



General Information

Program Fee: The program fee of $395.00 includes tuition, program
materials, refreshment, and lunch.

Group Discounts: Group discounts of 20% are available for three or
more participants registered together in advance from the same
company.

Registration: Don't delay. Register today. In order to maintain
reasonable class sizes, registrations are accepted on a first
come, first served basis until an optimum number is achieved. The
final deadline is May 31, 1990. Mail the registration form with
payment today. Allow one week for your return confirmation. For
immediate confirmation of registration, use your VISA/MasterCard
by telephone (800) 447-3549 [in Michigan (800) 462-0846] or FAX
(517) 353-3900.

Changes and Cancellations: Michigan State University reserves the
right to make changes in program speakers or presenters if un-
foreseen circumstances so dictate. Michigan State University also
reserves the right to cancel programs when enrollment criteria
are not met, or where conditions beyond its control prevail.
Every effort will be made to contact each enrollee when a program
is cancelled. All program fees will be refunded when a program is
cancelled by Michigan State University. Any additional costs in-
cluded by the enrollee of cancelled programs are the responsibil-
ity of the enrollee. Written cancellations by preregistrants
postmarked ten or more days prior to the seminar will be fully
refunded, except for a $25.00 processing fee. No refund will be
allowed for withdrawal postmarked less than ten days prior to the
seminar. If you fail to attend the program and do not notify En-
gineering Lifelong Education, you are liable for the entire fee.


Continuing Education Units (CEU): The Continuing Education Unit is
defined as "Ten contact hours of participation in an organized
continuing education experience under responsible sponsorship,
capable direction, and qualified instruction."
Michigan State
University maintains a permanent record of all CEU's issued. In-
dividuals may use transcripts as evidence of participation in
continuing education programs. This program carries 1.5 CEU's.

Housing: Housing is the responsibility of each participant. Hous-
ing will be available at The Kellogg Center for Continuing Educa-
tion. Room rates are in the range of $50 to $80. Room rates are
subject to change. To make your reservations, please call 517-
355-5090 or complete the form attached.

How to Reach Kellogg Center: Kellogg Center is located on campus,
on Harrison Avenue. For motorists, exit from US-127, or I-496 at
Trowbridge Road. When Trowbridge ends, turn left on Harrison to
the center (east side of Harrison). Lansing's Capital City Air-
port has limousine and taxi service to the center. The center is
approximately one mile from the East Lansing train station served
by Amtrak. For further information, contact Dr. Anthony Rigas,
Director, Engineering Lifelong Education, A394 Engineering Build-
ing, Michigan State University, East Lansing, MI 48824-1226.
Telephone (800) 447-3549; or, in Michigan, (800) 462-0846.

****************************************************************
****************************************************************
Please return this preregistration form if you
plan to attend.

Neural Nets
Engineering Lifelong Education
A-394 Engineering Building
Michigan State University
East Lansing, MI 48824-1226

Name______________________________________________

Title_______________________________________________

Institution/Company__________________________________

Address____________________________________________

City____________________State________ZIP____________

Daytime phone ( )______________________
Yes, I wish to receive CEU's.

My S.S.# is_______-_______-_______ Conference Registration Fee:
$395.00 pre-registeded

%~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~%Enclosed total :__________
Make check for fee and meals payable to Michigan State Universi-
ty. To pay by VISA/MC card, please complete the following:

Exp. date___________________ __Visa __ MasterCard

Number__________________________________________

Signature________________________________________
(Engineering Lifelong Education cannot accept other credit
cards.)


November 12-13, 1990



Overnight Housing Reservation

Arrival date___________ Departure date___________

Estimated arrival time___________________________ Single occupan-
cy Shared occupancy (half twin). Person you wish to share
with_____________________________ Regular room; if none avail-
able, please book a room in another nearby facility at comparable
rate. Regular room; if none available, please book a deluxe
room. Deluxe room. Late Arrival Guarantee: __ Visa __
MasterCard

__ AMEX Exp. date__________________________

charge card no.________________________________

Signature_____________________________________






























Workshop on Neural Nets
Engineering Lifelong Education
A-394 Engineering Building
Michigan State University
East Lansing, MI 48824-1226





























































------------------------------

End of Neuron Digest [Volume 6 Issue 63]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT