Copy Link
Add to Bookmark
Report

Neuron Digest Volume 07 Number 09

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Wednesday, 13 Feb 1991                Volume 7 : Issue 9 

Today's Topics:
Preprints: Speech Recognition Using a Neural Network with Time Delays
Technical report available
preprint - Cooperation of Learning Algorithms
Tech Reports AND Position Availble at USC
CFP - Analog VLSI Neural Networks
A Short Course in Neural Networks and Learning Theory
Symposium on Models of Human Identification and Categorization
Symposium and Forum Annoucements


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Preprints: Speech Recognition Using a Neural Network with Time Delays
From: unni@neuro.cs.gmr.com (K.P.Unnikrishnan)
Date: Wed, 06 Feb 91 11:05:57 -0500



THE FOLLOWING PREPRINTS ARE NOW AVAILABLE:


Speaker-Independent Digit Recognition using a
Neural Network with Time-Delayed Connections


K.P. Unnikrishnan J.J. Hopfield D.W. Tank
GM Research Labs Caltech AT&T Bell Labs
Warren, MI Pasadena, CA Murray Hill, NJ


ABSTRACT

The capability of a small neural network to perform speaker-independent
recognition of spoken digits in connected speech has been investigated.
The network uses time-delays to organize rapidly changing outputs of
symbol detectors over the time scale of a word. The network is
data-driven and unclocked. In order to achieve useful accuracy in a
speaker-independent setting, many new ideas and procedures were
developed. These include improving the feature detectors,
self-recognition of word ends, reduction in network size, and dividing
speakers into natural classes. Quantitative experiments based on Texas
Instruments digit data bases are described.

_____________________


Connected-Digit Speaker-Dependent Speech Recognition using a
Neural Network with Time-Delayed Connections
[To appear in: IEEE Trans. on Signal Proc., March 1991]


K.P. Unnikrishnan J.J. Hopfield D.W. Tank
GM Research Labs Caltech AT&T Bell Labs
Warren, MI Pasadena, CA Murray Hill, NJ


ABSTRACT


An analog neural network that can be taught to recognize stimulus
sequences has been used to recognize the digits in connected speech. The
circuit computes in the analog domain, using linear circuits for signal
filtering and nonlinear circuits for simple decisions, feature extraction
and noise suppression. An analog perceptron learning rule is used to
organize the subset of connections used in the circuit that are specific
to the chosen vocabulary. Computer simulations of the learning algorithm
and circuit demonstrate recognition scores >99% for a single speaker
connected-digit database. There is no clock; the circuit is data-driven,
and there is no necessity for endpoint detection or segmentation of the
speech signal during recognition. Training in the presence of noise
provides noise immunity up to the trained level. For the speech problem
studied here, the circuit connections need only be accurate to about 3
bits digitization depth for optimum performance. The algorithm used maps
efficiently onto analog neural network hardware: single chip
microelectronic circuits based upon this algorithm can probably be built
with current technology.

_____________________

For copies of preprints, send your mailing address to:

unni@neuro.cs.gmr.com

or

K.P.Unnikrishnan, Computer Science Department,
GM Research Labs, Warren, MI 48090-9055

------------------------------

Subject: Technical report available
From: stefano nolfi <IP%IRMKANT.BITNET@VMA.CC.CMU.EDU>
Date: Wed, 06 Feb 91 13:24:05 -0400


The following technical report is now available. The paper has been
submitted to ICGA-91.

Send request to stiva@irmkant.Bitnet
e-mail comments and related references are appreciated


AUTO-TEACHING:
NETWORKS THAT DEVELOP THEIR OWN TEACHING INPUT

Stefano Nolfi Domenico Parisi
Institute of Psychology
CNR - Rome
E-mail: stiva@irmkant.Bitnet


ABSTRACT

Back-propagation learning (Rumelhart, Hinton and
Williams, 1986) is a useful research tool but it has a
number of undesiderable features such as having the
experimenter decide from outside what should be
learned. We describe a number of simulations of
neural networks that internally generate their own
teaching input. The networks generate the teaching
input by trasforming the network input through
connection weights that are evolved using a form of
genetic algorithm. What results is an innate (evolved)
capacity not to behave efficiently in an environment
but to learn to behave efficiently. The analysis of
what these networks evolve to learn shows some
interesting results.


references

Rumelhart, D.E., Hinton G.E., and Williams, R.J. (1986).
Learning internal representations by error propagation. In
D.E. Rumelhart, and J.L. McClelland, (eds.), Parallel
Distributed Processing. Vol.1: Foundations. Cambridge,
Mass.: MIT Press.

------------------------------

Subject: preprint - Cooperation of Learning Algorithms
From: leon%FRLRI61.BITNET@CUNYVM.CUNY.EDU,
leon%FRLRI61.BITNET@CUNYVM.CUNY.EDU
Date: Thu, 07 Feb 91 12:02:44 +0100


The following paper has been placed in the neuroprose archives at Ohio
State University:




A Framework for the Cooperation
of Learning Algorithms


Leon Bottou & Patrick Gallinari
Laboratoire de Recherche en Informatique
Universite de Paris XI
91405 Orsay Cedex - France


Abstract

We introduce a framework for training architectures composed of several
modules. This framework, which uses a statistical formulation of learning
systems, provides a single formalism for describing many classical
connectionist algorithms as well as complex systems where several
algorithms interact. It allows to design hybrid systems which combine the
advantages of connectionist algorithms as well as other learning
algorithms.



This paper will appear in the NIPS-90 proceedings. To retrieve it by
anonymous ftp, do the following:

unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62)
Name (cheops.cis.ohio-state.edu:): anonymous
Password (cheops.cis.ohio-state.edu:anonymous): <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get bottou.cooperation.ps.Z
ftp> quit
unix>
unix> zcat bottou.cooperation.ps.Z | lpr -P<your postscript printer>


------------------------------

Subject: Tech Reports AND Position Availble at USC
From: jannaron@charlie.ece.scarolina.edu (Prof. Robert Jannarone)
Date: Mon, 11 Feb 91 08:39:16 -0500



T E C H N I C A L R E P O R T S A V A I L A B L E

Hu, Y., Ma, K., & Jannarone, R. J. (1991). Real-time pattern
recognition, II: visual conjunctoid neural networks. Research Report
#NNETS91-1, Electrical & Computer Engineering Department, University of
South Carolina.

Jannarone, R. J. (1991). Automated data processing prospects: some
examples and suggestions. Research Report #NNETS91-5, Electrical &
Computer Engineering Department, University of South Carolina.

Mallya, S., & Jannarone, R. J. (1991). Real-time pattern
recognition, I: neural network algorithms for normal models. Research
Report #NNETS91-4, Electrical & Computer Engineering Department,
University of South Carolina.

Tatman, G., & Jannarone, R. J. (1991). Real-time pattern
recognition, III: alternative neural networks for speech recognition.
Research Report #NNETS91-3, Electrical & Computer Engineering Department,
University of South Carolina.

Mehta, P., & Jannarone, R. J. (1991). Real-time neural networks,
IV: conjunctoid parallel implementation. Research Report #NNETS91-5,
Electrical & Computer Engineering Department, University of South
Carolina.

Please mail requests to:

Robert J. Jannarone
Electrical and Computer Eng. Dept.
University of South Carolina
Columbia SC, 29208
(803) 777-7930
(803) 777-8045, FAX
jannaron@ece.scarolina.edu

___________________________________________________________________________
___________________________________________________________________________


JOB OPENING ELECTRICAL & COMPUTER ENGINEERING, TENURE TRACK,
UNIVERSITY OF SOUTH CAROLINA (USC)

The USC EECE Department invites applications for tenure track faculty
positions. Particular areas of interest include qwuantum and physical
electronics, computer architecture, and computer vision. PERSONS OF HIGH
CALIBER IN OTHER AREAS WILL BE CONSIDERED (the EECE Department houses a
highly active research laboratory in neurocomputing). Appointment will
be at the Assistant or Associate Professor level with a competetive
salary and rank commensurate with qualifications. Tenured appointments
at the Professor level are also possible for uniquely qualified
individuals. The USC, as the flagship university of the state, seeks
candidates having a strong commitment to excellence in both education and
research. Candidates for Associate Professor are expected to have
significant research records. Candidates for Assistant Professor are
expected to show strong research potential. Positions may be filled as
early as January, 1991 but will remain open until suitable candidates are
found. Applicants should send resumes, including names of at least three
references, to Professor Etan Bourkoff, Chair, Department of Electrical
and Computer Engineering, Swearingen Engineering Center, University of
South Carolina, Columbia, SC 29208. The University of South Carolina is
an equal opportunity/affirmative action employer.

------------------------------

Subject: CFP - Analog VLSI Neural Networks
From: takefuji@axon.eeap.cwru.edu (Yoshiyasu Takefuji)
Date: Wed, 06 Feb 91 12:47:28 -0500

CALL FOR PAPERS
Journal Analog Integrated Circuits and Signal Processing (AICSP)
Special Issue
on
Analog VLSI Neural Networks

Papers are solicited for a special issue of the Journal Analog Integrated
Circuits and Signal Processing (AICSP) covering the growing area of
artificial neural networks to be published in September 1992. The
special issue will cover all aspects of analog VLSI neural networks.
Topics of interest include, but are not limited to, the following:

*VLSI analog/digital systems
*Tradeoffs of analog/digital systems
*Novel applications in signal processing, optimization, and others
*Wafer scale integrated systems
*Artificial neuron/synaptic models and implementations
*On-line learning hardware.

Six copies of complete manuscripts should be sent to Yoshiyasu Takefuji
by December 15, 1991.

Guest Editor: Prof. Yoshiyasu Takefuji
Dept. of Electrical Engineering
Case Western Reserve University
Cleveland, OH 44106, USA
Phone: 216-368-6430
Fax: 216-368-2668
Internet: takefuji@axon.eeap.cwru.edu

Instructions for authors can be obtained from the guest editor or by
contacting Kluwer at the following address.

Karen S. Cullen
Kluwer Academic Publishers
101 Philip Drive
Norwell, MA 02061, USA
Tel. (617) 871-6300 fax (617) 871-6528
Email Karen@world.std.com


------------------------------

Subject: A Short Course in Neural Networks and Learning Theory
From: john@cs.rhbnc.ac.uk
Date: Thu, 07 Feb 91 12:15:43 +0000

____________________________________
A SHORT COURSE
IN
NEURAL NETWORKS AND LEARNING THEORY
10th and 11th April, 1991
____________________________________

Dr John Shawe-Taylor,
Department of Computer Science,
Royal Holloway and Bedford New College,
University of London,
Egham,
Surrey TW20 0EX UK


Neural networks offer the exciting prospect of training computers to
perform tasks by example rather than explicit programming. They are
finding applications across a broad spectrum of tasks including
explosives detection, credit risk, machine vision, etc. But how reliable
are such techniques? Can we guarantee that a machine that is programmed
by example will necessarily perform adequately in novel situations? And
are the techniques practical for large scale applications? These
questions are currently being addressed by research in the area of
Computational Learning Theory. This theory provides invaluable insights
for assessing the risks involved in relying on a limited number of
examples as well as providing a framework for estimating the efficacy of
training methods.

The course will cover the main results of this theory which are needed
for the practitioner. They will permit those who are developing and using
Neural Network applications to place their performance in perspective and
realistically assess how networks will scale and how accurately they are
likely to respond to new data.

A key feature of the course will be its hands-on practical flavour. It
will include sessions where participants will have an opportunity to test
out ideas in practical working examples.

The course covers two days:

Day 1: Connectionism and Neural Networks
________________________________________
An overview of connectionism stressing the main strengths and
weaknesses of the approach. Particular emphasis will be given to areas
where the techniques are finding industrial application. At the same
time the areas where major problems remain to be solved will be
outlined and an indication of current trends in research will be
given.


Day 2: Learning Theory for Feedforward Networks
_______________________________________________
The focus will be on applying recent advances in Computational Learning
Theory to Feedforward Neural Networks. An overview of the field of
Computational Learning Theory will be given. This theory puts training
problems in perspective and suggests effective solutions. It also
speaks to the question of generalisation and allows predictions of
performance to be made. The practical sessions will involve applying
these insights to the training problems of Day 1.


Who should attend?
__________________
- Those who are involved in designing Neural Network systems or will
be required to make decisions about their application and who wish to
acquire expertise enabling them to make informed judgements about
Neural Network performance.

- Those who wish to benefit from recent advances in the theoretical
understanding of Neural Networks with a view to isolating useful areas
of current research.


Each day stands alone and delegates can enrol for either one or both
days. For more details and registration information, please write to:

Dr Penelope Smith,
Industrial Liaison Officer,
RHBNC,
Egham, Surrey TW20 0EX

or email to:

john@cs.rhbnc.ac.uk


------------------------------

Subject: Symposium on Models of Human Identification and Categorization
From: Tony Marley <INAM%MUSICB.MCGILL.CA@BITNET.CC.CMU.EDU>
Date: Sun, 10 Feb 91 16:27:24 -0500

Department of Psychology
McGill University
1205 Avenue Dr Penfield
Montreal
PQ H3A 1B1
Canada

February 10, 1991

MODELS OF HUMAN IDENTIFICATION AND CATEGORIZATION


Symposium at the Twenty-Fourth Annual Mathematical Psychology
Meeting
A. A. J. Marley, Symposium Organizer

The Society for Mathematical Psychology
Sponsored by Indiana University, Bloomington, Indiana
August 10-13, 1991


At each of its Annual Meetings, the Society for Mathematical Psychology
has one or more symposia on topics of current interest. I believe that
this is an opportune time to have the proposed session since much
exciting work is being done, plus Robert Nosofsky is an organizer of the
conference at Indiana, and he has recently developed many empirical and
theoretical ideas that have encouraged others to (re)enter this area.

Each presenter in the symposium will have 20 to 30 minutes available to
them, plus there will be time scheduled for general discussion. This
meeting is a good place to present your theoretical ideas in detail,
although simulation and empirical results are naturally also welcome.
Remember, the Cognitive Science Society is meeting at the University of
Chicago August 7- 10, i.e. just prior to this meeting; thus, by splitting
your material in an appropriate manner between the two meetings, you will
have an excellent forum within a period of a week to present your work in
detail.

If you are interested in participating in this symposium, please contact
me with a TITLE and ABSTRACT. I would also be interested in suggestions
of other participants with (if possible) an email address for them.

To give you a clearer idea of the kind of work that I consider of direct
relevance, I mention a few researchers and some recent papers. This list
is meant to be illustrative, so please don't be annoyed if I have omitted
your favourite work (including your own).

REFERENCES

AHA, D. W., & MCNULTY, D. (1990). Learning attribute relevance in context
in instance-based learning algorithms. In Proceedings of the Twelfth
Annual Conference of the Cognitive Science Society. Hillsdale, NJ:
Erlbaum.

ASHBY, F. G. (Ed.). (in press). Probabilistic Multidimensional Models of
Perception and Cognition. Hillsdale, NJ: Erlbaum.

ESTES, W. K., CAMPBELL, J. A., HATSPOULOS, N., & HURWITZ, J. B. (1989).
Base-rate effects in category learning: A comparison of parallel network
and memory storage-retrieval models. Journal of Experimental Psychology:
Learning, Memory, and Cognition, 15, 556-571.

GLUCK, M. A., & BOWER, G. H. (1989). Evaluating an adaptive network model
of human learning. Journal of Memory and Language, 27, 166-195.

HURWITZ, J. B. (1990). A hidden-pattern network model of category
learning. Ph. D. Thesis, Department of Psychology, Harvard.

KRUSCKE, J. K. (1990). ALCOVE: A connectionist model of category
learning. Research Report 19, Cognitive Science, Indiana University.

LACOUTURE, Y., & MARLEY, A. A. J. (1990). A connectionist model of choice
and reaction time in absolute identification. Manuscript, Universite
Laval & McGill University.

NOSOFSKY, R. M., & GLUCK, M. A. (1989). Adaptive networks, exemplars, and
classification rule learning. Thirtieth Annual Meeting of the Psychonomic
Society, Atlanta, Georgia.

RATCLIFF, R. (1990). Connectionist models of recognition memory:
Constraints imposed by learning and forgetting functions. Psychological
Review, 97, 285-308.

SHEPARD, R. N. (1989). A law of generalization and connectionist
learning. Plenary Session, Eleventh Annual Conference of the Cognitive
Science Society, University of Michigan, Ann Arbor.


Regards

Tony

A. A. J. Marley
Professor
Director of the McGill Cognitive Science Centre

------------------------------

Subject: Symposium and Forum Annoucements
From: jannaron@charlie.ece.scarolina.edu (Prof. Robert Jannarone)
Date: Mon, 11 Feb 91 14:05:52 -0500


S Y M P O S I U M A N N O U N C E M E N T

The 23rd Southeastern Symposium on System Theory will be held at the
University of South Carolina, Columbia, on March 10 through March 12,
1991. Contributed papers include topics in computer structures, optics,
robotics, neural networks, pattern recognition, estimation & reliability,
circuit systems, control systems, signal processing, electromagnetics,
parallel processing, communication systems, power systems, VLSI, and
computer networks. Invited speakers include Robert Korgon from Johns
Hopkins U. (A History of Industry/University/Government links in
Technology Transfer). Charles Daniels from AT&T (Fiber-Optic components
for LANs and Data Communica- tions), and Richard Edwards from
Westinghouse Savannah River Laboratories (Automatic Data Analysis
Prospects for Nuclear Waste Vitrification).

_______________________________________________________________________________
_______________________________________________________________________________

F O R U M A N N O U N C E M E N T


NEUROCOMPUTING and AUTOMATIC DATA PROCESSING
PROSPECTS for INTELLIGENT MANUFACTURING:

A Collection of Presentations from Industry,
Government, and University Scientists, with Discussion
(in conjunction with the Twenty-Third
Southeastern Symposium on System Theory)

Date: March 12, 1990.
Time: 2 PM until 9 PM.
Location: The University of South Carolina (USC) Swearingen Engineering
Center

Participants:

Russ Beckmeyer, Westinghouse Savannah River Company;
Steven Kirk, Digital Equipment Corporation;
Cihan H. Dagli, University of Missouri-Rolla;
Paul Huray, USC, U.S. Office of Science and Technology Policy;
Robert J. Jannarone, USC;
Steven Kirk, Digital Equipment Corporation;
Omid Omidvar, University of the District of Columbia;
William Ranson, USC, Southeastern Manufacturing and Technology Center;
Harold Szu, U.S. Naval Research Laboratories;
Debra Wince-Smith, U.S. Department of Commerce;
Paul Werbos, U.S. National Science Foundation;
David White, Mcdonell-Douglas, Corporation.

Registration fee: $85 (dinner included)

For further information, contact: Robert Jannarone
Department of Electrical and
Computer Engineering
University of South Carolina
Columbia, SC 29208
(803) 777-7930
(803) 777-8045 (FAX)
jannaron@ece.scarolina.edu.


------------------------------

End of Neuron Digest [Volume 7 Issue 9]
***************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT