Copy Link
Add to Bookmark
Report

Neuron Digest Volume 06 Number 39

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Wednesday,  6 Jun 1990                Volume 6 : Issue 39 

Today's Topics:
Re: Neural Nets and forecasting
Introduction to some work at NASA
Symbol Train Processing
submission to net: Time-Frequency Distributions & Neural Nets
Networks for stereopsis
Recent trends of applying NNs in digital signal processing
Implementations of ART2 wanted.
ART2 Source Code
Final call HICSS
UCLA-SFINX NN Simulator


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Re: Neural Nets and forecasting
From: zt@beach.cis.ufl.edu (tang)
Organization: UF CIS Department
Date: 30 May 90 22:12:56 +0000

In article <3820@discg1.UUCP> ilo0005@discg1.UUCP (cherie homaee) writes:
>
>Has anyone used neural nets for forecasting? If so have you used any
>other neural paradigm other than back-propagation?

We have done some experiments with time series forecasting using back-
propagation. Our results show that neural nets perform well compared with
traditional methods, especially for long term forecasting. Our initial
report will appear in the proceedings of the "First Workshop on Neural
Networks, Auburn, 1990"
.

See also "Neural Networks as Forecasting Experts: An Empirical Test,
Proceedings of the IJCNN Meeting, Washington, 1990"
, by Sharda and Patil.

------------------------------

Subject: Introduction to some work at NASA
From: "Eric Bobinsky" <cabobin@earth.lerc.nasa.gov>
Date: 01 Jun 90 11:38:00 -0400

Salutations! I am a recent subscriber to the Digest, and I though it
might be appropriate to briefly describe our neural network research
program to elicit a response from other readers with similar interests
and inclinations. I am with the NASA Lewis Research Center in Cleveland
(yes, Cleveland!), which is NASA's lead center for satellite
communications research. In the past two years we began a program with
the goal of trying to apply neural network technology to the problems of
enhancing the operational capabilities and service lifetimes of advanced
satellite communication systems.

To date, we have been working-- either directly or through university
grants-- in the areas of applying neural nets to advanced satellite
switching controllers, video image data compression, signal processing
(particularly high-speed demodulation and decoding), and autonomous
communication network control. In addition, our neural net program is
tied into a much broader program in the development of advanced digital
technology for high-rate modulation and coding.

I'd be pleased to hear from anyone out there working in these or similar
areas with whom we haven't already made acquaintance! My physical (as
opposed to logical) address is:

Eric Bobinsky
MS 5408
(sorry, that's 54-8)
Space Communications Division
NASA Lewis Research Center
Cleveland, Ohio 44135

Tel: 216-433-3497
FAX: 216-433-6371


------------------------------

Subject: Symbol Train Processing
From: coopere@rocky2.rockefeller.edu (Ellis D. Cooper)
Organization: The Rockefeller University, NY, NY 10021
Date: 04 Jun 90 17:49:33 +0000

Symbol Train Processing
Ellis D. Cooper
June 1, 1990

The goal of neuroscience is to understand the
fundamental principles of the brain. Communication,
analysis and simulation of mental models of the
underlying molecular, neuronal and network mechanisms
could benefit from a standardized graphical programming
language. Symbol train processing (STP) is a vivid
modeling language with the added advantage of not
presuming that neurons and other brain structures
communicate with numbers, e.g., the activation levels
of connectionism. Instead, symbol train processing
assumes that brain structures communicate at all levels
by emitting and absorbing sequences of symbols, only
some of which might be numbers. A computer program,
ChemiKine, for simulating a wide range of chemical
kinetic systems using symbol train processing is
available. For general symbol train processing,
however, the Mathematica STP Notebook provides an
object-oriented interpreter.

Most neuroscientists believe that understanding the
principles of the brain depends on developing theories
of biological phenomena occurring on spatial scales
from 0.1 meter to 1.0 thousandth of a meter in
networks of neurons connected across electro-chemical
synapses. Physically, a biological neural network is a
dynamical system whose state space has an extremely
large number of dimensions, not just because a
biological neural network has a large number of
synapses and neurons, but also because each synapse and
neuron has many characteristic electrical potential and
chemical concentration variables. Intractably complex
phenomena inevitably generate diverse inquiries based
on simplifying assumptions. Each inquiry hopes to
provide new scientific illumination or technological
applications.

One technologically fruitful model of biological neural
networks has been the connectionist network model. Its
adequacy for neuroscientific understanding is more
controversial. I am particularly interested in
assumptions relating to the character and significance
of spike trains. In connectionist networks the spike
train is reduced to a single continuous state variable,
the activation level of an abstract neurons output.
A large corpus of neuroscience research is based on
essentially the same abstraction of a spike train.
There is also a large corpus of research in which this
assumption is rejected. In fact, complex temporal
patterns of action potentials are taken by many
researchers to define the information produced by
biological neurons.

In connectionist networks the abstract neurons are
passive, non-linear integrators of their inputs, whose
properties are determined by coefficients in linear
expressions - the weights. By contrast, biological
neurons are active units with variable operating modes,
including oscillator and resonator behavior. It is also
implicit in the connectionist model that the individual
spikes occurring in a spike train must all be
identical. Biological neurons actually produce spikes
of different shapes.

The choice of simplifying assumptions to model a real
system must be governed by criteria of verisimilitude,
mathematical tractability, and computability. It can
happen that it is expedient to give the latter two
criteria greater emphasis at the expense of the first.
This leads to arguments against using such abstract
models in biology, but the successful use of ideal
models in physics cannot be ignored. My purpose is to
advance a new system of simplifying assumptions for
model building in neuroscience which attempts to
provide a superior balance between the three
aforementioned criteria.

STP units for building models communicate by emitting and
absorbing formal symbols which can stand for spikes of
different shapes, or for changes in levels of hormones,
or for changes in other biologically meaningful state
variables such as voltage across a membrane or current
through a channel.

STP units sum their simultaneous input signals and
attempt to match the instantaneous sum against built-in
state transition trigger symbols.

STP units have intrinsic timing properties which endow
them with oscillatory and resonance properties.

STP units undergo both automatic and triggered
transitions of state which may radically alter their
signal processing properties.

STP concepts were chosen specifically to apply not just
at the neural network level, but also at the higher
speed, smaller space scale ion channel, molecular
biochemistry level, and at the lower speed, larger
space scale of neuronal groups and clusters of groups,
etc.

The timers of STP units are easily set to random
timeouts, thereby with one mechanism to model
temperature at the chemical kinetics level, or the
stochastic firing rates at the neuronal level.

Computational neuroscience assumes that biological
neural networks implement algorithms for processing
information. I believe there is a theoretical need in
neuroscience for a computer tool with which to simulate
the brains algorithms for symbol train processing at
all time scales.

------------------------------

Subject: submission to net: Time-Frequency Distributions & Neural Nets
From: Don Malkoff <dmalkoff@ANDREW.dnet.ge.com>
Date: Mon, 04 Jun 90 16:18:31 -0400

I am writing a review on the use of time-frequency distributions of
signals as inputs to classification algorithms. The review will appear
in a book "New Methods in Time-Frequency Signal Analysis" to be published
by Longman & Cheshire.

I am particularly (but not solely) interested in schemes where the
classification mechanism is that of a neural network.

I would appreciate any inputs from the net as to appropriate references.
All applications are relevant. I would like to see this review be
comprehensive and adequately represent the contributions of neural nets.

Please reply to "dmalkoff@atl.dnet.ge.com"
____________________________________
Donald B. Malkoff
General Electric Company
Advanced Technology Laboratories
Moorestown Corporate Center
Bldg. 145-2, Route 38
Moorestown, N.J. 08057
(609) 866-6516


------------------------------

Subject: Networks for stereopsis
From: WOLPERT@VAX.OXFORD.AC.UK
Organization: Physiology Department, Oxford University, UK
Date: Tue, 05 Jun 90 17:33:49 +0000

I am interested in any pointers to current research/literature on
neural networks for stereopsis. In particular any references to networks
that solve random dot stereograms.

Thanks in advance

Piers Cornelissen.

Reply to STEIN@UK.AC.OXFORD.VAX

------------------------------

Subject: Recent trends of applying NNs in digital signal processing
From: Hazem.Abbas@QueensU.CA
Date: Tue, 05 Jun 90 14:07:00 -0400

Is any body involved in the applications of neural networks in the area
of digital signal processing (filter realization, adaptive filtering,
image enhancement, restoration and compression). I would appreciate it if
I can get acquainted with the relevant topics and bibliography as well.
Actually I need that in the process of finding a research point for my
Ph.D.

------------------------------

Subject: Implementations of ART2 wanted.
From: RM5I%DLRVM.BITNET@CUNYVM.CUNY.EDU
Date: Tue, 05 Jun 90 17:09:31 -0500

Hello,

does someone have an implementation of ART2 written in a common language
like Pascal or C.

Thanks for any help finding this.


Regards Roland Luettgens

German Aerospace Research Establishment
8031 Wessling
West Germany

rm5i@dlrvm Bitnet

------------------------------

Subject: ART2 Source Code
From: <GANKW%NUSDISCS.BITNET@CUNYVM.CUNY.EDU>
Date: Wed, 06 Jun 90 17:53:00 -0800

I discovered recently that the Adaptive Resonance Theory (ART) proposed
by Carpenter & Grossberg is similar in its operations to the traditional
McQueen's Kmeans clustering method with coarsening and refining
parameters (see ref 1). I intend to make a comparative study of these 2
methods.

Is there anybody who can share with me his/her ART2 source code; or
inform me how to get a copy of it? (ART1 is not suitable because my test
data are\ real number vectors). I am most willing to release my findings
to the network once I get the results.

Thanks in advance.

Reference

1. Anderberg, Cluster Analysis for Applications, Academic Press 1973.

My bitnet address is : gankw@nusdiscs.bitnet

Kok Wee Gan

------------------------------

Subject: Final call HICSS
From: Okan K Ersoy <ersoy@ee.ecn.purdue.edu>
Date: Mon, 04 Jun 90 13:28:45 -0500

FINAL CALL FOR PAPERS AND REFEREES
HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES - 24
NEURAL NETWORKS AND RELATED EMERGING TECHNOLOGIES
HAWAII - JANUARY 9-11, 1991

The Neural Networks Track of HICSS-24 will contain a special set of
papers focusing on a broad selection of topics in the area of Neural
Networks and Related Emerging Technologies. The presentations will
provide a forum to discuss new advances in learning theory, associative
memory, self-organization, architectures, implementations and
applications.

Papers are invited that may be theoretical, conceptual, tutorial or
descriptive in nature. Those papers selected for presentation will
appear in the Conference Proceedings which is published by the Computer
Society of the IEEE. HICSS-24 is sponsored by the University of Hawaii
in cooperation with the ACM, the Computer Society,and the Pacific
Research Institute for Information Systems and Management (PRIISM).

Submissions are solicited in:

Supervised and Unsupervised Learning
Issues of Complexity and Scaling
Associative Memory
Self-Organization
Architectures
Optical, Electronic and Other Novel Implementations
Optimization
Signal/Image Processing and Understanding
Novel Applications

INSTRUCTIONS FOR SUBMITTING PAPERS

Manuscripts should be 22-26 typewritten, double-spaced pages in length.
Do not send submissions that are significantly shorter or longer than
this. Papers must not have been previously presented or published, nor
currently submitted for journal publication. Each manuscript will be put
through a rigorous refereeing process. Manuscripts should have a title
page that includes the title of the paper, full name of its author(s),
affiliations(s), complete physical and electronic address(es), telephone
number(s) and a 300-word abstract of the paper.

DEADLINES

Six copies of the manuscript are due by June 25, 1990.
Notification of accepted papers by September 1, 1990.
Accepted manuscripts, camera-ready, are due by October 3, 1990.

SEND SUBMISSIONS AND QUESTIONS TO

O. K. Ersoy
Purdue University
School of Electrical Engineering
W. Lafayette, IN 47907
(317) 494-6162
E-Mail: ersoy@ee.ecn.purdue.edu

------------------------------

Subject: UCLA-SFINX NN Simulator
From: Edmond Mesrobian <edmond@CS.UCLA.EDU>
Date: Mon, 04 Jun 90 13:25:47 -0700

Recently, there was a posting concerning SFINX. The information was a bit
incorrect. To obtain the simulator, one must first sign a license
agreement. FTP instructions will then be sent to licensee. More
information concerning the simulator is presetned below.

hope this helps,
Edmond Mesrobian
UCLA Machine Perception Lab
3531 Boelter Hall
Los Angeles, CA 90024

============================================================================


UCLA-SFINX ( Structure and Function In Neural connec-
tions) is an interactive neural network simulation environment
designed to provide the investigative tools for studying the
behavior of various neural structures. It was designed to easily
express and simulate the highly regular patterns often found in
large networks, but it is also general enough to model parallel
systems of arbitrary interconnectivity.

UCLA-SFINX is not based on any single neural network para-
digm such as Backward Error Propagation (BEP) but rather enables
users to simulate a wide variety of neural network models. UCLA-
SFINX has been used to simulate neural networks for the segmenta-
tion of images using textural cues, architectures for color and
lightness constancy, script character recognition using BEP and
others.

It is all written in C, includes an X11 interface for visual-
izing simulation results (8 bit displays), and it has been ported
to HP 9000 320/350 workstations running HP-UX, Sun workstations
running SUNOS 3.5, IBM RT workstations running BSD 4.3, Ardent
Titan workstations running Ardent UNIX Release 2.0, and VAX 8200's
running Ultrix 2.2-1. To get UCLA-SFINX source code and document-
ation (in LaTeX format) follow the instructions below:


1. To obtain UCLA-SFINX via the Internet:

Sign and return the enclosed UCLA-SFINX License Agreement to
the address below. We will send you a copy of the signed
license agreement along with instructions on how to FTP a
copy of UCLA-SFINX. If you have a PostScript printer, you
should be able to produce your own copy of the manual. If
you wish to obtain a hardcopy of the manual, return a check
for $30 along with the license.

2. To obtain UCLA-SFINX on tape:

Sign and return the enclosed UCLA-SFINX License Agreement to
the address below. Return a check for $100 dollars along
with the license, for a hardcopy of the manual and a copy of
UCLA-SFINX on 1/4 inch cartridge tape (in tar format) read-
able by a Sun 3 workstation. We will also send you a copy
of the signed license agreement.

Checks should be made payable to the Regents of the Univer-
sity of California. If you have questions regarding any of the
information discussed above send electronic mail to
sfinx@retina.cs.ucla.edu or US mail to: UCLA Machine Perception
Laboratory, Computer Science Department, 3532 Boelter Hall, Los
Angeles, CA. 90024, USA.


>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cut here for license <<<<<<<<<<<<<<<<<<<<<<<<<




THE REGENTS OF THE UNIVERSITY OF CALIFORNIA

LOS ANGELES CAMPUS

UCLA-SFINX LICENSE AGREEMENT


This Agreement is entered into this___________of
____________________, 199__, by and between THE REGENTS OF THE
UNIVERSITY OF CALIFORNIA, a California corporation, hereinafter
called "University", and ________________________________________
_____________________________________, hereinafter called
"Licensee." This Agreement is made with reference to the follow-
ing:


1. DEFINITION

"UCLA-SFINX" is a set of copyrighted, source code computer
programs and any future modifications thereof delivered by
University to Licensee, and any accompanying documentation
provided by University. UCLA-SFINX is a general purpose
software system for the development and evaluation of con-
nectionist models. UCLA-SFINX is written for and operates
on UNIX systems.


2. GRANT OF RIGHTS


A. University grants to Licensee and Licensee accepts a
non-exclusive, non-transferable license to use UCLA-
SFINX solely for Licensee's non-commercial purposes.

B. Such use may include the making of sufficient copies of
UCLA-SFINX for the reasonable purposes of Licensee
hereunder. All copies of UCLA-SFINX made by Licensee,
in whole or in part, regardless of the form in which
the Licensee may subsequently use it, and regardless of
any modification which the Licensee may subsequently
make to it are the property of University and no title
to or ownership of such materials are transferred to
Licensee hereunder. Licensee shall include on any such
copies labels containing the name UCLA-SFINX, the
University's copyright notice, and any other
proprietary or restrictive notices appearing on the la-
bel of the copy of UCLA-SFINX furnished to Licensee by
University.






1







C. Such use may include the modification of UCLA-SFINX by
Licensee. Such modified versions of UCLA-SFINX shall
remain the property of University.

D. Such use shall not include further distribution, or any
action which may be construed as selling or licensing
UCLA-SFINX to any person or entity.


3. ACKNOWLEDGMENT

A. Licensee acknowledges that UCLA-SFINX has been
developed for research purposes only.

B. Licensee shall require its employees and students to
acknowledge in writing their use of UCLA-SFINX when re-
porting any research resulting from such use. The fol-
lowing notice should be used: "UCLA-SFINX from UCLA
MACHINE PERCEPTION LABORATORY."



4. WARRANTIES AND INDEMNIFICATION

A. University warrants that is is the owner of all right,
title, and interest in and to UCLA-SFINX, including all
copyright pertaining thereto and subsisting therein.

B. UCLA-SFINX is licensed "AS IS," and University dis-
claims all warranties, express and implied, including
but not limited to, the implied warranties of merchan-
tability and fitness for a particular purpose. In no
event will University be liable for any business ex-
pense, machine down time, loss of profits, any inciden-
tal, special, exemplary or consequential damages, or
any claims or demands brought against Licensee. The
entire risk as to the quality and performance of UCLA-
SFINX is with Licensee.

C. Licensee agrees to indemnify, defend, hold harmless,
and defend University, its officers, employees and
agents, against any and all claims, suits, losses, dam-
ages, costs, fees, and expenses resulting from or aris-
ing out of any use of UCLA-SFINX by Licensee.


5. TECHNICAL SUPPORT AND FEEDBACK

A. University shall have no obligation to install, sup-
port, maintain, or correct any defects in UCLA-SFINX.






2








B. Licensee agrees to notify University of any errors,
functional problems, and any defects in performance
discovered in UCLA-SFINX and of any fixes made by
Licensee. Such notice will contain a full description
of the problem, indicating in what circumstances it
originated, and how it manifested itself. Technical
matters and errors discovered in UCLA-SFINX may be com-
municated as provided in Article 9 below or via elec-
tronic mail to: sfinx@retina.cs.ucla.edu.


6. TERM AND TERMINATION

A. The term of this Agreement is perpetual and shall be
effective from the date of its signing by duly author-
ized official of Licensee.

B. Any failure of Licensee to comply with all terms and
conditions of this Agreement shall result in its im-
mediate termination.


7. SEVERABILITY

If any of the provisions or portions of this Agreement are
invalid under any applicable statute or rule of law, they
are to the extent of such invalidity severable and shall not
affect any other provision of this Agreement.


8. APPLICABLE LAW

This Agreement shall be governed by the laws of the State of
California.


9. NOTICE

A. Any notice under this Agreement shall be in writing and
mailed to the appropriate address given below:

To University regarding this Agreement:

The Regents of the University of California
Office of Contract and Grant Administration
University of California, Los Angeles
405 Hilgard Avenue
Los Angeles, California 90024-1406

Attention: Dr. Enrique Riveros-Schafer





3









B. To University regarding technical matters:

UCLA Machine Perception Laboratory
3532 Boelter Hall
Computer Science Department
Los Angeles, California 90024

Attention: Prof. Josef Skrzypek


C. To Licensee:

____________________________________________________
____________________________________________________
____________________________________________________
____________________________________________________
____________________________________________________


10. ENTIRETY

This Agreement supersedes any previous communication and,
when signed by both parties, constitutes the complete under-
standing of the parties. No modification or waiver of any
provisions hereof shall be valid unless in writing and
signed by both parties.

IN WITNESS THEREOF, the parties here to have caused this Agree-
ment to be executed.

LICENSEE THE REGENTS OF THE UNIVERSITY
OF CALIFORNIA

By: _______________________________ By: ______________________________
NAME: _______________________________ Wade A. Bunting, Ph.D.
Title: _______________________________ Intellectual Property Officer
Date: _______________________________ Date: ____________________________



4



>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cut here for license <<<<<<<<<<<<<<<<<<<<<<<<<

------------------------------

End of Neuron Digest [Volume 6 Issue 39]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT