Copy Link
Add to Bookmark
Report
Neuron Digest Volume 11 Number 31
Neuron Digest Friday, 7 May 1993 Volume 11 : Issue 31
Today's Topics:
MUME version 0.6 is available
Research Posts
Collaboration for financial modeling?
Richmond, VA Area ANN SIG
Subgraph Isomorphism with Hopfield nets anyone?
Brain usage
brain usage
Re: Re: Neuron Digest V11 #25 (software, jobs, discussion, etc.)
Neural Network Model for Kindling of Epilepsy
Introduction of a newcomer
some recent papers which may be of interest to you.
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: MUME version 0.6 is available
From: Multi-Module Environment <mm@sedal.sedal.su.OZ.AU>
Date: Sat, 24 Apr 93 10:57:50 -0500
MUME 0.6 IS NOW AVAILABLE
The Multi-Module Neural Computing Environment (MUME) version 0.6 (sources
only) is now available.
MUME-0.6 compiles on a variety of Unix machines as well on the Fujitsu
VP2200 and PCs (MSDOS 5.0 or higher and using DJGCC).
HOW TO GET IT
-------------
It can be acquired by fetching the licence file:
file license.ps (postrscript file)
machine 129.78.13.39
directory: /pub
login: anonymous
password: your email address
and getting it signed by an authorised person and then sending/faxing it to
MUME 0.6
SEDAL
Sydney University Electrical Engineering
NSW 2006 Australia
Fax: (+61-2) 660-1228
The machine/account/passwd where you can ftp the sources will then be
communicated to you.
*** PLEASE DO NOT FORGET TO WRITE YOUR EMAIL CONTACT ON THE FAXED LICENSE ***
PC USERS
--------
If you don't have the DJGCC compiler, you can write to the address above
with a signed license and a cheque for A$150 (for media/doc/postage) and
we will forward to you the software and binaries. Do not forget to clearly
specify the media (3.5" or 5"1/4) and your surface mail address. Note MUME
compiled under DJGCC will not run under Microsoft's Windows.
MAILING LIST
------------
Once you have the software, you can ask to be include on a mailing list by
sending your email address to
mume-request@sedal.su.oz.au
MORE INFO ABOUT MUME OR CHANGES
-------------------------------
If you don't know what MUME is, you can fetch the file /pub/mume-overview.ps.Z
from 129.78.13.39 (login as anonymous).
Otherwise here is a copy of the CHANGES file (from version 0.5 to 0.6):
o A detailled basic tutorial has been written (directory tutes/tut0)
o To simplify interconnections statements between nets, MUME
now generates default "iface"s. for example for an MLP called
john, MUME automatically generates the interfaces john.in (input
layer) and john.out (output layer). This applies to most nets.
The enhancement of the interconnection semantic has been simplified
even further by introducing "base" index which simplifies the
neuron reference. All information about interfaces is now
described in a separate manual page called IFACE.5.
o The configuration files can make use of symbols which can be set
in the file or on the command line of the front-end program (see
man pages SYMBOLS.5 and MMN.5). Some nets now also define their
own symbols (eg. "mlp" net).
o The specification of neuron index for "nfun" keyword has been
emhanced to allow easier indexing (see NET.5).
o All front ends now default to a "test" mode. To train the switch
"-train" is required.
o Data reading routines of ENV net have been optimised
o Data normalisation statements in ENV have been modified (see man
pages ENV.5 and NORM.5)
o MUME now supports the use of a validation set during training. The
main purpose of a validation set is to prevent overtraining, as the
error on both the training and validation sets can be tracked as
training progresses.
To use the validation set, set the optional "Validate" flag in the
system definition section to 1 (using the statement "Validate 1;")
and specify a validation data set in all ENV modules (using
the statement "data Validate <FileName>;"). The error on the
validation set will now be logged as a 3rd column along with the
epoch number and training error.
See the ENV.5 and MMN.5 man pages for more information.
o MUME now catches more system signals when possible and exit
after saving the net states upon receiving them.
The signals are: SIGINT, SIGTERM, SIGXCPU, and SIGXFSZ.
o Logging output is more consistent under the control of the
"-verbose" switch.
o the following learning algorithms have been added:
stochastic error descent for limited precision training
reinforcement learning
conjugate gradient
simplex based methods
o the following net classes has been added:
resource allocation nets (class RAN)
NeTtalk postprocessing module (class N2K)
o the class RBPTT has been renamed WPANG.
o the WZ class (continuously running recurrent net) has now
what is called pins which have 0 propagation delays
o and of course, many bugs were fixed.
The behaviour of the learning algorithms have not changed between 0.5 and
0.6. All configuration files should still run under 0.6 except for
normalisation statements in the ENV class. We are sure that the new
statements make declarations much easier.
mume-request@sedal.su.OZ.AU
------------------------------
Subject: Research Posts
From: Jon Shapiro <jls@computer-science.manchester.ac.uk>
Date: Wed, 28 Apr 93 14:04:04 +0000
Two Research Posts
Department of Psychology, University of Lancaster
Department of Computer Science, University of Manchester
Short-term memory for verbal sequences: psychological experiments
and connectionist modelling.
Applications are invited for two posts on a research project investigating
short-term memory mechanisms for processing verbal information. The first
post is for a postdoctoral researcher to work with Dr. Jonathan Shapiro
at Manchester on connectionist modelling and analysis. The second post is
for a graduate researcher to assist Professor Graham Hitch with the
psychological experiments.
Both post can begin as soon as possible and run through December 1995.
Applicants for the modelling post should have expertise in computational
and mathematical aspects of connectionism and a Ph.D. in a relevant subject.
The salary range is 15,221 - 16,629 U.K. pounds based on age and experience.
Applicants for the experimental post should have a first degree in psychology,
and interests in memory and cognition. The salary range is 13,632 - 15,221 U.K.
pounds. To apply for either post, send curriculum vita and the names and
addresses of two professional referees to the address below.
Closing Date: 14th of May.
Jonathan Shapiro
Computer Science Dept
University of Manchester
Oxford Road
Manchester M13 9PL
United Kingdom.
Phone: 44-(0)61 275 6253
Fax: 44-(0)61 275 6236
E-mail: jls@cs.man.ac.uk
------------------------------
Subject: Collaboration for financial modeling?
From: MARTIN DUDZIAK <DUDZIAK@vms.cis.pitt.edu>
Date: Thu, 29 Apr 93 23:40:00 -0500
Announcement
Please Circulate to Faculty and Researchers in Physics, Computer Science,
Economics, Mathematics, Business Schools
There are some interesting opportunities for university/industry
collaboration in the area of applying non-linear dynamical systems,
including but not limited to connectionist networks, to problems in
financial modeling within the banking sector.
The projects for which I am responsible including modeling of
fixed-income securities, forecasting of demand deposit accounts,
improvments to traditional duration/market value models, forecasting
select futures/commodities, and analysis of interest rate risks. We are
somewhat less interested in the traditional stock-price forecasting
applications for which there has been so much visibility of late and are
more interested in the use of techniques such as NNs, fuzzy logic, chaos
models and genetic algorithms for the identification and discrimination
of useful features and parameters that can be applied to forecasting some
rather indeterminate quantities.
Our work is not a short-order task and we are looking for an academic
partner with whom we might establish a long-term partnership. There are
plenty of precedents for good collaborations of this sort on which we
might model our joint effort. We envision building a set of tools that
can be refined in the course of studying one or two particular problems
from the above set and which as a toolset could be made available for use
on other problems besides those of interest to those in finance and
economics.
Our problems are such that they are both real-world, down-to-earth
applications in need of practical solutions and research-worthy in that
not too much has been done in some of these topics. There is good thesis
material, and good opportunity for breaking new ground. For instance, I
see strong parallels between the currency balances and flows that we are
trying to model and fluid dynamics... Cellular automata and parallel
processing are two other methods that could be applied to some of these
problems - it isn't just a matter of time series prediction (not to
imply that the latter is trivial).
A prior research track in the financial area is not as critical as an
ability to work together on some practical problems (but without losing
the spirit of scientific exploration and innovation). Being able to
cooperatively show results and demonstrations of practical success is
going to be very important.
This is an introductory announcement of interest on our part and I
welcome responses from individuals and groups that feel they have a
common interest and something to contribute. I believe that a highly
synergetic and mutually rewarding partnership is possible with the right
team.
For further information, please contact:
Martin Dudziak
(804) 782-5708
FAX (804) 782-5100
dudziak@vms.cis.pitt.edu
------------------------------
Subject: Richmond, VA Area ANN SIG
From: MARTIN DUDZIAK <DUDZIAK@vms.cis.pitt.edu>
Date: Thu, 29 Apr 93 23:40:00 -0500
ANNOUNCEMENT
Richmond VA Area SIG on NNs and NLDs Forming
An informal special interest group on neural and non-linear dynamical
systems is forming, oriented to people in the Richmond and east/central
Virginia area. There is currently no organizational affiliation and what
directions this group takes depends upon the response and participation
of interested persons.
The intent here is to bring together people who may otherwise be somewhat
isolated by virtue of working in diverse industries or centers, academic
or corporate, and who could benefit by reason of verbal and electronic
exchanges. My initial sense is that there is more going on in this part
of the country and state than appears at first glance, and while Richmond
may seem so close to Baltimore/Washington D.C. as a major technology arena, it
really isn't.
If you are interested, please contact me and we will see where this goes.
One idea is to have a monthly seminar with an invited talk; convenient
facilities can be easily provided.
Martin Dudziak
Crestar Bank
(804) 782-5708
dudziak@vms.cis.pitt.edu
------------------------------
Subject: Subgraph Isomorphism with Hopfield nets anyone?
From: chen@kuri.ces.kyutech.ac.jp (Chen Ke)
Date: Fri, 30 Apr 93 18:15:35 +0800
Hi, everybody.
Has anyone ever tried to solve the subgraph isomorphism problem
with Hopfield Nets?
This problem is formulated as follows:
Given: a graph G and a SG which is a fixed set of subgraphs.
Question:
Partition the vertices of G into disjoint sets S1, S2, ...,Sn
such that the graph induced by each set Si is isomorphic to the
graph in SG.
I would appreciate it if anyone can tell me any information about
this problem.
Regards,
Ke Chen, chen@kuri.ces.kyutech.ac.jp
------------------------------
Subject: Brain usage
From: Patrick Martin <patrick@brahma.anatomy.ucl.ac.uk>
Date: Fri, 30 Apr 93 10:18:58 +0000
Cristina Sarantino mentioned a fellow at Oxford who had hydrocephalis.
His brain was reduced to a thin film applied to the inside of his skull.
Nevertheless he was of average or above average intelligence.
Perhaps this man is an example of a human lesion study in the style of
Lashley. Systems were knocked out in a non-specific way, so another
systems were there to pick up the slack and the whole organism moved
along unimpeded.
Perhaps there is extra brain provided by all-wise evolution for the day
when a predator claws your head open. We only use a fraction of our
brain, the rest is for emergencies.
Patrick
------------------------------
Subject: brain usage
From: cera@cortex.health.ufl.edu
Date: Fri, 30 Apr 93 09:36:09 -0500
> Conclusion: Ten percent seems about right for intellectually active
> people, one percent for the intellectually lazy. Whether the brain
> could hold more is moot since the input channel is so limited. If it
> is true
Please correct me if I have read this wrong, but it seems that you should
say "10% of the brain can be devoted to thought". To conclude that
evolution would create something like a brain and then only make it 10%
efficient is not a viable statement. The brain makes enormous demands on
the body. It consumes 20% of the total oxygen demand at rest and 15% of
the cardiac output, yet represents approximately 3% of the total body
weight. With such demands how could the body suffer it to be 10%
efficient.
You are neglecting several other functions that the brain performs. You
mention sensory input as being a limiting factor in learning (something
that you accept as fact, which I doubt is true), but you do not allot any
brain capacity for this. What about movement? There has to brain
function set aside for movement. What about brain capacity for memory?
> that the more you know, the more you can learn, and if knowledge is
> "continuous" and "everywhere dense," there is no meaningful limit on
> brain capacity.
I am just beginning to learn about neural networks, and the first one
that I looked at could NOT store numbers more compact than if they where
stored as a binary representation. From this first experience I would
insist that there is a limit to any neural network.
take care
tim cera
P.S. my opinions are my own.
tim cera cera@cortex.health.ufl.edu
computer operations manager box 100244, university of florida
department of neuroscience gainesville, fl, 32610-0244
uf brain institute (904) 392-7088
------------------------------
Subject: Re: Re: Neuron Digest V11 #25 (software, jobs, discussion, etc.)
From: eytan@dpt-info.u-strasbg.fr (Michel Eytan, LILoL)
Date: Fri, 30 Apr 93 16:21:52 +0000
>Subject: Re: Neuron Digest V11 #25 (software, jobs, discussion, etc.)
>From: eytan@dpt-info.u-strasbg.fr (Michel Eytan, LILoL)
>Date: Sun, 18 Apr 93 10:25:57 +0000
>
>>From: David Bradbury <D.C.Bradbury@open.ac.uk>
>>Date: 31 Mar 93 10:44:25 +0800
>>
>>Does anyone know where I can get software that can be used to
>>build/model/ simulate neural networks and/or genetic algorithms that will
>>run on an apple mac or a sun workstation? I am a first year Ph.D student
>>looking at modular neural networks.
>
>For the Mac, dunno. However for the Sun (and other workstations), try out
>Aspirin/Migraines:
Sh**, what a (revealing, Dr. Freud?) slip of tongue: my message should
have read "I *have* a first-year Ph. D. student..." instead of "I *am* a
first-year Ph. D. student..."
As for the rest, yes I know about Aspirin/Migraines. Trouble is I cannot
compile it on my unix box since it's a diskless sparc1 that comes without
ANSI C -- and I do not have enough space to download a freeware ANSI C.
Thanks anyway
~=michel
------------------------------
Subject: Neural Network Model for Kindling of Epilepsy
From: cdgupta@physics.iisc.ernet.in (Chandan Dasgupta)
Organization: Dept. of Physics, Indian Institute of Science
Date: Fri, 30 Apr 93 20:06:42 -0500
We have recently developed a neural network model for the process of
kindling - generation of epilepsy in laboratory animals by repeated
electrical stimulation of certain parts of the brain. A paper describing
the model and the basic mechanism we propose is scheduled to appear in
the February issue of Biological Cybernetics. The abstract of this paper
is appended to this message. We were told by a referee of our paper that
we should look into the work of Jun Wada, Frank Morrell and Christopher
Zeeman on the modeling of kindling. In our literature survey, we have not
come across any paper by these authors on the modeling of the kindling
process. Can anyone provide any information on the work of these people ?
We would also appreciate receiving comments on our work and references to
similar work. Preprints of our paper are available on request. Please
send your request to the e-mail address given above, or to the address
mayank@mercury.HUJI.AC.IL. Thanks.
*******************************ABSTRACT*************************************
A Neural Network Model for Kindling of Focal Epilepsy:
Basic Mechanism
by
M. R. Mehta, C. Dasgupta and G. R. Ullal
A simple neural network model is proposed for kindling- the phenomenon
of generating epilepsy in laboratory animals by means of repeated
electrical stimulation. The model satisfies Dale's hypothesis,
incorporates a Hebb-like learning rule and has low periodic activity in
the absence of shocks. Several experimental observations, such as the
existence of an afterdischarge thres- hold, initial rapid rise followed
by eventual saturation of the afterdischarge amplitude and duration and
insensitivity of the kindling rate to the amplitude of the electrical
stimulation are reproduced in computer simulations of the model. Some new
experiments are also suggested. It is proposed that the main reason for
kindling is the formation of a large number of excitatory synaptic
connections due to learning.
------------------------------
Subject: Introduction of a newcomer
From: Spencer Rutledge III <7270P%NAVPGS.BITNET@cmsa.Berkeley.EDU>
Date: Fri, 30 Apr 93 23:55:24 -0800
I am a graduate student at the Naval Postgraduate School and is currently
working on my thesis which consist mainly of constructing and running
simulation models with are representative of individual decision making
processes in a dynamic environment. Decision making process is
represented by the use of various hueristics and non-linear equations
with randomized input data. The intent is to incorporate various
learning algorithms into the simulation models to see if that enhance the
results. The algorithms that will be used are multiple regression,
genetic algorithm, machine learning and neural networks. For this
purpose I am soliciting any information that might help me to construct a
or several neural networks to accomplish this task. My knowledge on the
subject of neural networks is very limited and any help would be greatly
appreciated.
******************************************************************************
* SPENCER RUTLEDGE III HOME: 408-646-8574 *
* CAPTAIN, USMC OFFICE: 408-656-2174 *
* SMC #: 2508 FAX: 408-656-3273 (OFFICIAL ONLY) *
* NAVAL POSTGRADUATE SCHOOL BITNET: 7270P@NAVPGS *
* MONTEREY, CA 93943-5100 INTERNET: RUTLEDGE@MASTER.AS.NPS.NAVY.MIL *
******************************************************************************
------------------------------
Subject: some recent papers which may be of interest to you.
From: "Laveen N. Kanal" <kanal@cs.UMD.EDU>
Date: Sat, 01 May 93 23:40:06 -0500
Laveen N. Kanal, " On pattern, categories, and alternate realities",
published in Pattern Recognition Letters, vol 14, pp. 241-255, March
1993, Elsevier/North-Holland.
This is the text of the talk given by the author at The Hague, The
Netherlands, when he received the King-Sun Fu award of the International
Association for Pattern Recognition.
Contents:
Preamble
Pattern
Some sketches from the current pattern recognition scene
Artificial neural networks
Hybrid systems
"Where's the AI?"
Categorization
Alternate realities
Prospects
Concluding remarks
"Time goes, you say? Ah, no!
Alas Time stays, we go;"
Pierre de Ronsard
The Paradox of Time
(Austin Dobson, tr)
Other Recent Papers:
R. Bhatnagar & L.N. Kanal, "Structural and Probabilistic Knowledge for
Abductive Reasoning",IEEE Trans. on Pattern Analysis and
Machine Intelligence, special issue on Probabilistic Reasoning,
March 1993.
L. Kanal & S. Raghavan," Hybrid Systems- A Key to Intelligent Pattern
Recognition", IJCNN-92, Proc. Int. Joint. Conf on Neural Networks,
June 1992.
B.J. Hellstrom & L.N. Kanal, "Asymmetric Mean-Field Neural Networks for
Multiprocessor Scheduling", Neural Networks, Vol. 5, pp 671-686,
May 1992.
L.N. Kanal & G.R. Dattatreya, "Pattern Recognition", in S. Shapiro (ed),
Encyclopedia of Artificial Intelligence, 2nd edition John
Wiley 1992.
R. Bhatnagar & L.N. Kanal, " Reasoning in Uncertain Domains-A Survey and
Commentary", in A. Kent & J.G. Williams (eds), Encyclopedia of
Computer Science and Technology,p. 297-316,(also in Encyclopedia
of Microcomputers, Marcel Dekker, Inc, 1992.
Laveen N. Kanal
Prof. of Computer Science
A.V. Williams Bldg.
Univ. of Maryland
College Park, MD 20742
USA
kanal@cs.umd.edu
------------------------------
End of Neuron Digest [Volume 11 Issue 31]
*****************************************