Copy Link
Add to Bookmark
Report

Neuron Digest Volume 09 Number 03

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

Neuron Digest   Wednesday, 29 Jan 1992                Volume 9 : Issue 3 

Today's Topics:
Neural Networks in Physics -- Reinventing the Wheel?
summer program in Mathematical Physiology: update
Market Survey request
Information Request (mortality prediction)
MA in Philosophy of Cognitive Science at Sussex University
Who I am. My request...
Time Series Forecasting - Comparision
Having a machine perform a simple task: The reason why neural networks and artificial i
DuPont Neural Computation Program - Job Opening
Reducing images for NN input???


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Neural Networks in Physics -- Reinventing the Wheel?
From: "(Mike Mehl)" <mehl@irrmasg5.epfl.ch>
Date: Thu, 02 Jan 92 11:12:00 +0000

Hello everyone. First of all, I'd like to thank all of you for the
information provided in the Neuron Digest and the archives.

The problem I'm interested in concerns the use of a neural network to do
interpolation in equation of state data. I haven't seen this discussed
anywhere. People familiar with neural networks tell me it ought to be
possible, so I've made some preliminary tests.

One of the concerns of computational condensed matter physics physics is
the prediction of equilibrium crystal structures. This is relatively
easy for simple crystals, but gets difficult latter. To illustrate, here
is a sample problem in the Aluminum - Lithium system. The diagrams I'm
drawing are 1-dimensional, but the actual crystals are in 3-d, of course.

Start with a lattice of Aluminum atoms, which we'll represent by


Al Al Al Al Al Al Al Al Al Al
|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|

Where a is the lattice constant, the spacing between the atoms. Using
almost any kind of computer, it is possible to calculate the energy of
this system as a function of the lattice constant, E[Al,a]. The location
of the minimum of this energy functional gives the equilibrium structure.

It is also easy to do the same calculations with Li atoms, getting E[Li,a]:

Li Li Li Li Li Li Li Li Li Li
|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|

It's a little harder to do the ordered Al-Li compound and get E[AlLi,a]

Al Li Al Li Al Li Al Li Al Li
|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|

It is even harder to calculate Al2 Li (E[Al2Li,a]):

Al Al Li Al Al Li Al Al Li Al
|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|

and so on. It's even worse to calculate the actual alloy structure,
where the Al and Li atoms are distributed more or less randomly. In
addition, in the mixed Al Li systems, the equilibrium spacing between Al
and Li atoms is not the same as between two Al or two Li atoms, so the
problem is even more difficult. After a certain point the (10 atoms in a
unit cell) the problem requires a Cray, and beyond that (100 atoms or so)
it cannot be done with present- equipment during a present-day lifespan.

Now for the interpolation part: The energy of a metallic crystal depends
mostly on the _local_ structure of the crystal. This means that in Al3 Li:

Al Al Al Li Al Al Al Li Al Al
|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|<- a ->|

the central Al atoms mostly look like atoms in pure Al, the Li atoms look
like atoms in AlLi, and the Al atoms between an Al and a Li atom look
more or less like Al atoms in Al2Li. Actually, you probably need to look
and 1st, 2nd, 3rd and maybe 4th neighbors to get good results, but this
is the basic idea.

It seemed to me that some form of a neural network would be able to
interpolate between these structures, using information about the local
structure around each atom. I built a network which did this, using a
variation of the Cascade architecture proposed by Scott Fahlman and
Christian Libiere. I wrote it in Fortran, the language of physics [ :-)
]. For my initial test case I used the equations of state for
face-centered cubic Al, face-centered cubic Li, and Al3Li in the Cu3Au
structure. I trained the network to match these equations of state, and
used the final result to calculate the equation of state of an ordered
form of the allow Al7Li. The results are quite encouraging. I looked at
the output of several networks, varying the number of hidden nodes and
the amount of information supplied (location of 2nd neighbors, 3rd
neighbors, etc.). The predicted E[Al7Li,a] is pretty much independent of
the choice of network, and it is not that far from the computed equation
of state. It is not perfect, but I didn't expect it to be.

Now this is only one test case. Before it is publishable, I must
construct networks in several systems. This requires generating huge
amounts of data (various equations of state) or searching data bases for
the required information. That's a lot of work, and I don't want to do
it unless it is worthwhile. So my question to all of you is, has this
been done before? Am I reinventing the wheel? If so, I would appreciate
references. As you can see by my signature below, I'm in Switzerland and
the moment. I don't speak French, so getting a librarian to help me
search a database is difficult. However, with a reference I should be
able to find the articles in question.

Thanks in advance for the help.

--------------------------------------------------------------
| Don't call my bosses | Michael J. Mehl |
| On the phone. | mehl@irrmasg6.epfl.ch |
| The thoughts in here | On Sabbatical from NRL, |
| Are all my own. | Washington DC USA |
| Burma Shave | to IRRMA%EPFL@Lausanne.Switzerland |
|(Gee, that dates me.) | |
--------------------------------------------------------------



------------------------------

Subject: summer program in Mathematical Physiology: update
From: Ken Miller <ken@cns.caltech.edu>
Date: Mon, 06 Jan 92 09:14:59 -0800

I'm writing with respect to my previous posting on the summer program in
Mathematical Physiology at MSRI (Mathematical Sciences Research
Institute, Berkeley, CA). The first two weeks of this program, July
6-17, are on "Neurons in Networks", as described in that posting.

The new information is:
(1) The application deadline has been pushed back to Feb. 1
(2) Applications may be sent by e-mail. Send applications to
abaxter@msri.org, and address the correspondence to Nancy Kopell and Michael
Reed
(3) All expenses will be covered for those who attend, *except*: there is
a $450 limit on travel expenses. So, those wishing to attend from overseas
should indicate in their application whether they will be able to attend
with that limit on travel support.
(4) As mentioned before, women and minorities are encouraged to apply. If
you *are* a member of a minority, or if you are a woman and it might not be
obvious to us from your name, please be sure to note this in your
application.

I'll repeat here the basic information about applications:

To apply to participate and for financial support or to obtain more
information about the topics of the workshops, please write to:

Nancy Kopell and Michael Reed
Summer Program in Mathematical Physiology
MSRI
1000 Centennial Drive
Berkeley, CA 94720

Applicants should state clearly whether they wish to be long term
participants or workshop participants and which workshops they wish
to attend. Students should send a letter explaining their
background and interests and arrange for one letter of
recommendation to be sent. Researchers should indicate their
interest and experience in mathematical biology and include a
current vita and bibliography. Women and minorities are encouraged
to apply.

Ken


------------------------------

Subject: Market Survey request
From: sylee@eekaist.kaist.ac.kr ( Soo Young Lee )
Date: Mon, 06 Jan 92 11:08:47 -0800


Request for Neural Networks Market Survey


We are trying to organize big national project on intellient computer,
where neural networks are supposed to play important role. To make the
proposal more competetive I would like to have some market survey of
neural network technology and related products for next 10 years. If you
have any data I may use or information of published survey book, please
let me know.

Prof. Soo-Young Lee
Computation and Neural Systems Lab.
Dept. of EE, KAIST
Fax: +82-42-829-3410
E-mail : sylee@eekaist.kaist.ac.kr


------------------------------

Subject: Information Request (mortality prediction)
From: SchwartzM@DOCKMASTER.NCSC.MIL
Date: Fri, 10 Jan 92 10:28:00 -0500

From: Marc Schwartz
E-Mail: SchwartzM at DOCKMASTER.NCSC.MIL
Date: January 10, 1992
Subject: Request For Information Sources

My company is involved in researching predictive methodologies for
patient outcomes (typically mortality) for certain surgical procedures.
We are currently using a Bayesian method for this process. I would like
to review available literature on Neural Networks and Fuzzy Set Theory to
see if this can be applied. Based upon what I have seen recently in some
of the medical literature it would seem very appropriate. I have
purchased a book by Bart Kosko on the subject, but would like other
references to review. Are there good basic introductions to this subject
matter that can be recommended to me. I am looking for one that might
also have examples of software implementation and coding examples
(preferably in C) such as the Kosko book.

Thanks ahead of time for any help that might come my way.!

Marc Schwartz
Director, Technical Services
Summit Medical Systems, Inc.
Minneapolis, MN 55447
Voice: 1-800-767-0766
E-Mail: SchwartzM at DOCKMASTER.NCSC.MIL




------------------------------

Subject: MA in Philosophy of Cognitive Science at Sussex University
From: Andy Clark <andycl@syma.sussex.ac.uk>
Date: Fri, 10 Jan 92 16:22:57 +0000


UNIVERSITY OF SUSSEX, BRIGHTON, ENGLAND
SCHOOL OF COGNITIVE AND COMPUTING SCIENCES

M.A. in the PHILOSOPHY OF COGNITIVE SCIENCE

The is a one year course which aims to foster the study of foundational
issues in Cognitive Science and Computer Modelling. It is designed for
students with a background in Philosophy although offers may be made to
exceptional students whose background is in some other discipline related
to Cognitive Science. Students would combine work towards a 20,000 word
philosophy dissertation with subsidiary courses concerning aspects of
A.I. and the other Cognitive Sciences.


General Information.

The course is based in the School of Cognitive and Computing Sciences.
The School provides a highly active and interdisciplinary environment
involving linguists, cognitive psychologists, philosophers and A.I.
researchers. The kinds of work undertaken in the school range from highly
practical applications of new ideas in computing to the most abstract
philosophical issues concerning the foundations of cognitive science. The
school attracts a large number of research fellows and distinguished
academic visitors, and interdisciplinary dialogue is encouraged by
several weekly research seminars.

Course Structure of the MA in Philosophy of Cognitive Science

TERM 1

Compulsory Course: Philosophy of Cognitive Science
Topic: The Representational Theory of Mind:
From Fodor to Connectionism.

and one out of :

Artificial Intelligence Programming (Part I)
Knowledge Representation
Natural Language Syntax
Psychology I
Computer Science I
Modern Analytic Philosophy (1)
Modern European Philosophy (1)
Artificial Intelligence and Creativity


TERM 2

Compulsory Course: Philosophy of Cognitive Science (II)
Topic: Code,Concept and Process:
Philosophy, Neuropsychology and A.I.


and one out of:


Artificial Intelligence Programming (Part II)
Natural Language Processing
Computer Vision
Neural Networks
Intelligent Tutoring Systems
Psychology II
Computer Science II
Social Implications of AI
Modern Analytic Philosophy (2)
Modern European Philosophy (2)




TERM 3
Supervised work for the Philosophy
of Cognitive Science dissertation
(20,000 words)



Courses are taught by one hour lectures , two hour seminars and one hour
tutorials.

Choice of options is determined by student preference and content of first
degree. Not all options will always be available and new options may be added
according to faculty interests.

CURRENT TEACHING FACULTY for the MA


Dr A. Clark Philosophy of Cognitive Science I and II
Mr R.Dammann Recent European Philosophy
Dr M.Morris Recent Analytic Philosophy
Dr S Wood and Mr R Lutz AI Programming I
Dr B Katz Knowledge Representation
Neural Networks
Dr N Yuill Psychology I
Dr M. Scaife Psychology II
Prof M Boden Artificial Intelligence and Creativity
Social Implications of AI
Dr L Trask Natural Language Syntax \& Semantics
Dr S Easterbrook Computer Science I \& II
Dr D Weir Logics for Artificial Intelligence
Dr D Young Computer Vision
Dr B Keller Natural Language Processing
Dr Y Rogers &
Prof B du Boulay Intelligent Tutoring Systems

ENTRANCE REQUIREMENTS
These will be flexible. A first degree in Philosophy or one of the Cognitive
Sciences would be the usual minimum requirement.

FUNDING
U.K.students may apply for British Academy funding for this course in the usual
manner. Overseas students would need to be funded by home bodies.

CONTACT
For an application form, or further information, please write to Dr
Allen Stoughton at the School of Cognitive and Computing Sciences,
University of Sussex, Falmer, Brighton BN1 9QH, or phone him on (0273)
606755 ext. 2882, or email - allen@cogs.sussex.ac.uk.


------------------------------

Subject: Who I am. My request...
From: A215%C53000.PETROBRAS.ANRJ.BR@UICVM.uic.edu
Date: Fri, 10 Jan 92 15:04:00 -0200

Dear Peter Marvit,

My name is Marcelo Pereira Melo.

I am an engineer and work for PETROBRAS, the brazilian national company
in charge off petroleum exploration, production and refinering in Brazil.
PETROBRAS is the major company in Brasil and often appears in the FORTUNE
magazine annual list of the 100 major companies in the world.

Recently I get a M.Sc. in Computer Science - Artificial Intelligence from
Catholic University at Rio de Janeiro,Brazil. The title of my dissertation was
"Artificial Neural Networks: An application to Oil Products Prices Forecasting".
In that work I investigated the applicability of the backpropagation method for
predicting the price of oil products in the international market. Aspects of
project, dificulties and suggestions were discussed. In short-term prediction
I verified a performance better than that obtained by the market experts. I
also suggested a simple way to combine forecast and do analysis of scenarios.
I have also written papers on the subject that were published in the TIMS 1991
(Rio de Janeiro) and ICNN 1991 (Nimes,France).

Now I am also investigating the use of ANN for credit analysis. So I would
be very grateful to receive some information concerning about using ANN for
forecasting and finance analysis.

My address,
PETROBRAS S.A.
Av. Chile 65, sala 1606
CEP: 20.035
Rio de Janeiro, Brasil
fax: (011) (55) (021) 220-2686

bitnet: a215@c53000.petrobras.anrj.br

Sincerely yours,
Marcelo.



------------------------------

Subject: Time Series Forecasting - Comparision
From: David Kanecki <kanecki@cs.uwp.edu>
Date: Sun, 12 Jan 92 16:28:20 -0600

In the November Issue of Simulation there was a paper that compared long
term versus short forecasting using neural networks and difference
equations. For short term forecasting the neural network was better than
the difference equation. For long term forecasting the neural network and
difference equations provided equal accuracy.

Also, the author noted the same observation I had in neural network
training times, the shorter or more precise the coding the shorter the
learning time. Thus, an inprecise or flowery coding will yield a long
learning time.

--------------
Neural Netowrk Adaption and Prediction Applications


A conference is being scheduled by the simulation society in 1993 that
discusses neural network applicatoins in environments where adaption and
predictoin is required.


If you would like the title of the article or the conference contact me for
the referrences at:

kanecki@cs.uwp.edu



------------------------------

Subject: Having a machine perform a simple task: The reason why neural networks and artificial intelligence need each other
From: David Kanecki <kanecki@cs.uwp.edu>
Date: Sun, 12 Jan 92 17:11:30 -0600

In the december issue of Scientific American an article described an
attempt to build a robot to butter a piece of toast. In the article this
task was not as easy as it seems. For example, if the toast was moved to
a different spot a new set of rules and procedures were needed. Also, if
the toast was oddly shaped a new set of rules and procedures were needed.
The solution to the problem was to build a mini netowrk to adapt to
changes.

Rules as used in artificial intelligence work in task that are fixed and
non changing. And, neural networks are usefull in changing environments.
In the natural world, each taskj may follow a set of rules. Also, the
rules may need to altered slightly due to the environment. Thus, for a
machine to function in the natural world it needs to have an artificial
intelligence and neural network basis. One movie that showed this
connection was "Star Wars". In "Star Wars" there were two robots. One
robot was named r2d2 and the other was c3p0. The r2d2 robot was the
artificial intelligence basis, while the c3p0 was the neural network
basis. Lastly, in complex task both had to work together.

The development of self learning games is one way to develop a program
that utilizes an artiificial intelligence and neural network basis. In
my experience, learning and task based program still cannot breakaway
from the Lady Lovelace argument. -- "A machine can only do what it is
programmed to do" or the neural network corrollary - "A machine can only
learn what it is taught by a teacher".

Thus, the goal of my study is to try to develop a program that can teach
and solve problems itself by using techniques of ai, neural networks, and
biology.


David H. Kanecki

kanecki@cs.uwp.edu


------------------------------

Subject: DuPont Neural Computation Program - Job Opening
From: elaine@central.cis.upenn.edu (Elaine Benedetto)
Organization: University of Pennsylvania
Date: 15 Jan 92 16:05:27 +0000



Job Opening in
Modelling/Simulation of Neural Systems
********************************************


The DuPont Neural Computation Program invites applications for a
computational modelling and simulation position. Applicants shuold have
experience in model- ling techniques and be interested in general
problems of sensory/motor integration and/or single neuron computation.
The program constructs biophysi- cal model neurons reflecting
experimentally recorded neuronal dynamics, and then assembles these
neurons into networks reflecting the organization of a biological
cardio-respiratory control system. Familiarity with the UNIX/C/C++/
X-windows computing environment is desirable. Interaction with ongoing
experi- mental work will be required. We are open to considering various
levels of academic training and work achievement, but imagine that the
ideal candidate might hold a B.S. in a technical subject while having
exposure or interest in biological subjects such as biophysics and
neurobiology. The opening is immedi- ate, but a later start date will be
considered. Competitive salary commensurate with experience.

Contact: Dr. James Schwaber or Prof. Lyle Ungar
Email: schwaber@eplrx7.es.duPont.com ungar@cis.upenn.edu
Telephone: (302) 695-7136 (215) 898-7449

DuPont Neural Computation Program
DuPont Experimental Station E352 Room 253
Wilmington, DE 19880-0352



------------------------------

Subject: Reducing images for NN input???
From: nobody@Kodak.COM (Corinne Weirich)
Date: Fri, 17 Jan 92 09:02:42 -0500


HI, I have just been added to this mailing list. I just started a
research project on using neural networks for image classification. I am
using a comercial NN package called NWorks on the Macintosh. My delema
right now is how to input large (greyscal) images into a neural net. Is
there a standard method to reduce images to a manageable number of
inputs? Does anyone know of any articles that might be helpfull.

Thanks, Corinne (corinne@eksignal.kodak.com)



------------------------------

End of Neuron Digest [Volume 9 Issue 3]
***************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT