Copy Link
Add to Bookmark
Report
Neuron Digest Volume 09 Number 25
Neuron Digest Monday, 8 Jun 1992 Volume 9 : Issue 25
Today's Topics:
Re: HELP TO START
Post-Doc in Neural Processes in Cognition for Fall 92
DMCC-7 Info request
Research Position
backprop
Re: Neuron Digest V9 #22 (discussion + job)
Do you need faster/bigger simulations?
neural programmer/analyst job opening
Financial forecasting using ANN?
IEEE NNC Standards Committee
How run a big net ?
Sunnet
Re: Fuzzy-logic newsletter via e-mail?
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Re: HELP TO START
From: M_FATTAH%EGFRCUVX.BITNET@pucc.Princeton.EDU
Date: Tue, 12 May 92 12:02:00 +0000
[[ Editor's Note: M. Fattah's original note stated something to the
effect of "please tell me about neural nets"; I asked for him to be a bit
more specific about his background and interests. Here is the result. I
hope someone can offer some beginning guides, books, or papers. One
suggestion might be "Analog VLSI" by Carver Mead (MIT Press?) which has a
chapter on the retina and its realization in silicon. -PM ]]
Dear Sir,
I have asked for help to start at the subject of Neural Networks. I
indicated that my field of interest is the use of the vision science in
illuminating engineering. Though you have kindly announced my request I
got no response. So, I would like to put it in another form: I would like
to know introductory and tutorial papers on the modelling of the visual
system using Neural Networks. Thanks a lot in advance,
Yours,
Mohamed Abdel Fattah
------------------------------
Subject: Post-Doc in Neural Processes in Cognition for Fall 92
From: SCHNEIDER@vms.cis.pitt.edu
Date: Tue, 12 May 92 16:17:00 -0500
Postdoctoral Training in
NEURAL PROCESSES IN COGNITION
For Fall 1992
The Pittsburgh Neural Processes in Cognition program, in its second year
is providing interdisciplinary training in brain sciences. The National
Science Foundation has established an interdisciplinary program
investigating the neurobiology of cognition utilizing neuroanatomical,
nneurophysiological, and computer simulation procedures. Students
perform original research investigating cortical function at multiple
levels of analysis. State of the art facilities include: computerized
microscopy, human and animal electrophysiological instrumentation,
behavioral assessment laboratories, MRI and PET brain scanners, the
Pittsburgh Supercomputing Center, and access to human clinical
populations. The is a joint program between the University of
Pittsburgh, its School of Medicine, and Carnegie Mellon University.
Postdoctoral applicants MUST HAVE UNITED STATES RESIDENT'S STATUS.
Applications are requested by June 20, 1992. Applicants must have a
sponsor from the training faculty for the Neural Processes in Cognition
Program. Application materials will be sent upon request. Each student
receives full financial support, travel allowances, and a computer
workstation access. Last year's class included mathematicians,
psychologists, and neuroscience researchers. Applications are encouraged
from students with interest in biology, psychology, engineering, physics,
mathematics, or computer science. For information contact: Professor
Walter Schneider, Program Director, Neural Processes in Cognition,
University of Pittsburgh, 3939 O'Hara St, Pittsburgh, PA 15260, call
412-624-7064 or Email to NEUROCOG@VMS.CIS.PITT.BITNET
------------------------------
Subject: DMCC-7 Info request
From: emsca!intevep!jjr@Sun.COM (Juan J. Rodriguez Moscoso)
Date: Wed, 13 May 92 10:55:55 +0700
Would anyone help me finding any information regarding the upcoming
Distributed Memory Computer Conference in its 7th meeting (DMCC 7)?, or,
if possible, any hints to re-direct this request to someone else. Thank
you, very much
--Juan J. Rodriguez
Advanced Technologies Group, INTEVEP S.A., Caracas, Venezuela.
------------------------------
Subject: Research Position
From: Joe Levy <joe@cogsci.edinburgh.ac.uk>
Date: Wed, 13 May 92 11:10:26 +0000
Research Position
University of Edinburgh, UK
Centre for Cognitive Science
Applications are invited for an RA position, at pre- or post-doctoral
level (up to 16,755 pounds), working on ESRC-funded research into
connectionist models of human language processing. The research involves
using recurrent neural networks trained on speech corpora to model
psycholinguistic data. Experience in C programming, neural networks and
human language processing would be valuable. An early start date is
preferred and the grant will run until 31 March 1994. Applications
consisting of a cv and the names and addresses to 2 referees should be
sent by Tuesday 2 June to Dr Richard Shillcock, Centre for Cognitive
Science, University of Edinburgh, 2 Buccleuch Place, Edinburgh EH8 9LW,
UK to whom informal enquiries should also be addressed (email
rcs%cogsci.ed.ac.uk@nsfnet-relay.ac.uk; tel +44 31 650 4425).
------------------------------
Subject: backprop
From: pomper@ee.ic.ac.uk
Date: Fri, 22 May 92 10:10:35 -0200
Hello to the experts !
For my studies I need a universal (?) backpropagation program .
This means that I would like to have
- different state functions
- different learning rate behaviours
- different error criteria
- flexible connections/inputs
- different update functions (steepest decent, ...)
for each layer/neuron
My question is: Does there exist a program (package) for my problem
already ? And where to get it? I need it for a UNIX environment (SUN
sparc, OPEN WINDOWS), or would like to get C source.
Greetings
(neurons get the difference)
Chris
pomper@ee.ic.ac.uk
------------------------------
Subject: Re: Neuron Digest V9 #22 (discussion + job)
From: John R. Purvis III <purvis@acad.stedwards.edu>
Date: Wed, 27 May 92 08:49:55 -0600
In reply to request for info on simulators that generate code I suggest
looking at the package NETS that was developed by NASA and is available
through COSMIC at the University of Georgia. Information can be obtained
via Internet at:
service@cossack.cosmic.uga.edu I have used the package some, and
a student has used it in an undergraduate research project. It seems
pretty good, and the price is low (around $150 if I remember correctly).
Product is also supported.
-- Regards,
_______
/ /
/ ___ /__ /__
/ / / / / / /
\_/ /__/ / / / /
John R. Purvis, P.E.
Assistant Professor (512) 448-8465
Department of Computer and FAX (512) 448-8492
Information Sciences purvis@acad.stedwards.edu
St. Edward's University
3001 So. Congress Ave.
Austin, TX 78704
------------------------------
Subject: Do you need faster/bigger simulations?
From: ngoddard@carrot.psc.edu
Date: Thu, 28 May 92 11:39:29 -0500
The Pittsburgh Supercomputing Center (PSC) encourages applications for
grants of supercomputing resources from researchers using neural
networks. Our Cray YMP is running the NeuralShell, Aspirin and PlaNet
simulators. The CM-2 (32k nodes) currently runs two neural network
simulators, one being a data-parallel version of the Mclelland and
Rumelhart PDP simulator. These simulators are also available on the 256
node CM-5 installed at the PSC (currently without vector units). Users
can run their own simulation code or modify our simulators; there is some
support for porting code.
PSC is primarily funded by the National Science Foundation and there is
no explicit charge to U.S.-based academic researchers for use of its
facilities. International collaboration is encouraged, but each proposal
must include a co-principal investigator from a U.S. institution. Both
academic and industry researchers are encouraged to apply.
The following numbers give an idea of the scale of experiment that can be
run using PSC resources. The bottom line is that in a day one can obtain
results that would have required months on a workstation. For a larnge
backpropagation network (200-200-200) the Cray simulators reach
approximately 20 million online connection-weight updates per second
(MCUPS) on a single CPU. This is about 100 times the speed of a
DecStation 5000/120 and about 30 times the speed of an IBM 730 on the
same problem. It could be increased by a factor of 8 if all of the Cray
YMP's 8 CPUS were dedicated to the task. The McClelland & Rumelhart
simulator on the CM-2 achieves about 20 MCUPS (batch) using 8k processors
or about 80 MCUPS using 32k processors. The Zhang CM-2 backpropagation
simulator has been benchmarked at about 150 MCUPS using all 32k
processors. Current CM-5 performance is around 35 MCUPS (batch) per
128-node partition for the McClelland & Rumelhart simulator. CM-5
performance should improve dramatically once vector units are installed.
A service unit on the Cray YMP corresponds to approximately 40 minutes of
CPU time using 2 MW of memory; on the CM2 it is one hour's exclusive use
of 8k processors. Grants of up to 100 service units are awarded every
two weeks; larger grants are awarded quarterly. Descriptions of the
types of grants available and the application form can be obtained as
described below.
NeuralShell, Aspirin and PlaNet run on various platforms including
workstations and are available by anonymous ftp as described below. The
McClelland & Rumelhart simulator is availible with their book
"Explorations in Parallel Distributed Processing", 1988. Documentation
outlining the facilities provided by each simulator is included in the
sources. The suitability of the supercomputer version of each simulator
for different types of network sizes and topologies is discussed briefly
in PSC documentation that can be obtained by anonymous ftp as described
below. It should be possible to develop a neural network model on a
workstation and later conduct full scale testing on the PSC's
supercomputers without substantial changes.
Instructions for how to use the anonymous ftp facility appear at the end
of this message. Further inquiries concerning the Neural and
Connectionist Modeling program should be sent to Nigel Goddard at
ngoddard@psc.edu (Internet) or ngoddard@cpwpsca (Bitnet) or the address
below.
How to get PSC grant information and application materials
- ----------------------------------------------------------
A shortened form of the application materials in printer-ready postscript
can be obtained via anonymous ftp from ftp.psc.edu (128.182.62.148). The
files are in "pub/grants". The file INDEX describes what is in each of
the other files. More detailed descriptions of PSC facilities and
services are only available in hardcopy. The basic document is the
Facilities and Services Guide.
Hardcopy materials can be requested from:
grants@psc.edu
or
(412) 268 4960 - ask for the Allocations Coordinator
or
Allocations Coordinator
Pittsburgh Supercomputing Center
4400 Fifth Avenue
Pittsburgh, PA 15213
How to get Aspirin/MIGRAINES
- ----------------------------
The software is available from two FTP sites, CMU's simulator collection
(pt.cs.cmu.edu or 128.2.254.155 in directory
/afs/cs/project/connect/code) and from UCLA's cognitive science machines
(polaris.cognet.ucla.edu or 128.97.50.3 in directory "alexis"). The
compressed tar file is a little less than 2 megabytes and is called
"am5.tar.Z".
How to get PlaNet
- -----------------
The software is availible from FTP site boulder.colorado.edu
(128.138.240.1) in directory "pub/generic-sources", filename
PlaNet.5.6.tar.Z
How to get NeuralShell
- ----------------------
The software is availible from FTP site quanta.eng.ohio-state.edu
(128.146.35.1) in directory "pub/NeuralShell", filename
"NeuralShell.tar".
Generic Anonymous FTP instructions.
- ------------------------------------
1. Create an FTP connection to the ftp server. For example, you
would connect to the PSC ftp server "ftp.psc.edu" (128.182.62.148)
with the command "ftp ftp.psc.edu" or "ftp 128.182.62.148".
2. Log in as user "anonymous" with password your username@your-site.
3. Change to the requisite directory, usually /pub/somedir, by
typing the command "cd pub/somedir"
4. Set binary mode by typing the command "binary" ** THIS IS IMPORTANT **
5. Optionally look around by typing the command "ls".
6. Get the files you want using the command "get filename" or
"mget *" if you want to get all the files.
7. Terminate the ftp connection using the command "quit".
8. If the file ends in .Z, uncompress it with the command
"uncompress filename.Z" or "zcat filename.Z > filename".
This uncompresses the file and removes the .Z from the filename
9. If the files end in .tar, extract the tar'ed files with
the command "tar xvf filename.tar".
10. If a file ends in .ps, you can print it on a Postscript printer
using the command "lpr -s -Pprintername filename.ps"
------------------------------
Subject: neural programmer/analyst job opening
From: Eric Mjolsness <mjolsness-eric@CS.YALE.EDU>
Date: Fri, 29 May 92 12:51:56 -0500
Programmer/Analyst Position
in Artificial Neural Networks
The Yale Center for Theoretical
and Applied Neuroscience (CTAN)
and the
Computer Science Department
(Yale University, New Haven CT)
We are offering a challenging position in software engineering in support
of new mathematical approaches to artificial neural networks, as
described below. (The official job description is very close to this and
is posted at Human Resources.)
1. Basic Function: Designer, programmer, and consultant for artificial
neural network software at Yale's Center for Theoretical and Applied
Neuroscience (CTAN) and the Computer Science Department.
2. Major duties:
(a) To extend and implement a design for a new neural net compiler
and simulator based on mathematical optimization and computer algebra,
using serial and parallel computers.
(b) To run and analyse computer experiments using this and other
software tools in a variety of application projects, including
image processing and computer vision.
(c) To support other work in artificial neural networks at CTAN,
including the preparation of software demonstrations.
3. Position Specifications:
(a) Education:
BA, including calculus, linear algebra, differential equations.
helpful: mathematical optimization.
(b) Experience:
programming experience in C under UNIX.
some of the following: C++ or other object-oriented language,
symbolic programming, parallel programming,
scientific computing, workstation graphics,
circuit simulation, neural nets, UNIX system
administration
(c) Skills:
High-level programming languages
medium to large-scale software engineering
good mathematical literacy
Preferred starting date: July 1, 1992.
For information or to submit an application (your resume and any
supporting materials), please write:
Eric Mjolsness
Yale Computer Science Department
P.O. Box 2158 Yale Station
New Haven CT 06520
(mjolsness@cs.yale.edu)
Any application must also be sent to Human Resources, with the position
identification "C 7-20073", at this address:
Jeffrey Drexler
Department of Human Resources
Yale University
155 Whitney Ave.
New Haven, CT 06520
------------------------------
Subject: Financial forecasting using ANN?
From: NERAMOS <neramos@unmdp.edu.ar>
Date: Sat, 30 May 92 18:16:47 -0700
I would be very grateful to receive some information concerning about
using ANN for forecasting and financial analysis. I heard about a very
interesting work
" Financial applications of neural networks"
Could some one post me a copy of this or some related works.
Thank you very much,
Dr. Nestor E. Ramos
University of Mar del Plata
P.O. Box 701 Correo Central
7600 Mar del Plata
Argentina
E-mail : NERAMOS@UNMDP.EDU.AR
P.S:I have economic datas from my country,if someone need them just
contact me.
------------------------------
Subject: IEEE NNC Standards Committee
From: Emile Fiesler <efiesler@idiap.ch>
Date: Mon, 01 Jun 92 08:34:02 +0100
On behalf of the IEEE Neural Network Council Standards Committee I am
composing a database of people who are interested in, and would like to
contribute to, the establishment of neural network standards.
The Committee consists of the following Working Groups:
A. the glossary working group
concerning neural network nomenclature
B. the paradigms working group
concerning neural network (construction) tools
C. the performance evaluation working group
concerning neural network benchmarks
D. the interfaces working group
concerning neural network hardware and software interfaces
(This working group is still tentative.)
People who are interested, and would like to be on our mailing list,
are invited to send me the following information:
1. Name
2. Title / Position
3. Address
4. Telephone number
5. Fax number
6. Electronic mail address
7. Interest:
A short statement expressing your interests in neural network
standards including which working group(s) you are interested
in (A. B, C, D).
E. Fiesler
IDIAP
Case postale 609
CH-1920 Martigny
Switzerland / Suisse
Tel.: +41-26-22-76-64
Fax.: +41-26-22-78-18
E-mail: EFiesler@IDIAP.CH (INTERNET)
------------------------------
Subject: How run a big net ?
From: <LUCA%CZHETH5A.BITNET@pucc.Princeton.EDU>
Date: Tue, 02 Jun 92 08:34:00 +0100
Dear Friends, I was trying to train a 2 layer net ( input, hidden,
output) with classical backpropagation. In a first instance I wrote my
own program, in fortran, and let it run on a VAX8000. It worked OK on a
small scale, but when I brought it up to the full project (15000 Input,
5000 Hidden, 15000 Output) it did not run for excessive memory
requirement. I then tryed to use NeurDS3.1 from DEC, and runned in the
same problem (a part to the problems in making the hexadecimal format of
the data and going above the intrinsic 5000 unit limit). I then made
some experience with RCS, Aspirin/Migraines and Planet on a Iris 4D/35
first and on a Sun4 then :-< same result, crash to excessive memory
requirement. In the meantime I have learned programming in C and then
being able to study the source that more or less all of these
meta-compilers produce out from the starting grammar and found that
rarely any of these uses dynamic allocation. I then found a short program
called Batchnet written by RW Dobbins and RC Eberhart that extensively
uses the dynamic allocation and I hope with it to become closer to the
solution, in tihs way.
My QUESTION: WHAT DO YOU USE FOR RUN BIG NETS ? (15000*5000*15000 I*H*O)
Of course one can address it from a classical point of view and optimize the
code by buffering it at expenses of I/O ... and that will be my last minute
opportunity ....
Best regards
Luca I.G.Toldo
Email: Luca@czheth5a.bitnet
THANKS !
------------------------------
Subject: Sunnet
From: Mark Evans <mre1@it-research-institute.brighton.ac.uk>
Date: Tue, 02 Jun 92 12:43:13 +0000
> Subject: looking for SunNet Ref. Manual
> From: Pablo Dominguez <fibces04@esaii.upc.es>
> Date: Wed, 06 May 92 19:07:50 +0100
>
> We are working with the Sunnet neurosimulator. We'd like to get the
> Reference Manual. If someone has it, please, let us know.
>
> Thanks in advance
>
> Pablo Dominguez
> Dept ESAII (UPC)
> Barcelona, Spain
> E-mail: fibces04@esaii.upc.es
You are looking for PlaNet, SunNet was the old name. Details may be
obtained from the author Yoshiro Miyata, miyata@jp.ac.chukyo-u.sccs.
Regards,
Mark
#################################################
# #
# Mark Evans mre1@itri.bton.ac.uk #
# Research Assistant itri!mre1 #
# #
# ITRI, #
# Brighton Polytechnic, #
# Lewes Road, #
# BRIGHTON, #
# E. Sussex, #
# BN2 4AT. #
# #
# Tel: +44 273 642915/642900 #
# Fax: +44 273 606653 #
# #
#################################################
------------------------------
Subject: Re: Fuzzy-logic newsletter via e-mail?
From: Ulf Rimkus <RIMKUS_U%DMRHRZ11.BITNET@vm.gmd.de>
Date: Tue, 02 Jun 92 22:54:30 +0700
In Reply to James Rash <jim@class.gsfc.nasa.gov>: I have found these
following two e-mail lists dealing with fuzzy-logic. The info ist
extracted from the "interest-group-list" available at "ftp.nisc.sri.com".
(netinfo/interest-groups) Hope this helps, Ulf Rimkus.
---------------------------CUT HERE ------------------------------------
CYBSYS-L@BINGVAXU.CC.BINGHAMTON.EDU
CYBSYS-L%BINGVMB.BITNET@CUNYVM.CUNY.EDU
CYBSYS-L@BINGVMB.BITNET
The Cybernetics and Systems mailing list is an open list serving those
working in or just interested in the interdisciplinary fields of Systems
Science, Cybernetics, and related fields (e.g. General Systems Theory,
Complex Systems Theory, Dynamic Systems Theory, Computer Modeling and
Simulation, Network Theory, Self-Organizing Systems Theory, Information
Theory, Fuzzy Set Theory). The list is coordinated by members of the
Systems Science department of the Watson School at SUNY-Binghamton, and is
affiliated with the International Society for the Systems Sciences (ISSS)
and the American Society for Cybernetics (ASC).
To subscribe, send the following command to LISTSERV@BINGVMB via mail or
interactive message:
SUB CYBSYS-L your_full_name
where "your_full_name" is your name. For example: SUB CYBSYS-L Joan Doe
Non-BitNet users can subscribe by sending the text:
SUB CYBSYS-L your_full_name
in the body of a message to LISTSERV@BINGVAXU.CC.BINGHAMTON.EDU or
LISTSERV%BINGVMB.BITNET@CUNYVM.CUNY.EDU.
To unsubscribe send the following command:
UNSUB CYBSYS-L
Coordinator: Cliff Joslyn <vu0112@BINGVAXU.CC.BINGHAMTON.EDU>
-----
IRLIST <IR-L%UCCVMA.BITNET@VM1.NODAK.EDU>
IRList is open to discussion of any topic (vaguely) related to Information
Retrieval. Certainly, any material relating to ACM SIGIR (the Special
Interest Group on Information Retrieval of the Association for Computing
Machinery) is of interest. The field has close ties to artificial
intelligence, database management, information and library science,
linguistics, etc. A partial list of suitable topics is:
Information Management/Processing/Science/Technology
AI Applications to IR Hardware aids for IR
Abstracting Hypertext and Hypermedia
CD-ROM/CD-I/... Indexing/Classification
Citations Information Display/Presentation
Cognitive Psychology Information Retrieval Applications
Communications Networks Information Theory
Computational Linguistics Knowledge Representation
Computer Science Language Understanding
Cybernetics Library Science
Data Abstraction Message Handling
Dictionary analysis Natural Languages, NL Processing
Document Representations Optical disc technology and applications
Electronic Books Pattern Recognition, Matching
Evidential Reasoning Probabilistic Techniques
Expert Systems in IR Speech Analysis
Expert Systems use of IR Statistical Techniques
Full-Text Retrieval Thesaurus construction
Fuzzy Set Theory
Contributions may be anything from tutorials to rampant speculation. In
particular, the following are sought:
Abstracts of Papers, Reports, Dissertations Address Changes
Bibliographies Conference Reports
Descriptions of Projects/Laboratories Half-Baked Ideas
Humorous, Enlightening Anecdotes Histories
Questions Requests
Seminar Announcements/Summaries Research Overviews
Work Planned or in Progress
The only real boundaries to the discussion are defined by the topics of
other mailing lists. Please do not send communications to both this list
and AIList or the Prolog list, except in special cases. The Moderator
tries not to overlap much with NL-KR, except when both lists receive
materials from contributors or from some bulletin board or researchers.
There is no objection to distributing material that is destined for
conference proceedings or any other publication. The Coordinator is
involved in SIGIR Forum and, unless submittors request otherwise, may
include submissions in whole or in part in future paper versions of the
FORUM. Indeed, this is one form of solicitation for FORUM contributions!
Both IRList and the FORUM are unrefereed, and opinions are always those of
the author and not of any organization unless there are other indications.
Copies of list items should credit the original author, not necessarily the
IRList.
The IRLIST Archives will be set up for anonymous FTP, and the address will
be announced in future issues.
To subscribe send the following command to LISTSERV@UCCVMA.BITNET:
SUB IR-L your_full_name
where "your_full_name" is your real name, not your login Id.
Non-BitNet users can join by sending the above command as the only line in
the text/body of a message to LISTSERV%UCCVMA.BITNET@VM1.NODAK.EDU.
Moderator: IRLUR%UCCMVSA.BITNET@VM1.NODAK.EDU
Editorial Staff: Clifford Lynch <lynch@POSTGRES.BERKELEY.EDU>
<calur@UCCMVSA.BITNET>
Mary Engle <engle@CMSA.BERKELEY.EDU>
<meeur@UCCMVSA.BITNET>
Nancy Gusack <ncgur@UCCMVSA.BITNET>
-------------------------- THE END -------------------------------------
INTERNET: rimkus_u@dmrhrz11.hrz.uni-marburg.de-----------------------------
Hey Fremder, meine besten Freunde sind Fremde, aber Du warst noch nie hier!
BITNET: rimkus_u@dmrhrz11---------------------------------------------------
------------------------------
End of Neuron Digest [Volume 9 Issue 25]
****************************************