Copy Link
Add to Bookmark
Report
Neuron Digest Volume 11 Number 15
Neuron Digest Saturday, 27 Feb 1993 Volume 11 : Issue 15
Today's Topics:
Applying Standards to Neural Networks
Sheet of neurons simulation
Re: Sheet of neurons simulation
Re: Sheet of neurons simulation
NNET model choice.
Handbook of Neural Algorithms
COMPUTER STANDARDS & INTERFACES addendum
Position Available at JPL
lectureship
Industrial Position in Artificial Intelligence and/or Neural Networks
lectureship in cognitive science
Microsoft Speech Research
Neural Computation & Cognition: Opening for NN Programmer
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Applying Standards to Neural Networks
From: erwin@trwacs.fp.trw.com (Harry Erwin)
Organization: TRW Systems Division, Fairfax VA
Date: 12 Feb 93 16:54:57 +0000
I was asked to review a proposal concerning the standardization of
vocabulary for machine learning and neural networks. This is being
distributed by the U.S. Technical Advisory Group to ANSI (JTC1 TAG). X3K5
is coordinating and developing a recommended position to JTC1 TAG for
approval for submission to ISO/IEC JTC 1. This recommendation has to be
returned to the JTC1 TAG Administrator no later than 1 March, 1993. The
contact person is the
JTC1 TAG Administrator
Computer and Business Equipment Manufacturers Association (CBEMA)
1250 Eys Street NW, Suite 200
Washington, DC 20005-3922
phone: 202-737-8888 (Press 1 Twice)
The vocabulary whose definitions are being standardized include:
"knowledge acquisition"
"learning strategy"
"concept"
"concept learning"
"conceptual clustering"
"taxonomy formation"
"machine discovery"
"connectionist model"
"massively parallel processing"
"connection machine"
"connection system"
"neural network"
"connectionist network"
"neurocomputer"
"learning task"
"concept description"
"chunking"
"discrimination network"
"characteristic description"
"discriminant description"
"structural description"
"concept formation"
"partially learned concept"
"version space (of a concept)"
"description space"
"instance space (of a concept)"
"(concept) generalization"
"consistent generalization"
"constraint-based generalization"
"similarity-based generalization"
"complete generalization"
"specialization"
"caching (in machine learning)"
"concept validation"
"confusion matrix"
"rote learning"
"adaptive learning"
"advice taking"
"learning by being told"
"learning from instruction"
"incremental learning"
"supervised learning"
"inductive learning"
"learning from induction"
"deductive learning"
"analytic learning"
"explanation-based learning"
"operationalization"
"learning by analogy"
"associative learning"
"learning from observation and discovery"
"learning without a teacher"
"unsupervised learning"
"learning from examples"
"positive example"
"negative example"
"near-miss"
"credit/blame assignment"
"causal analysis"
"unit (in neural networks)"
"link (in neural networks)"
"stable coalition"
"hidden layer"
"back propagation"
"transfer function"
For example, a "neural network" or "connectionist network" is defined as a
"A network of neuron-like processors each of which performs some simple
logical function, typically a logic threshold function. NOTE A neural
network completes a computation when its units have finished exchanging
messages and updating their potential, and settle into a stable state."
A "hidden layer" is defined as "An object-oriented software layer which
contains the method of instruction delivery among different programs run
by different types of data. NOTE Every processor is told to block out any
program that does not apply to the data object stored in it. From the
user's point of view however it appears that different types of processors
run different programs."
--My recommendation on this proposal to the TRW representative to this
standardization body is to vote no, since it is highly premature to
standardize on terminology when the underlying concepts remain the subject
of such active research."
Cheers,
Harry Erwin
Internet: erwin@trwacs.fp.trw.com
------------------------------
Subject: Sheet of neurons simulation
From: fburton@nyx.cs.du.edu (Francis Burton)
Organization: University of Denver, Dept. of Math & Comp. Sci.
Date: 18 Feb 93 18:11:27 +0000
On behalf of a colleague, I am looking for software that can be used to
simulate large networks of connected neurons. The individual elements
would have fairly unsophisticated (possibly identical) input/output
properties. The topology of the network would be a flat sheet with random
local interconnections, but later he may want to extend it to several
layers. The program should run on a PC - preferably freeware, but he
would be willing to pay for a commercial product (though I don't imagine
there would be much of a market for such a program).
I suspect that typical programs for neural-nets are not well suited to
this particular problem -- please correct me if I am mistaken.
Thank you for any pointers or advice.
Francis Burton Physiology, Glasgow University, Glasgow G12 8QQ, Scotland.
041 339 8855 x8085 | JANET: F.L.Burton@glasgow.ac.uk !net: via mcsun & uknet
"A horse! A horse!" | INTERNET: via nsfnet-relay.ac.uk BITNET: via UKACRL
------------------------------
Subject: Re: Sheet of neurons simulation
From: hunter@work.nlm.nih.gov (Larry Hunter)
Organization: National Library of Medicine
Date: 18 Feb 93 22:48:18 +0000
Francis Burton asks:
On behalf of a colleague, I am looking for software that can be used to
simulate large networks of connected neurons.
Well, there are many public domain (or nearly so) neural network
simulators out there that can do arbitrary topologies and update rules,
at least with a little bit of programming. IMHO, by far the best, both
in terms of what comes with the system and how easy it is to program to
meet specific needs, is the Xerion system from University of Toronto. It
has wonderful graphical interfaces (X windows) and runs on practically
any Unix/X platform. It is originally designed for use in machine
learning and on artificial neural nets, but I think it offers a good
possibility for adaptation to natural neural network simulation. Also,
the author of the program, Drew van Camp is pretty accessible.
It is available by anonymous ftp from the host ai.toronto.edu in the
directory /pub/xerion.
Here's a snippet from the README file:
Xerion is a Neural Network simulator developed and used by the
connectionist group at the University of Toronto. It contains libraries of
routines for building networks, and graphically displaying them. As well
it contains an optimization package which can train nets using several
different methods including conjugate gradient. It is written in C and
uses the X window system to do the graphics. It is being given away free
of charge to Canadian industry and researchers. It comes with NO warranty.
This distribution contains all the libraries used to build the simulators
as well as several simulators built using them (Back Propagation,
Recurrent Back Propagation, Boltzmann Machine, Mean Field Theory, Free
Energy Manipulation, Kohonnen Net, Hard and Soft Competitive Learning).
Also included are some sample networks built for the individual
simulators.
There are man pages for the simulators themselves and for many of the C
language routines in the libraries. As well, xerion has online help
available once the simulators are started. There is a tutorial on using
Xerion in the 'doc' directory.
I hope this does what you want.
Larry
Lawrence Hunter, PhD.
National Library of Medicine
Bldg. 38A, MS-54
Bethesda. MD 20894 USA
tel: +1 (301) 496-9300
fax: +1 (301) 496-0673
internet: hunter@nlm.nih.gov
encryption: PGP 2.1 public key via "finger hunter@work.nlm.nih.gov"
------------------------------
Subject: Re: Sheet of neurons simulation
From: senseman@lucy.brainlab.utsa.edu (David M. Senseman)
Organization: University of Texas at San Antonio
Date: 19 Feb 93 13:32:04 +0000
In article <HUNTER.93Feb18144818@work.nlm.nih.gov> Hunter@nlm.nih.gov writes:
>IMHO, by far the best, both in terms of what
>comes with the system and how easy it is to program to meet specific needs,
>is the Xerion system from University of Toronto.
The original posting asked for something to run "on a PC." This sounds
unlikely to run on a PC even if it were running an X server.
However, if you can get a hold of a UNIX based workstation, (Sparc, SGI,
HP, IBM, etc), you might want to check out the Caltech Neurosimulator
called "GENESIS". GENESIS also sports a very nice X-windows based
front-end called "XODUS" (what else :).
Unlike Xerion which was primarily designed for "non-biological" neural
networks (i.e. back-propagation, etc.), GENESIS was designed from the
beginning to model REAL neurons. In fact GENESIS has a group of commands
that generates "sheets of neurons" and synaptically connects them to
other sheets. Real HHK action potentials, Calcium channels, dendrtitic
spines, etc, etc...
I'm at home so I don't have all the details here, but if any
one is interested, they can contact me by E-Mail.
Again this program MUST be run on a UNIX box that supports
X-Windows. If all you have is a PC, then this isn't for you.
David M. Senseman, Ph.D. | Imagine the Creator as a low
(senseman@lonestar.utsa.edu) | comedian, and at once the world
Life Sciences Visualization Lab | becomes explicable.
University of Texas at San Antonio | H.L. Mencken
------------------------------
Subject: NNET model choice.
From: "Don" <schiewer@pa310m.inland.com>
Date: Fri, 19 Feb 93 14:34:56 -0600
I need some help selecting a NNET model to use for a classification
problem which involves looking at 20 thermal couples over a period of
10-20 samples. (continuously) The idea is to respond to a fault
condition. (fault/nofault)
I am considering Grossberg's STC(spacio-temperal classifier) model.
We will be implementing on NeuralWare's Neuralmaker PRO II.
Does any one know of other models or have info on how best to make this
work?
Thanks in advance.
Don Schiewer | Internet schiewer@pa881a.inland.com | Onward Great
Inland Steel | UUCP: !uucp!pa881a.inland!schiewer | Stream...
------------------------------
Subject: Handbook of Neural Algorithms
From: "Sean Pidgeon" <pidgeon@a1.relay.upenn.edu>
Date: Thu, 25 Feb 93 11:58:01 -0500
I would like to thank all those who took the trouble to respond to the
questionnaire posted in the 23 September 1992 issue by my colleague Tamara
Isaacs-Smith. The level of interest in our proposed Handbook has been
gratifying. A focus group was convened in Philadelphia on February 23 to
discuss the best way forward for the project, and our editorial plan is now
quite well developed.
All those interested in learning more about the Handbook project are
invited to contact me directly or to visit the IOP Publishing booth at the
World Congress on Neural Networks in Portland. Again, thanks for your
support.
------------------------------
Subject: COMPUTER STANDARDS & INTERFACES addendum
From: John Fulcher <john@cs.uow.edu.au>
Date: Fri, 26 Feb 93 13:55:36 -0500
COMPUTER STANDARDS & INTERFACES (North-Holland)
Forthcoming Special Issue on ANN Standards
ADDENDUM TO ORIGINAL POSTING
Prompted by enquiries from several people regarding my original Call for
Papers posting, I felt I should offer the following additional information
(clarification).
By ANN "Standards" we do not mean exclusively formal standards (in the ISO,
IEEE, ANSI, CCITT etc. sense), although naturally enough we will be
including papers on activities in these areas.
"Standards" should be interpreted in its most general sense, namely as
standard APPROACHES (e.g. the backpropagation algorithm & its many
variants). Thus if you have a paper on some (any?) aspect of ANNs,
provided it is prefaced by a summary of the standard approach(es) in that
particular area, it could well be suitable for inclusion in this special
issue of CS&I. If in doubt, post fax or email a copy by April 30th to:
John Fulcher,
Department of Computer Science,
University of Wollongong,
Northfields Avenue,
Wollongong NSW 2522,
Australia.
fax: +61 42 213262
email: john@cs.uow.edu.au.oz
------------------------------
Subject: Position Available at JPL
From: Padhraic Smyth <pjs@bvd.Jpl.Nasa.Gov>
Date: Thu, 18 Feb 93 11:49:36 -0800
We currently have an opening in our group for a new PhD graduate
in the general area of signal processing and pattern recognition.
While the job description does not mention neural computation per
se, it may be of interest to some members of the this
mailing list. For details see below.
Padhraic Smyth, JPL
RESEARCH POSITION AVAILABLE
AT THE
JET PROPULSION LABORATORY,
CALIFORNIA INSTITUTE OF TECHNOLOGY
The Communications Systems Research Section at JPL has an immediate
opening for a permanent member of technical staff in the area of
adaptive signal processing and statistical pattern recognition.
The position requires a PhD in Electrical Engineering or a closely
related field and applicants should have a demonstrated ability
to perform independent research.
A background in statistical signal processing is highly desirable.
Background in information theory, estimation and detection, advanced
statistical methods, and pattern recognition, would also be a plus.
Current projects within the group include the use of hidden Markov
models for change detection in time series, and statistical methods
for geologic feature detection in remotely sensed image data. The
successful applicant will be expected to perform both basic and
applied research and to propose and initiate new research projects.
Permanent residency or U.S. citizenship is not a strict requirement
- however, candidates not in either of these categories should be
aware that their applications will only be considered in
exceptional cases.
Interested applicants should send their resume (plus any supporting
background material such as recent relevant papers) to:
Dr. Stephen Townes
JPL 238-420
4800 Oak Grove Drive
Pasadena, CA 91109.
(email: townes@bvd.jpl.nasa.gov)
------------------------------
Subject: lectureship
From: Tony_Prescott <tony@aivru.shef.ac.uk>
Date: Fri, 19 Feb 93 10:59:46 +0000
LECTURESHIP IN COGNITIVE SCIENCE
University of Sheffield, UK.
Applications are invited for the above post tenable from 1st October 1993
for three years in the first instance but with expectation of renewal.
Preference will be given to candidates with a PhD in Cognitive Science,
Artificial Intelligence, Cognitive Psychology, Computer Science, Robotics,
or related disciplines.
The Cognitive Science degree is an integrated course taught by the departments
of Psychology and Computer Science. Research in Cognitive Science was highly
evaluated in the recent UFC research evaluation exercise, special areas of interest being vision, speech, language, neural networks,
and learning. The
successful candidate will be expected to undertake research vigorously.
Supervision of programming projects will be required, hence considerable
experience with Lisp, Prolog, and/or C is essential.
It is expected that the appointment will be made on the Lecturer A scale
(13,400-18,576 pounds(uk) p.a.) according to age and experience but enquiries
from more experienced staff able to bring research resources are welcomed.
Informal enquiries to Professor John P Frisby 044-(0)742-826538 or e-mail
jpf@aivru.sheffield.ac.uk. Further particulars from the director of Personnel
Services, The University, Sheffield S10 2TN, UK, to whom all applications
including a cv and the names and addresses of three referees (6 copies of all
documents) should be sent by 1 April 1993.
Short-listed candidates will be invited to Sheffield for interview for which
travel expenses (within the UK only) will be funded.
Current permanent research staff in Cognitive Science at Sheffield include:
Prof John Frisby (visual psychophysics),
Prof John Mayhew (computer vision, robotics, neural networks)
Prof Yorik Wilks (natural language understanding)
Dr Phil Green (speech recognition)
Dr John Porrill (computer vision)
Dr Paul McKevitt (natural language understanding)
Dr Peter Scott (computer assisted learning)
Dr Rod Nicolson (human learning)
Dr Paul Dean (neuroscience, neural networks)
Mr Tony Prescott (neural networks, comparative cog sci)
------------------------------
Subject: Industrial Position in Artificial Intelligence and/or Neural Networks
From: Jerome Soller <soller@asylum.cs.utah.edu>
Date: Fri, 19 Feb 93 14:09:43 -0700
I have just been made aware of a job opening in artificial
intelligence and/or neural networks in southeast Ogden, UT. This
company maintains strong technical interaction with existing industrial,
U.S. government laboratory, and university strengths in Utah. Ogden
is a half hour to 45 minute drive from Salt Lake City, UT.
For further information, contact Dale Sanders at 801-625-8343 or
dsanders@bmd.trw.com . The full job description is listed below.
Sincerely,
Jerome Soller
U. of Utah Department of Computer Science
and VA Geriatric, Research, Education and
Clinical Center
Knowledge engineering and expert systems development. Requires
five years formal software development experience, including two years
expert systems development. Requires experience implementing
at least one working expert system. Requires familiarity with expert
systems development tools and DoD specification practices. Experience with
neural nets or fuzzy logic systems may qualify as equivalent experience
to expert systems development. Familiarity with Ada, C/C++, database design,
and probabilistic risk assessment strongly desired. Requires strong
communication and customer interface skills. Minimum degree: BS in
computer science, engineering, math, or physical science. M.S. or Ph.D.
preferred. U.S. Citizenship is required. Relocation funding is limited.
------------------------------
Subject: lectureship in cognitive science
From: Martin Cooke <M.Cooke@DCS.SHEFFIELD.AC.UK>
Date: Tue, 23 Feb 93 12:54:29 +0000
To Dan: thanks, and all the best for the auditory list.
To the list: a job possibility
Martin
- ------------------------------
LECTURESHIP IN COGNITIVE SCIENCE
University of Sheffield, UK.
Applications are invited for the above post tenable from 1st October
1993 for three years in the first instance but with expectation of
renewal. Preference will be given to candidates with a PhD in
Cognitive Science, Artificial Intelligence, Cognitive Psychology,
Computer Science, Robotics, or related disciplines.
The Cognitive Science degree is an integrated course taught by the
departments of Psychology and Computer Science. Research in Cognitive
Science was highly evaluated in the recent UFC research evaluation
exercise, special areas of interest being vision, speech, language,
neural networks, and learning. The successful candidate will be
expected to undertake research vigorously. Supervision of programming
projects will be required, hence considerable experience with Lisp,
Prolog, and/or C is essential.
It is expected that the appointment will be made on the Lecturer A
scale (13,400-18,576 pounds(uk) p.a.) according to age and experience
but enquiries from more experienced staff able to bring research
resources are welcomed.
Informal enquiries to Professor John P Frisby 044-(0)742-826538 or
e-mail jpf@aivru.sheffield.ac.uk. Further particulars from the
director of Personnel Services, The University, Sheffield S10 2TN,
UK, to whom all applications including a cv and the names and
addresses of three referees (6 copies of all documents) should be
sent by 1 April 1993.
Short-listed candidates will be invited to Sheffield for interview
for which travel expenses (within the UK only) will be funded.
Current permanent research staff in Cognitive Science at Sheffield
include:
Prof J P Frisby (visual psychophysics),
Prof J E W Mayhew *(computer vision, robotics, neural
networks)
Prof Y Wilks (natural language understanding, from June 93)
Dr P D Green (speech recognition)
Dr J Porrill (computer vision)
Dr P McKevitt (natural language understanding)
Dr P Scott (computer assisted learning)
Dr R I Nicolson (human learning)
Dr P Dean (neuroscience, neural networks)
Dr M P Cooke (auditory modelling)
Dr G J Brown (auditory modelling)
Mr A J Prescott (neural networks, comparative cog sci)
------------------------------
Subject: Microsoft Speech Research
From: Xuedong Huang <xueh@microsoft.com>
Date: Tue, 23 Feb 93 22:19:47 -0800
As you may know, I've started a new speech group here at Microsoft. For
your information, I have enclosed the full advertisement we have been
using to publicize the openings. If you are interested in joining MS,
I strongly encourage you to apply and we will look forward to following
up with you.
- ------------------------------------------------------------
THE FUTURE IS HERE.
Speech Recognition. Intuitive Graphical Interfaces.
Sophisticated User Agents. Advanced Operating Systems.
Robust Environments. World Class Applications.
Who's Pulling It All Together?
Microsoft. We're setting the stage for the future of
computing, building a world class research group and
leveraging a solid foundation of object based technology
and scalable operating systems.
What's more, we're extending the recognition
paradigm, employing advanced processor and RISC-based
architecture, and harnessing distributed networks to
connect users to worlds of information.
We want to see more than just our own software
running. We want to see a whole generation of users
realize the future of computing.
Realize your future with a position in our
Speech Recognition group.
Research Software Design Engineers, Speech Recognition.
Primary responsibilities include designing and developing
User Interface and systems level software for an advanced
speech recognition system. A minimum of 3 years demonstrated
microcomputer software design and development experience
in C is required. Knowledge of Windows programming, speech
recognition systems, hidden Markov model theory, statistics,
DSP, or user interface development is preferred. A BA/BS
in computer science or related discipline is required. An
advanced degree (MS or Ph.D.) in a related discipline is
preferred.
Researchers, Speech Recognition.
Primary responsibilities include research on stochastic
modeling techniques to be applied to an advanced speech
recognition system. A minimum of 4 years demonstrated
research excellence in the area of speech recognition
or spoken language understanding systems is required.
Knowledge of Windows and real-time C programming for
microcomputers, hidden Markov model theory, decoder
systems design, DSP, and spoken language understanding
is preferred. A MA/MS in CS or related discipline is
required. A PhD degree in CS, EE, or related discipline
is preferred.
Make The Most of Your Future.
At Microsoft, our technical leadership and strong
Software Developers and Researchers stay ahead of the
times, creating vision and turning it into reality.
To apply, send your resume and cover letter, noting
"ATTN: N5935-0223" to:
Surface:
Microsoft Recruiting
ATTN: N5935-0223
One Microsoft Way
Redmond, WA 98052-6399
Email:
ASCII ONLY
y-wait@microsoft.com.us
Microsoft is an equal opportunity employer working to
increase workforce diversity.
------------------------------
Subject: Neural Computation & Cognition: Opening for NN Programmer
From: gluck@pavlov.rutgers.edu (Mark Gluck)
Date: Mon, 22 Feb 93 08:04:28 -0500
POSITION AVAILABLE: NEURAL-NETWORK RESEARCH PROGRAMMER
At the Center for Neuroscience at Rutgers-Newark, we have an opening
for a full or part-time research programmer to assist in developing
neural-network simulations. The research involves integrated
experimental and theoretical analyses of the cognitive and neural bases
of learning and memory. The focus of this research is on understanding
the underlying neurobiological mechanisms for complex learning
behaviors in both animals and humans.
Substantial prior experience and understanding of neural-network
theories and algorithms is required. Applicants should have a high
level of programming experience (C or Pascal), and familiarity with
Macintosh and/or UNIX. Strong English-language communication and
writing skills are essential.
*** This position would be particularly appropriate for a graduating
college senior who seeks "hands-on" research experience prior to
graduate school in the cognitive, neural, or computational sciences ***
Applications are being accepted now for an immediate start-date or for
starting in June or September of this year. NOTE TO N. CALIF.
APPLICANTS: Interviews for applicants from the San Francisco/Silicon
Valley area will be conducted at Stanford in late March. The
Neuroscience Center is located 20 minutes outside of New York City in
northern New Jersey.
For further information, please send an email or hard-copy letter
describe your relevant background, experience, and career goals to:
______________________________________________________________________
Dr. Mark A. Gluck
Center for Molecular & Behavioral Neuroscience
Rutgers University
197 University Ave.
Newark, New Jersey 07102
Phone: (201) 648-1080 (Ext. 3221)
Fax: (201) 648-1272
Email: gluck@pavlov.rutgers.edu
------------------------------
End of Neuron Digest [Volume 11 Issue 15]
*****************************************