Copy Link
Add to Bookmark
Report

Neuron Digest Volume 11 Number 36

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Monday,  7 Jun 1993                Volume 11 : Issue 36 

Today's Topics:
Invitation to game (submitted to Neuron Digest)
Post-Doc in Neural Nets and Time Series
Neural net & fuzzy software (& reports) available
neural simulators for PC
Vounteers needed for ANN conference
FAQ: NN archive and ftp sites


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Invitation to game (submitted to Neuron Digest)
From: dawei@nsdssd.lbl.gov (Dawei Dong)
Date: Thu, 20 May 93 13:40:22 -0800

You are cordially invited to play a game!

The name of the game:

"Find Neutrinos in a Noisy Background".

In a physics experiment, neutrinos generate photons which are
detected by 10,000 phototubes on the surface of a sphere. The
pattern of such neutrino event is the kind that human eyes can
identify by some training. That is one of the conventional methods
used by physicists.

Recently, an artificial neural network is developed by using the
organization principles found in low-level mammalian visual system to
perform the same pattern recognition task with the actual 3D detector
geometry and the duration of the acceptance time window incorporated.

In order to compare the human performance to the network performance,
this game is designed to allow the human player to learn from the
same visual events that was used to train the network. It has been
played by more than ten Ph.D.s of physics. Nobody beats the network
(yet)!

The related paper:

Neural Network for Recognizing Cerenkov Radiation Patterns

Dawei W. Dong and Yuen-Dat Chan

Nuclear Science Division
Lawrence Berkeley Laboratory
University of California
Berkeley, CA 94720

A three layer feed forward neural network is developed to distinguish
between true and false Cerenkov radiation signals that are present in
certain nuclear reactions. The network consists of an input layer of
phototubes on a spherical surface, two hidden layers of neurons, and
an output neuron. The first layer connections are developed by
principal component analysis, i.e., receptive field connections of
each neuron is one of the leading components of the input correlation
matrix. The connections of the second and third layers are learned
by back error propagation on the first layer output. It is shown
that the temporal information is very important for identifying
different patterns and the network performs the recognition task with
a high degree of accuracy.

To get the paper:

ftp nsdssb.lbl.gov
user anonymous
cd /pub/dawei
mget sno.{abs,dvi,ps}

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!BUT YOU DON'T HAVE TO READ THE PAPER NOR UNDERSTAND THE PHYSICS ABOUT IT!!
!!Among the current top three human players, two of them did not read the!!
!!paper and did not event try to understand the physics about it at all!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

To start the game:

yourmachine has to be running X11 with at least 16 RGB colors available.

xhost nsdssb.lbl.gov
telnet nsdssb.lbl.gov
login: sno
Password: nutrino
X11 Display: yourmachine:0.0
Player name: your name or alias

Maybe you can beat the network! Good luck!

Sincerely,

Dawei Dong

---------Interested to play the game? Some tips for you----------

How to play the game:

1) during learning or playing, an event is presented on the spherical
surface, you can choose different viewing angle by typing "x","y","z"
to view from X, Y, Z direction, or using left mouse button to rotate
around Z-axis and right mouse button to tilt Z-axis in/out. But the
most important thing is to concentrate on the lower-left corner of
the window where both front and back views of the event are plotted.

2) after you make up your mind, type "+" or "t" if you think it is a
true event, type "-" or "f" if you think it is a false event.

3) a new player needs to learn the game first. (type "l" or use mouse
selection to start learning the game). During learning, feedbacks
are provided. A beep means you just made a mistake. The event you
saw before is plotted again on the lower-right corner and labeled
with event type. Another feedback is on the upper-right corner,
which is the statistics, such as "E=4.0% T:120/127=94% F:168/173=97%"
which means 4.0 percent of total error by doing 120 correct for 127
true events and 168 correct for 172 false events. But the most
important thing is to concentrate on minimizing the number of beeps.

4) type "p" or use mouse selection to start playing a game. During
playing, there is no feedback (unless you type "s" or use mouse
selection to show current game score, but this should be used as a
way to show your score at the end of the game. If you are not
confident enough to play the game, go to learn for some more events).

5) the neural network performance is 2.7% error which is better than the
best human player up to now (4.0% error). If you achieve a good
performance, please do type "w" or use mouse selection to write score
for the record!

6) if you quit the game by typing "w" or "q" or using mouse selection, a
full record of the game session will be saved in a file (in case you
want details of your learning and playing record, send me an email).
If you don't want to leave any record at all, type control-c in the
window where you start the game or kill the game window with X11
window manager.

Have fun!


------------------------------

Subject: Post-Doc in Neural Nets and Time Series
From: John Moody <moody@chianti.cse.ogi.edu>
Date: Fri, 21 May 93 14:45:56 -0800


--------- POSITION AVAILABLE ----------

Adaptive Systems Research Group
Department of Computer Science and Engineering

Oregon Graduate Institute of Science & Technology


A post-doctoral position will be available for Fall 1993 to work
collaboratively on learning in dynamical contexts. The position will be
for one year with a possibility for renewal. The research involves the
application of neural networks and nonparametric statistical paradigms to
problems in time series analysis and forecasting. The work will likely
include theoretical analysis, development and implementation of new learning
algorithms, and empirical studies of algorithm performance on challenging
applications.

A Ph.D. in Computer Science, Electrical Engineering, Physics, Mathematics,
Statistics, or Economics, strong mathematical skills, and proficiency in a
UNIX/C programming environment are required. Experience with Mathematica,

S-Plus, object-oriented programming, or building large-scale software systems

are also helpful.

Interested applicants should send a letter describing their background
and interests, a CV, a few relevant publications, and names of three

references (with addresses, phones, & email addresses) to:

Prof. John Moody
Computer Science & Engineering
Oregon Graduate Institute
PO Box 91000
Portland, OR 97291-1000

moody@cse.ogi.edu
(503)690-1554

Email submissions of CV's, etc. are encouraged.

The Oregon Graduate Institute is an equal opportunity/ affirmative action
employer and encourages the applications of qualified women and minorities.


+++++++++++++++++++++++++++++++++++++++++++++++++++++++

Oregon Graduate Institute of Science & Technology
Department of Computer Science and Engineering

Research Interests of Faculty in Adaptive & Interactive Systems
(Neural Networks, Learning, Speech, Language, Vision, and Control)




Etienne Barnard (Research Assistant Professor):

Etienne Barnard is interested in the theory, design and implementation
of pattern-recognition systems, classifiers, and neural networks. He is
also interested in adaptive control systems -- specifically, the design
of near-optimal controllers for real- world problems such as robotics.


Ron Cole (Professor):

Ron Cole is director of the Center for Spoken Language Understanding at
OGI. Research in the Center currently focuses on speaker- independent
recognition of continuous speech over the telephone and automatic language
identification for English and ten other languages. The approach combines
knowledge of hearing, speech perception, acoustic phonetics, prosody and
linguistics with neural networks to produce systems that work in the real
world.


Mark Fanty (Research Assistant Professor):

Mark Fanty's research interests include continuous speech recognition for
the telephone; natural language and dialog for spoken language systems;
neural networks for speech recognition; and voice control of computers.


Dan Hammerstrom (Associate Professor):

Based on research performed at the Institute, Dan Hammerstrom and
several of his students have spun out a company, Adaptive Solutions
Inc., which is creating massively parallel computer hardware for the
acceleration of neural network and pattern recognition applications.
There are close ties between OGI and Adaptive Solutions. Dan is
still on the faculty of the Oregon Graduate Institute and continues
to study next generation VLSI neurocomputer architectures.


Todd K. Leen (Associate Professor):

Todd Leen's research spans theory of neural network models, architecture
and algorithm design and applications to speech recognition. His theoretical
work is currently focused on the foundations of stochastic learning, while
his work on Algorithm design is focused on fast algorithms for non-linear
data modeling.


Uzi Levin (Senior Research Scientist):

Uzi Levin's research interests include neural networks, learning systems,
decision dynamics in distributed and hierarchical environments, dynamical
systems, Markov decision processes, and the application of neural networks
to the analysis of financial markets.


John Moody (Associate Professor):

John Moody does research on the design and analysis of learning algorithms,
statistical learning theory (including generalization and model selection),
optimization methods (both deterministic and stochastic), and applications
to signal processing, time series, and finance.


David Novick (Assistant Professor):

David Novick conducts research in interactive systems, including
computational models of conversation, technologically mediated
communication, and human-computer interaction. A central theme of
this research is the role of meta-acts in the control of interaction.
Current projects include dialogue models for telephone-based
information systems.


Misha Pavel (Associate Professor, visiting from NYU and NASA Ames):

Misha Pavel does mathematical and neural modeling of adaptive behaviors
including visual processing, pattern recognition, visually guided motor
control, categorization, and decision making. He is also interested in
the application of these models to sensor fusion, visually guided
vehicular control, and human-computer interfaces.



------------------------------

Subject: Neural net & fuzzy software (& reports) available
From: "Uwe R. Zimmer, AG vP" <uzimmer@informatik.uni-kl.de>
Date: Tue, 25 May 93 15:29:10 +0000

- --- Neural net & fuzzy software (& reports) available! ---

We have installed an experimental FTP-Server, which will keep parts of our
actual work in the form of software and technical reports.
As the first step we have published two of our projects: a neural fuzzy
decision system, and an unsupervised clustering system.
For detailed information please see below.

please send all comments, remarks, etc. pp. to:
uzimmer@informatik.uni-kl.de

- --- What is published here? ---

Our group produces as a result of the actual projects (MOBOT, SPIN,
ALBATROSS) a huge amount of code and programs, so we would like to share
some of the sources. We put only these projects on our FTP-server, which
might be from some general interest, beyond the scientific results,
published in the associated papers.

- --- How to obtain the associated papers? ---

There is a common FTP-server for the whole university at our campus, which
holds parts of our actual work.
The FTP-information is:

University of Kaiserslautern FTP-Server is : ftp.uni-kl.de
Mode is : binary
Directory is : reports_uni-kl/computer_science/mobile_robots/...

Subdirectory is : 1993/papers
File name is : Zimmer.learning_surfaces.ps.Z

Subdirectory is : 1992/papers
File name is : Zimmer.rt_communication.ps.Z

Subdirectory is : 1991/papers
File names are : Edlinger.Pos_Estimation.ps.Z
Edlinger.Eff_Navigation.ps.Z
Knieriemen.euromicro_91.ps.Z
Zimmer.albatross.ps.Z

Submitted papers and technical reports may be found on:

FTP-Server is: ag_vp_file_server.informatik.uni-kl.de
Mode is : binary
Directory is : Neural_Networks/Reports

- --- What are the dedicated machines? ---

Most of our projects are written for the Apple-Macintosh world, so the
ready-to-run programs will require a Macintosh! On the other hand, the
sources are written in Pascal and if you like pascal-code you may have
benefits from the source code without a complete and running application.

- --- What sources are published? ---

We make only the "kernel"-sources from the specific projects available,
because they might be of public interest. If you are interested in the
sources for user-interface handling, etc., please let me know.

- --- Is the project documentation in english? ---

Sorry - not at all the projects. Some of the technical documentation is
in german!

-------------------------------------------
--- List of actually available projects ---
-------------------------------------------

- ---------------------------------------------------------------------------
- --- Neural Fuzzy Decision System (Joerg Bruske):
- ---------------------------------------------------------------------------

- --- Associated report is (english):

FTP-Server is: ag_vp_file_server.informatik.uni-kl.de
Mode is : binary
Directory is : Neural_Networks/Reports
File name is : Zimmer.NFDS.ps.Z

SPIN-NFDS
Learning and Preset Knowledge for Surface Fusion
- A Neural Fuzzy Decision System -

Jorg Bruske, Ewald von Puttkamer & Uwe R. Zimmer

The problem to be discussed in this paper may be characterized in short by
the question: "Are these two surface fragments belonging together (i.e.
belonging to the same surface)?"
. The presented techniques try to benefit
from some predefined knowledge as well as from the possibility to refine
and adapt this knowledge according to a (changing) real environment,
resulting in a combination of fuzzy-decision systems and neural networks.
The results are encouraging (fast convergence speed, high accuracy), and
the model might be used for a wide range of applications. The general frame
surrounding the work in this paper is the SPIN-project, where emphasis is
on sub-symbolic abstractions, based on a 3-d scanned environment.

- --- Source code and technical documentation (english):

FTP-Server is: ag_vp_file_server.informatik.uni-kl.de
Mode is : binary
Directory is : Neural_Networks/Software/Neural_Fuzzy_Decision

This documentation consists of five chapters:
In Chapter 1, the author presents his approach towards implementing fuzzy
decision systems (FDS) by means of neural nets, leading to his NFDS. In
order to train (optimize) the NFDS, a slightly modified version of the
backpropagation algorithm is introduced.
In Chapter 2, the FuzNet project and its modules are described in detail.
FuzNet implements the NFDS described in Chapter 1 on Apple-Macintosh
computers and has been developed as an easy-integrable SW-component for
larger SW-projects.
In Chapter 3, we will be concerned with the details of the integration of
FuzNet in other SW-projects, taking SurfaceExtractor as an example.
However, the reader need not know the SurfaceExtractor project (which
currently is not supplied via ftp) in order to understand the details of
integrating FuzNet in their projects.
In Chapter 4, the FuzTest application is described. FuzTest is a very
primitive application intended to familiarize the user with FuzNet.
In Chapter 5, the reader will find the syntax diagram for fuzzy data- and
rule- bases as accepted by FuzNet. The file "brakingFDS" contains such a
fuzzy data- and rule- base.
A references list concerning literature about neural nets, fuzzy logic and
neural fuzzy decision systems is appended to this documentation. In
particular, [Bruske93] is recommended for a detailed discussion of neural
fuzzy decision systems and [BruPuttZi93] as a short introduction to NFDS
and one of its applications in the Research Group v. Puttkamer.

- ---------------------------------------------------------------------------
- --- Dynamic unsupervised feature maps (Herman Keuchel):
- ---------------------------------------------------------------------------

- --- Associated report is (english):

FTP-Server is: ftp.uni-kl.de
Mode is : binary
Directory is : reports_uni-kl/computer_science/mobile_robots/1993/papers
File name is : Zimmer.learning_surfaces.ps.Z

SPIN - Learning and Forgetting Surface Classifications
with Dynamic Neural Networks

Herman Keuchel, Ewald von Puttkamer & Uwe R. Zimmer

This paper refers to the problem of adaptability over an infinite
period of time, regarding dynamic networks. A never ending flow of
examples have to be clustered, based on a distance-measure. The
developed model is based on the self-organizing feature maps of
Kohonen [6], [7] and some adaptations by Fritzke [3]. The problem
of dynamic surface classification is embedded in the SPIN project,
where sub-symbolic abstractions, based on a 3-d scanned environment
is being done.

- --- Source code and technical documentation (german):

FTP-Server is: ag_vp_file_server.informatik.uni-kl.de
Mode is : binary
Directory is : Neural_Networks/Software/Dynamic_Unsup_Feature_Map

NetSim

Ein Netzwerkmodell fuer die Klassifikation von Flaechen durch Neuronale
Netzwerke mit problemadaptiver Zellstruktur

von Herman Keuchel

Der Netzwerksimulator NetSim und diese Dokumentation (d.h. das hier ist nur
ein Teil der gesamten Dokumentation, allerdings der groessere) entstand im
Rahmen einer Arbeit deren Ziel war, festzustellen, in wie weit Neuronale
Netzwerke nach dem Modell von Bernd Fritzke zur Klassifizierung von
Flaechen
geeignet sind. Fuer eine genaue Beschreibung des Modells nach Bernd
Fritzke
siehe [Fritzke91a & Fritzke91b]. Das Netzwerkmodell ist eine
Weiterentwicklung eines der erfolgreichsten Modelle im Bereich des
unbeaufsichtigten Lernens, der self-organising feature maps von T.
Kohonen [Kohonen84]. Kohonen benutzt eine starre Zellstruktur mit einer
festen Anzahl von Zellen. Beim Modell von Fritzke passt sich die
Zellstruktur und -anzahl dynamisch den zu lernenden Vektoren an. Die
Anzahl der Zellen ist dabei von der gewuenschten
Klassifizierungsgenauigkeit abhaengig.

Die zu lernenden Flaechen werden in Form von m-dimensionalen Vektoren
reeller Zahlen dargestellt. Die Begriffe `Flaeche` und `Vektor` werden
daher im Folgenden oft synonym verwendet. Fuer detaillierte Informationen
ueber die Vektordarstellung von Flaechen siehe [Zimmer91].

In Kapitel 1 werden einige Erweiterungen des Modells erlaeutert, die noetig
wurden, um das Modell den Bedingungen im SPIN-Projekt [Zimmer91]
anzupassen. In diesem Projekt gibt es keine Aufteilung in Lernphase und
Arbeitsphase. Die beiden Phasen laufen waehrend der gesamten Laufzeit des
Systems parallel, damit das Netzwerk jederzeit in der Lage ist, sich
veraendernden Bedingungen anzupassen.

Kapitel 2 geht naeher auf Lernparameter, sinnvolle Defaultwerte, und auf
Auswirkungen von Parameteraenderungen ein. Kapitel 3 stellt den Simulator
und seine Bedienung vor, und in Kapitel 4 wird noch die Pro-
grammierschnittstelle der Simulationssoftware beschrieben. Kapitel 5
schliesslich erlaeutert noch offengebliebene Probleme und enthaelt einige
abschliessende Bemerkungen.

Die Software wurde auf Apple-Macintosh Rechnern in THINK Pascal
implementiert. Fuer den Netzwerksimulator NetSim ist ein Mathematischer
Coprozessor erforderlich. Die Applikation ist auf Macintosh II, IIxi,
IIcx, IIfx und Quadra 950 getestet und lauffaehig.


Fragen, Hinweise und Anregungen koennen an

Herman Keuchel
Parkstrasse 27b
67655 Kaiserslautern

oder an

Uwe R. Zimmer (siehe unten)

gerichtet werden.

- -----------------------------------------------------
-----
Uwe R. Zimmer ---
University of Kaiserslautern - Computer Science Department |
Research Group Prof. v. Puttkamer |
6750 Kaiserslautern - Germany |
-------------------------------------------------------------- |
P.O.Box:3049 | Phone:+49 631 205 2624 | Fax:+49 631 205 2803 |


------------------------------

Subject: neural simulators for PC
From: "Roger Haggard" <RLH0750@tntech.edu>
Date: Tue, 25 May 93 16:07:03 -0600

[[Editor's Note: See last message for at least part of the answer... -PM]]

Does anyone know of an FTP site for neural simulation code and/or demos
for a PC? I'm new to this field and need some examples.

Thanks.

Dr. Roger L. Haggard, Assistant Professor
EE Department, Box 5004
Tennessee Tech University Email: RLH0750@TNTECH.EDU
Cookeville, TN 38505 Phone: (615)372-3453



------------------------------

Subject: Vounteers needed for ANN conference
From: barga@cse.ogi.edu (Roger S. Barga)
Organization: Oregon Grad. Inst. Computer Science and Eng., Beaverton
Date: 26 May 93 21:29:53 +0000


***************************************************************************
** CALL FOR VOLUNTEERS **

1993 IEEE WORLD CONFERENCE ON NEURAL NETWORKS

July 11 - July 15, 1993
Portland Convention Center
Portland, Oregon USA
***************************************************************************
Volunteer positions are available to help at the World Conference on Neural
Networks (WCNN) conference, to be held at the Portland Convention Center,
Portland, OR from July 11-July 15, 1993.

In exchange for 20 hours of work, volunteers will receive:

Full admittance and registration to all sessions of the conference
(excluding tutorial sessions).

Admittance to a tutorial session only if the volunteer is working that
session.

Complete set of proceedings.

A T-shirt that must be worn during work shifts (in order to make
volunteers visible).

The work shifts are in ~5 hour blocks. Whatever combination of blocks that
will total about 20 hours can be used. The work shifts range in description
from helping at the registration desk, guarding the entrances to sessions,
room monitors (go-fers) in sessions, packaging proceedings, and helping in
the distribution of proceedings.

The conference begins Sunday, July 11 with the tutorial sessions and ends
July 15. We will be needing help starting Friday, July 9. We are currently
working out the final staffing requirements so information on exact schedules
and positions will be available soon.

Those interested in signing up, please send me the following information:

Name (last name, first name)
Address
Country
phone number
email

Upon signing up, you be sent a form with a more detailed description of
the positions, and a request for shift preference and tutorials. Sign
ups will be based on the date of commitment. There will be no funding
available for volunteer's travel and lodging expenses. PLEASE RESPOND
TO:

barga@cse.ogi.edu OR

Roger S. Barga
Oregon Graduate Institute of Science and Technology
Department of Computer Science and Engineering
19600 N.W. Von Neumann Drive
Beaverton, OR 97006-1999


If you have further questions, please feel free to contact me.

Thank you,
Roger S. Barga
Volunteer Chair WCNN-93
barga@cse.ogi.edu

------------------------------

Subject: FAQ: NN archive and ftp sites
From: steriti@dragon.cpe.ulowell.edu (Ron Steriti)
Date: Thu, 27 May 93 12:13:23 -0500

[[ Editor's Note: hplpm.hpl.hp.com has not existed for over two years.
Cattell.psych.upenn.edu is the home of Neuron DIgest and its archives
now. I therefore suggest caution in trying out some of these sites, but
many thanks to the compiler of this list. -PM ]]

FAQ Neural Network Archive Sites

This is a list of archive sites for neural network stuff.
Almost all use anonymous ftp's
(login as anonymous, password is your email address)
The list is in alphabetical order based on email addresses.

Good luck, Ron Steriti steriti@dragon.ulowell.edu

- -------------------------------------------------------------------------

amazonas.cs.columbia.edu (128.59.16.72):
/pub/learning.ps

archive.cis.ohio-state.edu
Neuroprose:
/pub/neuroprose

b.gp.cs.cmu.edu (128.2.242.8)
Connectionists
/usr/connect/connectionists/archives and /bibliographies
Anfragen an : connectionists-request@cs.cmu.edu

archive.afit.af.mil
pub/NeuralGraphics Directory with Neural Network preprints
and Artificial NN Simulator Software

b.gp.cs.cmu.edu (or 128.2.242.8) Name: ftpguest Password: cmunix
/connectionists/archives

boulder.colorado.edu (128.138.240.1): /pub/generic-sources
PlaNet (SunNet)
(PlaNet updates nowadays from tutserver.tut.ac.jp)

beligica.stat.washington.edu (128.95.17.57):
???

cattell.psych.upenn.edu (128.91.2.173)
/pub/Neuron-Digest
Anfragen an : neuron-request@hplabs.hp.com
neuron-request@cattell.psych.upenn.edu
Ansprechpartner: Peter Marvit

cochlea.hut.fi (130.233.168.48), ftp as 'anonymous': /pub/lvq-pak
lvq-pak

cogsci.indiana.edu (129.79.238.6):
Cognitive science papers, bibliographies, etc.

cs-archive.uwaterloo.ca (129.97.140.24)
TR von University of Waterloo
/cs-archive
Anfragen an : cs-archivist@cs-archive.uwaterloo.ca
compdoc-techreports-request@ftp.cse.ucsc.edu
(letzterer Richard Golding)

cs.ucsd.edu:
/pub/GAucsd: GAucsd12.sh.Z, GAguide12.ps.Z dpe-tr.ps.Z

cs.rochester.edu (192.5.53.209): /pub/simulator
RCS - Rochester Connectionist Simulator

crl.ucsd.edu (132.239.63.1):
/pub/neuralnets

dartvax.dartmouth.edu (129.170.16.4): /pub/mac
dartnet.sit.hqx - A simulator for Macintosh

external.nj.nec.com NEC Research Institute ftp archive
pub/giles/papers

funic.funet.fi
2) Finnische Datenbank (incl. Neuroprose, langsamerer Update)
/pub/sci/neural
Ansprechpartner: maints@nic.funet.fi postmaster@nic.funet.fi


ftp.cse.ucsc.edu
TR von University of California Santa Cruz
/pub/tr
Anfragen an : compdoc-techreports-request@ftp.cse.ucsc.edu
(letzterer Richard Golding)

ftp.sei.cmu.edu
Software Eng. Inst., Carnegie Mellon Univ.
/pub/documents

ftp-bionik.fb10.tu-berlin.de
pub/papers/Other

funic.funet.fi (128.214.6.100)
/pub/sci/neural/04Neural_FTP_Sites

genesis.cns.caltech.edu (131.215.135.64):
GENESIS - GEneral NEural network SImulation System
Register first with telnet to genesis.cns.caltech.edu, login as genesis

galba.mbfys.kun.nl (131.174.82.73):
ART1, ART2, Cognitron, Cascade Correlation, neuro-intro,
Quickprop, newton-backprop

hope.caltech.edu:
/pub/mackay Many articles, etc.

hplpm.hpl.hp.com (15.255.176.205) /pub
/pub/Neuron-Digest: Older issues of Neuron Digest
/pub/Neuron-Software: hst.README hst.tar.Z
mactivation.3.3.sit.hqx pdp.tar.Z

hustat.harvard.edu:
???

iraun1.ira.uka.de (129.13.10.90)
Neurosimulatoren aus Karlsruhe
/pub/neuron
Anfragen an : blaeser@i32fs2.ira.uka.de
Hinweis : Es gibt beim ftp auf der iraun1 die Moeglichkeit,
interaktiv Readmes zulesen mittels: ls "-G filename"

ics.uci.edu
"UCI Repository Of Machine Learning Databases and Domain Theories"
/pub/machine-learning-databases".
-- also has Fahlman's "
connectionist benchmarks"
(nettalk, protein- folding, vowels, sonar).

ifi.informatik.uni-stuttgart.de (129.69.211.1): /pub/SNNS
SNNS, Stuttgarter Neuronale Netze Simulator, many kinds of nets, X11

iubio.bio.indiana.edu (129.79.1.101)
/biology, /molbio, /chemistry, /science


kauri.vuw.ac.nz:
/pub/ms-dos/AI

lip.ens-lyon.fr (140.77.1.11)
LIP-IMAG , ENS - Lyon
/pub/LIP
Hinweis : README, lip.RR.contents, lip.TR.contents, lip.PhD.contents
besser nicht zwischen 8 - 18h was holen
e-mail : Catherine.Pequegnat@lip.ens-lyon.fr
oder besser text-lip@lip.ens-lyon.fr
markov.stats.ox.ac.uk
pub/neural/papers

minster.york.ac.uk
reports

me.uta.edu (129.107.2.20):
/pub/neural/annsim


menaik.cs.ualberta.ca (129.128.4.241): /pub
atree* - An Adaptive Logic Network Kit (unix) + MSWindows 3.x version



pt.cs.cmu.edu
NN-Benchmark collection
/afs/cs/project/connect/bench
Cascade-correlation sims: cascor1.lisp cascor1.c
Aspirin/MIGRAINE, a neural programming platform + graphical interface


princeton.edu:
/pub/harnad: crain.bbs velmans.bbs greenfield.bbs
and more...
pt.cs.cmu.edu (128.2.254.155):/afs/cs/project/connect/code
Cascade-correlation sims: cascor1.lisp cascor1.c
Aspirin/MIGRAINE, a neural programming platform + graphical interface

peabody.llnl.gov:
/neural *something weird in here !!! Everything invisible!*

polaris.cognet.ucla.edu (128.97.50.3): /alexis
am* - Aspirin/MIGRAINES



retina.cs.ucla.edu (131.179.16.6), username "
sfinxftp", password "joshua"
/pub/README, sfinx_v2.0.tar.Z
*** For some reason, the login fails ***

svr-ftp.eng.cam.ac.uk
Cambridge University
Engineering Department, Trumpington Street,Cambridge CB2 1PZ, England
/reports
Hinweis : ABSTRACTS File

tut.cis.ohio-state.edu 128.146.8.60:
/pub/condela/condela.tar

tutserver.tut.ac.jp (133.15.240.3), ftp as 'anonymous': /pub/misc
PlaNet5.7.tar.Z

xanth.cs.odu.edu (128.82.8.1):
*** Where's all the promised neural stuff? ***

yallara.cs.rmit.edu.au:
*** doesn't connect! ***

(16.1.0.2):
/pub/DEC
NeurDS - Simulator for DEC systems supporting VT100 terminals

128.123.001.032:
/gm (???)

129.26.8.90: /pub/gmd
Extension of the PDP simulator to the Connection Machine CM-2
a few papers


------------------------------

End of Neuron Digest [Volume 11 Issue 36]
*****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT