Copy Link
Add to Bookmark
Report
Neuron Digest Volume 11 Number 37
Neuron Digest Wednesday, 9 Jun 1993 Volume 11 : Issue 37
Today's Topics:
Neural Networks for Image Enhancement and Image Compression
Neuron Digest V11 #32 - public domain
NN in EXCEL
Kolmogorovs Theorem
Free 'NevProp' BP Software
problem with NN programs written in C++
conferences calendar available in Neuroprose archive
LETs FACE CHAOS through NON-LINEAR DYNAMICS - Summer School-one more try
re: commercial software
Neural Networks in C++
Adaptive Simulated Annealing (ASA) Version 1.26
Request for revenue figures on NN in financial markets
Quick survey: Cascor and Quickprop
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Neural Networks for Image Enhancement and Image Compression
From: Wakarewarewa-I <SXV8812@ritvax.isc.rit.edu>
Date: Thu, 27 May 93 12:43:47 -0500
Hi,
I would like to find some information on the applications
of neural networks to Image Enhancement and Image Compression. If there
are books, journals, papers, etc in this field, please let me know about
it.
Thank you for your time,
Shyam
sxv8812a@ultb.isc.rit.edu
Ph.D. Imaging Science
Rochester Institute of Technology
------------------------------
Subject: Neuron Digest V11 #32 - public domain
From: herrell@cps.msu.edu
Date: Fri, 28 May 93 11:56:21 -0500
Sorry. One more try.
--------------------------------------------------------
Does anyone know where I can FTP a good readable
C++ Neural Network Simulation? I don't need anything
fancy, just something that I can understand and
build on.
Sincerely,
Richard Herrell
------------------------------
Subject: NN in EXCEL
From: Mary Scott <71270.465@CompuServe.COM>
Date: 31 May 93 08:02:50 -0500
I've been working with a Neural Net in Neural Network PC Tools - A
Practical Guide, by Everhart and Dobbins, Chapter 12 - on predicting the
futures market. This model was developed by Thomas Zaremba. I have it
coded up in Excel as explained in the chapter, but it seems to have a
problem. I looking for others who may have gotten it to work or to find
Thomas Zaremba himself for comments. Any help and/or suggestions will be
appreciated.
Thanks very much,
M. Scott
------------------------------
Subject: Kolmogorovs Theorem
From: K Maguire <psnkm1@stirling.ac.uk>
Date: Tue, 01 Jun 93 06:31:10 +0000
Does anyone out there have anything on the application of Kolmogorov's
Theorem to real-world problems. Does anyone actually use a NN to
actually approximate a function, in the mathematical sense. Has abyone
even tried, like say for 'Special Functions'.
Kevin Maguire < psnkm1@uk.ac.stir.forth >
------------------------------
Subject: Free 'NevProp' BP Software
From: goodman@unr.edu (Phil Goodman)
Date: Tue, 01 Jun 93 18:02:50 +0000
If you feel this is appropriate, please include it as an announcement.
- - - - - - - -
******* FREE 'NevProp' GENERAL PURPOSE BACK-PROPAGATION SOFTWARE *******
o VERSION 1.15: UNIX (C code), DOS & Macintosh (executable)
o SUITABLE FOR BOTH NOVICE AND EXPERT NEURAL NET USERS
NevProp is a general purpose back-propagation program written in C for
UNIX, Macintosh, and DOS. The original version was Quickprop 1.0 by
Scott Fahlman, as translated from Common Lisp into C by Terry Regier.
The quickprop algorithm itself was not substantively changed, but we
inserted options to force gradient descent (per-epoch or per-pattern)
and added generalization & stopped training, c index, and interface
enhancements.
Phil Goodman (goodman@unr.edu)
David Rosen
Allen Plummer
University of Nevada Center for Biomedical Modeling Research
Washoe Medical Center, H-166, 77 Pringle Way, Reno, NV 89520
702-328-4867 FAX: 328-4111
************************************************************************
FEATURES of NevProp version 1.15
o UNLIMITED (except by machine memory) number of input PATTERNS;
o UNLIMITED number of input, hidden, and output UNITS;
o Arbitrary CONNECTIONS among the various layers' units;
o Clock-time or user-specified RANDOM SEED for initial random weights;
o Choice of regular GRADIENT DESCENT or QUICKPROP;
o Choice of LOGISTIC or TANH activation functions;
o Choice of PER-EPOCH or PER-PATTERN (stochastic) weight updating;
o GENERALIZATION to a test dataset;
o AUTOMATICALLY STOPPED TRAINING based on generalization;
o RETENTION of best-generalizing weights and predictions;
o Simple but useful bar GRAPH to show smoothness of generalization;
o SAVING of results to a file while working interactively;
o SAVING of weights file and reloading for continued training;
o PREDICTION-only on datasets by applying an existing weights file;
o In addition to RMS error, the concordance, or c index is displayed.
The c index shows the correctness of the RELATIVE ordering
of predictions AMONG the cases; ie, it considers all possible PAIRS
of vectors. This statistic is identical to the area under the
receiver operating characteristic (ROC) curve, widely used in
technology assessment. (See below for more info.)
************************************************************************
.- - - - - - - - - - - - - - - QUICKSTART - - - - - - - - - - - - - - .
| |
| UNIX: |
| 1. ftp the 'npxxx.shar' file (xxx depends on current version) |
| 2. type 'sh npxxx.shar' (creates 'NevProp' directory - cd to it) |
| 3. type 'make' (should create an executable 'np' program) |
| 4. type 'np' |
| 5. at prompt for .net filename, type 'iris' (supplied) |
| 6. explore! |
| |
| MAC & PCs: |
| 1. execute the program (assumes you got it somehow) |
| 2. at prompt for .net filename type 'iris' (supplied) |
| 3. explore! |
._ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .
3 SOURCES FOR NevProp:
1. The most updated version of NevProp will be made available by
anonymous ftp from the University of Nevada, Reno:
a. Create an FTP connection from wherever you are to machine
"unssun.scs.unr.edu". The internet address of this machine
is 134.197.10.128 for those who need it.
b. Log in as user "anonymous" with your own ID as password.
c. Change remote directory to "pub/goodman/nevpropdir".
d. At this point FTP should be able to get a listing of files
(The exact FTP commands depend on your local FTP server.)
2. NevProp and the original Quickprop are available as C code
by anonymous ftp from CMU:
a. Create an FTP connection from wherever you are to machine
"ftp.cs.cmu.edu". The internet address of this machine is
128.2.206.173, for those who need it.
b. Log in as user "anonymous" with your own ID as password.
You may see an error message that says "filenames may not
have /.. in them" or something like that. Just ignore it.
c. Change remote directory to "/afs/cs/project/connect/code".
NOTE: You must do this in a single operation. Some of the
super directories on this path are protected against outside
users.
d. At this point FTP should be able to get a listing of files
in this directory with "dir" & fetch the ones you want with "get".
(The exact FTP commands depend on your local FTP server.)
3. MACINTOSH and DOS versions of NevProp 1.xx:
May be available in the unr anonymous ftp above. If you are unable
to transfer the files, just send a formatted 3.5 inch floppy disk
(specifying Mac or DOS) with a self-addressed stamped return
envelope to Phil Goodman at the CBMR address above, including
a brief description of your research.
************************************************************************
Many happy iterations,
Phil
___________________________
___________________________ Phil Goodman,MD,MS goodman@unr.edu
| __\ | _ \ | \/ || _ \ Associate Professor & CBMR Director
|| ||_// ||\ /||||_// Cardiovascular Studies Team Leader
|| | _( || \/ ||| _(
||__ ||_\\ || |||| \\ CENTER for BIOMEDICAL MODELING RESEARCH
|___/ |___/ || |||| \\ University of Nevada School of Medicine
Washoe Medical Center H166, 77 Pringle Way,
Reno, NV 89520 702-328-4867 FAX:328-4111
------------------------------
Subject: problem with NN programs written in C++
From: BKIRKLAND@rivendell.otago.ac.nz
Date: Wed, 02 Jun 93 16:46:00 +1200
Dear Sir/Madam,
I currently have some neural networks programs on my computer. They are
copied from the disk in Adam Blum's "Neural Networks in C++: an object
oriented framework for building connectionist systems".
I decided to choose the digit recognition application (TESTIMAG) as that
seemed to be relevant to the task of recognising written/typed numbers on
tax returns which the New Zealand tax department is looking at.
I am doing a fourth year dissertation titled "Imaging systems in tax
administration". The dissertation is for my commerce with honours degree.
I already have a bachelor of science degree. It focuses on the
feasibility of an imaging system being used in the revenue collection
process.
Problem:
========
I am encountering some problems when attempting to run TESTIMAG.EXE. It
had presented scanline corrupt problems but I remedied that by replacing
the corrupt .PCX files with new non-corrupted ones.
I have successfully run TESTIMAG using the IMAGECPN.FCT file but I think
this is not the right .FCT file for this application (digit recognition).
I am afraid I am still learning C++ (I've had 7-8 months of learning this
language) and even less familiar with neural networks!
I cannot find the CVTPCX facility for converting .PCXs to a .FCT file.
There is IMAGECPN.FCT for the IMAGECPN application but there is no
TESTIMAG.FCT for the TESTIMAG application.
Can anyone send instructions to me to enable me to run TESTIMAG.EXE
properly with the right data files, e.g. the right .FCT file. I'd
appreciate it very much!
You can send such material to me at the email address shown at the bottom of
this message.
Thank you for your attention,
Yours faithfully,
Barry Kirkland, B.Sc.
BKIRKLAND@OTAGO.AC.NZ
------------------------------
Subject: conferences calendar available in Neuroprose archive
From: fmurtagh@eso.org
Date: Thu, 03 Jun 93 09:42:02 +0100
FTP-host: archive.cis.ohio-state.edu
FTP-file: pub/neuroprose/murtagh.calendar.txt.Z
The file murtagh.calendar.txt.Z is available for copying from the Neuroprose
repository.
It is a CALENDAR of forthcoming conferences and workshops in the neural net
and related fields. It is about 1300 lines in length, consists of brief
details (date, title, location, contact), and is valid from mid-May 1993
onwards. The intention is to update it in about 3 months.
F. Murtagh (fmurtagh@eso.org)
------------------------------
Subject: LETs FACE CHAOS through NON-LINEAR DYNAMICS - Summer School-one more try
From: summerschool.chaos@uni-lj.si
Date: Thu, 03 Jun 93 17:43:24 +0100
INTERNATIONAL SUMMER SCHOOL AT THE UNIVERSITY OF LJUBLJANA
"LETs FACE CHAOS through NON-LINEAR DUNAMICS"
September 26 - October 4, 1993 Ljubljana and Portoroz, Slovenia
***** PLEASE DISTRIBUTE *****
DEAR FRIEND,
I am sure you are wondering:
"WHAT'S IN IT FOR ME?"
If you are already working in this field you can present your work
on a lecture or prepare a workshop. In that case please send us the
material you would like to lecture, your CV and bibliography as soon
as possible. All the materials lectured on the summer school
and presented on the workshops will be printed in the book that will
be available two months after the end of the summer school.
If you are an undergraduate or graduate student, the knowledge you
can acquire by taking part in our summer school is useful in almost
any field in which you are majoring.
In addition, facing chaos through non-linear dynamics is not as
distant from our senses as are the other two scientific revolu-
tions of the twentieth century - the theory of relativity and
quantum mechanics.
As you will learn if you join us, all you need is pencil and
paper or computer, and of course, KNOWLEDGE. The latter will
surely be enriched: for this very purpose, experts from all
around the world have been invited to lecture to us!
"ANY SOCIAL EVENTS?"
The first day's lectures will take place in Portoroz, on the
Adriatic coast. On the way back to Ljubljana, where the rest of
the program continues, we will stop at the Lipica horse stables
and Postojna Cave. We have also prepared a tour through Ljublja-
na, the capital of Slovenia.
Further special events will take place in the evenings: concerts,
plays, etc. There will also be a weekend away to give you a
chance to get to know the Alpine region of Slovenia.
"AND HOW TO APPLY?"
NOTE: THE APPLICATION DEADLINE IS JULY 26, 1993!
DO NOT HESITATE TO CONTACT US IF YOU NEED ANY FURTHER INFORMATION,
AND THEN JOIN US IN SEPTEMBER AT OUR SUMMER SCHOOL HERE IN LJUBLJANA
AND PORTOROZ.
WE WILL MAKE SURE THAT THIS SUMMER SCHOOL IS A MEMORABLE EDUCATIONAL
EVENT WITH LOTS OF FUN!
YOUR ORGANISING COMMITTEE!
PRELIMINARY PROGRAMME:
1. INTRODUCTION
Synergetic approach to self-organising systems
Discrete versus continuous representation of dynamic systems
Mathematical background
Physical background
Open systems
Information dynamics
Dissipative systems
Evolutionary approach
Fractal graphics - geometry of chaos
2. APPLICATIONS
Qualitative and quantitative analysis of time series
Modelling and simulation of system dynamics
Qualitative modelling - problems and perspectives
Artificial intelligence and system dynamics
Prediction of chaotic dynamics with neural networks
Applications in:
Engineering:
- Electrical circuits
- Chemical reactions
- Architecture
- Working processes
- Meteorology
Physiology:
- EEG
- EKG
- Blood flow
- Fractal development of lung's capillaries
- Ion channels
- Calcium oscillations through the membrane
Physics:
- Quantum physics
- Fluid dynamics
- Plasma
Ecological modelling
Economy:
- Share prices
- Financial systems
GENERAL INFORMATION ABOUT THE SUMMER SCHOOL
Participants: Undergraduate and postgraduate students
and others interested in topic.
Educational requirements:
Basic knowledge of differential equations desired.
Theory & Applications:
The participants will be given the opportunity to
extend the theoretical knowledge acquired
at the lectures through practical work at workshops
which are also a part of the summer school programme.
Visits:
- Laboratories at the University of Ljubljana
- Jozef Stefan Institute
- Some sponsoring companies.
Certificate of Attendance: students will receive a
Certificate of Attendance.
Book: A book on the summer school topic will be available
two months after the end of the summer school.
Scientific advisers:
Dr. Aneta Stefanovska, Faculty of Electrical and
Computer Engineering, University of Ljubljana
Prof. Dr. Marko Robnik, Center for Theoretical
Physics and Applied Mathematics, University of
Maribor
Prof. Dr. Igor Grabec, Faculty of Mechanical
Engineering, University of Ljubljana
ORGANISING COMMITTEE:
Maja Malus, President
Matija Golner, Alenka Kavkler, Suzana Domjan
Mateja Forstnaric, Anton Kos, Marko Krek
Peter Groselj, Alenka Lamovec, Spela Nardoni
Natasa Petre, Martin Raic, Vlado Stankovski
Peter Ribaric, Alexander Simonic, Robert Zerjal
DEPARTURE NOTE
(CAN BE HANDED TO US UPON YOUR ARRIVAL!)
Departure date:______________________________(dd-mm-yy)
Time:________________________________________
From: Ljubljana AIRPORT
Ljubljana TRAIN station
Ljubljana BUS station
Do you need to confirm your flight: YES NO
Deadline:____________________________________
APPLICATION FORM
1. Name:____________________
2. Surname:____________________
3. Home address or mailing address:
__________________
__________________
__________________
__________________
4. Phone number:_____________________
5. Fax:____________________
6. E-mail:____________________
7. Country:_____________ 8. Passport number:___________
9. Sex: F M 10. Birth date:__________
11. University:_________________________________
12. Field of study:______________________________
13. Year :____________
14. Would you prefer lodging with the family of one of our students in
the Organising Committee team (IN THIS CASE THE
APPLICATION FEE is 100 ECU instead of 150 ECU):
HOTEL HOME
15. Are you vegetarian: YES NO
16. Are you a Smokin' Joe: YES NO
17. Other special wishes, e. g. for visa requirements etc.:
PLEASE INCLUDE A NOTE OF A FEW LINES
EXPLAINING WHY YOU ARE APPLYING!
YOU CAN COVER THE REGISTRATION FEE ON ARRIVAL!
ARRIVAL NOTE
1. Name:___________________________________
2. Country:_________________________________
3. Arrival date:________________ (dd-mm-yy)
4. Arrival time:________________
5. At: Ljubljana AIRPORT
Ljubljana BUS STATION
Ljubljana TRAIN STATION
OUR OFFICE
6. By: Plane Flight No:____________
Train
Bus
Car
Bike
OUR ADDRESS:
IAESTE LC Ljubljana and BEST Ljubljana
Mednarodna pisarna SOU
Kersnikova 4
61000 Ljubljana
Tel.: + 38 61 318 564
Fax: + 38 61 319 448
E_mail: summerschool.chaos@uni-lj.si
------------------------------
Subject: re: commercial software
From: David Bradbury <D.C.Bradbury@open.ac.uk>
Date: 04 Jun 93 10:43:42 +0800
First of all many thanks to all of you who sent me details about public domain
neural net simulators.
Now onto the serious stuff: My department are looking to buy a commercial
neural net simulator package. Can anybody give me any ideas about where
to get details of them or can recomend one in particular?
Thanks!
David (d.c.bradbury@open.ac.uk)
------------------------------
Subject: Neural Networks in C++
From: Rickardo Benn <rbenn@class.gsfc.nasa.gov>
Date: Fri, 04 Jun 93 09:29:40 -0500
We are trying to find neural network software written
in C++. We are especially interested in seeing a system where
individual neural network software modules can be interchanged.
We are currently trying to construct a neural network
to suit our applications.
Any example systems that have been coded using C++ would be
appreciated. Thank you in advance for addressing these issues.
Please send all responses to:
<rbenn@class.gsfc.nasa.gov> or
<jim@class.gsfc.nasa.gov>
------------------------------
Subject: Adaptive Simulated Annealing (ASA) Version 1.26
From: Lester Ingber <ingber@alumni.cco.caltech.edu>
Date: Sun, 06 Jun 93 12:27:29 -0800
========================================================================
Adaptive Simulated Annealing (ASA) Version 1.26
To get on or off blind-copy ASA e-mailings, just send an e-mail to
ingber@alumni.caltech.edu with your request.
________________________________________________________________________
Since the last announcement of version 1.9 on 14 May 93, some new
algorithms have been added. Two new Program Options, QUENCH_PARAMETERS
and QUENCH_COST, permit some "quenching." This can be useful together
with the SELF_OPTIMIZE Program Option, especially in large parameter
spaces. E.g., you can first determine a good set of Program Options
for a sample dimension, and then use these with the QUENCH options
in the larger space. The README[.ps] file explains these options.
In the draft of "Simulated annealing: Practice versus theory,"
sa_pvt.ps.Z in the archive, the SELF_OPTIMIZE, QUENCH_[] and
ACTIVATE_REANNEAL Program Options are applied to the difficult test
problem in the code, for dimensions n=4 and n=8, containing 10^(5n)
minima. Relative to previously published ASA/VFSR studies that were
faster and more accurate than other global optimization algorithms,
it is demonstrated how the use of these options can speed up the
search (number of cost_function calls) by as much as a factor of 50,
without losing accuracy in finding the global minimum.
The latest Adaptive Simulated Annealing (ASA) code (typically
not yet extensively tested) and some related (p)reprints in
compressed PostScript format can be retrieved via anonymous ftp from
ftp.caltech.edu [131.215.48.151] in the pub/ingber directory.
Interactively: ftp ftp.caltech.edu, [Name:] anonymous, [Password:]
your_email_address, cd pub/ingber, binary, ls; get file_of_interest,
quit. The latest version of ASA is asa-x.y.Z (x and y are version
numbers), linked to asa.Z. For the convenience of users who do
not have any uncompress utility, there is a file asa which is an
uncompressed copy of asa-x.y.Z/asa.Z; if you do not have sh or shar,
you still can delete the first-column X's and separate the files at
the END_OF_FILE locations. There are patches asa-diff-x1.y1-x2.y2.Z
up to the present version; these may be concatenated as required
before applying. The INDEX file contains an index of the other files.
If you do not have ftp access, get information on the FTPmail service
by: mail ftpmail@decwrl.dec.com, and send only the word "help" in
the body of the message.
If any of the above are not convenient, and if your mailer can handle
large files (please test this first), the code or papers you require
can be sent as uuencoded compressed files via electronic mail.
If you have gzip, resulting in smaller files, please state this.
Sorry, I cannot assume the task of mailing out hardcopies of code
or papers.
Lester
========================================================================
|| Prof. Lester Ingber ||
|| Lester Ingber Research ||
|| P.O. Box 857 EMail: ingber@alumni.caltech.edu ||
|| McLean, VA 22101 Archive: ftp.caltech.edu:/pub/ingber ||
------------------------------
Subject: Request for revenue figures on NN in financial markets
From: javega@zac1.zac.itesm.mx (Juan Antonio Vega)
Date: Mon, 07 Jun 93 10:59:26 -0700
I am documenting an application of neural networks to the prediction
of the Stock Market. I would like to include some revenue figures but
I haven't found any. In particular I am looking for:
+ Estimated profit of Neural Nets Companies by selling its products
to Financial Firms.
+ Estimated revenue of Financial Firms by investing on Neural Nets
Technology.
As far as I remember, some time ago a person in this digest mention a
$200 million revenue but I don't remember what this quantity is for.
Any help will be appreciated,
Thanks
Prof. J.A.Vega
javega@zac1.zac.itesm.mx
------------------------------
Subject: Quick survey: Cascor and Quickprop
From: Scott_Fahlman@SEF1.SLISP.CS.CMU.EDU
Date: Wed, 09 Jun 93 13:46:48 -0500
Distributing code by anonymous FTP is convenient for everyone with decent
internet connections, but it has the disadvantage that it is hard to keep
track of who is using the code. Every so often we need to justify our
existence to someone and need to show them that there are a non-trivial
number of real users out there.
If you are now using, or have recently used, any of my neural net
algorithms or programs (Quickprop, Cascade-Correlation, Recurrent
Cascade-Correlation), I would very much appreciate it if you would send me
a quick E-mail message with your name, organization, and (if it's not a
secret) just a few words about what you are doing with it. (For example:
"classifying textures in satellite photos".)
For those of you who don't know about the availability of this code (and
related papers), I enclose below some instructions on how to get these
things by anonymous FTP.
Thanks,
Scott
===========================================================================
Scott E. Fahlman Internet: sef+@cs.cmu.edu
Senior Research Scientist Phone: 412 268-2575
School of Computer Science Fax: 412 681-5739
Carnegie Mellon University Latitude: 40:26:33 N
5000 Forbes Avenue Longitude: 79:56:48 W
Pittsburgh, PA 15213
===========================================================================
Public-domain simulation programs for the Quickprop, Cascade-Correlation,
and Recurrent Cascade-Correlation learning algorithms are available via
anonymous FTP on the Internet. This code is distributed without charge on
an "as is" basis. There is no warranty of any kind by the authors or by
Carnegie-Mellon University.
Instructions for obtaining the code via FTP are included below. If you
can't get it by FTP, contact me by E-mail (sef+@cs.cmu.edu) and I'll try
*once* to mail it to you. Specify whether you want the C or Lisp version.
If it bounces or your mailer rejects such a large message, I don't have
time to try a lot of other delivery methods.
HOW TO GET IT:
For people (at CMU, MIT, and soon some other places) with access to the
Andrew File System (AFS), you can access the files directly from directory
"/afs/cs.cmu.edu/project/connect/code". This file system uses the same
syntactic conventions as BSD Unix: case sensitive names, slashes for
subdirectories, no version numbers, etc. The protection scheme is a bit
different, but that shouldn't matter to people just trying to read these
files.
For people accessing these files via FTP:
1. Create an FTP connection from wherever you are to machine
"ftp.cs.cmu.edu". The internet address of this machine is 128.2.206.173,
for those who need it.
2. Log in as user "anonymous" with your own ID as password. You may see an
error message that says "filenames may not have /.. in them" or something
like that. Just ignore it.
3. Change remote directory to "/afs/cs/project/connect/code". NOTE: You
must do this in a single operation. Some of the super directories on this
path are protected against outside users.
4. At this point FTP should be able to get a listing of files in this
directory with DIR and fetch the ones you want with GET. (The exact FTP
commands you use depend on your local FTP server.)
Partial contents:
quickprop1.lisp Original Common Lisp version of Quickprop.
quickprop1.c C version by Terry Regier, U. Cal. Berkeley.
backprop.lisp Overlay for quickprop1.lisp. Turns it into backprop.
cascor1.lisp Original Common Lisp version of Cascade-Correlation.
cascor1.c C version by Scott Crowder, Carnegie Mellon
rcc1.lisp Common Lisp version of Recurrent Cascade-Correlation.
rcc1.c C version, trans. by Conor Doherty, Univ. Coll. Dublin
nevprop1.15.shar Better quickprop implementation in C from U. of Nevada.
- ---------------------------------------------------------------------------
Tech reports describing these algorithms can also be obtained via FTP.
These are Postscript files, processed with the Unix compress/uncompress
program.
unix> ftp ftp.cs.cmu.edu (or 128.2.206.173)
Name: anonymous
Password: <your user id>
ftp> cd /afs/cs/project/connect/tr
ftp> binary
ftp> get filename.ps.Z
ftp> quit
unix> uncompress filename.ps.Z
unix> lpr filename.ps (or however you print postscript files)
For "filename", sustitute the following:
qp-tr Paper on Quickprop and other backprop speedups.
cascor-tr Cascade-Correlation paper.
rcc-tr Recurrent Cascade-Correlation paper.
precision Hoehfeld-Fahlman paper on Cascade-Correlation with
limited numerical precision.
- ---------------------------------------------------------------------------
The following are the published conference and journal versions of the
above (in some cases shortened and revised):
Scott E. Fahlman (1988) "Faster-Learning Variations on Back-Propagation: An
Empirical Study" in (\it Proceedings, 1988 Connectionist Models Summer
School}, D. S. Touretzky, G. E. Hinton, and T. J. Sejnowski (eds.),
Morgan Kaufmann Publishers, Los Altos CA, pp. 38-51.
Scott E. Fahlman and Christian Lebiere (1990) "The Cascade-Correlation
Learning Architecture", in {\it Advances in Neural Information Processing
Systems 2}, D. S. Touretzky (ed.), Morgan Kaufmann Publishers, Los Altos
CA, pp. 524-532.
Scott E. Fahlman (1991) "The Recurrent Cascade-Correlation Architecture" in
{\it Advances in Neural Information Processing Systems 3}, R. P. Lippmann,
J. E. Moody, and D. S. Touretzky (eds.), Morgan Kaufmann Pulishers, Los
Altos CA, pp. 190-196.
Marcus Hoehfeld and Scott E. Fahlman (1992) "Learning with Limited
Numerical Precision Using the Cascade-Correlation Learning Algorithm" in
IEEE Transactions on Neural Networks, Vol. 3, no. 4, July 1992, pp.
602-611.
------------------------------
End of Neuron Digest [Volume 11 Issue 37]
*****************************************