Copy Link
Add to Bookmark
Report
Neuron Digest Volume 03 Number 12
NEURON Digest Tue Jun 21 10:56:02 CDT 1988 - Volume 3 / Issue 12
Today's Topics:
Where to get The Rochester Simulation Package?
MACIE
Connection Machine vs. Neural Networks
Traveling Salesman Problem (a request)
pattern analysis
Genetic algorithms
Re: Genetic algorithms
Short Course: Artificial Neural Nets (Schwartz' Part)
Short Course: Artificial Neural Nets (Kosko's Part)
----------------------------------------------------------------------
Date: 20 Jun 88 13:33:39 GMT
From: Jeroen Raaymakers <mcvax!tnosoes!jeroen@uunet.uu.net>
Subject: Where to get The Rochester Simulation Package?
A few months ago there was some mentioning of a 'Rochester
Simulation Package' for the simulation of neural nets that
runs on a SUN machine under Suntools.
I am interested in this package and would like to know
where I can buy this package and who to contact at Rochester
(full name/address please).
Dr. Jeroen G.W. Raaijmakers
TNO Institute for Perception
P.O. Box 23
3769 ZG Soesterberg
The Netherlands
e-mail: tnosoes!jeroen@mcvax.uucp
or tnosoes!jeroen@mcvax.cwi.nl
------------------------------
Date: 2 Jun 88 13:22:48 GMT
From: "uh2%psuvm.BITNET" <@RELAY.CS.NET,@host.bitnet:uh2%psuvm.BITNET@jade.berkeley.edu.user (Lee Sailer)>
Subject: MACIE
Is it possible to obtain MACIE, the neural-net Expert System described
in the Feb. issue of CACM?
Can someone offer me a pointer to the author, Stephen Gallant, at
Northeastern U?
thanks.
------------------------------
Date: 2 Jun 88 00:27:05 GMT
From: Tom Holroyd <uflorida!novavax!proxftl!tomh@UMD5.UMD.EDU>
Subject: Connection Machine vs. Neural Networks
Is anybody doing any connectionist-type work on a Connection Machine?
Seems like a silly question. How fast is it? Do you compute inner
products in O(lg(N)) time? Do you have little robots running around
doing handsprings?
Other massively parallel architectures are also of interest.
E-mail to me and I'll summarize.
Tom Holroyd
UUCP: {uunet,codas}!novavax!proxftl!tomh
The white knight is talking backwards.
------------------------------
Date: Fri, 27 May 88 12:00:20 EDT
From: "Charles S. Roberson" <csrobe@ICASE.ARPA>
Subject: Traveling Salesman Problem (a request)
Greetings,
I am currently doing some work with the TSP and as a result I would like
help from the net in obtaining two items:
(1) a standard algorithm that currently performs well on the TSP,
and
(2) maps of cities that are used in classical/pathological cases.
Particularly, we would like the code used by S. Lin and B. W. Kernighan
in "An Effective Heuristic Algorithm for the Traveling-Salesman Problem"
published in _Operations_Research_ (1973), Vol 21, pp. 498-516. For the
cities, we would like problems with 20 to 100 cities given in x-y coordinates,
if possible.
Off course *any* tidbit of information that someone is willing to share
will be gratefully appreciated.
Thanks,
-c
+-------------------------------------------------------------------------+
|Charles S. Roberson ARPANET: csrobe@icase.arpa |
|ICASE, MS 132C BITNET: $csrobe@wmmvs.bitnet |
|NASA/Langley Rsch. Ctr. UUCP: ...!uunet!pyrdc!gmu90x!wmcs!csrobe|
|Hampton, VA 23665-5225 Phone: (804) 865-4090
+-------------------------------------------------------------------------+
------------------------------
Date: 25 May 88 17:59:14 GMT
From: Daniel Lippmann <mcvax!inria!vmucnam!daniel@uunet.uu.net>
Subject: pattern analysis
Does anybody there have knowledge or experience of neural-nets applied
to graphical pattern analysis of text?
Any pointers to books and PD or experimental software will be welcome.
thanks for any help
daniel (...!mcvax!inria!vmucnam!daniel)
------------------------------
Date: 23 May 88 15:04:12 GMT
From: "Rev. Steven C. Barash" <mmlai!barash@uunet.uu.net>
Subject: Genetic algorithms
A while back someone posted an extended definition of "Genetic algorithms".
If anyone still has that, or has their own definition, could you please
e-mail it to me? (There's probably lots of room for opinions here;
I'm interested in all perspectives).
I would also appreciate any pointers to literature in this area.
Also, if anyone wants me to post a summary of the replies, let me know.
Thanks in advance!
Steve Barash
--
Steve Barash @ Martin Marietta Labs
ARPA: barash@mmlai.uu.net
UUCP: {uunet, super, hopkins!jhunix} !mmlai!barash
------------------------------
Date: 30 May 88 16:46:17 GMT
From: Bill Pi <pollux.usc.edu!pi@OBERON.USC.EDU>
Subject: Re: Genetic algorithms
In article <317@mmlai.UUCP> barash@mmlai.UUCP (Rev. Steven C. Barash) writes:
>
>A while back someone posted an extended definition of "Genetic algorithms".
>If anyone still has that, or has their own definition, could you please
>e-mail it to me? (There's probably lots of room for opinions here;
>I'm interested in all perspectives).
>
>I would also appreciate any pointers to literature in this area.
Up till now, there are two conferences held already for Genetic Algorithms:
Proceeding of the First International Conference on Genetic Algorithms and
Their Applications, ed. J. J. Grefenstette, 1985.
Genetic Algorithms and Their Applications: Proceeding of the Second Inter-
national Conference o Genetic Algorithms, ed. J. J. Grefenstette, 1987.
They can be ordered from:
Lawrence Erlbaum Associates, Inc.
365 Broadway
Hillsdale, NJ 07642
(201) 666-4110
A latest collection of research notes on GA is
Genetic Algorithms and Simulated Annealing, ed. L. Davis, 1987, Morgan kaufmann
Publishers, Inc., Los Altos, Ca.
Also, A mailing list exists for Genetic Algorithms researchers. For more info.
send mail to "GA-List-Request@NRL-AIC.ARPA".
Jen-I Pi :-) UUCP: {sdcrdcf,cit-cav}!oberon!durga!pi
Department of Electrical Engineering CSnet: pi@usc-cse.csnet
University of Southern California Bitnet: pi@uscvaxq
Los Angeles, Ca. 90089-0781 InterNet: pi%durga.usc.edu@oberon.USC.EDU
------------------------------
Date: 25 May 88 06:17:00 GMT
From: bill coderre <bc@MEDIA-LAB.MEDIA.MIT.EDU>
Subject: Re: Genetic algorithms
In article <317@mmlai.UUCP> barash@mmlai.UUCP (Rev. Steven C. Barash) writes:
>A while back someone posted an extended definition of "Genetic algorithms".
>I would also appreciate any pointers to literature in this area.
Well, let's start talking about it right here. Make a change from the
usual rhetoric.
The classic (Holland) Genetic Algorithm stuff involves a pool of rules
which look like ascii strings, the left side of which are
preconditions and the right which are assertions. Attached to each
rule is a probability of firing.
When the clock ticks, all the rules that match their left side are
culled, and one is probabilistically selected to fire.
There is also an "evaluator" that awards "goodness" to rules that are
in the chain of producing a good event. This goodness usually results
in greater probability of firing. (Of course, one could also use
punishment strategies.)
Last, there is a "mutator" that makes new rules out of old. Some
heuristics that are used:
* randomly change a substring (usually one element)
* "breed" two rules together, by taking the first N of one and the
last M-N of another.
The major claim is that this approach avoids straight hill-climbing's
tendency to get stuck on local peaks, by using some "wild" mutations,
like reversing substrings of rules. I'm not gonna guess whether this
claim is true.
I have met Stewart Wilson of the Rowland Institute here in Cambridge,
and he has made simple critters that use the above strategy. They
start out with random rulebases, and over the course of a few million
ticks develop optimal ones.
>>>>>>>>>>
What is particularly of interest to me is genetic-LIKE algorithms that
use more sophisticated elements than ascii strings and simple numeric
scorings.
My master's research is an attempt to extend Genetic AI in just that
way. I wanna use genetic AI's ideas to cause a Society of Mind to
learn.
It appears that using Lenat-like ideas is the right way to make the
mutator, but the evaluator seems like a difficult trick. My hunch is
to use knowledge frames ala Winston, but this is looking less likely.
??????????
So does anybody know about appropriately similar research?
Anybody got any good ideas?
appreciamucho....................................................bc
------------------------------
Date: 10 Jun 88 21:10:59 GMT
From: Chuck Stein <agate!saturn!saturn.ucsc.edu!chucko@UCBVAX.BERKELEY.EDU>
Subject: Short Course: Artificial Neural Nets (Schwartz' Part)
The University of California
Eighteenth Annual
INSTITUTE IN COMPUTER SCIENCE
presents courses in:
* Scientific Visualization * Fault Tolerant Computing
* Parallel Computation * Image Engineering
* Data Compression * Machine Learning
at
Techmart, Santa Clara
and
on campus in Santa Cruz
Following is a course description for:
-------------------------------------------------------------------------
Expert Systems and Artificial Neural Systems:
Technology, Prototyping, Development and Deployment
in the Corporate Environment
July 13-15
Instructor: TOM J. SCHWARTZ, MSEE, MBA.
X421 Computer Engineering (2)
For programmers, engineers, engineering managers, and corporate
technology managers. This course will introduce participants to two of
today's most advanced computing technologies for the corporate
environment: expert systems and artificial neural systems. It will
prepare the attendees to evaluate the technology and current
commercial product offerings; to choose appropriate problems to which
the technology can be applied; to gain program support from
management; to complete a prototype; to compose the project plan and
to see the project through from system development to deployment.
Overview
The course presents a systematic introduction to the strategic use of
expert systems and artificial neural systems within the corporate
project environment, from technology introduction and history through
project plan, prototype, project development and deployment. Founded
on the concept that new technology never replaces old technology (it
merely reconfigures it), the course will focus on introducing these
technologies within the context of current methods and products. A
clear focus on productivity and improvement of the bottom line is the
goal.
Recently both expert systems and artificial neural systems have been
receiving tremendous attention as cutting edge technologies capable of
enhancing existing products and offering means to solve complex
problems which have defied conventional technology. Both
technologies offer the ability to distribute knowledge and expertise.
Expert systems require the human articulation of knowledge which is
captured in an expert system. Artificial neural systems can extract
knowledge from example sets. The course will also examine the
possibilities of merging these technologies together and integrating
them into a firm existing technology base.
Wednesday
Morning: Overview of Artificial Intelligence and Expert Systems.
This will cover definitions and composition, history, philosophical
foundations, and the "Great Schism" between expert systems and
artificial neural systems. This will be followed by an introduction to
expert systems, the basics of knowledge representation and control
structure, the Language-Shell Continuum and methods of control.
Afternoon: Introduction to Artificial Neural Systems and Generic
Technology Issues. This section will consists of an introduction to
artificial neural systems basics of supervised and unsupervised
learning and the modeling continuum. We will then consider the
common considerations of both technologies including: I/O, basics of
problem selection, hardware, "Hooks, Hacks & Ports", validation issues
and the "Explanation Debate".
Thursday
Morning: This section will cover areas where these technologies have
succeeded and failed in the areas of diagnostics, planning, pattern
recognition, and the extraction of knowledge from data.
Afternoon: Project Selection: In this section attendees will have the
opportunity to examine what they have learned and select a proposed
project. During the rest of the course, each person will be able to
match that selection with the other issues and complete an initial
project plan. Issues to be examined will include winning management
support, development strategies, deployment strategies, and budgeting.
Friday
Morning: Planning for Change: At this time, attendees will examine the
impact existing environment, hardware, software, cultural, business,
stake holders, and legal considerations will have on their selected
project. After this, we will examine a project plan and consider the
question of "what constitutes success, and what is its impact?"
Afternoon: Build or buy, vendor selection and wrap-up: For the final
session, we will consider the "build or buy" issue and available
software and hardware. There will be a summary of current available
hardware, languages, and tools. Also examined will be the use of
consultants. This will be followed by a course summary with time for
further questions and comments.
Instructor: TOM J. SCHWARTZ, MSEE, MBA, is the founder of Tom
Schwartz Associates of Mountain View, California.
Fee: Credit, $895 (EDP C6035)
Dates: Three Days, Wed.-Fri., Jul. 13-15, 9 a.m.-5 p.m.
Place: Techmart, 5201 Great America Pkwy., Santa Clara
-----------------------------------------------------------------------
RESERVATIONS:
Enrollment in these courses is limited. If you wish to attend a course
and have not pre-registered, please call (408) 429-4535 to insure that
space is still available and to reserve a place.
DISCOUNTS:
Corporate, faculty, IEEE member, and graduate student discounts and
fellowships are available. Please call Karin Poklen at (408) 429-4535
for more information.
COORDINATOR:
Ronald L. Smith, Institute in Computer Science, (408) 429-2386.
FOR FURTHER INFORMATION:
Please write Institute in Computer Science, University of California
Extension, Santa Cruz, CA 95064, or phone Karin Poklen at (408) 429-
4535. You may also enroll by phone by calling (408) 429-4535. A
packet of information on transportation and accommodations will be sent
to you upon receipt of your enrollment.
------------------------------
Date: 10 Jun 88 21:24:32 GMT
From: Chuck Stein <agate!saturn!saturn.ucsc.edu!chucko@UCBVAX.BERKELEY.EDU>
Subject: Short Course: Artificial Neural Nets (Kosko's Part)
The University of California
Eighteenth Annual
INSTITUTE IN COMPUTER SCIENCE
presents courses in:
* Scientific Visualization * Fault Tolerant Computing
* Parallel Computation * Image Engineering
* Data Compression * Machine Learning
at
Techmart, Santa Clara
and
on campus in Santa Cruz
Following is a course description for:
-------------------------------------------------------------------------
Artificial Neural Networks
August 1-3
Instructor: BART KOSKO
X415 Computer & Information Sciences (2)
This course offers a rigorous introduction to the mechanics of
artificial neural networks. It is aimed at an interdisciplinary audience
with emphasis on engineering and artificial intelligence. Designed as
an active process, the course will oblige participants to undertake
assignments including written work. Upon completion, attendees will
have a working knowledge of several state-of-the-art neural network
technologies.
Overview :
Artificial neural networks are programmable dynamical systems. Their
global properties can often be designed to carry out practical
information processing--pattern storage, robust recall, fuzzy
association, distributed prediction, inductive inference, and
combinatorial optimization. Artificial neural networks are especially
well suited for realtime pattern recognition and nearest neighbor
matching in large databases. Some continuous and diffusion networks
can perform global optimization. Some networks can learn complex
functional mappings simply by presenting them with input-output
pairs. Some fuzzy knowledge networks can represent, propagate, and
infer uncertain knowledge in contexts where traditional AI decision-
tree graph search cannot be applied.
Prerequisite: Background in calculus, matrix algebra, and some
probability theory.
Schedule
Monday:
*Associative Memory
symbolic vs. subsymbolic processing
preattentive and attentive processing
global stability
bidirectional associative memories (BAM)
optical BAMs
error-correcting decoding
temporal associative memory, avalanches
optimal linear associative memory
Tuesday:
*Global Stability and Unsupervised Learning
continuous BAMs and the Cohen-Grossberg Theorem
neurocircuits for combinatorial optimization
Hebb, differential Hebb, and competitive learning
adaptive BAMs
Grossberg Theory
adaptive resonance theory
adaptive vector quantization
counter-propagation
Wednesday:
*Supervised Learning and Fuzzy Knowledge Processing
lean-mean-square algorithm
backpropagation
simulated annealing
Geman-Hwang theorem for Brownian diffusions
Cauchy vs. Boltzmann machines
fuzzy entropy and conditioning
fuzzy associative memories (FAMs)
fuzzy cognitive maps (FCMs) and learning FCMs
Instructor: BART KOSKO, Assistant Professor of Electrical
Engineering at the University of Southern California
Fee: Credit, $895 (EDP J2478)
Dates: Three days, Mon.-Wed., Aug. 1-3, 9 a.m.-5 p.m.
Place: Techmart, 5201 Great America Pkwy., Santa Clara
-----------------------------------------------------------------------
RESERVATIONS:
Enrollment in these courses is limited. If you wish to attend a course
and have not pre-registered, please call (408) 429-4535 to insure that
space is still available and to reserve a place.
DISCOUNTS:
Corporate, faculty, IEEE member, and graduate student discounts and
fellowships are available. Please call Karin Poklen at (408) 429-4535
for more information.
COORDINATOR:
Ronald L. Smith, Institute in Computer Science, (408) 429-2386.
FOR FURTHER INFORMATION:
Please write Institute in Computer Science, University of California
Extension, Santa Cruz, CA 95064, or phone Karin Poklen at (408) 429-
4535. You may also enroll by phone by calling (408) 429-4535. A
packet of information on transportation and accommodations will be sent
to you upon receipt of your enrollment.
------------------------------
End of NEURON-Digest
********************