Copy Link
Add to Bookmark
Report

Neuron Digest Volume 06 Number 26

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Friday, 13 Apr 1990		Volume 6 : Issue 26 

Today's Topics:
Administrivia
cellular automata and neural networks
cellular automata and neural networks]
Neural net chip company omission--Neural Semiconductor, Inc.
info on back propagation requested
Neural Network Journals
Hinton reference needed
EE Times reference
Chemical Applications
A theoretical question: multiple patterns and synchrony
Re: A theoretical question: multiple patterns and synchrony
Re: A theoretical question: multiple patterns and synchrony
NADC ??
Spectrascopic data
Maximum likelihood estimators for weights in feed-forward nets
Re: Maximum likelihood estimators for weights in feed-forward nets
Is there a clock in the brain?
Re: Is there a clock in the brain?
The AI'90 CALL FOR PAPER


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Administrivia
From: "Neuron-Digest Moderator -- Peter Marvit" <neuron@hplabs.hpl.hp.com>
Date: Fri, 13 Apr 90 18:43:00 -0700

Greetings Neuron Digest readers!

Just a few notes of usual administrative stuff. The mailing list for the
Digest is now at 851 addresses, many of which are redistribution points.
As many of you know, the Digest is also gatewayed to the USENET group
comp.ai.neural-nets. This makes total readership... well lots!

Please remember to let me know if your mail address changes, if your
account is going to disappear, or if you receive multple copies.
Especially as the spring quarter ends, many academics move, graduate, and
otherwise change addresses. Notification makes my life easier.

Submissions directly to the Digest have been quite heavy recently and so
I have not published much from the USENET group. This issue and the next
several rectify this. I've tried to select articles of general interest
or requests for information with which readers may be able to help.
If you respond directly to an author, consider if your message might be
of general interest to Digest readers.

Cheers,
Peter

: Peter Marvit, Neuron Digest Moderator
: Courtesy of Hewlett-Packard Labs in Palo Alto, CA 94304 (415) 857-6646
: neuron-request@hplabs.hp.com OR {any backbone}!hplabs!neuron-request


------------------------------

Subject: cellular automata and neural networks
From: baker2@husc6.harvard.edu (James Baker)
Organization: /sc9a/cs163/baker2/.organization
Date: 11 Apr 90 18:02:07 +0000

[[ Editor's Note: This message and its companion was lifted from the
Cellular Automata mailing list. -PM ]]

Has anyone constructed ``cellular automata'' that learn?

There seems to be some good reasons to explore this possibility:

1. Arbitrary connections make analyzing neural networks difficult, if
not just impossible.

2. Cellular automata models are more readily simulated on hyper cube
parallel architectures than conventional neural networks.

Since one would train these models, they would not be cellular automata
in the strict sense; for example, they might use some global reward
signal or noise, in addition to receiving input and target data.

-- Jim

------------------------------

Subject: cellular automata and neural networks]
From: Manoel Fernando Tenorio <tenorio@ee.ecn.purdue.edu>
Date: Thu, 12 Apr 90 13:02:01 -0500


Has anyone constructed ``cellular automata'' that learn?

I think that before we can answer this question, it should be proceeded
by a definition of what is and is not a CA. Certainly, learning has been
done and is possible on CA-like machines.

There seems to be some good reasons to explore this possibility:

1. Arbitrary connections make analyzing neural networks difficult, if
not just impossible.

I don't understand why that is so. We have published and algorithm to do
that in the IEEE Trans in NN. This is how the brain seems to form certain
subcircuits, specially theone that are experienced based.

2. Cellular automata models are more readily simulated on hyper cube
parallel architectures than conventional neural networks.

I don't understand why that is so. When I think of CA's, I have images of
highly regular structures, but that might not be necessary; but again the
same can be said about NN.

Since one would train these models, they would not be cellular automata
in the strict sense; for example, they might use some global reward
signal or noise, in addition to receiving input and target data.

There have been a number of works on "local" learning rules for NN that
can be applied here. You could take a look at CA-like machines in
NIPS'87: Nondeterministic Adaptive Logic ELements by Windecker.



------------------------------

Subject: Neural net chip company omission--Neural Semiconductor, Inc.
From: aglen@sim.berkeley.edu (Glen Richard Anderson)
Date: Fri, 13 Apr 90 16:43:11 -0800


A major omission was made in your list of companies in the neural net
field, that of

Neural Semiconductor, Inc.
Carlsbad, CA
(619) 931-7600

Their product is a chip using a proprietary algorithm to do a fully
parallel 32 x 32 multiplication at the rate of 100,000 patterns/sec. The
chip is entirely digital, supposedly easy to use, gangable for larger
nets, and availble within this quarter (if not already).

Glen Anderson
University of California, Berkeley EECS


[[ Editor's Note: Thanks to Glen for the addition. What else is missing? ]]

------------------------------

Subject: info on back propagation requested
From: crook@nssdcs (Sharon Crook (IDM))
Organization: NSSDC Greenbelt Md.
Date: 30 Mar 90 21:11:23 +0000

I am a student who is also working with a small research group at
NASA/GSFC. We are using back propagation to train a net to classify
pixels from landsat images into land use categories. We are also using
some standard statistical image processing techniques and will probably
end up with an expert system that uses a combination of techniques.

We have tried to do alot of background research and have a good start on
the implementation. However, we have some problems that someone may be
able to help us solve. Because of the nature of the task, convergence in
the weight dynamics is a problem. Specifically we need more of the
following:

-References describing similar applications to large, noisy data
sets with lots of "overlap" in input.
-References on weight convergence for BP.
-References describing methods of automating the training process,
including deciding when to halt training if the weights don't
converge.

Any suggestions for ways of improving our performance are welcome.
Respond via email to: crook@nssdcs.gsfc.nasa.gov
crook@nssdca.gsfc.nasa.gov

Thanks,
Sharon Crook

[For more info see Campbell, Hill, and Cromp "Automatic Labeling
and Characterization of Objects Using Artificial Neural
Networks"
in Telematics and Informatics Vol.6, No.3-4, 1989.]

------------------------------

Subject: Neural Network Journals
From: plonski@aerospace.aero.org (Mike Plonski)
Organization: The Aerospace Corporation
Date: 31 Mar 90 02:48:59 +0000

I would like to compile a list of all relevant neural network and related
journals along with relevant comments from readers. My primary reason
for requesting this information is to decide on which new journals to
subscribe to, however I realize that there will probably be considerable
interest in such a compilation and I will post a summary of all
responses. I would like to gather complete information on these
journals, such as number of issues per year, size of issues, publisher
with address, cost, scope of topics, types of articles (correspondence,
review, in-depth), and your personal views on the journal. Any
information will be appreciated though. Hopefully many people will send
their comments so that I can compile a fairly complete list. In order to
cut down net traffic, please send your comments directly to me and I will
post a summary.

Thanks.

-----------------------------------------------------------------------------
. . .__. The opinions expressed herein are soley
|\./| !__! Michael Plonski those of the author and do not represent
| | | "plonski@aero.org" those of The Aerospace Corporation.
_____________________________________________________________________________

------------------------------

Subject: Hinton reference needed
From: ralph@cs.ubc.ca (Scott Ralph)
Organization: UBC Department of Computer Science, Vancouver, B.C., Canada
Date: 03 Apr 90 18:30:15 +0000


G. Hinton gave a talk at the Connectionist conference in Vancouver in
January 1990 in which he mentioned the use of unsupervised learning
techniquies for solving the binocular disparity problem in random dot
diagrams. More specifically he demonstrated this learning technique for
capturing generalizations of spatial coherence under 3d perspective
transformations. Can anyone direct me to the particular paper in which
Binocular disparity paper or anything they feel is relevant?

Thanks.

Scott Ralph

------------------------------

Subject: EE Times reference
From: thomasp@lan.informatik.tu-muenchen.dbp.de (Patrick Thomas)
Organization: Inst. fuer Informatik, TU Muenchen, W. Germany
Date: 05 Apr 90 16:18:46 +0000

[[ Editor's Note: Any European readers want to help this fellow? -PM ]]

I'm having trouble to find out who's publishing Electronic Engineering
Times and whether it is available in Europe too. As I understand it a guy
called Colin-Johnson is busy reporting about neural-nets in the EE Times.

Anyone having information about the publisher's address and EE Times'
subscription rates ?

Thanx a lot,

Patrick

------------------------------

Subject: Chemical Applications
From: wipke@secs.ucsc.edu (W. Todd Wipke)
Organization: UCSC Molecular Engineering Laboratory
Date: 08 Apr 90 09:23:23 +0000

A symposium in print is focussing on chemical applications of neural
networks. I would appreciate any leads to research or practical
applications in chemistry (specifics of who is doing the work).

Secondly I am interested in compiling a list of software that is
available on Mac or IBM PC for neural network applications. If there are
people using or offering packages, let me know what you are using.

The information may have been placed on this list already, so just send
it directly to me at wipke@secs.ucsc.edu, thanks. Commercial software
should indicate the price. Also indicate if a demo disk is available.

wipke@secs.ucsc.edu

------------------------------

Subject: A theoretical question: multiple patterns and synchrony
From: mcsun!inria!mirsa!luong@uunet.uu.net (Tuan Luong)
Organization: INRIA, Sophia-Antipolis (Fr)
Date: 09 Apr 90 13:05:21 +0000

Consider a neural network with several local minima of the energy
function, each of which representing a different pattern, and having
differents slopes and depths.

Consider now an image containing different patterns among those
represented in that network. Each of these ones will be recognized but
the different local minima will be reached at different times. This means
that vision in a temporal process: object 1 is perceived, then object
2...

So, how to create a representation of a composed object made of an
assembly of different patterns ? One can say that the temporal synchrony
of the units is a representation of the global identity, but how to
obtain this synchrony between the different parts of the image ?

------------------------------

Subject: Re: A theoretical question: multiple patterns and synchrony
From: wipke@secs.ucsc.edu (W. Todd Wipke)
Organization: UCSC Molecular Engineering Laboratory
Date: 10 Apr 90 07:35:46 +0000

Chemists routinely minimize energy of three-dimensional networks which
can have several local minima. Each minimum represents a low energy
state of the molecule, in effect, the "personality" of the molecule. I
wonder if there are any interesting conclusions one can draw.

On another but related topic, chemists have successfully used learning
machines for chemical pattern recognition--the work was done in 1969-71.
Very systematic studies of learning rate versus feature scaling, number
of parameters, type of feedback, size and diversity of training set etc.
Peter Jurs, Penn State wrote a book on it and many papers showing one can
predict mass spec, determine molecular formula from mass spec, classify
drugs, etc. From the literature I have seen, this work has gone
unnoticed by the computer science community. Since the data sets are
well defined, it would provide a reproducible standard against which you
could all compare your methods or black boxes. To my knowledge there is
no such standard in use today. I would be very interested to see if
today's methods are better than earlier ones. I have not seen systematic
studies like Jurs did, but would like to see some.

=======================================================================
W. Todd Wipke wipke@secs.ucsc.edu
Molecular Engineering Laboratory wipke@ucscd.ucsc.edu
Thimann Laboratories wipke@ucscd.bitnet
University of California BBS 408 429-8019
Santa Cruz, CA 95064 FAX 408 459-4716
=======================================================================

------------------------------

Subject: Re: A theoretical question: multiple patterns and synchrony
From: park@usceast.UUCP (Kihong Park)
Organization: University of South Carolina, Columbia
Date: 11 Apr 90 18:57:11 +0000

Why make the assumption that segmentable "parts" of an image are stored
as different local minima on the same network, i.e., its energy
landscape? You may want view your image processing system as consisting
of a number of relatively "independent" modules, each of which with
different functionalities. Then, temporal synchronicity can possibly be
achieved. That is, in the most simplest case where each model encodes
among other things one "part" of the image, the simultaneous convergence
to local minima in each module may bring forth a synchronized convergence
of the total system to a global minimum. Hence no temporal sequencing.
The above explanation is of course too simplistic, but nevertheless it
should illustrate that in any nontrivial system, modularization is a key
factor. How to achieve this is a big problem.

Kihong Park. (park@cs.scarolina.edu)

------------------------------

Subject: NADC ??
From: george@minster.york.ac.uk
Organization: Department of Computer Science, University of York, England
Date: 10 Apr 90 13:51:34 +0000

Does anyone have this paper or know where I can get hold of it. In particular,
what is the "NADC", and where is it?

> Wann, Mien, "Neural Networks for Multisensor Data Fusion",
> IP Proposal, Code 703, NADC, 1989.

Many thanks in advance,

George Bolt

____________________________________________________________
George Bolt, Advanced Computer Architecture Group,
Dept. of Computer Science, University of York, Heslington,
YORK. YO1 5DD. UK. Tel: [044] (0904) 432771

george@uk.ac.york.minster JANET
george%minster.york.ac.uk@nsfnet-relay.ac.uk ARPA
george!mcsun!ukc!minster!george UUCP
george@minster.york.ac.uk eab mail
____________________________________________________________

------------------------------

Subject: Spectrascopic data
From: ma86kbl@cc.brunel.ac.uk (Kam B Lo)
Organization: Brunel University, Uxbridge, UK
Date: 10 Apr 90 15:36:32 +0000

I am trying to train a three-layered neural network (1 layer of input, 1
hidden layer and 1 output layer) to classify two groups of chemical
compounds by their NIR spectrascopic data using the back-propagation
technique. Each of the NIR spectrum consists of 700 values in the range
of -0.03 to 0.03.

My problem is what preprocessing do I need to do to this data to get it
in an acceptable form for entering into the neural network. I have tried
setting up a network with 700 input cells and assigning them to the
corresponding values on the spectrum (is this sensible?), but this always
got stuck in a local minimum.

Any help with this problem will be greatly appreciated,
Kam.

------------------------------

Subject: Maximum likelihood estimators for weights in feed-forward nets
From: spackman@ohsuhcx.ohsu.edu (Dr. Kent Spackman)
Organization: Oregon Health Sciences University, Portland
Date: 10 Apr 90 19:54:22 +0000


I'm looking for references to articles that have anything to do with the
relationship between back-propagation and maximum likelihood estimation,
such as is done by logistic regression.

I have programmed a maximum likelihood estimator that trains a nnet by
sort-of back-propagating likelihoods, resulting in a "multi-layer"
logistic regression. It works well for simple problems (X-OR, etc.)
involving multiple layers and reproduces the results of single-layer
logistic regression.

Has anyone done this before? Do you know of any references?

Some background information:

Logistic regression is a commonly used statistical method in medicine.
It can be described as a method for maximum likelihood estimation of the
weights of a single-layer feed-forward neural network. It uses the
logistic transfer function just like "ordinary" back-propagation.
Back-propagation does not do maximum likelihood estimation. Minimizing
the sum of squared errors is not the same as maximum likelihood when
using the logistic transfer function.

Reply to the net or directly via email to me: spackman@ohsu.edu
Kent A. Spackman
Biomedical Information Communication Center
Oregon Health Sciences University
Portland, OR 97201

------------------------------

Subject: Re: Maximum likelihood estimators for weights in feed-forward nets
From: mgv@usceast.UUCP (Marco Valtorta)
Organization: University of South Carolina, Columbia
Date: 11 Apr 90 15:41:10 +0000

You may want to read "A Statistical Approach to Learning and
Generalization in Layered Neural Networks,"
by Levin, Tishby, and Solla,
in the COLT '89 Proceedings, pp.245-260. The Proceedings are published
by Morgan Kaufmann.

Marco Valtorta usenet: ...!ncrcae!usceast!mgv
Department of Computer Science internet: mgv@cs.scarolina.edu
University of South Carolina tel.: (1)(803)777-4641
Columbia, SC 29208 tlx: 805038 USC
U.S.A. fax: (1)(803)777-3065
usenet from Europe: ...!mcvax!uunet!ncrlnk!ncrcae!usceast!mgv

------------------------------

Subject: Is there a clock in the brain?
From: markh@csd4.csd.uwm.edu (Mark William Hopkins)
Organization: University of Wisconsin-Milwaukee
Date: 11 Apr 90 04:59:54 +0000


In order to reconstruct the underlying architecture of the brain, the
ultimate goal of AI, you need to perform tests to determine its
functional behavior. In that regard, the brain is just like any other
physical system and should be treated as such. It's suprising how far
one can go with simple "black box" experiments to test the brain like one
would test a transistor or circuit. All you're really doing in the
process is a kind of "debugging", only here you don't have the source
code or specification. That's a process called Reverse Engineering: by
which one derives the design of a system from its functional behavior.

We're all aware of the various cycles that we experience (such as the
sleep-wake cycle). And based on such experiences we may easily conclude
that there is some kind of biological clock in our bodies.

One time in my Biology class in high school, our teacher wanted to
prove to us that our sense of time was unreliable. What he did was have
the entire class time off 300 seconds upon hearing the "GO!" from him.
He simultaneously timed off 300 seconds from his watch. The students
were asked to raise their hand upon reaching 300.

What he suspected was that most everyone would raise their hand
prematurely, as was the case. On that basis he would demonstrate just
how innacurate we innately were.

There was one interesting exception though. When I reached 300 and
raised my hand it occurred at exactly the time the teacher said "STOP!".
The impulse to raise it actually occurred just BEFORE the teacher said
stop, the hand went up just after. To this day I still don't know
whether he ignored me or just didn't notice...

Now that I thought more about it, it seemed to me that I must have
timed off 5 minutes right down to the last 1/10th of a second. This
would not be suprising considering that I had lots of practice doing this
(with feedback from an actual clock).

So that prompted a couple questions:

* Is there a clock in the human brain? I mean a clock which behaves
like a counter/timer peripheral as it would for a CPU.

and
* Can you *program* a clock into your brain?

Now these questions can be fairly easily resolved by a few simple
experiements. I set up such experiments by configuring a microprocessor
to clock input signals from a push-button (which had already been
installed for another project). The microprocessor was capable of
clocking the interval between two push-button signals down to the last
instruction cycle (1 microsecond). The signal delay was a small fraction
of an instruction cycle. The only non-repeatible deviation lie in my
internal clock and in the control of my hand, so I had complete "control"
over the deviation.

This is a very simple experiment to set up: it only took a few minutes
to program the machine in question and to whip up a simple driver program
to process statistics, display results and drive the process. If you
have a processor at hand, it is something you may want to try yourself.

The first experiement was to determine how well the brain could
resolve time. The means to do this was to try and clock off intervals
from the push button as close to a predetermined number of machine cycles
(600,000) as possible. The time was kept small for this experiment
(600,000 cycles is about 0.6 seconds).

A feedback loop was established by displaying the error (Actual time
clocked off minus the target time), and a running average and RMS
deviation were continually displayed.

The goal was both to determine how well time could be resolved (using
the RMS deviation as an indicator), and to determine how well the brain
could be TRAINED to resolve time (using the feedback).

The way averages and RMS deviations were computed were as "decaying"
averages. The effect of each trial decayed exponentially with subsequent
trials with a "time-constant" set so that the "half-life" would be 10
trials. This enabled me to observe the trend in the RMS deviation (and
average) while still allowing the averaging to take place in a meaningful
manner.

The result: initially I was able to clock off intervals with an RMS
deviation of around 40,000 cycles (microseconds). With about an hours'
practice I managed to establish control to within a 25,000 microsecond
deviation.

This value varied widely and heavily depended on my degree of
concentration. It was very easy to "let go" and let the value slip back
up to the 40000 to 50000 range. However, there were times in which I
could (somewhat inconsistently) establish control to within 5000
microseconds.

The tentative conclusion of this part of the experiment was that (1)
The brain could be trained to resolve time, and that (2) with enough
practice this resolution could ultimately be carried down to a
plus-or-minus 5000 microsecond figure at which point one runs into a
biological limit.

That is consistent with the assumption that there is a 100Hz clock in the
brain.

The second experiment was to determine how well the brain could clock
units of time over a longer period. The cycles were set to 1 second, and
the period in question was set to 5 seconds. The goal was to train an
internal second counter to achieve the accuracy of a clock. This would
establish that there is an oscillator in the brain which can be used to
drive a clock.

The same feedback control and statistical technique was used here as
in the previous experiment. This experiment was cut off before
completion (it is difficult to establish that many trials with a 5 second
cycle time), so the results are tentative.

Initially, by counting to 5, I could establish control to within about
500,000 cycles (1/2 second). With practice, I managed to get this RMS
deviation down to around 150,000 cycles. Maximum concentration allowed
me to tentatively establish control to under 50,000 cycles, but this was
difficult to maintain for any length of time.

There were a definite subjective experiences that correlated with the
increased control. Gradually, this sense of being able to "see"
intervals like spans. Also, the ability to "hear" the interval elapsing
(a continuous sound that lasted a given duration marked an interval),
evolved. It is interesting that I could achieve much better control when
using these indicators and "feeling the wait", instead of counting off
the cycles. of spatial distance emerged.

OTHER QUESTIONS:

There are interesting experiences of mine that have prompted other
related issues about internal counter/timers in the brain. I've
classified them as follows:

(1) CROSS-TALK:
How many counters (not necessarily running at the same period) can you
have "running" in your brain at once without interference? I've
succeeded in getting two to run at once on different occasions.

This brings up a related point. There are often times when I may be
thinking of a musical sequence when suddenly a song appears on TV or the
radio. All of a sudden, I find myself completely incapable of continuing
the sequence I had in mind originally due to the interference.

How many strands of music can one have running in the mind at once
without mutual interference? How far can this skill be developed? As
you know, this is what is involved in learning to play both the upper and
lower registers on a piano; it's what is involved in composing intricate
harmonies, like those of Bach.

(2) SUBLIMATION:
There have been occasions in which I have "set up" an internal counter
in my mind and I would forget all about it. Moments later I'd suddenly
become aware of going "...756, 757, ..." and realise that I forgot to
"shut off" the timer.
If you can sublimate that counter/timer like this, then what's controlling
the process when it's completely unconscious? Where's the "oscillator"?
And just how far can this "sublimation" be carried out?

------------------------------

Subject: Re: Is there a clock in the brain?
From: dmocsny@minerva.che.uc.edu (Daniel Mocsny)
Organization: University of Cincinnati, Cin'ti., OH
Date: 12 Apr 90 19:46:24 +0000

In article <3377@uwm.edu> markh@csd4.csd.uwm.edu (Mark William Hopkins) writes:
> * Is there a clock in the human brain? I mean a clock which behaves like a
>counter/timer peripheral as it would for a CPU.
>
>and
>
> * Can you *program* a clock into your brain?

Ask any good drummer. A good drummer can not only play at a requested
tempo, measured in beats per minute, but can also delay and/or advance
particular hits by milliseconds to give a desired "feel" to the music.

I think if you are interested in brains and timing, you should be talking
to your musician friends. Or get yourself a MIDI sequencer and play
around with it. You can *easily* hear the difference when you adjust
tempo of a song slightly, or adjust timing of individual notes by not
very many milliseconds.

Dan Mocsny Snail:
Internet: dmocsny@minerva.che.uc.edu Dept. of Chemical Engng. M.L. 171
dmocsny@uceng.uc.edu University of Cincinnati
513/751-6824 (home) 513/556-2007 (lab) Cincinnati, Ohio 45221-0171

------------------------------

Subject: The AI'90 CALL FOR PAPER
From: chii@ee.su.oz.au (Liang Chii)
Organization: Electrical Engineering, The University of Sydney, Australia
Date: 28 Mar 90 08:38:03 +0000

AI'90 CALL FOR PAPERS ACS
=================

4TH AUSTRALIAN JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE
21-23 NOVEMBER, 1990
HYATT REGENCY, PERTH, WESTERN AUSTRALIA

This conference is a major regional forum for the presentation of recent
research on the theory and practical applications of Artificial
Intelligence. It acts as a catalyst to stimulate further research and
cooperation in this important area within the Australasia and
India-Pacific region. The theme of this year's conference aims to
encourage and promote AI techniques and tools for solving everyday
problems.

TOPICS INCLUDE (BUT NOT LIMITED TO)
===================================

o Logic and Resoning
o Knowledge Representation and Acquisition
o Machine Learning
o Artificial Neural Networks
o Computer Vision and Robotics
o Natural Language and Speech Recognition
o Expert Systems and development tools
o Applied AI in Civil, Electrical, Electronic and Mechanical Engineering
o Knowledge Engineering in Business Applications
o Applications in Government and Mining

CRITERIA FOR ACCEPTANCE
=======================

This conference welcomes high quality papers which have a significant
contribution to the theory or practice of AI. Papers in the application
areas will be judged by their novelty in the application, its
formulation, application of new AI techniques, and the success of the
application project.

REQUIREMENT FOR SUBMISSION
==========================

Authors must submit FOUR copies of their FULL papers to AI'90 Program
Committee by 11th May 1990. Submissions after the deadline may be
returned without being opened. Notification of acceptance and format of
the camera ready copy will be posted by the 27th July 1990. The camera
ready final paper will be due on 24th August 1990.

PAPER FORMAT FOR REVIEW
=======================

The paper should be about 5000 words in length. It should be at least
one and a half spacing and clearly legible. Authors should try to limit
their paper to not more than 15 pages not including diagrams. Each paper
must include a title, an abstract about 100 words, but no other
identifying marks. The abstract of 100 words with the title, authors
names, and correspondence address should accompany the submission on a
separate page.

PUBLICATION
===========

All papers accepted in the conference will be published in the conference
proceedings. Following the tradition of this conference, effort will
also be made to publish selected papers from the conference in book form
for wider circulation.

SUBMIT PAPERS TO
================

AI'90 Program Committee
c/o Department of Computer Science
University of Western Australia
Nedlands, WA 6009
Western Australia, Australia

Enquiries :
===========

Dr. C.P. Tsang, AI'90 Programme Chair
Tel : +61 9 380 2763
Fax : +61 9 382 1688
email : ai90paper@wacsvax.oz.au

------------------------------

End of Neuron Digest [Volume 6 Issue 26]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT