Copy Link
Add to Bookmark
Report

Neuron Digest Volume 06 Number 44

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Monday, 23 Jul 1990                Volume 6 : Issue 44 

Today's Topics:
Cascade-Correlation simulator in C
Two papers on neural nets and RNA/DNA
Re: Distinguishing "Normal" from Abnormal" Data
Paper citation requested
Looking for a paper
Distinguishing "
Normal" from "Abnormal" Data
Re: Distinguishing "
Normal" from "Abnormal" Data
From Standards Committee
AI Forum Meeting
NEURAL COMPUTATION 2:2
New tech report
ASE91 Conference announcement please distribute
Re: Technology Transfer Mailing List


Send submissions, questions, address maintenance and requests for old issues to
"
neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "
ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Cascade-Correlation simulator in C
From: Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU
Date: Sun, 17 Jun 90 01:02:35 -0400


Thanks to Scott Crowder, one of my graduate students at Carnegie Mellon,
there is now a C version of the public-domain simulator for the
Cascade-Correlation learning algorithm. This is a translation of the
original simulator that I wrote in Common Lisp. Both versions are now
available by anonymous FTP -- see the instructions below. Before anyone
asks, we are *NOT* prepared to make tapes and floppy disks for people.

Since this code is in the public domain, it is free and no license
agreement is required. Of course, as a matter of simple courtesy, we
expect people who use this code, commercially or for research, to
acknowledge the source. I am interested in hearing about people's
experience with this algorithm, successful or not. I will maintain an
E-mail mailing list of people using this code so that I can inform users
of any bug-fixes, new versions, or problems. Send me E-mail if you want
to be on this list.

If you have questions that specifically pertain to the C version, contact
Scott Crowder (rsc@cs.cmu.edu). If you have more general questions about
the algorithm and how to run it, contact me (fahlman@cs.cmu.edu). We'll
try to help, though the time we can spend on this is limited. Please use
E-mail for such queries if at all possible. Scott Crowder will be out of
town for the next couple of weeks, so C-specific problems might have to
wait until he returns.

The Cascade-Correlation algorithm is described in S. Fahlman and C
Lebiere, "
The Cascade-Correlation Learning Architecture" in D. S.
Touretzky (ed.) _Advances_in_Neural_Information_Processing_Systems_2_,
Morgan Kaufmann Publishers, 1990. A tech report containing essentially
the same information can be obtained via FTP from the "
neuroprose"
collection of postscript files at Ohio State. (See instructions below.)

Enjoy,

Scott E. Fahlman
School of Computer Science
Carnegie-Mellon University
Pittsburgh, PA 15217

---------------------------------------------------------------------------
To FTP the simulation code:

For people (at CMU, MIT, and soon some other places) with access to the
Andrew File System (AFS), you can access the files directly from
directory "
/afs/cs.cmu.edu/project/connect/code". This file system uses
the same syntactic conventions as BSD Unix: case sensitive names, slashes
for subdirectories, no version numbers, etc. The protection scheme is a
bit different, but that shouldn't matter to people just trying to read
these files.

For people accessing these files via FTP:

1. Create an FTP connection from wherever you are to machine "
pt.cs.cmu.edu".

2. Log in as user "
anonymous" with no password. You may see an error
message that says "
filenames may not have /.. in them" or something like
that. Just ignore it.

3. Change remote directory to "
/afs/cs/project/connect/code". Any
subdirectories of this one should also be accessible. The parent
directories may not be.

4. At this point FTP should be able to get a listing of files in this
directory and fetch the ones you want. The Lisp version of the
Cascade-Correlation simulator lives in files "
cascor1.lisp". The C version
lives in "
cascor1.c".

If you try to access this directory by FTP and have trouble, please
contact me.

The exact FTP commands you use to change directories, list files, etc.,
will vary from one version of FTP to another.
---------------------------------------------------------------------------
To access the postscript file for the tech report:

unix> ftp cheops.cis.ohio-state.edu (or, ftp 128.146.8.62)
Name: anonymous
Password: neuron
ftp> cd pub/neuroprose
ftp> binary
ftp> get fahlman.cascor-tr.ps.Z
ftp> quit
unix> uncompress fahlman.cascor-tr.ps.Z
unix> lpr fahlman.cascor-tr.ps (use flag your printer needs for Postscript)

------------------------------

Subject: Two papers on neural nets and RNA/DNA
From: BRUNAK@nbivax.nbi.dk
Date: Mon, 16 Jul 90 13:29:00 +0200



Two papers on neural nets and RNA/DNA:

``Cleaning up gene databases'', S. Brunak, J. Engelbrecht and S. Knudsen,
Nature, vol. 343, p. 123, 1990.

``Neural networks detect errors in the assignment of mRNA splice sites'',
S. Brunak, J. Engelbrecht and S. Knudsen, Nucleic Acids Research, 1990
(to appear).

Preprints are available.

Soren Brunak
Department of Structural Properties of Materials
Building 307
The Technical University of Denmark
DK-2800 Lyngby
Denmark

------------------------------

Subject: Re: Distinguishing "
Normal" from Abnormal" Data
From: Don Malkoff <dmalkoff@42.21.decnet>
Date: Mon, 16 Jul 90 09:01:06 -0400

Loren Petrich asked about the use of neural networks to distinguish
"normal" and "abnormal" phenomena:

The "Stochasm" neural network is used for signal detection and
classification in the domains of sonar and radar [1,2,3]. It makes and
dynamically maintains a model of "normal" background which resides in the
input layer and serves as a discriminator between "normal" and
"abnormal". Patterns of interest are passed to the second layer which
then performs classification. The network might be described as a
back-to-back radial basis function classifier.

1. Malkoff, D.B., "Detection and Classification by Neural Networks and
Time-Frequency Distributions,"
the Second International Symposium on
Signal Processing and Its Applications, Time-Frequency Signal Analysis
Workshop, Gold Coast, Australia, August 1990. To be published in "Time
Frequency Signal Analysis - Methods, Algorithms & Application,"
Longman
and Cheshire, Australia, 1990.

2. Malkoff, D.B. and L. Cohen, "A Neural Network Approach to the
Detection Problem Using Joint Time-Frequency Distributions,"
presented at
IEEE 1990 International Conference on Acoustics, Speech, and Signal
Processing, Albuquerque, New Mexico, April 3-6, 1990. Published in
Proceedings.

3. Malkoff, D.B., "A Neural Network for Real-Time Signal Processing,"
"Advances in Neural Information Processing Systems," ed. D. Touretzky,
Morgan Kaufmann Publishers, Inc., Volume 2, 1990.
____________________________________
Donald B. Malkoff
General Electric Company
Advanced Technology Laboratories
Moorestown Corporate Center
Bldg. 145-2, Route 38
Moorestown, N.J. 08057
(609) 866-6516
Email address: "dmalkoff@atl.dnet.ge.com"

------------------------------

Subject: Paper citation requested
From: stanley@visual1.tamu.edu (Stanley Guan)
Date: Mon, 16 Jul 90 16:46:03 -0500

Does anyone know of the following paper:
F. Girosi and T. Poggio. A theory of networks for approximation
and learning: part two. A. I. Memo, Artificial Intelligence
Laboratory, Massachusetts Institute of Technology, 1989 (?)
Is it published?


Any help will be highly appreciated.


-- Stanley Guan (stanley@visual1.tamu.edu)


------------------------------

Subject: Looking for a paper
From: roysam@ecse.rpi.edu (Roysam)
Date: Thu, 19 Jul 90 15:51:50 -0400


"Hornik, K. M., M. Stinchcombe, and H. White, "MUltilayer feedforward
networks are universal approximators," Neural Networks, 1989: to appear.

I'll appreciate any help in obtaining a copy of this paper.

Thanks

Badri Roysam
Assistant Professor
Department of ECSE
Rensselaer Polytechnic Institute,
Troy, NY 12180.

------------------------------

Subject: Distinguishing "
Normal" from "Abnormal" Data
From: loren@tristan.llnl.gov (Loren Petrich)
Organization: Lawrence Livermore National Laboratory
Date: 13 Jul 90 22:37:11 +0000


I may have asked about this earlier, and I am asking about
this again. I hope to use Neural Nets to analyze astronomical data,
and for this purpose, it will be vitally important to distinguish
"
normal" and "abnormal" phenomena. I mean by "normal" anything that is
very commonplace; "
abnormal" anything that is relatively rare. Since
the "
abnormal" phenomena are sometimes the most interesting ones, it
will be vital to pick them out. I even think it may be better to risk
misclassifying some "
normal" phenomena as "abnormal" than the other
way around.

Has anyone else faced similar problems?

What is the most efficient way to solve such problems?

Is a backprop network a good thing to use, and if so, what
would be the most suitable type of training set? Would one use an
mixture of known "
normal" inputs and randomly generated "abnormal"
inputs, with one output being a normal/abnormal indicator?

^
Loren Petrich, the Master Blaster \ ^ /
loren@sunlight.llnl.gov \ ^ /
One may need to route through any of: \^/
<<<<<<<<+>>>>>>>>
lll-lcc.llnl.gov /v\
lll-crg.llnl.gov / v \
star.stanford.edu / v \
v
For example, use:
loren%sunlight.llnl.gov@star.stanford.edu

My sister is a Communist for Reagan

------------------------------

Subject: Re: Distinguishing "
Normal" from "Abnormal" Data
From: jgk@osc.COM (Joe Keane)
Organization: Object Sciences Corp., Menlo Park, CA
Date: 17 Jul 90 23:33:24 +0000

In article <64712@lll-winken.LLNL.GOV> loren@tristan.llnl.gov (Loren Petrich)
writes:
> I may have asked about this earlier, and I am asking about
>this again. I hope to use Neural Nets to analyze astronomical data,
>and for this purpose, it will be vitally important to distinguish
>"
normal" and "abnormal" phenomena. I mean by "normal" anything that is
>very commonplace; "
abnormal" anything that is relatively rare. Since
>the "
abnormal" phenomena are sometimes the most interesting ones, it
>will be vital to pick them out. I even think it may be better to risk
>misclassifying some "
normal" phenomena as "abnormal" than the other
>way around.
>
> Has anyone else faced similar problems?

Yup.

> What is the most efficient way to solve such problems?

This may be heresy in comp.ai.neural-nets, but this task seems ideally suited
to standard statistical analysis. Off the top of my head, it's hard to say
what sort of distribution you want. A multi-variate normal might work
sufficiently well, although you probably want something multi-mode.

> Is a backprop network a good thing to use, and if so, what
>would be the most suitable type of training set? Would one use an
>mixture of known "
normal" inputs and randomly generated "abnormal"
>inputs, with one output being a normal/abnormal indicator?

Don't get me wrong, i think neural nets are very interesting, and they have
produced good results in some areas. But i see them being used where more
mundane methods would work quite well, and probably much faster.

It seems like NN is the newest trick, so people want to use it everywhere.
But in the process they don't hear about the old things, which is too bad. Is
it just me, or are others bothered by this trend?

------------------------------

Subject: From Standards Committee
From: ck@rex.cs.tulane.edu (Cris Koutsougeras)
Organization: Computer Science Dept., Tulane Univ., New Orleans, LA
Date: 16 Jul 90 16:57:06 +0000



REQUEST FOR HELP WITH NEURAL NETS TERMINOLOGY STANDARDS


The neural nets area is an interdisciplinary one. As such it has been
attracting researchers from various areas such as neurobiology, computer
science, controls, thermodynamics etc. Until the neural nets theme
brought researchers from such areas together, there was little interest
about the proceeds of research in a certain area from researchers in
another area. So various terms have been established for essentially the
same concepts or abstract entities. Unfortunately the variety of terms
has been passed onto the neural nets terminology as this is used today.
We have therefore found ourselves dealing with a great diversity in the
terminology and notation which can create misunderstanding and confusion
for readers and difficulties for writers. This problem is especially
severe for persons new to the field. You will find for example the terms
node, neuron, neurode, unit, processing element, cell to refer to the
same entity. Another example : weight, synapse, connection strength,
propagation coefficient etc.

To address this terminology/notation problem, the IEEE Neural Networks
Council has established an Ad Hoc Standards Committee. It is felt that
neural nets technology is still in such an actively evolving state that
an attempt to standardize terminology and notation must take precedence.
We would like to request interested researchers in the area to help with
the development of a list of terms for which it is felt that there exists
a need for a precise definition. All of you who have faced a problem with
the clarity of certain terms or concepts, are invited to communicate with
us a list of such terms. We also welcome your suggestion concerning
possible definitions which you feel that accurately and clearly describe
the entity referred to by each term or collection of terms.

A relevant article by R. Eberhart is published in the IEEE Transactions
on Neural Networks (Vol 1, No 2, June 90) which you may wish to consult.

Submitt your contributions to Dr. E. Tzanakou, Bioengineering Dpt.,
Rutgers University, Piscataway, NJ. or e-mail :
etzanako@elbereth.rutgers.edu



Cris Koutsougeras
Tulane University

------------------------------

Subject: AI Forum Meeting
From: kingsley@hpwrc02.hp.com
Date: Thu, 19 Jul 90 18:12:46 -0700

**************************************************************
* *
* A I F O R U M M E E T I N G *
* *
* *
* SPEAKER: Jeffrey Canin *
* TOPIC: Recent developments in the Supercomputer *
* Industry, (Thinking Machines, N-Cube) *
* WHEN: 7PM Tuesday 7/24/90 *
* WHERE: Lockheed building 202, auditorium *
* 3251 Hanover Street *
* Palo Alto, CA *
* *
* AI Forum meetings are free, open and monthly! *
* Call (415) 594-1685 for more info *
**************************************************************


------------------------------

Subject: NEURAL COMPUTATION 2:2
From: Terry Sejnowski <tsejnowski@UCSD.EDU>
Date: Fri, 13 Jul 90 18:06:21 -0700

NEURAL COMPUTATION
Table of Contents -- Volume 2:2

Visual Perception of Three-Dimensional Motion
David J. Heeger and Allan Jepson

Distributed Symbolic Representation of Visual Shape
Eric Saund

Modeling Orientation Discrimination at Multiple Reference Orientations
with a Neural Network
M. Devos and G. A. Orban

Temporal Differentiation and Violation of Time-Reversal Invariance in
Neurocomputation of Visual Information
D. S. Tang and V. Menon

Analysis of Linsker's Simulations of Hebbian Rules
David J. C. MacKay and Kenneth D. Miller

Applying Temporal Difference Methods to Fully Recurrent
Reinforcement Learning Networks
Jurgen Schmidhuber

Generalizing Smoothness Constraints from Discrete Samples
Chuanyi Ji, Robert R. Snapp, and Demetri Psaltis

The Upstart Algorithm: A Method for Constructing and Training
Feedforward Neural Networks
Marcus Frean

Layered Neural Networks with Gaussian Hidden Units As Universal
Approximations
Eric Hartman, James D. Keeler, and Jacek M. Kowalski

A Neural Network for Nonlinear Bayesian Estimation in Drug
Therapy
Reza Shadmehr and David D'Argenio

Analysis of Neural Networks with Redundancy
Yoshio Izui and Alex Pentland

Stability of the Random Neural Network Model
Erol Gelenbe

The Perceptron Algorithm Is Fast for Nonmaliciouis Distribution
Eric B. Baum


SUBSCRIPTIONS: Volume 2

______ $35 Student
______ $50 Individual
______ $100 Institution

Add $12. for postage outside USA and Canada surface mail.
Add $18. for air mail.

(Back issues of volume 1 are available for $25 each.)

MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142.
(617) 253-2889.


------------------------------

Subject: New tech report
From: Subutai Ahmad <ahmad@ICSI.Berkeley.EDU>
Date: Tue, 17 Jul 90 13:30:06 -0700


The following technical report is available:

A Network for Extracting the Locations of Point Clusters
Using Selective Attention

by

Subutai Ahmad & Stephen Omohundro
International Computer Science Institute

ICSI Technical Report #90-011

Abstract

This report explores the problem of dynamically computing visual
relations in connectionist systems. It concentrates on the task of
learning whether three clumps of points in a 256x256 image form an
equilateral triangle. We argue that feed-forward networks for solving
this task would not scale well to images of this size. One reason for
this is that local information does not contribute to the solution: it
is necessary to compute relational information such as the distances
between points. Our solution implements a mechanism for dynamically
extracting the locations of the point clusters. It consists of an
efficient focus of attention mechanism and a cluster detection scheme.
The focus of attention mechanism allows the system to select any
circular portion of the image in constant time. The cluster detector
directs the focus of attention to clusters in the image. These two
mechanisms are used to sequentially extract the relevant coordinates.
With this new representation (locations of the points) very few
training examples are required to learn the correct function. The
resulting network is also very compact: the number of required weights
is proportional to the number of input pixels.


Copies can be obtained in one of two ways:

1) ftp a postscript copy from cheops.cis.ohio-state.edu. The
file is ahmad.tr90-11.ps.Z in the pub/neuroprose directory. You can
either use the Getps script or follow these steps:

unix:2> ftp cheops.cis.ohio-state.edu
Connected to cheops.cis.ohio-state.edu.
Name (cheops.cis.ohio-state.edu:): anonymous
331 Guest login ok, send ident as password.
Password: neuron
230 Guest login ok, access restrictions apply.
ftp> cd pub/neuroprose
ftp> binary
ftp> get ahmad.tr90-11.ps.Z
ftp> quit
unix:4> uncompress ahmad.tr90-11.ps.Z
unix:5> lpr ahmad.tr90-11.ps


2) Order a hard copy from ICSI:

The cost is $1.75 per copy for postage and handling. Please enclose
your check with the order. Charges will be waived for ICSI sponsors
and for institutions that have exchange agreements with the Institute.
Make checks payable (in U.S. Dollars only) to "
ICSI" and send to:

International Computer Science Institute
1947 Center Street, Suite 600
Berkeley, CA 94704

Be sure to mention TR 90-011 and include your physical address. For
more information, send e-mail to: info@icsi.berkeley.edu


--Subutai Ahmad
ahmad@icsi.berkeley.edu

------------------------------

Subject: ASE91 Conference announcement please distribute
From: Dan.Howard@na.oxford.ac.uk
Date: Thu, 19 Jul 90 18:42:06 +0100


Subject: ASE 91 Conference First Announcement

CALL FOR PAPERS
2nd International Conference


A S E 9 1

Application of Supercomputers
in Engineering



Sponsored by ISBE (International Society for Boundary Elements),
Wessex Institute of Technology, and NSF (pending)


BOSTON, MASSACHUSETTS
August 13-15 1991




ORGANIZATION AND EDITORIAL COMMITEE:
-----------------------------------

Dr. Carlos Brebbia,
Director of W.I.T. and Computational Mechanics, UK.
Professor Avi Lin,
Temple University and ICOMP NASA, USA.
Dr. Daniel Howard,
Oxford University and Rolls Royce PLC, UK.
Dr. Alex Peters,
IBM Heidelberg, West Germany.


CONFERENCE OBJECTIVES:
----------------------

ASE91 aims to bring together computer scientists, computational engineers,
hardware engineers and mathematicians. The objective is to define
the proper roles for all these groups in the practical numerical
simulation of engineering problems. Illustrations of one sidedness are the
many debates in computational dynamics over mesh/grid generation (structured
vs unstructured) with little consideration for the computer science role !
(for example: development of a machine with fast indirect memory addressing);
while hardware people do not usually take the simulation end of things
very much into account when designing a new machine --- users are expected
to respond to changes in hardware design. Another example of one sidedness
is over concern with algorithm CPU time, while little attention is paid
to user-friendly aspects of problem solving (the viewpoint of the
practising engineer --- human pre/post process simplicity/efficiency/cost).
Finally, another example is the academic use of mathematical error estimates
for accuracy measurement, ignoring the engineering estimate (the `estimate
of physical features'). ASE91 then hopes to clarify such issues and to
act as a forum for groups who do not normally meet. These will hopefully
influence each other and leave the venue with more established views of
their roles in the computational engineering simulation world.


CONFERENCE THEMES:
-----------------

The first conference, ASE 89, took place in Southampton University and
resulted in the publication of two volumes of proceedings. At that
conference the emphasis was mainly on the impact of supercomputer
architectures on the engineering community and on the relevance of
benchmark tests for these computers. ASE89 attracted over one hundred
contributors and participants.

The themes of ASE 91 will have a stronger emphasis on parallel
algorithms for the efficient solution of partial differential
equations, on examples of large scale computation which have had an
impact on an engineering design, as well as on hardware and software
aspects of supercomputing which result in more efficient/fast indirect
memory addressing. Invited speakers on these and other relevant topics
will be disclosed in a future announcement.

Contributors should consider four main subject categories when submitting
an abstract:


(1) New and better algorithms for parallel engineering computation:

(a) multigrid and vector extrapolation schemes
(b) conjugate gradient methods
(c) operator split and domain decomposition
(d) the mathematics of parallel computation
(e) Finite and Boundary Element algorithms

(2) Examples of engineering applications on vector and parallel computers:

(a) structural dynamics, rock and ice mechanics
(b) fatigue, impact, and crash simulations
(c) Computational Fluid Dynamics, heat transfer and combustion
(d) turbulence and environmental modelling
(e) shallow water equations
(f) soil mechanics
(g) CAD and CIM interacting with Finite/Boundary Elements

(3) Three dimensional visualisation of engineering problems using the latest
algorithms, hardware configurations and distributed systems.

(4) Pre processors, and all engineering, algorithmic, and computer science
aspects of grid/mesh generation and data management.

Papers are invited on the topics outlined above and on other topics which
will fit within the general scope of the conference.


TIME SCHEDULE:
-------------

Submission of abstracts: **** 1st November 1990 (deadline) ****
Preliminary acceptance: 15th December 1990
Submission of final paper: 5th April 1991
Final acceptance: 17th May 1991
Conference: 13th-15th August 1991

ABSTRACTS should be no longer than 300 words and should clearly state
the purpose, results and conclusions of the work to be described in the
final paper. Final acceptance will be based upon review of the full
length paper.

ALL ABSTRACTS must be submitted to the Conference Secretary:

Liz Neuman,
Conference Secretary,
W.I.T., Ashurst Lodge
Ashurst, Southampton,
SO4 2AA, England, UK

Tel: 44-703-293223
FAX: 44-703-292853

For further information on ASE 91 please contact the Conference Secretary
above. Specific information on `themes' can be obtained from the following
E-mail addresses: howard@uk.ac.oxford.na
or avilin@euclid.math.temple.edu (if mailing from anywhere except UK)
avilin@edu.temple.math.euclid (if mailing from UK)



------------------------------

Subject: Re: Technology Transfer Mailing List
From: finin@PRC.Unisys.COM
Date: Wed, 11 Jul 90 01:21:30 -0400


This mailing list is a good idea. Here is an announcement of a
conference that is focused on technology transfer in AI:


The Seventh IEEE Conference on Artificial Intelligence Applications

Fontainbleau Hotel, Miami Beach, Florida
February 24 - 28, 1991

Call For Participation

Sponsored by The Computer Society of IEEE

The conference is devoted to the application of artificial
intelligence techniques to real-world problems. Two kinds of papers
are appropriate: case studies of knowledge-based applications that
solve significant problems and stimulate the development of useful
techniques and papers on AI techniques and principles that underlie
knowledge-based systems, and in turn, enable ever more ambitious
real-world applications. This conference provides a forum for such
synergy between applications and AI techniques.

Papers describing significant unpublished results are solicited along
three tracks:

o "
Scientific/Engineering" Applications Track. Contributions stemming
from the general area of industrial and scientific applications.

o "
Business/Decision Support" Applications Track. Contributions stemming
from the general area of decision support applications in business,
government, law, etc.

Papers in these two application tracks must: (1) Justify the use
of the AI technique, based on the problem definition and an
analysis of the application's requirements; (2) Explain how AI
technology was used to solve a significant problem; (3) Describe
the status of the implementation; (4) Evaluate both the
effectiveness of the implementation and the technique used.

Short papers up to 1000 words in length will also be accepted for
presentation in these two application tracks.

o "
Enabling Technology" Track. Contributions focusing on techniques
and principles that facilitate the development of practical knowledge
based systems that can be scaled to handle increasing problem
complexity. Topics include, but are not limited to: knowledge
representation, reasoning, search, knowledge acquisition, learning,
constraint programming, planning, validation and verification, project
management, natural language processing, speech, intelligent
interfaces, natural language processing, integration, problem-solving
architectures, programming environments and general tools.

Long papers in all three tracks should be limited to 5000 words and
short papers in the two applications tracks limited to 1000 words.
Papers which are significantly longer than these limits will not be
reviewed. The first page of the paper should contain the following
information (where applicable) in the order shown:

- Title.
- Authors' names and affiliation. (specify student status)
- Contact information (name, postal address, phone, fax and email address)
- Abstract: A 200 word abstract that includes a clear statement describing
the paper's original contributions and what new lesson is imparted.
- AI topic: one or more terms describing the relevant AI areas, e.g.,
knowledge acquisition, explanation, diagnosis, etc.
- Domain area: one or more terms describing the problem domain area,
e.g., mechanical design, factory scheduling, education, medicine, etc.
Do NOT specify the track.
- Language/Tool: Underlying programming languages, systems and tools used.
- Status: development and deployment status, as appropriate.
- Effort: Person-years of effort put into developing the particular
aspect of the project being described.
- Impact: A twenty word description of estimated or measured (specify)
benefit of the application developed.

Each paper accepted for publication will be allotted seven pages in
the conference proceedings. The best papers accepted in the two
applications tracks will be considered for a special issue of IEEE
EXPERT to appear late in 1991. An application has been made to
reserve a special issue of IEEE Transactions on Knowledge and Data
Engineering (TDKE) for publication of the best papers in the enabling
technologies track. IBM will sponsor an award of $1,500 for the
best student paper at the conference.

In addition to papers, we will be accepting the following types of
submissions:

- Proposals for Panel discussions. Provide a brief description of the
topic (1000 words or less). Indicate the membership of the panel and
whether you are interested in organizing/moderating the discussion.

- Proposals for Demonstrations. Submit a short proposal (under 1000
words) describing a videotaped and/or live demonstration. The
demonstration should be of a particular system or technique that
shows the reduction to practice of one of the conference topics.
The demonstration or videotape should be not longer than 15 minutes.

- Proposals for Tutorial Presentations. Proposals for three hour
tutorials of both an introductory and advanced nature are
requested. Topics should relate to the management
and technical development of useful AI applications. Tutorials
which analyze classes of applications in depth or examine
techniques appropriate for a particular class of applications are of
particular interest. Copies of slides are to be provided in advance to
IEEE for reproduction.

Each tutorial proposal should include the following:

* Detailed topic list and extended abstract (about 3 pages)
* Tutorial level: introductory, intermediate, or advanced
* Prerequisite reading for intermediate and advanced tutorials
* Short professional vita including presenter's experience in
lectures and tutorials.

- Proposals for Vendor Presentations. A separate session will be held
where vendors will have the opportunity to give an overview to
their AI-based software products and services.


IMPORTANT DATES

- August 31, 1990: Six copies of Papers, and four copies of all proposals
are due. Submissions not received by that date will be returned
unopened. Electronically transmitted materials will not be accepted.
- October 26, 1990: Author notifications mailed.
- December 7, 1990: Accepted papers due to IEEE. Accepted tutorial
notes due to Tutorial Chair.
- February 24-25, 1991: Tutorial Program of Conference
- February 26-28, 1991: Technical Program of Conference

Submit Papers and Other Materials to:

Tim Finin
Unisys Center for Advanced Information Technology
70 East Swedesford Road
PO Box 517
Paoli PA 19301
internet: finin@prc.unisys.com
phone: 215-648-2840; fax: 215-648-2288

Submit Tutorial Proposals to:

Daniel O'Leary
Graduate School of Business
University of Southern California
Los Angeles, CA 90089-1421
phone: 213-743-4092, fax: 213-747-2815

For registration and additional conference information, contact:

CAIA-91
The Computer Society of the IEEE
1730 Massachusetts Avenue, NW
Washington, DC 20036-1903
phone: 202-371-1013

CONFERENCE COMMITTEES

General Chair: Se June Hong, IBM Research
Program Chair: Tim Finin, Unisys
Publicity Chair: Jeff Pepper, Carnegie Group, Inc.
Tutorial Chair: Daniel O'Leary, University of Southern California
Local Arrangements: Alex Pelin, Florida International University, and
Mansur Kabuka, University of Miami
Program Committee:
AT-LARGE SCIENTIFIC/ENGINEERING TRACK
Tim Finin, Unisys (chair) Chris Tong, Rutgers (chair)
Jan Aikins, AION Corp. Sanjaya Addanki, IBM Research
Robert E. Filman, IntelliCorp Bill Mark, Lockheed AI Center
Ron Brachman, AT&T Bell Labs Sanjay Mittal, Xerox PARC
Wolfgang Wahlster, German Res. Center Ramesh Patil, MIT
for AI & U. of Saarlandes David Searls, Unisys
Mark Fox, CMU Duvurru Sriram, MIT

ENABLING TECHNOLOGY TRACK BUSINESS/DECISION SUPPORT TRACK
Howard Shrobe, Symbolics (chair) Peter Hart, Syntelligence (chair)
Lee Erman, Cimflex Teknowledge Chidanand Apte, IBM Research
Eric Mays, IBM Research Vasant Dhar, New York University
Norm Sondheimer, GE Research Steve Kimbrough, U. of Pennsylvania
Fumio Mizoguchi, Tokyo Science Univ. Don McKay, Unisys
Dave Waltz, Brandeis & Thinking Machines


------------------------------

End of Neuron Digest [Volume 6 Issue 44]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT