Copy Link
Add to Bookmark
Report

Neuron Digest Volume 04 Number 23

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Tuesday, 15 Nov 1988                Volume 4 : Issue 23 

Today's Topics:
Mark Jones to speak on neural nets and symbolic AI
colloquia
CVPR 89 submission deadline
Washington Neural Network Society
Technical report announcement
FINAL CALL FOR PAPERS
Report Available - Connectionist State Machines
BBS Call For Commentators: The Tag Assignment Problem


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"

------------------------------------------------------------

Subject: Mark Jones to speak on neural nets and symbolic AI
From: pratt@zztop.rutgers.edu (Lorien Y. Pratt)
Organization: Rutgers Univ., New Brunswick, N.J.
Date: 04 Nov 88 19:36:12 +0000


Fall, 1988
Neural Networks Colloquium Series
at Rutgers

Neural Nets and Symbolic AI
---------------------------

Mark Jones
AT&T Bell Laboratories

Room 705 Hill center, Busch Campus
Friday November 11, 1988 at 11:10 am
Refreshments served before the talk


Abstract

I will present an overview of the rapidly developing area of neural
networks or connectionist networks and their application to problems in
Artificial Intelligence (AI). I will briefly discuss areas of
perception, feature discovery, associative memory and pattern
completion, but will concentrate on the relationship of neural networks
to classical symbolic AI domains such as natural language processing
and knowledge representation.


Lorien Y. Pratt Computer Science Department
pratt@paul.rutgers.edu Rutgers University
Busch Campus
(201) 932-4634 Piscataway, NJ 08854

------------------------------

Subject: colloquia
From: loui@wucs1.wustl.edu (Ron Loui)
Organization: Washington University, St. Louis, MO
Date: 04 Nov 88 21:26:44 +0000


COMPUTER SCIENCE COLLOQUIUM

Washington University
St. Louis

4 November 1988


TITLE: Why AI needs Connectionism? A Representation and Reasoning Perspective


Lokendra Shastri
Computer and Information Science Department
University of Pennsylvania



Any generalized notion of inference is intractable, yet we are capable of
drawing a variety of inferences with remarkable efficiency - often in a few
hundered milliseconds. These inferences are by no means trivial and support
a broad range of cognitive activity such as classifying and recognizing
objects, understanding spoken and written language, and performing
commonsense reasoning. Any serious attempt at understanding intelligence
must provide a detailed computational account of how such inferences may be
drawn with requisite efficiency. In this talk we describe some work within
the connectionist framework that attempts to offer such an account. We
focus on two connectionist knowledge representation and reasoning systems:

1) A connectionist semantic memory that computes optimal solutions to an
interesting class of inheritance and recognition problems extremely fast -
in time proportional to the depth of the conceptual hierarchy. In addition
to being efficient, the connectionist realization is based on an evidential
formulation and provides a principled treatment of exceptions, conflicting
multiple inheritance, as well as the best-match or partial-match
computation.

2) A connectionist system that represents knowledge in terms of multi-place
relations (n-ary predicates), and draws a limited class of inferences based
on this knowledge with extreme efficiency. The time taken by the system to
draw conclusions is proportional to the length of the proof, and hence,
optimal. The system incorporates a solution to the "variable binding"
problem and uses the temporal dimension to establish and maintain bindings.

We conclude that working within the connectionist framework is well
motivated as it helps in identifying interesting classes of limited
inference that can be performed with extreme efficiently, and aids in
discovering constraints that must be placed on the conceptual structure in
order to achieve extreme efficiency.


host: Ronald Loui
___________________________________________________________________________

1988-89 AI Colloquium Series (through February)


Sep 16 Michael Wellman, MIT/Air Force
"The Trade-off Formulation Task in Planning under Uncertainty"
30 Kathryn Laskey, Decision Science Consortium
"Assumptions, Beliefs, and Probabilities"
Nov 4 Lokendra Shastri, University of Pennsylvania
"Why AI Needs Connectionism? A Representation and Reasoning
Perspective"

11 Peter Jackson, McDonnell Douglas Research Laboratories
"Diagnosis, Defaults, and Abduction"
18 Eric Horvitz, Stanford University (decision-theoretic control
of problem-solving)
Dec 2 Mark Drummond, NASA Ames (planning)
Feb 3 Fahiem Bacchus, University of Waterloo (uncertain reasoning)
10 Dana Nau, University of Maryland (TBA)

other speakers to be announced

____________________________________________________________________________

------------------------------

Subject: CVPR 89 submission deadline
From: wnm@uvacs.cs.Virginia.EDU (Worthy N. Martin)
Organization: U.Va. CS Department, Charlottesville, VA
Date: 07 Nov 88 01:50:02 +0000


The following call for papers has appeared before,
however, it is being reissued to remind interested
parties of the first deadline, namely:

- ---->
- ----> November 16, 1988 -- Papers submitted
- ---->

This deadline will be held to firmly with the submission
date determined by postmark.
Thank you for your interest in CVPR89
Worthy Martin


- ----------------------------------------------------------

CALL FOR PAPERS

IEEE Computer Society Conference
on
COMPUTER VISION AND PATTERN RECOGNITION

Sheraton Grand Hotel
San Diego, California
June 4-8, 1989.



General Chair


Professor Rama Chellappa
Department of EE-Systems
University of Southern California
Los Angeles, California 90089-0272


Program Co-Chairs

Professor Worthy Martin Professor John Kender
Dept. of Computer Science Dept. of Computer Science
Thornton Hall Columbia University
University of Virginia New York, New York 10027
Charlottesville, Virginia 22901


Program Committee

Charles Brown John Jarvis Gerard Medioni
Larry Davis Avi Kak Theo Pavlidis
Arthur Hansen Rangaswamy Kashyap Alex Pentland
Robert Haralick Joseph Kearney Roger Tsai
Ellen Hildreth Daryl Lawton John Tsotsos
Anil Jain Martin Levine John Webb
Ramesh Jain David Lowe



Submission of Papers

Four copies of complete drafts, not exceeding 25 double
spaced typed pages should be sent to Worthy Martin at the
address given above by November 16, 1988 (THIS IS A HARD
DEADLINE). All reviewers and authors will be anonymous for
the review process. The cover page will be removed for the
review process. The cover page must contain the title,
authors' names, primary author's address and telephone
number, and index terms containing at least one of the below
topics. The second page of the draft should contain the
title and an abstract of about 250 words. Authors will be
notified of notified of acceptance by February 1, 1989 and
final camera-ready papers, typed on special forms, will be
required by March 8, 1989. Submission of Video Tapes As a
new feature there will be one or two sessions where the
authors can present their work using video tapes only. For
information regarding the submission of video tapes for
review purposes, please contact John Kender at the address
above.



Conference Topics Include:

-- Image Processing
-- Pattern Recognition
-- 3-D Representation and Recognition
-- Motion
-- Stereo
-- Visual Navigation
-- Shape from _____ (Shading, Contour, ...)
-- Vision Systems and Architectures
-- Applications of Computer Vision
-- AI in Computer Vision
-- Robust Statistical Methods in Computer Vision



Dates

November 16, 1988 -- Papers submitted
February 1, 1989 -- Authors informed
March 8, 1989 -- Camera-ready manuscripts to IEEE
June 4-8, 1989 -- Conference

------------------------------

Subject: Washington Neural Network Society
From: will@ida.org (Craig Will)
Date: Mon, 07 Nov 88 15:32:23 -0500

The Washington Neural Network Society
and
The IEEE Computer Society
Artificial Intelligence Subchapter

Joint Meeting
November 14, 1988 7:00 PM

Speaker: C. Lee Giles
Air Force Office of Scientific Research
Washington, D.C.

High Order Neural Networks


Conventional neural networks make use of interconnections
between neurons based on a single connection strength, or
weight, associated with each input to a neuron. Such networks
often require multiple layers of neurons to solve specific
problems, and difficulties of learning and stability have not
been completely solved.

An alternative approach is to increase the computational
power of each neuron in the network. One way to do this is to
use more complex interconnections that not only include a sin-
gle connection strength for each single input to a neuron, but
also a weight representing the connection strength for each
pair of other neurons feeding into a unit, a weight for each
triple of other neurons feeding into an input, and so forth.
Such networks are called higher order networks, and appear to
have several advantages. They can efficiently construct
high-order internal representations that can capture high-
order internal representations in complex, high-dimensional
data. This can allow a network to encode invariances, such as
the ability to recognize a visual shape regardless of size,
rotation, or a vertical or horizontal shift. Higher order
networks can perform the same computations as conventional
networks with fewer layers, and usually have learning times
faster by orders of magnitude.

In this talk Lee Giles will present an introduction to
higher order neural networks and discuss their advantages over
conventional approaches. He will also discuss difficulties
with higher order networks, such as scaling up to large sys-
tems, as well as potential solutions.


Dr. C. Lee Giles is with the Air Force Office of Scien-
tific Research, Bolling Air Force Base, Washington, D.C.,
where he is a program officer responsible for basic research
on the computational aspects of neural networks.

The meeting will be held at MITRE in McLean, Virginia.
Take the Beltway (495) toward McLean, and get off at exit 11
(if coming from Virginia) or exit 11A (if coming from Mary-
land). Take Route 123 north to the second traffic light (at
Colshire Drive). Go right onto Colshire Drive and follow it
to MITRE's Hayes Building at the top of the hill. From Wash-
ington, take 66 West to the Beltway and go north, get off at
exit 11 as above. For more information call Diane Entner at
(703) 243-6996.


Schedule:
7:00 - 7:10 Introduction (Craig Will, Diane Entner)
7:10 - 7:20 Neural nets at MITRE (Alexis Wieland)
7:20 - 8:20 Speaker (Lee Giles)
8:20 - 9:20 Informal discussion with refreshments


------------------------------

Subject: Technical report announcement
From: David.Servan-Schreiber@A.GP.CS.CMU.EDU
Date: Wed, 09 Nov 88 00:05:00 -0500



The following technical report is available upon request:

ENCODING SEQUENTIAL STRUCTURE IN SIMPLE RECURRENT NETWORKS
David Servan-Schreiber, Axel Cleeremans & James L. McClelland
CMU-CS-88-183

We explore a network architecture introduced by Elman (1988) for
predicting successive elements of a sequence. The network uses
the pattern of activation over a set of hidden units from time-
step t-1, together with element t, to predict element t+1. When
the network is trained with strings from a particular finite-
state grammar, it can learn to be a perfect finite-state
recognizer for the grammar. When the net has a minimal number of
hidden units, patterns on the hidden units come to correspond to
the nodes of the grammar; however, this correspondence is not
necessary for the network to act as a perfect finite-state
recognizer. We explore the conditions under which the network can
carry information about distant sequential contingencies across
intervening elements to distant elements. Such information is
maintained with relative ease if it is relevant at each
intermediate step; it tends to be lost when intervening elements
do not depend on it. At first glance this may suggest that such
networks are not relevant to natural language, in which
dependencies may span indefinite distances. However, embeddings
in natural language are not completely independent of earlier
information. The final simulation shows that long distance
sequential contingencies can be encoded by the network even if
only subtle statistical properties of embedded strings depend on
the early information.

Send surface mail to :

Department of Computer Science
Carnegie Mellon University
Pittsburgh, PA. 15213-3890
U.S.A

or electronic mail to Ms. Terina Jett:

Jett@CS.CMU.EDU (ARPA net)

Ask for technical report CMU-CS-88-183.

------------------------------


Subject: FINAL CALL FOR PAPERS
From: Julian Dow@UK.AC.GLASGOW.VME
Date: 3 Nov 88 13:47:55
To: CONNECT-BB@UK.AC.ED.EUSIP
Msg ID: < 3 Nov 88 13:47:55 GMT A10120@UK.AC.GLA.VME>

CALL FOR PAPERS: "BIOELECTRONICS AND BIOSENSORS"
SEB MEETING: EDINBURGH, APRIL 3-7th, 1989

The programme for the meeting is being drawn up at present; a
provisional list of speakers and titles is shown below:


Pickard (Cardiff) "Bee brains and biosensors".
Turner (Cranfield) "Biosensor design".
Clark (Glasgow) "Cell patterning by contact guidance on synthetic substrates".
Gross (Texas) "Simultaneous recording from multielectrode arrays".
Edell (MIT) "Long term recording from neuronal implants".
Pine (Caltech) "The silicon/neuron connection".
Birch (Unilever) "Electrochemical sensors based on the capillary fill device".
Kulys (USSR) "Biosensors based on organic metal electrodes".
Pethig (Bangor) "Sensors and switches based on biological materials".
Stanley (Scientific Generics) "Pitfalls and potential of the next
generation of commercial biosensors"
.
Cullen (Cambridge) "Immunosensors based on the excitation of surface plasmon
polaritons on diffraction gratings"
.
Thompson (Toronto) "The use of animal cell receptors in biosensors".

A session on "computers in neurobiology" and a "CED Users' Group meeting"
will run concurrently in Edinburgh. If you would like to submit a paper or
poster, or are not already included on the mailing list, please reply to me
below. Please return the completed form to me at the above address by the
deadline of 15th November, 1988. Successful applicants will be notified
shortly afterwards, and the final programme for the whole meeting, together
with registration forms and accommodation booking details, will be
circulated in February.

***Topics covered

Developmental biology: contact guidance, studies of cell behaviour on
patterned surfaces of controlled topography.

In vivo Neurobiology: measurement of electrical activity in intact nervous
tissue with microengineered electrode arrays.

In vitro Neurobiology: "real" neural networks of cultured neurons on arrays
of electrodes. Relevance to computer design.

Biomedical applications: use of electrode arrays as implantable sensors and
in prosthetics.

Biosensors: Fundamental problems. Application to biological problems in
research.

Technological problems: Choice of substrate, electrode materials.
Long-term stability in aqueous environments. Multiplexing and signal
treatment. Data reduction.


***Interested?

Please complete and return the attached form if you would like to be
included on the mailing list, or write to Mail: Dr. Julian A.T. Dow,
Department of Cell Biology, University of Glasgow, Glasgow G12 8QQ,
Scotland.

Phone: (041) 330 4616
Telex: 777070 UNIGLA
Fax: (041) 330 4808
(Electronic mail address (JANET) : Julian_Dow@uk.ac.glasgow.vme)
- ------------------------------------------------
SEB Conference: April 1989


To:
Dr. Julian A.T. Dow,
Department of Cell Biology,
University of Glasgow,
Glasgow G12 8QQ,
Scotland.
Telephone (041) 330 4616

Name:

..............................................
Address:
..............................................
..............................................
..............................................
..............................................



Please include me on your mailing list for further announcements on
the Bioelectronics & Biosensors session of the 1989 SEB Spring
Meeting in Edinburgh, on the 6th and 7th of April.

I am interested in
giving an oral presentation
presenting a poster
attending.
The title of my poster presentation would be:
..........................................................
My main area of interest is
Cell biology
Developmental Biology
Neurobiology
Biosensors
Electronics
Computing
(Please circle)
I am / am not a member of the SEB
(you do not *need* to be a member, incidentally)


Please duplicate this form and pass on to any interested colleagues.
- --- End of forwarded message
- --- End of forwarded message

------------------------------

Subject: Report Available - Connectionist State Machines
From: rba@flash.bellcore.com (Robert B Allen)
Date: Thu, 10 Nov 88 16:52:24 -0500

Connectionist State Machines
Robert B. Allen
Bellcore, November 1988


Performance of sequential adaptive networks on a number of tasks
was explored. For example the ability to respond to continuous
sequences was demonstrated first with a network which was trained
to flag a given subsequence and, in a second study, to generate
responses to transitions conditional upon previous transitions.
Another set of studies demonstrated that the networks are able to
recognize legal strings drawn from simple context-free grammars
and regular expressions. Finally, sequential networks were also
shown to be able to be trained to generate long strings. In some
cases, adaptive schedules were introduced to gradually extend the
network's processing of strings.


Contact: rba@bllcore.com
Robert B. Allen
2A-367
Bellcore
Morristown, NJ 07960-1910


------------------------------

Subject: BBS Call For Commentators: The Tag Assignment Problem
From: harnad@confidence.Princeton.EDU (Stevan Harnad)
Date: Fri, 11 Nov 88 02:32:57 -0500

Below is the abstract of a forthcoming target article to appear in
Behavioral and Brain Sciences (BBS), an international, interdisciplinary
journal providing Open Peer Commentary on important and controversial
current research in the biobehavioral and cognitive sciences. To be
considered as a commentator or to suggest other appropriate commentators,
please send email to:

harnad@confidence.princeton.edu or write to:
BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771]
____________________________________________________________________
A SOLUTION TO THE TAG-ASSIGNMENT PROBLEM FOR NEURAL NETWORKS

Gary W. Strong Bruce A. Whitehead
College of Information Studies Computer Science Program
Drexel University University of Tennessee Space Institute
Philadelphia, PA 19104 USA Tullahoma, TN 37388 USA

ABSTRACT: Purely parallel neural networks can model object recognition in
brief displays -- the same conditions under which illusory conjunctions
(the incorrect combination of features into perceived objects in a stimulus
array) have been demonstrated empirically (Treisman & Gelade 1980; Treisman
1986). Correcting errors of illusory conjunction is the "tag-assignment"
problem for a purely parallel processor: the problem of assigning a spatial
tag to nonspatial features, feature combinations and objects. This problem
must be solved to model human object recognition over a longer time scale.
A neurally plausible model has been constructed which simulates both the
parallel processes that may give rise to illusory conjunctions and the
serial processes that may solve the tag-assignment problem in normal
perception. One component of the model extracts pooled features and another
provides attentional tags that can correct illusory conjunctions. Our
approach addresses two questions: (i) How can objects be identified from
simultaneously attended features in a parallel, distributed representation?
(ii) How can the spatial selection requirements of such an attentional
process be met by a separation of pathways between spatial and nonspatial
processing? Analysis of these questions yields a neurally plausible
simulation model of tag assignment, based on synchronization of neural
activity for features within a spatial focus of attention.

KEYWORDS: affordance; attention; connectionist network; eye movements;
illusory conjunction; neural network; object recognition; retinotopic
representations; saccades; spatial localization

------------------------------

End of Neurons Digest
*********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT