Copy Link
Add to Bookmark
Report
AIList Digest Volume 8 Issue 133
AIList Digest Wednesday, 30 Nov 1988 Volume 8 : Issue 133
Seminars:
Why AI may need Connectionism - Lokendra Shastri
On the Semantics of Default Logic - Dr. Wiktor Marek
Complexity and Decidability of Terminological Logics - Peter F. Patel-Schneider
Parallel Computation of Motion - Heinrich Bulthoff
Planning and Plan Execution - Mark Drummond
Computational Complexity in Medical Diagnostic Systems - Gregory Cooper
Computation and Inference Under Scarce Resources - Eric Horvitz
Grammatical Categories and Shapes of Events - Alexander Nakhimovsky
----------------------------------------------------------------------
Date: Thu, 10 Nov 88 16:26:34 EST
From: dlm@allegra.att.com
Subject: Why AI may need Connectionism - Lokendra Shastri
Why AI may need Connectionism? A Representation and Reasoning Perspective
Lokendra Shastri
Computer and Information Science Department
University of Pennsylvania
Tuesday, November 15 at 10:00 am.
AT&T Bell Laboratories - Murray Hill Room 3D436
Any generalized notion of inference is intractable, yet we are capable
of drawing a variety of inferences with remarkable efficiency.
These inferences are by no means trivial and support a broad range
of cognitive activity such as classifying and recognizing objects,
understanding spoken and written language, and performing commonsense
reasoning. Any serious attempt at understanding intelligence must
provide a detailed computational account of how such inferences may be
drawn with requisite efficiency. In this talk we describe some work
within the connectionist framework that attempts to offer such an account.
We focus on two connectionist knowledge representation and reasoning systems:
1) A connectionist system that represents knowledge in terms of
multi-place relations (n-ary predicates), and draws
a limited class of inferences based on this knowledge with extreme
efficiency. The time taken by the system to draw conclusions is
proportional to the @i(length) of the proof, and hence, optimal.
The system incorporates a solution to the "variable binding" problem
and uses the temporal dimension to establish and maintain bindings.
2) A connectionist semantic memory that computes optimal solutions
to an interesting class of @i(inheritance) and @i(recognition) problems
extremely fast - in time proportional to the @i(depth) of the conceptual
hierarchy. In addition to being efficient, the connectionist realization
is based on an evidential formulation and provides a principled
treatment of @i(exceptions), @i(conflicting multiple inheritance),
as well as the @i(best-match) or @i(partial-match) computation.
We conclude that working within the connectionist framework is well
motivated as it helps in identifying interesting classes of
limited inference that can be performed with extreme efficiently,
and aids in discovering constraints that must be placed on the
conceptual structure in order to achieve extreme efficiency.
Sponsor: Mark Jones - jones@allegra.att.com
------------------------------
Date: Tue, 15 Nov 88 18:03:12 EST
From: dlm@allegra.att.com
Subject: On the Semantics of Default Logic - Dr. Wiktor Marek
Dr. Wiktor Marek
Department of Computer Science
University of KY
Monday, 11/21, 1:30 PM
AT&T Bell Laboratoris - Murray Hill 3D-473
On the Semantics of Default Logic
At least two different types of structures have been
proposed as correct semantics for logic of defaults:
Minimal sets of formulas closed under defaults and fixed
points of Reiter's operator GAMMA. In some cases these
notions coincide, but generally this is not the case. In
addition Konolige identified Reiter's fixed points (so
called extensions) with a certain class of autoepistemic
expansions of Moore.
We identify another class of structures for defaults,
called weak expansions and show a one to one correspondence
between Moore's autoepistemic expansions and weak expan-
sions. This functor extends Konolige's correspondence. We
show that the expressive power of autoepistemic logic (with
expansions as the intended structures) is precisely same as
default logic with weak expansions.
Sponsor: D. Etherington - ether@allegra.att.com
------------------------------
Date: Mon 21 Nov 88 17:40:09-EST
From: Marc Vilain <MVILAIN@G.BBN.COM>
Subject: Complexity and Decidability of Terminological Logics - Peter
F. Patel-Schneider
BBN Science Development Program
AI Seminar Series Lecture
COMPLEXITY AND DECIDABILITY OF TERMINOLOGICAL LOGICS
Peter F. Patel-Schneider
AI Principles Research Department
AT&T Bell Laboratories
(pfps@allegra.att.com)
BBN Labs
10 Moulton Street
2nd floor large conference room
10:30 am, Tuesday November 29
Terminological Logics are important formalisms for representing
knowledge about concepts and objects, and are attractive for use in
Knowledge Representation systems. However, Terminological Logics with
reasonable expressive power have poor computational properties, a fact
which has restricted their use and utility in Knowledge Representation
systems. This talk gives a brief description of Terminological Logics,
presents some results concerning their tractability and decidability,
and discusses the role of Terminological Logics in Knowledge
Representation systems.
------------------------------
Date: Tue, 22 Nov 88 17:55:47 EST
From: kgk@CS.BROWN.EDU
Subject: Parallel Computation of Motion - Heinrich Bulthoff
Parallel Computation of Motion: computation, psychophysics and physiology
Heinrich H. B"ulthoff
Brown University
Department of Cognitive and Linguistic Sciences
Wednesday, November 30, 12PM.
Lubrano Conference Room
4th Floor, Center for Information Technology
Brown University
The measurement of the 2D field of velocities -- which is the
projection of the true 3D velocity field -- from time-varying
2-dimensional images is in general impossible. It is, however,
possible to compute suitable ``optical flows'' that are
qualitatively similar to the velocity field in most cases. We
describe a simple, parallel algorithm that computes successfully an
optical flow from sequences of real images, is consistent with human
psychophysics and suggests plausible physiological models. In
particular, our algorithm runs on a Connection Machine supercomputer
in close to real time. It shows several of the same ``illusions''
that humans perceive. A natural physiological implementation of the
model is consistent with data from cortical areas V1 and MT.
Regularizing optical flow computation leads to a formulation which
minimizes matching error and, at the same time, maximizes smoothness
of the optical flow. We develop an approximation to the full
regularization computation in which corresponding points are found
by comparing local patches of the images. Selection between
competing matches is performed using a winner-take-all scheme. The
algorithm accomodates many different image transformations
uniformly, with similar results, from brightness to edges. The
algorithm is easily implemented using local operations on a
fine-grained computer (Connection Machine) and experiments with
natural images show that the scheme is effective and robust against
noise. This work was done at the Artificial Intelligence Laboratory
at MIT in collaboration with Tomaso Poggio and James J. Little .
------------------------------
Date: 28 Nov 88 02:55:26 GMT
From: wucs1!loui@uunet.uu.net (Ron Loui)
Subject: Planning and Plan Execution - Mark Drummond
COMPUTER SCIENCE COLLOQUIUM
Washington University
St. Louis
2 December 1988
Planning and Plan Execution
Mark Drummond
NASA Ames
We are given a table on which to place three blocks (A, B, and C). We begin
in a state where all the blocks are available for placement; there is also an
unspecified means of transporting each block to its target location on the
table. We might imagine that there are an unlimited number of
interaction-free robot arms, or that each block may be levitated into place
once it is available. The exact means for moving the blocks does not matter:
given that a block is available it may be placed. The only constraint is that
B cannot be placed last. We call this the "B-not-last" problem.
We must produce a plan which is as flexible as possible. If a block can be
placed then our plan must so instruct the agent. If a block cannot be placed
according to the constraints then our plan must prevent the agent from
attempting to place the block. The agent must never lock up in a state from
which no progress is possible. This would happen, for instance, if A were on
the table, and C arrived and was placed. B could then not be placed last.
It takes four totally ordered plans or three partially ordered plans to deal
with the B-not-last problem. In either representation there is no one plan
that can be given to the assembly agent which does not overly commit to a
specific assembly strategy. Disjunction is not the only problem. Actions
will often fail to live up to the planner's expectations. An approach based
on relevancy analysis is needed, where actions are given in terms of the
conditions under which their performance is appropriate. The problem is even
harder when there can be parallel actions.
Our approach uses a modified Condition/Event system (Drummond, 1986a,b) as a
causal theory of the application domain. The C/E system is amenable to direct
execution by an agent, and can be viewed as a nondeterministic control
program. For every choice point in the projection, we synthesize a "situated
control rule" that characterizes the conditions under which action execution
is appropriate. This can be viewed as a generalization of STRIPS' algorithm
for building triangle tables from plan sequences (Nilsson, 1984).
________________________________________________________________________________
5 December 1988
Coping with Computational Complexity in Medical Diagnostic Systems
Gregory Cooper
Stanford University/Knowledge Systems Laboratory
Probabilistic networks will be introduced as a representation for medical
diagnostic knowledge. The computational complexity of using general
probabilistic networks for diagnosis will be shown to be NP-hard. Diagnosis
using several important subclasses of these networks will be shown to be
NP-hard as well. We then will focus on some of the approximation methods
under development for performing diagnostic inference. In particular, we will
discuss algorithms being developed for performing diagnostic inference using a
probabilistic version of the INTERNIST/QMR knowledge base.
--------------------------------------------------------------------------------
Computation and Inference Under Scarce Resources
Eric Horvitz
Stanford University
Knowledge Systems Laboratory
I will describe research on Protos, a project focused on reasoning and
representation under resource constraints. The work has centered on building
a model of computational rationality through the development of flexible
approximation methods and the application of reflective decision-theoretic
control of reasoning. The techniques developed can be important for providing
effective computation in high-stakes and complex domains such as medical
decision making. First, work will be described on the decision-theoretic
control of problem solving for solving classical computational tasks under
varying, uncertain, and scarce resources. After, I will focus on
decision-theoretic reasoning under resource constraints. I will present work
on the characterization of partial results generated by alternative
approximation methods. The expected value of computation will be introduced
and applied to the selection and control of probabilistic inference. Plans
for extending the work to inference in a large internal-medicine knowledge
base will be described. Finally, I extend the techniques beyond the tradeoff
between computation time and quality of computational results to explore
issues surrounding complex reasoning under cognitive constraints.
________________________________________________________________________________
------------------------------
Date: Tue, 29 Nov 88 14:24:11 EST
From: rapaport@cs.Buffalo.EDU (William J. Rapaport)
Subject: Grammatical Categories and Shapes of Events - Alexander
Nakhimovsky
UNIVERSITY AT BUFFALO
STATE UNIVERSITY OF NEW YORK
GRADUATE GROUP IN COGNITIVE SCIENCE
PRESENTS
ALEXANDER NAKHIMOVSKY
Department of Computer Science
Colgate University
GRAMMATICAL CATEGORIES AND SHAPES OF EVENTS
Tuesday, December 13, 1988
4:30 P.M.
280 Park Hall, Amherst Campus
This talk traces recurrent patterns in two linguistic and two ontologi-
cal domains: (1) grammatical categories of the noun, (2) grammatical
categories of the verb, (3) shapes of visually perceived objects, and
(4) aspectual classes of events. Correspondences between noun
categories and visual properties of objects are shown by comparing the
semantics of noun classifiers in classifier languages with some computa-
tional objects and processes of early and late vision.
Among grammatical categories of the verb, only those having to do with
aspect are discussed, and three kinds of phenomena identified: the
perfective-imperfective distinction, corresponding to the presence vs.
absence of a contour, at a given scale, in the object domain (and thus
to the count-mass distinction in the noun-categories domain); the aspec-
tual types of verb meanings (a.k.a. Aktionsarten); and coersion, or
nesting, of aspectual types. Unlike previous treatments, a distinction
is drawn betweem aspectual coersion within the word (i.e., in word for-
mation and inflection) and aspectual coersion above the word level, by
verb arguments and adverbial modifiers. This makes it possible to
define the notion of an aspectual classifier and (on analogy with noun-
classifier languages) the notion of an aspectual language. Several pro-
perties of aspectual languages are identified, and a comparison is made
between the ways aspectual distinctions are expressed in aspectual
languages (e.g., Slavic languages), predominantly nominal languages
(e.g., Finnish, Hungarian), and a weakly typed language like English.
The similarities between the object-noun domains and the event-verb
domains point to a need for topological (rather than logical) represen-
tations for aspectual classes, representations that could support the
notions of connectedness, boundary, and continuous function. One such
representation is presented and shown to explain several facts about
aspectual classes. Tentative proposals are made toward defining the
notion of an ``aspectually possible word''. In conclusion, I discuss
the implications of the presented material for the problem of naturalis-
tic explanation in linguistics and the modularity hypothesis.
There will be an evening discussion at Stuart Shapiro's house,
112 Park Ledge Drive, Snyder, at 8:15 P.M.
Contact Bill Rapaport, Dept. of Computer Science, 673-3193, for further details.
------------------------------
End of AIList Digest
********************