Copy Link
Add to Bookmark
Report
AIList Digest Volume 4 Issue 260
AIList Digest Wednesday, 19 Nov 1986 Volume 4 : Issue 260
Today's Topics:
Seminars - A Robust Approach to Plan Recognition (CMU) &
Object-Oriented DBMSs (UPenn) &
The Capacity of Neural Networks (UPenn) &
BoltzCONS: Recursive Objects in a Neural Network (CMU) &
Insight in Human Problem Solving (CMU) &
Analogical and Deductive Reasoning (UCB) &
Planning and Plan Recognition in Office Systems (Rutgers) &
Logic Programming and Circumscription (SU)
----------------------------------------------------------------------
Date: 11 Nov 86 17:51:22 EST
From: Steven.Minton@k.cs.cmu.edu
Subject: Seminar - A Robust Approach to Plan Recognition (CMU)
This week's speaker is Craig Knoblock. Usual time and place, 3:15 in
7220.
Title: A Robust Approach to Plan Recognition
Abstract:
Plan recognition is the process of inferring an agent's plans and goals from
his actions. Most of the previous work on plan recognition has approached
this problem by first hypothesizing a single goal and then attempting to
match the actions with a plan for achieving that goal. Unfortuantely, there
are some types of problems where focusing on a single hypothesis will
mislead the system. I will present an architecture for plan recognition
that does not require the system to choose a single goal, but allows several
hypotheses to be considered simultaneously. This architecture uses an
assumption-based truth maintenance system to maintain both the observed
actions and the predictions about the agent's plans and goals.
------------------------------
Date: Thu, 13 Nov 86 00:16 EST
From: Tim Finin <Tim@cis.upenn.edu>
Subject: Seminar - Object-Oriented DBMSs (UPenn)
DBIG Meeting
10:30 Friday November 14th
554 Moore School
University of Pennsylvania
DEVELOPMENT OF AN OBJECT-ORIENTED DBMS
David Maier
Oregon Graduate Center
and
Servio Logic Development Corp
GemStone is an object-oriented database server developed by Servio Logic
that supports a model of objects similar to that of Smalltalk. GemStone
provides complex objects with sharing and identity, specification of
behavioral aspects of objects, and an extensible data model. Those features
came with the choice of Smalltalk as a starting point for the data model and
its programming language, OPAL. However, Smalltalk is a single-user,
memory-based system, and requires significant modifications to provide a
multi-user, disk-based system with support for associative queries objects
of arbitrary size.
This presentation begins with a summary of the requirements for a database
system to support applications such as CAD, office automation and knowledge
bases. I next introduce the Smalltalk language and its data model, showing
how they satisfy some of the requirements, and indicating which remain to be
satisfied. I will outline the approach Servio took on the remaining
requirements, describing the techniques used for storage management,
concurrency, recovery, name spaces and associative access, as time permits.
------------------------------
Date: Thu, 13 Nov 86 23:12 EST
From: Tim Finin <Tim@cis.upenn.edu>
Subject: Seminar - The Capacity of Neural Networks (UPenn)
CIS Colloquium
University of Pennsylvania
3pm Tuesday November 18
216 Moore School
THE CAPACITY OF NEURAL NETWORKS
Santosh S. Venkatesh
University of Pennsylvania
Analogies with biological models of brain functioning have led to fruitful
mathematical models of neural networks for information processing. Models of
learning and associative recall based on such networks illustrate how
powerful distributed computational properties become evident as collective
consequence of the interaction of a large number of simple processing
elements (the neurons). A particularly simple model of neural network
comprised of densely interconnected McCulloch-Pitts neurons is utilized in
this presentation to illustrate the capabilities of such structures. It is
demonstrated that while these simple constructs form a complete base for
Boolean functions, the most cost-efficient utilization of these networks
lies in their subversion to a class of problems of high algorithmic
complexity. Specializing to the particular case of associative memory,
efficient algorithms are demonstrated for the storage of memories as stable
entities, or gestalts, and their retrieval from any significant subpart.
Formal estimates of the essential capacities of these schemes are shown. The
ultimate capability of such structures, independent of algorithmic
approaches, is characterized in a rigourous result. Extensions to more
powerful computational neural network structures are indicated.
------------------------------
Date: 12 November 1986 1257-EST
From: Masaru Tomita@A.CS.CMU.EDU
Subject: Seminar - BoltzCONS: Recursive Objects in a Neural Network
(CMU)
Time: 3:30pm
Place: WeH 5409
Date: 11/18, Tuesday
BoltzCONS: Representing and Transforming Recursive
Objects in a Neural Network
David S. Touretzky, CMU CSD
BoltzCONS is a neural network in which stacks and trees are implemented as
distributed activity patterns. The name reflects the system's mixed
representational levels: it is a Boltzmann Machine in which Lisp cons cell-like
structures appear as an emergent property of a massively parallel distributed
representation. The architecture employs three ideas from connectionist symbol
processing -- coarse coded distributed memories, pullout networks, and variable
binding spaces, that first appeared together in Touretzky and Hinton's neural
network production system interpreter. The distributed memory is used to store
triples of symbols that encode cons cells, the building blocks of linked lists.
Stacks and trees can then be represented as list structures, and they can be
manipulated via associative retrieval. BoltzCONS' ability to recognize shallow
energy minima as failed retrievals makes it possible to traverse binary trees
of unbounded depth nondestructively without using a control stack. Its two
most significant features as a connectionist model are its ability to represent
structured objects, and its generative capacity, which allows it to create new
symbol structures on the fly.
A toy application for BoltzCONS is the transformation of parse trees from
active to passive voice. An attached neural network production system contains
a set of rules for performing the transformation by issuing control signals to
BoltzCONS and exchanging symbols with it. Working together, the two networks
are able to cooperatively transform ``John kissed Mary'' into ``Mary was kissed
by John.''
------------------------------
Date: 14 Nov 86 10:16:55 EST
From: Jeffrey.Bonar@isl1.ri.cmu.edu
Subject: Seminar - Insight in Human Problem Solving (CMU)
An Interdiciplinary Seminar of the Computer Science Department
and the Learning Research and Development Center
UNIVERSITY OF PITTSBURGH
AN INFORMATION PROCESSING ARCHITECTURE
TO EXPLAIN INSIGHT IN HUMAN PROBLEM SOLVING
STELLAN OHLSSON
10:00 AM TO 11:00, FRIDAY, JANUARY 9TH, 1987
LRDC AUDITORIUM, SECOND FLOOR
REFRESHMENTS FOLLOWING
There are currently four models of symbolic computation which are in
frequent use in Cognitive Science work: applicative programming, logic
programming, rule-based programming, and object oriented (frame based)
programming. Each of these exhibit some general properties of human
information processing, but neglects others. For example, LISP contains a
model for the hiearchical structure of action, which Production Systems do not.
What is needed for the simulation of human cognition is a new architecture
which exhibits all of the properties which we know are characteristic of human
cognition, and which "has" them in a natural way. An attempt at defining such
an architecture will be presented. It has grown within a specific simulation
attempt, namely to understand formally what happens in so-called
"Aha"-experiences, moments of insight during problem solving. A theory has
been constructed which explains such events within the information processing
theory of problem solving as heuristic search. The theory is then implemented
within the architecture described. An example of a run of the system will be
described.
For more information, call Cathy Rupp 624-3950
------------------------------
Date: Mon, 17 Nov 86 13:55:23 PST
From: admin%cogsci.Berkeley.EDU@berkeley.edu (Cognitive Science
Program)
Subject: Seminar - Analogical and Deductive Reasoning (UCB)
BERKELEY COGNITIVE SCIENCE PROGRAM
Cognitive Science Seminar - IDS 237A
Tuesday, November 25, 11:00 - 12:30
2515 Tolman Hall
Discussion: 12:30 - 1:30
2515 Tolman Hall
``Analogical and Deductive Reasoning"
Stuart Russell
Computer Science
UC Berkeley
The first problem I will discuss is that of analogical reason-
ing, the inference of further similarities from known similari-
ties. Analogy has been widely advertised as a method for apply-
ing past experience in new situations, but the traditional
approach based on similarity metrics has proved difficult to
operationalize. The reason for this seems to be that it
neglects the importance of relevance between known and inferred
similarities. The need for a logical semantics for relevance
motivates the definition of determinations, first-order expres-
sions capturing the idea of relevance between generalized pro-
perties. Determinations are shown to justify analogical infer-
ences and single-instance generalizations, and to express an
apparently common form of knowledge hitherto neglected in
knowledge-based systems. Essentially, the ability to acquire
and use determinations increases the set of inferences a system
can make from given data. When specific determinations are
unavailable, a simple statistical argument can relate similar-
ity to the probability that an analogical solution is correct,
in a manner closely connected to Shepard's stimulus generaliza-
tion results. The second problem, suggested by and subsuming
the first, is to identify the ways in which existing knowledge
can be used to help a system to learn from experience. I
describe a simple method for enumerating the types of knowledge
(of which determinations are but one) that contribute to learn-
ing, so that the monolithic notion of confirmation can be
teased apart. The results find strong echoes in Goodman's work
on induction. The application of a logical, knowledge-based
approach to the problems of analogy and induction indicates the
need for a system to be able to detect as many forms of regu-
larity as possible in order to maximize its inferential capa-
bility. The possibility that important aspects of common sense
are captured by complex, abstract regularities suggests further
empirical research to identify this knowledge.
------------------------------
Date: 17 Nov 86 12:59:16 EST
From: BORGIDA@RED.RUTGERS.EDU
Subject: Seminar - Planning and Plan Recognition in Office Systems
(Rutgers)
Computer Science Department Colloquium
Date: Thursday November 20
Speaker: Professor Bruce Croft
Title: Planning and Plan Recognition in Office Systems
Affiliation: Department of Computer and Information Science,
University of Massachusetts, Amherst
Time: 10:00 a.m. [NOTE UNUSUAL TIME!!!]
Place: Hill 705
Note: Refreshments will be served at 9:50 a.m.
The office environment provides an ideal testbed for systems
that attempt to represent and support complex, semi-structured
and cooperative activities. It is typical to find a variety of
constraints at different levels of abstraction on activities,
objects manipulated by activities, and people that carry out the
activities. In this talk, we will discuss the use of planning and
plan recognition techniques to support an intelligent interface
for an office system. In particular, we emphasise the use of
object-based models, and the relationship between planning and
plan execution. The types of exceptions that can occur with
underconstrained plans will be described and some suggestions
made about techniques for handling them.
------------------------------
Date: 17 Nov 86 1037 PST
From: Vladimir Lifschitz <VAL@SAIL.STANFORD.EDU>
Subject: Seminar - Logic Programming and Circumscription (SU)
Commonsense and Non-Monotonic Reasoning Seminar
LOGIC PROGRAMMING AND CIRCUMSCRIPTION
Vladimir Lifschitz
Thursday, November 20, 4pm
MJH 252
The talk will be based on my paper "On the declarative semantics of
logic programs with negation". A few copies of the paper are available
in my office, MJH 362.
ABSTRACT. A logic program can be viewed as a predicate formula, and its
declarative meaning can be defined by specifying a certain Herbrand
model of that formula. For programs without negation, this model is
defined either as the Herbrand model with the minimal set of positive
ground atoms, or, equivalently, as the minimal fixed point of a certain
operator associated with the formula (Van Emden and Kowalski). These
solutions do not apply to general logic programs, because a program
with negation may have many minimal Herbrand models, and the corresponding
operator may have many minimal fixed points. Apt, Blair and Walker and,
independently, Van Gelder, introduced a class of general logic programs
which disallow certain combinations of recursion and negation, and showed
how to use the fixed point approach to define a declarative semantics for
such programs. Using the concept of circumscription, we extend the minimal
model approach to stratified programs and show that it leads to the same
semantics.
------------------------------
End of AIList Digest
********************