Copy Link
Add to Bookmark
Report

NL-KR Digest Volume 04 No. 49

eZine's profile picture
Published in 
NL KR Digest
 · 11 months ago

NL-KR Digest             (5/16/88 17:47:00)            Volume 4 Number 49 

Today's Topics:
Seminar - AURORA -- An Or-Parallel Prolog System (Unisys)
Seminar - Nonmonotonic Parallel Inheritance Networks (AT&T)
Seminar - Collative Semantics (AT&T)
BBN AI Seminar -- Steven Minton
Language & Cognition Seminar
Harvard AI seminar
Lang. & Cognition Seminar
From CSLI Calendar, May 5, 3:27
CSLI Reports
From CSLI Calendar, May 12, 3:28
Conference - AAAI Workshop on Plan Recognition

Submissions: NL-KR@CS.ROCHESTER.EDU
Requests, policy: NL-KR-REQUEST@CS.ROCHESTER.EDU
----------------------------------------------------------------------

Date: Sat, 23 Apr 88 12:40 EDT
From: Tim Finin <antares!finin@burdvax.prc.unisys.com>
Subject: Seminar - AURORA -- An Or-Parallel Prolog System (Unisys)

AI SEMINAR
UNISYS PAOLI RESEARCH CENTER

AURORA - An Or-parallel Prolog System

Andrzej Ciepelewski
Swedish Institute of Computer Science (SICS)


A parallel prolog system has been constructed in a cooperative effort
among Argonne National Lab, University of Manchester and SICS. The
system has been based on a state of the art sequential Prolog. It runs
on multiprocessors with shared memory and is expected to perform
better than on e.g. Sequent Symmetry than the commercial Prolog
systems available today. The system executes "ordinary" ordinary
Prolog programs withs cuts and side effects keeping the semantics of
sequential execution. Also programs written in Prolog extended with
parallel primitives like "cavalier" commit and unorderd sided-effects
can be excuted. The system has been designed for portability and
modifiability. It main part, the engine part and the scheduler part
are nicely interfaced. Two quite different schedulers have already
been tried. Some preliminary performance data has already been
collected, running mostly small search and parsing problems. The
largest program ran so far have been the parallelised SICStus Prolog
compiler and Chat-80. The figures from Sequent Balance 8000 show about
20% parallel overhead in one processor case and close to linear
speed-ups. We are waiting with exitement for figures from Sequent
Symmetry where the system has been recently ported. In my talk I will
mainly discuss implementation decisions and performance figures.


2:00 pm Tuesday, April 26
Paoli Auditorium
Unisys Paloi Research Center
Route 252 and Central Ave.
Paoli PA 19311

-- non-Unisys visitors who are interested in attending should --
-- send email to finin@prc.unisys.com or call 215-648-7446 --
Tim Finin finin@prc.unisys.com
Paoli Research Center ..!{psuvax1,sdcrdcf,cbmvax,bpa}!burdvax!finin
Unisys Corporation 215-648-7446 (o)
PO Box 517, Paoli PA 19301 215-386-1749 (h)

------------------------------

Date: Mon, 25 Apr 88 12:30 EDT
From: dlm%research.att.com@RELAY.CS.NET
Subject: Seminar - Nonmonotonic Parallel Inheritance Networks (AT&T)

Speaker: Chiaki Sakama
ICOT, Japan.

Time: 10:30, May 2nd, 1988
Room: AT&T Bell Laboratories- Murray Hill 3D-436

Title: Nonmonotonic Parallel Inheritance Networks

This paper discusses a formalization of nonmonotonic inheritance
reasoning in semantic networks using Reiter's default theory. It enables us to
define inheritance rules apart from data in a network, and improves
readability or maintenance of a network compared with other approaches.
We also present a parallel inheritance algorithm based on this method,
which generates a set of properties for an input class. This algorithm is
easily realized in a parallel logic programming language GHC (Guarded Horn
Clauses), which is developed as the kernel language of the fifth-generation
project at ICOT.


Sponsor: David Etherington
ether@research.att.com

------------------------------

Date: Mon, 25 Apr 88 12:30 EDT
From: dlm%research.att.com@RELAY.CS.NET
Subject: Seminar - Collative Semantics (AT&T)

Speaker: Dan Fass
Computing Research Laboratory, New Mexico State University

Title: Collative Semantics: A Semantics for Natural Language Processing

Date: Thursday, April 29, 10:30
Place: AT&T Bell Laboratories- Murray Hill 3D-560

Abstract:

The main semantic phenomena Collative Semantics (CS) addresses
are lexical ambiguity and what are referred to as "semantic
relations"
. Seven kinds of semantic relation investigated are
literal, metonymic, metaphorical, anomalous, novel, inconsistent,
and redundant relations. The talk will discuss CS and the three
main ideas behind it:

(1) a linguistic view of knowledge representation which shows how
the semantic primitives of a knowledge representation can
function like senses of words from natural language;

(2) a distinction made between knowledge and coherence; and

(3) a large-scale framework of four constructs, two
representations and two processes, in which coherence plays a
major organising role.

CS has four components which are instances of the four
constructs: sense-frames and semantic vectors are the two
representations, and collation and screening are the two
processes. Sense-frames represent lexical ambiguity: they embody
the linguistic view of knowledge representation. Collation is a
process which discriminates semantic relations: it matches the
sense-frames of two word senses and distinguishes semantic
relations between the word senses as a complex system of mappings
between their sense-frames. Semantic vectors represent semantic
relations: they are a "coherence representation" which records
the systems of mappings produced by collation. Screening is a
process which resolves lexical ambiguity: it selects between
pairs of semantic vectors. CS has been implemented in a sentence
analysis program called Meta5. Examples will be given of how
metaphors and metonymies are distinguished by Meta5.

Sponsors: Bruce Ballard (bwb@research.att.com)
Ron Brachman (rjb@research.att.com)

------------------------------

Date: Wed, 27 Apr 88 14:23 EDT
From: Marc Vilain <MVILAIN@G.BBN.COM>
Subject: BBN AI Seminar -- Steven Minton

BBN Science Development Program
AI Seminar Series Lecture

LEARNING EFFECTIVE SEARCH CONTROL KNOWLEDGE:
AN EXPLANATION-BASED APPROACH

Steven Minton
Carnegie-Mellon University
(Steven.Minton@cad.cs.cmu.edu)

BBN Labs
10 Moulton Street
2nd floor large conference room
10:30 am, Tuesday May 3


In order to solve problems more effectively with accumulating
experience, a problem solver must be able to learn and exploit search
control knowledge. In this talk, I will discuss the use of
explanation-based learning (EBL) for acquiring domain-specific control
knowledge. Although previous research has demonstrated that EBL is a
viable approach for acquiring control knowledge, in practice EBL may not
always generate useful control knowledge. For control knowledge to be
effective, the cumulative benefits of applying the knowledge must
outweigh the cumulative costs of testing whether the knowledge is
applicable. Generating effective control knowledge may be difficult, as
evidenced by the complexities often encountered by human knowledge
engineers. In general, control knowledge cannot be indiscriminately
added to a system; its costs and benefits must be carefully taken into
account.

To produce effective control knowledge, an explanation-based learner
must generate "good" explanations -- explanations that can be profitably
employed to control problem solving. In this talk, I will discuss the
utility of EBL and describe the PRODIGY system, a problem solver that
learns by searching for good explanations. Extensive experiments testing
the PRODIGY/EBL architecture in several task domains will be discussed.
I will also briefly describe a formal model of EBL and a proof that
PRODIGY's generalization algorithm is correct with respect to this model.

------------------------------

Date: Fri, 29 Apr 88 08:51 EDT
From: Dori Wells <DWELLS@G.BBN.COM>
Subject: Language & Cognition Seminar

BBN Science Development Program
Language & Cognition Seminar Series


METAPHORS, MEMORIES AND MODALITIES: INSIGHTS FROM INFANTS

Sheldon H. Wagner
Department of Psychology
University of Rochester


BBN Laboratories Inc.
10 Moulton Street
Large Conference Room, 2nd Floor

10:30 a.m., Friday, May 6, 1988


Absract: Human infants are linguistically and experientially immature and
yet they show evidence of complex cognitive judgments some of which might be
thought solely to be in the province of language users. Examples of these are
"metaphorical" recognition of similarities between physically dissimilar
events and the recognition of objects presented separately to different
modalities. Evidence for these abilities and a putative amodal code that
subserves them will be presented along with a model of visual recognition
memory that can serve as a useful metric for quantifying the rate of
information-processing of infants of varying ages. Concurrrent validity for
the model will be examined by comparing performances of infants of varying
ages under different experimental conditions and by comparing these results
to those obtained from infants born under medically compromising conditions
such as birth asphyxia, intraventricular hemorrhage and severe prematurity.

------------------------------

Date: Fri, 29 Apr 88 13:48 EDT
From: Ehud Reiter <reiter@harvard.harvard.edu>
Subject: Harvard AI seminar

Monday, May 9, 1988
4 PM
Aiken 101 (Harvard University)
(Tea at 3:45 pm, Aiken Main Lobby)

Connectionist Representations and Linguistic Inference

David S. Touretzky
Computer Science Department
Carnegie Mellon University

DUCS is a neural network architecture for representing and manipulating
frame-like structures. Slot names and slot fillers are diffuse patterns of
activity spread over a collection of units. The choice of a distributed
representation gives rise to certain useful properties not shared by
conventional frame systems. One of these is the ability to retrieve a slot
even if the slot name is not known precisely. Another is the ability to encode
fine semantic distinctions as subtle variations on the canonical pattern for a
slot. DUCS combines the flexiblity of parallel distributed processing with the
structured flavor of conventional formalisms. but it is only suggestive of the
sort of fluid knowledge representations connectionists are really after.

In the second half of the talk I will discuss some current problems in
connectionist natural language processing. Spreading activation/lateral
inhibition architectures are insufficient to handle many interesting linguistic
phenomena. For example, metonymy requires not only a rich knowledge
representation, but also a flexible inference mechanism. Future connectionist
models, employing more sophisticated network architectures, may provide
solutions to these difficulties.

------------------------------

Date: Tue, 3 May 88 15:34 EDT
From: Dori Wells <DWELLS@G.BBN.COM>
Subject: Lang. & Cognition Seminar

BBN Science Development Program
Language & Cognition Seminar Series

ON GLOSSING

Alton Becker
Dept. of Linguistics
University of Michigan

BBN Laboratories Inc.
10 Moulton Street
Large Conference Room, 2nd Floor

10:30 a.m., Thursday, May 5, 1988


Abstract: While linguists have become very sophisticated in parsing languages,
the other basic linguistic act, glossing, is done uncritically almost always,
even though to a very great extent glossing determines parsing. In this talk,
the pervasive double-headedness construction in Burmese which, I will argue,
is untranslatable into single-headed English will be used to illustrate how
uncritical glossing has obscured deep syntactic differences between languages.

As are all my talks, this one will be yet another plea for nonuniversality.

------------------------------

Date: Thu, 5 May 88 12:20 EDT
From: Emma Pease <emma@russell.stanford.edu>
Subject: From CSLI Calendar, May 5, 3:27

Reading: "Connectionism and Cognitive Architecture:
A Critical Analysis"

by Jerry Fodor and Zenon Pylyshyn
Cognition, March 1988
Discussion led by Adrian Cussins
(cussins.pa@xerox.com)
May 5

Fodor and Pylyshyn try to impale connectionism on the horns of a
dilemma: either connectionism is a mere implementation theory, or it
is a false theory of cognition because it cannot capture the
systematicity of thought. Once you adopt Fodor and Pylyshyn's
perspective, their argument is very powerful. I will run through
their argument, and then show how to adopt a different perspective on
connectionist modeling of cognition by showing how to deny their
assumption that any psychological theory must model cognition in terms
of its conceptual structure. Our question must be: Can we make sense
of connectionism's claim to model cognition subsymbolically?

--------------
NEXT WEEK'S CSLI TINLUNCH
Harman's and Cherniak's Leniency on Agent Rationality
Readings: "Change in View: Principles of Reasoning"
by Gilbert Harman, chaps. 2 and 3 (MIT Press, 1986)
and
"Minimal Rationality"
by Christopher Cherniak (MIT Press, 1986)
Discussion led by Ronald Loui
(loui@csli.stanford.edu)
May 12

In two short chapters of CHANGE IN VIEW, Gil Harman (1) argues against
the "special relevance" of logic to reasoning, and (2) dismisses
normative theories that require (probabilistic) degrees of belief.
Inconsistency is unavoidable, closure leads to clutter, and sometimes
implication is immediate, but not logical. There exists
all-or-nothing-belief, and "it is too complicated for mere finite
beings to make extensive use of probabilities."


Christopher Cherniak is more interested in "how stupid can you be"
while still being describable as a rational agent. But he has also
tried to let the agent off the normative hook. He argues that a
condition of MINIMAL RATIONALITY must be fashioned for any
satisfactory cognitive theorizing. Only inferences that are
"feasible" and "apparently appropriate" need be made.

I will focus on the Harman chapters, but will entertain Cherniak's
blurring of the descriptive/normative distinction as well.

--------------
NEXT WEEK'S CSLI SEMINAR
What is Logic Programming? What is a Logic?
Jose Meseguer
(meseguer@csl.sri.com)
May 12

During the past few years at CSLI, Joseph Goguen and I have proposed a
broad view of logic programming that is open to different logics and
rejects the identification of logic programming with one of its
instances. This point of view has led to the design and
implementation of the purely functional logic programming language
OBJ, to the design of the Eqlog language that unifies functional and
relational programming, and to the FOOPS and FOOPlog languages that
extend OBJ and Eqlog with object-oriented capabilities. The
development of natural language systems can be made much simpler by
adopting this broad view of logic programming. Joseph Goguen has
shown how the Goguen-Burstall theory of institutions can be applied to
formalize logic programming languages and yields design principles for
such languages.

In this talk, I will report on some recent work of mine that builds
on the previous joint work with Goguen and makes this view axiomatic
by giving general axioms that a language should satisfy in order to be
called a logic programming language. The question of what a logic
programming language is leads us to the more fundamental question
about how general logics should be axiomatized. Two main approaches
to this question are:

1. A model-theoretic approach that takes the satisfaction relation
between models and sentences as basic, and is exemplified by the
Barwise axioms for abstract model theory and the Goguen-Burstall
axioms for institutions.

2. A proof-theoretic approach that takes the entailment relation
between sets of sentences as basic and is exemplified by the work
of Tarski on consequence relations and the entailment axioms of
Scott.

Neither of these approaches is by itself sufficient to axiomatize
logic programming. In the talk I will present an axiomatization that
unifies both approaches and yields the desired axioms for logic
programming. I will also discuss how languages based on higher-order
type theory can be included in this framework.


------------------------------

Date: Tue, 10 May 88 01:09 EDT
From: Ron Nash <nash@csli.STANFORD.EDU>
Subject: CSLI Reports


The Spring 1988 catalog of reports published by The Center for the
Study of Language and Information at Stanford University is now
available online, in HyperCard format (for Macintosh computers).
Abstracts are included.

This is an update of (not a supplement to) the previous catalog.
So if you missed the last edition, this one contains the complete
list.

The file is available by anonymous ftp from csli.stanford.edu
The relevant file is: pub/csli-abstracts.hqx

Those without internet access can send a 3.5" disk and a self-addressed
envelope to:

Publications
CSLI
Ventura Hall
Stanford University
Stanford, CA 94305-4115


(CSLI was founded in 1983 by researchers from Stanford University,
SRI International, and Xerox PARC to further research and development
of integrated theories of language, information, and computation.)

Ron Nash
Center for the Study of Language and Information
Stanford University
nash@russell.stanford.edu

------------------------------

Date: Thu, 12 May 88 13:00 EDT
From: Emma Pease <emma@russell.stanford.edu>
Subject: From CSLI Calendar, May 12, 3:28

Reading: "
Even"
by Paul Kay
Discussion led by Mark Gawron
(gawron@csli.stanford.edu)
May 19

This is one of several papers assuming the outlook of Construction
Grammar, which takes the analysis of the semantic and syntactic
properties of constructions as central, and takes the analysis of a
sentence to be the unification of the constructions in it. As the
primitive grammatical units, constructions here take on the role of
lexical items in other linguistic theories. This paper focuses on the
semantic and pragmatic properties of the "
construction" EVEN; Kay
invokes what he calls a scalar model, which he has elsewhere used to
explicate the notion of informativeness crucial for Grice's Maxim of
Quantity. If Kay's analysis is right, a correct account of the use of
EVEN in an utterance will make it an operation on the proposition
expressed by that utterance, another proposition Kay calls the context
proposition, and a particular scalar model invoked for that occasion.
This use of scalar models raises some issues about just what needs to
be part of the context that the meaning of a word (or by extension,
sentence) needs to operate on.


------------------------------

Date: Wed, 27 Apr 88 14:29 EDT
From: Drew Mcdermott <dvm@YALE-BULLDOG.ARPA>
Subject: Conference - AAAI Workshop on Plan Recognition

CALL FOR PARTICIPATION
WORKSHOP ON PLAN RECOGNITION

AAAI-88, Minneapolis, Minnesota, Wednesday, August 24.
Radisson-St. Paul Hotel

Plan recognition is a touchstone issue for Artificial Intelligence,
which has generated thorny problems and theoretical results for years.
The class of problems we have in mind is to infer a goal-based
explanation of the behavior of one or more actors. This class can
be extended to closely related problems like inferring an author's
plans from a text, inferring a programmer's plans from his code, or
inferring explanations of new bug types from case histories.

Problems of this sort often seem to lie at the heart of intelligence,
because people can apparently select just the right explanatory
principles from large knowledge bases. For that reason, this problem
area has encouraged interest in nontraditional control structures such
as marker passing, parallelism, and connectionism. To date, however,
no decisive solutions have been obtained.

The workshop will aim at bringing together individuals working in all
the active areas related to plan recognition, as well as individuals
trying to exploit research results for practical applications. This
interaction should prove fruitful for both groups.

Contributors interested in participating in this workshop are requested
to submit a 1000-2000 word extended abstract of their work, describing
its relevance to the topic of plan recognition. The workshop attendance
will be limited to 35, and all participants will present their work,
either in an oral presentation, or in a poster session. Abstracts will
be refereed by the organizing committee. Copies of the chosen abstracts
will be sent to each participant prior to the workshop. Presenters shall
have the opportunity to expand their abstracts for inclusion in a workshop
proceedings to be published later.

Extended abstracts should be received prior to June 3, 1988. Mail them
to:
Jeff Maier
TASC
2555 University Blvd.
Fairborn, Ohio 45324
(513)426-1040

Authors will be notified of the status of their papers by July 8, 1988.

Organizing Committee:

Larry Birnbaum, Yale University
Doug Chubb, US Army, Center for Signals Warfare
Jeff Maier, TASC
Drew McDermott, Yale University
Bob Wilensky, UC Berkeley
Steve Williams, TASC

------------------------------

End of NL-KR Digest
*******************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT