Copy Link
Add to Bookmark
Report

NL-KR Digest Volume 05 No. 21

eZine's profile picture
Published in 
NL KR Digest
 · 11 months ago

NL-KR Digest             (11/03/88 00:01:08)            Volume 5 Number 21 

Today's Topics:
Seminar - Knowlege Processing - Hewitt
Harvard AI colloquim
Seminar - Probabilistic Semantics - Pearl
SUNY Buffalo Linguistics Colloq: Zwicky
From CSLI Calendar, October 27, 4:6 (includes new publications)
From CSLI Calendar, November 3, 4:7

Submissions: NL-KR@CS.ROCHESTER.EDU
Requests, policy: NL-KR-REQUEST@CS.ROCHESTER.EDU
----------------------------------------------------------------------

Date: Sat, 22 Oct 88 23:19 EDT
From: Carl Hewitt <Hewitt@xx.lcs.mit.edu>
Subject: Seminar - Knowlege Processing - Hewitt

How an IC Fab is different from an Insect Colony
(The Importance of Keeping Insects Out of the Fab)

Carl Hewitt
Message Passing Semantics Group
MIT AI/LCS

Tuesday 25 October 1988
Seminar: 2:30-3:30pm
Toscanini's Ice Cream: 3:30-...

MIT AI Lab, NE43-8th floor Playroom
545 Tech Sq. Cambridge

Abstract
|
v
Interested
in AI?-Yes->
| Knowledge Processing is a new approach that is informed by results
| from the sociology of science. The result is an approach that
| challenges both the "scruffies" and the "neats." Unlike the
| scruffies", Knowledge Processing is becoming a principled approach
| with rigorous foundations and methods. Unlike the "
neats", Knowledge
No..processing takes conflict and contradictions to be the norm, thereby
| vitiating the most fundamental assumptions of the "
neats". Our
| approach incorporates and integrates the work of Howie Becker, Paul
| Feyerabend, Elihu Gerson, Bruno Latour, and Susan Leigh Star. We are
| looking for students and staff to work with us to extend the Knowledge
| Processing paradigm, make it more rigorous, and apply it to challenging
| domains such as the ones discussed below.
| |
| <------------------------------------------+
v
Interested
in CS?-Yes->
| Concurrent multiprocessor computers are the wave of the future. Actors
| have become the de facto mathematical model for concurrent
| object-oriented programming languages (OOPSLA-88 Concurrency Workshop).
| Actors enjoy the theoretical property that they are "
ultraconcurrent"
| which means that the available concurrency is limited only by the laws
| of physics. Because they are ultraconcuurent, actors can be as fast
| as RPC on workstations, as fast as MULTILISP and QLISP on shared
| address multiprocessors, and as fast as the Cosmic Kernel on
| distributed memory multicomputers. To make this theoretical result
| into a practical reality, we are looking for students and staff to
| join us to create high performance ultraconcurrent prototypes
No ...
| on multiprocessor workstations running Mach and OS/2, on shared address
|multiprocesors (Encore and Sequent), and on multicomputers (Ametek, Intel,
| Jellybean, and Mosaic). Our goal for 1992 is to achieve a sustained
| rate of 100 billion, 32 bit data path, instructions per second for
| knowledge processing applications such as the one described below.
| |
| <-----------------------------------------+
v
Interested
in IC Fab? -- Yes --->
| Current IC manufacturing technology lacks robustness, flexibility, and
| efficiency. Supporting the staff and customers with computer
| organizations to mediate the work has great potential to dramatically
| improve the productivity of flexible IC manufacturing. Each human
| organization (CAD group, physical plant, accounting dept., etc.) will
No...
| have its own shadow computer organization to help organize and
| coordinate its work. These computer organizations will be designed and
| managed according to the principles and methods of Knowledge Processing.
| |
| <-----------------------------------------+
v
Interested
in ICE CREAM? --Yes--> ALL Come!
----
Brad Miller U. Rochester Comp Sci Dept.
miller@cs.rochester.edu {...allegra!rochester!miller}

------------------------------

Date: Mon, 24 Oct 88 14:41 EDT
From: Ehud Reiter <reiter@harvard.harvard.edu>
Subject: Harvard AI colloquim

HARVARD UNIVERSITY
Center for Research in Computing Technology
Colloquium Series Presents


BAYESIAN AND DEMPSTER-SHAFER FORMALISMS FOR
EVIDENTIAL REASONING: A CONCEPTUAL ANALYSIS

Judea Pearl
Cognitive Systems Laboratory
Computer Science Department
University of California, Los Angeles.

Thursday, October 27, 1988
4 PM, Aiken Computation Lab. 101
(Tea: 3:30 pm, Aiken Basement Lobby)

ABSTRACT

Evidential reasoning is the process of drawing plausible conclu-
sions from uncertain clues and incomplete information. In
most AI applications (e.g., diagnosis, forecasting, vision,
speech recognition and language understanding), this process has
been handled by ad-hoc techniques, embedded in domain-
specific procedures and data structures. Recently, there has
been a strong movement to seek a more principled basis for evi-
dential reasoning, and the two most popular contenders that
have emerged are the Bayesian and the Dempster-Shafer (D-S) ap-
proaches.

The Bayesian approach is by far the more familiar between the two,
resting on the rich tradition of statistical decision theory, as
well as on excellent axiomatic and behavioral arguments. Its
three defining attributes are (1) reliance on complete proba-
bilistic model of the domain (2) willingness to accept sub-
jective judgments as an expedient substitute for empirical data
and (3) the use of Bayes conditionalization as the primary
mechanism for updating beliefs in light of new information.

D-S belief functions offer an alternative to Bayesian inference,
in that they do not require the specification of a complete
probabilistic model and, consequently, they do not (and cannot)
use conditionalization to represent the impact of new evi-
dence. Instead, belief functions compute PROBABILITY INTERVALS,
the meaning of which has been a puzzling object to many researchers,
and a subject of much confusion.

The main purpose of this talk is to offer a clear interpretation
of belief functions, thus facilitating a better appreciation of
their power and range of applicability vis a vis those of Baye-
sian inference. We view a belief function as the PROBABILITY-OF-
NECESSITY, namely, the probability that the uncertain constraints
imposed by the evidence, together with the steady constraints
which govern the environment, will be sufficient to compel the
truth of a proposition (by excluding its negation). We shall
demonstrate this interpretation on simple examples, then address
the more general issues of computational, epistemological and
semantic adequacies of the Bayesian and D-S approaches.

Host: Professor Barbara Grosz

------------------------------

Date: Mon, 24 Oct 88 15:41 EDT
From: annette@xx.lcs.mit.edu
Subject: Seminar - Probabilistic Semantics - Pearl


Date: Friday, October 28
Time: 9:30
Place: 8th floor playroom

PROBABILISTIC SEMANTICS FOR QUALITATIVE REASONING:
PRELIMINARY RESULTS AND OPEN QUESTIONS

Judea Pearl
Computer Science Department
University of California, Los Angeles

The prospect of attaching probabilistic semantics to con-
ditional sentences promises to provide current theories of
commonsense reasoning with useful norms of coherence. For exam-
ple, if we interpret the sentence "
Birds fly" to mean "If x is
a bird, it is highly probable that x can fly", then the
logic of high probabilities (Adams,1966) imposes some desir-
able disciplines on how default theories should behave -- it
posts requirements of consistency on default statements, it per-
mits the derivation of plausible conclusions that have been
missed by other formalisms and it is free of spurious exten-
sions. Using nonstandard analysis for infinitesimals
(Spohn, 1988), this logic can be further refined to represent
shades of likelihood, e.g., "
likely", "very likely", "extremely
likely", etc.

However, shades of likelihood are not sufficient to capture
many plausible patterns of reasoning, and must be augmented with
assumptions invoking notions of independence and causation.
The maximum-entropy approach succeeds in emulating conventions
of independence, but it appears to have a basic clash with
human understanding of causation. I shall illustrate the na-
ture of these problems using the "
Yale shooting problem" and
the "
UCLA party problem".

----
Brad Miller U. Rochester Comp Sci Dept.
miller@cs.rochester.edu {...allegra!rochester!miller}

------------------------------

Date: Mon, 24 Oct 88 17:05 EDT
From: William J. Rapaport <rapaport@cs.Buffalo.EDU>
Subject: SUNY Buffalo Linguistics Colloq: Zwicky


UNIVERSITY AT BUFFALO
STATE UNIVERSITY OF NEW YORK

DEPARTMENT OF LINGUISTICS
GRADUATE GROUP IN COGNITIVE SCIENCE
and
GRADUATE RESEARCH INITIATIVE IN COGNITIVE AND LINGUISTIC SCIENCES

PRESENT

ARNOLD ZWICKY

Department of Linguistics, Ohio State University
Department of Linguistics, Stanford University

1. TOWARDS A THEORY OF SYNTACTIC CONSTRUCTIONS

The past decade has seen the vigorous development of frameworks for syn-
tactic description that not only are fully explicit (to the point of
being easily modeled in computer programs) but also are integrated with
an equally explicit framework for semantic description (and, sometimes,
with equally explicit frameworks for morphological and phonological
description). This has made it possible to reconsider the _construc-
tion_ as a central concept in syntax.

Constructions are, like words, Saussurean signs--linkages of linguistic
form with meanings and pragmatic values. The technical problem is to
develop the appropriate logics for the interactions between construc-
tions, both with respect to their form and with respect to their
interpretation. I am concerned here primarily with the formal side of
the matter, which turns out to be rather more intricate than one might
have expected. Constructions are complexes of categories, sub-
categories, grammatical relations, conditions on governed features, con-
ditions on agreeing features, conditions on phonological shape, condi-
tions on branching, conditions on ordering, _and_ specific contributory
constructions (so that, for example, the subject-auxiliary construction
in English contributes to several others, including the information
question construction, as in `What might you have seen?'). The schemes
of formal interaction I will illustrate are overlapping, or mutual
applicability; superimposition, or invocation; and preclusion, or over-
riding of defaults.

Thursday, November 3, 1988
5:00 P.M.
Baldy 684, Amherst Campus

There will be an evening discussion on Nov. 3, 8:00 P.M.,
at the home of Joan Bybee, 38 Endicott, Eggertsville.

=========================================================================

2. INFLECTIONAL MORPHOLOGY AS A (SUB)COMPONENT OF GRAMMAR

Friday, November 4, 1988
3:00 P.M.
Baldy 684, Amherst Campus

Wine and cheese to follow.

Call Donna Gerdts (Dept. of Linguistics, 636-2177) for further information.

------------------------------

Date: Wed, 26 Oct 88 20:10 EDT
From: Emma Pease <emma@csli.Stanford.EDU>
Subject: From CSLI Calendar, October 27, 4:6

What is Planning? What does it have to do with Language Processing?
Ray Perrault
(rperrault@ai.sri.com)
November 3

Various notions of `plan', or complex action, have been developed in
AI in the course of developing programs that can automatically
construct courses of behavior to achieve certain goals. Tradeoffs can
be made between the expressive power of the languages in which states
and actions can be expressed and the computational difficulty of
processes by which plans can be constructed, i.e., `planning', or
inferred, i.e., `plan recognition'.
We will review some of the main lines of research on plans in AI,
as well as applications made of those notions to problems in language
understanding, including language generation, speech act theory, and
understanding of stories and dialogues.

____________
NEXT WEEK'S CSLI SEMINAR
The Resolution Problem for Natural-Language Processing
Weeks 6: Knowledge-based Approaches to the Resolution Problem
Jerry Hobbs
(hobbs@warbucks.ai.sri.com)
November 3

We will continue examining various AI approaches to the resolution
problem, concentrating on those that try to make extensive use of
world knowledge and context. We will especially be looking at the
work of Hirst, Charniak, and approaches using abductive inference.

--------------
CSLI TALK
Proof Normalization with Nonstandard Objects
Shigeki Goto
Nippon Telegraph and Telephone Corporation
Monday, 31 October, 2:30
Cordura Conference Room 100

It is well known that formal proof systems can serve as programming
languages. A proof that describes an algorithm can be executed by
Prawitz's normalization procedure. This talk extends the
computational use of proofs to realize a lazy computation formally.
It enables computation of a proof over stream objects (infinitely long
lists) as in Concurrent Prolog.
To deal with infinitely long objects, we will extend the number
theory to incorporate infinite numbers. This is an application of
nonstandard analysis to computer programs. We will show that the rule
of mathematical induction can be extended to cover infinite numbers
with appropriate computational meaning.
The method of introducing nonstandard integers was independently
proposed by the speaker (Goto) and Professor Per Martin-Lof at
Stockholm University. He will briefly discuss Martin-Lof's extension
of his constructive type theory.
--------------
TALK
Minds, Machines, and Searle
Stevan Harnad
(harnad@princeton.edu)
Thursday, 3 November, 10:00
Cordura Hall Conference Room

Searle's provocative "
Chinese Room Argument" attempted to show that
the goals of "
Strong AI" are unrealizable. Proponents of Strong AI
are supposed to believe that (i) the mind is a computer program, (ii)
the brain is irrelevant, and (iii) the Turing Test is decisive.
Searle's point is that since the programmed symbol-manipulating
instructions of a computer capable of passing the Turing Test for
understanding Chinese could always be performed instead by a person
who could not understand Chinese, the computer can hardly be said to
understand Chinese. Such "
simulated" understanding, Searle argues, is
not the same as real understanding, which can only be accomplished by
something that "
duplicates" the "causal powers" of the brain. In this
paper I make the following points:

1. Simulation versus Implementation
2. Theory-Testing versus Turing-Testing
3. The Convergence Argument
4. Brain Modeling versus Mind Modeling
5. The Modularity Assumption
6. The Teletype versus the Robot Turing Test
7. The Transducer/Effector Argument
8. Robotics and Causality
9. Symbolic Functionalism versus Robotic Functionalism
10. "
Strong" versus "Weak" AI

---------------
NEW PUBLICATIONS

The CSLI Publications Office is pleased to announce the publication of
three new titles.

---------------------------------------------
The second edition of Johan van Benthem's
"
A Manual of Intensional Logic"
(Revised and Expanded)

Intensional logic, as understood here, is based on the broad
presupposition that so-called intensional contexts in natural language
can be explained semantically by the idea of multiple reference. Van
Benthem reviews work on tense, modality, and conditionals and then
presents recent developments in intensional theory, including
partiality and generalized quantifiers. The text of the first edition
has been substantially revised, and three new chapters have been
added.

Johan van Benthem is professor of mathematical logic at the University
of Amsterdam.

Cloth: $29.95 ISBN: 0-937073-30-X

Paper: $12.95 ISBN: 0-937073-29-6

---------------------------------------------
Tore Langholm's
"
Partiality, Truth and Persistence"

This book is a study in the theory of partially defined models.
Langholm compares in detail the various alternatives for extending the
definition of truth or falsity that holds with classical, complete
models to partial models. He also investigates the monotonicity of
truth and other inexpressible conditions. These discussions culminate
with a combined Lindstrom and persistence characterization theorem.

Tore Langholm is a research fellow in mathematics at the University of
Oslo.

Cloth: $29.95 ISBN: 0-937073-35-0

Paper: $12.95 ISBN: 0-937073-34-2

---------------------------------------------
"
Papers from the Second International
Conference on Japanese Syntax"
(Edited by William Poser)

Cloth: $40.00 ISBN: 0-937073-39-3

Paper: $15.95 ISBN: 0-937073-38-5

---------------------------------------------

These titles are distributed by the Univesity of Chicago Press and may
be purchased in academic or university bookstores or ordered directly
from the distributor at 5801 Ellis Avenue, Chicago, Illinois 60637.
(1-800) 621-2736.

------------------------------

Date: Wed, 2 Nov 88 20:28 EST
From: Emma Pease <emma@csli.Stanford.EDU>
Subject: From CSLI Calendar, November 3, 4:7

Higher-Level Lexical Structure and Parsing
Michael Tanenhaus
University of Rochester
(mtan@prodigal.psych.rochester.edu)
November 10

Sentences with long-distance dependencies (filler-gap sentences)
present interesting problems of ambiguity resolution. This paper will
present results from a series of experiments, using both behavioral
measures and brain-evoked potential measures, that provide a detailed
picture of how people use verb argument structure and verb control
information to posit and fill gaps. The results provide intriguing
suggestions about the interaction among syntactic, semantic, and
lexical information in parsing.


------------------------------

End of NL-KR Digest
*******************


← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT