Copy Link
Add to Bookmark
Report

AIList Digest Volume 3 Issue 060

eZine's profile picture
Published in 
AIList Digest
 · 1 year ago

AIList Digest             Monday, 6 May 1985       Volume 3 : Issue 60 

Today's Topics:
Seminars - Frame Problem (UCB) &
The Sequential Nature of Unification (SU) &
Late-Bound Polymorphic Languages (IBM-SJ) &
Inclusive Predicate Existence (CMU) &
The LCF Programmable Theorem Provers (UTexas) &
Automated Programming (CMU) &
Mathematical Discovery and Hindsight (RU) &
Foundations of Horn Logic Programming (CMU) &
Property Theory and Second-Order Logic (CSLI)

----------------------------------------------------------------------

Date: Wed, 3 Apr 85 15:29:21 pst
From: chertok%ucbcogsci@Berkeley (Paula Chertok)
Subject: Seminar - Frame Problem (UCB)

BERKELEY COGNITIVE SCIENCE PROGRAM
Cognitive Science Seminar -- IDS 237B

TIME: Tuesday, April 9, 11 - 12:30
PLACE: 240 Bechtel Engineering Center

SPEAKER: Jerry Fodor, Philosophy Department, MIT and
UC Berkeley

TITLE: ``Modules, Frames, Fridgeons, Sleeping
Dogs and The Music of The Spheres''

This paper continues the discussion of the frame problem
from The Modularity of Mind. It's argued that outbreaks of the
frame problem are characteristic of the study of unencapsulated -
hence nonmodular - cognitive faculties, and that this explains
why the effects of the problem are felt much more strongly when
we consider cognition and problem solving than when the topic is
perceptual integration. Since unencapsulation is a leading
characteristic of any process of rational nondemonstrative infer-
ence, it is argued that the solution of the frame problem is not
dissociable from the general problem of understanding such
processes. This rather gloomy assessment is then compared with
views current in AI according to which resort to `sleeping dog'
strategies has already made it possible to circumvent the frame
problem. I argue that the proposed solution begs the problem.

------------------------------

Date: 3 Apr 85 0000 PST
From: Arthur Keller <ARK@SU-AI.ARPA>
Subject: Seminar - The Sequential Nature of Unification (SU)


CS Colloquium, April 9, 4:15pm, Terman Auditorium

ON THE SEQUENTIAL NATURE OF UNIFICATION

Cynthia Dwork
Massachusetts Institute of Technology


Unification of terms is a crucial step in resolution theorem proving, with
applications to a variety of symbolic computation problems. It will be
shown that the general problem is log-space complete for P, even if
inifinite substitutions are allowed. Thus a fast parallel algorithm for
unification is unlikely. More positively, term matching, an important
subcase of unification, will be shown to have a parallel algorithm
requiring a number of processors polynomial in n, the size of the input,
and running in time poly-logarithmic in n.

This talk assumes no familiarity with unification or its applications.

------------------------------

Date: Wed, 3 Apr 85 17:03:57 PST
From: IBM San Jose Research Laboratory Calendar
<calendar%ibm-sj.csnet@csnet-relay.arpa>
Reply-to: IBM-SJ Calendar <CALENDAR%ibm-sj.csnet@csnet-relay.arpa>
Subject: Seminar - Late-Bound Polymorphic Languages (IBM-SJ)

IBM San Jose Research Lab
5600 Cottle Road
San Jose, CA 95193


Mon., Apr. 8 Computer Science Seminar
10:30 A.M. STATIC TYPE ANALYSIS IN LATE-BOUND POLYMORPHIC LANGUAGES
Aud. A Late-bound languages are those which in some way
permit the user to delay the binding of meanings to
operation names until such time as the operation is
actually to be invoked. SIMULA is generally
considered to be the first such language and several
others, most notably Smalltalk, Zetalisp, and LOOPS,
have been introduced in recent years. These
languages gain great flexibility from the
programmer's ability to write procedures which work
on a wide variety of kinds of data, each of which
implements a small set of common operations in a
data-specific way. This kind of polymorphism,
coupled with an inheritance mechanism for sharing
code, has enabled the creation of powerful software
in remarkably little time and space. The price for
this flexibility, however, has until now been the
inability to formally reason about programs in these
languages. In particular, the lack of a decidable
and sound system of static typechecking has prevented
the application of even the simplest kinds of
compiler optimizations. In this way, late-binding
has acquired a reputation for inherent slowness and
impracticality. My talk will examine the issue of
static typechecking for these languages in detail.
The Smalltalk-80 language is studied first, leading
to an explication of both the major subtleties
inherent in the problem and the picayune but
crippling difficulties with Smalltalk itself which
render static analysis in that language ugly and,
ultimately, impracticable. An approach to the
problem and a new perspective on late-binding itself
eventually emerge and, in the final part of the talk,
these are applied to the design of a small,
late-bound, polymorphic language. I have constructed
formal semantics and a provably-sound system for
type-inference in this language. These in turn have
led to a type-assignment algorithm based on circular
unification.

P. Curtis, Cornell University
Host: J. Williams

------------------------------

Date: 2 Apr 1985 1621-EST
From: Lydia Defilippo <DEFILIPPO@CMU-CS-C.ARPA>
Subject: Seminar - Inclusive Predicate Existence (CMU)

APPLIED LOGIC SEMINAR

Speaker: Ketan Mulmuley
Date: Monday, April 8, 1985
Time: 2:00 - 3:15 pm
Place: 2105 Doherty Hall
Title: A Mechanizable Theory For Inclusive Predicate Existence
ABSTRACT:

In denotational semantics inclusive predicates play a major role in showing
the equivalence of two semantics - one operational and one denotational, for
example - of a given language. Proving the existence these inclusive
predicates is always the hardest part of any semantic equivalence proof.
Reynolds and Milne gave a general scheme for proving such existences.
However, because of the difficult nature of these proofs it soon became
desirable to have a mechanizable theory for such proofs so that the computer
could automate a large chunk of them. We shall present one such theory and
see how it was implemented on top of LCF.

Moreover, we shall prove that this existence issue is indeed nontrivial
through diagonalization.

------------------------------

Date: Sun 7 Apr 85 13:23:27-CST
From: CL.SHANKAR@UTEXAS-20.ARPA
Subject: Seminar - The LCF Programmable Theorem Provers (UTexas)


Larry Paulson (Cambridge U.) on LCF, Tue April 9, 4-5pm, Pai 3.02

Abstract: The LCF project has produced a family of interactive,
programmable theorem-provers, particularly intended for verifying computer
hardware and software. The introduction sketches basic concepts: the
metalanguage ML, the logic PPLAMBDA, backwards proof, rewriting, and theory
construction. A historical section surveys some LCF proofs. Several
proofs involve denotational semantics, notably for compiler correctness.
Functional programs for parsing and unification have been verified.
Digital circuits have been proved correct, and some subsequently fabricated.

There is an extensive bibliography of work related to LCF. The most dynamic
issues at present are data types, subgoal techniques, logics of computation,
and the development of ML.

------------------------------

Date: 15 Apr 1985 0912-EST
From: Lydia Defilippo <DEFILIPPO@CMU-CS-C.ARPA>
Subject: Seminar - Automated Programming (CMU)

Speakers: Kent Petersson and Jan Smith
Dept. of Mathematics, Univ. of Goteborg, Sweden
Date: Tuesday, April 16
Time: 3:30 - 4:30
Place: 4605 Wean Hall
Topic: Program derivation in type theory

In Martin-Lof's type theory, programs can be formally derived from
specifications. When formulating a specification, predicate
logic is available by the interpretation of propositions as types.
Formulating the rules as tactics, programs are constructed "top
down." These ideas will be illustrated by a derivation of a
program for the partitioning problem.

------------------------------

Date: 17 Apr 85 13:28:24 EST
From: John <Bresina@RUTGERS.ARPA>
Subject: Seminar - Mathematical Discovery and Hindsight (RU)


Machine Learning Colloquium

MATHEMATICAL DISCOVERY AND THE HINDSIGHT CRITERION

Michael Sims, Rutgers

Abstract

My dissertation research is an investigation into the nature of mathematical
discovery, and this talk will concern the portion of that work related to the
understanding of the relationship between changes in representation and a
discovery systems performance. More specifically, I would like to define new
concepts in such a way that the resulting representation language leads to
better efficiency in future discovery tasks. I will suggest a general
principle (Hindsight Criterion), which leads to greater future efficiency in
concept formation problems in an interestingness driven discovery system. I
will also indicate why this is expected to be true.

The Hindsight Criterion says that it is useful to define a new concept when
doing so will make an (already known) interesting expression simpler. Although
this is a widely used criterion, my work indicates how Hindsight relates to the
complexity of the search for the original expression, as well as how it can
improve the complexity of future discoveries in an actual discovery system. I
will also discuss how Hindsight is being used to direct concept formation in
the mathematical discovery system, IL, which I am currently implementing.

DATE: Friday, April 19th
TIME: 11:00-12noon (talk); 12noon-12:30 (discussion)
PLACE: Room 423, Hill

------------------------------

Date: 25 Apr 1985 0828-EST
From: Lydia Defilippo <DEFILIPPO@CMU-CS-C.ARPA>
Subject: Seminar - Foundations of Horn Logic Programming (CMU)


Speaker: Vijay Saraswat
Date: Monday, April 29, 1985
Time: 2:00 - 3:15
Place: Scaife Hall 324
Topic: On the foundations of (Horn) logic programming.


In this talk I present the classical (van Emden, Kowalski, Apt ...)
interpretation of Horn logic (without equality) as a programming language.
In this view (computation as deduction), definite clauses are viewed as
specifications of recursively defined procedures. The least fixed point of the
functional associated with these procedures (which can be captured by a form
of resolution known as SLD-resolution, i.e. non-deterministic procedure call
with parameter passing by unification) is then seen as precisely the extension
of the corresponding predicate in the initial model of the axioms.

Various operational notions of "negation" arise in this context, which can be
related to validity in certain Herbrand models, and I will
discuss them briefly. I will also discuss recent work by Nait-Abdallah and van
Emden on the semantics of a certain kind of infinitary computation using Horn
clauses.

Finally, I will touch upon some "real" "logic" programming langauges such as
Prolog and Concurrent Prolog and show why the classical semantics is not
adequate. For Prolog it only provides a notion of partial correctness, and for
Concurrent Prolog it just bounds the space of correct answers to a query, while
not telling us anything about whether the query might fail or not terminate.
Answering such questions leads to the application of more conventional
techniques from semantics of programming languages to (Horn) logic programming.

------------------------------

Date: Wed 24 Apr 85 17:10:00-PST
From: Emma Pease <Emma@SU-CSLI.ARPA>
Subject: Seminar - Property Theory and Second-Order Logic (CSLI)

[Excerpted from the CSLI Newsletter by Laws@SRI-AI.]


THURSDAY, May 2, 1985

2:15 p.m. CSLI Seminar
Redwood Hall ``Property Theory and Second-Order Logic''
Room G-19 Chris Menzel, CSLI


``Property Theory and Second-Order Logic''

Much recent work in semantics (e.g., Barwise and Perry, Chierchia,
Sells, Zalta) involves an extensive appeal to abstract logical
objects--properties, relations, and propositions. Such objects were
of course not unheard of in semantics prior to this work. What is
noteworthy is the extent to which the conception of these objects
differs from the prevailing conception in formal semantics, viz.,
Montague's. Two ways in which they differ (not necessarily common to
all recent accounts) stand out: first, these abstract objects are
metaphysically primitive, not set theoretic constructions out of
possible worlds and individuals; second, they are untyped--properties
can exemplify each other as well as themselves, relations can fall
within their own field, and so on.
With properties, relations, and propositions playing this more
prominent role in semantics (as well as in philosophy), it is
essential that there be a rigorous mathematical theory of these
objects. The framework for such a theory, I think, is second-order
logic; indeed, I will argue that second-order logic, rightly
understood, just IS the theory of properties, relations, and
propositions. To this end, building primarily on the work of Bealer,
Cocchiarella, and Zalta, I will present a second-order logic that is
provably consistent along with an algebraic intensional semantics
which yields significant insights into the structure of the abstract
ontology of logic and the paradoxes. Time permitting, I will apply the
logic to two issues, one in semantics and the other in the philosophy
of mathematics--specifically, to the analysis of noun phrases
involving terms of order like `fourth' and `last', and the question of
what the (ordinal) numbers are, to which I will give a logicist answer
adumbrated by Russell. --Chris Menzel

------------------------------

End of AIList Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT