Copy Link
Add to Bookmark
Report

AIList Digest Volume 4 Issue 016

eZine's profile picture
Published in 
AIList Digest
 · 15 Nov 2023

AIList Digest           Thursday, 30 Jan 1986      Volume 4 : Issue 16 

Today's Topics:
Journal Issue - Blackboard Models for AI in Engineering,
Seminars - Naive Physics: Knowledge in Pieces (UCB) &
Term Rewriting, Theorem Proving, Logic Programming (CSLI) &
The Algebra of Time Intervals (SRI) &
Machine Learning and Economics (RU) &
Semi-Applicative Programming (UPenn) &
Integrating Syntax and Semantics (Edinburgh) &
Feature Structures in Unification Grammars (UPenn)

----------------------------------------------------------------------

Date: Mon, 27 Jan 86 23:21:15 est
From: Michael Bushnell <mb@ohm.ECE.CMU.EDU>
Subject: Call for Papers - Blackboard Models for AI in Engineering


======================================================================
| CALL FOR PAPERS for the |
| |
| INTERNATIONAL JOURNAL FOR |
| |
| ARTIFICIAL INTELLIGENCE IN ENGINEERING |
| October, 1986 Special Issue |
| |
| Guest Editors: |
| Pierre Haren, INRIA, France. |
| Mike Bushnell, Carnegie-Mellon University |
| |
| Manuscripts in US should be sent to: |
| Mike Bushnell |
| Department of Electrical and Computer Engineering |
| Carnegie-Mellon University |
| Pittsburgh, PA 15213 |
| USA |
| (ARPAnet: mb@ohm.ece.cmu.edu) |
| Deadline for receiving manuscripts: April 1st, 1986 |
| |
======================================================================


We are soliciting papers for a special issue of the International Journal
for AI in Engineering. This issue will focus on the AI Blackboard model, as
applied to engineering problems. Papers describing the application of the
Blackboard model to problems in the disciplines of Electrical Engineering,
Computer Engineering, Chemical Engineering, Civil Engineering, Mechanical
Engineering, Metallurgy, Materials Science, Robotics, and areas of Computer
Science are appropriate. Papers describing applications to other
disciplines may also be appropriate. In addition, papers discussing AI
tools that are particularly appropriate for Engineering applications are
most welcome, along with book reviews, letters to the editor, conference
reports, and other relevant news.

All submissions must be original papers written in English and will be
refereed. The copyright of published papers will be vested with the
publishers. Contributions will be classified as research papers and
research notes, of up to 5000 equivalent words, or as review articles of up
to 10,000 equivalent words. Authors wishing to prepare review articles
should contact the editors in advance. Manuscripts should be typed
double-spaced with wide margins, on one side of the paper only, and
submitted in triplicate. The article should be preceded by a summary of
not more than 200 words describing the entire paper. A list of key words is
also required. The article title should be brief and stated on a separate
page with the author's names and addresses.

------------------------------

Date: Wed, 22 Jan 86 16:47:34 PST
From: admin%cogsci@BERKELEY.EDU (Cognitive Science Program)
Subject: Seminar - Naive Physics: Knowledge in Pieces (UCB)

BERKELEY COGNITIVE SCIENCE PROGRAM
Spring 1986
Cognitive Science Seminar - IDS 237B

Tuesday, January 28, 11:00 - 12:30
[NB. New Location] 2515 Tolman Hall
Discussion: 12:30 - 1:30 [location TBA]

``Knowledge in Pieces''
Andrea A. diSessa
Math Science and Technology, School of Education

Abstract
Naive Physics concerns expectations, descriptions and
explanations about the way the physical world works that people
seem spontaneously to develop through interaction with it. A
recent upswing in interest in this area, particularly concern-
ing the relation of naive physics to the learning of school
physics, has yielded significant interesting data, but little
in the way of a theoretical foundation. I would like to pro-
vide a sketch of a developing theoretical frame together with
many examples that illustrate it.

In broad strokes, one sees a rich but rather shallow (in a
sense I will define), loosely coupled knowledge system with
elements that originate often as minimal abstractions of common
phenomena. Rather than a "change of theory" or even a shift in
content of the knowledge system, it seems that developing
understanding of classroom physics may better be described in
terms of a change in structure that includes selection and
integration of naive knowledge elements into a system that is
much less data-driven, less context dependent, more capable of
"reliable" (in a technical sense) descriptions and explana-
tions. In addition I would like to discuss some hypothetical
changes at a systematic level that do look more like changes of
theory or belief. Finally, I would like to consider the poten-
tial application of this work to other domains of knowledge,
and the relation to other perspectives on the problem of
knowledge.

------------------------------

Date: Wed 22 Jan 86 17:32:26-PST
From: Emma Pease <Emma@SU-CSLI.ARPA>
Subject: Seminar - Term Rewriting, Theorem Proving, Logic Programming (CSLI)

[Excerpted from the CSLI Newsletter by Laws@SRI-AI.]


CSLI ACTIVITIES FOR NEXT THURSDAY, January 30, 1986

2:15 p.m. CSLI Seminar
Ventura Hall Term Rewriting Systems and Application to Automated
Trailer Classroom Theorem Proving and Logic Programming
Helene Kirchner (Kirchner@sri-ai)


Term Rewriting Systems and Application to
Automated Theorem Proving and Logic Programming
Helene Kirchner

Term rewriting systems are sets of rules (i.e. directed equations)
used to compute equivalent terms in an equational theory. Term
rewriting systems are required to be terminating and confluent in
order to ensure that any computation terminates and does not depend on
the choice of applied rules. Completion of term rewriting systems
consists of building, from a set of non-directed equations, a
confluent and terminating set of rules that has the same deductive
power. After a brief description of these two notions, their
application in two different domains are illustrated:
- automated theorem proving in equational and first-order
logic,
- construction of interpretors for logic programming languages
mixing relational and functional features.

------------------------------

Date: Thu 23 Jan 86 11:50:10-PST
From: LANSKY@SRI-AI.ARPA
Subject: Seminar - The Algebra of Time Intervals (SRI)

THE ALGEBRA OF TIME INTERVALS

Peter Ladkin (LADKIN@KESTREL)
Kestrel Institute

11:00 AM, MONDAY, January 27
SRI International, Building E, Room EJ228 (new conference room)

We build on work of James Allen (Maintaining Knowledge about Temporal
Intervals, CACM Nov 1983), who suggested a calculus of time intervals.
Allen's intervals are all convex (no gaps). We shall present a
taxonomy of *natural* relations between non-convex [i.e.,
non-contiguous] intervals, and illustrate the expressiveness of this
subclass, with examples from the domain of project management. In
collaboration with Roger Maddux, we have new mathematical results
concerning both Allen's calculus, and our own. We shall present as
many of these as time permits.

The talk represents work in progress. We are currently designing and
implementing a time expert for the Refine system at Kestrel Institute,
which will include the interval calculus.

------------------------------

Date: 22 Jan 86 09:18:06 EST
From: Tom <mitchell@RED.RUTGERS.EDU>
Subject: Seminar - Machine Learning and Economics (RU)

[Forwarded from the Rutgers bboard by Laws@SRI-AI.]


ML Colloquium talk

Title: Market Traders: Intelligent Distributed Systems
In an Open World
Speaker: Prof. Spencer Star
Laval University, Quebec
Date: Friday, Jan 24
Time: 11 am
Location: Hill 423

Professor Spencer Star is a computer scientist/economist who
works on simulating economic markets. He will be spending the coming
year on sabbatical at Rutgers to work on incorporating a machine
learning component into his current market simulations. He is
visiting now in order to meet the department and to get some feedback
on his current research ideas on learning. Below is part of an
abstract from his recent paper. [...]

-Tom Mitchell


Market Traders: Intelligent Distributed Systems In an Open World

Although markets are at the heart of modern microeconomics, there has
been relatively little attention paid to disequilibriun states and to
the decision-making rules used by traders within markets. I am
interested in the procedures that traders use to determine when and
how much they will bid, how they adapt their behaviour to a changing
market environment, and the effects of their adaptive behaviour on the
market's disequilibrium path. This paper reports on research to study
these questions with the aid of a computer program that represents a
market with interacting and independent knowledge-based traders. The
program is callled TRADER.

In a series of experiments with TRADER I find that market efficiency
requires a minimum number of intelligent traders with a capacity to
learn, but when their knowledge is reflected in the market bids and
asks, naive traders can enter the markets and sometimes do better than
the expert traders. Moreover, the entrance of naive traders in a
market that is already functioning efficiently does not degrade the
market's performance. Since learning by independent agents appears to
be a key element in understanding and using open systems, the focus of
future research will be on studying learing and adaptive processes by
intelligent agents in open systems.

------------------------------

Date: Tue, 28 Jan 86 15:41 EST
From: Tim Finin <Tim%upenn.csnet@CSNET-RELAY.ARPA>
Subject: Seminar - Semi-Applicative Programming (UPenn)


SEMI-APPLICATIVE PROGRAMMING: AN EXAMPLE
N.S Sridharan
BBN Labs, AI Department, Cambridge MA

3pm Thursday, January 30, 1986
216 Moore, University of Pennsylvania

Most current parallel programming languages are designed with a sequential
programming language as the base language and have added constructs that allow
parallel execution. We are experimenting with an applicative base language
that has implicit parallelism everywhere, and then we introduce constructs that
inhibit parallelism. The base language uses pure LISP as a foundation and
blends in interesting features of Prolog and FP. Proper utilization of
available machine resources is a crucial concern in functional programming. We
advocate several techniques of controlling the behavior of functional programs
without changing their meaning or functionality: program annotation with
constructs that have benign side-effects, program transformation and adaptive
scheduling. This combination yields us a semi-applicative programming language
and an interesting programming methodology.

In this talk we give some background information on our project, its aims and
scope and report on work in progress in the area of parallel algorithms for
context-free parsing.

Starting with the specification of a context-free recognizer, we have been
successful in deriving variants of the recognition algorithm of
Cocke-Kasami-Younger. One version is the CKY algorithm in parallel. The
second version includes a top-down predictor to limit the work done by the
bottom-up recognizer. The third version uses a cost measure over derivations
and produces minimal cost parses using a dynamic programming technique. In
another line of development, we arrive at a parallel version of the Earley
algorithm.

------------------------------

Date: Wed, 29 Jan 86 10:19:18 GMT
From: Gideon Sahar <gideon%edai.edinburgh.ac.uk@cs.ucl.ac.uk>
Subject: Seminar - Integrating Syntax and Semantics (Edinburgh)

EDINBURGH AI SEMINARS

Date: 29th January l985
Time: 2.00 p.m.
Place: Department of Artificial Intelligence
Seminar Room - F10
80 South Bridge
EDINBURGH.


Dr. Ewan Klein, Centre for Cognitive Studies, University of Edinburgh
will give a seminar entitled - "Integrating syntax and semantics :
unification categorial grammar as a tool for a natural language
processing".

This talk will report on work carried out at the Centre for Cognitive
Science By Henk Zeevat, Jo Calder and Ewan Klein as part of an ESPRIT
project on natural language and graphics interfaces to a knowledge-base.

In recent years there has been a surge of interest in syntactic
parsers which exploit linguistically-motivated non-transformatinal
grammar formalisms: instances are the GPSG chart parser at
Hewlett-Packard, Palo Alto, and the PATR-II parser at SRI, Menlo Park.
By contrast, progress in the development of tractable, truth-conditional
semantic formalisms for parsing has lagged behind.

Unification categorial grammar (UCG) employs three resources which
significantly improve this situation. The first is Kamp's theory of
Discourse Representation: this is essentially a first-order calculus
which nevertheless provides a more elegant treatment of NL anaphora and
quantification than standard first-order logic.

Second, the grammar encodes both syntactic and semantic information in
the same data structures, namely directed acyclic graphs, and
manipulates them with same operation, namely unification. Third, the
fundamental grammar rule is that of categorial grammar, namely
functional application. Since the grammar objects contain both
syntactic and semantic information, any rule application will
simultaneously produce syntactic and semantic results.

UCG translates readily into a PATR-like declarative formalism, for
which Calder has written a Prolog implementation called PIMPLE.

------------------------------

Date: Tue, 28 Jan 86 15:41 EST
From: Tim Finin <Tim%upenn.csnet@CSNET-RELAY.ARPA>
Subject: Seminar - Feature Structures in Unification Grammars (UPenn)


LOGICAL SPECIFICATIONS FOR FEATURE
STRUCTURES IN UNIFICATION GRAMMARS

William C. Rounds and Robert Kasper, University of Michigan

3pm Tuesday, February 4, 1986
216 Moore, University of Pennsylvania

In this paper we show how to use a simple modal logic to give a complete
axiomatization of disjunctively specified feature or record structures commonly
used in unification-based grammar formalisms in computational linguistics. The
logic was originally developed as a logic to explain the semantics of
concurrency, so this is a radically different application. We prove a normal
form result based on the idea of Nerode equivalence from finite automata
theory, and we show that the satisfiability problem for our logical formulas is
NP-complete. This last result is a little surprising since our formulas do not
contain negation. Finally, we show how the unification problem for
term-rewriting systems can be expressed as the satisfiability problem for our
formulas.

------------------------------

End of AIList Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT