Copy Link
Add to Bookmark
Report

AIList Digest Volume 2 Issue 134

eZine's profile picture
Published in 
AIList Digest
 · 1 year ago

AIList Digest            Tuesday, 9 Oct 1984      Volume 2 : Issue 134 

Today's Topics:
Seminars - AI Control Design & Fault Diagnosis & Composite Graph Theory,
Lectures - Logic and AI,
Program - Complexity Year at MSRI
----------------------------------------------------------------------

Date: Mon 8 Oct 84 09:31:31-PDT
From: Ken Laws <Laws@SRI-AI.ARPA>
Subject: Seminar - AI Control Design

From the IEEE Grid newsletter for the SF Bay Area:

Some very exciting new ideas on the role of expert systems in control
design for AI will be presented at the Oct. 25 meeting of the Santa
Clara Valley Control Systems, Man and Cybernetics Society.

The talk, by Dr. Thomas Trankle and Lawrence Markosian of Systems
Control Technology, will report work in progress to develop an AI
system that implements a linear feedback control designer's expert
knowledge. This AI system is a planning expert system written in LISP,
and has knowledge of linear control design rules and an interface
with a control CAD package.

The LISP code represents the design rules as operators that have
goals, preconditions, and side effects. Higher-level operators
or "scripts" represent expert design procedures. The control
design process takes the form of a recursive goal-directed search,
aided by the expert designer's heuristics.

Cocktails at 6:30 pm, dinner ($11) at 7:00, presentation at 8:00.
Rick's Swiss Chalet, 4085 El Camino, Palo Alto
Reservations by Oct. 24, Council Office, (415) 327-6622.

------------------------------

Date: Mon 8 Oct 84 09:48:09-PDT
From: Paula Edmisten <Edmisten@SUMEX-AIM.ARPA>
Subject: Seminar - Reasoning About Fault Diagnosis with LES

[Forwarded from the Stanford SIGLUNCH distribution by Laws@SRI-AI.]

DATE: Friday, October 12, 1984
LOCATION: Chemistry Gazebo, between Physical and Organic Chemistry
TIME: 12:05

SPEAKER: Walter Perkins
Lockheed Palo Alto Research & Development

ABSTRACT: Reasoning About Fault Diagnosis with LES

The Lockheed Expert System (LES) is a generic framework for helping
knowledge engineers solve problems in diagnosing, monitoring,
designing, checking, guiding, and interpreting. Many of the ideas of
EMYCIN were incorporated into its design, but it was given a more
flexible control structure. In its first "real" application, LES
was used to guide less-experienced maintenance personnel in the fault
diagnosis of a large electronic signal-switching network. LES used
not only the knowledge of the expert diagnostician (captured in the
familiar form of "IF-THEN" rules), but also knowledge about the
structure and function of the device under study to perform rapid
isolation of the module causing the failure. In this talk we show how
the topological structure of the device is modeled in a frame
structure and the troubleshooting rules of the expert are conveniently
represented using LES's case grammar format. We also explain how
"demons" are used to setup an agenda of relevant goals and subgoals.
The system was fielded in November 1983, and is being used by Lockheed
technicians. A preliminary evaluation of the system will also be
discussed. LES is being applied in a number of other domains which
include design verification, satellite communication,
photo-interpretation, and hazard analysis.

Paula

------------------------------

Date: Sat 6 Oct 84 15:26:34-PDT
From: Andrei Broder <Broder@SU-SCORE.ARPA>
Subject: Seminar - Composite Graph Theory

[Forwarded from the Stanford bboard by Laws@SRI-AI.]

AFLB talk
10/11/84 - Joan Feigenbaum (Stanford):

Recognizing composite graphs is equivalent to
testing graph isomorphism

In this talk I will explore graph composition from complexity
theoretic point of view. Given two graphs G1 and G2, we construct the
composition G = G1[G2] as follows: For each node in G2, insert a copy
of G1. If two copies correspond to nodes that are adjacent in G2,
then draw in all possible edges x -- y such that x is in one copy and
y is in the other. A graph that can be expressed as the composition
of two smaller graphs is called composite and one that cannot is
called irreducible.

Composite graphs have a great deal of structure and their abstract
mathematical properties have been studied extensively. In particular,
Harary and Sabidussi have characterized the relationships between the
automorphism groups of G1 and G2 and the automorphism group of their
composition. Graph composition has been used by Garey and Johnson and
Chv\'atal to study NP-complete problems. Garey and Johnson used it to
derive upper bounds on the accuracy of approximation algorithms for
graph coloring. Chv\'atal showed that the Hamiltonian circuit problem
remains NP-complete even if the input graph is known to be composite.
In this talk, I consider what seems to be a more basic question about
composite graphs; namely, how difficult are they to recognize?

The main result I will give is that testing whether a graph is
composite is equivalent to testing whether two graphs are isomorphic.
In the proof that recognizing composite graphs is no harder than
testing graph isomorphism, I will give an algorithm that either
declares a graph irreducible or finds a non-trivial decomposition.
This distinguishes graph- decomposition from integer-factorization,
where primality-testing and factoring are not known to have the same
complexity. The inherent difficulty of the recognition problem for
composite graphs gives some insight into why some difficult graph
theoretic problems, such as Hamiltonian circuit, are no easier even if
the inputs are known to be composite. Furthermore, assuming P does
not equal NP, graph isomorphism is one of the most important problems
for which neither a polynomial time algorithm nor a proof that there
cannot be such an algorithm is known. Perhaps examining a problem
that is equivalent to it will yield insight into the complexity of the
graph isomorphism problem itself. For example, if all irreducible
graphs have succinct certificates, then graph isomorphism is in Co-NP.

If there is time, I will also show that for cartesian multiplication,
another way to construct product graphs, the recognition problem is in
P. This talk presents joint work with Alex Schaffer.

***** Time and place: October 11, 12:30 pm in MJ352 (Bldg. 460) ****

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Regular AFLB meetings are on Thursdays, at 12:30pm, in MJ352 (Bldg.
460).
- Andrei Broder

------------------------------

Date: Mon, 8 Oct 84 15:00:32 edt
From: minker@maryland (Jack Minker)
Subject: Lectures - Logic and AI at Maryland, Oct. 22-26


FINAL ANNOUNCEMENT
WEEK
of
LOGIC and its ROLE in ARTIFICIAL INTELLIGENCE
at
THE UNIVERSITY OF MARYLAND
OCTOBER 22-26, 1984


The Mathematics and Computer Science Departments at the
University of Maryland at College Park are jointly sponsor-
ing a Special Year in Mathematical Logic and Theoretical
Computer Science. The week of October 22-26 will be devoted
to Logic and its role in Artificial Intelligence. The
titles and abstracts of the five distinguished lectures that
are to be presented are as follows:


Monday, October 22:

RAYMOND REITER (University of British Columbia)

LOGIC FOR SPECIFICATION: DATABASES, CONCEPTUAL MODELS
AND KNOWLEDGE REPRESENTATION LANGUAGES.


AI systems and databases have a feature in common: they
require representations for various aspects of the real
world. These representations are meant to be queried and,
in response to new information about the world, modified in
suitable ways. Typically, these query and modification
processes require reasoning using the underlying representa-
tion of the world as premises. So, it appears natural to
use a suitable logical language for representing the
relevant features of the world, and proof theory for the
reasoning. This is not the normal practise in databases and
AI. The representations used assume a variety of forms, usu-
ally bearing little or no resemblance to logic. In AI,
examples of such representation systems include: semantic
networks, expert systems, and many different knowledge
representation languages such as KRL, KL-ONE, FRL. In data-
bases, example representation systems are the relational
data model, and various conceptual or semantic models like
TAXIS and the entity-relationship model. The point of these
representation systems is that they provide their users with
computationally efficient ways of representing, and using
the knowledge about an application domain. The natural role
of logic in databases and AI is a language for specifying
representation systems. On this view, one must distinguish
between the abstract specification, using logic, of the
knowledge content of a database or AI application, and its
realization as a representation system. This distinction
has pleasant consequences:

1. The logical specification provides a rigorous
semantics for the representation system realizing the
specification.

2. One can prove the correctness of representation
systems with repect to their logical semantics.

3. By taking seriously the problem of logically speci-
fying an application, one discovers some rich and fas-
cinating epistemological issues e.g. the centrality of
non-monotonic reasoning for representation systems.


Tuesday, October 23:

JOHN McCARTHY (Stanford University)

MATHEMATICS OF CIRCUMSCRIPTION


Circumscription (McCarthy 1980, 1984) is a method of non-
monotonic reasoning proposed for use in artificial intelli-
gence. Let A(P) be a sentence expressing the facts "being
taken into account", where P stands for a "vector" of predi-
cates regarded as variable. Let E(P,x) be a wff depending
on a variable x and the Ps. The circumscription of E(P,x)
is a second order formula in P expressing the fact that P
minimizes lambda x.E(P,x) subject to the facts A(P). The
non-monotonicity arises, because augmenting A(P) sometimes
reduces the conclusions that can be drawn. Circumscription
raises mathematical problems similar to those that arise in
analysis in that it involves minimization of a functional
subject to constraints. However, its logical setting
doesn't seem to permit direct use of techniques from
analysis. Here are some open questions that will be treated
in the lecture.

1. What is the relation between minimal models and the
theory generated by the circumscription formula?

2. When do minimal models exist?

3. The circumscription formula is second order. When
is it equivalent to a first order formula?

4. There are several variants of circumscription
including successive circumscriptions and prioritized
circumscription. What are the relations among these
variants?

References:

McCarthy, John (1980):
"Circumscription - A Form of Non-Monotonic Reasoning",
Artificial Intelligence, Volume 13, Numbers 1,2,
April.

McCarthy, John (1984):
"Applications of Circumscription to Formalizing Common
Sense Knowledge". This paper is being given at the
1984 AAAI conference on non-monotonic reasoning and is
being submitted for publication to Artificial Intelli-
gence.


Wednesday, October 24:

MAARTEN VAN EMDEN (University of Waterloo)

STRICT AND LAX INTERPRETATIONS OF RULES IN LOGIC PROGRAMMING


The strict interpretation says only that is admit-
ted which is explicitly allowed by a rule. The lax
interpretation says only that is excluded which is
explicitly disallowed. This distinction is impor-
tant in mathematics and in law, for example. Logic
programs also are susceptible to both interpreta-
tions. We discuss the use of fixpoint techniques
to determine Herbrand models of logic programs. We
find that least fixpoints and least models correspond
to the strict interpretation and characterize suc-
cessful finite computations of logic programs.
Greatest fixpoints and greatest models correspond to
the lax interpretation and are closely related to
negations inferred by finite failure and to terms con-
structed by certain infinite computations.


Thursday, October 25:

JON BARWISE (Stanford University)

CONSTRAINT LOGIC.


Constraint Logic is based on a semantics that grew out
of situation semantics, but on a syntax similar to
that from first-order logic. The sematics is not car-
ried out in set theory, as is usual in logic, but in a
richer theory I call Situation Theory, a theory about
things like situations, roles, conditions, types and
constraints. While the syntax is not so unusual look-
ing, the connection between the syntax and semantics
is much more dynamic than is in traditional logic,
since the interpretation assigned to a given *use* of
some expression will depend on context, in particular,
on the history of the "session". For example, vari-
ables are interpreted as denoting roles, but different
uses of a given variable x may denote increasingly
constrained roles as a session proceeds. This is one
feature that makes Constraint Logic interesting with
regard to AI in general and with regard to non-
monotonic logic in particular.


Friday, October 26:

LAWRENCE HENSCHEN (Northwestern University)

COMPILING CONSTRAINT-CHECKING PROGRAMS IN DEDUCTIVE DATABASES.


There are at least two kinds of formulas in the inten-
sional database which should always be satisfied by
the interpretations corresponding to the various
states of the database -- definitions and integrity
constraints. In our approach, formulas defining new
relations are used in response to queries to compute
portions of those defined relations; such formulas are
therefore automatically satisfied by the underlying
database state. On the other hand integrity con-
straints may need to be checked each time the database
changes. Of course, we believe there are significant
advantages in being able to express integrity con-
straints in a non-procedural way, such as with first
order logic. However, reevaluating an entire first-
order statement would be wasteful as normally only a
small portion of the database needs to be checked. We
present (resolution-based) techniques for developing
from first-order statements efficient tests for
classes of updates. These tests can be developed at
database creation time, hence are compiled, and can be
applied before a proposed update is made so that
failure does not require backing out.


Lectures will be given at:

MWF 11:00 AM - 12:30 PM
TTH 10:00 AM - 11:30 AM

Location: Mathematics Building, 3rd Floor Room Y3206

The lectures are open to the public. If you plan to
attend kindly notify us so that we can make appropriate
plans for space. We regret that all funds available to
support junior faculty and graduate students have been allo-
cated. For additional information contact:

Jack Minker
Department of Computer Science
University of Maryland
College Park, MD 20742
(301) 454-6119
minker@maryland

------------------------------

Date: Mon, 8 Oct 84 15:24:48 pdt
From: ann%ucbernie@Berkeley
Subject: Program - Complexity Year at MSRI

[Forwarded from the Univ. of Wisconsin by Udi@WISC-RSCH.]
[Forwarded from the SRI bboard by Laws@SRI-AI.]


COMPLEXITY YEAR AT
MATHEMATICAL SCIENCES RESEARCH INSTITUTE


A year-long research program in computational complexity will take
place at the Mathematical Sciences Research Institute, Berkeley, California,
beginning in August, 1985. Applications are solicited for memberships in
the Institute during this period. The Institute will award eight or more
postdoctoral fellowships to new and recent Ph.D.'s who intend to participate
in this program. These fellowships are generally for the entire year, but
half-year awards are also possible. It is hoped and expected that members
at the more senior level will come with partial or full support from sab-
batical leaves and other sources. Memberships for any period are possible,
although, for visits of less than three months, Institute support is limited
to awards to help offset living expenses.


The Program Committee for the complexity year consists of Richard Karp
and Stephen Smale (co-chairmen) and Ronald Graham. The program will emphasize
concrete computational problems of importance either within mathematics and
computer science or in the application of these disciplines to operations
research, numerical computation, economics and other fields. Attention will
be given both to the design and analysis of efficient algorithms and to the
inherent computational complexity of problems. Week-long workshops are planned
on topics such as complexity theory and operations research, complexity theory
and numerical analysis, algebraic and number-theoretic computation, and
parallel and distributed computation. Programs in Mathematical Economics
and in Geometric Function Theory will take place concurrently with the
Computational Complexity program.


Address inquiries and applications to:

Calvin C. Moore, Deputy Director
Mathematical Sciences Research Institute
2223 Fulton St., Room 603
Berkeley, California 94720

Applicants' files should be completed by January 1, 1985.

The Institute is committed to the principles of Equal Opportunity and
Affirmative Action.

------------------------------

End of AIList Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT