Copy Link
Add to Bookmark
Report

AIList Digest Volume 1 Issue 030

eZine's profile picture
Published in 
AIList Digest
 · 1 year ago

AIList Digest            Tuesday, 2 Aug 1983       Volume 1 : Issue 30 

Today's Topics:
Automatic Translation - Lisp to Lisp,
Language Understanding - EPISTLE System,
Programming Aids - High-Level Debuggers,
Databases - Request for Geographic Descriptors,
Seminars - Chess & Evidential Reasoning
----------------------------------------------------------------------

Date: Fri 29 Jul 83 15:53:59-PDT
From: Michael Walker <WALKER@SUMEX-AIM.ARPA>
Subject: Lisp Translators

[...]

There has been some discussion about Lisp translation programs
the last couple of days. Another one to add to the list is that
developed by Gord Novak at Sumex for translating Interlisp into Franz,
Maclisp, UCILisp, and Portable Standard Lisp. I suspect Gord would
have a pretty good idea about what else is available, as this seems to
be an area of interest of his.

Mike Walker

[Another resource might be the set of macros that Rodney Brooks
developed to run his Maclisp ACRONYM system under Franz Lisp.
The Image Understanding Testbed at SRI uses this package.
-- KIL]

------------------------------

Date: 30 Jul 1983 07:10-PDT
From: the tty of Geoffrey S. Goodfellow
Reply-to: Geoff@SRI-CSL
Subject: IBM Epistle.


TECHNOLOGY MEMO
By Dan Rosenheim
(c) 1983 Chicago Sun-Times (Independent Press Service)
IBM is experimenting with an artificial intelligence program that
may lead to machine recognition of social class, according to a
research report from International Resource Development.
According to the market research firm, the IBM program can
evaluate the style of a letter, document or memo and can criticize the
writing style, syntax and construction.
The program is called EPISTLE (Evaluation, Preparation and
Interpretation System for Text and Language Entities).
Although IBM's immediate application for this technology is to
highlight ''inappropriate style'' in documents being prepared by
managers, IRD researchers see the program being applied to determine
social origins, politeness and even general character.
Like Bernard Shaw's Professor Higgins, the system will detect
small nuances of expression and relate them to the social background
of the originator, ultimately determining sex, age, level of
intelligence, assertiveness and refinement.
Particularly intriguing is the possibility that the IBM EPISTLE
program will permit a response in the mode appropriate to the user and
the occasion. For example, says IRD, having ascertained that a letter
had been sent by a 55-year-old woman of Armenian background, the
program could help a manager couch a response in terms to which the
woman would relate.

------------------------------

Date: 01 Aug 83 1203 PDT
From: Jim Davidson <JED@SU-AI>
Subject: EPISTLE


There's a lot of exaggeration here, presumably by the author of the
Sun-Times article. EPISTLE is a legitimate project being worked on
at Yorktown, by George Heidorn, Karen Jensen, and others. [See,
e.g., "The EPISTLE text-critiquing system". Heidorn et al, IBM
Systems Journal, 1982] Its general domain, as indicated, is business
correspondence. Its stated (long-term) goals are

(a) to provide support for the authors of business letters--
critiquing grammar and style, etc.;

(b) to deal with incoming texts: "synopsizing letter contents,
highlighting portions known to be of interest, and
automatically generating index terms based on conceptual
or thematic characteristics rather than key words".

Note that part (b) is stated considerably less ambitiously than in
the Sun-Times article.

The current (as of 1982) version of the system doesn't approach even
these more modest goals. It works only on problems in class (a)--
critiquing drafts of business letters. The *only* things it checks
for are grammar (number agreement, pronoun agreement, etc.), and
style (overly complex sentences, inappropriate vocabulary, etc.)
Even within these areas, it's still very much an experimental system,
and has a long way to go.

Note in particular that the concept of "style" is far short of the
sort of thing presented in the Sun-Times article. The kind of style
checking they're dealing with is the sort of thing you find in a
style manual: passive vs. active voice, too many dependent clauses,
etc.

------------------------------

Date: 28 Jul 1983 05:25:43-PST
From: whm.arizona@Rand-Relay
Subject: Debugger Query--Summary of Replies

[Reprinted from Human-Nets.]

Several weeks ago I posted a query for information on debuggers. The
information I received fell into two categories: information about
papers, and information about actual programs. The information about
papers was basically subsumed by two documents: an annotated
bibliography, and soon-to-be-published conference proceedings. The
information about programs was quite diverse and somewhat lengthy. In
order to avoid clogging the digest, only the information about the
papers is included here. A longer version of this message will be
posted to net.lang on USENET.

The basic gold mine of current ideas on debugging is the Proceedings
of the ACM SIGSOFT/SIGPLAN Symposium on High-Level Debugging which was
held in March, 1983. Informed sources say that it is scheduled to
appear as vol. 8, no. 4 (1983 August) of SIGSOFT's Software
Engineering Notes and as vol. 18, no. 8 (1983 August) of SIGPLAN
Notices. All members of SIGSOFT and SIGPLAN should receive copies
sometime in August.

Mark Johnson at HP has put together a pair of documents on debugging.
They are:

"An Annotated Software Debugging Bibliography"
"A Software Debugging Glossary"

I believe that a non-annotated version of this bibliography appeared
in SIGPLAN in February 1982. The annotated bibliography is the basic
gold mine of "pointers" about debugging.

Mark can be contacted at:

Mark Scott Johnson
Hewlett-Packard Laboratories
1501 Page Mill Road, 3U24
Palo Alto, CA 94304
415/857-8719

Arpa: Johnson.HP-Labs@RAND-RELAY
USENET: ...!ucbvax!hplabs!johnson


Two books were mentioned that are not currently included in Mark's
bibliography:

"Algorithmic Debugging" by Ehud Shapiro. It has information
on source-level debugging, debuggers in the language being
debugged, debuggers for unconventional languages, etc. It
is supposedly available from MIT Press. (From
dixon.pa@parc-maxc)

"Smalltalk-80: The Interactive Programming Environment"
A section of the book describes the system's interactive
debugger. (This book is supposedly due in bookstores
on or around the middle of October. A much earlier
version of the debugger was briefly described in the
August 1981 BYTE.) (From Pavel@Cornel.)

Ken Laws (Laws@sri-iu) sent me an extract from "A Bibliography of
Automatic Programming" which contained a number of references on
topics such as programmer's apprentices, program understanding,
programming by example, etc.


Many thanks to those who took the time to reply.

Bill Mitchell
The University of Arizona
whm.arizona@rand-relay
arizona!whm

------------------------------

Date: Fri 29 Jul 83 19:32:39-PDT
From: Robert Amsler <AMSLER@SRI-AI.ARPA>
Subject: WANTED: Geographic Information Data Bases

I want to build a geographic knowledge base and wonder if
someone out there has small or large sets of foreign
geographic data. Something containing elements such as
(PARIS CITY FRANCE) composed of three items,
Geographic-Name, Superclass, and Containing-Geographic item.

I have already acquired a list of all U.S. cities and
their state memberships; but apart from that need other
geographic information for other U.S. features (e.g. counties,
rivers, mountains, etc.) as well as world-wide data.

I am not especially looking for numeric data (e.g. Longitude
and Latitude; elevations, etc.) nor numeric attributes such
as population, area, etc. -- I want symbolic data, names of
geographic entities.

Note::: I do mean already machine-readable.

Bob Amsler
Natural-Language and Knowledge-Resource Systems Group
Advanced Computer Systems Department
SRI International
333 Ravenswood Ave
Menlo Park, CA 95025

------------------------------

Date: 1 August 1983 1507-EDT
From: Dorothy Josephson at CMU-CS-A
Subject: CMU Seminar, 8/9

[Reprinted from the CMU BBoard.]

DATE: Tuesday, August 9, 1983
TIME: 3:30 P.M.
PLACE: Wean Hall 5409
SPEAKER: Hans Berliner
TOPIC: "Ken Thompson's New Chess Theorem"

ABSTRACT

Among the not-quite-so-basic endgames in chess is the one of 2
Bishops versus Knight (no pawns). What the value of a general
position in this domain is, has always been an open question. The
Bishops have a large advantage, but it was thought that a basic and
usually achievable position could be drawn. Thompson has just shown
that this endgame is won in the general case using a technique called
retrograde enumeration. We will explain what he did, how he did it,
and the significance of this result. We hope some people from Formal
Foundations will attend as there are interesting questions relating
to whether a construction such as this should be considered a
"proof."

------------------------------

Date: 1 Aug 83 17:40:48 PDT (Monday)
From: murage.pa@PARC-MAXC.ARPA
Subject: HP Computer Colloquium, 8/4

[Reprinted from the SRI BBoard.]


JOHN D. LAWRENCE

Articifial Intelligence Center
SRI International


EVIDENTIAL REASONING:
AN IMPLIMENTATION FOR MULTI-SENSOR INTEGRATION


One common feature of most knowledge-based expert systems is that
they must reason based upon evidential information. Yet there is very
little agreement on how this should be done. Here we present our
current understanding of this problem and its solution as it applies
to multi-sensor integration. We begin by characterizing evidence as a
body of information that is uncertain, incomplete, and sometimes
inaccurate. Based on this characterization, we conclude that
evidential reasoning requires both a method for pooling multiple
bodies of evidence to arrive at a consensus opinion and some means of
drawing the appropriate conclusions from that opinion. We contrast
our approach, based on a relatively new mathematical theory of
evidence, with those approaches based on Bayesian probability models.
We believe that our approach has some significant advantages,
particulary its ability to represent and reason from bounded
ignorance. Further, we describe how these techniques are implemented
by way of a long term memory and a short term memory. This provides
for automated reasoning from evidential information at multiple
levels of abstraction over time and space.


Thursday, August 4, 1983 4:00 p.m.

5M Conference Room
1501 Page Mill Road
Palo Alto, CA 94304

NON-HP EMPLOYEES: Welcome! Please come to the lobby on time, so
that you may be escorted to the conference room.

------------------------------

End of AIList Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT