Copy Link
Add to Bookmark
Report

AIList Digest Volume 2 Issue 041

eZine's profile picture
Published in 
AIList Digest
 · 15 Nov 2023

AIList Digest           Wednesday, 4 Apr 1984      Volume 2 : Issue 41 

Today's Topics:
Physiognomic Awareness - Request,
Waveform Analysis - IBM EKG Program Status,
Logic Programming - Prolog vs. Pascal & IBM PC/XT Prolog Benchmarks,
Fifth Generation - McCarthy Review
----------------------------------------------------------------------

Date: Sun, 1 Apr 1984 01:00 EST
From: RMS.G.DDS%MIT-OZ@MIT-MC.ARPA
Subject: Physiognomic Awareness and Ergonomic Design


Physiognomic awareness and relation to Ergonomic design inquiry

Does anyone know of any studies conducted on this exact topic ?
Or as close as can be ! I am interested in whether this has been
explored.

------------------------------

Date: 26 Feb 84 0:05:42-PST (Sun)
From: hplabs!sdcrdcf!akgua!mcnc!ecsvax!hsplab @ Ucb-Vax
Subject: computer EKG
Article-I.D.: ecsvax.2050

I would like to footnote Jack Buchanan's note and add that IBM, who helped
support the original development of the Bonner program has announced that
effective June, 1984, it will close its Health Care Division, which
currently manufactures its EKG products. Support for existing products
will continue for at least seven years after product termination, however.

David Chou
Department of Pathology
University of North Carolina, Chapel Hill
...!mcnc!ecsvax!hsplab

------------------------------

Date: Mon 2 Apr 84 21:34:24-PST
From: Ken Laws <Laws@SRI-AI.ARPA>
Subject: Prolog vs. Pascal

The latest issue of IEEE Graphics (March '84) has an article
comparing interpreted Prolog with compiled Pascal for a graphics
application. The results are surprising to me.

J.C. Gonzalez, M.H. Williams, and I.E. Aitchison of Heriot-Watt
University report on the comparison in "Evaluation of the
Effectiveness of Prolog for a CAD Application." They
implemented a very simple set of graphical routines: much of
the code is given in the article. They were building a 2-D
entry and editing system where the polygons were stored as lists
of vertices and edges. The user could enter new points and
edit previously entered figures. This formed the front end
to a system for constructing 3-D models from orthogonal 2-D
orthographic projections (engineering drawings). Much of the
code has the flavor of "For each line (or point or figure)
satisfying given constraints, do the following ..." (Often only
one entity would satisfy the constraints.)

The authors report that the Prolog version (using assert and
retract to manipulate the database) was more concise, more readable,
and clearer than the Pascal version. The Prolog version also took
less storage, was developed more quickly, and was developed with
minimum error. What is more remarkable is that the interpreted
Prolog ran about 25% faster than the compiled Pascal.

They were using a PDP-11/34 with the NU7 Prolog interpreter
from Edinburgh and the VU Pascal compiler from Vrije University.

------------------------------

Date: Fri 23 Mar 84 10:32:55-PST
From: Herm Fischer <HFischer@USC-ECLB>
Subject: IBM PC/XT Prolog Benchmarks

[Forwarded from the Prolog Digest by Laws@SRI-AI.]

[...]

IBM was kind enough to let us have PC/IX for today, and we
brought up UNSW Prolog. With a minor exception the code
and makefiles were compatible with PC/IX. (They have frustrated
me for a whole year, being incompatible every PCDOS "C" compiler
from Lattice onward.)

PC/IX and Prolog are neatly integrated; all Unix features, and
even shell calls, can be made within the Prolog environment.
Even help files are included. It is kind of nice to be tracing
away and browse and modify your prolog code within the interpretive
environment, using the INed (nee rand) editor and all the other
Unix stuff.

The 64 K limitation of PC/IX bothers me, more emotionally than
factually, because only one of my programs couldn't be run today.
I'm sure I will get really upset unless I find some hack around
this limitation.

A benchmark really surprises me. The Zebra problem (using
Pereira's solution) provides the following statistics:

DEC-2040 6 seconds (if compiled) (Timed on TOPS-20)
42 seconds (if interpreted) ( " " " )

VAX-11/780 204 secs (interpreted) (UNSW) (Timed on Unix Sys III)

IBM PC/XT 544 secs (interpreted) ( " ) (Timed on " " " )

The latter 2 times are wall-clock with no other jobs or users
running, and these two Prologs were compiled from the same source
code and make file! The PC/IX was CPU-bound, and its disk never
blinked during the execution of the test.

-- Herm Fischer

------------------------------

Date: Wed 21 Mar 84 20:47:07-PST
From: Ramsey Haddad <HADDAD@SU-SCORE.ARPA>
Subject: fifth generation

[Forwarded from the Stanford bboard by Laws@SRI-AI.]

For anyone interested in these things, there is a review by John
McCarthy of Feigenbaum and McCorduck's "The Fifth Generation:
Artificial Intelligence and Japan's Computer Challenge to the World"
in the April 1984 issue of REASON magazine.


[The following is a copy of Dr. McCarthy's text, reprinted with
his permission. -- KIL]


The Fifth Generation - Artificial Intelligence and Japan's Computer
Challenge to the World - by Edward Feigenbaum and Pamela McCorduck,
Addison-Wesley Publishing Co.


Review of Feigenbaum and McCorduck - for Reason


Japan has replaced the Soviet Union as the world's second
place industrial power. (Look at the globe and be impressed).
However, many people, Japanese included, consider that this success
has relied too much on imported science and technology - too much for
the respect of the rest of the world, too much for Japanese
self-respect, and too much for the technological independence needed
for Japan to continue to advance at previous rates. The Fifth
Generation computer project is one Japanese attempt to break out of
the habit of copying and generate Japan's own share of scientific and
technological innovations.

The idea is that the 1990s should see a new generation of
computers based on "knowledge information processing" rather than
"data processing". "Knowledge information processing" is a vague term
that promises important advances in the direction of artificial
intelligence but is noncommittal about specific performance. Edward
Feigenbaum describes this project in The Fifth Generation - Artificial
Intelligence and Japan's Computer Challenge to the World, predicts
substantial success in meeting its goals, and argues that the U.S.
will fall behind in computing unless we make a similar coherent
effort.

The Fifth Generation Project (ICOT) is the brainchild of
Kazuhiro Fuchi of the Japanese government's Electro-Technical
Laboratory. ICOT, while supported by industry and government, is an
independent institution. Fuchi has borrowed about 40 engineers and
computer scientists, all under 35, for periods of three years, from
the leading Japanese computer companies. Thus the organization and
management of the project is as innovative as one could ask. With
only 40 people, the project is so far a tiny part of the total
Japanese computer effort, but it is scheduled to grow in subsequent
phases.

The project is planned to take about 10 years,during which
time participants will design computers based on "logic programming",
an invention of Alain Colmerauer of the University of Marseilles in
France and Robert Kowalski of Imperial College in London, and
implemented in a computer programming language called Prolog. They
want to use additional ideas of "dataflow" developed at M.I.T. and to
make machines consisting of many procesors working in parallel. Some
Japanese university scientists consider that the project still has too
much tendency to look to the West for scientific ideas.

Making parallel machines based on logic programming is a
straightforward engineering task, and there is little doubt that this
part of the project will succeed. The grander goal of shifting the
center of gravity of computer use to the intelligent processing of
knowledge is more doubtful as a 10 year effort. The level of
intelligence to be achieved is ill-defined. The applications are also
ill-defined. Some of the goals, such as common sense knowledge and
reasoning ability, require fundamental scientific discoveries that
cannot be scheduled in advance.

My own scientific field is making computer programs with
common sense, and when I visited ICOT, I asked who was working on the
problem. It was disappointing to learn that the answer was "no-one".
This is a subject to which the Japanese have made few contributions,
and it probably isn't suited to people borrowed from computer
companies for three years. Therefore, one can't be optimistic that
this important part of the project goals will be achieved in the time
set.

The Fifth Generation Project was announced at a time when the
Western industrial countries were ready for another bout of viewing
with alarm; the journalists have tired of the "energy crisis" - not
that it has been solved. Even apart from the recession, industrial
productivity has stagnated; it has actually declined in industries
heavily affected by environmental and safety innovations. Meanwhile
Japan has taken the lead in automobile production and in some other
industries.

At the same time, artificial intelligence research was getting
a new round of publicity that seems to go in a seven-year cycle. For
a while every editor wants a story on Artificial Intelligence and the
free lancers oblige, and then suddenly the editors get tired of it.
This round of publicity has more new facts behind it than before,
because expert systems are beginning to achieve practical results,
i.e. results that companies will pay money for.

Therefore, the Fifth Generation Project has received enormous
publicity, and Western computer scientists have taken it as an
occasion for spurring on their colleagues and their governments.
Apocalyptic language is used that suggests that there is a battle to
the death - only one computer industry can survive, theirs or ours.
Either we solve all the problems of artificial intelligence right away
or they walk all over us.

Edward Feigenbaum is the leader of one of the major groups
that has pioneered expert systems -- with programs applicable to
chemistry and medicine. He is also one of the American computer
scientists with extensive Japanese contacts and extensive interaction
with the Fifth Generation Project.

Pamela McCorduck is a science writer with a previous book,
Machines Who Think, about the history of artificial intelligence
research.

The Fifth Generation contains much interesting description
of the Japanese project and American work in related areas. However,
Feigenbaum and McCorduck concentrate on two main points. First,
knowledge engineering will dominate computing
by the 1990s. Second, America is in deep trouble if we don't
organize a systematic effort to compete with the Japanese in this
area.

While knowledge engineering will increase in importance, many
of its goals will require fundamental scientific advances that cannot
be scheduled to a fixed time frame. Unfortunately, even in the United
States and Britain, the hope of quick applications has lured too many
students away from basic research. Moreover, our industrial system
has serious weaknesses, some of which the Japanese have avoided. For
example, if we were to match their 40 engineer project according to
output of our educational system, our project would have 20 engineers
and 20 lawyers.

The authors are properly cautious about what kind of an
American project is called for. It simply cannot be an Apollo-style
project, because that depended on having a rather precise plan in the
beginning that could see all the way to the end and did not depend on
new scientific discoveries. Activities that were part of the plan
were pushed, and everything that was not part of it was ruthlessly
trimmed. This would be disastrous when it is impossible to predict
what research will be relevant to the goal.

Moreover, if it is correct that good new ideas are more likely
to be decisive in this field at this time than systematic work on
existing ideas, we will make the most progress if there is money to
support unsolicited proposals. The researcher should propose goals
and the funders should decide how he and his project compare with the
competition.

A unified government-initiated plan imposed on industry has
great potential for disaster. The group with the best political
skills might get their ideas adopted. We should remember that present
day integrated circuits are based on an approach rejected for
government support in 1960. Until recently, the federal government
has provided virtually the only source of funding for basic research
in computer technology. However, the establishment of
industry-supported basic research through consortia like the
Microelectronics and Computer Technology Corporation (MCC), set up in
Austin, Texas under the leadership of Admiral Bobby Inman, represents
a welcome trend--one that enhances the chances of making the
innovations required.

------------------------------

End of AIList Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT