Copy Link
Add to Bookmark
Report
AIList Digest Volume 3 Issue 134
AIList Digest Friday, 4 Oct 1985 Volume 3 : Issue 134
Today's Topics:
Queries - Micro LISP & DEC20 LISP & Expert System Tools,
LISP - Franz Functions,
Survey - AI Project Planning,
Bibliography - Connectionism and Parallel Distributed Processing
----------------------------------------------------------------------
Date: Thu 3 Oct 85 17:30:30-PDT
From: Robert Blum <BLUM@SUMEX-AIM.ARPA>
Subject: LISP on micros: need info and refs
I teach a tutorial every year at SCAMC (the largest medical computing
meeting in the US) on AI in medicine. My students invariably want to know
how to run LISP on their micros. I'd like to request information on
the various currently marketed LISPs for micros (their cost, their
speed, quality, facilities). Pointers to review articles would also
be helpful.
Thanks, Bob Blum (BLUM@sumex)
[Has anyone saved past AIList micro-LISP messages in a convenient
form? My archive is not in a convenient form for multimessage
topical searches. -- KIL]
------------------------------
Date: 4 Oct 1985 10:06-EST
From: (Steinar Kj{rnsr|d) <steinar@oslo-vax>
Subject: Common Lisp for DEC20/TOPS20
I have a paper in front of me dated July 1984 describing
avaiable Common Lisps. I'm aware of several new implementations
since July 1984(!), and especially I'm interested in versions
for DEC20/TOPS20 and VAX/UNIX. Does anyone know *IF* and
*WHERE* these versions are avaiable ?
Steinar Kjaernsroed,
Institute of Informatics,
University of Oslo,
NORWAY
------------------------------
Date: Fri 4 Oct 85 08:19:08-PDT
From: Mark Richer <RICHER@SUMEX-AIM.ARPA>
Subject: AI software tools
Several months ago I wrote a report on commercial AI software tools
(KEE, ART, S.1, DUCK, and SRL+ ... now called Language Craft). The paper
was basically in two parts : 1) general criteria for evaluating tools,
2) a description and partial evaluation of the five tools. I am now
revising this paper and would welcome input from people. I am especially
interested in comments from people that have used of these software tools
OR another tool not described in my paper, but which you feel is another
good candidate for discussion.
------------------------------
Date: Wed, 2 Oct 85 10:36 EST
From: Atul Bajpai <bajpai%gmr.csnet@CSNET-RELAY.ARPA>
Subject: QUERY: Expert System Building Tools
Any ideas about what would be the implications (advantages/problems)
of using two (or more) different Expert System Tools (eg. KEE, ART,
S.1, Knowledge Carft etc.) to a single application. Has anyone done this
before? Is it practical to try such a thing? Any and all comments are
welcome. Thanks.
--Atul Bajpai-- (bajpai%gmr.csnet@csnet-relay OR atul@sushi.arpa)
------------------------------
Date: Wed, 2 Oct 85 09:35:06 edt
From: Walter Hamscher <hamscher@MIT-HTVAX.ARPA>
Subject: Franz Functions
(1) (getenv '|USER|) will return the user ID (Franz manual section 6).
Then you can either write a C routine that uses _getpwent
(Unix manual section 3) and link it in using cfasl (Franz section
8.4), or franz code that opens and scans through the file /etc/passwd.
(2) (status ctime). Franz manual section 6.
(3) `plist'. Section 2.
(4) Try closing the file after you write to it, then reopening
it in "a" mode. Avoid filepos.
Incidentally, the mailing list "franz-friends@Berkeley.ARPA"
is a much more appropriate place for questions like this.
------------------------------
Date: 1 Oct 1985 15:12:53-EDT
From: kushnier@NADC
Subject: AI Project Planning Survey
AI PROJECT PLANNING SURVEY
In order to develop a better planning methodology for AI project
management, the Artificial Intelligence group at the Naval Air
Development Center is conducting a survey among readers of AILIST. The
answers to the survey will provide guidelines and performance metrics
which will aid in the ability to "scope out" future AI projects.
A summary of the results of this survey will be provided to the
respondents.
Although complete entries are desired, any information will help.
Response to this survey must be made at no cost to the
Government.
Please respond by 18 Oct 1985.
General Project Information
1. PROJECT NAME: (MYCIN, R1, PROSPECTOR, . . .)
2. NAME OF DEVELOPMENT GROUP:
3. NAME AND ADDRESS OF CONTACT:
4. SHORT DESCRIPTION OF PROJECT:
5. TYPE OF SYSTEM: (Interpretation, Diagnosis, Monitoring, Prediction,
Design, Planning, Control, Debugging, Repair, Instruction)
6. DEVELOPMENT DATES:
7. CONTACT FOR SOFTWARE AND DOCUMENTATION:
8. CURRENT LEVEL OF PROGRESS: (Fully successful, still developing,
demonstrated but not in use, . . .)
Implementation Specifics
9. METHOD OF KNOWLEDGE REPRESENTATION: (Frames, production rules,...)
10. IMPLEMENTATION LANGUAGE: (Lisp, Prolog, Ops5, . . .)
11. COMMERCIAL TOOLS USED (if any): (KEE, ART, M.1, . . .)
12. NUMBER OF MAN-YEARS INVESTED:
Knowledge representation:
Knowledge acquisition:
Implementation:
13. HOST COMPUTER: (Vax, Symbolics, IBM-PC, . . .)
14. NUMBER AND AVERAGE SIZE IN BYTES OF RULES, FRAMES, WFFs
OR OBJECTS (where applicable):
Performance Criterion
15. PROGRAM EXECUTION TIME: (Avg. rule firings/task, rules
fired/sec, . . .)
16. AMOUNT OF MEMORY IN BYTES FOR:
Short term memory:
Long term memory:
Procedural memory:
17. IN WHAT WAYS DID THE SCOPE OF THE DOMAIN CHANGE? (Full scope,
rescoping needed,...)
Knowledge Acquisition
18. KNOWLEDGE ACQUISITION EFFORT IN MAN-YEARS:
19. NUMBER OF EXPERTS USED:
20. EXPERT SELECTION METHOD:
21. EXPERT INTERVIEWING METHODOLOGY:
Please send your responses to:
Ralph Fink, rfink@NADC.ARPA
Code 5021
Naval Air Development Center
Warminster, PA 18974
------------------------------
Date: Tue, 1 Oct 85 00:51:02 est
From: "Marek W. Lugowski" <marek%indiana.csnet@CSNET-RELAY.ARPA>
Subject: A reading list for a "Connectionism and Parallel Dist. Proc." course
After months of brainstorming (er, inaction), here comes at last the
promised reading list for Indiana U.'s planned connectionism course,
78 items long--just right for a semester. IU's a liberal arts school,
after all. :-)
-- Marek
A list of sources for teaching a graduate AI course "Connectionism and
Parallel Distributed Processing" at Indiana University's Computer Science
Department. Compiled by Marek W. Lugowski (marek@indiana.csnet) and Pete
Sandon (sandon@wisc-ai.arpa), Summer 1985.
Ackley, D. H., "Learning Evaluation Functions in Stochastic Parallel
Networks," CMU Thesis Proposal, December 1984.
Amari, S-I., "Neural Theory of Association and Concept-Formation," Biological
Cybernetics, vol. 26, pp. 175-185, 1977.
Ballard, D. H., "Parameter Networks: Toward a Theory of Low-Level Vision,"
IJCAI, vol. 7, pp. 1068-1078, 1981.
Ballard, D. H. and D. Sabbah, "On Shapes," IJCAI, vol. 7, pp. 607-612, 1981.
Ballard, D. H., G. E. Hinton, and T. J. Sejnowski, "Parallel Visual
Computation," Nature, vol. 103, pp. 21-26, November 1983.
Ballard, D. H., "Cortical Connections: Structure and Function," University of
Rochester Technical Report #133, July 1984.
Block, H. D., "A Review of "Perceptrons: An Introduction to Computational
Geometry"," Information and Control, vol. 17, pp. 501-522, 1970.
Bobrowski, L., "Rules of Forming Receptive Fields of Formal Neurons During
Unsupervised Learning Processes," Biological Cybernetics, vol. 43,
pp. 23-28, 1982.
Bongard, M., "Pattern Recognition", Hayden Book Company (Spartan Books), 1970.
Brown, C. M., C. S. Ellis, J. A. Feldman, T. J. LeBlanc, and G. L. Peterson,
"Research with the Butterfly Multicomputer," Rochester Research Review,
pp. 3-23, 1984.
Christman, D. P., "Programming the Connection Machine," MIT EECS Department
Masters Thesis, 1984.
Csernai, L. P. and J. Zimanyi, "Mathematical Model for the Self-Organization
of Neural Networks," Biological Cybernetics, vol. 34, pp. 43-48, 1979.
Fahlman, S. E., NETL, a System for Representing and Using Real Knowledge ,
MIT Press, Cambridge, Massachusetts, 1979.
Fahlman, S. E., G. E. Hinton, and T. J. Sejnowski, "Massively Parallel
Architectures for AI: NETL, Thistle and Boltzmann Machines," Proceedings
of the National Conference on Artificial Intelligence, 1983.
Feldman, J. A., "A Distributed Information Processing Model of Visual Memory,"
University of Rochester Technical Report #52, December 1979.
Feldman, J. A., "Dynamic Connections in Neural Networks," Biological
Cybernetics, vol. 46, pp. 27-39, 1982.
Feldman, J. A. and D. H. Ballard, "Computing with Connections," in Human and
Machine Vision, ed. J. Beck, B. Hope and A. Rosenfeld (eds), Academic
Press, New York, 1983.
Feldman, J. A. and L. Shastri, "Evidential Inference in Activation Networks,"
Rochester Research Review, pp. 24-29, 1984.
Feldman, J. A., "Connectionist Models and Their Applications: Introduction,"
Special Issue of Cognitive Science, vol. 9, p. 1, 1985.
Fry, G., ed., Nanotechnology Notebook, an unpublished collection of published
and unpublished material on molecular computing. Contact either the
editor (cfry@mit-prep.arpa) or Scott Jones (saj@mit-prep.arpa) at MIT
Artificial Intelligence Laboratory for distribution and/or bibliography
information. Contains material by Eric Drexler, Richard Feynman, Kevin
Ulmer and others.
Fukushima, K., "Cognitron: A Self-organizing Multilayered Neural Network,"
Biological Cybernetics, vol. 20, pp. 121-136, 1975.
Fukushima, K., "Neocognitron: A Self-organizing Neural Network Model for
a Mechanism of Pattern Recognition Unaffected by Shift in Position,"
Biological Cybernetics, vol. 36, pp. 193-202, 1980.
Hebb, D. O., The Organization of Behavior, Wiley, New York, 1949.
Hewitt, C., "Viewing Control Structures as Patterns of Passing Messages",
Artificial Intelligence: An MIT Perspective, Winston and Brown, Editors,
MIT Press, Cambridge, Massachusetts, 1979.
Hewitt, C. and P. de Jong, "Open Systems", MIT Artificial Laboratory Memo #691,
December 1982.
Hewitt, C. and H. Lieberman, "Design Issues in Parallel Architectures for
Artificial Intelligence", MIT Artificial Intelligence Lab Memo #750,
November 1983.
Hewitt, C., an article on asynchronous parallel systems not simulable
by Turing Machines or Omega-order logic, BYTE, April 1985.
Hillis, D. W., "New Computer Architectures and Their Relationship to Physics
or Why Computer Science Is No Good", International Journal of
Theoretical Physics, vol. 21, pp. 255-262, 1982.
Hinton, G. E., "Relaxation and its Role in Vision," University of Edinburgh
Ph.D. Dissertation, 1977.
Hinton, G. E. and J. A. Anderson, Parallel Models of Associative Memory,
Lawrence Erlbaum Associates, Hillsdale, New Jersey, 1981.
Hinton, G. E. and T. J. Sejnowski, "Optimal Perceptual Inference," Proceedings
of the IEEE Computer Society Conference on CV and PR, pp. 448-453,
June 1983.
Hinton, G. E., T. J. Sejnowski, and D. H. Ackley, "Boltzmann Machines:
Constraint Satisfaction Networks that Learn," CMU Department of Computer
Science Technical Report No. 84-119, May 1984.
Hinton, G. E., "Distributed Representations", CMU Department of Computer
Science Technical Report No. 84-157, October 1984.
Hirai, Y., "A New Hypothesis for Synaptic Modification: An Interactive Process
between Postsynaptic Competition and Presynaptic Regulation," Biological
Cybernetics, vol. 36, pp. 41-50, 1980.
Hirai, Y., "A Learning Network Resolving Multiple Match in Associative
Memory," IJCPR, vol. 6, pp. 1049-1052, 1982.
Hofstadter, D. R., "Who Shoves Whom Inside the Careenium?, or, What is the
Meaning of the Word 'I'?", Indiana University Computer Science Department
Technical Report #130, Bloomington, Indiana, 1982.
Hofstadter, D, "The Architecture of Jumbo", Proceedings of the International
Machine Learning Workshop, Monticello, Illinois, June 1983.
Hofstadter, D. R., "The Copycat Project", MIT Artificial Intelligence Lab
Memo #755, Cambridge, Massachusetts, January 1984.
Hofstadter, D. R., Metamagical Themas, Basic Books, New York, 1985.
Hopfield, J.J., "Neural Networks and Physical Systems with Emergent Collective
Computational Abilities," Proceedings of the National Academy of
Sciences, vol. 79, pp. 2554-2558, 1982.
Jefferson, D. E., "Virtual Time", UCLA Department of Computer Scienced
Technical Report No. 83-213, Los Angeles, California, May 1983.
Jefferson, D. E. and H. Sowizral, "Fast Concurrent Simulation Using the Time
Warp Mechanim, Part 1: Local Control", Rand Corporation Technical Report,
Santa Monica, California, June 1983.
Jefferson, D. E. and H. Sowizral, "Fast Concurrent Simulation Using the Time
Warp Mechanim, Part 2: Global Control", Rand Corporation Technical Report,
Santa Monica, California, August 1983.
John, E. R., "Switchboard versus Statistical Theories of Learning and Memory,"
Science, vol. 177, pp. 850-864, September 1972.
Kandel, E. R., "Small Systems of Neurons," Scientific American, vol. 241,
pp. 67-76, September 1979.
Kanerva, P., Self-Propagating Search: A Unified Theory of Memory, Center
for Study of Language and Information (CSLI) Report No. 84-7 (Stanford
Department of Philosophy Ph.D. Dissertation), Stanford, California, 1984.
To appear as a book by Bradford Books (MIT Press).
Kanerva, P., "Parallel Structures in Human and Computer Memory", Cognitiva 85,
Paris, June 1985.
Kirkpatrick, S., and C. D. Gelatt, Jr. and M. P. Vecchi, "Optimization by
Simulated Annealing", Science, vol. 220, no. 4598, 13 May 1983.
Kohonen, T., "Self-Organized Formation of Topologically Correct Feature Maps,"
Biological Cybernetics, vol. 43, pp. 59-69, 1982.
Kohonen, T., "Analysis of a Simple Self-Organizing Process," Biological
Cybernetics, vol. 44, pp. 135-140, 1982.
Kohonen, T., "Clustering, Taxonomy, and Topological Maps of Patterns," IJCPR,
vol. 6, pp. 114-128, 1982.
Kohonen, T., Self-Organization and Associative Memory, 2nd edition,
Springer-Verlag, New York, 1984.
Lieberman, H., "A Preview of Act1", MIT Artificial Intelligence Lab Memo #626.
Malsburg, C. von der, "Self-Organization of Orientation Sensitive Cells in the
Striate Cortex," Kybernetik, vol. 14, pp. 85-100, 1973.
McClelland, J. L., "Putting Knowledge in its Place: A Scheme for Programming
Parallel Processing Structures on the Fly," Cognitive Science, vol. 9,
pp. 113-146, 1985.
McClelland, J. L. and D. E. Rumelhart, "Distributed Memory and the
Representation of General and Specific Information," Journal of
Experimental Psychiatry: General, vol. 114, pp. 159-188, 1985.
McClelland, J. L. and D. E. Rumelhart, Editors, Parallel Distributed
Processing: Explorations in the Microstructure of Cognition, Volume 1,
Bradford books (MIT Press), Cambridge, Massachusetts, 1985 (in press).
McCullough, W. S., Embodiments of Mind, MIT Press, Cambridge, Massachusetts,
1965.
Minsky, M. and S. Papert, Perceptrons: An Introduction to Computational
Geometry, MIT Press, Cambridge, Massachusetts, 1969.
Minsky, M., "Plain Talk about Neurodevelopmental Epitemology", MIT Artificial
Intelligence Lab Memo #430, June 1977.
Minsky, M., "K-Lines: A Theory of Memory", MIT Artificial Intelligence Lab
Memo #516, June 1979.
Minsky, M., "Nature Abhors an Empty Vaccum", MIT Artificial Intelligence Lab
Memo #647, August 8, 1981.
Mozer, M. C., "The Perception of Multiple Objects: A Parallel, Distributed
Processing Approach", UCSD Thesis Proposal, Institute for Cognitive
Science, UCSD, La Jolla, California, August 1984.
Nass, M. M. and L. N. Cooper, "A Theory for the Development of Feature
Detecting Cells in Visual Cortex," Biological Cybernetics, vol. 19,
pp. 1-18, 1975.
Norman, "Categorization of Action Slips", Psychological Review, vol. 88, no. 1,
1981.
Palm, G., "On Associative Memory," Biological Cybernetics, vol. 36, pp. 19-31,
1980.
Plaut, D. C., "Visual Recognition of Simple Objects by a Connection Network,"
University of Rochester Technical Report #143, August 1984.
Rosenblatt, F., Principles of Neurodynamics, Spartan Books, New York, 1962.
Rumelhart, D. E. and D. Zipser, "Feature Discovery by Competitive Learning,"
Cognitive Science, vol. 9, pp. 75-112, 1985.
Sabbah, D., "Design of a Highly Parallel Visual Recognition System," IJCAI,
vol. 7, pp. 722-727, 1981.
Sabbah, D., "Computing with Connections in Visual Recognition of Origami
Objects," Cognitive Science, vol. 9, pp. 25-50, 1985.
Smolensky, P., "Schema Selection and Stochastic Inference in Modular
Environments", Proceedings of the AAAI-83, Washington, 1983.
Theriault, D. G., "Issues in the Design and Implementation of Act2", MIT
Artificial Intelligence Lab Technical Report #728, June 1983.
Touretzky, D. S. and G. E. Hinton, "Symbols Among the Neurons: Details of
a Connectionist Inference Architecture," IJCAI, vol. 9, pp. 238-243,
1985.
Uhr, L., "Recognition Cones and Some Test Results," in Computer Vision
Systems, ed. A. R. Hanson and E. M. Riseman, Academic Press, New York,
1978.
Uhr, L. and R. Douglass, "A Parallel-Serial Recognition Cone System for
Perception: Some Test Results," Pattern Recognition, vol. 11, pp. 29-39,
1979.
Van Lehn, K., "A Critique of the Connectionist Hypothesis that Recognition
Uses Templates, not Rules," Proceedings of the Sixth Annual Conference
of the Cognitive Science Society, 1984.
Willshaw, D. J., "A Simple Network Capable of Inductive Generalization,"
Proceedings of the Royal Society of London, vol. 182, pp. 233-247, 1972.
A list of contacts known to us that have taught or are interested in
teaching a similar course/seminar:
J. Barnden, Indiana U. Computer Science
G. Hinton, CMU Computer Science
D. Hofstadter, U. of Michigan Psychology
J. McClelland, CMU Psychology
D. Rumelhart, UCSD Psychology
P. Smolensky, U. of Colorado Computer Science
D. Touretzky, CMU Computer Science
------------------------------
End of AIList Digest
********************