Copy Link
Add to Bookmark
Report
Machine Learning List Vol. 3 No. 17
Machine Learning List: Vol. 3 No. 17
Friday, Sept 27, 1991
Contents:
COBWEB in C?
PPSN-92: Call for Papers
Morgan Kaufmann ML-LIST special offer
The Machine Learning List is moderated. Contributions should be relevant to
the scientific study of machine learning. Mail contributions to ml@ics.uci.edu.
Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues
may be FTP'd from ics.uci.edu in pub/ml-list/V<X>/<N> or N.Z where X and N are
the volume and number of the issue; ID: anonymous PASSWORD: <your mail address>
------------------------------
From: "Ron D. Appel" <appel@cih.hcuge.ch>
Subject: Fisher's COBWEB in C
Date: Fri, 20 Sep 1991 09:51:38 +0200
Does a C implementation of Fisher's COBWEB exist?
Ron D. Appel
------------------------------
Date: Thu, 26 Sep 91 17:30:18 +0200
From: Bernard Manderick <bernard@arti1.vub.ac.be>
Subject: PPSN-92: Call for Papers
Call for Papers
PPSN 92
Parallel Problem Solving from Nature
Free University Brussels, Belgium
28-30 September 1992
The unifying theme of the PPSN-conference is ``natural computation'',
i.e. the design, the theoretical and empirical understanding, and the
comparison of algorithms gleaned from nature as well as their
application to real-world problems in science, technology, etc.
Characteristic for natural computation is the metaphorical use of
concepts, principles, and mechanisms explaining natural systems.
Examples are genetic algorithms, evolution strategies, algorithms
based on neural networks, immune networks, and so on. A first focus
of the conference is on problem solving in general, and learning and
adaptiveness in particular. Since natural systems usually operate in a
massively parallel way, a second focus is on parallel algorithms and
their implementations.
The conference scope includes but is not limited to the following topics:
Physical metaphors such as simulated annealing,
Biological metaphors such as evolution strategies, genetic
algorithms, immune networks, classifier systems and neural networks
insofar problem solving, learning and adaptability are concerned, and
Transfer of other natural metaphors to artificial problem solving.
Conference Address:
PPSN - p/a D. Roggen - Dienst WEIN -
Vrije Universiteit Brussel - Pleinlaan 2 - B-1050 Brussels - Belgium
tel. +32/2/641.35.75
fax +32/2/641.28.70
email ppsn@arti.vub.ac.be
------------------------------
Date: Fri, 20 Sep 91 17:46:55 PDT
From: Morgan Kaufmann <morgan@unix.sri.COM>
Subject: Morgan Kaufmann ML-LIST special offer
[Note: If you know of other publishers who would be willing to
discount ML books to readers of ML-LIST, you can have them contact me.
An offer from Lawrence Erlbaum Associates will probably appear in
the next issue- Mike]
MORGAN KAUFMANN PUBLISHERS
MACHINE LEARNING LIST SPECIAL OFFER
PRICES SHOWN REFLECT 10% DISCOUNT
GOOD THROUGH OCTOBER 31,1991
MACHINE LEARNING: A THEORETICAL APPROACH, by Balas K. Natarajan
(Hewlett-Packard Laboratories), July 1991, ISBN 1-44860-148-1,
pages; $38.66
This is the first comprehensive introduction to computational
learning theory. The author's uniform presentation of fundamental
results and their applications offers AI researchers a theoretical
perspective on the problems they study. The book presents tools
for the analysis of probabilistic models of learning, tools that
crisply classify what is and is not efficiently learnable. After
a general introduction to Valiant's PAC paradigm and the important
notion of the Vapnik-Chervonenkis dimension, the author explores
specific topics such as finite automata and neural networks. The
presentation is intended for a broad audience, the author's
ability to motivate and pace discussions for beginners has been
praised by reviewers. Each chapter contains numerous examples and
exercises, as well as a useful summary of important results. An
excellent introduction to the area, suitable either for a first
course, or as a component in general machine learning and advanced
AI courses. Also an important reference for AI researchers.
Introduction * Learning Concepts on Countable Domains * Time-
Complexity of Concept Learning * Learning Concepts on Uncountable
Domains * Learning Functions * Finite Automata * Neural Networks
* Generalizing the Learning Model * Conclusion * Notation *
Bibliography
FOUNDATIONS OF GENETIC ALGORITHMS, Edited by Gregory J. E. Rawlins
(Indiana University), July 1991, ISBN 1-55860-170-8; $41.36
Genetic Algorithms (GAs) are becoming an important tool in machine
learning research. GAs have been applied to problems such as
design of semiconductor layout and factory control, and have been
used in AI systems and neural networks to model processes of
cognition such as language processing and induction. They are the
principal heurisitic search method of classifier systems and they
have been used on NP-hard combinatorial optimization problems.
Although much is known about their basic behavior, there are many
aspects of GAs that have not been rigorously defined or studied
formally. This book addresses the need for a principled approach
to understanding the foundations of genetic algorithms and
classifier systems as a way of enhancing their further development
and application. Each paper presents original research, and most
are accessible to anyone with general training in computer science
or mathematics. This book will be of interest to a variety of
fields including machine learning, neural networks, theory of
computation, mathematics and biology.
Contributors: Clayton L. Bridges, David E. Goldberg, Yuval
Davidor, Carol A. Ankenbrandt, Kalyanmoy Deb, Gilbert Syswerda,
Steven Y. Goldsmith, Delores M. Etter, Robert E. Smith, Thomas H.
Westerdale, Gunar E. Liepins, Lashon Booker, Rick Riolo, John R.
Koza, H. James Antonisse, Alden H. Wright, Ping-Chung Chi, Petrus
Handoko, Darrell Whitley, J. David Schaffer, Larry J. Eshelman,
Daniel Offutt, Piet Spiessens, Jon T. Richardson, David L. Battle,
Michael D. Vose, Stephanie Forrest, Melanie Mitchell, John
Grefenstette, Larry J. Eshelman, Barry R. Fox, Mary Beth McMahon,
Kenneth De Jong, William Spears, Heinz Muehlenbein
CONCEPT FORMATION: KNOWLEDGE AND EXPERIENCE IN UNSUPERVISED
LEARNING, Edited by Douglas Fisher (Vanderbilt University) and
Michael Pazzani (University of California, Irvine), July 1991; ISBN
1-55860-201-1; $35.96
Concept formation lies at the center of learning and cognition.
Unlike much work in machine learning and cognitive psychology,
research on this topic focuses on the unsupervised and incremental
acquisition of conceptual knowledge. Recent work on concept
formation addresses a number of important issues. Foremost among
these are the principles of similarity that guide concept learning
and retrieval in human and machine, including the contribution of
surface features, goals, and `deep' features. Another active area
of research explores mechanisms for efficiently reorganizing memory
in response to the ongoing experiences that confront intelligent
agents. Finally, methods for concept formation play an increasing
role in work on problem solving and planning, developmental
psychology, engineering applications, and constructive induction.
This book brings together results on concept formation from
cognitive psychology and machine learning, including explanation-
based and inductive approaches. Chapters from these differing
perspectives are intermingled to highlight the commonality of
their research agendas. In addition to cognitive scientists and
AI researchers, the book will interest data analysts involved in
clustering, philosophers concerned with the nature and origin of
concepts, and any researcher dealing with issues of similarity,
memory organization, and problem solving.
Computational Models of Concept Learning * An Iterative Bayesian
Algorithm for Categorization * Representational Specificity and
Learning * Discrimination Net Models of Concept Formation *
Concept Formation in Structured Domains * Theory-Driven Concept
Formation * Explanation-Based Learning as Concept Formation * Some
Influences of Instance Comparisons in Concept Formation * Harpoons
and Long Sticks: Theory and Similarity in Rule Induction * Concept
Formation over Problem-Solving Experiences * Concept Formation in
Context * The Formation and Use of Abstract Concepts in Design *
Learning to Recognize Movements * Representation Generation in an
Exploratory Learning System * Q-SOAR: A Computational Account of
Children's Learning About Number Conservation
GENETIC ALGORITHMS: PROCEEDINGS OF THE FOURTH INTERNATIONAL
CONFERENCE (1991), Edited by Richard K. Belew (Univ. of
California, San Diego), Lashon Booker (Naval Research Laboratory)
and J. David Schaffer (Philips Laboratories), July 1991; ISBN 1-
55860-208-9; $33.26
Also Available: Proceedings of the Third Int'l. Conference
(1989): Edited by J. David Schaffer (Philips Laboratories), 1989,
ISBN 1-55860-066-3; 452 pages; $31.46
This volume contains the papers presented at the Fourth
International Conference on Genetic Algorithms. The papers, each
written by a leading researcher in the field, reflect the growth
and diversity of this field. Topics include: Holland's Genetic
Algorithm and Classifier Systems, machine learning and
optimization using these systems, relations to other learning
paradigms (such as connectionist networks), parallel
implementations, related biological modeling issues and practical
applications.
MACHINE LEARNING: PROCEEDINGS OF THE EIGHTH INTERNATIONAL WORKSHOP
(1991), edited by Lawrence Birnbaum and Gregg Collins (Both of
Northwestern University), June 1991; ISBN 1-55860-200-3; $31.46
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS, 3, edited by
Richard P. Lippmann (MIT Lincoln Labs), John E. Moody (Yale
University), and David S. Touretzky (Carnegie Mellon University)
Volume 3: May 1991, 1200 pages; cloth; ISBN 1-55860-184-8; $44.96
Volume 2: 1990; 853 pages; cloth; ISBN 1-55860-100-7; $32.36
Volume 1: 1989; 819 pages; cloth; ISBN 1-55860-015-9; $32.36
Research in neural networks comprises results from many
disciplines. These volumes contain the collected papers from the
premier forum for neural networks research<196>the IEEE Conferences
on Neural Information Processing Systems-Natural and Synthetic,
held annually in Denver, Colorado. Papers are rigorously
evaluated for scientific merit and revised for publication.
Topics from these books include: rules and connectionist models,
speech, vision, neural network dynamics, neurobiology,
computational complexity issues, fault tolerance in neural
networks, benchmarking and comparing neural network applications,
architectural issues, fast training techniques, VLSI, control,
optimization, statistical inference, and genetic algorithms.
CONNECTIONIST MODELS: PROCEEDINGS OF THE 1990 SUMMER SCHOOL
WORKSHOP, edited by David S. Touretzky (Carnegie Mellon
University), Jeffrey L. Elman (University of California, San
Diego), Terrence J. Sejnowski (Salk Institute) and Geoffrey E.
Hinton (University of Toronto), 1990; 404 pages; paper; ISBN
1-55860-156-2; $26.96
1988 Proceedings also available: edited by David S. Touretzky
(Carnegie Mellon University), Geoffrey Hinton (University of
Toronto), and Terrence J. Sejnowski (Salk Institute), 1988; 527
pages; paper; ISBN 1-55860-035-3; $17.96
The Connectionist Models Summer Schools bring together
distinguished researchers and outstanding graduate students to
evaluate current research results at the forefront of
connectionist models in neural networks. The papers, rigorously
selected from faculty and student entries, have been updated and
revised to incorporate workshop discussions, as well as the
authors' and editors' interaction. The selections serve to
summarize a wide variety of efforts in VLSI design, optimization
methods, learning theory, vision, speech, neuroscience,
linguistics, and cognitive psychology. This collection, like its
successful predecessor, will be a valuable reference for
researchers and students.
CASE BASED REASONING: PROCEEDINGS OF THE 1991 DARPA WORKSHOP, May
1991; 500 pages; paper; ISBN 1-55860-199-6; $31.50
COLT 1991: Proceedings of the Fourth Annual Workshop on
Computational Learning Theory, edited by Leslie Valiant (Harvard
University) and Manfred Warmuth (University of California, Santa
Cruz), July 1991; 395 pages; Paper; ISBN 1-55860-146-5; $31.46.
MACHINE LEARNING: AN ARTIFICIAL INTELLIGENCE APPROACH, VOLUME III,
edited by Yves Kodratoff (French National Scientific Research
Council) and Ryszard Michalski (George Mason University). June
1990; 825 pages; Cloth; ISBN 1-55860-119-8 $44.96.
General Issues * Empirical Learning Methods * Analytical Learning
Methods * Integrated Learning Systems * Subsymbolic Learning
Systems * Formal Analysis
READINGS IN MACHINE LEARNING, edited by Jude Shavlik (University
of Wisconsin, Madison) and Thomas Dietterich (Oregon State
University). June 1990; 853 pages; Paper; ISBN 1-55860-143-0;
$35.96.
General Aspects of Machine Learning * Inductive Learning Using Pre-
Classified Training Examples * Unsupervised Concept Learning and
Discovery * Improving the Efficiency of a Problem Solver * Using
Pre-Existing Domain Knowledge Inductively * Explanatory/Inductive
Hybrids
COMPUTATIONAL MODELS OF SCIENTIFIC DISCOVERY AND THEORY FORMATION,
edited by Jeff Shrager (Xerox PARC) and Pat Langley (NASA Ames
Research Center). June 1990; 498 pages; Cloth; ISBN 1-55860-131-7;
$35.96.
Introduction * The Conceptual Structure of the Geologic Revolution
* A Unified Analogy Model of Explanation and Theory Formulation *
An Integrated Approach to Empirical Discovery * Deriving Basic Laws
by Analysis of Process and Equations * Theory Formation by
Abduction: Initial Results of a Case Study Based on the Chemical
Revolution * Diagnosing and Fixing Faults in Theories * Hypothesis
Formation as Design * Towards a Computational Model of Theory
Revision * Evaluation of KEKADA as an AI Program * Scientific
Discovery in the Lay Person * Designing Good Experiments to Test
Bad Hypotheses * On Finding the Most Probable Model * Commonsense
Perception and the Psychology of Theory Formation * Five Questions
for Computationalists
COMPUTER SYSTEMS THAT LEARN: CLASSIFICATION AND PREDICTION METHODS
FROM STATISTICS, NEURAL NETS, MACHINE LEARNING AND EXPERT SYSTEMS,
by Sholom Weiss and Casimir Kulikowski (Both of Rutgers
University). October 1990; approx. 250 pages; Cloth; ISBN
1-55860-065-5; $35.96.
Overview of Learning Systems * How to Estimate the True Performance
of a Learning System * Statistical Pattern Recognition * Neural
Nets * Machine Learning: Easily Understood Decision Rules * Which
Technique is Best? * Expert Systems
A GENERAL EXPLANATION-BASED LEARNING MECHANISM AND ITS APPLICATION
TO NARRATIVE UNDERSTANDING, by Raymond J. Mooney (University of
Texas, Austin). 1989; 243 pages; Paper; ISBN 1-55860-091-4,
$23.96.
EXTENDING EXPLANATION-BASED LEARNING BY GENERALIZING THE STRUCTURE
OF EXPLANATIONS, by Jude W. Shavlik (University of Wisconsin,
Madison). 1990; 219 pages; Paper; ISBN 1-55860-109-0, $26.96.
MATHEMATICAL FOUNDATIONS OF LEARNING MACHINES, by Nils Nilsson
(Stanford University), with a New Introduction by Terrence J.
Sejnowski (Salk Institute) and Hal White (University of California,
San Diego). 1990; 138 pages; Paper; ISBN 1-55860-123-6; $17.96.
Introduction by Terrence J. Sejnowski and Hal White * Trainable
Pattern Classifiers * Some Important Discriminant Functions: Their
Properties and Their Implementations * Parametric Training Methods
* Some Nonparametric Training Methods for Learning Machines *
Training Theorems * Layered Machines * Piecewise Linear Machines
* Appendix
=================================================================
MORGAN KAUFMANN PUBLISHERS
MACHINE LEARNING LIST SPECIAL OFFER
10% DISCOUNT THROUGH OCTOBER 31,1991
ORDER FORM
Please send me the following books:
No. Copies Author/Title ISBN# Price
----------------------------------------------------------------
----------------------------------------------------------------
----------------------------------------------------------------
----------------------------------------------------------------
----------------------------------------------------------------
----------------------------------------------------------------
SUBTOTAL: ___________
CALIFORNIA RESIDENTS ADD APPROPRIATE SALES TAX: ___________
SHIPPING & HANDLING: ___________
U.S. -- add $3.50 for 1st book, $2.50 for each additional;
Foreign -- add $6.50 for 1st book; $3.50 for each additional:
TOTAL: _____________
Check enclosed _____ Charge my VISA _____ MasterCard _____
Account #__________________________________ Expires ___________
Signature ________________________________
Phone_______________________
Name as on card_____________________________________
Send books to:
__________________________________________________________
__________________________________________________________
__________________________________________________________
__________________________________________________________
Send form to:
Morgan Kaufmann, 2929 Campus Drive, Suite 260, Dept. E6, San Mateo,
CA 94403. Telephone Orders: (800) 745-7323 (US and Canada), 415-
578-9911, Fax: (415) 578-0672.
------------------------------
END of ML-LIST 3.17