Copy Link
Add to Bookmark
Report

Machine Learning List Vol. 2 No. 21

eZine's profile picture
Published in 
Machine Learning List
 · 11 months ago

 
Machine Learning List: Vol. 2 No. 21
Wednesday, Oct 31, 1990

Contents:
MLJ: Special issue; discovery/invention
MLJ: Special issue; Symbolic Learning and Robotics.
Availability of decision tree learning software
IEEE Expert Special Track on Applications of Machine Learning
Machine Learning Journal Book Reviews
Data

The Machine Learning List is moderated. Contributions should be relevant to
the scientific study of machine learning. Mail contributions to ml@ics.uci.edu.
Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues
may be FTP'd from ics.uci.edu in /usr2/spool/ftp/pub/ml-list/V<X>/<N> or N.Z
where X and N are the volume and number of the issue; ID & password: anonymous

------------------------------
Date: Tue, 9 Oct 90 23:29 CST
From: ZYTKOW@wsuiar.wsu.ukans.edu
Subject: ML special issue; discovery/invention

Dear Colleague: October 9, 1990


I would like to announce that the deadline for submission of the
papers has been shifted to November 27, 1990 (Tuesday). Because
"Machine Learning" cannot publish the call for papers due to a lack of
space, please make one more effort to reach potential authors. The new
deadline will allow more time to people who have learned about the
special issue only recently.

You may submit a paper any time before the deadline. Please send one
copy to me and four copies to Karen Cullen (Machine Learning):
Jan Zytkow Mrs. Karen Cullen
Department of Computer Science Machine Learning
Wichita State University Kluwer Academic Publishers
Wichita, KS 67208 101 Philip Drive
Assinippi Park
Norwell, MA 02061

------------------------------
Date: Mon, 22 Oct 90 08:28:40 EDT
From: John Laird <laird@caen.engin.umich.edu>
Subject: Special Issue Reminder.

This is a reminder that I am editing a special issue of the Machine Learning
journal on Symbolic Learning and Robotics. The deadline for submissions is
November 15, 1990. For more information, contact laird@caen.engin.umich.edu.

John Laird

------------------------------
Date: Wed, 31 Oct 90 16:09:40 PST
From: Wray Buntine <wray@ptolemy.arc.nasa.GOV>
Subject: availability of decision tree learning software


I'm just putting out feelers. I wrote a rather comprehensive
decision tree learning package for my thesis. It contains a
reimplementation of all the major algorithms (CART, C4, MML/MDL)
and an extensive experiment control suite (randomised trials, cross
validation, statistical significance, ...). It also contains my
Bayesian algorithm which includes "option trees" and "tree averaging",
approaches that usually outclass the others.

It would be a shame to put this on the shelf since I've never seen
anything else that comes near it in functionality, and being written in C,
it's fast.

If enough people are interested in obtaining a copy for research purposes,
I should be able to make the package available. This will require my neatening
up a few things. So please drop me a line via email if you're interested.

Wray Buntine
wray@ptolemy.arc.nasa.gov
(415) 604 3389

[Here's one vote for making this available- MP]
--------------------------

TREE PACKAGE SUMMARY

The package is written in C and runs on SUNs. Haven't compiled it
on other UNIX machines. Comes with man entries, examples of use, etc.

The package integrates several previous algorithms:
- re-implementation of most of CART (Breiman, Friedland, Olshen and Stone)
- re-implementation of (earlier) C4 (Quinlan, IJMMS)
- re-implementation of much of Wallace's minimum message length algorithm
(this is a "corrected" version of Quinlan & Rivest's MDL algorithm,
and is rather like Rissanen's algorithm in US Patent 4,719,571)
- multiple trees (Kwok and Carter, Uncertainty in AI 4)
- the original and still the best (possibly, the only) implementation
of Buntine's tree averaging and option trees
These are integrated into a system that offers:
different splitting rules:
GINI, information gain, MML, Bayes
but no subsetting unfortunately
different pruning rules
cost complexity with test-set, or cross validation
Bayes averaging
minimum message length
minimum error pruning
different methods for handling missing values
...
a full featured classifier that computes
actual and predicted error rates
confusion matrix
half-brier score
different cost functions (utilities) than minimum errors
etc.
an experiment control package
controlled, repeated trials
report generation
statistics


------------------------------
Date: Wed, 31 Oct 90 17:42:32 -0500
From: "Alberto M. Segre" <segre@cs.cornell.edu>
Subject: IEEE Expert Special Track on Applications of Machine Learning


Call for Papers
IEEE Expert Special Track on Applications of Machine Learning


The goal of this series of articles is to explore novel
applications of machine learning technology to real world
problems. Of special interest are papers which clearly illustrate
how machine learning techniques can be profitably used to solve
problems of interest to AI practitioners.

Machine learning research has enjoyed enormous growth since
the early 1970's. Much of the early work was directed towards
understanding and developing inductive learning algorithms for
concept formation. While the technology developed in this era has
been transferred to commercial applications (e.g., many expert
system tools now include modules for inductive learning of
classification rules), little has been said about applications of
more recent research results in concept formation, explanation-
based learning, clustering, or discovery learning.

Audience

IEEE Expert's charter is to try to transfer to the user
community ideas and tools that come out of research and
development; clear, not overly formal, writing is essential. Its
readers are users, developers, managers, researchers, and
purchasrs who are interest in databases, expert systems, and
artificial intelligence, with particular emphasis on
applications. They want to learn about the tools, techniques,
concepts, aids, and systems that have potential for real-world
applications.

Special Tracks

A special track is a collection of papers united by a theme;
it is just like a special issue except that the articles appear
over several issues. Other special tracks in progress cover AI
and object-oriented programming, AI applications in process
systems, functional modeling of devices, and connectionist
applications.

Submissions

All manuscripts submitted for publication should be
original. Articles published in other journals will not be
considered. Submissions should run 8 to 10 journal pages. For
full submission guidelines contact:

Alberto Maria Segre [segre@cs.cornell.edu]
Department of Computer Science
Cornell University
Upson Hall
Ithaca, NY 14853-7501

------------------------------
Date: Wed, 31 Oct 90 17:51:30 -0500
From: "Alberto M. Segre" <segre@cs.cornell.edu>
Subject: Machine Learning Journal Book Reviews

Announcement:
Machine Learning Journal Book Reviews


Machine Learning is soliciting volunteer reviewers for
recent monographs in machine learning. The Journal is currently
including short, archival-quality, reviews of current books in
the field; the first set of reviews will begin appearing in
Volume 1 Number 6.

The ideal reviewer should be a machine-learning researcher,
faculty, or advanced graduate student. Reviewers should have some
previous exposure to the field and will be asked to provide a
short statement relating their qualifications to review a
particular book. Once selected, reviewers will be supplied with
the book of their choosing, and will be expected to produce a
three to five page review for the Journal. Book authors will be
given an opportunity to read the review and respond in print.

Books selected for review should be recent (within the last
five years) releases relevent to the field of machine learning.
For more information, contact:

Alberto Maria Segre [segre@cs.cornell.edu]
Department of Computer Science
Cornell University
Upson Hall
Ithaca, NY 14853-7501

------------------------------
Date: Wed, 24 Oct 90 18:50:58 PDT
From: Pat Langley <langley@ptolemy.arc.nasa.GOV>
Subject: experimental joke

I thought all of you might appreciate this, which I just heard today:

Q: What's the singular form of the word "data"?

A: "anecdote".
------------------------------
END of ML-LIST 2.21

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT