Copy Link
Add to Bookmark
Report

Machine Learning List Vol. 1 No. 10

eZine's profile picture
Published in 
Machine Learning List
 · 1 year ago

 
Machine Learning List: Vol. 1 No. 10
Saturday, Oct 14, 1989

Contents:
Neurogammon wins Computer Olympiad
1990 Machine Learning Conference: Call for Papers
ML Journal
References on dealing with / using feature dependence in learning
5th Machine Learning Conference

The Machine Learning List is moderated. Contributions should be relevant to
the scientific study of machine learning. Mail contributions to ml@ics.uci.edu.
Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues
of Volume 1 may be FTP'd from /usr2/spool/ftp/pub/ml-list/V1/<N> or N.Z where
N is the number of the issue; Host ics.uci.edu; Userid & password: anonymous

----------------------------------------------------------------------
From: Rich Sutton <rich@gte.COM>
Subject: Neurogammon wins Computer Olympiad
Date: Wed, 4 Oct 89 11:59:37 EDT

Neurogammon 1.0 is a backgammon program which uses multi-layer
neural networks to make move decisions and doubling cube decisions.
The networks were trained by back-propagation on large expert
data sets. Neurogammon competed at the recently-held First
Computer Olympiad in London, and won the backgammon competition
with a perfect record of 5 wins and no losses. This is a victory
not only for neural networks, but for the entire machine learning
community, as it is apparently the first time in the history of
computer games that a learning program has ever won a tournament.

A short paper describing Neurogammon and the Olympiad results will
appear in the next issue of Neural Computation. (This was inadver-
tently omitted from Terry Sejnowski's recent announcement of the
contents of the issue.) The paper may also be obtained on-line in
plain text format by sending e-mail to TESAURO@ibm.com.

----------------------------------------------------------------------
From: "B. Porter and R. Mooney" <ml90@cs.utexas.EDU>
Date: Fri, 6 Oct 89 15:02:34 CDT
Subject: 1990 Machine Learning Conference: Call for Papers


SEVENTH INTERNATIONAL CONFERENCE ON MACHINE
LEARNING: CALL FOR PAPERS


The Seventh International Conference on Machine Learning will be
held at the University of Texas in Austin during June 21-23,
1990. Its goal is to bring together researchers from all
areas of machine learning. The conference will include
presentations of refereed papers, invited talks, and poster
sessions in all areas of machine learning.

REVIEW CRITERIA

In order to ensure high quality papers, each submission will
be reviewed by two members of the program committee and
judged on clarity, significance, and originality. All sub-
missions should contain new work, new results, or major extensions
to prior work. If the paper describes a running system, it
should explain that system's representation of inputs and outputs,
its performance component, its learning methods, and its evalua-
tion. In addition to reporting advances in current areas of
machine learning, authors are encouraged to report results on
exploring novel learning tasks.


SUBMISSION OF PAPERS

Each paper must have a cover page with the title, author's
names, primary author's address and telephone number, and an
abstract of about 200 words. The cover page should also give
three keywords that describe the research. Examples of keywords
include:

PROBLEM AREA GENERAL APPROACH EVALUATION CRITERIA

Concept learning Genetic algorithms Empirical evaluation
Learning and planning Empirical methods Theoretical analysis
Language learning Explanation-based Psychological validity
Learning and design Connectionist
Machine discovery Analogical reasoning

Papers are limited to 12 double-spaced pages (including figures
and references), formatted with twelve point font. The deadline
for submitting papers is February 1, 1990. Authors will be
notified of acceptance by Friday, March 23, 1990 and camera-
ready copy is due by April 23, 1990.


Send papers (3 copies) to: For information, please contact:

Machine Learning Conference Bruce Porter or Raymond Mooney
Department of Computer Sciences ml90@cs.utexas.edu
Taylor Hall 2.124 (512) 471-7316
University of Texas at Austin
Austin, Texas 78712-1188


----------------------------------------------------------------------
Subject: ML Journal
Date: Tue, 10 Oct 89 13:33:30 -0700
From: Caroline Ehrlich <ehrlich@ICS.UCI.EDU>


MACHINE LEARNING JOURNAL

VOLUME 5 (1990)

CORRECT SUBSCRIPTION PRICE

On a recent brochure from the Publisher, the individual subscription price
for the journal was incorrectly stated. The publisher regrets this error.

The correct individual subscription rate in North America for MACHINE LEARNING,
VOLUME FIVE (1990) is $50.

Volume 5 will include 2 special issues.

For further information, contact Carl Harris or Karen Cullen at:

Kluwer Academic Publishers
101 Philip Drive
Norwell, MA 02061

----------------------------------------------------------------------
Date: Fri, 13 Oct 89 17:34 EST
From: LEWIS@cs.umass.EDU
Subject: References on dealing with / using feature dependence in learning

I posted a query several issues ago seeking references on concept learning or
clustering methods that make use of a priori knowledge of dependencies between
features. The following is an edited list of the references and suggestions I
received, broken down by topics. Many thanks to Chuck Anderson, Larry Watanabe,
David Benjamin, John Stutz, and Bruce Croft for contributions.


David D. Lewis ph. 413-545-0728
Information Retrieval Laboratory BITNET: lewis@umass
Computer and Information Science (COINS) Dept. INTERnet: lewis@cs.umass.edu
University of Massachusetts, Amherst
Amherst, MA 01003
USA UUCP: ...!uunet!cs.umass.edu!lewis@uunet.uu.net

********************************
BAYESIAN LEARNING:

...see Cheeseman et.al. in ML-87 and AAAI-87...

...You might find G Larry Bretthorst's Bayesian Spectrum Analysis and Parameter
Estimation, #48 in Springer-Verlag's Lecture Notes in Statistics to be of
interest. Larry attacked the problem of learning the underlying frequencies in
time spectra applying a full Bayesian analysis. He developed methods for
combinations of stationary frequencies, time varying frequencies, and
exponentially decaying amplitudes. Although the subject area is outside of what
most ML researchers study, the techniques are potentially of great value...

*******************************
FISHER/GENNARI SYSTEM:

...There are papers in this year's IJCAI and also in the Machine Learning
Workshop proceedings. Authors are Langley, Gennari, Fisher, K. Thompson.
Fisher's thesis is available from the ICS dept at UC Irvine and he has
a paper the the 1987 volume of the Machine Learning Journal...

*******************************
MULTIPLE REGRESSION:

...I believe something could be gained by perusing the literature on
multiple regression. I have no experience with multiple regression, but
believe it is the process of iteratively doing regressions on sets of
variables that are changed between interations by adding new variables that
are combinations of variables used in the previous regression that turn out to
be correlated with the independent variable to some degree. In other words,
higher-order features are added that encapsulate part of the regression
function. This is a bit different from what you are asking, in that variables
that are correlated with the independent variable, rather than with each
other, or combined. Anyway, the statistical literature on regression might be
useful to you....


*******************************
CONSTRUCTIVE INDUCTION:

Larry Rendell - article in Machine Learning Journal (not many issues, so
shouldn't be hard to find).

R. S. Michalski & L. Watanabe - Constructive Closed-loop learning - Fundamental
ideas and examples (George Mason University Tech Report - AI Center)

Guiding Constructive Induction - L. Watanabe, IJCAI 87

George Drastal - some paper in IJCAI 89

PROTOS system - tech reports from Austin at Texas - Bruce Porter and Weiss?

Katharina Morik - paper in Machine Learning Workshop 89

paper on MARVIN by Sammut & Banerji - Machine Learning V. II (book).

DUCE - paper by Steve Muggleton IJCAI 87

CIGOL - Steven Muggleton Machine Learn. Workshop 88

In addition, there was an IJCAI workshop last year - 1st workshop on change of
representation and inductive bias

Chris Matheus - paper in IJCAI 89

********************************
INFORMATION RETRIEVAL

C. J. van Rijsbergen. (1977). A Theoretical Basis for the use of
Co-occurrence Data in Information Retrieval, Journal of Documentation,
2(33), 106-119.

W. Bruce Croft. (1986) Boolean Queries and Term Dependencies in
Probabilistic Retrieval Models. Journal of the American Society for
Information Science. 2(37), 71-77.

David D. Lewis, W. Bruce Croft and Nehru Bhandaru (1989).
Language-Oriented Information Retrieval. International Journal of
Intelligent Systems.

********************************

TREE DEPENDENCE AND RELATED ISSUES

Wong, S.K.M.; Poon, F.C.S. "Comments on approximating discrete probability
distributions with dependence trees." IEEE Trans. Pattern Anal. Mach. Intell.
(USA) vol.11, no.3, pp. 333-335, March 1989.

Wang, D.C.C.; Wong, A.K.C. "Classification of Discrete Data with Feature
Space Transformation". IEEE Trans on Autom. Control. Vol AC-24, no. 3, pp
434-437, June 1979.

Wong, A.K.C.; Liu, T.S. "A Decision-Directed Clustering Algorithm for
Discrete Data" IEEE Trans. Comput. Vol. C-26, no. 1, pp. 75-82, Jan.
1977.

Wong, K.C.; Young, T.Y.; Liu, P.S. "Application of Pattern
Recognition Techniques to Discrete Clinical Data" Proceedings of the
1976 IEEE Conference on Decision and Control, pp. 158-161, 1976.

Chow, C.K.; Liu, C.N. "Approximating Discrete Probability Distributions
with Dependence Trees" IEEE Trans. on Information Theory. Vol. IT-14, No. 3,
May, 1968.
----------------------------------------------------------------------
From: cwi!uunet!watmath!alberta!pavan@PARIS.ICS.UCI.EDU
Subject: 5th Machine Learning Conference
Date: Thu, 5 Oct 89 17:33:52 MDT


I need the proceedings of the fifth conference on machine learning,
held at Ann Arbor, Michigan in 1988. How can I obtain a copy - publisher,
ordering information etc.

Thanks,
Pavan

pavan@alberta.uucp
pavan@cs.ualberta.ca
[Since this is probably of general interest, I'll post one concise reply
in the next ML-LIST.- MJP]
----------------------------------------------------------------------

END of ML-LIST 1.10

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT