Copy Link
Add to Bookmark
Report

Machine Learning List Vol. 4 No. 10

eZine's profile picture
Published in 
Machine Learning List
 · 1 year ago

 
Machine Learning List: Vol. 4 No. 10
Saturday, April 25, 1992

Contents:
ML92 ONR Bursries announcement
Ninth International Machine Learning Conference (ML92) REMINDER
Symposium on Goal-Driven Learning
Announcing the availability of a neural network hyperplane animator
Artificial Intelligence and Statistics Workshop
New NIST OCR Database
NIST database of fingerprints
NIST SPECIAL DATABASE 2



The Machine Learning List is moderated. Contributions should be relevant to
the scientific study of machine learning. Mail contributions to ml@ics.uci.edu.
Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues
may be FTP'd from ics.uci.edu in pub/ml-list/V<X>/<N> or N.Z where X and N are
the volume and number of the issue; ID: anonymous PASSWORD: <your mail address>

----------------------------------------------------------------------

Date: Fri, 8 May 92 21:13:05 BST
From: Derek Sleeman <sleeman@computing-science.aberdeen.ac.UK>
Subject: ML92 ONR Bursries announcement


ML92 CONFERENCE
ABERDEEN, JULY 1-4 1992

The US Office of Naval Research has indicated it will make a grant to assist
US Graduate students to attend this meeting. It is likely that each bursary
will be worth around 300-400 US dollars.

Priority for Bursaries will be given to those students who are presenting
papers and posters at the main Conference; second priority will be given to
those contributing to a Workshop; thirdly those attending the Conference will
be eligible.


If you would like to be considered for a Bursary, please have your
advisor/supervisor write me a letter (no email please), or a fax followed by a
copy sent by airmail.


Bursaries will be assigned on a first come basis. However, formal notification
of Bursaries will not be sent out until we have received the appropriate
authorization from ONR.



Applications & Enquiries to:

Prof Derek SLEEMAN (ONR Bursaries)
Computing Science Department
KING's COLLEGE
The University
ABERDEEN AB9 2UE
Scotland UK

Telephone +44 224 272288/95
FAX +44 224 273422

------------------------------

Date: Mon, 11 May 92 16:39:08
From: ML 92 <ml92@computing-science.aberdeen.ac.UK>
Subject: Ninth International Machine Learning Conference (ML92) REMINDER


** REMEMBER **

DEADLINE FOR NORMAL ML92 REGISTRATION FEE - MAY 29
******
(LATE FEE APPLIES AFTER THIS DATE)


------------------------------

Date: Thu, 7 May 92 14:06:42 EDT
From: Ashwin Ram <ashwin@cc.gatech.EDU>
Subject: Symposium on Goal-Driven Learning


Symposium Announcement

Goal-Driven Learning

To be held at the
1992 Conference of the Cognitive Science Society
Indiana University, Bloomington, Indiana

Conference dates: July 29-August 1
Symposium date: July 30 or August 1


Topic description:

In Artificial Intelligence, Psychology, and Education, a growing body of
research supports the view that learning is a goal-directed process.
Experimental studies show that people with different goals process
information differently; work in machine learning presents functional
arguments for goal-based focusing of learner effort. The symposium
brings together researchers from diverse research areas to discuss
issues in how learning goals arise, how they affect learner decisions of
when and what to learn, and how they guide the learning process.


Organizers:

David Leake, Indiana University (leake@cs.indiana.edu)
Ashwin Ram, Georgia Tech (ashwin@cc.gatech.edu)


Participants and areas:

Larry Barsalou, Psychology, University of Chicago
(Goals and tasks in category formation and learning)

David Leake, Computer Science, Indiana University
(Task-driven explanation)

Ryszard Michalski, Computer Science, George Mason University
(Learning goals in machine learning)

Evelyn Ng, Education, Simon Fraser University
(Goal orientation and the design of instruction)

Paul Thagard, Cognitive Science/Philosophy, Princeton University
(Goals in analogy and problem solving)

Ashwin Ram, Computer Science, Georgia Tech
(Meta-reasoning about knowledge goals for learning)


Format of the symposium:

The first hour will be spent in presentation of short position papers.
The remaining 40 minutes would be open discussion among the panelists
and with the audience.


Conference registration information:

Conference registration brochures are available from:
Candace Shertzer
Cognitive Science Program
Indiana University
(812) 855-4658
cshertze@silver.ucs.indiana.edu

------------------------------

Date: Tue, 5 May 92 16:24:21 EDT
From: pratt@cs.rutgers.EDU
Subject: Announcing the availability of a neural network hyperplane animator


Lori Pratt and Paul Hoeper
Computer Science Dept
Rutgers University

Understanding neural network behavior is an important goal of many
research efforts. Although several projects have sought to translate
neural network weights into symbolic representations, an alternative
approach is to understand trained networks graphically. Many
researchers have used a display of hyperplanes defined by the weights
in a single layer of a back-propagation neural network. In contrast to
some network visualization schemes, this approach shows both the
training data and the network parameters that attempt to fit those
data. At NIPS 1990, Paul Munro presented a video which demonstrated
the dynamics of hyperplanes as a network changes during learning. This
video was based on a program implemented for SGI workstations.

At NIPS 1991, we demonstrated an X-based hyperplane animator, similar
in appearance to Paul Munro's, but with extensions to allow for
interaction during training. The user may speed up, slow down, or
freeze animation, and set various other parameters. Also, since it
runs under X, this program should be more generally usable.

This program is now being made available to the public domain. The
remainder of this message contains more details of the hyperplane
animator and ftp information.


1. What is the Hyperplane Animator?

The Hyperplane Animator is a program that allows easy graphical display
of Back-Propagation training data and weights in a Back-Propagation neural
network.

Back-Propagation neural networks consist of processing nodes interconnected
by adjustable, or ``weighted'' connections. Neural network learning consists
of adjusting weights in response to a set of training data. The weights
w1,w2,...wn on the connections into any one node can be viewed as the
coefficients in the equation of an (n-1)-dimensional plane. Each non-input
node in the neural net is thus associated with its own plane. These hyperplanes
are graphically portrayed by the hyperplane animator. On the same graph it
also shows the training data.


2. Why use it?

As learning progresses and the weights in a neural net alter, hyperplane
positions move. At the end of the training they are in positions that
roughly divide training data into partitions, each of which contains only
one class of data. Observations of hyperplane movement can yield valuable
insights into neural network learning.

3. How to install the Animator.

Although we've successfully compiled and run the hyperplane animator on
several platforms, it is still not a stable program. It also only
implements some of the functionality that we eventually hope to include.
In particular, it only animates hyperplanes representing input-to-hidden
weights. It does, however, allow the user to change some aspects of
hyperplane display (color, line width, aspects of point labels, speed of
movement, etc.), and allows the user to freeze hyperplane movement for
examination at any point during training.

How to install the hyperplane animator:

1. copy the file animator.tar.Z to your machine via ftp as follows:

ftp cs.rutgers.edu (128.6.25.2)
Name: anonymous
Password: (your ID)
ftp> cd pub/hyperplane.animator
ftp> binary
ftp> get animator.tar.Z
ftp> quit

2. Uncompress animator.tar.Z

3. Extract files from animator.tar with:
tar -xvf animator.tar

4. Read the README file there. It includes instructions for running
a number of demonstration networks that are included with this
distribution.

DISCLAIMER:
This software is distributed as shareware, and comes with no warantees
whatsoever for the software itself or systems that include it. The authors
deny responsibility for errors, misstatements, or omissions that may or
may not lead to injuries or loss of property. This code may not be sold
for profit, but may be distributed and copied free of charge as long as
the credits window, copyright statement in the ha.c program, and this notice
remain intact.


------------------------------

Date: Mon, 11 May 92 12:12:39 PDT
From: Wray Buntine <wray@ptolemy.arc.nasa.GOV>
Subject: Artificial Intelligence and Statistics Workshop

Historically, there's been good involvement by the machine learning
people in this workshop. I'm told Doug Fisher might be giving
a tutorial on AI for statisticians, and there will also be a
tutorial on statistics for AI people.

Wray Buntine


*****************************************************************

Call For Papers
Fourth International Workshop on

Artificial Intelligence
and
Statistics

January 3-6, 1993
Ft. Lauderdale, Florida

*****************************************************************

PURPOSE:
This is the fourth in a series of workshops which has
brought together researchers in Artificial Intelligence and in
Statistics to discuss problems of mutual interest. The result has
been an unqualified success. The exchange has broadened research
in both fields and has strongly encouraged interdisciplinary work.

This workshop will have as its primary theme:

``Selecting models from data''

Papers on other aspects of the interface between A.I. & Statistics
are *strongly* encouraged as well (see TOPICS below).

FORMAT:
To encourage interaction and a broad exchange of ideas, the
presentations will be limited to 18 discussion papers in single
session meetings over the three days. Focussed poster sessions
will provide the means for presenting and discussing the remaining
research papers.
Attendance at the workshop will *not* be limited.

The three days of research presentations will be preceded by a day
of tutorials. These are intended to expose researchers in each
field to the methodology used in the other field.

LANGUAGE:
The language will be English.

TOPICS OF INTEREST:

The fourth workshop has a primary theme of

``Selecting models from data''.

At least one third of the workshop schedule will be set aside for
papers with this theme. We particularly encourage papers
on the following topics:
- model selection
- model search
- model validation
- integrated man-machine modelling methods
- software tools and environments for the above.

Other themes will be developed according to the strength of the
papers in other areas of the interface between AI & Statistics.
We strongly encourage research papers on the following areas as
well:

- empirical discovery and statistical methods for knowledge
acquisition
- probability and search
- uncertainty propagation
- combined statistical and qualitative reasoning
- inferring causation
- quantitative programming tools and integrated software for
data analysis and modelling.
- discovery in databases
- meta data and design of statistical data bases
- automated data analysis and knowledge representation for
statistics
- machine learning
- clustering and concept formation.


SUBMISSION REQUIREMENTS:
Three copies of an extended abstract (up to four pages) should be
sent by air mail to

P. Cheeseman, Programme Chair
4th Int'l Workshop on AI & Stats
NASA Ames Research Center
MS 269-2
Moffett Field
CA 94035
USA

or electronically (latex documents preferred) to either

ai-stats@watstat.waterloo.edu
or
ai-stats@watstat.uwaterloo.ca

Submissions for discussion papers (and poster presentations) will
be considered if postmarked by June 30, 1992. If the submission
is electronic (e-mail), then it must be *received* by midnight
June 30, 1992.
Abstracts received after this date but *before* July 31, 1992,
will be considered for poster presentation *only*.

Please indicate which topic(s) your abstract addresses and include
an electronic mail address for correspondence.
Acceptance notices will be mailed by September 1, 1992.
Preliminary papers (up to 20 pages) must be returned by November 1,
1992. These preliminary papers will be copied and distributed at
the workshop.

PROGRAM COMMITTEE:

General Chair: R.W. Oldford U. of Waterloo, Canada

Programme Chair: P. Cheeseman NASA (Ames), USA

Members:
W. Buntine NASA (Ames), USA
Wm. DuMouchel USA
D.J. Hand Open University, UK
W.A. Gale AT&T Bell Labs, USA
D. Lubinsky AT&T Bell Labs, USA
M. McLeish U. of Guelph, Canada
E. Neufeld U. of Saskatchewan, Canada
J. Pearl UCLA, USA
D. Pregibon AT&T Bell Labs, USA
P. Shenoy U. of Kansas, USA
P. Smythe JPL, USA


SPONSORS:
Society for Artificial Intelligence And Statistics
International Association for Statistical Computing



------------------------------

Subject: New NIST OCR Database
From: Mike Garris x2928 <mdg@magi.ncsl.nist.gov>
Date: Fri, 10 Apr 92 08:23:11 -0500

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY

Announces a New Database

+-----------------------------+
| "NIST Special Database 3" |
+-----------------------------+

Binary Images of Handwritten Segmented Characters
(HWSC)


The NIST database of handwritten segmented characters contains 313,389
isolated character images segmented from the 2,100 full-page images
distributed with "NIST Special Database 1". The database includes the
2,100 pages of binary, black and white, images of hand-printed numerals
and text. This significant new database contains 223,125 digits, 44,951
upper-case, and 45,313 lower-case character images. Each character image
has been centered in a separate 128 by 128 pixel region and has been
assigned a classification which has been manually corrected so that the
error rate of the segmentation and assigned classification is less than
0.1%. The uncompressed database totals approximately 2.75 gigabytes of
image data and includes image format documentation and example software.

"NIST Special Database 3" has the following features:
+ 313,389 isolated character images including classifications
+ 223,125 digits, 44,951 upper-case, and 45,313 lower-case images
+ 2,100 full-page images
+ 12 pixel per millimeter resolution
+ image format documentation and example software

Suitable for automated hand-print recognition research, the database
can be used for:
+ algorithm development
+ system training and testing

The database is a valuable tool for training recognition systems on a
large statistical sample of hand-printed characters. The system
requirements are a 5.25" CD-ROM drive with software to read ISO-9660
format.

If you have any further technical questions please contact:

Michael D. Garris
mdg@magi.ncsl.nist.gov
(301)975-2928 (new number!)

If you wish to order the database, please contact:

Standard Reference Data
National Institute of Standards and Technology
221/A323
Gaithersburg, MD 20899
(301)975-2208
(301)926-0416 (FAX)


------------------------------

Subject: NIST database of fingerprints
From: Craig Watson <craig@magi.ncsl.nist.gov>
Date: Fri, 10 Apr 92 17:23:36 -0500

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY

Announces a New Database

+-----------------------------+
| "NIST Special Database 4" |
+-----------------------------+

8-Bit Gray Scale Images of Fingerprint Image Groups
(FIGS)


The NIST database of fingerprint images contains 2000 8-bit gray scale
fingerprint image pairs. Each image is 512 by 512 pixels with 32 rows of
white space at the bottom and classified using one of the five following
classes: A=Arch, L=Left Loop, R=Right Loop, T=Tented Arch, W=Whirl. The
database is evenly distributed over each of the five classifications with
400 fingerprint pairs from each class. The images are compressed using a
modified JPEG lossless compression algorithm and require approximately
636 Megabytes of storage compressed and 1.1 Gigabytes uncompressed (1.6 :
1 compression ratio). The database also includes format documentation and
example software.


"NIST Special Database 4" has the following features:

o 2000 8-bit gray scale fingerprint image pairs including
classifications
o 400 fingerprint pairs from each of the five classifications Arch,
Left and Right Loops, Tented Arch, Whirl
o each of the fingerprint pairs are two completely different
rollings of the same fingerprint
o 19.6850 pixels per millimeter resolution
o image format documentation and example software


Suitable for automated fingerprint classification research, the database
can be used for:

o algorithm development
o system training and testing


The database is a valuable tool for evaluating fingerprint systems on a
statistical sample of fingerprints which is evenly distributed over the
five major classifications. The system requirements are a 5.25" CD-ROM
drive with software to read ISO-9660 format.


If you have any further technical questions please contact:

Craig I. Watson
craig@magi.ncsl.nist.gov
(301)975-4402


If you wish to order the database, please contact:

Standard Reference Data
National Institute of Standards and Technology
Bldg. 221/A323
Gaithersburg, MD 20899
(301)975-2208
(301)926-0416 (FAX)


------------------------------

Subject: NIST SPECIAL DATABASE 2
From: Darrin Dimmick X4147 <dld@magi.ncsl.nist.gov>
Date: Mon, 13 Apr 92 08:09:43 -0500


NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY

Announces a New Database

+-----------------------------+
| "NIST Special Database 2" |
+-----------------------------+

Structured Forms Reference Set
(SFRS)

The NIST database of structured forms contains 5,590 full page images of
simulated tax forms completed using machine print. THERE IS NO REAL TAX
DATA IN THIS DATABASE. The structured forms used in this database are 12
different forms from the 1988, IRS 1040 Package X. These include Forms
1040, 2106, 2441, 4562, and 6251 together with Schedules A, B, C, D, E, F
and SE. Eight of these forms contain two pages or form faces making a
total of 20 form faces represented in the database.

Each image is stored in bi-level black and white raster format. The
images in this database appear to be real forms prepared by individuals
but the images have been automatically derived and synthesized using a
computer and contain no "real" tax data. The entry field values on the
forms have been automatically generated by a computer in order to make
the data available without the danger of distributing privileged tax
information.

In addition to the images the database includes 5,590 answer files, one
for each image. Each answer file contains an ASCII representation of the
data found in the entry fields on the corresponding image. Image format
documentation and example software are also provided.

The uncompressed database totals approximately 5.9 gigabytes of data.

"NIST Special Database 2" has the following features:
+ 5,590 full-page images
+ 5,590 answer files
+ 12 pixel per millimeter resolution
+ image format documentation and example software

Suitable for automated document processing system research and
development, the database can be used for:
+ algorithm development
+ system training and testing

The system requirements are a 5.25" CD-ROM drive with software to read
ISO-9660 format.

If you have any further technical questions please contact:


Darrin L. Dimmick
dld@magi.ncsl.nist.gov
(301)975-4147

If you wish to order the database, please contact:

Standard Reference Data
National Institute of Standards and Technology
221/A323
Gaithersburg, MD 20899
(301)975-2208
(301)926-0416 (FAX)


------------------------------

End of ML-LIST 4.10 (Digest format)
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT