Copy Link
Add to Bookmark
Report

Machine Learning List Vol. 5 No. 17

eZine's profile picture
Published in 
Machine Learning List
 · 11 months ago

 
Machine Learning List: Vol. 5 No. 17
Friday, August 13, 1993

Contents:
New Additions to the UCI Machine Learning Repository
CFP: Machine Learning Special Issue
MOBAL 2.2 available via ftp
workshop announcement for ML-List
Genetic programming announcement
CLNL'93 Schedule



The Machine Learning List is moderated. Contributions should be relevant to
the scientific study of machine learning. Mail contributions to ml@ics.uci.edu.
Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues
may be FTP'd from ics.uci.edu in pub/ml-list/V<X>/<N> or N.Z where X and N are
the volume and number of the issue; ID: anonymous PASSWORD: <your mail address>

----------------------------------------------------------------------

Subject: New Additions to the UCI Machine Learning Repository
Date: Sun, 08 Aug 1993 19:59:54 -0700
From: "Patrick M. Murphy" <pmurphy@focl.ICS.UCI.EDU>

The following is a list of databases that have recently been
added to the UCI Machine Learning Repository.

Any comments or donations would be creatly appreciated
(ml-repository@ics.uci.edu).

Patrick M. Murphy (Site Librarian)
David W. Aha (Off-Site Assistant)

1. Water Treatment Plant Data
(donated by Javier Bejar and Ulises Cortes)

This dataset comes from the daily measures of sensors in a urban waste
water treatment plant. The objective is to classify the operational
state of the plant in order to predict faults through the state
variables of the plant at each of the stages of the treatment process.
This domain has been stated as an ill-structured domain.
38 numeric attributes, 527 instances, some missing values.

2. Challenger Space Shuttle O-Ring Data
(donated by David Draper)

5 attributes; 23 instances; 2 databases

Edited from (Draper, 1993):
The motivation for collecting this database was the explosion of the
USA Space Shuttle Challenger on 28 January, 1986. An investigation
ensued into the reliability of the shuttle's propulsion system. The
explosion was eventually traced to the failure of one of the three field
joints on one of the two solid booster rockets. Each of these six field
joints includes two O-rings, designated as primary and secondary, which
fail when phenomena called erosion and blowby both occur.
The night before the launch a decision had to be made regarding
launch safety. The discussion among engineers and managers leading to
this decision included concern that the probability of failure of the
O-rings depended on the temperature t at launch, which was forecase to
be 31 degrees F. There are strong engineering reasons based on the
composition of O-rings to support the judgment that failure
probability may rise monotonically as temperature drops. One other
variable, the pressure s at which safety testing for field join leaks
was performed, was available, but its relevance to the failure process
was unclear.
Draper's paper includes a menacing figure graphing the number of field
joints experiencing stress vs. liftoff temperature for the 23 shuttle
flights previous to the Challenger disaster. No previous liftoff
temperature was under 53 degrees F. Although tremendous extrapolation
must be done from the given data to assess risk at 31 degrees F, it
is obvious even to the layman "to foresee the unacceptably high risk
created by launching at 31 degrees F." For more information, see
Draper (1993) or the other previous analyses.
The task is to predict the number of O-rings that will experience
thermal distress for a given flight when the launch temperature is
below freezing.

------------------------------

Date: Wed, 11 Aug 93 14:18:22 EDT
From: gordon@aic.nrl.navy.MIL
Subject: CFP: Machine Learning Special Issue

CALL FOR PAPERS

Special Issue on
BIAS EVALUATION AND SELECTION

Guest Editors: Marie desJardins and Diana Gordon

This special issue of Machine Learning will be devoted to
research on biases in machine learning, with an emphasis on
research that addresses both the evaluation and the selec-
tion of biases. By "evaluation" we mean empirical or
analytical methods for studying the impact of biases on
learning performance, based on problem space characteris-
tics. These characteristics might include background
knowledge, information about the source of the data, or pro-
perties of the learning agent. By "selection" we mean the
use of the evaluation results to choose, either statically
or dynamically, the most appropriate bias(es) for a given
problem.

Topics of interest include, but are not limited to:

o Determination of the regions of appropriateness for
"important" biases, i.e., biases that occur frequently in
machine learning systems. Two examples of "important"
biases are a preference for simpler hypotheses and a
preference for hypotheses of a particular form (e.g.,
decision trees or DNF or exemplars).

o Determination of when certain bias shifting methods,
which are themselves biases, are most appropriate.

o Novel methods for evaluating biases. For example, one
might use relatively unexplored problem space charac-
teristics (e.g., confidence levels in the source of
training data) to determine which bias to select.

o Discussion of bias interactions and their impact on
learning.

Inquiries may be addressed to Marie desJardins at
marie@erg.sri.com, or to Diana Gordon at
gordon@aic.nrl.navy.mil. Submissions must be received by
November 15, 1993, and must comply with the submission
guidelines published in Machine Learning. Shorter technical
notes are invited, as well as full-length papers of up to
12,000 words.

Two copies of the submitted manuscript should be mailed to:
Diana Gordon
Naval Research Laboratory, Code 5514
4555 Overlook Avenue S.W.
Washington D.C. 20375-5337 USA

In addition, an electronic PostScript version would be help-
ful.

Five additional copies of each submitted manuscript must be
mailed to:
Karen Cullen
Attn: Special Issue on Bias
MACHINE LEARNING Editorial Office
Kluwer Academic Publishers
101 Philip Drive
Assinippi Park
Norwell, MA 02061 USA

Papers will be subject to the standard review process.
Please forward this announcement to interested colleagues.


------------------------------

From: Werner Emde <werner.emde@gmd.de>
Subject: MOBAL 2.2 available via ftp
Date: Thu, 12 Aug 93 12:12:57 +0200


The knowledge acquisition and machine learning system MOBAL (release 2.2)
is available free for non-commercial academic use from the anonymous
ftp-server 'ftp.gmd.de' in the directory 'gmd/mlt/Mobal'. The system
requires a Sun SparcStation, SunOS 4.1, OpenWindows 2.0, and HyperNeWS
1.4; the latter can be obtained by sending mail to newsdev@turing.ac.uk.
By agreement with Turing Institute, HyperNeWS 1.4 is now also available
from our server in the directory 'gmd/mlt/HyperNeWS'.


About MOBAL
__________-
Mobal is a sophisticated system for developing operational models of
application domains. It integrates a manual knowledge acquisition and
inspection environment, a powerful inference engine, machine learning
methods for automated knowledge acquisition, and a knowledge revision
tool.

By using Mobal's knowledge acquisition environment, you can incrementally
develop a model of your domain in terms of logical facts and rules. You
can inspect the knowledge you have entered in text or graphics windows,
augment the knowledge, or change it at any time. The built-in inference
engine can immediately execute the rules you have entered to show you the
consequences of your inputs, or answer queries about the current
knowledge. Mobal also builds a dynamic sort taxonomy from your inputs.
If you wish, you can use machine learning methods to automatically
discover additional rules based on the facts that you have entered, or to
form new concepts. If there are contradictions in the knowledge base due
to incorrect rules or facts, there is a knowledge revision tool to help
you locate the problem and fix it.


Changes since Mobal 2.0
______________________-
MOBAL release 2.2 offers some interesting new features and a number of
small improvements making it worth your while to replace older releases
of the system.

1. MOBAL as an ILP toolbox

Since the very first releases, MOBAL has been a kind of toolbox offering
different tools (including the Rule Discovery Tool RDT) to support the
modeling of domains. Now we have coupled MOBAL with some other well
known ILP systems. The new release contains interfaces to and the code of
following learning system:

- FOIL 5 (from J. Ross Quinlan),
- GOLEM (from S. Muggelton and C. Feng),
- mFOIL (from S. Dzeroski and I. Bratko),
- CILGG (from J.-U. Kietz), and
- INCY (from E. Sommer).

Many thanks to the authors of the programs for their kind permission to
include the systems in the new release!

The user can select any of these tools (and RDT, of course) within MOBAL.
The knowledge base is automatically translated into the format required
by the called algorithm and the resulting rules are translated back into
MOBAL's format. As the coupling is achieved by defining input/output
filters for each tool, third party implementations can be used without
modifications. Furthermore, MOBAL 2.2 is open to be coupled to other
learning systems.

2. Integrity Constraints

The new release offers the possibilty to state integrity constraints.
MOBAL checks whether all constraints are satisfied either continuously or
on demand. Violations are placed on the system's agenda and can be
resolved by the user at a convenient time.

3. Programmer's Interface

MOBAL 2.2 includes a Programmer's Interface, which gives access to the
full range of MOBAL's knowledge representation and inference, knowledge
acquisition and learning facilities.


User Guide
__________
MOBAL's User Guide has completely reworked and extended for the new
release. A compressed PostScript version can be found in the MOBAL
directory

Acknowledgements
________________
Mobal 2.2 is a result of research funded by the Euopean Community within
the type B ESPRIT Project 2154 "Machine Learning Toolbox" and the ESPRIT
Project "Inductive Logic Programming" (ILP, PE 6020) and is based on the
Sytem BLIP developed in the project "Lerner" at the Technical University
Berlin funded by the German government (BMFT) under contract ITW8501B1.

Mobal is being developed with Quintus Prolog 3.1.1 on a Sun4. We would
like to thank the Quintus Corporation for their support in making this
runtime version of MOBAL possible.


Restrictions
____________
MOBAL is available in the hope that it will be useful, but WITHOUT ANY
WARRANTY; without even the implied warranty of FITNESS FOR A PARTICULAR
PURPOSE.

MOBAL can be used free of charge for academic, educational, or
non-commercial uses. We do require, however, that you send us mail
(addresses below) so we know where MOBAL is going. This will also get
you access to MobalNews, our mailing list where we let all registered
MOBAL users know about updates, bug fixes, etc.


E-Mail:
mobal@gmdzi.gmd.de

Project MLT
GMD (German National Research Center for Computer Science)
AI Research Division (I3.KI)
Schloss Birlinghoven
D 53757 St. Augustin
Germany

Fax: +49/2241/14-2889

------------------------------

Subject: EuroSoar Workshop-7
Date: Mon, 9 Aug 93 15:19:29 BST
From: "Frank E. Ritter" <ritter@psychology.nottingham.ac.UK>

WORKSHOP ANNOUNCEMENT AND CALL FOR PAPERS/PARTICIPATION

EuroSoar Workshop-7

19th-21st November 1993
Department of Psychology, University of Nottingham
Nottingham, England

The Seventh European Workshop on Soar will be held in Nottingham,
England on the weekend of 19th-21st November 1993. The Workshop will
be of interest to researchers in Europe actively working with Soar or
interested in doing so. Soar is a unified theory of cognition
realized as a problem-space architecture in an AI production system
language. Events will include invited and submitted talks reporting
recent Soar research in Europe and North America, and demonstrations
of Soar work.

People travelling from outside Nottingham will probably want
to stay in Nottingham on the Friday night as well as the Saturday
night, as the workshop will start with dinner at 7pm on Friday night,
and finish at 3pm on Sunday.

Attendance will be limited to 50 participants. Preference will be
given to active Soar researchers and those who submit their forms
promptly.

An introductory tutorial will be held on the afternoon of Friday 19th
November, intended to enable newcomers to Soar to benefit from and
contribute to the main part of the workshop. The tutorial will
include sessions on the basic concepts of Soar, current research
issues, a demonstration of Soar in use, and a survey of current Soar
activity worldwide. Given a suffient level of ability on the part of
the participants, it may attempt to teach very basic Soar programming.

For further information and a registration form contact before 10
September 1993:

Frank Ritter, EuroSoar-7
Dept. of Psychology
U. of Nottingham
Nottingham NG7 2RD
England

Tel: +44 (602) 515 292
Fax: +44 (602) 515 324
Email: Ritter@psyc.nott.ac.uk

------------------------------

Date: Thu, 12 Aug 93 13:57:23 PDT
From: John Koza <koza@cs.stanford.EDU>
Subject: Genetic programming announcement




CALL FOR PAPERS

GENETIC PROGRAMMING TRACK

June 26 (Sunday) to June 26 (Wednesday), 1994
Walt Disney World Dolphin Hotel
Lake Buena Vista, Florida

At the IEEE CONFERENCE ON
EVOLUTIONARY COMPUTATION

As part of the IEEE WORLD CONGRESS ON
COMPUTATIONAL INTELLIGENCE

Sponsored by the IEEE Neural Networks Council

Papers are being solicited for a 3-day Genetic Programming track at the
IEEE Conference on Evolutionary Compuation. The IEEE Conference on
Evolutionary Compuation will be held as part of the larger and longer
IEEE World Congress on Computational Intelligence sponsored by the IEEE
Neural Networks Council to be held from June 26 (Sunday) to July 2
(Saturday), 1994. The multi-conference will include the 1994 IEEE
Conference on Neural Networks and the FUZZ/IEEE '94 conference as well
as numerous tutorials on genetic programming, genetic algorithms,
evolutionary computation, neural networks, and fuzzy logic.

Topics:
Theoretical and applied aspects of genetic programming,
Tierra, and other systems for evolving computer programs.

Lee Altenberg, Duke University
Peter J. Angeline, Ohio State University
Robert J. Collins, U. S. Animation Inc.
Kenneth E. Kinnear, Jr., Sun Microsystems, Inc.
John Koza, Stanford University
Craig Reynolds, Electronic Arts Inc.
James P. Rice, Knowledge Systems Laboratory
Walter Alden Tackett, Univ of Southern California

__________________________________________________________________

IEEE CONFERENCE ON EVOLUTIONARY COMPUTATION

Zbigniew Michalewicz, General Chair
zbyszek@mosaic.uncc.edu
Topics:

Genetic algorithms (GA), genetic programming (GP), evolution strategies
(ES), evolution programming (EP), classifier systems, theory of
evolutionary computation, evolutionary computation applications,
efficiency and robustness comparisons with other direct search
algorithms, parallel computer applications, new ideas incorporating
further evolutionary principles, artificial life, evolutionary
algorithms for computational intelligence, comparisons between
different variants of evolutionary algorithms, machine learning
applications, evolutionary computation for neural networks, and fuzzy
logic in evolutionary algorithms.


__________________________________________________________________

FOR CONFERENCE REGISTRATION MATERIALS,
DETAILED PROGRAM INFORMATION,
HOTEL AND TRAVEL INFORMATION,
(AS AVAILABLE)
CONTACT
World Congress on Computational Intelligence
Meeting Management Inc.
5665 Oberlin Drive, Suite 110
San Diego, California 92121, USA
Telephone: 619-453-6222
FAX: 619-535-3880
E-MAIL: 70750.345@compuserve.com

__________________________________________________________________

INSTRUCTIONS TO AUTHORS

Papers must be received in San Diego by:

Friday December 10, 1993.

All accepted papers will be published in the Conference Proceedings.
Papers will be reviewed by senior researchers in the field, and all
authors will be informed of the decisions at the end of the review
proces. Six copies (one original and five copies) of the paper must
be submitted. Original must be camera ready, on 8.5 x 11-inch white
paper, one-column format in Times or similar fontstyle, 10 points or
larger with one-inch margins on all four sides. Do not fold or staple
the original camera-ready copy. Four pages are encouraged. The paper
must not exceed six pages including figures, tables, and references,
and should be written in English. Centered at the top of the first
page should be the complete title, author name(s), affiliation(s) and
physical mailing address(es), and electronic mailing address(es) if
available.

In the accompanying letter, the following information must be included:

1) Full title of paper,
2) Corresponding author's name, physical address, telephone,
electronic mail address (if available), and fax numbers,
3) First and second choices of technical session (specify
"GENETIC PROGRAMMING" here),
4) Preference for oral or poster presentation, and
5) Presenter's name, physical address, electronic mail address
(if available), telephone and fax numbers.

Mail papers and accompanying letter to (and/or obtain further
information from):

World Congress on Computational Intelligence,
Meeting Management
5665 Oberlin Drive, #110
San Diego, California 92121, USA

(e-mail: 70750.345@compuserve.com,
telephone: 619-453-6222).

__________________________________________________________________

For information on two other concurrent events:

IEEE CONFERENCE ON NEURAL NETWORKS
Steven K. Rogers, General Chair
rogers@afit.af.mil

FUZZ/IEEE '94
Piero P. Bonissone, General Chair
bonissone@crd.ge.ge.com

__________________________________________________________________



------------------------------

From: Russell Greiner <greiner@learning.siemens.COM>
Date: Mon, 9 Aug 93 15:00:24 EDT
Subject: CLNL'93 Schedule


***********************************************************
* CLNL'93 -- Computational Learning and Natural Learning *
* Provincetown, Massachusetts *
* 10-12 September 1993 *
***********************************************************

CLNL'93 is the fourth of an ongoing series of workshops designed to bring
together researchers from a diverse set of disciplines ___ including
computational learning theory, AI/machine learning,
connectionist learning, statistics, and control theory ___
to explore issues at the intersection of theoretical learning research
and natural learning systems.

The schedule of presentations appears below, followed by logistics and
information on registration

================ ** CLNL'93 Schedule (tentative) ** =======================

Thursday 9/Sept/93:
6:30-9:00 (optional) Ferry (optional): Boston to Provincetown
[departs Boston Harbor Hotel, 70 Rowes Wharf on Atlantic Avenue]

Friday 10/Sept/93 [CLNL meetings, at Provincetown Inn]
9 - 9:15 Opening remarks
9:15-10:15 Scaling Up Machine Learning: Practical and Theoretical Issues
Thomas Dietterich [Oregon State Univ]
(invited talk, see abstract below)

10:30-12:30 Paper session 1
What makes derivational analogy work: an experience report using APU
Sanjay Bhansali [Stanford]; Mehdi T. Harandi [Univ of Illinois]
Scaling Up Strategy Learning: A Study with Analogical Reasoning
Manuela M. Veloso [CMU]
Learning Hierarchies in Stochastic Domains
Leslie Pack Kaebling [Brown]
Learning an Unknown Signalling Alphabet
Edward C. Posner, Eugene R. Rodemich [CalTech/JPL]

12:30- 2 Lunch (on own)

Unscheduled TIME
( Whale watching, beach walking, ... )
( Poster set-up time; Poster preview (perhaps) )

Dinner (on own)

7 - 10 Poster Session [16 posters]
(Hors d'oeuvres)
Induction of Verb Translation Rules from Ambiguous Training and a
Large Semantic Hierarchy
Hussein Almuallim, Yasuhiro Akiba, Takefumi Yamazaki, Shigeo Kaneda
[NTT Network Information Systems Lab.]
What Cross-Validation Doesn't Say About Real-World Generalization
Gunner Blix, Gary Bradshaw, Larry Rendall [Univ of Illinois]
Efficient Learning of Regular Expressions from Approximate Examples
Alvis Brazma [Univ of Latvia]
Capturing the Dynamics of Chaotic Time Series by Neural Networks
Gurtavo Deco, Bernd Schurmann [Siemens AG]
Learning One-Dimensional Geometrical Patterns Under One-Sided Random
Misclassification Noise
Paul Goldberg [Sandia National Lab]; Sally Goldman [Washington Univ]
Adaptive Learning of Feedforward Control Using RBF Network ...
Dimitry M Gorinevsky [Univ of Toronto]
A practical approach for evaluating generalization performance
Marjorie Klenin [North Carolina State Univ]
Scaling to Domains with Many Irrelevant Features
Pat Langley, Stephanie Sage [Siemens Corporate Research]
Variable-Kernel Similarity Metric Learning
David G. Lowe [Univ British Columbia]
On-Line Training of Recurrent Neural Networks with Continuous
Topology Adaptation
Dragan Obradovic [Siemens AG]
N-Learners Problem: System of PAC Learners
Nageswara Rao, E.M. Oblow [Engineering Systems/Advanced Research]
Soft Dynamic Programming Algorithms: Convergence Proofs
Satinder P. Singh [Univ of Mass]
Integrating Background Knowledge into Incremental Concept Formation
Leon Shklar [Bell Communications Research]; Haym Hirsh [Rutgers]
Learning Metal Models
Astro Teller [Stanford]
Generalized Competitive Learning and then Handling of Irrelevant Features
Chris Thornton [Univ of Sussex]
Learning to Ignore: Psychophysics and Computational Modeling of Fast
Learning of Direction in Noisy Motion Stimuli
Lucia M. Vaina [Boston Univ], John G. Harris [Univ of Florida]

Saturday 11/Sept/93 [CLNL meetings, at Provincetown Inn]
9:00-10:00 Current Tree Research
Leo Breiman [UCBerkeley]
(invited talk, see abstract below)

10:30-12:30 Paper session 2
Initializing Neural Networks using Decision Trees
Arunava Banerjee [Rutgers]
Exploring the Decision Forest
Patrick M. Murphy, Michael Pazzani [UC Irvine]
What Do We Do When There Is Outrageous Data Points in the Data Set? -
Algorithm for Robust Neural Net Regression
Yong Liu [Brown]
A Comparison of RBF and MLP Networks for Classification of
Biomagnetic Fields
Martin F. Schlang, Ralph Neunier, Klaus Abraham-Fuchs [Siemens AG]

12:30- 2 Lunch (on own)

2:30- 3:30 TBA (invited talk)
Yann le Cun [ATT]

4:00- 6:00 Paper session 3
On Learning the Neural Network Architecture: An Average Case Analysis
Mostefa Golea [Univ of Ottawa]
Fast (Distribution Specific) Learning
Dale Schuurmans [Univ of Toronto]
Computational capacity of single neuron models
Anthony Zador [Yale Univ School of Medicine]
Probalistic Self-Structuring and Learning
A.D.M. Garvin, P.J.W. Rayner [Cambridge]

7:00- 9 Banquet dinner

Sunday 12/Sept/93 [CLNL meetings, at Provincetown Inn]
9 -11 Paper session 4
Supervised Learning from real and Discrete Incomplete Data
Zoubin Ghaharamani, Michael Jordan [MIT]
Model Building with Uncertainty in the Independent Variable
Volker Tresp, Subutai Ahmad, Ralph Neuneier [Siemens AG]
Supervised Learning using Unclassified and Classified Examples
Geoff Towell [Siemens Corp. Res.]
Learning to Classify Incomplete Examples
Dale Schuurmans [Univ of Toronto]; R. Greiner [Siemens Corp. Res.]

11:30 -12:30 TBA (invited talk)
Ron Rivest [MIT]

12:30 - 2 Lunch (on own)

3:30 - 6:30 Ferry (optional): Provincetown to Boston
Depart from Boston (on own)

______ ______
Scaling Up Machine Learning: Practical and Theoretical Issues

Thomas G. Dietterich
Oregon State University and
Arris Pharmaceutical Corporation


Supervised learning methods are being applied to an ever-expanding
range of problems. This talk will review issues arising in these
applications that require further research. The issues can be
organized according to the problem-solving task, the form of the
inputs and outputs, and any constraints or prior knowledge that must
be considered. For example, the learning task often involves
extrapolating beyond the training data in ways that are not addressed
in current theory or engineering experience. As another example, each
training example may be represented by a disjunction of feature
vectors, rather than a unique feature vector as is usually assumed.
More generally, each training example may correspond to a manifold of
feature vectors. As a third example, background knowledge may take
the form of constraints that must be satisfied by any hypothesis
output by a learning algorithm. The issues will be illustrated using
examples from several applications including recent work in
computational drug design and ecosystem modelling.

_______
Current Tree Research

Leo Breiman
Deptartment of Statistics
University of California, Berkeley

This talk will summarize current research by myself and collaborators
into methods of enhancing tree methodology. The topics covered will be:

1) Tree optimization
2) Forming features
3) Regularizing trees
4) Multiple response trees
5) Hyperplane trees

These research areas are in a simmer. They have been programmed and
are undergoing testing. The results are diverse.

_______
_______

Programme Committee:
Andrew Barron, Russell Greiner, Tom Hancock, Steve Hanson, Robert Holte,
Michael Jordan, Stephen Judd, Pat Langley, Thomas Petsche, Tomaso Poggio,
Ron Rivest, Eduardo Sontag, Steve Whitehead

Workshop Sponsors:
Siemens Corporate Research and MIT Laboratory of Computer Science

================ ** CLNL'93 Logistics ** =======================

Dates:
The workshop begins at 9am Friday 10/Sept, and concludes by 3pm
Sunday 12/Sept, in time to catch the 3:30pm Provincetown--Boston ferry.

Location:
All sessions will take place in the Provincetown Inn (800 942-5388); we
encourage registrants to stay there. Provincetown Massachusetts is located
at the very tip of Cape Cod, jutting into the Atlantic Ocean.

Transportation:
We have rented a ship from The Portuguese Princess to transport CLNL'93
registrants from Boston to Provincetown on Thursday 9/Sept/93, at no charge
to the registrants. We will also supply light munchies en route. This ship
will depart from the back of Boston Harbor Hotel, 70 Rowes Wharf on Atlantic
Avenue (parking garage is 617 439-0328); tentatively at 6:30pm.
If you are interested in using this service, please let us know ASAP (via
e-mail to clnl93@learning.scr.siemens.com) and also tell us whether you be
able to make the scheduled 6:30pm departure.

(N.b., this service replaces the earlier proposal, which involved the
Bay State Cruise Lines.)

The drive from Boston to Provincetown requires approximately two hours.
There are cabs, busses, ferries and commuter airplanes (CapeAir, 800 352-0714)
that service this Boston--Provincetown route.
The Hyannis/Plymouth bus (508 746-0378) leaves Logan Airport at 8:45am,
11:45am, 2:45pm, 4:45pm on weekdays, and arrives in Provincetown about
4 hours later; its cost is $24.25.
For the return trip (only), Bay State Cruise Lines (617 723-7800) runs a
ferry that departs Provincetown at 3:30pm on Sundays, arriving at
Commonwealth Pier in Boston Harbor at 6:30pm; its cost is $15/person, one way.

Inquiries:
For additional information about CLNL'93, contact
clnl93@learning.scr.siemens.com
or
CLNL'93 Workshop
Learning Systems Department
Siemens Corporate Research
755 College Road East
Princeton, NJ 08540--6632

To learn more about Provincetown, contact their
Chamber of Commerce at 508 487-3424.


================ ** CLNL'93 Registration ** =======================

Name: ________________________________________________
Affiliation: ________________________________________________
Address: ________________________________________________
________________________________________________
Telephone: ____________________ E-mail: ____________________

Select the appropriate options and fees:

Workshop registration fee ($50 regular; $25 student) ___________
Includes
* attendance at all presentation and poster sessions
* the banquet dinner on Saturday night; and
* a copy of the accepted abstracts.

Hotel room ($74 = 1 night deposit) ___________
[This is at the Provincetown Inn, assuming a minimum stay of
2 nights. The total cost for three nights is $222 = $74 x 3,
plus optional breakfasts.
Room reservations are accepted subject to availability.
See hotel for cancellation policy.]

Arrival date ___________ Departure date _____________
Name of person sharing room (optional) __________________
[Notice the $74/night does correspond to $37/person per
night double-occupancy, if two people share one room.]
# of breakfasts desired ($7.50/bkfst; no deposit req'd) ___

Total amount enclosed: ___________


If you are not using a credit card, make your check payable in U.S. dollars
to "Provincetown Inn/CLNL'93", and mail your completed registration form to
Provincetown Inn/CLNL
P.O. Box 619
Provincetown, MA 02657.
If you are using Visa or MasterCard, please fill out the following,
which you may mail to above address, or FAX to 508 487-2911.
Signature: ______________________________________________
Visa/MasterCard #: ______________________________________________
Expiration: ______________________________________________



------------------------------

End of ML-LIST (Digest format)
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT