Copy Link
Add to Bookmark
Report

NL-KR Digest Volume 14 No. 68

eZine's profile picture
Published in 
NL KR Digest
 · 10 months ago

NL-KR Digest      Wed Oct 25 21:21:29 PDT 1995      Volume 14 No. 68 

Today's Topics:

CFP: AAAI96FP - Fall Symp. KR Sys. w/ NL, Fall 96, Boston
Position: Senior Professor of Cognitive / Neural Sys., Boston U.
Announcement: BU Conference on Language Development, Nov 95, Boston
CFP: ECAI-96: 12th Eur. Conf on AI - Workshops, Aug 96, Budapest
Announcement: UM-96 Travel Scholarships
Announcement: Grad Study: Johns Hopkins Language / Speech Center
Announcement: Link Grammar, version 2, CMU
Position: Postdoc in Chinese Corpora, Linguistics, Academia Sinica
Program: Wrkshp on Spoken Language Generation, Nov 95, Darmstadt
Position: Summer KR etc. Interns at CIRL, U Oregon
Announcement: Paper on Automated Deduction, Concept Languages, UNC

* * *

Subcriptions: listserv-style administrative requests to
nl-kr-request@ai.sunnyside.com.
Submissions, policy, questions: nl-kr@ai.sunnyside.com
To speed up processing of your submission write to
listserv@ai.sunnyside.com with the message:
GET nl-kr style

Back issues:
FTP: ai.sunnyside.com:/pub/nl-kr/Vxx/Nyyy
/pub/nl-kr/Vxx/INDEX
Gopher: ai.sunnyside.com, Port 70, in directory /pub/nl-kr
Email: write to LISTSERV@AI.SUNNYSIDE.COM, omit subject, mail command:
GET nl-kr nl-kr_file_list
Web: http://ai.sunnyside.com/pub/nl-kr
Editors:
Al Whaley (al@ai.sunnyside.com) and
Chris Welty (weltyc@sigart.acm.org).

-----------------------------------------------------------------------

Date: Wed, 11 Oct 1995 16:20:16 -0400
From: Lucja Iwanska <lucja@CS.Wayne.EDU>
To: al@snyside1.sunnyside.com, weltyc@sigart.acm.org
Subject: CFP: AAAI96FP - Fall Symp. KR Sys. w/ NL, Fall 96, Boston


KNOWLEDGE REPRESENTATION SYSTEMS BASED ON NATURAL LANGUAGE

CFP for a AAAI Fall 1996 Symposium


WHERE: Boston/Cambridge
WHEN: Immediately preceding KR-96 (the exact dates to be announced later)

The Symposium addresses the theoretically and practically important
problem of knowledge representation (KR) systems that closely parallel
the representational and inferential characteristics of natural language.

Advantages of such NL-based KR systems would be enormous.
Among the arguments for the natural-language-as-KR-system approach are:

1. KR systems based on natural language would be easy for people to use,

2. Most human knowledge is encoded and communicated via natural language,
in the form of textual documents and (transcribed) interactions (dialogs).
A KR system based on natural language would be capable of automatically
creating and updating its knowledge base from natural language texts more easily.
Additionally, the contents of this knowledge base and inferences supported by the
KR system, would parallel those of a natural language user.

3. Every day, a huge number of new textual documents becomes available on-line.
This creates the need for more sophisticated information retrieval techniques based
on natural language processing (NLP) and KR techniques.

4. KR systems based on natural language would provide a uniform symbolic representation.
The same representational and inference mechanism could be used when utilizing previous
knowledge for processing new natural language inputs (natural language as both
meta-level and object-level language),

5. It is hard to match expressiveness and precision of natural language,
particularly in not (well) formalized domains,

6. Many philosophers, linguists and cognitive scientists believe that mental-level
representation of knowledge (human mind) is close in form to natural language.


While some AI researchers believe that it is feasible and necessary to
design KR systems closely mimicking natural language, others are
pessimistic about success or even possibility of designing such KR
systems. This might account for the general lack of interest in the
problems of NLP within the KR community; for example, only six of the
twenty-two KR systems presented in the "Special Issue on Implemented
Knowledge Representation and Reasoning Systems", Charles Rich, Editor
SIGART Bulletin, Vol. 2 (3), ACM Press, 1991, are driven by NLP concerns.

Among the arguments against the natural-language-as-KR-system approach are:

1. Natural language is (highly) ambiguous,

2. Natural language has (very) complex syntax, semantics, and pragmatics,

3. Natural language is non-systematic, non-algorithmic,

4. Natural language is (highly) context-dependent,

5. Natural language is (merely) an interface;
Inferencing does not belong with natural language.



The goal of this Symposium is to address in-depth such arguments for
and against designing KR systems closely simulating natural language.

We invite papers that substantiate the view that natural language can
be viewed as a KR system with its own representational and inferential
machinery, and, as such, is a productive source of ideas for KR
formalisms and their practical implementations.

We are interested in papers discussing representations and inference
mechanisms paralleling a non-trivial or interesting subset of natural
language and systems whose expressiveness, semantics, information
packaging, reasoning, and computational tractability closely
correspond to that of natural language.

We are interested in automatic or semi-automatic methods of obtaining
taxonomies facilitating various NLP tasks such as anaphora resolution,
inferencing, and machine translation.

We are also interested in papers that discuss those aspects of natural
language that are not desirable in a KR system. We invite position
papers with supported arguments against the idea of designing KR
systems that mimic natural language.

- - - - - - - - - - - - - -

PAPER FORMAT:

Strongly preferred
12 pt article latex style
15 pages maximum, including title, abstract, figures,
but excluding references
The first page must include:
title
author's name(s)
affiliation
complete mailing address
e-mail address
phone/fax number(s)
abstract of 200 or so words
keywords


ELECTRONIC SUBMISSIONS are strongly preferred:

DIRECT:
anonymous ftp to ftp.cs.wayne.edu ~pub/nlkr directory

As the last resort, five hard copies of the paper
can be snail mailed to

Lucja Iwanska
Department of Computer Science
Wayne State University
Detroit, MI 48202, USA
(313) 577-1667 (phone)
(313) 577-2478 (secretary)
(313) 577-6868 (fax)


TIMETABLE:

1.01.96
intent to submit due
5.01.96:
submission deadline
7.01.96:
reviews completed
papers chosen
notification/comments/requests for changes sent out
?? 9.01.96:
final versions received
?? 11.02-11.05
Symposium takes place

- - - - - - - - - - - - - - -


PROGRAM COMMITTEE:


Syed S. Ali, Southwest Missouri State University
syali@sy.smsu.edu

Douglas Appelt, SRI International;
appelt@ai.sri.com

R.V. Guha, Apple Computers, Inc.
guha@taurus.apple.com

Sasa Buvac Stanford University
buvac@sail.stanford.edu

Lucja Iwanska (Chair), Wayne State University
lucja@cs.wayne.edu

Douglas Lenat, CYC Corp.
lenat@mcc.com

David McCallester, AT&T Bell Labs
dmac@research.att.com

Len Schubert, University of Rochester
schubert@cs.rochester.edu

Stuart C. Shapiro, State University of New York at Buffalo
shapiro@cs.buffalo.edu

Wlodek Zadrozny, IBM TJ Watson Research Center
wlodz@watson.ibm.com

= = = = = = = = = = = = = = =

-----------------------------------------------------------------------

Date: Wed, 11 Oct 1995 15:49:31 -0400
From: cas-cns@PARK.BU.EDU (B.U. CAS/CNS)
Subject: Position: Senior Professor of Cognitive / Neural Sys., Boston U.
Reply-To: cas-cns@PARK.BU.EDU

NEW FACULTY
IN COGNITIVE AND NEURAL SYSTEMS
AT BOSTON UNIVERSITY

Boston University seeks an associate or full professor
for its graduate Department of Cognitive and Neural
Systems. Exceptionally qualified assistant professors
will also be considered. This department offers an
integrated curriculum of psychological,
neurobiological, and computational concepts, models,
and methods in the fields of computational
neuroscience, connectionist cognitive science, and
neural network technology in which Boston University
is a leader. Candidates should have an international
research reputation, preferably including extensive
analytic or computational research experience in
modeling a broad range of nonlinear neural networks,
especially in one or more of the areas: vision and
image processing, adaptive pattern recognition,
cognitive information processing, speech and language,
and neural network technology. Send a complete
curriculum vitae and three letters of recommendation
to Search Committee, Department of Cognitive and
Neural Systems, 677 Beacon Street, Boston University,
Boston, MA 02215. Boston University is an Equal
Opportunity/Affirmative Action Employer.

http://cns-web.bu.edu

-----------------------------------------------------------------------

Date: Tue, 10 Oct 95 20:43:24 -0400
From: langconf@louis-xiv.bu.edu (BU Conference on Language Development)
To: langconf-announce@louis-xiv.bu.edu
Subject: Announcement: BU Conference on Language Development, Nov 95, Boston

20th Annual Boston University Conference on Language Development

November 3, 4, 5, 1995


KEYNOTE SPEAKER: Lila Gleitman

PLENARY SPEAKER: Lydia White

Sessions include first and second language acquisition of syntax,
morphology, phonology, lexical and conceptual knowledge, discourse,
narrative and literacy, social and cultural aspects of language use, as
well as exceptional language, language processing, and bilingualism.
Ninety papers are scheduled to be presented.

All conference sessions will be held on the Boston University campus
in the George Sherman Union, 775 Commonwealth Avenue, Boston.

The Conference program, up-to-date (as of 9/28/95) hotel and travel
information, and the registration form, may be obtained by sending a
blank e-mail message to info@louis-xiv.bu.edu


If you have any questions, or if you would like to add your address
to our regular mailing list or inform us of a change in address, or if
you have any difficulties, please send e-mail to
langconf@louis-xiv.bu.edu, phone 617-353-3085, or write to:

Boston University Conference on Language Development
138 Mountfort Street
Brookline, MA 02146-4083

As we are a volunteer-run Conference, we ask your patience in that
it may take us a day or two to respond to a phone call or e-mail.

-----------------------------------------------------------------------

From: Elisabeth Andre <Elisabeth.Andre@dfki.uni-sb.de>
To: comp-phon@cogsci.ed.ac.uk, connectionists@cs.cmu.edu, acl@cs.columbia.edu,
Subject: CFP: ECAI-96: 12th Eur. Conf on AI - Workshops, Aug 96, Budapest
Date: Thu, 12 Oct 1995 18:42:59 +0100

-------------------------------------------
ECAI-96: SECOND CALL FOR WORKSHOP PROPOSALS
-------------------------------------------

12th European Conference on Artificial Intelligence
August 12-16, 1996
Budapest, Hungary

Deadline for Workshop Proposals: 1 November 1995
Workshop dates 12-13 August 1996

For more information, see
http://www.dfki.uni-sb.de/ecai96/call-for-workshops.html

-----------------------------------------------------------------------

To: comp-ai-nlang-know-rep@uunet.uu.net
From: 96 Jill um <um96@cis.udel.edu>
Subject: Announcement: UM-96 Travel Scholarships
Date: 12 Oct 1995 23:21:19 GMT


Travel Funds for Fifth International Conference on User Modeling

Funds have been received from AAAI for subsidizing student
travel to UM-96. For information about applying for a travel
grant, please contact David Chin at chin@uhics.ics.hawaii.edu

-----------------------------------------------------------------------

To: corpora@hd.uib.no, nl-kr@cs.rpi.edu
Subject: Announcement: Grad Study: Johns Hopkins Language / Speech Center
Date: Sun, 15 Oct 1995 16:46:23 -0400
From: "Eric Brill" <brill@crabcake.cs.jhu.edu>




CENTER FOR LANGUAGE AND SPEECH PROCESSING
Johns Hopkins University
Baltimore, Md.


The Center for Language and Speech Processing (CLSP) at Johns Hopkins
University encourages students interested in pursuing a graduate
degree in any aspect of language and speech processing to apply to
Johns Hopkins. Graduate students interested in language and speech
processing who want to conduct research at CLSP must first be admitted
to a graduate program in one of the various departments that have CLSP
faculty. Students must meet the requirements for admission and degree
completion in their home department. To obtain an application to any
of the affiliated departments, send mail to the address provided
below. Be sure to indicate which department(s) you wish to apply to.


Home Departments and Selected Faculty


Cognitive Science

MICHAEL BRENT, Ph.D., MIT, 1991. Computational models of language acquisition,
machine learning of natural language.
LUIGI BURZIO, Ph.D., MIT, 1981. Theories of syntax and phonology, rules versus
constraints in lexical organization.
ROBERT FRANK, Ph.D., University of Pennsylvania, 1992. Natural language syntax
and foundations of grammatical theory, tree adjoining grammars, computational
and empirical studies of language acquisition and language processing.
PAUL SMOLENSKY, Ph.D., Indiana University, 1981. Integration of
neural/connectionist and symbolic computation, soft constraints in universal
grammar, optimality theory, phonology and syntax.


Computer Science

ERIC BRILL, Ph.D., University of Pennsylvania, 1993. Natural language and
speech processing, machine learning, artificial intelligence.
SIMON KASIF, Ph.D., University of Maryland, 1985. Artificial intelligence,
parallel computation, machine learning, computational modeling.
STEVEN SALZBERG, Ph.D., Harvard, 1989. Machine learning, computational
biology, pattern recognition.
DAVID YAROWSKY, Ph.D., University of Pennsylvania, 1995. Natural language
processing and spoken language systems, machine translation, information
retrieval and machine learning.


Electrical and Computer Engineering

ANDREAS ANDREOU, Ph.D., Johns Hopkins University, 1986. Sensory
communication, acoustic processing for speech recognition using models of
audition and speech production, low power analog VLSI auditory models.
GERT CAUWENBERGHS, Ph.D., California Institute of Technology, 1994. Neural
networks, model free learning, low power integrated circuits for speech
encoding/decoding and acoustic signal classification.
FREDERICK JELINEK, Ph.D., MIT, 1962. Speech recognition, statistical methods
of natural language processing, information theory.


Biomedical Engineering

JOHN HEINZ, Sc.D., MIT, 1962. Speech communication, acoustics of speech and
swallowing.
MURRAY SACHS, Ph.D., MIT, 1966. Auditory neurophysiology and psychophysics.
ERIC YOUNG, Ph.D., Johns Hopkins University, 1972. Auditory neurophysiology,
neural modeling, sensory processes.


Mathematical Sciences

CAREY PRIEBE, Ph.D., George Mason University, 1993. Statistics, functional
estimation, discriminant analysis, change point analysis, image analysis.
COLIN WU, Ph.D., UC Berkeley, 1990. Statistics, semi-parametric models,
robustness.


Center Resources and Activities

World-class computational resources
Ample laboratory and office space for graduate students
Weekly academic year seminar series
Annual Speech Research Symposium
Six-week international summer research workshop

Affiliated Laboratories

Center for Hearing Sciences, Johns Hopkins School of Medicine
Communications Sciences Research Laboratory, Johns Hopkins Kennedy
Krieger Institute
Neural Encoding Laboratory, Johns Hopkins School of Medicine
Sensory Communication Laboratory, Johns Hopkins Whiting School of
Engineering

Selected graduate courses offered in NLP and related areas:


600.403 - Learning and Modeling
600.404 - Artificial Neural Networks
600.435 - Artificial Intelligence
600.440 - Advanced Topics in Artificial Intelligence
600.465 - Introduction to Natural Language Processing
600.466 - Advanced Topics in Natural Language Processing
600.489 - Automated Reasoning
600.661 - Machine Learning
600.676 - Statistical Methods of Natural Language Analysis

520.435 - Digital Signal Processing
520.447 - Introduction to Information Theory and Coding
520.475 - Processing and Recognition of Speech
520.476 - Information Extraction from Speech and Text
520.639 - Information Theory
520.641 - Communication Theory
520.735 - Sensory Information Processing

050.405 - Cognitive AI II: Learning
050.603 - Lexical Processing
050.606 - Cognitive Neuropsychology of Language
050.624 - Topics in Syntactic Theory
050.625 - Linguistic Semantics
050.627 - Lexicon Seminar
050.642 - Computational Language Acquisition
050.643 - Laboratory in Computational Language Acquisition
050.801 - Research Seminar in Cognitive Neuropsychology
050.802 - Research Seminar in Cognitive Processes



Center for Language and Speech Processing
Johns Hopkins University
3400 N. Charles Street / Barton Hall
Baltimore, MD 21218-2686
IMPORTANT: Indicate department(s) of interest
Tel (410) 516-4237 Fax (410) 516-5050
Electronic Access
e-mail: clsp@jhu.edu
WWW: http://cspjhu.ece.jhu.edu
Applications can also be requested via the web by filling out the on-line
form available at http://cspjhu.ece.jhu.edu/admission.html

The Johns Hopkins University does not discriminate on the basis of race, color,
sex, religion, homosexuality, national or ethnic origin, age, disability or
veteran status in any student program or activity administered by the
University or with regard to admission or employment.
Questions regarding Title VI, Title IX and Section 504 should be referred to
Yvonne M. Theodore, Affirmative Action Officer, 205 Garland Hall (410-516-
8075).

-----------------------------------------------------------------------

From: sleator@BOBO.LINK.CS.CMU.EDU
To: nl-kr@snyside1.sunnyside.com
Subject: Announcement: Link Grammar, version 2, CMU
Date: Sat, 14 Oct 95 18:01:34 EDT

In Spring 1992, we released version 1 of our "link grammar parser". This
is a syntactic parser for English, based on an original theory of syntax
related to dependency grammar. Several hundred people took copies of
the parser, and a number of people reported that they were using it or
were planning to use it for various projects. However, the parser had a
number of weaknesses, and its coverage was not sufficient for it to be
of much use to people.

With the help of Dennis Grinberg and John Lafferty, we have now released
version 2, which is significantly better than version 1. Some of the
advantages are described below. We have also created a web site about
the system:

http://bobo.link.cs.cmu.edu/grammar/html/intro.html

This site contains a lot of information about the parser, and allows you
to experiment with it. The parser and its documentation are available
via anonymous ftp:

/afs/cs/user/sleator/public/link-grammar on host ftp.cs.cmu.edu

We think the parser could now be useful for a variety of applications
that involve recovering the syntactic structure of text. These might
include speech recognition, speech generation, grammar checking, machine
translation, and language understanding systems.

P.S. This message was sent to a mailing list consisting of 1230 people
who (1) have expressed interest in our work (as determined by a crude
analysis of our email, and other correspondence), or (2) are on mailing
lists whose subjects would indicate interest in this work. We have
attempted to eliminate all duplicates form the list. We apologize if
you receive this message twice, or if this information is of not of
interest to you.

Davy Temperley Daniel Sleator

.......................................................................

Daniel Sleator Office: 412-268-7563
Professor of Computer Science Fax: 412-268-5576
Carnegie Mellon University Home: 412-362-8675
Pittsburgh, PA 15213 sleator@cs.cmu.edu


IMPROVEMENTS

1. The new version is "robust". The old version could not assign any
syntactic structure to a sentence unless it could completely
interpret the sentence. The new version is able to skip over portions
of the sentence that it cannot understand, and assign some structure
to the rest of the sentence.

2. Quite apart from the "robustness" feature, the parser's coverage is
vastly improved. The old system could only fully parse about 30% of
sentences in a typical Wall Street Journal article. The new version
can find complete parses for 70-75% of such sentences.

3. The new version has a much larger dictionary. The old version had
about 25000 words; the new version has about 59000. (Here we count
individual forms of verbs and nouns: e.g., "chase", "chases",
"chased", and "chasing" are counted as separate words. The number of
"stem" words is probably about 30000.)

4. The new version has an "unknown word" feature. It has a general
syntactic category which it assigns to any word which it does not
recognize. (In the process, it labels the unknown word as a noun,
verb, adjective, or adverb.)

5. The parser has a "two-stage" system. At the first stage, it considers
common syntactic constructions; the "stage one" coverage is roughly
comparable to the coverage of the earlier version. In the second
stage, it considers many less common constructions. Here are a few
examples of "stage-two" constructions:

Plural nouns acting as noun modifiers ("He was booked on a weapons
violations charge")

Adjectival nouns preceding adjectives ("City clerical workers went on
strike today")

Prepositional phrases modifying verbs, but preceding the direct object
("She sold for five dollars the ring her mother gave her").

Manner adverbs modifying adjectives ("The delicately quiet tone of the
cello blended well with the fiercely percussive piano chords")

Unusual cases of subject-verb inversion ("Also invited to the meeting
were several prominent scientists")

Auxiliaries without main verbs ("If you don't want to do it, you
should find someone who will")

Unusual uses of gerunds ("We have to talk about this sleeping in class
and girl chasing")

Noun-phrases introducing proper names ("The actress Whoopi Goldberg
and the singer Michael Jackson attended the ceremony")

Hyphenated expressions as noun-phrases ("The buy-out caused a
free-for-all in the mid-afternoon")

6. The post-processing system released with the earlier version has been
improved. There is now a "wild-card" character for post-processing,
allowing rules to be expressed much more parsimoniously.

7. The new version has greatly improved documentation. We have compiled
a "guide-to-links", describing every connector type and every
syntactic construction covered by the parser. The guide also contains
a complete description of post-processing. We also provide a general
introduction (in a file called "manual") to the system, describing
the general logic of link grammars and the post-processing system and
the notational system we use for expressing them, as well as a number
of special features of the parser. We hope this will allow people
to modify the system substantially if they wish or design their own
versions (e.g. dictionaries for other languages).

8. The dictionary is using a different (and much more logical) link
naming scheme.

-----------------------------------------------------------------------

From: Phang Nyit Yin <pny@hp.iis.sinica.edu.tw>
Subject: Position: Postdoc in Chinese Corpora, Linguistics, Academia Sinica
To: CMP-LG@XXX.LANL.GOV, zzlsa@gallua.BITNET, weischedel@bbn.com,
Date: Wed, 18 Oct 95 14:27:54 EAT

Non-Tenure Track Research Position (Chinese Corpus Linguistics)
Institute of History and Philology, Academia Sinica,
Nankang, Taipei, Taiwan, ROC

Research Areas:I. Modern and Classical Chinese Corpora
II. Corpus-based Linguistic Research
III. Corpus-based Chinese NLP

Primary Fields: Linguistics (Syntax/Semantics/Morphology)

Subfields: Corpus Linguistics, Computational Linguistics,
Lexicography, Classical Chinese Grammar

Openings: 1

Term: November 1995 to June 1996
(renewable pending budgetary approval and performance)

Starting Salary: N.T. $52,780/month (roughly U.S. $1,950/month)

Requirements: (1) Ph. D. in Linguistics (before Oct.1995)
(2) (near) Native Fluency in Mandarin Chinese

Location: Corpus Linguistics Research Group
Institute of History and Philology
Academia Sinica, Nankang, Taipei, Taiwan, ROC

Application:
I. BY October 22 (Sunday): Email (1) cover letter (2) C.V.
(including list of publications), and (3) research interests
and proposed research areas
TO: hschuren@ccvax.sinica.edu.tw
OR Fax to Chu-Ren Huang at 886-2-786-8834

II. BY October 28 (Saturday) [If Short-listed]
Send (1), (2), (3) research proposal, (4) Thesis, (5) proof of
a valid doctorate (received before October 1995), (6) relevant
publications, and (7) three letters of recommendation,
TO: Professor Chu-Ren Huang
Institute of History and Philology
Academia Sinica
Nankang, Taipei
Taiwan 115

III. Enquiries:
Send to the above email, snail mail addresses, or fax

-----------------------------------------------------------------------

Date: Tue, 10 Oct 1995 17:01:18 +0100
From: "Dr. John Bateman" <bateman@darmstadt.gmd.de>
To: nlpeople@cogsci.ed.ac.uk, siggen-members@indigo.cs.bgu.ac.il,
Subject: Program: Wrkshp on Spoken Language Generation, Nov 95, Darmstadt


2ND `SPEAK!' WORKSHOP: SPEECH GENERATION IN MULTIMODAL
INFORMATION SYSTEMS AND PRACTICAL APPLICATIONS

2nd-3rd November 1995
Darmstadt


The preliminary programme and details for registering for
this workshop are now available across the web.

URL: "http://www.darmstadt.gmd.de/programme1.html"


John Bateman.
GMD/IPSI.

-----------------------------------------------------------------------

To: comp-ai-nlang-know-rep@uunet.uu.net
From: ginsberg@dt.cirl.uoregon.edu (Matthew L. Ginsberg)
Subject: Position: Summer KR etc. Interns at CIRL, U Oregon
Date: 20 Oct 1995 17:27:48 GMT

CIRL, the Computational Intelligence Research Laboratory at the
University of Oregon (Eugene), is considering starting a summer
internship program for undergraduates. The program would be aimed at
students between their junior and senior years who want to spend a
summer participating in CIRL's various research activities;
approximately three students would be selected competitively each year
after a national search. Selected students would be provided with
housing, transportation expenses, participation in the national AI
conference (to be held in Portland next summer), and a salary
comparable to what they could obtain from industry.

The purpose of this message is to help us understand what sort of
interest there would be in such a program. If you are an
undergraduate who would be interested in participating, please reply
to this message with a brief statement of your research interests and
background. Further information on CIRL, together with a description
of the research that goes on here, can be found by visiting our home
page, http://www.cirl.uoregon.edu. (Our overall research is focussed
on basic questions in artificial intelligence including search,
knowledge representation, and reasoning. There is an emphasis on
planning, constraint satisfaction, and reasoning about action and
physical devices.)

Thanks! We hope to see you in Eugene for the summer of 1996.

Matt Ginsberg

-----------------------------------------------------------------------

To: comp-ai-nlang-know-rep@ncren.net
From: parama@cs.unc.edu (Paramasivam)
Subject: Announcement: Paper on Automated Deduction, Concept Languages, UNC
Date: 23 Oct 1995 17:45:12 -0400


The following paper is now available from
ftp://ftp.cs.unc.edu/pub/users/parama/class.ps

Automated Deduction Techniques for Classification in Concept Languages

Abstract:
Mechanical theorem provers are becoming increasingly more powerful, and
we believe that it is time to examine whether certain tasks that have
formerly been accomplished by other means, can now be performed
efficiently by a theorem prover. One such task is classification in
description logic-based knowledge representation systems or Concept
Language systems. Concept Language systems provide a
formalism for expressing knowledge based on concepts and roles.
Subsumption checking is one important reasoning faculty offered by such
sytems. In this paper we use a theorem prover coupled with a
finite-model finder to perform subsumption checking. This approach is
complete and sound for Concept Languages whose underlying description
logic have the finite model property. The performance is compared with
several other well-known Concept Language systems. Some efficient
strategies to compute the subsumption hierarchy, known as
classification, are also described.

We welcome your comments.
--Parama

M. Paramasivam and David A. Plaisted
Department of Computer Science
CB # 3175 Sitterson Hall
University of North Carolina, Chapel Hill NC 27599-3175

--

End of NL-KR Digest
*******************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT