Copy Link
Add to Bookmark
Report
NL-KR Digest Volume 09 No. 53
NL-KR Digest (Mon Oct 12 10:29:44 1992) Volume 9 No. 53
Today's Topics:
Query: Natural Language to SQL
Query: Genetic Algorithm applications in Natural Language Processing
Announcement: Two special journal issues on conceptual graphs
Announcement: Special Symposium for Jack Minker
Announcement: New Graduate Program in Architecture+Art+Technology
Discussion: A different approach to handling English
Submissions: nl-kr@cs.rpi.edu
Requests, policy: nl-kr-request@cs.rpi.edu
Back issues are available from host archive.cs.rpi.edu [128.213.3.18] in
the files nl-kr/Vxx/Nyy (ie nl-kr/V01/N01 for V1#1), mail requests will
not be promptly satisfied. Starting with V9, there is a subject index
in the file INDEX. If you can't reach `cs.rpi.edu' you may want
to use `turing.cs.rpi.edu' instead.
BITNET subscribers: we now have a LISTSERVer for nl-kr.
You may send submissions to NL-KR@RPIECS
and any listserv-style administrative requests to LISTSERV@RPIECS.
-----------------------------------------------------------------
To: nl-kr@cs.rpi.edu
Date: Fri, 2 Oct 92 13:57:55 EDT
From: cmac@ssdc.sterling.com (Chris McNeilly)
Subject: Query: Natural Language to SQL
I'm in search of commercial systems that are natural language front
ends for relational databases. The user enters english queries and
the system kicks out sql. Does anybody have any experience with any
of these systems and can recommend one? Currently I'm using a system
from Natural Language and it seems to have problems dealing with some
of the things we need to do.
Thanks,
Chris
- ---------------------------------
Chris McNeilly
Sterling Software
Work 703-356-3551
FAX 703-821-1485
email cmac%ssdc@uunet.uu.net
uunet!ssdc!cmac
------------------------------
To: nl-kr@cs.rpi.edu
Date: Sun, 4 Oct 1992 09:38:40 -0400
From: OLCAY BOZ <ob00@lehigh.edu>
Newsgroups: comp.ai.nlang-know-rep
Subject: Query: Genetic Algorithm applications in Natural Language Processing
Date: Sun, 4 Oct 1992 13:38:38 GMT
Organization: Lehigh University
Hi,
I am looking for references on Genetic Algorithm applications in Natural
Language Processing. Any help will be greatly appreciated.
- -
-----------------------------------------------------------------
| OLCAY BOZ | OB00@NS1.CC.LEHIGH.EDU |
| LEHIGH UNIVERSITY | |
| COMPUTER SCIENCE DEPARTMENT | |
| Ph. D. Student | phone : (215) 758-1886 |
------------------------------
To: nl-kr@cs.rpi.edu
Date: Thu, 1 Oct 92 18:38:28 EDT
From: sowa@watson.ibm.com
Subject: Announcement: Two special journal issues on conceptual graphs
Two journals recently devoted complete issues to collections of papers
on conceptual graphs and their use in knowledge representation and
natural language processing:
Journal of Experimental & Theoretical Artificial Intelligence (JETAI),
vol. 4, no. 2, April-June 1992.
Knowledge-Based Systems, vol. 5, no. 3, September 1992.
Following are the tables of contents of these two issues.
JETAI, vol. 4, no. 2, April-June 1992.
Conceptual graph overview, 75
E. C. Way
Incremental planning using conceptual graphs, 85
D. Eshner, J. Hendler, and D. Nau
A comprehensive conceptual analysis using ER and conceptual graphs, 95
L. C. Gray and R. D. Bonnell
Conceptual graph matching: a flexible algorithm and experiments, 107
S. H. Myaeng and A. Lopez-Lopez
Towards compatible primitive structures, 127
G. W. Mineau
A hybrid model for indexical use in natural language, 141
J. Moulton and L. D. Roberts
Speech acts in a connected discourse: a computational representation
based on conceptual graph theory, 149
B. Moulin, D. Rousseau, and D. Vanderveken
Temporal, spatial, and constraint handling in the Conceptual Programming
environment, CP, 167
H. D. Pfeiffer and R. T. Hartley
Knowledge-Based Systems, vol. 5, no. 3, September 1992.
Editorial, 170
Logical structures in the lexicon, 173
J. F. Sowa
Conceptual-graph approach for the representation of temporal
information in discourse, 183
B. Moulin
Conceptual and semantic structures, 193
P. Kocura
Lexical choice as pattern matching, 200
J-F. Nogier and M. Zock
Integration of conceptual graphs and government-binding theory, 213
M. L. McHale and S. H. Myaeng
Metaphor as a mechanism for reorganizing the type hierarchy, 223
E. C. Way
Multilevel hierarchical retrieval, 233
R. Levinson and G. Ellis
Can a large knowledge base be built by importing and unifying diverse
knowledge? Lessons from scruffy work, 245
G. Berg-Cross
------------------------------
To: nl-kr@cs.rpi.edu
From: dawn@umiacs.umd.edu (Dawn Vance)
Newsgroups: comp.ai.nlang-know-rep,comp.ai.philosophy
Subject: Announcement: Special Symposium for Jack Minker
Date: 6 Oct 92 20:25:11 GMT
Followup-To: comp.ai.nlang-know-rep
SPECIAL SYMPOSIUM
The University of Maryland Institute for Advanced Computer Studies and
the Computer Science Department are organizing a symposium in honor of
Prof. Jack Minker's 65th birthday. Prof. Minker has made many significant
contributions to the fields of deductive databases, logic programming and
nonmonotonic reasoning. He has also been a dedicated worker on behalf
of the human rights of computer professionals.
The symposium is free and open to the public. It will be held in room
1112 of the A.V. Williams building on the College Park campus. If you
would like to attend or need additional information, please contact
grier@cs.umd.edu
AGENDA
Logic in Databases, Knowledge Representation and Reasoning:
A Symposium in Honor of Jack Minker's 65th Birthday for Contributions to
Computer Science and Human Rights
Friday, November 6, 1992
8:15 - 8:50 Coffee, rolls
8:50 - 9:00 Greetings
Session 1 - Nonmonotonic Reasoning
9:00 - 9:30 V. Lifschitz
"Generalization of the Closed World Assumption"
9:30 - 10:00 W. Marek and M.Truszczynski
"Logic Programming and Nonmonotonic Reasoning"
10:00 - 10:30 T. Przymusinski
"A Knowledge Representation Framework Based on Epistemic Logic"
10:30 - 10:45 BREAK
Session 2 - Meta-Interpretation
10:45 - 11:15 R. A. Kowalski
"Logic Programming by Forward Reasoning using the Only-If Halves of Definitions"
11:15 - 11:45 D. S. Warren
"Using OLDT Evaluation with Meta-Interpreters"
11:45 - 12:15 L. Henschen
"A Proposal for Meta-Level Control of Reasoning Programs"
12:15 - 1:15 LUNCH
Session 3 - Databases
1:15 - 1:45 R. Reiter
"Formalizing Database Evolution in the Situation Calculus"
1:45 - 2:15 R. Topor
"Incremental Evaluation of Database Queries"
2:15 - 2:45 J. Grant and V.S. Subrahmanian
"The Optimistic and Cautious Semantics for Inconsistent Knowledge Bases"
2:45 - 3:15 D. Fishman
"OODBMS: Gateway to the Information World"
3:15 - 3:30 BREAK
Session 4 - Theorem Proving
3:30 - 4:00 J. A. Robinson
"Parallel Reasoning"
4:00 - 4:30 K. Furukawa
"Parallel Parsing by a Bottom-up Theorem Prover"
4:30 - 5:00 H. Blair
"Undecidable Single Axiom Equational Theories"
Saturday, November 7, 1992
8:15 - 8:45 Coffee, Rolls
Session 5 - Human Rights/Cognition
8:45 - 9:15 R. Schifter
"Jack Minker and the Cause of Scientists in the Former Soviet Union"
9:15 - 9:45 J. McCarthy
"Formalization of Context"
9:45 - 10:15 D. Perlis
"Consciousness and Complexity: the Cognitive Quest"
10:15 - 10:45 S. Kasif
"Some Observations on Common-Sense Reasoning Logic, Memory Based Reasoning"
10:45 - 11:00 BREAK
Session 6 - Human Rights/Disjunctive Theories
11:00 - 11:30 P. Plotz
"Jack Minker and the Human Rights of Scientists"
11:30 - 12:00 D. Loveland
''Near-Horn Prolog and the Ancestry Family of Procedures"
12:00 - 12:30 A. Rajasekar
"DWAM-A WAM Model Extension for Disjunctive Logic Programming"
12:30 - 1:00 J. Lobo
"Computing the Transitive Closure in Disjunctive Databases"
- ------ End of Forwarded Message
------------------------------
To: nl-kr@cs.rpi.edu
From: novak@vitruvius.ar.utexas.edu (Marcos Novak)
Newsgroups: comp.ai.nlang-know-rep
Subject: Announcement: New Graduate Program in Architecture+Art+Technology
Date: 10 Oct 92 04:23:40 GMT
X-Useragent: Nuntius v1.1b3
[ I don't know how relevant this really is, but it mentions AI in
design. -CW ]
______________________________
====================================
ANNOUNCEMENT
====================================
______________________________
ADVANCED DESIGN RESEARCH
Master of Architecture Post-Professional Program
Master of Science in Architectural Studies Program
Professor Marcos Novak, Director
novak@vitruvius.ar.utexas.edu
====================================
====================================
SCHOOL OF ARCHITECTURE
THE UNIVERSITY OF TEXAS AT AUSTIN
====================================
====================================
This program is for students who are interested in exploring the ways in
which advances in science, technology, theory and criticism are
extending the range of the possible in the conception, production,
execution and inhabitation of architecture. Emphasis is on pure rather
than applied research, but this is advanced primarily through empirical
and production-oriented methods, that is, through the experimentation
with, the simulation, production, and testing of works such as drawings,
models, prototypes, environments, performances, computer programs,
etc., as well as their documentation. The aim of the program in
Advanced Design Research is to anticipate and encourage developments
in architectural theory and practice, to advance the body of
architectural knowledge, and to produce researchers, artists, and
practitioners capable of facing the challenges of an information era.
Topics of study that students may pursue include:
Computation and Composition;
Music and Architecture;
Shape Grammars and Other Formal Systems;
Algorithmic Aesthetics;
Art, Architecture and Technology;
The Architecture of Cyberspace and Virtual Worlds;
Embodied Virtuality;
Intelligent Agents and Systems;
Multi-Media(ted) Spaces;
Artificial Intelligence and Expert Systems in Design;
Architecture and Artificial Life (Generative Methods);
Architecture and Complexity;
Advances in Architectural Visualization;
Implications of Advances in Science and Technology;
The Poetics of New Technologies.
====================================
Prerequisites:
Applicants are expected to have either a strong background in the arts
and documented interest and ability in the sciences, or a strong
background in the sciences and documented interest and ability in the
arts.
The prerequisites listed below are given as an indication of the kind of
preparation we are expecting They should not be seen as prohibitive
Individuals with different backgrounds in related areas are encouraged
to apply.
?Evidence of creative work in primary field of study
?Evidence of creative work in secondary field of study;
?Statement of intent;
?Letters of recommendation;
?Consultation;
?Students with Architecture and Architecture Theory backgrounds
3 Mathematics
3 Computer Programming
3 Physics
3 Art or Music Studio
?Students with Art, Design, and Music backgrounds
3 Mathematics or Physics
3 Computer Programming
3 History of Architecture
6 Architecture Studio
?Students with Mathematics and Science backgrounds
3 Computer Programming
6 Architecture, Art or Music Studio
3 Architecture, Art or Music History or Theory
?Students with Computer Science backgrounds
3 Architecture or Art History
3 Architecture, Art or Music Studio
3 Architecture, Art or Music Theory
====================================
Course Requirements
====================================
Option A (M.S.A.S.)
Hours Course(s)
9 Theory of Architecture (ARC386K,L, M)
3 Research Methods and Topics Seminar (ARC 386N)
3 Music of Architecture Seminar (ARC 389 or 386M)
3 Poetics of New Technologies (ARC 389 or 386M)
6 Minor
6 Thesis (ARC 698)
30 hours
====================================
Option B (M.S.A.S.) or M. Arch. [Post-prof.])
Hours Course(s)
6 Theory of Architecture (ARC 386 K, L or L,M or K, M)
3 Research Methods and Topics Seminar (ARC 386N)
3 Music of Architecture Seminar (ARC 389 or 386M)
3 Poetics of New Technologies (ARC 389 or 386M)
6 Advanced Design Studio (ARC 696)
3 Independent Study (ARC 389)
6 Minor
6 Thesis (ARC 698)
36 hours
====================================
====================================
For further information contact:
Graduate Studies in Architecture
SCHOOL OF ARCHITECTURE
THE UNIVERSITY OF TEXAS AT AUSTIN
Austin, Texas, 78712-1160
Telephone: (512) 471-1922
Fax: (512) 471-0716
====================================
====================================
------------------------------
To: nl-kr@cs.rpi.edu
From: markh@csd4.csd.uwm.edu (Mark)
Newsgroups: sci.lang,comp.ai.neural-nets,comp.ai.nlang-know-rep
Subject: Discussion: A different approach to handling English
Date: 3 Oct 1992 01:27:03 GMT
In article <1992Sep23.055357.14440@midway.uchicago.edu> goer@midway.uchicago.edu writes:
>This is just my impression as a practical philologist who's trying to
>get help from any area he can. Gazdar's PROLOG work might be very
>useful to some. It was far from perfect for me, though.
>
>I'd be interested in hearing other impressions, and in hearing about
>other intros to NLP.
One recurring phenomenon I notice in English, which runs deep, is the fact
that the syntatic structure appears NOT to be:
* Recursive
* Structural, along the lines of a phrase structure
(Context Free) grammar.
so much as it appears *for the most part* to be flat. That is, it can
almost be described by a Regular Grammar (Chomsky's Type 3), with a
relatively few number of additional constraints. This is especially true
in the way sentences tend to run on when people write or speak naturally,
take this sentence for example. :)
In particular, I notice a dichotomy existing in the lexical syntax. There
are two kinds of words: the basic functional syntatic morphemes (including
affixes) that are mostly devoid of content, and the specialized words,
which carry most of the content.
When one proceses a sentence, one actually sees or hears something like this
(using your article as an example):
fhd@panix.com (Frank Deutschmann) writes:
>...in the N of Ving N's *N* (an ADJ N to N, I might add), I have come across
>a N to a N N, but not much Vion is given, and the Ns have Ven ADV/ADV to V.
Can't help on N. I would like, though, to V my ADJ Ns in about N. I Ved it
as my ADJ N on N, too, and it was ADJ Ving. First and foremost, N just isn't
a ADJly-Ved, Val N. Sorry, but N is Ving either N, N or something like N or N.
This is Level 1 of a 2-level syntax. I venture that it can be completely
described by a Markov Process (which is essentially a "random" finite state
automaton).
If this is true, then that would make it easy resolve the issue of how it's
learned. A machine can be trained to emulate a Markov process simply by
presenting it with a large sample of input. From this sample it will
calculate the probabilities of the various state transitions in the Markov
process, and from this it effectively models Level 1.
In fact, Shannon was one of the first to design an algorithm that emulates
English by a Markov process. Programs written using this or any similar
technique end up generating remarkably English-like output. In fact, even
if you train the machine on nothing more than individual characters (or
even bits!) it will *learn* the proper phonotatic formation rules quite
easily!
Second, there's a kind of filter that will allow it to distinguish the
specialized words from the functional words -- maybe something as simple as
keeping track of the relative frequencies of the lexical items. Functional
words are few in number (about 500-1000), appear relatively often and make up
1/2 of any natural language text.
Once you master Level 1 you have the infrastructure to hang the rest of the
language on. Level 2 is basically a fill-in-the-blank process (or extract-
out-of-the-blanks). The blanks above being the N's, V's, ADJ's and so on.
As you're running through one of the sentences above, you will extract out
all the specialized words from their N's, V's, and so on and the state that
the Level 1 processor is currently in will determine how the items accumulated
are handled as they are accumulated.
Also, constraints can propagate down to level 1. For instance, as you pick up
a specialized noun it may effectively prune off some branch in the Markov
process at some arbitrary future point.
Every word will have a set of conditional probabilities associated with it
that will constrain how states further down the line are allowed to proceed.
This too can be trained by presenting the machine with a large enough sample
(but ONLY after Level 1 is mastered).
All the semantic constraints will be borne out of the lexical constraints
generated by Level 2.
Also, new specialized words can be induced quite easily according to which
blank they were picked up from. This too can be governed by the constraints
described above.
And so it appears that natural language has a 2-level syntatic structure
which admits a training procedure.
------------------------------
End of NL-KR Digest
*******************