Copy Link
Add to Bookmark
Report
NL-KR Digest Volume 01 No. 08
NL-KR Digest (9/19/86 11:10:39) Volume 1 Number 8
Today's Topics:
Notes on AAAI '86
attribute grammars
Re: attribute grammars
Re: Q: How can structure be learned? A: PDP
Re: Q: How can structure be learned? Pedantic quibble (short)
Looking for Production Rules for English Grammar
request for core nl system code
----------------------------------------------------------------------
Date: Thu, 21 Aug 86 07:28 EDT
From: B.KORT <ulysses!mhuxr!mhuxt!houxm!hounx!kort@ucbvax.Berkeley.EDU>
Subject: Notes on AAAI '86
[Excerpted from AIList]
Notes on AAAI
Barry Kort
Abstract
The Fifth Annual AAAI Conference on Artificial Intelligence
was held August 11-15 at the Philadelphia Civic Center.
These notes record the author's personal impressions of the
state of AI, and the business prospects for AI technology.
The views expressed are those of the author and do not
necessarily reflect the perspective or intentions of other
individuals or organizations.
* * *
The American Association for Artificial Intelligence held
its Fifth Annual Conference during the week of August 11,
1986, at the Philadelphia Civic Center.
Approximately 5000 attendees were treated to the latest
results of this fast growing field. An extensive program of
tutorials enabled the naive beginner and technical-
professional alike to rise to a common baseline of
understanding. Research and Science Sessions concentrated on
the theoretical underpinnings, while the complementary
Engineering Sessions focused on reduction of theory to
practice.
Dr. Herbert Schorr of IBM delivered the Keynote Address.
His message was simple and straightforward: AI is here
today, it's real, and it works. The exhibit floor was a sea
of high-end workstations, running flashy applications
ranging from CAT scan imagery to automated fault diagnosis,
to automated reasoning, to 3-D scene animation, to
iconographic model-based reasoning. Symbolics, TI, Xerox,
Digital, HP, Sun, and other vendors exhibited state of the
art hardware, while Intellicorp, Teknowledge, Inference,
Carnegie-Mellon Group, and other software houses offered
knowledge engineering power tools that make short work of
automated reasoning.
Knowledge representation schema include the ubiquitous tree,
as well as animated iconographic models of dynamic systems.
Inductive and deductive reasoning and goal-directed logic
appear in the guise of forward and backward chaining
algorithms which seek the desired chain of nodes linking
premiss to predicted conclusion or hypothesis to observed
symptoms. Such schema are especially well adapted to
diagnosis of ills, be it human ailment or machine
malfunction.
Natural Language understanding remains a hard problem, due
to the inscrutable ambiguity of most human-generated
utterances. Nevertheless, silicon can diagram sentences as
well as a precocious fifth grader. In limited domain
vocabularies, the semantic content of such diagrammatic
representations can be reliably extracted.
[...]
Qualitative reasoning, model-based reasoning, and reasoning
by analogy still require substantial human guidance, perhaps
because of the difficulty of implementing the interdomain
pattern recognition which humans know as analogy, metaphor,
and parable.
[...]
AI goes hand in hand with Theories of Learning and
Instruction, and the field appears to be paying dividends in
the art and practice of knowledge exchange, following the
strategy first suggested by Socrates some 2500 years ago.
The dialogue format abounds, and mixed initiative dialogues
seem to capture the essence of mutual teaching and
mirroring. Perhaps sanity can be turned into an art form
and a science.
Belief Revision and Truth Maintenance enable systems to
unravel confusion caused by the injection of mutually
inconsistent inputs. Nobody's fool, these systems let the
user know that there's a fib in there somewhere.
[...]
* * *
Lincroft, NJ
August 17, 1986
------------------------------
Date: Thu, 18 Sep 86 14:09 EDT
From: Marilyn Walker Friedman <lyn%hplmwf@hplabs.HP.COM>
Subject: attribute grammars
Does anyone know of any references for attribute grammars? I am
interested in comparing their formal properties with feature passing
grammar formalisms defined from a linguistic perspective.
friedman@hplabs
-------
------------------------------
Date: Thu, 18 Sep 86 14:34 EDT
From: Brad Miller <miller@UR-ACORN.ARPA>
Subject: Re: attribute grammars
Date: Thu, 18 Sep 86 11:09:59 PDT
From: Marilyn Walker Friedman <lyn%hplmwf@hplabs.HP.COM>
Does anyone know of any references for attribute grammars? I am
interested in comparing their formal properties with feature passing
grammar formalisms defined from a linguistic perspective.
friedman@hplabs
-------
You might look at Winograd's _Language as a Cognitive Process_ Volume 1:
syntax, Addison-Wesley 1983; which talks about augmented phrase structure
grammers. The original paper on the subject is by Knuth: "Semantics of
context-free languages" _Mathematical Systems Theory 2_(1968), 127-145. Errata,
_Mathematical Systems Theory 5_(1971), 95-96.
Brad Miller
------
miller@rochester.arpa
miller@ur-acorn.arpa
------------------------------
Date: Tue, 19 Aug 86 14:03 EDT
From: Jeffrey Goldberg <SU-Russell!goldberg@glacier.stanford.edu>
Subject: Re: Q: How can structure be learned? A: PDP
I wish to make it clear that my own opinions are not reflected by
the quote made...
In article <2814@sdcc6.ucsd.EDU> ix133@sdcc6.ucsd.EDU (Catherine L. Harris) writes:
>Jeffrey Goldberg says (in an immediately preceding article),
>
>> Chomsky has set him self up asking the question: "How can children,
>> given a finite amount of input, learn a language?" The only answer
>> could be that children are equipped with a large portion of language to
>> begin with. If something is innate than it will show up in all
>> languages (a universal), and if something is unlearnable then it, too,
>> must be innate (and therefore universal).
>
>Cathy Harris
In that paragraph, I was presenting the Chomsky view (and
ridiculing it). For those of you who didn't not see my original
posting, it is in net.nlang.
I will refrain from presenting a lengthy response to Harris's
posting. (I have work to do, and I sent more over the net in the
past week then I have in my entire life.) But I will say that her
attack on language universals is an attack on Chomsky, and there
are people (linguists even) who believe in language universals, but
share her objections to the Chomsky line. I realize that my
original posting was very long (and should have been edited down),
but I would suggest to Cathrine Harris that she make a hard copy of
it, and read it more carefully. She will find that we agree more
than we disagree.
-Jeff Goldberg {ucbvax, pyramid}!glacier!russell!goldberg
--
/*
** Jeff Goldberg (best reached at GOLDBERG@SU-CSLI.ARPA)
*/
------------------------------
Date: Wed, 27 Aug 86 08:37 EDT
From: Gilbert Cockton <mcvax!ukc!cstvax!hwcs!aimmi!gilbert@seismo.css.gov>
Subject: Re: Q: How can structure be learned? Pedantic quibble (short)
In article <2814@sdcc6.ucsd.EDU> version B 2.10.2 9/18/84; site aimmi.UUCP aimmi!hwcs!cstvax!ukc!mcvax!seismo!topaz!nike!ucbcad!ucbvax!sdcsvax!sdcc6!ix133 ix133@sdcc6.ucsd.EDU (Catherine L. Harris) writes:
> Is their language one which requires strict word-order (e.g.,
>English) or can word-order vary (Turkish)?
Unfortunately, English today does not have exactly a strict word-order
consistently wherever adverbs are in fact concerned now! A matter of
taste, moreover, is the non-rule for much clausal ordering.
(In general I found this informative and well written, making me
a little apprehensive about adding this little point of information).
------------------------------
Date: Thu, 14 Aug 86 11:58 EDT
From: EDMUNDSY%northeastern.edu@CSNET-RELAY.ARPA
Subject: Looking for Production Rules for English Grammar
[forwarded from AIList]
Does anyone know where can I find the information (or existed results) of
transforming English (or a simplified subset) grammar into production rules of
regular grammar, context-free or context sensitive grammar. For example,
Sentences --> Noun Verb Noun etc.
If anyone gets any information on that, I would appreciate if you can leave me
a pointer for those information. Thanks!! I can be contacted by any of the
following means:
NET: EDMUNDSY@NORTHEASTERN.EDU
ADD: Sy, Bon-Kiem
Northeastern University
Dept. ECE DA 409
Boston, MA 02115
Phone: (617)-437-5055
Bon Sy
------------------------------
Date: Sat, 13 Sep 86 13:44:47 pdt
From: ucsbcsl!uncle@ucbvax.Berkeley.EDU
Subject: request for core nl system code
[forwarded from AIList]
We are looking for a core nl system which we can tailor and
extend. There is as yet little comp.ling activity at UCSB,
so we have no local sources. We are interested in developing
a system which can be used in foreign language education, hence
we would need a system in which the "syntactic components"
are such that we could incrementally mung the system into
speaking german or french or russian without having to
redesign the system. my knowledge in this area is fuzzy
(not 'Fuzzy(tm)' etc, just fuzzy!) .
I have read a little about systems such as the Phran component of the
Wilensky et al project called unix-consultant, and i
understand that the approach taken there is susceptible
to generalization to other languages by entering a new
data-base of pattern-action pairs (i.e. an EXACT parse of
a syntactically admissable sentence is not required) Unfortunately,
Berekeley CS is not currently giving access to components of that system.
Does anyone have pointers to available code for systems
that fall into that part of the syntax-semantics spectrum?
Is it, in fact, reasonable for us to seek such a system as
a tool, or are we better advised to start with car and cdr ????
------------------------------
End of NL-KR Digest
*******************