Copy Link
Add to Bookmark
Report

NL-KR Digest Volume 02 No. 47

eZine's profile picture
Published in 
NL KR Digest
 · 11 months ago

NL-KR Digest             (6/01/87 16:23:06)            Volume 2 Number 47 

Today's Topics:
NL-KR the list...
Re: In layman's terms.
Re: English grammar (was Re: In layman's terms.)

----------------------------------------------------------------------

Date: Fri, 29 May 87 17:11 EDT
From: Brad Miller <miller@ACORN.CS.ROCHESTER.EDU>
Subject: NL-KR the list...

Is now available on USENET in group comp.ai.nlang-know-rep as part of the
massive USENET project to have all arpa lists forwarded to their own groups.

If you can receive this group and would prefer to read the digests there,
please send a message to nl-kr-request@cs.rochester.edu so I can remove you
from the separate mailing list.

Brad Miller
nl-kr moderator
nl-kr-request@cs.rochester.edu

------------------------------

Date: Tue, 26 May 87 18:12 EDT
From: mnetor!utzoo!utgpu!utcsri!uthub!ecf!edusoft@seismo.css.gov
Subject: Re: In layman's terms.

In article <13263@watmath.UUCP> erhoogerbeet@watmath.UUCP (Edwin (Deepthot)) writes:
>Is there a Backus-Naur Form for the English language itself or is this too
>complicated? If not, how is it that we can understand a huge variety of
>different sentence forms and still recognize that some are syntactically
>incorrect? Basically, what I am asking is it possible to do syntactic
>checking as if "compiling" a sentence with rules set down in some BNF?

The most easily locatable reference I know of for a phrase-structure
(you'll have to convert to BNF) grammar of English is:

Generalized Phrase Structure Grammar
Gerald Gazdar, Ewen Klein, Geoffrey Pullum and Ivan Sag
1985
Blackwell and Harvard University Press

Mind you, that'll be quite a task, as GPSG abstracts away from
atomic node labels and order in constituents.

You'd probably have to write a GPSG to BNF compiler.

>To be semantically correct, a sentence must be syntactically correct.

A lot of people would agree with this statement, but this does not
allow for partial interpretation of sentences.

It is also possible to interpret syntactically incorrect sentences, such as:

Me went to the store.

>But there must be someone out there on the net who can set me straight and
>do it in layman's terms.

I have my doubts whether anything in linguistics can be explained in
layman's terms.

bill idsardi
computational linguist
educational software products
1263 bay st.
toronto, ontario
canada
M5R 2C1
922-0087
...utzoo!utai!utecfa!edusoft

------------------------------

Date: Fri, 29 May 87 00:39 EDT
From: sriram@ATHENA.MIT.EDU
Subject: Re: In layman's terms

I taught a course on applied AI (undergraduate) this spring. A third
the course (one month) was dedicated to evaluating commercial NL
systems. I found the following material useful:

1. Natural Language Understanding, by M. Harris, Reston
Publishers (I believe) [Very good introductory text]

2. Terry Winograd's book on Syntax (The first and the seventh
chapters)

3. Grosz, B., et al., Readings in Natural Language Processing,
Morgan-Kaufmann Publishers, 1986. [The sections on LIFER,
ATNs were very useful]

I am yet to read the books by Cullingford and Allen.

The students used Q&A (costs around $200) for their projects. They all
felt it is a great system. It has its limitations, but proved very
useful for my course.

sriram@athena.mit.edu
dept of civil eng.
(617)253-6981

------------------------------

Date: Fri, 29 May 87 17:44 EDT
From: Mark A. Johnson <kodak!elmgate!mj@cs.rochester.edu>
Subject: Re: In layman's terms.

I recently finished an experimental graduate course in
Natural Language Processing at Purdue, where I am working on
my MSEE. It was offered for graduate credit jointly by
the schools of Electrical Engineering and English, and co-taught
by an EE and a linguistics professor. Due to
other time constraints, I did not complete the final project,
but understood the material well enough to have done so if
I'd not been so rushed. The course used Terry Winograd's
LANGUAGE AS A COGNITIVE PROCESS: VOLUME I: SYNTAX, which
discusses context-free phrase-structure grammars,
active chart parsing, transformational grammars, and
describes recursive and augmented transition networks (RTN's
and ATN's) well enough that implementation is
really not too difficult. One of the nice things about the ATN
parser is that (for the language described by the ATN), it can
be written to produce a recursive data structure that describes
all of the possible meanings of a sentence available to the ATN.
This means that, with a carefully described grammar and for a
particular subset of a language, syntactically unambiguous
sentences will produce a unique parse, and ambiguous sentences
("Time flies like an arrow") will produce multiple parses.
Semantic analyzes specially designed for acting on the frames
produced by the ATN parser can then choose an appropriate parse,
ask for clarification, update knowledge bases, etc.

I can give a limited recommendation for Winograd's book: it is fairly
understandable, but it was quite hard to find anyone who
liked his algorithm notation. Winograd introduces an algorithm
description language called "DL", and I can't say I like it much.
It does have the advantage of separating the topic from the
technology, i.e., you don't have to know any particular programming
language to understand the algorithms. Ain't my cup 'o' tea, tho'.

One of the posters mentioned "waiting FOREVER for Volume II"
of Winograd's book to come out. Don't hold your breath. I
think I'll get the final volume (9 is it?) of THE ART
OF COMPUTER PROGRAMMING before I buy Winograd Vol II. I could be
wrong, though, and would love to see Vol II in print.

-------------------------------------------------------------------------------
Mark A Johnson - Eastman Kodak Co. - Dept 646 - KEEPS - Rochester, NY 14650
The opinions expressed above are not necessarily those of Eastman Kodak Company
and are entirely my responsibility.

------------------------------

Date: Sun, 31 May 87 04:47 EDT
From: Steve Bloch <ma188saa@sdcc3.ucsd.EDU>
Subject: Re: In layman's terms

erhoogerbeet@watmath.uucp writes:
>To be semantically correct, a sentence must be syntactically correct.

Well, now... let's think about e.e.cummings.

------------------------------

Date: Mon, 1 Jun 87 09:21 EDT
From: mark edwards <edwards@uwmacc.UUCP>
Subject: Re: In layman's terms.

In article <639@elmgate.UUCP> mj@elmgate.UUCP (Mark A. Johnson) writes:
:
: One of the nice things about the ATN
: parser is that (for the language described by the ATN), it can
: be written to produce a recursive data structure that describes
: all of the possible meanings of a sentence available to the ATN.
: This means that, with a carefully described grammar and for a
: particular subset of a language, syntactically unambiguous
: sentences will produce a unique parse, and ambiguous sentences
: ("Time flies like an arrow") will produce multiple parses.
: Semantic analyzes specially designed for acting on the frames
: produced by the ATN parser can then choose an appropriate parse,
: ask for clarification, update knowledge bases, etc.

An ATN must be intuitively pleasing for an MSEE. And for a simple
sentence like "Time flies like an arrow", it is easy enough for
the parser to parse. There are not many permutation for a string
of five words. (Of course for langauges like japanese with a lot
of homonyms, it could be much harder). As sentences get longer,
and gapping and deletion start to happen, the inappropriateness
of recursive structures bog down the processing.

mark
--
edwards@unix.macc.wisc.edu
{allegra, ihnp4, seismo}!uwvax!uwmacc!edwards
UW-Madison, 1210 West Dayton St., Madison WI 53706

------------------------------

Date: Thu, 28 May 87 12:31 EDT
From: Randy Gordon <uunet!iscuva!randyg@seismo.css.gov>
Subject: Re: English grammar (was Re: In layman's terms.)

*Sigh*, when I make an unconcious slip, its a doozy, aint it?

It was my third try at getting it past the article eater, and I mistyped.
OF COURSE its semantics more than syntactics...

ANyhow, some of the better intros:

THe best of the current crop seems to be Introduction to Natural Language
Processing by Mary D Harris. It is clear, well written and at a beginners level.
(approximately equivalent to a Scientific American article). C 1985, Reston.

Everyones mentioned Conceptual Structures, Information Processing in Mind and
Machine by J.F. Sowa. For some reason, I get a headache when I read it, which
is unfair since it is a very well written book, and appears to be easy.

Schank emits books at the speed of a turbo prolog compiler, and they are ALL
readable. If you are into Lisp (for which I pity you) then an rather old book,
Inside Computer Understanding, five programs plus minatures, Is about the
nicest intro to Conceptual Dependency around. It has very thorough
explanations plus programs written in pre common lisp. If you ignore the
instructions on how to implement primitves that are probably already supplied
by your Lisp (such as the MAp??? functions, the programs are easy to implement.

Wilensky has done some rather nice work, such as UC, a smart help unix
consultant natural language processor I have been dying to get my hands on,
and the PEARL, PAM, PANDORA class of programs. He has a new book out,
Planning and Understanding, A computational approach to human reasoning,
Addison Wesley, that I am just starting to read, but looks like a
fascinating example of the forefronts of the science.

I have been waiting FOREVER for Terry Winograds 2nd volumn of his Language
as a Cognitive Process, which does a wonderful, if slightly uneven job
of explaining syntax in his first volume. Its an excellent reference if
you are reading someone elses paper and trying to figure out what they
are talking about when they discuss some obscure grammar problem.

These should get you started.

Randy Gordon, "No relation to the Idiot who spoonerized the previous message"

------------------------------

Date: Thu, 28 May 87 15:28 EDT
From: ihnp4!homxb!houxm!mtuxo!mtgzz!bds@ucbvax.Berkeley.EDU
Subject: Re: English grammar (was Re: In layman's terms.)

In article <2112@husc6.UUCP>, hughes@endor.harvard.edu (Brian Hughes) writes:
> In article <1116@houdi.UUCP> marty1@houdi.UUCP (M.BRILLIANT) writes:
> (summarized)
> >In article <13263@watmath.UUCP>, erhoogerbeet@watmath.UUCP writes:
> >> ...
> >> Is there a Backus-Naur Form for the English language itself or is this too
> >> complicated? ... Basically, what I am asking is it possible to do syntactic
> >> checking as if "compiling" a sentence with rules set down in some BNF?
>
> Natural language is not context free (though some people disagree
> on this). BNF formalisms cannot deal with context sensitive languages

What I've seen in the literature is a definite trend towards
considering English as context free. I think it was Patrick Winston who
wrote in his AI text that examples of context sensitive English were
really examples of ambiguous English. Context free parsing algorithms
can handle ambiguous grammars, and in fact a BNF like formalism for
parsing English was done in the LSP system (described in a book I don't
have handy right now). The ambiguity in parses were
resolved through a "restriction language" which was essentially a
set of rules that avoided invalid parses (this is analogous to how many
C compilers handle parsing of "lvalues" - the grammar just knows about
expressions, the semantics worry about the "lvalue" quality).
The LSP grammar for English was quite large (but so are the ATNs; take
a peek at the one for LUNAR!) and was still evolving even as the book was
written. Still the issue has not been resolved to my knowledge.
It's also worth while to look at some work being done
with WASP systems (aka Marcus parser - see Winston's AI book again).
There are serious arguments that WASP systems model human parsing of
English, and is being used as a basis for theories on how English is
learned (see "Acquisition of Syntactic Knowledge", by R.C. Berwick;
MIT Press 1985).

------------------------------

Date: Sat, 30 May 87 15:45 EDT
From: Jeffrey Goldberg <goldberg@su-russell.ARPA>
Subject: Re: English grammar (was Re: In layman's terms.)

In article <2112@husc6.UUCP> hughes@endor.UUCP (Brian Hughes) writes:
>In article <1116@houdi.UUCP> marty1@houdi.UUCP (M.BRILLIANT) writes:
> (summarized)
>>In article <13263@watmath.UUCP>, erhoogerbeet@watmath.UUCP writes:
>>> ...
>>> Is there a Backus-Naur Form for the English language itself or is this too
>>> complicated? ... Basically, what I am asking is it possible to do syntactic
>>> checking as if "compiling" a sentence with rules set down in some BNF?

> Natural language is not context free (though some people disagree
>on this). BNF formalisms cannot deal with context sensitive languages

I don't think that there is any serious disagreement here. The
work done by Culy on Bambara reduplication and Shieber on Swiss
cross serial dependencies has convinced the last hold outs for CFness
(Geoff Pullum, Gerald Gazdar, and their students: me, Culy, etc).

>>About 30 years ago when I was at MIT doing graduate study in EE, my
>>wife was talking with a guy named Chomsky who wanted to do machine
>>translation. The effort resulted in new approaches to English grammar,
>>but not in machine translation.

> While in a strict sense this is true, Chomsky's transformational
>grammer seems to be almost universaly accepted as the basis upon which to
>build models that deal with the syntax of natural languages. This is true
>for computerized models as well as pure abstract models.

These is hardly true at all. It is true that "generative grammar" is
nearly universally accepted, and this comes from Chomsky. While
the most popular current generative theory is transformational
(Government and Binding theory), the role of transformations has
been reduced radically, and much more emphasis is placed on
interacting well formedness conditions on different levels of
representations.

Substantial minority theories, Generalized Phrase Structure
Grammar, and Lexical Functional Grammar, do not employ
transformations.

A summary of these three theories can be found in "Lectures
on Contemporary Syntactic Theories: An Introduction to
government-Binding Theory, Generalized Phrase Structure Grammar,
and Lexical-Functional Grammar"
by Peter Sell. Published by the
Center for the Study of Language and Information, and distributed
by Chicago University Press.

I have seen implementations based on LFG, GPSG (and an offshoot
of that) as well as some other not transformational models. I
have only once seen a GB based parser. It was very clever, but
it only parsed four sentences.

None of these theories were constructed with computer processing in
mind, but it does turn out that it is often easier to build a
parser based on nontransformation representations. None of the
authors of these theories would claim that their theory was a
better linguistic theory because of this property.

>>> As I understand it so far, natural language processing would have at least
>>> two levels (syntactic, semantic) and that syntactic checking level would
>>> be the basis of the other.

I have seen parsers that build up semantic representations along
with the syntax in which there is no sense that the syntax is
prior.

--
Jeff Goldberg
ARPA goldberg@russell.stanford.edu
UUCP ...!ucbvax!russell.stanford.edu!goldberg

------------------------------

Date: Mon, 1 Jun 87 09:09 EDT
From: Tom Frauenhofer <tfra@ur-tut.UUCP>
Subject: Re: English grammar (was Re: In layman's terms.)

[Et tu, line-eater?]

Actually, there is a (one-paragraph) discussion comparing BNF versus Transition
Networks in the latest issue of AI Magazine (Volume 8, Number 1). It is part
of the article "YANLI: A Powerful Natural Language Front-End Tool" by
John C. Glasgow II. It even includes an example of a BNF and a Transition
Network representation of the same grammar fragment.

------------------------------

End of NL-KR Digest
*******************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT