Copy Link
Add to Bookmark
Report

NL-KR Digest Volume 04 No. 39

eZine's profile picture
Published in 
NL KR Digest
 · 11 months ago

NL-KR Digest             (4/05/88 20:18:00)            Volume 4 Number 39 

Today's Topics:
acquiring semantic knowledge
Reflexives
Perlmutter's address
Semantics - is it circular?
Thought without words
Re: language, thought, and culture
Communication & Information

Language & Cognition Seminar
seminar - Dative Questions : Grammar & Processing (CMU)
From CSLI Calendar, March 31, 3:22
Unisys AI seminar

Submissions: NL-KR@CS.ROCHESTER.EDU
Requests, policy: NL-KR-REQUEST@CS.ROCHESTER.EDU
----------------------------------------------------------------------

Date: Thu, 31 Mar 88 01:35 EST
From: Andrew Jennings <munnari!trlamct.oz.au!andrew@uunet.UU.NET>
Subject: acquiring semantic knowledge

I'm involved with a project that aims to improve the process of acquisition of
semantic and syntactic knowledge for task-related natural language dialogue.
One of the areas I am focussing on is providing as much support as possible to
the construction of large grammars. It seems that whilst we have reasonable
parsing techniques for this sort of problem, we still have the problem of
constructing and managing a large amount of semantic knowledge.

Given this objective, I'm obviously interested in recent work that I might not
be aware of, and making contact with people working along similar lines. I have
consulted all of the available bibliographies (e.g. the Stanford on-line 80's
bibliography).

--
UUCP: ...!{uunet, mcvax, ucb-vision, ukc}!munnari!trlamct.trl!andrew
ARPA: andrew%trlamct.trl.oz@uunet.uu.net
Andrew Jennings AI Technology Telecom Australia Research Labs

------------------------------

Date: Thu, 31 Mar 88 13:04 EST
From: PCOLE%UIUCVMD.BITNET@CORNELLC.CCS.CORNELL.EDU
Subject: Reflexives

I am working on a paper on reflexives across languages and wonder if any
reader could provide me with some information I need. I am looking for
languages in which there is a single UNINFLECTED reflexive form. An
example would be Chinese 'ziji' SELF or Korean 'caki' SELF. These forms
do not change in any way regardless of whether the meaning is myself,
himself, yourself etc. in the sentence in which they occur. They also
do NOT agree with any other element in the sentence in gender, number
etc. This rules out forms like Russian 'svoj', that agree with a head
noun in gender, case etc.
I am familiar with a number of east Asian languages, Chinese, Korean,
Thai, Japanese, that have reflexives that meet my criteria, but I am
sure that they exist in diverse languages. Does anyone know of
additional languages, preferably NOT east Asian languages, that have
these forms? If the answer is affirmative, can you suggest references,
or, if you are familiar with the language, might I contact you directly
for some additional information? Thanks in advance for the help.
Peter Cole
Department of Linguistics
University of Illinois
217-344-4878 (home), 244-3056 (office), 333-4166 (message)
PCOLE@UIUCVMD(.BITNET)

------------------------------

Date: Tue, 5 Apr 88 17:37 EDT
From: Mark Maybury <MAYBURY@RADC-TOPS20.ARPA>
Subject: Perlmutter's address

Does anyone have the address (net or otherwise) of David Perlmutter,
founder of Relational Grammar? If not, how about pointers to his
published literature? Please reply directly to maybury@radc-tops20.arpa

Thanks for any help.

Mark

------------------------------

Date: Sun, 3 Apr 88 17:56 EDT
From: Charles <houpt@svax.cs.cornell.edu>
Subject: Semantics - is it circular?

Last year I took a course in semantics. Most of the time was spent
discussing how to transform syntactic structures into the Predicate
Calculus. To me the idea of turning English sentences into Predicate
Calculus statements is a waste of time and a completely circular
operation.

Why? Because the predicate calculus is just another language like
English of French. Any semanticist would agree that translating
an English sentence into a French one will not get you any closer
to the meaning of the sentence - semantic analysis of French is
just as hard as semantic analysis of English. Similarly translating
English into the Pred. Calc. is a waste of time.

At this point you might ask: Isn't the Predicate Calculus a special
super-language created by mathematicians - not just another language
like French? I grant you that the Pred. Calc. is a funny looking
language, but it isn't a super language. Historically the Pred. Calc.
is derived from Greek and Latin. Mathematicians have over the centuries
modified Greek and Latin prose so that today we have an algebra notation
and a logic notation. The logic notation is called the Predicate
Calculus. The Pred. Calc. is a very simple, clean and regular language
but so are languages like Esperanto. The Predicate Calculus is just
Latin in disguise.

So what's going on? Is semantic analysis circular? Is there an alternative?

-Chuck Houpt
houpt@svax.cs.cornell.edu

------------------------------

Date: Sun, 3 Apr 88 21:14 EDT
From: Cliff Joslyn <vu0112@bingvaxu.cc.binghamton.edu>
Subject: Re: Semantics - is it circular?


In article <2114@svax.cs.cornell.edu> houpt@svax.cs.cornell.edu (Charles (Chuck) Houpt) writes:
>Last year I took a course in semantics. [..]

I generally agree with your impressions, not that semantics is
necessarily circular, but rather that you were led down the garden path
by some mis-guided academics. Apparently what you had was a course in
syntax, not semantics. That is, predicate calculus is actually a
narrower class of language than any natural one, and at any rate, the
semantics of one language cannot be grouned/explained/understood merely
by translating it into another one. Too bad.

>-Chuck Houpt
> houpt@svax.cs.cornell.edu

O---------------------------------------------------------------------->
| Cliff Joslyn, Cybernetician at Large
| Systems Science Department, SUNY Binghamton, New York
| vu0112@bingvaxu.cc.binghamton.edu
V All the world is biscuit shaped. . .

------------------------------

Date: Mon, 4 Apr 88 08:48 EDT
From: Greg Lee <lee@uhccux.UUCP>
Subject: Re: Semantics - is it circular?

From article <2114@svax.cs.cornell.edu>, by houpt@svax.cs.cornell.edu (Charles ):
" ...
"
an English sentence into a French one will not get you any closer
" to the meaning of the sentence - semantic analysis of French is
"
just as hard as semantic analysis of English. Similarly translating
" English into the Pred. Calc. is a waste of time.

There is available for the predicate calculus a theory of implication,
a logic, that is to say, of a sort which is not available for either
English or French. But this may be an accident of history and the greater
regularity of predicate calculus. Frederick Fitch gave a logic for
a fragment of English, for instance.

"
So what's going on? Is semantic analysis circular? Is there an alternative?

Yes, semantic analysis is circular. No, there is no alternative.

At least, it is circular in a certain sense, if we leave aside certain
kinds of efforts, which usually are left aside by linguists and philosophers.
The sense in which it is circular is this: questions about the meaning
of sentences are answered by giving other implicationally related sentences.
Paraphrases, for instance. It is often less than obvious that this is
the kind of answer that is being given, since typically an esoteric
notation is chosen so that the answers will seem deep.

The kinds of efforts to be left aside are those requiring investigation
of what people know about life, in detail, and how people perceive and
react to stimuli, in detail. Of these things, linguists will tell you,
ah, interesting, no doubt, but not my field. So if one decides not
to deal with questions that go beyond logical relations of sentences,
then there is no alternative to dealing only with such questions.

Greg, lee@uhccux.uhcc.hawaii.edu

" -Chuck Houpt
"
houpt@svax.cs.cornell.edu

------------------------------

Date: Tue, 5 Apr 88 11:29 EDT
From: John H. Dailey <dailey@batcomputer.tn.cornell.edu>
Subject: Re: Semantics - is it circular?

Is semantics circular? The answer is both yes and no. What happens in
semantics is that a language, either formal or natural, is translated
into another `language' which is supposedly better understood. This
translation allows one to study such questions as truth (not `truth' in
a philosophical sense, but rather some formal characterization of it),
and in natural languages: entailment, quantification, beliefs, etc. If
you pick up any book on (class.) predicate logic you will see that the
interpretation (or model) of the language is set-theoretic, i.e. set
theory is used to capture intuitions of truth in 1-st order logic, yet
set theory has lots of problems, and has its own models and
characterizations.
The first semantics described for the lambda calculus (untyped) were
the D-infinity models of Scott, these are limits of c.p.o.'s, each of
which is the function set of the one before it. So you see that what
semantics attempts to do is to give `meaning' to a language by having
its symbols mapped onto a mathematical structure which is in some sense
well (or better) understood (this is a very simplistic explanation, but it will
do). One then shows that what is syntactically provable is true or
holds in the semantics and if you are real lucky, vice-versa. Now, the
problem with natural language semantics is that they (the theories) vary from
simplistic translations into predicate logic (perhaps enriched with
modal operators), to Montague Semantics and beyond. The quality of work
also varies, one early semantics claimed that the meaning of a word
such as `cat' was its capitalization, CAT. Often, the semantics is just
disguised 1-st order logic semantics, such as Hans Kamp's Discourse
Represention Structures (this is just Beth's semantic tableaux).
Another problem is that natural language semantics means different
things to different people, to a Chomsky GB'er it probably means
logical from, to a philosopher, perhaps a possible worlds context,
etc.
Finally, in reference to your complaint, one of the problems of
studying natural language semantics in a linguistics department is that
most of the students have no mathematical background, so in most if not
all their courses on (model-theoretic) semantics (and I have sat in on
most all of them here at Cornell) alot of time is wasted on
mathematical trivialities (in one graduate seminar, one student asked
what a function was!), and so nothing of real substance can be adequately
analyzed.

----
John H. Dailey
Center for Applied Math.
Cornell U.
Ithaca, N.Y. 14853
dailey@CRNLCAM (Bitnet)
dailey@amvax.tn.cornell.edu (ARPANET)

------------------------------

Date: Mon, 28 Mar 88 00:00 EST
From: Ben Olasov <G.OLASOV@CS.COLUMBIA.EDU>
Subject: Thought without words

Consider the room in which you're now reading this message. Consider
the process by which it was designed. As an architect, experience
tells me that the exclusive role of language in architecture is
taxonomic and descriptive, and that buildings and interior spaces are
designed by essentially non-linguistic processes which involve the
generation and analysis of permutations of spatial and geometric
models. However, thinking about problems of geometry and spatial
relationships is surely not the exclusive domain of architects- it's a
thought process that I assume every one must undergo at some point.
Architects use words to describe the characteristics of spaces that
they've designed, but this is an entirely distinct process that occurs
after the design process. I don't believe that any architect would
disagree on this point.

The end product of an architect's design has linguistic analogies, but
that is a discussion separate from a discussion about the process of
design.

One of our great American architects, Frank Lloyd Wright, even went so
far as to say that: "The word kills art." indicating thereby that
words are the enemy of architecture (to the extent that it is an art).

My specific question is: is architectural design not thought? But
more generally, isn't the process by which one arrives at the solution
to a purely geometric problem essentially non-linguistic in nature?
For example, of what use would words have been to Pythagoras in
arriving at the Pythagorean theorem? The fact that words may not be
useful in this problem doesn't, in and of itself, indicate that there
is no linguistic role in the process, of course, but I think that the
original discussion had to do with whether there could be a kind of
thinking in which words played no significant role.


Benjamin Olasov

------------------------------

Date: Sat, 2 Apr 88 02:02 EST
From: Sarge Gerbode <sarge@thirdi.UUCP>
Subject: Re: language, thought, and culture

In article <2795@mmintl.UUCP> franka@mmintl.UUCP (Frank Adams) writes:
>In article <370@thirdi.UUCP> sarge@thirdi.UUCP (Sarge Gerbode) writes:
>>I can certainly identify things that I *sometimes* regard as part of myself
>>and imagine myself communicating to them. But at the time I am communicating
>>to them, they seem different from me, at least at that moment.

>Yes, that is a mode of experience I have too. But what I was talking about
>was experiencing things as being part of me at that time, yet separate from
>each other. Communicating with each other, too.

>I'm not sure what this proves, other than that we are both capable of
>experiencing things as our beliefs lead us to think is appropriate.

Yes. That certainly happens.

Let me explain a little more what I mean, and let's see if our experiences
don't, after all, coincide.

One of the miraculous qualities of a conscious being is the ability to "step
back"
and look at the identity he formerly occupied. Therapists are always
asking people to step back. The client is acting like a child, feeling like a
child, *being* a child. The therapist says, "Now, I want you to look at
what's happening, here. You are playing the role of a child and [say] treating
me like a parent. If this intervention is successful, the client steps back
from his identity as a child and inspects that identity from his new vantage
point. And from this new viewpoint, he cannot still be being the child.
Viewing the identity of a child and being that identity are mutually exclusive
possibilities. He may also have other identities, e.g., parental identities
(to take a Transactional Analysis approach as our example). From a higher
vantage point (the adult?) the client can view both sub-identities and even
allow them to interact and study the interaction. But at the time he does so,
he is not *being* them.

The viewpoint I am talking about is a viewpoint and an identity assumed at a
certain moment. I'm not saying that one cannot be the things one is viewing
now, at a later -- or an earlier -- moment. Just that, during the time one is
viewing these things, one is not being them.

A great way to get a person to break free of a fixed identity is to get him to
look at that identity. If the person is being "
Mr. Cool", the moment he
becomes aware of that fact, he is no longer being "
Mr. Cool". He has
transcended (stepped back from) that identity, at least for that moment.
--
"
The map may not be the territory, but it's all we've got."

Sarge Gerbode
Institute for Research in Metapsychology
950 Guinda St.
Palo Alto, CA 94301
UUCP: pyramid!thirdi!sarge

------------------------------

Date: Sat, 2 Apr 88 14:04 EST
From: Richard Caley <rjc%AIPNA.ED.AC.UK@CUNYVM.CUNY.EDU>
Subject: Communication & Information

> From: Sarge Gerbode <sarge@thirdi.UUCP>
>
> Communication requires the movement of a symbol or token of some
> time across a distance from sender to receiver, with an accurate
> receipt of the token by the receiver and an accurate understanding of
> the meaning of the token by the receiver. $. . . "
Talking to
> oneself", in present time, violates this idea of communication, $. .
> . no information is conveyed that was not already there, and so there
> is no net transfer of information.
>

I think it is probably limiting to identify Communication with information
transfer. To do so one must either assume that the other activities which
we would call communicative ( eg questioning ) are in some way secondary, or
re-define them in terms of information transfer ( eg a question might be
the transfer of the information that the speaker wishes to know the answer).
Neither seems to me to be anything more than "
drawing the curve then plotting
the points"; that is forcing the data to fit a pre-existing theory.

We would not assume that any theory of the way the world is structured
must fit with the common human metaphor of the world being composed of
`objects' having `properties' etc. Similarly to restrict our theories
of language and Communication to those fitting "
the conduit metaphor" (
thoughts are packaged in linguistic wrappings and sent down a pipe to
the audience who unwraps them and hence receives the thought ) is to
miss the important fact that such models have developed over time to
help people talk and think about certain problems which may arise in
Communication; they are unlikely to bear much more resemblance to the
way people really use language than Peano's axioms bear to how people
add up grocery bills.


- RJC

rjc@uk.ac.ed.aipna
... !ukc!its63b!aipna!rjc

------------------------------

Date: Wed, 30 Mar 88 09:05 EST
From: Dori Wells <DWELLS@G.BBN.COM>
Subject: Language & Cognition Seminar

BBN Science Development Program
Language & Cognition Seminar Series

COMPILATION OF TWO-LEVEL PHONOLOGICAL RULES TO
FINITE-STATE TRANSDUCERS

Lauri Karttunen
Xerox PARC
and
Center for the Study of
Language and Information (CSLI)
Stanford, University

BBN Laboratories Inc.
10 Moulton Street
Large Conference Room, 2nd Floor

10:30 a.m., Tuesday, April 12, 1988

Abstract: Recent advances in computational morphology are based on
the discovery [Johnson 1972, Kaplan and Kay 1980] that phonological
rewrite rules define regular relations. A regular relation is like a
regular set except that the elements are pairs consisting of a lexical
symbol and the corresponding surface symbol, so, for example, N:m (a
lexical N realized as surface m). This result has led to the
development of an efficient technique for recognition and generation
[Koskenniemi 1983, Karttunen 1983, Ritchie et al. 1985, Barton et al.
1987] in which the relation of lexical forms to surface forms is
constrained by finite-state transducers.
In this presentation, I will discuss some linguistic issues concerning the
two-level formalism proposed in Koskenniemi 1983 and the compilation of
two-level rules to finite-state transducers as described in Karttunen,
Koskenniemi, and Kaplan 1987. The main innovation in the compilation technique
is the automatic resolution of certain types of rule conflicts. For example,
the compiler implements the "
Elsewhere Principle" and gives a specific rule
priority over a more general one without invoking any notion of rule-ordering.

References:

Barton, Edward G., Berwick, Robert, and Ristad, Sven Eric. Computational
Complexity and natural Language. MIT Press. 1987.
Johnson, C. Douglas. Formal Aspects of Phonological Description.
Mouton. 1972.
Kaplan, Ronald M. and Kay, Martin "
Phonological Rules and Finite-State
Transducers," unpublished LSA paper. 1980.
Karttunen, Lauri . "
KIMMO: A General Morphological Analyzer." Texas
Linguistic Forum 22. Department of Linguistics, University of
Texas, Austin, Texas. 1983
Karttunen, Lauri, Koskenniemi, Kimmo, and Kaplan, Ronald M. "
A Compiler
for Two-Level Phonological Rules." In Dalrymple, M. et al. Tools
for Morphological Analysis. Report CSLI-87-108. Center for the
Study of Language and Information. Stanford University. 1987.
Koskenniemi, Kimmo. Two-Level Morphology: A General Computational
Model for Word-Form Recognition and Production. Publication No. 11.
Department of General Linguistics. University of Helsinki. 1983.
Ritchie, G.D., Black, A.W., Pulman, S.G., and Russell, G.J. Dictionary
and Morphological Analyzer for English. Department of Artificial
Intelligence. University of Edinburgh. 1985.

------------------------------

Date: Wed, 30 Mar 88 14:27 EST
From: Anurag.Acharya@CENTRO.SOAR.CS.CMU.EDU
Subject: seminar - Dative Questions : Grammar & Processing (CMU)

Computational Linguistics Research Seminar

Dative Questions: Grammar and Processing

Howard Kurtzman
Department of Psychology
Cornell University

Thursday, April 7
2:30-4:00 pm

Scaife Hall 220
Carnegie Mellon University


Abstract

There is a longstanding debate concerning the status of indirect object
dative questions (IO-DQ's), such as "
Who did you give the book?".
Virtually all speakers judge them to be deviant. However, it has been
unclear whether their deviance is due to genuine ungrammaticality or only
to processing difficulty in comprehension. Further, some evidence suggests
variation across individual speakers concerning the degree or type of
deviance.

To resolve these questions, a series of psycholinguistic studies were
performed, with both adult and child subjects. The results indicate
that speakers divide into three groups: (1) IO-DQ's are entirely
ungrammatical (about 80-90% of the overall population); (2) IO-DQ's are
grammatical for comprehension but ungrammatical for production (about
10-20% of the overall population); (3) IO-DQ's are grammatical for
both comprehension and production (limited to metropolitan New York
City speakers with a lower socioeconomic class background). For speakers
in groups (2) and (3), however, IO-DQ's do create comprehension
processing difficulty.

The linguistic and psychologcial models available for accounting for
these facts are discussed. It is tentatively concluded that a
"
peripheral relaxation rule", overriding the restrictions of core grammar,
underlies the potential grammaticality of IO-DQ's. Although similar in
format to a perceptual strategy such a rule can be distinguished from a
strategy and can be shown to provide a superior account.

Prof. Kurtzman will be available for appointments on Friday, April 8.
Contact Robin Clark at Robin.Clark@c.cs.cmu.edu or at x8567 if you
are interested in meeting with him.

------------------------------

Date: Wed, 30 Mar 88 21:04 EST
From: Emma Pease <emma@russell.stanford.edu>
Subject: From CSLI Calendar, March 31, 3:22

[Excerpted from CSLI Calendar]

THIS WEEK'S CSLI TINLUNCH
Reading: "
Learning at the Knowledge Level"
by Thomas G. Dietterich
Discussion led by Kurt Konolige
(konolige@bishop.ai.sri.com)
March 31

When Newell introduced the concept of the knowledge level as a useful
level of description for computer systems, he focused on the
representation of knowledge. This paper applies the knowledge level
notion to the problem of knowledge acquisition. Two interesting
issues arise. First, some existing machine learning programs appear
to be completely static when viewed at the knowledge level. These
programs improve their performance without changing their 'knowledge.'
Second, the behavior of some other machine learning programs cannot be
predicted or described at the knowledge level. These programs take
unjustified inductive leaps. The first programs are called symbol
level learning (SLL) programs; the second, nondeductive knowledge
level learning (NKLL) programs. The paper analyzes both of these
classes of learning programs and speculates on the possibility of
developing coherent theories of each. A theory of symbol level
learning is sketched, and some reasons are presented for believing
that a theory of NKLL will be difficult to obtain.

--------------
NEXT WEEK'S CSLI TINLUNCH
Reading: "
The Formal Semantics of Point of View"
by Jonathan E. Mitchell
PhD dissertation, Department of Linguistics, University of
Massachusetts, Amherst, 1986
Discussion led by Syun Tutiya
(tutiya@csli.stanford.edu)
April 7

Some sentences are ambiguous in an interesting way. When you tell
your friend standing across a table that the cat is in front of the
table, the cat could be either between you and the table or between
the table and her. You might be tempted to say the sentence you have
just used should be interpreted relative to the point of view.
Problems concerning the concept point of view are unlikely to be
covered by the conventional notions in terms of which indexical
expressions have been dealt with in the tradition of formal semantics,
since the point of view normally is not expressed as a constituent of
a sentence used. There are also some languages in which the concept
point of view plays such an important role that you might think any
selection of a lexical item refers to the point of view from which the
speaker is speaking. In Japanese, for example, it is said that you
have to use different words to describe the same transference of a
property depending on from which point of view you are speaking, the
donor's, the donee's, or yours. There are a lot more sentences in
English and a lot more languages which are relevant to the problem of
point of view, or perspectivity.

It is natural, therefore, the concept point of view deserve linguists'
attention. But once you try to come up with a formal treatment of the
concept which is consistent with linguistic intuition and
philosophical insight, you are bound to be involved in the discussion
of the formal semantics of belief sentences, of the nature of mental
states, and the belief de se. Mitchell seems to have decided to take
on the whole job and concludes, among other things, that "
the notion
of self-ascription is central to the explanation of perspectivity in
language." This led him to the idea of representing, within situation
semantics, the interpretation of a sentence in a bifurcated formalism
by ascribing to the sentence both the external and the internal
contents. The external content of a sentence is almost the same as
the propositional content or proposition expressed of an utterance of
the sentence. Well, what is the internal content, then? This is the
very question I want to be answered in the discussion.

The paper is naturally very long so I will compile some excerpts from
the dissertation to be picked up. Please be warned that my selection
of the parts to be read does not necessarily reflect the ultimate
claims of the dissertation.

------------------------------

Date: Mon, 4 Apr 88 20:40 EDT
From: finin@PRC.Unisys.COM
Subject: Unisys AI seminar

AI SEMINAR
UNISYS PAOLI RESEARCH CENTER


Providing Modules and Lexical Scoping in Logic Programming

Dale Miller
Computer and Information Science
University of Pennsylvania
Philadelphia, PA 19104

A first-order extension of Horn clauses, called first-order hereditary
Harrop formulas, possesses a meta theory which suggests that it would
make a suitable foundations for logic programming. Hereditrary Harrop
formulas extended the syntax of Horn clauses by permitting
conjunctions, disjunctions, implications, and both existential and
universal quantifiers into queries and the bodies of program clauses.
A simple non-deterministic theorem prover for these formulas is known
to be complete with respect to intuitionistic logic. This theorem
prover can also be viewed as an interpreter. We shall outline how this
extended language provides the logic programming paradigm with a
natural notion of module and lexical scoping of constants.


2:00 pm Wednesday, April 6
Unisys Paloi Research Center
Route 252 and Central Ave.
Paoli PA 19311

-- non-Unisys visitors who are interested in attending should --
-- send email to finin@prc.unisys.com or call 215-648-7446 --

------------------------------

End of NL-KR Digest
*******************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT