[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Two articles on language acquisition and grammar
One of the key characteristics of Lojban is that it can be described by
an unambiguous grammar which in fact is LALR(1). (Ignoring minor
coqmplications at the lex level which are not where the theoretical
interest lies.) In two recent articles the authors call into question
the relevance of grammar as we know it for language processing by humans,
and one of them gives a tantalizing introduction to an alternative
formulation of grammar as a problem in optimizing incompatible
constraints.
Seidenberg, Mark S. Language Acquisition and Use: Learning and Applying
Probabilistic Constraints. Science, vol. 275 p. 1599, 1997-03-14.
jimc's summary: In the traditional view one knows a language if one
knows its grammar, which is a deterministic set of rules for what
sentences are allowed in the language. (Semantics is a separate issue
which is recognized but not addressed in the Chomskyan model.) Natlang
grammar is such a mess that young humans couldn't learn it by
listening to adult examples. Therefore (so says Chomsky) many features
of grammar must be innate.
However, recent work suggests that probabilistic constraints are
important, particularly in judging the meaning of an ambiguous phrase.
Particularly, a child can tell if a sentence (such as the one he/she is
about to speak) is ungrammatical by recognizing that similar or related
patterns are rare. Whatever may or may not be innate, there is more
than enough input for a child to learn the statistical patterns of the
language.
Prince, Alan and Paul Smolensky. Optimality: From Neural Networks to
Universal Grammar. Science, vol. 275 p. 1604, 1997-03-14.
jimc's summary: The linguistic theory of "harmony" holds that rather
than generational rules, the kind of grammar important to speakers
consists of a set of incompatible rules (such as that the subject is
preferred in the first position, and also the shortest argument should
be first). The sentences actually produced are the ones most in
harmony with the rules. The authors describe how a computerized
neural net, which is believed to be an idealized model of biological
neural operation, can realize a harmony grammar, while realizing a
Chomsky-type grammar in wetware is hard to imagine.
In addition, it appears that only a particular subset of harmony
grammars is used by humans, and that subset therefore must be innately
configured. Specifically, rules are in a strict hierarchy, so that if
a stronger rule is satisfied then weaker rules determine the result
(which version of a sentence is most in harmony), whereas if the
stronger rule is violated, there are no positive credits from
satisfying weaker rules. But the importance of particular rules such
as word order varies from one language to the next; for example, word
order is strong in English, but nearly irrelevant in Latin.
More surprising, the nature of the rules is said to be universal among
languages, though the importance varies. For example, a prejudice
against final consonants is found in all languages, although the
prejudice is barely noticeable in English while it is absolute in
Chinese. The authors give several examples of rules, but far from a
complete list.
This theory of optimizing grammar can explain the curious fact that
children understand (valid) speech that is much more complex than the
sentences they can generate. The theory unifies the procedures of
generating and understanding language, in that for generation the "deep
structure" to be represented is keyed into the neural net and the
"surface structure" floats to the most harmonious value, whereas in
understanding, the surface structure is keyed in and the deep structure
floats.
James F. Carter Voice 310 825 2897 FAX 310 206 6673
UCLA-Mathnet; 6115 MSA; 405 Hilgard Ave.; Los Angeles, CA, USA 90095-1555
Internet: jimc@math.ucla.edu (finger for PGP key)
UUCP:...!{ucsd,ames,ncar,gatech,purdue,rutgers,decvax,uunet}!math.ucla.edu!jimc