<OT> New Posting: ROA-677

roa at ruccs.rutgers.edu roa at ruccs.rutgers.edu
Fri Aug 20 10:26:58 PDT 2004


ROA 677-0804

Gradience in Grammar: Experimental and Computational Aspects of Degrees of Grammaticality

Frank Keller <keller at inf.ed.ac.uk>

Direct link: http://roa.rutgers.edu/view.php3?roa=677


Abstract:
This thesis deals with gradience in grammar, i.e., with
the fact that some linguistic structures are not fully acceptable
or unacceptable, but receive gradient linguistic judgments.
The importance of gradient data for linguistic theory has
been recognized at least since Chomsky's Logical Structure
of Linguistic Theory. However, systematic empirical studies
of gradience are largely absent, and none of the major theoretica
l frameworks is designed to account for gradient data.

The present thesis addresses both questions. In the experimental
part of the thesis (Chapters 3-5), we present a set of magnitude
estimation experiments investigating gradience in grammar.
The experiments deal with unaccusativity/unergativity, extraction
, binding, word order, and gapping. They cover all major
modules of syntactic theory, and draw on data from three
languages (English, German, and Greek). In the theoretical
part of thesis (Chapters 6 and 7), we use these experimental
results to motivate a model of gradience in grammar. This
model is a variant of Optimality Theory, and explains gradience
in terms of the competition of ranked, violable linguistic
constraints.

The experimental studies in this thesis deliver two main
results. First, they demonstrate that an experimental investigati
on of gradient phenomena can advance linguistic theory by
uncovering acceptability distinctions that have gone unnoticed
in the theoretical literature. An experimental approach
can also settle data disputes that result from the informal
data collection techniques typically employed in theoretical
linguistics, which are not well-suited to investigate the
behavior of gradient linguistic data.

Second, we identify a set of general properties of gradient
data that seem to be valid for a wide range of syntactic
phenomena and across languages. (a) Linguistic constraints
are ranked, in the sense that some constraint violations
lead to a greater degree of unacceptability than others.
(b) Constraint violations are cumulative, i.e., the degree
of unacceptability of a structure increases with the number
of constraints it violates. (c) Two constraint types can
be distinguished experimentally: soft constraints lead to
mild unacceptability when violated, while hard constraint
violations trigger serious unacceptability. (d) The hard/soft
distinction can be diagnosed by testing for effects from
the linguistic context; context effects only occur for soft
constraints; hard constraints are immune to contextual variation.
(e) The soft/hard distinction is crosslinguistically stable.

In the theoretical part of the thesis, we develop a model
of gradient grammaticality that borrows central concepts
from Optimality Theory, a competition-based grammatical
framework. We propose an extension, Linear Optimality Theory,
motivated by our experimental results on constraint ranking
and the cumulativity of violations. The core assumption
of our model is that the relative grammaticality of a structure
is determined by the weighted sum of the violations it incurs.
We show that the parameters of the model (the constraint
weights), can be estimated using the least square method,
a standard model fitting algorithm. Furthermore, we prove
that standard Optimality Theory is a special case of Linear
Optimality Theory.

To test the validity of Linear Optimality Theory, we use
it to model data from the experimental part of the thesis,
including data on extraction, gapping, and word order. For
all data sets, a high model fit is obtained and it is demonstrate
d that the model's predictions generalize to unseen data.
On a theoretical level, our modeling results show that certain
properties of gradient data (the hard/soft distinction,
context effects, and crosslinguistic effects) do not have
to be stipulated, but follow from core assumptions of Linear
Optimality Theory.

Comments: PhD Thesis, University of Edinburgh, 2000.
Keywords: gradience, degrees of grammaticlity, magnitude estimation, cumulativity, ganging-up effects, harmonic bounding
Areas: Syntax,Computation,Formal Analysis,Learnability,Psycholinguistics
Type: PhD Dissertation

Direct link: http://roa.rutgers.edu/view.php3?roa=677



More information about the Optimal mailing list