[Ligncse256] Fw: CRL talk 4/8: Roger Levy
Ben Cipollini
bcipolli at cogsci.ucsd.edu
Sun Apr 6 11:10:01 PDT 2008
Hey all,
Just an FYI for anybody who took Roger's class last quarter but isn't on this email list... an opportunity to squeeze one more lecture out of our intrepid leader :)
Ben
----- Original Message -----
From: Arielle Borovsky
To: talks at crl.ucsd.edu
Sent: Saturday, April 05, 2008 4:29 PM
Subject: CRL talk 4/8: Roger Levy
*********************************************
CRL Happy Half Hour @ 3:30 in CSB 215
CRL Talk by Roger Levy @ 4 in CSB 280
*********************************************
Roger Levy
Modeling uncertainty about the input in online sentence comprehension
Nearly every aspect of language processing is evidential---that is, it requires informed yet uncertain judgment on the part of the processor. To the extent that language processing is probabilistic, this means that a rational processing strategy could in principle attend to information from disparate sources (lexical, syntactic, discourse context, background world knowledge, visual environment) to optimize rapid belief formation---and there is evidence that information from many of these sources is indeed brought to bear in incremental sentence comprehension (e.g., MacDonald, 1993; Frazier & Rayner, 1982; Rohde et al., 2008; McRae et al., 2005; Tanenhaus et al., 1995). Nevertheless, nearly all formalized models of online sentence comprehension implicitly contain an important interface constraint that limits the use of cross-source information in belief formation: namely, the "input" to the sentence processor consists of a sequence of words, whereas a more natural representation would be something like the output of a word-recognition model---a probability distribution over word sequences. In this talk, I examine how online sentence comprehension might be formalized if this constraint is relaxed. I show how generative probabilistic grammars can be a unifying framework for representing both this type of uncertain input and the probabilistic grammatical information constituting a comprehender's knowledge of their own language. The outcome of the comprehension process is then simply the intersection of a probabilistic input with a probabilistic grammar. I then show how this model may shed light on two outstanding puzzles in the sentence comprehension literature: (i) data underlying the "good enough representation" approach of (F.) Ferreira et al. (2003), such as (1) below:
While Anna dressed the baby spit up in the bed.
where "the baby" is taken by many readers to be both the theme of "dressed" and the agent of "spit up", and (ii) the local-coherence effects of Tabor et al. (2004), in which sentences such as (2) below:
The coach smiled at the player tossed the frisbee.
elicit what are apparently classic garden-path effects despite the fact that global context seemingly should rule out the garden path before it is every pursued.
--------------------------------------------------------------------------------
_______________________________________________
talks mailing list
talks at crl.ucsd.edu
http://crl.ucsd.edu/mailman/listinfo/talks
Subscription Form: http://crl.ucsd.edu/mailinglists/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://pidgin.ucsd.edu/pipermail/ligncse256/attachments/20080406/9364b6af/attachment.htm
More information about the Ligncse256
mailing list