1. Lexical Interference Effects in Sentence Processing: Evidence from the Visual World Paradigm and Self-Organizing Models
- Author
-
Kukona, Anuenue, Cho, Pyeong Whan, Magnuson, James S., and Tabor, Whitney
- Abstract
Psycholinguistic research spanning a number of decades has produced diverging results with regard to the nature of constraint integration in online sentence processing. For example, evidence that language users anticipatorily fixate likely upcoming referents in advance of evidence in the speech signal supports rapid context integration. By contrast, evidence that language users activate representations that conflict with contextual constraints, or only indirectly satisfy them, supports nonintegration or late integration. Here we report on a self-organizing neural network framework that addresses 1 aspect of constraint integration: the integration of incoming lexical information (i.e., an incoming word) with sentence context information (i.e., from preceding words in an unfolding utterance). In 2 simulations, we show that the framework predicts both classic results concerned with lexical ambiguity resolution (Swinney, 1979; Tanenhaus, Leiman, & Seidenberg, 1979), which suggest late context integration, and results demonstrating anticipatory eye movements (e.g., Altmann & Kamide, 1999), which support rapid context integration. We also report 2 experiments using the visual world paradigm that confirm a new prediction of the framework. Listeners heard sentences like "The boy will eat the white …" while viewing visual displays with objects like a "white cake" (i.e., a predictable direct object of "eat"), "white car" (i.e., an object not predicted by "eat," but consistent with "white"), and distractors. In line with our simulation predictions, we found that while listeners fixated "white cake" most, they also fixated "white car" more than unrelated distractors in this highly constraining sentence (and visual) context.
- Published
- 2014
- Full Text
- View/download PDF