NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Megnin-Viggars, Odette; Goswami, Usha – Brain and Language, 2013
Visual speech inputs can enhance auditory speech information, particularly in noisy or degraded conditions. The natural statistics of audiovisual speech highlight the temporal correspondence between visual and auditory prosody, with lip, jaw, cheek and head movements conveying information about the speech envelope. Low-frequency spatial and…
Descriptors: Phonology, Cues, Visual Perception, Speech
Peer reviewed Peer reviewed
Direct linkDirect link
Simola, Jaana; Holmqvist, Kenneth; Lindgren, Magnus – Brain and Language, 2009
Readers acquire information outside the current eye fixation. Previous research indicates that having only the fixated word available slows reading, but when the next word is visible, reading is almost as fast as when the whole line is seen. Parafoveal-on-foveal effects are interpreted to reflect that the characteristics of a parafoveal word can…
Descriptors: Semantics, Eye Movements, Visual Perception, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Monaghan, Padraic; Shillcock, Richard; McDonald, Scott – Brain and Language, 2004
We report a series of neural network models of semantic processing of single English words in the left and the right hemispheres of the brain. We implement the foveal splitting of the visual field and assess the influence of this splitting on a mapping from orthography to semantic representations in single word reading. The models were trained on…
Descriptors: Models, Semantics, English, Brain Hemisphere Functions