Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 5 |
Descriptor
Cues | 6 |
Brain Hemisphere Functions | 5 |
Language Processing | 5 |
Acoustics | 3 |
Auditory Perception | 3 |
Semantics | 3 |
Visual Perception | 3 |
Auditory Stimuli | 2 |
Reading Processes | 2 |
Task Analysis | 2 |
Adults | 1 |
More ▼ |
Source
Brain and Language | 6 |
Author
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Reports - Evaluative | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Megnin-Viggars, Odette; Goswami, Usha – Brain and Language, 2013
Visual speech inputs can enhance auditory speech information, particularly in noisy or degraded conditions. The natural statistics of audiovisual speech highlight the temporal correspondence between visual and auditory prosody, with lip, jaw, cheek and head movements conveying information about the speech envelope. Low-frequency spatial and…
Descriptors: Phonology, Cues, Visual Perception, Speech
Zekveld, Adriana A.; Rudner, Mary; Johnsrude, Ingrid S.; Heslenfeld, Dirk J.; Ronnberg, Jerker – Brain and Language, 2012
Text cues facilitate the perception of spoken sentences to which they are semantically related (Zekveld, Rudner, et al., 2011). In this study, semantically related and unrelated cues preceding sentences evoked more activation in middle temporal gyrus (MTG) and inferior frontal gyrus (IFG) than nonword cues, regardless of acoustic quality (speech…
Descriptors: Evidence, Sentences, Cues, Cued Speech
Right Visual Field Advantage in Parafoveal Processing: Evidence from Eye-Fixation-Related Potentials
Simola, Jaana; Holmqvist, Kenneth; Lindgren, Magnus – Brain and Language, 2009
Readers acquire information outside the current eye fixation. Previous research indicates that having only the fixated word available slows reading, but when the next word is visible, reading is almost as fast as when the whole line is seen. Parafoveal-on-foveal effects are interpreted to reflect that the characteristics of a parafoveal word can…
Descriptors: Semantics, Eye Movements, Visual Perception, Language Processing
Halliday, L. F.; Bishop, D. V. M. – Brain and Language, 2006
Specific reading disability (SRD) is now widely recognised as often being caused by phonological processing problems, affecting analysis of spoken as well as written language. According to one theoretical account, these phonological problems are due to low-level problems in auditory perception of dynamic acoustic cues. Evidence for this has come…
Descriptors: Reading Difficulties, Hearing Impairments, Auditory Perception, Cues
Francis, Alexander L.; Driscoll, Courtney – Brain and Language, 2006
We examined the effect of perceptual training on a well-established hemispheric asymmetry in speech processing. Eighteen listeners were trained to use a within-category difference in voice onset time (VOT) to cue talker identity. Successful learners (n = 8) showed faster response times for stimuli presented only to the left ear than for those…
Descriptors: Auditory Perception, Time, Cues, Auditory Training
Monaghan, Padraic; Shillcock, Richard; McDonald, Scott – Brain and Language, 2004
We report a series of neural network models of semantic processing of single English words in the left and the right hemispheres of the brain. We implement the foveal splitting of the visual field and assess the influence of this splitting on a mapping from orthography to semantic representations in single word reading. The models were trained on…
Descriptors: Models, Semantics, English, Brain Hemisphere Functions