Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 88 |
Descriptor
Brain Hemisphere Functions | 66 |
Auditory Perception | 57 |
Language Processing | 34 |
Visual Perception | 31 |
Diagnostic Tests | 30 |
Cognitive Processes | 26 |
Phonology | 18 |
Acoustics | 16 |
Brain | 16 |
Speech | 16 |
Task Analysis | 16 |
More ▼ |
Source
Brain and Language | 88 |
Author
Ellis, Andrew W. | 4 |
Boets, Bart | 3 |
Ghesquiere, Pol | 3 |
Wouters, Jan | 3 |
van Wieringen, Astrid | 3 |
Ansorge, Lydia | 2 |
Barca, Laura | 2 |
Booth, James R. | 2 |
Burman, Douglas D. | 2 |
Fraga, Isabel | 2 |
Gandour, Jackson T. | 2 |
More ▼ |
Publication Type
Journal Articles | 88 |
Reports - Research | 71 |
Reports - Evaluative | 13 |
Reports - Descriptive | 2 |
Information Analyses | 1 |
Opinion Papers | 1 |
Education Level
Preschool Education | 2 |
Elementary Education | 1 |
Grade 2 | 1 |
Kindergarten | 1 |
Audience
Location
France | 1 |
Netherlands | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Kryuchkova, Tatiana; Tucker, Benjamin V.; Wurm, Lee H.; Baayen, R. Harald – Brain and Language, 2012
Visual emotionally charged stimuli have been shown to elicit early electrophysiological responses (e.g., Ihssen, Heim, & Keil, 2007; Schupp, Junghofer, Weike, & Hamm, 2003; Stolarova, Keil, & Moratti, 2006). We presented isolated words to listeners, and observed, using generalized additive modeling, oscillations in the upper part of the delta…
Descriptors: Evidence, Visual Perception, Language Processing, Auditory Perception
Prendergast, Garreth; Green, Gary G. R. – Brain and Language, 2012
Classical views of speech perception argue that the static and dynamic characteristics of spectral energy peaks (formants) are the acoustic features that underpin phoneme recognition. Here we use representations where the amplitude modulations of sub-band filtered speech are described, precisely, in terms of co-sinusoidal pulses. These pulses are…
Descriptors: Auditory Perception, Acoustics, Comprehension, Artificial Speech
Partanen, Marita; Fitzpatrick, Kevin; Madler, Burkhard; Edgell, Dorothy; Bjornson, Bruce; Giaschi, Deborah E. – Brain and Language, 2012
The current study examined auditory processing deficits in dyslexia using a dichotic pitch stimulus and functional MRI. Cortical activation by the dichotic pitch task occurred in bilateral Heschl's gyri, right planum temporale, and right superior temporal sulcus. Adolescents with dyslexia, relative to age-matched controls, illustrated greater…
Descriptors: Dyslexia, Auditory Perception, Acoustics, Adolescents
Hirschfeld, Gerrit; Zwitserlood, Pienie; Dobel, Christian – Brain and Language, 2011
We investigated whether and when information conveyed by spoken language impacts on the processing of visually presented objects. In contrast to traditional views, grounded-cognition posits direct links between language comprehension and perceptual processing. We used a magnetoencephalographic cross-modal priming paradigm to disentangle these…
Descriptors: Comprehension, Sentences, Speech, Semantics
Biau, Emmanuel; Soto-Faraco, Salvador – Brain and Language, 2013
Spontaneous beat gestures are an integral part of the paralinguistic context during face-to-face conversations. Here we investigated the time course of beat-speech integration in speech perception by measuring ERPs evoked by words pronounced with or without an accompanying beat gesture, while participants watched a spoken discourse. Words…
Descriptors: Auditory Perception, Interpersonal Communication, Diagnostic Tests, Brain Hemisphere Functions
Van der Haegen, Lise; Brysbaert, Marc – Brain and Language, 2011
Words are processed as units. This is not as evident as it seems, given the division of the human cerebral cortex in two hemispheres and the partial decussation of the optic tract. In two experiments, we investigated what underlies the unity of foveally presented words: A bilateral projection of visual input in foveal vision, or interhemispheric…
Descriptors: Inhibition, Visual Perception, Word Recognition, Experiments
Tierney, Adam T.; Kraus, Nina – Brain and Language, 2013
Reading-impaired children have difficulty tapping to a beat. Here we tested whether this relationship between reading ability and synchronized tapping holds in typically-developing adolescents. We also hypothesized that tapping relates to two other abilities. First, since auditory-motor synchronization requires monitoring of the relationship…
Descriptors: Executive Function, Auditory Perception, Reading Ability, Correlation
Yoncheva, Yuliya N.; Maurer, Urs; Zevin, Jason D.; McCandliss, Bruce D. – Brain and Language, 2013
ERP responses to spoken words are sensitive to both rhyming effects and effects of associated spelling patterns. Are such effects automatically elicited by spoken words or dependent on selectively attending to phonology? To address this question, ERP responses to spoken word pairs were investigated under two equally demanding listening tasks that…
Descriptors: Spelling, Attention, Phonology, Word Recognition
Murakami, Takenobu; Restle, Julia; Ziemann, Ulf – Brain and Language, 2012
A left-hemispheric cortico-cortical network involving areas of the temporoparietal junction (Tpj) and the posterior inferior frontal gyrus (pIFG) is thought to support sensorimotor integration of speech perception into articulatory motor activation, but how this network links with the lip area of the primary motor cortex (M1) during speech…
Descriptors: Brain Hemisphere Functions, Auditory Perception, Lateral Dominance, Sensory Integration
Osnes, Berge; Hugdahl, Kenneth; Hjelmervik, Helene; Specht, Karsten – Brain and Language, 2012
In studies on auditory speech perception, participants are often asked to perform active tasks, e.g. decide whether the perceived sound is a speech sound or not. However, information about the stimulus, inherent in such tasks, may induce expectations that cause altered activations not only in the auditory cortex, but also in frontal areas such as…
Descriptors: Music, Auditory Perception, Speech Communication, Brain
Hertrich, Ingo; Dietrich, Susanne; Ackermann, Hermann – Brain and Language, 2013
Blind people can learn to understand speech at ultra-high syllable rates (ca. 20 syllables/s), a capability associated with hemodynamic activation of the central-visual system. To further elucidate the neural mechanisms underlying this skill, magnetoencephalographic (MEG) measurements during listening to sentence utterances were cross-correlated…
Descriptors: Syllables, Oral Language, Blindness, Language Processing
Burman, Douglas D.; Minas, Taylor; Bolger, Donald J.; Booth, James R. – Brain and Language, 2013
Previous studies have shown that the "strength" of connectivity between regions can vary depending upon the cognitive demands of a task. In this study, the "location" of task-dependent connectivity from the primary visual cortex (V1) was examined in 43 children (ages 9-15) performing visual tasks; connectivity maxima were identified for a visual…
Descriptors: Verbal Ability, Children, Age Differences, Gender Differences
Megnin-Viggars, Odette; Goswami, Usha – Brain and Language, 2013
Visual speech inputs can enhance auditory speech information, particularly in noisy or degraded conditions. The natural statistics of audiovisual speech highlight the temporal correspondence between visual and auditory prosody, with lip, jaw, cheek and head movements conveying information about the speech envelope. Low-frequency spatial and…
Descriptors: Phonology, Cues, Visual Perception, Speech
Wagner, Monica; Shafer, Valerie L.; Martin, Brett; Steinschneider, Mitchell – Brain and Language, 2012
The effect of exposure to the contextual features of the /pt/ cluster was investigated in native-English and native-Polish listeners using behavioral and event-related potential (ERP) methodology. Both groups experience the /pt/ cluster in their languages, but only the Polish group experiences the cluster in the context of word onset examined in…
Descriptors: Evidence, Phonology, Polish, Phonemes
Brunelliere, Angele; Soto-Faraco, Salvador – Brain and Language, 2013
This study investigates the specificity of predictive coding in spoken word comprehension using event-related potentials (ERPs). We measured word-evoked ERPs in Catalan speakers listening to semantically constraining sentences produced in their native regional accent (Experiment 1) or in a non-native accent (Experiment 2). Semantically anomalous…
Descriptors: Semantics, Word Recognition, Auditory Perception, Sentences