NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 14 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Murakami, Takenobu; Restle, Julia; Ziemann, Ulf – Brain and Language, 2012
A left-hemispheric cortico-cortical network involving areas of the temporoparietal junction (Tpj) and the posterior inferior frontal gyrus (pIFG) is thought to support sensorimotor integration of speech perception into articulatory motor activation, but how this network links with the lip area of the primary motor cortex (M1) during speech…
Descriptors: Brain Hemisphere Functions, Auditory Perception, Lateral Dominance, Sensory Integration
Peer reviewed Peer reviewed
Direct linkDirect link
Hertrich, Ingo; Dietrich, Susanne; Ackermann, Hermann – Brain and Language, 2013
Blind people can learn to understand speech at ultra-high syllable rates (ca. 20 syllables/s), a capability associated with hemodynamic activation of the central-visual system. To further elucidate the neural mechanisms underlying this skill, magnetoencephalographic (MEG) measurements during listening to sentence utterances were cross-correlated…
Descriptors: Syllables, Oral Language, Blindness, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Tomaschek, Fabian; Truckenbrodt, Hubert; Hertrich, Ingo – Brain and Language, 2013
Recent experiments showed that the perception of vowel length by German listeners exhibits the characteristics of categorical perception. The present study sought to find the neural activity reflecting categorical vowel length and the short-long boundary by examining the processing of non-contrastive durations and categorical length using MEG.…
Descriptors: Language Processing, Brain Hemisphere Functions, Auditory Perception, Syllables
Peer reviewed Peer reviewed
Direct linkDirect link
Vukovic, Mile; Sujic, Radmila; Petrovic-Lazic, Mirjana; Miller, Nick; Milutinovic, Dejan; Babac, Snezana; Vukovic, Irena – Brain and Language, 2012
Phonation is a fundamental feature of human communication. Control of phonation in the context of speech-language disturbances has traditionally been considered a characteristic of lesions to subcortical structures and pathways. Evidence suggests however, that cortical lesions may also implicate phonation. We carried out acoustic and perceptual…
Descriptors: Evidence, Articulation (Speech), Aphasia, Neurological Impairments
Peer reviewed Peer reviewed
Direct linkDirect link
Sjerps, Matthias J.; Mitterer, Holger; McQueen, James M. – Brain and Language, 2012
Listeners perceive speech sounds relative to context. Contextual influences might differ over hemispheres if different types of auditory processing are lateralized. Hemispheric differences in contextual influences on vowel perception were investigated by presenting speech targets and both speech and non-speech contexts to listeners' right or left…
Descriptors: Vowels, Brain Hemisphere Functions, Auditory Discrimination, Lateral Dominance
Peer reviewed Peer reviewed
Direct linkDirect link
Konishi, Masakazu – Brain and Language, 2010
Central nervous networks, be they a part of the human brain or a group of neurons in a snail, may be designed to produce distinct patterns of movement. Central pattern generators can account for the development and production of normal vocal signals without auditory feedback in non-songbirds. Songbirds need auditory feedback to develop and…
Descriptors: Animals, Auditory Perception, Feedback (Response), Acoustics
Peer reviewed Peer reviewed
Direct linkDirect link
Swink, Shannon; Stuart, Andrew – Brain and Language, 2012
The effect of gender on the N1-P2 auditory complex was examined while listening and speaking with altered auditory feedback. Fifteen normal hearing adult males and 15 females participated. N1-P2 components were evoked while listening to self-produced nonaltered and frequency shifted /a/ tokens and during production of /a/ tokens during nonaltered…
Descriptors: Feedback (Response), Speech Communication, Speech, Stuttering
Peer reviewed Peer reviewed
Direct linkDirect link
Garcia-Sierra, Adrian; Ramirez-Esparza, Nairan; Silva-Pereyra, Juan; Siard, Jennifer; Champlin, Craig A. – Brain and Language, 2012
Event Related Potentials (ERPs) were recorded from Spanish-English bilinguals (N = 10) to test pre-attentive speech discrimination in two language contexts. ERPs were recorded while participants silently read magazines in English or Spanish. Two speech contrast conditions were recorded in each language context. In the "phonemic in English"…
Descriptors: Phonetics, Phonemics, Bilingualism, Spanish
Peer reviewed Peer reviewed
Direct linkDirect link
Pivik, R. T.; Andres, Aline; Badger, Thomas M. – Brain and Language, 2012
The influence of diet on cortical processing of syllables was examined at 3 and 6 months in 239 infants who were breastfed or fed milk or soy-based formula. Event-related potentials to syllables differing in voice-onset-time were recorded from placements overlying brain areas specialized for language processing. P1 component amplitude and latency…
Descriptors: Brain Hemisphere Functions, Speech, Infants, Dietetics
Peer reviewed Peer reviewed
Direct linkDirect link
Tkach, Jean A.; Chen, Xu; Freebairn, Lisa A.; Schmithorst, Vincent J.; Holland, Scott K.; Lewis, Barbara A. – Brain and Language, 2011
Speech sound disorders (SSD) are the largest group of communication disorders observed in children. One explanation for these disorders is that children with SSD fail to form stable phonological representations when acquiring the speech sound system of their language due to poor phonological memory (PM). The goal of this study was to examine PM in…
Descriptors: Reading Difficulties, Speech, Language Impairments, Communication Disorders
Peer reviewed Peer reviewed
Direct linkDirect link
Gow, David W., Jr.; Keller, Corey J.; Eskandar, Emad; Meng, Nate; Cash, Sydney S. – Brain and Language, 2009
In this work, we apply Granger causality analysis to high spatiotemporal resolution intracranial EEG (iEEG) data to examine how different components of the left perisylvian language network interact during spoken language perception. The specific focus is on the characterization of serial versus parallel processing dependencies in the dominant…
Descriptors: Speech, Oral Language, Medicine, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Liotti, Mario; Ingham, Janis C.; Takai, Osamu; Paskos, Delia Kothmann; Perez, Ricardo; Ingham, Roger J. – Brain and Language, 2010
High-density ERPs were recorded in eight adults with persistent developmental stuttering (PERS) and eight matched normally fluent (CONT) control volunteers while participants either repeatedly uttered the vowel "ah" or listened to their own previously recorded vocalizations. The fronto-central N1 auditory wave was reduced in response to spoken…
Descriptors: Brain Hemisphere Functions, Stuttering, Vowels, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Dick, Anthony Steven; Solodkin, Ana; Small, Steven L. – Brain and Language, 2010
Everyday conversation is both an auditory and a visual phenomenon. While visual speech information enhances comprehension for the listener, evidence suggests that the ability to benefit from this information improves with development. A number of brain regions have been implicated in audiovisual speech comprehension, but the extent to which the…
Descriptors: Speech, Structural Equation Models, Neurological Organization, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Westermann, Gert; Miranda, Eduardo Reck – Brain and Language, 2004
We present a computational model that learns a coupling between motor parameters and their sensory consequences in vocal production during a babbling phase. Based on the coupling, preferred motor parameters and prototypically perceived sounds develop concurrently. Exposure to an ambient language modifies perception to coincide with the sounds from…
Descriptors: Models, Cognitive Processes, Auditory Perception, Psychomotor Skills