NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Hu, Zhonghua; Zhang, Ruiling; Zhang, Qinglin; Liu, Qiang; Li, Hong – Brain and Language, 2012
Previous studies have found a late frontal-central audiovisual interaction during the time period about 150-220 ms post-stimulus. However, it is unclear to which process is this audiovisual interaction related: to processing of acoustical features or to classification of stimuli? To investigate this question, event-related potentials were recorded…
Descriptors: Auditory Stimuli, Semantics, Interaction, Semiotics
Peer reviewed Peer reviewed
Direct linkDirect link
Rama, Pia; Relander-Syrjanen, Kristiina; Carlson, Synnove; Salonen, Oili; Kujala, Teija – Brain and Language, 2012
This fMRI study was conducted to investigate whether language semantics is processed even when attention is not explicitly directed to word meanings. In the "unattended" condition, the subjects performed a visual detection task while hearing semantically related and unrelated word pairs. In the "phoneme" condition, the subjects made phoneme…
Descriptors: Phonemes, Semantics, Attention, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Weber-Fox, Christine; Leonard, Laurence B.; Wray, Amanda Hampton; Tomblin, J. Bruce – Brain and Language, 2010
Brief tonal stimuli and spoken sentences were utilized to examine whether adolescents (aged 14;3-18;1) with specific language impairments (SLI) exhibit atypical neural activity for rapid auditory processing of non-linguistic stimuli and linguistic processing of verb-agreement and semantic constraints. Further, we examined whether the behavioral…
Descriptors: Sentences, Auditory Stimuli, Semantics, Verbs
Peer reviewed Peer reviewed
Direct linkDirect link
Hocking, Julia; Price, Cathy J. – Brain and Language, 2009
This fMRI study investigates how audiovisual integration differs for verbal stimuli that can be matched at a phonological level and nonverbal stimuli that can be matched at a semantic level. Subjects were presented simultaneously with one visual and one auditory stimulus and were instructed to decide whether these stimuli referred to the same…
Descriptors: Verbal Stimuli, Semantics, Cognitive Processes, Diagnostic Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Rogalsky, Corianne; Pitz, Eleanor; Hillis, Argye E.; Hickok, Gregory – Brain and Language, 2008
Auditory word comprehension was assessed in a series of 289 acute left hemisphere stroke patients. Participants decided whether an auditorily presented word matched a picture. On different trials, words were presented with a matching picture, a semantic foil, or a phonemic foil. Participants had significantly more trouble with semantic foils…
Descriptors: Phonemics, Semantics, Patients, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Bastiaansen, Marcel C. M.; Oostenveld, Robert; Jensen, Ole; Hagoort, Peter – Brain and Language, 2008
An influential hypothesis regarding the neural basis of the mental lexicon is that semantic representations are neurally implemented as distributed networks carrying sensory, motor and/or more abstract functional information. This work investigates whether the semantic properties of words partly determine the topography of such networks. Subjects…
Descriptors: Topography, Semantics, Nouns, Musicians
Peer reviewed Peer reviewed
Direct linkDirect link
Kittredge, Audrey; Davis, Lissa; Blumstein, Sheila E. – Brain and Language, 2006
In a series of experiments, the effect of white noise distortion and talker variation on lexical access in normal and Broca's aphasic participants was examined using an auditory lexical decision paradigm. Masking the prime stimulus in white noise resulted in reduced semantic priming for both groups, indicating that lexical access is degraded by…
Descriptors: Aphasia, Acoustics, Auditory Stimuli, Patients
Peer reviewed Peer reviewed
Direct linkDirect link
Feldman, Laurie Beth; Soltano, Emily G.; Pastizzo, Matthew J.; Francis, Sarah E. – Brain and Language, 2004
We examined the influence of semantic transparency on morphological facilitation in English in three lexical decision experiments. Decision latencies to visual targets (e.g., CASUALNESS) were faster after semantically transparent (e.g., CASUALLY) than semantically opaque (e.g., CASUALTY) primes whether primes were auditory and presented…
Descriptors: Semantics, Morphology (Languages), Language Processing, English
Peer reviewed Peer reviewed
Direct linkDirect link
Friederici, Angela D.; Alter, Kai – Brain and Language, 2004
Spoken language comprehension requires the coordination of different subprocesses in time. After the initial acoustic analysis the system has to extract segmental information such as phonemes, syntactic elements and lexical-semantic elements as well as suprasegmental information such as accentuation and intonational phrases, i.e., prosody.…
Descriptors: Listening Comprehension, Language Processing, Brain Hemisphere Functions, Syntax
Peer reviewed Peer reviewed
Direct linkDirect link
Blumenfeld, Henrike K.; Booth, James R.; Burman, Douglas D. – Brain and Language, 2006
This study used functional magnetic resonance imaging (fMRI) to examine brain-behavior correlations in a group of 16 children (9- to 12-year-olds). Activation was measured during a semantic judgment task presented in either the visual or auditory modality that required the individual to determine whether a final word was related in meaning to one…
Descriptors: Brain Hemisphere Functions, Visual Discrimination, Auditory Discrimination, Neurolinguistics