Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 10 |
Descriptor
Auditory Stimuli | 13 |
Visual Stimuli | 13 |
Brain Hemisphere Functions | 7 |
Diagnostic Tests | 7 |
Cognitive Processes | 6 |
Language Processing | 5 |
Semantics | 5 |
Neurological Organization | 4 |
Brain | 3 |
Children | 3 |
Comparative Analysis | 3 |
More ▼ |
Source
Brain and Language | 13 |
Author
Publication Type
Journal Articles | 13 |
Reports - Research | 12 |
Reports - Descriptive | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Hessler, Dorte; Jonkers, Roel; Stowe, Laurie; Bastiaanse, Roelien – Brain and Language, 2013
In the current ERP study, an active oddball task was carried out, testing pure tones and auditory, visual and audiovisual syllables. For pure tones, an MMN, an N2b, and a P3 were found, confirming traditional findings. Auditory syllables evoked an N2 and a P3. We found that the amplitude of the P3 depended on the distance between standard and…
Descriptors: Auditory Stimuli, Audiovisual Aids, Phonemes, Brain Hemisphere Functions
Howell, Peter; Jiang, Jing; Peng, Danling; Lu, Chunming – Brain and Language, 2012
The neural mechanisms used in tone rises and falls in Mandarin were investigated. Nine participants were scanned while they named one-character pictures that required rising or falling tone responses in Mandarin: the left insula and right putamen showed stronger activation between rising and falling tones; the left brainstem showed weaker…
Descriptors: Phonology, Mandarin Chinese, Investigations, Visual Stimuli
Hu, Zhonghua; Zhang, Ruiling; Zhang, Qinglin; Liu, Qiang; Li, Hong – Brain and Language, 2012
Previous studies have found a late frontal-central audiovisual interaction during the time period about 150-220 ms post-stimulus. However, it is unclear to which process is this audiovisual interaction related: to processing of acoustical features or to classification of stimuli? To investigate this question, event-related potentials were recorded…
Descriptors: Auditory Stimuli, Semantics, Interaction, Semiotics
Rama, Pia; Relander-Syrjanen, Kristiina; Carlson, Synnove; Salonen, Oili; Kujala, Teija – Brain and Language, 2012
This fMRI study was conducted to investigate whether language semantics is processed even when attention is not explicitly directed to word meanings. In the "unattended" condition, the subjects performed a visual detection task while hearing semantically related and unrelated word pairs. In the "phoneme" condition, the subjects made phoneme…
Descriptors: Phonemes, Semantics, Attention, Language Processing
Emmorey, Karen; Xu, Jiang; Braun, Allen – Brain and Language, 2011
To identify neural regions that automatically respond to linguistically structured, but meaningless manual gestures, 14 deaf native users of American Sign Language (ASL) and 14 hearing non-signers passively viewed pseudosigns (possible but non-existent ASL signs) and non-iconic ASL signs, in addition to a fixation baseline. For the contrast…
Descriptors: Phonetics, Task Analysis, American Sign Language, Language Processing
Kast, Monika; Bezzola, Ladina; Jancke, Lutz; Meyer, Martin – Brain and Language, 2011
The present functional magnetic resonance imaging (fMRI) study was designed, in order to investigate the neural substrates involved in the audiovisual processing of disyllabic German words and pseudowords. Twelve dyslexic and 13 nondyslexic adults performed a lexical decision task while stimuli were presented unimodally (either aurally or…
Descriptors: Decoding (Reading), Metabolism, Stimuli, Stimulation
Dick, Anthony Steven; Solodkin, Ana; Small, Steven L. – Brain and Language, 2010
Everyday conversation is both an auditory and a visual phenomenon. While visual speech information enhances comprehension for the listener, evidence suggests that the ability to benefit from this information improves with development. A number of brain regions have been implicated in audiovisual speech comprehension, but the extent to which the…
Descriptors: Speech, Structural Equation Models, Neurological Organization, Brain Hemisphere Functions
Bastiaansen, Marcel C. M.; Oostenveld, Robert; Jensen, Ole; Hagoort, Peter – Brain and Language, 2008
An influential hypothesis regarding the neural basis of the mental lexicon is that semantic representations are neurally implemented as distributed networks carrying sensory, motor and/or more abstract functional information. This work investigates whether the semantic properties of words partly determine the topography of such networks. Subjects…
Descriptors: Topography, Semantics, Nouns, Musicians
Wolff, Susann; Schlesewsky, Matthias; Hirotani, Masako; Bornkessel-Schlesewsky, Ina – Brain and Language, 2008
We present two ERP studies on the processing of word order variations in Japanese, a language that is suited to shedding further light on the implications of word order freedom for neurocognitive approaches to sentence comprehension. Experiment 1 used auditory presentation and revealed that initial accusative objects elicit increased processing…
Descriptors: Sentence Structure, Word Order, Costs, Japanese
Feldman, Laurie Beth; Soltano, Emily G.; Pastizzo, Matthew J.; Francis, Sarah E. – Brain and Language, 2004
We examined the influence of semantic transparency on morphological facilitation in English in three lexical decision experiments. Decision latencies to visual targets (e.g., CASUALNESS) were faster after semantically transparent (e.g., CASUALLY) than semantically opaque (e.g., CASUALTY) primes whether primes were auditory and presented…
Descriptors: Semantics, Morphology (Languages), Language Processing, English
Agnew, John A.; Dorn, Courtney; Eden, Guinevere F. – Brain and Language, 2004
This study assessed the ability of seven children to accurately judge relative durations of auditory and visual stimuli before and after participation in a language remediation program. The goal of the intervention program is to improve the children's ability to detect and identify rapidly changing auditory stimuli, and thereby improve their…
Descriptors: Auditory Perception, Training, Reading Skills, Auditory Stimuli
Van Strien, Jan W. – Brain and Language, 2004
To investigate whether concurrent nonverbal sound sequences would affect visual-hemifield lexical processing, lexical-decision performance of 24 strongly right-handed students (12 men, 12 women) was measured in three conditions: baseline, concurrent neutral sound sequence, and concurrent emotional sound sequence. With the neutral sequence,…
Descriptors: Auditory Stimuli, Brain Hemisphere Functions, Hypothesis Testing, Cognitive Processes
Blumenfeld, Henrike K.; Booth, James R.; Burman, Douglas D. – Brain and Language, 2006
This study used functional magnetic resonance imaging (fMRI) to examine brain-behavior correlations in a group of 16 children (9- to 12-year-olds). Activation was measured during a semantic judgment task presented in either the visual or auditory modality that required the individual to determine whether a final word was related in meaning to one…
Descriptors: Brain Hemisphere Functions, Visual Discrimination, Auditory Discrimination, Neurolinguistics