Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 1 |
Descriptor
Auditory Stimuli | 4 |
Brain Hemisphere Functions | 4 |
Language Processing | 4 |
Listening Comprehension | 2 |
Semantics | 2 |
Syntax | 2 |
Visual Stimuli | 2 |
Word Recognition | 2 |
Acoustics | 1 |
Aphasia | 1 |
Cognitive Processes | 1 |
More ▼ |
Source
Brain and Language | 5 |
Author
Alter, Kai | 1 |
Blumstein, Sheila E. | 1 |
Davis, Lissa | 1 |
Ellis, Andrew W. | 1 |
Friederici, Angela D. | 1 |
Hagoort, Peter | 1 |
Hayes, Adrian | 1 |
Hellwig, Frauke | 1 |
Herzog, Hans | 1 |
Indefrey, Peter | 1 |
Kittredge, Audrey | 1 |
More ▼ |
Publication Type
Journal Articles | 5 |
Reports - Descriptive | 5 |
Opinion Papers | 1 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Van Strien, Jan W. – Brain and Language, 2004
To investigate whether concurrent nonverbal sound sequences would affect visual-hemifield lexical processing, lexical-decision performance of 24 strongly right-handed students (12 men, 12 women) was measured in three conditions: baseline, concurrent neutral sound sequence, and concurrent emotional sound sequence. With the neutral sequence,…
Descriptors: Auditory Stimuli, Brain Hemisphere Functions, Hypothesis Testing, Cognitive Processes
Kittredge, Audrey; Davis, Lissa; Blumstein, Sheila E. – Brain and Language, 2006
In a series of experiments, the effect of white noise distortion and talker variation on lexical access in normal and Broca's aphasic participants was examined using an auditory lexical decision paradigm. Masking the prime stimulus in white noise resulted in reduced semantic priming for both groups, indicating that lexical access is degraded by…
Descriptors: Aphasia, Acoustics, Auditory Stimuli, Patients
Friederici, Angela D.; Alter, Kai – Brain and Language, 2004
Spoken language comprehension requires the coordination of different subprocesses in time. After the initial acoustic analysis the system has to extract segmental information such as phonemes, syntactic elements and lexical-semantic elements as well as suprasegmental information such as accentuation and intonational phrases, i.e., prosody.…
Descriptors: Listening Comprehension, Language Processing, Brain Hemisphere Functions, Syntax
Indefrey, Peter; Hellwig, Frauke; Herzog, Hans; Seitz, Rudiger J.; Hagoort, Peter – Brain and Language, 2004
Following up on an earlier positron emission tomography (PET) experiment (Indefrey et al., 2001), we used a scene description paradigm to investigate whether a posterior inferior frontal region subserving syntactic encoding for speaking is also involved in syntactic parsing during listening. In the language production part of the experiment,…
Descriptors: Listening Comprehension, Auditory Stimuli, Syntax, Speech Communication
Lavidor, Michal; Hayes, Adrian; Shillcock, Richard; Ellis, Andrew W. – Brain and Language, 2004
The split fovea theory proposes that visual word recognition of centrally presented words is mediated by the splitting of the foveal image, with letters to the left of fixation being projected to the right hemisphere (RH) and letters to the right of fixation being projected to the left hemisphere (LH). Two lexical decision experiments aimed to…
Descriptors: Word Recognition, Language Processing, Visual Stimuli, Orthographic Symbols