Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 27 |
Descriptor
Task Analysis | 28 |
Brain Hemisphere Functions | 24 |
Diagnostic Tests | 15 |
Language Processing | 14 |
Pictorial Stimuli | 12 |
Visual Stimuli | 11 |
Semantics | 8 |
Cognitive Processes | 7 |
Auditory Stimuli | 6 |
Correlation | 6 |
Speech | 5 |
More ▼ |
Source
Brain and Language | 28 |
Author
Pascual-Leone, Alvaro | 2 |
Ahonen, Timo | 1 |
Amorapanth, Prin | 1 |
Annoni, Jean-Marie | 1 |
Arciuli, Joanne | 1 |
Aro, Mikko | 1 |
Aron, Adam R. | 1 |
Avila, C. | 1 |
Badcock, Nicholas A. | 1 |
Baker, Errol H. | 1 |
Barry, Johanna G. | 1 |
More ▼ |
Publication Type
Journal Articles | 28 |
Reports - Research | 22 |
Reports - Evaluative | 6 |
Education Level
Audience
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Dhooge, Elisah; De Baene, Wouter; Hartsuiker, Robert J. – Brain and Language, 2013
In this study, we investigated how people deal with irrelevant contextual information during speech production. Two main models have been proposed. WEAVER++ assumes that irrelevant information is removed from the production system by an early blocking mechanism. On the other hand, the response exclusion hypothesis assumes a blocking mechanism that…
Descriptors: Cognitive Processes, Speech, Naming, Brain Hemisphere Functions
Rama, Pia; Relander-Syrjanen, Kristiina; Carlson, Synnove; Salonen, Oili; Kujala, Teija – Brain and Language, 2012
This fMRI study was conducted to investigate whether language semantics is processed even when attention is not explicitly directed to word meanings. In the "unattended" condition, the subjects performed a visual detection task while hearing semantically related and unrelated word pairs. In the "phoneme" condition, the subjects made phoneme…
Descriptors: Phonemes, Semantics, Attention, Language Processing
Nemrodov, Dan; Harpaz, Yuval; Javitt, Daniel C.; Lavidor, Michal – Brain and Language, 2011
This study examined the capability of the left hemisphere (LH) and the right hemisphere (RH) to perform a visual recognition task independently as formulated by the Direct Access Model (Fernandino, Iacoboni, & Zaidel, 2007). Healthy native Hebrew speakers were asked to categorize nouns and non-words (created from nouns by transposing two middle…
Descriptors: Evidence, Stimuli, Nouns, Word Recognition
Emmorey, Karen; Xu, Jiang; Braun, Allen – Brain and Language, 2011
To identify neural regions that automatically respond to linguistically structured, but meaningless manual gestures, 14 deaf native users of American Sign Language (ASL) and 14 hearing non-signers passively viewed pseudosigns (possible but non-existent ASL signs) and non-iconic ASL signs, in addition to a fixation baseline. For the contrast…
Descriptors: Phonetics, Task Analysis, American Sign Language, Language Processing
Henderson, Lisa M.; Baseler, Heidi A.; Clarke, Paula J.; Watson, Sarah; Snowling, Margaret J. – Brain and Language, 2011
Using event-related potentials (ERPs), we investigated the N400 (an ERP component that occurs in response to meaningful stimuli) in children aged 8-10 years old and examined relationships between the N400 and individual differences in listening comprehension, word recognition and non-word decoding. Moreover, we tested the claim that the N400…
Descriptors: Listening Comprehension, Stimuli, Semantics, Word Recognition
Osnes, Berge; Hugdahl, Kenneth; Hjelmervik, Helene; Specht, Karsten – Brain and Language, 2012
In studies on auditory speech perception, participants are often asked to perform active tasks, e.g. decide whether the perceived sound is a speech sound or not. However, information about the stimulus, inherent in such tasks, may induce expectations that cause altered activations not only in the auditory cortex, but also in frontal areas such as…
Descriptors: Music, Auditory Perception, Speech Communication, Brain
Megnin-Viggars, Odette; Goswami, Usha – Brain and Language, 2013
Visual speech inputs can enhance auditory speech information, particularly in noisy or degraded conditions. The natural statistics of audiovisual speech highlight the temporal correspondence between visual and auditory prosody, with lip, jaw, cheek and head movements conveying information about the speech envelope. Low-frequency spatial and…
Descriptors: Phonology, Cues, Visual Perception, Speech
Coppens, Leonora C.; Gootjes, Liselotte; Zwaan, Rolf A. – Brain and Language, 2012
Language comprehenders form a mental representation of the implied shape of objects mentioned in the text. In the present study, the influence of prior visual experience on subsequent reading was assessed. In two separate phases, participants saw a picture of an object and read a text about the object, suggesting the same or a different shape.…
Descriptors: Brain Hemisphere Functions, Visual Perception, Cognitive Processes, Reading Processes
Menenti, Laura; Segaert, Katrien; Hagoort, Peter – Brain and Language, 2012
Models of speaking distinguish producing meaning, words and syntax as three different linguistic components of speaking. Nevertheless, little is known about the brain's integrated neuronal infrastructure for speech production. We investigated semantic, lexical and syntactic aspects of speaking using fMRI. In a picture description task, we…
Descriptors: Sentences, Speech Communication, Semantics, Syntax
Cai, Weidong; Oldenkamp, Caitlin L.; Aron, Adam R. – Brain and Language, 2012
Some situations require one to quickly stop an initiated response. Recent evidence suggests that rapid stopping engages a mechanism that has diffuse effects on the motor system. For example, stopping the hand dampens the excitability of the task-irrelevant leg. However, it is unclear whether this "global suppression" could apply across wider motor…
Descriptors: Motor Reactions, Brain Hemisphere Functions, Responses, Cognitive Processes
Bedny, Marina; Pascual-Leone, Alvaro; Dravida, Swethasri; Saxe, Rebecca – Brain and Language, 2012
Recent evidence suggests that blindness enables visual circuits to contribute to language processing. We examined whether this dramatic functional plasticity has a sensitive period. BOLD fMRI signal was measured in congenitally blind, late blind (blindness onset 9-years-old or later) and sighted participants while they performed a sentence…
Descriptors: Evidence, Sentences, Blindness, Brain Hemisphere Functions
Magezi, David A.; Khateb, Asaid; Mouthon, Michael; Spierer, Lucas; Annoni, Jean-Marie – Brain and Language, 2012
In highly proficient, early bilinguals, behavioural studies of the cost of switching language or task suggest qualitative differences between language control and domain-general cognitive control. By contrast, several neuroimaging studies have shown an overlap of the brain areas involved in language control and domain-general cognitive control.…
Descriptors: Evidence, Brain Hemisphere Functions, Bilingualism, Cognitive Ability
Korinth, Sebastian Peter; Sommer, Werner; Breznitz, Zvia – Brain and Language, 2012
Little is known about the relationship of reading speed and early visual processes in normal readers. Here we examined the association of the early P1, N170 and late N1 component in visual event-related potentials (ERPs) with silent reading speed and a number of additional cognitive skills in a sample of 52 adult German readers utilizing a Lexical…
Descriptors: Reading Processes, Visual Stimuli, Silent Reading, Reading Rate
Ziegler, Johannes C.; Pech-Georgel, Catherine; George, Florence; Foxton, Jessica M. – Brain and Language, 2012
This study investigated global versus local pitch pattern perception in children with dyslexia aged between 8 and 11 years. Children listened to two consecutive 4-tone pitch sequences while performing a same/different task. On the different trials, sequences either preserved the contour (local condition) or they violated the contour (global…
Descriptors: Phonology, Dyslexia, Short Term Memory, Brain Hemisphere Functions
Amorapanth, Prin; Kranjec, Alexander; Bromberger, Bianca; Lehet, Matthew; Widick, Page; Woods, Adam J.; Kimberg, Daniel Y.; Chatterjee, Anjan – Brain and Language, 2012
Schemas are abstract nonverbal representations that parsimoniously depict spatial relations. Despite their ubiquitous use in maps and diagrams, little is known about their neural instantiation. We sought to determine the extent to which schematic representations are neurally distinguished from language on the one hand, and from rich perceptual…
Descriptors: Brain Hemisphere Functions, Patients, Schemata (Cognition), Spatial Ability
Previous Page | Next Page ยป
Pages: 1 | 2