Publication Date
In 2025 | 1 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 11 |
Since 2006 (last 20 years) | 45 |
Descriptor
Language Processing | 58 |
Visual Stimuli | 58 |
Visual Perception | 31 |
Auditory Perception | 22 |
Cognitive Processes | 20 |
Task Analysis | 15 |
Auditory Stimuli | 14 |
Word Recognition | 12 |
Brain Hemisphere Functions | 11 |
Speech Communication | 11 |
Comparative Analysis | 9 |
More ▼ |
Source
Author
Emmorey, Karen | 3 |
Jerger, Susan | 3 |
Abdi, Herve | 2 |
Ansorge, Lydia | 2 |
Aveyard, Mark | 2 |
Connell, Louise | 2 |
Damian, Markus F. | 2 |
Ellis, Andrew W. | 2 |
Kaschak, Michael P. | 2 |
Lavidor, Michal | 2 |
Lynott, Dermot | 2 |
More ▼ |
Publication Type
Journal Articles | 51 |
Reports - Research | 44 |
Reports - Descriptive | 5 |
Reports - Evaluative | 4 |
Dissertations/Theses -… | 2 |
Information Analyses | 2 |
Speeches/Meeting Papers | 2 |
Opinion Papers | 1 |
Audience
Researchers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Stroop Color Word Test | 1 |
Wechsler Adult Intelligence… | 1 |
What Works Clearinghouse Rating
Interaction between Word Processing and Low-Level Visual Representation in Autistic College Students
Nicolás Acuña Luongo; Valeria Arriaza – Mind, Brain, and Education, 2025
Recent studies reported a differential multisensory integration (MSI) in autism spectrum disorder (ASD). Much of the research on MSI differences has focused on how visual stimuli influence speech processing. The present study takes a reverse perspective. We investigated if speech processing can affect the construction of low-level visual…
Descriptors: Visual Perception, Autism Spectrum Disorders, College Students, Multisensory Learning
Lacey, Simon; Jamal, Yaseen; List, Sara M.; McCormick, Kelly; Sathian, K.; Nygaard, Lynne C. – Cognitive Science, 2020
Sound symbolism refers to non-arbitrary mappings between the sounds of words and their meanings and is often studied by pairing auditory pseudowords such as "maluma" and "takete" with rounded and pointed visual shapes, respectively. However, it is unclear what auditory properties of pseudowords contribute to their perception as…
Descriptors: Acoustics, Auditory Stimuli, Cognitive Mapping, Definitions
Irwin, Julia; Avery, Trey; Kleinman, Daniel; Landi, Nicole – Journal of Autism and Developmental Disorders, 2022
Children with autism spectrum disorders have been reported to be less influenced by a speaker's face during speech perception than those with typically development. To more closely examine these reported differences, a novel visual phonemic restoration paradigm was used to assess neural signatures (event-related potentials [ERPs]) of audiovisual…
Descriptors: Autism, Pervasive Developmental Disorders, Diagnostic Tests, Brain Hemisphere Functions
Batel, Essa – Journal of Psycholinguistic Research, 2020
This study tested the effect of constraining sentence context on word recognition time (RT) in the first and second language. Native (L1) and nonnative (L2) speakers of English performed self-paced reading and listening tasks to see whether a semantically-rich preceding context would lead to the activation of a probable upcoming word prior to…
Descriptors: Word Recognition, Visual Stimuli, Auditory Stimuli, Auditory Perception
Audiovisual Speech Processing in Relationship to Phonological and Vocabulary Skills in First Graders
Gijbels, Liesbeth; Yeatman, Jason D.; Lalonde, Kaylah; Lee, Adrian K. C. – Journal of Speech, Language, and Hearing Research, 2021
Purpose: It is generally accepted that adults use visual cues to improve speech intelligibility in noisy environments, but findings regarding visual speech benefit in children are mixed. We explored factors that contribute to audiovisual (AV) gain in young children's speech understanding. We examined whether there is an AV benefit to…
Descriptors: Auditory Perception, Visual Stimuli, Auditory Stimuli, Cues
Jerger, Susan; Damian, Markus F.; Karl, Cassandra; Abdi, Hervé – Journal of Speech, Language, and Hearing Research, 2018
Purpose: Successful speech processing depends on our ability to detect and integrate multisensory cues, yet there is minimal research on multisensory speech detection and integration by children. To address this need, we studied the development of speech detection for auditory (A), visual (V), and audiovisual (AV) input. Method: Participants were…
Descriptors: Speech, Language Processing, Auditory Perception, Visual Stimuli
Ostarek, Markus; Huettig, Falk – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2017
It is well established that the comprehension of spoken words referring to object concepts relies on high-level visual areas in the ventral stream that build increasingly abstract representations. It is much less clear whether basic low-level visual representations are also involved. Here we asked in what task situations low-level visual…
Descriptors: Word Recognition, Comprehension, Visual Stimuli, Interference (Learning)
Mihye Choi – ProQuest LLC, 2020
One hypothesis to explain perceptual narrowing in speech perception is the distributional learning account. This account claims that both infants and adults are able to infer the number of phonemic categories through observations of frequency distributions of individual phones in their speech input (Maye, Werker, & Gerken, 2002). Although the…
Descriptors: Phonemes, Native Language, Cues, Information Sources
Heikkilä, Jenni; Tiippana, Kaisa; Loberg, Otto; Leppänen, Paavo H. T. – Language Learning, 2018
Seeing articulatory gestures enhances speech perception. Perception of auditory speech can even be changed by incongruent visual gestures, which is known as the McGurk effect (e.g., dubbing a voice saying /mi/ onto a face articulating /ni/, observers often hear /ni/). In children, the McGurk effect is weaker than in adults, but no previous…
Descriptors: Articulation (Speech), Audiovisual Aids, Brain Hemisphere Functions, Diagnostic Tests
Heuer, Sabine; Ivanova, Maria V.; Hallowell, Brooke – Journal of Speech, Language, and Hearing Research, 2017
Purpose: Language comprehension in people with aphasia (PWA) is frequently evaluated using multiple-choice displays: PWA are asked to choose the image that best corresponds to the verbal stimulus in a display. When a nontarget image is selected, comprehension failure is assumed. However, stimulus-driven factors unrelated to linguistic…
Descriptors: Aphasia, Eye Movements, Comparative Analysis, Language Processing
Giustolisi, Beatrice; Emmorey, Karen – Cognitive Science, 2018
This study investigated visual statistical learning (VSL) in 24 deaf signers and 24 hearing non-signers. Previous research with hearing individuals suggests that SL mechanisms support literacy. Our first goal was to assess whether VSL was associated with reading ability in deaf individuals, and whether this relation was sustained by a link between…
Descriptors: Deafness, Hearing Impairments, Task Analysis, Correlation
Gordon, Peter C.; Plummer, Patrick; Choi, Wonil – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2013
Serial attention models of eye-movement control during reading were evaluated in an eye-tracking experiment that examined how lexical activation combines with visual information in the parafovea to affect word skipping (where a word is not fixated during first-pass reading). Lexical activation was manipulated by repetition priming created through…
Descriptors: Human Body, Priming, Word Recognition, Eye Movements
Angele, Bernhard; Rayner, Keith – Journal of Experimental Psychology: Learning, Memory, and Cognition, 2013
One of the words that readers of English skip most often is the definite article "the". Most accounts of reading assume that in order for a reader to skip a word, it must have received some lexical processing. The definite article is skipped so regularly, however, that the oculomotor system might have learned to skip the letter string…
Descriptors: Form Classes (Languages), Sentences, Verbs, Language Processing
Goodhew, Stephanie C.; Visser, Troy A. W.; Lipp, Ottmar V.; Dux, Paul E. – Cognition, 2011
Decades of research on visual perception has uncovered many phenomena, such as binocular rivalry, backward masking, and the attentional blink, that reflect "failures of consciousness". Although stimuli do not reach awareness in these paradigms, there is evidence that they nevertheless undergo semantic processing. Object substitution masking (OSM),…
Descriptors: Semantics, Visual Perception, Cognitive Processes, Cues
Connell, Louise; Lynott, Dermot – Cognition, 2012
Abstract concepts are traditionally thought to differ from concrete concepts by their lack of perceptual information, which causes them to be processed more slowly and less accurately than perceptually-based concrete concepts. In two studies, we examined this assumption by comparing concreteness and imageability ratings to a set of perceptual…
Descriptors: Language Processing, Olfactory Perception, Word Processing, Reaction Time