Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 7 |
Descriptor
Auditory Perception | 9 |
Visual Stimuli | 9 |
Auditory Stimuli | 5 |
Speech Communication | 4 |
Visual Perception | 4 |
Sensory Integration | 3 |
Cognitive Ability | 2 |
Cognitive Processes | 2 |
Experiments | 2 |
Acoustics | 1 |
Associative Learning | 1 |
More ▼ |
Source
Cognition | 9 |
Author
Publication Type
Journal Articles | 9 |
Reports - Research | 6 |
Reports - Evaluative | 2 |
Education Level
Early Childhood Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Sweeny, Timothy D.; Guzman-Martinez, Emmanuel; Ortega, Laura; Grabowecky, Marcia; Suzuki, Satoru – Cognition, 2012
While perceiving speech, people see mouth shapes that are systematically associated with sounds. In particular, a vertically stretched mouth produces a /woo/ sound, whereas a horizontally stretched mouth produces a /wee/ sound. We demonstrate that hearing these speech sounds alters how we see aspect ratio, a basic visual feature that contributes…
Descriptors: Television Viewing, Visual Perception, Auditory Perception, Geometric Concepts
Kraljic, Tanya; Samuel, Arthur G. – Cognition, 2011
Listeners rapidly adjust to talkers' pronunciations, accommodating those pronunciations into the relevant phonemic category to improve subsequent perception. Previous work has suggested that such learning is restricted to pronunciations that are representative of how the speaker talks (Kraljic, Samuel, & Brennan, 2008). If an ambiguous…
Descriptors: Auditory Perception, Learning Processes, Experiments, Speech Communication
Eramudugolla, Ranmalee; Kamke, Marc. R.; Soto-Faraco, Salvador; Mattingley, Jason B. – Cognition, 2011
A period of exposure to trains of simultaneous but spatially offset auditory and visual stimuli can induce a temporary shift in the perception of sound location. This phenomenon, known as the "ventriloquist aftereffect", reflects a realignment of auditory and visual spatial representations such that they approach perceptual alignment despite their…
Descriptors: Visual Stimuli, Auditory Stimuli, Spatial Ability, Cognitive Ability
Zeelenberg, Rene; Bocanegra, Bruno R. – Cognition, 2010
Recent studies show that emotional stimuli impair performance to subsequently presented neutral stimuli. Here we show a cross-modal perceptual enhancement caused by emotional cues. Auditory cue words were followed by a visually presented neutral target word. Two-alternative forced-choice identification of the visual target was improved by…
Descriptors: Cues, Visual Perception, Auditory Stimuli, Visual Stimuli
Foxton, Jessica M.; Riviere, Louis-David; Barone, Pascal – Cognition, 2010
Speech prosody has traditionally been considered solely in terms of its auditory features, yet correlated visual features exist, such as head and eyebrow movements. This study investigated the extent to which visual prosodic features are able to affect the perception of the auditory features. Participants were presented with videos of a speaker…
Descriptors: Visual Stimuli, Speech Communication, Suprasegmentals, Human Body
Petrini, Karin; Russell, Melanie; Pollick, Frank – Cognition, 2009
The ability to predict the effects of actions is necessary to behave properly in our physical and social world. Here, we describe how the ability to predict the consequence of complex gestures can change the way we integrate sight and sound when relevant visual information is missing. Six drummers and six novices were asked to judge audiovisual…
Descriptors: Vision, Prediction, Nonverbal Communication, Auditory Perception
Jordan, Kerry E.; Suanda, Sumarga H.; Brannon, Elizabeth M. – Cognition, 2008
Intersensory redundancy can facilitate animal and human behavior in areas as diverse as rhythm discrimination, signal detection, orienting responses, maternal call learning, and associative learning. In the realm of numerical development, infants show similar sensitivity to numerical differences in both the visual and auditory modalities. Using a…
Descriptors: Infants, Associative Learning, Redundancy, Cognitive Ability
Tuomainen, J.; Andersen, T.S.; Tiippana, K.; Sams, M. – Cognition, 2005
In face-to-face conversation speech is perceived by ear and eye. We studied the prerequisites of audio-visual speech perception by using perceptually ambiguous sine wave replicas of natural speech as auditory stimuli. When the subjects were not aware that the auditory stimuli were speech, they showed only negligible integration of auditory and…
Descriptors: Visual Stimuli, Auditory Perception, Auditory Stimuli
Soto-Faraco, Salvador; Navarra, Jordi; Alsius, Agnes – Cognition, 2004
The McGurk effect is usually presented as an example of fast, automatic, multisensory integration. We report a series of experiments designed to directly assess these claims. We used a syllabic version of the "speeded classification" paradigm, whereby response latencies to the first (target) syllable of spoken word-like stimuli are slowed down…
Descriptors: Classification, Auditory Perception, Visual Perception, Syllables