NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 13 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Sweeny, Timothy D.; Guzman-Martinez, Emmanuel; Ortega, Laura; Grabowecky, Marcia; Suzuki, Satoru – Cognition, 2012
While perceiving speech, people see mouth shapes that are systematically associated with sounds. In particular, a vertically stretched mouth produces a /woo/ sound, whereas a horizontally stretched mouth produces a /wee/ sound. We demonstrate that hearing these speech sounds alters how we see aspect ratio, a basic visual feature that contributes…
Descriptors: Television Viewing, Visual Perception, Auditory Perception, Geometric Concepts
Peer reviewed Peer reviewed
Direct linkDirect link
Zeelenberg, Rene; Bocanegra, Bruno R. – Cognition, 2010
Recent studies show that emotional stimuli impair performance to subsequently presented neutral stimuli. Here we show a cross-modal perceptual enhancement caused by emotional cues. Auditory cue words were followed by a visually presented neutral target word. Two-alternative forced-choice identification of the visual target was improved by…
Descriptors: Cues, Visual Perception, Auditory Stimuli, Visual Stimuli
Peer reviewed Peer reviewed
Direct linkDirect link
Swallow, Khena M.; Jiang, Yuhong V. – Cognition, 2010
Recent work on event perception suggests that perceptual processing increases when events change. An important question is how such changes influence the way other information is processed, particularly during dual-task performance. In this study, participants monitored a long series of distractor items for an occasional target as they…
Descriptors: Attention, Memory, Cognitive Processes, Task Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Petrini, Karin; Russell, Melanie; Pollick, Frank – Cognition, 2009
The ability to predict the effects of actions is necessary to behave properly in our physical and social world. Here, we describe how the ability to predict the consequence of complex gestures can change the way we integrate sight and sound when relevant visual information is missing. Six drummers and six novices were asked to judge audiovisual…
Descriptors: Vision, Prediction, Nonverbal Communication, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Yeung, H. Henny; Werker, Janet F. – Cognition, 2009
One of the central themes in the study of language acquisition is the gap between the linguistic knowledge that learners demonstrate, and the apparent inadequacy of linguistic input to support induction of this knowledge. One of the first linguistic abilities in the course of development to exemplify this problem is in speech perception:…
Descriptors: Language Acquisition, Native Speakers, Infants, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Jordan, Kerry E.; MacLean, Evan L.; Brannon, Elizabeth M. – Cognition, 2008
We report here that monkeys can actively match the number of sounds they hear to the number of shapes they see and present the first evidence that monkeys sum over sounds and sights. In Experiment 1, two monkeys were trained to choose a simultaneous array of 1-9 squares that numerically matched a sample sequence of shapes or sounds. Monkeys…
Descriptors: Reaction Time, Critical Thinking, Animals, Animal Behavior
Peer reviewed Peer reviewed
Direct linkDirect link
Sanabria, Daniel; Spence, Charles; Soto-Faraco, Salvador – Cognition, 2007
Motion information available to different sensory modalities can interact at both perceptual and post-perceptual (i.e., decisional) stages of processing. However, to date, researchers have only been able to demonstrate the influence of one of these components at any given time, hence the relationship between them remains uncertain. We addressed…
Descriptors: Motion, Cognitive Processes, Classification, Visual Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Chapados, Catherine; Levitin, Daniel J. – Cognition, 2008
This experiment was conducted to investigate cross-modal interactions in the emotional experience of music listeners. Previous research showed that visual information present in a musical performance is rich in expressive content, and moderates the subjective emotional experience of a participant listening and/or observing musical stimuli [Vines,…
Descriptors: Musicians, Music, Emotional Response, Interaction
Peer reviewed Peer reviewed
Direct linkDirect link
Davis, Chris; Kim, Jeesun – Cognition, 2006
The study examined whether people can extract speech related information from the talker's upper face that was presented using either normally textured videos (Experiments 1 and 3) or videos showing only the outlined of the head (Experiments 2 and 4). Experiments 1 and 2 used within- and cross-modal matching tasks. In the within-modal task,…
Descriptors: Language Processing, Auditory Perception, Inner Speech (Subvocal), Motion
Peer reviewed Peer reviewed
Direct linkDirect link
Schwartz, Jean-Luc; Berthommier, Frederic; Savariaux, Christophe – Cognition, 2004
Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances "sensitivity" to acoustic information,…
Descriptors: Hearing (Physiology), Lipreading, Auditory Perception, Visual Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Justus, Timothy; List, Alexandra – Cognition, 2005
Two priming experiments demonstrated exogenous attentional persistence to the fundamental auditory dimensions of frequency (Experiment 1) and time (Experiment 2). In a divided-attention task, participants responded to an independent dimension, the identification of three-tone sequence patterns, for both prime and probe stimuli. The stimuli were…
Descriptors: Auditory Stimuli, Experiments, Auditory Perception, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Soto-Faraco, Salvador; Navarra, Jordi; Alsius, Agnes – Cognition, 2004
The McGurk effect is usually presented as an example of fast, automatic, multisensory integration. We report a series of experiments designed to directly assess these claims. We used a syllabic version of the "speeded classification" paradigm, whereby response latencies to the first (target) syllable of spoken word-like stimuli are slowed down…
Descriptors: Classification, Auditory Perception, Visual Perception, Syllables
Peer reviewed Peer reviewed
Direct linkDirect link
Toro, Juan M.; Sinnett, Scott; Soto-Faraco, Salvador – Cognition, 2005
We addressed the hypothesis that word segmentation based on statistical regularities occurs without the need of attention. Participants were presented with a stream of artificial speech in which the only cue to extract the words was the presence of statistical regularities between syllables. Half of the participants were asked to passively listen…
Descriptors: Auditory Perception, Word Recognition, Artificial Speech, Hypothesis Testing