NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Hutson, John P.; Chandran, Prasanth; Magliano, Joseph P.; Smith, Tim J.; Loschky, Lester C. – Cognitive Science, 2022
Viewers' attentional selection while looking at scenes is affected by both top-down and bottom-up factors. However, when watching film, viewers typically attend to the movie similarly irrespective of top-down factors--a phenomenon we call the "tyranny of film." A key difference between still pictures and film is that film contains…
Descriptors: Attention, Eye Movements, Films, Motion
Peer reviewed Peer reviewed
Direct linkDirect link
Dreneva, Anna; Shvarts, Anna; Chumachenko, Dmitry; Krichevets, Anatoly – Cognitive Science, 2021
The paper addresses the capabilities and limitations of extrafoveal processing during a categorical visual search. Previous research has established that a target could be identified from the very first or without any saccade, suggesting that extrafoveal perception is necessarily involved. However, the limits in complexity defining the processed…
Descriptors: Cognitive Processes, Geometric Concepts, Visual Perception, Eye Movements
Peer reviewed Peer reviewed
Direct linkDirect link
Sekicki, Mirjana; Staudte, Maria – Cognitive Science, 2018
Referential gaze has been shown to benefit language processing in situated communication in terms of shifting visual attention and leading to shorter reaction times on subsequent tasks. The present study simultaneously assessed both visual attention and, importantly, the immediate cognitive load induced at different stages of sentence processing.…
Descriptors: Eye Movements, Cognitive Processes, Difficulty Level, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Yu, Chen; Smith, Linda B. – Cognitive Science, 2017
Joint attention has been extensively studied in the developmental literature because of overwhelming evidence that the ability to socially coordinate visual attention to an object is essential to healthy developmental outcomes, including language learning. The goal of this study was to understand the complex system of sensory-motor behaviors that…
Descriptors: Attention Control, Visual Perception, Language Acquisition, Toddlers
Peer reviewed Peer reviewed
Direct linkDirect link
Coco, Moreno I.; Keller, Frank; Malcolm, George L. – Cognitive Science, 2016
The human sentence processor is able to make rapid predictions about upcoming linguistic input. For example, upon hearing the verb eat, anticipatory eye-movements are launched toward edible objects in a visual scene (Altmann & Kamide, 1999). However, the cognitive mechanisms that underlie anticipation remain to be elucidated in ecologically…
Descriptors: Role, Memory, Visual Perception, Linguistic Input
Peer reviewed Peer reviewed
Direct linkDirect link
Busey, Thomas; Yu, Chen; Wyatte, Dean; Vanderkolk, John – Cognitive Science, 2013
Perceptual tasks such as object matching, mammogram interpretation, mental rotation, and satellite imagery change detection often require the assignment of correspondences to fuse information across views. We apply techniques developed for machine translation to the gaze data recorded from a complex perceptual matching task modeled after…
Descriptors: Eye Movements, Perception Tests, Visual Stimuli, Visual Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Coco, Moreno I.; Keller, Frank – Cognitive Science, 2012
Most everyday tasks involve multiple modalities, which raises the question of how the processing of these modalities is coordinated by the cognitive system. In this paper, we focus on the coordination of visual attention and linguistic processing during speaking. Previous research has shown that objects in a visual scene are fixated before they…
Descriptors: Sensory Integration, Visual Perception, Attention, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Nyamsuren, Enkhbold; Taatgen, Niels A. – Cognitive Science, 2013
Complex problem solving is often an integration of perceptual processing and deliberate planning. But what balances these two processes, and how do novices differ from experts? We investigate the interplay between these two in the game of SET. This article investigates how people combine bottom-up visual processes and top-down planning to succeed…
Descriptors: Visual Perception, Cognitive Processes, Eye Movements, Regression (Statistics)
Peer reviewed Peer reviewed
Direct linkDirect link
Xu, Yun; Higgins, Emily C.; Xiao, Mei; Pomplun, Marc – Cognitive Science, 2007
Color coding is used to guide attention in computer displays for such critical tasks as baggage screening or air traffic control. It has been shown that a display object attracts more attention if its color is more similar to the color for which one is searching. However, what does "similar" precisely mean? Can we predict the amount of attention…
Descriptors: Mathematical Models, Eye Movements, Computer Interfaces, Color
Peer reviewed Peer reviewed
Direct linkDirect link
Lacroix, Joyca P. W.; Murre, Jaap M. J.; Postma, Eric O.; van den Herik, H. Jaap – Cognitive Science, 2006
The natural input memory (NAM) model is a new model for recognition memory that operates on natural visual input. A biologically informed perceptual preprocessing method takes local samples (eye fixations) from a natural image and translates these into a feature-vector representation. During recognition, the model compares incoming preprocessed…
Descriptors: Recognition (Psychology), Models, Visual Perception, Eye Movements
Peer reviewed Peer reviewed
Direct linkDirect link
Richardson, Daniel C.; Dale, Rick – Cognitive Science, 2005
We investigated the coupling between a speaker's and a listener's eye movements. Some participants talked extemporaneously about a television show whose cast members they were viewing on a screen in front of them. Later, other participants listened to these monologues while viewing the same screen. Eye movements were recorded for all speakers and…
Descriptors: Eye Movements, Listening Comprehension Tests, Listening Comprehension, Cues
Peer reviewed Peer reviewed
Direct linkDirect link
Johansson, Roger; Holsanova, Jana; Holmqvist, Kenneth – Cognitive Science, 2006
This study provides evidence that eye movements reflect the positions of objects while participants listen to a spoken description, retell a previously heard spoken description, and describe a previously seen picture. This effect is equally strong in retelling from memory, irrespective of whether the original elicitation was spoken or visual. In…
Descriptors: Eye Movements, Pictorial Stimuli, Comparative Analysis, Visual Perception