+ Site Statistics
+ Search Articles
+ PDF Full Text Service
How our service works
Request PDF Full Text
+ Follow Us
Follow on Facebook
Follow on Twitter
Follow on LinkedIn
+ Subscribe to Site Feeds
Most Shared
PDF Full Text
+ Translate
+ Recently Requested

Guidance during Visual Search in Real-World Scenes Scene Context vs Object Content



Guidance during Visual Search in Real-World Scenes Scene Context vs Object Content







A number of past studies have shown that when searching for an object in a scene, eye movements are guided towards the target based on scene context by selecting the most relevant areas. Other studies have shown that fixations are directed to high spatial frequency information, corresponding to objects in the scene (van Diepen Empty Scene: search scene with all objects removed; Fractioned Scene: search scene with only a small number of objects; and No Scene: a black screen control. Thus, the Empty Scene provided scene context information alone, while the Fractioned Scene provided additional information about object content that did not overlap with the target. While results showed search was best in the Full Scene condition and worst in the No Scene condition, we found across a number of eye movement measures that there was no difference between the Fractioned and Empty Scene searches. This pattern was seen in the latency to first target fixation, the number of fixations before the first target fixation and in reaction time. The picture that emerges seems to support previous studies indicating that scene context information may be more useful in guiding eye movements during search than object-based features.

Please choose payment method:






(PDF emailed within 1 workday: $29.90)

Accession: 036214330

Download citation: RISBibTeXText


Related references

On-Line Contributions of Peripheral Information to Visual Search in Scenes Further Explorations of Object Content and Scene Context. 2012

Peripheral guidance in scenes: The interaction of scene context and object content. Journal of Experimental Psychology. Human Perception and Performance 40(5): 2056-2072, 2014

On the visual span during object search in real-world scenes. Visual Cognition 21(7): 803-837, 2013

Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychological Review 113(4): 766-786, 2006

Neural representations of contextual guidance in visual search of real-world scenes. Journal of Neuroscience 33(18): 7846-7855, 2013

How do the regions of the visual field contribute to object search in real-world scenes? Evidence from eye movements. Journal of Experimental Psychology. Human Perception and Performance 40(1): 342-360, 2014

The relative contribution of scene context and target features to visual search in scenes. Attention Perception and Psychophysics 72(5): 1283-1297, 2010

Two forms of scene memory guide visual search: Memory for scene context and memory for the binding of target object to scene location. Visual Cognition 17(1-2): 273-291, 2009

The roles of scene gist and spatial dependency among objects in the semantic guidance of attention in real-world scenes. Vision Research 105: 10-20, 2014

Capture by object exemplars during category-based search of real-world scenes. 2013

Anticipation in Real-World Scenes: The Role of Visual Context and Visual Memory. Cognitive Science 40(8): 1995-2024, 2016

How do targets, nontargets, and scene context influence real-world object detection?. Attention Perception and Psychophysics 79(7): 2021-2036, 2017

Guidance of visual attention by semantic information in real-world scenes. Frontiers in Psychology 5: 54, 2014

Emotional real-world scenes impact visual search. Cognitive Processing 20(3): 309-316, 2019

Eye guidance during real-world scene search: The role color plays in central and peripheral vision. Journal of Vision 16(2): 3, 2016