Perceptual dissociations among views of objects, scenes, and reachable spaces.
Journal:
Journal of experimental psychology. Human perception and performance
Published Date:
Jun 1, 2019
Abstract
In everyday experience, we interact with objects and we navigate through space. Extensive research has revealed that these visual behaviors are mediated by separable object-based and scene-based processing mechanisms in the mind and brain. However, we also frequently view near-scale spaces, for example, when sitting at the breakfast table or preparing a meal. How should such spaces (operationalized here as "reachspaces"), which contain multiple objects but not enough space to navigate through, be considered in this dichotomy? Here, we used visual search to explore the possibility that reachspace views are perceptually distinctive from full-scale scene views as well as object views. In the first experiment, we found evidence for this dissociation. In the second experiment, we found that the perceptual differences between reachspaces and scenes were substantially larger than those between scene categories (e.g., kitchens vs. offices). Finally, we provide computational support for this perceptual dissociation: Deep neural network models also naturally separate reachspaces from both scenes and objects, suggesting that mid- to high-level features may underlie this dissociation. Taken together, these results demonstrate that our perceptual systems are sensitive to systematic visual feature differences that distinguish objects, reachspaces, and full-scale scene views. Broadly, these results raise the possibility that our visual system may use different perceptual primitives to support the perception of reachable and navigable views of the world. (PsycINFO Database Record (c) 2019 APA, all rights reserved).