EconPapers    
Economics at your fingertips  
 

Within- and Cross-Modal Distance Information Disambiguate Visual Size-Change Perception

Peter W Battaglia, Massimiliano Di Luca, Marc O Ernst, Paul R Schrater, Tonja Machulla and Daniel Kersten

PLOS Computational Biology, 2010, vol. 6, issue 3, 1-10

Abstract: Perception is fundamentally underconstrained because different combinations of object properties can generate the same sensory information. To disambiguate sensory information into estimates of scene properties, our brains incorporate prior knowledge and additional “auxiliary” (i.e., not directly relevant to desired scene property) sensory information to constrain perceptual interpretations. For example, knowing the distance to an object helps in perceiving its size. The literature contains few demonstrations of the use of prior knowledge and auxiliary information in combined visual and haptic disambiguation and almost no examination of haptic disambiguation of vision beyond “bistable” stimuli. Previous studies have reported humans integrate multiple unambiguous sensations to perceive single, continuous object properties, like size or position. Here we test whether humans use visual and haptic information, individually and jointly, to disambiguate size from distance. We presented participants with a ball moving in depth with a changing diameter. Because no unambiguous distance information is available under monocular viewing, participants rely on prior assumptions about the ball's distance to disambiguate their -size percept. Presenting auxiliary binocular and/or haptic distance information augments participants' prior distance assumptions and improves their size judgment accuracy—though binocular cues were trusted more than haptic. Our results suggest both visual and haptic distance information disambiguate size perception, and we interpret these results in the context of probabilistic perceptual reasoning.Author Summary: To perceive your surroundings your brain must distinguish between different possible scenes, each of which is more or less likely. In order to disambiguate interpretations that are equally likely given sensory input, the brain aggregates multiple sensations to form an interpretation of the world consistent with each. For instance, when you judge the size of an object you are viewing, its distance influences its image size that projects to your eyes. To estimate its true size, your brain must use extra information to disambiguate whether it is a small, near object, or large, far object. If you touch the object your brain could use the felt distance to scale the apparent size of the object. Cognitive scientists do not fully understand the computations that make perceptual disambiguation possible. Here we investigate how people disambiguate an object's size from its distance by measuring participants' size judgments when we provide different types of distance sensations. We find that distance sensations provided by viewing objects with both eyes open, and by touching the object, are both effective for disambiguating its size. We provide a general probabilistic framework to explain these results, which provides a unifying account of sensory fusion in the presence of ambiguity.

Date: 2010
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1000697 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 00697&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1000697

DOI: 10.1371/journal.pcbi.1000697

Access Statistics for this article

More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().

 
Page updated 2025-03-22
Handle: RePEc:plo:pcbi00:1000697