From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach
Goker Erdogan,
Ilker Yildirim and
Robert A Jacobs
PLOS Computational Biology, 2015, vol. 11, issue 11, 1-32
Abstract:
People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models—that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model’s percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects’ ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception.Author Summary: When viewing an object, people perceive the object’s shape. Similarly, when grasping the same object, they also perceive its shape. In general, the perceived shape is identical in these two scenarios, illustrating modality invariance, an important type of perceptual constancy. Modality invariance suggests that people infer a modality-independent, conceptual representation that is the same regardless of the modality used to sense the environment. If so, how do people infer modality-independent representations from modality-specific sensory signals? We present a hypothesis about the components that any system will include if it infers modality-independent representations from sensory signals. This hypothesis is instantiated in a computational model that infers object shape representations from visual or haptic (i.e., active touch) signals. The model shows perfect modality invariance—it infers the same shape representations regardless of the sensory modality used to sense objects. The model also provides a highly accurate account of data collected in an experiment in which people judged the similarity of pairs of objects that were viewed, grasped, or both. Conceptually, our research contributes to our understanding of modality invariance. Methodologically, it contributes to cognitive modeling by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception.
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1004610 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 04610&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1004610
DOI: 10.1371/journal.pcbi.1004610
Access Statistics for this article
More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().