From Time to Depth: A Unified Framework for 3D Vision and Action
Pin Yang
No mf9bx_v1, OSF Preprints from Center for Open Science
Abstract:
Understanding how organisms perceive and interact with three-dimensional (3D) structure in a 3D world based on two-dimensional (2D) optic images is a fundamental question in vision science. Traditional approaches have often relied on a perception–cognition–action sequence, wherein spatial representations are first constructed from indirect depth cues and subsequently used to plan actions. In this review, an alternative framework inspired by ecological theory is proposed: unveiling the third dimension through the explicit integration of a fourth dimension—time. Time-dimensioned optical variables provide direct, continuously updated information that supports both the recovery of 3D structure and the prospective control of action. Rather than relying on internal spatial maps and feedforward predictions, organisms can achieve real-time, dynamic coupling with their environment by perceiving time-dimensioned optic variables (aka tau models). Drawing on evidence from perceptual psychology, neuroscience, and sensorimotor control, it is highlighted how tau and tau related variables provide robust information for interacting with objects in depth. This perspective not only aligns with the continuous, dynamic nature of natural behavior but also offers a unified framework of perception and action that emphasizes real-time coupling with the environment
Date: 2025-07-16
References: Add references at CitEc
Citations:
Downloads: (external link)
https://osf.io/download/6873dbf8d36f854ce99169ba/
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:osf:osfxxx:mf9bx_v1
DOI: 10.31219/osf.io/mf9bx_v1
Access Statistics for this paper
More papers in OSF Preprints from Center for Open Science
Bibliographic data for series maintained by OSF ().