Auditory and tactile frequency mapping for visual distance perception: A step forward in sensory substitution and augmentation
Pingping Jiang,
Jonathan Rossiter and
Christopher Kent
PLOS ONE, 2025, vol. 20, issue 3, 1-21
Abstract:
Vision is crucial for daily tasks and interacting with the environment, but visual impairment can hinder these activities. Many sensory substitution products and studies prioritize providing abundant and accurate information, yet often overlook the inherent relationship between different modalities, potentially preventing users from receiving information intuitively. This study investigated the representation of visual distance using auditory and vibrotactile frequency through a series of psychological cross-modal matching experiments. By establishing mapping functions between auditory/vibrotactile frequency and visual distance, we aim to facilitate the design of sensory substitution devices that take visual distance information (ranging from 1 m to 12 m) and convert it into non-visual information (auditory frequency within the range 47-2764 Hz or vibrotactile frequency within the range 10–99 Hz). Results show distinct patterns regarding the correlation between visual distance and frequency in both auditory (auditory frequency-to-visual distance) and vibrotactile (vibrotactile frequency-to-visual distance) domains. The prevailing trend (59%) was a monotonic negative correlation (i.e. higher frequencies are associated with shorter distances), while 24% of participants demonstrated a consistently positive correlation. Additionally, we compare this study with our previous investigations into the reverse cross-modal mapping of visual distance-to-auditory frequency and visual distance-to-vibrotactile frequency. We reveal common patterns between these two studies (negative and positive correlations), suggesting a bidirectional mapping between visual distance and frequency in both auditory and vibrotactile domains, and the potential for new sensory substitution devices for those with visual impairment by integrating underlying cross-modal mechanisms to enhance intuitive and natural human-machine interaction.
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0318354 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 18354&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0318354
DOI: 10.1371/journal.pone.0318354
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().