Implicit estimation of sound-arrival time
Yoichi Sugita () and
Yôiti Suzuki
Additional contact information
Yoichi Sugita: National Institute of Advanced Industrial Science and Technology, Neuroscience Research Institute
Yôiti Suzuki: Research Institute of Electrical Communication and Graduate School of Information Sciences, Tohoku University
Nature, 2003, vol. 421, issue 6926, 911-911
Abstract:
Abstract In perceiving the sound produced by the movement of a visible object, the brain coordinates the auditory and visual input1,2,3 so that no delay is noticed even though the sound arrives later (for distant source objects, such as aircraft or firework displays, this is less effective). Here we show that coordination occurs because the brain uses information about distance that is supplied by the visual system to calibrate simultaneity. Our findings indicate that auditory and visual inputs are coordinated not because the brain has a wide temporal window for auditory integration, as was previously thought, but because the brain actively changes the temporal location of the window depending on the distance of the visible sound source.
Date: 2003
References: Add references at CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
https://www.nature.com/articles/421911a Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:nature:v:421:y:2003:i:6926:d:10.1038_421911a
Ordering information: This journal article can be ordered from
https://www.nature.com/
DOI: 10.1038/421911a
Access Statistics for this article
Nature is currently edited by Magdalena Skipper
More articles in Nature from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().