EconPapers    
Economics at your fingertips  
 

Multiscale temporal integration organizes hierarchical computation in human auditory cortex

Sam V. Norman-Haignere (), Laura K. Long, Orrin Devinsky, Werner Doyle, Ifeoma Irobunda, Edward M. Merricks, Neil A. Feldstein, Guy M. McKhann, Catherine A. Schevon, Adeen Flinker and Nima Mesgarani ()
Additional contact information
Sam V. Norman-Haignere: Columbia University
Laura K. Long: Columbia University
Orrin Devinsky: NYU Langone Medical Center
Werner Doyle: NYU Langone Medical Center
Ifeoma Irobunda: Columbia University Irving Medical Center
Edward M. Merricks: Columbia University Irving Medical Center
Neil A. Feldstein: Columbia University Irving Medical Center
Guy M. McKhann: Columbia University Irving Medical Center
Catherine A. Schevon: Columbia University Irving Medical Center
Adeen Flinker: NYU Langone Medical Center
Nima Mesgarani: Columbia University

Nature Human Behaviour, 2022, vol. 6, issue 3, 455-469

Abstract: Abstract To derive meaning from sound, the brain must integrate information across many timescales. What computations underlie multiscale integration in human auditory cortex? Evidence suggests that auditory cortex analyses sound using both generic acoustic representations (for example, spectrotemporal modulation tuning) and category-specific computations, but the timescales over which these putatively distinct computations integrate remain unclear. To answer this question, we developed a general method to estimate sensory integration windows—the time window when stimuli alter the neural response—and applied our method to intracranial recordings from neurosurgical patients. We show that human auditory cortex integrates hierarchically across diverse timescales spanning from ~50 to 400 ms. Moreover, we find that neural populations with short and long integration windows exhibit distinct functional properties: short-integration electrodes (less than ~200 ms) show prominent spectrotemporal modulation selectivity, while long-integration electrodes (greater than ~200 ms) show prominent category selectivity. These findings reveal how multiscale integration organizes auditory computation in the human brain.

Date: 2022
References: Add references at CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://www.nature.com/articles/s41562-021-01261-y Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:nathum:v:6:y:2022:i:3:d:10.1038_s41562-021-01261-y

Ordering information: This journal article can be ordered from
https://www.nature.com/nathumbehav/

DOI: 10.1038/s41562-021-01261-y

Access Statistics for this article

Nature Human Behaviour is currently edited by Stavroula Kousta

More articles in Nature Human Behaviour from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:nathum:v:6:y:2022:i:3:d:10.1038_s41562-021-01261-y