Optimal features for auditory categorization
Shi Tong Liu,
Pilar Montes-Lourido,
Xiaoqin Wang and
Srivatsun Sadagopan ()
Additional contact information
Shi Tong Liu: University of Pittsburgh
Pilar Montes-Lourido: University of Pittsburgh
Xiaoqin Wang: Johns Hopkins University
Srivatsun Sadagopan: University of Pittsburgh
Nature Communications, 2019, vol. 10, issue 1, 1-14
Abstract:
Abstract Humans and vocal animals use vocalizations to communicate with members of their species. A necessary function of auditory perception is to generalize across the high variability inherent in vocalization production and classify them into behaviorally distinct categories (‘words’ or ‘call types’). Here, we demonstrate that detecting mid-level features in calls achieves production-invariant classification. Starting from randomly chosen marmoset call features, we use a greedy search algorithm to determine the most informative and least redundant features necessary for call classification. High classification performance is achieved using only 10–20 features per call type. Predictions of tuning properties of putative feature-selective neurons accurately match some observed auditory cortical responses. This feature-based approach also succeeds for call categorization in other species, and for other complex classification tasks such as caller identification. Our results suggest that high-level neural representations of sounds are based on task-dependent features optimized for specific computational goals.
Date: 2019
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-019-09115-y Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:10:y:2019:i:1:d:10.1038_s41467-019-09115-y
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-019-09115-y
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().