EconPapers    
Economics at your fingertips  
 

Neuromechanistic Model of Auditory Bistability

James Rankin, Elyse Sussman and John Rinzel

PLOS Computational Biology, 2015, vol. 11, issue 11, 1-34

Abstract: Sequences of higher frequency A and lower frequency B tones repeating in an ABA- triplet pattern are widely used to study auditory streaming. One may experience either an integrated percept, a single ABA-ABA- stream, or a segregated percept, separate but simultaneous streams A-A-A-A- and -B---B--. During minutes-long presentations, subjects may report irregular alternations between these interpretations. We combine neuromechanistic modeling and psychoacoustic experiments to study these persistent alternations and to characterize the effects of manipulating stimulus parameters. Unlike many phenomenological models with abstract, percept-specific competition and fixed inputs, our network model comprises neuronal units with sensory feature dependent inputs that mimic the pulsatile-like A1 responses to tones in the ABA- triplets. It embodies a neuronal computation for percept competition thought to occur beyond primary auditory cortex (A1). Mutual inhibition, adaptation and noise are implemented. We include slow NDMA recurrent excitation for local temporal memory that enables linkage across sound gaps from one triplet to the next. Percepts in our model are identified in the firing patterns of the neuronal units. We predict with the model that manipulations of the frequency difference between tones A and B should affect the dominance durations of the stronger percept, the one dominant a larger fraction of time, more than those of the weaker percept—a property that has been previously established and generalized across several visual bistable paradigms. We confirm the qualitative prediction with our psychoacoustic experiments and use the behavioral data to further constrain and improve the model, achieving quantitative agreement between experimental and modeling results. Our work and model provide a platform that can be extended to consider other stimulus conditions, including the effects of context and volition.Author Summary: Humans have an astonishing ability to separate out different sound sources in a busy room: think of how we can hear individual voices in a bustling coffee shop. Rather than voices, we use sound stimuli in the lab: repeating patterns of high and low tones. The tone sequences are ambiguous and can be interpreted in different ways—either grouped into a single stream, or separated out into different streams. When listening for a long time, one’s perception switches every few seconds, a phenomenon called auditory bistability. Based on knowledge of the organization of brain areas involved in separating out different sound sources and how neurons in these areas respond to the ambiguous sequences, we developed a computational model of auditory bistabilty. Our model is less abstract than existing models and shows how groups of neurons may compete in order to dictate what you perceive. We predict how the difference between the two tone sequences affects what you hear over time and we performed an experiment with human listeners to confirm our prediction. The model provides groundwork to further explore the way the brain deals with the busy and often ambiguous world of sound.

Date: 2015
References: View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1004555 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 04555&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1004555

DOI: 10.1371/journal.pcbi.1004555

Access Statistics for this article

More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().

 
Page updated 2025-03-19
Handle: RePEc:plo:pcbi00:1004555