Power-Law Inter-Spike Interval Distributions Infer a Conditional Maximization of Entropy in Cortical Neurons
Yasuhiro Tsubo,
Yoshikazu Isomura and
Tomoki Fukai
PLOS Computational Biology, 2012, vol. 8, issue 4, 1-11
Abstract:
The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI), which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously) and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE) solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates) at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains. Author Summary: The brain is a highly noisy information machine, making a striking contrast with man-made electric computers to which noise is merely harmful. However, little is known about the way neurons process information in the noisy states. Here, we explore the principle of noisy neural information processing in accurately recorded spike trains of in vivo cortical neurons. We found that their irregular spiking exhibits power-law statistics of inter-spike intervals. While the power law in neuronal firing itself is a surprising finding in neuroscience, a simple mathematics further reveals a possible link between the power law and neural code. Namely, we show that in vivo cortical neurons try to maximize the firing-rate entropy under joint constraints on the energy consumption and uncertainty of output spike trains. Our results suggest that the brain, which operates under a highly noisy environment and a severe limitation of the energy consumption, may employ a different computational principle from the mutual information maximization in the standard information theory.
Date: 2012
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1002461 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 02461&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1002461
DOI: 10.1371/journal.pcbi.1002461
Access Statistics for this article
More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().