Bits and brains: Information flow in the nervous system
William Bialek,
Michael DeWeese,
Fred Rieke and
David Warland
Physica A: Statistical Mechanics and its Applications, 1993, vol. 200, issue 1, 581-593
Abstract:
Until recently there have been no convincing quantitative measurements on the rates of information transmission in real neurons. Here we review the theoretical basis for making such measurements, together with the data which demonstrate remarkably high information rates in a variety of systems. In fact these rates are within a factor of two of the absolute physical limits set by the entropy of neural spike trains. Theses observations lead to sharp theoretical questions about the structure of the code and the strategy for adapting the code to different ensembles of input signals.
Date: 1993
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/037843719390563J
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:200:y:1993:i:1:p:581-593
DOI: 10.1016/0378-4371(93)90563-J
Access Statistics for this article
Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis
More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().