EconPapers    
Economics at your fingertips  
 

Exploring neural manifolds across a wide range of intrinsic dimensions

Jacopo Fadanni, Rosalba Pacelli, Alberto Zucchetta, Pietro Rotondo and Michele Allegra

PLOS Computational Biology, 2026, vol. 22, issue 4, 1-28

Abstract: The rapid surge in the number of simultaneously recorded neurons demands reliable tools to explore the latent geometry of high-dimensional neural spaces. Within such spaces, neuronal activity typically lies on a subspace or manifold characterized by an intrinsic dimension (ID) that is much lower than the total number of recorded units. The ID can provide immediate information about the neural code, such as the minimum number of encoded variables and the relation between collective and individual neural activity. Existing studies rely on disparate and potentially unreliable ID estimators, which can contribute to conflicting reports of high-dimensional vs. low-dimensional manifolds. Here, we propose a robust and versatile pipeline for ID estimation, exploiting a local version of the full correlation integral estimator (lFCI). Being able to simultaneously cope with high dimensionality and non-linearity, lFCI overcomes some major limitations of common ID estimation methods. We prove the strength and accuracy of lFCI by applying it on synthetic benchmark data by Altan et al., 2019, where other methods typically underestimate the ID. We apply lFCI to study neural manifolds arising in recurrent neural networks trained on the 20 tasks of the well-known ‘cog-Task’ battery. Across tasks and training repetitions, lFCI uncovers a consistently low ID, which we show to be fundamentally related to the task structure. Finally, we apply lFCI to a reference experimental dataset by Stringer et al., 2019, comprising visual responses to a large set of natural images, strongly supporting previous reports that responses are organized in a high-dimensional manifold. lFCI has the potential to shed light on the current debate about the geometry of neural codes, and its dependence on structural constraints and computational goals in biological and artificial neural networks.Author summary: The dimensionality, or intrinsic dimension (ID), of neural manifolds is one of the key properties characterizing the collective organization of neural activity at a population level. However, rigorously estimating the ID is difficult: common methods tend to overestimate the ID in the presence of curvature, and underestimate it when large. Exploiting FCI, an ID estimator that is particularly effective in high-dimensional settings, we developed local FCI (lFCI), a pipeline for ID estimation that addresses these limitations. lFCI proves robust when tested against challenging benchmark data. We applied lFCI in two paradigmatic cases: data from artificial recurrent neural networks trained on simple tasks, and experimental data of neural responses to visual stimuli. In both cases, lFCI provides substantial support for widely debated hypotheses - low ID in the case of simple tasks and large ID for visual responses. lFCI could be used to characterize neural manifolds across a wide range of dimensions, offering a useful tool in the current debate about the dimensionality of neural activity.

Date: 2026
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1014162 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 14162&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1014162

DOI: 10.1371/journal.pcbi.1014162

Access Statistics for this article

More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().

 
Page updated 2026-04-26
Handle: RePEc:plo:pcbi00:1014162