EconPapers    
Economics at your fingertips  
 

Synergistic information supports modality integration and flexible learning in neural networks solving multiple tasks

Alexandra M Proca, Fernando E Rosas, Andrea I Luppi, Daniel Bor, Matthew Crosby and Pedro A M Mediano

PLOS Computational Biology, 2024, vol. 20, issue 6, 1-28

Abstract: Striking progress has been made in understanding cognition by analyzing how the brain is engaged in different modes of information processing. For instance, so-called synergistic information (information encoded by a set of neurons but not by any subset) plays a key role in areas of the human brain linked with complex cognition. However, two questions remain unanswered: (a) how and why a cognitive system can become highly synergistic; and (b) how informational states map onto artificial neural networks in various learning modes. Here we employ an information-decomposition framework to investigate neural networks performing cognitive tasks. Our results show that synergy increases as networks learn multiple diverse tasks, and that in tasks requiring integration of multiple sources, performance critically relies on synergistic neurons. Overall, our results suggest that synergy is used to combine information from multiple modalities—and more generally for flexible and efficient learning. These findings reveal new ways of investigating how and why learning systems employ specific information-processing strategies, and support the principle that the capacity for general-purpose learning critically relies on the system’s information dynamics.Author summary: What is the informational basis of learning in humans, animals, or, indeed, artificial neural networks (ANN)? Furthermore, how can these systems learn to solve multiple tasks simultaneously? These fundamental questions are, surprisingly, still not fully understood. One advantage of studying ANNs is that we can precisely probe learning-related changes. Here we draw on a recent branch of information theory, partial information decomposition, to examine how different types of information support different learning goals, and where, in ANNs. We show that adding noise to an ANN encourages it to keep copies of information at multiple nodes, promoting robustness. In contrast, whenever flexible learning is required, for instance when facing varied stimulus types or diverse tasks, individual neurons work together to represent information more abstractly. This work sheds light on how systems encode information differently according to their learning pressures, which can help us better understand how and why the human brain uses particular forms of information processing.

Date: 2024
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1012178 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 12178&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1012178

DOI: 10.1371/journal.pcbi.1012178

Access Statistics for this article

More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().

 
Page updated 2025-05-31
Handle: RePEc:plo:pcbi00:1012178