Balancing complexity, performance and plausibility to meta learn plasticity rules in recurrent spiking networks
Basile Confavreux,
Everton J Agnes,
Friedemann Zenke,
Henning Sprekeler and
Tim P Vogels
PLOS Computational Biology, 2025, vol. 21, issue 4, 1-21
Abstract:
Synaptic plasticity is a key player in the brain’s life-long learning abilities. However, due to experimental limitations, the mechanistic link between synaptic plasticity rules and the network-level computations they enable remain opaque. Here we use evolutionary strategies (ES) to meta learn local co-active plasticity rules in large recurrent spiking networks with excitatory (E) and inhibitory (I) neurons, using parameterizations of increasing complexity. We discover rules that robustly stabilize network dynamics for all four synapse types acting in isolation (E-to-E, E-to-I, I-to-E and I-to-I). More complex functions such as familiarity detection can also be included in the search constraints. However, our meta learning strategy begins to fail for co-active rules of increasing complexity, as it is challenging to devise loss functions that effectively constrain network dynamics to plausible solutions a priori. Moreover, in line with previous work, we can find multiple degenerate solutions with identical network behaviour. As a local optimization strategy, ES provides one solution at a time and makes exploration of this degeneracy cumbersome. Regardless, we can glean the interdependecies of various plasticity parameters by considering the covariance matrix learned alongside the optimal rule with ES. Our work provides a proof of principle for the success of machine-learning-guided discovery of plasticity rules in large spiking networks, and points at the necessity of more elaborate search strategies going forward.Author summary: Synapses between neurons in the brain change continuously throughout life. This phenomenon, called synaptic plasticity, is believed to be crucial for the brain to learn from and remember past experiences. However, the exact nature of these synaptic changes remains unclear, partly because they are hard to observe experimentally. Theorists have thus long tried to predict these synaptic changes and how they contribute to learning and memory, using abstraction called plasticity rules. Although many plasticity rules have been proposed, there are many different synapse types in the brain and many more possible rules to test. A recent approach has thus been to automate this screening of possible plasticity rules, using modern Machine Learning tools. This idea, called meta learning plasticity rules, has so far only been applied to very simple models of brain synapses. Here, we scale up this idea to more complex and more faithful models. We optimize plasticity rules based on their ability to make model brain circuits solve some basic memory tasks. We find several different yet equally good plasticity rules (degeneracy). However, our method drops in performance when considering more complex rules or tasks.
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1012910 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 12910&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1012910
DOI: 10.1371/journal.pcbi.1012910
Access Statistics for this article
More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().