Energy efficient synaptic plasticity
- PMID: 32053106
- PMCID: PMC7082127
- DOI: 10.7554/eLife.50804
Energy efficient synaptic plasticity
Abstract
Many aspects of the brain's design can be understood as the result of evolutionary drive toward metabolic efficiency. In addition to the energetic costs of neural computation and transmission, experimental evidence indicates that synaptic plasticity is metabolically demanding as well. As synaptic plasticity is crucial for learning, we examine how these metabolic costs enter in learning. We find that when synaptic plasticity rules are naively implemented, training neural networks requires extremely large amounts of energy when storing many patterns. We propose that this is avoided by precisely balancing labile forms of synaptic plasticity with more stable forms. This algorithm, termed synaptic caching, boosts energy efficiency manifold and can be used with any plasticity rule, including back-propagation. Our results yield a novel interpretation of the multiple forms of neural synaptic plasticity observed experimentally, including synaptic tagging and capture phenomena. Furthermore, our results are relevant for energy efficient neuromorphic designs.
Keywords: computational models; metabolism; neuroscience; none; synaptic consolidation; synaptic plasticity.
Plain language summary
The brain expends a lot of energy. While the organ accounts for only about 2% of a person’s bodyweight, it is responsible for about 20% of our energy use at rest. Neurons use some of this energy to communicate with each other and to process information, but much of the energy is likely used to support learning. A study in fruit flies showed that insects that learned to associate two stimuli and then had their food supply cut off, died 20% earlier than untrained flies. This is thought to be because learning used up the insects’ energy reserves. If learning a single association requires so much energy, how does the brain manage to store vast amounts of data? Li and van Rossum offer an explanation based on a computer model of neural networks. The advantage of using such a model is that it is possible to control and measure conditions more precisely than in the living brain. Analysing the model confirmed that learning many new associations requires large amounts of energy. This is particularly true if the memories must be stored with a high degree of accuracy, and if the neural network contains many stored memories already. The reason that learning consumes so much energy is that forming long-term memories requires neurons to produce new proteins. Using the computer model, Li and van Rossum show that neural networks can overcome this limitation by storing memories initially in a transient form that does not require protein synthesis. Doing so reduces energy requirements by as much as 10-fold. Studies in living brains have shown that transient memories of this type do in fact exist. The current results hence offer a hypothesis as to how the brain can learn in a more energy efficient way. Energy consumption is thought to have placed constraints on brain evolution. It is also often a bottleneck in computers. By revealing how the brain encodes memories energy efficiently, the current findings could thus also inspire new engineering solutions.
© 2020, Li and van Rossum.
Conflict of interest statement
HL No competing interests declared, Mv Reviewing editor, eLife
Figures
Similar articles
-
Evolving interpretable plasticity for spiking networks.Elife. 2021 Oct 28;10:e66273. doi: 10.7554/eLife.66273. Elife. 2021. PMID: 34709176 Free PMC article.
-
Competitive plasticity to reduce the energetic costs of learning.PLoS Comput Biol. 2024 Oct 28;20(10):e1012553. doi: 10.1371/journal.pcbi.1012553. eCollection 2024 Oct. PLoS Comput Biol. 2024. PMID: 39466853 Free PMC article.
-
Neural learning rules for generating flexible predictions and computing the successor representation.Elife. 2023 Mar 16;12:e80680. doi: 10.7554/eLife.80680. Elife. 2023. PMID: 36928104 Free PMC article.
-
Possible role of intramembrane receptor-receptor interactions in memory and learning via formation of long-lived heteromeric complexes: focus on motor learning in the basal ganglia.J Neural Transm Suppl. 2003;(65):1-28. doi: 10.1007/978-3-7091-0643-3_1. J Neural Transm Suppl. 2003. PMID: 12946046 Review.
-
Propagation delays determine neuronal activity and synaptic connectivity patterns emerging in plastic neuronal networks.Chaos. 2018 Oct;28(10):106308. doi: 10.1063/1.5037309. Chaos. 2018. PMID: 30384625 Review.
Cited by
-
Learning induces coordinated neuronal plasticity of metabolic demands and functional brain networks.Commun Biol. 2022 May 9;5(1):428. doi: 10.1038/s42003-022-03362-4. Commun Biol. 2022. PMID: 35534605 Free PMC article.
-
Synaptic weight dynamics underlying memory consolidation: Implications for learning rules, circuit organization, and circuit function.Proc Natl Acad Sci U S A. 2024 Oct 8;121(41):e2406010121. doi: 10.1073/pnas.2406010121. Epub 2024 Oct 4. Proc Natl Acad Sci U S A. 2024. PMID: 39365821 Free PMC article.
-
Postsynaptic Potential Energy as Determinant of Synaptic Plasticity.Front Comput Neurosci. 2022 Feb 17;16:804604. doi: 10.3389/fncom.2022.804604. eCollection 2022. Front Comput Neurosci. 2022. PMID: 35250524 Free PMC article.
-
Reinforcement learning when your life depends on it: A neuro-economic theory of learning.PLoS Comput Biol. 2024 Oct 28;20(10):e1012554. doi: 10.1371/journal.pcbi.1012554. eCollection 2024 Oct. PLoS Comput Biol. 2024. PMID: 39466882 Free PMC article.
-
The AI trilemma: Saving the planet without ruining our jobs.Front Artif Intell. 2022 Oct 19;5:886561. doi: 10.3389/frai.2022.886561. eCollection 2022. Front Artif Intell. 2022. PMID: 36337142 Free PMC article.
References
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Research Materials
