Neural learning rules for generating flexible predictions and computing the successor representation
- PMID: 36928104
- PMCID: PMC10019889
- DOI: 10.7554/eLife.80680
Neural learning rules for generating flexible predictions and computing the successor representation
Abstract
The predictive nature of the hippocampus is thought to be useful for memory-guided cognitive behaviors. Inspired by the reinforcement learning literature, this notion has been formalized as a predictive map called the successor representation (SR). The SR captures a number of observations about hippocampal activity. However, the algorithm does not provide a neural mechanism for how such representations arise. Here, we show the dynamics of a recurrent neural network naturally calculate the SR when the synaptic weights match the transition probability matrix. Interestingly, the predictive horizon can be flexibly modulated simply by changing the network gain. We derive simple, biologically plausible learning rules to learn the SR in a recurrent network. We test our model with realistic inputs and match hippocampal data recorded during random foraging. Taken together, our results suggest that the SR is more accessible in neural circuits than previously thought and can support a broad range of cognitive functions.
Keywords: hippocampus; neuroscience; plasticity; predictive coding; recurrent neural network; state-space model; tufted titmouse.
Plain language summary
Memories are an important part of how we think, understand the world around us, and plan out future actions. In the brain, memories are thought to be stored in a region called the hippocampus. When memories are formed, neurons store events that occur around the same time together. This might explain why often, in the brains of animals, the activity associated with retrieving memories is not just a snapshot of what happened at a specific moment-- it can also include information about what the animal might experience next. This can have a clear utility if animals use memories to predict what they might experience next and plan out future actions. Mathematically, this notion of predictiveness can be summarized by an algorithm known as the successor representation. This algorithm describes what the activity of neurons in the hippocampus looks like when retrieving memories and making predictions based on them. However, even though the successor representation can computationally reproduce the activity seen in the hippocampus when it is making predictions, it is unclear what biological mechanisms underpin this computation in the brain. Fang et al. approached this problem by trying to build a model that could generate the same activity patterns computed by the successor representation using only biological mechanisms known to exist in the hippocampus. First, they used computational methods to design a network of neurons that had the biological properties of neural networks in the hippocampus. They then used the network to simulate neural activity. The results show that the activity of the network they designed was able to exactly match the successor representation. Additionally, the data resulting from the simulated activity in the network fitted experimental observations of hippocampal activity in Tufted Titmice. One advantage of the network designed by Fang et al. is that it can generate predictions in flexible ways,. That is, it canmake both short and long-term predictions from what an individual is experiencing at the moment. This flexibility means that the network can be used to simulate how the hippocampus learns in a variety of cognitive tasks. Additionally, the network is robust to different conditions. Given that the brain has to be able to store memories in many different situations, this is a promising indication that this network may be a reasonable model of how the brain learns. The results of Fang et al. lay the groundwork for connecting biological mechanisms in the hippocampus at the cellular level to cognitive effects, an essential step to understanding the hippocampus, as well as its role in health and disease. For instance, their network may provide a concrete approach to studying how disruptions to the ways neurons make and break connections can impair memory formation. More generally, better models of the biological mechanisms involved in making computations in the hippocampus can help scientists better understand and test out theories about how memories are formed and stored in the brain.
© 2023, Fang et al.
Conflict of interest statement
CF, DA, LA, EM No competing interests declared
Figures
Update of
- doi: 10.1101/2022.05.18.492543
Similar articles
-
RatInABox, a toolkit for modelling locomotion and neuronal activity in continuous environments.Elife. 2024 Feb 9;13:e85274. doi: 10.7554/eLife.85274. Elife. 2024. PMID: 38334473 Free PMC article.
-
Toward the biological model of the hippocampus as the successor representation agent.Biosystems. 2022 Mar;213:104612. doi: 10.1016/j.biosystems.2022.104612. Epub 2022 Jan 29. Biosystems. 2022. PMID: 35093444
-
A neural network model of when to retrieve and encode episodic memories.Elife. 2022 Feb 10;11:e74445. doi: 10.7554/eLife.74445. Elife. 2022. PMID: 35142289 Free PMC article.
-
Engineering Aspects of Olfaction.In: Persaud KC, Marco S, Gutiérrez-Gálvez A, editors. Neuromorphic Olfaction. Boca Raton (FL): CRC Press/Taylor & Francis; 2013. Chapter 1. In: Persaud KC, Marco S, Gutiérrez-Gálvez A, editors. Neuromorphic Olfaction. Boca Raton (FL): CRC Press/Taylor & Francis; 2013. Chapter 1. PMID: 26042329 Free Books & Documents. Review.
-
Performance of a Computational Model of the Mammalian Olfactory System.In: Persaud KC, Marco S, Gutiérrez-Gálvez A, editors. Neuromorphic Olfaction. Boca Raton (FL): CRC Press/Taylor & Francis; 2013. Chapter 6. In: Persaud KC, Marco S, Gutiérrez-Gálvez A, editors. Neuromorphic Olfaction. Boca Raton (FL): CRC Press/Taylor & Francis; 2013. Chapter 6. PMID: 26042330 Free Books & Documents. Review.
Cited by
-
Rapid learning of predictive maps with STDP and theta phase precession.Elife. 2023 Mar 16;12:e80663. doi: 10.7554/eLife.80663. Elife. 2023. PMID: 36927826 Free PMC article.
-
Accounting for multiscale processing in adaptive real-world decision-making via the hippocampus.Front Neurosci. 2023 Sep 5;17:1200842. doi: 10.3389/fnins.2023.1200842. eCollection 2023. Front Neurosci. 2023. PMID: 37732307 Free PMC article. Review.
-
Abstract cognitive maps of social network structure aid adaptive inference.Proc Natl Acad Sci U S A. 2023 Nov 21;120(47):e2310801120. doi: 10.1073/pnas.2310801120. Epub 2023 Nov 14. Proc Natl Acad Sci U S A. 2023. PMID: 37963254 Free PMC article.
-
Learning predictive cognitive maps with spiking neurons during behavior and replays.Elife. 2023 Mar 16;12:e80671. doi: 10.7554/eLife.80671. Elife. 2023. PMID: 36927625 Free PMC article.
-
Endotaxis: A neuromorphic algorithm for mapping, goal-learning, navigation, and patrolling.Elife. 2024 Feb 29;12:RP84141. doi: 10.7554/eLife.84141. Elife. 2024. PMID: 38420996 Free PMC article.
References
-
- Amarimber S-I. Characteristics of random nets of analog neuron-like elements. IEEE Transactions on Systems, Man, and Cybernetics. 1972;SMC-2:643–657. doi: 10.1109/TSMC.1972.4309193. - DOI
Publication types
MeSH terms
Associated data
Grants and funding
LinkOut - more resources
Full Text Sources
Research Materials
