Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Jan 21;16(1):e1007606.
doi: 10.1371/journal.pcbi.1007606. eCollection 2020 Jan.

Learning spatiotemporal signals using a recurrent spiking network that discretizes time

Affiliations

Learning spatiotemporal signals using a recurrent spiking network that discretizes time

Amadeus Maes et al. PLoS Comput Biol. .

Abstract

Learning to produce spatiotemporal sequences is a common task that the brain has to solve. The same neurons may be used to produce different sequential behaviours. The way the brain learns and encodes such tasks remains unknown as current computational models do not typically use realistic biologically-plausible learning. Here, we propose a model where a spiking recurrent network of excitatory and inhibitory spiking neurons drives a read-out layer: the dynamics of the driver recurrent network is trained to encode time which is then mapped through the read-out neurons to encode another dimension, such as space or a phase. Different spatiotemporal patterns can be learned and encoded through the synaptic weights to the read-out neurons that follow common Hebbian learning rules. We demonstrate that the model is able to learn spatiotemporal dynamics on time scales that are behaviourally relevant and we show that the learned sequences are robustly replayed during a regime of spontaneous activity.

PubMed Disclaimer

Conflict of interest statement

The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Model architecture.
(A) The recurrent network consists of both inhibitory (in blue) and excitatory (in red) neurons. The connectivity is sparse in the recurrent network. The temporal backbone is established in the recurrent network after a learning phase. Inset: zoom of recurrent network showing the macroscopic recurrent structure after learning, here for 7 clusters. The excitatory neurons in the recurrent network project all-to-all to the read-out neurons. The read-out neurons are not interconnected. (B) All excitatory to excitatory connections are plastic under the voltage-based STDP rule (see Methods for details). The red lines are spikes of neuron j (top) and neuron i (bottom). When neurons j and i are very active together, they form bidirectional connections strengthening both Wij and Wji. Connections Wij are unidirectionally strengthened when neuron j fires before neuron i. (C) The incoming excitatory weights are L1 normalized in the recurrent network, i.e. the sum of all incoming excitatory weights is kept constant. (D) Potentiation of the plastic read-out synapses is linearly dependent on the weight. This gives weights a soft upper bound.
Fig 2
Fig 2. Learning a sequential dynamics stably under plasticity.
(A) The excitatory neurons receive sequential clustered inputs. Excitatory neurons are grouped in 30 disjoint clusters of 80 neurons each. (7 clusters shown in the cartoon for simplicity) (B) The weight matrix after training (only the first five clusters shown) exhibits the learned connectivity structure, e.g., neurons within cluster 1 are highly interconnected and also project to neurons in cluster 2, same for cluster 2 to cluster 3, etc. The spectrum of the full weight matrix after training shows most eigenvalues in a circle in the complex plane (as in a random graph) with two other eigenvalues signifying the balancing of the network, and a series of dominant eigenvalues in pairs that encode the feedforward embedding. (C) Raster plot of the total network consisting of 2400 excitatory (in red) and 600 inhibitory (in blue) neurons. After learning, the spontaneous dynamics exhibits a stable periodic trajectory ‘going around the clock’. The excitatory clusters discretize time (see zoom) and the network has an overall period of about 450 ms.
Fig 3
Fig 3. Learning a non-Markovian sequence via the read-out neurons.
(A) Excitatory neurons in the recurrent network are all-to-all connected to the read-out neurons. The read-out neurons receive additional excitatory input from the supervisor neurons and inhibitory input from interneurons. The supervisor neurons receive spike trains that are drawn from a Poisson process with a rate determined by the target sequence. The read-out synapses are plastic under the voltage-based STDP rule. (B) The rate of the input signal to the supervisor neurons A, B and C. The supervisor sequence is ABCBA where each letter represents a 75 ms external stimulation of 10 kHz of the respective supervisor neuron. (C) After learning, the supervisor input and plasticity are turned off. The read-out neurons are now solely driven by the recurrent network. (D) The read-out weight matrix WRE after 12 seconds of learning. (E) Under spontaneous activity, the spikes of recurrent network (top) and read-out (bottom) neurons. Excitatory neurons in the recurrent network reliably drive sequence replays. (F) The target rate (top) and the rate of the read-out neurons (bottom) computed using a one sequence replay and normalized to [0, 1]. The spikes of the read-out neurons are convolved with a Gaussian kernel with a width of ∼ 12 ms.
Fig 4
Fig 4. Learning sequences in parallel.
(A) The recurrent network projects to two sets of neurons. (B) Two different sequences, ABCBA and DEDED, are learned by alternating between them and presenting each for 2 seconds at a time. (C) The read-out weight matrix after 24 seconds of learning. (D) Raster plot of spontaneous sequence reactivations, where an external inhibitory current is assumed to control which sequence is replayed.
Fig 5
Fig 5. Scaling, robustness and time variability of the model.
(A) Change of the mean period of the sequential dynamics as the number of clusters grows with: (i) the total number of excitatory neurons kept constant (red line); (ii) the total number of neurons increasing with cluster size (blue line). Error bar shows a standard deviation. (B) Dynamics with varying level of external excitatory input for four different cluster sizes and NE = 2400. The external input can modulate the period of the sequential dynamics by ∼ 10%. (C) Recall performance of the learned sequence ABCBA for varying cluster sizes and NE = 30NC under synapse deletion (computed over 20 repeats). The learning time depends on the cluster size: Δt = 960s/NC. (D) The ABCBA sequence is learned with a network of 120 excitatory neurons connected in one large chain and read-out neurons with maximum synaptic read-out strength increased to WmaxAE=75pF. The network is driven by a low external input (rextEE=2.75kHz). When, at t = 500 ms a single synapse is deleted, the dynamics breaks down and parts of the sequence are randomly activated by the external input. Top: spike raster of the excitatory neurons of the RNN. Bottom: spike raster of the read-out neurons. (E) (Left) Histogram of the variability of the period of the sequential activity of the RNN over 79 trials (Right) The standard deviation of the cluster activation time, σt, increases as the square root of μt, the mean time of cluster activation: σt=0.213μt (root mean squared error = 0.223 ms).
Fig 6
Fig 6. Learning a complex sequence.
(A) Target sequence (top). The amplitude shows the rate of the Poisson input to the supervisor neurons and is normalized between 0 and 10 kHz. Rate of read-out neurons for one sample reactivation after learning 6 seconds (bottom). 45 read-out neurons encode the different frequencies in the song. Neuron i encodes a frequency interval of [684 + 171i, 855 + 171i]Hz. (B) The read-out weight matrix after learning 6 seconds. (C) Sequence replays showing the spike trains of both the recurrent network neurons (top, excitatory neurons in red and inhibitory neurons in blue), and the read-out neurons (bottom).
Fig 7
Fig 7. Spectral analysis of reduced linear model.
(A) Cartoon of a simplified linearised rate model with three nodes x1, x2, x3 corresponding to three clusters of excitatory neurons with recurrent strength δ connected to a central cluster of inhibitory neurons x4. The cyclic connections are stronger clockwise than anticlockwise since ϵ > 1. (B) The spectrum shows a conjugate complex eigenvalue pair with large real part (2δϵ − 1)/2 and an imaginary part ±3(ϵ-1)/2 which grows linearly with the asymmetry of the clockwise/anticlockwise strength (ϵ − 1). This pair of eigenvalues dominates the dynamics as their real parts are close to 1 and leads to the periodic behaviour corresponding to propagation around the cycle x1x2x3x1….

Similar articles

Cited by

References

    1. Abbott LF, DePasquale B, Memmesheimer RM. Building functional networks of spiking model neurons. Nature Neuroscience. 2016;19(3):350–355. 10.1038/nn.4241 - DOI - PMC - PubMed
    1. Jun JJ, Steinmetz NA, Siegle JH, Denman DJ, Bauza M, Barbarits B, et al. Fully integrated silicon probes for high-density recording of neural activity. Nature. 2017;551(7679):232–236. 10.1038/nature24636 - DOI - PMC - PubMed
    1. Maass W. Searching for principles of brain computation. Current Opinion in Behavioral Sciences. 2016;11:81–92. 10.1016/j.cobeha.2016.06.003 - DOI
    1. Hahnloser RHR, Kozhevnikov AA, Fee MS. An ultra-sparse code underlies the generation of neural sequences in a songbird. Nature. 2002;419(6902):65–70. 10.1038/nature00974 - DOI - PubMed
    1. Leonardo A, Fee MS. Ensemble Coding of Vocal Control in Birdsong. Journal of Neuroscience. 2005;25(3):652–661. 10.1523/JNEUROSCI.3036-04.2005 - DOI - PMC - PubMed

Publication types

LinkOut - more resources