Entropy, mutual information, and systematic measures of structured spiking neural networks

J Theor Biol. 2020 Sep 21:501:110310. doi: 10.1016/j.jtbi.2020.110310. Epub 2020 May 19.

Abstract

The aim of this paper is to investigate various information-theoretic measures, including entropy, mutual information, and some systematic measures that are based on mutual information, for a class of structured spiking neuronal networks. In order to analyze and compute these information-theoretic measures for large networks, we coarse-grained the data by ignoring the order of spikes that fall into the same small time bin. The resultant coarse-grained entropy mainly captures the information contained in the rhythm produced by a local population of the network. We first show that these information theoretical measures are well-defined and computable by proving stochastic stability and the law of large numbers. Then we use three neuronal network examples, from simple to complex, to investigate these information-theoretic measures. Several analytical and computational results about properties of these information-theoretic measures are given.

Keywords: Complexity; Degeneracy; Entropy; Mutual information; Neural field models.

MeSH terms

  • Entropy
  • Neural Networks, Computer*
  • Neurons*