The aim of this paper is to investigate various information-theoretic measures, including entropy, mutual information, and some systematic measures that are based on mutual information, for a class of structured spiking neuronal networks. In order to analyze and compute these information-theoretic measures for large networks, we coarse-grained the data by ignoring the order of spikes that fall into the same small time bin. The resultant coarse-grained entropy mainly captures the information contained in the rhythm produced by a local population of the network. We first show that these information theoretical measures are well-defined and computable by proving stochastic stability and the law of large numbers. Then we use three neuronal network examples, from simple to complex, to investigate these information-theoretic measures. Several analytical and computational results about properties of these information-theoretic measures are given.
Keywords: Complexity; Degeneracy; Entropy; Mutual information; Neural field models.
Copyright © 2020 Elsevier Ltd. All rights reserved.