Discrete-Time Signatures and Randomness in Reservoir Computing

IEEE Trans Neural Netw Learn Syst. 2021 May 26;PP. doi: 10.1109/TNNLS.2021.3076777. Online ahead of print.


A new explanation of the geometric nature of the reservoir computing (RC) phenomenon is presented. RC is understood in the literature as the possibility of approximating input-output systems with randomly chosen recurrent neural systems and a trained linear readout layer. Light is shed on this phenomenon by constructing what is called strongly universal reservoir systems as random projections of a family of state-space systems that generate Volterra series expansions. This procedure yields a state-affine reservoir system with randomly generated coefficients in a dimension that is logarithmically reduced with respect to the original system. This reservoir system is able to approximate any element in the fading memory filters class just by training a different linear readout for each different filter. Explicit expressions for the probability distributions needed in the generation of the projected reservoir system are stated, and bounds for the committed approximation error are provided.