Robustness analysis for connection weight matrices of global exponential stability of stochastic recurrent neural networks

Neural Netw. 2013 Feb:38:17-22. doi: 10.1016/j.neunet.2012.10.004. Epub 2012 Nov 7.

Abstract

This paper analyzes the robustness of global exponential stability of stochastic recurrent neural networks (SRNNs) subject to parameter uncertainty in connection weight matrices. Given a globally exponentially stable stochastic recurrent neural network, the problem to be addressed here is how much parameter uncertainty in the connection weight matrices that the neural network can remain to be globally exponentially stable. We characterize the upper bounds of the parameter uncertainty for the recurrent neural network to sustain global exponential stability. A numerical example is provided to illustrate the theoretical result.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Models, Neurological
  • Neural Networks, Computer*
  • Position-Specific Scoring Matrices*
  • Stochastic Processes*
  • Time Factors
  • Uncertainty*