Equivariant nonstationary source separation

Neural Netw. 2002 Jan;15(1):121-30. doi: 10.1016/s0893-6080(01)00137-x.

Abstract

Most of source separation methods focus on stationary sources, so higher-order statistics is necessary for successful separation, unless sources are temporally correlated. For nonstationary sources, however, it was shown [Neural Networks 8 (1995) 411] that source separation could be achieved by second-order decorrelation. In this paper, we consider the cost function proposed by Matsuoka et al. [Neural Networks 8 (1995) 411] and derive natural gradient learning algorithms for both fully connected recurrent network and feedforward network. Since our algorithms employ the natural gradient method, they possess the equivariant property and find a steepest descent direction unlike the algorithm [Neural Networks 8 (1995) 411]. We also show that our algorithms are always locally stable, regardless of probability distributions of nonstationary sources.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Computer Simulation
  • Neural Networks, Computer*