Convergent decomposition techniques for training RBF neural networks

Neural Comput. 2001 Aug;13(8):1891-920. doi: 10.1162/08997660152469396.

Abstract

In this article we define globally convergent decomposition algorithms for supervised training of generalized radial basis function neural networks. First, we consider training algorithms based on the two-block decomposition of the network parameters into the vector of weights and the vector of centers. Then we define a decomposition algorithm in which the selection of the center locations is split into sequential minimizations with respect to each center, and we give a suitable criterion for choosing the centers that must be updated at each step. We prove the global convergence of the proposed algorithms and report the computational results obtained for a set of test problems.

MeSH terms

  • Algorithms
  • Learning
  • Mathematics
  • Models, Neurological
  • Neural Networks, Computer*