Error minimized extreme learning machine with growth of hidden nodes and incremental learning

IEEE Trans Neural Netw. 2009 Aug;20(8):1352-7. doi: 10.1109/TNN.2009.2024147. Epub 2009 Jul 10.

Abstract

One of the open problems in neural network research is how to automatically determine network architectures for given applications. In this brief, we propose a simple and efficient approach to automatically determine the number of hidden nodes in generalized single-hidden-layer feedforward networks (SLFNs) which need not be neural alike. This approach referred to as error minimized extreme learning machine (EM-ELM) can add random hidden nodes to SLFNs one by one or group by group (with varying group size). During the growth of the networks, the output weights are updated incrementally. The convergence of this approach is proved in this brief as well. Simulation results demonstrate and verify that our new approach is much faster than other sequential/incremental/growing algorithms with good generalization performance.

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Classification / methods
  • Computer Simulation
  • Databases, Factual
  • Learning / physiology
  • Neural Networks, Computer*
  • Neurons / physiology
  • Pattern Recognition, Automated / methods
  • Regression Analysis
  • Synaptic Transmission / physiology
  • Time Factors