Generalized backpropagation algorithm for training second-order neural networks

Int J Numer Method Biomed Eng. 2018 May;34(5):e2956. doi: 10.1002/cnm.2956. Epub 2018 Feb 6.

Abstract

The artificial neural network is a popular framework in machine learning. To empower individual neurons, we recently suggested that the current type of neurons could be upgraded to second-order counterparts, in which the linear operation between inputs to a neuron and the associated weights is replaced with a nonlinear quadratic operation. A single second-order neurons already have a strong nonlinear modeling ability, such as implementing basic fuzzy logic operations. In this paper, we develop a general backpropagation algorithm to train the network consisting of second-order neurons. The numerical studies are performed to verify the generalized backpropagation algorithm.

Keywords: artificial neural network; backpropagation (BP); second-order neurons.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Fuzzy Logic
  • Machine Learning
  • Neural Networks, Computer*