Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 1994;5(2):213-28.
doi: 10.1109/72.279186.

Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models

Affiliations

Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models

B Srinivasan et al. IEEE Trans Neural Netw. 1994.

Abstract

In this paper, back propagation is reinvestigated for an efficient evaluation of the gradient in arbitrary interconnections of recurrent subsystems. It is shown that the error has to be back-propagated through the adjoint model of the system and that the gradient can only be obtained after a delay. A faster version, accelerated back propagation, that eliminates this delay, is also developed. Various schemes including the sensitivity method are studied to update the weights of the network using these gradients. Motivated by the Lyapunov approach and the adjoint model, the predictive back propagation and its variant, targeted back propagation, are proposed. A further refinement, predictive back propagation with filtering is then developed, where the states of the model are also updated. The convergence of this scheme is assured. It is shown that it is sufficient to back propagate as many time steps as the order of the system for convergence. As a preamble, convergence of online batch and sample-wise updates in feedforward models is analyzed using the Lyapunov approach.

PubMed Disclaimer

Similar articles

Cited by