Data-driven learning of chaotic dynamical systems using Discrete-Temporal Sobolev Networks

Neural Netw. 2024 May:173:106152. doi: 10.1016/j.neunet.2024.106152. Epub 2024 Feb 1.

Abstract

We introduce the Discrete-Temporal Sobolev Network (DTSN), a neural network loss function that assists dynamical system forecasting by minimizing variational differences between the network output and the training data via a temporal Sobolev norm. This approach is entirely data-driven, architecture agnostic, and does not require derivative information from the estimated system. The DTSN is particularly well suited to chaotic dynamical systems as it minimizes noise in the network output which is crucial for such sensitive systems. For our test cases we consider discrete approximations of the Lorenz-63 system and the Chua circuit. For the network architectures we use the Long Short-Term Memory (LSTM) and the Transformer. The performance of the DTSN is compared with the standard MSE loss for both architectures, as well as with the Physics Informed Neural Network (PINN) loss for the LSTM. The DTSN loss is shown to substantially improve accuracy for both architectures, while requiring less information than the PINN and without noticeably increasing computational time, thereby demonstrating its potential to improve neural network forecasting of dynamical systems.

Keywords: Chaotic system; LSTM; Lorenz system; Neural network; Physical Informed Neural Network; Prediction.

MeSH terms

  • Algorithms*
  • Forecasting
  • Learning
  • Memory, Long-Term
  • Neural Networks, Computer*