A nonlinear sparse neural ordinary differential equation model for multiple functional processes

Can J Stat. 2022 Mar;50(1):59-85. doi: 10.1002/cjs.11666. Epub 2021 Nov 16.

Abstract

In this article, we propose a new sparse neural ordinary differential equation (ODE) model to characterize flexible relations among multiple functional processes. We characterize the latent states of the functions via a set of ordinary differential equations. We then model the dynamic changes of the latent states using a deep neural network (DNN) with a specially designed architecture and a sparsity-inducing regularization. The new model is able to capture both nonlinear and sparse dependent relations among multivariate functions. We develop an efficient optimization algorithm to estimate the unknown weights for the DNN under the sparsity constraint. We establish both the algorithmic convergence and selection consistency, which constitute the theoretical guarantees of the proposed method. We illustrate the efficacy of the method through simulations and a gene regulatory network example.

Keywords: Deep neural networks; Primary 62R10; multivariate functions; nonconvex optimization; ordinary differential equation; secondary 62G05; ℓ0-penalty.