A singular Riemannian geometry approach to Deep Neural Networks I. Theoretical foundations

Neural Netw. 2023 Jan:158:331-343. doi: 10.1016/j.neunet.2022.11.022. Epub 2022 Nov 19.

Abstract

Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis. The strategies employed to investigate their theoretical properties mainly rely on Euclidean geometry, but in the last years new approaches based on Riemannian geometry have been developed. Motivated by some open problems, we study a particular sequence of maps between manifolds, with the last manifold of the sequence equipped with a Riemannian metric. We investigate the structures induced through pullbacks on the other manifolds of the sequence and on some related quotients. In particular, we show that the pullbacks of the final Riemannian metric to any manifolds of the sequence is a degenerate Riemannian metric inducing a structure of pseudometric space. We prove that the Kolmogorov quotient of this pseudometric space yields a smooth manifold, which is the base space of a particular vertical bundle. We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between manifolds implementing neural networks of practical interest and we present some applications of the geometric framework we introduced in the first part of the paper.

Keywords: Classification; Deep learning; Degenerate metrics; Neural networks; Riemann geometry.

MeSH terms

  • Algorithms*
  • Image Processing, Computer-Assisted / methods
  • Neural Networks, Computer*