On Relations Between the Relative Entropy and χ2-Divergence, Generalizations and Applications

Entropy (Basel). 2020 May 18;22(5):563. doi: 10.3390/e22050563.

Abstract

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data-processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.

Keywords: Markov chains; chi-squared divergence; f-divergences; information contraction; large deviations; maximal correlation; method of types; relative entropy; strong data–processing inequalities.