Recommendation via Collaborative Autoregressive Flows

Neural Netw. 2020 Jun:126:52-64. doi: 10.1016/j.neunet.2020.03.010. Epub 2020 Mar 13.

Abstract

Although it is one of the most widely used methods in recommender systems, Collaborative Filtering (CF) still has difficulties in modeling non-linear user-item interactions. Complementary to this, recently developed deep generative model variants (e.g., Variational Autoencoder (VAE)) allowing Bayesian inference and approximation of the variational posterior distributions in these models, have achieved promising performance improvement in many areas. However, the choices of variation distribution - e.g., the popular diagonal-covariance Gaussians - are insufficient to recover the true distributions, often resulting in biased maximum likelihood estimates of the model parameters. Aiming at more tractable and expressive variational families, in this work we extend the flow-based generative model to CF for modeling implicit feedbacks. We present the Collaborative Autoregressive Flows (CAF) for the recommender system, transforming a simple initial density into more complex ones via a sequence of invertible transformations, until a desired level of complexity is attained. CAF is a non-linear probabilistic approach allowing uncertainty representation and exact tractability of latent-variable inference in item recommendations. Compared to the agnostic-presumed prior approximation used in existing deep generative recommendation approaches, CAF is more effective in estimating the probabilistic posterior and achieves better recommendation accuracy. We conducted extensive experimental evaluations demonstrating that CAF can capture more effective representation of latent factors, resulting in a substantial gain on recommendation compared to the state-of-the-art approaches.

Keywords: Autoregressive flows; Collaborative recommendation; Generative models; Normalizing flows; Variational inference.

MeSH terms

  • Bayes Theorem
  • Information Management / methods
  • Likelihood Functions
  • Machine Learning*
  • Normal Distribution