Shallow Representation Learning via Kernel PCA Improves QSAR Modelability

J Chem Inf Model. 2017 Aug 28;57(8):1859-1867. doi: 10.1021/acs.jcim.6b00694. Epub 2017 Aug 7.


Linear models offer a robust, flexible, and computationally efficient set of tools for modeling quantitative structure-activity relationships (QSARs) but have been eclipsed in performance by nonlinear methods. Support vector machines (SVMs) and neural networks are currently among the most popular and accurate QSAR methods because they learn new representations of the data that greatly improve modelability. In this work, we use shallow representation learning to improve the accuracy of L1 regularized logistic regression (LASSO) and meet the performance of Tanimoto SVM. We embedded chemical fingerprints in Euclidean space using Tanimoto (a.k.a. Jaccard) similarity kernel principal component analysis (KPCA) and compared the effects on LASSO and SVM model performance for predicting the binding activities of chemical compounds against 102 virtual screening targets. We observed similar performance and patterns of improvement for LASSO and SVM. We also empirically measured model training and cross-validation times to show that KPCA used in concert with LASSO classification is significantly faster than linear SVM over a wide range of training set sizes. Our work shows that powerful linear QSAR methods can match nonlinear methods and demonstrates a modular approach to nonlinear classification that greatly enhances QSAR model prototyping facility, flexibility, and transferability.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Informatics / methods*
  • Principal Component Analysis*
  • Quantitative Structure-Activity Relationship*
  • Support Vector Machine*
  • Time Factors