Principal Component Analysis based on Nuclear norm Minimization

Neural Netw. 2019 Oct:118:1-16. doi: 10.1016/j.neunet.2019.05.020. Epub 2019 Jun 8.

Abstract

Principal component analysis (PCA) is a widely used tool for dimensionality reduction and feature extraction in the field of computer vision. Traditional PCA is sensitive to outliers which are common in empirical applications. Therefore, in recent years, massive efforts have been made to improve the robustness of PCA. However, many emerging PCA variants developed in the direction have some weaknesses. First, few of them pay attention to the 2D structure of error matrix. Second, to estimate data mean from sample set with outliers by averaging is usually biased. Third, if some elements of a sample are disturbed, to extract principal components (PCs) by directly projecting data with transformation matrix causes incorrect mapping of sample to its genuine location in low-dimensional feature subspace. To alleviate these problems, we present a novel robust method, called nuclear norm-based on PCA (N-PCA) to take full advantage of the structure information of error image. Meanwhile, it is developed under a novel unified framework of PCA to remedy the bias of computing data mean and the low-dimensional representation of a sample both of which are treated as unknown variables in a single model together with projection matrix. To solve N-PCA, we propose an iterative algorithm, which has a closed-form solution in each iteration. Experimental results on several open databases demonstrate the effectiveness of the proposed method.

Keywords: Low-dimensional representation; Nuclear norm; Optimal mean; Principal component analysis (PCA); Robustness.

MeSH terms

  • Algorithms
  • Databases, Factual / trends
  • Humans
  • Machine Learning* / trends
  • Pattern Recognition, Visual
  • Principal Component Analysis / methods*