Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
, 107 (497), 223-232

Likelihood-based Selection and Sharp Parameter Estimation


Likelihood-based Selection and Sharp Parameter Estimation

Xiaotong Shen et al. J Am Stat Assoc.


In high-dimensional data analysis, feature selection becomes one means for dimension reduction, which proceeds with parameter estimation. Concerning accuracy of selection and estimation, we study nonconvex constrained and regularized likelihoods in the presence of nuisance parameters. Theoretically, we show that constrained L(0)-likelihood and its computational surrogate are optimal in that they achieve feature selection consistency and sharp parameter estimation, under one necessary condition required for any method to be selection consistent and to achieve sharp parameter estimation. It permits up to exponentially many candidate features. Computationally, we develop difference convex methods to implement the computational surrogate through prime and dual subproblems. These results establish a central role of L(0)-constrained and regularized likelihoods in feature selection and parameter estimation involving selection. As applications of the general method and theory, we perform feature selection in linear regression and logistic regression, and estimate a precision matrix in Gaussian graphical models. In these situations, we gain a new theoretical insight and obtain favorable numerical results. Finally, we discuss an application to predict the metastasis status of breast cancer patients with their gene expression profiles.


Figure 1
Figure 1
Truncated L1 function Jt (|βj|) with τ = 1 in (a), and its DC decomposition into a dfference of two convex functions JL and JT,2 in (b).

Similar articles

See all similar articles

Cited by 33 articles

See all "Cited by" articles

LinkOut - more resources