Selective inference for effect modification via the lasso

J R Stat Soc Series B Stat Methodol. 2022 Apr;84(2):382-413. doi: 10.1111/rssb.12483. Epub 2021 Dec 14.

Abstract

Effect modification occurs when the effect of the treatment on an outcome varies according to the level of other covariates and often has important implications in decision-making. When there are tens or hundreds of covariates, it becomes necessary to use the observed data to select a simpler model for effect modification and then make valid statistical inference. We propose a two-stage procedure to solve this problem. First, we use Robinson's transformation to decouple the nuisance parameters from the treatment effect of interest and use machine learning algorithms to estimate the nuisance parameters. Next, after plugging in the estimates of the nuisance parameters, we use the lasso to choose a low-complexity model for effect modification. Compared to a full model consisting of all the covariates, the selected model is much more interpretable. Compared to the univariate subgroup analyses, the selected model greatly reduces the number of false discoveries. We show that the conditional selective inference for the selected model is asymptotically valid given the rate assumptions in classical semiparametric regression. Extensive simulation studies are conducted to verify the asymptotic results and an epidemiological application is used to demonstrate the method.

Keywords: causal inference; false coverage rate; machine learning; post-selection inference; semiparametric regression.