Interpretable Deep Models for ICU Outcome Prediction

AMIA Annu Symp Proc. 2017 Feb 10:2016:371-380. eCollection 2016.

Abstract

Exponential surge in health care data, such as longitudinal data from electronic health records (EHR), sensor data from intensive care unit (ICU), etc., is providing new opportunities to discover meaningful data-driven characteristics and patterns ofdiseases. Recently, deep learning models have been employedfor many computational phenotyping and healthcare prediction tasks to achieve state-of-the-art performance. However, deep models lack interpretability which is crucial for wide adoption in medical research and clinical decision-making. In this paper, we introduce a simple yet powerful knowledge-distillation approach called interpretable mimic learning, which uses gradient boosting trees to learn interpretable models and at the same time achieves strong prediction performance as deep learning models. Experiment results on Pediatric ICU dataset for acute lung injury (ALI) show that our proposed method not only outperforms state-of-the-art approaches for morality and ventilator free days prediction tasks but can also provide interpretable models to clinicians.

MeSH terms

  • Acute Lung Injury
  • Computer Simulation*
  • Electronic Health Records
  • Humans
  • Intensive Care Units, Pediatric*
  • Machine Learning*
  • Models, Theoretical
  • Neural Networks, Computer*
  • Prognosis