Using Explainable Artificial Intelligence Models (ML) to Predict Suspected Diagnoses as Clinical Decision Support

Stud Health Technol Inform. 2022 May 25:294:573-574. doi: 10.3233/SHTI220529.

Abstract

The complexity of emergency cases and the number of emergency patients have increased dramatically. Due to a reduced or even missing specialist medical staff in the emergency departments (EDs), medical knowledge is often used without professional supervision for the diagnosis. The result is a failure in diagnosis and treatment, even death in the worst case. Secondary: high expenditure of time and high costs. Using accurate patient data from the German national registry of the medical emergency departments (AKTIN-registry, Home - Notaufnahmeregister (aktin.org)), the most 20 frequent diagnoses were selected for creating explainable artificial intelligence (XAI) models as part of the ENSURE project (ENSURE (umg.eu)). 137.152 samples and 51 features (vital signs and symptoms) were analyzed. The XAI models achieved a mean area under the curve (AUC) one-vs-rest of 0.98 for logistic regression (LR) and 0.99 for the random forest (RF), and predictive accuracies of 0.927 (LR) and 0.99 (RF). Based on its grade of explainability and performance, the best model will be incorporated into a portable CDSS to improve diagnoses and outcomes of ED treatment and reduce cost. The CDSS will be tested in a clinical pilot study at EDs of selected hospitals in Germany.

Keywords: Clinical Decision Support; Diagnoses Prediction; Emergency Department; Explainable Artificial Intelligence; Machine Learning.

MeSH terms

  • Area Under Curve
  • Artificial Intelligence*
  • Decision Support Systems, Clinical*
  • Humans
  • Logistic Models
  • Pilot Projects