A novel approach of brain-computer interfacing (BCI) and Grad-CAM based explainable artificial intelligence: Use case scenario for smart healthcare

J Neurosci Methods. 2024 May 7:408:110159. doi: 10.1016/j.jneumeth.2024.110159. Online ahead of print.

Abstract

Background: In order to push the frontiers of brain-computer interface (BCI) and neuron-electronics, this research presents a novel framework that combines cutting-edge technologies for improved brain-related diagnostics in smart healthcare. This research offers a ground-breaking application of transparent strategies to BCI, promoting openness and confidence in brain-computer interactions and taking inspiration from Grad-CAM (Gradient-weighted Class Activation Mapping) based Explainable Artificial Intelligence (XAI) methodology. The landscape of healthcare diagnostics is about to be redefined by the integration of various technologies, especially when it comes to illnesses related to the brain.

New method: A novel approach has been proposed in this study comprising of Xception architecture which is trained on imagenet database following transfer learning process for extraction of significant features from magnetic resonance imaging dataset acquired from publicly available distinct sources as an input and linear support vector machine has been used for distinguishing distinct classes.Afterwards, gradient-weighted class activation mapping has been deployed as the foundation for explainable artificial intelligence (XAI) for generating informative heatmaps, representing spatial localization of features which were focused to achieve model's predictions.

Results: Thus, the proposed model not only provides accurate outcomes but also provides transparency for the predictions generated by the Xception network to diagnose presence of abnormal tissues and avoids overfitting issues. Hyperparameters along with performance-metrics are also obtained while validating the proposed network on unseen brain MRI scans to ensure effectiveness of the proposed network.

Comparison with existing methods and conclusions: The integration of Grad-CAM based explainable artificial intelligence with deep neural network namely Xception offers a significant impact in diagnosing brain tumor disease while highlighting the specific regions of input brain MRI images responsible for making predictions. In this study, the proposed network results in 98.92% accuracy, 98.15% precision, 99.09% sensitivity, 98.18% specificity and 98.91% dice-coefficient while identifying presence of abnormal tissues in the brain. Thus, Xception model trained on distinct dataset following transfer learning process offers remarkable diagnostic accuracy and linear support vector act as a classifier to provide efficient classification among distinct classes. In addition, the deployed explainable artificial intelligence approach helps in revealing the reasoning behind predictions made by deep neural network having black-box nature and provides a clear perspective to assist medical experts in achieving trustworthiness and transparency while diagnosing brain tumor disease in the smart healthcare.

Keywords: Automated diagnosis; Brain tumor; Brain-computer interfacing; Deep Neural Network; Explainable Artificial Intelligence; Health; Transparency.