Automated detection of COVID-19 through convolutional neural network using chest x-ray images

PLoS One. 2022 Jan 21;17(1):e0262052. doi: 10.1371/journal.pone.0262052. eCollection 2022.

Abstract

The COVID-19 epidemic has a catastrophic impact on global well-being and public health. More than 27 million confirmed cases have been reported worldwide until now. Due to the growing number of confirmed cases, and challenges to the variations of the COVID-19, timely and accurate classification of healthy and infected patients is essential to control and treat COVID-19. We aim to develop a deep learning-based system for the persuasive classification and reliable detection of COVID-19 using chest radiography. Firstly, we evaluate the performance of various state-of-the-art convolutional neural networks (CNNs) proposed over recent years for medical image classification. Secondly, we develop and train CNN from scratch. In both cases, we use a public X-Ray dataset for training and validation purposes. For transfer learning, we obtain 100% accuracy for binary classification (i.e., Normal/COVID-19) and 87.50% accuracy for tertiary classification (Normal/COVID-19/Pneumonia). With the CNN trained from scratch, we achieve 93.75% accuracy for tertiary classification. In the case of transfer learning, the classification accuracy drops with the increased number of classes. The results are demonstrated by comprehensive receiver operating characteristics (ROC) and confusion metric analysis with 10-fold cross-validation.

MeSH terms

  • COVID-19 / diagnostic imaging*
  • COVID-19 / pathology
  • COVID-19 / virology
  • Case-Control Studies
  • Databases, Factual
  • Deep Learning*
  • Diagnosis, Differential
  • Female
  • Humans
  • Image Interpretation, Computer-Assisted / methods*
  • Male
  • Pneumonia, Bacterial / diagnostic imaging*
  • Pneumonia, Bacterial / pathology
  • Pneumonia, Bacterial / virology
  • ROC Curve
  • Radiography, Thoracic
  • SARS-CoV-2 / pathogenicity

Grants and funding

The authors received no specific funding for this work.