Artificial intelligence in fracture detection: transfer learning from deep convolutional neural networks

Clin Radiol. 2018 May;73(5):439-445. doi: 10.1016/j.crad.2017.11.015. Epub 2017 Dec 18.

Abstract

Aim: To identify the extent to which transfer learning from deep convolutional neural networks (CNNs), pre-trained on non-medical images, can be used for automated fracture detection on plain radiographs.

Materials and methods: The top layer of the Inception v3 network was re-trained using lateral wrist radiographs to produce a model for the classification of new studies as either "fracture" or "no fracture". The model was trained on a total of 11,112 images, after an eightfold data augmentation technique, from an initial set of 1,389 radiographs (695 "fracture" and 694 "no fracture"). The training data set was split 80:10:10 into training, validation, and test groups, respectively. An additional 100 wrist radiographs, comprising 50 "fracture" and 50 "no fracture" images, were used for final testing and statistical analysis.

Results: The area under the receiver operator characteristic curve (AUC) for this test was 0.954. Setting the diagnostic cut-off at a threshold designed to maximise both sensitivity and specificity resulted in values of 0.9 and 0.88, respectively.

Conclusion: The AUC scores for this test were comparable to state-of-the-art providing proof of concept for transfer learning from CNNs in fracture detection on plain radiographs. This was achieved using only a moderate sample size. This technique is largely transferable, and therefore, has many potential applications in medical imaging, which may lead to significant improvements in workflow productivity and in clinical risk reduction.

MeSH terms

  • Artificial Intelligence*
  • Deep Learning
  • Diagnosis, Differential
  • Fractures, Bone / diagnostic imaging*
  • Humans
  • Machine Learning
  • Neural Networks, Computer
  • Radiographic Image Interpretation, Computer-Assisted
  • Sensitivity and Specificity