Objective: Develop technology to predict burn wound depth using a combination of FDA approved ultrasound modalities and interpretation of these images using artificial intelligence (AI).
Summary background data: Physical examination by burn surgeons is the diagnostic gold standard to determine the need for burn surgery. Distinguishing between deep partial and third degree burns to determine the need for surgery is the ultimate diagnostic challenge. Reported accuracy for this process is 76% for burn experts and 50% for non-experts.
Methods: A pig burn model (n=12) was used to develop the initial AI framework, which was subsequently tested in a nonrandomized prospective study of thermal burn human subjects (n=30). Images from Tissue Doppler Elastography Imaging (TDI), to measure tissue stiffness, Harmonic B-mode ultrasound, to identify anatomic landmarks, and digital photographs were collected. Biopsies were obtained from 5 subjects who went to the OR for debridement as ground truth for AI image interpretation. The AI model analyzed both TDI and B-mode images to predict burn depth. AI accuracy and explainability in predicting burn depth were the main outcomes.
Results: The AI algorithm identified third degree burns in pigs with 100% accuracy. For human subjects the mean age was 47.6± 17.6 years old and TBSA is 7.7±8.5%. The AI method achieved a 95% accuracy in identifying 3rd degree burns in humans.
Conclusions: These results indicate that the strategy to use AI interpretation of B-mode ultrasound and TDI images to increase diagnostic accuracy in predicting burn depth is feasible.
Keywords: AI burn diagnosis; GPT; TDI ultrasound; burn depth prediction; deep learning; explainability; surgical decision-making; vision-language model.
Copyright © 2026 The Author(s). Published by Wolters Kluwer Health, Inc.