Volumetric laser endomicroscopy (VLE) is a balloon-based technique, which provides a circumferential near-microscopic scan of the esophageal wall layers, and has potential to improve Barrett's neoplasia detection. Interpretation of VLE imagery in Barrett's esophagus (BE) however is time-consuming and complex, due to a large amount of visual information and numerous subtle gray-shaded VLE images. Computer-aided detection (CAD), analyzing multiple neighboring VLE frames, might improve BE neoplasia detection compared to automated single-frame analyses. This study is to evaluate feasibility of automatic data extraction followed by CAD using a multiframe approach for detection of BE neoplasia. Prospectively collected ex-vivo VLE images from 29 BE-patients with and without early neoplasia were retrospectively analyzed. Sixty histopathology-correlated regions of interest (30 nondysplastic vs. 30 neoplastic) were assessed using different CAD systems. Multiple neighboring VLE frames, corresponding to 1.25 millimeter proximal and distal to each region of interest, were evaluated. In total, 3060 VLE frames were analyzed via the CAD multiframe analysis. Multiframe analysis resulted in a significantly higher median AUC (median level = 0.91) compared to single-frame (median level = 0.83) with a median difference of 0.08 (95% CI, 0.06-0.10), P < 0.001. A maximum AUC of 0.94 was reached when including 22 frames on each side using a multiframe approach. In total, 3060 VLE frames were automatically extracted and analyzed by CAD in 3.9 seconds. Multiframe VLE image analysis shows improved BE neoplasia detection compared to single-frame analysis. CAD with multiframe analysis allows for fast and accurate VLE interpretation, thereby showing feasibility of automatic full scan assessment in a real-time setting during endoscopy.
Keywords: Barrett's neoplasia; computer-aided detection; volumetric laser endomicroscopy.
© The Author(s) 2019. Published by Oxford University Press on behalf of International Society for Diseases of the Esophagus. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.