Wearable augmented reality (AR) is an emerging technology with enormous potential for use in the medical field, from training and procedure simulations to image-guided surgery. Medical AR seeks to enable surgeons to see tissue segmentations in real time. With the objective of achieving real-time guidance, the emphasis on speed produces the need for a fast method for imaging and classification. Hyperspectral imaging (HSI) is a non-contact, optical imaging modality that rapidly acquires hundreds of images of tissue at different wavelengths, which can be used to generate spectral data of the tissue. Combining HSI information and machine-learning algorithms allows for effective tissue classification. In this paper, we constructed a brain tissue phantom with porcine blood, yellow-dyed gelatin, and colorless gelatin to represent blood vessels, tumor, and normal brain tissue, respectively. Using a segmentation algorithm, hundreds of hyperspectral images were compiled to classify each of the pixels. Three segmentation labels were generated from the data, each with a different type of tissue. Our system virtually superimposes the HSI channels and segmentation labels of a brain tumor phantom onto the real scene using the HoloLens AR headset. The user can manipulate and interact with the segmentation labels and HSI channels by repositioning, rotating, changing visibility, and switching between them. All actions can be performed through either hand or voice controls. This creates a convenient and multifaceted visualization of brain tissue in real time with minimal user restrictions. We demonstrate the feasibility of a fast and practical HIS-AR technique for potential use of image-guided brain surgery.
Keywords: Augmented reality; brain tumor resection; hyperspectral imaging; image-guided surgery.