Synthetic CT generation from weakly paired MR images using cycle-consistent GAN for MR-guided radiotherapy

Biomed Eng Lett. 2021 Jun 19;11(3):263-271. doi: 10.1007/s13534-021-00195-8. eCollection 2021 Aug.

Abstract

Although MR-guided radiotherapy (MRgRT) is advancing rapidly, generating accurate synthetic CT (sCT) from MRI is still challenging. Previous approaches using deep neural networks require large dataset of precisely co-registered CT and MRI pairs that are difficult to obtain due to respiration and peristalsis. Here, we propose a method to generate sCT based on deep learning training with weakly paired CT and MR images acquired from an MRgRT system using a cycle-consistent GAN (CycleGAN) framework that allows the unpaired image-to-image translation in abdomen and thorax. Data from 90 cancer patients who underwent MRgRT were retrospectively used. CT images of the patients were aligned to the corresponding MR images using deformable registration, and the deformed CT (dCT) and MRI pairs were used for network training and testing. The 2.5D CycleGAN was constructed to generate sCT from the MRI input. To improve the sCT generation performance, a perceptual loss that explores the discrepancy between high-dimensional representations of images extracted from a well-trained classifier was incorporated into the CycleGAN. The CycleGAN with perceptual loss outperformed the U-net in terms of errors and similarities between sCT and dCT, and dose estimation for treatment planning of thorax, and abdomen. The sCT generated using CycleGAN produced virtually identical dose distribution maps and dose-volume histograms compared to dCT. CycleGAN with perceptual loss outperformed U-net in sCT generation when trained with weakly paired dCT-MRI for MRgRT. The proposed method will be useful to increase the treatment accuracy of MR-only or MR-guided adaptive radiotherapy.

Supplementary information: The online version contains supplementary material available at 10.1007/s13534-021-00195-8.