2.5D and 3D segmentation of brain metastases with deep learning on multinational MRI data

Front Neuroinform. 2023 Jan 18;16:1056068. doi: 10.3389/fninf.2022.1056068. eCollection 2022.


Introduction: Management of patients with brain metastases is often based on manual lesion detection and segmentation by an expert reader. This is a time- and labor-intensive process, and to that end, this work proposes an end-to-end deep learning segmentation network for a varying number of available MRI available sequences.

Methods: We adapt and evaluate a 2.5D and a 3D convolution neural network trained and tested on a retrospective multinational study from two independent centers, in addition, nnU-Net was adapted as a comparative benchmark. Segmentation and detection performance was evaluated by: (1) the dice similarity coefficient, (2) a per-metastases and the average detection sensitivity, and (3) the number of false positives.

Results: The 2.5D and 3D models achieved similar results, albeit the 2.5D model had better detection rate, whereas the 3D model had fewer false positive predictions, and nnU-Net had fewest false positives, but with the lowest detection rate. On MRI data from center 1, the 2.5D, 3D, and nnU-Net detected 79%, 71%, and 65% of all metastases; had an average per patient sensitivity of 0.88, 0.84, and 0.76; and had on average 6.2, 3.2, and 1.7 false positive predictions per patient, respectively. For center 2, the 2.5D, 3D, and nnU-Net detected 88%, 86%, and 78% of all metastases; had an average per patient sensitivity of 0.92, 0.91, and 0.85; and had on average 1.0, 0.4, and 0.1 false positive predictions per patient, respectively.

Discussion/conclusion: Our results show that deep learning can yield highly accurate segmentations of brain metastases with few false positives in multinational data, but the accuracy degrades for metastases with an area smaller than 0.4 cm2.

Keywords: 2.5D; 3D; MRI; brain metastases; deep learning; segmentation.

Grant support

The project was supported by the Norwegian South-Eastern Health Authority (grant numbers 2021031, 2016102, 2017073, and 2013069), the Research Council of Norway (grant numbers 261984 and 325971), the Norwegian Cancer Society (grant numbers 6817564 and 3434180), the European Research Council (grant number 758657-ImPRESS).