Interpretable deep learning for multicenter gastric cancer T staging from CT images

NPJ Digit Med. 2025 Dec 20;9(1):2. doi: 10.1038/s41746-025-02002-5.

Abstract

Preoperative T staging of gastric cancer is critical for therapeutic stratification, yet conventional contrast-enhanced CT interpretation shows subjectivity and inconsistent reliability. We developed GTRNet, an interpretable end-to-end deep-learning framework that classifies T1-T4 from routine CT without manual segmentation or annotation. In a retrospective multicenter study of 1792 patients, CT images underwent standardized preprocessing and the largest axial tumor slice was used for training; performance was then tested in two independent external cohorts. GTRNet achieved high discrimination (AUC 0.86-0.95) and accuracy (81-85%) in internal and external tests, surpassing radiologists. Grad-CAM heatmaps localized attention to the gastric wall and serosa. Combining a deep-learning rad-score with tumor size, differentiation and Lauren subtype, we constructed a nomogram with good calibration and higher net clinical benefit than conventional approaches. This automated and interpretable pipeline may standardize CT-based staging and support preoperative decision-making and neoadjuvant-therapy selection.