MCP-Net: Inter-frame Motion Correction with Patlak Regularization for Whole-body Dynamic PET

Med Image Comput Comput Assist Interv. 2022 Sep:13434:163-172. doi: 10.1007/978-3-031-16440-8_16. Epub 2022 Sep 16.

Abstract

Inter-frame patient motion introduces spatial misalignment and degrades parametric imaging in whole-body dynamic positron emission tomography (PET). Most current deep learning inter-frame motion correction works consider only the image registration problem, ignoring tracer kinetics. We propose an inter-frame Motion Correction framework with Patlak regularization (MCP-Net) to directly optimize the Patlak fitting error and further improve model performance. The MCP-Net contains three modules: a motion estimation module consisting of a multiple-frame 3-D U-Net with a convolutional long short-term memory layer combined at the bottleneck; an image warping module that performs spatial transformation; and an analytical Patlak module that estimates Patlak fitting with the motion-corrected frames and the individual input function. A Patlak loss penalization term using mean squared percentage fitting error is introduced to the loss function in addition to image similarity measurement and displacement gradient loss. Following motion correction, the parametric images were generated by standard Patlak analysis. Compared with both traditional and deep learning benchmarks, our network further corrected the residual spatial mismatch in the dynamic frames, improved the spatial alignment of Patlak Ki/Vb images, and reduced normalized fitting error. With the utilization of tracer dynamics and enhanced network performance, MCP-Net has the potential for further improving the quantitative accuracy of dynamic PET. Our code is released at https://github.com/gxq1998/MCP-Net.

Keywords: Inter-frame motion correction; Parametric imaging; Tracer kinetics regularization; Whole-body dynamic PET.