Impact of the sensitivity factor on the signal-to-noise ratio in grating-based phase contrast imaging

Proc SPIE Int Soc Opt Eng. 2019 Feb:10948:109481Q. doi: 10.1117/12.2512251. Epub 2019 Mar 1.

Abstract

The sensitivity factor of a grating-based x-ray differential phase contrast (DPC) imaging system determines how much fringe shift can be observed for a given refraction angle. It is commonly believed that increasing the sensitivity factor will improve the signal-to-noise ratio (SNR) of the phase signal. However, this may not always be the case if the intrinsic phase wrapping effect is taken into consideration. In this work, a theoretical derivation is provided to quantify relationship between the sensitivity and SNR for a given refraction angle, exposure level, and grating based x-ray DPC system. The theoretical derivation shows that the expected phase signal is not always proportional to the sensitivity factor and may even decrease when the sensitivity factor becomes too large. The noise variance of the signal is not always solely dependent on the exposure level and fringe visibility but may become signal-dependent under certain circumstances. As a result, SNR of the phase signal does not always increase with higher sensitivity. Numerical simulation studies were performed to validate the theoretical models. Results show that when the fringe visibility and exposure level are fixed, there exists an optimal sensitivity factor which maximizes the SNR for a given refraction angle; further increase of the sensitivity factor may decrease the SNR.