A theory is presented for describing the effect on the transverse NMR relaxation rate of microscopic spatial inhomogeneities in the static magnetic field. The theory applies when the inhomogeneities are weak in magnitude and the nuclear spins diffuse a significant distance in comparison with a length scale characterizing the inhomogeneities. It is shown that the relaxation rate is determined by a temporal correlation function and depends quadratically on the magnitude of the inhomogeneities. For the case of unrestricted diffusion, a simple algebraic approximation for the temporal correlation function is derived. The theory is illustrated by applying it to a model of randomly distributed magnetized spheres. The theory is also used to fit experimental data for the dependence of the relaxation rate on the interecho time for a Carr-Purcell-Meiboom-Gill pulse sequence. The experimental systems considered are in vitro red blood cell suspensions and samples of human gray matter and rat liver. Magn Reson Med 44:144-156, 2000.
Copyright 2000 Wiley-Liss, Inc.