Diffusion tensor imaging (DTI) is a modality known to be highly sensitive to the detrimental effects of experimental noise. Here, using Monte Carlo simulations, we compare and contrast how noise complicates the measurement of diffusion anisotropy in diffusion tensor and conventional diffusion-weighted imaging (DWI). As the signal-to-noise ratio (SNR) decreases below a value of approximately 20, the eigenvalues (lambda(i)) of the diffusion tensor D are found to diverge rapidly from their true values, with the result that the measured anisotropy can be significantly in error and isotropic structures falsely assigned a high level of anisotropy. The effect of noise on the rotationally variant indices, calculated from a conventional diffusion-weighted imaging experiment, is found to be much less insidious, because the apparent diffusion coefficients (ADCs) diverge only slowly as the signal-to-noise decreases. Thus, although rotationally variant indices almost always underestimate the true diffusion anisotropy, they show only a small susceptibility to experimental noise and hence, are preferred to their rotationally invariant counterparts when the signal-to-noise ratio is small.