Bandpass effects in time-resolved diffuse spectroscopy

Appl Spectrosc. 2009 Jan;63(1):48-56. doi: 10.1366/000370209787169795.

Abstract

This paper discusses the spectral distortions occurring when time-resolved spectroscopy of diffusive media is performed illuminating with a wide bandpass. It is shown that the spectral region within the bandpass that exhibits the lowest absorption will dominate the resulting time-resolved curve, leading to significant underestimations of absorption as well as distortions in the spectral shape (including shifts in peak positions). Due to the nonlinear behavior of absorption, this effect becomes even more pronounced when including longer and longer photon path lengths. First, a theoretical treatment of the problem is given, and then the distortion is described by time-resolved reflectance simulations and experimental measurements of lipid and water samples. A spectrally constrained data analysis is proposed that takes into account the spectrum of the light injected into the sample, used to overcome the distortion and improve the accuracy of the estimation of chromophore concentrations from absorption spectra. Measurements on a lipid sample show a reduction of the error from 30% to 6%.