Dynamic nuclear medicine studies can generate large quantities of data, and their analysis consists essentially of a reduction of these data to a small number of relevant parameters which will assist in clinical decision making. This review examines some of the mathematical techniques that have been used in the process of data reduction and attempts to explain the principles behind their application. It particularly identifies the techniques that have stood the test of time and demonstrated their usefulness, many of which are now available as standard tools on nuclear medicine processing computers. These include curve processing tools such as smoothing, fitting and factor analysis, as well as tools based on empirical models, such as the Patlak/Rutland plot and deconvolution. Compartmental models and vascular models are also examined and the review finishes with a summary of some functional images and condensed images. It is concluded that an appreciation of the principles and limitations of these mathematical tools is valuable for their correct usage and interpretation of the results produced.