Historically, osteoporosis has been defined as a disease in which there is "too little bone, but what there is, is normal." As a result of research design and sample selection limitations, published data contradict and confirm the historical definition. Because of these limitations, it has been hard to assess the contribution of mineral quality to mechanical properties, and to select therapeutic protocols that optimize bone mineral properties. The coupling of an optical microscope to an infrared spectrometer enables the acquisition of spectral data at known sites in a histologic section of mineralized tissue without loss of topography and/or orientation. The use of second-derivative spectroscopy coupled with curve-fitting techniques allows the qualitative and quantitative assessment of mineral quality (crystallite size and perfection, mineral:matrix ratio) at well-defined morphologic locations. We have previously applied these techniques to the study of normal human osteonal, cortical, and trabecular bone. The results indicated that the newly deposited bone mineral is less "crystalline/mature" than the older one. In the present study, Fourier transform infrared microspectroscopy (FTIRM) was applied to the study of human osteonal and cortical bone from iliac crest biopsies of untreated osteoporotic patients. The hypothesis tested was that osteoporotic bone mineral is monotonically different in its properties expressed as "crystallinity/maturity" than the normal. The results indicate significant differences in the mineral properties as expressed by crystal size and perfection, with the mineral from osteoporotic bone being more crystalline/mature than the normal.