Chromosome aberrations in blood lymphocytes provide a useful measure of past exposure to ionizing radiation. Despite the widespread and successful use of the dicentric assay for retrospective biodosimetry, the approach suffers substantial drawbacks, including the fact that dicentrics in circulating blood have a rather short half-life (roughly 1-2 years by most estimates). So-called symmetrical aberrations such as translocations are far more stable in that regard, but their high background frequency, which increases with age, also makes them less than ideal for biodosimetry. We developed a cytogenetic assay for potential use in retrospective biodosimetry that is based on the detection of chromosomal inversions, another symmetrical aberration whose transmissibility (stability) is also ostensibly high. Many of the well-known difficulties associated with inversion detection were circumvented through the use of directional genomic hybridization, a method of molecular cytogenetics that is less labor intensive and better able to detect small chromosomal inversions than other currently available approaches. Here, we report the dose-dependent induction of inversions following exposure to radiations with vastly different ionization densities [i.e., linear energy transfer (LET)]. Our results show a dramatic dose-dependent difference in the yields of inversions induced by low-LET gamma rays, as compared to more damaging high-LET charged particles similar to those encountered in deep space.