Interpretation of DNA evidence depends upon the ability of the analyst to accurately compare the DNA profile obtained from an item of evidence and the DNA profile of a standard. This interpretation becomes progressively more difficult as the number of 'drop-out' and 'drop-in' events increase. Analytical thresholds (AT) are typically selected to ensure the false detection of noise is minimized. However, there exists a tradeoff between the erroneous labeling of noise as alleles and the false non-detection of alleles (i.e. drop-out). In this study, the effect ATs had on both types of error was characterized. Various ATs were tested, where three relied upon the analysis of baseline signals obtained from 31 negative samples. The fourth AT was determined by utilizing the relationship between RFU signal and DNA input. The other ATs were the commonly employed 50, 150 and 200 RFU thresholds. Receiver Operating Characteristic (ROC) plots showed that although high ATs completely negated the false labeling of noise, DNA analyzed with ATs derived using analysis of the baseline signal exhibited the lowest rates of drop-out and the lowest total error rates. In another experiment, the effect small changes in ATs had on drop-out was examined. This study showed that as the AT increased from ∼10 to 60 RFU, the number of heterozygous loci exhibiting the loss of one allele increased. Between ATs of 60 and 150 RFU, the frequency of allelic drop-out remained constant at 0.27 (±0.02) and began to decrease when ATs of 150 RFU or greater were utilized. In contrast, the frequency of heterozygous loci exhibiting the loss of both alleles consistently increased with AT. In summary, for samples amplified with less than 0.5ng of DNA, ATs derived from baseline analysis of negatives were shown to decrease the frequency of drop-out by a factor of 100 without significantly increasing rates of erroneous noise detection.
Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.