The quantification of single nucleotide polymorphism (SNP) allele frequencies in pooled DNA samples using real time PCR is a promising approach for large-scale diagnostics and genotyping. The limits of detection (LOD) and limits of quantification (LOQ) for mutant SNP alleles are of particular importance for determination of the working range, which, in the case of allele-specific real time PCR, can be limited by the variance of calibration data from serially diluted mutant allele samples as well as by the variance of the 100% wild-type allele samples (blank values). In this study, 3sigma and 10sigma criteria were applied for the calculation of LOD and LOQ values. Alternatively, LOQ was derived from a 20% threshold for the relative standard deviation (%RSD) of measurements by fitting a curve for the relationship between %RSD and copy numbers of the mutant alleles. We found that detection and quantification of mutant alleles were exclusively limited by the variance of calibration data since the estimated LOD(calibration) (696 in 30 000 000 copies, 0.0023%), LOQ(20%RSD) (1470, 0.0049%) and LOQ(calibration) (2319, 0.0077) values were significantly higher than the LOD(blank) (130, 0.0004%) and LOQ(blank) (265, 0.0009%) values derived from measurements of wild-type allele samples. No significant matrix effects of the genomic background DNA on the estimation of LOD and LOQ were found. Furthermore, the impact of large genome sizes and the general application of the procedure for the estimation of LOD and LOQ in quantitative real time PCR diagnostics are discussed.