Evaluation of the accuracy of 3DVH software estimates of dose to virtual ion chamber and film in composite IMRT QA

Med Phys. 2012 Jan;39(1):81-6. doi: 10.1118/1.3666771.

Abstract

Purpose: A novel patient-specific intensity modulated radiation therapy (IMRT) QA system, 3DVH software and mapcheck 2, purports to be able to use diode array-measured beam doses and the patient's DICOM RT plan, structure set, and dose files to predict the delivered 3D dose distribution in the patient for comparison to the treatment planning system (TPS) calculated doses. In this study, the composite dose to an ion chamber and film in phantom predicted by the 3DVH and mapcheck 2 system is compared to the actual measured chamber and film doses. If validated in this context, then 3DVH can be used to perform an equivalent dose analysis as that obtained with film dosimetry and ion chamber-based composite IMRT QA. This is important for those losing their ability to perform film dosimetry for true composite IMRT QA and provides a measure of confidence in the accuracy of 3DVH 3D dose calculations which may replace phantom-based IMRT QA.

Methods: The dosimetric results from 15 consecutive patient-specific IMRT QA tests performed by composite field irradiation of ion chamber and EDR2 film in a solid water phantom were compared to the predicted doses for those virtual detectors based on the calculated 3D dose by the 3DVH software using mapcheck 2 measured doses of each beam within each plan. For each of the 15 cases, immediately after performing the ion chamber plus film measurements, the mapcheck 2 was used to measure the dose for each beam of the plan. The dose to the volume of the virtual ion chamber and the dose distribution in the plane of the virtual film calculated by the 3DVH software was extracted. The ratio of the measured to 3DVH or eclipse-predicted ion chamber doses was calculated. The same plane in the phantom measured using film and calculated with eclipse was exported from 3DVH and the 2D gamma metric was used to compare the relationship between the film doses and the eclipse or 3DVH predicted planar doses. Also, the 3D gamma value was calculated in the 3DVH software which compares the eclipse dose to the 3DVH predicted dose distribution. For the 2D and 3D gamma metrics, 2% dose and 2 mm distance to agreement (DTA) were used. In addition, a simple dose difference was performed using either a 2% or 3% dose difference tolerance.

Results: The mean ratio ± standard deviation of the measured vs 3DVH or vs eclipse-predicted dose to the ion chamber was 1.013 ± 0.015 and 1.003 ± 0.012, respectively. For 3DVH vs eclipse, the mean percentage of pixels failing the 3D gamma metric was 1.2% ± 1.4% while the failure rate for the 2D gamma metric was 1.1% ± 0.9%. When either 3DVH or eclipse was compared to EDR2 film, the gamma failure rate was 2.3% ± 2.0% and 1.6% ± 1.7%, respectively. Mean dose difference failures were 9%-27% ± 5%-15% for 2 or 3% dose difference tolerances, depending on the combination of systems tested. No statistically significant differences were found for any of the planar dosimetric comparisons.

Conclusions: 3DVH + mapcheck 2 predicts the same absolute dose, the percent of pixels failing the gamma metric, and the percent of pixels failing 2% or 3% dose difference tolerance tests as one would have obtained had one made measurements in solid water phantom using an ion chamber and coronal film instead of a diode array. This is also a necessary although not sufficient condition for validation of the accuracy of 3DVH predictions of the 3D dose using beam-by-beam measurements.

Publication types

  • Comparative Study
  • Evaluation Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Film Dosimetry / instrumentation*
  • Film Dosimetry / methods
  • Quality Assurance, Health Care / methods
  • Quality Assurance, Health Care / standards
  • Radiation Dosage
  • Radiotherapy, Conformal / instrumentation*
  • Radiotherapy, Conformal / methods*
  • Radiotherapy, Conformal / standards
  • Reproducibility of Results
  • Sensitivity and Specificity
  • Software Validation*
  • Software*
  • United States