Diagnostic radiology training programs must produce highly skilled diagnostic radiologists capable of interpreting radiological examinations and communicating results to clinicians. Established training performance tools evaluate interpretive skills, but trainees' competency in reporting skills is also essential. Our semi-automated passive electronic tool entitled the Quantitative Reporting Skills Evaluation (QRSE) allows radiology training programs to evaluate the quantity of edits made to trainee preliminary reports by attending physicians as a metric to evaluate trainee reporting performance. Consecutive report pairs and metadata extracted from the radiology information system were anonymized and exported to a MySQL database. To perform the QRSE, for each report pair, open source software was first utilized to calculate the Levenshtein Percent (LP), the percent of character changes required to convert each preliminary report to its corresponding final report. The average LP (ALP), ALP for each trainee, and standard deviations were calculated. Eighty-four trainees and 56 attending radiologists interpreted 228,543 radiological examinations during the study period. The overall ALP was 6.38 %. Trainee-specific ALPs ranged from 1.1 to 15.3 %. Among trainee-specific ALPs, the standard deviation was 3.7 %. Our analysis identified five trainees with trainee-specific ALPs above 2 standard deviations from the mean and 14 trainees with trainee-specific ALPs less than 1 standard deviation below the mean. The QRSE methodology allows for the passive, quantitative, and longitudinal evaluation of the reporting skills of trainees during diagnostic radiology residency training. The QRSE identifies trainees with high and low levels of edits to their preliminary reports, as a marker for trainee overall reporting skills, and thus represents a novel performance metric for radiology training programs.