Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
, 4 (3), 94-99
eCollection

Development of an Assessment Strategy in Preclinical Fixed Prosthodontics Course Using Virtual Assessment software-Part 2

Affiliations

Development of an Assessment Strategy in Preclinical Fixed Prosthodontics Course Using Virtual Assessment software-Part 2

Ramtin Sadid-Zadeh et al. Clin Exp Dent Res.

Abstract

The purpose of this study was to evaluate interrater agreement between faculty and virtual assessments of preparations for complete coverage restorations in preclinical fixed prosthodontics. Teeth prepared during preclinical fixed prosthodontics practical exams at the University at Buffalo School of Dental Medicine were used in this study. Teeth were prepared for fabrication of complete cast, metal ceramic, and all ceramic crowns. The specimens were digitized using an intraoral scanner. Then, they were virtually superimposed on the corresponding standard preparations using Compare software. The software was used to quantify comparison percentages, average finish line widths, and average axial wall heights. Two calibrated faculty members assessed preparations for occlusal/incisal reduction, finish line location, axial wall height, and finish line width using traditional assessment forms. Cohen's kappa coefficient was used to measure interrater agreement between faculty and virtual assessments. Kappa interrater agreement scores ranged between 0.83 and 0.88 for virtually assessed comparison percentages and sums of faculty-assessed occlusal/incisal reduction and finish line location. Kappa interrater agreement score ranges were 0.64-0.94 and 0.74-0.89 for comparisons of virtual and faculty assessments for axial wall height and finish line width, respectively. Virtual assessments are similar to faculty assessments for occlusal/incisal reduction, finish line location, axial wall height, and finish line width in fixed prosthodontics and can be used as equivalent evaluations of student performance for these criteria.

Keywords: Compare software; dental education; faculty assessment; rubrics.

Similar articles

See all similar articles

References

    1. American Dental Association (2006). Commission on dental accreditation. Accreditation standards for dental education programs.
    1. Bongers P. J., van Hove P. D., Stassen L. P. S., Dankelman J., & Schreuder H. W. (2015). A new virtual‐reality training module for laparoscopic surgical skills and equipment handling: Can multitasking be trained? A randomized controlled trial. Journal of Surgical Education, 72, 184–191. - PubMed
    1. Buchanan J. A. (2004). Experience with virtual reality based technology in teaching restorative dental procedures. Journal of Dental Education, 68, 1258–1265. - PubMed
    1. Callan R. S., Haywood V. B., Cooper J. R., Furness A. R., & Looney S. W. (2015). The validity of using E4D compare's “% Comparison” to assess crown preparations in preclinical dental education. Journal of Dental Education, 79, 1445–1451. - PubMed
    1. Clancy J. M., Lindquist T. J., Palik J. F., & Johnson L. A. (2002). A comparison of student performance in a simulation clinic and a traditional laboratory environment: Three‐year results. Journal of Dental Education, 66, 1331–1337. - PubMed

LinkOut - more resources

Feedback