Purpose: To compare intrabony thermal changes induced by two different protocols for guided implant surgery during the whole drilling procedure.
Materials and methods: Two protocols for guided implant placement were evaluated in vitro using artificial bone cylinders. The control protocol provided traditional metal sleeves and a standard drilling sequence composed of four cylindrical triflute drills (cutting surface length = 16 mm). The test protocol provided a three-slot polyurethane sleeve and two cylindrical drills (second drill cutting surface length = 4 mm). Forty automated intermittent and graduated osteotomies (depth = 14 mm) were performed under external irrigation. Temperatures were measured in real time by three sensors at different depths (2, 8, and 13 mm). The temperature changes generated by the final drill of each protocol during the shearing and withdrawing processes were recorded as experimental results and subjected to the Student t test.
Results: Maximum temperature increases were recorded during the process of withdrawing in both protocols. In the control group, the mean thermal changes were 10.18°C, 8.61°C, and 5.78°C at depths of 2, 8, and 13 mm, respectively. In the test group, the mean thermal changes were 1.44°C, 4.46°C, and 3.58°C at depths of 2, 8, and 13 mm, respectively. The control group revealed statistically significantly (P < .0001) higher thermal changes than the test group, both in the superficial and deeper bone areas.
Conclusion: An appropriate irrigation system could be crucial for thermal lowering during a guided implant osteotomy mainly in the coronal and middle third of the implant site. Copious irrigation should be provided during the withdrawing process since greater thermal increases could be expected. Lower temperature increases could be achieved, reducing drill-to-bone contact, ie, cutting surface length, due to short frictional force exposure.