Purpose: To retrospectively compare the accuracy of observer performance with personal computer (PC) compared with that with dedicated picture archiving and communication system (PACS) workstation display in the detection of wrist fractures on computed radiographs.
Materials and methods: This study was conducted according to the principles of the Declaration of Helsinki (2002 version) of the World Medical Association. The institutional clinical board approved the study; informed consent was not required. Seven observers independently assessed randomized anonymous digital radiographs of the wrist from 259 subjects; 146 had fractures, and 113 were healthy control subjects (151 male and 108 female subjects; average age, 33 years). Follow-up radiographs and/or computed tomographic scans were used as the reference standard for patients with fractures, and follow-up radiographs and/or clinical history data were used as the reference standard for controls. The PC was a standard hospital machine with a 17-inch (43-cm) color monitor with which Web browser display software was used. The PACS workstation had two portrait 21-inch (53-cm) monochrome monitors that displayed 2300 lines. The observers assigned scores to the radiographs on a scale of 1 (no fracture) to 5 (definite fracture). Receiver operating characteristic (ROC) curves, sensitivity, specificity, and accuracy were compared.
Results: The areas under the ROC curves were almost identical for the PC and workstation (0.910 vs 0.918, respectively; difference, 0.008; 95% confidence interval: -0.029, 0.013). The average sensitivity with the PC was almost identical to that with the workstation (85% vs 84%, respectively), as was the average specificity (82% vs 81%, respectively). The average accuracy (83%) was the same for both.
Conclusion: The results of this study showed that there was no difference in accuracy of observer performance for detection of wrist fractures with a PC compared with that with a PACS workstation.