Background: The evaluation of abstracts for scientific meetings has been shown to suffer from poor inter observer reliability. A measure was developed to assess the formal quality of abstract submissions in a standardized way.
Methods: Item selection was based on scoring systems for full reports, taking into account published guidelines for structured abstracts. Interrater agreement was examined using a random sample of submissions to the American Gastroenterological Association, stratified for research type (n = 100, 1992-1995). For construct validity, the association of formal quality with acceptance for presentation was examined. A questionnaire to expert reviewers evaluated sensibility items, such as ease of use and comprehensiveness.
Results: The index comprised 19 items. The summary quality scores showed good interrater agreement (intra class coefficient 0.60 - 0.81). Good abstract quality was associated with abstract acceptance for presentation at the meeting. The instrument was found to be acceptable by expert reviewers.
Conclusion: A quality index was developed for the evaluation of scientific meeting abstracts which was shown to be reliable, valid and useful.