Background: Medical practitioners have unmet information needs. Health care research dissemination suffers from both "supply" and "demand" problems. One possible solution is to develop methodologic search filters ("hedges") to improve the retrieval of clinically relevant and scientifically sound study reports from bibliographic databases. To develop and test such filters a hand search of the literature was required to determine directly which articles should be retrieved, and which not retrieved.
Objective: To determine the extent to which 6 research associates can agree on the classification of articles according to explicit research criteria when hand searching the literature.
Design: Blinded, inter-rater reliability study.
Setting: Health Information Research Unit, McMaster University, Hamilton, Ontario, Canada.
Participants: 6 research associates with extensive training and experience in research methods for health care research.
Main outcome measure: Inter-rater reliability measured using the kappa statistic for multiple raters.
Results: After one year of intensive calibration exercises research staff were able to attain a level of agreement at least 80% greater than that expected by chance (kappa statistic) for all classes of articles.
Conclusion: With extensive training multiple raters are able to attain a high level of agreement when classifying articles in a hand search of the literature.