Aims: This paper highlights the lack of consideration that is given to power in the health and social sciences, which is a continuing problem with both single study research and more importantly for meta-analysis.
Background: The power of a study is the probability that it will lead to a statistically significant result. By ignoring power the single study researcher makes it difficult to get negative results published and therefore affects meta-analysis through publication bias. Researchers using meta-analysis, who also ignore power, then compound the problem by including studies with low power that are more likely to show significant effects.
Method: A simple means of calculating an easily understood measure of effect size from a contingency table is demonstrated in this paper. A computer programme for determining the power of a study is recommended and a method of reflecting the adequacy of the power of the studies in a meta-analysis is suggested. An example of this calculation from a meta-analytic study on intravenous magnesium, which produced inaccurate results, is provided.
Conclusion: It is demonstrated that incorporating power analysis into this meta-analysis would have prevented misleading conclusions being reached. Some suggestions are made for changes in the protocol of meta-analytic studies, which highlight the importance of power analysis.