An indispensable principle of rational thought is that positive evidence should increase belief. In this paper, we demonstrate that people routinely violate this principle when predicting an outcome from a weak cause. In Experiment 1 participants given weak positive evidence judged outcomes of public policy initiatives to be less likely than participants given no evidence, even though the evidence was separately judged to be supportive. Experiment 2 ruled out a pragmatic explanation of the result, that the weak evidence implies the absence of stronger evidence. In Experiment 3, weak positive evidence made people less likely to gamble on the outcome of the 2010 United States mid-term Congressional election. Experiments 4 and 5 replicated these findings with everyday causal scenarios. We argue that this "weak evidence effect" arises because people focus disproportionately on the mentioned weak cause and fail to think about alternative causes.
Copyright © 2011 Elsevier B.V. All rights reserved.