Mutation rates can evolve through genetic drift, indirect selection due to genetic hitchhiking, or direct selection on the physicochemical cost of high fidelity. However, for many systems, it has been difficult to disentangle the relative impact of these forces empirically. In RNA viruses, an observed correlation between mutation rate and virulence has led many to argue that their extremely high mutation rates are advantageous because they may allow for increased adaptability. This argument has profound implications because it suggests that pathogenesis in many viral infections depends on rare or de novo mutations. Here, we present data for an alternative model whereby RNA viruses evolve high mutation rates as a byproduct of selection for increased replicative speed. We find that a poliovirus antimutator, 3DG64S, has a significant replication defect and that wild-type (WT) and 3DG64S populations have similar adaptability in 2 distinct cellular environments. Experimental evolution of 3DG64S under selection for replicative speed led to reversion and compensation of the fidelity phenotype. Mice infected with 3DG64S exhibited delayed morbidity at doses well above the lethal level, consistent with attenuation by slower growth as opposed to reduced mutational supply. Furthermore, compensation of the 3DG64S growth defect restored virulence, while compensation of the fidelity phenotype did not. Our data are consistent with the kinetic proofreading model for biosynthetic reactions and suggest that speed is more important than accuracy. In contrast with what has been suggested for many RNA viruses, we find that within-host spread is associated with viral replicative speed and not standing genetic diversity.