The human cerebellum plays an important role in language, amongst other cognitive and motor functions , but a unifying theoretical framework about cerebellar language function is lacking. In an established model of motor control, the cerebellum is seen as a predictive machine, making short-term estimations about the outcome of motor commands. This allows for flexible control, on-line correction, and coordination of movements . The homogeneous cytoarchitecture of the cerebellar cortex suggests that similar computations occur throughout the structure, operating on different input signals and with different output targets . Several authors have therefore argued that this ‘motor’ model may extend to cerebellar nonmotor functions [3–5], and that the cerebellum may support prediction in language processing . However, this hypothesis has never been directly tested. Here, we used the ‘Visual World’ paradigm , where on-line processing of spoken sentence content can be assessed by recording the latencies of listeners' eye movements towards objects mentioned. Repetitive transcranial magnetic stimulation (rTMS) was used to disrupt function in the right cerebellum, a region implicated in language . After cerebellar rTMS, listeners showed delayed eye fixations to target objects predicted by sentence content, while there was no effect on eye fixations in sentences without predictable content. The prediction deficit was absent in two control groups. Our findings support the hypothesis that computational operations performed by the cerebellum may support prediction during both motor control and language processing.