Synchronization of two independent systems is a widely discussed phenomenon observed in many disciplines. Recent studies have shown its relevance to cognitive functions. However, its influence on language perception has not yet been investigated. As successful syntactic processing relies on rules that enable the listener to predict the category of the next incoming element, such prediction can be maximized if the auditory speech input is temporally regular and hence motivates synchronization. For this reason, the present ERP-experiments investigated the influence of successful synchronization in auditory syntactic processing. Our results clearly demonstrate that late syntactic processes (P600) are controlled by a temporally regular input. In particular, the latency of the P600 varies as a function of the duration of a pre-determined interval between successive elements. The current data therefore attest the impact of synchronization on higher level cognitive processes such as syntax in language.