Behavioral studies have demonstrated that time perception in adults, children, and nonhuman animals is subject to Weber's Law. More specifically, as with discriminations of other features, it has been found that it is the ratio between two durations rather than their absolute difference that controls the ability of an animal to discriminate them. Here, we show that scalp-recorded event-related electrical brain potentials (ERPs) in both adults and 10-month-old human infants, in response to changes in interstimulus interval (ISI), appear to obey the scalar property found in time perception in adults, children, and nonhuman animals. Using a timing-interval oddball paradigm, we tested adults and infants in conditions where the ratio between the standard and deviant interval in a train of homogeneous auditory stimuli varied such that there was a 1:4 (only for the infants), 1:3, 1:2, and 2:3 ratio between the standard and deviant intervals. We found that the amplitude of the deviant-triggered mismatch negativity ERP component (deviant-ISI ERP minus standard-ISI ERP) varied as a function of the ratio of the standard to deviant interval. Moreover, when absolute values were varied and ratio was held constant, the mismatch negativity did not vary.