We study the extreme discrepancy of infinite sequences in the d-dimensional unit cube, which uses arbitrary sub-intervals of the unit cube as test sets. This is in contrast to the classical star discrepancy, which uses exclusively intervals that are anchored in the origin as test sets. We show that for any dimension d and any , the extreme discrepancy of every infinite sequence in is at least of order of magnitude , where N is the number of considered initial terms of the sequence. For , this order of magnitude is best possible.
Keywords:
Extreme
© The Author(s) 2022.