Laws of the iterated logarithm (LILs) are statements about the almost sure fluctuations of sums of random variables. Perhaps the first LIL was due to Kinchine in 1924 (an article titled über einen satz der wahrscheinlichkeitsrechnung, have fun reading that). Kolmogorov proved a LIL for bounded random variables in 1929.

The most famous LIL, and one often appealed to when building confidence sequences (see also stitching for LIL rates), is the following. It is due to Hartman and Wintner. See a nice survey here. Let be independent with variance . Let . A classical law of the iterated logarithm states that, with probability 1,

and

There is also a converse. If

then .

Relationship to LLNs and CLTs

LILs are often viewed as sitting between laws of large numbers and central limit theorems. The strong LLN states that a.s. The Lindeberg-Levy CLT says that converges to a normal in distribution. A LIL is trying to find the boundary, asking what kind of sequence exists such that does not converge to 0, but does not diverge.

Infinite variance

It’s natural to ask what we can say if we don’t assume the variance is finite. In this case it can be shown that . Does there exist some sequence such that It turns out that yes, under specific circumstances. Namely, the distribution must sit in the basin of attraction of a Gaussian. See this paper by Maller.