The Chernoff method applies Markov’s inequality to the nonnegative random variable , and then chooses to optimize the bound. This often results in tighter bounds and concentration inequalities than working with the random variable directly.

In particular, for a random variable , Markov’s inequality gives that

This holds for all , so we obtain the bound

where is the MGF of . If we place distributional assumptions on (eg bounded, sub-Gaussianity), then we can solve for the optimal value of .

The Chernoff method is often written in terms of the Cramer transform of (hence often called in the Cramer-Chernoff method), which is

This is also often called the Fenchel-Legendre transform. We can write