These are concentration inequalities that concern functions of random variables that obey a bounded differences condition. Intuitively this means that if you change one argument, the function value doesn’t change too much.

The most famous example of a bounded difference inequality is McDiarmid’s inequality, which is the following. Suppose obeys

for all . (Here is the vector of arguments that omits ). Then

Note this is a generalization of the Hoeffding’s bound. It is recovered by taking .

In fact, need not be a function with domain . We can replace with for any space . (See Raginsky’s lecture notes for the proof). This is useful if, for instance, we are interested in obtaining bounds on the norm of random vectors. If we is a normed space for instance, then could be and we can obtain deviation inequalities between and .