Anti-concentration results are exactly what they sound like. If concentration inequalities study how random variables concentrate around particular values, anti-concentration inequalities are lower bounds on how much they concentrate around these values. Anti-concentration shows up when proving central limit theorems.
Stated another way, concentration inequalities want to provide an upper bound on the quantity , whereas anti-concentration wants to provide upper bounds on , thus showing that the concentration around for any is limited. is often called “Levy’s concentration function”.
Carbery-Wright
Let be log-concave and a polynomial of degree . In 2001, Carbery and Wright proved that
Note this covers the case of sums of normally distributed random variables, which is the typical application. See here for a modern statement and explanation of Carbery-Wright.
Glazer and Mikulincer showed how to get dimension-free bounds on the variance of polynomials, which can then be used in combination with Carbery-Wright to upper bound the right hand side above in terms of and only.
Esseen’s inequality
Suppose has characteristic function , then
So if , then .
Levy-style anti-concentration
If are independent and symmetric random variables in a Banach space and , then for all
and