Pinelis is responsible for both a Hoeffding and Bernstein bound in smooth and separable Banach spaces. At the core of his proof is the construction of a supermartingale; his results can therefore be made time-uniform by applying Ville’s inequality instead of Markov’s inequality (though they’re usually stated as fixed-time bounds).
Hoeffding
Consider a -smooth separable Banach space with norm . Let have conditional mean with . Then:
Note the similarity to the usual Hoeffding bound (bounded scalar concentration) but with the extra factor of . This is because Hilbert spaces are -smooth Banach spaces.
Bernstein
Consider a -smooth separable Banach space with norm . Let have conditional mean with . Then
Like for Bernstein’s/Bennett’s inequality in the scalar setting (Bennett’s inequality), if the random variables are iid then is simply the variance.
Empirical-Bernstein
See discussion on the work of Martinez-Taboada and Ramdas in empirical Bernstein bounds:
Random vectors
Using elements of the Pinelis approach to concentration in addition to some game-theoretic techniques (game-theoretic statistics), Martinez-Taboada and Ramdas gave a time-uniform empirical Bernstein bound in (2,)-smooth Banach spaces. They show that, for with conditional mean with . Then, with probability , simultaneously for all :
where and and is a predictable sequence with values in .
We also give an empirical-Bernstein bound for random vectors in in Time-uniform confidence spheres for means of random vectors using the variational approach to concentration but it has explicit dependence on the dimension.
Link to original
Heavy-tailed concentration
In 2015, Minsker proposed the geometric median-of-means for Banach spaces. This is a general method for boosting weak (polynomial rate) estimators into an estimator with exponential rate. The idea is similar to the Lugosi-Mendelson median-of-means (see multivariate heavy-tailed concentration), but the weak estimators are aggregated using the geometric median. This can be computed in polynomial time (in ) using Weisfeld’s algorithm, since the objective is convex. This estimator was proposed simultaneously by Hsu and Sabato.
In 2022, Yun and Park extended geometric median-of-means to Polish spaces, which include separable Banach spaces. They seem to get the same rates as Minsker, which are not quite sub-Gaussian in .