Suppose you are trying to generate a confidence sequence for a parameter . Suppose you can write down a supermartingale of the form for (eg, a sub-psi process based on an exponential inequality), where is some process which is a function of . is often the martingale .

One option is to use a predictable sequence , employing a different at each time step. See confidence sequences via predictable plug-ins.

Another approach, that of conjugate mixtures, is to integrate over . Suppose ) is a supermartingale for all . Then

is a supermartingale for any distribution over by Tonelli’s theorem. (For intuition take to be a discrete distribution). Applying Ville’s inequality to then yields a (sometimes implicit) CS for .

Various distributions were studied by Howard et al, most of them giving implicit CSs. One nice exception is Robbins’ Gaussian mixture.

Robbins’ Gaussian mixture

If are -sub-Gaussian with conditional means , then is the standard sub-Gaussian supermartingale then taking to be Gaussian with mean 0 and variance gives the following bound:

Robbins noted this CS (in a slightly less general sense) in the 1970s. This has been generalized to vector-valued random variables.

Refs