Suppose you are trying to generate a confidence sequence for a parameter $μ$. Suppose you can write down a supermartingale of the form $M_{t}(λ)=exp{λS_{t}(μ)−ψ(λ)V_{t}}$ for $μ$ (eg, a sub-psi process based on an exponential inequality), where $(S_{t}(μ))$ is some process which is a function of $μ$. $(S_{t})$ is often the martingale $S_{t}(μ)=∑_{i≤t}(X_{i}−μ)$.

One option is to use a predictable sequence $(λ_{t})_{t≥1}$, employing a different $λ_{t}$ at each time step. See confidence sequences via predictable plug-ins.

Another approach, that of conjugate mixtures, is to integrate over $λ$. Suppose $M_{t}(λ$) is a supermartingale for all $λ∈Λ$. Then

$N_{t}:=∫_{λ}M_{t}(λ)ρ(dλ),$is a supermartingale for any distribution $ρ$ over $Λ$ by Tonelli’s theorem. (For intuition take $ρ$ to be a discrete distribution). Applying Ville’s inequality to $N_{t}$ then yields a (sometimes implicit) CS for $μ$.

Various distributions $ρ$ were studied by Howard et al, most of them giving implicit CSs. One nice exception is Robbins’ Gaussian mixture.

## Robbins’ Gaussian mixture

If $X_{1},X_{2},…$ are $σ$-sub-Gaussian with conditional means $μ_{i}=E[X_{i}∣X_{1}]$, then $M_{t}(λ)=∏_{i≤t}exp{λ(X_{i}−μ_{i})−λ_{2}σ_{2}/2}$ is the standard sub-Gaussian supermartingale then taking $ρ$ to be Gaussian with mean 0 and variance $a_{2}$ gives the following bound:

$P ∃t≥1: i≤t∑ X_{i}−μ_{i} ≥σt_{2}a_{2}2(ta_{2}+1) g(αta_{2}+1 ) ≤α.$Robbins noted this CS (in a slightly less general sense) in the 1970s. This has been generalized to vector-valued random variables.

# Refs

- Conjugate mixtures, Lecture notes by Aaditya Ramdas.