sufficiencyestimatorsdecision-theory
We observe data drawn from some for some . Let be any convex loss on estimates of (see statistical decision theory). The Rao-Blackwell theorem states that the expected loss of an estimator can never be made worse by conditioning on a sufficient statistic . That is, if we consider , then
Sufficiency is only used in order to define the estimator in order to make sure it’s actually independent from (otherwise the theorem is useless).
The estimator is often called the Rao-Blackwellization of . The theorem is usually applied with the squared error.