Adaptive estimator explained

In statistics, an adaptive estimator is an estimator in a parametric or semiparametric model with nuisance parameters such that the presence of these nuisance parameters does not affect efficiency of estimation.

Definition

Formally, let parameter θ in a parametric model consists of two parts: the parameter of interest, and the nuisance parameter . Thus . Then we will say that \scriptstyle\hat\nu_n is an adaptive estimator of ν in the presence of η if this estimator is regular, and efficient for each of the submodels

l{P}\nu(η0)=\{P\theta:\nu\inN,η=η0\}.

Adaptive estimator estimates the parameter of interest equally well regardless whether the value of the nuisance parameter is known or not.

The necessary condition for a regular parametric model to have an adaptive estimator is that

I\nuη(\theta)=\operatorname{E}[z\nuzη']=0forall\theta,

where zν and zη are components of the score function corresponding to parameters ν and η respectively, and thus Iνη is the top-right k×m block of the Fisher information matrix I(θ).

Example

Suppose

\scriptstylel{P}

is the normal location-scale family:

l{P}=\{f\theta(x)=\tfrac{1}{\sqrt{2\pi}\sigma}

-1
2\sigma2
(x-\mu)2
e

|\mu\inR,\sigma>0\}.

Then the usual estimator

\hat\mu=\bar{x}

is adaptive: we can estimate the mean equally well whether we know the variance or not.

Basic references

Other useful references