In econometrics, the information matrix test is used to determine whether a regression model is misspecified. The test was developed by Halbert White,[1] who observed that in a correctly specified model and under standard regularity assumptions, the Fisher information matrix can be expressed in either of two ways: as the outer product of the gradient, or as a function of the Hessian matrix of the log-likelihood function.
Consider a linear model
y=X\beta+u
u
N(0,\sigma2I)
\beta
\sigma2
\thetaT=\begin{bmatrix}\beta&\sigma2\end{bmatrix}
\ell(\theta)=-
n | |
2 |
log\sigma2-
1 | |
2\sigma2 |
\left(y-X\beta\right)T\left(y-X\beta\right)
The information matrix can then be expressed as
I(\theta)=\operatorname{E}\left[\left(
\partial\ell(\theta) | |
\partial\theta |
\right)\left(
\partial\ell(\theta) | |
\partial\theta |
\right)T\right]
I(\theta)=-\operatorname{E}\left[
\partial2\ell(\theta) | |
\partial\theta\partial\thetaT |
\right]
If the model is correctly specified, both expressions should be equal. Combining the equivalent forms yields
\Delta(\theta)=
n | |
\sum | |
i=1 |
\left[
\partial2\ell(\theta) | |
\partial\theta\partial\thetaT |
+
\partial\ell(\theta) | |
\partial\theta |
\partial\ell(\theta) | |
\partial\theta |
\right]
where
\Delta(\theta)
(r x r)
r
n-1/2\Delta(\hat{\theta
\hat{\theta