Hannan–Quinn information criterion explained

In statistics, the Hannan–Quinn information criterion (HQC) is a criterion for model selection. It is an alternative to Akaike information criterion (AIC) and Bayesian information criterion (BIC). It is given as

HQC=-2Lmax+2kln(ln(n)),

where

Lmax

is the log-likelihood, k is the number of parameters, and n is the number of observations.

Burnham & Anderson (2002, p. 287) say that HQC, "while often cited, seems to have seen little use in practice". They also note that HQC, like BIC, but unlike AIC, is not an estimator of Kullback–Leibler divergence. Claeskens & Hjort (2008, ch. 4) note that HQC, like BIC, but unlike AIC, is not asymptotically efficient; however, it misses the optimal estimation rate by a very small

ln(ln(n))

factor. They further point out that whatever method is being used for fine-tuning the criterion will be more important in practice than the term

ln(ln(n))

, since this latter number is small even for very large

n

; however, the

ln(ln(n))

term ensures that, unlike AIC, HQC is strongly consistent. It follows from the law of the iterated logarithm that any strongly consistent method must miss efficiency by at least a

ln(ln(n))

factor, so in this sense HQC is asymptotically very well-behaved. Van der Pas and Grünwald prove that model selection based on a modified Bayesian estimator, the so-called switch distribution, in many cases behaves asymptotically like HQC, while retaining the advantages of Bayesian methods such as the use of priors etc.

See also

References