Generalized estimating equation explained

In statistics, a generalized estimating equation (GEE) is used to estimate the parameters of a generalized linear model with a possible unmeasured correlation between observations from different timepoints.[1] [2] Although some believe that GEEs are robust in everything, even with the wrong choice of working correlation matrix, generalized estimating equations are robust only to loss of consistency with the wrong choice.

Regression beta coefficient estimates from the Liang-Zeger GEE are consistent, unbiased, and asymptotically normal even when the working correlation is misspecified, under mild regularity conditions. GEE is higher in efficiency than generalized linear iterative model (GLIM) in the presence of high autocorrelation. When the true working correlation is known, consistency does not require the assumption that missing data is missing completely at random. Huber-White standard errors improve the efficiency of Liang-Zeger GEE in the absence of serial autocorrelation but may remove the marginal interpretation. GEE estimates the average response over the population ("population-averaged" effects) with Liang-Zeger standard errors, and in individuals using Huber-White standard errors, also known as "robust standard error" or "sandwich variance" estimates.[3] Huber-White GEE was used since 1997, and Liang-Zeger GEE dates to the 1980s based on a limited literature review.[4] Several independent formulations of these standard error estimators contribute to GEE theory. Placing the independent standard error estimators under the umbrella term "GEE" may exemplify abuse of terminology.

GEEs belong to a class of regression techniques that are referred to as semiparametric because they rely on specification of only the first two moments. They are a popular alternative to the likelihood-based generalized linear mixed model which is more at risk for consistency loss at variance structure specification.[5] The trade-off of variance-structure misspecification and consistent regression coefficient estimates is loss of efficiency, yielding inflated Wald test p-values as a result of higher variance of standard errors than that of the most optimal.[6] They are commonly used in large epidemiological studies, especially multi-site cohort studies, because they can handle many types of unmeasured dependence between outcomes.

Formulation

Given a mean model

\muij

for subject

i

and time

j

that depends upon regression parameters

\betak

, and variance structure,

Vi

, the estimating equation is formed via:[7]

U(\beta)=

N
\sum
i=1
\partial\mui
\partial\beta
-1
V
i

\{Yi-\mui(\beta)\}

The parameters

\betak

are estimated by solving

U(\beta)=0

and are typically obtained via the Newton–Raphson algorithm. The variance structure is chosen to improve the efficiency of the parameter estimates. The Hessian of the solution to the GEEs in the parameter space can be used to calculate robust standard error estimates. The term "variance structure" refers to the algebraic form of the covariance matrix between outcomes, Y, in the sample. Examples of variance structure specifications include independence, exchangeable, autoregressive, stationary m-dependent, and unstructured. The most popular form of inference on GEE regression parameters is the Wald test using naive or robust standard errors, though the Score test is also valid and preferable when it is difficult to obtain estimates of information under the alternative hypothesis. The likelihood ratio test is not valid in this setting because the estimating equations are not necessarily likelihood equations. Model selection can be performed with the GEE equivalent of the Akaike Information Criterion (AIC), the quasi-likelihood under the independence model criterion (QIC).[8]

Relationship with Generalized Method of Moments

The generalized estimating equation is a special case of the generalized method of moments (GMM).[9] This relationship is immediately obvious from the requirement that the score function satisfy the equation:\mathbb[U(\beta)] = \sum_^N \frac V_i^ \ \,\! = 0

Computation

Software for solving generalized estimating equations is available in MATLAB,[10] SAS (proc genmod[11]), SPSS (the gee procedure[12]), Stata (the xtgee command[13]), R (packages glmtoolbox,[14] gee,[15] geepack and multgee), Julia (package GEE.jl[16]) and Python (package statsmodels[17]).

Comparisons among software packages for the analysis of binary correlated data [18] [19] and ordinal correlated data[20] via GEE are available.

See also

Further reading

External links

Notes and References

  1. Biometrika . 73 . 1 . 13–22 . Longitudinal data analysis using generalized linear models . Kung-Yee Liang . Scott Zeger . Scott Zeger . 1986 . 10.1093/biomet/73.1.13. free .
  2. Book: Hardin, James . Hilbe, Joseph . Joseph Hilbe . Generalized Estimating Equations . registration. London: Chapman and Hall/CRC . 2003 . 978-1-58488-307-4 .
  3. Abadie . Alberto . Athey . Susan . Imbens . Guido W . Wooldridge . Jeffrey M . 1710.02926 . October 2022 . 10.1093/qje/qjac038 . 1 . The Quarterly Journal of Economics . 1–35 . When Should You Adjust Standard Errors for Clustering? . 138.
  4. Wolfe . Frederick . Anderson . Janice . Harkness . Deborah . Bennett . Robert M. . Caro . Xavier J. . Goldenberg . Don L. . Russell . I. Jon . Yunus . Muhammad B. . 1997 . A prospective, longitudinal, multicenter study of service utilization and costs in fibromyalgia . Arthritis & Rheumatism . en . 40 . 9 . 1560–1570 . 10.1002/art.1780400904. 9324009 .
  5. 2883299. 2010. Fong. Y. Bayesian inference for generalized linear mixed models. Biostatistics. 11. 3. 397–412. Rue. H. Wakefield. J. 10.1093/biostatistics/kxp053. 19966070.
  6. O'Brien . Liam M. . Fitzmaurice . Garrett M. . Horton . Nicholas J. . October 2006 . Maximum Likelihood Estimation of Marginal Pairwise Associations with Multiple Source Predictors . Biometrical Journal . en . 48 . 5 . 860–875 . 10.1002/bimj.200510227 . 0323-3847 . 1764610 . 17094349.
  7. Book: Diggle, Peter J. . Patrick Heagerty . Kung-Yee Liang . Scott L. Zeger. Analysis of Longitudinal Data . Oxford Statistical Science Series . 2002 . 978-0-19-852484-7.
  8. .
  9. Breitung. Jörg. Chaganty. N. Rao. Daniel. Rhian M.. Kenward. Michael G.. Lechner. Michael. Martus. Peter. Sabo. Roy T.. Wang. You-Gan. Zorn. Christopher. 3213776. 2010. Discussion of 'Generalized Estimating Equations: Notes on the Choice of the Working Correlation Matrix'. Methods of Information in Medicine. 49. 5. 426–432. 10.1055/s-0038-1625133.
  10. Journal of Statistical Software . 25 . 14 . 1–14 . GEEQBOX: A MATLAB Toolbox for Generalized Estimating Equations and Quasi-Least Squares . Sarah J. Ratcliffe . Justine Shults . 2008 .
  11. Web site: The GENMOD Procedure . The SAS Institute.
  12. Web site: IBM SPSS Advanced Statistics. IBM SPSS website.
  13. Web site: Stata's implementation of GEE . Stata website.
  14. Web site: glmtoolbox: Set of Tools to Data Analysis using Generalized Linear Models. CRAN . 10 October 2023 .
  15. Web site: gee: Generalized Estimation Equation solver. CRAN . 7 November 2019 .
  16. Web site: Shedden . Kerby . Generalized Estimating Equations in Julia . GitHub . 24 June 2022 . 23 June 2022.
  17. Web site: Generalized Estimating Equations — statsmodels.
  18. Biometrical Journal . 40 . 3 . 245–260 . The generalised estimating equations: a comparison of procedures available in commercial statistical software packages. Andreas Ziegler . . 1998 . 10.1002/(sici)1521-4036(199807)40:3<245::aid-bimj245>3.0.co;2-n.
  19. The American Statistician . 53 . 2 . Review of software to fit generalized estimating equation regression models . Nicholas J. HORTON . Stuart R. LIPSITZ . 1999 . 10.1080/00031305.1999.10474451 . 160–169. 10.1.1.22.9325 .
  20. Computational Statistics & Data Analysis . GEE for longitudinal ordinal data: Comparing R-geepack, R-multgee, R-repolr, SAS-GENMOD, SPSS-GENLIN. Nazanin Nooraee . Geert Molenberghs . Edwin R. van den Heuvel . 2014 . 10.1016/j.csda.2014.03.009 . 77 . 70–83. 15063953.