Mlm

Getting GEE estimates using iteratively reweighted least squares (IRLS)

I show this in a recent JEBS article on using Generalized Estimating Equations (GEEs). Shown below is some annotated syntax and examples. Huang, F. (2021). Analyzing cross-sectionally clustered data using generalized estimating equations. Journal of Educational and Behavioral Statistics. doi: 10.310210769986211017480 In the original paper draft, I had a section which showed how much more widely used mixed models (i.e., MLMs, HLMs) were compared to GEEs but was asked to remove that (to save space).

Using REML and ML for estimation

More notes to self… Obtaining estimates of the unknown parameters in multilevel models is often done by optimizing a likelihood function. The estimates are the values that maximize the likelihood function given certain distributional assumptions. The likelihood function differs depending on whether maximum (ML) or restricted maximum (REML) likelihood is used. For ML, the log likelihood function to be maximized is: [ \ell{ML}(\theta)=-0.5n \times ln(2\pi) -0.5 \times \sum{i}{ln(det(V_i))} - 0.

Extracting the V matrix for MLMs

Notes to self (and anyone else who might find this useful). With the general linear mixed models (to simplify, I am just omitting super/subscripts): [Y = X\beta + Zu + e] where we assume (u \sim MVN(0, G)) and (e \sim MVN(0, R)). (V) is: [V = ZGZ^T + R] Software estimates (V) iteratively and maximizes the likelihood (the function of which depends on whether ML or REML is used).