Categories
Nevin Manimala Statistics

Gradient boosting for linear mixed models

Int J Biostat. 2021 Jan 13;17(2):317-329. doi: 10.1515/ijb-2020-0136.

ABSTRACT

Gradient boosting from the field of statistical learning is widely known as a powerful framework for estimation and selection of predictor effects in various regression models by adapting concepts from classification theory. Current boosting approaches also offer methods accounting for random effects and thus enable prediction of mixed models for longitudinal and clustered data. However, these approaches include several flaws resulting in unbalanced effect selection with falsely induced shrinkage and a low convergence rate on the one hand and biased estimates of the random effects on the other hand. We therefore propose a new boosting algorithm which explicitly accounts for the random structure by excluding it from the selection procedure, properly correcting the random effects estimates and in addition providing likelihood-based estimation of the random effects variance structure. The new algorithm offers an organic and unbiased fitting approach, which is shown via simulations and data examples.

PMID:34826371 | DOI:10.1515/ijb-2020-0136

By Nevin Manimala

Portfolio Website for Nevin Manimala