Abstract
A large body of research has focused on theory and computation for variable selection techniques for high dimensional data. There has been substantially less work in the big “tall” data paradigm, where the number of variables may be large, but the number of observations is much larger. The orthogonalizing expectation maximization (OEM) algorithm is one approach for computation of penalized models which excels in the big tall data regime. The oem package is an efficient implementation of the OEM algorithm which provides a multitude of computation routines with a focus on big tall data, such as a function for out-of-memory computation, for large-scale parallel computation of penalized regression models. Furthermore, in this paper we propose a specialized implementation of the OEM algorithm for cross validation, dramatically reducing the computing time for cross validation over a naive implementation.
Original language | English (US) |
---|---|
Journal | Journal of Statistical Software |
Volume | 104 |
Issue number | 6 |
DOIs | |
State | Published - 2022 |
Bibliographical note
Funding Information:This material is based upon work supported by, or in part by, NSF Grants DMS 1055214 and DMS 1564376, and NIH grant T32HL083806.
Publisher Copyright:
© 2022, American Statistical Association. All rights reserved.
Keywords
- C++
- MCP
- OpenMP
- expectation maximization
- lasso
- optimization
- out-of-memory computing
- parallel computing