Abstract
This paper systematically revisits prior meta-analytic conclusions about the criterion-related validity of personnel selection procedures, and particularly the effect of range restriction corrections on those validity estimates. Corrections for range restriction in meta-analyses of predictor–criterion relationships in personnel selection contexts typically involve the use of an artifact distribution. After outlining and critiquing five approaches that have commonly been used to create and apply range restriction artifact distributions, we conclude that each has significant issues that often result in substantial overcorrection and that therefore the validity of many selection procedures for predicting job performance has been substantially overestimated. Revisiting prior meta-analytic conclusions produces revised validity estimates. Key findings are that most of the same selection procedures that ranked high in prior summaries remain high in rank, but with mean validity estimates reduced by.10–.20 points. Structured interviews emerged as the top-ranked selection procedure. We also pair validity estimates with information about mean Black–White subgroup differences per selection procedure, providing information about validity–diversity tradeoffs.We conclude that our selection procedures remain useful, but selection predictor–criterion relationships are considerably lower than previously thought.
Original language | English (US) |
---|---|
Pages (from-to) | 2040-2068 |
Number of pages | 29 |
Journal | Journal of Applied Psychology |
Volume | 107 |
Issue number | 11 |
DOIs | |
State | Published - Dec 30 2021 |
Bibliographical note
Publisher Copyright:© 2021. American Psychological Association
Keywords
- Artifact distribution
- Meta-analysis
- Range restriction
- Selection procedures
- Validity