Revisiting the design of selection systems in light of new findings regarding the validity of widely used predictors

Paul R. Sackett, Charlene Zhang, Christopher M. Berry, Filip Lievens

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

Sackett et al. (2022) identified previously unnoticed flaws in the way range restriction corrections have been applied in prior meta-analyses of personnel selection tools. They offered revised estimates of operational validity, which are often quite different from the prior estimates. The present paper attempts to draw out the applied implications of that work. We aim to a) present a conceptual overview of the critique of prior approaches to correction, b) outline the implications of this new perspective for the relative validity of different predictors and for the tradeoff between validity and diversity in selection system design, c) highlight the need to attend to variability in meta-analytic validity estimates, rather than just the mean, d) summarize reactions encountered to date to Sackett et al., and e) offer a series of recommendations regarding how to go about correcting validity estimates for unreliability in the criterion and for range restriction in applied work.

Original languageEnglish (US)
Pages (from-to)283-300
Number of pages18
JournalIndustrial and Organizational Psychology
Volume16
Issue number3
DOIs
StatePublished - Sep 9 2023

Bibliographical note

Publisher Copyright:
© The Author(s), 2023. Published by Cambridge University Press on behalf of the Society for Industrial and Organizational Psychology.

Keywords

  • meta-analysis
  • selection
  • validity

Fingerprint

Dive into the research topics of 'Revisiting the design of selection systems in light of new findings regarding the validity of widely used predictors'. Together they form a unique fingerprint.

Cite this