Abstract
There is a long history of examining assessments used in college admissions or personnel selection for predictive bias, also called differential prediction, to determine whether a selection system predicts comparable levels of performance for individuals from different demographic groups who have the same assessment scores. We expand on previous research that has considered predictive bias in individual predictor variables to (a) examine magnitudes of differential prediction in multipredictor selection systems and (b) explore how differences in prediction generalize across samples. We also share updated methods for computing standardized effect sizes for categorically moderated regression models that facilitate the metaanalysis of differential prediction effects. Our findings highlight the importance of analyzing composite predictors when testing for predictive bias in compensatory selection systems and demonstrate the generalizability of long-observed differential prediction trends by race/ethnicity.
Original language | English (US) |
---|---|
Pages (from-to) | 1995-2012 |
Number of pages | 18 |
Journal | Journal of Applied Psychology |
Volume | 107 |
Issue number | 11 |
DOIs | |
State | Published - Dec 30 2021 |
Bibliographical note
Funding Information:This research was supported by a grant from the College Board. Paul R.Sackett has served as a consultant to the College Board. This relationship hasbeen reviewed and managed by the University of Minnesota in accordancewith its conflict of interest policies. This research is derived from dataprovided by the College Board
Publisher Copyright:
© 2021. American Psychological Association
Keywords
- Differential prediction
- Meta-analysis
- Predictive bias
- Predictor weighting
- Selection