Changing abilities vs. changing tasks

Examining validity degradation with test scores and college performance criteria both assessed longitudinally

Jeffrey A. Dahlke, Jack W. Kostal, Paul R Sackett, Nathan R Kuncel

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

We explore potential explanations for validity degradation using a unique predictive validation data set containing up to four consecutive years of high school students' cognitive test scores and four complete years of those students' college grades. This data set permits analyses that disentangle the effects of predictor-score age and timing of criterion measurements on validity degradation. We investigate the extent to which validity degradation is explained by criterion dynamism versus the limited shelf-life of ability scores. We also explore whether validity degradation is attributable to fluctuations in criterion variability over time and/or GPA contamination from individual differences in course-taking patterns. Analyses of multiyear predictor data suggest that changes to the determinants of performance over time have much stronger effects on validity degradation than does the shelf-life of cognitive test scores. The age of predictor scores had only a modest relationship with criterion-related validity when the criterion measurement occasion was held constant. Practical implications and recommendations for future research are discussed.

Original languageEnglish (US)
Pages (from-to)980-1000
Number of pages21
JournalJournal of Applied Psychology
Volume103
Issue number9
DOIs
StatePublished - Sep 1 2018

Fingerprint

Aptitude
Students
Individuality
Datasets

Keywords

  • Cognitive ability
  • College performance
  • Dynamic criteria
  • Validity
  • Validity degradation

PubMed: MeSH publication types

  • Journal Article

Cite this

@article{b51f31fbe46f49269b7582d944463322,
title = "Changing abilities vs. changing tasks: Examining validity degradation with test scores and college performance criteria both assessed longitudinally",
abstract = "We explore potential explanations for validity degradation using a unique predictive validation data set containing up to four consecutive years of high school students' cognitive test scores and four complete years of those students' college grades. This data set permits analyses that disentangle the effects of predictor-score age and timing of criterion measurements on validity degradation. We investigate the extent to which validity degradation is explained by criterion dynamism versus the limited shelf-life of ability scores. We also explore whether validity degradation is attributable to fluctuations in criterion variability over time and/or GPA contamination from individual differences in course-taking patterns. Analyses of multiyear predictor data suggest that changes to the determinants of performance over time have much stronger effects on validity degradation than does the shelf-life of cognitive test scores. The age of predictor scores had only a modest relationship with criterion-related validity when the criterion measurement occasion was held constant. Practical implications and recommendations for future research are discussed.",
keywords = "Cognitive ability, College performance, Dynamic criteria, Validity, Validity degradation",
author = "Dahlke, {Jeffrey A.} and Kostal, {Jack W.} and Sackett, {Paul R} and Kuncel, {Nathan R}",
year = "2018",
month = "9",
day = "1",
doi = "10.1037/apl0000316",
language = "English (US)",
volume = "103",
pages = "980--1000",
journal = "Journal of Applied Psychology",
issn = "0021-9010",
publisher = "American Psychological Association",
number = "9",

}

TY - JOUR

T1 - Changing abilities vs. changing tasks

T2 - Examining validity degradation with test scores and college performance criteria both assessed longitudinally

AU - Dahlke, Jeffrey A.

AU - Kostal, Jack W.

AU - Sackett, Paul R

AU - Kuncel, Nathan R

PY - 2018/9/1

Y1 - 2018/9/1

N2 - We explore potential explanations for validity degradation using a unique predictive validation data set containing up to four consecutive years of high school students' cognitive test scores and four complete years of those students' college grades. This data set permits analyses that disentangle the effects of predictor-score age and timing of criterion measurements on validity degradation. We investigate the extent to which validity degradation is explained by criterion dynamism versus the limited shelf-life of ability scores. We also explore whether validity degradation is attributable to fluctuations in criterion variability over time and/or GPA contamination from individual differences in course-taking patterns. Analyses of multiyear predictor data suggest that changes to the determinants of performance over time have much stronger effects on validity degradation than does the shelf-life of cognitive test scores. The age of predictor scores had only a modest relationship with criterion-related validity when the criterion measurement occasion was held constant. Practical implications and recommendations for future research are discussed.

AB - We explore potential explanations for validity degradation using a unique predictive validation data set containing up to four consecutive years of high school students' cognitive test scores and four complete years of those students' college grades. This data set permits analyses that disentangle the effects of predictor-score age and timing of criterion measurements on validity degradation. We investigate the extent to which validity degradation is explained by criterion dynamism versus the limited shelf-life of ability scores. We also explore whether validity degradation is attributable to fluctuations in criterion variability over time and/or GPA contamination from individual differences in course-taking patterns. Analyses of multiyear predictor data suggest that changes to the determinants of performance over time have much stronger effects on validity degradation than does the shelf-life of cognitive test scores. The age of predictor scores had only a modest relationship with criterion-related validity when the criterion measurement occasion was held constant. Practical implications and recommendations for future research are discussed.

KW - Cognitive ability

KW - College performance

KW - Dynamic criteria

KW - Validity

KW - Validity degradation

UR - http://www.scopus.com/inward/record.url?scp=85046299548&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85046299548&partnerID=8YFLogxK

U2 - 10.1037/apl0000316

DO - 10.1037/apl0000316

M3 - Article

VL - 103

SP - 980

EP - 1000

JO - Journal of Applied Psychology

JF - Journal of Applied Psychology

SN - 0021-9010

IS - 9

ER -