The Effects of Response Instructions on Situational Judgment Test Performance and Validity in a High-Stakes Context

Filip Lievens, Paul R. Sackett, Tine Buyse

Research output: Contribution to journalArticlepeer-review

38 Scopus citations

Abstract

This study fills a key gap in research on response instructions in situational judgment tests (SJTs). The authors examined whether the assumptions behind the differential effects of knowledge and behavioral tendency SJT response instructions hold in a large-scale high-stakes selection context (i.e., admission to medical college). Candidates (N = 2,184) were randomly assigned to a knowledge or behavioral tendency response instruction SJT, while SJT content was kept constant. Contrary to prior research in low-stakes settings, no meaningfully important differences were found between mean scores for the response instruction sets. Consistent with prior research, the SJT with knowledge instructions correlated more highly with cognitive ability than did the SJT with behavioral tendency instructions. Finally, no difference was found between the criterion-related validity of the SJTs under the two response instruction sets.

Original languageEnglish (US)
Pages (from-to)1095-1101
Number of pages7
JournalJournal of Applied Psychology
Volume94
Issue number4
DOIs
StatePublished - Jul 2009

Keywords

  • high-stakes testing
  • response instructions
  • situational judgment test

Fingerprint

Dive into the research topics of 'The Effects of Response Instructions on Situational Judgment Test Performance and Validity in a High-Stakes Context'. Together they form a unique fingerprint.

Cite this