Comparative analysis of the reliability of job performance ratings

Chockalingam Viswesvaran, Deniz S. Ones, Frank L. Schmidt

Research output: Contribution to journalReview articlepeer-review

570 Scopus citations

Abstract

This study used meta-analytic methods to compare the interrater and intrarater reliabilities of ratings of 10 dimensions of job performance used in the literature; ratings of overall job performance were also examined. There was mixed support for the notion that some dimensions are rated more reliably than others. Supervisory ratings appear to have higher interrater reliability than peer ratings. Consistent with H. R. Rothstein (1990), mean interrater reliability of supervisory ratings of overall job performance was found to be .52. In all cases, interrater reliability is lower than intrarater reliability, indicating that the inappropriate use of intrarater reliability estimates to correct for biases from measurement error leads to biased research results. These findings have important implications for both research and practice.

Original languageEnglish (US)
Pages (from-to)557-574
Number of pages18
JournalJournal of Applied Psychology
Volume81
Issue number5
DOIs
StatePublished - Oct 1996

Fingerprint

Dive into the research topics of 'Comparative analysis of the reliability of job performance ratings'. Together they form a unique fingerprint.

Cite this