A Comparison of Robust Likelihood Estimators to Mitigate Bias From Rapid Guessing

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Rapid guessing (RG) behavior can undermine measurement property and score-based inferences. To mitigate this potential bias, practitioners have relied on response time information to identify and filter RG responses. However, response times may be unavailable in many testing contexts, such as paper-and-pencil administrations. When this is the case, self-report measures of effort and person-fit statistics have been used. These methods are limited in that inferences concerning motivation and aberrant responding are made at the examinee level. As test takers can engage in a mixture of solution and RG behavior throughout a test administration, there is a need to limit the influence of potential aberrant responses at the item level. This can be done by employing robust estimation procedures. Since these estimators have received limited attention in the RG literature, the objective of this simulation study was to evaluate ability parameter estimation accuracy in the presence of RG by comparing maximum likelihood estimation (MLE) to two robust variants, the bisquare and Huber estimators. Two RG conditions were manipulated, RG percentage (10%, 20%, and 40%) and pattern (difficulty-based and changing state). Contrasted to the MLE procedure, results demonstrated that both the bisquare and Huber estimators reduced bias in ability parameter estimates by as much as 94%. Given that the Huber estimator showed smaller standard deviations of error and performed equally as well as the bisquare approach under most conditions, it is recommended as a promising approach to mitigating bias from RG when response time information is unavailable.

Original languageEnglish (US)
Pages (from-to)236-249
Number of pages14
JournalApplied Psychological Measurement
Volume46
Issue number3
DOIs
StatePublished - May 2022

Bibliographical note

Funding Information:
The author would like to thank Samuel Ihlenfeldt and Jiayi Deng from the University of Minnesota for their feedback on earlier drafts of the manuscript. The author(s) received no financial support for the research, authorship, and/or publication of this article.

Publisher Copyright:
© The Author(s) 2022.

Keywords

  • item response theory
  • low-stakes testing
  • noneffortful responding
  • rapid guessing
  • robust likelihood estimation
  • validity

Fingerprint

Dive into the research topics of 'A Comparison of Robust Likelihood Estimators to Mitigate Bias From Rapid Guessing'. Together they form a unique fingerprint.

Cite this