Investigating the Impact of Noneffortful Responses on Individual-Level Scores: Can the Effort-Moderated IRT Model Serve as a Solution?

Joseph A. Rios, James Soland

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Suboptimal effort is a major threat to valid score-based inferences. While the effects of such behavior have been frequently examined in the context of mean group comparisons, minimal research has considered its effects on individual score use (e.g., identifying students for remediation). Focusing on the latter context, this study addressed two related questions via simulation and applied analyses. First, we investigated how much including noneffortful responses in scoring using a three-parameter logistic (3PL) model affects person parameter recovery and classification accuracy for noneffortful responders. Second, we explored whether improvements in these individual-level inferences were observed when employing the Effort Moderated IRT (EM-IRT) model under conditions in which its assumptions were met and violated. Results demonstrated that including 10% noneffortful responses in scoring led to average bias in ability estimates and misclassification rates by as much as 0.15 SDs and 7%, respectively. These results were mitigated when employing the EM-IRT model, particularly when model assumptions were met. However, once model assumptions were violated, the EM-IRT model’s performance deteriorated, though still outperforming the 3PL model. Thus, findings from this study show that (a) including noneffortful responses when using individual scores can lead to potential unfounded inferences and potential score misuse, and (b) the negative impact that noneffortful responding has on person ability estimates and classification accuracy can be mitigated by employing the EM-IRT model, particularly when its assumptions are met.

Original languageEnglish (US)
Pages (from-to)391-406
Number of pages16
JournalApplied Psychological Measurement
Volume45
Issue number6
DOIs
StatePublished - Sep 2021

Bibliographical note

Funding Information:
The authors would like to thank Hongwen Guo from the Educational Testing Service and Samuel Ihlenfeldt from the University of Minnesota for their helpful comments on an earlier draft. The author(s) received no financial support for the research, authorship, and/or publication of this article.

Publisher Copyright:
© The Author(s) 2021.

Keywords

  • ability estimation
  • classification accuracy
  • noneffortful responding
  • rapid guessing

Fingerprint

Dive into the research topics of 'Investigating the Impact of Noneffortful Responses on Individual-Level Scores: Can the Effort-Moderated IRT Model Serve as a Solution?'. Together they form a unique fingerprint.

Cite this