Computerized Adaptive Testing in Early Education: Exploring the Impact of Item Position Effects on Ability Estimation

Anthony D. Albano, Liuhan Cai, Erin M. Lease, Scott R McConnell

Research output: Contribution to journalArticle

Abstract

Studies have shown that item difficulty can vary significantly based on the context of an item within a test form. In particular, item position may be associated with practice and fatigue effects that influence item parameter estimation. The purpose of this research was to examine the relevance of item position specifically for assessments used in early education, an area of testing that has received relatively limited psychometric attention. In an initial study, multilevel item response models fit to data from an early literacy measure revealed statistically significant increases in difficulty for items appearing later in a 20-item form. The estimated linear change in logits for an increase of 1 in position was.024, resulting in a predicted change of.46 logits for a shift from the beginning to the end of the form. A subsequent simulation study examined impacts of item position effects on person ability estimation within computerized adaptive testing. Implications and recommendations for practice are discussed.

Original languageEnglish (US)
Pages (from-to)437-451
Number of pages15
JournalJournal of Educational Measurement
Volume56
Issue number2
DOIs
StatePublished - Jun 1 2019

    Fingerprint

Cite this