Computerized Adaptive Testing in Early Education: Exploring the Impact of Item Position Effects on Ability Estimation

Anthony D. Albano, Liuhan Cai, Erin M. Lease, Scott R McConnell

Research output: Contribution to journalArticlepeer-review

5 Scopus citations


Studies have shown that item difficulty can vary significantly based on the context of an item within a test form. In particular, item position may be associated with practice and fatigue effects that influence item parameter estimation. The purpose of this research was to examine the relevance of item position specifically for assessments used in early education, an area of testing that has received relatively limited psychometric attention. In an initial study, multilevel item response models fit to data from an early literacy measure revealed statistically significant increases in difficulty for items appearing later in a 20-item form. The estimated linear change in logits for an increase of 1 in position was.024, resulting in a predicted change of.46 logits for a shift from the beginning to the end of the form. A subsequent simulation study examined impacts of item position effects on person ability estimation within computerized adaptive testing. Implications and recommendations for practice are discussed.

Original languageEnglish (US)
Pages (from-to)437-451
Number of pages15
JournalJournal of Educational Measurement
Issue number2
StatePublished - Jun 1 2019

Bibliographical note

Funding Information:
This work was supported in part by grant R305A160034 from the Institute of Education Sciences, U.S. Department of Education.

Publisher Copyright:
© 2019 by the National Council on Measurement in Education


Dive into the research topics of 'Computerized Adaptive Testing in Early Education: Exploring the Impact of Item Position Effects on Ability Estimation'. Together they form a unique fingerprint.

Cite this