Consistency Theory for the General Nonparametric Classification Method

Chia Yi Chiu, Hans Friedrich Köhn

Research output: Contribution to journalArticlepeer-review

8 Scopus citations


Parametric likelihood estimation is the prevailing method for fitting cognitive diagnosis models—also called diagnostic classification models (DCMs). Nonparametric concepts and methods that do not rely on a parametric statistical model have been proposed for cognitive diagnosis. These methods are particularly useful when sample sizes are small. The general nonparametric classification (GNPC) method for assigning examinees to proficiency classes can accommodate assessment data conforming to any diagnostic classification model that describes the probability of a correct item response as an increasing function of the number of required attributes mastered by an examinee (known as the “monotonicity assumption”). Hence, the GNPC method can be used with any model that can be represented as a general DCM. However, the statistical properties of the estimator of examinees’ proficiency class are currently unknown. In this article, the consistency theory of the GNPC proficiency-class estimator is developed and its statistical consistency is proven.

Original languageEnglish (US)
Pages (from-to)830-845
Number of pages16
Issue number3
StatePublished - Sep 15 2019
Externally publishedYes

Bibliographical note

Funding Information:
Funding was provided by National Science Foundation (Grant No. 1552563). Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Publisher Copyright:
© 2019, The Psychometric Society.


  • cognitive diagnosis
  • DINA model
  • DINO model
  • G-DINA model
  • general DCM
  • general nonparametric classification method
  • nonparametric classification
  • Q-matrix


Dive into the research topics of 'Consistency Theory for the General Nonparametric Classification Method'. Together they form a unique fingerprint.

Cite this