TY - JOUR
T1 - Technical Adequacy of the Data-Based Instruction Knowledge and Skills Assessment in Writing
AU - Choi, Seohyeon
AU - McMaster, Kristen L.
AU - Lembke, Erica S.
AU - Guha, Manjary
N1 - Publisher Copyright:
© Hammill Institute on Disabilities 2024.
PY - 2024/12
Y1 - 2024/12
N2 - Teachers’ knowledge and skills about data-based instruction (DBI) can influence their self-efficacy and their implementation of DBI with fidelity, ultimately playing a crucial role in improving student outcomes. The purpose of this brief report is to provide evidence for the technical adequacy of a measure of DBI knowledge and skills in writing by examining its internal consistency reliability, considering different factor structures, and assessing item statistics using classical test theory and item response theory. We used responses from 154 elementary school teachers, primarily special educators, working with children with intensive early writing needs. Results from confirmatory factor analysis did not strongly favor either a one-factor solution, representing a single dimension of DBI knowledge and skills, or a two-factor solution, comprising knowledge and skills subscales. Internal consistency reliability coefficients were within an acceptable range, especially with the one-factor solution assumed. Item difficulty and discrimination estimates varied across items, suggesting the need to further investigate certain items. We discuss the potential of using the DBI Knowledge and Skills Assessment, specifically in the context of measuring teacher-level DBI outcomes in writing.
AB - Teachers’ knowledge and skills about data-based instruction (DBI) can influence their self-efficacy and their implementation of DBI with fidelity, ultimately playing a crucial role in improving student outcomes. The purpose of this brief report is to provide evidence for the technical adequacy of a measure of DBI knowledge and skills in writing by examining its internal consistency reliability, considering different factor structures, and assessing item statistics using classical test theory and item response theory. We used responses from 154 elementary school teachers, primarily special educators, working with children with intensive early writing needs. Results from confirmatory factor analysis did not strongly favor either a one-factor solution, representing a single dimension of DBI knowledge and skills, or a two-factor solution, comprising knowledge and skills subscales. Internal consistency reliability coefficients were within an acceptable range, especially with the one-factor solution assumed. Item difficulty and discrimination estimates varied across items, suggesting the need to further investigate certain items. We discuss the potential of using the DBI Knowledge and Skills Assessment, specifically in the context of measuring teacher-level DBI outcomes in writing.
KW - assessment
KW - data-based instruction
KW - teacher
UR - http://www.scopus.com/inward/record.url?scp=85193374786&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85193374786&partnerID=8YFLogxK
U2 - 10.1177/15345084241252369
DO - 10.1177/15345084241252369
M3 - Article
AN - SCOPUS:85193374786
SN - 1534-5084
VL - 50
SP - 40
EP - 47
JO - Assessment for Effective Intervention
JF - Assessment for Effective Intervention
IS - 1
ER -