A Topical and Methodological Systematic Review of Meta‐Analyses Published in the Educational Measurement Literature

Joseph A. Rios, Samuel D. Ihlenfeldt, Michael Dosedal, Amy L Riegelman

Research output: Contribution to journalArticle

Abstract

This systematic review investigated the topics studied and reporting practices of published meta‐analyses in educational measurement. Our findings indicated that meta‐analysis is not a highly utilized methodological tool in educational measurement; on average, less than one meta‐analysis has been published per year over the past 30 years (28 meta‐analyses were published between 1986 and 2016). Within the field, researchers have utilized meta‐analysis to study three primary subject areas: test format effects, test accommodations, and predictive validity of operational testing programs. In regard to reporting practices, authors often failed to provide descriptive details of both their search strategy and sample characteristics limiting reproducibility and generalizability of findings, respectively. Furthermore, diagnostic analyses of outliers, publication bias, and statistical power were not provided for the majority of studies, putting into question the validity of inferences made from the meta‐analyses sampled. The lack of transparent and replicable practices of meta‐analyses in educational measurement is a concern for generating credible research syntheses that can assist the field in improving evidence‐based practices. Recommendations are provided for improving training and editorial standards of meta‐analytic research.

Fingerprint

accommodation
diagnostic
literature
lack
trend

Cite this

@article{c9ad05fa67e34a31a1813b5fbc4cba3c,
title = "A Topical and Methodological Systematic Review of Meta‐Analyses Published in the Educational Measurement Literature",
abstract = "This systematic review investigated the topics studied and reporting practices of published meta‐analyses in educational measurement. Our findings indicated that meta‐analysis is not a highly utilized methodological tool in educational measurement; on average, less than one meta‐analysis has been published per year over the past 30 years (28 meta‐analyses were published between 1986 and 2016). Within the field, researchers have utilized meta‐analysis to study three primary subject areas: test format effects, test accommodations, and predictive validity of operational testing programs. In regard to reporting practices, authors often failed to provide descriptive details of both their search strategy and sample characteristics limiting reproducibility and generalizability of findings, respectively. Furthermore, diagnostic analyses of outliers, publication bias, and statistical power were not provided for the majority of studies, putting into question the validity of inferences made from the meta‐analyses sampled. The lack of transparent and replicable practices of meta‐analyses in educational measurement is a concern for generating credible research syntheses that can assist the field in improving evidence‐based practices. Recommendations are provided for improving training and editorial standards of meta‐analytic research.",
author = "Rios, {Joseph A.} and Ihlenfeldt, {Samuel D.} and Michael Dosedal and Riegelman, {Amy L}",
year = "2019",
month = "7",
day = "25",
doi = "10.35542/osf.io/ygds4",
language = "English (US)",
journal = "Educational Measurement: Issues and Practice",
issn = "0731-1745",
publisher = "Wiley-Blackwell",

}

TY - JOUR

T1 - A Topical and Methodological Systematic Review of Meta‐Analyses Published in the Educational Measurement Literature

AU - Rios, Joseph A.

AU - Ihlenfeldt, Samuel D.

AU - Dosedal, Michael

AU - Riegelman, Amy L

PY - 2019/7/25

Y1 - 2019/7/25

N2 - This systematic review investigated the topics studied and reporting practices of published meta‐analyses in educational measurement. Our findings indicated that meta‐analysis is not a highly utilized methodological tool in educational measurement; on average, less than one meta‐analysis has been published per year over the past 30 years (28 meta‐analyses were published between 1986 and 2016). Within the field, researchers have utilized meta‐analysis to study three primary subject areas: test format effects, test accommodations, and predictive validity of operational testing programs. In regard to reporting practices, authors often failed to provide descriptive details of both their search strategy and sample characteristics limiting reproducibility and generalizability of findings, respectively. Furthermore, diagnostic analyses of outliers, publication bias, and statistical power were not provided for the majority of studies, putting into question the validity of inferences made from the meta‐analyses sampled. The lack of transparent and replicable practices of meta‐analyses in educational measurement is a concern for generating credible research syntheses that can assist the field in improving evidence‐based practices. Recommendations are provided for improving training and editorial standards of meta‐analytic research.

AB - This systematic review investigated the topics studied and reporting practices of published meta‐analyses in educational measurement. Our findings indicated that meta‐analysis is not a highly utilized methodological tool in educational measurement; on average, less than one meta‐analysis has been published per year over the past 30 years (28 meta‐analyses were published between 1986 and 2016). Within the field, researchers have utilized meta‐analysis to study three primary subject areas: test format effects, test accommodations, and predictive validity of operational testing programs. In regard to reporting practices, authors often failed to provide descriptive details of both their search strategy and sample characteristics limiting reproducibility and generalizability of findings, respectively. Furthermore, diagnostic analyses of outliers, publication bias, and statistical power were not provided for the majority of studies, putting into question the validity of inferences made from the meta‐analyses sampled. The lack of transparent and replicable practices of meta‐analyses in educational measurement is a concern for generating credible research syntheses that can assist the field in improving evidence‐based practices. Recommendations are provided for improving training and editorial standards of meta‐analytic research.

U2 - 10.35542/osf.io/ygds4

DO - 10.35542/osf.io/ygds4

M3 - Article

JO - Educational Measurement: Issues and Practice

JF - Educational Measurement: Issues and Practice

SN - 0731-1745

ER -