Refinements to Effect Sizes for Tests of Categorical Moderation and Differential Prediction

Jeffrey A. Dahlke, Paul R. Sackett

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

We provide a follow-up treatment of Nye and Sackett’s (2017) recently proposed dMod standardized effect-size measures for categorical-moderation analyses. We offer several refinements to Nye and Sackett’s effect-size equations that increase the precision of dMod estimates by accounting for asymmetries in predictor distributions, facilitate the interpretation of moderated effects by separately quantifying positive and negative differences in prediction, and permit the computation of nonparametric effect sizes. To aid in the implementation of our refinements to dMod, we provide software written in the R programming language that computes Nye and Sackett’s effect sizes with all of our refinements and that includes options for easily computing bootstrapped standard errors and bootstrapped confidence intervals.

Original languageEnglish (US)
Pages (from-to)226-234
Number of pages9
JournalOrganizational Research Methods
Volume21
Issue number1
DOIs
StatePublished - Jan 1 2018

Keywords

  • bias
  • categorical moderation
  • differential prediction
  • effect size
  • multiple regression

Fingerprint

Dive into the research topics of 'Refinements to Effect Sizes for Tests of Categorical Moderation and Differential Prediction'. Together they form a unique fingerprint.

Cite this