How Methodological Features Affect Effect Sizes in Education
Published online on June 21, 2016
Abstract
As evidence becomes increasingly important in educational policy, it is essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. A total of 645 studies from 12 recent reviews of evaluations of preschool, reading, mathematics, and science programs were studied. Effect sizes were roughly twice as large for published articles, small-scale trials, and experimenter-made measures, compared to unpublished documents, large-scale studies, and independent measures, respectively. Effect sizes were significantly higher in quasi-experiments than in randomized experiments. Excluding tutoring studies, there were no significant differences in effect sizes between elementary and middle/high studies. Regression analyses found that effects of all factors maintained after controlling for all other factors. Explanations for the effects of methodological features on effect sizes are discussed, as are implications for evidence-based policy.