In a speech before the Association for Public Policy Analysis and Management Conference on November 7, 2008, Judith M. Gueron, President Emerita and Scholar in Residence at MDRC, accepted the Peter H. Rossi Award for Contributions to the Theory or Practice of Program Evaluation.
This MDRC working paper on research methodology explores two complementary approaches to developing empirical benchmarks for achievement effect sizes in educational interventions.
This MDRC working paper on research methodology provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of interventions on children.
No universal guideline exists for judging the practical importance of a standardized effect size, a measure of the magnitude of an intervention’s effects. This working paper argues that effect sizes should be interpreted using empirical benchmarks — and presents three types in the context of education research.
Empirical Guidance for Studies That Randomize Schools to Measure the Impacts of Educational Interventions
This paper examines how controlling statistically for baseline covariates (especially pretests) improves the precision of studies that randomize schools to measure the impacts of educational interventions on student achievement.
Relying on 427 classroom observations conducted over a three-year period, this study traces changes in teachers’ instructional practices in the First Things First schools.