About MDRC

Bloom, MDRC’s Chief Social Scientist from 1999 to 2017, led the development of experimental and quasi-experimental methods for estimating program impacts, working closely with staff members to build these methods into their research. He is well known as an evaluation research methodologist and frequently gives workshops and seminars on this work at academic conferences, university seminars, and meetings of government research organizations. Prior to joining MDRC, Bloom taught research methods, program evaluation, and applied statistics at Harvard University and at New York University, where he received the Great Teacher Award in 1993. The author of numerous articles and several books, Bloom has been a principal investigator of many randomized experiments and rigorous quasi-experiments, including, among others: a study of the impacts of 105 new small public high schools in New York City, a reanalysis of the national Head Start Impact Study, a study of the impacts of alternative social/emotional interventions for students in Head Start programs, the national impact study of Reading First, the National Job Training Partnership Act study, the Canadian Earnings Supplement Project for displaced workers, and the Delaware Displaced Worker Study. In 2010 Bloom received the Peter H. Rossi Award for Contributions to the Theory or Practice of Program Evaluation from the Association for Public Policy and Management. Having earned his bachelor’s degree in engineering from Case Western Reserve University, Bloom holds a Master of City Planning degree, a Master of Public Administration degree, and a PhD in political economy and government from Harvard University.
-
MDRC Publications
BriefA Brief Synthesis of 20 Years of MDRC’s Randomized Controlled Trials
June, 2022What works to help community college students progress academically? This brief synthesizes 20 years of rigorous research by MDRC, presenting new evidence about key attributes of community college interventions that are positively related to larger impacts on students’ academic progress.
ReportFindings from an Evaluation of New York City’s Supervised Release Program
September, 2020In 2016, New York City rolled out Supervised Release, which allowed judges to release defendants under supervision instead of setting bail. The findings in this report suggest that the program reduced the number of defendants detained in jail, while at the same time maintaining court appearance rates and public safety.
MethodologyApril, 2018Multisite randomized trials allow researchers to study both the average impact of an intervention and how the impact varies across settings, which can help guide decisions in policy, practice, and science. This Reflections on Methodology post distills some key considerations for research design and for reporting and interpreting such variation.
Issue FocusReflections from a Career in Evaluation Research
July, 2017For 18 years, Howard Bloom, MDRC’s chief social scientist, has led the organization’s development of experimental and quasi-experimental methods for estimating program impacts. In this essay, he reviews some of the lessons he has learned in four decades of research both inside and outside academia.
MethodologyApril, 2015Beyond measuring average program impacts, it is important to understand how impacts vary. This paper gives a broad overview of the conceptual and statistical issues involved in using multisite randomized trials to learn about and from variation in program effects across individuals, across subgroups of individuals, and across program sites.
Working PaperMarch, 2015Using data from the Head Start Impact Study, this paper examines variation in Head Start effects across individual children, policy-relevant subgroups of children, and Head Start centers. It finds that past estimates of the average effect of Head Start programs mask a wide range of relative program effectiveness.
MethodologyLessons from a Simulation Study
November, 2014This paper makes valuable contributions to the literature on multiple-rating regression discontinuity designs (MRRDDs). It makes concrete recommendations for choosing among existing MRRDD estimation methods, for implementing any chosen method using local linear regression, and for providing accurate statistical inferences.
Working PaperOctober, 2014The city’s small, academically nonselective high schools have substantially improved graduation rates for disadvantaged students. This report demonstrates that, because more of their students graduate and do so within four years, the schools have lower costs per graduate than the schools their study counterparts attended.
MethodologySeptember, 2013This paper examines the properties of two nonexperimental study designs that can be used in educational evaluation: the comparative interrupted time series (CITS) design and the difference-in-difference (DD) design. The paper looks at the internal validity and precision of these two designs, using the example of the federal Reading First program as implemented in a midwestern state.
ReportNew Findings About the Effectiveness and Operation of Small Public High Schools of Choice in New York City
August, 2013New data from a rigorous study confirm that New York City’s small public high schools, which have nonselective admissions and serve many disadvantaged students, have substantially improved rates of graduation with Regents diplomas. This report also describes what principals and teachers at these schools believe accounts for their success.
MethodologyJune, 2013This paper presents a conceptual framework for designing and interpreting research on variation in program effects. The framework categorizes the sources of program effect variation and helps researchers integrate the study of variation in program effectiveness and program implementation.
MethodologyOctober, 2012This paper explores the use of instrumental variables analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome.
MethodologyAugust, 2012Despite the growing popularity of the use of regression discontinuity analysis, there is only a limited amount of accessible information to guide researchers in the implementation of this research design. This paper provides an overview of the approach and, in easy-to-understand language, offers best practices and general guidance for practitioners.
BriefJanuary, 2012A rigorous study that takes advantage of lottery-like features in New York City’s high school admissions process demonstrates that new small public high schools that are open to students of all academic backgrounds have substantial impacts on rates of graduation with Regents diplomas for every disadvantaged subgroup of students that was examined.
MethodologyApril, 2011This paper provides practical guidance for researchers who are designing and analyzing studies that randomize schools — which comprise three levels of clustering (students in classrooms in schools) — to measure intervention effects on student academic outcomes when information on the middle level (classrooms) is missing.
MethodologyStrategies for Interpreting and Reporting Intervention Effects on Subgroups
November, 2010This revised paper examines strategies for interpreting and reporting estimates of intervention effects for subgroups of a study sample. Specifically, the paper considers: why and how subgroup findings are important for applied research, the importance of prespecifying subgroups before analyses are conducted, and the importance of using existing theory and prior research to distinguish between subgroups for which study findings are confirmatory, as opposed to exploratory.
MethodologyHoward Bloom’s Remarks on Accepting the Peter H. Rossi Award
November, 2010In a speech before the Association for Public Policy Analysis and Management Conference on November 5, 2010, Howard Bloom, MDRC’s Chief Social Scientist, accepted the Peter H. Rossi Award for Contributions to the Theory or Practice of Program Evaluation.
MethodologyAugust, 2010This paper is the first step in a study of instrumental variables analysis with randomized trials to estimate the effects of settings on individuals. The goal of the study is to examine the strengths and weaknesses of the approach and present them in ways that are broadly accessible to applied quantitative social scientists.
ReportHow New York City’s New Small Schools Are Boosting Student Achievement and Graduation Rates
June, 2010Taking advantage of lottery-like features in New York City’s high school admissions process, this study provides rigorous evidence that new small public high schools are narrowing the educational attainment gap and markedly improve graduation prospects, particularly for disadvantaged students.
MethodologyDecember, 2009This paper provides a detailed discussion of the theory and practice of modern regression discontinuity. It describes how regression discontinuity analysis can provide valid and reliable estimates of general causal effects and of the specific effects of a particular treatment on outcomes for particular persons or groups.
MethodologyDecember, 2009This paper provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of educational interventions.
BriefWhat We Know, What We Don’t, and What’s Next
June, 2009Studies of Reading First released in 2008 found no overall effect on student reading comprehension, and the program was eliminated in 2009. However, the research findings were more nuanced than was widely reported, and they offer lessons for policymakers making critical choices today about how the federal government can best support the teaching of reading to young children.
MethodologyOctober, 2008This MDRC working paper on research methodology explores two complementary approaches to developing empirical benchmarks for achievement effect sizes in educational interventions.
ReportSeptember, 2008This report presents findings on the effectiveness of two specific professional development strategies on improving the knowledge and practice of second-grade teachers in high-poverty schools and on the reading achievement of their students.
MethodologyJuly, 2008This MDRC working paper on research methodology provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of interventions on children.
ReportInterim Report
May, 2008This report, written by Abt Associates and MDRC and published by the U.S. Department of Education’s Institute of Education Sciences, finds that Reading First increased the amount of time that teachers spent on the five essential components of reading instruction, as defined by the National Reading Panel. While Reading First did not improve students’ reading comprehension on average, there are some indications that some sites had impacts on both instruction and reading comprehension. An overview puts these interim findings in context.
MethodologyJuly, 2007No universal guideline exists for judging the practical importance of a standardized effect size, a measure of the magnitude of an intervention’s effects. This working paper argues that effect sizes should be interpreted using empirical benchmarks — and presents three types in the context of education research.
MethodologyAugust, 2006This MDRC research methodology working paper examines the core analytic elements of randomized experiments for social research. Its goal is to provide a compact discussion of the design and analysis of randomized experiments for measuring the impact of social or educational interventions.
MethodologyEmpirical Guidance for Studies That Randomize Schools to Measure the Impacts of Educational Interventions
November, 2005This paper examines how controlling statistically for baseline covariates (especially pretests) improves the precision of studies that randomize schools to measure the impacts of educational interventions on student achievement.
ReportPromoting Work in Seattle Public Housing During a HOPE VI Redevelopment
October, 2005Early success for this ambitious employment program for public housing residents in Seattle was disrupted by a federal HOPE VI grant to tear down and revitalize the housing development.
ReportFindings and Lessons from First Things First
July, 2005First Things First, a comprehensive school reform initiative, increased student achievement in Kansas City, Kansas, the first school district to adopt the reform model. It is not yet clear if First Things First is working in four other school districts in which it has been replicated.
ReportThe Effectiveness of Jobs-Plus
March, 2005Jobs-Plus, an ambitious employment program inside some of the nation’s poorest inner-city public housing developments, markedly increased the earnings of residents in the sites where it was implemented well.
MethodologyMarch, 2003This paper illustrates how to design an experimental sample for measuring the effects of educational programs when whole schools are randomized to a program and control group. It addresses such issues as what number of schools should be randomized, how many students per school are needed, and what is the best mix of program and control schools.
ReportA Look at Early Implementation and Impacts on Student Achievement in Eight Elementary Schools
November, 2001MethodologyNew Directions in Evaluations of American Welfare-to-Work and Employment Initiatives
October, 2001MethodologyMethodological Lessons from an Evaluation of Accelerated Schools
October, 2001MethodologyThe Effects of Program Management and Services, Economic Environment, and Client Characteristics
July, 2001MethodologyPlanning for the Jobs-Plus Demonstration
October, 1999MethodologyStatistical Implications for the Evaluation of Education Programs
August, 1999 -
Other Publications
-
Projects
Over 2 million households receive federal housing subsidies that allow them to rent in the private rental market. The Housing Choice Voucher program, funded by the U.S. Department of Housing and Urban Development (HUD), requires households to pay 30 percent of their incomes toward rent; HUD...
Success Academy is a prominent charter network in New York City with schools located in the Bronx, Brooklyn, Manhattan, and Queens. Across 32 elementary, middle, and high schools in 2014, Success Academy served roughly 9,000 students. In 2014, among the 2,255 students who were age-eligible to take the New York State achievement test, 94 percent were proficient in math...
Current practice for ensuring that impact evaluations in education have adequate statistical power does not take the use of multiplicity adjustments into account. Multiplicity adjustments to p-values protect against spurious statistically significant findings when there are multiple statistical tests (for example, due to multiple outcomes, subgroups, or time points),...
Beginning in the 1990s, the “Make Work Pay” experiments tested whether offering earnings supplements would increase employment and income and improve family well-being among welfare recipients. The experiments responded to a fundamental challenge: Low-wage jobs often leave families only barely better off financially than even subsistence-level welfare benefits. As a...
Shira Kolnik Mattera, Electra Small, Nina Castells, Barbara S. Goldman, JoAnn Hsueh, Ximena Portilla, Frieda Molina, Howard Bloom, Patrizia Mancini, Sharon RowserHead Start, which serves nearly 1 million low-income children, is the nation’s largest federally sponsored early childhood education program. Designed to narrow the gap between disadvantaged children and their more affluent peers, Head Start provides comprehensive programming during the preschool period to improve children’s social competence and academic readiness for...
The New York City public school system is the largest in the United States, with over 1,200 schools and more than 1.1 million students enrolled each year. For more than a decade, it has also been the site of an unprecedented investment in high school reform. Beginning in 2002 and with the support of the Bill & Melinda Gates Foundation and...
Elementary schools that educate children at risk of academic failure have traditionally responded by offering remedial instruction that slows the pace of learning. Research suggests, however, that remediation makes it harder for students to catch up and join the educational mainstream. Accelerated Schools offer a different approach: accelerating learning for all...
The Reading First Program, established under the No Child Left Behind Act of 2001, represents one of the most direct and intensive efforts by the federal government to influence instructional practice and student achievement in low-performing schools. Reading First is predicated on research findings that high-quality reading instruction in the primary grades...