Assessing an intervention’s effects on multiple outcomes increases the risk of false positives. Procedures that make adjustments to address this risk can reduce power, or the probability of detecting effects that do exist. MDRC’s Reflections on Methodology discusses how to estimate power when making adjustments as well as alternative definitions of power.
To improve outcomes among high-interest borrowers, policymakers need to understand what is driving usage. This second post in MDRC’s Reflections on Methodology series discusses how a data discovery process revealed clusters of borrowers who differed greatly in the kinds of loans and lenders they used and in their loan outcomes.
The SIMPLER framework was developed for the Behavioral Interventions to Advance Self-Sufficiency (BIAS) project ― the first major effort to apply behavioral insights to human services programs in the United States. SIMPLER summarizes several key behavioral concepts that can guide practitioners interested in using behavioral insights to enhance service delivery.
Encouraging Additional Summer Enrollment (EASE) aims to increase summer enrollment rates among low-income community college students using insights from behavioral science. This infographic describes some of the benefits of summer enrollment, reasons why students may not enroll in summer, and interventions the EASE team designed to address low enrollment rates.
Machine learning algorithms, when combined with the contextual knowledge of researchers and practitioners, offer service providers nuanced estimates of risk and opportunities to refine their efforts. The first post of a new series, Reflections on Methodology, discusses how MDRC helps organizations make the most of predictive modeling tools.
MDRC launches the first of a five-part web series from the Chicago Community Networks study — a mixed-methods initiative that combines formal social network analysis with in-depth field surveys of community practitioners. It measures how community organizations collaborate on local improvement projects and how they come together to shape public policy.
How a District Might Find a Program That Meets Local Needs
For school districts striving to meet both ESSA requirements and specific educational needs, this infographic shows how evidence can guide decisions. The evaluation of Reading Partners, a one-on-one volunteer tutoring program, serves as an example.
Strategies for Interpreting and Reporting Intervention Effects on Subgroups
This revised paper examines strategies for interpreting and reporting estimates of intervention effects for subgroups of a study sample. Specifically, the paper considers: why and how subgroup findings are important for applied research, the importance of prespecifying subgroups before analyses are conducted, and the importance of using existing theory and prior research to distinguish between subgroups for which study findings are confirmatory, as opposed to exploratory.
Howard Bloom’s Remarks on Accepting the Peter H. Rossi Award
In a speech before the Association for Public Policy Analysis and Management Conference on November 5, 2010, Howard Bloom, MDRC’s Chief Social Scientist, accepted the Peter H. Rossi Award for Contributions to the Theory or Practice of Program Evaluation.
This paper is the first step in a study of instrumental variables analysis with randomized trials to estimate the effects of settings on individuals. The goal of the study is to examine the strengths and weaknesses of the approach and present them in ways that are broadly accessible to applied quantitative social scientists.