Observation tools allow researchers to rate practitioners’ use of a program or curriculum according to the time spent on its components and how they were implemented. Reflections on Methodology explains how researchers and model developers can collaborate to develop useful assessment tools and the benefits and challenges involved.
In NYC P-TECH Grades 9-14 schools, students take an integrated sequence of high school and college courses with the goal of completing both high school and college, while simultaneously being exposed to hands-on work experiences. This infographic describes the model and introduces MDRC’s evaluation of it.
The new book Randomistas describes how randomized controlled trials (RCTs) have revolutionized many fields. RCTs are a uniquely powerful tool, but they are not the only way to build knowledge about effective programs for low-income people.
By combining prior beliefs about a program’s effectiveness with new data to produce a distribution of impacts, Bayesian statistics provides an alternative to classical methods that may be more useful for policymaking. Reflections on Methodology discusses some issues with and applications of this approach.
An essential step in the child support process is delivering legal documents to the person named as a parent. This infographic summarizes results from a Georgia intervention that aimed to get parents to come in and accept documents voluntarily instead of using a sheriff or process server to deliver them.
In the second of two posts on the research opportunities presented by school choice systems, Reflections on Methodology discusses a few issues common to lottery-based analyses — constrained statistical power, imperfect compliance, and restricted generalizability.
The Center for Applied Behavioral Science (CABS) combines MDRC’s decades of experience tackling social policy issues with insights from behavioral science. This graphic explains the CABS’s approach to solving problems.
In a randomized controlled trial, measuring treatment contrast – the difference in services received by a program group and those in a counterfactual condition – is critical for understanding what a program’s effects suggest about the best ways to improve services. This paper explains why treatment contrast is important and offers guidance about how to measure it.
A two-stage study design can test a complex set of interventions, individually and in combination. Reflections on Methodology shows how this approach was used for a pair of programs, the first administered in preschools and the second implemented as a kindergarten follow-up for individual students.
Too often, programs and policies do not consider the way people actually think and behave. Behavioral science demonstrates that even small hassles create barriers that prevent those in need of services from receiving them. This infographic provides a brief overview of how the Center for Applied Behavioral Science is improving social services by making use of behavioral insights.