Observation tools allow researchers to rate practitioners’ use of a program or curriculum according to the time spent on its components and how they were implemented. Reflections on Methodology explains how researchers and model developers can collaborate to develop useful assessment tools and the benefits and challenges involved.
In 2018, MDRC has released more than 80 reports, briefs, infographics, blog posts, podcasts, videos, and technical assistance tools — all with the goal of improving policies and programs that affect low-income Americans. Check out 10 of our most popular releases.
The new book Randomistas describes how randomized controlled trials (RCTs) have revolutionized many fields. RCTs are a uniquely powerful tool, but they are not the only way to build knowledge about effective programs for low-income people.
A “One-Page Protocol” Approach
How can researchers increase the likelihood that focus groups produce information that addresses key implementation questions? The Implementation Research Incubator presents an example of a simple protocol designed to encourage conversational flow about a complex issue — cooperative learning — while helping the interviewers explore core topics in depth.
By combining prior beliefs about a program’s effectiveness with new data to produce a distribution of impacts, Bayesian statistics provides an alternative to classical methods that may be more useful for policymaking. Reflections on Methodology discusses some issues with and applications of this approach.
With its new Center for Data Insights, MDRC is furthering its long-standing commitment to helping our partners improve their programs and systems. This issue focus describes how the Center is assisting government agencies, educational institutions, and nonprofit organizations use their data better to refine and target their services.
Studying implementation is a multidisciplinary exercise requiring careful planning and coordination. This post, the first of several from the Implementation Research Incubator describing the processes and procedures of MDRC’s work, shows how a wide array of staff members contribute to the effort.
A key to interpreting study findings is considering not just the features of a program being tested, but how it differs from business as usual — which may change over the course of the evaluation. The Implementation Research Incubator discusses guidelines for measuring this contrast.
In the second of two posts on the research opportunities presented by school choice systems, Reflections on Methodology discusses a few issues common to lottery-based analyses — constrained statistical power, imperfect compliance, and restricted generalizability.
The Experience of a New Program for Young People Involved in the Juvenile Justice System
STRIVE International engaged MDRC to help the organization improve a new program model aimed at increasing educational attainment and employment of young adults involved in the juvenile justice system. This Issue Focus describes the partnership and offers advice to organizations implementing new programs on how to build evidence of effectiveness.