Semistructured interviews involve an interviewer asking some prespecified, open-ended questions, with follow-up questions based on what the interviewee has to say. This Reflections on Methodology post describes a semistructured interview protocol recently used to explore how children who experience poverty perceive their situations, their economic status, and public benefit programs.
An essential step in the child support process is delivering legal documents to the person named as a parent. This infographic summarizes results from a Georgia intervention that aimed to get parents to come in and accept documents voluntarily instead of using a sheriff or process server to deliver them.
How a District Might Find a Program That Meets Local Needs
For school districts striving to meet both ESSA requirements and specific educational needs, this infographic shows how evidence can guide decisions. The evaluation of Reading Partners, a one-on-one volunteer tutoring program, serves as an example.
Easing the Transition to Adulthood for Vulnerable Young People
This infographic describes MDRC’s results from the largest random assignment evaluation of a program serving youth people aging out of the foster care and juvenile justice systems. After one year, YVLifeSet, a program run by Youth Villages, boosts earnings, increases housing stability and economic well-being, and improves outcomes related to health and safety.
An Empirical Assessment Based on Four Recent Evaluations
This reference report, prepared for the National Center for Education Evaluation and Regional Assistance of the Institute of Education Sciences (IES), uses data from four recent IES-funded experimental design studies that measured student achievement using both state tests and a study-administered test.
This paper provides practical guidance for researchers who are designing and analyzing studies that randomize schools — which comprise three levels of clustering (students in classrooms in schools) — to measure intervention effects on student academic outcomes when information on the middle level (classrooms) is missing.
This paper provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of educational interventions.
This MDRC working paper on research methodology explores two complementary approaches to developing empirical benchmarks for achievement effect sizes in educational interventions.
This MDRC working paper on research methodology provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of interventions on children.
No universal guideline exists for judging the practical importance of a standardized effect size, a measure of the magnitude of an intervention’s effects. This working paper argues that effect sizes should be interpreted using empirical benchmarks — and presents three types in the context of education research.