Formative assessments — assessments that measure what students do and do not know, so that teachers can modify their instruction accordingly — have been widely hailed as a potential vehicle for improving student achievement. Yet little solid research evidence exists about their effectiveness, especially in reform-rich school districts. This study examines the effects of the Formative Assessments of Student Thinking in Reading (FAST-R) initiative in the Boston Public Schools system (BPS), where the use of data to improve instruction is a general priority of the school district. The study looks at changes in reading scores over time at 21 BPS schools that operated FAST-R during the 2005-2006 and 2006-2007 school years and changes at a group of comparison schools serving demographically similar students during the same period.
The Boston Plan for Excellence (BPE), a not-for-profit school reform organization, created and operates FAST-R. The study intervention involved administering a series of short student assessments whose items resemble those in the Massachusetts Comprehensive Assessment System (MCAS), the state’s “high-stakes” assessment used to measure the performance of both schools and students, and focus on students’ reading comprehension skills. BPE staff members compiled the results of the assessments into easy-to-use reports that contained information about each student. Then a BPE instructional data coach met with the teachers at each school to review the reports and to suggest how teachers could respond to students’ learning needs. (One BPE coach served most of the schools, and another BPE coach served the balance.)
The MDRC evaluation includes process and impact analyses. The process analysis found that teachers at the FAST-R schools who took a survey administered as part of the study reported that the professional development they received from the BPE FAST-R coaches was helpful and contributed to their understanding of data and their ability to work with students. At the same time, while the intervention was implemented as intended (it was meant to be flexible and to provide as much or as little coaching to individual schools as administrators and teachers sought), it was not very intensive; the majority of survey respondents spent only one to five hours with the FAST-R data coach during the 2006-2007 school year. Moreover, comparison-school teachers who took the survey reported receiving at least as much professional development as their FAST-R counterparts, were as likely to find it useful, and spent as much or more time analyzing data, including data from other (non-FAST-R) formative assessments.
The impact analysis examines the effects of FAST-R on the reading test scores of third- and fourth-graders. FAST-R’s impacts on student achievement — that is, the difference that FAST-R made over and above what was going on in the comparison schools — are generally positive but not statistically significant, as measured by MCAS reading scores. In other words, these differences could have arisen by chance. Effects on another measure of student reading, the Stanford Achievement Test, are more mixed but are also not statistically significant.
While FAST-R schools put in place a particular model of data utilization, other BPS schools were pursuing similar goals, and this fact, along with the intervention’s lack of intensity, may have undercut the likelihood that it would generate substantial and statistically significant impacts in this evaluation. Thus, this single study in a single district is not the last word on the potential of FAST-R. Much remains to be discovered about how teachers can best learn to use data to improve their instruction and boost the achievement of their students.