Working Paper: School Comparisons in Observational Designs


Full Title: Evaluating Methods for Selecting School-Level Comparisons in Quasi-Experimental Designs: Results from a Within-Study Comparison

Full Abstract: This paper compares the performance of three approaches to selecting school-level comparison units in educational evaluations that try to match treatment and comparison schools. In one approach, matching is on “focal” characteristics that are assumed to be related to both treatment assignment and outcome. In another, matching is on geographical attributes and so both sets of schools come from the same “local” area – in this case, from the same school district. In the third “hybrid” approach, both focal and local attributes are used sequentially. First, matching occurs within school districts and, for treatment schools without comparable local matches, focal matches are found with non-local schools that have otherwise similar observed characteristics. To assess the performance of these three approaches to matching, the study employs a within-study comparison design in which treatment effect estimates from a quasi-experimental research design are compared to results from a randomized experiment that shares the same treatment group. We find that focal and local matching each reduce bias by at least 75 percent relative to the simplest two-group design where posttest differences are compared without any covariates. Indeed, after covariate adjustment, all the estimated treatment effects are less than .02 SDs from the experimental benchmark and there were no statistically significant differences between the matched quasi-experimental and the experimental effects. Even so, the hybrid matching approach outperformed both the focal and local approaches, reducing even more of the initial bias and coming closer to the experimental benchmark.

EdPolicyWorks Working Paper Series No. 47. April 2016.