HKS Faculty Research Working Paper Series
HKS Working Paper No. RWP01-019
A youth homicide reduction initiative in Boston in the mid-1990s poses particular difficulties for program evaluation because it did not have a control group and the exact implementation date is unknown. A standard methodology in program evaluation is to use time series variation to compare pre- and post-program outcomes. Such an approach is not valid, however, when the timing of a potential break is unknown. To evaluate the Boston initiative, we adapt from the macroeconomics literature a test of unknown break point to test for a change in regime. Tests for parameter instability provide a flexible framework for testing a range of hypotheses commonly posed in program evaluation. These tests both pinpoint the timing of maximal break and provide a valid test of statistical significance. We evaluate the results of the estimation using the asymptotic results in the literature and with our own Monte Carlo analyses. We conclude there was a statistically significant discontinuity in youth homicide incidents (on the order of 60 percent) shortly after the intervention was unveiled.
Piehl, Anne Morrison, Suzanne J. Cooper, David M. Kennedy, and Anthony A. Braga. "Testing for Structural Breaks in the Evaluation of Programs." KSG Faculty Research Working Papers Series RWP01-019, April 2001.