A Reexamination of the Evidence from the Highscope Perry Preschool Program
Total Page:16
File Type:pdf, Size:1020Kb
A Service of Leibniz-Informationszentrum econstor Wirtschaft Leibniz Information Centre Make Your Publications Visible. zbw for Economics Heckman, James; Moon, Seong Hyeok; Pinto, Rodrigo; Savelyev, Peter; Yavitz, Adam Article Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program Quantitative Economics Provided in Cooperation with: The Econometric Society Suggested Citation: Heckman, James; Moon, Seong Hyeok; Pinto, Rodrigo; Savelyev, Peter; Yavitz, Adam (2010) : Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program, Quantitative Economics, ISSN 1759-7331, Wiley, Hoboken, NJ, Vol. 1, Iss. 1, pp. 1-46, http://dx.doi.org/10.3982/QE8 This Version is available at: http://hdl.handle.net/10419/150304 Standard-Nutzungsbedingungen: Terms of use: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Documents in EconStor may be saved and copied for your Zwecken und zum Privatgebrauch gespeichert und kopiert werden. personal and scholarly purposes. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle You are not to copy documents for public or commercial Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich purposes, to exhibit the documents publicly, to make them machen, vertreiben oder anderweitig nutzen. publicly available on the internet, or to distribute or otherwise use the documents in public. Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, If the documents have been made available under an Open gelten abweichend von diesen Nutzungsbedingungen die in der dort Content Licence (especially Creative Commons Licences), you genannten Lizenz gewährten Nutzungsrechte. may exercise further usage rights as specified in the indicated licence. https://creativecommons.org/licenses/by-nc/3.0/ www.econstor.eu Quantitative Economics 1 (2010), 1–46 1759-7331/20100001 Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program James Heckman University of Chicago, University College Dublin, American Bar Foundation, and Cowles Foundation, Yale University Seong Hyeok Moon University of Chicago Rodrigo Pinto University of Chicago Peter Savelyev University of Chicago Adam Yavitz University of Chicago James Heckman: [email protected] Seong Hyeok Moon: [email protected] Rodrigo Pinto: [email protected] Peter Savelyev: [email protected] Adam Yavitz: [email protected] A version of this paper was presented at a seminar at the HighScope Perry Foundation, Ypsilanti, Michigan, December 2006, at a conference at the Minneapolis Federal Reserve in December 2007, at a conference on the role of early life conditions at the Michigan Poverty Research Center, University of Michigan, December 2007, at a Jacobs Foundation conference at Castle Marbach, April 2008, at the Leibniz Network Conference on Noncognitive Skills in Mannheim, Germany, May 2008, at an Institute for Research on Poverty confer- ence, Madison, Wisconsin, June 2008, and at a conference on early childhood at the Brazilian National Academy of Sciences, Rio de Janeiro, Brazil, December 2009. We thank the editor and two anonymous ref- erees for helpful comments which greatly improved this draft of the paper. We have benefited from com- ments received on early drafts of this paper at two brown bag lunches at the Statistics Department, Uni- versity of Chicago, hosted by Stephen Stigler. We thank all of the workshop participants. In addition, we thank Amanda Agan, Mathilde Almlund, Joseph Altonji, Ricardo Barros, Dan Black, Steve Durlauf, Chris Hansman, Tim Kautz, Paul LaFontaine, Devesh Raval, Azeem Shaikh, Jeff Smith, and Steve Stigler for help- ful comments. Our collaboration with Azeem Shaikh on related work has greatly strengthened the analysis in this paper. This research was supported in part by the American Bar Foundation, the Committee for Eco- nomic Development, by a grant from the Pew Charitable Trusts and the Partnership for America’s Economic Success, the JB & MK Pritzker Family Foundation, Susan Thompson Buffett Foundation, Robert Dugger, and NICHD R01HD043411. The views expressed in this presentation are those of the authors and not necessar- ily those of the funders listed here. Supplementary materials for this paper are available online (Heckman, Moon, Pinto, Savelyev, and Yavitz (2010c)). Copyright © 2010 James Heckman, Seong Hyeok Moon, Rodrigo Pinto, Peter Savelyev, and Adam Yavitz. Licensed under the Creative Commons Attribution-NonCommercial License 3.0. Available at http://www.qeconomics.org. DOI: 10.3982/QE8 2 Heckman, Moon, Pinto, Savelyev, and Yavitz Quantitative Economics 1 (2010) Social experiments are powerful sources of information about the effectiveness of interventions. In practice, initial randomization plans are almost always compro- mised. Multiple hypotheses are frequently tested. “Significant” effects are often reported with p-values that do not account for preliminary screening from a large candidate pool of possible effects. This paper develops tools for analyzing data from experiments as they are actually implemented. We apply these tools to analyze the influential HighScope Perry Preschool Pro- gram. The Perry program was a social experiment that provided preschool edu- cation and home visits to disadvantaged children during their preschool years. It was evaluated by the method of random assignment. Both treatments and con- trols have been followed from age 3 through age 40. Previous analyses of the Perry data assume that the planned randomization protocol was implemented. In fact, as in many social experiments, the intended randomization protocol was compromised. Accounting for compromised ran- domization, multiple-hypothesis testing, and small sample sizes, we find statisti- cally significant and economically important program effects for both males and females. We also examine the representativeness of the Perry study. Keywords. Early childhood intervention, compromised randomization, social experiment, multiple-hypothesis testing. JEL classification. C93, I21, J15, V16. 1. Introduction Social experiments can produce valuable information about the effectiveness of inter- ventions. However, many social experiments are compromised by departures from ini- tial randomization plans.1 Many have small sample sizes. Applications of large sample statistical procedures may produce misleading inferences. In addition, most social ex- periments have multiple outcomes. This creates the danger of selective reporting of “significant” effects from a large pool of possible effects, biasing downward reported p-values. This paper develops tools for analyzing the evidence from experiments with multiple outcomes as they are implemented rather than as they are planned. We apply these tools to reanalyze an influential social experiment. The HighScope Perry Preschool Program, conducted in the 1960s, was an early child- hood intervention that provided preschool education to low-IQ, disadvantaged African- American children living in Ypsilanti, Michigan. The study was evaluated by the method of random assignment. Participants were followed through age 40 and plans are under way for an age-50 followup. The beneficial long-term effects reported for the Perry pro- gram constitute a cornerstone of the argument for early childhood intervention efforts throughout the world. Many analysts discount the reliability of the Perry study. For example, Hanushek and Lindseth (2009), among others, claim that the sample size of the study is too small to make valid inferences about the program. Herrnstein and Murray (1994)claimthates- timated effects of the program are small and that many are not statistically significant. 1See the discussion in Heckman (1992), Hotz (1992), and Heckman, LaLonde, and Smith (1999). Quantitative Economics 1 (2010) Analyzing social experiments as implemented 3 Others express the concern that previous analyses selectively report statistically signifi- cant estimates, biasing the inference about the program (Anderson (2008)). There is a potentially more devastating critique. As happens in many social experi- ments, the proposed randomization protocol for the Perry study was compromised. This compromise casts doubt on the validity of evaluation methods that do not account for the compromised randomization and calls into question the validity of the simple sta- tistical procedures previously applied to analyze the Perry study.2 In addition, there is the question of how representative the Perry population is of the general African-American population. Those who advocate access to universal early childhood programs often appeal to the evidence from the Perry study, even though the project only targeted a disadvantaged segment of the population.3 This paper develops and applies small-sample permutation procedures that are tai- lored to test hypotheses on samples generated from the less-than-ideal randomiza- tions conducted in many social experiments. We apply these tools to the data from the Perry experiment. We correct estimated treatment effects for imbalances that arose in implementing the randomization protocol and from post-randomization reassign- ment. We address the potential problem that arises from arbitrarily selecting “signifi- cant” hypotheses from a set of possible hypotheses using recently developed stepdown multiple-hypothesis testing procedures. The procedures we use