School Vouchers. Results from Randomized Experiments
Total Page:16
File Type:pdf, Size:1020Kb
This PDF is a selection from a published volume from the National Bureau of Economic Research Volume Title: The Economics of School Choice Volume Author/Editor: Caroline M. Hoxby, editor Volume Publisher: University of Chicago Press Volume ISBN: 0-226-35533-0 Volume URL: http://www.nber.org/books/hox03-1 Conference Date: February 22-24, 2001 Publication Date: January 2003 Title: School Vouchers. Results from Randomized Experiments Author: Paul Peterson, William Howell, Patrick J. Wolf, David Campbell URL: http://www.nber.org/chapters/c10087 4 School Vouchers Results from Randomized Experiments Paul E. Peterson, William G. Howell, Patrick J. Wolf, and David E. Campbell In the past decade much has been learned about the way in which school vouchers affect low-income families and their children. Ten years ago, the empirical information available about this widely debated question came primarily from a flawed public school choice intervention attempted in Alum Rock, California during the 1960s (Bridge and Blackman 1978; El- Paul E. Peterson is the Henry Lee Shattuck Professor of Government and director of the Program on Education Policy and Governance at Harvard University. William G. Howell is assistant professor of government at Harvard University. Patrick J. Wolf is assistant professor of public policy at Georgetown University. David E. Campbell is assistant professor of politi- cal science at Notre Dame University. The authors wish to thank the principals, teachers, and staff at the private schools in Day- ton, Washington, and New York City who assisted in the administration of tests and ques- tionnaires. We also wish to thank the SCSF, PACE, and WSF for cooperating fully with these evaluations. Kristin Kearns Jordan, Tom Carroll, and other members of the SCSF staff as- sisted with data collection in New York City. John Blakeslee, Leslie Curry, Douglas Dewey, Laura Elliot, Heather Hamilton, Tracey Johnson, John McCardell, and Patrick Purtill of the Washington Scholarship Fund provided similar cooperation. T. J. Wallace and Mary Lynn Naughton, staff members of Parents Advancing Choice in Education, provided valuable as- sistance with the Dayton evaluation. Chester E. Finn, Bruno Manno, Gregg Vanourek, and Marci Kanstoroom of the Fordham Foundation, Edward P. St. John of Indiana University, and Thomas Lasley of the University of Dayton provided valuable suggestions throughout various stages of the research design and data collection. We wish to thank especially David Myers of Mathematical Policy Research, a principal investigator of the evaluation of the New Yo rk School Choice Scholarship Program; his work on the New York evaluation has influ- enced in many important ways the design of the Washington and Dayton evaluations. We thank William McCready, Robin Bebel, Kirk Miller, and other members of the staff of the Public Opinion Laboratory at Northern Illinois University for their assistance with data col- lection, data processing, conduct of the lottery, and preparation of baseline and year-one follow-up data. We are particularly grateful to Tina Elacqua, Matthew Charles, and Brian Harrigan for their key roles in coordinating data collection efforts. We received helpful advice from Paul Hill, Christopher Jencks, Derek Neal, Donald Rock, and Donald Rubin. Daniel Mayer and Julia Chou were instrumental in preparing the New Yo rk City survey and test score data and executing many of the analyses reported in the paper. 107 108 Paul E. Peterson, William G. Howell, Patrick J. Wolf, and David E. Campbell more 1990). In the early and mid-1990s, however, new voucher programs sprouted across the country in such cities as Milwaukee, Cleveland, Indi- anapolis, and San Antonio. Initially, the evaluations of these innovations were limited by the quality of the data or the research procedures employed. Often, planning for the evaluation began after the experiment was under way, which made it impossible to gather baseline data or to ensure the for- mation of an appropriate control group. As a result, the quality of the data collected was not as high as researchers normally would prefer.1 Despite their limitations, these early evaluations provided program oper- ators and evaluation teams with opportunities to learn the problems and pitfalls accompanying the study of school vouchers. Subsequent evalua- tions of voucher programs in New York, Washington, D.C., and Dayton, Ohio have been designed in such a way as to allow for the collection of higher-quality information about student test score outcomes and parental assessments of public and private schools. Because vouchers in these cities were awarded by lot, program evaluations could be designed as randomized field trials. Prior to conducting the lotteries, the evaluation team collected baseline data on student test scores and family background characteristics. One, two, and three years later, the evaluation team again tested the stu- dents and asked parents about their children’s school experiences.2 In the absence of response biases that are conditional on treatment status, any sta- tistically significant differences between students offered a voucher and those not offered a voucher may be attributed to the intervention, because average student initial abilities and family backgrounds are similar between Additional research assistance was provided by Rachel Deyette, Jennifer Hill, and Martin West; Antonio Wendland, Tom Polseno, Shelley Weiner, Lilia Halpern, and Micki Morris pro- vided staff assistance. These evaluations have been supported by grants from the following foundations: Achelis Foundation, Bodman Foundation, Lynde and Harry Bradley Foundation, William Donner Foundation, Thomas B. Fordham Foundation, Milton and Rose D. Friedman Foundation, John M. Olin Foundation, David and Lucile Packard Foundation, Smith-Richardson Foun- dation, Spencer Foundation, and Walton Family Foundation. The methodology, analyses of data, reported findings, and interpretations of findings are the sole responsibility of the au- thors of this report and are not subject to the approval of SCSF, WSF, PACE, or of any foun- dation providing support for this research. 1. Disparate findings have emerged from these studies. For example, one analysis of the Mil- waukee choice experiment found test score gains in reading and math, particularly after stu- dents had been enrolled for three or more years, whereas another study found gains only in math, and a third found gains in neither subject. See Greene, Peterson, and Du (1998); Rouse (1997); and Witte (1997). On the Cleveland program, see Greene, Howell, and Peterson (1998) and Metcalf et al. (1998). Greene, Peterson, and Du (1998) report results from analyses of ex- perimental data; the other studies are based upon analyses of nonexperimental data. 2. Results from the Dayton evaluation after one year are reported in Howell and Peterson (2000). Second-year results for Dayton are described in West, Peterson, and Campbell (2001). First-year results for Washington are reported in Wolf, Howell, and Peterson (2000). Second- year results for Washington are reported in Wolf, Peterson, and West (2001). First-year results from the New York City evaluation are reported in Peterson et al. (1999). Second-year results from New York City are described in David Myers et al. (2000). All of the occasional papers mentioned in this note are available at [http://data.fas.harvard.edu/pepg/]. School Vouchers 109 the two groups. Students and families who were evaluated entered private school in grades two through five in New York City and grades two through eight in Washington, D.C. and Dayton (and other parts of Montgomery County, Ohio).3 This chapter reports programmatic impacts on student test scores, parents’ satisfaction with their child’s school, and parent reports of the characteristics of the schools the child attended. 4.1 The Three Voucher Programs The design of the three voucher programs was similar in key respects, thereby allowing the evaluation team to combine results from the separate evaluations of these programs. All were privately funded; all were targeted at students from low-income families, most of whom lived within the cen- tral city; and all provided partial vouchers, which the family was expected to supplement from other resources. All students included in the evaluation had previously been attending public schools. The programs, however, did differ in size, timing, and certain administrative details. In this section we describe the main characteristics of the School Choice Scholarships Foun- dation program in New York City, the Washington Scholarship Fund pro- gram in Washington, D.C., and the Parents Advancing Choice in Educa- tion program in the Dayton metropolitan area. 4.1.1 The School Choice Scholarships Foundation Program in New York City In February 1997, the School Choice Scholarships Foundation (SCSF) announced that it would provide 1,300 scholarships worth up to $1,400 an- nually for at least three years to children from low-income families then at- tending public schools. The scholarship could be applied toward the cost of attending a private school, either religious or secular. After announcing the program, SCSF received initial applications from over 20,000 students be- tween February and late April 1997. To be eligible for a scholarship, children had to be entering grades one through five, live in New York City, attend a public school at the time of ap- plication, and come from families with incomes low enough to qualify for the U.S. government’s free or reduced school lunch program. To ascertain eligibility, students and an adult member of their family were asked to at- tend verification sessions during which family income and the child’s public school attendance were documented. Subsequent to the lottery, SCSF assisted families in identifying possible private schools their children might attend. By the end of the first year, 3. Baseline data from the D.C. and Dayton evaluations are reported in Peterson et al. (1998). Baseline data for New York City are reported in Peterson et al. (1997). Both of these reports are available at [http://data.fas.harvard.edu/pepg/].