A Cost-Effectiveness Analysis of Early Literacy Interventions Jessica Simon Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy under the Executive Committee of the Graduate School of Arts and Sciences COLUMBIA UNIVERSITY 2011 © 2011 Jessica Simon All rights reserved ABSTRACT A Cost-Effectiveness Analysis of Early Literacy Interventions Jessica Simon Success in early literacy activities is associated with improved educational outcomes, including reduced dropout risk, in-grade retention, and special education referrals. When considering programs that will work for a particular school and context; cost-effectiveness analysis may provide useful information for decision makers. The study provides information about the cost-effectiveness of four early literacy programs that the What Works Clearinghouse (WWC), a government agency that evaluates effectiveness research in education, has determined show evidence of effectiveness: Accelerated Reader, Classwide Peer Tutoring, Reading Recovery, and Success for All. By using meta-analytic techniques to combine effect sizes for different studies and weighting literacy outcomes, the study provides new information about the relative effectiveness of early literacy programs. In particular, by weighting literacy outcomes, the study casts new light upon the relative importance of different kinds of literacy outcomes for creating successful beginning readers. Costs are often ignored, but are a necessary consideration given budget constraints. Rigorous measurement of program costs and presentation of cost-effectiveness ratios provides information about the relative cost-effectiveness of four "effective" programs. Using meta-analytic results with confidence intervals, Accelerated Reader -- a relatively small add-on software program -- appears to be more cost-effective than Reading Recovery, a one-to-one tutoring program. Using point estimates for all four programs, Accelerated Reader and Classwide Peer Tutoring, two relatively small add-on programs, appear to be more cost-effective options than Reading Recovery and Success for All, two relatively more intensive interventions. Cost-effectiveness analysis should be one tool considered by decision makers, considered alongside goals for different subpopulations, individual contexts, and needs. A COST-EFFECTIVENESS ANALYSIS OF EARLY LITERACY INTERVENTIONS TABLE OF CONTENTS CHAPTER I: Introduction……………………………………………...…………………1 CHAPTER II: Reading Programs in the Study………………………………...………...13 CHAPTER III: Cost-Effectiveness Analysis in Education……………………...……….63 CHAPTER IV: Measuring Costs and Resource Use………………………...…………..87 CHAPTER V: Effects…………………………………………………..………………140 CHAPTER VI: Results and Discussion: Cost-Effectiveness Ratios………………..….194 CHAPTER VII: Conclusions and Areas for Future Research……………………….…208 BIBLIOGRAPHY………………………………………………………………………218 APPENDIX I: Studies Reviewed With Decision About Acceptance Into Study, and Why Rejected…………………………………………………………………...235 APPENDIX II: Text of Literacy Specialist Survey Administered to Membership of Massachusetts Reading Association………………………………………..….298 APPENDIX III: Meta-Regression Results……………………………………………...304 APPENDIX IV: Interview Guidelines………………………………………………….305 APPENDIX V: Resource Lists for Reading Programs……………………………...….308 APPENDIX VI: Homogeneity Statistics for Meta-Analysis…………………………...314 i LIST OF TABLES Table 1: Description of Intended Populations for Four Reading Programs……………...31 Table 2: Overview of Accelerated Reader/Reading Renaissance………………..………34 Table 3: Overview of Classwide Peer Tutoring…………………………….……………41 Table 4: Overview of Reading Recovery………………………………………………...46 Table 5: Overview of Success for All……………………………………………………53 Table 6: Brief Overview of Steps to Conduct a Rigorous Cost-Effectiveness Analysis...69 Table 7: Existing Cost-Effectiveness Analyses of Accelerated Reader…………………76 Table 8: Existing Cost-Effectiveness Analyses of Reading Recovery…………………..81 Table 9: Existing Cost-Effectiveness Analysis of Success for All………………………85 Table 10: Description of Categories Used in the Ingredients Method…………………...90 Table 11: Description of Principal Contact Information Received and Responses….....100 Table 12: Assumptions for Cost Calculations………………………………………….102 Table 13: Description of Implementations of AR in Effectiveness Studies……………110 Table 14: Cost Estimates to Implement AR for Grades K to 6………………………...112 Table 15: Description of Implementation in Effectiveness Study of CWPT…………...118 Table 16: Cost Estimates to Implement CWPT for Grades K to 6…………………..…119 Table 17: Description of Implementations of Effectiveness Studies of RR…..………..125 Table 18: Cost Estimates to Implement RR for Struggling Readers in Grade 1……………….128 Table 19: Description of Implementations of Effectiveness Studies of SFA…………..133 Table 20: Cost Estimates to Implement SFA for Grades K-2………………………….136 ii Table 21: Sources of Effectiveness Studies About Reading Programs………………...154 Table 22: Decisions for Studies About Reading Programs…………………………….156 Table 23: Description of Accepted Studies of AR for Meta-Analysis…………………161 Table 24: Description of Accepted Study of CWPT…………………………………...163 Table 25: Description of Accepted Studies of RR for Meta-Analysis…………………164 Table 26: Description of Accepted Studies of SFA…………………………………….168 Table 27: Weights Generated by Massachusetts Reading Association Survey Responses……………………………………………………………………………….181 Table 28: Unweighted and Weighted Average Effect Sizes for Reading Programs…...184 Table 29: WWC-Approved Studies About AR………………………………………...186 Table 30: WWC-Approved Study About CWPT………………………………………187 Table 31: WWC-Approved Studies About RR…………………………………..…….188 Table 32: WWC-Approved Study About SFA…………………………………………190 Table 33: WWC Average Effect Sizes for Beginning Reading Programs……………..191 Table 34: Comparison of Cost-Effectiveness Ratios for AR and RR.............................198 Table 35: Comparison of Cost-Effectiveness Ratios for Four Reading Programs……..200 Table 36: Description of Potential Benefits of Reading Programs……………………..213 iii ACKNOWLEDGMENTS This work would not have been possible without the collaboration, support, and insight of many people. I am grateful to my advisor, Henry Levin, for his wisdom, mentorship, and dedication. I would like to thank the committee members: I am grateful to Michael Kieffer for reading and commenting on early drafts with such insight; and to Thomas Bailey, Clive Belfield, and Francisco Rivera-Batiz, for their helpful suggestions and feedback. I am also thankful for the gracious collaboration of the Massachusetts Reading Association, and for conversations with national program staff from the four reading programs in the study, as well as the school leaders who took the time to talk with me about their individual implementations. I am appreciative of comments made by David Wilson and Andrew Gelman regarding the statistical analyses conducted in this manuscript. iv DEDICATION This work is dedicated to my husband for his confidence in me and his unfailing support, and to my daughter – my favorite developing reader. v 1 CHAPTER I: Introduction Policymakers and school leaders with limited budgets must determine how best to allocate scarce financial and temporal resources. Decision makers must consider whose performance to improve – for example, whether an educational intervention should focus on improving scores for as many students as possible, for those who are close to meeting state standards, or for particular struggling students. They must also consider the outcome they hope to improve by implementing an educational intervention. Although test scores are readily available, they are probably not the final outcome of interest. Other results, such as high school dropout rates or labor market outcomes, may be of greater importance but would require long-term studies that are difficult and expensive to conduct, and may not answer the question that is immediately relevant. Often, school leaders are held accountable for students’ scores, so research showing how educational interventions affect test scores may be immediately useful to decision makers. The most effective means of generating test score gains for students may not come from the most cost-effective program (Levin, Glass, & Meister, 1987). Although cost-effectiveness analyses alone cannot drive educational decision making, this study is intended to provide information that may improve selection of early reading interventions appropriate to a particular school’s or district’s goals and context. The study provides a rationale for considering early literacy interventions and for the use of cost-effectiveness methodology. It also examines different kinds of reading outcomes, and provides new perspective on how agencies like the What Works Clearinghouse (WWC) may help decision makers choose an educational program. The dissertation presents a rigorous cost-effectiveness analysis that: 2 • Evaluates study and outcome quality based on a specific set of criteria. • Weights effectiveness measures using results from a survey of literacy professionals to apply specialists’ perspectives on the role of various literacy outcomes in creating successful beginning readers. • Combines effectiveness measures for two programs via meta-analysis and describing confidence intervals of effect size rather than point estimates. • Estimates program costs via program developer information using the ingredients method. The study contributes
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages323 Page
-
File Size-