<<

Effects of Programs on School Readiness Report from the Preschool Curriculum Evaluation Research initiative

NCER 2008-2009 U.S. DEPaRtmENt of Effects of Preschool Curriculum Programs on School Readiness Report from the Preschool Curriculum Evaluation Research Initiative

July 2008

Preschool Curriculum Evaluation Research Consortium

NCER 2008-2009 U.S. Department of Education This report was prepared for the National Center for Education Research, Institute of , under contract numbers ED-01-CO-0039/0005 (Mathematica Policy Research, Inc.), ED- 01-CO-0052/0004 (RTI International), and ED-04-CO0076/0007 (Synergy Enterprises, Inc.).

Disclaimer The opinions and positions expressed in this report are the authors’ and do not necessarily represent the opinions and positions of the Institute of Education Sciences or the U.S. Department of Education. Any references within the document to specific education products are illustrative and do not imply endorsement of these products to the exclusion of other products that are not referenced.

U.S. Department of Education Margaret Spellings Secretary

Institute of Education Sciences Grover J. Whitehurst Director

National Center for Education Research Lynn Okagaki Commissioner

July 2008

This report is in the public domain. While permission to reprint this publication is not necessary, the citation should be:

Preschool Curriculum Evaluation Research Consortium (2008). Effects of Preschool Curriculum Programs on School Readiness (NCER 2008-2009). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. Washington, DC: U.S. Government Printing Office.

This report is available for download on the IES website at http://ncer.ed.gov.

Alternative Formats On request, this publication can be made available in alternative formats, such as Braille, large print, audiotape, or computer diskette. For more information, call the Alternative Format Center at (202) 205-8113.

Preschool Curriculum Evaluation Research Consortium1

Institute of Education Sciences (IES) of California, Berkeley3 Caroline Ebanks Anne Cunningham Allen Ruby Marcia Davidson (RMC Research; University of Maine) Mathematica Policy Research (MPR), Inc. University of Missouri John Love Kathy R. Thornburg Sarah Avellar Wayne Mayfield Steve Glazerman Johnetta Morrison Susan Sprachman Jacqueline Scott RTI International University of North Carolina at Charlotte Ina Wallace Richard G. Lambert Randall Bender Martha Abbott-Shim (Quality Counts) Renate Houts University of New Hampshire Jun Liu Jeffrey S. Priest Melissa Raspa Leigh Zoellick Holly Rhodes University of North Florida Florida State University Cheryl Fountain Christopher J. Lonigan Madelaine Cosgrove Christopher Schatschneider Janice Wood Purdue University University of Texas Health Science Center Douglas Powell at Houston Nancy File (University of Wisconsin-Milwaukee) Susan Landry Success for All Foundation Michael Assel Bette Chambers Susan Gunnewig Robert Slavin Paul Swank University of California, Berkeley2 University of Virginia Prentice Starkey Laura Justice Alice Klein Khara Pence Douglas Clements (University at Buffalo, Alice Wiggins State University of New York) Vanderbilt University Julie Sarama (University at Buffalo, Dale Farran State University of New York) Mark Lipsey

1 The members of the Preschool Curriculum Evaluation Research (PCER) Consortium include the principal investigators and co- principal investigators from each of the 12 funded research projects, Institute of Education Sciences (IES) staff, and staff from RTI International (RTI) and Mathematica Policy Research (MPR), Inc., the evaluation contractors. 2 This University of California, Berkeley research team partnered with researchers at the University at Buffalo, State University of New York to evaluate the Pre-K Mathematics supplemented with DLM Early Childhood Express Math software curriculum in California and New York. 3 This University of California, Berkeley research team evaluated the Ready, Set, Leap! curriculum in New Jersey.

iii

Page intentionally left blank.

Acknowledgments

The findings reported here are based on research conducted by the Preschool Curriculum Evaluation Research (PCER) program research teams, the evaluation contractors, and Institute of Education Sciences (IES) staff. This report is a product of the collaborative efforts of the PCER Consortium. The PCER Consortium consists of research teams from each participating grantee site, IES staff, and the evaluation contractors: RTI International (RTI) and Mathematica Policy Research (MPR), Inc. Appendix B of the report was authored by Randall Bender (RTI), Jun Liu (RTI), Ina Wallace (RTI), Melissa Raspa (RTI), and Margaret Burchinal (University of North Carolina at Chapel Hill). The PCER Consortium would like to acknowledge Dr. Susan J. Kontos who served as the principal investigator for the Project Approach (Wisconsin) evaluation study from July 2002 to September 2003. Dr. Kontos, one of the country’s leading researchers in early childhood education and care, died September 12, 2003. We also acknowledge and thank Heidi Schweingruber and James Griffin for their contributions to the PCER program during their tenure at IES. We are grateful to the schools, teachers, parents, and the children who participated in our assessments, interviews, and observations. Without their cooperation, none of this research would have been possible. The listed authors represent only a part of the research team involved in this project. We would like to thank the research staff at each grantee site, especially the grantees’ site coordinators who worked closely with the local preschool programs, school staff, and the contractors’ data collection teams to facilitate the successful collection of child-, parent-, teacher-, and classroom-level data. We appreciate the efforts of the classroom observers and the child assessors who were critical to the successful completion of data collection at each research site. We are also grateful to the many contractor staff members who have worked on data collection and data analysis tasks over the duration of the study. The mention of trade names, commercial products, or organizations in the description of the projects, or the reporting of study findings, does not imply endorsement by the U.S. Government.

v

Page intentionally left blank.

Disclosure of Potential Conflicts of Interest

The PCER Consortium consists of research teams (principal investigators and co-principal investigators from each grantee site), IES staff, and the evaluation contractors, Mathematica Policy Research (MPR), Inc. and RTI International (RTI). Most of the grantee research teams, IES staff, and contractor staff from MPR and RTI have no interests that could be affected by findings from the evaluation of the curricula that are highlighted in this report. It is important to note that four of the PCER initiative research teams developed curricula that were implemented at their respective research sites. The Success for All Foundation (SFA) developed the Curiosity Corner curriculum, which was implemented in preschool classrooms in Florida, Kansas, and New Jersey. Dr. Christopher Lonigan and his colleagues at Florida State University developed the Literacy Express curriculum, which was implemented in public pre-kindergarten classrooms in Florida. Drs. Prentice Starkey and Alice Klein are the developers of the Pre-K Mathematics curriculum. Drs. Douglas Clements and Julie Sarama are the developers of the DLM Early Childhood Express Math software. The Pre-K Mathematics curriculum and the DLM Early Childhood Express Math software were implemented jointly in Head Start and public pre-kindergarten classrooms in California and New York. Drs. Cheryl Fountain, Madelaine Cosgrove, and Janice Wood are on staff at the Florida Institute of Education, University of North Florida, where the Early Literacy and Learning Model (ELLM) was developed. These researchers were selected to receive funding for their PCER research projects in a competitive grant application process. Each research team implemented its curriculum and conducted site-specific analyses examining the effects of these curricula on child outcomes. RTI and MPR, the evaluation study contractors, conducted independent evaluations of these and the other treatment curricula that were included in the PCER study. The developers/implementers of these curricula did not conduct the impact analyses that are summarized in this report. Members of the RTI data analysis team completed the impact analyses. In addition to their role as developers and implementers, Drs. Starkey, Klein, Clements, Sarama, and Lonigan developed measures that were included in the PCER child assessment battery. Drs. Starkey and Klein developed a preschool mathematics assessment, the Child Math Assessment (CMA) that was adapted for use in the PCER evaluation study. The Child Math Assessment-Abbreviated (CMA-A) was added to the assessment battery as a measure of children’s early mathematical knowledge and skills using manipulative materials. The Building Blocks, Shape Composition task was also included in the child assessment battery. This task was adapted from the Building Blocks assessment tool, which was developed by Clements, Sarama, and Liu. The Elision subtest from the Preschool Comprehensive Test of Phonological and Print Processing (Pre- CTOPPP) was used in the pre-kindergarten year of the evaluation study. Dr. Christopher Lonigan and his colleagues developed the Pre-CTOPPP, Elision subtest. The assessment was not commercially available at the time it was selected for inclusion in the study or during the data collection phase of the study. A revised version of the assessment became commercially available as the Test of Preschool Early Literacy (TOPEL) in January 2007, after the PCER data collection. Dr. Lonigan has a financial interest in the commercial version of this measure. Dr. Susan Landry and her colleagues at the Center for Improving the Readiness of Children for Learning and Education (CIRCLE) developed one of the study’s classroom observation measures and advised on the selection of the child assessments. CIRCLE staff also trained PCER data collection teams to collect classroom observation data using the Teacher Behavior Rating Scale (TBRS), but CIRCLE staff did not collect the data. CIRCLE staff scored the classroom observation data that were collected using the TBRS measure. Data collection teams from MPR and RTI independently collected all of the data using the measures that are mentioned here. The data analysis team completed descriptive and impact analyses using the scored data. The developers of these measures had no direct role in the completion of the descriptive analyses or the impact analyses that are summarized in this report.

vii

Page intentionally left blank.

Glossary

ACF—Administration for Children and Families

ANCOVA—Analysis of Covariance

Arnett—Arnett Caregiver Interaction Scale

CMA-A—Child Math Assessment-Abbreviated control classrooms—Classrooms randomly assigned to the control condition. Classrooms where the prevailing or existing curriculum was in use during the course of the study control curriculum—The prevailing/existing curriculum used by teachers in the control condition at each site

CTOPP—Comprehensive Test of Phonological Processing (CTOPP), Elision subtest

ECERS-R—Early Childhood Environment Rating Scale-Revised

ECLS-K—Early Childhood Longitudinal Study, Kindergarten cohort

ELLM—Early Literacy and Learning Model

FACES—Family and Child Experiences Survey

FSU—Florida State University full-day—Preschool program where children spend at least 6 hours per day in the preschool classroom

GED—General Educational Development grantee—Researcher funded by the Institute of Education Sciences, U.S. Department of Education, to conduct a site-specific study under the Preschool Curriculum Evaluation Research initiative. Grants were awarded to investigators at a single or to co-investigators at multiple half-day—Preschool program where children spend less than 6 hours per day

Head Start center—Preschool that is funded by the U.S. Administration for Children and Families Head Start Bureau

ICC—Intraclass correlation

IES—Institute of Education Sciences, U.S. Department of Education

LBS—Learning Behaviors Scale

MDE—Minimum Detectable Effects

MPR—Mathematica Policy Research, Inc

ix

Glossary—Continued

MPR evaluation sites—Preschool Curriculum Evaluation Research research sites where Mathematica Policy Research, Inc. conducted data collection

PCER—Preschool Curriculum Evaluation Research

PLBS—Preschool Learning Behaviors Scale

PPVT—Peabody Picture Vocabulary Test, Third Edition (PPVT-III)

Pre-CTOPPP—Preschool Comprehensive Test of Phonological and Print Processing, Elision subtest private pre-kindergarten—Preschool that is funded primarily through tuition or other nongovernmental source public pre-kindergarten—Preschool that is part of a public school system or receives substantial public funding random assignment—Determination by lottery under supervision of a researcher whether a study subject will be placed in one experimental group or another randomized trial—Research study in which subjects are randomly assigned to receive or not receive interventions research site—Collection of preschool programs/classrooms in a specific geographic location that were recruited by each grantee. Grantees implemented one or more preschool curricula at each research site

RTI—RTI International

RTI evaluation sites—PCER research sites where RTI International conducted data collection

SFA—Success for All site/grantee site—The geographic location of the research sites

SSRS—Social Skills Rating System

SSRS Problem Behaviors—Social Skills Rating System, Problem Behaviors scale

SSRS Social Skills—Social Skills Rating System, Social Skills scale

TBRS—Teacher Behavior Rating Scale

TERA—Test of Early Ability, Third Edition (TERA-3)

TOLD—Test of Language Development-Primary, Third Edition (TOLD-P:3) treatment classroom—Classrooms randomly assigned to the treatment condition where an experimental curriculum was implemented and evaluated

x

Glossary—Continued treatment curriculum—One of the 14 intervention curricula that were implemented in treatment classrooms

UNF—University of North Florida

WJ—Woodcock Johnson Achievement Test, 3rd Edition (WJ III)

WJ Applied Problems/WJ Applied Problems test—Woodcock Johnson Achievement Test, 3rd Edition (WJ III), Applied Problems Test

WJ Letter Word Identification/WJ Word Identification test—Woodcock Johnson Achievement Test, 3rd Edition (WJ III), Letter Word Identification Test

WJ Spelling/WJ Spelling test—Woodcock Johnson Achievement Test, 3rd Edition (WJ III), Spelling Test

xi

Page intentionally left blank.

Contents

Page Preschool Curriculum Evaluation Research Consortium ...... iii Acknowledgments ...... v Disclosure of Potential Conflicts of Interest ...... vii Glossary ...... ix List of Tables ...... xxi List of Figures ...... xxix Executive Summary ...... xxxi Research Questions ...... xxxi Study Design ...... xxxii Sample and Assignment to Condition ...... xxxii Measures ...... xxxv Study Implementation ...... xxxviii Analysis ...... xxxix Results ...... xli Chapter 1. An Overview of the Preschool Curriculum Evaluation Research Initiative ...... 1 Study Background ...... 1 School Readiness and Later Academic Achievement ...... 1 Early Childhood Education ...... 2 The Preschool Curriculum Evaluation Research Initiative ...... 3 Research Questions...... 3 Study Design ...... 4 Intervention and Control Curricula ...... 4 Sample and Random Assignment to Condition ...... 6 Measures ...... 9 Study Implementation ...... 16 Timeline of Implementation ...... 17 Response Rates, Attrition, and Mobility ...... 23 Contamination ...... 24 Fidelity of Implementation ...... 25 Sample Description ...... 25 Analysis ...... 26 Results ...... 30 Model Results ...... 31 Considerations: Efficacy, Power, and Multiple Comparisons ...... 31 Criteria for Findings...... 32 Findings ...... 33 Findings by Outcome ...... 33 Findings by Curriculum ...... 38

xiii

Contents—Continued

Page Chapter 2. Bright Beginnings and Creative Curriculum: Vanderbilt University (Tennessee site) ...... 41 Curriculum ...... 41 Bright Beginnings ...... 41 Creative Curriculum ...... 41 Sample ...... 42 Children and Families ...... 42 Teachers ...... 44 Programs/Classrooms ...... 44 Random Assignment ...... 44 Contamination ...... 45 Control Condition ...... 45 Data Collection ...... 45 Attrition ...... 45 Implementation ...... 46 Implementation Fidelity Ratings ...... 46 Impact Analysis Results ...... 46 Bright Beginnings—Child Outcomes ...... 47 Bright Beginnings—Classroom Outcomes ...... 48 Summary of Findings for Bright Beginnings ...... 49 Creative Curriculum—Child Outcomes ...... 51 Creative Curriculum—Classroom Outcomes ...... 52 Summary of Findings for Creative Curriculum ...... 53 Chapter 3. Creative Curriculum: University of North Carolina at Charlotte (North Carolina and Georgia sites) ...... 55 Curriculum ...... 55 Sample ...... 55 Children and Families ...... 55 Teachers ...... 57 Programs/Classrooms ...... 57 Random Assignment ...... 58 Contamination ...... 59 Control Condition ...... 59 Data Collection ...... 59 Attrition ...... 59 Implementation ...... 59 Implementation Fidelity Ratings ...... 60 Impact Analysis Results ...... 60 Creative Curriculum—Child Outcomes ...... 60 Creative Curriculum—Classroom Outcomes ...... 62 Summary of Findings for Creative Curriculum (North Carolina and Georgia) ...... 63

xiv

Contents—Continued

Page Chapter 4. Creative Curriculum with Ladders to Literacy: University of New Hampshire (New Hampshire site) ...... 65 Curriculum ...... 65 Sample ...... 65 Children and Families ...... 65 Teachers ...... 67 Programs/Classrooms ...... 67 Random Assignment ...... 67 Contamination ...... 68 Control Condition ...... 68 Data Collection ...... 68 Attrition ...... 69 Implementation ...... 69 Implementation Fidelity Ratings ...... 69 Impact Analysis Results ...... 70 Creative Curriculum with Ladders to Literacy—Child Outcomes ...... 70 Creative Curriculum with Ladders to Literacy—Classroom Outcomes ...... 71 Summary of Findings for Creative Curriculum with Ladders to Literacy ...... 72 Chapter 5. Curiosity Corner: Success for All Foundation (SFA sites: Florida, Kansas, and New Jersey) ...... 75 Curriculum ...... 75 Sample ...... 75 Children and Families ...... 75 Teachers ...... 76 Programs/Classrooms ...... 76 Random Assignment ...... 76 Contamination ...... 78 Control Condition ...... 78 Data Collection ...... 78 Attrition ...... 79 Implementation ...... 79 Implementation Fidelity Ratings ...... 79 Impact Analysis Results ...... 79 Curiosity Corner—Child Outcomes ...... 79 Curiosity Corner—Classroom Outcomes ...... 81 Summary of Findings for Curiosity Corner ...... 82

xv

Contents—Continued

Page Chapter 6. Doors to Discovery and Let’s Begin with the Letter People: University of Texas Health Science Center at Houston (Texas site) ...... 85 Curriculum ...... 85 Doors to Discovery ...... 85 Let’s Begin with the Letter People ...... 85 Sample ...... 85 Children and Families ...... 86 Teachers ...... 87 Programs/Classrooms ...... 87 Random Assignment ...... 88 Contamination ...... 89 Control Condition ...... 89 Data Collection ...... 89 Attrition ...... 89 Implementation ...... 90 Implementation Fidelity Ratings ...... 90 Impact Analysis Results ...... 90 Doors to Discovery—Child Outcomes ...... 90 Doors to Discovery—Classroom Outcomes ...... 92 Summary of Findings for Doors to Discovery ...... 93 Let’s Begin with the Letter People—Child Outcomes ...... 95 Let’s Begin with the Letter People—Classroom Outcomes ...... 96 Summary of Findings for Let’s Begin with the Letter People ...... 97 Chapter 7. Early Literacy and Learning Model (ELLM): University of North Florida (Florida-UNF site) ...... 99 Curriculum ...... 99 Sample ...... 99 Children and Families ...... 100 Teachers ...... 100 Programs/Classrooms ...... 101 Random Assignment ...... 101 Contamination ...... 103 Control Condition ...... 103 Data Collection ...... 103 Attrition ...... 103 Implementation ...... 104 Site-Specific Fidelity Ratings ...... 104 Implementation Fidelity Ratings ...... 104 Impact Analysis Results ...... 105 Early Literacy and Learning Model—Child Outcomes ...... 105 Early Literacy and Learning Model—Classroom Outcomes ...... 106 Summary of Findings for Early Literacy and Learning Model ...... 107

xvi

Contents—Continued

Page Chapter 8. Language-Focused Curriculum: University of Virginia (Virginia site) ...... 109 Curriculum ...... 109 Sample ...... 109 Children and Families ...... 109 Teachers ...... 110 Programs/Classrooms ...... 110 Random Assignment ...... 110 Contamination ...... 112 Control Condition ...... 112 Data Collection ...... 112 Attrition ...... 112 Implementation ...... 113 Implementation Fidelity Ratings ...... 113 Impact Analysis Results ...... 113 Language-Focused Curriculum—Child Outcomes ...... 113 Language-Focused Curriculum—Classroom Outcomes ...... 115 Summary of Findings for Language-Focused Curriculum ...... 115 Chapter 9. Literacy Express and DLM Early Childhood Express supplemented with Open Court Reading Pre-K: Florida State University (Florida-FSU site) ...... 117 Curriculum ...... 117 Literacy Express ...... 117 DLM Early Childhood Express supplemented with Open Court Reading Pre-K ...... 117 Sample ...... 118 Children and Families ...... 118 Teachers ...... 118 Programs/Classrooms ...... 118 Random Assignment ...... 120 Contamination ...... 121 Control Condition ...... 121 Data Collection ...... 121 Attrition ...... 121 Implementation ...... 121 Implementation Fidelity Ratings ...... 122 Impact Analysis Results ...... 123 Literacy Express—Child Outcomes ...... 123 Literacy Express—Classroom Outcomes ...... 124 Summary of Findings for Literacy Express ...... 125 DLM Early Childhood Express supplemented with Open Court Reading Pre-K—Child Outcomes ...... 127 DLM Early Childhood Express supplemented with Open Court Reading Pre-K—Classroom Outcomes ...... 128 Summary of Findings for DLM Early Childhood Express supplemented with Open Court Reading Pre-K ...130

xvii

Contents—Continued

Page Chapter 10. Pre-K Mathematics supplemented with DLM Early Childhood Express Math software: University of California, Berkeley/University at Buffalo, State University of New York (California/New York sites) ...... 131 Curriculum ...... 131 Sample ...... 131 Children and Families ...... 132 Teachers ...... 132 Programs/Classrooms ...... 132 Random Assignment ...... 135 Contamination ...... 136 Control Condition ...... 136 Data Collection ...... 136 Attrition ...... 137 Implementation ...... 137 Implementation Fidelity Ratings ...... 138 Impact Analysis Results ...... 138 Pre-K Mathematics supplemented with DLM Early Childhood Express Math software —Child Outcomes ...... 138 Pre-K Mathematics supplemented with DLM Early Childhood Express Math software —Classroom Outcomes ...... 140 Summary of Findings for Pre-K Mathematics supplemented with DLM Early Childhood Express Math software ...... 141 Chapter 11. Project Approach: Purdue University/University of Wisconsin (Wisconsin site) ...... 143 Curriculum ...... 143 Sample ...... 143 Children and Families ...... 143 Teachers ...... 145 Programs/Classrooms ...... 145 Random Assignment ...... 146 Contamination ...... 146 Control Condition ...... 146 Data Collection ...... 146 Attrition ...... 146 Implementation ...... 146 Implementation Fidelity Ratings ...... 148 Impact Analysis Results ...... 148 Project Approach—Child Outcomes ...... 148 Project Approach—Classroom Outcomes ...... 149 Summary of Findings for Project Approach ...... 150

xviii

Contents—Continued

Page Chapter 12. Project Construct: University of Missouri-Columbia (Missouri site) ...... 153 Curriculum ...... 153 Sample ...... 153 Children and Families ...... 153 Teachers ...... 154 Programs/Classrooms ...... 154 Random Assignment ...... 154 Contamination ...... 156 Control Condition ...... 156 Data Collection ...... 156 Attrition ...... 156 Implementation ...... 157 Implementation Fidelity Ratings ...... 157 Impact Analysis Results ...... 158 Project Construct—Child Outcomes ...... 158 Project Construct—Classroom Outcomes ...... 159 Summary of Findings for Project Construct ...... 160 Chapter 13. Ready, Set, Leap! University of California, Berkeley (New Jersey site) ...... 163 Curriculum ...... 163 Sample ...... 163 Children and Families ...... 164 Teachers ...... 165 Programs/Classrooms ...... 165 Random Assignment ...... 166 Contamination ...... 167 Control Condition ...... 167 Data Collection ...... 167 Attrition ...... 167 Implementation ...... 167 Implementation Fidelity Ratings ...... 167 Impact Analysis Results ...... 167 Ready, Set, Leap!—Child Outcomes ...... 168 Ready, Set, Leap!—Classroom Outcomes ...... 169 Summary of Findings for Ready, Set, Leap! ...... 170 References ...... 173 Appendix A: Secondary Analyses Results ...... A-1 Appendix B: Data Analysis Approach and Statistical Model ...... B-1 Appendix C: Unadjusted Mean Scores ...... C-1 Appendix D: Covariate Adjusted Mean Differences and Standard Errors ...... D-1

xix

Page intentionally left blank.

List of Tables

Table Page Executive Summary A The intervention curricula ...... xxxiii B Units of random assignment for evaluation of each curriculum ...... xxxiv C Outcomes and measures ...... xxxvi D Model used with each measure ...... xl E Effect sizes for student-level measures: Pre-kindergarten ...... xlii F Effect sizes for student-level measures: Kindergarten ...... xliii G Effect sizes for classroom-level measures: Pre-kindergarten ...... xliv H Findings by student-level outcomes ...... xlvii I Findings by classroom-level outcomes ...... xlviii

Chapter 1. An Overview of the Preschool Curriculum Evaluation Research Initiative 1.1 The intervention curricula ...... 5 1.2 The intervention and control curricula ...... 6 1.3 Units of random assignment for evaluation of each curriculum ...... 7 1.4 Dispersion of the preschool study sample into kindergarten schools and classrooms ...... 9 1.5 Outcomes and measures ...... 11 1.6 Standardized mean and reliability for outcome measures ...... 12 1.7 Training and support of treatment teachers ...... 20 1.8 Inter-pair agreement on classroom observations among research teams working with RTI International (RTI), fall 2003 and spring 2004 ...... 22 1.9 Response rates ...... 23 1.10 Characteristics of children and parents ...... 26 1.11 Characteristics of preschool and kindergarten teachers ...... 27 1.12 Characteristics of ...... 27 1.13 Characteristics of ...... 28 1.14 Model used with each outcome measure ...... 29 1.15 Effect sizes for student-level measures: Pre-kindergarten ...... 34 1.16 Effect sizes for student-level measures: Kindergarten ...... 35 1.17 Effect sizes for classroom-level measures: Pre-kindergarten ...... 36 1.18 Achieved minimum detectable effects on the reading, language, mathematics, and behavior composites of measures ...... 37 1.19 Findings by student-level outcomes ...... 37 1.20 Findings by classroom-level outcomes, pre-kindergarten year only ...... 38

Chapter 2. Bright Beginnings and Creative Curriculum: Vanderbilt University (Tennessee site) 2.1 Child demographic characteristics for Bright Beginnings and Creative Curriculum ...... 42 2.2 Primary caregiver demographic characteristics for Bright Beginnings and Creative Curriculum ...... 43 2.3 Preschool teacher characteristics for Bright Beginnings and Creative Curriculum ...... 44 2.4 Effect sizes for Bright Beginnings ...... 50 2.5 Effect sizes for Creative Curriculum: Tennessee ...... 54

xxi

List of Tables—Continued

Table Page Chapter 3. Creative Curriculum: University of North Carolina at Charlotte (North Carolina and Georgia sites) 3.1 Child demographic characteristics for Creative Curriculum: North Carolina and Georgia...... 56 3.2 Primary caregiver demographic characteristics for Creative Curriculum: North Carolina and Georgia ...... 57 3.3 Preschool teacher characteristics for Creative Curriculum: North Carolina and Georgia ...... 58 3.4 Effect sizes for Creative Curriculum: North Carolina and Georgia ...... 64

Chapter 4. Creative Curriculum with Ladders to Literacy: University of New Hampshire (New Hampshire site) 4.1 Child demographic characteristics for Creative Curriculum with Ladders to Literacy ...... 66 4.2 Primary caregiver demographic characteristics for Creative Curriculum with Ladders to Literacy ...... 66 4.3 Preschool teacher characteristics for Creative Curriculum with Ladders to Literacy ...... 68 4.4 Effect sizes for Creative Curriculum with Ladders to Literacy ...... 73

Chapter 5. Curiosity Corner: Success for All Foundation (SFA sites: Florida, Kansas, and New Jersey) 5.1 Child demographic characteristics for Curiosity Corner...... 76 5.2 Primary caregiver demographic characteristics for Curiosity Corner ...... 77 5.3 Preschool teacher characteristics for Curiosity Corner ...... 78 5.4 Effect sizes for Curiosity Corner ...... 83

Chapter 6. Doors to Discovery and Let’s Begin with the Letter People: University of Texas Health Science Center at Houston (Texas site) 6.1 Child demographic characteristics for Doors to Discovery and Let’s Begin with the Letter People ...... 86 6.2 Primary caregiver demographic characteristics for Doors to Discovery and Let’s Begin with the Letter People ...... 87 6.3 Preschool teacher characteristics for Doors to Discovery and Let’s Begin with the Letter People ...... 88 6.4 Effect sizes for Doors to Discovery ...... 94 6.5 Effect sizes for Let’s Begin with the Letter People ...... 98

Chapter 7. Early Literacy and Learning Model (ELLM): University of North Florida (Florida-UNF site) 7.1 Child demographic characteristics for Early Literacy and Learning Model ...... 100 7.2 Primary caregiver demographic characteristics for Early Literacy and Learning Model ...... 101 7.3 Preschool teacher characteristics for Early Literacy and Learning Model ...... 102 7.4 Effect sizes for Early Literacy and Learning Model ...... 108

Chapter 8. Language-Focused Curriculum: University of Virginia (Virginia site) 8.1 Child demographic characteristics for Language-Focused Curriculum ...... 110 8.2 Primary caregiver demographic characteristics for Language-Focused Curriculum ...... 111 8.3 Preschool teacher characteristics for Language-Focused Curriculum...... 112 8.4 Effect sizes for Language-Focused Curriculum ...... 116

xxii

List of Tables—Continued

Table Page Chapter 9. Literacy Express and DLM Early Childhood Express supplemented with Open Court Reading Pre-K: Florida State University (Florida-FSU site) 9.1 Child demographic characteristics for Literacy Express and DLM Early Childhood Express supplemented with Open Court Reading Pre-K ...... 119 9.2 Primary caregiver demographic characteristics for Literacy Express and DLM Early Childhood Express supplemented with Open Court Reading Pre-K ...... 119 9.3 Preschool teacher characteristics for Literacy Express and DLM Early Childhood Express supplemented with Open Court Reading Pre-K ...... 120 9.4 Effect sizes for Literacy Express ...... 126 9.5 Effect sizes for DLM Early Childhood Express supplemented with Open Court Reading Pre-K ...... 130

Chapter 10. Pre-K Mathematics supplemented with DLM Early Childhood Express Math software: University of California, Berkeley/University at Buffalo, State University of New York (California/New York sites) 10.1 Child demographic characteristics for Pre-K Mathematics supplemented with DLM Early Childhood Express Math software ...... 133 10.2 Primary caregiver demographic characteristics for Pre-K Mathematics supplemented with DLM Early Childhood Express Math software ...... 134 10.3 Preschool teacher characteristics for Pre-K Mathematics supplemented with DLM Early Childhood Express Math software ...... 135 10.4 Effect sizes for Pre-K Mathematics supplemented with DLM Early Childhood Express Math software ...... 142

Chapter 11. Project Approach: Purdue University/University of Wisconsin (Wisconsin site) 11.1 Child demographic characteristics for Project Approach ...... 144 11.2 Primary caregiver demographic characteristics for Project Approach ...... 144 11.3 Preschool teacher characteristics for Project Approach ...... 145 11.4 Effect sizes for Project Approach ...... 151

Chapter 12. Project Construct: University of Missouri-Columbia (Missouri site) 12.1 Child demographic characteristics for Project Construct ...... 154 12.2 Primary caregiver demographic characteristics for Project Construct ...... 155 12.3 Preschool teacher characteristics for Project Construct ...... 156 12.4 Effect sizes for Project Construct ...... 161

Chapter 13. Ready, Set, Leap! University of California, Berkeley (New Jersey site) 13.1 Child demographic characteristics for Ready, Set, Leap! ...... 164 13.2 Primary caregiver demographic characteristics for Ready, Set, Leap! ...... 165 13.3 Preschool teacher characteristics for Ready, Set, Leap! ...... 166 13.4 Effect sizes for Ready, Set, Leap! ...... 171

Appendix A. Secondary Analyses Results A-1 Possible early treatment effects and non-equivalence at baseline ...... A-4 A-2 Secondary analysis: Outcomes, measures, models, and grades analyzed ...... A-5 A-3 Correlation matrix for student-level measures ...... A-8 A-4 Criterion used to determine curricula’s impact on a measure and on an outcome ...... A-9

xxiii

List of Tables—Continued

Table Page Appendix A. Secondary Analyses Results—Continued A-5 Findings on student-level outcomes: Main and secondary analyses ...... A-11 A-6 Findings on classroom-level outcomes: Main and secondary analyses ...... A-12 A-7 Secondary analysis results for Bright Beginnings ...... A-18 A-8 Secondary analysis results for Creative Curriculum: Tennessee ...... A-24 A-9 Secondary analysis results for Creative Curriculum: North Carolina and Georgia ...... A-30 A-10 Secondary analysis results for Creative Curriculum with Ladders to Literacy ...... A-36 A-11 Secondary analysis results for Curiosity Corner ...... A-42 A-12 Secondary analysis results for Doors to Discovery ...... A-48 A-13 Secondary analysis results for Let’s Begin with the Letter People ...... A-54 A-14 Secondary analysis results for Early Literacy and Learning Model ...... A-60 A-15 Secondary analysis results for Language-Focused Curriculum ...... A-65 A-16 Secondary analysis results for Literacy Express ...... A-71 A-17 Secondary analysis results for DLM Early Childhood Express supplemented with Open Court Reading Pre-K ...... A-78 A-18 Secondary analysis results for Pre-K Mathematics supplemented with DLM Early Childhood Express Math software ...... A-84 A-19 Secondary analysis results for Project Approach ...... A-90 A-20 Secondary analysis results for Project Construct ...... A-96 A-21 Secondary analysis results for Ready, Set, Leap! ...... A-102

Appendix B. Data Analysis Approach and Statistical Model B-1 Units of random assignment for evaluation of each curriculum ...... B-4 B-2 Variables in analysis ...... B-6 B-3 Response rates and attrition ...... B-9 B-4 Main analysis: Model used with each outcome measure ...... B-12 B-5 Secondary analysis: Models and grades for each outcome measure ...... B-13 B-6 Average estimated population standard deviation by study outcome and time points ...... B-28 B-7 Average estimated population standard deviation by study outcome and group assignment ...... B-29 B-8 Pooled standard deviation example ...... B-29 B-9 Estimated pooled population standard deviation using unconditional standard deviations and standard deviations from repeated measures analyses ...... B-31 B-10 Pooled standard deviation details: Pooling for outcomes modeled with the simple repeated measures, the repeated measures spline models, the pre-kindergarten spring analysis of covariance (ANCOVA) models (except Teacher Behavior Rating Scale [TBRS]), and the kindergarten spring ANCOVA models where kindergarten data were comparable to pre-kindergarten ...... B-32 B-11 Pooled standard deviation details: Pooling for Teacher Behavior Rating Scale (TBRS) outcomes modeled with the pre-kindergarten spring analysis of covariance (ANCOVA) models ...... B-33 B-12 Pooled standard deviation details: Pooling for kindergarten spring outcomes (SSRS, Pre- CTOPPP/CTOPP, PLBS/LBS) modeled with analysis of covariance (ANCOVA) models ...... B-34 B-13 Pre-kindergarten and kindergarten classroom clusters of children, maximum and minimum size ...... B-36 B-14 Specific covariance structures found to best fit the data ...... B-38

xxiv

List of Tables—Continued

Table Page Appendix B. Data Analysis Approach and Statistical Model—Continued B-15 Specific covariance structure definitions ...... B-38 B-16 Significant treatment by covariate interactions from check on homogeneity of regression assumption...... B-40

Appendix C. Unadjusted Mean Scores C-1a Unadjusted mean scores of child-level outcome measures, Bright Beginnings: Tennessee ...... C-3 C-1b Unadjusted mean scores of classroom-level outcome measures, Bright Beginnings: Tennessee ...... C-4 C-2a Unadjusted mean scores of child-level outcome measures, Creative Curriculum: Tennessee ...... C-5 C-2b Unadjusted mean scores of classroom-level outcome measures, Creative Curriculum: Tennessee ...... C-6 C-3a Unadjusted mean scores of child-level outcome measures, Creative Curriculum: North Carolina ad Georgia ...... C-7 C-3b Unadjusted mean scores of classroom-level outcome measures, Creative Curriculum: North Carolina and Georgia ...... C-8 C-4a Unadjusted mean scores of child-level outcome measures, Creative Curriculum with Ladders to Literacy: New Hampshire ...... C-9 C-4b Unadjusted mean scores of classroom-level outcome measures, Creative Curriculum with Ladders to Literacy: New Hampshire ...... C-10 C-5a Unadjusted mean scores of child-level outcome measures, Curiosity Corner: Florida, Kansas, and New Jersey ...... C-11 C-5b Unadjusted mean scores of classroom-level outcome measures, Curiosity Corner: Florida, Kansas, and New Jersey ...... C-12 C-6a Unadjusted mean scores of child-level outcome measures, Doors to Discovery: Texas ...... C-13 C-6b Unadjusted mean scores of classroom-level outcome measures, Doors to Discovery: Texas ...... C-14 C-7a Unadjusted mean scores of child-level outcome measures, Let’s Begin with the Letter People: Texas ...... C-15 C-7b Unadjusted mean scores of classroom-level outcome measures, Let’s Begin with the Letter People: Texas ...... C-16 C-8a Unadjusted mean scores of child-level outcome measures, Early Literacy and Learning Model: Florida—University of North Florida ...... C-17 C-8b Unadjusted mean scores of classroom-level outcome measures, Early Literacy and Learning Model: Florida—University of North Florida ...... C-18 C-9a Unadjusted mean scores of child-level outcome measures, Language-Focused Curriculum: Virginia ...... C-19 C-9b Unadjusted mean scores of classroom-level outcome measures, Language-Focused Curriculum: Virginia ...... C-20 C-10a Unadjusted mean scores of child-level outcome measures, Literacy Express: Florida—Florida State University ...... C-21 C-10b Unadjusted mean scores of classroom-level outcome measures, Literacy Express: Florida —Florida State University ...... C-22 C-11a Unadjusted mean scores of child-level outcome measures, DLM Early Childhood Express supplemented with Open Court Reading Pre-K: Florida—Florida State University ...... C-23 C-11b Unadjusted mean scores of classroom-level outcome measures, DLM Early Childhood Express supplemented with Open Court Reading Pre-K: Florida—Florida State University ...... C-24

xxv

List of Tables—Continued

Table Page Appendix C. Unadjusted Mean Scores—Continued C-12a Unadjusted mean scores of child-level outcome measures, Pre-K Mathematics supplemented with DLM Express Math software: California and New York ...... C-25 C-12b Unadjusted mean scores of classroom-level outcome measures, Pre-K Mathematics supplemented with DLM Express Math software: California and New York ...... C-26 C-13a Unadjusted mean scores of child-level outcome measures, Project Approach: Wisconsin ...... C-27 C-13b Unadjusted mean scores of classroom-level outcome measures, Project Approach: Wisconsin ...... C-28 C-14a Unadjusted mean scores of child-level outcome measures, Project Construct: Missouri ...... C-29 C-14b Unadjusted mean scores of classroom-level outcome measures, Project Construct: Missouri ...... C-30 C-15a Unadjusted mean scores of child-level outcome measures, Ready, Set, Leap!: New Jersey ...... C-31 C-15b Unadjusted mean scores of classroom-level outcome measures, Ready, Set, Leap!: New Jersey ...... C-32

Appendix D. Covariate Adjusted Mean Differences and Standard Errors D-1a Covariate adjusted mean differences and standard errors of child-level outcome measures, Bright Beginnings: Tennessee ...... D-3 D-1b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Bright Beginnings: Tennessee ...... D-4 D-2a Covariate adjusted mean differences and standard errors of child-level outcome measures, Creative Curriculum: Tennessee ...... D-5 D-2b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Creative Curriculum: Tennessee ...... D-6 D-3a Covariate adjusted mean differences and standard errors of child-level outcome measures, Creative Curriculum: North Carolina and Georgia ...... D-7 D-3b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Creative Curriculum: North Carolina and Georgia ...... D-8 D-4a Covariate adjusted mean differences and standard errors of child-level outcome measures, Creative Curriculum with Ladders to Literacy: New Hampshire ...... D-9 D-4b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Creative Curriculum with Ladders to Literacy: New Hampshire ...... D-10 D-5a Covariate adjusted mean differences and standard errors of child-level outcome measures, Curiosity Corner: Florida, Kansas, and New Jersey ...... D-11 D-5b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Curiosity Corner: Florida, Kansas, and New Jersey ...... D-12 D-6a Covariate adjusted mean differences and standard errors of child-level outcome measures, Doors to Discovery: Texas ...... D-13 D-6b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Doors to Discovery: Texas ...... D-14 D-7a Covariate adjusted mean differences and standard errors of child-level outcome measures, Let’s Begin with the Letter People: Texas ...... D-15 D-7b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Let’s Begin with the Letter People: Texas ...... D-16 D-8a Covariate adjusted mean differences and standard errors of child-level outcome measures, Early Literacy and Learning Model: Florida—University of North Florida ...... D-17 D-8b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Early Literacy and Learning Model: Florida—University of North Florida ...... D-18

xxvi

List of Tables—Continued

Table Page Appendix D. Covariate Adjusted Mean Differences and Standard Errors—Continued D-9a Covariate adjusted mean differences and standard errors of child-level outcome measures, Language-Focused Curriculum: Virginia ...... D-19 D-9b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Language-Focused Curriculum: Virginia ...... D-20 D-10a Covariate adjusted mean differences and standard errors of child-level outcome measures, Literacy Express: Florida—Florida State University ...... D-21 D-10b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Literacy Express: Florida—Florida State University ...... D-22 D-11a Covariate adjusted mean differences and standard errors of child-level outcome measures, DLM Early Childhood Express supplemented with Open Court Reading Pre-K: Florida—Florida State University ...... D-23 D-11b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, DLM Early Childhood Express supplemented with Open Court Reading Pre-K: Florida—Florida State University ...... D-24 D-12a Covariate adjusted mean differences and standard errors of child-level outcome measures, Pre-K Mathematics supplemented with DLM Early Childhood Express Math software: California and New York ...... D-25 D-12b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Pre-K Mathematics supplemented with DLM Early Childhood Express Math software: California and New York ...... D-26 D-13a Covariate adjusted mean differences and standard errors of child-level outcome measures, Project Approach: Wisconsin ...... D-27 D-13b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Project Approach: Wisconsin ...... D-28 D-14a Covariate adjusted mean differences and standard errors of child-level outcome measures, Project Construct: Missouri ...... D-29 D-14b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Project Construct: Missouri ...... D-30 D-15a Covariate adjusted mean differences and standard errors of child-level outcome measures, Ready, Set, Leap!: New Jersey ...... D-31 D-15b Covariate adjusted mean differences and standard errors of classroom-level outcome measures, Ready, Set, Leap!: New Jersey ...... D-32

xxvii

Page intentionally left blank.

List of Figures

Figure Page 1.1 Timeline for teams working with RTI International ...... 18 1.2 Timeline for teams working with Mathematica Policy Research, Inc...... 19 B-1 Repeated measures spline model ...... B-15 B-2 Simple repeated measures model ...... B-15 B-3 Pre-kindergarten (Pre-K) and kindergarten (K) analysis of covariance (ANCOVA) models ...... B-17

xxix

Page intentionally left blank.

Executive Summary

A variety of preschool curricula is available and in widespread use, however, there is a lack of evidence from rigorous evaluations regarding the effects of these curricula on children’s school readiness. The lack of such information is important as early childhood center-based programs have been a major, sometimes the sole, component of a number of federal and state efforts to improve young at-risk children’s school readiness (e.g., Head Start, Even Start, public pre-kindergarten). In 2005, nearly half (47%) of all 3- to 5-year-old children from low-income families were enrolled in either part-day or full-day early childhood programs (U.S. Department of Education 2006). In 2002, the Institute of Education Sciences (IES) began the Preschool Curriculum Evaluation Research (PCER) initiative to conduct rigorous efficacy evaluations of available preschool curricula. Twelve research teams implemented one or two curricula in preschool settings serving predominantly low-income children under an experimental design. For each team, preschools or classrooms were randomly assigned to the intervention curricula or control curricula and the children were followed from pre-kindergarten through kindergarten. IES contracted with RTI International (RTI) and Mathematica Policy Research (MPR) to evaluate the impact of each of the 14 curricula implemented using a common set of measures with the cohort of children beginning preschool in the summer-fall of 2003. This report provides the individual results for each curriculum from the evaluations by RTI and MPR. Chapter 1 describes the PCER initiative and details the common elements of the evaluations including the experimental design, implementation, analysis, results, and findings. Chapters 2-13, respectively, provide greater detail on the individual evaluations of the curricula implemented by each research team including information on the curricula, the demographics of the site-specific samples, assignment, fidelity of implementation, and results. Appendix A presents results from a secondary analysis of the data. Appendix B provides greater detail regarding the data analyses conducted. Appendixes C and D provide additional information regarding the outcome measures.

Research Questions The PCER initiative focused on the impact of the intervention curricula on students’ reading and pre-reading, phonological awareness, early language, early mathematics knowledge, and behavior (including social skills and problem behaviors) at the end of pre-kindergarten and kindergarten. These domains of knowledge and skills are predictive of academic success in the early years of elementary school (Downer and Pianta 2006; Miles and Stipek 2006). As a result, the research questions for the initiative primarily concern student outcomes and also include classroom outcomes due to their potentially mediating or moderating roles. The research questions are: 1. What is the impact of each of the 14 preschool curricula on preschool students’ early reading skills, phonological awareness, language development, early mathematical knowledge, and behavior? 2. What is the impact of each of the 14 preschool curricula on these outcomes for students at the end of kindergarten? 3. What is the impact of each of the 14 preschool curricula on preschool classroom quality, teacher-child interaction, and instructional practices?

xxxi Executive Summary

Study Design Under the PCER initiative, 12 research teams received peer-reviewed grants to implement one to two preschool curricula of their choosing under an experimental design. For each team’s evaluation, preschool classrooms or programs were randomly assigned to use the treatment or control curricula. The treatment curricula included sufficient standardized training procedures and curriculum materials to be implemented in typical early childhood education settings. RTI and MPR evaluated the impact of each curriculum using a common set of measures. The curricula, corresponding research team, research site, and evaluator are listed in table A. Three teams each implemented two curricula. Two teams implemented the same curriculum, Creative Curriculum. Four teams had originally developed the curricula that they implemented (Curiosity Corner; Literacy Express, Pre-K Mathematics supplemented with DLM Early Childhood Express Math software, and Early Literacy and Learning Model [ELLM]). RTI evaluated eight curricula implemented by seven teams (including one curriculum that was evaluated by two teams) while MPR evaluated six curricula implemented by five teams. In sum, 14 curricula (one twice) were evaluated. The 14 curricula were evaluated in comparison to the local control condition that, in general, was the local curriculum-as-usual. As a result, multiple curricula were used across the control sites and within some of the individual evaluations. These included teacher-developed nonspecific curricula with a focus on basic school readiness, district-developed curricula, and published curricula (some of which were implemented by other research teams). The control curricula are identified in the section on Findings by Curriculum at the end of the Executive Summary. As a result of the use of different control curricula among the evaluations, this report does not make cross-intervention comparisons. Rather than one overall evaluation, the PCER study contains individual evaluations for each curriculum, for three reasons. First, each research team worked independently. Second, the selection of the intervention and the randomized assignment occurred at the team level. Third, different control curricula were used with each intervention curriculum.

Sample and Assignment to Condition Preschool programs taking part in the evaluation of the curricula included Head Start centers, private child care centers, and public pre-kindergarten programs in urban, rural, and suburban locations. Each research team recruited interested local preschool programs. IES had set a funding priority on grant applications that addressed preschools serving children from low-income families, with the result that 88 percent of the preschools included were either Head Start centers or public pre-kindergarten programs, and half of the children’s primary caregivers had a high school education or less. Programs agreed to the random assignment (by program or classroom) to a treatment curriculum or to local control conditions. For each evaluated curriculum, table B indicates whether pre-kindergarten programs or classrooms were randomly assigned to treatment or control conditions, the number assigned to each, and the number of treatment and control students included in each evaluation. Three teams (implementing four curricula) randomly assigned pre-kindergarten programs, and the other nine teams randomly assigned classrooms. Three teams compared two curricula against a single set of control classrooms or programs. All but two teams (Purdue University and University of New Hampshire) used block random assignment.

xxxii Executive Summary

Table A.—The intervention curricula

Curriculum and publisher Research team Research site Evaluator

Bright Beginnings Vanderbilt University Tennessee RTI (Charlotte-Mecklenburg Schools 2001)

Creative Curriculum Vanderbilt University Tennessee RTI (Teaching Strategies, Inc. 2002)

Creative Curriculum University of North Carolina North Carolina RTI (Teaching Strategies, Inc. 2002) at Charlotte and Georgia

Creative Curriculum with Ladders to Literacy University of New Hampshire New Hampshire RTI (Teaching Strategies, Inc. 2002; Paul H. Brookes Publishing Company 1998)

Curiosity Corner Success for All Foundation Florida, Kansas, MPR (Success for All Foundation, Inc. 2003) New Jersey

DLM Early Childhood Express supplemented Florida State University Florida MPR with Open Court Reading Pre-K (SRA/McGraw-Hill 2003)

Doors to Discovery University of Texas Health Texas RTI (Wright Group/McGraw-Hill 2001) Science Center at Houston

Early Literacy and Learning Model University of North Florida Florida RTI (Florida Institute of Education and the University of North Florida 2002)

Language-Focused Curriculum University of Virginia Virginia MPR (Paul H. Brookes Publishing Company 1995)

Let’s Begin with the Letter People University of Texas Health Texas RTI (Abrams & Company 2000) Science Center at Houston

Literacy Express Florida State University Florida MPR (Author: Lonigan and Farver 2002, unpublished)

Pre-K Mathematics supplemented with DLM Early University of California, California and RTI Childhood Express Math software Berkeley and University at New York (Scott Foresman—Pre-K Mathematics 2002; SRA/ Buffalo, State University McGraw-Hill—DLM Early Childhood Express Math of New York software 2003)

Project Approach Purdue University and Wisconsin RTI (Ablex 1989) University of WI-Milwaukee

Project Construct University of Missouri- Missouri MPR (Missouri Department of Elementary and Secondary Columbia Education 1992)

Ready, Set, Leap! University of California, New Jersey MPR (LeapFrog School House 2003) Berkeley NOTE: RTI: RTI International MPR: Mathematica Policy Research, Inc. SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

xxxiii Executive Summary

Table B.—Units of random assignment for evaluation of each curriculum

Research team Curricula Treatment sample Control sample Students

Bright Beginnings 7 classrooms T: 103 Vanderbilt University 7 classrooms C: 105 Creative Curriculum 7 classrooms T: 101

University of North Carolina T: 97 Creative Curriculum 9 classrooms 9 classrooms at Charlotte C: 97

Creative Curriculum with Ladders T: 62 University of New Hampshire 7 classrooms 7 classrooms to Literacy C: 61

T: 105 Success for All Foundation Curiosity Corner 10 Pre-K programs 8 Pre-K programs C: 110

T: 101 University of Texas Health Doors to Discovery 14 classrooms 15 classrooms C: 96 Science Center at Houston Let’s Begin with the Letter 15 classrooms T: 100 People

Early Literacy and Learning T: 137 University of North Florida 14 classrooms1 14 classrooms1 Model C: 107

T: 97 University of Virginia Language-Focused Curriculum 7 classrooms 7 classrooms C: 98

DLM Early Childhood Express with Open Court Reading Pre-K 5 Pre-K programs T: 101 Florida State University 6 Pre-K programs C: 97 Literacy Express 6 Pre-K programs T: 99

UC-Berkeley and University Pre-K Mathematics with DLM T: 159 at Buffalo, State University of Early Childhood Express Math 20 classrooms 20 classrooms C: 157 New York software

Purdue University and University T: 114 Project Approach 7 classrooms 6 classrooms of WI-Milwaukee C: 90

T: 123 University of Missouri-Columbia Project Construct 10 Pre-K programs1 11 Pre-K programs1 C: 108

University of California, T: 149 Ready, Set, Leap! 18 classrooms 21 classrooms Berkeley C: 137 1 After one program or classroom attrited. NOTE: T: Treatment Group C: Control Group Three research teams (Vanderbilt University, University of Texas Health Science Center at Houston, and Florida State University) have two treatment groups and a shared control group. When reading the “Students” column, the first “T” refers to the first curriculum in the same row, while the second “T” refers to the second curriculum in the same row. The “C” refers to the shared control group. For example, Vanderbilt University compared two curricula: Bright Beginnings (103 students) and Creative Curriculum (101 students) to a control curriculum (105 students). SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

xxxiv Executive Summary

The process of random assignment differed somewhat depending upon the evaluator. The seven research teams working with RTI were responsible for the random assignment at their sites; RTI monitored the process and tracked any changes. These teams had a pilot preschool implementation year starting in the fall of 2002. The randomization conducted in that year carried over, in most cases, to the actual evaluation begun in the 2003-04 school year. The five research teams working with MPR began implementing the curricula in the 2003-04 school year. In conjunction with the research teams, MPR conducted block random assignment for four teams. In addition, Florida State University (FSU) block randomly assigned pre-kindergarten programs to its two curricula and the control group. The analyses included 2,911 children, 315 preschool classrooms, and 208 preschools. As noted above, the PCER study individually evaluates separate curriculum so no comparisons are made between all those included in the treatment condition and all those who were part of the control condition. Such comparisons are made for each evaluation’s treatment and control groups in chapters 2 to 13. On average, the students were age 4.6 years at the time of the baseline data collection in the fall of 2003 and age 6.1 years at the time of the kindergarten follow-up in the spring of 2005. Approximately half (51%) of the children were male. One-third were white non-Hispanic, 43 percent were African American, and 16 percent were Hispanic. Less than 7 percent had a disability. On average, the students’ primary caregivers, most often their biological or adoptive mother, were age 32 years at the time of the fall 2003 data collection. Less than half (47%) were married and one-third were never married. Less than half attended or graduated from college (48%), one-third had a high school diploma or GED, and 19 percent did not complete high school. Half were employed full-time, 14 percent part-time, and 34 percent were unemployed. Almost all the preschool teachers were female (98%) and the majority were White (54%), with one-third African-American. Two-thirds had at least a college degree. On average, they had 12 years of teaching experience and 8 years of experience teaching in pre-kindergarten settings. A majority (87%) of the preschool programs in which they taught were full-day programs. More than half (58%) were public pre-kindergartens, 31 percent were Head Start teachers, and child care teachers made up the remainder (12%). On average, teachers taught 15 students, with a child-staff ratio averaging 7.5 children per teacher. The kindergarten teachers were also mostly female (98%) and White (74%), with 17 percent African- American. Almost all had at least a BA (97%) with 39 percent having a graduate degree. They averaged 15 years of teaching experience, with an average of 9 years teaching kindergarten. Ninety-three percent of the kindergarten classrooms were full-day and 92 percent of the students were enrolled in public schools. The average number of students per classroom was 20 children. Thirty-nine percent were enrolled in schools where more than 75 percent of the students were eligible for free or reduced-price lunch.

Measures Twenty-seven measures were chosen to address the outcomes of interest regarding children’s school readiness (reading, phonological awareness, language, mathematics, and behavior) and classroom conditions (classroom quality, teacher-child interaction, and instructional practices). Table C lists the measures used for each outcome, when they were collected, and through which instrument they were collected. Five major data collection instruments were used to collect the outcome measures and other student, school and family data: (1) a child assessment, (2) a teacher report, (3) classroom observation, (4) a teacher interview or questionnaire, and (5) a parent interview.

Child Assessment The child assessment measured the student-level academic outcomes for the evaluation, beginning with a preschool pre-test in the fall of 2003 and post-tests near the end of preschool in the spring of 2004, and the

xxxv Executive Summary

Table C. —Outcomes and measures

Outcome Measures Times collected Instrument

Reading TERA Pre-K: fall/spring, K: spring Child assessment WJ Letter Word Identification Pre-K: fall/spring, K: spring WJ Spelling Pre-K: fall/spring, K: spring

Pre-kindergarten phonological Pre-CTOPPP Pre-K: fall/spring Child assessment awareness1

Kindergarten phonological CTOPP K: spring Child assessment awareness1

Language PPVT Pre-K: fall/spring, K: spring Child assessment TOLD Pre-K: fall/spring, K: spring

Mathematics WJ Applied Problems Pre-K: fall/spring, K: spring Child assessment CMA-A Mathematics Composite Pre-K: fall/spring, K: spring Shape Composition2 Pre-K: fall/spring, K: spring

Pre-kindergarten behavior1 SSRS Social Skills Pre-K: fall/spring Teacher report SSRS Problem Behavior Pre-K: fall/spring PLBS Pre-K: fall/spring

Kindergarten behavior1 SSRS Social Skills K: spring Teacher report SSRS Problem Behavior K: spring LBS K: spring

Classroom quality ECERS-R Pre-K: fall/spring Classroom observation

Teacher-child interaction Arnett Detachment Pre-K: fall/spring Classroom observation Arnett Harshness Pre-K: fall/spring Arnett Permissiveness Pre-K: fall/spring Arnett Positive Interaction Pre-K: fall/spring

Literacy instruction TBRS Written Expression Pre-K: spring Classroom observation TBRS Print and Letter Knowledge Pre-K: spring

Phonological instruction TBRS Phonological Awareness Pre-K: spring Classroom observation

Language instruction TBRS Book Reading Pre-K: spring Classroom observation TBRS Oral Language Pre-K: spring

Mathematics instruction TBRS Math Concepts Pre-K: spring Classroom observation 1 Pre-kindergarten and kindergarten measures are not on the same scale. 2 Building Blocks, Shape Composition task NOTE: Refer to the glossary for abbreviations of the measures. SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

end of kindergarten in the spring of 2005. Individually administered, the battery assessed beginning reading skills, phonological awareness, oral language development, and mathematical knowledge and skills. The measures regarding reading included the Test of Early Reading Ability (TERA) (Reid, Hresko, and Hammill 2001), the Woodcock Johnson (WJ) Letter Word Identification, and WJ Spelling ( McGrew and Woodcock 2001). For phonological awareness, the measures were the Elision subtests of the Preschool Comprehensive Test of Phonologic and Print Processing and the Comprehensive Test of Phonological Processing for kindergarten (Pre-CTOPPP and CTOPP) (Wagner, Torgeson, and Rashotte 1999). For language, the

xxxvi Executive Summary measures included the Peabody Picture Vocabulary Test (PPVT) (Dunn and Dunn 1997) and the Test of Language Development (TOLD) Grammatic Understanding subtest (Newcomer and Hammill 1997). For mathematics, the measures were the WJ Applied Problems ( McGrew and Woodcock 2001), the Child Math Assessment-Abbreviated (CMA-A) Composite Score (Klein and Starkey 2002), and the Building Blocks’ Shape Composition Task (unpublished).

Teacher Report of Child Behavior Teacher reports provided the student-level behavior measures used in the evaluation. Preschool teachers gave pre-intervention ratings of child behaviors in the fall of 2003 and post-intervention ratings in the spring of 2004. They rated each child’s behavior (social competence, behavior problems, and classroom performance) using three scales: the Social Skills Rating System (SSRS) Social Skills scale, the SSRS Problem Behaviors scale (Gresham and Elliott 1990), and the Preschool Learning Behaviors Scale (PLBS) (McDermott et al. 2000). Kindergarten teachers provided a longer-term post-intervention rating on the students’ behavior in the spring of 2005 using the two SSRS scales and the Learning Behaviors Scale (LBS) (McDermott et al. 2000).

Classroom Observation Two pre-intervention classroom measures and three post-intervention classroom measures were gathered from preschool classroom observations. No observations were made of kindergarten classrooms. Three scales designed to characterize the quality and organization of the classroom and the nature of the interaction between children and the teacher were used in the observations. The Early Childhood Environment Rating Scale-Revised (ECERS-R) (Harms, Clifford, and Cryer 1998) provided an overall measure of the quality of the classroom. The Arnett Caregiver Interaction Scale (Arnett) (Arnett 1989) measured teacher-child interaction on four scales: Positive Interaction, Harshness, Detachment, and Permissiveness. The pre- intervention observation using the ECERS-R and Arnett Scale was conducted in the fall of 2003 and the post-intervention observation in the spring of 2004. The Teacher Behavior Rating Scale (TBRS) (Landry et al. 2002) was added as a post-intervention measure to the spring 2004 observation to capture preschool instructional practices. The TBRS includes scales for teacher instructional practices regarding: written expression, print and letter knowledge, phonological awareness, book reading, oral language use, and mathematics concepts.

Teacher Interview/Questionnaire Preschool teachers were interviewed regarding the types and frequency of classroom activities, general classroom information, clarification of observational data, teacher attitudes and beliefs, and teacher background information. The background information was used to construct covariates for the models used to analyze the data. Instead of an interview, kindergarten teachers completed a questionnaire that addressed their background, views on readiness, classroom resources and activities, instructional practices, and interactions with parents.

Parent Interview Parents were interviewed regarding demographic information, their own and their child’s health and disability status, their assessment of the child’s accomplishments and social skills, family-child activities, parenting practices, parental depression, and the use of child care. The interview drew primarily from the Head Start’s Family and Child Experiences Survey (FACES) (U.S. Department of Health and Human Services 2002) supplemented with additional measures. The demographic information and disability status were used to construct covariates for the models used to analyze the data.

xxxvii Executive Summary

Study Implementation The key implementation events in the evaluation of each curriculum included randomization of classrooms or programs, consent gathering, teacher training in the use of a treatment curriculum, implementation of the curriculum in the classroom, training the assessors, and collection of the baseline student and classroom measures and the post-intervention measures in preschool and kindergarten. As research teams independently implemented the curricula and as the schools followed different calendars, the dates and sometimes the order of these events differed between teams and sites within teams. Randomization for the seven teams working with RTI occurred in the pilot year (starting in the fall of 2002) and mostly carried over into the 2003-04 evaluation year. For the five teams working with MPR, there was no pilot year and their time of randomization ranged from July through September of 2003. The consent process followed randomization, except for two teams, for which it occurred concurrently. The start of implementation of the curricula in the classroom ranged from August through October 2003. The RTI and MPR data collection teams attempted to collect baseline data close to the beginning of school to avoid student exposure to the treatment curricula before pre-testing. Twelve teams began implementation before baseline data collection and two teams began implementation concurrently with collection. The lag between the start of implementation and the collection of baseline data ranged from 8 to 49 days (appendix A discusses additional analyses to adjust for possible early treatment effects that might result from these cases). Baseline data collection followed the consent process for the teams working with MPR and ran concurrently for the teams working with RTI. Baseline data collection took 6 to 8 weeks between September and November 2003. Assessors were trained the week of August 4, 2003 for the teams working with RTI and the week of September 8, 2003 for the teams working with MPR. The amount and timing of teacher training varied by team. The teams working with RTI provided most of the training during the 2002 pilot year, then gave refresher training during the 2003 evaluation year. The teams working with MPR provided initial training at the beginning of the evaluation year, and then follow-up training throughout the year. The students’ exposure to the treatment curriculum and their teachers’ training in its use was confined to preschool for all teams except in the case of the Success for All (SFA) team; in this case, some children entered SFA kindergarten classrooms where the SFA Kinder Corners curriculum was in use. Pre-kindergarten post-test data were collected in the spring, from April to June 2004, depending on school calendars. Student assessments, teacher interviews, teacher reports on behavior, and classroom observations were completed over a 6- to 8-week period. Parent interviews were completed over a 12-week period. Kindergarten post-test data (student assessments, teacher reports, teacher surveys, and parent interviews but no classroom observations) were collected in the spring and summer of 2005 between March and July.

Fidelity of Implementation The research teams collected data on the fidelity of implementation for the treatment and control curricula using both a team-specific measure and a global implementation rating that can be used for between-curricula comparisons. The global ratings use a four-point scale representing High, Medium, Low, or No Implementation. The fidelity of implementation for both the treatment and control curricula was rated as Medium.

Contamination The research teams monitored treatment and control classrooms to ensure that treatment group teachers were not sharing curriculum information or materials with teachers in the control group. At research sites with classroom-level random assignment to the treatment and control groups (treatment and control classrooms in the same school or center), the teams’ classroom observations indicated that there was little or

xxxviii Executive Summary no evidence of contamination. There was minimal risk of contamination at sites where pre-kindergarten programs (child care, Head Start centers, or all pre-kindergarten classrooms in an elementary school) were randomly assigned to the treatment or control condition.

Response Rates and Attrition The baseline data were collected in fall 2003 from the original sample, with an average response rate of 98 percent for the child assessments, 97 percent for the teacher reports, and 84 percent for the parent interviews. For the first follow-up data collection in spring 2004, attrition reduced the percentage of children for whom data were collected to 93 percent of students completing the child assessments, 90 percent having a teacher report, and 79 percent having a parent interview. Further attrition led to an additional decline in the second follow-up data collection in spring 2005, with 85 percent of the original sample completing the child assessments, 72 percent having a teacher report, and 75 percent having a parent interview. Overall, 15 percent of all the students sampled (426 students) were not included in the analyses: 2 percent non-responders during baseline data collection and 13 percent through later attrition. For the individual research teams, the percentage of students sampled who were not included in the analysis ranged from 3 to 34 percent. There was no evidence of differential sample attrition across the treatment and control groups at each research site.

Analysis Each curriculum was analyzed separately due to the independence of the research teams, the nonrandom assignment of curricula to research teams and sites, and the differences in control conditions. Because students were nested in classrooms or programs and repeatedly assessed with multiple measures, multi-level models containing a series of student, teacher, and classroom-level covariates were used to address the cross- level correlated errors, allowing for a mixture of random and fixed effects (see appendix B for details). For each curriculum, these models were used to estimate differences between treatment and control group means for each of the 27 outcome measures. The type of model used to analyze each outcome measure depended on the number of time points it was observed. Two types of models for repeated measures (spline and simple) were used for outcome measures with comparable data from two or three time points. Analysis of covariance (ANCOVA) was conducted for outcome measures observed at one time point. The more observations of a measure from different time points included in a model, the better able the model is to identify the parameters of interest, in this case the treatment and control group means of the measures. For this reason, the spline repeated measures model is the preferred model followed by the simple repeated measures model, and then the ANCOVA. The analysis of each measure uses the most preferred model that can be used given the number of time points the measure was observed. Table D lists the model used with each measure. For the eight student-level outcome measures with observations at three time points, a repeated measures spline model was used to compare the treatment and control group means for the spring pre-kindergarten and spring kindergarten observations. In addition, the model was used to check for differences in group mean measures at the baseline observation, check for such differences at the start of treatment if there was a lag between curriculum implementation and the baseline data collection, and compare the mean rates of growth for the treatment and control groups in pre-kindergarten and in kindergarten (the statistical techniques used are discussed in appendix B and the results from these three analyses are provided in appendix A). For the four student-level outcome measures and five classroom-level outcome measures with observations at two time points, a simple repeated measures model was used to compare the treatment and control group means at spring pre-kindergarten. Similarly, it was used to check on group mean differences at the baseline and start of treatment, and compare the rates of growth in pre-kindergarten.

xxxix Executive Summary

Table D.—Model used with each measure

Times Outcome Measure observed Model

Reading TERA 3 Spline repeated measures WJ Letter Word Identification 3 Spline repeated measures WJ Spelling 3 Spline repeated measures

Pre-kindergarten phonological awareness1 Pre-CTOPPP 2 Repeated measures

Kindergarten phonological awareness1 CTOPP 1 ANCOVA w/ Pre-K baseline

Language PPVT 3 Spline repeated measures TOLD 3 Spline repeated measures

Mathematics WJ Applied Problems 3 Spline repeated measures CMA-A Mathematics Composite 3 Spline repeated measures Shape Composition2 3 Spline repeated measures

Pre-kindergarten behavior1 SSRS Social Skills 2 Repeated measures SSRS Problem Behavior 2 Repeated measures PLBS 2 Repeated measures

Kindergarten behavior1 SSRS Social Skills 1 ANCOVA w/ Pre-K baseline SSRS Problem Behavior 1 ANCOVA w/ Pre-K baseline LBS 1 ANCOVA w/ Pre-K baseline

Classroom quality ECERS-R 2 Repeated measures

Teacher-child interaction Arnett Detachment 2 Repeated measures Arnett Harshness 2 Repeated measures Arnett Permissiveness 2 Repeated measures Arnett Positive Interaction 2 Repeated measures

Literacy instruction TBRS Written Expression 1 ANCOVA TBRS Print and Letter Knowledge 1 ANCOVA

Phonological instruction TBRS Phonological Awareness 1 ANCOVA

Language instruction TBRS Book Reading 1 ANCOVA TBRS Oral Language 1 ANCOVA

Mathematics instruction TBRS Math Concepts 1 ANCOVA 1 Pre-kindergarten and kindergarten measures are not on the same scale. 2 Building Blocks, Shape Composition task NOTE: ANCOVA: Analysis of covariance. The repeated measures spline model was used to analyze data collected at three time points (fall and spring of pre-kindergarten and spring of kindergarten). The simple repeated measures model was used to analyze data collected at two time points (fall and spring of pre-kindergarten). Refer to the glossary for abbreviations of the measures. SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

xl Executive Summary

ANCOVA models were used to estimate the difference in mean outcome measures between the treatment and control group in the spring of pre-kindergarten or kindergarten when only one observation was available. The availability of only one observation of a measure occurred in two situations. First, four of the kindergarten student measures (the CTOPP, SSRS Social Skills, SSRS Problem Behaviors, and LBS) were not on the same scales as the pre-kindergarten measures. The ANCOVA model for these kindergarten measures included students’ scores on the respective pre-kindergarten scale as a covariate to address any differences in the groups that occurred, despite randomization. Second, six pre-kindergarten classroom instruction measures were based on the TBRS that was given only in the spring of pre-kindergarten. Group mean differences for these were estimated using an ANCOVA without a similar baseline covariate. These models may be biased by any initial differences in instruction that may have existed despite randomization, as there is no baseline measure.

Results The goal of the PCER initiative was to identify the impact of the 14 preschool curricula on five student-level outcomes (reading, phonological awareness, language, mathematics, and behavior) and six classroom-level outcomes (classroom quality, teacher-child interaction, and four types of instruction). Each outcome was based on one or more of the measures (see table D); thus, the process of determining a curriculum’s impact on the outcomes required two steps. First, the models were estimated to identify average differences in the 27 measures between the students receiving the treatment curriculum and those receiving the control and determine whether they were statistically significant. Second, criteria were applied to the set of measures that made up each outcome to determine whether the results for that group of measures showed a finding that the curriculum had an impact on that outcome. This process is described in the following order: (1) the model results for the 27 measures, (2) the criteria applied to the measures for each outcome, and (3) the findings derived from applying the criteria to the results for the measures. The analysis tested the statistical significance of the difference between the means of the treatment versus the control group for each measure. Tables E-G display this difference as an effect size and note which differences are statistically significant (using a significance level of .05 and a two-tailed test). In the tables, the measures are grouped under their corresponding student-level and classroom-level outcomes. Table E identifies the impacts of each curriculum on the student-level measures in pre-kindergarten (note that Creative Curriculum is listed twice as it was implemented by the Vanderbilt University (Tennessee) research team and by the University of North Carolina at Charlotte (North Carolina) research team). Ten curricula show no statistically significant impacts on any of the student-level measures while five show significant impacts on some measures. Table F identifies nine curricula showing no statistically significant impacts on any of the student-level measures in kindergarten and six that do. Table G shows that with seven curricula there are no statistically significant impacts on any of the classroom-level measures and eight curricula show such impacts.

xli Executive Summary

Table E.—Effect sizes for student-level measures: Pre-kindergarten

Curricula CC DLM CC CC with Curiosity with Pre-K Outcome/Measures BB (V) (UNC) Ldrs Corner DD LB ELLM LFC OC LE Math PA PC RSL Reading TERA .39 * .02 -.08 -.30 .10 .06 .02 .15 .16 .68*** .17 .13 .14 .00 .08 WJ Letter Word Identification .35 .16 -.08 -.16 .09 .10 .10 -.05 .11 .51** .30 -.01 .42 -.05 .01 WJ Spelling .18 .19 -.18 .30 .04 .06 .17 .11 .25 .46** .05 .20 27 -.15 .20 Phonological awareness Pre-CTOPPP -.07 .10 .02 -.16 .18 .18 -.13 .18 .20 .32* .14 .04 .05 .10 -.09

Language PPVT .13 .23 .08 -.38 -.01 .15 -.03 .17 .02 .40* .17 .17 .16 .03 .15 TOLD .09 .07 -.16 -.22 -.08 .17 .08 .15 .01 .40** -.04 .17 .15 -.05 -.11 Mathematics WJ Applied Problems .16 .17 .20 -.14 .10 .01 -.10 .10 .20 .36** .05 .22 .07 .06 .04 Summary Executive CMA-A Mathematics Composite .14 .10 -.10 .18 .01 .13 .15 .01 .08 .17 -.02 .44** .18 -.11 -.24* xlii Shape Composite -.03 .12 .19 .02 .16 -.13 21 -.14 .08 .24 -.01 .96*** .27 -.42** .08 Behavior SSRS Social Skills -.27 .03 .05 -.25 -.06 -.18 -.27 -.06 -.42 -.11 -.06 .22 .04 .22 -.05 SSRS Problem Behavior .23 .07 -.16 -.01 .43 -.14 -.06 -.24 .37 .11 -.31 -.09 .50 -.08 -.03 PLBS .04 .14 .07 -.08 -.25 -.18 -.44 .14 -.27 -.16 17.09 -.31 .00 .07 * p < .05 ; ** p < .01; *** p < .001 NOTE: Refer to the glossary for abbreviations of the measures. Abbreviations for the curricula are: BB: Bright Beginnings DLM with OC: DLM Early Childhood Express supplemented with Open Court Reading CC (V): Creative Curriculum (Vanderbilt University) Pre-K CC (UNC): Creative Curriculum (University of North Carolina at LE: Literacy Express Charlotte) Pre-K Math: Pre-K Mathematics supplemented with DLM Early Childhood Express Math CC with Ldrs: Creative Curriculum with Ladders to Literacy software DD: Doors to Discovery PA: Project Approach LB: Let’s Begin with the Letter People PC: Project Construct ELLM: Early Literacy and Learning Model RSL: Ready, Set, Leap! LFC: Language-Focused Curriculum SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

i Executive Summary

Table F.—Effect sizes for student-level measures: Kindergarten

Curricula CC DLM CC CC with Curiosity with Pre-K Outcome/Measures BB (V) (UNC) Ldrs Corner DD LB ELLM LFC OC LE Math PA PC RSL Reading TERA -.07 .10 -.04 -.54 .43* -.05 -.13 .30 .05 .76** -.11 .31 .29 -.03 .01 WJ Letter Word Identification .09 .38 .00 -.27 .43* -.09 -.18 .00 .02 .50** .08 .22 .03 .16 -.12 WJ Spelling .06 .25 -.05 -.08 .20 -.12 -.06 .04 .11 .22 .06 .03 .14 .00 .04 Phonological awareness CTOPP .01 .06 .06 -.10 .25 -.09 -.13 .08 .03 .38* .08 -.11 -.17 -.12 -.02 Language PPVT .07 .12 .15 -.30 .14 .18 .00 .34* -.09 .48** .16 .11 .10 .10 -.02 TOLD .16 .11 -.17 -.06 .15 .06 -.12 .44** -.07 .46** .10 .08 .32 .01 -.03 Mathematics

WJ Applied Problems .13 .17 .09 -.33 .26 -.02 -.13 .26 .11 .48*** -.02 .13 .27 .08 .00 Summary Executive CMA-A Mathematics .07 .05 .14 -.19 -.05 -.16 -.07 -.05 .00 .13 -.21 .13 .22 -.06 -.10 Composite xliii Shape Composite .15 .00 -.01 -.10 .32 -.12 -.06 .03 .06 .09 -.14 .41*** .24 .12 .03 Behavior SSRS Social Skills .03 .35 -.12 .17 .32 -.05 .24 .27 -.07 -.18 -.37 .06 -.44* .12 -.03 SSRS Problem Behavior .24 -.05 .08 .02 -.08 .46 .06 .23 -.05 .01 .22 -.01 .49* .07 .07 LBS .30 .08 -.20 -.11 .11 -.32 -.10 .04 .10 -.13 -.38* .01 -.42* -.02 -.01 * p < .05; ** p < .01; *** p < .001 NOTE: Refer to the glossary for abbreviations of the measures. Abbreviations for the curricula are: BB: Bright Beginnings DLM with OC: DLM Early Childhood Express supplemented with Open Court Reading CC (V): Creative Curriculum (Vanderbilt University) Pre-K CC (UNC): Creative Curriculum (University of North Carolina at LE: Literacy Express Charlotte) Pre-K Math: Pre-K Mathematics supplemented with DLM Early Childhood Express Math CC with Ldrs: Creative Curriculum with Ladders to Literacy software DD: Doors to Discovery PA: Project Approach LB: Let’s Begin with the Letter People PC: Project Construct ELLM: Early Literacy and Learning Model RSL: Ready, Set, Leap! LFC: Language-Focused Curriculum SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

ii Executive Summary

Table G.—Effect sizes for classroom-level measures: Pre-kindergarten

Curricula CC DLM CC CC with Curiosity with Pre-K Outcome/Measure BB (V) (UNC) Ldrs Corner DD LB ELLM LFC OC LE Math PA PC RSL Global classroom quality ECERS-R .80 .45 1.66* -.71 -.48 .39 .82* -.48 — .34 1.29* .05 -.19 .54 .16 Teacher-child interaction Arnett Detachment .19 -.16 -1.68* .51 -.41 -.07 -.07 -.41 — -.06 -1.09 -.37 .57 .12 .19 Arnett Harshness .12 -.12 -.70 -.26 .14 -.38 -.95* -.40 — -.70 -.84 .18 .86 -.13 .30 Arnett Permissiveness .16 .51 -1.01 1.02 -.98 .13 -.05 -.24 — .05 .51 -.45 -.43 -.02 -.24 Arnett Positive Interactions .41 -.15 1.65** .03 .02 .38 .48 .29 — .43 .56 .16 -.99 .46 .04 Language instruction TBRS Book Reading 1.03 -.47 .28 -.32 2.06** 1.18* .63 .32 -.79 .01 .49 .07 -.76 .81 -.18 TBRS Oral Language .39 -.07 1.80** -.50 .37 .59 .44 .14 .87 -.33 .25 .19 -.42 .52 -.24

Phonological instruction Summary Executive TBRS Phonological Awareness 1.53* 1.97 -.10 -.19 .44 .58 .66 .53 .92 1.41* 1.26* .38 -1.19 .01 .22 xliv xliv Literacy instruction TBRS Print and Letter Knowledge 1.51* 1.81 1.02 .75 -.99 .90* .99* .41 .33 .91 1.07 .07 .34 .34 -.02 TBRS Written Expression 1.61* 1.99 1.73** 1.13* -.54 .62 .60 -.22 .99 -.58 -.03 -.12 .62 .43 .10 Mathematics instruction TBRS Math Concepts .98 1.48 .75 .44 -.33 .37 .24 -.92 .20 -.46 -.12 .57 -.64 .53 -.10 — Not available. * p < .05; ** p < .01 NOTE: Refer to the glossary for abbreviations of the measures. Abbreviations for the curricula are: BB: Bright Beginnings DLM with OC: DLM Early Childhood Express supplemented with Open Court Reading CC (V): Creative Curriculum (Vanderbilt University) Pre-K CC (UNC): Creative Curriculum (University of North Carolina at LE: Literacy Express Charlotte) Pre-K Math: Pre-K Mathematics supplemented with DLM Early Childhood Express Math CC with Ldrs: Creative Curriculum with Ladders to Literacy software DD: Doors to Discovery PA: Project Approach LB: Let’s Begin with the Letter People PC: Project Construct ELLM: Early Literacy and Learning Model RSL: Ready, Set, Leap! LFC: Language-Focused Curriculum SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

iii Executive Summary

The statistical significance of these results depend, in part, upon the evaluations having adequate power to detect significant impacts. The original IES Request for Applications to which the 12 research teams successfully responded required that each team include a minimum of 10 classrooms or preschool programs (half treatment and half control) with a minimum of 150 total students. Minimal Detectable Effects were calculated after data collection using the smaller achieved (not expected) samples for each team on a set of four composite measures (combining the measures for reading, language, mathematics and behavior respectively). The Minimal Detectable Effects ranged from .34 to .69 across the composites and teams. Four of the five student-level outcomes had two to three outcome measures associated with them (phonological awareness only had one per grade), as did three of the six classroom-level outcomes. The measures within an outcome are conceptually related to one another and sufficiently inter-correlated that an effect on one would not be expected to appear, except by chance, without indications of some effect on the others. To minimize the potential for false positive findings that may arise from multiple comparisons made among related measures, a criterion was applied to the set of measures within each outcome (rather than a post-hoc statistical adjustment). These criteria were used to determine whether a curriculum had a treatment effect on each student-level outcome for pre-kindergarten and for kindergarten. They include: • The reading, mathematics, and behavior outcomes each contained three measures. The finding that a curriculum has an effect on any of these three outcomes required at least two of the three measures to have had a statistically significant effect with the same sign and no significant effect with the opposite sign. • The language outcome contained two measures. A finding of an outcome effect required at least one of the two measures to have had a statistically significant effect and no significant effect with the opposite sign. • The phonological awareness outcome contained one measure. A finding of an outcome effect required this measure (Pre-CTOPPP in preschool and CTOPP in kindergarten) to have had a statistically significant effect. A similar set of rules was used to determine whether a curriculum had a treatment effect on each pre- kindergarten classroom-level outcome: • The classroom-quality outcome contained one measure. A finding of an outcome effect required this measure to have had a statistically significant effect. • The teacher-child relationship outcome contained four measures. A finding of an outcome effect required at least two of the four measures to have had a statistically significant effect in the same direction and no statistically significant effects with the opposite direction. For these measures, direction concerns desirability of the effect; a desirable effect would be a positive sign for the Positive Interaction scale and a negative effect for the other three scales. • The early literacy instruction outcome and the early language instruction outcome each contained two measures. A finding of an outcome effect required at least one of the two measures to have had a statistically significant effect and no significant effect with the opposite sign. • The phonological instruction outcome and the mathematics instruction outcome each contained one measure. A finding of an outcome effect required the measure to have had a statistically significant effect. These criteria were applied to the results for each set of measures within the five student-level outcomes (for preschool and for kindergarten) and the six classroom-level outcomes for kindergarten presented in tables E- G. In this way, each curriculum’s impact on each of the 16 outcomes was determined. Below, these findings are presented in two sections: the first organized by outcome and the second by curriculum. Under the Findings by Outcome, those curricula affecting each of the five student-level (for pre-kindergarten and

xlv Executive Summary

kindergarten) and six classroom-level outcomes (for pre-kindergarten) are identified. Under the Findings by Curriculum, each curriculum is discussed with regard to its effects on the outcomes. The findings described in both sections are presented in tables H and I. Table H shows the impacts of each curriculum on the student-level outcomes for both pre-kindergarten (pre-K) and kindergarten (K). A blank cell stands for no effect, a plus sign (+) means a positive effect, a minus sign (-) means a negative effect, and a zero (0) signifies no effect in one grade when there is an effect in the other. Table I shows the impact of each curriculum on the classroom-level outcomes using the same symbols.

Findings by Outcome Two of the 14 intervention curricula had impacts on the student-level outcomes for the pre-kindergarten year (table H). DLM Early Childhood Express supplemented with Open Court Reading Pre-K positively affected reading, phonological awareness, and language. Pre-K Mathematics supplemented with DLM Early Childhood Express Math software curricula positively affected mathematics. In the kindergarten year, four of the curricula had impacts on the student-level outcomes though three of these did not have impacts during the pre-kindergarten year (table H). DLM Early Childhood Express supplemented with Open Court Reading Pre-K continued to have positive effects on reading, phonological awareness, and language in kindergarten as it did in pre-kindergarten. Curiosity Corner, which had no effects in pre-kindergarten, was found to positively affect reading in kindergarten. Early Literacy and Learning Model (ELLM), which had no effects in pre-kindergarten, was found to positively affect language in kindergarten. Project Approach, which had no effects in pre-kindergarten, was found to negatively affect behavior in kindergarten. Eight of the 14 treatment curricula had a positive effect on the pre-kindergarten classroom-level outcomes (table I). Bright Beginnings affected early literacy instruction and phonological awareness instruction. Creative Curriculum (as implemented by the North Carolina team but not by the Tennessee research team) affected classroom quality, teacher-child interaction, early literacy instruction and early language instruction. Creative Curriculum with Ladders to Literacy affected early literacy instruction. Curiosity Corner affected early language instruction. DLM Early Childhood Express supplemented with Open Court Reading Pre-K affected phonological awareness instruction. Doors to Discovery affected early literacy instruction and early language instruction. Let’s Begin with the Letter People affected classroom quality and early literacy instruction. Literacy Express affected classroom quality and phonological awareness instruction.

Findings by Curriculum Each curriculum is discussed separately and cross-curriculum comparisons are not made. The type of pre- kindergarten program involved in the evaluation and the control curricula are described (though the results should not be used to evaluate any control curricula). Impacts on the outcomes are then presented in the following order: (1) student-level outcomes in pre-kindergarten, (2) student-level outcomes in kindergarten, and (3) classroom-level outcomes in pre-kindergarten. Bright Beginnings Bright Beginnings and its control were implemented in state pre-kindergarten classrooms in Tennessee. In the control classrooms, teachers used teacher-developed curricula with a focus on basic school readiness. No impacts on the pre-kindergarten or kindergarten student-level outcomes were found. A positive impact was found at the classroom level on early literacy instruction and phonological awareness instruction.

xlvi Executive Summary

Table H.—Findings by student-level outcomes

Phonological Curricula Reading awareness Language Mathematics Behavior

Bright Beginnings

Creative Curriculum (Vanderbilt)

Creative Curriculum (UNC-Charlotte)

Creative Curriculum with Ladders to Literacy

Curiosity Corner Pre-K: 0 K: +

DLM Early Childhood Express with Open Court Pre-K: + Pre-K: + Pre-K: + Reading Pre-K K: + K: + K: +

Doors to Discovery

Early Literacy and Learning Model Pre-K: 0 K: +

Language-Focused Curriculum

Let’s Begin with the Letter People

Literacy Express

Pre-K Mathematics with DLM Early Childhood Pre-K: + Express Math software K: 0

Project Approach Pre-K: 0 K: -

Project Construct

Ready, Set, Leap! NOTE: Abbreviations of the findings are: Pre-K: Pre-kindergarten K: Kindergarten +: Finding of a positive impact -: Finding of a negative impact Blank Cell: Finding of no impact 0: Finding of no impact (when an impact is found for the other grade) SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

Creative Curriculum─Vanderbilt University Creative Curriculum and its control were implemented in state pre-kindergarten classrooms in Tennessee. In the control classrooms, teachers used teacher-developed curricula with a focus on basic school readiness. No impacts regarding pre-kindergarten or kindergarten student-level outcomes were found. No impacts were found on the classroom-level outcomes.

xlvii Executive Summary

Table I.—Findings by classroom-level outcomes

Teacher- Phonological Early Math Classroom child Early literacy awareness language concepts Curricula quality interaction instruction instruction instruction instruction

Bright Beginnings + +

Creative Curriculum (Vanderbilt)

Creative Curriculum (UNC-Charlotte) + + + +

Creative Curriculum with Ladders to Literacy +

Curiosity Corner +

DLM Early Childhood Express with Open Court + Reading Pre-K

Doors to Discovery + +

Early Literacy and Learning Model

Language-Focused Curriculum

Let’s Begin with the Letter People + +

Literacy Express + +

Pre-K Mathematics with DLM Early Childhood Express Math software

Project Approach

Project Construct

Ready, Set, Leap!

NOTE: Abbreviations of the findings are: +: Finding of a positive impact Blank Cell: Finding of no impact SOURCE: The Preschool Curriculum Evaluation Research (PCER) Study.

Creative Curriculum─University of North Carolina at Charlotte Creative Curriculum and its control were implemented in full-day Head Start programs in North Carolina and Georgia. In the control condition, teachers used teacher-developed, nonspecific curricula. No impacts on the pre-kindergarten or kindergarten student-level outcomes were found. A positive impact was found at the classroom level on overall classroom quality, teacher-child relationships, early literacy instruction, and early language instruction. Creative Curriculum with Ladders to Literacy Ladders to Literacy was implemented in full-day and half-day Head Start classrooms in New Hampshire as a supplementary curriculum in conjunction with Creative Curriculum. In the control condition, teachers used only Creative Curriculum. No impacts on the pre-kindergarten or kindergarten student-level outcomes were found. A positive impact was found at the classroom level on early literacy instruction.

xlviii Executive Summary

Curiosity Corner Curiosity Corner and its control were implemented in full-day preschool programs in three different states (Florida, Kansas, and New Jersey). In the control condition, teachers used a variety of preschool curricula including the Creative Curriculum and Animated Literacy curriculum models, and teacher-developed curricula. No impacts regarding pre-kindergarten student-level outcomes were found. A positive impact on reading was found at the end of kindergarten. A positive impact was found at the classroom level on early language instruction. DLM Early Childhood Express supplemented with Open Court Reading Pre-K The evaluation of DLM Early Childhood Express supplemented with Open Court Reading Pre-K took place in public pre-kindergarten classrooms in Florida. In the control condition, teachers were provided with the High/Scope curriculum. A positive impact was found on reading, phonological awareness, and language development in both pre-kindergarten and kindergarten. A positive impact was found at the classroom level on phonological awareness instruction. Doors to Discovery Doors to Discovery and its control were implemented in full-day Head Start and public pre-kindergarten (Title I and non-Title I) programs in Texas. In the control condition, teachers used teacher-developed, nonspecific curricula. No impacts on the pre-kindergarten or kindergarten student-level outcomes were found. A positive impact was found at the classroom level on early literacy instruction and early language instruction. Early Literacy and Learning Model (ELLM) The Early Literacy and Learning Model (ELLM) curriculum was implemented in combination with the existing comprehensive curricula that were in use in the control group classrooms in Florida. Several curricula were used in the control classrooms including Creative Curriculum, Beyond Centers and Circletime, High Reach, and High/Scope. No impacts regarding pre-kindergarten student-level outcomes were found. A positive impact on language development was found at the end of kindergarten. No impacts were found on the classroom-level outcomes. Language-Focused Curriculum The Language-Focused curriculum was implemented in full-day Head Start and public pre-kindergarten classrooms in Virginia. The control teachers reported using High/Scope curriculum materials. No impacts on the pre-kindergarten or kindergarten student-level outcomes were found. No impacts were found on the classroom instruction outcomes. Impacts on classroom quality and teacher-child interaction outcomes could not be determined because of unreliable (inflated) data from 8 of the 14 participating classrooms on the relevant measures. Let’s Begin with the Letter People Let’s Begin with the Letter People and its control were implemented in full-day Head Start and public pre- kindergarten (Title I and non-Title I) programs in Texas. In the control condition, teachers used teacher- developed, nonspecific curricula. No impacts on the pre-kindergarten or kindergarten student-level outcomes were found. A positive impact was found at the classroom level on classroom quality and early literacy instruction. Literacy Express Literacy Express and its control were implemented in public pre-kindergarten classrooms in Florida. In the control condition, teachers were provided with the High/Scope curriculum. No impacts on the pre- kindergarten or kindergarten student-level outcomes were found. A positive impact was found at the classroom level on classroom quality and phonological awareness instruction. Pre-K Mathematics supplemented with DLM Early Childhood Express Math Software The evaluation of Pre-K Mathematics supplemented with DLM Early Childhood Express Math software took place in Head Start and public pre-kindergarten classrooms in California and New York. Several curricula were used

xlix Executive Summary

in the control condition including Creative Curriculum, High/Scope, Montessori, specialized literacy curricula, and local school district and teacher-developed curricula. A positive impact was found on students’ mathematical knowledge at the end of pre-kindergarten. No impacts on the kindergarten student-level outcomes were found. No impacts were found on the classroom-level outcomes. Project Approach The Project Approach curriculum was implemented in public pre-kindergarten classrooms in Wisconsin. In the control classrooms, teachers reported implementing their own teacher-developed, nonspecific curricula. No impacts on the pre-kindergarten student-level outcomes were found. A negative impact on behavior was found at the end of kindergarten. No impacts were found on the classroom-level outcomes. Project Construct Project Construct was implemented in full-day child care centers in Missouri. In the control schools, teacher- developed generic curricula were implemented. No impacts on the pre-kindergarten or kindergarten student- level outcomes were found. No impacts were found on the classroom-level outcomes. Ready, Set, Leap! Ready, Set, Leap! was implemented in pre-kindergarten programs in New Jersey. In the control condition, teachers used the High/Scope approach. No impacts on the pre-kindergarten and kindergarten student-level outcomes were found. No impacts were found on the classroom-level outcomes.

l