Early Intervention Education Program, Phase 2 February 9, 2018 Management Audit Committee Representative Michael Madden, Chairman Senator Dave Kinskey, Vice Chairman

Senator Paul Barnard Senator Leland Christensen Senator John Hastert Senator Tara Nethercott Senator Charles Scott

Representative Representative Jerry Obermueller Representative Representative

Prepared by Michael Swank, Program Evaluation & Research Manager Samantha Mills, Program Evaluator Marla Smith, Associate Program Evaluator

TABLE OF CONTENTS

Early Intervention and Education Program, Phase 2

Executive Summary ...... v Recommendation Locator ...... xi List of Acronyms ...... xv Definitions ...... xvii Introduction, Scope, and Methodology ...... 1 Chapter 1 Currently Reported Outcomes ...... 7 Chapter 2 Early Education Data Environment ...... 23 Chapter 3 Potential Additional Program Outcomes or Performance Measures ...... 31 Agency Responses Wyoming Department of Health ...... 49 Wyoming Department of Education ...... 53 Supplement to Early Intervention Education Program, Phase 2 Report …...... 57 Appendices (A) Update on Legislative and Agency Actions from Phase 1 ...... A-1 (B) Summary of Phase 2 Data Analysis Methodology ...... B-1 (C) Federal Reporting Indicators ...... C-1 (D) Example Part C Report Card Template ...... D-1 (E) Example Best Practices for Data System Development ...... E-1 (F) LSO School District Early Childhood Survey Questionnaire ...... F-1

iii

iv

Wyoming Legislative Service Office Executive Summary Early Intervention and Education Program, Phase 2 Introduction The Early Intervention and Education Program (Program or EIEP), administered through the Wyoming Department of Health (Health), provides funding and oversight to fourteen contracted regional child development centers to provide early intervention and preschool services to young children with disabilities. The Program fulfills the State’s implementation of the federal Individuals with Disabilities Education Act (IDEA) and Wyoming’s 1989 Services to Preschool Children with Disabilities Act. The foundation of IDEA is to identify children with developmental delays and disabilities as soon as possible in order to provide interventions and services so that they may function at age-appropriate levels. Interventions and services are provided through the Part C and Part B programs. Part C focuses on early intervention services for the families of eligible children age birth through two years, and is an optional program for the State. Part B focuses on eligible children age three through twenty-one, which entitles them to a free and appropriate public education, to be delivered in the least restrictive regular education environment alongside age-appropriate peers, to the maximum extent possible. However, the Program administers a sub-component under Part B, Section 619 (Part B 619), specific to children ages three through five years. Therefore, the Program includes services for children age birth through five years. The State receives two separate allotments of federal funding based on the IDEA Part C and Part B components. The Wyoming Department of Health (Health) solely administers the Part C program. For the overall Part B program, the Wyoming Department of Education (Education) is the federally designated State Education Agency (SEA) and is required to oversee its implementation. Consequently, Education provides monitoring and oversight of Health for the Part B 619 portion of the Program. Phase 2 Evaluation Purpose In September 2016, the Management Audit Committee (Committee) released the Phase 1 report on the Program, which addressed issues related to child identification rates, the State funding model and its implementation, and the Health-Education organizational structure. This Phase 2 report focuses on the final research objective related to Program outcomes and impacts. The foundation of this report is built upon the premise that since Wyoming funds about 90% of the Program’s cost, the Program should demonstrate, to a significant degree, positive, State-defined outcomes and impacts. Importantly, due to the Committee’s emphasis on understanding children’s outcomes in the kindergarten through twelfth grade (K-12) public education system,

v this report focuses significantly on the Part B 619 program reporting and analysis. Some Part C information is included as necessary to demonstrate specific processes or conditions of data system structure and use of data.

Background The Program is driven by the federal IDEA ▪ State Performance Plan (SPP) law and regulations with its core ▪ Annual Performance Report (APR) accountability framework based on federal ▪ State Systemic Improvement Plan requirements and indicators. Each year, both (SSIP) Health (for Part C) and Education (for Part B) These documents include information on must report and provide assurances to the compliance and outcome indicators for federal Office of Special Education Programs students and the programs. The indicators (OSEP) that the State is meeting federal cover issues such as timeliness of evaluations requirements. Significantly, each year OSEP and eligibility determinations, and student has found both Health and Education met growth in social-emotional, behavior, and performance expectations. In recent years, academic knowledge skills. The latter OSEP has pursued its Results Driven indicator generally relates to program Accountability (RDA) model to incorporate outcomes as defined by the federal more emphasis on improving child functional government. and academic outcomes as opposed to pure compliance with federal laws, regulations, Since FY2007 (2006 Laws, Ch. 85), the and procedures. While this change has not Legislature has also required the Program to drastically changed reporting requirements, report on parent satisfaction and child this new emphasis on outcomes has spurred outcomes. However, in the absence of State- increased use of data analysis and data driven level definitions and measures, Health has monitoring practices. reported the federal indicators to the Legislature. Education and Health have not There are three main parts to the federal pursued or conducted analyses on Program accountability framework: children’s skills growth or achievement in the K-12 education system. Report Finding and Recommendation Summary Findings 2. Performance expectations and The Committee’s main question related to methods are set to measure whether this Phase 2 evaluation is how or whether the program is meeting its purpose the Program can be determined successful. and mission. This assessment relies on two important Based on the current Program legal authority conditions: and administrative practices, evaluation staff 1. The Program’s purpose and mission conclude that the preceding conditions have are clearly defined. generally been set by the federal accountability framework, but have not been

vi

sufficiently explained or described at the Education say is being implemented State level. If relying on the federal to improve program results. framework, under which Health and ▪ Health’s Special Education Education are regularly determined by the Automation Software (SEAS) data federal government to be compliant, the system was established in 2010, but Program may be determined to be meeting does not meet contemporary best expectations. practices for early childhood data However, as noted during the Phase 1 systems. Issues with the system and evaluation, Wyoming general funds cover resulting data include a lack of data the vast majority of Program costs. definitions for consistent data entry, Therefore, evaluation staff focused on trying automatic system controls to prohibit to understand what Health, Education, and or lessen inaccurate data entry, and other system stakeholders believe to be the problematic reporting mechanics that central purpose and mission of the Program contribute to inefficient data as well as what may be the best approach to compiling and extraction for ongoing measuring its impact and success by the program administration or federal State. reporting requirements. ▪ The Legislature and agencies have Overall, evaluation staff found the following not worked to clearly define the issues related to the Program’s state-level State’s purpose and mission for the data and reporting framework: Program beyond the federal ▪ Health is pursuing a pilot initiative to accountability framework. This change the Child Outcome Summary evaluation provides options on how (COS) process used to report on to look at the Program’s impact, federal Outcome Indicator 3 (for Part outcomes, and ultimate success, C) and Outcome Indicator 7 (for Part including assessing the K-12 B 619). Health has not provided education system academic progress sufficient or consistent direction to or achievement of children who regional centers on the expected received Program services. conclusion of the pilot initiative, its Recommendations anticipated policy and procedural development impacts, and ultimate Many of the recommendations noted in this goal for improved regional center or report emphasize a collaborative relationship agency oversight practices. between Health and Education to establish ▪ The disjointed Part B 619 federal clear and consistent data and reporting reporting practices between Health system functions so that both federal and and Education contribute to possible future Legislature-defined problematic SPP, APR, and SSIP accountability measures can be efficiently report narratives and conclusions that pursued. More specifically, the report indicate inconsistencies or recommends: contradictions in what Health and ▪ Health should revisit its current change initiative for the COS to

vii ensure it remains current and ensure SEAS or new early childhood relevant and that the pilot initiative data system meet reasonable and includes a study period to summarize customary education data standards its successes and continuing and can more efficiently meet day- concerns. Health should also to-day program administrative and establish clear implementation and federal reporting requirements. data benchmarks for a future ▪ Education should revisit its process statewide rollout. for distributing WISER IDs to Part B ▪ Education and Health should develop 619 children and revisit its policy for a standard and more formalized Part not assigning WISER IDs for Part C B 619 data reporting and SPP, APR, children. and SSIP coordinating process to ▪ Health and Education should explore assure reported data is accurate setting minimum regional center (accompanied by Phase 1 practice, curricula, personnel, Recommendation 2.1) and that report training, or other standards to better narratives accurately reflect agency define what “high quality” early and program practices for improving intervention and preschool services performance. include, and that can be efficiently ▪ Related to the SEAS data system, in and effectively monitored by the the short term, Health should agencies. conduct a thorough assessment of its SEAS data system, in coordination Overall, the Legislature has the option and with its vendor and regional centers, opportunity to define a more specific to assure more efficient and accurate accountability framework for the Program collection of required data, including that targets State interests. Example setting standard business rules and components to consider in this framework data definitions. include: defining the Program purpose and goals, setting minimum service and/or In the long-term, especially if the curricula standards, establishing outcome Legislature defines state-level measures specific to Program students’ K-12 performance measures, Health growth and achievement, and setting clear should work with Education to administrative responsibility.

viii

Agency Response

Wyoming Department of Health

The Wyoming Department of Health stated that both the Phase 1 and Phase 2 reports have been useful tools for the agency, providing insight and reflection on the ways in which Health can continue to strengthen the Program. Health agreed with all of the recommendations provided in the Phase 2 report, but provided several notes for the Legislature to consider. Health continues to believe that the best placement for both Part C and Part B 619 is with the Department of Education. However, no matter which agency the Legislature concludes to be the desired placement for the Program, Health also stressed that both programs should remain administratively together. Health also anticipates working with Education on implementing the recommendations.

Wyoming Department of Education

The Wyoming Department of Education also agreed with the recommendations contained in the Phase 2 report. Education stated that the Program is multi-faceted, complex, and its administration and impact extends far beyond Education’s Division of Individual Learning. The evaluation provides the opportunity to refine and improve oversight and procedures as Education works with Health. Related to specific recommendations, Education noted that it does not formally publish data collected by other agencies because it cannot speak to the data collection process. Education is willing to work with Health to ensure that an effective database is used by the Program. Additionally, Education stated that it will continue to train Health staff on WISER ID retrieval for Program children and that there are no technical reasons that Health cannot retrieve WISER IDs for all of their students, such as for Part C. Finally, Education stated that defining and measuring high quality programs would be easier if the Program was placed under Education’s administration and that it agrees that the Legislature could set up a more specific and clear accountability framework noted under the report’s Chapter 3 policy considerations.

ix

x

Recommendation Locator Chapter Recommendation Page Party Agency Recommendation Number Number Number Addressed Response The Wyoming Department of Health should revisit its original Child Outcome Summary change initiative process plan from 2015 to ensure it remains current and relevant. Health should extend the plan to define a study and 1 1.1 15 Health Agree determination period on the pilot process, its successes, continuing concerns, and then establish clear implementation and data benchmarks for potential future statewide implementation.

Education’s Part B programmatic and Data Unit staff should Education: develop standard Part B reporting and data coordination Education and Agree 1 1.2 policy and procedures to ensure appropriate coordination 21 Health Health: and inclusion of Health staff in the development of Part B 619 federal indicator reporting. Agree

In the short term, the Department of Health should conduct the following actions to mitigate current SEAS data system limitations and data entry or extraction difficulties: ▪ Conduct an assessment of minimal system functions 2 2.1 necessary to efficiently comply with federal reporting 28 Health Agree requirements. ▪ Survey regional centers to catalogue ongoing data input and system navigation difficulties central to the minimum state contract and federal reporting and tracking functions for Program children that they serve.

xi Chapter Recommendation Page Party Agency Recommendation Number Number Number Addressed Response ▪ In consultation with the SEAS data system vendor, develop a plan and cost estimate for updating the system to meet basic needs.

The Department of Health should develop SEAS data system 2 2.2 business rules and/or policy standards for consistent and 28 Health Agree reliable data entry by regional centers.

In addition to short-term actions outlined in Recommendation 2.1, the Department of Health should work with the information technology and data analysis staff at the Department of Education to assess the current attributes and limitations of the SEAS platform. The agencies should outline Health: requirements needed to meet customary and reasonable data Health and Agree 2 2.3 29 quality control standards for education data as well as develop Education Education: a plan to potentially replace the SEAS data system to most Agree efficiently meet the needs of the Program administration, federal reporting requirements, and potential adjustments to the Legislature’s outcome and accountability expectations (see Chapter 3 Policy Consideration).

The Department of Health should revisit its WISER ID assignment processes and develop a formal policy to allow 3 3.1 48 Health Agree for straightforward, efficient, and timely assignment of WISER IDs to Part B 619 children entering services.

The Department of Education should allow WISER IDs to 3 3.2 48 Education Agree be assigned to Part C children entering services.

xii

Chapter Recommendation Page Party Agency Recommendation Number Number Number Addressed Response The Departments of Health and Education should consider developing policies and/or rules that define and measure a Health: high quality early intervention program (e.g. service Health and Agree 3 3.3 standards, curricula, staff licensing or professional 48 Education development standards, etc.), which can be periodically and Education: efficiently validated within the regional center monitoring Agree protocols.

Policy Considerations Chapter Page Agency Policy Considerations Party Addressed Number Number Response

The Legislature could consider defining a more specific accountability framework for the Program that targets state interests, including: ▪ Defining the Program purpose and goal(s). ▪ Setting minimum service and/or curricula standards. ▪ Establishing outcome measures specific to Program students’ K- Health: Agree 3 12: 48 Legislature o Kindergarten readiness, Education: o Student skills growth, Agree o Student achievement, o Potential service and cost reductions. ▪ Setting clear administrative responsibility and timing for Health and/or Education to analyze and report on regional center compliance and results.

xiii

xiv

List of Acronyms Early Intervention and Education Program, Phase 2

APR ...... Annual Performance Report BHD or Division ...... Behavioral Health Division CDC, Region, or Regional Center ...... Regional Child Development Center COS ...... Child Outcome Summary CPAA ...... Children’s Progress Academic Assessment EIEP or Program ...... Early Intervention and Education Program ETS ...... Department of Enterprise Technology Services FAPE ...... Free and Appropriate Public Education IDEA ...... Individuals with Disabilities Education and Improvement Act IEP...... Individual Education Program IF-K ...... Instructional Foundations for Kindergarten (assessment) IFSP ...... Individual Family Service Plan K-12 ...... Kindergarten through Twelfth Grade LEA or IEU ...... Local Education Agency or Intermediary Education Unit LRE ...... Least Restrictive Environment MAP ...... Measure of Academic Progress (assessment) MOE ...... Maintenance of Effort MOU or IA ...... Memorandum of Understanding or Interagency Agreement OSEP ...... Office of Special Education Programs (U.S. Department of Education) Part B 619 ...... Part B, Section 619 of IDEA (ages three through five years preschool) Part C ...... Part C of IDEA (age birth through two years early intervention) PAWS ...... Proficiency Assessment for Wyoming Students (assessment) RDA ...... Results Driven Accountability SEAS ...... Special Education Automation Software SiMR ...... State-identified Measurable Results SLDS ...... State Longitudinal Data System SPP ...... State Performance Plan SSIP ...... State Systemic Improvement Plan WDE or Education ...... Wyoming Department of Education WDH or Health ...... Wyoming Department of Health WISE System ...... Wyoming Integrated Statewide Education System WISER ID ...... Wyoming Integrated Statewide Education Record Identifier (WDE)

xv

xvi

Definitions Early Intervention and Education Program, Phase 2

These definitions are provided to describe or explain key concepts in the report. The language may not directly reflect legal definitions defined in federal/state statutes or rules and regulations.

Pre-K Federal Indicators The State Performance Plan and Annual Performance Reports (SPP/APR) required under federal regulations (34 CFR 80.40) measure several areas of process, procedural, and outcome performance for both Part C and Part B 619 programs. There are fourteen (14) federal indicators for Part C and twenty1 (20) federal indicators for Part B (including preschool and K-12 program indicators), with each indicator specifying its purpose and criteria for measurement. There are four indicators directly related to Part B 619. A summary of the federal indicators can be found in Appendix C. Results Driven Accountability (RDA) The federal Office of Special Education Programs (OSEP) developed the Results Driven Accountability framework to shift accountability from mere process/procedural compliance to states aligning all components of their accountability system toward improved child outcomes. Beginning with its 2014 determinations, OSEP began to consider results and compliance as factors in making determinations that states are meeting IDEA requirements. The framework applies to both the Part C and Part B 619 programs and emphasizes use of measurable data. The RDA framework includes the SPP/APR, SSIP, and SiMR components, leading to federal determinations of states’ results and compliance, and ultimately leading to differentiated monitoring of states based on their respective levels of compliance and need for federal assistance. State Performance Plan/Annual Performance Report (SPP/APR) As a component of federal Results Driven Accountability framework, each state must develop and submit their State Performance Plan/Annual Performance Report (SPP/APR). The SPP evaluates a state’s efforts to implement the requirements and purposes of federal law as well as describes how a state will improve its IDEA implementation. This document, submitted every six years, includes measurable targets for federal indicators established under three monitoring priority areas: 1) free and appropriate public education (FAPE) in the least restrictive environment (LRE); 2) disproportionality (disproportionate representation of racial and ethnic groups in special education or related services categories that is the result of inappropriate identification); and, 3) general supervision of the program. The APR is submitted annually each February, reflects the state’s progress toward meeting program goals, and provides the actual

1 Under OMB NO: 1820-0624 Part B SPP/APR Indicator required 20 indicators in 2014. However, the Results Driven Accountability process may reduce the number of required indicators in the future. xvii target data, explanation of progress or slippage, and discussion of improvement activities completed by the state for each indicator. State Systemic Improvement Plan (SSIP) Included within the federal State Performance Plan and Annual Performance Report requirements is the State Systemic Improvement Plan (SSIP), required under Indicator 17. The SSIP is designed to illustrate a state’s intended activities to support and improve outcomes in targeted areas. Figure 1.1 in Chapter 1 provides a diagram showing the three-phased roll-out of the SSIP implementation from April 2015 through February 2020. State-identified Measureable Results (SiMR) Under the State Systemic Improvement Plan (federal Indicator 17), within the SPP/APR, states are required to have State-identified Measurable Results (SiMR). The SiMR requires a statement of the result(s) a state intends to achieve through the implementation of the SSIP. The SiMR must be aligned to an SPP/APR indicator or a component of an indicator. The SiMR must also be clearly based on data and state education infrastructure analyses and must be a child-level outcome in contrast to a process measure. The current Part B SiMR developed by Education states, “The impact of State Systemic Improvement Plan (SSIP) activities will be measured by an increase in the percentage of third-grade students with disabilities who spend 21 to 60 percent of their school day outside the general education environment who score proficient or advanced on the statewide reading assessment.” Child Outcome Summary (COS) The Child Outcome Summary (COS) is the entry and exit assessment process used by the Program’s regional centers to determine the knowledge and skill levels of IDEA-eligible early intervention and preschool children. The COS information is used by the Program to report on Part B 619 federal outcome Indicator 7 and Part C federal outcome Indicator 3. The assessment is used to show both a child’s growth in skills, and, when applicable, where a child may be exhibiting skills equivalent to their typical learning and age-appropriate peers. The assessment process is conducted by the regional centers, where professional staff score a child from one through seven depending on the social-emotional, knowledge, and behavioral skills they demonstrate. A score of six or seven means a child is at age appropriate level functioning. State Longitudinal Data System (SLDS) A Statewide Longitudinal Data System (SLDS) is designed to help education stakeholders (e.g. policy-makers, state administrators, school districts, individual schools, teachers, etc.) make informed, data-driven decisions to improve student learning. The central purpose of an SLDS is to identify trends and results of education practice so that adjustments can be made at different points/levels of the education system to improve children’s skills and educational and professional outcomes. An SLDS may be a stand-alone system or may include linking multiple, separate data systems to utilize a constant flow of historical student data, including demographics, assessments, and other information. Data links may include information from preschool, K-12, post-secondary schools, and even career and professional employment

xviii

information to see where different types of students or how different approaches to instruction impact student’s outcomes. Wyoming Department of Education WISE Certified Teacher-Course-Student (TCS) Enrollment Data Collection 684 (WDE 684 report) The WDE 684 requires the submission of student-level data and is the authoritative collection of student-level demographics and course information. This is a comprehensive collection that includes all student level demographics and registered schedule information required under W.S. 21-2-203, W.S. 21-2-204, W.S. 21-2-304(a)(v), and federal law Elementary and Secondary Education Act, as amended by the Every Student Succeeds Act (ESSA). WDE SPED 684 The WDE 684 SPED file is a subset of the overall WDE 684 report and includes specific student level data for children receiving special education and related services.

xix

xx

February 2018

Introduction, Scope, Methodology

Introduction and Scope Wyoming Statute § 28-8-107(b) authorizes the Legislative Service Office (LSO) to conduct program evaluations, performance audits, and analyses of policy alternatives. Generally, the purpose of such research is to provide a base of knowledge from which policymakers can make informed decisions. At its January 2016 meeting, the Management Audit Committee (Committee) voted to authorize the LSO Program Evaluation staff to move forward with the full evaluation of the Early Intervention and Education Program (EIEP or Program). Due to multiple challenges evaluation staff encountered during the initial stages of the project, including LSO staff reorganization and gaining access to data from the administering agencies, the Committee approved completion of the evaluation in two phases. The Committee reviewed and released the Phase 1 report in September 2016. The Phase 1 evaluation report covered the Committee’s concerns related to child eligibility identification rates, the statutory funding model for distributing State funds, and the Program’s organizational structure between the Wyoming Department of Health (Health), which administers the program, and the Wyoming Department of Education (Education), which provides oversight of Health for part of the Program. The Phase 2 report provides information related to the Committee’s last concern: the Program’s performance and impact. Specifically, this report provides findings and recommendations related to the following research question: 1. Measured outcomes — Further review could be conducted to measure the program’s success. Potential example measured outcomes could include: a. Review the number of students that received Part B preschool services that did not need K-12 special education services; b. Review pre-kindergarten assessment results for students that received Part B preschool services compared with the same assessment results for non-Part B preschool students, both for those that did and did not receive K-12 special education services.

Page 1 Early Intervention and Education Program, Phase 2

Summary of Phase 1 For full context related to this Phase 2 report, evaluation staff believe it is necessary to recall the focus of the Phase 1 report. The Phase 1 report noted that the Individuals with Disabilities Education Act (IDEA) is a four part (Sections A through D) U.S. law that ensures students with a disability are provided a free and appropriate public education (FAPE) that is tailored to their individual needs and served in the least restrictive environment (LRE). Sections C (called Part C) and B (called Part B) are different programs at the federal level. However, under W.S. 21-2-701 through 21-2-706, the Legislature provides that Health will administer both Part C and the portion of Part B applicable to pre-kindergarten or preschool (Section 619; also sometimes called pre-K) services as a single program. Statute also specifies how federal and state funding will flow for services to students. Other than outlining state general funds disbursement, there is no specific statute that requires or stipulates that Health pursue and disburse federal Part C funds. However, the State has a longstanding tradition of applying for, and disbursing, Part C federal funds to the fourteen community regional child development centers (regional centers). Part C covers children age birth through 2 years and Part B 619 covers children age three through five years. The combined programs are administered in Health through a unit of four staff within the Developmental Disabilities section of the Behavioral Health Division (Division). A significant factor in the administration of the Program is that since federal Part B 619 funds must flow directly to the state’s education agency (SEA), Education maintains a supervisory relationship over Health related to Part B 619 program and fiscal administration. In this structure, Health is classified as a local education agency (LEA), or an intermediate education unit (IEU), similar to school districts in its use of, and accountability for, the Part B 619 program. Education and Health maintain this relationship, as required by statute, through an interagency agreement that defines the duties and roles for each agency. The most recent update to the MOU was executed in August 2016. The Phase 1 report provided six findings: ▪ When considering multiple measures and factors, Wyoming’s identification rates for Part C and Part B 619 children appear reasonable and appropriately monitored (Chapter 2).

Page 2 February 2018

▪ Education’s reporting of Part B 619 data to the federal government cannot be reconciled with Health data for the same program (Chapter 2). ▪ Health’s child count standard is no longer defined through rules, and the single, November 1st count date may not provide the best timing or information to determine the number of children served by the Program (Chapter 3). ▪ Budget cuts and Health’s management of the State’s federal maintenance of effort requirement are the primary contributors to Health and the Legislature not implementing the statutory Program funding model as intended, essentially eliminating consistency and stability for the Program (Chapter 4). ▪ Administration for the Program is operational but the current organizational structure does not sufficiently align authority with purpose and responsibility (Chapter 5). ▪ The Program is a part of the overall early childhood learning system in the State, but appears to be viewed independently or separately of other programs, services, and funding (Chapter 5). The Phase 1 report concluded that identification and eligibility rates in Wyoming do not appear out of balance with national data and research on the frequency of disability occurrence across the nation. The report also provided recommendations for Health and Education to establish a data reconciliation process to verify accurate data is reported to the federal government for the Part B 619 program. Finally, the report provided recommendations and policy considerations to the agencies and the Legislature to implement the current statutory funding model and to reconsider the current organizational structure of the Program. This potential reorganization could include placing the Program alongside other early childhood intervention and education programs in an Office of Early Learning. Health and Education generally agreed with most of the recommendations, but the Committee opted to wait until after the Phase 2 evaluation was complete prior to considering any legislative policy options. See Appendix A for a summary of steps each agency has considered or taken since the public release of the Phase 1 evaluation report.

Page 3 Early Intervention and Education Program, Phase 2

Phase 2: Two Outcome Landscapes With a lack of state defined outcomes and measures for Program success, Health and Education have focused administrative and oversight efforts on complying with federal reporting requirements. This focus has occurred even as some staff indicated the desire to be more proactive in setting consistent standards for curricula, services, and staff training at regional centers. Evaluators approached this condition by dividing the Program outcome review into the following two areas: 1. What is occurring? This analysis encompassed a review of the federal outcome definitions and reporting requirements with initial focus on how Health and Education facilitate Program compliance. 2. “What if” scenarios? This analysis included a review of stakeholder views and data analysis to understand example definitions and performance measures that the Legislature could pursue to more comprehensively understand system processes, outcomes, and ultimately Program “success.” Throughout both the Phase 1 and Phase 2 evaluation research, evaluation staff could not identify an overarching central purpose for the Program to which the Legislature has defined specific performance expectations beyond the federal framework. The main way in which the Legislature has set its performance expectations is through its 2006 reporting requirement established for the Program (2006 Laws, Ch. 85, §2). However, without greater specificity provided to Health and Education, these agencies have adapted the federal outcome measures and reporting requirements to fit this state-level report. Meeting federal requirements and focusing on Program compliance can be important to ensure basic IDEA principles and expectations. This reporting focus also ensures the continued access to federal funds for the Program. However, the federal requirements may not provide for a comprehensive accountability framework for Wyoming, the agencies, and regional centers where State general funds account for about 90% of program funding. This report provides the Legislature and agencies with considerations for enhancing both the understanding of, and the impact on the students served by, the Program. Considerations focus on both in the short-term and the long-term, as these students enter and progress in the kindergarten through twelfth grade public education system (K-12 system).

Page 4 February 2018

Methodology This evaluation was conducted according to statutory requirements and professional standards and methods for governmental audits and evaluations. The evaluation research was conducted from October 2016 through May 2017. The general analytical time frame covered by this evaluation includes documents and data from July 1, 2011 (FY2011) through June 30, 2016 (FY2016), unless noted otherwise. Generally, research methods for Phase 2 of this evaluation included similar work steps, such as interviews, observations of meetings, and document and data analysis that evaluation staff conducted during Phase 1. Evaluation staff continued to attend relevant meetings, in person and via conference calls, at Health and Education as well as meetings of various councils and working groups covering early childhood intervention and education services. Evaluation staff also continued to request information from each agency related to the Phase 2 objective of understanding the Program outcome and academic performance schema. Agencies’ Primary Data and Phase 1 Scope Limitation Revisited In the Phase 1 report, evaluation staff noted a scope limitation on some of the data presented in the report. The limitation primarily related to data from the Special Education Automation Software (SEAS) data system used by Health to track child and Program information for both Part C and Part B 619. Refer to Chapter 2 for a more comprehensive description of the challenging data environment with which evaluation staff had to navigate. Throughout Phase 2 research, evaluators spent significant time reviewing various data sets from Health and Education, including more in-depth review, cleaning, and summarizing of SEAS data. Evaluators were able isolate example limitations, noted with different analyses. Evaluation staff also accessed and reviewed primary program data from Education for the Instructional Foundations for Kindergarten (IF-K) and the Measures of Academic Progress (MAP) assessment, as well as district reporting through the WDE 684 Enrollment report and corresponding special education file. In summary, while evaluation staff do not articulate an overall scope limitation to this report, significant caveats and considerations throughout the report should provide caution to the reader when drawing specific conclusions related to the success or failure of the Program.

Page 5 Early Intervention and Education Program, Phase 2

Please review Appendix B for a more detailed summary of the methodologies employed for this Phase 2 report. Part C Data Challenges Separate from the overall data challenges noted above, much of the Phase 2 report focuses on the Part B 619 program as it relates to tracking and monitoring child outcomes from the Program into the early grade levels of the K-12 system. The central issue evaluators encountered in utilizing this data was that, while Education has allowed use of the Wyoming Integrated Statewide Education Record Identifiers (WISER IDs) to be assigned to Part B 619 children age three through five years, WISER ID numbers have never been issued to children under the Part C program. In analysis of Health’s Program data, evaluation staff found that it is possible to track children that exit Part C services and those that transition to Part B 619. However, trying to get an accurate and reliable population of all children that utilized Part C (both that exited and transitioned to Part B 619) and that can also be found in Education’s data is difficult and time consuming. While explained in more detail in Chapter 2, as Health manually assigns WISER IDs to Part B 619 children only when they enter the program, it is difficult to link Part C children to data maintained by Education. Moreover, without WISER IDs, evaluators could not effectively utilize children’s names and birthdates from Health’s data system for linking students to Education’s data. Acknowledgements The LSO expresses appreciation to those individuals and agencies that assisted with our research. We convey specific gratitude to Wyoming Departments of Health and Education for accommodating our numerous requests for documents, data, and interviews. We appreciate the information provided by the regional centers during our Phase 1 evaluation and their continued support through Phase 2. We also appreciate the access to and assistance of the forty-eight public school districts for considering and responding to our requests for feedback on the Program and the regional centers. Finally, we thank the various stakeholders (e.g. advisory councils, boards, and associations) that provided access to meetings and discussions that assisted in our understanding of the Program.

Page 6 February 2018

Chapter 1: Currently Reported Outcomes

Phase 2 focuses on Program impact and success, with a central feature of investigating possible expanded performance measures on assessing academic outcomes. The report’s structure is deliberate in first describing the current outcome and reporting framework, explaining the backdrop of the data environment that may impact future state-based outcome measurement, and ending with a summary of different outcome or impact concepts that the Legislature may consider. Most importantly, it is important to keep in mind that absent more refined and specific legislative and/or statutory definitions or clarification, academic outcomes and associated measures of Program success cannot occur. Finding 1.1: The Program is meeting federal performance expectations, but Health’s revision initiative of the Child Outcome Summary lacks sufficient preparation and direction for the regional centers. The Management Audit Committee asked evaluation staff to review the Program’s outcomes and impact in an effort to determine if the Program is successful. While Chapter 3 provides a more detailed explanation of possible ways to measure the success of the Program through child academic outcomes, there are no State-defined measures to make such a determination at this point. Instead, this chapter summarizes current federal performance expectations. Importantly, as Part C and Part B 619 are separate federal programs, each program must maintain compliance with separate federal requirements. The federal Office of Special Education Programs (OSEP) has provided a designation of “meets expectations” for the past two years. This determination means that the Program complies with programmatic and financial requirements outlined in federal law and regulations as measured through specific indicators for compliance. However, Health staff revealed that there has been ongoing concern with the Child Outcome Summary (COS) used to report on the central federal outcome measure for the Part B 619 program. In an attempt to address these concerns, Health is piloting a new COS process. However, there appears to be minimal planning and direction provided to the regional centers related to the purpose of the pilot or the expected long-term improvements, and implementation of this initiative, based on the pilot results.

Page 7 Early Intervention and Education Program, Phase 2

Federal Requirements and Results Driven Accountability In June 2014, the Office of Special Education Programs (OSEP) introduced a shift in its monitoring and assistance. OSEP announced its move away from compliance focused monitoring and toward improving outcomes (results) through data analysis. The Assistant Secretary for Special Education and Rehabilitative Services at the time stated, “Despite this focus on compliance, states are not seeing improved results for children and youth with disabilities…young children are not coming to kindergarten prepared to learn. In many areas, a significant achievement gap exists between students with disabilities and their general education peers.” To address this ongoing concern, OSEP began to implement a new accountability system called Results Driven Accountability (RDA). The RDA model balances improved developmental and educational results and functional outcomes, with continued compliance as it relates to those outcomes and results. The RDA model is explained in more detail in the call-out box on the next page. Through RDA, the central focus of federal monitoring of the states will be to ensure that children improve in skill and abilities, rather than solely ensuring that services and supports are provided according to procedural requirements. Figure 1 Federal Results Driven Accountability Phases

Source: Legislative Service Office summary of federal Office of Special Education Programs information.

Page 8 February 2018

Federal RDA Framework In accordance with federal statute and regulations (20 U.S.C. 1416(b)(1)(C) and 1442 of the IDEA; 34 CFR 80.40), States receiving IDEA funds must comply with various system planning and reporting requirements: ▪ Complete a State Performance Plan (SPP) ▪ Complete an Annual Performance Report (APR)

Figure 2 The plan and report outline the State’s efforts to implement the requirements and purposes of and the Part C and Part B Federal RDA Framework programs. In Wyoming, Health is responsible for reporting on Part C and Education is responsible to report on the Part B program. Each program has specific standards that it must meet, called federal performance indicators. More specifically, a six year State Performance Plan (SPP), which describes how the State will improve its implementation of IDEA, and an Annual Performance Report (APR) must be submitted by the State to report its progress yearly to OSEP. Part of the APR is Indicator 17, which requires each state to compose their State Systemic Improvement Plan (SSIP). In developing, implementing, and evaluating the SSIP, it is an expectation of OSEP that a state’s focus on results will drive innovation in the use of evidence-based practices in the delivery of services to children with disabilities. This innovation will, in turn, lead to improved results for children with disabilities. Figure 1, on the previous page, shows the multi-year transition OSEP required the states to implement the RDA framework. Figure 2, on the left, illustrates the interplay of the SPP, APR, and SSIP in complying with the federal RDA framework. The first two years required the states to develop their SSIPs and, as they enter the analysis phase, states will be required to measure how they progress toward their targeted measures.

Federal Indicators The APR requires the State to report specific data on a number of standardized indicators of IDEA compliance. Many of the Part C and Part B indicators are different, given the purpose and services for each

Page 9 Early Intervention and Education Program, Phase 2

program. Only six indicators, listed below, relate directly or generally to the Part B 619 portion of the overall Part B program. The remaining Part B indicators are mainly focused on the K-12 system. Part B 619 related indicators are as follows: ▪ Indicator 6: Preschool Environments ▪ Indicator 7: Preschool Outcomes ▪ Indicator 8: Parent Involvement (ages 3-21 years) ▪ Indicator 11: Child Find and Evaluation in 60 Days ▪ Indicator 12: Early Childhood Transition ▪ Indicator 17: State Systemic Improvement Plan (ages 3-21 years) For Part B 619, Health staff stated that Education has never asked them to assist in the completion of the SPP/APR for the full Part B program, except to provide the required data for the pre-K related indicators. Health contracts with a private, third-party consultant to extract the data from its Special Education Automation Software (SEAS) data system in order to bridge the administrative gap and analyze the data for the two agencies. Education also contracts with the same consultant for K-12 system Part B data review and analysis as well as for administering the Instructional Foundations for Kindergarten (IF-K) assessment program. Wyoming Meets Requirements for Part B Through Education’s Reporting On June 28, 2016, OSEP determined that “Wyoming meets the requirements and purposes of Part B of the IDEA. This determination is based on the totality of the State’s data and information, including the federal fiscal year (FFY) 2014 State Performance Plan/Annual Performance Report (SPP/APR), and other publically available information.” Appendix C provides a full listing of all of the indicators tracked for both the Part C and Part B programs, a summary of the six Part B 619 related indicators, and summary tables showing Wyoming’s indicator data. Part C and Part B Impact is Currently Determined by the Child Outcome Summary (COS) While both the Part C and Part B 619 programs have individual indicators used to gauge the entirety of federal compliance and results for the State, the primary outcome or impact indicator for both programs is the Child Outcome Summary (COS) form and process.

Page 10 February 2018

The COS process was developed as part of a federal push to understand how preschool special education and related services contribute to a child’s developmental and functional progress. 2 Wyoming Began Using the COS in FY2006 In 2005, the OSEP began requiring state early childhood special education programs to report on child outcomes. Outcomes measured infants, toddlers, and preschool children in the developmental domain outcomes of social-emotional skills, acquisition, and use of knowledge skills, and use of appropriate behaviors. In order to meet these performance measures and reporting requirements, Wyoming and other states adopted the COS. According to Health’s FY2006 legislative report on the Program, Wyoming regional centers first applied the COS in February 2006. The key to measuring outcomes is that a child should be assessed at program entry and at program exit so the regional centers can determine when and how much growth in each assessed area has occurred. The COS generates a singular score or rating, even when knowledge about children's functioning comes from multiple sources. A service provider scores each domain, based on a child's functioning level compared to his or her chronological age. The process also gauges the progress a child makes toward age appropriate functioning and behavior compared to his or her peers. Initially, there was a specific COS form (the COSF) by which providers summarized information and recorded a student’s overall score, with a low score of one and high score of seven. However, starting in 2011, the term “Child Outcomes Summary Process” rather than COSF was used to emphasize that this measurement approach is a team process, not merely a form. Part B Indicator 7: Preschool Outcomes Indicator 7 for the Part B 619 program (similar to Indicator 3 of the Part C program) is possibly the most important indicator for the purpose of this evaluation. While state statute does not define program outcomes, Indicator 7 includes three child development domains with five descriptors, or levels, of functioning. Indicator 7 is measured by regional centers use of the COS, where each center assesses individual

2 The COS is a process developed by the Early Childhood Outcomes Center (ECOC) that provides a common metric for describing children's functioning compared to age expectations in separate developmental domains. The COS includes a rubric for a team to summarize the child's level of functioning using information from many sources including assessment tools as well as parent and provider reports.

Page 11 Early Intervention and Education Program, Phase 2

student progress from the time a child enters the Program to the time he/she leaves services. The two summary statements that encapsulate this outcome measure are: • Of those children who entered the program below age expectations in each of the following outcomes, the percent who substantially increased their rate of growth by the time they turned six years of age or exited the program. • The percent of children who were functioning within age expectations in the outcome by the time they turned six years of age or exited the program. (LSO emphasis) Table 1.1, below, provides Wyoming’s Part B 619 outcome data as reported by Education in its FFY2014 APR. The data indicates that the regional centers’ service delivery is producing developmental progress at rates higher than the State’s targets and above the national average. Table 1.1 Indicator 7: Preschool Outcomes FFY 2014 FFY2014 FFY2014 FFY2013 Data National Target Data Average Summary Statement 1: Of those children who entered the program below age expectations in each of the following outcomes, the percent who substantially increased their rate of growth by the time they turned six years of age or exited the program in the outcome of: 7-A: Positive social-emotional skills 87.50% 87.50% 91.20% 80% 7-B: Acquisition and use of knowledge and skills 89.27% 89.27% 92.10% 80% 7-C: Use of appropriate behaviors to meet their needs 89.18% 89.18% 92.80% 79% Summary Statement 2: The percent of children who were functioning within age expectations in each of the following outcomes by the time they turned six years of age or exited the program: 7-A: Positive social-emotional skills 57.13% 57.13% 59.40% 58% 7-B: Acquisition and use of knowledge and skills 53.72% 53.72% 58.10% 51% 7-C: Use of appropriate behaviors to meet their needs 68.55% 68.55% 77.80% 63% Sources: 2016 Wyoming Part B State Performance Plan/Annual Performance Report for Indicator 7.

Health is Piloting a New COS Process Guidance from the Early Childhood Technical Assistance Center (ECTA) states that, “the COS summary form is intended for local,

Page 12 February 2018 state, and federal data collection, reporting, and program improvement. The contents of the questions, rating scale, and definitions for the scale should not be adapted.” While not required by OSEP, Health has been pursuing changes to the COS process in order to gain data that are more reliable for both the Part C and Part B 619 programs. Although Health is taking this proactive step to ensure data validity of the COS, Program staff stated that they anticipate no change in its determination of meeting federal reporting requirements. Health is Concerned with Varied Regional Center COS Practices and COS Data Validity Over the course of both phases of this evaluation, Health staff stated that in 2014-2015 they became concerned that the regional centers were using a variety of methods to complete the COS. Regional centers used different formal assessment checklists, structured processes, as well as personal and professional judgment to assign COS scores to children. Additionally, there has not been consistent or clear training on this process for a number of years and as such, regional centers have adopted their own practices for the COS. As a result, Health staff stated that the resulting COS scores entered into the SEAS data system might not be indicative of children’s progress or skill achievement from Program services. One example concern expressed by Health staff and their data consultant was that Health identified children’s exit scores for Part C children that were higher than their entry scores to Part B, even though the child transitioned within the same region. In addition, Health stated that, because there is no standard, it could not be certain that a child exhibiting skills to a score at a certain level in one regional center would effectively score the same in another. Health’s Pilot Initiative Requires Regions to Utilize a Standardized Assessment Tool to Score the COS After several interactions with federal agencies and the regional centers, Health decided to implement its COS change initiative during FY2017. Health is currently piloting the initiative with three regional centers: Region 1 (Big Horn Basin), Region 7 (Sweetwater County), and Region 13 (Campbell County). Working with its data consultant, Health initially provided these pilot regions with an Excel spreadsheet to report data. Later, Health followed-up with a Google Dashboard online reporting system, so these regions could report the new COS scores outside of SEAS. These pilot regions have reported COS entry and exit scores for their

Page 13 Early Intervention and Education Program, Phase 2

children in batches, once in November 2016 and once in February 2017. Health anticipates the last reporting for the pilot to occur at the end of the school year (May/June 2017). Health, based on feedback from federal technical assistance staff, now requires pilot regions to use a single assessment tool (Battelle Developmental Index or BDI) that is referenced to the COS scoring levels based on student assessment standard deviation scores. The intent of Health is to base COS scores on observed child skill demonstrations and to remove potentially subjective components from the COS process. According to Health staff, work on this process has been done with considerable stakeholder input, which included a COS work group in January 2015 that met to review COS data and identify possible options for improvement. Regional Centers Expressed Concerns with the COS Changes In observing monthly teleconference calls between Health and the regional center directors (termed “directors’ calls”) evaluation staff could not confirm how much, or to what extent, regional centers have been engaged with the COS change initiative. While initial discussion for a new COS began before January 2016, evaluators did not witness regional centers’ offering extensive feedback on the COS change initiative when requested by Health over the course of this evaluation. However, interviews conducted by evaluation staff prior to implementation of the COS pilot revealed several regional centers were neither satisfied with the process to obtain regional feedback, nor with the ultimate assessment tool chosen by Health to implement the COS process change. Overall, regional centers noted the following concerns to Health and/or evaluation staff, related to the initiative (not an all- inclusive list): ▪ The standardized assessment tool limits regional centers flexibility when assessing children. ▪ The assessment tool, with some centers required to assess hundreds of children each year, may be cost prohibitive. ▪ The assessment tool chosen limits educators’ time in the classroom. ▪ Health did not consider specific drawbacks or applicability of the chosen assessment tool for students with certain disabilities.

Page 14 February 2018

Health Appears Unclear on the Long-Term Benefit of the COS Change Initiative Evaluation staff’s primary concern is that there does not appear to be a cohesive pilot, or proof of concept plan in place to direct the initiative. When asked, Health staff noted that while the new COS process will provide more standardization compared to past region-by-region assessment methods, the use of the standardized assessment tool for the COS will not entirely negate the possibility for regional center staff subjectivity in assessing children. For example, Health noted that it discovered one region was providing COS scores on several children without direct observation of children during the completion of the assessments. Instead, the region was using team member knowledge, recollection, and staff notes to complete the assessment. Further, during the monthly director’s call with the regional centers, several directors expressed concern regarding the pilot rollout. While Health originally stated it intended to expand the pilot COS method to all centers toward the beginning ofFY2018, upon clarification during Phase 2, Health could not confirm this original implementation date would be met. It should be noted that Education has not been involved in the development of the new COS initiative and its staff could not confirm that it will be involved in the full implementation phase of the initiative. Recommendations 1.1: The Wyoming Department of Health should revisit its original Child Outcome Summary change initiative process plan from 2015 to ensure it remains current and relevant. Health should extend the plan to define a study and determination period on the pilot process, its successes, continuing concerns, and then establish clear implementation and data benchmarks for potential future statewide implementation. While Health began looking at potential changes to the COS as far back as the fall 2014, staff turnover and other obstacles to its implementation have created fluctuating expectations. Evidence indicates that the COS change initiative is well intentioned and has potential to provide more accurate, valuable data on which to manage the program, but the current implementation process does not appear to have clear goals or sufficient process and procedural planning to ensure the COS changes will lead to better Program administration or identifying better outcomes.

Page 15 Early Intervention and Education Program, Phase 2

Finding 1.2: Federal reporting for Part B 619 is fragmented between Health and Education, leading to inconsistencies and inadequate disclosure of information. Health and Education each control federal reporting for one program: Health for Part C and Education for Part B 619. For Health, Part C reporting appears consistent in form and content from year-to-year. However, despite Education staff stating that the relationship between the two agencies being the most interactive and collaborate that it has been during last decade, there remains a lack of sufficient coordination between Education and Health to ensure federal reporting fully represents the Part B 619 program. Annual “Data Drill Down” Meetings are the Cornerstone of Part B Compliance and Results Monitoring Annually, Education conducts collaborative analyses of statewide education data to understand the current trends and potential red flags among students receiving special education and related services. These “data drill down” meetings have been held in the last three years and are facilitated by Health and Education’s data consultant. These exercises for both Part B 619 and the K-12 Part B program are intended to identify areas for federal compliance monitoring as well as to develop improvement strategies and revise the SPP/APR and SSIP. The data drill down process focuses on IDEA areas of concern such as assuring requirements for students to receive a free and appropriate education (FAPE) in the least restrictive environment (LRE). This process also investigates challenges (e.g. staff training, extra resources, etc.) that may influence outcomes for students. Information gleaned from these drill downs is used to create compliance hypotheses that then direct on-sight monitoring of school districts and the regional centers. For Part B 619, Health monitors the regional centers for compliance, and Education monitors Health’s efforts as a(n) local education agencies/intermediate education Unit (LEA/IEU). Education’s monitoring of Health sometimes includes accompanying Health Program staff on site visits to the regional centers. Monitoring visits focus on case file reviews. Aside from ensuring that children are provided services in accordance with their individual education programs (IEPs), there has traditionally been minimal focus on improved academic outcomes. While Education typically responds to monitoring and site visits with targeted professional development in the school district, Health has not had the resources to do the same for

Page 16 February 2018

the regional centers. More recently, Education has begun to support Health and the regional centers professional development by including Health and regional center staff in the Week of Academic Vision and Excellence conference (August 2016) and the University of Wyoming’s Project ECHO online training sessions. Education’s Part B 619 Reporting is Inconsistent In examining Education’s Part B reporting over the last few years, evaluation staff found several inconsistencies or potential contradictions in federally reported information related to how and whether Part B 619 information was included. Below provides five examples of these concerns: Use of the COS States are asked within the SPP/APR submission if they have used the COS assessment tool to report on program outcomes. Within the 2014 SPP/APR, Education stated that the COS process was not used in Wyoming for Indicator 7 for the Part B 619 program. However, Health and regional centers have never ceased using the COS for reporting on the Program’s outcomes. Additionally, later elements within the federal report discuss the strong correlation between Part B 619 COS scores and students’ K-12 assessment results. SSIP Improvement Strategies Education identifies preschool and pre-literacy skills as a key component of its State Systemic Improvement Plan (SSIP). Specifically, Education reported in its most current APR for Part B that: “The Wyoming SSIP strategies address the potential root causes of low reading performance for students with disabilities in preschools and grades K-3 identified by the WDE and its stakeholders in several ways. The Meaningful Professional Development and Technical Assistance strategy will focus on preschool teachers, general education teachers, and special education teachers, to ensure they have access to differentiated professional development and technical assistance activities designed to improve the knowledge and skills necessary to provide evidence-based, differentiated pre-literacy and reading instruction to students with disabilities in preschools and grades K-3.” (LSO emphasis) However, in a response to questions for this evaluation, Education staff stated that the “CDCs were not selected to be an active component in

Page 17 Early Intervention and Education Program, Phase 2

the pilot phase due to previous difficulties implementing MTSS[multi- tiered system of support] at that level.” Education added that it intends to bring in regional centers using data-based interventions (DBIs), with a “hope that they want to participate” in the future. Overall, Education noted that if it emphasized preschool supports in the original SSIP, “there would not have been enough time to move the needle in regard to the SiMR [a K–12-only measure].” COS Relationship with MAP Assessment The previous statement related to the SiMR measure highlights another issue evaluation staff encountered when trying to validate federal reporting related to the COS and its use for outcomes reporting. Education adds to the SiMR discussion that through its data analysis, the “Child Outcome Summary (COS) scores taken when students exit preschool programs correlate strongly with fall kindergarten Measure of Academic Progress (MAP) scores.” When asked about the analysis and conclusion, neither Education nor Health staff could confirm the correlation or the methodology used to make this determination. Education staff specifically stated that “the current administration cannot explain the reasoning,” and that Education does not have the data to validate this language in the federal reporting. Longitudinal Analyses Education’s APR provides that “The WDE will continue to explore ways in which COS data can be used along with MAP and PAWS data to impact the literacy skills of students in preschools and grades K-3. Given that the COS is administered only at entry to and exit from preschool, MAP is administered 2-3 times per school year, and PAWS is administered annually, there remains a need to explore ways in which the progress of preschool students might be used to identify areas of improvement in preschool reading instruction.” However, evaluation staff found no evidence in the course of this evaluation that Education has conducted, or plans to conduct, this type of analysis. Education staff could not confirm that they reviewed the COS results submitted to OSEP, and stated that they currently have no plans to involve or interfere with Health’s COS pilot initiative. Public Reporting on LEAs As it relates to federal requirements (Section 300.602(b)(1)(i)(B)) Wyoming is required to make public the State’s annual reports on the performance of each LEA located in the State. As Education has

Page 18 February 2018

sometimes noted Health as the state’s 49th school district, it seems reasonable for Education to establish a public report for Health’s administration of the Part B 619 program. However, Education stated that it does create an annual performance report of all LEAs and that the Program under Health does not qualify as it is an intermediate education unit (IEU). Evaluation staff noted that for the Part C program, Health publishes report cards on each regional center (see Appendix D for an example of a Part C regional report card for FY2015). Education stated, “All indicators, 1-17, are reported on by WDE. There is no requirement from OSEP to break out section 619 and report specifically on that population of students.” As of the writing of this report, Education’s website report cards for the State of Wyoming exclude indicators 6 and 7 related to preschool environment and outcomes, and individual report cards for the Part B 619 program under Health have not been posted since 2010. Neither Education nor Health Take Full Control of Part B 619 Reporting The current administrative structure between Education and Health over the Part B 619 program is fragmented, as noted in the Phase 1 report. This fragmentation has created a convoluted and arduous process for completing final federal reporting each year. This process includes an additional complication as both agencies heavily rely on a single data consultant to help review, translate, and report Part B 619 and K-12 system Part B information. Figure 1.3, on the next page, displays the interplay between Education agency staff (both the Individual Learning Division and Data Unit), Health Program staff, a data consultant, and the primary data systems used to report on the Part B program (both Part B 619 and the K-12 Part B program). Reviewing the figure from the bottom to the top demonstrates the complex ways in which the regional centers, agencies, and the data consultant transfer and translate data into the overall Part B report to the federal government, including the Part B 619 elements. Additionally, Health staff noted during both Phase 1 and Phase 2 of the evaluation that other than providing data to Education, through the data consultant, Program staff are not asked to contribute to the interpretation of the data or the narratives developed for the SPP/APR or SSIP, as they relate to the Part B 619 program. Similarly, Education staff noted that they do not develop all of the interpretations and narratives to the SPP/APR and SSIP that relate to

Page 19 Early Intervention and Education Program, Phase 2

Part B 619. Education and Health rely upon the data consultant to provide some of this information. As neither agency initiates full control or responsibility over the data collection and reporting processes, there are miscommunications, misinterpretations, contradictions, and inconsistencies in the reported information. Finally, for Part B 619 SPP/APR and SSIP reporting, Education’s Data Unit plays little or no role in assisting programmatic staff or directly overseeing the data consultants’ data access, analyses, and reports. Figure 1.3 Current Interactions and Oversight of Part B Reporting

Source: Legislative Service Office summary of compiled information from the Wyoming Departments of Education and Health. * Education stated that for the most recent reporting cycle, the data consultant directly input or downloaded required reporting materials into the federal portal.

Page 20 February 2018

Recommendation 1.2: Education’s Part B programmatic and Data Unit staff should develop standard Part B reporting and data coordination policy and procedures to ensure appropriate coordination and inclusion of Health staff in the development of Part B 619 federal indicator reporting. As the SEA for Part B, Education is ultimately responsible for assuring accurate and consistent reporting is developed for both the federal government and the State. In light of the current organizational placement of the Part B 619 program, Education should better facilitate and coordinate disclosure of federal indicator reporting. Components to consider in devising the recommended reporting and coordination plan include (at a minimum): ▪ Transfer, validation, and interpretation of program data between Education’s Data Unit, data consultant, and Health. ▪ Development of federal narratives and conclusions related to the SPP/APR and SSIP. ▪ Full disclosure of data for each federal indicator related to Part B 619 students.

Page 21 Early Intervention and Education Program, Phase 2

Page 22 February 2018

Chapter 2: Early Education Data Environment

The core element on which to build a robust, but efficient accountability model for the Program, especially if the Legislature goes beyond current federal reporting requirements, is sufficient, complete, and coordinated data on which to analyze and display performance measures. This issue impacts both the structure and implementation of the data system, but also the processes and procedures utilized by Health and Education to work with the data. Evaluators found that Health’s current Program data system, the Special Education Automation Software (SEAS) lacks important automatic or procedural controls or data validation functions to ensure that data entered by the regional centers is continually accurate and complete. As a result, Health staff use a significant amount of time reviewing monthly counts and other SEAS information to ensure data accuracy. While Health acknowledges the system’s limitations, they believe that they can continue to work under current circumstances. However, under current federal reporting requirements, or if the Legislature pursues a more comprehensive outcomes framework for the Program, Health, Education, and the Legislature would benefit from a planned and coordinated approach to Program data. It should also be noted that the Legislature and several state agencies initiated a coordinated approach to education data a number of years ago, but that initiative has receded into the background. Finding 2.1: Health’s SEAS data system has critical limitations that limit the State’s ability to track children or students across the educational landscape from preschool to the K-12 system. Wyoming does not currently look beyond the Part C and Part B federal reporting measures to gauge Program success. While Program data from the SEAS data system is used for federal reporting, Health does not appear to know the extent of the system’s data accuracy and reporting limitations and has not provided for sufficient data standards and processes to best ensure input accuracy by regional centers. Consequently, any analyses conducted by the agencies or evaluation staff carries with it significant data use and interpretation caveats. Additionally, data cannot be efficiently linked between the two agencies through their existing systems. Therefore, while both Education and Health utilize a consultant with access to each agency’s

Page 23 Early Intervention and Education Program, Phase 2

data, there has been no statutory direction to look consistently at cross- program analyses of early intervention and K-12 system data. Education Utilizes Data Quality Control Standards and Techniques Education requires that all districts submit student level enrollment and demographic data (only for K-12 and district operated pre-K programs) each year through its WDE 684 data collection required under W.S. 21- 2-203 through 204 and W.S. 21-2-304(a)(v). As a secondary component to this collection, information is collected for students that are provided services under the IDEA, known as the “SpEd file” within the 684 report. In reviewing background information on Education’s WISE system, Education follows practices to ensure data consistency, reliability, and completeness. These practices include implementing and regularly revising business rules and procedural standards as well as data definitions to assist districts in identifying the correct information to be entered into each field in the system. There are no such requirements for data maintained by the Department of Health for the Program. WISER Identification Numbers are the Key Link to Analyses Over the last decade, Education has utilized the WISE system and the Wyoming Integrated Statewide Education Record Identifier (WISER ID) number to track children in the K-12 system.3 Health recommended use of WISER identifiers for the Program’s children in its 2007 annual report to the Legislature, but these identifiers were not open for use by the Program until mid to late 2011, and then only assigned to Part B 619 children. This application of the WISER ID for the Program was not consistently used until the 2012-2013 school year. The unique identifiers allow districts and the State to track students from district to district (and now potentially from regional center to regional center) for a more comprehensive review of student progress and achievement. Aside from children who receive preschool services through a local school district, Education does not assign WISER IDs to other pre-K children served through private or public preschools (e.g. Head Start/Early Head Start). During Phase 1 of the evaluation, evaluation staff were informed that there was some discussion about allowing WISER ID use in the Head Start/Early Head Start programs around the state. However, it does not

3 The initiative and infrastructure for the WISE system was developed between about 2002 and 2006, with full implementation by 2007.

Page 24 February 2018 appear that this initiative has moved forward. As such, the population of pre-K children reviewed for outcomes analyses or other comparisons within this evaluation is not a global view of all children in Wyoming, but only for Program children served by the regional centers and school districts. Health uses the Special Education Automation Software (SEAS) for its Program Data System As noted in the Phase 1 report, evaluation staff also provided a scope limitation in the report, as evaluation staff was not able to review the system and data thoroughly. However, after a review of the system and its data for the Phase 2 report, evaluation staff found that SEAS, and the data provided within the system, carry significant caveats and concerns. These concerns are noted because the basis for any of the longitudinal or comparative analysis for the Program starts with SEAS data. The SEAS data system was first purchased and configured for Health for the Part B program in 2010 as an off-the-shelf system that allowed Health and the regional centers to keep data and records electronically. Prior to this system, Health used Microsoft Access databases to assist with Program data tracking and reporting. Health later requested that its vendor implement a customized module within SEAS to allow for tracking of Part C data as well. Health stated that due to these different implementations, the Part C module is more responsive to the Program’s state and federal administration requirements than that used for Part B 619. One significant challenge in working with the system is that its reporting capabilities can be problematic. Many regional centers stated that the system is workable until certain functions become inoperable. For example, one center noted that there are problems extracting child information, or “packets,” from the stored file to the working file. This process causes the centers to restart and re-enter information, leading to duplicated and incomplete forms stored within the system. Regional centers stated that there has been limited response from the SEAS vendor to repair or find accommodations for centers to complete their data requirements. Health currently has a $50,000 annual maintenance agreement with its vendor, but is beginning to look at options for potential major system modifications or updates, or a complete system change.

Page 25 Early Intervention and Education Program, Phase 2

SEAS Data Quality and Completeness Cannot be Ensured To determine the number of applicable children served in the Part C and Part B programs, evaluation staff relied on the “BHD Monthly Count” report. While Health stated that this report is the most comprehensive program data that can be extracted from SEAS, it contains notable issues related to overall consistency, accuracy, reliability, and completeness. Example issues found by evaluation staff included: Inaccurate numbers and characters (e.g. illogical birthdates, service dates, etc.); children identified as receiving services despite disqualifying codes within the dataset; and children with negative number of days served (e.g. service start date that is after the service exit date). Evaluation staff noted additional concerns related to data accuracy as the regional centers have discretion and flexibility to enter codes that make sense to them, but are not consistent from one center to the next. A prime example of this condition is that instead of the SEAS data system generating and assigning identification numbers automatically, each region has developed their own temporary identification number methodology before Health’s Program staff sends official SEAS ID numbers. A second, and similar, manual process is required for WISER ID assignment as well. Evaluation staff list several minimum caveats to the SEAS data used for analyses in this report within Appendix B. The State Longitudinal Data System Initiative has Been Abandoned In its current state, the Part B 619 data system is wholly separate from the K-12 system and there is limited understanding of the short or long- term impacts of the Program on students or the State’s K-12 system among the agencies and Program stakeholders. Perhaps the most logical way to connect these educational arenas is through the use of longitudinal data, which would allow the agencies to track child progress from pre-K initiation through post-secondary education and into the workforce. Wyoming began a statewide longitudinal data system (SLDS) initiative in 2010 in an attempt to understand education impacts on post-secondary education, but there was little to no movement on the SLDS, including areas pertaining to pre-K programs and data.

Page 26 February 2018

Department of Enterprise Technology Services was Tasked with Implementing the SLDS The public education system in Wyoming has moved in the direction of longitudinal data since it started assigning unique identifiers to K-12 children through the WISE system. However, the first concrete discussion of building a formal SLDS for Wyoming began in 2010 with the Joint Education Committee. This initiative was focused on bringing together K-12 and post-secondary data to analyze education system impacts (e.g. Hathaway Scholarship program). After more than a year of discussion, the Legislature passed Section 326 in the 2012 Budget Bill (2012 Laws, Ch. 26) and a separate bill (2012 Laws, Ch. 30) authorizing the Department of Enterprise Technology Services (ETS) and Education to develop an SLDS. Interest for including pre-K data was approached later in the system planning process. Based on a needs assessment conducted under the direction of ETS, the initial cost for the SLDS was estimated at $12.5 million. The ETS received an initial appropriation of school foundation program funds, and moved forward with a request for proposal (RFP) for a vendor to design and build the system. ETS also began coordinating governance of the system and working to complete interagency agreements, or memoranda of understanding (MOUs), for all Wyoming agencies that would participate in the system, such as agencies working with children and/or educating children (e.g. Education, Health, Department of Family Services, etc.). Due to several complications, ranging from issues with the selected vendor to significant turnover at Education, to the inability to execute the interagency agreements, by November 2015, the SLDS was effectively abandoned. The remaining appropriated funds for the project were transferred to the Wyoming Community College Commission, via the B-11 process, to build a data warehouse. Best Practices for Early Childhood Data Systems While the SEAS data system or a successor system could work to provide some data in an SLDS environment, Wyoming’s SLDS initiative appears no longer feasible. Evaluation staff reviewed best practices directly related to early childhood data systems and considerations that do not necessarily focus on full, linked SLDS integration, but may be useful as Health updates the SEAS data system. Whether attempting to incorporate an early childhood data system into an SLDS or not, Health and Education could utilize readily available best practices to gauge the need for, or extent of, its data system

Page 27 Early Intervention and Education Program, Phase 2

development, revisions or change. In light of the data quality issues identified by evaluation staff, the Early Childhood Data Center has developed the 10 Fundamentals of Coordinated State Early Care and Education Data Systems. These fundamentals focus on identifying the core data collection and reporting requirements that will target policy questions most important to decision-makers. Examples of best practices, including components that focus on quality indicators, authority, and roles and responsibility can be found in Appendix D. Recommendation 2.1: In the short term, the Department of Health should conduct the following actions to mitigate current SEAS data system limitations and data entry or extraction difficulties: ▪ Conduct an assessment of minimal system functions necessary to efficiently comply with federal reporting requirements. ▪ Survey regional centers to catalogue ongoing data input and system navigation difficulties central to the minimum state contract and federal reporting and tracking functions for Program children that they serve. ▪ In consultation with the SEAS data system vendor, develop a plan and cost estimate for updating the system to meet basic needs.

Recommendation 2.2: The Department of Health should develop SEAS data system business rules and/or policy standards for consistent and reliable data entry by regional centers. The central purpose of Recommendations 2.1 and 2.2 is for Health to complete a formal and coordinated data appraisal and needs assessment of the SEAS data system in order to identify where and how the Program should proceed with any update or change required to more efficiently and effectively administer the Program. As the SEAS data system works for both the regional centers as a case management system, as well as Health’s primary data tracking and reporting system, the needs assessment and appraisal should cover both areas. Finally,

Page 28 February 2018

Health should develop standards for how regional centers input data consistently and reliably. Recommendations 2.3: In addition to short-term actions outlined in Recommendation 2.1, the Department of Health should work with the information technology and data analysis staff at the Department of Education to assess the current attributes and limitations of the SEAS platform. The agencies should outline requirements needed to meet customary and reasonable data quality control standards for education data as well as develop a plan to potentially replace the SEAS data system to most efficiently meet the needs of the Program administration, federal reporting requirements, and potential adjustments to the Legislature’s outcome and accountability expectations (see Chapter 3 Policy Consideration). This recommendation is highly related to two other factors noted in the Phase 1 report and in Chapter 3 of this report, respectively: 1. Whether or how the Program’s organizational structure is changed (e.g. move Part B 619 and/or Part C to the Department of Education or under an Office of Early Learning); and 2. Whether or how the Legislature, agencies, and stakeholders move to define clearly the Program’s purpose and performance expectations to move beyond the current federal framework. While it would be beneficial for Education’s programmatic and data staff to collaborate with Health on any SEAS needs assessment and appraisal work, the bigger concern is whether the current or even a revised SEAS data system can remain a competent platform to work in a broader accountability setting for the Program. Education should, at a minimum, be involved with Part B 619 data issues due to its legal responsibilities for the Program’s administration and oversight. However, if Part C comes under its purview, Education’s role will only increase.

Page 29 Early Intervention and Education Program, Phase 2

Page 30 February 2018

Chapter 3: State-level Performance Expectations

Finding 3.1: The Legislature, Health, and Education have not expanded Program performance expectations beyond the federal outcome definitions or requirements. Even though the Program has met expectations on the federal indicators, the Committee also charged evaluation staff with reviewing the Program from a state-level performance perspective. The purpose of this phase of the evaluation was to address concerns the Legislature had regarding difficulty determining Program success and how to value the Program based on its overall impact on students and the State. In attempting to complete these analyses, evaluation staff explored different stakeholder perceptions and expectations of the Program and its intended purpose. The challenge evaluation staff identified, which was also expressed by Program staff at Health and Education, was looking for criteria beyond the federal framework. Separating Factors that Impact Children is Difficult Establishing causation for results in education is extremely difficult. This concern was a central focus for Education staff when discussing how primary education data would be used in this evaluation. Their premise was that many factors contribute to a child’s development, whether in pre-K or in the K-12 system. For example, other factors, such as natural growth and development, parental and teacher interventions, and possibly progressive special education services at the district level, may affect a child’s assessed outcome. In addition, each child’s ability and readiness to comprehend and apply different interventions and services varies. Once the Program has a clearly defined purpose, consideration of these concerns during legislative and stakeholder policy discussions will make measuring its impact more consistent and useful. The State Does Not Have a Clear Definition for Program Purpose This second phase of this evaluation was intended to help provide context for establishing clear definitions and expectations for the Program. The primary focus for this phase is on academic outcomes that could be used to assess the impact of early intervention on the children served through the Program. However, the main challenge in doing so is the absence of program purpose or outcome measure definitions within state statute or agency rules and regulations.

Page 31 Early Intervention and Education Program, Phase 2

State, School District, and Regional Center Staff Have Varied Definitions of the Program’s Purpose Over the course of this evaluation, various stakeholders have defined the purpose of the Program differently, inherently changing the methods of measuring and determining Program success. This chapter explores several of those purposes and potential outcome measures. Figure 3.1 Example Stakeholder Opinions on the Program’s Purpose

Source: Legislative Service Office compilation and summary of stakeholder information. Figure 3.1 highlights examples of stakeholder responses received throughout Phase 1 and Phase 2 related to the Program’s purpose. Overall, the comments of these stakeholders ranged from defining the purpose of the Program as simply meeting federal indicators to the need for a longitudinal study of students to understand when and where they make the most progress or achieve higher functioning. See Appendix F for a list of questions evaluation staff asked of Wyoming school district special education staff. The Program Has Reported on Undefined Legislative Outcomes The Legislature studied and implemented numerous changes to the Program between 2004 and 2008. These changes included establishing the current Program funding model and the need to analyze and report outcomes to the Legislature. Contained in 2006 House Bill 12 (2006 Laws, Ch. 85, Section 2; herein referenced as the Chapter 85 report), the Legislature stated the following:

Page 32 February 2018

(a) The developmental disabilities division of the department of health shall report to the select committee on developmental programs or, if that committee no longer exists, to the joint education interim committee and the joint labor, health and social services interim committee as follows: (i) By October 1, 2006, and every year thereafter, regarding: (E) Whether developmental preschools are meeting parent expectations based on parent satisfaction surveys and children are making sufficient and measurable progress based on student performance measurements. If a program is not providing satisfactory results, the board of directors of the program shall submit a remedial plan to the division that details the steps that will be taken to correct the deficiencies. The division shall report the results of the parent satisfaction surveys and student performance measurements as well as the number of regions out of compliance with the above performance measures, the number of remedial plans submitted to the division and the number of regions correcting noncompliance within the past year. For the October 1, 2006 report only, the division shall submit a plan for implementation, including a description of the parent survey instrument to be used in successive years and the method to be used for assessing student performance rather than the outcome measures. (LSO emphasis) Health has complied with this reporting requirement each year, but has continued to utilize only the requirements set in federal law. For example, parent satisfaction surveys are conducted each year as part of the federal indicator reporting for both Part C and Part B. Similarly, for the outcome or student performance measure assessments, Health reports Child Outcome Summary (COS) information. Finally, Health has maintained its federally required monitoring program, which requires corrective action plans (CAPs), or other more detailed corrective action documents, for when Health finds regional centers out of compliance with federal requirements over one or more consecutive years. Evaluation Staff Relied on Data Queried by Education for Review and Analyses of Potential Outcomes Based on discussions with Education staff regarding data needs, evaluation staff relied upon Education’s Data Unit to efficiently review and query its expansive data collections. Evaluation staff also reviewed the WISE system business rules and standards, and believes the WISE system and data provided to LSO met sufficient assurances of data accuracy.

Page 33 Early Intervention and Education Program, Phase 2

Evaluation staff then compared the data to other public Education reports and found general alignment with child enrollment numbers for kindergarten through fourth grade.4 In an effort to explore the possible outcomes analyses and child or Program comparisons that administering agencies could be conducting, evaluation staff linked WISER IDs to other Education data sets to track children from the Program through the fourth grade. Evaluation staff focused primarily on fall kindergarten scores and child counts because it is the most likely method to determine outcomes of Program services as children transition into the K-12 public education system. The initial goal of evaluation staff was to provide a comparison of the children who received Program services to their peers once enrolled within school districts. A second goal was to provide statistical analysis on how Program students, services, and student performance can change over time. However, as neither agency conducts such analyses, or tracks data over time, the following analysis relates only to the potential benefit that could be gained from systematically reviewing data on a longitudinal basis. Evaluation Staff Defined Example Groups of Student Populations for Analytical Comparison To the extent possible, analysis of records was conducted on four broad groups of children based on demographic data provided by Education and Health: • Group 1: Children who did not receive EIEP services AND did not require special education while enrolled in a Wyoming school district (generally described as “typical learners” or “typical learning peers”); • Group 2: Children who did receive EIEP, but did not require special education while enrolled in a Wyoming school district; • Group 3: Children who did not receive EIEP, but did require special education while enrolled in a Wyoming School District; and, • Group 4: Children who did receive EIEP, and did require continuing special education while enrolled in a Wyoming School District. More than three-quarters of the students from the fall 2012-2016 Education data collections were identified as Group 1 (i.e. typical learners). Another 14% of K-3 students were not identified within the special education student data, but did require K-3 special

4 Due to the process and timelines for assigning WISER IDs to Program children beginning in 2011, the oldest of these children today would most likely be in the third and fourth grades.

Page 34 February 2018

education services (Group 3). Students identified within the special education student data, but did not require K-3 special education (Group 2) accounted for 2.5% of the students. Less than 10% of the students in the Education data collections were identified as receiving special education through school districts (Group 4). These four groups of students were used by Evaluation Staff to analyze available data to determine if the potential Program purposes could be measured. Table 3.1 Count of Unique WISER Identification Numbers, by Defined Child Groups, Fall Data 2012-2016 Evaluation Staff Description Number of Percent of Child Group No. (EIEP and/or K-3 IDEA) WISERs Total Group 1 Typical Learners 27,220 76.1% Group 2 Yes EIEP; No IDEA in K-3 881 2.5% Group 3 No EIEP; Yes IDEA in K-3 5,151 14.4% Group 4 Yes EIEP, Yes IDEA in K-3 2,474 7% Total 35,726 100% Source: Legislative Service Office analysis of the fall enrollment and special education collections matched with Wyoming Department of Health’s Program child count data. Evaluation Staff Used Five Program Purposes Identified by Stakeholders to Develop Potential Outcomes As noted earlier in this chapter, Program stakeholders have different notions of what the Program and regional centers’ service provisions should accomplish. In combination with the Committee’s concerns about Program success, evaluation staff focused on providing analyses on a number of outcome measures that explore several stakeholders’ purposes and potential outcomes for the Program, including: 1) To gauge individual student achievement and age appropriate social and emotional skill levels; 2) To gauge children’s kindergarten readiness skills; 3) To assess whether Program services may be closing the achievement gap between typical learners and their peers who receive IDEA special education and related services; 4) To track the number of children who received pre-kindergarten IDEA services along with their potential decreased need and cost of services into the K-3 public education system; 5) To identify and provide access to services for children with developmental delays or disabilities.

Page 35 Early Intervention and Education Program, Phase 2

Analyses Do Not Provide Conclusions The evaluation staff analyses described in the following pages illustrate potential approaches to determine outcomes and example methods to measure academic progress, or impact. Each analysis provides context and limited information related to the number of children who received services in both the Part B 619 program and/or the K-12 system through Wyoming school districts. Evaluation staff also looked at assessment outcomes for kindergarten readiness and math and reading scores. After reviewing test scores for children who received early intervention services through the Program along with their peers, both in typical and special education programs, the available data does not reveal that test scores are significantly better or worse as a result of receiving early intervention services. Due to the data quality limitations noted in Chapter 2, these analyses are illustrative of collection and reporting possibilities and readers should not rely on the analyses provided in this chapter to draw conclusions about current Program success. Success and desired Program outcome measures should be specifically defined and set by the Legislature and stakeholders. Then, targeted analyses should be developed and refined thereafter. Outcome Concept #1: Measuring individual student achievement and progress. This approach defines the purpose of the Program similar to current federal expectations. Under this approach, the agencies could continue to utilize results from the Child Outcome Summary (COS) as a central outcome measurement, which has indicated program success at the federal level. As discussed in Chapter 1, Wyoming has met federal requirements and expectations over the years. The COS is undergoing a new pilot project to provide more reliable and objective data. Until this process is fully implemented and matures, this information will not be available for effective analysis. Health staff noted that the changes to the COS process will take about three years before the State can expect consistent and reliable data and analyses. Individual Education Programs (IEPs) Could be Used for Measuring Student Achievement Stakeholders offered that another way to gauge Program success could be through the review of individual child achievement measured by attainment of their individual education program (IEP) goals. The most effective way to determine IEP progress at the state level would be for Program staff to gather and query broad-based IEP goals information from Health’s special education data system.

Page 36 February 2018

However, Health staff demonstrated that this data is not currently available in a broad-based report format through its data system. Health does review individual IEPs during regional center monitoring visits, but there is no current report developed to aggregate this data for cross-regional or statewide analyses within their system. The primary concern expressed by Health staff in going this direction is that each regional center’s programming and curricula philosophies differ and statewide comparisons or summary of IEP progress may not provide clear performance measures. During Phase 1, Health staff noted that providing the regional centers with training on measurable IEP goal writing is a current and future objective for professional development. On the other hand, Education staff cautioned that using such a measure may violate provisions of the IDEA requiring each child’s IEP be developed according to their specific educational needs and disabilities. Outcome Concept #2: Measuring kindergarten readiness Kindergarten readiness could be measured using assessments aligned with expected kindergarten entry skills and standards, as currently defined by each district.5 According to Education staff, there are two common readiness assessments tools, only one of which is required: 1) the Instructional Foundations for Kindergarten (IF-K) and the Children’s Progress Academic Assessment (CPAA).6 Evaluation staff concentrated its review on the IF-K because it was specifically developed in, and for, Wyoming. The intent of the review is to support a consistent, statewide, skills and development standard could help the State and districts understand what areas of greatest developmental need exist for incoming kindergartners with disabilities. IF-K Was Developed With Broad-based Stakeholder Input Prior to 2007, kindergarten readiness was generally assessed through a checklist of distinct skills to determine a child’s readiness for school. In 2008, Education piloted the IF-K observational tool consisting of nine foundational assessment areas for rating preschool and kindergarten children over a 10-week period. The assessment

5 The State developed the Wyoming Early Learning Foundation skills a number of years ago to bring consistency and definition to early learning skill standards. However, these foundational elements are not required among preschools nor are they specifically aligned with K-12 standards.

6 Education noted that for district-based preschool and kindergarten programs, as well as Temporary Assistance for Needy Families (TANF) preschools monitored through Education (as delegated by the Department of Family Services), both of these assessments are used.

Page 37 Early Intervention and Education Program, Phase 2

tool was developed with broad-based stakeholder input from school districts and the preschool community. In terms of its intended use, both preschool and kindergarten teachers were to use the data from the IF-K to understand the foundation skills their students demonstrate when they enter/exit preschool or enter kindergarten. It was also to be used as a tool for communicating with families about children making the transition from preschool to kindergarten. However, this assessment tool has never been used as intended, nor has it been required of the Program’s regional centers. The key areas measured through the IF-K included: A. Representation B. Language C. Writing D. Reading E. Geometric and Algebraic Math F. Number Sense and Operations G. Science H. Relationships and Self-Regulation I. Social Problem-solving As of the 2015-2016 school year, Education suggests that only five (bolded above) of the nine foundational areas for assessment be used, as it believes the CPAA assessment tool is more appropriate to cover children’s reading and math foundational skills. Due to Incorrect Use, IF-K is not a Reliable and Consistent Foundational Early Childhood Assessment Tool Despite the broad-based support and testing of the IF-K when it was first developed, as Education administrations have changed, the districts appear to use the IF-K assessment less. Additionally, the assessment’s intended purpose, to obtain consistent data and drive early childhood policy, seems to have been forgotten. For example, in 2010, Education released a memo advising that the IF-K was to be used to “establish a baseline and trends for data indicators on children entering kindergarten... data will be used to improve early learning professional development, transitions and outcomes for students entering kindergarten.” In 2014, Education sent another memo to districts, which stated that the purpose of the IF-K was to “support instruction, facilitate transition from preschool to kindergarten, and identify levels of student readiness for the kindergarten curriculum.” While the initial Education memo highlighted data and outcomes as the central benefit to guide future changes, the latter memo highlights instructor and staff development.

Page 38 February 2018

Additionally, this test is not required of all school districts, and when it is used, the test is not given at the same time in the year, administered in the same fashion, or over the same amount of time. In reviewing the school district survey results, evaluation staff found that the IF-K is not used for the purposes expressed in either memo and the districts have modified use of the IF-K assessment to best fit their needs. For example, the IF-K data provided by Education for this evaluation revealed that 45 of the 48 school districts used the IF-K assessments within their districts. However, when surveyed by evaluation staff, only 9, or 28%, of the 33 responding districts stated they used the IF- K assessment. Most of the districts that used the IF-K stated that its use appears to be more for screening children at their kindergarten registration and screening events or shortly after the start of kindergarten. In each case, survey respondents stated that conducting this assessment lasted from ten minutes to one hour, not the intended comprehensive 10-week observation time. Evaluation Staff Review of IF-K Data According to the consultant charged with compiling and analyzing IF-K data, only a score of 4 or 5 indicate that a child is ‘kindergarten ready.’ When evaluation staff reviewed available IF-K scores, the majority of children scored a 3 or 4 in most functional areas. The analysis shown in Table 3.2, below, depicts the available test scores for children who did and did not receive early intervention services from the regional centers (CDC). For example, for the Representation skill, 26% of children who had early intervention services scored at ‘kindergarten ready’ levels of 4 or 5, compared to 44% of children who did not have those services. However, that analysis did not take into consideration the special education needs of the child taking the IF-K assessment. Table 3.2 IF-K Scores in Each Functional Area for Children, 2012 through 2016 Kindergarten Ready IF-K Score 1 1 2 2 3 3 4 4 5 5 Children Served No No No No No or Not Served by CDC CDC CDC CDC CDC CDC CDC CDC CDC CDC Regional Centers Representation 15% 6% 26% 15% 33% 34% 21% 32% 5% 12% Language 16% 5% 31% 17% 33% 36% 16% 29% 5% 13% Science 18% 9% 26% 19% 36% 39% 17% 25% 3% 8% Relationships and 17% 6% 25% 16% 30% 29% 25% 39% 3% 10% Self-Regulation Social Problem- 18% 7% 28% 19% 32% 34% 17% 30% 4% 10% Solving Source: Legislative Service Office analysis of Education’s IF-K data matched with Health SEAS child count data.

Page 39 Early Intervention and Education Program, Phase 2

Health and Education Do Not Formally Study Program Impacts on K-3 Student Progress and Outcomes While Education has the ability to track assessment scores of children from preschool through the K-12 system, and does so for district-run preschools with the help of an outside consultant, there has been no similar activity between Health and Education to study and compare child results (e.g. assessment outcomes) or academic progress for children served through the Program. Due to the current organizational structure of one agency overseeing another, Education noted difficulties in considering Health the equivalent of the school districts for the purposes of longitudinal study and outcomes measurement, despite its interpreting Health to be a Local Education Agency (LEA) or Intermediary Education Unit (IEU) for Part B 619 reporting. During Phase 2, evaluators learned that school districts have specific allowances to administer preschool programs using certain education funds. Wyoming Statute 21-4-302(c) authorizes districts to run or contract for part-time preschool programs that utilize developmentally appropriate curricula. According to survey results, about half of the thirty-three responding school districts stated they funded a local district preschool program. As part of this allowance, statute also requires participating districts to assess the success of their preschool programs. Specifically, Wyoming Statute 21-4-302(e) states: (e) A school district which provides a preschool program under subsection (c) of this section biennially shall assess, through the fourth grade when practical, the school readiness and academic performance of pupils who participate in the program as compared with those who do not participate in the program. The district shall report the results of the assessment to the department of education and the department shall report the results to the joint education interim committee of the legislature on or before October 1 of each even numbered year. The results of any assessment required by this subsection shall be open for public inspection. While the districts are required to perform the program assessments or evaluations on their respective preschool programs, Education has taken over this responsibility on a statewide basis and provided two biennial reports, 2013 and 2015, to evaluation staff. These analyses were conducted by the Program’s data consultant and looked at preschools in six districts. It is important to note that the 2015 study noted the following caution: “In evaluating the impact of the district preschool programs on subsequent student performance, the central question is: Did the students who participated in the

Page 40 February 2018

preschool program perform at a higher level than they would have had they not participated in the preschool program? Since this question is unanswerable, the question becomes: Did the students who participated in the district preschool program perform at a higher level than a comparable group of students who did not participate in the preschool program? The difficulty in answering this question is coming up with a “comparable” group of students. The students who were selected for the preschool program were selected based on economic and/or academic need… However, there may be other aspects of “economic need” and “academic need” that districts used to select preschool students that cannot be identified from enrollment files… Thus, our ability to identify a true comparison group is very limited.” Overall, the studies concluded that, while the results were mixed, there were “more positive results” for some children, such as children on free and reduced lunch, or students receiving Title 1 services, than comparable groups not on the lunch program or receiving Title 1 services. Education staff stated that with the data already available, such longitudinal monitoring and reporting could be extended to children who receive services through the Program. Outcome Concept #3: Closing the achievement gap between typical learners and their peers who receive special education and related services According to the National Center for Education Statistics, achievement gaps occur when one group of students consistently outperforms another group. Students can be grouped by demographic and socioeconomic conditions and common factors such as race/ethnicity, gender, poverty, and limited English proficiency are traditionally noted in education. Moreover, federal education laws and policies direct education systems to address and reduce the achievement gaps between students with disabilities and their non-disabled peers. Importantly, on December 10, 2015, the Every Student Succeeds Act (ESSA) was signed into law. The new focus of the ESSA is primarily on career and college readiness, and less focused on identifying achievement gaps among students. However, given that the federal No Child Left Behind (NCLB) law was in place for the children identified for this evaluation (2012-2016), and closing the achievement gap was a primary focus at the time, evaluation staff considered both laws while conducting its analyses.

Page 41 Early Intervention and Education Program, Phase 2

IDEA and NCLB May Not Align for Ensuring Children With Disabilities Are Both Served and Succeed Federal education policy specifically addresses the academic achievement of students with disabilities and requires students to be fully included in the provisions of NCLB, thus endorsing “closing the achievement gap” between students with disabilities and other student groups. The NCLB and IDEA were expected to work together to improve the academic performance of students with disabilities in the following areas: • Academic content and achievement standards • Annual assessments • Accountability • Qualified teachers in general education and special education However, merging the nation’s main education law with IDEA policies also creates significant challenges. The most notable challenges are grounded in accountability priorities wherein the individual needs of the student with disabilities and the provision of a free appropriate public education (FAPE; federal IDEA principle) may not balance with the equal academic success of all students (NCLB principle). An example of this potential conflict is that each child’s IEP goals are not necessarily aligned with state or district learning or outcome standards. Furthermore, IEPs are not set to measure strictly academic performance. Instead they are designed for continual skill development and progress toward appropriate social-emotional, physical, and academic functioning. The Measures of Academic Progress May Provide Balance The Measures of Academic Progress (MAP) assesses individual student academic progress, but can also be used to compare a student’s scores with his or her peers. In Wyoming, kindergarten students are assessed twice per year using MAP in the areas of reading and math. This assessment does not specifically factor in social-emotional or behavioral growth/progress, which may affect many children with disabilities. Evaluators reviewed and compared the MAP scores (when available) among the four groups identified earlier in this chapter. For this analysis, evaluation staff reviewed only the fall kindergarten MAP scores, collected around October of each year, as these scores would be the least impacted by K-12 system kindergarten special education services/supports. Based on analysis conducted by evaluation staff, the scores for Groups One, Two, and Three did not vary significantly in distribution from one group to the next in either math or reading, making it

Page 42 February 2018

difficult to draw any conclusions related to specific outcomes or impact of the Program. However, Group Four had the lowest scores, which is reasonable as these children would have the most difficulties as they continue to need IDEA services. Another observation noted in the analysis was that year-over-year, fall kindergarten scores declined in each group except the special education students who received early intervention services prior to entering kindergarten. Figure 3.4 and Figure 3.5, below, illustrate this trend for both reading and math. This analysis suggests that early intervention may be beneficial to children who continue to receive services once they enter kindergarten. Figure 3.4 Fall Kindergarten MAP Reading, 2012 through 2016 146 144 142 140 138 136 134 132 2012-2013 2013-2014 2014-2015 2015-2016 Group One Group Two Group Three Group Four

Source: Legislative Service Office analysis of Education’s WDE 684 data matched with Health SEAS child count data. Figure 3.5 Fall Kindergarten MAP Math, 2012 through 2016 150 145 140 135 130 125 2012-2013 2013-2014 2014-2015 2015-2016 Group One Group Two Group Three Group Four

Source: Legislative Service Office analysis of Education’s WDE 684 data matched with Health SEAS child count data.

Page 43 Early Intervention and Education Program, Phase 2

Outcome Concept #4: Tracking the number of children to determine possible decreased need and costs for special education services One could also consider that the purpose of the program may be to provide additional preparation, coping, or mitigation strategies that lessen negative pressures on students once enrolled in public education. Another possible way of measuring outcomes for the Program is through the number of children who are served and then exit the Program because their abilities, or needs, no longer necessitate IDEA special education services.7 Some stakeholders expressed concern with using this method as a measure to determine Program success because some children will always require services based on the nature of their disability or educational needs. However, to determine the extent of children exiting services in K-3, evaluation staff identified children with exit dates and complete records included in this analysis. According to Education, there are several exit codes offered within its special education data, a few examples include: ▪ NM - Normal Matriculation ▪ RP - Returned to Regular Program ▪ TO - Transferred to another educational setting outside the district ▪ PE - Parental Exit An important consideration when looking at children’s exit from services is that in some instances, children exit in kindergarten (return to regular classroom programming), but then return in later grades with a different disability and possibly a later exit date. Several children within Groups Two, Three and Four have multiple exit dates and reasons. Within Group Four, children who had Program services at a regional center and remained in special education services during K-3 years, out of 2,474 unique WISER IDs with exit dates, evaluation staff matched 295 with Health’s child count data. Although in some cases, a child had multiple entrance dates and exit reasons, most exited the program because he or she either returned to regular programming or transferred out of the district.

7 Please note that once in the K-12 system, children may receive other interventions not specifically linked to IDEA-eligible services through an IEP. These services may occur under the Multi-Tiered Systems of Support (MTSS) that all children may access (e.g. reading programs/assistance, extended day remediation, etc.).

Page 44 February 2018

While no children exited during the fall of a school year with a reason being “return to regular program,” by the end of the school year collection date, 114 of the 295 children reported exiting for that reason. This reason indicates that the child no longer presented an educational need for IDEA services and supports. Within Group Three (children who did not receive Program services at a regional center, but did have K-3 special education services) there were 730 unique WISER IDs with exit dates. Several children within this count have multiple reasons and/or exit dates. Overall, the majority of Group Three children were identified with an exit reason as having returned to regular programming. When children exit Program services, they do so because there is no longer a demonstrated need for special education and they return to their regular program. In this analysis, roughly 14% of children receiving Program services no longer need them by the fourth grade. It is important to note, that some children may never reach a point that allows them to exit services. Exit dates coupled with disability categories may provide greater insight into this measure if pursued. This data is collected and available for such analysis. Cost and Service Tracking is Difficult for Both the Program and K-12 Special Education In a more general sense, estimating the potential cost reduction impact of Program services on the K-12 system is extremely difficult. Even in an aggregate sense, current Program and K-12 system cost and service tracking is imprecise. For example, the Program’s current funding model allocates funding to regional centers on a per- child basis, but this average amount of funding (between $8,000 and $9,000 per child) is not tied to each eligible child at the regional center level. The funding is allocated to the regional center to support the centers’ overall operating costs to serve eligible children in their care throughout the year. If a child enters or leaves a regional center, they must be served regardless of state funding levels. In the K-12 system, special education and related services are currently paid to districts at 100% of actual, allowable costs incurred by the districts. Therefore, Education’s required reporting of district special education and related service costs is mostly at the district level with almost no ability to track back to services delivered or staff members’ time delivering services. In terms of data to track services that may be related or associated with K-12 system costs, the Program’s special education data system does not currently have the capacity to aggregately report IEP services, either by type, quantity, or frequency. For the K-12 system, Education’s special education data system tracks up to five service

Page 45 Early Intervention and Education Program, Phase 2

types for each child, but may miss children’s services for more high need students that may receive more than five different services. The K-12 data also does not track quantity of service time or frequency of delivery. Outcome Concept #5: Service Access and Opportunity As discussed in Phase 1 of this evaluation, many stakeholders believe that identifying children and providing broad access to free and appropriate education is the primary goal of the Program. Moreover, Wyoming generally identifies and serves more children age birth through five years, than almost all other states. Phase 1 confirmed the high identification rates, but also found that Wyoming’s identification rates may not necessarily be a cause for concern when compared to expected disability incident rates within the U.S. population. While the conversations related to the primary purpose of the Program varied, the school districts survey showed an almost universal belief that early intervention services are “very important.” Of the districts that noted otherwise, only one provided comments of frustration and negative feelings about the regional center with which it worked, but not on overall impressions of early intervention’s usefulness. Regional Centers’ Service Delivery and Resource Standards Additionally, there tends to be a consensus among stakeholders that to be effective, early intervention and education must be high-quality. The state requires that professionals working with children at the regional centers follow their professional licensing requirements, but there is little other direction as to what constitutes a ‘high-quality’ program. To ensure that the program is providing the same opportunities to all children throughout the state, standardization could be considered. As one stakeholder explained, “a child at one regional center may have the same disability as that of another child at a different regional center and neither child would be given then same amount of services for the same amount of time.” While each child’s services should be individualized, there are disparate region-by-region practices where children’s programming standards vary. Both the regional centers and the school districts are subject to Chapter 7 of the Wyoming Department of Education rules and regulations governing IDEA disability classifications. However, philosophies on identification and course of action for children who require special education services differ from one location to the next. Additionally, W.S. 21-2-706(a)(iii) states that, “Contracts with developmental preschool service providers shall require that the providers adopt evidence-based best practices, as defined by the

Page 46 February 2018

division by rule and regulation.” (LSO emphasis) During Phase 1 of this evaluation, Health staff expressed that as Education is the SEA for Part B 619 services, only Education has the ability to promulgate rules concerning this portion of the Program. Yet, Health has not adopted rules and regulations related to best practice standards for Part C. Health staff noted that it relies on regions to follow-through on statements made in regional centers’ biennial applications that they will use best practices. In general, there is no clear definition of ‘high-quality’ as it relates to early intervention services. School districts that replied to the evaluation survey stated varying ideas as to what constitutes a high- quality service. However, there were several themes that appeared when asked to define components of a high-quality early learning provider. Those included: 1) Teacher certification, 2) Inclusive environment, 3) Differentiated instruction, 4) Evidence-based early childhood curriculum, and 5) Parental involvement. Additionally, there were comments about what concerns or improvements can be made by regional centers, including: 1) full-day services or more academic programming for preschool ages (three to five years), 2) improved IEPs and service programming, and 3) improved work with high needs children. As discussed in Phase 1, while several State advisory committees have attempted to define high-quality, a clear definition remains elusive. Finally, in the event a definition and standards are developed, an important issue is oversight and ensuring fidelity to the standards by regional centers. Recommendation 3.1: The Department of Health should revisit its WISER ID assignment processes and develop a formal policy to allow for straight-forward, efficient, and timely assignment of WISER IDs to Part B 619 children entering services. Recommendation 3.2: The Department of Education should allow WISER IDs to be assigned to Part C children entering services. Recommendations 3.1 and 3.2 seek to bring continuity and consistency in the assignment and use of WISER IDs for the Program. Health should work with Education to establish clear policies and procedures for assignment of the identifiers to children entering the Program, as well as work on a strategy to include Part C children. Recommendation 3.3: The Departments of Health and Education should consider developing policies and/or rules that define and measure a high quality early intervention program (e.g. service standards, curricula, staff licensing or professional development

Page 47 Early Intervention and Education Program, Phase 2 standards, etc.), which can be periodically and efficiently validated within the regional center monitoring protocols. While different working groups have attempted to look at or define what “high quality means” in early childhood education, this recommendation is specific to the Program and ensuring Health and Education set reasonable but clear and rigorous standards of quality that regional centers should follow. This process should include regional center and family stakeholder input and be set in agency rules commensurate with W.S. 21-2-706(a)(iii). Policy Consideration As noted throughout Chapter 1 and Chapter 3, the current Program emphasis on federal reporting and indicators has set its base accountability framework. However, with the State’s share of funding at about 90% of Program resources, it may be advisable for the Legislature to pursue its own accountability framework based on Wyoming-defined Program purpose(s) and performance measurement expectations. The Legislature could consider defining a more specific accountability framework for the Program that targets state interests, including: ▪ Defining the Program purpose and goal(s). ▪ Setting minimum service and/or curricula standards. ▪ Establishing outcome measures specific to Program students’ K-12: o Kindergarten readiness, o Student skills growth, o Student achievement, o Potential service and cost reductions. o Setting clear administrative responsibility and timing for Health and/or Education to analyze and report on regional center compliance and results.

Page 48 February 2018

Agency Response Wyoming Department of Health

Page 49 Early Intervention and Education Program, Phase 2

Page 50 February 2018

Page 51 Early Intervention and Education Program, Phase 2

Page 52 February 2018

Agency Response Wyoming Department of Education

Page 53 Early Intervention and Education Program, Phase 2

Page 54 February 2018

Early Intervention and Education Program, Phase 2

Page 56 February 2018

Supplement Early Intervention and Education Program, Phase 2

On July 13, 2017, the Management Audit Committee requested further research on the Early Intervention Education Program (EIEP) prior to the release of the Phase 2 evaluation. Specifically, the Committee requested that its staff review the following: 1. Review of other states’ Part C and Part B 619 program organizational structures. 2. Which states require means testing to receive Part C services? 3. What potential savings may be anticipated from implementing means testing for the program? 4. What other state(s) may have a state-only funded Part C equivalent program, and if so, how is it structured and/or administered? 5. A review of the potential impact of EIEP on K-12 special education system costs. 6. What does national literature say about the cost effectiveness of EIEP-equivalent services? 1. Organizational structure and placement of the IDEA Part B 619 and Part C programs The Early Childhood Technical Assistance Center (ECTA) produced a document in 2016 identifying the lead agencies for the Part C program in each state. The following table depicts the organizational structure of Part B and C along with whether a state provides universal (also referred to as state-funded) preschool. It should be noted that in most states, the state educational entity or an office of early childhood houses Part B 619. However, Part C is located in several different state level agencies throughout the country, a large number of those entities being the department of health. Interestingly, there are three states, Nebraska, Pennsylvania, and Vermont, which co-lead their early intervention (Part C) program between the Department of Education and another agency.

Early Intervention and Education Program, Phase 2

Table 1.1 Organizational Structure of IDEA Part B 619 and Part C State Funded State IDEA Part B 619 IDEA Part C Pre-K Alabama State Education Agency Department of Rehabilitation Yes Services Alaska State Education Agency Department of Health & Social Yes Services Arizona State Education Agency Department of Economic No Security Arkansas State Education Agency Department of Human Yes Services/Developmental Disabilities California State Education Agency Department of Developmental Yes Services Colorado State Education Agency Department of Human Services Yes Connecticut State Education Agency Department of Developmental Yes Services Delaware State Education Agency Department of Health & Social Yes Services Florida State Education Agency Department of Health Yes Georgia State Education Agency Department of Health/Public Yes Health Hawaii State Education Agency Department of Health No Idaho State Education Agency Department of Health and No Welfare Illinois State Education Agency Department of Human Services Yes Indiana State Education Agency Family and Social Services No Administration Iowa State Education Agency Department of Education Yes Kansas State Education Agency Department of Health and Yes Environment Kentucky State Education Agency Department for Public Health Yes (Cabinet of Health and Family Services) Louisiana State Education Agency Department of Health Yes Maine State Education Agency Department of Education Yes Maryland State Education Agency Department of Education Yes Massachusetts State Education Agency Department of Public Health Yes Michigan State Education Agency Department of Education Yes Minnesota State Education Agency Department of Education Yes Mississippi State Education Agency Department of Health No Missouri State Education Agency Department of Elementary and Yes Secondary Education

Page 58 February 2018

State Funded State IDEA Part B 619 IDEA Part C Pre-K Montana State Education Agency Department of Public Health and No Human Services Nebraska State Education Agency Department of Health and Yes Human Services and Department of Education Nevada State Education Agency Department of Health and Yes Human Services New State Education Agency Department of Health and No Hampshire Human Services New Jersey State Education Agency Department of Health Yes New Mexico State Education Agency Department of Health Yes New York State Education Agency Department of Health Yes North State Education Agency Department of Health and Yes Carolina Human Services North Dakota State Education Agency Department of Human Services No Ohio State Education Agency Department of Developmental Yes Disabilities Oklahoma State Education Agency Department of Education Yes Oregon State Education Agency Department of Education Yes Pennsylvania State Education Agency Department of Education and Yes Department of Human Services Rhode Island State Education Agency Department of Health and Yes Human Services South State Education Agency South Carolina First Steps to Yes Carolina School Readiness South Dakota State Education Agency Department of Education No Tennessee State Education Agency Department of Education Yes Texas State Education Agency Department of Health and Yes Human Services Utah State Education Agency Department of Health No Vermont State Education Agency Department of Education and Yes Department of Human Services Virginia State Education Agency Department of Behavioral Yes Health and Developmental Services Washington State Education Agency Department of Early Learning Yes West Virginia State Education Agency Department of Health and Yes Human Resources Wisconsin State Education Agency Department of Health Services Yes Wyoming State Education Agency Department of Health No Source: Early Childhood Technical Assistance Center (http://ectacenter.org/partc/ptclead.asp).

Early Intervention and Education Program, Phase 2

In total, eleven states do not have state funded preschool. When considering possible states for comparison, these states would be the most likely to have similar oversight structures as that in Wyoming. Table 1.2 provides a summary of those states and the placement of their Individuals with Disabilities Education Act (IDEA) Part C program. Table 1.2 States that do not have State Funded Pre-K and the location of their IDEA Part B 619 and C Programs State Location of Part B 619 Location of Part C Arizona State Education Agency Department of Economic Security Hawaii State Education Agency Department of Health Idaho State Education Agency Department of Health and Welfare Mississippi State Education Agency Department of Education Montana State Education Agency Department of Public Health and Human Services New Hampshire State Education Agency Department of Health and Human Services North Dakota State Education Agency Department of Human Services South Dakota State Education Agency Department of Education Utah State Education Agency Department of Health Wisconsin State Education Agency Department of Health Services (Children’s Services Section) Wyoming State Education Agency Department of Health Source: “A Framework for choosing a State-Level Early Childhood Governance System,” Build Initiative. 2013 2. States that Require Means Testing to Receive Part C Services Several states employ means testing to establish eligibility and fee structures. Means testing is defined as a “determination of whether an individual or family is eligible for government assistance, based upon whether the individual or family possesses the means to do without that help.” Currently, Wyoming does not require payment of fees for Part C services. Unlike Part B services, which must be provided to children free of charge to the families, Part C provides that eligible children and their families are entitled to receive early intervention (EI) services. These services are provided according to an individual state’s established policies and procedures, which may include a system of payments. States may use multiple pay sources such as public and private insurance, community resources, or a “sliding scale” of fees according to family income. Certain Part C services are provided at no cost to any family, regardless of income level, including Child Find services, evaluations and assessments, Individualized Family Service Plan (IFSP) development, and service coordination. Table 1.3, below, summarizes the family cost share plans in each state that requires payment and conducts means testing. Table 1.3

Page 60 February 2018

is based on a review of each state’s data provided to the Early Childhood Technical Assistance Center (ECTA). Most states use a percentage of the federal poverty level, established by the U.S. Census Bureau, and used by the U.S. Department of Health and Human Services, as a baseline in determining need. Table 1.3 States with Implemented Family Fee Schedules for IDEA Part C Baseline Income Percent above poverty State Authority Method Range of Fees level or otherwise noted Alaska Administrative 115% Percent of cost 10% - 100% Code per service hour California Statute 400% Sliding Scale 10% - 100% Connecticut Statute Annual Gross Income Monthly Fee $8 - $544 (AGI) above $45,000 Georgia Rule 200% Monthly Fee $5 - $100 Illinois Statute 185% Monthly Fee $10 - $200 Indiana Statute 250% Fee per 15 $.75 - $25 per minute service 15 minute unit unit Kentucky Statute 200% Monthly Fee $20 - $100 Note: entire fee may be waived if family has private insurance. Missouri Statute 200% Monthly Fee $5 - $100 New Jersey Administrative 300% Fee per service $2 - $100 Code hour North Policy 200% Percentage of 20% - 100% of Carolina cost of services service costs or 5% of AGI Texas Rule 200% Monthly Fee $20 - $150 Utah Rule 185% Monthly Fee $10 - $100 Virginia Statute 300% Monthly Fee $0 - $809 Wisconsin Policy 200% Annual Fee $25 - $150 Source: Early Childhood Technical Assistance Center (http://ectacenter.org/topics/finance/familyfees.asp). 3. What potential savings may be anticipated from implementing means testing for the program? To answer this question would require intensive analysis that is not currently within the capability of LSO’s Evaluation Section. The analysis would require a review of data that is not currently collected. Therefore, to arrive at an estimate of potential savings,

Early Intervention and Education Program, Phase 2

the following information would need to be assembled and analyzed: the number of children living within the specified poverty levels at each center, and then a review to determine the sliding scale for each child based on the household income. The special education costs of each of those children would then need to be determined, and finally those costs would need to be compared to the funding provided to each center. Because families are not currently required to report either family size or income, LSO would need to collect annual income information from private citizens that utilize the services. Assuming this data could be collected, the analysis would then require evaluators to review the number of individuals in each household along with the household income level to determine the amount of revenue generated through the assessment of fees for the Part C Program. Once collected, an equation that considers the number of households of a given income and family size multiplied by the fees for that income and family size would provide the anticipated revenue. This revenue would represent the potential savings that may be anticipated from implementing means testing for the Part C program. In 2011, Colorado hired private consultants to attempt this type of analysis and determined that monthly fees could generate $189,827 in monthly gross revenue. However, the consultants provided a caveat that income data for families participating in early invention services was not collected. Instead the consultants used public census data “relative to general population income,” which represented median income levels, but not “distribution of income levels by income ranges or quadrants.” The Colorado study recommended that: 1. The state structure a monthly participation fee that was not tied to the frequency and intensity of services provided; 2. The billing and collection of fees be centralized with no more than 15% of revenue collected going to administration costs; and, 3. The fee schedule be established with two tiers depending on a family’s ability to use private insurance. Two options were provided to establish a baseline for fee participation. The first was the eligibility for public insurance, and the second was the eligibility for free or reduced lunch. For FY 2010-11, Colorado provided 10,990 Part C eligible children and their families services with expenditures of approximately $29,325,399. The study also reviewed other states to determine annual revenue generated from Part C fees, as

Page 62 February 2018

reflected in the table below. However, it appears the fees are insignificant compared to the cost of the states’ programs. Table 1.4 FY 2010 Summary of States Participating in Interviews and Survey for Colorado Study Percent of Family Fees as Baseline Families that 2010 a Percent of State Fee Range Income participate in Revenue Total EI (Part the Fee System C) Revenue Arizona 200% 15%-100% of 30% Unknown Unknown costs of services Connecticut AGI above $8-544 25% $1.1 M 2% $45,000 Illinois 185% $10-$200 27.6% $3.75M 3.1% Massachusetts 300% $250-$1,500 Approximately $1.5 M 1.3% 35% Missouri 200% $5-$100 50% $350,000 .9% North 200% 20%-100% of 30% $830,000 1% Carolina costs of services (includes insurance revenue Texas 200% $20-$150 10% $362,251 .2% Utah 158% $10-$100 34% $259,970 1.3% Source: “Analysis of Family Cost Participation Policy: Final Report” Colorado Department of Human Services, Early Intervention Services December 2011 While there is no data currently available for preschool aged children in Wyoming, 26% of school-aged children are eligible for free and reduced lunch. Using that percentage as a proxy, the state could anticipate revenue from 64% of children enrolled in IDEA Part C. However, given that all but one of the Regional Centers in Wyoming are non-government agencies, the collection of such revenue, and its use, could be difficult to implement. This analysis becomes further complicated when trying to determine cost savings using the existing funding model for the EIEP preschools. As explained in the Phase 1 EIEP report, child services at the preschool level are not reimbursed at actual cost. Rather, the Department of Health provides a contracted grant amount of money to each regional center based on its child count on December 1 of each year. According to the Department of Health, there is no data currently collected on cost per service. Therefore, using a sliding scale or cost-sharing model would be difficult, as there is no way to determine the percentage of services for which families would be responsible. An additional concern would be the consideration of children and families that would not screen or enroll their children in services if

Early Intervention and Education Program, Phase 2

costs and fees are involved. Having been an entitlement since the inception of the program, charging for Part C services may cause a decline in the number of children served, ultimately leading to a decrease in the requested amount for state funding. 4. What other state(s) may have a state-only funded Part C equivalent program, and if so, how is it structured and/or administered? All states participate in the IDEA Part C early intervention program. There are no state-only programs. Please see Table 1.1 regarding the organizational structure of the program in each state. 5. Potential EIEP impact on K-12 special education system costs In Wyoming, special education funding is not tracked on a per- student basis in either preschool (Part B 619 or B 611) or the K-12 public education system. The state utilizes a child count to calculate payments to the Part B 619 program; however, there is no current tracking or report regarding the costs to each child. Likewise, special education costs are not tracked per student at the K-12 level. Each school district provides an aggregated account of its special education expenditures; however, those amounts are not itemized by service costs per child. There is potential to identify the number of students who received preschool special education services and who no longer need special education in the K-12 system. If early intervention is determined to be the root cause for the child no longer needing special education services, then the children who exit services prior to, or shortly after, entering the K-12 system, could lessen the burden on the K-12 special education reimbursement amounts. These numbers are reported in Chapter 3 of the Phase 2 report under the section labeled “Outcome Concept #4.” Special Education Funding in Other states There is not sufficient information available to determine the cost or appropriation of state funds for Part B services in other states. While Wyoming chose to fund special education at 100% reimbursement, the other 49 states have a variety of other funding systems, including more complex funding systems. A study from the National Association of State Directors of Special Education found that at least 12 states now make use of funding formulas that differentiate funding amounts based either on a student’s specific disability or the educational services they are provided. (Griffith, 2015) As part of the supplemental information requested by the Management Audit Committee, formal requests were made to the

Page 64 February 2018

Early Childhood Technical Assistance center (ECTA), the Education Commission of the States, as well as the IDEA Part B 619 coordinators of seventeen selected states8. The requests asked for information pertaining to the amount of state funds associated with developmental preschools as overseen by the IDEA Part B 619 or similar program. Only two states, Montana and Idaho, were able to provide dollar amounts associated with their programs; however, Montana does not provide any state funding, but instead relies on the federal IDEA amount. Idaho expended $2,123,353 in state funds during the 2016-2017 school year. Annually, Idaho produces a report listing the amount of funding, disability type, and the population of IDEA eligible students. For school year 2014-2015, 4.7% of the state’s 3 through 5-year-old population was eligible for IDEA services. The remaining fifteen selected states either did not respond to the request, or stated that because of their educational funding models, the state amount flowing directly to Part B 619 could not be separated from the overall Part B (which ranges for 3-21) total. Thirty-two states and the District of Columbia fund students with disabilities through the state’s primary education funding formula. Student funding often happens through weighted, staff-based allocations, or dollar allocations (Millard & Aragon, 2015). Because many states do not separate the Part B 619 (3 to 5 years old) program from the Part B 611 (6-21) program as a whole, LSO sought to compare special education funding models for all states. The Education Commission of the States has created a 50-state comparison for special education funding located in Appendix 3.1, with the author’s permission. Five states provide some type of reimbursement to school districts (Michigan 28.6%, Nebraska 51- 57%, Vermont 60% of statewide average salaries of special educators, Wisconsin 26.79%, and Wyoming 100%). Three of the states include student counts as part of that funding; however, Wyoming is not among them (Millard & Aragon, 2015). 6. National literature on early childhood intervention As part of the supplemental information request, the Committee instructed staff to review national studies to identify any cost- benefit analysis that had been conducted for early intervention services. To that end, evaluators reviewed and reached out to

8 The following states were selected because, like Wyoming, they do not have universal preschool, and/or because of their proximity to, or similar demographics as, Wyoming: Arizona, Colorado, Hawaii, Idaho, Massachusetts, Michigan, Minnesota, Mississippi, Montana, Nebraska, New Jersey, New Mexico, North and South Dakota, Utah, Vermont and Wisconsin.

Early Intervention and Education Program, Phase 2

numerous national organizations focused on early childhood development. While there was a wealth of information showing economic and societal impacts related to early education, there was consensus within the literature that more information needed to be collected to provide meaningful analysis on the impact, be it social or economic, of early intervention services for children with developmental delays or disabilities. In addition to the studies discussed below, the Early Childhood Technical Assistance (ECTA) center has a library of studies regarding the cost benefit of early childhood education. However, ECTA provides very little research regarding cost savings to K-12 special education as a result of early intervention. According to a 2005 study titled, “Early Childhood Education for All, A Wise Investment,” sponsored by the Legal Momentum’s Family Initiative and the MIT Workplace Center, 32 states had performed economic impact studies related to early childhood education. The analysis of these reports found that tax dollars invested in early childhood education creates economic development opportunities and builds an employable workforce. (Calman & Tarr-Whelan, 2005) Similarly, “Early Childhood Interventions” by Karoly, Kilburn, & Cannon found that, for programs that serve higher-risk children and families, estimates of per-child benefits range from approximately $1,400 to $240,000 per child and that for each dollar invested in early intervention programs, the potential return to society ranges from $1.80 to $17.07. However, the study also stated, “We do not cover programs that are designed primarily to promote children’s physical health, nor do we cover programs that primarily serve children with special needs.” (Karoly, Kilburn, & Cannon, 2005) This cost benefit analysis shows that there is a benefit to early childhood education generally; however, it also highlighted that more evidence is needed related to the benefit of early intervention for children with developmental delays and/or disabilities. In 2007, the U.S. Office of Special Education and SRI International teamed up to produce the National Early Intervention Longitudinal Study, or NEILS study, which viewed only Part C of the IDEA. This study is, perhaps, the most comprehensive look at early intervention education; however, it is important to note that it is a decade old. The report found that kindergarten data provided “a glimpse of the early academic and social skills of former EI participants and suggest that some children who received EI services will achieve academic and social success in the future because they are doing quite well as kindergarteners.” It also stated that it is, of course, too early at kindergarten to predict the types of long-term academic and social skill outcomes to expect

Page 66 February 2018 for the former EI participants.” The report found that with intervention, 63% of children in Part C went on to receive services between the ages of ages 3-5. In kindergarten, 32% of those children were no longer considered to have a disability and were no longer receiving special education services. Further, 60% of children who had received early intervention services were rated by kindergarten teachers as having commensurate behavioral skills as their peers; 54% were rated as having commensurate social skills as their peers. This study reviewed only children who received early intervention within the Part C program, and did not directly link outcomes to Part B services. Related to cost, the NEILS study stated, “NEILS examined average expenditures for services and how expenditures vary for children with different types of needs. The average total expenditure for the entire length of time the child and family received EI services was $15,740. Given that the average child received services for 17.2 months, the average monthly expenditure was approximately $916. Not surprisingly, expenditures varied for different types of children receiving EI services. For four disability-related categories—risk condition only, communication delay only, developmental delay with no diagnosed condition, and diagnosed condition—the average monthly expenditures for each category were $549, $642, $948, and $1,103, respectively.” Not surprisingly, this report also stated, “Future studies need to examine the nature of EI services in far more depth.” (Hebbeler, et al., 2007) In July 2011, the National Center for Education Evaluation and Regional Assistance published a study entitled, “National Assessment of the IDEA.” This study reviewed services provided to 316,730 infants and toddlers birth through age 2 through the Part C early intervention program and 700,166 children with disabilities ages 3 through 5 through Part B 619 in the year 2007. While this study provided background on the types of services offered through both programs, there was no discussion of cost-benefit analysis. (Institute of Educational Sciences, 2011) In a brief study entitled, “Investing in Our Future: The Evidence Base on Preschool Education,” authors found that there was cost savings (e.g., reduced spending for special education, reduced involvement in child protection, welfare, & criminal justice systems) and greater economic productivity; however, the study looked only at Head Start programs and not early intervention in special education. The study also stated, “More research is needed replicating and extending initial findings of positive effects for children with special needs.” (Yoshikawa, et al., 2013)

Early Intervention and Education Program, Phase 2

Even considering the national studies noted above, we cannot provide information sufficient to draw conclusions, consistent with government auditing standards, whether EIEP creates savings in the K-12 special education system. In addition, rigorous analysis of the Program’s effectiveness and cost-benefit analysis is beyond the scope of this type of government program audit.

Bibliography of Public Works Calman, L. J., & Tarr-Whelan, L. (2005). Early Childhood Education for All: A Wise Investment. New York: Legal Momentum. Ficano, M., Boivin, N., Rieger, A., & Dempsy, T. (2015). Early Childhood & Preschool First Steps for Special Needs Children. Retrieved 08 30, 2017, from NAPSEC.ORG. Griffith, M. (2015, March 04). A Look at Funding for Students with Disabilities. The Progress of Educational Reform, 16(1), p. 6. Hebbeler, K., Spiker, J., Bailey, D., Scarborough, A., Mallik, S., Simeonsson, R., et al. (2007). Early Intervention for Infants and Toddlers with Disabilities and Their Families: Participants, Services, and Outcomes National Early Intervention Longitudinal Study. Menlo Park: SRI International. Institute of Educational Sciences. (2011). National Assessment of IDEA Overview. Washington DC: national Center of Education Evaluation and Regional Assistance Institute of Educational Sciences. Karoly, L. A. (2016). The Economic Returns to Early Childhood Education. The Future of Children, 26(2), 37-55. Karoly, L. A., Kilburn, M. R., & Cannon, J. S. (2005). Early Childhood Interventions. Santa Monica: RAND Labor and Population. Millard, M., & Aragon, S. (2015, June). State Funding for Students with Disabilities. Education Commission of the States, pp. 1-7. The President's Coucil of Economic Advisors. (2014). The Economics of Early Childhood Investments. Washington D.C. : The President of the United States Executive Office. Yoshikawa, H., Weiland, C., Brookes-Gunn, J., Burchinal, M., Espinosa, L., Gormley, W. T., et al. (2013). Investing in Our Future: The Evidence base on Preschool Education. Washington D.C. : Society for Research in Child Development and Foundation For Child Development.

Page 68 February 2018

Appendices Early Intervention and Education Program, Phase 2

Early Intervention and Education Program, Phase 2

February 2017

Appendix A Update on Legislative and Agency Actions from Phase 1

Legislative Update During the 2017 Legislative Session, the Legislature passed original House Bill 211 (House Enrolled Act 73; 2017 Laws, Ch. 103), which amended the process for developmental preschool payment calculations. Under this legislation, in order for children to be included in the state general funds payment calculation, they must have an IEP or IFSP in place on or before December 1 of each year (previously November 1 of each year). The Governor signed the bill on March 2 and the revised requirements will go into effect July 1, 2017. Additionally, as required by statute, in the fall 2016, Health submitted an exception budget request to the Legislature for 1% External Cost Adjustment (ECA) reduction as applicable to the K-12 education system. The Legislature accepted the Governor-proposed budget reductions for both Part C and Part B 619, although the Legislature added back $2 million for Part C services.

Agency Update LSO evaluation staff requested updates from Health and Education on what actions, if any, these agencies have taken related to the findings and recommendations presented in the Phase 1 report. The following provides a summary of the issue areas covered in the report with corresponding summaries of Health and Education’s update on progress made or issues considered through May 2017: Identification Rates and Data Reconciliation Health: Health states that there have been no specific actions taken related to reconciling Part B 619 data reported to Education, pending results of the Phase 2 report. Education has not requested further discussion or validation of data from Health. Health will begin an update to its data validation approach in June 2017, including reviewing its data contractor’s summary of Program data prior to its submission to Education. Health will set up a meeting with Education’s data team prior to Education’s own data validation process. Education: Education states that it is waiting for the completion of the Phase 2 report to ensure any data validation process with Health aligns with final Phase 2 recommendations. Education notes that if the Program is organizationally moved under Education, this would provide an opportunity to have clean and consistent data processes and procedures as also used with K-12 education system data.

A-1 Early Intervention and Education Program, Phase 2

Statutory Funding Model Implementation and Federal MOE Health: Health states that there have been no discussions to include Education in the process for reviewing or otherwise overseeing the funding model implementation as it relates to Part B 619. Health and Education have not discussed possible recommendations for changing the Program’s statutory funding model. Health states that Education has encouraged Health staff to receive training on federal maintenance of effort (MOE) requirements through the Center for IDEA Fiscal Reporting (CIFR) and utilize its Part B 619 MOE calculator that has been approved by OSEP. Health attended two trainings in early 2017 with Education that were facilitated by Wayne Ball of CIFR. Health will continue to use the federally approved MOE calculator for future MOE determinations. As a result of this training, Health requested regional centers supply personnel vacancy information for FY2017 to assess whether the State meets one of the federal exceptions for reducing its MOE obligations from year-to-year. Health says it has provided the continuing contract expectation to receive this information on a yearly basis. Health’s Director’s Unit for Policy, Research and Evaluation (DUPRE) is developing a Part C MOE calculator to help with future determinations. Health states that it has met the federal maintenance of effort (MOE) requirements for both Part C and Part B 619 for FY2017. Health notes that the MOE calculations are not required to be specifically reported to the federal government, or to Education for Part B 619. The federal grant applications specify that each state must provide assurances regarding financial considerations for each program, including assurances regarding the MOE. For the upcoming federal grant period, Health and Education will submit the grant applications for Part C and Part B, respectively, explaining that the State meets federal financial assurance for the program for FY2017. The federal government considers these assurances when awarding funds to the states. Education: Education concurs that there have been no discussions or actions taken between Health and Education related to Education reviewing or overseeing the State general funds funding model for Part B 619 or that the agencies do not currently recommend changes to the model formula or its implementation. Education states that it is not required to see the calculations from Health for the Part B 619 MOE unless Health reports that it has not met the MOE. If the MOE is met, Health continues to provide assurances to receive federal funds. If the MOE is not met, Health must submit to Education its calculations and documentation on how it will improve for the next fiscal year. Education’s grant management system (GMS) will not open for federal application submission until July 2017, as federal funds have not yet been released to the states.

A-2 February 2017

Change of Organizational Structure and Plan Health: Health stated that there have been no substantive changes to the relationship between Education and Program staff. The August 2016 MOU between the agencies remains the basis for the current relationship where Education provides oversight and monitoring of Health’s Program Part B 619 administrative staff. Health continues to monitor regional centers according to Education’s guidance on its data driven, continuous improvement monitoring protocols. Health continues to believe that the most appropriate administrative structure for both Part C and Part B 619 is for the programs to be placed under Education’s full control, through a detailed transition plan developed prior to amending statutes. Please see Exhibit 1 through Exhibit 3 at the end of this appendix for supporting documents provided by Health specific to Part C transition planning and implementation. If reorganization is to occur, Health recommends planning for both immediate and long- term program needs, with at least one year of transition planning following passage of any legislation to move the programs. Health outlines the following minimum considerations: 1. Immediate Transition Need: a. Begin and have ongoing discussions with the federal OSEP regarding transition planning. Health will lead discussions for Part C while Health and Education will collaborate on discussion for Part B 619. In speaking with OSEP in April 2017, Health states OSEP has identified a minimum of 12 months of active planning. The ultimate goal of transition planning is to ensure that families and children have minimal impact through a transition. b. During the 2018 Budget Session, the Legislature could revise the language in the current W.S. 21-2-701 through 21-2-706 to identify Education as the responsible agency for the Part B 619 program. The Governor could designate Education as the new local education agency for Part C following guidance provided by OSEP. c. Health and Education should initiate discussions with Wyoming Attorney General’s Office to plan required legal activities to implement statute or other administrative changes. This would include drafting regional center contracts with Education as the administering agency. 2. Longer Transition Need: a. Comprehensive Part C and Part B 619 program reviews through agency and stakeholder (e.g. regional centers, school districts, state agencies, etc.) input and revision. Review and revision should encompass, at a minimum, the following:

A-3 Early Intervention and Education Program, Phase 2

▪ Discuss transfer with Attorney General staff for both Health and Education with regular, ongoing meetings planned throughout transition process. ▪ Legislative review and planning for revisions to the State general funds funding model. ▪ Plan transition requirements identified in Exhibits 1 and 2, including fiscal information and data systems for Part C. Complete close out fiscal policies as outlined in Transfer of IDEA Part C Funds, page 2, #4 (Exhibit 1). ▪ Legislative revision of Wyoming Statutes, as applicable. ▪ Notification of transition to all families of children enrolled in Part C and Part B 619, with possible revision of IFSPs and IEPs, respectively. ▪ Notification to pertinent state agencies and other Health and Education Divisions (e.g. Department of Family Services, Department of Workforce Services, Health’s Public Health Division, Medicaid, and Kid Care Chip programs). ▪ Ensure Least Restrictive Environment (LRE) options/availability, including provision of Part B 619 in school district preschools that is consistent with Education’s Chapter 7: Services for Children with Disabilities rules. ▪ Review all current contracting provisions and identify any possible contract amendments that will need to occur as a result of the transition. ▪ Review and draft revisions to rules, policies, and procedures. ▪ Align regional center preschool standards with school district preschool standards for Part B 619. ▪ Plan for and ensure monitoring provisions are upheld throughout transition process. ▪ Plan for and ensure required reporting to OSEP. ▪ Plan for the transition of current Program staff, which includes review and assurance of current Part C and Part B federal grant applications for administration of the programs. ▪ Plan for revisions and determine the local education agency for the Early Hearing Detection and Intervention (EHDI) program, with three contracts. ▪ Complete inventory of all equipment and materials that would be transferred. ▪ Draft and plan training activities for the transition for all Early Intervention Service providers for Part C, and for all Early Childhood Special Education personnel for Part B 619.

A-4 February 2017

Education: Education continues to support organizationally moving the Part B 619 program under its authority. Education emphasizes that any reorganization include Education receiving appropriate staffing and funding resources needed to operate the program. Education states that it would be much easier to eliminate the LEA/IEU of Health and to administer funds, technical assistance, and professional development directly to the regional centers. Under its current administrative structure, the Part B 619 program could be moved under the Division of Individual Learning, but any statutory revisions should not limit future organizational changes if an Office of Early Learning is eventually adopted. As for Part C, Education believes Part C should be maintained in Health, but could include development of a specific collaborative plan. This may include revising statutes and rules and regulations of the agencies, as well as Education working with districts to oversee transitioning students from Part C to Part B 619 at the regional centers and from Part B 619 to the K-12 education system. As for funding the services, Education states that the IDEA does not contain any regulations that either require or prevent Education (as an SEA) from flowing funds directly to the regional centers. However, W.S. 21-2-705 and 21-2-706 currently require the distribution of federal funds to Health through Education, and for Health to distribute the state general funds to regional centers. Education’s recommended options for structuring Part B 619 services include: 1) that the preschools are incorporated into their respective districts, in which case the school board would administer the programs as if it is its own grade, or 2) allowing existing regional centers to apply for their federal funds through Education’s GMS system where the regional centers could receive funding the same as a district (emulating the current k-12 system). Education does note that a funding model structured to go through the districts will eventually run more efficiently. Overall, Education expects an eighteen to twenty-four month transition planning and implementation timeframe to complete the transition of Part B 619 from Health to Education. Education’s general transition activities and assumptions as follows: ▪ There should be an extensive transition planning period with stakeholder input, to include Education, district personnel, regional centers, and others. ▪ The funding structure should be reviewed and decided upon, including how the regional centers are supported by Education (e.g. regions in each school district, maintain regions and have them directly apply for Education funds through the GMS, Education contract through a similar structure as currently used in Health, etc.). ▪ The transition and implementation of the Part B 619 program should maintain the integrity of the “education model” consistent with other Education programs. ▪ Stakeholder input team should review education best practices and other states models.

A-5 Early Intervention and Education Program, Phase 2

Exhibit 1. OSEP Transfer Procedures for IDEA Part C Grant Funds

A-6 February 2017

Source: Wyoming Department of Health

A-7 Early Intervention and Education Program, Phase 2

Exhibit 2. OSEP Considerations When Changing Lead Agency Designations Under Part C of IDEA

Source: Wyoming Department of Health

A-8 February 2017

Exhibit 3. Example Governor’s Letter to Federal Government Signaling Change in Part C Lead Agency.

A-9 Early Intervention and Education Program, Phase 2

Source: Wyoming Department of Health

A-10 February 2017

Appendix B Phase 2 Data Analysis Methodology

General Methodology Evaluation staff conducted the Phase 2 evaluation utilizing a similar approach and resources as employed during Phase 1, including conducting interviews, requesting documents and data, and observing various stakeholders’ meetings and conference calls from October 2016 through May 2017. Other states’ and best practice information was also continuously gathered to provide comparisons to current Program structures and practices. Phase 1 documents and information were also reviewed as necessary to revisit information that was relevant to the Phase 2 project. However, as the Phase 2 evaluation focused on performance, outcomes, and impact of the Program, evaluation staff employed two specific research methods to build on work completed during Phase 1. First, evaluation staff were able to access primary child and service data from both Health and Education through interagency data sharing agreements, as authorized by IDEA. Please see the following sections for more specific data review, analysis, and quality control steps utilized by evaluation staff to draw conclusions expressed in this report. Second, evaluation staff contacted each public school district in the state to request feedback on early childhood programs, emphasizing the work of the regional child development centers. This information helped complement regional centers’ comments and perspectives obtained during Phase 1. Please see Appendix F for a complete list of survey questions submitted to the school districts for their response. Wyoming Department of Health Data This evaluation uses data gathered by the Program through its primary data system, the Special Education Automation Software or SEAS. This system was originally purchased by Health in 2010 and is employed by the regional centers as a case management system to track children and their education plan and service information. During Phase 1 research, evaluation staff were informed by Health that this system has limited controls and validation functions to provide consistent assurances on data completeness, accuracy, and reliability. For Phase 1 research, evaluation staff relied on Program staff’s extracted, processed, and cleaned data to evaluate child identification and eligibility rates as well as the annual count of children for the statutory funding model. During Phase 2, evaluation staff were able to provide greater review of specific reports and data fields on which Phase 2 research and analysis would be reliant. In conducting this review of the data, evaluation staff found the following example issues specific to data used for Phase 2 analysis:

B-1 Early Intervention and Education Program, Phase 2

▪ There is a lack of automatic edits/controls to prevent entry of clearly erroneous data, such as date entries where a service exit date predates a child’s service start date; or a child’s active status may not change when a child has an exit date entered into the system. ▪ Student identification numbers are not automatically generated by the system and Program staff relies on regional centers’ employees to input Health-generated student identifiers. ▪ Some data fields lack clear data entry conditions to prevent irregular coding, such as allowing letters to be entered for fields intended to be numerical (e.g. student identifiers, regional center identifiers). ▪ The student “Name” field contains multiple names, such as first, last, and middle names which does not allow for consistent searchable records and cross-reference with other data; the field also allows input of nicknames, aliases, or supplementary names next to legal names under the same field. ▪ The system appears to not contain logical field agreements allowing field values to conflict or contradict other field values (e.g. a child may still have a service start date even if classified as being disqualified from service, or that a parent refused placement, or if a child was determined to not have an educational need for services). Additionally, due to the system’s real-time reporting structure, it is virtually impossible to recreate historic reports for quality control and evaluation purposes. Essentially, the quality of data produced by this system relies on date-certain data extracts saved outside the system (e.g. in Microsoft Excel) and requires multiple staff manipulation steps to manual check and clean records for required reporting and oversight/monitoring activities. Program staff also stated that they must regularly call regional centers to make specific changes and adjustments to specific record as discrepancies or errors are gradually discovered. Taking these concerns together, evaluation staff advise that data analyses performed for and displayed in this report be read with caution. Evaluation staff did utilize different data review and cleaning processes to eliminate clearly unreliable data records before utilizing the data for its analyses and linking with Education’s data. Additionally, Health and Education rely on the SEAS system and its data to meet federal reporting requirements. Therefore, with evaluation staff’s review and cleaning processes, evaluation staff determined that the data used in this report met minimal quality and completeness standards, and was reliable enough on which to provide a baseline of outcome and impact analysis for the Legislature. As noted in the report, future analyses on the Program’s performance and impact would benefit from the agencies and the Legislature considering advancing the Program’s data environment and data system to provide for more easily accessible and regular reporting to effectively oversee the Program and regional centers. Wyoming Department of Education Data This phase of the report utilized information and data obtained through several school district and early childhood data collections managed and reviewed by Education. These collections

B-2 February 2017 included the WDE 684 Student Count and Special Education data files, the 401 special education financial reports, and other data files related to different assessments, such as for the Measures of Academic Progress (MAP) assessments, Proficiency Assessment for Wyoming Students (PAWS), and the Instructional Foundations for Kindergarten (IF-K). Unlike evaluation staff’s review of SEAS data, evaluation staff did not pull raw data from Education’s data systems, but instead worked with Education’s data unit to target individual child data that had already been reviewed and validated (“cleaned”) by the unit. Using Education’s data staff expertise, evaluation staff relied on Education data unit staff to query the various reports to provide cohort- based data files that most closely related to the research objective: children that entered kindergarten during or after the 2011-2012 school year. While evaluation staff was not able to independently validate, verify, or confirm that the number of student records or information contained in the provided data files, evaluation staff did review aggregate data from various Education reports to assess the general accuracy of the data. Methodology for Reviewing Education Data The WDE provided excel files, which they ‘scrubbed’ prior to its submission to evaluation staff. The 2011-12 data was very limited, and thus, no meaningful analysis of that group could occur. However, comparisons of the numbers provided to evaluation staff with enrollment numbers that Department of Education publically reported in 2012 forward align. The 684 data provided consisted of 164,344 records. The 684 SPED file contains 28,814 records. Shown in Table B.1, below, within this data there are 35,726 unique WISERS. However, within the 684 enrollment files, there were 3,930 children who were marked both true and false for receipt of IDEA services, most likely indicating that the child either was screened for services and found to be ineligible, or that a child returned to the regular classroom sometime in the school year. To identify only WISERS that never received IDEA services, evaluation staff removed the 3,930 duplicative WISERS from the IDEA “False” Column and the remaining WISER IDs represent children who had truly never received IDEA services in Kindergarten through third grade. Table B.1 Count of Unique WISER IDs from Education Data IDEA Status Count of Unique WISER IDs IDEA Services: True 7,186 * IDEA Services: False 28,540 Total 35,726 *3,390, or 55%, also had an indicator of false at some point in K-3 Source: Legislative Service Office. To ensure there were no Group 1 children with SPED files, evaluation staff cross-checked the “IDEA False” category with the SPED 684 files. In total, there were 764 unique WISERS marked “false” in the 684 Enrollment File that did have a Special Education File. The Special

B-3 Early Intervention and Education Program, Phase 2

Education File, or SpED684, requires an IDEA status be marked. As such, evaluation staff reviewed each of the 764 WISER IDs to identify the accurate IDEA status. There were 38 WISERS with multiple statuses depending on the school year and grade of the child. In other words, a child could have been marked ‘true’ in kindergarten and ‘ineligible’ in first or second grade. Table B.2, below, illustrates the math used to ensure an unduplicated count. Table B.2 Count of Duplicate WISER IDs Between Enrollment and Special Education Data Files Status Unique Also Also marked Also Marked Removed from WISER Marked True with exit False total count to Count True prevent duplication True With Exit 140 26 0 -26 True No Exit 335 0 False 6 1 0 -1 Ineligible 303 9 1 1 -11 Refused 18 0 0 Total 802 36 1 802-38 = 764 Adjusted Total Source: Legislative Service Office. Within the SpED684 file, districts have the option of marking “False, ineligible, or [parent] refused services” designations. Table B.3, below, breaks down these categories by grade and school year. Table B.3 IDEA Marked “FALSE, Ineligible, or Parent Refused” within SpED 684 file and “False” within the 684 Enrollment file.

2011-12 2012-13 2013-14 2014-15 2015-16 Total Kindergarten 6 20 39 29 33 127 False 6 6 Ineligible 19 35 25 29 108 Refused 1 4 4 4 13 First Grade 3 36 21 33 93 Ineligible 3 36 20 32 91 Refused 1 1 2 Second Grade 2 33 38 73 Ineligible 2 30 38 70 Refused 3 3 Third Grade 41 41 Ineligible 41 41 Grand Total 6 23 77 83 145 334 Source: Legislative Service Office.

B-4 February 2017

Three hundred ten (310) student records in this file were marked ineligible, with 108 records indicating that the ineligible findings occurred in kindergarten. Seven (7) children were found ineligible multiple times. For example, one child was evaluated in first, second, and third grade, but was found to be ineligible all three times. While this child did not receive IDEA services from the school district, the evaluation generated a SPED684 file for each of those school years. Because of the nature of the analysis needed for this report, each WISER in question was reviewed to ensure that children who received IDEA services at the K-3 level, and those who did not, were separated into correct child groups. Such comparison was required in order to ensure that the student data used from Kindergarten through third grade was accurate and the base for each of the four groups was sound. Given that the K-3 foundation for the groups used in the analysis had to be adjusted and/or corrected to account for the SpED file to match the IDEA status within the 684 enrollment data, once that step was completed, evaluation staff built queries to link the WISER IDs with test scores. These files were then linked to MAP Assessment scores, IF-K Assessment scores, and PAWS assessment scores. Merging Health’s SEAS Data with Education’s Data After learning there was a connection between SEAS data and the data provided by the Wyoming Department of Education, evaluation staff created a database and ran a series of reports and queries that linked the two agencies data sets by WISER IDs. Noting the steps taken to eliminate unreliable data from health, once the database was complete, evaluation staff was able to identify children from the program in the Wyoming K-12 system. Figure B.1, on the next page, highlights the overall logic used to divide children’s records into the four comparator groups for analyses of academic assessments.

B-5 Early Intervention and Education Program, Phase 2

Figure B.1 LSO Evaluation Logic Flowchart for Child Group Designation

Source: Legislative Service Office.

B-6 February 2017

Appendix C Summary of Federal Outcome Indicators

Table C.1 IDEA Indicators for Part B and Part C Part B 619 Indicator (Education) Part C Indicator (Health) 1. Percent of Infants and Toddlers who receive 1 Graduation Rate EI services on IFSP in a timely manner 2. Percent of infants and Toddlers who receive 2 Drop Out Rate EI in their home or programs for typically developing children 3 Statewide Assessment: participation and 3. Child Outcomes performance 4 Suspension and Expulsion for students with 4. Percent of families participating in Part C disabilities who report EI services have helped their family 5. Percent of infants and toddlers birth to 1 5 LRE for children ages 6-21 with IFSP compared to state data 6. Percent of infants and toddlers birth to 3 6 Preschool Environments (FAPE in LRE) with IFSP compared to state data 7. Percent of eligible infants and toddlers with IFSPs for whom an evaluation and assessment 7 Preschool Outcomes and an initial IFSP meeting were conducted within Part C’s 45-day timeline. 8 Parent Involvement 8. Timely Transitions 9. Percent of noncompliance findings that are 9 Disproportionate Representation (resulting corrected within one year by each component in inappropriate identification) of the general supervision system 10 Disproportionate Representation for 10. Timely Reporting disability category 11 Child Find and Evaluation in 60 Days 11. Evaluation and Assessments by qualified (WDE uses K-12 only) personnel, completed in all areas. 12 Early Childhood Transition 12. Parent receipt of procedural safeguards 13. Receipt of Services by qualified personnel 13 Transition planning on IEP by age 16 in accordance with IFSP 14 Post-Secondary Outcomes 14. Timely IFSP meeting 15 Resolution Sessions 15. IFSP with measurable outcomes 16. Percent of children whose IFSPs include a 16 Mediations statement/ description of the child’s developmental status in all areas 17. Percent of children whose eligibility 17 State Systemic Improvement Plan determinations included the use of clinical opinion.

C-1 Early Intervention and Education Program, Phase 2

Part B 619 Indicator (Education) Part C Indicator (Health) 18. Percent of personnel employed by program and their contractors that meet state personnel standards/qualifications. 20 Timely and Accurate Reporting Source: Legislative Service Office compilation of Wyoming Departments of Health and Education information.

Part B 619 Indicator Definitions Preschool Least Restrictive Environment (LRE) Indicator 6 is a “results indicator,” which focuses on the state serving pre-K students in their least restrictive environment (LRE). The required federal calculation has the State compare the number of children ages three through five years with individual education programs (IEPs) that: a) attend a regular early childhood program and receive the majority of special education and related services in the regular early childhood program (environment), and b) attend a separate special education class, separate school, or residential facility. The intent of this measure is to ensure students will progress better in their academic and social-emotional development with as many children receiving services in an inclusive environment with their typical learning peers. Indicator 8: Parental Involvement Indicator 8 is a “results indicator,” which values the input and inclusion of parents in their children’s education progress and success. The measure looks at the percentage of parents with a child receiving special education services, who report that schools facilitated parental involvement as a means of improving services and results. Education noted in its reporting encompassed children ages three through twenty-one, with no separate reporting for Part B 619 pre-K children and their parents. Indicator 11: Child Find (Timely Initial Evaluation) Indicator 11 is a “compliance indicator,” which looks at how quickly children are evaluated upon receiving parental consent to do so. This measure is dependent on the federal requirement of evaluating children with suspected disabilities within sixty (60) days from parental consent (or within the timeframe required by a state, if less than sixty days). The intent of this measure is to assure children are evaluated in a timely fashion, and if found to need services, enter services as quickly as possible upon disability identification and service needs planning. Indicator 12: Early Childhood Transition Indicator 12 is a “compliance indicator,” which gauges how successful are states with transitioning Part C students into the Part B 619, or pre-K program. There are several sub- metrics for this overall indicator: ▪ Number of children who have been served in Part C and referred to Part B for eligibility determination

C-2 February 2017

▪ Number of those referred determined to be NOT eligible ▪ Number of those found eligible who have an IEP developed and implemented by third birthday ▪ Number of parent refusals to provide consent caused delays in evaluation or initial services ▪ • Number of children referred to Part B less than 90 days before third birthday Indicator 17: State Systemic Improvement Plan Indicator 17 is a “results indicator,” which encompasses the State Systemic Improvement Plan. The SSIP is measured by the State-identified Measureable Result (SiMR) data that is intended to help the state and federal government implement interventions to improve student results. The central focus of Education’s SSIP is on the K-12 public education system and is defined as: “An increase in the percentage of third-grade students with disabilities who spend 21 to 60 percent of their school day outside the general education environment, who score proficient or advanced on the statewide reading assessment.”

Indicator 6: Preschool Least Restrictive Environment (LRE) Table C.2, below summarizes Education’s reporting on this indicator from the 2014 APR, which shows past performance and intended targets for the future. Wyoming has generally met its targets for this indicator. Table C.2 Indicator 6: Preschool Least Restrictive Environment

Baseline Targets Measure FFY 2011 2012 2013 2014 2015 2016 2017 2018 6-A Target ≥ 60.34% 61.48% 61.73% 61.98% 62.23% 62.48% 62.73% Regular WY Data 59.84% 60.45% 61.48% National Average FFY14 = 50% 6-B Target ≤ 31.30% 29.01% 28.76% 28.51% 28.26% 28.01% 27.76% Separate Data 30.80% 30.94% 29.01% National Average FFY14 = 21% 6-A: Regular early childhood program and receiving the majority of special education and related services in the regular early childhood program 6-B: Separate special education class, separate school or residential facility Sources: WY Part B SSP/APR and 2016 Part B 2014 SPP/APR Indicator Analysis Booklet

C-3 Early Intervention and Education Program, Phase 2

Indicator 8: Parental Involvement Table C.3, on the next page, summarizes Education’s reporting on this indicator from the 2014 APR, which shows past performance and intended targets for the future. Wyoming did not meet its target for FFY2012, but did meet the target the following year, for this indicator. Table C.3 Indicator 8: Parental Involvement FFY 2011 2012 2013 2014 2015 2016 2017 2018 Data 79.85% 70.71% 74.61% Target NR 80.35% 74.61% 75.14%% 75.14% 75.39% 75.64% 75.89% National FFY2014 Average = % Source: WY Part B SSP/APR and 2016 Part B 2014 SPP/APR Indicator Analysis Booklet. NR = Not Reported

Indicator 11: Child Find (Timely Initial Evaluation) Table C.4, below, summarizes Education’s reporting on this indicator from the 2014 APR, which shows past performance and intended targets for the future. Wyoming has generally met its targets for this indicator. Table C.4 Indicator 11: Child Find (Timely Initial Evaluation) FFY 2011 2012 2013 2014 2015 2016 2017 2018 Data 97.76% 97.69% 98.22% 98.57% Target 100% 100% 100% 100% 100% 100% 100% 100% National FFY2014 = Average 98% Sources: WY Part B SSP/APR and 2016 Part B 2014 SPP/APR Indicator Analysis Booklet.

Indicator 12: Early Childhood Transition The overall calculation requires the following: Numerator: Number of Part C found eligible for Part B with IEP implemented by 3rd birthday Denominator: C children referred to B – C children NOT eligible – C children’s parents refuse evaluation/service consent – children referred to Part C within ninety (90) days of 3rd birthday (Multiplied by 100)

C-4 February 2017

Table C.5, on the next page, summarizes Education’s reporting on this indicator from the 2014 APR, which shows past performance and intended targets for the future. Wyoming has generally met its targets for this indicator. Table C.5 Indicator 12: Early Childhood Transition FFY 2011 2012 2013 2014 2015 2016 2017 2018 Data 94.40% 95.70% 100.00% 100.00% Target 100% 100% 100% 100% 100% 100% 100% 100% National Average FFY2014 = 97% Sources: WY Part B SSP/APR and 2016 Part B 2014 SPP/APR Indicator Analysis Booklet.

Indicator 17: State Systemic Improvement Plan (SSIP) Education created baseline data for this measure in 2013. According to Education’s FFY2014 report, this measure was defined through stakeholder input and improvement strategies were based on identified strengths, barriers, and challenges associated with improving the reading performance of preschool and early elementary-aged students. However, developmental preschools were not included in this goal. Education staff explained that it plans to begin to include regional centers in this analysis/measure in the next couple of years. Table 1.5, below, summarizes Education’s reporting on this measure for the SSIP for FFY2014, with targets noted for each subsequent year through FFY2019. Table 1.5 Percentage of Students Scoring Proficient/Advanced on the Statewide Reading Assessment # Students Scoring FFY Target # Students Proficient 2013-14 4.4% 295 13 Actual 2014-15 4.4% 295 13 Proposed 2015-16 4.8% 295 14 Proposed 2016-17 5.2% 295 15 Proposed 2017-18 6.0% 295 18 Proposed 2018-19 8.4% 295 25 Proposed Sources: WY Part B SSP/APR and 2016 Part B 2014 SPP/APR Indicator Analysis Booklet.

C-5 Early Intervention and Education Program, Phase 2

C-6 February 2017

Appendix D Example Part C Regional Center Report Card

Table D.1 Example Part C Report Card Template

Region 1: Child Development Center Performance Report For Federal Fiscal Year 2014 (FFY-July 1, 2014 through June 30, 2015) for Part C of IDEA Infants and Toddlers with Disabilities (birth to age 3 years) 2014-15 2013-14 Did Region 2014-15 2014-15 Region 1 History of Meet Indicator Measurement Target State Rate Rate Performance Target? Percent of infants and toddlers with Timely IFSPs who receive the early 1 % % % % Y/N Services intervention services on their IFSPs in a timely manner. Percent of infants and toddlers with Natural IFSPs who primarily receive early 2 % % % % Y/N Environment intervention services in the home or community-based settings. Percent of infants and toddlers with IFSPs who demonstrate improved: Statement Statement Statemen Statement Statement 1- Statement 1- Child 1- % 1- % t 1- % 1- % Y/N Of those children who 3a Outcomes Positive social-emotional skills Statement Statement Statemen Statement Statement 2- entered the program below (including social relationships); 2- % 2- % t 2- % 2- % Y/N age expectations, the percent who substantially increased Percent of infants and toddlers with their rate of growth by the IFSPs who demonstrate improved: time they exited. Statement Statement Statemen Statement Statement Child 1- % 1- % t 1- % 1- % 1- Y/N Statement 2- 3b Outcomes Acquisition and use of knowledge and Statement Statement Statemen Statement Statement 2- Percent of children who skills (including early language/ 2- % 2- % t 2- % 2- % Y/N were functioning at a level communication); comparable to same-aged peers by the time they exited.

D-1 Early Intervention and Education Program, Phase 2

2014-15 2013-14 Did Region 2014-15 2014-15 Indicator Measurement Region 1 History of Meet Target State Rate Rate Performance Target? Percent of infants and toddlers with Statement Statement Statemen Statement Statement Child IFSPs who demonstrate improved: 1- % 1- % t 1- % 1- % 1- Y/N 3c Outcomes Use of appropriate behaviors to meet Statement Statement Statemen Statement Statement 2- their needs. 2- % 2- % t 2- % 2- % Y/N

Percent of families participating in Part Family C who report that early intervention 4a % % % % Y/N Outcomes services have helped the family: Know their rights Percent of families participating in Part Family C who report that early intervention 4b % % % % Y/N Outcomes services have helped the family: Effectively communicate child's needs Percent of families participating in Part Family C who report that early intervention 4c % % % % Y/N Outcomes services have helped the family: Help their children develop and learn

Served, Percent of infants and toddlers birth to 5 % % % Y/N birth to 1 1 with IFSPs compared to national data. %

Served, Percent of infants and toddlers birth to 6 % % % % Y/N birth to 3 3 with IFSPs compared to national data.

Percent of eligible infants, toddlers with IFSPs for whom an evaluation and Evaluation 7 assessment and an initial IFSP meeting % % % % Y/N in 45 days were conducted within Part C's 45-day timeline Percent of all children exiting Part C IFSPs with who received timely transition planning transition to support the child’s transition to 8a % % % % Y/N steps and preschool and other appropriate services community services by their third birthday including:

D-2 February 2017

2014-15 2013-14 Did Region 2014-15 2014-15 Indicator Measurement Region 1 History of Meet Target State Rate Rate Performance Target? Notification to Local Percent of all children exiting Part C Education who received timely transition planning Agency to support the child’s transition to 8b (LEA), if % % % % Y/N preschool and other appropriate child community services by their third potentially birthday including: eligible for Part B Transition Percent of all children exiting Part C conference, who received timely transition planning if child to support the child’s transition to 8c % % % % Y/N potentially preschool and other appropriate eligible for community services by their third Part B birthday including: General supervision system (including Part C monitoring, complaints, hearings, etc.) 9 Monitoring identifies and corrects noncompliance % % % % Y/N System as soon as possible but in no case later than one year from identification. 14a Data Time Percent of state reported data are timely % % % % Y/N Data Percent of state reported data are 14b % % % % Y/N Accuracy accurate Comments:

D-3 Early Intervention and Education Program, Phase 2

D-4 February 2017

Appendix E Example Best Practices for Data System Development

Table E.1 DaSy9 Framework: Purpose and Vision Subcomponent Quality Indicators Elements of Quality ▪ Formal written statement or embedded in other documents Articulate the purpose and ▪ Obtain input from staff and stakeholders vision of the data system ▪ Readily accessible (e.g. available online) ▪ Reviewed and revised, as necessary ▪ Guides decision-making (e.g. who uses the system) ▪ Addresses state and federal reporting and data requirements Includes intent and goals ▪ Addresses accountability, improvements, and operations for the data system ▪ Addresses linking Part C and Section 619 data ▪ Addresses linking data to broader SLDS efforts Source: DaSy Framework.

Table E.2 DaSy Framework: Data Governance and Management Quality Indicators Elements of Quality Authority and Accountability focused on structures, responsibility, and oversight Delineates ▪ Formalize data governance structure appropriate decision- ▪ Includes representatives from state programs making authority and accountability ▪ Oversee data collection, maintenance, ensure adherence to policies ▪ Written statements delineates decision-making authority Clearly established ▪ Clearly assign data-related responsibilities decision-making ▪ Explain decision-making authority to staff and stakeholders authority, roles, and responsibilities ▪ Review and revise policies, as needed, with stakeholder input ▪ Define relationship between data governance structure and others ▪ Process for staff to recommend policy changes Proper authority ▪ All requirements are clearly defined granted to implement ▪ Prior to implementation approve proposed IT changes policies ▪ Review and revise in response to state and federal changes ▪ Explain policies and procedures to staff and stakeholders Quality and Integrity focused on validity, reliability, accuracy, consistency, and data use

9 DaSy = Center for Early Childhood Data Systems

E-1 Early Intervention and Education Program, Phase 2

Quality Indicators Elements of Quality ▪ Align data system with purpose and vision ▪ Procedures in place to ensure validity of data Develop and ▪ Point of contact for each transfer or exchange implement ▪ Develop data quality and integrity procedures procedures to ensure ▪ Provide training for those who collect, maintain, or receive data data quality and integrity ▪ Require supporting documentation to ensure interoperability ▪ Ensure adherence to policies and procedures ▪ Regularly review and revise, as needed ▪ Communicate regularly with users about policies and procedures Implement ▪ Monitor implementation of policies and procedures monitoring ▪ Provide orientation training for state and local data managers procedures to ensure ▪ Create and maintain standardized training materials consistency ▪ Ensure adherence to policies when data is exchanged or transferred ▪ Policies should be reviewed and revised, as needed Security and Access focused on protection of data (e.g. from loss or contamination ▪ Security policies should be in place and available ▪ Adhere to federal, state, and local laws, regulations and standards ▪ Apply policies to all data collected, maintained, or used ▪ Document data system operations, to at least include the following o Persons(s) responsible for data security o Data training for authorized users Develop and o Data storage method implement o Data back-up and recovery procedures to ensure o Data transference (e.g. email, etc.) security of data from breach or loss o Data encryption o Data destruction o Employee use of program equipment and person devices ▪ Periodic training for those who collect, maintain, or receive data ▪ Adherence to policies when transferring or exchanging state data ▪ Ensure adherence to policies when data is exchanged or transferred ▪ Policies should be regularly reviewed and revised as needed ▪ Access policies should be in place and available ▪ Adhere to all federal, state, and local, regulations, and standards Develop and ▪ Policies applied to all data collected, maintained, or used implement ▪ Require internal data users to participate in relevant training procedures for authorized access ▪ Routinely monitor and test data system to ensure effective and consistent implementation of policies and procedures ▪ Policies should be regularly reviewed and revised as needed Implement ▪ Communicate regularly with users about policies and procedures procedures to ▪ Monitor the implementation of policies and overall security of data maintain and address ▪ Monitor all data users to ensure adherence to policies security and access ▪ Develop relevant training materials

E-2 February 2017

Quality Indicators Elements of Quality ▪ Require all users to demonstrate knowledge about security and access policies and procedures ▪ Review and revise training materials as needed ▪ Policies should be reviewed and revised, as needed Source: DaSy Framework.

Table E.3 DaSy Framework: Stakeholder Engagement Subcomponent Quality Indicators Elements of Quality Leading Part C/619 Data System Stakeholders ▪ Establish purposes for engaging stakeholders Identify affected ▪ Identify individuals to represent different types of stakeholders stakeholder groups ▪ Articulate expectations for stakeholder involvement and individuals ▪ Periodically review stakeholder representation ▪ Use multiple methods to maximize opportunities Provide opportunities ▪ Provide stakeholders with necessary information to provide input for stakeholder input ▪ Methods for input should be reviewed and revised, as needed ▪ Review stakeholder input to guide decision-making Notify stakeholders of ▪ Use multiple methods to communicate decisions decisions made ▪ Notification methods should be reviewed and revised, as needed Part C/619 Participating as Stakeholders in Integrated Data System Initiatives Engage stakeholders ▪ Know stakeholder role in the integrated data system initiative in integrated data ▪ Participate as an active stakeholder (e.g. attend and actively system initiatives participate in meetings) Source: DaSy Framework.

Table E.4 DaSy Framework: System Design and Development Subcomponent Quality Indicators Elements of Quality Initiation of New System/Enhancement and Requirements Analysis ▪ Provide input to determine project team roles and responsibilities Active involvement in ▪ Review the high-level plan to ensure goals and needs are met initiating development ▪ Provide input on the development/enhancement or enhancement ▪ Provide input on the plan, schedule, and system requirements Active involvement in ▪ Actively define, review, and revise business requirements development of ▪ Work with IT team to create work process models that reflect business requirements, appropriate program, processes, and language process models, and ▪ Create data models that reflect program language data models ▪ Solicit end user input

E-3 Early Intervention and Education Program, Phase 2

Quality Indicators Elements of Quality ▪ Reconcile process and data models with business requirements ▪ Clear process for the approval of the final business requirements ▪ Identify features and functions of the data system/enhancement ▪ List required features and functions, identify scope ▪ Prioritize business requirements ▪ Address technical requirements that operate in the background Define requirements ▪ Diagram or describe work processes and work flows (i.e. identify what the ▪ Break down work processes and work flows into manageable system must do) functions and subfunctions ▪ Identify all data needs for reporting, accountability, program improvement, and program operations ▪ Identify data elements, characteristics, and relationships ▪ Develop an initial data dictionary ▪ Includes specific data elements (see Attachment A, pg. 37 – 39) ▪ Track data for children that move from one program to another ▪ Built in edit-check routines ▪ Reports in place to assess data quality (e.g. error reports) ▪ Controls in place (e.g. require strong passwords) ▪ Embedded supports and training materials for end users ▪ Reporting and analysis tools, with ease of access to data ▪ Automated functions that support program practices ▪ Security measures that allow compliance with federal, state, and local privacy requirements including: Capacity support o Data back-up and recovery accountability, o Data storage improvement, and o Data encryption operations o Proper destruction of data o Secure transmission of data ▪ Allow for selected modifications with little or no reliance on IT ▪ Capacity to link child-level data elements (e.g. outcomes) ▪ Capacity to link child-level data with service provider ▪ Capacity to link child-level data with program data ▪ Capacity to link service provider with program data ▪ Capacity to link family outcomes with child outcomes ▪ Able to track entries/changes made by users and who made them ▪ System has interoperability System Design and Development ▪ Work with IT as decisions are made about technical architecture Work with IT team to ▪ Work with IT to review, refine, and approve mock-ups design the system ▪ Work with IT to develop data dictionary

E-4 February 2017

Quality Indicators Elements of Quality ▪ Work with IT to refine data system requirements Work with IT team as ▪ Test modules as they develop until functioning as intended they build and test ▪ Communicate with IT to ensure adequate system performance ▪ Require technical documentations, including instructions System Acceptance and Development ▪ Select representative end users for acceptance testing ▪ Collaborate with IT to create acceptance testing plan Prepare for, ▪ Staff prepare materials and feedback mechanism communicate about, ▪ Ensure that legacy and new data are processed together and conduct system acceptance testing ▪ Conduct acceptance testing, process feedback, and communicate prior to deployment findings with IT ▪ Adjust plans as needed ▪ Repeat system acceptance testing, as needed ▪ Ensure data dictionary is reviewed and revised, as needed Participate in creating, ▪ Participate in creating and updating system materials reviewing, and ▪ Ensure changes to materials are communicated to Help Desk revisiting materials ▪ Update materials based on acceptance testing and feedback ▪ Create deployment plan, guidelines for transition, and schedule ▪ Communicate deployment plan to necessary parties ▪ Ensure end user support Work with IT to ▪ Confirm contingency plans exist for problems deploy data system ▪ Coordinate with IT team to release new data ▪ Coordinate with IT to transition responsibilities ▪ Coordinate with IT team on retirement of legacy system Source: DaSy Framework.

Table E.5 DaSy Framework: Data Use Subcomponent Quality Indicators Elements of Quality Planning for Data Use ▪ Develop recommendations for effective data use ▪ Identify potential data users and periodically gather information about specific data needs Plan for data analysis, ▪ Consider accountability and program improvement questions product development, ▪ Have a process to prioritize data requests, in a timely fashion and dissemination ▪ Dissemination focuses on products, methods, and timelines ▪ Review and revise plans for data analysis, product development, and dissemination as necessary Analyzing and Disseminating for Data Use

E-5 Early Intervention and Education Program, Phase 2

Quality Indicators Elements of Quality ▪ Analyze data to address accountability and improvement needs Conduct data analysis ▪ Prioritize and respond to various types of data requests activities and ▪ Develop documentation of the specifications to answer specific implement procedures questions and update documents as needed to ensure data integrity ▪ Implement procedures to ensure that data, as queried and reported, are accurate by checking with data ▪ Prepare a variety of data products Prepare data products ▪ Include documentation in data products needed for accurate to and inform decision interpretation and use of information making and promote ▪ Ensure that personally identifiable information is protected understanding ▪ Use a variety of approaches and displays ▪ Evaluate data products and information, revise as needed ▪ Use a variety of methods to disseminate data products Disseminate data ▪ Disseminate data products with sufficient information products to users ▪ Verify accuracy of the data prior to the release of data products ▪ Periodically evaluate the effectiveness of dissemination strategies Using Data and Promoting Capacity for Data Use ▪ Use subgroup analysis to facilitate interpretation of the data ▪ State and local staff systematically review the findings of data Use data to inform analysis, interpret findings, and make decisions based on data decisions ▪ Evaluate data use at the state and local levels to support accountability, program improvement, and program operations ▪ Provide multiple resources and tools Support the use of ▪ Assess professional development needs data at state and local ▪ Provide professional development to supports state and local users levels ▪ Evaluate the effectiveness of professional development activities Source: DaSy Framework.

Table E.6 DaSy Framework: Sustainability Subcomponent Quality Indicators Elements of Quality ▪ Ensure the criteria is meeting the needs of stakeholders Systematic process ▪ Collect and analyze data on the identified criteria includes stakeholder ▪ Use results of the analysis to identify needed improvements input to identify ▪ Verify that potential improvements align with the purpose and vision enhancements ▪ Have a process for initiating changes ▪ Monitor that the data system is up-to-date Generate political ▪ Articulate to decision-makers the benefits of the data system and and fiscal support to need for improvements maintain and ▪ Work with state leadership to identify needed resources enhance the data ▪ Promote the use of data-informed decision-making for continuous system program improvement

E-6 February 2017

Quality Indicators Elements of Quality ▪ Plan for and address transfer of knowledge ▪ Promote participation in integrated and/or link data systems Source: DaSy Framework. Figure E.1 Early Childhood Technical Assistance Center Data System Framework

Source: The Early Childhood Technical Assistance (ECTA) Center A System Framework

Similar to the DaSy Framework, each component has several subcomponents, quality indicators, and elements of quality “that specify what needs to be in place to support a high-quality Part C/Section 619 System.” As the ECTA Center Framework is focused on a programmatic approach, only a summary of each component is provided below: ▪ Governance (GV). Governance ensures that there is an “established enforceable decision-making authority to effectively implement the statewide system and that leadership advocates for and leverages sufficient fiscal and human resources to support quality services throughout the state.” Governance includes creating policies, state laws, regulations, interagency agreements, and other enforceable measures, as needed. The GV subcomponents are vision, mission and/or purpose; legal foundations; administrative structures; and leadership and performance management. ▪ Finance (FN). Resource and funding decisions “are influenced by federal, state, and local guidelines for use of funds, political will and identified need. As a result, state systems need to be current on service utilization data, demographics of children served and opportunities for collaboration and alignment with other early care and education programs serving the same population …This information should directly inform decisions regarding which resources to pursue (procurement), and how they should be allocated, used, and disbursed.” The FN subcomponents include finance planning

E-7 Early Intervention and Education Program, Phase 2

process/forecasting; fiscal data; procurement; resource allocation; use of funds and disbursement; and monitoring and accountability of funds and resources. ▪ Personnel/Workforce (PN). The Early Childhood Personnel Center worked with the ECTA Center to create the information in this component. This component is intended to assist states in the “planning, development, implementation, and evaluation of a comprehensive system of personnel development…[and] is the primary mechanism by which the state ensures that infants, toddlers, and young children with disabilities and their families, are provided services by knowledgeable, skilled, competent, and highly qualified personnel.” The PN subcomponents include leadership, coordination, and sustainability; state personnel standards; preservice personnel development; in-service personnel development; recruitment and retention; and evaluation. ▪ Data System (DS). This component is used to assist Part C and Section 619 programs in developing and enhancing high quality data systems and in improving quality of data. This framework is intended to help state staff understand the characteristics and capabilities of their data system so they may lead and participate in development efforts, cross-agency work, and to answer important program and policy questions. Sub- components of this component include stakeholder engagement, system design and development, and data use and sustainability. ▪ Accountability & Quality Improvement (AC). This component helps guide states “in an ongoing process of reviewing and evaluating the Part C and Section 619 systems to identify areas for statewide improvement…[because] true accountability holds states responsible for a sustainable process that ensures ongoing quality and improvement.” The AC subcomponents are planning for accountability and improvement; collecting and analyzing performance data; and using results for continuous improvement. ▪ Quality Standards (QS). The QS component serves as a guide for states in “an ongoing process of evaluating the quality of their programs and services within the context of the larger early care and education community, to ensure continuous program improvement and to develop more effective, efficient systems that support enhanced child and family outcomes.” The subcomponents include both child level standards and program level standards.

E-8 February 2017

Appendix F Evaluation Staff School District Survey on Early Intervention and Education Program

In total, of the forty-eight possible responses (one for each district), we received 36 responses from 33 districts (68.75% response rate). Figure F.1, below shows the Wyoming school districts that responded by each Program region. The survey was open for district response from February 15 through March 17, 2017. Figure F.1 Wyoming School Districts Response to LSO EIEP Evaluation Survey

Source: Legislative Service Office from U.S. Census information.

F-1 Early Intervention and Education Program, Phase 2

The following represents the contents of the survey sent to each Wyoming school district. Introduction The ’s Management Audit Committee (Committee) has authorized the Program Evaluation Section of the Legislative Service Office (evaluation staff) to conduct a program evaluation (or performance audit) of the Wyoming Early Intervention and Education Program, commonly known as the developmental preschool program administered by the Wyoming Department of Health. The central question the Committee wants addressed is whether or how the program performs (its outcomes) and overall impact on children and the State. Therefore, as part of our research we would like to offer school districts the opportunity to provide comments, feedback, or other information that will assist us in our research. To clarify, we are not evaluating school districts with this project. In completing this survey, please take into account any issues you believe are important for us to know where applicable and/or outlined in the survey. We also welcome additional comments, explanations, and information for any question in the survey as well as in the general comments box at the end of the survey. Please be as clear and as complete as possible in your responses. This will help prevent our process from disturbing your schedule and activities as much as possible. Your responses to our survey will be confidential and will not be forwarded to or reviewed by the Wyoming Department of Health, Wyoming Department of Education, the Legislature, or the regional child development center preschools. We may aggregate the survey’s results in our final report, but we will ensure that published information cannot be tracked back to the original respondents. If you would you like to discuss any additional concerns you have regarding this evaluation or survey questions, please contact Michael Swank at [email protected] or 307-777-7881. We understand this survey will require your valuable time and appreciate your response. We hope this chance to directly address the Legislature on this important issue is worth your effort. For information about the Legislative Service Office, including the Program Evaluation section, please visit our website: http://legisweb.state.wy.us. Base Questions 1. Please provide your name and position or title. Name Position District ID (select one): 2. With which regional child development center does your district work most often? (select one) a. Please briefly explain your understanding of the central purpose and/or goal of the regional child development centers. 3. How important are the regional child development centers in preparing children with disabilities for kindergarten? Please briefly explain the main reason(s) for your response.

F-2 February 2017

4. Would you rate the regional developmental preschool center with which your district works with most often as a “high quality” early learning program? Please briefly explain your definition of what "high quality" means or includes. 5. Does your school district support (fund) its own preschool program? a. If you responded "Yes" to Question 5, does the district-supported preschool program serve children with disabilities? 6. Does your district use the IF-K (Instructional Foundations for Kindergarten) assessment on incoming kindergarten and/or preschool students? If yes, please summarize what staff commonly administer this assessment to students. a. If you answered "No" to Question 6, do you use a different kindergarten readiness assessment? b. If you answered "Yes" to Question 6, when is the assessment administered to students? c. If you answered "Yes" to Question 6, how long does it take to administer the assessment to each student? 7. Please describe any concerns or reservations your district may have with using the IF-K assessment tool. 8. Please identify one or more examples of early childhood or kindergarten readiness standards used by your district. Please provide a brief explanation of the author or sponsor of the standards and a summary as to why these standards are used by the district. 9. How are standards and expectations for kindergarten preparedness shared with regional child development centers within your district? 10. Please describe the most common circumstances or reasons where your district and regional child development centers' staff reach different conclusions on a child's special education needs when transitioning from preschool (IDEA Part B 619) into K-12 special education services. Additional Questions 11. Please describe the current, or standard coordination and collaboration process employed by your school district when children from the regional child development centers expect to transition to the district (e.g. timelines, district staff attends preschool IEP meetings prior to transition, district staff observe services provided at the centers to transitioning students, etc.)? 12. Please list up to three of the most common challenges confronted by your district, if applicable, when working with the regional child development centers when transitioning children to the school district. Challenge 1: Challenge 2: Challenge 3:

F-3 Early Intervention and Education Program, Phase 2

13. Please list up to three of the most common positive attributes your district encounters, if applicable, when working with the regional child development centers when transitioning children to your school district. Attribute 1: Attribute 2: Attribute 3: 14. Please provide additional information you believe the Wyoming Legislature should know about the early intervention and education program and regional child development centers that would improve the system and services.

F-4 February 2017

Recent Program Evaluations State Park Fees ...... May 2001 Childcare Licensing ...... July 2001 Wyoming Public Television ...... January 2002 Wyoming Aeronautics Commission ...... May 2002 Attorney General’s Office: Assignment of Attorneys and Contracting for Legal Representation ...... November 2002 Game & Fish Department: Private Lands Public Wildlife Access Program ...... December 2002 Workers’ Compensation Claims Processing ...... June 2003 Developmental Disabilities Division Adult Waiver Program ...... January 2004 Court-Ordered Placements at Residential Treatment Centers ...... November 2004 Wyoming Business Council ...... June 2005 Foster Care ...... September 2005 State-Level Education Governance...... December 2005 HB 59: Substance Abuse Planning and Accountability ...... January 2006 Market Pay for State Employees ...... July 2006 Wyoming Drug Courts ...... July 2006 A&I HRD Role in State Hiring ...... December 2006 Kid Care CHIP: Wyoming’s State Children’s Health Insurance Program ...... June 2007 Wyoming Retirement System: Public Employee Plan ...... August 2007 WYDOT and General Fund Appropriations for Highways ...... May 2008 Wyoming Child Protective Services ...... September 2008 Department of Fire Prevention and Electrical Safety ...... December 2008 Office of Health Care Licensing and Surveys ...... July 2009 Victim Services Division: Phase I ...... August 2009 Victim Services Division: Phase II ...... February 2010 Reading Assessment and Intervention Program ...... February 2010 Office of State Lands & Investments: Management of State Trust Lands ...... June 2010 Proficiency Assessments for Wyoming Students (PAWS) ...... December 2010

Early Intervention and Education Program, Phase 2

Wyoming Unemployment Insurance Program ...... December 2010 Department of Administration and Information: Information Technology Division and Office of Chief Information Officer ...... July 2011 Wyoming Department of Health: Veterans’ Home of Wyoming ...... November 2011 Wyoming Aeronautics Commission ...... September 2012 Wyoming Boards and Commissions ...... June 2013 Wyoming’s Interim Budget Process to Modify Legislatively Appropriated Funds ...... November 2013 Wyoming Aeronautics Commission (Follow-up Evaluation) ...... November 2013 University of Wyoming: Effectiveness of Block Grant Funding (with Supplement) ...... January 2015 Wyoming Public Purpose Investments (PPIs) ...... August 2015 Wyoming Water Development Commission ...... January 2016 Early Intervention and Education Program, Phase 1 ...... September 2016 Business Ready Community Program ...... December 2016

Evaluation reports can be obtained from: Wyoming Legislative Service Office 213 State Capitol Building Cheyenne, Wyoming 82002 Telephone: 307-777-7881 Fax: 307-777-5466 Website: http://legisweb.state.wy.us