ISSUES& ANSWERS REL 2009–No. 072

How state education agencies

At Education Development Center, Inc. in the Northeast and Islands Region support data-driven decisionmaking in districts and schools

U.S. Department of Education ISSUES&ANSWERS REL 2009–No. 072

At Education Development Center, Inc. How state education agencies in the Northeast and Islands Region support data-driven decisionmaking in districts and schools

May 2009

Prepared by Michelle A. LaPointe Education Development Center, Inc. Jessica Brett Education Development Center, Inc. Melissa Kagle Learning Innovations at WestEd Emily Midouhas Education Development Center, Inc. María Teresa Sánchez Education Development Center, Inc. with Young Oh and Charlotte North Education Development Center, Inc.

U.S. Department of Education WA ME MT ND VT MN OR NH MA ID SD WI NY MI WY CT RI IA PA NE NV OH IL IN UT WV CA CO VA KS MO KY NC TN AZ OK NM AR SC

GA MS AL LA TX AK

FL

VI At Education Development PR Center, Inc.

Issues & Answers is an ongoing series of reports from short-term Fast Response Projects conducted by the regional educa- tional laboratories on current education issues of importance at local, state, and regional levels. Fast Response Project topics change to reflect new issues, as identified through lab outreach and requests for assistance from policymakers and educa- tors at state and local levels and from communities, businesses, parents, families, and youth. All Issues & Answers reports meet Institute of Education Sciences standards for scientifically valid research.

May 2009

This report was prepared for the Institute of Education Sciences (IES) under Contract ED-06-CO-0025 by Regional Educa- tional Laboratory Northeast and Islands administered by Education Development Center, Inc. The content of the publica- tion does not necessarily reflect the views or policies of IES or the U.S. Department of Education nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.

This report is in the public domain. While permission to reprint this publication is not necessary, it should be cited as:

LaPointe, M. A., Brett, J., Kagle, M., Midouhas, E., Sánchez, M. T., Oh, Y., and North, C. (2009). How state education agen- cies in the Northeast and Islands Region support data-driven decisionmaking in districts and schools (Issues & Answers Re- port, REL 2009–No. 072). , DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northeast and Islands. Retrieved from http://ies.ed.gov/ncee/edlabs.

This report is available on the regional educational laboratory web site at http://ies.ed.gov/ncee/edlabs. Summary REL 2009–No. 072 How state education agencies in the Northeast and Islands Region support data-driven decisionmaking in districts and schools

The report examines the initiatives of districts and schools in the Northeast and state education agencies in the Northeast Islands Region? What are their similarities and Islands Region to support data- and differences? driven decisionmaking in districts and schools and describes the service provid- • What service providers do the state ers hired to support this work. Identi- education agencies work with to support fying four components of data-driven educators in using data to inform educa- decisionmaking initiatives, it finds that tion decisions, and what services do they not all initiatives include all four. provide? What are their similarities and differences? Data-driven decisionmaking is receiving in- creasing attention from the education commu- The descriptions of initiatives by state educa- nity because of federal and state accountability tion agencies in the Northeast and Islands requirements; the enhanced capacity of states Region to support data-driven decisionmaking and school districts to collect, manage, and in districts and schools are based on exami- distribute data; and a better understanding of nations of state education agency web sites, the importance of data-driven decisionmaking interviews with agency officials, and a review in improving instruction and student achieve- of the documents they provided. State educa- ment. The report responds to a request by the tion agency officials also identified the service Department of Elementary and providers they have hired to support this for information about work. Three providers were selected for more how other state education agencies in the in-depth profiles. Descriptions of the three Northeast and Islands Region support districts service providers are based on interviews with and schools in effectively using data to inform service provider staff, observations of profes- a range of education decisions. sional development activities, and additional materials provided by the service providers. Two research questions guided this study: Appendixes to the report include profiles of each state education agency initiative, a • What state education agency initiatives catalogue of nine service providers identi- support data-driven decisionmaking by fied by the state education agency officials, ii Summary

and profiles of three service providers. This and school processes in response to such study has several limitations, including the data. small number of respondents interviewed, the focus on state education agency data-driven Each state education agency’s initiatives were decisionmaking supports for districts and reviewed within the framework of these schools rather than on district and school im- components. Only appears to plementation of data-driven decisionmaking, provide all four components to every school in and a lack of information about the decisions the state. and Vermont provide all four made by state education agency officials. components but target only specific schools. New York provides a data system, data tools, Analysis of the state education agency data- and professional development, but as separate driven decisionmaking initiatives across initiatives with little overlap. , the region revealed that state agencies have Rhode Island, and the Virgin Islands are creat- implemented one or more of four components ing a longitudinal data system and data tools to support data use by schools and districts: and developing training in their use. Connect- icut focuses solely on the process of using data • Centralized data system/warehouse. A cen- for decisionmaking and targets that support to tralized data system/warehouse combines low-performing districts. data from multiple sources into a central- ized repository. Data can include a range Despite a dearth of comprehensive initiatives, of evidence, such as classroom assessment state education agency officials across the re- data, school-level information on students gion mentioned the importance of providing a and staff, demographic data, and state test range of data-driven decisionmaking supports scores. to schools. But they noted that limited fund- ing and a lack of capacity force them to make • Tools for data analysis and reporting. Data choices about which components to provide. tools allow users to collect, organize, and analyze data for use in decisionmaking. Analysis of the four components across the state education agency initiatives revealed that • Training on data systems/warehouses and implementation is affected in part by available tools. Training helps educators learn to funding and capacity: effectively and efficiently use the data analysis tools to better understand the • Funding. State education agency data- available data. driven decisionmaking initiatives were shaped by the funding available. Exter- • Professional development in using data nal funds (such as Title I) may restrict for decisionmaking. Teachers and admin- who can receive services or prescribe istrators require extensive professional the types of services that can be offered. development to build their expertise in State funding must be stretched to cover identifying and analyzing relevant data a range of education services, and sup- and adjusting instructional practices port for data-driven decisionmaking must Summary iii

compete with teacher salaries and book customized reports from the centralized and supply purchases. data system.

• Capacity. State education agency staff may • Training in data systems/warehouses and lack the capacity to support their data- tools. Performance Pathways and Mea- driven decisionmaking initiatives because sured Progress not only support the cre- of time or expertise. To help provide ation of a data warehouse system but also needed services to schools and districts, provide training in using the system and state education agencies have contracted its tools. To reach a broader audience, both with external service providers. providers use a “train the trainer” model, training groups of educators who are then State education agency officials identified the responsible for training colleagues. following providers of support for data-driven decisionmaking initiatives in the region: Cen- • Professional development in using data for ter for Assessment, Cognos, the decisionmaking. All three providers offer Regional Educational Service Centers Alliance some professional development in using (RESC Alliance), ESP Solutions Group, Mea- data to guide decisionmaking. Offerings sured Progress, the New York Board of Coop- vary, depending on the organization’s erative Educational Services (BOCES), Pearson focus on creating a data-driven culture, School Systems, Performance Pathways, and using assessment data, or developing TetraData. Three of these nine service provid- software tools. Providers work with teams ers were selected for in-depth profiles: the from districts or schools who are expected Connecticut RESC Alliance, Measured Prog- to train their colleagues in the process. ress, and Performance Pathways. This study outlines several considerations These three service providers assist state for education decisionmakers and research- education agencies in implementing the four ers on the potential benefits of implement- key data-driven decisionmaking components ing additional components of a data-driven identified in this study: decisionmaking system, sources of funding, and strategies to enhance their capacity to • Centralized data system/warehouse. Perfor- support teachers and administrators. Ideas mance Pathways and Measured Progress are proposed for further research, including support state education agencies in creating examining how state education agencies scale a central repository for storing data. up their data-driven decisionmaking initia- tives; exploring how state education agencies, • Tools for data analysis and reporting. schools, and districts implement data-driven All three service providers use various decisionmaking; and analyzing the impacts of tools to support data analysis and report- data-driven decisionmaking on student and ing, ranging from paper and pencil to school outcomes. computer-based spreadsheet software and online programs capable of generating May 2009 iv Table of contents

Table of contents Why this study? 1 Importance to the region 1 Study questions 2 What state education agency initiatives support data-driven decisionmaking by districts and schools? 3 State education agency data-driven decisionmaking initiatives by state and jurisdiction 3 Four key components to support data collection and use 7 Similarities and differences in data-driven decisionmaking initiatives across state education agencies 8 What service providers do state education agencies work with to support educators’ use of data-driven decisionmaking? 13 In-depth investigation of three service providers 13 Similarities and differences across the three service providers 16 Considerations for state education agency leaders developing data-driven decisionmaking Initiatives 18 Adding components of a data-driven decisionmaking system 18 Sources of funding 18 Strategies to enhance capacity 18 Considerations for further research 18 Stages of implementation 19 Different levels of the education system 19 Impact of data-driven decisionmaking on student achievement 19 Appendix A Glossary 23 Appendix B Study methods 25 Appendix C Research on the implementation of data-driven decisionmaking, barriers to implementation, and supports available 32

Appendix D Profiles of state education agencies’ data-driven decisionmaking initiatives 35 Appendix E Catalogue of service providers 55 Appendix F Service provider profiles 60 Notes 74 References 76 Boxes 1 What is data-driven decisionmaking? 2 2 Study methods 4 Tables 1 Components of state education agency initiatives in the Northeast and Islands Region, 2007/08 9 2 Overview of service providers identified by state education agencies, 2007/08 14 Table of contents v

3 Summary of three service providers, 2007/08 15 4 Service providers’ support for the four components of state education agency initiatives, 2007/08 16 5 Snapshot of data-driven decisionmaking initiatives in the Northeast and Islands Region, 2007/08 20 B1 State initiative coding scheme 29 B2 Professional development services coding scheme 30 F1 Measured Progress clients, 2008 65 Why this study? 1

The report Why this study? Use of data is recognized as a major component of examines the school improvement. No Child Left Behind (NCLB) legislation stipulates that low-performing Title I initiatives of state schools use data to support school improvement plans submitted to the state education agency education agencies (NCLB section 1116). Such data could include test scores, classroom assessments, attendance and in the Northeast discipline records, school or district staffing pat- terns (teachers’ and principals’ years of experience, and Islands areas of expertise, and levels of education), and classroom information gathered systematically by Region to support teachers (Wayman 2005). The increased pressure for data-driven decisionmaking comes not only data-driven from external accountability requirements, such as the NCLB Act and state legislation, but also from decisionmaking an increased understanding that school improve- ment efforts benefit from systematically gathering in districts and and analyzing information about the school, its students, and the school’s capacity to meet stu- schools and dents’ needs (Finnigan, O’Day, and Wakelyn 2003; describes the Marsh et al. 2005; Ikemoto and Marsh 2007). Data-driven decisionmaking is neither new nor service providers limited to education decisionmakers (see box 1 for a description of data-driven decisionmaking and defi- hired to support nitions of key terms). Analyzing data to assess or- ganizational effectiveness and to craft improvement this work. strategies has long been important for managers in the public and private sectors (Feigenbaum 1951; Identifying four Deming 1982; Barzelay 1992). Although educators have a long tradition of using test scores and other components information to shape instructional decisions, until recently the use of data to inform education deci- of data-driven sions has not been systematic (Abelman et al. 1999). With external accountability pressures and the decisionmaking availability of new technology to access data, there is now a push by state education agencies for more initiatives, it uniform and rigorous use of data to improve in- struction outcomes (see, for example, Abbott 2008; finds that not New Hampshire Department of Education 2006).

all initiatives Importance to the region

include all four. Officials at the Rhode Island Department of Elementary and Secondary Education expressed 2 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Box 1 that supports the use of evidence to can include scores from large-scale What is data-driven “reshape the central practices and assessments, classroom assessments, decisionmaking? cultures of their schools to react attendance and discipline records, intentionally to the new kinds of and statistics on school or district Key terms related to data-driven data” (Halverson et al. 2007, p. 4). staffing patterns (such as teachers’ decisionmaking include the following Strategies include forming data teams and principals’ years of experi- (for more detail see appendix A): to guide the data-driven decision- ence, areas of expertise, and level of making process (Love et al. 2008) education). Data-driven decisionmaking in edu- and implementing new instruction cation refers to “teachers, principals, approaches and the associated profes- State data-driven decisionmaking and administrators systematically sional development (Moody, Russo, initiatives are initiatives that have collecting and analyzing various and Casey 2005). been conceptualized at the state level types of data . . . to guide a range of to support data use by local educa- decisions to help improve the success Culture of inquiry is a type of data- tors; have a set of comprehensive of students and schools” (Marsh, driven decisionmaking in which goals, objectives, and purposes; and Pane, and Hamilton 2006, p. 1). It can faculty create an organizational have designated resources (finan- include the relatively simple process culture focused on using data and cial and personnel) to support their of disaggregating state test scores to other evidence to shape instructional development and implementation. In identify and support struggling stu- practices. addition, state education agency lead- dents (Ikemoto and Marsh 2007) and ers must communicate their policies the more complex process in which Data collected about students, teach- to districts in the state and offer them faculty develop a school environment ers, and their schools and districts support.

interest in research that describes how state educa- develop their capacity to conduct these activities. tion agencies can support data-driven decision- But there is little research on how state education making in districts and schools. The department agencies and external service providers support is designing a comprehensive data system to these efforts—the focus of this report. provide educators—from classroom teachers to district and state ­administrators—with access to Study questions data. This effort prompted Rhode Island Depart- ment of Elementary and Secondary Education Two research questions guided this study: officials to request the Regional Educational Laboratory Northeast and Islands to profile how • What state education agency initiatives sup- other jurisdictions in the region support system- port data-driven decisionmaking by districts atic data-driven decisionmaking in districts and and schools in the Northeast and Islands schools. Officials at state education agencies in Region? What are their similarities and Massachusetts, New Hampshire, and New York differences? also expressed interest in the project. • What service providers do the state education Extensive research on data-driven decisionmaking agencies work with to support educators in at the district and school levels reveals the variety using data to inform education decisions, and of practices teachers and administrators used what services do they provide? What are their to support both classroom practices and school similarities and differences? improvement efforts (see appendix C for a brief review of the literature). The research findings are The first research question concerns state educa- also clear that educators need external support to tion agency initiatives to develop the capacity of What state education agency initiatives support data-driven decisionmaking by districts and schools? 3

district and school leaders to use data to adjust What state education agency initiatives instruction and organize schools to improve stu- support data-driven decisionmaking dent achievement (see box 1 for definitions of key by districts and schools? terms). The second research question examines service providers in the region and how they sup- The descriptions in this section provide a snap- port data-driven decisionmaking in districts and shot of the data-driven decisionmaking initiatives schools. State education agencies contract external and their context in Connecticut, Maine, Mas- service providers to support data-driven decision- sachusetts, New Hampshire, New York, Rhode making initiatives by providing technology tools Island, Vermont, and the Virgin Islands. Table 5 and professional development. The responses to at the end of this report compares the major these research questions will be useful to state elements of the initiative in each jurisdiction, leaders who are designing or implementing initia- including targeted groups, data collected, data tives for building the capacity of educators to analysis tools, and service providers working with conduct data-driven practices at the school and the state education agencies. A full description district levels and to district administrators who of each state education agency initiative is in ap- sometimes contract with service providers to sup- pendix D. port data-driven decisionmaking. State education agency data-driven decisionmaking To answer these questions, the project team con- initiatives by state and jurisdiction ducted a qualitative study of data-driven decision- making initiatives in the Northeast and Islands The goal of each state education agency’s work Region (see box 2 and appendix B for details on with districts and schools on data-driven decision- the study methods). The report also provides an making is to help local educators use data to overview of the literature on data-driven decision- improve student outcomes. Despite this common making in schools and districts and the external mission, individual agencies in the Northeast supports for those efforts (appendix C), case pro- and Islands Region support data-driven decision- files of state education agency initiatives (appendix making in distinct ways. This section describes D), a catalogue of the nine providers contracted each state’s approach. by state education agencies to support data-driven decisionmaking in districts and schools in their Connecticut. Connecticut’s data-driven decision- jurisdiction (appendix E), and case profiles of making initiative emphasizes improving the three service providers (appendix F). outcomes of low-performing schools. Under the Connecticut Accountability for Learning Initiative The state education agencies and service provid- of the State Department of Education, regional ers in the Northeast and Islands Region profiled educational service centers train school teams in this report offer a range of examples of how to in low-performing districts to analyze data to support data-driven decisionmaking at the district improve student performance. Classroom and and school levels. While ideally state education school-level data are agencies could provide a comprehensive range of examined, including The goal of each state supports for data-driven decisionmaking, limited state assessment scores education agency’s resources often force tradeoffs between providing from the Connecticut work with districts and data and tools and fostering a climate that encour- Academic Performance schools on data-driven ages a process of data-driven decisionmaking. The Test and Connecticut decisionmaking is to examples in the report are intended to expand the Mastery Test, along with help local educators dialogue on how state education agencies can sup- data on student behav- use data to improve port data-driven decisionmaking and help practi- ior and school climate. student outcomes tioners better meet the needs of their students. Professional development 4 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Box 2 initiatives and the person responsible profiles and to develop cross-case Study methods for implementing the state contract themes. (appendix B provides more details on Study sample and analysis. The the selection criteria). Three service Limitations of the study. The primary researchers used publicly available providers from the list of nine identi- limitation of this study is the small information to identify statewide fied providers were selected for more number of respondents interviewed. data-driven decisionmaking initia- in-depth profiles: Measured Prog- The knowledge and opinions of one tives, key respondents, and service ress, Performance Pathways, and the state education agency official from providers; supplemental documents Connecticut Regional Educational each jurisdiction may not represent provided by respondents from state Service Centers Alliance (see appen- the knowledge or perspectives of all education agencies and service pro- dix F). They were selected because state education agency staff. In addi- viders; semistructured, open-ended they serve multiple jurisdictions in tion, information in the three service interviews with key respondents from the region, have been in existence provider profiles cannot be general- each state education agency and each for two or more years, have provided ized to represent the six providers service provider; and observations of data-driven decisionmaking ser- that were not profiled. Also, the study professional development activities. vices for at least two years, and offer focuses on the role of state education Data collection was iterative, with professional development services agencies and does not explore in any the state education agency initiatives in using data to inform teaching and depth the role of districts in sup- informing the selection of service learning. porting data-driven decisionmaking providers. initiatives. Finally, the study does not Interviews were recorded and tran- examine the effectiveness of any of Interviews were conducted with scribed. Transcripts and observation these initiatives or their impacts on a state education agency official notes were coded using the qualita- schools. with a key role in the data-driven tive data analysis software ATLAS-ti. decisionmaking initiatives in each The coding output was analyzed Note of the eight state education agencies alongside documents to triangulate 1. The plan originally included all nine state included in the study (Connecticut, the evidence about state education education agencies in the region. Despite attempts over several months, the re- Maine, Massachusetts, New Hamp- agency policies and service provider search team was unable to coordinate an shire, New York, Rhode Island, Ver- efforts. Researchers used these data interview with an appropriate individual 1 mont, and the Virgin Islands; two to develop state case profiles and ser- from the Puerto Rico Department of officials were interviewed in Massa- vice provider profiles. The completed Education. As the study moved into the chusetts). Each respondent was asked profiles facilitated systematic analysis second phase of the research, focusing on to identify at least one service pro- of cross-state education agency and service providers, a decision was made to proceed without information from vider supporting that state education cross-provider issues. The research Puerto Rico because of time constraints. agency’s data-driven decisionmaking team met regularly to share case

in data-driven decisionmaking for teachers and by the Leadership and Learning Center, a partner school administrators focuses on creating data organization. teams and training them to build and sustain a five-step process of embedding data and evidence The Connecticut State Department of Educa- into their decisionmaking processes. No technol- tion does not have a statewide data warehouse to ogy is used during the training; data are typically store student assessment data, but officials are on paper. Certification training, intended for considering one. The regional educational service educators who want to create a schoolwide culture centers had provided a statewide warehouse to of inquiry through facilitation of ongoing profes- districts through a vendor, TetraData, but that sional development at their schools, is provided relationship ended in June 2008. Connecticut What state education agency initiatives support data-driven decisionmaking by districts and schools? 5

received a longitudinal data system grant from Massachusetts. The Although Maine lacks the U.S. Department of Education in 2005. Ac- Massachusetts Depart- a formal data-driven cording to the U.S. Department of Education web ment of Elementary and decisionmaking initiative, site, the Connecticut State Department of Educa- Secondary Education has state education leaders tion is using the grant to enhance its technology a data-driven decision- have long recognized infrastructure to support data collection, but making initiative that the need to assist local not to support data-driven decisionmaking in focuses on creating a educators in developing districts and schools. virtual data warehouse a culture of inquiry and sophisticated tools in schools to support Maine. Although Maine lacks a formal data-driven to help educators analyze the use of evidence decisionmaking initiative, state education leaders the data. In 2001 the in decisionmaking have long recognized the need to assist local edu- department created a cators in developing a culture of inquiry in schools centralized system to to support the use of evidence in decisionmaking. collect student background information and link Since the early years of the Maine Educational it to Massachusetts Comprehensive Assessment Assessment (first administered in 1984), the Systems data. For the past three years the depart- Maine Department of Education and Measured ment has been developing and pilot testing its Progress, a service provider focusing on assess- virtual Educational Data Warehouse initiative, ment development, have provided workshops to which will give people at all levels (state, district, district and school administrators to support them and school) access to a variety of state-maintained in interpreting the state assessment reports. In data (such as state assessment scores and student addition, in the 1990s and the early 2000s the state and educator information) and data analysis and accountability system included local performance reporting tools. Districts will have the opportunity assessments.1 In recent years the workshops also to add their own data, including course grades and have trained educators on an online data tool, attendance or discipline data. designed by Measured Progress, that allows users to customize the Maine Educational Assessment The virtual Educational Data Warehouse uses reports. Administrators are then expected to share software developed and managed by Cognos this training with their colleagues. Corporation. As of May 2008, 76 districts (of 388) were piloting the data warehouse. The pilot Maine Department of Education staff provide includes training on the data tools and profes- follow-up support for school teams—to conduct sional development support for implementing a item analysis and to discuss how to identify stu- process of data-driven decisionmaking. Once the dent needs and strengthen instruction based on pilot ends and the initiative is available statewide, state assessment evidence—but very few personnel districts will need to provide training on the data are available to serve the state, which is as large as analysis tools and professional development from the rest of New England combined. Department their own budget. staff support data-driven decisionmaking within specific programs (such as Title I, Reading First, New Hampshire. In 2007 New Hampshire unveiled and special education), but limited state funds Performance Tracker, a suite of software developed mean that these services are available only to a by Performance Pathways to support the state’s targeted set of schools. Maine is using a federal formal data-driven decisionmaking initiative, longitudinal data grant to develop a new data Follow the Child. Follow the Child focuses on system that will give all schools access to an online measuring growth in the personal, social, physical, longitudinal data system that will integrate state and academic facets of each student’s life and on assessment scores with student information and defining the support systems students need to suc- staffing data. ceed. The intention is to ensure that personalized 6 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

learning and student well-being are at the center of Educators can view and sort data by performance district and school policies. level, content strand or standard, test item, and subgroup. District and school administrators and Along with storing and analyzing student and teachers are expected to use the information to educator data, Performance Tracker can link inform education decisions. Furthermore, the curriculum, assessments, and standards and help regional Boards of Cooperative Educational Ser- educators create and store local assessments. New vices (BOCES), created in 1948 to provide shared Hampshire, like Connecticut and Maine, is the education programs and services to New York recipient of a grant from the U.S. Department of school districts, offer data-driven decisionmaking Education to improve its longitudinal data system. services based on the needs of the regions they The federal support has allowed New Hampshire serve. Many low-performing schools seek BOCES to waive the subscription fee for districts to the services in using data to improve teaching and state’s data warehouse. New Hampshire also is learning. using its federal grant to provide professional development for educators at all levels—from Rhode Island. The Rhode Island Department of preschool to postsecondary—to make better use of Elementary and Secondary Education supports multiple kinds of data to support the learning and data-driven decisionmaking in the state through development of each child and to foster a climate major investments in a data warehouse system to that supports the use of data and evidence to guide support its longitudinal data system and tools to instruction. The New Hampshire Department of enable districts to use the data to inform instruc- Education appears to have the most comprehen- tional decisions. State, district, and school educa- sive data-driven decisionmaking initiative in the tors and parents will have access to the online Northeast and Islands Region. consolidated data warehouse, which facilitates data collection and provides data analysis tools. In New York. Although the New York State Educa- 2006 the Rhode Island Department of Elementary tion Department does not have a formal statewide and Secondary Education contracted with Tetra- data-driven decisionmaking policy for districts Data and ESP Solutions Group to create a central and schools, its support for low-performing virtual repository to consolidate vast amounts of schools encourages data-driven decisionmaking. information and data that had been available in It also encourages state-sponsored and other separate locations (such as New England Common agencies to incorporate supports for data-driven Assessment Program data,2 student and financial decisionmaking into their programs. information, and teacher certification). The system is being introduced gradually over a two-year New York has a strong history of using data to period. During the roll-out district staff are being inform education decisions at all trained in how to use the components of the new Although the New levels of the system. A network of data system. York State Education regional school support centers, Department does created in 2003, provides profes- Vermont. Since the mid-1990s the Vermont De- not have a formal sional development for teachers in partment of Education has supported educators’ statewide data-driven low-performing schools in using use of data to improve instruction and student decisionmaking policy data to drive instructional deci- outcomes. However, before passage of the NCLB for districts and schools, sions. The New York State Testing Act, these initiatives were not tied to accountabil- its support for low- and Accountability Reporting ity efforts. Currently, the department has a formal performing schools Tool (nySTART), introduced in data-driven decisionmaking initiative with spe- encourages data-driven 2006, gives school leaders access cific requirements as part of its work with low-per- decisionmaking. to a variety of reports on scores forming schools. It also informally provides some on New York’s standardized tests. resources related to data-driven decisionmaking to What state education agency initiatives support data-driven decisionmaking by districts and schools? 7

all schools in the state. Schools in need of improve- have one or more of four Analysis of state ment must provide data-based evidence of the key components in place education agency data- impact of their school improvement efforts on stu- to support data collection driven decisionmaking 3 dent learning. These schools receive support and and use by schools and initiatives in the training for data-driven decisionmaking processes districts in making edu- Northeast and Islands 4 through principal learning communities; by 2009 cation decisions. These Region revealed that similar professional learning communities will be components provide a agencies typically have required for all teachers in these schools through logical framework for one or more of four key teacher learning communities. presenting findings. components in place to support data collection In addition to services provided free of charge to • Centralized data and use by schools low-performing schools, districts can subscribe to system/warehouse. and districts in making the Vermont Data Warehouse, which has both data A centralized data education decisions storage and analysis capabilities. Fees are based system/warehouse on a sliding scale, according to district size and combines data from revenues. The warehouse uses software developed multiple sources into a centralized repository. by TetraData and is administered by the Vermont Such centralized systems provide a range of Data Consortium, a partner agency of the Vermont stakeholders with an integrated view of mul- Department of Education. The data warehouse tiple data sources and provide a customized provides training on the use of its data tools. interface to access these data (Dembosky et al. 2005). Data range from classroom assessment Virgin Islands. The Virgin Islands Department of data to school-level information about stu- Education does not have a formal policy to support dents and staff (Gallagher, Means, and Padilla data-driven decisionmaking in its districts and 2007; Wayman 2005). schools. A department official noted that the juris- diction relies on the provisions in the NCLB Act to • Tools for data analysis and reporting. Data encourage the use of data in decisionmaking. The tools are software that allows teachers and ad- Virgin Islands Territorial Assessment of Learning ministrators to collect, organize, and analyze (VITAL) has been in place since 2004/05; before data and use them for decisionmaking (see, that the jurisdiction had had a five-year morato- for example, case studies by Larocque 2007; rium on standardized testing. Since territorywide Storandt, Nicholls, and Yusaitis 2007; Chen, testing resumed in 2004, the assessment has been Heritage, and Lee 2005; Lachat and Smith taken by students in grades 5, 7, and 11 in each 2005; Light et al. 2005; Murnane, Sharkey, and of the past three test administrations. The Virgin Boudett 2005; Mason 2002). These software Islands Department of Education is beginning to tools can reside on individual personal com- develop a longitudinal data system to access and puters or be provided online. User-friendly analyze both assessment data and student infor- software allows easy access to and manipula- mation, and the professional development that tion of data to inform decisions about class- supports the use of the data system is part of a room practice. contract with Pearson School Systems, the exter- nal service provider that created the system. • Training on data systems/warehouses and tools. Educators often need training on Four key components to support data collection and use effective and efficient use of data systems/­ warehouses and tools (Earl, Katz, and Fullan Analysis of state education agency data-driven 2006; Dembosky et al. 2005). Teachers and decisionmaking initiatives in the Northeast and district and school administrators may receive Islands Region revealed that agencies typically technical training and support through 8 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Six of the eight state workshops, user conferences, four key components of data-driven decision- education agencies review of annotated score reports, making, focusing on similarities and differences. provide data reporting or technical assistance sessions led It also looks at how available funding and capac- and analysis tools to by project managers. ity for data-driven decisionmaking supports local educators, often shaped the decisions of state education agencies with the support of an • Professional development in on which components to implement, who would external service provider using data for decisionmaking. receive access to these supports, and which Professional development in outside vendors to work with states to provide data-driven decisionmaking supports. builds educators’ expertise and habits in identifying and analyzing rel- State education agencies in the Northeast and evant data and adjusting instruction and other Islands Region have implemented the four compo- practices in response to the data (Boudett and nents to different extents and in varying combina- Steele 2007; Knapp, Copland, and Swinnerton tions and sequences (see table 1 for an overview of 2007; Young 2006; City, Kagle, and Teoh 2005; the components). Lachat and Smith 2005; Murnane, Sharkey, and Boudett 2005; Copland 2003; Feldman and Centralized data system/warehouse. The state edu- Tung 2001). Professional development occurs cation agencies in the region offer various systems over time and often involves an iterative cycle to support data-driven decisionmaking in districts in which teachers examine data, plan for im- and schools. These include consolidated virtual plementation of targeted strategies, implement data warehouses (Massachusetts, New Hampshire, those strategies, and evaluate their impact Rhode Island), a data warehouse run by a part- (Abbott 2008). It often focuses on building ner organization rather than the state education collaborative teams that use data to examine agency (Vermont), integrated but more limited improvement efforts (Love et al. 2008). Such data systems (Virgin Islands), and separate reposi- professional development is often delivered tories for different types of data (New York and through workshops or ongoing coaching for Maine). Maine is developing a consolidated data data teams in schools and districts. warehouse that should be available to all schools within three years. Connecticut plans to develop These four components have been implemented a statewide data warehouse, but currently data- in varying combinations and sequences and are at driven decisionmaking relies on data collected by different stages of implementation in the agen- districts and at schools. cies studied. Choices about which components to implement—and about the depth and breadth of Tools for data analysis and reporting. Seven of the implementation—are shaped at least in part by eight state education agencies (all but Connecticut) the availability of funding and the implementation provide data reporting and analysis tools to local capacity of state education agencies. The agencies educators, often with the support of an external are extending their capacity to support data use in service provider. In Vermont a partner agency districts and schools by contracting with organiza- provides data tools, which are available to districts tions that can provide services for one or more of for a subscription fee. Massachusetts, New Hamp- the data-driven decisionmaking components. shire, Rhode Island, and the Virgin Islands offer software that facilitates analysis of assessment and Similarities and differences in data-driven decisionmaking other data when used in conjunction with their initiatives across state education agencies data warehouses.

This section looks at the extent to which state The Massachusetts Department of Elementary and education agencies in the region implement the Secondary Education has purchased a license for What state education agency initiatives support data-driven decisionmaking by districts and schools? 9

Table 1 Components of state education agency initiatives in the Northeast and Islands Region, 2007/08

Centralized data system/warehouse Training in Professional Consolidated Separate Tools for data data systems/ development in data system/ data system/ analysis and warehouses using data for Jurisdiction warehouse warehouses reporting and data tools decisionmaking Connecticut 3 Maine 3 3 3 3a Massachusetts 3 3 3 b 3 b New Hampshire 3 3 3 3 New York 3 3 3 Rhode Island 3 3 3 Vermont 3c 3c 3c 3 Virgin Islands 3 3 3

a. A limited number of Maine Department of Education staff are available to support data-driven decisionmaking at the school level. b. State education agency funding is available only for pilot districts. Available statewide through districts’ funds. c. These services are provided by a partner organization for a subscription fee. Note: Puerto Rico is not included because the study team was unable to set up interviews with key respondents from the Puerto Rico Department of Educa- tion, despite repeated attempts over several months. Source: Authors’ compilation based on interviews and documents provided by state education agency officials and information available on state education agency web sites.

data warehouse software that will house all the Training in data systems/warehouses and data state-maintained data and will eventually allow tools. Six of the eight jurisdictions (Maine, Massa- school districts to load and analyze their own chusetts, Rhode Island, New Hampshire, Vermont, data. All New Hampshire districts have access to and the Virgin Islands) provide training on use Performance Tracker, the state-adopted data ware- of the data warehouses and data analysis tools. house and analysis tool that is used in conjunc- In Vermont the training is provided by a partner tion with the state’s data warehouse. The Rhode organization. Maine provides workshops on the Island Department of Elementary and Secondary online data tool, designed by Measured Progress, Education is creating a data warehouse to hold all which allows users to customize Maine Educa- of its education-related data. It will include a data tional Assessment reports. Districts that have been analyzer available at all levels of the state educa- piloting the Massachusetts data warehouse have tion system. The Virgin Islands Department of received training on uploading assessment data to Education contracts with Pearson School Systems, the warehouse and on the analysis tools. District the vendor that created its system, to access and staff in Rhode Island are being trained to use their analyze state assessment data and student infor- new system. Performance Pathways provides New mation. Maine and New York have stand-alone Hampshire educators with training in their ware- web-based tools that allow educators to conduct house data analysis and reporting tools. As part analyses and create reports using assessment and of the contract with Pearson School Systems, the other types of data. Maine works with Measured Virgin Islands Department of Education provides Progress to offer a web portal that enables district professional development in the use of its data sys- staff and school faculty to generate customized tem to access and analyze both student assessment score reports from Maine Educational Assess- data and student information. ment data. New York’s nySTART tool gives school leaders access to a variety of reports on students’ Professional development in using data for standardized test scores. decisionmaking. Six jurisdictions (Connecticut, 10 state education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Maine, Massachusetts, New Hampshire, New in decisionmaking. New York provides a data York, and Vermont) provide professional develop- system, data tools, and professional development, ment in using data for education decisionmaking. although these are all part of separate initiatives Connecticut provides this training through its and have little overlap. Maine is piloting a longi- low-performing districts initiative, Connecticut tudinal data system and has online tools to access Accountability for Learning Initiative. Profes- a limited warehouse of state assessment data, sional development for teachers and school provides training in using those tools, and offers administrators focuses on training data teams to some help for schools in using data for decision- build and sustain a five-step process of embed- making. Vermont has a statewide data warehouse ding data and evidence into decisionmaking available to districts by subscription and also processes. The Maine Department of Education provides professional development to its low-per- has staff available to facilitate item analysis with forming schools in using data for decisionmaking. school teams and to help teams better understand Connecticut is the only state focused solely on how data can inform classroom practice and the process of data-based decisionmaking. It improve student achievement. In Massachusetts targets professional development in using data state funds are available for professional develop- for decisionmaking to low-performing districts, ment for pilot districts only. Once the initiative although a centralized data system is planned for is launched statewide, districts will need to use the near future. local funds to implement this component. New Hampshire provides training in implementing a Although only New Hampshire is currently schoolwide culture that maintains a consistent implementing a comprehensive data-driven focus on the use of data. New York’s regional decisionmaking system that is available to all school support centers provide professional schools, state education officials across the region development to teachers and administrators from noted the importance of having access to a data low-performing schools in data-driven decision- warehouse, being proficient in using data analy- making for content areas. In Vermont teachers in sis tools, and understanding how to use data for schools in need of improvement receive training decisionmaking. While initiatives in Connecticut, in developing and maintaining professional learn- Massachusetts, and Rhode Island focus primarily ing communities. on one component, state officials are aware of the need for a more balanced approach. State implementation of key components. Among the study jurisdictions only New Hampshire The Massachusetts Department of Elementary appears able to offer all four components of state and Secondary Education is funding a data education agency initiatives to every school in warehouse and recognizes the need for profes- the state (see table 1). Massachusetts and Rhode sional development on data use; however, districts Island are creating a longitudinal data system and are expected to invest in supports for teachers data tools and preparing training on how to use and administrators. As one department official them. Massachusetts is piloting explained, “districts have to come up with the Among the study a data warehouse and provid- training plan, professional development, and jurisdictions only New ing professional development for probably some consulting services” (interview, Hampshire appears data-driven decisionmaking in October 23, 2007). The University of Rhode Island able to offer all four pilot districts. Once the initia- has provided assistance to schools in using data components of state tive is implemented statewide, for decisionmaking, with some support from education agency however, each district will have the Rhode Island Department of Elementary 5 initiatives to every to fund its own training for using and Secondary Education. While Connecticut school in the state data analysis tools and profes- is currently focused on investing in professional sional development in using data development in using data for decisionmaking, it What state education agency initiatives support data-driven decisionmaking by districts and schools?11

has a federal longitudinal data system grant and implementing a com- In every jurisdiction is exploring ways to provide districts and schools prehensive data-driven but New Hampshire greater access to data. An official from the Ver- decisionmaking ini- limited state funding and mont Department of Education stressed the need tiative. In particular, capacity appear to have for both technology and the capacity of educators initiatives for a data hampered long-term to analyze data: warehouse and for tools goals of implementing to use the data warehouse a more comprehensive I really see there being two sides to this. . . . are sometimes compet- data-driven decision­ There is the tech side; every state gives a lot ing priorities. This study making initiative of data, and it’s overwhelming if you don’t reveals how funding and know how to analyze it. Then there’s the team capacity shape state edu- side; data is only as valuable as you have cation agencies’ implementation of the data-driven people to look at it. If you don’t have teachers decisionmaking initiatives provided to districts [involved], then kid-specific solutions won’t and schools. State education agency officials from be discussed. In fact, without this perspective several states reported that available funding and [of teachers], people can make bad decisions. staff capacity have influenced decisions about They might, for example, decide to change the which components to implement, who receives ac- curriculum, but it turns out it’s the kids who cess to these supports, and which external service don’t eat breakfast who do badly. (Interview, providers to work with to expand support. October 23, 2007) Funding. The source and amount of funding for Although Maine and Vermont are working toward state education agency initiatives can affect the implementation of all four components, limited services provided and who receives support. The funding has forced state education agencies to Connecticut State Department of Education relies make choices about the components they provide. primarily on education funding from the state An official from the Maine Department of Edu- legislature and allocates its limited resources to cation lamented the fact that their services are support districts and schools with the highest targeted to only a few schools: “Am I happy that we needs. The Maine Department of Education offers are hitting everybody and providing good instruc- training to a team of administrators from each tion in how to use data? No. We just simply don’t district in using the online reporting tools for the have the staff, but we do what we can” (interview, state assessment, and staff are available to coach February 20, 2008). schools in using data for decisionmaking. Because of staff limitations, this coaching is available pri- State education agency officials in the Northeast marily to schools participating in specific grant and Islands Region voiced the need for access to programs (such as Title I and Reading First). a data system/warehouse, for data tools, and for Some districts in New York City have used Title teachers and administrators to receive training in II funding to support the development of school those tools, as well as for professional development leaders (Darling-Hammond et al. 2007), which in using data for decisionmaking. In every juris- includes training in analyzing data and lead- diction but New Hampshire limited state funding ing school faculty in an evidence-based school and capacity appear to have hampered long-term improvement process. goals of implementing a more comprehensive data-driven decisionmaking initiative. Massachusetts provides state funding to support its statewide data warehouse, which will be avail- State education agency resources shape approaches able free of charge. It is piloting implementation to data-driven decisionmaking initiatives. Re- with selected districts that have access to federal spondents noted competition for resources in funds. Once the initiative is available statewide, 12 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

All state education the Massachusetts Department of previously assessment coordinator at the Maine agencies in the region Elementary and Secondary Educa- Department of Education, noted the increased have turned to external tion will provide guidance to pressure on staff: service providers districts on professional develop- to develop data ment in using the data tools and [Departments of education staff] are being warehouses and other using data for decisionmaking, asked to do exponentially more with fewer technology solutions but districts will need to fund resources. Prior to NCLB, most states tested and to provide training their own services. According to at three grade levels. Overnight, they had to in use of data tools a Department of Elementary and more than double the number of kids they Secondary Education official, the were testing. In most cases, [departments of availability of federal funds has education] were doing it with the same and facilitated the pilot: “It’s been helpful for us to even reduced staff. And they were also trying have partners . . . in the districts who have actu- to implement data warehouses. It’s insane ally reached out and have applied for the Title II what we ask state employees to do. (Inter- D grants” (interview, October 23, 2007). Vermont view, March 24, 2008) charges a subscription fee for access to its data warehouse and analysis tools, which are developed With inadequate staffing and the need for highly and maintained by a partner agency. Also for a fee, specialized skills in data-driven decisionmaking, the partner agency provides professional develop- state education agencies in the region are seeking ment in using the data tools and analyzing the vendors who can provide the needed services to data. Vermont provides funds to low-­performing districts and schools. All state education agencies schools for professional development on data- in the region have turned to external service pro- driven decisionmaking, but other schools and viders to develop data warehouses and other tech- districts must pay their own way. With a federal nology solutions and to provide training in use longitudinal data grant, New Hampshire has been of data tools. The Massachusetts Department of able to offer free statewide access to its data system Elementary and Secondary Education contracted and tools, training on tools, and professional de- with Cognos to develop a data warehouse, but velopment in data-driven decisionmaking. three full-time staff members also work on it, and other staff are available for consultation. Vermont Capacity. State education agency support for data- partnered with another organization to develop driven decisionmaking in districts and schools is a data warehouse and focused state education also shaped by the capacity of staff to work with agency resources and personnel on professional districts and schools. The assessment division of development for data-driven decisionmaking the state education agency is commonly charged processes in low-performing schools. The external with helping local educators understand test items service provider is responsible for the data ware- and how analysis of student scores can shape house, analysis tools, and training on these tools. instructional practice. But these divisions are As one Vermont Department of Education official now being asked to do more with the same level put it: “Being small, the department can’t do a lot of staffing. Although the state education agencies of in-house services” (interview, October 23, 2007). in the Northeast and Islands Region administered statewide assessments before 2001, under the State education agencies have also turned to NCLB Act they now test all students in grades 3–8 external service providers to expand supports every year and at least once during high school, for implementing data-driven decisionmaking. dramatically increasing the number of assess- Maine and the states that use the New England ments to be administered, scored, and analyzed. Common Assessment Program (New Hampshire, An official at one of the Maine Department of Rhode Island, and Vermont) have negotiated their Education’s service providers, Measured Progress, large-scale assessment contracts with Measured Service providers state education agencies work with to support data-driven decisionmaking 13

Progress to include some support for data-driven ESP Solutions Group, and Three of the service decisionmaking. The New York State Depart- Measured Progress), four providers identified by ment of Education works with its regional school focus on creating data state education agency support centers to provide data-driven decision- warehouses and tools to officials administer state making workshops to all of its low-performing analyze data (Cognos, assessments and provide schools. As a New York State Department of Pearson School Systems, support in interpreting Education official stated: Performance Pathways, test scores, four focus on and TetraData), and three creating data warehouses I oversee implementation of school improve- focus on professional and tools to analyze ment in New York City and there’s another development for using data, and three focus on office upstate in Albany that handles it, even data to improve school professional development though I do policy for everyone. We have planning and classroom for using data to improve regional support centers that we fund because practice (BOCES, Con- school planning and we do not have the capacity to cover the state. necticut RESC Alliance, classroom practice (Interview, January 30, 2008) and Measured Progress). Appendix E provides ad- The Connecticut Department of Education, lack- ditional information on the nine organizations. ing adequate staff and expertise, partnered with the RESC Alliance to implement the professional In-depth investigation of three service providers development necessary to support its data-driven decisionmaking initiative for low-performing Three service providers were selected for profiling. districts. Providers had to have worked with multiple juris- dictions in the region, have been in existence for two or more years, and have offered these services What service providers do state education for two or more years. In addition, organizations agencies work with to support educators’ had to offer professional development in using use of data-driven decisionmaking? data to inform teaching and learning, not just sup- port on the other components. Connecticut RESC This section provides an overview of the external Alliance, Measured Progress, and Performance service providers mentioned in interviews with Pathways were selected on this basis (table 3). state education agency officials and then profiles (Appendix B provides more detail on the selec- three organizations that reflect the variety of ser- tion process; appendix E provides a more in-depth vices provided by the full sample of organizations. profile of each provider.)

Interviews with state education agency officials in Connecticut RESC Alliance. The Connecticut the Northeast and Islands Region identified nine RESC Alliance, a regional education agency, is a organizations that were contracted to support consortium of all six regional educational ser- data-driven decisionmaking initiatives at district vice centers in the state. Since 2000, the regional and school levels (table 2). These organizations are educational service centers have met regularly to not an exhaustive list of service providers in the share problems and solutions and cost-effective region, but only those mentioned by interviewees. methods for meeting the needs of public school However, the organizations represent the range of districts through a variety of programs and ser- services available in the region and the variety of vices. Through the Connecticut Accountability for organizations that exist to support state education Learning Initiative, the RESC Alliance provides agencies. Three of the service providers administer training for school data teams at low-­performing state assessments and provide support in inter- districts, to help them build a data-driven preting test scores (the Center for Assessment, decisionmaking process—the only data-driven 14 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Table 2 Overview of service providers identified by state education agencies, 2007/08 State education External agency service provider Services provided Connecticut State Connecticut Regional • Programs and services tailored to the needs of particular schools and Department of Educational Service districts Education Centers Alliance • Training in data-driven decisionmaking (ctrescalliance.org) • Data warehousing

Maine Department Measured Progress • Development, implementation, and administration of K–12 education of Education (measuredprogress.org) assessments • Professional development related to use of data tools, item analysis, and formative assessment Massachusetts Cognos • Business intelligence software Department of (cognos.com) Elementary and Secondary Education New Hampshire Performance Pathways • Data-analysis tools and training on tools Department of (perfpathways.com) • Professional development in creating a data-informed culture in Education schools New York State NY Board of Cooperative • Technical assistance and technology supports to districts and schools Education Educational Services • Professional development in data-driven decisionmaking Department (BOCES) (eboces.wnyric.org/wps/ • Professional development in data management portal/BOCESofNYS) ESP Solutions Group • Data systems and psychometrics in K–12 education for federal, state, Rhode Island (espsolutionsgroup.com) and local education agencies Department of • Consulting and direct services Elementary and Secondary Education TetraData • Data management tools, created expressly for the K–12 community (tetradata.com) Vermont Department Center for Assessment • Partner with states and agencies of Education (nciea.org) • Design and implementation of effective assessment and accountability policies and programs Virgin Islands Pearson School Systems • Assessment development Department of (pearsonschoolsystems. • Technology products to facilitate data storage and analysis Education com)

Note: The state education agencies each identified one primary provider of data-driven decisionmaking services, except for Rhode Island, which identified two providers. Source: Authors’ compilation based on interviews with service provider and state education agency staff and material provided by service providers.

decisionmaking component that the Connecticut Although the RESC Alliance is an agency of the State Department of Education implements. The state of Connecticut and serves any district in Con- RESC Alliance also provides technical assistance necticut that requests support, it has also worked and training in district-level data warehousing with districts in Massachusetts and is beginning and associated data analysis and offers a variety of to serve districts and schools in Rhode Island. The customized work, such as spreadsheet and data- RESC Alliance is partially funded by the state but base training and custom report development for also raises revenue through outside grants and fee- schools and districts. for-service arrangements with districts. Service providers state education agencies work with to support data-driven decisionmaking 15

Table 3 Summary of three service providers, 2007/08 Years in Service provider existence Type of service Clients Who pays? Connecticut RESC 4 • Training to develop Districts: Any Connecticut • Partially funded by the Alliance school data teams to district that requests state build a data-driven services for data-driven • Partially funded by decisionmaking process decisionmaking revenues from delivery • Training and technical States: Massachusetts and of services to districts assistance for district- Rhode Island nonprofit and grant funds level data warehousing education institutions and data analysis applications Measured Progress 24 • Assessment Districts: Professional • State assessment development development provider contracts include • Professional to Maine, Massachusetts, some professional development in New Hampshire, and development to analyzing assessment Vermont districts and schools results, using data to States: Assessment • Districts and states improve instruction, developer for Maine, contract directly with and using formative Massachusetts, New Measured Progress and assessment Hampshire, New York, pay for professional Rhode Island, and Vermont development (in the region) and others outside the region (25 states in all) (includes both assessment and professional development on using assessment data) Performance 4 (8 years • Software for data Districts: Connecticut, • Statewide contracts Pathways for the two analysis, assessment, Massachusetts, New York, provide software with companies and curriculum and Rhode Island (in the some professional that merged mapping and region) and others outside development; to form alignment to standards the region (47 states in all) additional professional Performance • Professional States: State contract with development can be Pathways) development in using New Hampshire purchased from the these software tools company and creating a data- • Districts with their own driven culture contracts pay for all their own services

Source: Authors’ compilation based on interviews with service provider and state education agency staff and material provided by service providers.

Measured Progress. Measured Progress, a nonprofit achievement. Measured Progress has worked with company, provides both large-scale and alternative more than 35 states and major districts to develop assessments, as well as professional development large-scale and alternative assessments. services in using data to improve student achieve- ment. Founded in 1983 as Advanced Systems in In the Northeast and Islands Region the company Measurement and Evaluation, Inc., the company’s has created assessments for Maine, Massachu- stated mission is to improve teaching and learn- setts, and New York and joint assessments (New ing by providing assessment data customized to England Common Assessment Program) for New the needs of schools and by helping educators Hampshire, Rhode Island, and Vermont. It also has make the best use of the data to improve student contracted with districts in Maine, Massachusetts, 16 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

state education agencies New Hampshire, and Vermont use of their data tools to support evidence-based rely on the specialized to provide professional develop- decisionmaking and facilitate the development of skills of these service ment on using data and formative curriculum and local assessments aligned with providers to implement assessment to improve classroom state standards. aspects of their data- practice. State assessment con- driven decisionmaking tracts with Measured Progress In the Northeast and Islands Region Performance initiatives at the district include some professional develop- Pathways has a state contract with New Hamp- and school levels ment to districts and schools on shire and works with individual districts in Con- analysis of assessment data and its necticut, Massachusetts, New York, and Rhode use in improving instruction. Island. Statewide contracts provide districts with software and some professional development. Performance Pathways. Performance Pathways, a Additional professional development can be pur- for-profit technology company, focuses on bring- chased from the company. Districts also contract ing a data-driven informed culture to educators in directly with Performance Pathways, using their schools. The company uses a suite of three prod- own funds to pay for those services. ucts to achieve this goal: Performance Tracker, Assessment Builder, and the TechPaths Cur- Similarities and differences across riculum Mapping System. Performance Tracker the three service providers software allows users to access assessment data and student information in a central location and The three service providers offer different types produce reports and graphs for analysis. Using As- of services to support one or more of the four sessment Builder, educators can create and score components of a statewide data-driven decision- local benchmark assessments based on a library of making initiative (table 4). While each of the items aligned to state standards. TechPaths helps three service providers has a different focus, state educators create, map, and align curriculum and education agencies rely on the specialized skills instruction to state standards. of these service providers to implement aspects of their data-driven decisionmaking initiatives at The company has been in existence for three years, the district and school levels. Two of the service the result of a merger between AlterNet Perfor- providers offer services to support all four compo- mance and TechPaths Company. The merger al- nents, while one focuses on a single component. lowed the two companies to expand their offerings Two provide data tools and training in using the to include data tools, training on the use of tools, tools for analysis and reporting (Wayman et al. and support for creating a data-informed culture 2005; Foley et al. 2008). All three provide at least in districts and schools. Professional development some support to districts and schools in using data provided by Performance Pathways focuses on the for decisionmaking or in creating a school culture

Table 4 Service providers’ support for the four components of state education agency initiatives, 2007/08 Professional Centralized Training in data development in data system/ Tools for data analysis systems/warehouses using data for Service provider warehouse and reporting and data tools decisionmaking Connecticut RESC Alliance 3 Measured Progress 3 3 3 3 Performance Pathways 3 3 3 3

Source: Authors’ compilation based on interviews with service provider and state education agency staff and material provided by service providers. Service providers state education agencies work with to support data-driven decisionmaking 17

that uses data-driven decisionmaking (Boudett The tools are housed on their web-based portal. and Steele 2007; Knapp, Copland, and Swinnerton Performance Pathways markets three data tools: 2007; Young 2006; City, Kagle, and Teoh 2005). one stores and analyzes data, another creates and And all three follow the “train the trainer” model scores local benchmark assessments, and a third as a way of leveraging districts’ professional devel- facilitates mapping and aligning of curriculum opment funds. and instruction to state standards.

The three providers differ in their emphasis on Training in data systems/warehouses and data developing a culture of using evidence for making tools. The two service providers that provide education decisions and on promoting proficient tools for data-driven decisionmaking, Measured and widespread use of data tools. The Connecticut Progress and Performance Pathways also provide RESC Alliance is more focused on building the training in their use. Both organizations stress capacity of data teams in schools, while Perfor- the importance of this training for maximizing mance Pathways’ work is grounded in the devel- the effectiveness of the tools. Measured Prog- opment and use of data tools. Measured Progress ress uses a train the trainer model for its online falls between the two, with its assessments and tools. Performance Pathways also offers a train analysis tools intended to build a climate of data the trainer model, providing training for school use in schools. district representatives who then educate their colleagues on how to use the tools and data. This Centralized data system/warehouse. Two of the approach to professional development is increas- profiled external service providers provide a ingly common for districts (Dembosky et al. 2005) centralized data system/warehouse. Performance and indicates an orientation toward developing a Pathways’ Performance Tracker software allows process that will live beyond the initial workshops users to store multiple forms of data, so users can and participants. upload assessment data, student information, and other relevant data. The information is stored and Professional development on using data for linked together in a central location to support decisionmaking. All three service providers reporting. Measured Progress, as part of its work offer professional development on using data for on assessments, helps clients create a web portal to decisionmaking. The Connecticut RESC Alliance facilitate customized data reporting. Data are gen- trains teams of educators from low-performing erated from the assessments created and scored districts to develop a data-driven culture through by Measured Progress. The Connecticut RESC a collaborative process of examining data and Alliance, which emphasizes the process of data- student work. The training is intended to lead driven decisionmaking over the use of data tools, to changes in teacher instruction and student does not support services for creating a centralized learning. In Connecticut a for-profit partner of the data system. RESC Alliance and the Connecticut Accountability for Learning Initiative, the Leadership and Learn- Tools for data analysis and reporting. The three ing Center, offers certification to participants to service providers vary considerably in the use of prepare them to lead data teams and teach their technology to support data analysis and reporting. colleagues about using data for decisionmaking.6 The Connecticut RESC Alliance offers services to support data-driven decisionmaking without In conjunction with as- All three service the use of technology. District and school teams sessment work Measured providers profiled in the typically use paper copies of their state assessment Progress emphasizes study offer professional reports and other data used in decisionmaking. changing the climate development on using Measured Progress is beginning to develop online of schools and districts data for decisionmaking data tools and generate accompanying reports. by using data to inform 18 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

instructional decisions. Measured Progress works depend on resources, including capacity and with educators on how to interpret and use the funding. assessment reports. Performance Pathways of- fers workshops to educators on how to use the Sources of funding information generated by their reports to make informed instructional decisions. Where inadequate resources impede implemen- tation of a comprehensive data-driven decision- making initiative, state education agencies could Considerations for state education explore a range of funding options to augment agency leaders developing data- their investments in data-driven decisionmaking. driven decisionmaking Initiatives State education agencies might consider adopt- ing more creative ways to fund their data-driven Several issues emerged from this study that may decisionmaking initiatives, to enhance the sup- be relevant to state education agency leaders as ports provided to districts and schools. Specifi- they develop or expand initiatives to support data- cally, state education agencies might take advan- driven decisionmaking. tage of various federal grant programs (such as Title I, Title II, and Reading First) to augment state Adding components of a data-driven funding for data-driven decisionmaking. decisionmaking system Strategies to enhance capacity Although resource constraints force choices about which components of a data-driven decisionmaking The state education agencies in this report sup- system to implement, state education agency leaders port data-driven decisionmaking in districts interested in supporting data use for decisionmaking and schools despite limited staff and sometimes in districts and schools might consider whether add- limited expertise in developing tools or training ing components could help to meet both intermedi- educators. State education agencies have turned to ate and long-term student achievement goals: external service providers to augment their data- driven decisionmaking initiatives. They might also • Where the focus has been on data warehousing, consider negotiating additional related services state education agencies might consider provid- under existing contracts to leverage current ing more professional development on how to programs and offer greater support for data use in convert the stored data into usable knowledge districts and schools. that influences teaching and learning.

Where inadequate • Where the focus has been on Considerations for further research resources impede providing professional devel- implementation of a opment to educators on using This study describes how state education agen- comprehensive data- data for decisionmaking, state cies in the Northeast and Islands Region support driven decisionmaking education agencies might data-driven decisionmaking in districts and initiative, state consider investing in a data schools (see table 5 for a snapshot of services in the education agencies warehouse and data analysis region). While it fills some gaps in the research, could explore a range tools to collect, organize, it raises additional questions about implementa- of funding options and analyze data for use in tion of data-driven decisionmaking. The research to augment their decisionmaking. could be expanded by looking at state education investments in data- agency support and implementation of data-driven driven decisionmaking Any decision to implement ad- decisionmaking in greater depth, at implementa- ditional components will also tion at different levels of the education system, and Considerations for further research 19

at the impacts of data-driven decisionmaking on Districts. Although most data-driven decision- student achievement. making research has focused on implementation at the school level, stakeholders in the region may Stages of implementation wish to learn more about data-driven decision- making in districts throughout the Northeast and Many jurisdictions in the region are in the early Islands Region. stages of implementing these initiatives for data-driven decisionmaking. As state education • How are districts supporting data-driven agencies provide more support for data-driven decisionmaking in schools? decisionmaking, it would be interesting to revisit this topic: • Specifically, now that the Massachusetts Department of Elementary and Secondary • How do states scale up their efforts to provide Education has created its data warehouse in- either comprehensive training or expand their frastructure, how are districts helping school- support to all districts? level staff use that data for decisionmaking?

• How do they support a range of data-driven Schools. Avenues for further research at the school- decisionmaking activities, such as formative level include: assessment? • Is data-driven decisionmaking increasingly Different levels of the education system embedded in school processes and classroom practice? This study did not examine implementation of data-driven decisionmaking at the state, district, • Are teachers and principals implementing school, or classroom level. Exploration of imple- the skills taught in professional development mentation at these levels of the education system sessions? could help to answer important questions. • What types of decisions are made by teach- State education agencies. Although this study ers and school administrators? Do they have examined how state education agencies sup- access to the data needed to inform those port data-driven decisionmaking by districts decisions? and schools, it did not address how state educa- tion agency leaders might employ data-driven Impact of data-driven decisionmaking decisionmaking to inform their own decisions. on student achievement The emphasis in the literature has been on district and school implementation rather than on state Once the data-driven decisionmaking initiatives implementation. Some questions to consider: are more fully implemented, there will be a need to investigate how data-driven decisionmaking • What kinds of data-driven decisionmaking affects student achievement. practices do state education agency leaders engage in? What kinds of data do they collect • Does data-driven decisionmaking help and use to make decisions on education poli- schools meet adequate yearly progress cies and practices? requirements?

• What kinds of decisions are made by state • What are the impacts of specific data-driven education agency leaders who engage in data- decisionmaking programs or practices on driven decisionmaking practices? student achievement? 20 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking a inued) enter t MEA ampus ducation (con easured egional esource C rogress ( nfinite C R educational service centers E State R administrator and web portal MEA access to scores) I M P (integrated web portal) • • • • onnecticut is iloting web portal or professional F development and technical assistance no technology used;is data are on paper and are sometimes in spreadsheets C currently considering developing a statewide data warehouse Web portalWeb for MEA scores P that will integrate MEDM S and MEA databases Technology availability Service provider • • • • ata cademic astery ducation D S) aine E onnecticut A anagement System onnecticut M erformance and Test ocal data teams C L work with classroom- and school-level data, including state assessment scores ( P C and behavioral andTest) school data climate State assessmentState scores Student and some staffing information are in the M M ( MEDM These are not currently an integrated system. ata collected D • • • • irst eading F dministrators (data ow-performing Teachers and Teachers administrators L districts A tools) and Teachers school leaders (item analysis) Schools with funding from Title I or R for school(data improvement plans) workingTeachers with students receiving special education services (process of data-driven decisionmaking) • • • • • • ducation ducation ehind ccountability epartment ducational ) scores and create aine D ct, focus shifting is to aine E epartment of E hild L eft B nitiative MEA epartment of E o C onnecticut A ct, the M aine D ) A ducation supported teachers in ssessment ( asic professional development efore the N ecipient of a D art of the C rofessional development and rofessional development help to ollow-up technical assistance NCLB P for L earning I P technical assistance for low-performing districts B focuses on process of data-driven decisionmaking and use of data teams and trains school leaders F provided throughout the year ( B of E using data-driven decisionmaking understandto M A local assessments theSince NCLB A create a longitudinal data system R grant support to creating a longitudinal data system Service providers offer training on data tools The M supports school staff in item analysis and supports targeted schools in using data create school to improvement plans P special educators use data escription of D data-driven decisionmaking initiative group Targeted • • • • • • • • • • aine onnecticut napshot of data-driven decisionmaking initiatives in the Northeast and Islands Region, 2007/08 Jurisdiction C M Table 5 Table S Considerations for further research 21 a inued) t (con ognos (data athways erformance C warehouse) P P • • erformance Tracker) erformance ata warehouse P Web portalWeb to provide to access data and create reports D and analysis tools ( Technology availability Service provider • • eading valuation ynamic indicators iscipline evelopment R orthwest E ssociation results ssessment results ttendance ttendance ourse results urricula hysical development inancial data ocal assessments F (expected) C D A State assessmentState scores N A D of basic early literacy skills and D A L Sociodemographic characteristics P C A istricts can add their State assessmentState scores Student and educator information istricts can add their State data: State • D own data, including: • • • • • State data: State • D own data, including on: • • • • • ata collected • • D • • • • ll schoolsll and ll schoolsll and A districts A districts • • ngland ew E hild policy rogram and on epartment of ollow the C ssessment P ampshire D ata-driven decisionmaking initiative ata warehouse and analysis software ew H reation of a statewide virtual data ommon A arly stages of statewide ducation received a federal grant to nitiative will not be mandated C warehouse Warehouse will provide a sophisticated software tool that will allow users at state and district levels analyze to state and district data and decisions make to based on them E implementation I D supports the F D allows educators track to individual student progress on the N local assessments and link assessments curricula,to lessons, and postsecondary data N E improve longitudinal data system C escription of D data-driven decisionmaking initiative group Targeted • • • • • • • inued) t assachusetts ew ampshire napshot of data-driven decisionmaking initiatives in the Northeast and Islands Region, 2007/08 Jurisdiction M N H Table 5 (con Table S 22 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking a ata ata D D P Solutions roup ssessment ooperative enter for oard of egional earson School S ducational B C E Services R information centers Tetra E G Tetra C A P Systems • • • • • • • asic irgin ata onsolidated C ork State I database S ew Y ssessment of ccountability eporting Tool ermont D ducation A nformation System) owa Test of B owa Test slands Territorial slands Territorial N Testing and Testing A R Web portalWeb collect to and analyze data online. ( E I V Warehouse and associated analysis software Skills, and V I A L earning data) S (includes student information, I Technology availability Service provider • • • • odel asic Skills nformation ) I istrict M S evelopment ssessment A dministration ssessment of L earning ommon D slands Territorial irgin I slands Territorial aries depending on the eading A ermont D inancial information owa Test of B owa Test V initiative State assessmentState data Student information F Teacher certificationTeacher State assessmentState data V R C for collection of local data V School A A I and Student I (S System ata collected D • • • • • • • • • • • ew ccountability ll districtll and ll schoolsll can access vailable district to eporting Tool ork State Testing ork Testing State ow-performing ow-performing ocal administrators A school leaders N to access have Y and A R L schools receive targeted professional development A officials, state education agency employees, school faculty, and parents L schools more have requirements and receive more support A the data warehouse and analysis software subscribingby the to service L • • • • • • ct ssessment of ducation epartment of ork E State ew Y irgin I slands D epartment has no formal statewide ata warehouse and analysis software slands Territorial A irgin I slands Territorial egional school support centers arly stages of data use mphasis on team use of data ducation does a data-driven not have ow-performing schools required to D initiative The department encourages data- driven decisionmaking part as of low- performing school improvement plans R provide professional development low-performingto schools that incorporates data-driven decisionmaking areas content into serve as a central repository for data held in all systems willSystem give users real- to access time information and the ability run to reports based on data in the system L show data use; they receive support throughprincipal learning communities D E E decisionmaking initiative beyond the requirements of the NCLB A E V L earning implemented in 2004/05 escription of D data-driven decisionmaking initiative• group Targeted • • • • • • • • • • inued) t sland Statewide data warehouse system ork The N Authors’ compilation based on interviews, documents provided by state education agency officials, and information available on state education agency web sites. For further information on the material in this table, see appendix D. ew Y irgin I slands The V hode I ermont napshot of data-driven decisionmaking initiatives in the Northeast and Islands Region, 2007/08 Jurisdiction N R V V Table 5 (con Table S Note: a. The service providers listed were identified by state education agency officials as the leading provider in the jurisdiction. Source: Appendix A. Glossary 23

Appendix A User-friendly software allows users to easily ac- Glossary cess, manipulate, and analyze data to improve classroom practice. Culture of inquiry. A type of data-driven decisionmaking in which faculty create an organi- Formative assessment. All activities undertaken zational culture focused on using data and other by teachers and their students to assess student evidence to shape instructional practices. understanding and to provide feedback for on­going adjustments in teaching and learning. Timely ad- Cycle of inquiry. An iterative process in which justments help students achieve targeted learning educators collaboratively analyze data to under- goals or standards within a set timeframe. stand their instructional practice and develop a plan to support improvement efforts. Longitudinal data system. A system that holds and tracks multiple years of student and teacher Data. Information maintained by state educa- demographic data, test scores and assessments, tion agencies, districts, schools, or teachers. Can and other information. include assessment data, school-level data on students and staff, demographic data, state test Low-performing school or school in need of scores, and financial information. improvement. A school that has not made ad- equate yearly progress for two consecutive years, Data-driven decisionmaking. Systematic col- as specified under the No Child Left Behind Act lection and analysis of various types of data— of 2001. Such schools are identified as “in need of including input, process, outcome, and satisfaction improvement.” data—by teachers, principals, and administrators to guide decisions to help students and schools Multiple measures. The use of various assess- succeed. ments to characterize the performance of students, schools, and school districts. Measures for assess- Data-driven informed culture. A term used ing student achievement can include student port- by Performance Pathways for data-driven folios and exhibitions, performance assessments, decisionmaking. teacher observations, and formal tests. School and district measures can include student growth mea- Data system. A centralized repository that sures, promotion rates, attendance records, suspen- combines data from multiple sources to provide sion rates, graduation rates, enrollment in honor or an integrated view of data sources and a uni- advanced placement classes, and formal tests. form interface for data access (also called a data warehouse). Service providers. External organizations work- ing with states, districts, and schools to provide Data teams. Small groups of teachers and admin- technology, professional development, and techni- istrators working together on an ongoing basis to cal assistance for data-driven decisionmaking. examine individual student work generated from common formative assessments. Data teams use State data-driven decisionmaking initiatives. collaborative, structured, scheduled meetings that Initiatives conceptualized at the state level with focus on the effectiveness of teaching and learning. a set of comprehensive goals, objectives, and purposes and with designated resources (such as Data tools. Software that allows teachers and funds and personnel) to support development and administrators to collect, organize, and analyze implementation. State education agency leaders data for use in decisionmaking. Can be software must communicate such policies to districts in the on individual personal computers or online tools. state and offer them support. 24 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Train the trainer model. A system in which some User conferences. Meetings that bring together individuals are trained in a theory or technology educators who are working with a given product and are then in turn responsible for training oth- and allow them to learn more about the product and ers in the same material. other products and how they work with each other. Appendix B. Study methods 25

Appendix B state, the respondents held various roles in the state Study methods education agencies, including assessment coordina- tor, director of evaluation and research, information This project has two main purposes: to catalogue technology director, school improvement consultant, statewide data-driven decisionmaking initiatives and information systems specialist (among others). being implemented in the Northeast and Islands Region through document reviews and interviews Respondents at each state education agency with key respondents from state education agen- provided names and contact information for cies and to catalogue service providers that sup- lead service providers supporting its data-driven port data-driven decisionmaking initiatives in the decisionmaking initiative with professional de- region while profiling three such providers. velopment for districts and schools. From this list a catalogue of nine service providers was created Research questions (see appendix E). This catalogue includes providers mentioned by state education agency respondents Two research questions guided the study: in interviews as the lead provider in their state and is not an exhaustive list of all providers working • What state education agency initiatives support in the region. It includes one provider contracted data-driven decisionmaking by districts and with each state except Rhode Island, for which two schools in the Northeast and Islands Region? were chosen because both had been selected by the What are their similarities and differences? state through the requests for proposals process.

• What service providers do the state education Three providers were then selected for more in- agencies work with to support educators in depth profiles. These providers met the following using data to inform education decisions, and criteria: they worked with multiple jurisdictions in what services do they provide? What are their the region, they had existed for two or more years, similarities and differences? and they had offered services for two or more years (that is, they were not starting up the activi- Sample selection ties). This reduced the sample to eight providers— the New York Board of Cooperative Educational The original sample included all nine jurisdictions Services was excluded because it served only in the Northeast and Islands Region. However, the districts and schools in New York State. Organi- research team was unable to schedule interviews zations were selected from the remaining eight with key respondents in the Puerto Rico Depart- providers of professional development in using ment of Education despite repeated attempts over data to inform teaching and learning in addition several months to contact the department. All to other services to districts in the region. That other jurisdictions in the region are profiled (see left four organizations—the Connecticut Regional appendix D for state education agency profiles). Education Service Center Alliance, Measured Progress, Performance Pathways, and TetraData. The research team identified key respondents at state TetraData and Performance Pathways offered education agencies by drawing on existing relation- similar services. As Performance Pathways offered ships within the Regional Educational Labora- a larger range of services, it was selected. tory Northeast and Islands, through liaisons and researchers assigned to each state and through the The three service providers profiled in the report team’s professional contacts from previous research and in appendix F are: in the region. In a few cases the team found respon- dents through web sites describing states’ data- • The Connecticut Regional Educational Ser- driven decisionmaking initiatives. Depending on the vice Centers Alliance (provides professional 26 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

development on the process of data-driven observed two of the three providers profiled decisionmaking). in the report. The timing of the project made it impossible for researchers to observe the • Measured Progress (provides data warehous- workshops conducted by Measured Progress, ing services, data analysis tools, and profes- which are held when state assessment data are sional development on these tools and on released. Instead, researchers spoke with a par- data-driven decisionmaking). ticipant from a that has worked with Measured Progress and an observer from • Performance Pathways (provides tools for data the Maine state education agency to gather analysis, reporting and curriculum align- information about whether the workshop ment, with professional development on these matched the service provider’s description. tools and on data-driven decisionmaking). Two open-ended protocols were developed for the Data sources study, one for state education agencies and one for service providers, to guide data collection. To answer the research questions, researchers used The protocols for state education agency officials data from four sources: included questions addressing 11 key factors:

• Publicly available information. This included • Motivation for the initiative. reports from state education agencies and ser- vice providers on data-driven decisionmaking, • Overarching goals of the initiative. descriptions of procedures and technical assistance to support data use, schedules of • Data collected and used. procedures, and information on state educa- tion agency and service provider web sites. • Tools used to collect, house, and disseminate data. • Supplemental documents. Respondents from both state education agencies and service • Users of data at different levels of the system. providers offered additional documentation that was not publicly available. • Data analysis tools.

• Interviews. Semi-structured, open-ended • Decisions made using information gleaned interviews were conducted with key respon- from the data. dents from each state education agency and service provider. An application to Education • Leaders of the initiative. Development Center’s Institutional Review Board for an expedited review of involvement • Infrastructure in place to support data-driven of human subjects in a research project was decisionmaking. approved, and researchers informed respon- dents of their rights and responsibilities • Personnel supports in place. through written informed consent forms. • Challenges to scaling up the initiative. • Observations. When possible, interviews and reviews of official materials were supplemented The protocols for service provider personnel were with observations of professional develop- designed to collect information on 12 key factors: ment activities—a check on the self-reported information from the providers. Researchers • Objectives of services. Appendix B. Study methods 27

• Skills targeted. • Conduct interviews with state education agency respondents. Three researchers • Key components. investigated two jurisdictions each, and two researchers examined one each. Researchers • Users of the services. held 45-minute interviews with a key state education agency contact in each jurisdiction • Leaders of the services. to gather descriptions of data-driven decision- making strategies. In all but one jurisdiction • Amount of time that services had existed. interviews were limited to one respondent. In Massachusetts two state education agency • Numbers of schools and districts that pro- officials were interviewed. Most interviews grams served. were held by telephone, but interviews in Maine, Massachusetts, and Vermont were • Timeframe for the services. held in person at the state education agencies. In addition to answering questions about the • Cost of the services. jurisdictions’ data-driven decisionmaking initiatives, state education agency contacts • Research base for the services. supplied the names of data-driven decision- making professional development service pro- • Reasons states gave for selecting the services viders who worked with them and the names and service providers. of their contacts at each service provider. All interviews were recorded and transcribed. • State leaders’ experiences working with the service providers. • Gather publicly available information about service providers who work with state edu- Some of these factors were based on the four di- cation agencies. Information from service mensions of data-driven decisionmaking that are provider web sites was used to create the designed to build and sustain capacity: leadership, catalogue of data-driven decisionmaking personal supports, infrastructure, and program- professional development service providers in matic content (Abbott 2008). appendix E.

Data collection methods • Select three service providers for in-depth profiling. From the nine providers in the cata- Data collection began in September 2007 and logue, three were chosen for in-depth profil- ended in April 2008. Six sequential steps were ing using the criteria described above. taken to answer the research questions: • Interview key respondents at service providers. • Collect publicly available information on Telephone interviews were conducted with data-driven decisionmaking initiatives. Public two key contacts at each of the three profiled online documents, including state education service providers to gather information about agency web sites and web sites for state educa- their professional development services in tion agency partners (such as education agen- data-driven decisionmaking. At Measured cies), were reviewed to gather information on Progress, in addition, researchers interviewed jurisdictions’ data-driven decisionmaking a district administrator who had participated practices. This helped to describe the initia- in the training and who demonstrated the tives and, in some cases, to locate key contacts data tools. Some phone interviews were fol- in each jurisdiction. lowed up with in-person interviews to observe 28 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

service provider activities. All interviews were The research team developed a code book or cod- recorded and transcribed. ing scheme based on the factors that guided data collection and discussed and developed subcodes • Observe professional development services. to allow a more nuanced analysis of the transcripts Teams of two researchers observed profes- and notes. Most of the codes were developed sional development training programs or before coding began, but a few subcodes were workshops. The observations allowed re- added during the discussions to ensure intercoder searchers to confirm information reported reliability. Subcategories were developed dur- by the service providers. During the visits ing analysis, though the overarching themes had researchers asked contacts to provide ad- been identified in the original proposal. The main ditional documentation, including training coding themes, and examples of the subcodes that and curriculum materials, course syllabi, and guided the analysis of transcripts and notes, ap- marketing information, thus gathering practi- pear in tables B1 (state initiatives) and B2 (profes- cal information that could be useful to the sional development). Cross-state education agency state education agencies. For example, details analysis allowed the researchers to identify the on the cost, duration of services or activities, four components of a data-driven decisionmaking and size of typical workshops and trainings initiative—a framework for the findings. for a given provider can help states under- stand what to expect when working with such To ensure uniform conclusions when coding providers. Researchers also focused on learn- transcripts and documents, just two coders were ing how service providers met the differing assigned to each of the two sets of interview needs of jurisdictions. Interview transcripts transcripts (state education agency and service and observation notes were systematically an- provider), but with three researchers participating alyzed using the ATLAS-ti software program. in the coding. One researcher coded both sets of Documents were used to further illustrate or transcripts, partnering with a second researcher explain issues raised in the observations and to code one set and with a third to code the other interviews. set. The two researchers coding each set worked closely to agree on interpretations of the qualita- Because Measured Progress had already finished tive data and to ensure consistency in coding. For all its professional development activities at each set, after coding a single transcript separately, the time it was contacted, researchers reviewed researchers then discussed their individual coding workshop materials and spoke with two additional of the transcript item by item. Each coder then respondents who had attended its workshop in coded a different subset of transcripts and notes. Maine. To develop the case profiles of statewide data- Researchers also interviewed two district officials driven decisionmaking initiatives and triangulate in Massachusetts to understand how they access evidence supporting the description of the initia- the state’s data warehouse initiative and how they tives, transcript data were reviewed alongside both use the available data for decisionmaking. publicly available documents and those provided by state personnel. Information relevant to each Data analysis strategies code and category was synthesized thematically, based on the categories outlined in the analysis The analysis of all collected data, including docu- protocols (Abbott 2008), and the syntheses were ments and other provider materials, interview used to describe each initiative. transcripts, and observational field notes, was based on the categories outlined in the original For the service provider case profiles, interview research proposal and in data collection protocols. transcripts and observational field notes were Appendix B. Study methods 29

Table B1 State initiative coding scheme Main theme Examples of subcodes Initiative • Direction • Requirement • Focus Motivation for the initiative • Accountability • Analyzing data Overarching goals of the initiative • Data storage • Accountability Data collected and used • State assessment • Student information Tools used to collect, house, and • Software from outside provider disseminate data • Data warehouse Users of data at different levels of the system• State department of education • Principals Data analysis tools • Use of software and reporting • Use of professional development Decisions made using information gleaned • Academic from the data • Financial Leaders of the initiative • State department of education personnel • District officials Infrastructure in place to support data- • Professional development driven decisionmaking • Data warehouse Personal supports in place • Professional development centers • State department of education personnel Challenges to scaling up the initiative • Money • Culture Evaluation • Initiative • Outside provider Definition • Data-driven decisionmaking

Source: Authors’ analysis.

analyzed alongside documents provided by con- The team reviewed each profile, revised it based tacts at the service providers and coded for the key on comments from the team, and sent the revised categories included in the interview and analysis profile to respondents for validation. Profiles were protocols. Information for each code and cat- again revised after comments were received from egory was synthesized thematically, based on the respondents. categories outlined in the analysis protocols, and compiled into each case profile. After reviewing all the profiles, the team held sev- eral discussions to identify cross-profile similari- This systematic analysis of the collected data al- ties, differences, and challenges. These became the lowed the researchers to build case profiles of the cross-profile themes that guided the discussion of state education agencies and the service providers. findings in the body of the report, including the 30 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Table B2 Professional development services coding scheme Main theme Examples of subcodes States • New York • New England Common Assessment Program Overarching goals of the • Provide technology-based products in data-driven decisionmaking to districts and organization states • Provide professional development for using assessment data to districts and states Objectives of the services • Create culture of data-driven decisionmaking in schools • Make states better consumers of data Skills targeted • Analysis of data • Use of technology-based tools for data-driven decisionmaking Key components • Training in process of data-driven decisionmaking • Training in use of technology-based products for data-driven decisionmaking Users of the services • District personnel • Principals Leaders of the initiative • State department of education personnel • Outside service provider Amount of time services had been • 1–3 years in existence • Varied for each service provided Numbers of schools and districts • All districts in a state that programs served • Some districts in a state Timeframe for the services • Less than three months • 1–3 years Cost of the services • Paid by state • Paid by districts Research base for the services • Based on work of data-driven decisionmaking experts • Large-scale evaluation work Curriculum and content • Analysis of data • Use of technology-based tools for data-driven decisionmaking Trainers • Company personnel • District personnel previously trained by trainers Trainees • District personnel • Principals Kinds of data analyzed • State assessment • Student information Analysis tools • Use of data teams • Software-generated reports Decisions made using information • Financial gleaned from the data • Academic Medium of delivery • In person, with technology • Online

(continued) Appendix B. Study methods 31

Table B2 (continued) Professional development services coding scheme Main theme Examples of subcodes Training process • Based on work of one data-driven decisionmaking expert • Based on work of various data-driven decisionmaking experts Materials provided to trainees • PowerPoint presentation from training • Activity sheets Follow-up training • Follow-up technical assistance as part of professional development • Follow-up technical assistance when requested Supports • Follow-up technical assistance • Technology-based tools Activities and goals • Data analysis and interpretation activity worksheets • Train other teachers and administrators in a district and school Connections with everyday • Use of real data practice • Access to real assessment items Implementation experience issues • Capacity • Teacher resistance

Source: Authors’ analysis.

four components of a state education agency data- information from the interviews with documenta- driven decisionmaking initiative. One member tion and observations, it is unclear whether the of the team took notes during the discussion and opinions of the respondents represented their col- drafted the themes. Once the team had reviewed leagues’ perspectives. the draft, it met for another discussion to flesh out the themes and add documentation from the In addition, the project did not explore the role of profiles. A similar process guided the development the districts or schools in supporting the imple- of the considerations for state education agency mentation of data-driven decisionmaking, nor leaders and for further research. did it consider the impact of data-driven decision- making initiatives on school improvement or stu- Study limitations dent achievement. Finally, this project did not as- sess the effectiveness of any of the state education The primary limitation of this project was the agency initiatives for data-driven decisionmaking small number of respondents interviewed at or the impact of the professional development that each state education agency and service provider. was provided to build educators’ capacity to use Although the research team triangulated the data-driven decisionmaking. 32 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Appendix C Technology Trends Study maintained at least some Research on the implementation of student data electronically and the majority (91 data-driven decisionmaking, barriers to percent) of teachers surveyed reported access to implementation, and supports available at least limited data on their students (Gallagher, Means, and Padilla 2008). This appendix examines the literature on data- driven decisionmaking. In particular, it explores In addition, many educators have access to data what research says about the implementation of tools that can facilitate data-driven decision- data-driven decisionmaking, the barriers to its making. These can include software programs and implementation, and the available support to online data portals that provide more advanced states, districts, and schools in implementing a and customized ways of analyzing and reporting data-driven culture. data (see case studies by Larocque 2007; Storandt, Nicholls, and Yusaitis 2007; Chen, Heritage, and Implementation of data-driven decisionmaking Lee 2005; Lachat and Smith 2005; Light et al. 2005; at the district and school levels Murnane, Sharkey, and Boudett 2005; Mason 2002). These software programs can either be sep- Studies of how data-driven decisionmaking is arate from data warehouses or used in conjunction implemented show that districts and schools with data warehouses or other types of databases. encourage various activities to build educators’ Training on both the data systems/warehouses capacity to use data and evidence to improve stu- and the analysis tools is commonly provided by dent outcomes. The research suggests that, to ease the organization that developed the system or the use of data-driven decisionmaking, districts through a district’s central office staff (Foley et and schools provide technology infrastructure and al. 2008). Training educators how to access and professional development to help educators collect, analyze data is an important part of developing organize, and analyze data and to foster an envi- data-driven decisionmaking among school or ronment that supports a culture of inquiry. Al- district staff—research shows that educators often though most of these findings were gleaned from lack an understanding of data, or the skills needed case studies of district and school implementation, to track and analyze useful data (Earl, Katz, and one national survey of teachers has documented Fullan 2006; Dembosky et al. 2005). their data-driven decisionmaking practices. Research on implementation of data-driven Education professionals collect various data, decisionmaking reveals that teachers and admin- including test scores, classroom assessments, and istrators also receive professional development on information on the background characteristics of data-driven decisionmaking. Schools with com- faculty, students, and schools (Gallagher, Means, prehensive data-driven decisionmaking develop a and Padilla 2008; Wayman 2005). These data may culture of inquiry in which educators collabora- be housed in data systems/warehouses (Marsh, tively analyze data, to understand their instruc- Pane, and Hamilton 2006; Mills 2008; Wayman, tion practice and to develop plans to support Stringfield, and Yakimowski 2004), which facili- school improvement efforts (Boudett and Steele tate teacher and administrator access to multiple 2007; Knapp, Copland, and Swinnerton 2007; sources of data (Dembosky et al. 2005; Kerr et al. Young 2006; City, Kagle, and Teoh 2005; Lachat 2006; Lachat and Smith 2005; Marsh, Pane, and and Smith 2005; Murnane, Sharkey, and Boudett Hamilton 2006; Storandt, Nicholls, and Yusaitis 2005; Copland 2003; Feldman and Tung 2001). 2007; Supovitz and Klein 2003; Wayman 2005). In 2006/07 nearly all school districts in a nation- This cycle of inquiry helps schools evolve into ally representative sample surveyed by the U.S. learning organizations or “organizations where . . . Department of Education’s National Educational people are continually learning to see the whole Appendix C. Research on implementation, barriers, and supports 33

together” (Senge 1990, p. 3). In these schools organizations provide professional development educators use data to clarify instructional prob- on data-driven decisionmaking to district and lems (Firestone and González 2007). Professional school administrators and teachers (Feldman and development to support these processes follows Tung 2001; Lachat and Smith 2005; Ikemoto and the model of professional learning communities, Marsh 2007). Such external support often comes which focus on the development of collaborative, from organizations that create data tools (Dem- reflective practices (Feger et al. 2008). In profes- bosky et al. 2005; Storandt, Nicholls, and Yusaitis sional learning communities for data-driven 2007; Wayman, Stringfield, and Yakimowski 2004) decisionmaking, often called data teams, coaches or from professional development organizations or guide teams of educators through data-driven universities that help districts and schools develop decisionmaking to help them use data to make data-driven decisionmaking (Boudett, City, and and evaluate changes in instruction (Love et al. Murnane 2005; Darling-Hammond et al. 2007; 2008). Thorn, Meyer, and Gamoran 2007).

Barriers to data-driven decisionmaking Although most research on external supports for data-driven decisionmaking focuses on profes- Research on data-driven decisionmaking in sional development organizations and universities, districts and schools also documents barriers to state education agencies have also recently begun using data to improve student outcomes. A pri- to support data-driven decisionmaking in districts mary barrier is that the available data may not be and schools (Dembosky et al. 2005; Massel 2001; relevant to school and district improvement plans. Moss and Piety 2007; Reichardt 2000). Because For example, data used to demonstrate adequate these initiatives are new, there is little research in yearly progress may not be appropriate for analyz- this area. A handful of descriptive studies were ing classroom practice (Mandinach 2008). Where found that looked at state education agency initia- teachers have access to student data, the data tives to support data-driven decisionmaking at the system might not provide tools to organize and local level. They show that states have supported analyze the data in a meaningful or relevant way the needs of local educators in three primary ways: (Gallagher, Means, and Padilla 2008). Many edu- creating a policy structure to support and encour- cators face difficulties in transforming data into age data-driven decisionmaking, providing data actionable knowledge (Massell 2001; Chen et al. storage and analysis tools, and building the capac- 2000; Gallagher, Means, and Padilla 2008; Hassler, ity of local educators to use data. Buck, and Torgesen 2004; Love 2004; Lang et al. 2007), in part due to a lack of essential knowledge One descriptive study, released prior to the imple- and skills to analyze data (Lang et al. 2007; Marsh, mentation of the No Child Left Behind (NCLB) Pane, and Hamilton 2006). Act, documents that the state education agency in supported data-driven decisionmaking External supports for data-driven in three ways: by providing access to state assess- decisionmaking in districts and schools ment data (through a standards-based account- ability system), by requiring school improvement To overcome the barriers described above and plans to include using data to measure progress, to effectively use data-driven decisionmaking in and by providing each district with Statisti- districts and schools to improve student outcomes, cal Package for the Social Sciences software educators not only must have access to data but and free training on that software (Reichardt also must know how to organize and analyze the 2000). Another study, involving 100 interviews data available. Often they need external sup- with educators in six districts and surveys of 26 ports to develop their capacity to use data in their superintendents, detailed how the state education districts and schools, so a variety of external agency in Pennsylvania makes resources available 34 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

to all districts to support data-driven decision- capacity to analyze and use data in their school making (Hergert et al. 2009). These supports improvement plans (Hergert et al. 2009). include customized reports of state assessment data, an annual training seminar on data-driven Data-driven decisionmaking at the local level is not decisionmaking, and help for districts to imple- new. But the need for the state education agency ment the Pennsylvania Value Added Assessment to play a role in supporting data-driven decision- System for growth modeling of individual student making in districts and schools has increased achievement. The Pennsylvania state education dramatically in the past decade, with new state agency also focuses other resources—for example, and federal external accountability systems. The supports for school improvement planning—on increased nationwide emphasis on data-driven schools in need of improvement. Interviews with decisionmaking as a school improvement strat- state education agency officials and a review of egy, combined with a lack of information about documents revealed that state education agencies state education agency supports for data-driven in the Northeast and Islands Region support edu- decisionmaking, makes this report an important cators in low-performing schools to develop their contribution to data-driven decisionmaking efforts. Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 35

Appendix D CAPT assesses and reports on student performance Profiles of state education agencies’ in four areas: mathematics, reading across the disci- data-driven decisionmaking initiatives plines, writing across the disciplines, and science.

This appendix provides profiles of the state data- The state has been making major changes to the driven decisionmaking initiatives in Connecticut, organization of the Connecticut State Department Maine, Massachusetts, New Hampshire, New York, of Education and its relationship to school districts Rhode Island, Vermont, and the Virgin Islands. and schools, with a new commissioner appointed in May 2007 and new school reform legislation Connecticut passed in June 2007. The Bureau of School and District Improvement and the Bureau of Account- The Connecticut State Department of Education’s ability, Compliance, and Monitoring were tasked statewide data-driven decisionmaking initia- with delivering “systematic and systemic” support tive targets low-performing districts. Under the to schools and districts. These bureaus work with department’s Connecticut Accountability for all districts to provide data on student and district Learning Initiative (CALI), regional educational outcomes, processes to analyze data, methods of service centers train teams from schools in low- developing improvement activities, and training performing districts to analyze state assessment through the CALI. results and other data to improve instruction and student performance. The basic training for teach- Overview of the initiative. The Connecticut State ers and school administrators focuses on how to Department of Education’s statewide data-driven build and sustain data-driven decisionmaking in decisionmaking initiative is part of the CALI, district- and school-level data teams. The Leader- which aims to advance learning for all students, ship and Learning Center, a partner, provides particularly students from the 12 districts under certification training for educators who plan to corrective action. The Connecticut State Depart- create a schoolwide culture of data use through ment of Education partnered with the regional professional development at their schools. educational service centers and the Leadership and Learning Center to provide support to dis- State context. Connecticut has 1,111 schools serv- tricts and schools in six key areas (Connecticut ing 575,058 students. The majority of students State Department of Education 2008):7 are White (67.0 percent), and the rest are either Hispanic (15.4 percent) or Black (13.7 percent). • Data-driven decisionmaking data teams: using Roughly a quarter (26.5 percent) of students in the district and school data for analyzing, setting state are eligible for free or reduced-price lunch, goals, and implementing research-based and 44 percent of the schools are Title I. Con- instruction strategies. necticut schools have long been high performing, obtaining scores above the country average on • Making standards work: aligning school mathematics and reading for all administrations and district assessment and instruction and of the National Assessment of Educational Prog- developing classroom-based assessments to ress (NAEP) since 1992. monitor student progress.

The Connecticut Mastery Test (CMT), a state assess- • Effective teaching strategies: examining ways ment, is administered annually to all public school to write lesson plans and deliver instruction students in grades 3–8 to assess mathematics, read- using Dr. Robert Marzano’s research-based ing, and writing skills. The Connecticut Academic strategies (Marzano, Pickering, and Pollock Performance Test (CAPT), given to students in 2001, cited in Connecticut Department of grade 10, is a natural extension of the CMT. The Education 2008). 36 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

• Common formative assessments: collabora- The CALI offers two kinds of data-driven tively designing assessments to improve learn- decisionmaking training: basic training and cer- ing, as opposed to assessments of learning, tification training. The basic training is a two-day and analyzing the results. seminar for teachers and school administrators, including curriculum developers, superinten- • School climate improvement to support dents, and principals. Participants learn first how academic achievement: exploring practical to examine their own real student data using structures and strategies that address bullying a six-step data-driven decisionmaking process and school violence by creating safe learning (treasure hunting, analyzing needs, prioritizing environments. needs, setting goals, identifying instructional strategies, and determining results indicators) • Accountability in district and school improve- and then how to develop and sustain this pro- ment planning: creating a framework for a cess in district- and school-level teams of fellow new accountability system. teachers and administrators. These data teams collaboratively analyze data and identify student The CALI’s overarching goal is to reduce student strengths and weaknesses. Team members then achievement gaps. Its data-driven decision- come up with instruction strategies to address making component, which began in 2004, these as well as required learning standards.8 provides districts with professional develop- Team members implement their strategies, moni- ment and technical assistance in using data to tor them, and discuss them at the next meeting, improve pedagogy and learning. The profes- which should take place in their districts. sional development and technical assistance are offered to Title I districts and schools identified The three-day certification training is for educators by adequate yearly progress measures as in need who plan to create a schoolwide culture of data use of improvement. Ineligible districts and schools through professional development at their schools. can participate for a fee, although in 2008 the During the training they learn more about the professional development and technical as- data-driven decisionmaking process taught in the sistance will be offered to the entire state. For basic training and about how to become effec- districts that choose to participate, four days per tive leaders of professional development in their year are set aside for professional development, schools. Participants bring their own classroom- or and follow-up technical assistance is offered school-level data to be analyzed. Some attendees throughout the year. bring state assessment data. Others bring data on behavioral issues, such as tracking the number of Three external service providers offer data-driven classroom referrals and office referrals for students. decisionmaking services to high-need districts in In addition, some bring school climate survey data. the state: the CALI, the Leadership and Learning The data are typically on paper or in a spreadsheet. Center (an educational professional development, No technological analysis tools are used. Data anal- publishing, and consulting organization based in ysis occurs through the six-step process mentioned Denver, ), and the Regional Educational above: users examine their data (treasure hunting), Service Centers (RESC) Alliance (see appendix E) identify and analyze student needs, prioritize the with the State Education Resource Center (SERC). needs, set goals for meeting them, identify instruc- The CALI oversees implementation of the pro- tion strategies to target the goals they set, and de- fessional development and technical assistance, vise ways to measure whether the goals have been which are funded by the districts. The Leadership met once the strategies have been tested (deter- and Learning Center creates materials for the pro- mining results indicators). Actions resulting from fessional development training. The RESC Alliance this process include curriculum revision, program and SERC deliver the training. redesign, and funding redistribution. Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 37

Districts that participate in the professional de- Maine velopment receive two kinds of support: technical assistance and materials to facilitate data team Both leadership and staff at the Maine Depart- work. Technical assistance, provided to districts ment of Education are aware of the importance of throughout the year as a follow-up to the profes- data-driven decisionmaking. Before the No Child sional development sessions, is based on the specific Left Behind (NCLB) Act of 2001 took effect, Maine needs of each district that requests it. Once the was developing a balanced accountability system training is complete, the materials provided during that combined local assessments with statewide the training support the formation of data teams in testing in various grades. Maine implemented the each participant’s district. These materials include a Maine Educational Assessment (MEA) in 1984 booklet with an overview of the six-step data-driven and has since supported teachers in analyzing decisionmaking process, activity worksheets for MEA test scores to improve classroom practice. As each step, a booklet describing the data teams and the department moves to implement annual NCLB how they should work, and activity sheets to guide testing requirements, its focus is shifting from participants in implementing the teams in their performance assessments to longitudinal analysis districts. In addition, a DVD is provided with video of MEA data, with a federal grant supporting the clips of a sample data team. The initiative does not development of a longitudinal data system. De- provide a technological infrastructure to house data. spite constrained resources and a small staff, the department encourages and supports data-driven Next steps. Starting in fall 2008, the CALI planned decisionmaking within specific programs and to expand its services to all districts in the provides some support to schools and districts for state, rather than providing them to only low- developing their capacity to analyze data. ­performing districts. State context. Geographically, Maine is the largest Additional resources. These include: state in New England, with a larger landmass than the rest of New England combined. That creates • The CALI web site atwww.sde.ct.gov/sde/ challenges to equitably serving students across cwp/view.asp?a=2618&Q=321754, techni- the state. Maine’s population of 1.3 million is 96 cal assistance standards at www.sde.ct.gov/ percent White, compared with 67 percent across sde/cwp/view.asp?a=2618&q=322294, and the United States, and is largely rural. Although an data-driven decisionmaking and related increasing population of immigrants has settled training at www.sde.ct.gov/sde/cwp/view. in the state’s three urban centers, overall school asp?a=2618&q=321744 and www.sde.ct.gov/ enrollment is declining slowly. Public school sde/lib/sde/word_docs/curriculum/cali/cali_ enrollment declined from 213,867 in 1996/97 training_description.doc. to 193,335 in 2006/07 (Maine Education Policy Research Institute 2008). Historically Maine has • The Leadership and Learning Center web site had a high-achieving school system, measured in at www.leadandlearn.com. part by high rankings on the National Assessment of Educational Progress (Maine Education Policy • The Connecticut RESC Alliance web site at Research Institute 2008). http://ctrescalliance.org/index.html. Overview of the initiative. Maine created the MEA • The SERC web site at www.ctserc.org. in 1984 and, from its beginning, helped local educa- tors interpret their MEA scores and use the data to • The web site for public summary performance inform classroom practice. The state expanded this reports for Connecticut’s statewide testing work in the 1990s. Maine’s education standards, the programs at www.CTreports.com. Maine Learning Results, were approved in 1997 and 38 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

included a local performance assessment designed do with it; what I should be looking for?” We to show what students knew and were able to do. don’t have the capacity to do that on a large- In the late 1990s and early 2000s Maine helped dis- scale basis. . . . I help people on the phone all tricts develop local assessments and helped teach- the time, but I can’t get out there. (Interview, ers and principals use the data to improve student February 20, 2008) achievement. With limited state resources, and given the difficulty of holding schools and districts Other department offices deploy content experts to accountable for myriad performance assessments, work with teachers on item analyses. But, with- the Maine Department of Education is now focus- out an overarching state policy, the data-driven ing on creating a longitudinal data system that will decisionmaking work can became sequestered in support analysis of MEA scores at the student level. programs within the department. For example, the special education program supports districts Although the department encourages data-driven in analyzing data on students with special needs, decisionmaking in many of its state-sponsored and the Title I office helps low-performing schools programs and in its support of low-performing with data-driven decisionmaking. Staff from schools, it has no overarching, integrated policy or different program offices concur that MEA data vision to encourage data-driven decisionmaking are critical to their school improvement work and in districts and schools. Capacity is low: at the that data-driven decisionmaking is a key part of time of the study the state had several vacancies in that work. Despite the lack of a formal, overarch- data-related positions, no director of state assess- ing data-driven decisionmaking policy, informal ment, and no National Assessment of Educational communication and collaboration appear to Progress coordinator, and both positions had been support using data-driven decisionmaking across empty for some time. Also, Maine’s focus has been programs. on supporting activities at the school level rather than on crafting a statewide data system. This The Maine Department of Education supports focus is beginning to shift, with stronger external professional development for understanding and accountability measures imposed by the federal using MEA data, including both training and government and a federal grant supporting the support in reading MEA reports and in using development of Maine’s longitudinal data system. data to improve instruction. Support in reading MEA reports is provided by staff from the depart- The MEA coordinator, one of a few department ment assessment office and from the assessment staff members dedicated to assessment-related company Measured Progress. The workshops, part work, is passionate about using data in the of the state’s assessment contract, are provided classroom—something she did actively when she free to districts. Other department offices support was a teacher. “Having yearly data following kids using the data to improve teaching and learning. through is powerful—if you know what you’re looking for” (interview, February 20, 2008). She Maine has contracted with Measured Progress regrets that, with limited staffing, state assessment to develop, administer, and score the MEA, to staff cannot get out to schools: report test scores to each school in the state and to support district and school leaders in using I don’t work with individual districts that the test data. Every fall, staff from Measured much. I think we have an obligation as a Progress and the Maine Department of Educa- department to provide data-driven decision- tion’s assessment office lead report interpreta- making workshops, [to help them use] all the tion workshops. These are held in five locations information that is available to them. We get around the state to ensure that they are acces- calls from people asking, “Can you help me? sible to all Maine educators. In the fall of 2007 I have all this data, and I don’t know what to Measured Progress rolled out a web-based portal Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 39

for accessing MEA scores, focusing professional standards students are measured against on the development on helping school leaders both MEA. In the words of the MEA coordinator: understand their scores and use the portal. The workshops are offered to district leaders, such They are supposed to teach [to] standards, as superintendents and curriculum directors. and the MEA is supposed to assess [by] Some principals also attend. These leaders attend those standards. So rather than teaching to primarily to learn how to use data and under- the MEA, they should be teaching [to] the stand adequate yearly progress in their schools. standards [that] the MEA will identify . . . but Such workshops have been offered since the state that’s a disconnect for people. They just want began administering the MEA in 1984. to get ready for the test. So we’ve been provid- ing release items for many, many years. . . . According to the MEA coordinator, with the new We encourage them to integrate as much as Measured Progress interactive web-based portal, they can into the everyday class work [and] a grade 5 teacher not only sees students’ MEA not just . . . get ready for the test a week before scores for this year but can check how they did the test. (Interview, February 20, 2008) on the grade 4 test. Yet a visit to a school district showed that the database accessible through the At the time of this study, though, there were only interactive portal is populated with only one two content-area coaches to serve the entire state year of MEA data. This is the first year Measured of Maine. Progress has used the portal to provide assess- ment reports to schools, so only the current year Next step. The Maine Department of Education’s of data is available online (interview, April 7, support for data-driven decisionmaking has 2008). Because of recent changes to state stan- evolved over time since the MEA was created in dards and the MEA, there are no plans to add the 1980s. But such support appears to have grown earlier test scores to the database. However, new stronger in recent years. The MEA coordinator test scores will be added in each subsequent year. attributed some of this to NCLB accountability Currently, the lack of longitudinal data limits the requirements: portal’s usefulness. We always encourage data-driven decision- Maine Department of Education staffers provide making. That’s always our goal, internally as more in-depth professional development, focused well as externally. I have seen a big change, on test content, to help faculty use MEA scores anecdotally, since No Child Left Behind came to improve instruction and to tailor it to each in. The test finally means something to them. student. The department offers workshops for Before you’d get your bad score in the paper teachers in reading, writing, math, and science. and you’d feel terrible, but now it’s changed. Department staff realize that teachers will not use It’s huge. Now, they are actually looking at a lot of the MEA data, so they try to focus on item- the data, and a lot of them didn’t before. level data and how these data can help teachers (Interview, February 20, 2008) with instruction. Despite this shift in attitudes, the coordinator Department staff go to schools, where they can tai- recognized that the state could do more: lor professional development to help school faculty learn from their classes’ responses to particular Am I happy that we are hitting everybody items. The staff explain how to look at specific and providing good instruction in how released items and how to analyze data trends over to use data? No. I’ve identified that as a time. They want teachers to be very familiar with need going forward, that we really need to the standards, or Maine Learning Results—the ­collaborate. . . . We just simply don’t have 40 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

the staff, but we do what we can. (Interview, (department, district, and school) will have access February 20, 2008) to the data for reporting and for making cur- ricular and pedagogical decisions. For three years This issue has recently come to the fore in Maine. Massachusetts piloted the initiative, and at the In June 2007 the Department of Education received time of the study it was working toward statewide a three-year, $3.2 million grant from the U.S. implementation, with 76 districts having access to Department of Education to support the develop- the virtual warehouse. ment of a longitudinal data system. The grant was intended to help states expand their existing data Participation in the initiative will not be systems so they can better use data to meet report- ­mandatory—the state lacks funding to enforce it. ing requirements, support decisionmaking, and aid The department obtained funding to purchase the education research. States were selected in a com- online tool and pay a maintenance fee, to hire a petition and judged on their need for the project, few staff dedicated to the data warehouse, and to its quality of design and management plan, and its support school districts in the pilot phase. Once promotion of timely generated accurate data for the initiative is statewide, districts that join will local, state, and federal reporting requirements. rely on their own budgets to build their capacity to understand, run, and use the software. In February 2008 Maine contracted with Infinite Campus, an information management firm, to in- State context. Massachusetts is the most populous tegrate its current databases into one longitudinal of the six New England states, with more than 6 data system. The Maine Department of Education’s million residents. The public education system leadership, along with its data management team, serves nearly a million students in 388 school determined that the state needed a new data sys- districts and 1,873 schools. In 1993 Massachusetts tem to meet growing data needs. The new system passed the Act, which required will integrate MEA data with the student informa- the development of content and performance tion system, Maine Educational Data Management standards. As a result, the state developed the Systems, and eventually will include staffing data. Massachusetts Comprehensive Assessment System It is currently being piloted with a small number (MCAS) to test all public school students on the of schools. It will take several years to integrate all state learning standards. Students are tested every districts and schools into the system. year in reading and math in grades 3–8 and in high school. Science is tested in one elementary Additional resources. These include: grade, one middle grade, and one high school grade. In addition to the content assessments, • Measured Progress at www.measuredprogress English language learner students are required to .org. take the Massachusetts English Proficiency As- sessment (MEPA). • Infinite Campus at www.infinitecampus.com. Since 2001 the state has had a centralized system, Massachusetts the Student Information Management System (SIMS), to collect student background informa- Massachusetts is implementing the Educational tion. Each student has a unique identification Data Warehouse, a statewide initiative. The number that allows the student’s SIMS and MCAS Massachusetts Department of Elementary and data to be tracked. At the time of the study, the Secondary Education has purchased a license for state was in the first year of implementing the data warehouse software that will house all the Education Personnel Information Management state-maintained data and allow school districts to System, which gives each educator in the state an load and analyze their own data. Staff at all levels identification number. Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 41

Overview of the initiative. A few years ago, senior difficult question to answer and the most work” personnel at the Massachusetts Department of (interview, December 3, 2007). During the first Elementary and Secondary Education concerned two years of the grant administrative personnel about increasing demands for using data in from the districts received professional develop- decisionmaking noted that the department col- ment from the Connecticut RESC Alliance (see lected and used data inconsistently and that data appendix E) on using data for decisionmaking. collected from the districts were not going back to the districts effectively. In 2005 the department The department got funding to expand the pilot obtained internal funding to pilot the Educational for two additional two-year periods. May 2008 Data Warehouse. The state invited districts to marked the end of the initial phase of full rollout, apply to the two-year pilot. At about the time when with 76 districts having access to the warehouse department officials were considering the data and the department finalizing details of its warehouse, district officials in the Northampton expansion to more districts. Although use of the School District were searching for a way to better data warehouse and tools will not be mandatory, deal with the data they were collecting and with its department officials expect that districts will take various locations. In 2005 a Northampton admin- advantage of them. Though initiated and managed istrator met the state department official in charge by the Massachusetts Department of Elementary of the Educational Data Warehouse initiative and and Secondary Education, officials from the seven shared the district’s vision for data collection and school districts in the first pilot cohort—especially analysis. The district was awarded an Educational those from Amherst and Northampton—have Data Warehouse grant. It and six other school dis- been important in shaping it. Its main goals are tricts became the first cohort to pilot the initiative. twofold: to provide a platform to house data col- lected by the state, districts, and schools, making During the first six months of the pilot the those data accessible in decisionmaking, and to districts and the department selected Cognos as provide software that allows state and district the vendor for the data warehouse. Department users to analyze their data. Different stakeholders officials met with representatives from Cognos to will use the warehouse differently. For example, examine its data model and to discuss changes department staff will use it for reporting purposes, required to meet state and district needs. In 2006 school personnel will look at data longitudinally, Massachusetts bought a statewide license for the and teachers will access their students’ records warehouse. from previous years to see their progress.

Once the warehouse was ready, the state depart- At the time of the study, the warehouse contained ment used it to make five years of its SIMS and state-maintained data, such as SIMS, MCAS, and MCAS data available to the pilot districts. Two MEPA data, and the state was uploading Educa- districts with data management staff uploaded tion Personnel Information Management System district-level data, such as students’ grades and data. The tool can incorporate other data, explains courses. Many months were spent learning how to one department official: “We would love to get our use the warehouse—running the reports, iden- finance data in there, [along with] . . . licensure tifying which local data could be uploaded, and data, academic program data, and professional learning how to upload them. development information. The types of things we could include and ultimately link are endless” Professional development has been integral to the (interview, October 23, 2007). Districts will also project explains one distinct official: “Assuming be able to load their own data, such as data on that you can get your data loaded and you have courses, grades, student attendance, or discipline. access to data, what are you going to do with the State department officials will notify districts data? . . . That is by far the most important and when reviewing the districts’ uploaded data. 42 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

It is expected, once the data warehouse is imple- which cleaning and uploading data take time and mented statewide, that people at different levels— resources—and learn how the uploaded data can policymakers, district officials, principals, school be used for decisionmaking. Still, the state official data teams, and teachers—will be able to access in charge of the warehouse suggested that its ad- state data and their local data and to run reports vantages outweigh the challenges: “I would rather tailored to their needs. see a model such as [a statewide Educational Data Warehouse] as opposed to a hundred different data The state education department bought a statewide warehouses being purchased by only the districts license for the virtual data warehouse that will be that can afford them” (interview, October 23, available free to every school district in the state. 2007). Although the state is planning to provide direc- tions to districts about the warehouse, districts Additional resources. These include the Massachu- will have to budget to participate in the project setts Department of Elementary and Secondary and build district capacity around the warehouse. Education, Educational Data Warehouse web page at www.doe.mass.edu/infoservices/dw/. The warehouse has developed several personnel supports through pilot testing the initiative. Three New Hampshire fulltime state department staff personnel and two contractors work exclusively on the warehouse. The New Hampshire Department of Education In addition, weekly conference calls between has made using data to improve instruction and department staff and districts in the first two student outcomes a priority, resulting in a compre- pilot cohorts allow discussion of implementation hensive approach to data-driven decisionmaking issues. District and pilot department user groups that includes tools for data storage and analysis, gather feedback from end users and share best curriculum alignment, and the development of practices with them. Finally, a steering committee local assessments. The focus of its initiatives is on of department and district personnel meets to pri- understanding individual student learning needs. oritize enhancement work for the data warehouse team. It grew out of a previous steering committee, State context. New Hampshire, a primarily rural drawn from the pilot districts, that met several state, has a large number of small schools and times over two years to make decisions about single-school districts. Districts are organized into implementation. school administrative units, each under a single superintendent. New Hampshire’s emphasis on Challenges to implementation. The warehouse has local control has resulted in a recent education ac- limited funding and will not be mandated once it countability law specifying that the state will not is launched statewide. The pilots have shown that take over schools that fail to make adequate yearly the initiative takes a good deal of districts’ time progress. and resources to implement. One district official noted that, although the district received funding, New Hampshire established its Statewide Educa- it was a challenge to find “the time to clean the tion Improvement and Assessment program (RSA data, or the time to load the data, or the time to do 193-C) in 1993, long before the NCLB Act, to set professional development” (interview, December 3, standards and require education improvement and 2007). assessment plans. Recent legislation (RSA 193-H) sought to establish a single state accountability Once the initiative is launched statewide, the state system for schools to be coordinated with the department’s challenge will be to manage the NCLB Act. This system seeks both to meet NCLB expectations of the school districts. Districts must requirements and to go beyond them by including view the warehouse as a work in progress—for all public schools, both Title I and non–Title I. Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 43

New Hampshire has joined with Rhode Island and New Hampshire exceeds NCLB requirements Vermont to develop the New England Common through a joint initiative of the governor and Assessment Program (NECAP), an annual assess- the commissioner of education called Follow the ment developed to comply with NCLB legislation. Child, which brings together several data collec- NECAP has been administered each fall in grades tion and use programs to track each student’s 3–8 since 2005. In fall 2007 it was given in grade progress toward (at least) proficiency on local 11. New Hampshire educators are also construct- and state assessments. Participation in Follow ing course competencies at the secondary level the Child is voluntary for districts (called school that link content to skills and use multiple mea- administrative units in New Hampshire), approxi- sures to evaluate student proficiency. mately half of which were participating at the time of the study. Since 2005 the state has emphasized educating the whole child. Its Follow the Child initiative fosters a New Hampshire is committed to the use of data personalized learning experience for each student to guide school improvement. A New Hampshire and challenges districts to measure growth in and Department of Education publication states: develop support systems for the personal, social, physical, and academic aspects of each student’s All of us must use school performance data to life. Participation by districts, schools, and indi- continuously improve the education system vidual educators is voluntary. in our state. Making this data available to the public is essential. Providing every New Overview of the initiative. Data collection and use Hampshire child with a quality education is have been the focus of New Hampshire’s approach a result of assessing and understanding the to school improvement for the past five years. Ac- situation, working together in our commu- cording to the leader of data use initiatives within nities, in our classrooms, and in our own the New Hampshire Department of Education, homes to make a difference. (New Hampshire about three years ago the department recognized Department of Education 2006) that its methods for data collection and data dis- semination to districts did not make good analysis The department’s vision for data-driven decision- likely at the district and school level. For example, making in the state is to enable districts to analyze districts were unable to compare year-to-year assessment data and other data in various queries, assessment results or to calculate graduation rates to support district efforts to develop and analyze meaningfully. their own local assessments, and to link these data to curriculum and lesson planning tied to state These deficiencies led New Hampshire to launch a standards. coordinated statewide effort in 2005 to collect data that would allow for meaningful data analyses by New Hampshire’s Initiative for School Empower- districts and schools and to train users in data- ment and Excellence system gives a unique iden- driven decisionmaking. These efforts were aided tification number to each student in New Hamp- by funding from the U.S. Department of Education shire, letting districts easily retrieve data for all for data collection and decisionmaking through a their students. The U.S. Department of Education statewide longitudinal data system grant of $3.2 longitudinal grant will allow the state to expand million over three years. This grant is being used the database to include postsecondary students. to improve data warehouse and analysis tools, to fund the Follow the Child initiative (explained Statewide data collection efforts have been led by below), and to collaboratively develop a database state education agency personnel. Professional to connect school and postsecondary data on development for district and classroom data use, students and teachers. led by New Hampshire’s five regional professional 44 state education agencies in the Northeast & Islands Region AND data-driven decisionmaking

development consortiums and centers, offer basic yearly progress reporting to data collection and and advanced data analysis training for schools ­analysis—to make schools and districts aware of and districts. the importance the state places on data use and to give them an incentive to analyze their data. New Hampshire collects a wide range of data with Recognizing the extra effort and training commit- its statewide data warehouse and analysis tool, Per- ments districts must make to learn data-driven formance Tracker. All school administrative units decisionmaking, department staff emphasize that have access to Performance Tracker. It has three their role is to provide customer service to help modules: Performance Tracker stores assessment districts progress toward effective data use. The de- and other data and facilitates analysis through var- partment answers immediate, individual questions ious queries, Assessment Builder allows districts to from districts and schools on data collection and develop customized assessments, and Content Li- provides technical help with the data warehouse. brary facilitates linking assessments to curriculum and lesson plans. All the administrative units have In addition, New Hampshire’s five professional free access to the first module. About 15 percent of development consortiums and centers provide the units (including Manchester and Nashua, the training in data-driven decisionmaking. The largest in the state) have paid for access to the other training is voluntary and funded by the districts. two, and many more units have expressed interest Thus, the consortiums and centers are motivated in accessing them. Funding such access through to cater training to district needs. the federal longitudinal data grant has been dis- cussed by the state department. Challenges to implementation. New Hampshire has a strong tradition of local control. This allows State department staff reported that one strength schools to adapt to local circumstances, but it also of Performance Tracker is the range of data it can presents challenges to statewide education initia- store and analyze. New Hampshire’s data-driven tives. For data-driven decisionmaking it means decisionmaking efforts have focused on making that school administrative units use different a wide range of data available for analyses. Types assessments—NECAP is the statewide assess- of data that districts can put into Performance ment developed in response to NCLB legislation, Tracker include scores on NECAP, the Northwest whereas other assessments such as the Northwest Evaluation Association assessments (administered Evaluation Association’s are used voluntarily by by about half of New Hampshire’s school admin- local educators. Similarly, professional develop- istrative units, these measure student instruction ment is decentralized, differing by area. With the level and growth, to help teachers tailor instruc- state lacking personnel to provide professional tion to individual needs), the Dynamic Indicators development on data use, much of the state’s of Basic Early Literacy Skills (DIBLS) test, and the formal training in data-driven decisionmaking Developmental Reading Assessment, along with is provided at the local level. Although the state data on attendance and academic, social, personal, department has provided supports for data-driven and physical development from the Follow the decisionmaking, in New Hampshire, it is difficult Child initiative. to determine how much data-driven decision- making is being implemented in school admin- The New Hampshire Department of Education has istrative units and schools given the strong local two types of infrastructure to support data-driven control in New Hampshire. decisionmaking: its data warehouse facilitating data storage, retrieval, and analysis—described Additional resources. These include New Hamp- above—and policies that promote data-driven shire Department of Education at www.ed.state. decisionmaking in schools and districts. The nh.us/education/ and Performance Pathways at policies focus on tying funding and adequate www.perfpathways.com. Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 45

New York accountability plan came the School Under Reg- istration Review process, which identifies schools New York has a strong history of using data-driven performing furthest from state standards—based decisionmaking to inform education decisions at on state assessment scores—and requires that they different levels of the system. Although the New create comprehensive improvement plans. The York State Education Department has no formal state education department has strong expecta- statewide data-driven decisionmaking policy for tions that districts and schools include data-driven districts and schools, it encourages such decision- strategies in their improvement plans. making in its support of low-performing schools and through state-sponsored and other agency The seven regional school support centers are the programs. Those programs include New York’s primary means through which support is provided statewide data reporting system, nySTART, and to New York’s high-priority districts and schools. the services provided by the Board of Coopera- Established in 2003 along with the OSICS and tive Educational Services and the New York Data funded by the state education department, the Analysis Technical Assistance Group. centers provide technical assistance and support to schools and priority districts, and they coordinate State context. New York contains some of the most the implementation of the Reading First initiative densely populated areas in the Northeast and and the reauthorization of the Individuals with Islands Region, includes a wide range of urban, Disabilities Education Improvement Act (IDEA suburban, and rural schools serving nearly 2.8 of 2004).9 One center serves New York City, and million students (U.S. Department of Education, the remaining six are spread throughout the state. National Center for Education Statistics 2007). The centers work with four other department- The five largest districts—New York City, Buffalo, supported regional network partners to deliver Rochester, Syracuse, and Yonkers—make up 42 services to schools and districts. percent of the state’s public school population. New York has a very diverse population of stu- Since the inception of the OSICS and the regional dents. A little over half (52.7 percent) are White, centers, low-performing schools have received whereas the majority of the other half are either substantial professional development focused on Hispanic (20.1 percent) or Black (19.8 percent). data-driven decisionmaking. Low-performing dis- Most (68.2 percent) of the schools in the state are tricts and schools, which must include data-driven Title I, and almost half (44.8 percent) of the stu- strategies in their improvement plans, are encour- dents are eligible for free or reduced-price lunch. aged to participate in data-driven decisionmaking professional development. When developing the Overview of the initiative. The Elementary, plans, the districts and schools are expected to Middle, Secondary, and Continuing Education use data from multiple assessments, which can department of the New York State Department of include standardized assessments—with data that Education established the Office of School Im- can be disaggregated by student subgroups—and provement and Community Services (OSICS) in benchmark assessments demonstrating student 2003. The OSICS is charged with designing and progress toward meeting learning standards. implementing the state’s system of support for Other assessments must be used to measure the low-performing schools. Its overarching goal is to impact of curriculum and instruction delivery in close the achievement gap in English language arts core academic areas. and mathematics for all students, including NCLB subgroups. These data can help schools revise their education priorities. Once priorities are established, they In 1998 New York became one of the first states become the core of each school’s comprehensive to have a system of accountability. With the education plan, including specific annual goals, 46 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

measurable objectives, and action plans for ob- motivated to engage frequently with those data. servable and effective strategies to improve student Not only have teachers become better consum- achievement. ers of data but the state department’s associate commissioner finds that leaders in districts and One approach used to provide data-driven schools are better informed about their data: decisionmaking to low-performing schools in New York is through the reading, mathematics, science, When you go into our lowest performing English as a second language, and special leader- schools . . . particularly in elementary schools, ship institutes. Held several times a year, these you will find that principals can talk to you are offered by the OSICS with the New York City about data. To me, this shows real change. . . . Department of Education, Fordham University Before, you would have to find a classroom Graduate School of Education, and the New York teacher to learn those things, if you were Comprehensive Center. Representatives from low- lucky. (Interview, January 30, 2008) performing schools are required to attend the in- stitutes, at which the representatives attend work- Data-driven decisionmaking by district and school shops led by researchers, professors, consultants, leaders is supported and encouraged in New York professional developers, and other educators on through avenues other than the initiative for integrating research-based strategies and practices improving low-performing schools. One data re- into teaching and learning. Many workshops teach porting and analysis tool is available to all school participants to use data for decisionmaking in leaders, and two types of networks or groups particular content areas. For example, a workshop encourage and support districts and schools in at a December 2007 institute addressed the align- their use. ment between assessments (formative and sum- mative alike) and learning standards for English Through the statewide data reporting system, language arts and the use of assessment informa- nySTART, the state education department encour- tion to drive instruction. Another session taught ages managing and reporting state assessment participants how to identify student weaknesses in data. Formerly housed in 12 regional information mathematics by analyzing state assessment items centers, such data are now stored in the New York and to use the information to derive instruction State Testing and Accountability Reporting Tool, strategies targeting such weaknesses. or nySTART, a web site that gives school leaders (including district superintendents, principals, Another approach to professional development in and teachers) access to reports on New York’s data-driven decisionmaking for low-performing standardized test scores.10 The change reflected a schools is through the New York State Teacher desire to have a standardized online tool for the Centers. The OSICS placed these on-site profes- regional information centers and Boards of Coop- sional development centers in low-performing erative Educational Services (see below) to work schools, where they help teachers to use data. The with state assessment data. Run by the Grow Net- state has paid for mathematics and literacy tutors work and McGraw-Hill, nySTART is funded by the for each school without a Teacher Center, to help state department of education. The nySTART web teachers examine data more closely and use it to site contains detailed reports on test results for inform instruction. assessments, including the New York State Testing Program, New York State Alternative Assessment, The professional development in data-driven the New York State English as a Second Language decisionmaking provided to New York’s districts Achievement Test, and the New York State Regents and schools over the past decade or so has contrib- Examinations. District and school administrators, uted to changes in the cultures of schools, shift- as well as teachers, are expected to use the data ing them toward knowing their data and being reports to inform education decisions. Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 47

The nySTART system provides verification, as- and provide customized professional development sessment, and accountability reports. Verifica- on data management and analysis to districts. tion reports let school district officials review and verify the accuracy and completeness of The New York Data Analysis Technical Assistance data. Assessment reports provide summary- and Group, developed in 2000 at a Comprehensive Dis- individual-­level performance in PDF for English trict Education Planning conference and funded language arts and mathematics assessments in by the state’s education department, grew out of a grades 3–8 and for the New York State Alternative shared interest in creating data systems that would Assessment. The data can be analyzed in a variety help teachers and other educators, rather than of ways. For example, teachers and administrators only those with technical expertise. The group is a can view and sort performance data by district, research-based learning community, open to edu- school, grade, and student. They can analyze ag- cation professionals involved in supporting data gregate performance and examine data by perfor- use to improve instruction around the state. Mem- mance level, content strands and standards, and bers come from organizations including school test item. Data can be summarized for subgroups districts, BOCES, special education training and (disability status, race/ethnicity, English profi- resource centers, regional school support centers, ciency, and so on). Last, the accountability reports regional information centers, the state education provide information about school status under the department, and the private sector. Members may state and federal accountability systems, including be involved in a variety of data-driven activities, school profile information, school accountability such as building data warehouses in the informa- status, and performance. tion centers or providing test scoring and data analysis to districts. The goal of the group is to New York’s regional school support centers “provide a leadership role in identifying, catalog- work with various types of partners that are ing, and modeling best practices in the analysis not department-­funded to provide professional and use of data for New York schools” (New York development in data-driven decisionmaking. One Data Analysis Technical Assistance Group 2008). such type is the Board of Cooperative Educational The group holds meetings four times a year, with Services (BOCES), regional networks that provide biannual conferences to discuss ways of building services including student supports, professional its capacity to provide data-driven decisionmaking development, technical assistance, and technology services and supports. It also has an email list supports that individual districts cannot provide. that is free to members, where members post The state’s 37 boards are funded by the state and questions and answers related to data-driven by districts, whose superintendents govern and decisionmaking. determine services. BOCES provides data-driven decisionmaking services based on regional and Next steps. The New York State Education Depart- district need. For example, in districts with the ment has long understood the need for consistent capacity to manage, interpret, and gather informa- data-driven decisionmaking in districts and tion from data to inform instruction, the boards schools. It continues to build on its data-driven may ease and enhance those tasks. Other districts decisionmaking initiatives, for example, by plan- lacking such capacity may receive professional ning a new initiative to create an integrated data development and technical assistance. Housed in system for PreK through higher education. The the boards are the 12 statewide regional informa- new initiative was called for by the New York State tion centers, which provide participating districts Board of Regents in fall 2006. Its action plan calls with technology services for instruction and for the department to create the new data system administration as well as training and support for with the goal of using data to improve graduation teachers, students, and administrators. The centers rates in high school and in higher education (New store and process all the data required for the state York State Board of Regents 2006). 48 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Additional resources. These include: State context. Rhode Island is the smallest in area of any state. It is highly urban and densely • The New York State Education Depart- populated, with a public school student population ment School Comprehensive Educa- of 152,422. Most students are White (70.5 percent), tional Plan at http://schools.nyc.gov/NR/ with Hispanics (17.3 percent) the largest minor- rdonlyres/75097FD5-478D-4129-AE78 ity group. Close to half (40.9 percent) of the state’s -EF0442C60B3D/32663/RequiredCEP schools are Title I, and just over a third (34.9 per- Appendicesfor20072008ReleasedMay22007. cent) of the student population is eligible for free doc and the District Comprehensive Edu- or reduced-price lunch. cational Plan at http://schools.nyc.gov/NR/ rdonlyres/138A84DB-3E16-418F In 2006 the state began to build a data ware- -AF7C-A90B266FFAFE/32645/ house for all education data. Formerly in several DCEPTemplate200708v807.doc. locations, data would now be available to users through one consolidated source (with seven com- • The Board of Cooperative Educational Ser- ponents). Users can obtain up-to-date data in a vices web site at www.monroe2boces.org/ user-friendly form and can generate reports based statewide.cfm. on the data.

• The Data Analysis and Technical Assistance Overview of the initiative. Rhode Island’s work Group web site at www.datag.org. on a data warehouse began with district officials looking for a way to analyze data and use that data • nySTART web site at www.nystart.gov. to inform decisions on education. As districts and the state education department began looking • Introduction to nySTART (PowerPoint) into systems, they realized that they were missing at http://emsc32.nysed.gov/irts/nystart/ a foundation: a statewide student identification PD-0713-Intro-to-nySTART.ppt. system. After such a system was created in March 2004, a two-year project to develop the warehouse • P-16 Education: A Plan for Action at http:// was funded through a statewide bond referendum. usny.nysed.gov/summit/p-16ed.pdf. In June 2006 two companies, ESP Solutions and TetraData Corporation, were chosen to create the Rhode Island warehouse.

The Rhode Island Department of Elementary and The goal of the data warehouse is to let educators Secondary Education is creating a central data base their decisions on real assessment and stu- warehouse. Housing all education-related data, dent data rather than hunches. Education leaders including state assessments, student information, want officials to “be able to back up actual deci- teacher information, and financial information sions with actual data . . . [and] then to monitor [a] related to schools, the warehouse will allow users decision along the way to see if it is . . . paying off” to access data with up-to-date graphs and figures (interview, December 3, 2007). The warehouse is as well as user-designed reports. designed for use by individuals at all levels of the state education system, from district officials to It will be available in different ways to users at all building administrators to classroom teachers. levels of the system: parents, teachers, administra- tors, education department employees, and the The data warehouse initiative is run and supported state legislature. The state is focusing professional by many state education department offices. The development on helping users in all school dis- Office of Network and Information Services, tricts understand the new data warehouse. assigned to build and maintain the warehouse, Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 49

now coordinates the work of the outside service the data analyzer application, also is online and providers selected to create it. The office answers was introduced to all districts in March 2007. The questions from the external vendors, provides the third component, DASH, released in fall 2007, is data to them, and helps to coordinate and support an interface with graphs and gauges on various training as components are rolled out. data such as today’s attendance rates. The graphics are based on a car’s tachometer, with information Office of Assessment and Accountability staff are color coded to indicate problem areas (red) and working with the warehouse project to verify the areas of trouble (yellow). accuracy of data uploaded into the system. In addition, this office provides workshops through- Additional formally funded warehouse com- out the year to help users interpret assessment ponents include new data collection software data. The workshops are timed to coincide with (released in October 2007) and a component to the release of the data from the state. In addition, automate data collection in real time (initially re- specialized workshops help districts identified as leased to the Providence school district in October in need of improvement. 2007). The Schools Interoperability Framework (SIF) technology used for the data collection is The data warehouse has seven components (some designed to help all system directories speak the not funded under the current project): same language, allowing a free flow of data within the system. Explained one department of educa- • A consolidated educational information tion official: system portal. Let’s say I am a district and I have a differ- • A data analyzer application. ent computer system to track transportation. I have a different computer system for the • DASH, a front-end vision alignment tool for lunch program. I have a different computer teachers and principles. system for the library system and a different one for the student information system. None • New data collection software. of these systems talk to each other. So, the idea is, let’s say, the student information sys- • Data collection using the Schools Interoper- tem is speaking French and the library system ability Framework technology. is speaking Spanish, SIF translates everything into English and then it’s kind of like the gate- • A master directory. way for communication. So, when a student enrolls into the student information system, • A graphic interface system. a packet of information will be automatically sent to the library system saying this is a new The warehouse is being introduced gradually, with student, enroll this student into the library. components launched at different times over the So, the idea again is to save time, data entry, two-year period. Each component is piloted with and that type of thing. (Interview, Decem- smaller groups before being introduced to all the ber 3, 2007) districts. Two additional components connected to the data The first component, the consolidated educational warehouse were not funded under the initiative at information system portal, was released in May the time of the study. The first, a master directory 2008. The master portal for the warehouse, it will released in July 2007, stores directory informa- give all users a single logon to access all the pro- tion about districts and schools. The second is a grams at their access level. The second component, graphic interface system that would display all 50 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

system information graphically. Coordinated by school and student performance. The survey is the SIF technology, the interface system would administered by the National Center on Public automatically update access to all warehouse Education and Social Policy at the University of systems. Its release date was still to be determined Rhode Island. The university provides reports for at the time of the study. every public school in the state and offers materi- als and technical assistance to help schools use the Data in the warehouse will include student in- survey data to improve student achievement. formation, such as enrollment, discipline, special education, and English proficiency data; statewide Challenges to implementation. The state has and local assessment data; financial information; encountered numerous challenges—from capac- and teacher certification information, with other ity to time to funding—as it works to implement information about teachers that can be linked to the warehouse. Funding to support training for their classes and students. users at different levels is limited. Also limited are the personnel who support both the creation Many goals are envisioned for the warehouse. The of the warehouse and its ongoing maintenance underlying one is to give educators the data they (training, checking data, and so forth). Windows need to make the best decisions for their students, of time when all the right individuals can attend looking at programs and examining which have an appropriate training session have been very had better results. The warehouse will allow edu- limited, the system being designed for users with cators to examine treatment and effect—always an different levels of exposure to data use. In addi- important goal and one that is especially crucial tion, this project faces a challenge in coordinating now, during a state of fiscal crisis. Tight resources its technical and analytic components: people with make it critical that educators make informed a technical understanding of the system are not program decisions; good data can help them. always familiar with the analyses that will be run with the information. In contrast, the users of the The state education department has taken the lead system are not always familiar with the system’s in creating the warehouse. With the help of ESP So- technology but do know what kinds of analyses lutions and TetraData Corporation, it is creating a they want to conduct. This could lead to a techno- system that can be used by anyone with an interest logically sophisticated system not meeting users’ in education. The primary focus has been on estab- analytical needs. lishing the system and orienting users to compo- nents as they are introduced. Initially, training was Additional resources. These include: done by the contractors creating the system, with emphasis on learning to use each new application. • The Rhode Island Department of Elementary Current use and training were limited at the time and Secondary Education Office of Network of the study; however, the system is designed for and Information Services data warehouse many different people. Although access to and per- web site at www.ride.ri.gov/onis/DW/ mitted uses of the data will vary based on a user’s DataWarehouse.aspx. position, everyone will have appropriate access to the data through a variety of available tools. • The SALT survey web site at www.ride.ri.gov/ PSI/salt/default.aspx and the “SALT Survey Other state supports for data use. The Rhode Databook Overview” at www.nksd.net/ Island Department of Elementary and Secondary SALT/2008/overview.pdf/. Education created the School Accountability for Learning and Teaching (SALT) survey and process • University of Rhode Island National Center on in 1998. The SALT process is a school-centered Public Education and Social Policy “HiPlaces cycle of inquiry designed to improve Rhode Island School Improvement Planning and Monitoring Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 51

Process: Instruction Matrix” at www.ncpe.uri. unique response to the issue of schools failing to edu/research/matrix-instruction.asp. make adequate yearly progress.

Vermont Because Vermont’s history with data use predates the NCLB Act, it has traditionally focused on using Vermont has a history of collecting and using data data to improve instruction rather than to meet that predates the NCLB Act. The Vermont Depart- accountability requirements. The state assessment ment of Education has focused its data work on coordinator described Vermont’s vision for the use the human side of data—increasing the capacity of data in Vermont’s classrooms: “I think teachers of schools to collaboratively examine data and ought to be able to tell you at any moment in time make instructional decisions. The state has a data where each kid’s at” (interview, October 26, 2007). warehouse and provides professional development on data analysis, contracting this work to partner An initiative launched in the 2007/08 school year organizations. requires that Vermont schools identified as in need of improvement provide data-based evidence on State context. Vermont has the highest percent- the effect of their school improvement efforts. The age of rural residents of any state. Schools are state education commissioner’s Required Actions decentralized and the state has many small (16 Vermont State Law 165[b]) codify some of the schools and single-school districts. Districts are responsibilities for low-performing schools and organized into supervisory unions, each under make these uniform statewide. All identified low- a single superintendent. Vermont’s emphasis on performing schools must: local control has resulted in a recent education accountability law (Act 60, the Equal Education • Identify measures to track student progress, Opportunity Act) specifying that the state will not with particular focus on the groups and take over schools that fail to make adequate yearly content areas for which the school is identified progress. However, the state education commis- (this triggers required implementation and sioner determines whether districts are making monitoring of local assessment systems). progress in improving student performance and holds those not making sufficient adequate yearly • Develop a continuum of support for student progress accountable for using data and reporting learning. progress. • Report to the state at the middle and end Vermont has joined with New Hampshire and of the academic year, using data to show Rhode Island to develop the NECAP, administered progress. each fall since 2005 in grades 3–8 and in fall 2007, for the first time, in grade 11. • Have principals participate in principal learn- ing communities. Overview of the initiative. The Vermont Depart- ment of Education sees two components in data • Establish one teacher learning community use: the technical side, or the data and its analysis, during 2008/09. and the human side, or the importance of teams combining different perspectives to learn from the • Have all teachers in teacher learning commu- data. This dual emphasis is reflected in require- nities beginning in 2009/10. ments that principals (as of 2007) and teachers (partly in 2008, fully as of 2009) participate in According to the state assessment coordinator, learning communities focused on using data these requirements are to ensure that school lead- to improve achievement. This requirement is a ers can show the state education commissioner 52 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

that “they know whether kids are meeting accurate student and educational data available to standards. There are lots of ways to do this; we address continuous school improvement as well just want to know you’re doing something. The as state and federal accountability and reporting Commissioner is saying, ‘Tell me how you know requirements” (Vermont Data Consortium 2008). where your kids are.’” Through this directive, the The consortium has worked with the department commissioner ensures that identified schools use to create a Common District Model that includes data to guide their school improvement efforts. data on students, teachers, courses, class sec- tions, student schedules, course grades, and local In addition to this initiative focused on low- assessments. The model will eventually expand to performing schools, Vermont is also working to contain data on attendance, discipline, programs, expand the functions and uses of the state’s data and interventions; standards-based report cards; warehouse so that more districts can access it and and more. In addition, the data warehouse soft- better use data. The warehouse restructured its ware can use data stored in any student informa- pricing to let smaller districts, with fewer re- tion system to generate reports. Data from NECAP, sources, afford the services. the Vermont Development Reading Assessment, and local assessments are also housed in the Schools are required to submit reports directly to warehouse. Data are collected by schools, districts, the state education commissioner. The assessment and supervisory unions. About half of Vermont coordinator manages a range of data and assess- districts were participating in the warehouse at the ment projects within the department. Three school time of the study. support coordinators within the department lead the initiative for low-performing schools. The The data warehouse allows teachers (at the class- Vermont Data Consortium, a nonprofit group that room level), administrators (at the school level), manages the data warehouse, has close ties to the and central office personnel (at the district level) department; one department employee worked to query the data through different filters. One can with the consortium to develop queries to analyze analyze how students performed on assessments, data. how different subgroups performed (in Vermont the most critical achievement gap is between stu- In Vermont data-driven decisionmaking is sup- dents of low and higher socioeconomic status), and ported by the data warehouse, which stores data how they compare with other students in the state. and facilitates data analysis, and by policies that provide incentives and accountability for data use. The state department of education provides tech- The Vermont Department of Education focuses nical assistance and, due to its small size, limited more of its resources and personnel on supporting professional development in data use. The six low-performing schools than on supporting data educational service agencies (Vermont’s regional use by all schools, which is handled by partner or- education service providers) can provide more ganizations such as the Vermont Data Consortium extensive professional development, including in and educational service agencies. data-driven decisionmaking; however, given the agencies’ decentralized model, the department has The Vermont Data Warehouse is a data storage little control over the content and quality of the and retrieval system, using software from Tetra- professional development they provide. Data, that allows participating districts to access state assessment data (reported through spread- For schools falling under the Required Actions sheets) and generate queries to analyze those legislation, support is available from principal data. According to the consortium, the goal of the learning communities in each of the state’s four re- warehouse is to “ensure that Vermont supervisory gions, as well as with technical assistance on data unions/districts have cost-effective, timely, and use. Unique to Vermont’s accountability efforts, Appendix D. Profiles of state education agencies’ data-driven decisionmaking initiatives 53

such communities bring together principals of schools. The Virgin Islands Territorial Assess- schools failing to make adequate yearly progress to ment of Learning (VITAL) has been in place since share perspectives and reflect on school improve- 2004/05, following a five-year moratorium on ment efforts. standardized testing. After territorywide testing resumed with VITAL in 2004, the assessment Challenges to implementation. Vermont’s history was taken only by students in grades 5, 7, and 11 of using data, including its focus on assessment in each of three successive administrations. The in the decade before the NCLB Act, has created department was only beginning to develop a longi- challenges to adding an accountability component tudinal data system at the time of the study. to its work on data-driven decisionmaking. As the assessment coordinator said, “The notion of Context. The Virgin Islands, consisting of St. failing schools came later and unwillingly to us.” Croix, St. John, and St. Thomas, has a population Vermont has opted for the least punitive treat- of only 120,000 (primarily on St. Thomas and ment for schools failing to meet adequate yearly St. Croix). Although the population is mostly of progress requirements, focusing on support rather African descent, St. Croix has a large Hispanic than sanctions. Other means, such as the Required community (mainly of Puerto Rican descent), and Actions legislation, address accountability. St. Thomas has a significant community of Danish and French descent. In addition, there are U.S. An additional challenge is that not all districts mainlanders as well as people from the Dominican are members of the Vermont Data Consortium, Republic, West Bank and Gaza, and many other the most powerful tool available for data-driven locations. All public school students qualify for decisionmaking. free or reduced-price lunch.

Next steps. The state department of education The education department oversees two school is providing increasing support and guidance districts, with 34 schools and about 18,500 stu- for schools under corrective action. Besides the dents. In 2003 all 34 schools were deemed at risk requirements detailed above, the department has and ranked in the bottom third of all schools in been adding required teacher learning com- the United States and territories. The National As- munities for all faculty in schools identified as sessment of Educational Progress results for 2000 low performing to be phased in by 2010. These show that the share of grade 4 students perform- communities will use data to develop strategies for ing at or above proficient level was 4 percent, improving student achievement. compared with 28 percent nationally.

In addition, the department and the Vermont Data Overview of the initiative. The Virgin Islands De- Consortium are exploring ways to make the data partment of Education’s director of planning and warehouse more accessible to more schools and evaluation reported that, at the time of the study, districts through a new fee structure, with lower it had only recently begun to create a longitudinal costs for small districts. data system and was just starting to craft a data- driven decisionmaking initiative: “In the absence of Additional resources. These include the Vermont formal policy, what we have been doing is using the Data Consortium web site at http://vermontdata.org. NCLB [Act] as a motivator to encourage schools to use data” (interview October 26, 2007). For over a The Virgin Islands year the department did not have a permanent con- firmed commissioner of education.11 In the absence The Virgin Islands Department of Education is of a commissioner, the director of planning and in the early stages of crafting a policy to support evaluation was setting the direction for data-driven data-driven decisionmaking in its districts and decisionmaking. One of the primary motivations 54 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

for helping districts and schools use data was to Next steps. The Virgin Islands has only recently help them understand why their schools were not begun using data for decisionmaking. At the time making adequate yearly progress. of the study its plans were evolving; it planned to incorporate a comprehensive data warehouse and The education department contracted with the longitudinal system that would include achieve- education information company Pearson School ment data, student information, and staffing in- Systems to support it in developing and adminis- formation. In the words of the director of planning tering its assessment and longitudinal data system. and evaluation: Pearson was also contracted to hold workshops for principals twice a year—to teach them how One of our plans is to incorporate a compre- to query data in the school administration and hensive data warehouse and longitudinal student information system database, how to enter system [in which] data is gathered from data into spreadsheets, and how to pull out the the schools and uploaded to the state. . . . types of information they need. We will be able to not just look at student achievement data and see whether or not all Data management staff from the research and the teachers are highly qualified but . . . also evaluation office also meet with the schools and be able to electronically cross-check those stress the importance of data, helping them un- data to other sources and facilities: what’s derstand the significance of the data for making happening nutritionally in the school lunch adequate yearly progress. According to the direc- area, and the like. We are in the planning tor of planning and evaluation: stage; we are developing a proposal to see what it will cost. We are moving forward One of the things we have done is share with with that. We are moving forward with them where their challenges are, based on trying to implement another level of data- the data in regards to their school’s perfor- driven decisionmaking. (Interview, October mance. . . . We found . . . in our last report 26, 2007) card that participation rate and attendance were two of the big areas why schools didn’t Additional resources. These include: make [adequate yearly progress]. We found in the [2006/07] school year it is no longer a • The Virgin Islands Department of Education significant factor for missing [adequate yearly web site at www.doe.vi. progress]. Now they are working on address- ing the proficiency achievement attainment of • The Pearson School Systems web site at www. the subgroups. (Interview, October 26, 2007) pearsonschool.com. Appendix E. Catalogue of service providers 55

Appendix E Connecticut Regional Educational Service Centers Alliance Catalogue of service providers The Connecticut Regional Educational Service This appendix provides a brief profile of nine Centers (RESC) Alliance comprises six educational Northeast and Islands Region service providers. It service centers that provide responsive programs includes providers mentioned by state education and services to Connecticut schools and districts agency respondents in interviews as the lead pro- in a cost-effective manner. As part of the Con- vider in the state; it is not an exhaustive catalogue necticut State Department of Education’s Connect- of all providers working in the Northeast and icut Accountability for Learning Initiative, one Islands Region. of its main services is professional development and technical assistance in data-driven decision- Cognos making for low-performing schools. Training for teachers and school administrators focuses on Used by more than 23,000 organizations in the data-driven decisionmaking process and how industries ranging from retail to defense, Cognos to build and sustain this process in district- and business intelligence software is an integrated school-level data teams. The training is based on performance management software by IBM that a “train the trainer” model in which trainees then is relatively new to education. It is used to ana- become school leaders of data-driven decision- lyze and report on student performance data, as making professional development. Districts that mandated by the NCLB Act. Rather than a simple request it receive follow-up technical assistance repository for data, it integrates and aggregates based on their unique data-driven decisionmaking data to multidimensionally analyze student needs. In addition, the alliance provides a variety performance by various factors. Parents, teachers, of customized work to schools and districts, such principals, and district and state administrators as spreadsheet and database training and custom can access and manipulate data from the software report development. according to their needs. The RESC Alliance maintained the Connecticut The Massachusetts Department of Elementary Data Warehouse, powered by TetraData’s EASE-e and Secondary Education has partnered with Data Analyzer, and provides technical assistance Cognos Corporation to create and manage a and training on understanding and using the data statewide data warehouse. Local districts are key warehouse and the data analysis applications asso- to creating an accurate and consistent database; ciated with it. At one point, as many as 76 districts at the time of the study a pilot group of districts of the 169 districts in Connecticut were receiving was participating in the data warehouse project. some type of training and professional develop- Project participants must complete training, ment pertaining to the data warehouse. Because of which the department offers to end users, report a lack of funding and capacity at the district level, authors, and data leaders. The “long-term goal,” however, the data warehouse was not used and explains one official at the department of educa- ceased functioning at the end of June 2008. tion, “is to provide every district and school with the ability to easily query and analyze its organi- ESP Solutions Group zation’s state-maintained data [Student Informa- tion Management System, Massachusetts Com- ESP Solutions Group provides data systems and prehensive Assessment System, Massachusetts psychometrics in K–12 education to federal, state, English Proficiency Assessment, and educator and local education agencies. Led by a manage- data], and to provide districts with the option to ment team with extensive experience in technol- load and analyze their own data.” ogy and education, it pioneered the concept of 56 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

“data-driven decisionmaking” in the 1970s. ESP Interoperability Framework to automate verti- offers consulting and services in data manage- cal reporting of data. In progress. ment, data collection and exchange, data analysis, and data reporting. Its products include the State • Vermont—Client: state education agency. Report Manager web tool, publications and re- Consulting work on state’s information system ports, online data dictionaries, disaster prevention design (1998). and recovery plans, and electronic transcript tools. Measured Progress While leading the U.S. Department of Education’s Performance-Based Data Management Initiative Measured Progress, an independent nonprofit (now called Education Data Exchange Network), organization, develops, implements, and oper- ESP worked with all states and territories. Past ates education assessments for grades K–12 and and present work with the Northeast and Islands provides related professional development. It Region states includes: also houses two research centers: one focuses on theoretical and applied psychometrics; the other, • Connecticut—Client: state education agency. educational assessments. Consulting services in Developing a statewide longitudinal educa- various areas are also offered. tional data system using ESP’s State Report Manager web tool and other dissemination Its K–12 assessment services division covers all tools. In progress. aspects of large-scale, alternative, classroom, and online assessments and counts the following • Maine—Client: state education agency. Tak- Northeast and Islands Region states as former and ing data inventory using DataSpecs and ESP current clients: metadata inventory tool. In progress. • Connecticut and New York—alternative • Massachusetts—Client: Boston Public assessments. Schools. Past services for the Massachusetts Department of Education include making an • Maine—Maine Educational Assessment. applications inventory for the chief informa- tion officer (2006); consulting on statewide • Massachusetts—Massachusetts Compre- student identifiers, confidentiality, and data hensive Assessment System; Massachusetts standards (2002); building a statewide student English Proficiency Assessment. identification system (1998); and developing a data dictionary (1996). • New Hampshire—New Hampshire Educa- tion Improvement Assessment Program, from • New Hampshire—Client: state education 1992–2006. agency. Documenting state’s i4see data man- agement policies and procedures for student • New Hampshire, Rhode Island, and Ver- data access and data verification. In progress. mont—New England Common Assessment Dropout data collection (1999). Program.

• New York—Client: state education agency. Focusing on assessment literacy and effective as- Consulting on student identifiers (2002). sessment strategies, the professional development services division has delivered customized work- • Rhode Island—Client: state education agency. shops in districts and schools across the North- Designing and implementing a data ware- east and Islands Region. It also produces a guide house; implementing a statewide Schools to interpreting the state reports so that teachers Appendix E. Catalogue of service providers 57

and administrators can use assessment data to helped the department develop an account- inform, drive, modify, and improve instruction ability plan based on its specific needs. in the classroom. At the time of the study a new product—i-analyze—that would serve as a report- New York Board of Cooperative Educational Services ing portal with filtering options for schools was in development. The Board of Cooperative Educational Services (BOCES), created by the New York State Legisla- National Center for the Improvement ture in 1948, consists of regional support networks of Educational Assessment that provide services to districts in need of help, including technical assistance and technology sup- The National Center for the Improvement of port. Participating school districts receive services Educational Assessment, Inc., or the Center for through a cooperative service agreement, though Assessment, is a nonprofit organization founded some services are free. BOCES provides data- in 1998 to address the changes in accountability driven decisionmaking services based on regional and assessment under way across the country. and district needs. Many low-performing schools The center works with state and other education seek BOCES help to use data to improve teaching agencies to design and implement effective assess- and learning. ment and accountability policies and programs. It provides services in five areas: technical advisory Housed in the 37 BOCES in the state are 12 committees, assessment systems, accountability statewide regional information centers that give systems, other consulting, and Reidy Interactive districts access to instructional and administra- Lecture Series conferences. Since its founding, it tive technology services as well as training and has worked in 25 states; at the time of the study support for teachers, students, and administrators. it was working with 11 state departments of They store and process all the data required for education and one nonprofit organization. In the the state and provide districts with customized Northeast and Islands Region it was working with professional development on data management the Massachusetts and Vermont state departments and analysis. of education: Pearson School Systems • Massachusetts. The Massachusetts Depart- ment of Elementary and Secondary Educa- Pearson School Systems is a part of the media tion was creating an accountability system giant Pearson that provides education, business that is part of the Education Reform Law and information, and publishing services. Creating that validates its psychometric measures and innovative education technology for 40 years, design scheme. The Center for Assessment Pearson has the largest market share in student helped to review the proposal for this system information systems solutions—with 6,500 school and its education and policy issues. districts and 130,000 schools serving 22 million students using its products. In addition to its • Vermont. The Vermont Department of Educa- award-winning software, Pearson provides com- tion needed an accountability system that prehensive consulting, implementation services, could meet its unique characteristics: many and project management. schools with small student populations and a history of local control. The state wanted a Some of Pearson School Systems’ most popular system that could house state tests and other data products include: assessment as well as incorporate locally determined data into a reliable measure of • Chancery SMS. A solution for growing urban school achievement and progress. The center districts seeking specific implementation 58 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

and customization services, streamlined perform on the main state test. Instruction operations, and organizational process re- can then focus on the identified area of need. engineering. • Performance Tracker. An assessment manage- • Pearson Inform. A data analysis and deci- ment tool that collects assessment data and sion support tool for K–12 schools. Performs writes reports. It is web based (accessible) comparative and longitudinal analyses from and integrates with other popular software. student to district level. Analyzes data from Curriculum and assessments are correlated to multiple sources. state and district standards.

• PowerSchool Premier. A fully integrated, web- • TechPaths. A curriculum mapping system based, cross-platform student information developed by Dr. Heidi Hayes Jacobs. It uses system. It is a real-time, intuitive user-friendly assessment data and links to curriculum data. communication tool. More than 55,000 teachers from 500 schools in • SASI. A flexible, comprehensive student 47 states were using products from Performance information management system that allows Pathways at the time of the study. Most North- teachers and administrators to monitor, track, east and Islands Region states have districts that and report on student data and progress employ Performance Pathways: (through SASI Gradebook, Parent Access, and Classroom). Included in the system is a com- • New Hampshire—As part of its Follow the munication tool that allows parents to track Child initiative, the state has provided Per- their children’s academic progress online. Its formance Tracker to all public schools since focus is on day-to-day operations. spring 2007. The system currently includes assessment data from the New England Com- Performance Pathways mon Assessment Program and Northwest Evaluation Association, but more indicators A technology company formed in 2005 by the will be added later. It is geared primarily merger of Technology Pathways Inc. with Alter- toward teachers and principals but will ulti- Net Performance, Performance Pathways Inc. mately be available to students. focuses on data analysis and a data-informed culture in education. It believes that districts need • New York BOCES—All BOCES districts can to have curriculum-driven assessment and that use the applications. Districts can obtain curriculum and assessment data should drive professional development directly from decisionmaking. the BOCES or become certified and train themselves. Performance Pathways develops and provides cur- riculum and assessment solutions for the PreK–12 • Connecticut, Massachusetts, and Rhode environment. Its three integrated applications— Island—Client list of districts is increasing. Assessment Builder, Performance Tracker, and TechPaths Curriculum Mapping System—are de- TetraData signed to help teachers and administrators make informed and timely decisions about instruction: Founded in 1997, TetraData became a subsid- iary of Follett Software Company in 2006. Its • Assessment Builder. Designs, scores, and products are data-driven performance solutions analyzes local assessments. Designed to assess for grades K–12. Districts can come aboard any benchmark skills to predict how students will time in implementing their data system; they are Appendix E. Catalogue of service providers 59

empowered to develop improvement plans as well • Warehouse and Central Data Store. Consoli- as monitor, evaluate, and change their strategies dates databases into a central information along the way. source.

TetraData Analysis Suite, TetraData DASH, and In addition to the aforementioned products, Tetra- TetraData Warehouse and Central Data Store are a Data provides data management services, profes- suite of solutions. A combination of these products sional development and training, hosting facility and services is packaged to meet each district’s services, and customer service that includes a needs. project manager assigned to each client district.

• Analysis Suite. Analyzes and monitors data More than 600 K–12 education institutions na- factors, including longitudinal analyses; mea- tionwide, serving more than 2.3 million students, sures program efficacy; tracks disaggregated use TetraData products and services in one form No Child Left Behind subgroups. Intended for or another. Its clients are typically districts with use at the district level. an average of 15,000–30,000 students; the cost of the system often precludes smaller districts from • DASH. Front-end vision alignment tool for the purchasing it. A typical contract is renewable and warehouse used by teachers and principals. A lasts for three years. In the Northeast and Islands communications tool, DASH allows the dis- Region New York City, Rhode Island, Vermont, trict to quickly analyze data and disseminate and at least one district in Connecticut are clients them to principals and teachers. of TetraData. 60 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Appendix F in appendix D). At the time, the centers were Service provider profiles unsure of when the education department would begin to develop an approach to data warehousing, Three service providers were selected for profil- so they formed the Connecticut RESC Alliance ing. Providers had to meet the following criteria: and contracted a vendor to provide districts with work with multiple jurisdictions in the region, data warehousing along with technical assistance have been in existence for two or more years, and so they could use the data to make informed edu- have offered these services for two or more years. cation decisions. In addition, organizations had to offer professional development in using data to inform teaching and Based on the recommendation of Victoria Bern- learning, not just support on the other compo- hardt, an expert in the data-driven decision- nents. This appendix contains in-depth profiles of making field, TetraData was contracted to provide those three service providers—the Connecticut data warehousing to districts. RESC members had Regional Educational Service Centers (RESC) worked with Bernhardt and valued her work in Alliance, Measured Progress, and Performance data-driven decisionmaking. The Connecticut data Pathways. warehouse was powered by TetraData’s EASE-e Data Analyzer software program.12 In Connecticut Connecticut Regional Educational Service Centers Alliance 76 of its 169 districts have received some type of training and professional development in data The Connecticut Regional Educational Ser- warehousing. vice Centers Alliance is a consortium of the six regional educational service centers in the state. The interviewees from the RESC Alliance empha- Created under state statute, the centers come sized that work on data-driven decisionmaking together to share common problems and solutions should focus on data warehousing and school and determine cost-effective ways of meeting the improvement planning together, be a collaborative needs of public school districts through various process, and involve multiple methods of collect- programs and services. The RESC Alliance works ing data. with Connecticut districts that request its services, though it has also worked in other Northeast and RESC Alliance members suggested that districts Islands Region states (Massachusetts and Rhode should not only collect, house, and analyze their Island). Service areas offered include technology, data to meet their individual needs but also do this assessment, curriculum, data warehousing, and with a focus on the development and monitoring data-driven decisionmaking, among others. The of school improvement plans. EASTCONN, one centers are funded partially by the state educa- of the regional educational centers, worked with tion department, partially by revenues from their the Hampshire Collaborative in western Massa- services, and partially by federal grant money. chusetts to help it develop its data warehouse and craft school improvement plans that would involve Overview. The Connecticut RESC Alliance was using data. One interviewee from the RESC formed in 2000 after the regional educational Alliance described the importance of focusing si- service centers recognized the strong need for multaneously on both data and data use in school data warehousing in districts. This was before the improvement in working with districts: Connecticut State Department of Education estab- lished the Connecticut Accountability for Learn- I went out and started working with [schools] ing Initiative (CALI), which provides professional initially just focused around data warehous- development in data-driven decisionmaking and ing, but what I knew at that point from what technical assistance to low-performing districts, we had tried to do in Connecticut was [how among other services (see the Connecticut profile important it was] to really get them to focus Appendix F. Service provider profiles 61

on school improvement and not just data classroom assessments to make more real-time warehousing. I mean you can buy a data decisions on instruction.13 This vision for data- warehouse and that’s fine, but if you don’t driven decisionmaking work is integrated with the think about the other side of this, which is services the RESC Alliance and CALI provide to what you’re going to do with these data, and Connecticut districts and schools. begin to develop these things down parallel roads, then you’re going to build a wonderful Clients. The RESC Alliance works with any district warehouse and . . . what we found in Con- in Connecticut that requests services and has also necticut is that we built this great tool, but worked in other Northeast and Islands Region nobody knew what to do with it. And you states (Massachusetts and Rhode Island). The know, for those districts in Connecticut that data-driven decisionmaking training it delivers didn’t know what to do with it, they failed. through CALI had been offered for free only to (Interview, February 6, 2008) districts in corrective action, and for a fee to other districts. At the time of the interviews, the RESC The RESC Alliance also views data-driven Alliance was looking for funding to provide ser- decisionmaking as a collaborative process, as one vices free of charge to all interested districts. interviewee described: Services. Along with technical assistance and It’s about conducting some root cause analy- training on understanding and using the data ses so that you’re getting to some of those warehouse and the associated data analysis underlying causes. I’m a firm believer that applications, the RESC Alliance also provides you can’t get to the root cause in education districts and schools with a continuum of services, because the variables are so many and so including forming data teams and guiding school complex, but you’ve got to [make] . . . your improvement planning. Customization is also best attempt at getting to some root causes. available to schools or districts in, say, compre- And then setting some . . . goals and develop- hensive data management services and assessment ing some real, clear action plans. Strategies mapping. are fine, but the missing piece in some of this is having a real plan to implement those strat- RESC Alliance is involved in professional devel- egies. It’s completing the loop here so that . . . opment services that are part of the Connecticut we’ve got monitoring plans in place, we’ve got State Department of Education’s CALI, which action plans, who is responsible, all of those aims to advance learning for all students, par- kinds of things. You know, it’s almost a mis- ticularly those from the 12 districts in corrective nomer to call it data-driven decisionmaking action. Initiated in 2004, CALI provides support because you’re doing more than making a to districts and schools in the following areas: decision. (Interview, February 6, 2008) data-driven decisionmaking data teams, mak- ing standards work, effective teaching strategies, Interviewees also noted the importance of having common formative assessments, school climate multiple methods of collecting data if data-driven improvement to support academic achievement, decisionmaking is to be effective. Due to pressures and accountability in district and school improve- to meet adequate yearly progress targets, districts ment planning. The professional development and and schools have made collecting and examin- technical assistance are offered to Title I districts ing state assessment data from the Connecticut and schools that are identified as in need of Academic Performance Test and the Connecticut improvement. But ineligible districts and schools Mastery Test a priority. The RESC Alliance recog- can participate for a fee. For those districts that nizes that other data must also be used to make choose to participate, four days per year are set improvements and also that students should have aside for professional development; the technical 62 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

assistance is offered throughout the year as follow- assistance follow-up to the professional develop- up support. ment is also provided by the regional educational service centers—though not by the data-driven The data-driven decisionmaking component of decisionmaking professional development trainers CALI aims to reduce achievement gaps by pro- themselves—and the State Education Resource viding districts with professional development Center. services and technical assistance on using data to impact pedagogy and learning and, by extension, The RESC Alliance trainers were initially trained improve student achievement. The hope is that by consultants from the Leadership and Learn- these services will create a culture of data-driven ing Center. These trainers could then train others, decisionmaking in schools and districts where as well as provide on-site support, in the various data teams of school administrators and teachers modules that make up the organization’s approach are examining multiple kinds of data in an ongo- to school improvement. The interviewees were ing format. trained in data-driven decisionmaking and soon will be trained in common formative assessments These services are based on the work of data- in order to provide services in it. The individuals driven decisionmaking experts, including Doug who provide the basic data-driven decisionmaking Reeves (Reeves 2000, 2001), Nancy Love (Love training have complementary backgrounds et al. 2008; Love 2002), and Victoria Bernhardt that bolster the services provided. While they (Bernhardt 1998). Reeves centers his approach to are both certified to train others in data-driven data-driven decisionmaking and other school im- decisionmaking, one has a stronger background in provement planning efforts (such as using effective technical aspects of data-driven decisionmaking, teaching strategies and coaching) around a holistic including collecting, housing, and managing data; accountability system where districts, schools, and the other is stronger in school improvement. communities are accountable for what happens within the system. The system must be collabora- Professional development services. The RESC Al- tive and yield identifiable results based on set liance delivers a two-day basic training in data- goals for each level of stakeholder. This process driven decisionmaking to eligible districts that of collaboration provides a vehicle for examin- are part of CALI.14 This training gives district ing data and student work that leads to changes teams the opportunity to examine data from in teacher instruction and student learning. In their own school and apply it within the context addition, Love and Bernhardt were the first to ap- of the guided training. The basic training is a proach data-driven decisionmaking on a practical seminar for teachers and school administrators, level, and their work focuses on providing techni- including, for example, curriculum developers, cal assistance to schools and districts and around superintendents, and principals. Participants learn establishing a cycle of inquiry and data-driven how to examine their own real student data using decisionmaking. The trainers draw from the work a five-step data-driven decisionmaking process of these experts to design their professional devel- (see appendix D for an overview of this process) opment sessions. and how to develop and sustain this process on district- and school-level data teams composed of RESC Alliance data-driven decisionmaking fellow teachers and administrators. Through this training. Two of the regional educational service process, data teams collaboratively analyze data centers provide low-performing districts with and identify student strengths and weaknesses. professional development training in data-driven Team members devise instructional strategies that decisionmaking and data teams, and they have best address these areas as well as the required plans to deliver the common formative assess- learning standards. These strategies must be one ments training in the near future. The technical of the Effective Teaching Strategies identified in a Appendix F. Service provider profiles 63

meta-analysis by Marzano, Pickering, and Pollock Furthermore, this is layered with the problem (2001, cited in Connecticut State Department of of districts making quick fixes to meet adequate Education 2008). Team members are meant to yearly progress and ignoring the need for systemic implement their strategies, monitor them, and change. Another interviewee stated: discuss them at their next meeting. A variety of actions are taken as a result of this process, includ- Everybody is so “deer in the headlights” ing curriculum revision, program redesign, and because of [adequate yearly progress]. I mean, funding redistribution. it is hard to institutionalize anything because they are all about quick fixes. They are about Participants bring their own classroom- or safe harbor, which . . . gets you out of the school-level data to analyze during the trainings. newspaper, but that does not fix the prob- Attendees bring a variety of data, including state lem. . . . You still have most of your kids not assessment data; data on behavioral issues, such making [adequate yearly progress], but you as tracking the number of classroom referrals or made safe harbor. It’s like, “Wait a minute . . . office referrals; and school climate survey data. there’s still something wrong with this picture The data are typically housed on paper or in a here.” (Interview, February 6, 2008) spreadsheet. No technology analysis tools are used in the training. Another challenge is the lack of time in a typical school day to conduct such practices as data team Technical assistance services. Technical assistance, meetings. In addition, there are often not enough based on particular district needs, is provided individuals in the district or school who have the throughout the year to districts as a follow up ability and time to keep the data-driven decision- to the professional development. Thus, it could making work on track throughout the year. But involve doing the actual basic training described one of the biggest challenges noted by interviewees above, developing a common formative assess- from the RESC Alliance was the lack of a clear ment, or even helping districts use spreadsheet vision for data-driven decisionmaking in districts. software to examine their data. It could also This is because, as one interviewee noted, “With- involve helping districts learn how to provide dif- out that shared vision, people don’t know what it ferentiated instruction or examine student work. is they are trying to institutionalize.” He also sug- gested that leadership is ultimately responsible for Challenges. A main goal of CALI training is to en- creating a vision of data-driven decisionmaking courage schools and districts to create a culture of in a district or school and making sure that it is educators using data frequently, with the ultimate shared among administrators and teachers. A goal of improving student achievement. However, shared vision is often not institutionalized due to there are challenges associated with getting data- poor leadership, and sometimes teacher turnover. driven decisionmaking practices institutional- If there is an institutionalized vision in a par- ized in schools and districts in the state. The first ticular district and a new leader comes in with a challenge needing attention is teacher resistance to completely different idea of data-driven decision- change. As an interviewee from the RESC Alliance making and how it should be implemented, a commented: “flavor of the month” trend can result, preventing the district from making continuing progress in You continue to send a constant message, and data-driven decisionmaking. One interviewee at some point either the change starts to occur suggested that when districts are interviewing for and they are still in the place they were in, or a new superintendent or principal, they should they start to internalize some of the message explain to him or her that a culture already exists and make it their own. So that would be the and encourage that person to embrace it and carry first obstacle. (Interview, February 6, 2008) it forward. However, there often is not a culture in 64 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

place, so it is also important to ensure that a new of experience in education and in supporting leader has the motivation and ability to create a schools with Measured Progress. For example, vision and make sure it is shared by all. Finally, the vice-president of client services is the former while districts have the infrastructure for adequate director of the Maine Educational Assessment yearly progress reporting purposes, they have not testing program, and before that he was a class- created the classroom and formative assessments room teacher and principal. As an employee of that are needed to collect additional data beyond the state education agency in Maine, he worked demographic and state assessment data. closely with Measured Progress (then Advanced Systems) to develop and administer the assess- Despite the challenges associated with institution- ment and was the original provider of the test in- alizing a culture of data-driven decisionmaking in terpretation workshops in Maine. In an interview districts and schools in Connecticut, the RESC Al- he speculated that he has probably conducted test liance interviewees cite their success at the school interpretation workshops as long as anyone in level in raising awareness of the need for data to the field. Given its staff’s rich backgrounds in the inform education decisions. field, Measured Progress is particularly qualified to support local educators in implementing data- Additional resources. These include: driven decisionmaking.

• Connecticut RESC Alliance web site at http:// Overview. According to the company web site, ctrescalliance.org. Measured Progress embraces the motto, “It’s all about student learning. Period.” This was reiter- • Connecticut Accountability for Learning ated by senior staff and was evident in their de- Initiative web site at www.sde.ct.gov/sde/cwp/ scriptions of their work. One senior staff member view.asp?a=2618andQ=321754. commented on the problem of “data overload” in schools that does not lead to improved student Measured Progress learning, saying, “There’s way too much data out there and way too little information.” Measured Progress is a nonprofit company that provides large-scale and alternative assessments The company’s mission is to improve teaching and professional development in using data to and learning by providing assessment data that is improve student achievement. Founded in 1983 as customized to the needs of schools and then help- Advanced Systems in Measurement and Evalu- ing educators use data to improve student achieve- ation, Inc., the company has worked with more ment. The company web site describes Measured than 35 states and major districts to develop large- Progress as striving to provide services for educa- scale and alternative assessment programs. Head- tors that meet the following characteristics: quartered in Dover, New Hampshire, Measured Progress also has facilities in Colorado, Kentucky, • Relevant. Assessment is not an end in itself; its and New York, with more than 400 full-time purpose is to foster student growth. employees. • Personal. Shared goals, values, and passions Although the company has grown dramatically enable us to forge lasting relationships. over the past 25 years, it remains focused on its original commitment to students and teaching • Focused. You and your students are at the and learning. Still leading the company, one of heart of our mission. its cofounders began his career as an elementary school teacher; in fact, most senior staff mem- • Reliable. You know you can count on us to do bers are former teachers who bring a wealth what it takes to get the job done right. Appendix F. Service provider profiles 65

• Tailored. Customization is part of who we are; Clients. Measured Progress serves districts and we specialize in solutions that fit. states around the country, developing and admin- istering state assessments for 13 states (table F1). • Student driven. It is all about student learning! These statewide assessment contracts generally include professional development on administer- Through its professional development, Measured ing the assessment and understanding and using Progress focuses on helping districts and schools assessment data; this is included in the state’s con- develop several factors common to high-achieving tract and provided to educators in all districts and schools, including a standards-based learning schools in the state. Measured Progress also has environment, teachers and administrators who professional development contracts with 3 states understand assessments, effective use of a variety and with individual districts in 12 states. of strategies to gather information about student learning, adjustments and interventions during the In the Northeast and Islands Region it serves learning process, professional learning communi- seven states in various capacities. It has worked ties, collaborative teams that examine all aspects of with Maine to develop and administer the Maine student work to determine program effectiveness, Educational Assessment since the test’s incep- and the use of multiple data sources to plan for tion in 1985. It also developed the Massachusetts improvement. The ultimate goal is to help teachers Comprehensive Assessment System and works use data to help students. In the words of another with New Hampshire, Rhode Island, and Vermont employee, quoted on Measured Progress’ web site: to administer the New England Comprehensive “The intention of the company is, always has been, Assessment Program. It produces alternative as- and always will be to impact how well teachers can sessments for Maine, Massachusetts, New Hamp- help students to learn. It’s why we exist.” shire, New York, and Rhode Island and has been administering the Massachusetts English Profi- This sentiment was echoed in interviews with two ciency Assessment since 2003. And it contracts senior staff. Both spoke passionately about how directly with districts in Maine, New Hampshire, their goal was to support teachers and students. and Vermont to provide schools and districts with One noted his strong distaste for one-shot profes- professional development that focuses on creating sional development workshops that fail to build a data-driven culture. relationships with educators or to serve as a guide and consultant as educators deepen their under- Services. Measured Progress primarily offers two standing of how evidence can shape classroom kinds of contracts to states and districts around the practice. country. It develops and administers large‑scale

Table F1 Measured Progress clients, 2008

Service provided State of district contracting for services State contracts Assessments Florida, Georgia, Kentucky, Maine, Massachusetts, Missouri, , , New Hampshire, New York, Rhode Island, Utah, Vermont Professional development Georgia, , Kentucky, Maine, Louisiana, Nevada, , Massachusetts, , Mississippi, Montana, New Hampshire, , South Dakota, Vermont

Source: Authors’ compilation. 66 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

and alternative assessments to help states and proficiency for English language learner students districts understand their assessment reports, (the Massachusetts English Proficiency Assess- and it provides professional development services ment) for Massachusetts. to improve the capacity of educators to make data-driven decisions.15 Most of these services It works with the states to develop large-scale and are provided under multiyear contracts, allowing alternative assessments customized to the needs of Measured Progress to tailor its assistance to client each state education agency—aligning assessment needs. Professional development is delivered by items with state content and skills standards, for former educators, whose experience lets them focus instance. The company scores the tests and creates on the needs of schools and classrooms. test reports for each school and district and works closely with the states to support districts and The professional development services provided schools in interpreting the test reports. by Measured Progress address relevant catego- ries discussed in the research on data-driven Professional development to support assessment decisionmaking: building assessment literacy contracts. In conjunction with the state assess- (Price and Koretz 2005; Center for Research ments, Measured Progress provides professional on Evaluation, Standards, and Student Testing development for local educators to help them 2006), developing inquiry cycles to focus data make meaningful decisions using their test data. analysis (Boudett, City, and Murnane 2005; Supports include detailed assessment reports and Bernhardt 1998), examining instruction (City, workshops to help educators use the data to im- Kagle, and Teoh 2005; Love 2002), and improv- prove classroom practice and school and district ing achievement (Love et al. 2008). Measured programs. Progress is beginning to create data analysis tools (Wayman, Stringfield, and Yakimowski 2004). Because Measured Progress believes a well The company also houses two research centers, designed score report is the first step in helping which help it stay up to date on large-scale and educators interpret assessment data, it carefully alternative assessment, formative assessment, crafts its reports to guide discussions about school and using data to improve classroom practice. improvement. Explains a senior staff member: Researchers from Measured Progress regularly present at state, regional, and national confer- The extent to which schools can use data from ences. Measured Progress also stays current assessment is driven by the professional de- through its National Advisory Committee, made velopment they receive to some extent, but it’s up of education experts; its partnership with also driven by the design of the reports. We’ve the National Association; and its always tried to design reports to maximize memberships in the Council of Chief information to schools. It’s the first line [for Officers and the Council of Great City Schools. deciding] what information you want to com- municate. What do you want to be able to say Large-scale assessments. Measured Progress based on the results of these tests? What infor- develops and administers various large-scale mation do you want to give schools? We focus assessments. Most are criterion-reference tests, around . . . three essential questions [How did administered to students in each grade level along we do? What do the data tell us about parts of with alternative assessments for students who our program? What do the data tell us about cannot participate in the regular assessment pro- parts of our population (No Child Left Behind gram. Measured Progress also administers high subgroups)?]. If I see three years in a row that school graduation tests for Georgia and Nevada, students always do worse in geometry, then writing portfolios for Kentucky’s comprehensive that’s program evaluation in its purest sense. assessment system, and a test of language arts (Interview, March 24, 2008) Appendix F. Service provider profiles 67

Reports are tailored for each state education them. . . . In the workshops we try to make agency. There is no boilerplate test report: each the data come alive. Schools bring their own report looks different, based on the needs and reports. We walk them through each one of priorities of the state education agency. Measured the reports and give them clues about where Progress tries to make the reports complement they would find data about answering . . . and align with other initiatives in the states. They those [three] questions. (Interview, March 24, provide item-level information linked to state 2008) standards so local educators can make meaningful decisions about school programs and classroom Measured Progress works with the state education practice within the context of their state’s account- agency to design and implement the workshops, ability system. In the words of one administrator which typically last half a day and are offered in from Measured Progress: various locations in the state. Workshops targeted district administrators, principals, and teachers, The design of the reports is really where it who not only review their own data but are also starts, I think. The linking of data on in- expected to share the process with others at their dividual item results is a way of making it schools. easier for teachers to grab hold of the data and make some sense of it. The item-level Until recently, these workshops were face-to- report has generated more conversation face and included little technology. As Measured among teachers than anything we’ve ever Progress moves toward online test administration produced. It gets real personal on a kid level, and score reports, workshops are beginning to and it helps them to think about why students integrate more technology in order to train users got something wrong. . . . Massachusetts on the new web-based data portal. releases 100 percent of [the test items] every year. Maine and NECAP [states] would if Professional development to support data-driven they could afford it. . . . When you can show a decisionmaking. Measured Progress offers addi- teacher an actual item and how all their kids tional professional development, separate from its performed on that item, it becomes a per- large-scale assessment contracts, to help educa- sonal thing for the teacher and that’s really tors use data to inform decisionmaking. These good. (Interview, March 24, 2008) professional development contracts are primarily with individual districts. One senior staff member In interviews, senior staff at Measured Progress noted: made clear their belief that Measured Progress score reports are a major contribution to helping local We have a variety of professional develop- educators use data to make informed decisions. ment offerings, from looking at data to look- ing at student work. It’s all loosely connected Measured Progress also offers workshops on test with assessment. . . . It’s more elective on the interpretation as part of its large-scale assessment part of districts, based on their own judgment services. These workshops focus on answering or readiness for a variety of things. All profes- three important questions through the data: How sional development is customized for their did we do? What do the data tell us about parts of needs. (Interview, March 24, 2008) our program? What do the data tell us about parts of our population (the subgroups)? As one senior Measured Progress helps districts create classroom staff member at Measured Progress explained: assessment tailored to individual local client needs and aligned to state standards, so students are It’s not just about training people how to better prepared to take the high-stakes assess- look at reports; it’s about what you do with ment. It also offers support for creating a formative 68 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

assessment process, which the company defines as argue that in systems where data use is prevalent adjustments and interventions during the learning the biggest danger is in “overanalysis”—drawing process. conclusions that the data do not warrant. Mea- sured Progress works on two levels to counteract These contracts are longer term, not isolated work- both data overload and its possible misuse. First, shops. This is part of the overall philosophy of the it designs both score reports and professional company, which urges building a relationship with development to make data as comprehensible and clients to better meet their needs. The director of straightforward as possible, with a focus on the professional development described how essential three essential questions listed above. Second, it this was to him: designs assessments to lessen the possibility of users drawing erroneous conclusions by making When I joined Measured Progress, it was sure that there are a sufficient number of questions doing a lot of workshops. I wanted to move in each subcategory. Finally, it works with state toward longer term relationships with clients. education agency officials and teachers in each It was a condition of joining the company. state to ensure that questions are tied both to state (Interview, April 10, 2008) standards and classroom curriculum.

Measured Progress does offer low-cost, one-day It has also been a challenge to work with state edu- workshops, however, primarily as an enticement to cation agencies to create data systems that comply establish a longer term relationship. with federal reporting requirements but also pro- vide data that is meaningful for improving school Challenges. Measured Progress has been busy programs. If the state education agency’s data are helping state education agencies scale up their not meaningful, it is difficult for the contractor to large-scale assessments in the wake of the NCLB provide meaningful score reports and professional Act. Although most state education agencies development on using that data: administered statewide assessments prior to 2001, they now test all students every year, beginning in The big [challenge] is the robustness of the grade 3. Measured Progress staff understand and [state education agency’s] data warehouse. respond to the pressure this puts on states: Basically, if you put garbage in, you get garbage out. . . . So that’s certainly a huge Departments of education are being asked to challenge. (Interview, March 24, 2008) do exponentially more with fewer resources. Prior to NCLB, most states tested at three Both an assessment developer and a professional grade levels. Overnight, they had to more development provider, Measured Progress is than double the number of kids they were uniquely situated to help state education agen- testing. And in most cases [education depart- cies create data systems useful to educators and ments] were doing it with the same and even to guide data use to shape instruction and school reduced staff. And they were also trying to programs. implement data warehouses. It’s insane what we ask state employees to do. (Interview, Next steps. For a quarter of a century, Measured March 24, 2008) Progress has created customized products and services for its clients. In the twenty-first century Measured Progress has expanded to accommo- this has evolved into administering assessments date the needs of clients and to remain relevant in online, providing online data tools to facilitate this new world of assessment. Senior staff point using data at the school level, and supporting the to a new challenge facing educators: they often external accountability systems that did not exist have more data than they can use. Staffers also at its founding in 1983. Measured Progress has Appendix F. Service provider profiles 69

adapted with the times but has faced challenges in Performance Tracker and Assessment Builder. meeting the needs of clients and staying current TechPaths Company was started in 1998 by Dr. with the changing environment. Bena Kallick, current vice-president for profes- sional and staff development at Performance In recent years a shift toward online testing and Pathways, and contained the TechPaths Cur- reporting has presented many challenges for riculum Mapping System. The merger of the two Measured Progress. Many of these are related to companies helped complete their mission to create technology infrastructure and limited bandwidth a data-informed culture. One employee describes to access the web-based tools. One challenge asso- how the merger was almost inevitable and neces- ciated with online reporting, noted by respondents sary for work in using data with educators: at the Maine Department of Education and by workshop participants, was finding a venue for the There are commercials advertising Reese’s training that was geographically convenient for Peanut Butter Cups [that ask], “Who got local educators but also had the capacity to give all peanut butter in my chocolate and who put participants access to the online reporting tools. chocolate in my peanut butter?” That is kind Measured Progress will work with the education of how this company is. At this point, it is department to find a venue that supports hands- almost impossible to separate, curriculum on computer-based training in using the tool. mapping and curriculum data from assess- Similar logistical concerns may arise with other ment data—kind of like the peanut butter states as Measured Progress expands its online cup. . . . It just became so evident that data portal and moves from its pilot sites to general use. and data-informed decisions are not just an Another issue cited by users is the length of time assessment: that if you don’t link that assess- it has taken to roll out the online tools to a point ment information to curricular and teaching where they are operating at full capacity. This time information, then you are really only getting lag reflects the “growing pains” of working toward half the picture. . . . We have too much peanut more high-tech products for a company that has butter or too much chocolate [when] we really historically been more concerned with psychomet- need to have the chocolate/peanut butter. rics and professional development. (Interview, April 4, 2008)

Performance Pathways Performance Pathways uses its three software products to help teachers create a culture of Performance Pathways is a for-profit technol- data-driven inquiry in their schools. The products ogy company focused on bringing a data-driven facilitate both analyzing assessment data and culture to educators in schools. The company uses mapping curriculum, both essential to improving a suite of three products to achieve this: Per- classroom practice. formance Tracker, Assessment Builder, and the TechPaths Curriculum Mapping System. Founded Clients. The Performance Pathways suite of prod- in 2005, Performance Pathways is the result of a ucts is available to educators through statewide merger of two companies, AlterNet Performance and district contracts. Through these different and TechPaths Company. types of contracts, Performance Pathways serves more than 55,000 teachers from 500 schools and Overview. Founded in 2002 by Jeff Colosimo, the districts in 47 states. In the Northeast and Islands current president and chief executive officer of Region New Hampshire had a statewide contract Performance Pathways, AlterNet Performance with Performance Pathways at the time of the was a technology company dedicated to develop- study. The state initially purchased Performance ing software solutions and services to manage Tracker for use by all districts and recently chose assessment information. Its products included to also offer Assessment Builder to districts ready 70 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

to begin work in this area. Some districts in New Data tools. Performance Pathways’ three main Hampshire have already purchased TechPaths products—Performance Tracker, Assessment for use in their curriculum mapping as well. In Builder, and TechPaths—can be used individually, addition to New Hampshire, Performance Path- but they are also designed to be used together. ways contracts with districts in Connecticut, Massachusetts, New York, and Rhode Island on an Performance Tracker is designed as an easy-to- individual or group basis. use tool that can help schools collect and access data and meet accountability requirements. It is Once a state has purchased the software products, web based, so data can be quickly accessed and all districts in the state have access to them. States viewed in different formats (tables and graphs); it using Performance Tracker are responsible (with also produces performance-based reports. All the the help of Performance Pathways) for uploading assessment data are correlated to state standards, all their chosen data for users to access. In addi- and actual assessments can be added to the system tion to providing the software for districts to use, and accessed through the software. states can also choose to purchase professional development services for the districts. If a state The second product, Assessment Builder, allows chooses not to purchase the full array of profes- schools to create assessments based on state and sional development services provided by Perfor- district standards. The system allows users access mance Pathways, districts can purchase additional to the current available state assessments, as well services at a reduced price. as previous assessments. Once the assessments are created, accompanying information details Districts can contract directly with Performance which standards are addressed in the test and Pathways either by choosing to add additional which questions correspond with the standard. products and professional development that are not Beginning in summer 2008, a new component to offered by their state to use in conjunction with the Assessment Builder was introduced: OLA pro- products already purchased by the state or in states duces online assessments that allow teachers to with no statewide contract by working directly immediately review the results. Accompanying the with Performance Pathways to purchase and use assessments are answer sheets that can be scanned the software products. The services provided to dis- into the system to produce item-analysis reports. tricts are the same as those provided to the states When attached to Performance Tracker, the results that purchase Performance Tracker; Performance can be uploaded into the system, compared with Pathways loads the data into the system and main- other assessments, and used to generate reports. tains the data that the district has provided. The third product, TechPaths, is a curriculum- Services. Performance Pathways uses a suite of mapping system that is based on the work of Heidi data tools and professional development services, Hayes Jacobs (1989, 1997, 2004). It is designed which comprises Performance Tracker, As- to link curriculum and assessment data. Tech- sessment Builder, and TechPaths. Performance Paths can be linked to both Performance Tracker Tracker houses different types of data and allows and Assessment Builder to access assessments users to generate reports based on the available and their data. In addition, state standards are data. Assessment Builder allows users to build downloaded so users can align their curriculum their own assessments with the assistance of a with the standards. And they can reference the content library and easy references to state and standards to evaluate how and when a particular district standards. Through the TechPaths soft- standard is addressed in the curriculum. ware, users access advanced tools to develop a so- phisticated curriculum map. The work is tailored The development of these products and the ac- to meet the needs of each client. companying professional development services Appendix F. Service provider profiles 71

is guided by the work of experts in the field. The and assessment data. The underlying goal of all the work with curriculum mapping is based on the training is to focus on what improvements can be work of Heidi Hayes Jacobs (Jacobs 1989, 1997, made for students. 2004), Bena Kallick (Kallick and Costa 1995; Kal- lick and Wilson 2001; Kallick and Colosimo 2009), The introductory and advanced training sessions Grant Wiggins (Wiggins 1998), and Jay McTighe. are usually two-day workshops. Once a contract is The work of Wiggins and McTighe (Wiggins and established with Performance Pathways, the intro- McTighe 2005; McTighe and Wiggins 1999, 2004), ductory training can be set up as in as little as one along with that of Rick Stiggins (Stiggins et al. or two months. Once users are familiar with the 2004; Chappuis et al. 2004), also influenced the software, they move on to the advanced training. assessment work. The advanced training is continually administered as user needs and understanding of the product Training. The suite of Performance Pathways change. Districts can decide in conjunction with products is accompanied by levels of professional project managers how to space the training, how development. These services are a key component to introduce the products, and when to offer more of the work, as one staff member there explained: advanced training. Although there is no minimum length, most contracts with Performance Pathways Each of the tools, Tracker and TechPaths and are for at least a year, to allow schools to look at a Assessment Builder, in and of itself is simply year’s worth of data. Contracts are renewable and a product. . . . Through training provided [we often run from one to three years. work] around how [to] get your staff to- gether to actually use this as a tool. . . . What District employees receive a first round of train- conversations do you have? What protocols ing, either introductory or advanced, from train- do you use to begin to examine the data? . . . ers from Performance Pathways. Once they are Then it really isn’t just about, “Can you use familiar with the products, they are trained again the software?” It is far more about: “What by Performance Pathways on how to train others do you do once you know how to use the in their districts to use the products. After this software?” (Interview, April 3, 2008) second round trainees continue to receive ongoing support from trainers from Performance Path- In working toward this goal, the company offers ways. In states with statewide contracts individu- two levels of training focused on creating a data- als at larger regional service centers or profes- informed culture: introductory and advanced. sional development centers are trained to in turn The introductory training is technology-based, train teachers and other educators in the districts, with a focus on how to use the software, what the to reach a larger audience. software has to offer, and how to generate reports. The advanced training sessions, which make up The training occurs almost exclusively with users the majority of the training sessions, are the next accessing an individual computer loaded with the steps. These training sessions teach users how to applicable software. This hands-on manner is very work with the data and with the reports generated important, as the products are software based and by the software. This training is designed to help require familiarity with all the tools, as one senior administrators work with teachers to determine staff member explained: the best use of the available information: What types of conversations should teachers and admin- [Performance Pathways is] a software istrators be having, and how should they use the company, so we really have to be in front of information in order to go back and enhance their a computer. . . . The [introductory] training work with students? Additional skills developed [requires] one user per computer. Everyone during the training focus on linking curriculum has to be hands-on; everyone has to get a feel 72 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

for manipulating [the software] and getting new users and more advanced information for a sense of how it works. (Interview, April 3, those already familiar with the products. Other 2008) sessions focused on the culture of data-driven decisionmaking by looking at using data and There are times when training takes place in establishing professional learning communities fo- a workshop format, with participants not on cused on using data. Districts with longer histories individual computers (such as during an ad- with the products presented their perspectives and vanced training session that is more content experiences. based, though this is not common), and in user conferences. Product support. The products come with several layers of help and support. Each contract has a In addition to professional development on prod- project manager who checks in with districts and uct use, in those states with statewide contracts, states to find out how things are working and what Performance Pathways also provides a user additional training would best serve the needs of conference. This conference pulls together educa- the district. The project manager notes areas of tors in the state who are working with the software weakness or concern identified by the staff in order products and allows them to learn more about the to best tailor the training to their needs. In addi- software as well as other products and how they tion, the project manager can monitor how often work in conjunction with each other. One staff the software is being used. If the software is under- member described how sometimes districts are used, he or she will talk with the district to try to asked to come to a conference in teams of two or understand the reasons for this and determine how three people “because that is really critical. You they can best help make use of the product. need that collegiality and that shared experience” (interview, April 3, 2008). In addition to district-level support related to the overall use of the product, there are several A New Hampshire user conference for state and other sources of support for users. For basic-level district administrators focused on Performance questions, such as difficulties logging on or enter- Tracker, as it was the first product purchased by ing information, a toll free number is available the state. The conference began with information 24 hours a day, 7 days a week. In addition, each from New Hampshire Department of Educa- product comes with an online manual, and every tion personnel, including the commissioner of screen has a “help” screen function as well as icons education. The president and vice-president of that direct users to help documents. And district Performance Pathways then gave a presentation or state administrators have access to the trainers on Performance Tracker and the overall work of and can email or call them directly with questions the company in the area of data-driven decision- or concerns. making. Performance Pathways’ director of tech- nology then presented information on upcoming Challenges. Through their work inside districts, changes to the software and other products. The Performance Pathways employees have seen a remainder of the program took place in a variety shift in how teachers perceive assessment data: of breakout sessions. These sessions were staffed “It seems . . . [that] as a nation, we are going from by trainers and other employees of Performance educators who thought that the assessment infor- Pathways and in some cases included individuals mation was really done for someone else to a na- from districts in New Hampshire that have been tion of educators who believe that the assessments using Performance Tracker. The sessions covered are truly for them” (interview, April 3, 2008). the basics of the software products, with informa- tion about each of the three software products. Interviewees at Performance Pathways noted that This included both introductory information for they were receiving less resistance in schools but Appendix F. Service provider profiles 73

were still encountering some initial resistance training program, frequent discussions with edu- from teachers who see the software as overwhelm- cation leaders identify what is working and what is ing, requiring a lot of work through the training not and provide information on school successes. and collaborative planning. But once trained, This continual evaluation is accompanied by the teachers report being very happy, now able to run work of project managers who monitor the use the reports with a few clicks rather than putting of the software and check in with districts when together a report using paper and pencil. One it is underused. As a result of such evaluation, challenge Performance Pathways continues to face, the company regularly makes both minor and though it is decreasing, is an inadequate level of large-scale changes to its products. Minor releases technological experience by staff members: as the occur based on user comments. Larger changes to technology experience level increases, the software the software occur every six months to a year and becomes easier to use and understand. require retraining staff. Performance Pathways informs clients of changes through release notes Next steps. As a company focused on using data that outline the new features and provide a tutorial to inform its decisions, Performance Pathways on them. Larger districts might also receive a web regularly collects data to evaluate its own work. In seminar on the changes in addition to the release addition to data compiled from evaluations of its notes. 74 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Notes 4. Principal learning communities bring to- gether principals of schools failing to make The research team for this project consists of adequate yearly progress to share perspectives employees of prime contractor, Education Devel- and reflect on school improvement efforts. opment Center, Inc., of Newton, Massachusetts, and employees of its subcontractor, Learning 5. The Rhode Island Department of Elementary Innovations at WestEd of Woburn, Massachu- and Secondary Education created the School setts. No members of the research team at either Accountability for Learning and Teaching organization or any of their colleagues in their (SALT) survey and process in 1998. The SALT research centers (Learning Innovations, Regional process is a school-centered cycle of inquiry Educational Laboratory Northeast and Islands, designed to improve school and student or the Center for Children and Technology) have performance in the Rhode Island public financial interests in any of the service providers schools. The survey is administered by the discussed in this report, nor did they provide any Center for Social Policy and Education at the support to the state education agencies in devel- University of Rhode Island. The university oping or evaluating their data-driven decision- provides reports for every school in the state making initiatives. The authors are not aware of and offers materials and technical assistance any colleagues who might have financial interests to help schools use the survey data to improve in the organizations discussed in this report. student achievement. (For information about the SALT survey see www.ride.ri.gov/PSI/salt/ This report could not have been completed default.aspx. To learn more about the services without the assistance of Pam Buffington, Rebby available through the University of Rhode Carey, Margaret Honey, Michael Maffie, and Ellen Island, see www.ncpe.uri.edu/research/ Mandinach. matrix-instruction.asp or www.nksd.net/ SALT/2008/overview.pdf.) 1. Maine has given state assessments for a quarter of a century, but before the NCLB 6. See www.leadandlearn.com. Act tests were not administered to each grade every year. In addition, in the 1990s 7. Each of these components is based on the and the early 2000s the state accountability work of nationally known researchers in system included local performance assess- education, including Dr. Douglas Reeves, Dr. ments, but in 2007, after about a decade of Michael Schmoker, Dr. Robert Marzano, Dr. development, the requirement that each Richard Elmore, and Dr. John Simpson (cited district create a performance assessment was in Connecticut Department of Education abandoned. 2008).

2. The New England Common Assessment 8. Instructional strategies selected by the data Program is the result of collaboration by New teams must be one of the effective teaching Hampshire, Rhode Island, and Vermont. This strategies identified in a meta-analysis by common assessment has been given in the Marzano, Pickering, and Pollock (2001, cited three states since 2005. in Connecticut Department of Education, 2008). 3. Under the NCLB Act a Title I school that fails to make adequate yearly progress for two con- 9. Reading First is a grant-funded program secutive years, as measured against its annual held by several of the states in the Northeast measurable objective, is identified as in need and Islands Region: “Through Reading First, of improvement. states and districts receive support to apply Notes 75

scientifically based reading research—and districts and was building local databases for the proven instructional and assessment tools districts to meet their needs in the meantime. consistent with this research—to ensure that all children learn to read well by the end of 13. Although collecting data from common for- third grade” (Reading First 2008, p.1). This is mative assessments has occurred in some dis- another area in which data-driven decision- tricts for years, it has never been a statewide making is used in New York State. More initiative. At the time of the study, Connecti- specifically, part of the Reading First program cut was piloting a statewide online formative involves teachers using a variety of assess- assessment tool. ments that identify students at risk of reading difficulty, diagnose their specific weaknesses, 14. A three-day certification training, the second and monitor their progress over time. In- kind of training offered by CALI, is delivered formation obtained through assessments is by the Leadership and Learning Center, and is used formatively throughout the teaching and intended for educators who plan to facilitate learning process. ongoing professional development at their schools, with the goal of creating a schoolwide 10. New York City has recently developed its own culture of data use. data system called the Achievement Report- ing and Innovation System. It is a web-based 15. Given the timing of this research and the data management system that collects and professional development schedule of Mea- analyzes information about student academic sured Progress, the researchers were unable to performance to help educators and parents observe any sessions offered by the company. make decisions that improve the academic Because the New England Common Assess- performance of New York City students ment Program is administered in the fall and and schools. See http://schools.nyc.gov/ the goal is to facilitate school and district use Accountability/SchoolReports/ARIS/default. of the test reports to guide program improve- htm for further information. ment, all test interpretation workshops were completed by the time of this study. In Maine 11. On April 15, 2008, LaVerne Terry was con- the test interpretation workshops are offered firmed as the Commissioner of Education for at the beginning of the school year, before the the Virgin Islands. research team had identified service pro- viders. To compensate for this, researchers 12. Due to district lack of funding and capacity, interviewed the staff at Measured Progress the data warehouse and the relationship with and the Maine Department of Education who TetraData ended June 30, 2008. At the time provided the training, spoke with a workshop of the study, the alliance was researching participant, and reviewed materials used to new vendors to provide data warehousing to guide the training. 76 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

References improving teaching and learning: an assessment-based approach. Urban Education, 35(3), 356–85. Abbott, D.A. (2008). A functionality framework for educa- tional organizations: achieving accountability at scale. City, L., Kagle, M., and Teoh, M. (2005). Examining in- In E.B. Mandinach and M. Honey (Eds.), Data-driven struction. In K.P. Boudette, E.A. City, and R.J. Mur- school improvement: linking data and learning. New nane (Eds.), Data Wise: a step-by-step guide to using York: Teachers College Press. assessment results to improve teaching and learning. Cambridge, MA: Harvard Education. Abelman, C., Elmore, R.F., Even, J., Kenyon, S., and Mar- shall, J. (1999). When accountability knocks, will any- Cognos. (2000, May 30). Massachusetts schools analyze one answer? Philadelphia, PA: Consortium for Policy and predict MCAS student testing results using Cognos Research in Education and the University of Pennsyl- Solution. [News release]. Retrieved April 16, 2008, from vania Graduate School of Education. www.cognos.com/news/releases/2000/rel_315.html.

Barzelay, M. (1992). Breaking through bureaucracy. Berke- Cognos. (n.d.-a). Education perspectives (Cognos Perfor- ley, CA: University of California. mance Perspectives Series). Retrieved April 16, 2008, from www.cognos.com/pdfs/whitepapers/wp_educa- Bernhardt, V. (1998). Data analysis for comprehensive tion-perspectives.pdf. school-wide improvement. Larchmont, NY: Eye on Education. Cognos. (n.d.-b). Making the grade: meeting the mandates of No Child Left Behind. Retrieved April 16, 2008, from Boudett, K.P., City, E.A., and Murnane, R.J. (Eds.). (2005). www.cognos.com/solutions/projects/no-child/. Data Wise: a step-by-step guide to using assessment re- sults to improve instruction. Cambridge, MA: Harvard Connecticut State Department of Education. (2008, Au- Education Press. gust). Connecticut Accountability for Learning Initia- tive: Technical assistance standards. Connecticut State Boudett, K.P., and Steele, J.L. (Eds.). (2007). Data Wise in Department of Education, RESC Alliance. Retrieved action. Stories of schools using data to improve teaching April 8, 2009, from www.sde.ct.gov/sde/cwp/view. and learning. Cambridge, MA: Harvard University. asp?a=2618&q=322294.

Center for Research on Evaluation, Standards and Stu- Copland, M. (2003). Leadership of inquiry: building and dent Testing. (2006). CRESST Assessment Glossary. sustaining capacity for school improvement. Educa- Retrieved from www.cse.ucla.edu/products/glossary. tional Evaluation and Policy Analysis, 25(4), 375–95. html. Ctresc.org. (n.d.). Connecticut Data Warehouse: continuum Chappuis, S., Stiggins, R., Arter, J.A., and Chappuis, J. of services. (2004). Assessment for learning: an action guide for school leaders. Portland, OR: Assessment Training Darling-Hammond, L., LaPointe, M., Meyerson, D., Orr, Institute. M., and Cohen, C. (2007). Preparing school leaders for a changing world: lessons from exemplary leadership Chen, E., Heritage, M., and Lee, J. (2005). Identifying and development programs. Palo Alto, CA: Stanford Educa- monitoring students’ learning needs with technology. tion Leadership Institute. Journal of Education for Students Placed at Risk, 10(3), 309–32. Dembosky, J.W., Pane, J.F., Barney, H. and Christina, R. (2005). Data-driven decision making in southwestern Chen, J., Salahuddin, R., Horsch, P., and Wagner, S.L. Pennsylvania school districts (No. WR-326-HE/GF). (2000). Turning standardized test scores into a tool for Santa Monica, CA: RAND Corporation. References 77

Deming, W.E. (1982). Out of the crisis. Cambridge, MA: Hassler, L.B., Buck, J.A., and Torgesen, J.K. (2004, April). Massachusetts Institute of Technology. Preparation for high-stakes reading assessments: the scope and nature of test preparation activities. Paper Earl, L., Katz, S., and Fullan, M. (2006). Leading schools in a presented at the annual meeting of the American Edu- data rich world. San Francisco, CA: Corwin. cational Research Association, San Diego, CA.

Feger, S., and Arruda, E., with the assistance of Pringle, R., Hergert, L.F., Gleason, S.C., and Urbano, C., with North, and Briggs, D. (2008). Professional learning communi- C. (2009). How eight state education agencies in the ties: key themes from the literature. Cambridge, MA: Northeast and Islands Region identify and support Harvard University. low-performing schools and districts (Issues & An- swers Report, REL 2009–No. 068). Washington, DC: Feigenbaum, A.V. (1951). Quality control: principles, prac- U.S. Department of Education, Institute of Education tice, and administration. New York: McGraw-Hill. Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Labo- Feldman, J., and Tung, R. (2001, April). Whole school ratory Northeast and Islands. Retrieved from http:// reform: how schools use the data-based inquiry and ies.ed.gov/ncee/edlabs. decision making process. Paper presented at the annual meeting of the American As- Ikemoto, G.S., and Marsh, J.A. (2007). Cutting through sociation, Seattle, WA. the “data-driven” mantra: different conceptions of data-driven decision making. Yearbook of the National Finnigan, K., and O’Day, J.A., with Wakelyn, D. (2003). Society for the Study of Education, 106(1), 105–31. External support to schools on probation: getting a leg up? Madison, WI: Consortium for Policy Research in Jacobs, H.H., (Ed.). (1989). Interdisciplinary curriculum: de- Education and the University of -Madison. sign and implementation. Alexandria, VA: Association for Supervision & Curriculum Development. Firestone, W.A., and González, R.A. (2007). Culture and processes affecting data use in school districts. Year- Jacobs, H.H., (Ed.). (1997). Mapping the big picture: in- book of the National Society for the Study of Education, tegrating curriculum and assessment K–12. Alexan- 106(1), 132–54. dria, VA: Association for Supervision & Curriculum Development. Foley, E., Mishook, J., Thompson, J., Kubiak, M., Supovitz, J., and Rhude-Faust, M. (2008). Beyond test scores: lead- Jacobs, H.H., (Ed.). (2004). Getting results with curriculum ing indicators for education. Providence, RI: Annen­ mapping. Alexandria, VA: Association for Supervision berg Institute. & Curriculum Development.

Gallagher, L., Means, B., and Padilla, C. (2007). Teachers’ Kallick, B., and Colosimo, J. 2009. Using curriculum map- use of student data systems to improve instruction. ping and assessment data to improve learning. Thou- U.S. Department of Education, Office of Planning, sand Oaks, CA: Corwin Press. Evaluation and Policy Development, Policy and Program Studies Service Retrieved April 25, 2008, Kallick, B., and Costa, A.L. (Eds.). (1995). Assessment in the from http://ctl.sri.com/publications/downloads/ learning organization: shifting the paradigm. Alexan- TeacherDataUseBrief09_10_07.pdf. dria, VA: Association for Supervision & Curriculum Development Halverson, R., Grigg, J., Prichett, R., and Thomas, C. (2007). The new instructional leadership: creating data-driven Kallick, B., and Wilson, J.M. (Eds.). (2001). Information instructional systems in schools. Madison, WI: Uni- technology in schools: creating practical knowledge versity of Wisconsin-Madison, School of Education, to improve student performance. San Francisco, CA: Wisconsin Center for Education Research. Jossey-Bass, Inc. 78 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

Kerr, K.A., Marsh, J.A., Ikemoto, G.S., Darilek, H., and Maine Education Policy Research Institute. (2008). The Barney, H. (2006). Strategies to promote data use for condition of K–12 public education in Maine 2008. instructional improvement: actions, outcomes, and Gorham, ME: University of Southern Maine. Retrieved lessons from three urban districts. American Journal of April 25, 2008, from http://usm.maine.edu/cepare/ Education, 112(4), 496–520. Reports/Conditions2008.pdf.

Knapp, M.S., Copland, M.A., and Swinnerton, J.A. (2007). Mandinach, E.B. (2008). Data for compliance v. data for in- Understanding the promise and dynamics of data- structional improvement. Paper presented at a meeting of informed leadership. Yearbook of the National Society the Council of Chief State School Officers, Milwaukee, WI. for the Study of Education, 106(1), 74–104. Mandinach, E.B., and Honey, M. (2008). Data-driven school Lachat, M.A., and Smith, S. (2005). Practices that support improvement: linking data and learning. New York: data use in urban high schools. Journal of Education Teachers College. for Students Placed at Risk, 10(3), 333–49. Mandinach, E.B., Honey, M., and Light, D. (2006, April). A Laird, E. (2006). Data use drives school and district im- theoretical framework for data-driven decisionmaking. provement. Austin, TX: The Data Quality Campaign. Paper presented at the annual meeting of the American Educational Research Association, San Francisco, CA. Lang, L., Goldwyn, S., Schatschneider, C., Roehrig-Boce, A., and Johnson, C. (2007, April). Using assessment Marsh, J., Kerr, K., Ikemoto, G., Darilek, H., Suttorp, M.J., data to guide school and classroom decision making: Zimmer, R., and Barney, H. (2005). The role of districts an examination of the effects on student achievement. in fostering instructional improvement: lessons from Paper presented at the annual meeting of the American three urban districts partnered with the Institute for Educational Research Association, Chicago, IL. Learning (No. MG-361-WFHF). Santa Monica, CA: RAND Corporation. Larocque, M. (2007). Closing the achievement gap: The experience of a middle school. The Clearing House Marsh, J.A., Pane, J.F., and Hamilton, L.S. (2006). Making (March/April), 157–61. sense of data-driven decisionmaking in education: evi- dence from recent RAND research. Santa Monica, CA: Light, D., Honey, M., Heinze, J., Brunner, C., Wexler, D., RAND Corporation. Mandinach, E., and Fasca, Chad. (2005). Linking data and learning–the Grow Network study. Summary re- Marzano, R. J., Pickering, D., and McTighe, J. (1993). Assess- port. New York: Education Development Center, Center ing outcomes: performance assessment using the dimen- for Children and Technology. sions of learning model. Alexandria, VA: Association for Supervision and Curriculum Development. Love, N. (2002). Using data/getting results: a practical guide for school improvement in mathematics and science. Mason, S. (2002). Turning data into knowledge: lessons from Norwood, MA: Christopher Gordon. six Milwaukee public schools. Madison, WI: Wisconsin Center for Education Research. Love, N. (2004). Taking data to new depths: there’s a ton of data being collected. The trick is to know how to Massachusetts Department of Elementary and Secondary use of effectively. Journal of Staff Development, 25(4), Education. (n.d.). Educational Data Warehouse: fre- 22–26. quently asked questions. Retrieved April 16, 2008, from www.doe.mass.edu/infoservices/dw/faq.html. Love, N., Stiles, K., Mundry, S., and DiRanna, K. (2008). The data coach’s guide to improving learning for all Massell, D. (2001). The theory and practice of using data students. San Francisco, CA: Corwin. to build capacity: state and local strategies and their References 79

effects. In S.H. Fuhrman (Ed.), From the capitol to the Data Wise: a step-by-step guide to using assessment classroom: standards-based reform in the states. Chi- results to improve teaching and learning. Cambridge, cago, IL: University of Chicago. MA: Harvard Education Press.

McTighe, J., and Wiggins, G. (1999). Understanding by Reading First. (2008). Program at a glance. Reading design handbook. Alexandria, VA: Association for First. Retrieved January 15, 2009, from www.ed.gov/ Supervision and Curriculum Development. programs/readingfirst/gtepreadingfirst.pdf.

McTighe, J., and Wiggins, G. (2004). Understanding by design Reeves, D.B. (2000). Accountability in action: a blueprint for professional development workbook. Alexandria, VA: As- learning organizations. Denver, CO: Advanced Learn- sociation for Supervision and Curriculum Development. ing Press.

Mills, L. (2008). Getting started with data warehousing Reeves, D.B. (2001). Holistic accountability: serving stu- [Electronic version]. Schools CIO. Retrieved August 22, dents, schools, and community. Thousand Oaks, Cali- 2008, from www.schoolcio.com/ShowArticle/1048. fornia: Corwin Press.

Moody, L., Russo, M., and Casey, J.S. (2005). Acting and as- Reichardt, R. (2000). The state’s role in supporting data-driven sessing. In K. Boudett, E. City, and R. Murnane (Eds.), decision making: lessons from Wyoming. Aurora, CO: Data Wise: a step-by-step guide to using assessment Mid-Continent Research for Education and Learning. results to improve teaching and learning. Cambridge, MA: Harvard Education. Senge, P.M. (1990). The fifth discipline: the art and practice of the learning organization. London: Random House. Moss, P.A., and Piety, P.J. (2007). Introduction: evidence and decision making. Yearbook of the National Society Stiggins, R., Arter, J.A., Chappuis, J., and Chappuis, S. 2004. for the Study of Education, 106(1), 1–14. Classroom assessment for student learning: doing it right-using it well. Portland, OR: Assessment Training Murnane, R.J., Sharkey, N.S., and Boudett, K.P. (2005). Institute. Using student-assessment results to improve instruc- tion: Lessons from a workshop. Journal of Education Storandt, B.C., Nicholls, C., and Yusaitis, J. (2007, April). for Students Placed at Risk, 10(3), 269–80. Understanding how districts and schools use technology to link student performance data with standards-based New Hampshire Department of Education. (2006). A guide instruction. Paper presented at the annual meeting of to how data improves student achievement. Retrieved American Educational Research Association, Chicago, IL. May 2, 2008, from www.ed.state.nh.us/education/ News/DataInitiativeFactSheet.pdf. Supovitz, J.A., and Klein, V. (2003). Mapping a course for improved student learning: how innovative schools New York Data Analysis Technical Assistance Group, 2008. systematically use student performance data to guide Mission statement. Retrieved April 25, 2008, from improvement. Philadelphia, PA: Consortium for Policy www.datag.org. Research in Education and University of Pennsylvania Graduate School of Education. New York State Board of Regents. (2006). P-16 education: a plan for action. Albany, NY: New York State Education Thorn, C., Meyer, R.H., and Gamoran, A. (2007). Evidence and Department. Retrieved August 22, 2008, from http:// decision making in education systems. Yearbook of the usny.nysed.gov/summit/p-16ed.pdf. National Society for the Study of Education, 106(1), 340–61.

Price, J., and Koretz, D. (2005). Building assessment U.S. Department of Education, National Center for Education literacy. In K. Parker, E. City, and R. Murnane (Eds.), Statistics, Common Core of Data. (2007). “Elementary 80 STate education agencies in the Northeast & Islands Region AND data-driven decisionmaking

and Secondary Education Characteristics, New York.” Wayman, J.C., Stringfield, S., and Yakimowski, M. (2004). Retrieved April 20, 2008, from http://nces.ed.gov/ Software enabling school improvement through analysis programs/stateprofiles/sresult.asp?mode=short&s1=36. of student data. Baltimore, MD: Center for Research on the Education of Students Placed at Risk. Vermont Department of Education. (2008). Act 60: The Equal Educational Opportunity Act. Retrieved April 8, Wiggins, G. 1998. Educative assessment: designing assess- 2008, from http://education.vermont.gov/new/html/ ments to inform and improve student performance. San laws/act60.html Francisco, CA: Jossey-Bass, Inc.

Vermont Data Consortium. (2008). Mission statement. Wiggins, G., and McTighe, J. (2005). Understanding by de- Retrieved April 20, 2008, from http://vermontdata.org. sign. Alexandria, VA: Association for Supervision and Curriculum Development. Wayman, J.C. (2005). Involving teachers in data-driven de- cision making: using computer data systems to support Young, V.M. (2006). Teachers’ use of data: loose coupling, teacher inquiry and reflection. Journal of Education for agenda setting, and team norms. American Journal of Students Placed at Risk, 10(3), 295–308. Education, 112(4), 521–48.