RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING-AFRICA (REEP-A) Year Three Annual Report

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication.

Prepared for

USAID

Prepared by

Dexis Consulting Group

Research for Effective Education Programming – Africa (REEP-A)

Contract No. AID-OAA-TO-16-00024

October 2019 ACRONYMS

AFR/SD/ED Africa Bureau’s Office of Sustainable Development Education Team

AREW Africa Regional Education Workshop

CFA Confirmatory Factor Analyses

DERP Data for Education Research and Programming in Africa

EGRA Early Grade Reading Assessment

HDAK Huguka Dukore Akazi Kanoze

LOI Language of Instruction

RDCS Regional Development Cooperation Strategy

REEP-A Research for Effective Education Programming -– Africa

RTI Research Triangle Institute

SEL Social and Emotional Learning

SRBGV School-Related Gender-Based Violence

TLLA Teacher Language and Literacy Assessment

TO Task Order

TOCOR Task Order Contract Officer Representative

USAID United States Agency for International Development

i | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019

CONTENTS

EXECUTIVE SUMMARY 1 OVERVIEW OF THE REEP-A PROGRAM 2 BACKGROUND 2 YEAR THREE ACTIVITIES 3 PROGRAMMING AND PROJECT SUPPORT 3 TECHNICAL ACTIVITIES 6 PROGRESS TOWARD OBJECTIVES (BY RESULT) 6 GENERAL TECHNICAL ACTIVITIES 6 RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET 9 RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED 11 RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED 12 CHALLENGES AND LESSONS LEARNED 15 OTHER DELIVERABLES 15 PROJECT MANAGEMENT DELIVERABLES 16 YEAR THREE DELIVERABLES STATUS 17 ANNEX OF YEAR THREE DELIVERABLES 18

REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 | ii EXECUTIVE SUMMARY The Research for Effective Education Programming – Africa (REEP-A) Task Order (TO), a contract between the U.S. Agency for International Development (USAID) and Dexis Consulting Group (Dexis), was awarded on September 29, 2016 for a period of five years, ending on September 28, 2021.

The main objective of REEP-A is to generate and effectively disseminate Africa regional and country specific education data, analysis, and research, to ensure the availability of evidence-based interventions that inform the prioritization of needs and education investment decisions. REEP-A activities support USAID’s 2011 Education Strategy, which set targets for 2015 for early grade reading, workforce development, and education in crisis and conflict-affected environments.

To meet these goals, REEP-A activities are designed to emphasize the following three Results:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

Year 3 of REEP-A was focused heavily on Result 2 and 3 activities with the majority of tasks allocated to planning and starting various research activities.

Under Result 1, REEP-A continued work on Requirement 1.2, Deliver Education Data Briefs, translating the comprehensive education data brief that provided statistics and data points on the global prevalence of School-Related Gender-Based Violence (SRGBV) into French for wider usage and dissemination.

Planning and research for two activities under Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) continued in Year 3 as well. The activities consisted of supporting the Mission through the implementation of performance evaluations for two of their existing education projects, Huguka Dukore Akazi Kanoze (HDAK) and Soma Umenye. For the HDAK Performance Evaluation, the fieldwork was conducted, and the final report was developed and submitted during the Year. For the Soma Umenye Performance Evaluation, the fieldwork occurred in Year 3, as well as initial report writing and analysis.

Research under Result 2 was largely devoted to the Language of Instruction (LOI) work stream (Requirements 2.31- 2.36) consisting of multiple research activities that explore teacher-focused language factors affecting early grade reading. A draft of the framework document (Requirement 2.31) was submitted on February 4, 2019 for further comments from the Education Office of the Economic Growth, Education and Environment bureau. Work has continued on the report to incorporate revisions, as well as the graphic design development, to produce a high quality, polished report for wide dissemination, and an updated draft was submitted on August 13, 2019 for review. Additionally, under Result 2, Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes, Dexis began drafting up to19 country profiles for sub- Saharan African countries currently implementing Goal 1 reading programs.

Requirement 2.33, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance continued during Year 3. The initial findings coming out of the inclusion of

11 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 the teacher questions in the Early Grade Reading Assessment (EGRA) in was put in an initial report and submitted on June 24, 2019 for review. Under Requirement 2.34, Case Study on LOI Implementation and Language Attitudes, a draft of the report was submitted to USAID on August 30, 2019, that included proposed revisions to the LOI-specific questions included in the teacher interview questionnaire with the aim to improve the reliability and content of data collected through this instrument. Additionally, work progressed under Requirement 2.35, Teacher Language and Literacy Assessment (TLLA), involving developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. The final research plan and instruments were submitted on July 8, 2019 and were approved on July 12, 2019.

The research priority for Result 3 in Year 3 was the continuation of work towards the Delivery of the SRGBV Measurement Framework (Requirement 3.1) as well as the work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer. The development of the SRGBV Measurement Framework will yield accurate and reliable data on SRGBV that is not currently available through other research tools. Many steps have been taken to achieve this throughout Year 3, including adjusting the survey instruments, piloting the instruments in Malawi, and the conceptualization and development of the toolkit. Additional discussions and revisions continued throughout Year 3 to address the remaining survey items and toolkit improvements. Secondly, work towards Requirement 3.2, Update Status of Early Grade Reading Barometer, has progressed during Year 3 through the continuation of incorporation of the (Tusome) baseline and midline EGRAs in English and Kiswahili.

This Annual Report covers FY2019, the period from September 30, 2018 through September 30, 2019. REEP-A activities during this period included project administration, planning, and technical activities across all Result and activity areas. The annual financial report ending on September 30, 2019 is attached. The REEP-A Program Manager is Stephanie Squires and the Task Order Contract Officer Representative (TOCOR) is Megnote Lezhnev.

OVERVIEW OF THE REEP-A PROGRAM

BACKGROUND The overarching aim of the Research for Effective Education Programming – Africa (REEP-A) project is to provide the U.S. Agency for International Development (USAID) Africa Bureau, overseas Missions, and partner organizations, with concrete research contributions on USAID education initiatives to inform evidence-based investment, decision-making and the prioritization of needs. REEP-A will contribute to USAID’s education aims in Africa by providing technical and advisory services focused on research and capacity building, intended to enhance the quality and effectiveness of USAID activities. REEP-A will provide practical and action-oriented gender-sensitive research that will be applied to both project design and implementation. REEP-A aims to strengthen the relationships and links between project evaluation, design, and implementation. In addition, REEP-A will contribute to the capacity development of education personnel and partners by enhancing the use of research in decision-making, mainstreaming the application of feedback-loops and lessons learned into project design and implementation, and expanding the evidence and knowledge base on education initiatives.

USAID’s Education Strategy, issued in 2011, has promoted the rigorous use of evidence-based

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 2 programming across the Strategy’s three education goals, in particular for the measurement of access to education and early grade reading improvements. The education goals set targets for 2015, which centered on early grade reading, workforce development, and education in crisis and conflict-affected environments. REEP-A will specifically focus on improvement in early grade reading and the measurement of equitable access to education in crisis and conflict environments (Goals 1 and 3):

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

Following the release of the Strategy, the Education Team of the USAID Africa Bureau’s Office of Sustainable Development (AFR/SD/ED) identified the need for targeted research and training services to complement the efforts of Africa Missions to effectively align their education portfolios with the three goal areas of the Strategy. The Data for Education Research and Programming in Africa project (DERP), implemented between 2011 and 2016, was established to respond to these identified needs, and provided AFR/SD/ED with timely research and tools that Missions could apply to their ongoing and future education projects. REEP-A will expand on the progress and evidence base established by DERP and will target identified gaps and challenges.

REEP-A seeks to contribute evidence that responds to the question of why, despite some gains, progress toward enhanced early grade reading and access to quality education in conflict and crisis settings in Africa has been modest. Expanding the evidence base and data collection for education interventions in Africa is critical to spurring further progress, increasing the understanding of program effectiveness, identifying barriers, and replicating successes.

YEAR THREE ACTIVITIES

PROGRAMMING AND PROJECT SUPPORT Building on the efforts made in Year 2 of REEP-A, Year 3 focused on the preparation and implementation of technical research for a range of activities across all three Result areas.

Under Result 1, REEP-A continued work on Requirement 1.2, Deliver Education Data Briefs, for Year 3 by completing a translation of the Education Data Brief: Global Prevalence of School-Related Gender-Based Violence (SRGBV) into French. The English version of the document was submitted in Year 2, and it was determined that for wider usage and dissemination it would be valuable to translate the document to French. The finalized translation was completed for submission on February 4, 2019 and approved on March 7, 2019.

Planning and research for two activities under Requirement Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) continued throughout Year 3. The activities consist of supporting the Rwanda Mission through the implementation of performance evaluations for two of their existing education projects, HDAK and Soma Umenye. The final version of the Huguka Dukore Akazi Kanoze (HDAK) Performance Evaluation report was submitted June 3,

3 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 2019. However, comments still remained, and the report was revised and resubmitted for final approval on September 26, 2019.

The initial research plan for the Performance Evaluation of the flagship early grade reading program in Rwanda, Soma Umenye, was submitted and approved on November 29, 2018. During Quarter 2, conversations occurred with the Mission to update the research plan following adjustments made to program implementation. Following the onboarding of the team lead, the final research plan was submitted and approved on July 30, 2019. Pre-field planning, including onboarding of all technical consultants, occurred in June 2019. The field data collection occurred from August 5 to 21, 2019, with data analysis and report writing carrying over into Year 4.

Under Result 2, Requirement 2.20, following the conclusion of the Africa Regional Education Workshop (AREW) in May 2018, a comprehensive report outlining the planning, logistics, and lessons learned from AREW was requested. The AREW Report was submitted on October 20, 2018, and approved October 22, 2018 to provide a summary of the events and lessons learned to be used for future planning.

Additionally, the Language of Instruction (LOI) work stream continued to constitute a major area of work. In order to meet the objectives of the first activity in this research stream, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes (KSA) related to Language and Literacy that Influence Early Grade Literacy Outcomes (Result 2, Requirement 2.31), a draft of the report was submitted on February 4, 2019. An updated draft was submitted on August 13, 2019 for review in Quarter 3, addressing the remaining comments, including further comments from the Education Office of the Economic Growth, Education and Environment Bureau.

Furthermore, work continued on Result 2, Requirement 2.32, Develop Relevant Country- Specific Profiles of Language of Instruction (LOI) Policies, Reading Program Approaches to LOI, and Student Reading Outcomes. This work includes the continuation of data collection across the 19 countries for inclusion in the country profiles. A draft profile was shared with USAID on March 5, 2019 as an example to allow for discussion and initial feedback. Three draft country profiles (South , Rwanda, and South Africa) were submitted on June 28, 2019 for further comments and to determine the appropriate document structure for the remainder of the country profiles. Feedback has been received for two of the three draft country profiles. Work continues on completing the country profiles, including gathering missing information and designing the layout and structure of the profiles.

Additional work under Result 2, Requirement 2.33, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance continued during Year 3. Following the inclusion of the questions into an EGRA in Uganda in mid-October 2018, analysis began once the data was received. During Quarter 1 the analysis began by comparing teachers’ responses with data on student reading outcomes. The preliminary findings and proposed revisions were summarized in an initial report and submitted on June 24, 2019 for review.

Requirement 2.35, Teacher Language and Literacy Assessment (TLLA), involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. An initial research plan was developed to map the way forward for this activity. This research plan was submitted on January 18, 2019 with comments received on February 4, 2019. Throughout Quarters 2 and 3 USAID provided comments and feedback on the instruments and research plan, suggesting a

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 4 change in the proposed structure of the tools in the research plan. The suggested changes were incorporated and both the research plan and instruments, were re- submitted on July 8, 2019, and were approved on July 12, 2019.

Under Result 3, the continuation of work on the SRGBV Toolkit (Result 3, Requirement 3.1), remained a primary focus for Year 3. Efforts consisted of data analysis from the fieldwork for the pilot in Malawi, and progress toward the Toolkit’s development. The draft Malawi Pilot Report was submitted to USAID for review and comments on September 28, 2018. Throughout Quarter 2 and Quarter 3, conversations with Dexis, RTI, and USAID focused on requests from USAID for clarification, providing additional information on the recommendations made regarding the survey content, and delivering more details on the reliability and validity analysis conducted on all SRGBV instruments. Following these discussions, it was determined that additional scenario planning documentation was needed in order to identify the best path for any further instrument revisions and repiloting.

Building on the additional analysis and clarification on the results from the Malawi pilot, additional requests were made from USAID to provide further information and determine next steps. The first item was a scenario document outlining possible next steps for the SRGBV instruments which was submitted for discussion on April 30, 2019.

The second item was the results of the Confirmatory Factor Analyses (CFA) on the Gender Attitudes and Beliefs instrument for students, teachers, and parents, and was submitted on May 24, 2019. The third item was the results of the CFA on the Perceptions of School Climate instrument for students, teachers, and parents and was submitted on June 19, 2019. Following the CFA results submissions, USAID provided feedback on June 26, 2019, which was discussed in a call with Dexis, RTI, and USAID on June 27, 2019.

A draft of the SRGBV Measurement Toolkit (Result 3, Requirement 3.1) was submitted in early 2019 for USAID review. On May 20, 2019, USAID provided written feedback on the draft version of the SRGBV Toolkit, which was discussed in a phone call on May 22, 2019. The feedback focused on revising the Toolkit to ensure that the overall framing of the Toolkit more clearly links the survey instruments and the conceptual framework; improving the physical and electronic navigation of the Toolkit; and revising content around the data analysis. Additionally, discussion on the merits of including the Social and Emotional Learning (SEL) instrument continued, as this particular instrument was not included in the Malawi pilot, but has been piloted separately in Uganda. Additional discussions and revisions continued throughout Quarter 3 to address the continuing challenges of producing a high-quality deliverable that is user-friendly.

Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during Year 3 through the continuation of incorporation of the Kenya (Tusome) baseline and midline EGRAs in English and Kiswahili. The anticipated completion of the data incorporation will occur in Quarter 1, Year 4. During Quarter 4, USAID began discussions around revisiting the usefulness of the Early Grade Reading Barometer. Further discussions will be necessary to determine if proceeding with the Early Grade Reading Barometer activity is beneficial.

Furthermore, overall program management remains a significant portion of the Task Order (TO) in order to ensure effective support of the technical activities within REEP-A. Administrative support and facilitation of discussions between the technical and management teams at Dexis, RTI, and USAID have

5 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 occurred throughout Year 3 and will continue to be a major management focus throughout the project’s duration. TECHNICAL ACTIVITIES REEP-A supports AFR/SD’s Regional Development Cooperation Strategy (RDCS) through the generation of regional, country-specific and gender-focused data, analysis, and research that will assist in identifying and prioritizing needs, promoting rigorous research to inform evidence-based decisions, and promoting gender-equitable education investments within Africa.

To achieve these aims, technical assistance and advisory services under REEP-A fall across the three Results articulated in the Education Analytic Framework:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This distribution of activities across the three Results is reflected throughout the five years of the TO. Year 3 focused on continuing to provide strategic research across the three Results, and builds on the foundational research and analysis that has occurred over the last two years. Year 3 remains focused on implementing and continuing research begun thus far, as well as initiating new research activities. In discussion with USAID, including during Quarterly meetings, future research topics will also be discussed and refined, to fulfill undetermined requirements and to ensure relevancy and alignment with emerging areas of interest being pursued under the TO. Research activities in Year 3 and subsequent years will prioritize and integrate gender-focused approaches, as done throughout Year 2.

In close consultation with AFR/SD/ED, all research activities, including technical plans, will utilize gender- sensitive methodologies and evaluation questions to improve understanding of the barriers to improving gender equity in education and learning. Recognizing the importance of gender considerations as a key component in achieving development aims, research activities within REEP-A will draw upon available sex-disaggregated data and other gender-sensitive tools to clearly articulate existing gender disparities and identify knowledge gaps. This research will provide a thorough analysis of gender issues at both the individual and institutional levels, including the interplay of contextual and political factors. In addition to research approaches, a gender lens will be adopted in the other analytic and capacity building services under REEP-A. This gender-sensitive approach includes the engagement of men and boys in conversations around issues of gender equity, which is central to achieving equality both in education and the work force. The identification of both gender-based disparities and continued barriers to equity will be driving aims of the research and recommendations, and will shape research questions and topics throughout the TO.

PROGRESS TOWARD OBJECTIVES (BY RESULT)

GENERAL TECHNICAL ACTIVITIES For Result 1, in Quarter 1 of Year 3, REEP-A continued work on Requirement 1.2, Deliver Education Data Briefs, by starting the process of onboarding a translation services company to translate the Education Data Brief from English to French. This brief is a comprehensive document that provides the most recent statistics and data points related to SRGBV (Result I, Requirement 1.2). The English version

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 6 of the document was submitted in Year 2 and it was determined that for wider usage and dissemination it would be valuable to translate the document to French. Under Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7), the evaluation team began the fieldwork in Rwanda, towards the end of Quarter 1 for the HDAK Performance Evaluation (Requirements 1.3). Additionally, the initial research plan for the Soma Umenye Performance Evaluation (Requirement 1.4) was submitted and approved on November 29, 2018.

In Quarter 2, under Requirement 1.2, Deliver Education Data Briefs, the translation of the Education Data Brief: Global Prevalence of School-Related Gender-Based Violence (SRGBV) into French was completed. The finalized translation was both submitted and approved in Quarter 2. During Quarter 2, under the requirement Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) the HDAK evaluation team worked collaboratively with the Rwanda Mission to share initial findings and analysis following the in-country work. The draft of the evaluation report was submitted on February 19, 2019. Secondly, conversations were initiated with the Mission to update the Soma Umenye Research Plan (Requirement 1.4), in response to adjustments made to program implementation.

In Quarter 3, under Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) the final version of the HDAK Performance Evaluation report was submitted. Additionally, following the onboarding of the team lead on the Soma Umenye Performance Evaluation, the final research plan was submitted and approved in Quarter 3. Pre-field planning, as well as the onboarding of technical consultants, occurred.

In Quarter 4, under Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) the HDAK Report received comments from USAID. The report was revised and resubmitted for final approval on September 26, 2019. The field data collection, data analysis and report writing for the Soma Umenye Performance Evaluation (Requirement1.4) also occurred in Quarter 4.

For Result 2, in Quarter I the final AREW report was submitted following final guidance from the Task Order Contract Officer Representative (TOCOR) on this deliverable. The final two versions of the AREW Report (Result 2, Requirement 2.20) were submitted on October 20, 2018, and approved October 22, 2018. Additionally, work under the LOI work stream continued, with a draft of the report under Requirement 2.31, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes (KSA) Related to Language and Literacy that Influence Early Grade Literacy Outcomes submitted on October 9, 2018. In Quarter I, a graphic designer was also identified to develop the overall design of the report to ensure a high-quality, polished final deliverable. Under Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes, Dexis submitted a research plan on November 12, 2018 outlining the information to be included in the profiles. This research plan was approved on December 7, 2018. Technical work continued in Quarter 1 through the collection of the necessary information and the development of the structure of the profiles.

In Quarter 2, a draft of the framework document (Requirement 2.31) was submitted on February 4, 2019 for further comments from the Education Office of the Economic Growth, Education and Environment bureau. In Quarter 2, a draft country profile (Requirement 2.32) was shared with USAID on March 5, 2019, as an example to allow for discussion and receive initial feedback. During this quarter under Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) the analysis began by comparing teachers’ responses with data on

7 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 student reading outcomes. This analysis, as well as other findings, illustrated the need to adjust the questions for the next EGRA inclusion. Under Requirement 2.35, Teacher Language and Literacy Assessment, an initial research plan was developed to map the way forward for this activity. This research plan was submitted in Quarter 2, on January 18, 2019 with comments received on February 4, 2019. A discussion on February 7, 2019 allowed for Dexis, RTI and USAID to walk through the plan. Following this call, revisions to the research plan began and was resubmitted February 25, 2019. Work continued as well on further development of the instruments. Final comments were received on March 27, 2019.

In Quarter 3, work continued with revisions and design of the framework document (Requirement 2.31), following comments that were received from the Education Office of the Economic Growth, Education and Environment bureau. Under Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes, three draft country profiles (South Sudan, Rwanda, and South Africa) were submitted on June 28, 2019. Additional work occurred as well under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33). The preliminary findings coming out of the inclusion of the teacher questions in the EGRA in Uganda were included in an initial report to share with USAID. Additionally, a research plan was submitted on June 24, 2019 for review. Under Requirement 2.35, Teacher Language and Literacy Assessment following the comments received from USAID on March 27, 2019, the final research plan was submitted on June 20, 2019.

In Quarter 4, work continued on the framework document (Requirement 2.31) through revisions and design work to produce a high-quality, polished report for wide dissemination. An updated draft was submitted on August 13, 2019 for review. Feedback was received for two of the three draft country profiles under Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes; however further discussion and feedback on all three profiles is necessary to determine the appropriate structure for the remainder of the country profiles. In Quarter 4, under Requirement 2.34, Case Study on LOI Implementation and Language Attitudes, a draft of the report was submitted to USAID on August 30, 2019 for initial comments. Under Requirement 2.35, Teacher Language and Literacy Assessment, the suggested changes were incorporated, and both the research plan and instruments were re-submitted on July 8, 2019 and approved on July 12, 2019. Additionally, in Quarter 4, mission concurrence for the TLLA activity in Uganda was received on August 13, 2019.

For Result 3, under Delivery of the SRGBV Measurement Framework (Requirement 3.1) the data cleaning, processing, analysis, and report writing for the SRGBV instrument pilot carried into Quarter I of Year The draft Malawi Pilot report was submitted on September 28, 2018. Following this submission, discussions occurred around some of the issues flagged in the report leading to additional analyses and discussions around how to correct these and proceed with next steps. Initial sections of the Toolkit were shared with USAID during this Quarter to select a design style for the Toolkit.

Work on Requirement 3.2 included submission of an Early Grade Reading Barometer Assessment Report on October 31, 2018, and approval on November 13, 2018. The assessment report aimed to determine if the current Early Grade Reading Barometer platform, an online tool hosted by USAID’s Asia Bureau, provided the appropriate venue to incorporate additional datasets. Findings determined that the necessary platform for the Barometer, managed by the Asia Bureau is well-operated, and thus it is recommended that the Africa Bureau utilize the existing platform to incorporate datasets. The assessment report allowed for taking the next steps for fulling the remaining Sub-Requirement 3.2.3,

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 8 Incorporate African Mission Early Grade Reading Data into the Early Grade Reading Barometer.

In Quarter 2 under Delivery of the SRGBV Measurement Framework (Requirement 3.1), a document summarizing the history of decisions and additional analyses conducted was requested by USAID and produced to serve as a reference point for the path forward. The summary document was submitted on February 28, 2019. Further discussions continued stemming from this document, indicating that further analysis, as well as a document articulating options for next steps, was needed. Continued work on the design and layout of the Toolkit occurred during this quarter. Efforts continued with work on a modification with the subcontractor in regards to Sub-Requirement 3.2.3, Incorporate African Mission Early Grade Reading Data into the Early Grade Reading Barometer.

In Quarter 3, following the additional analysis and clarification of the results from the Malawi pilot, three deliverables were added to the SRGBV work stream (Requirement 3.1). The first item consisted of a scenario document outlining possible next steps for the SRGBV instruments which was submitted on April 30, 2019. The item consisted of results of the confirmatory factor analysis (CFA) on the Gender Attitudesand Beliefs instrument for students, teachers, and parents, and was submitted on May 24, 2019. The third item consisted of results of the CFA on the Perceptions of School Climate instrument for students, teachers, and parents and was submitted on June 19, 2019. Following the CFA results submissions, USAID provided feedback on June 26, 2019, which was discussed on a call with Dexis, RTI, and USAID and on June 27, 2019. On May 20, 2019, USAID provided written response to the draft version of the SRGBV Toolkit, which were discussed on a phone call on May 22, 2019. The feedback focused on revising the Toolkit to ensure that the overall framing of the Toolkit more clearly links the survey instruments and the conceptual framework; improving the physical and electronic navigation of the Toolkit; and revising content around data analysis. Additionally, discussion around the merits of including the social-emotional learning (SEL) instrument continued, as this particular instrument was not included in the Malawi pilot, but has been piloted separately in Uganda. Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during Quarter 3 through the continuation of incorporation of the Kenya (Tusome) baseline and midline EGRA’s in English and Kiswahili.

In Quarter 4, discussions and revisions regarding the SRGBV toolkit (Result 3, Requirement 3.1) continued to address the continuing challenges of producing a high-quality deliverable that is user- friendly. Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during Quarter 4 through the continuation of incorporation of the Kenya (Tusome) baseline and midline EGRAs in English and Kiswahili. However, during Quarter 4, USAID began discussions regarding revisiting the usefulness of the Early Grade Reading Barometer. Further discussions will be necessary to determine if proceeding with Early Grade Reading Barometer activity is beneficial.

RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET The aim of Result 1 is to provide Missions with data and research support that meets needs and facilitates understanding of the status of education in specific countries. Result 1 includes three main activity categories, spread across seven requirements:

• Deliver Country and Regional Education Data Trend Snap Shots (Requirement 1.1) • Deliver Education Data Briefs (Requirement 1.2) • Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3- 1.7)

9 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 The smallest distribution of REEP-A activities in Year 3, as in the wider TO, falls within Result 1. Year 3 activities under Result 1 included Education Data Briefs (Requirement 1.2) and Mission Support for Evaluation, Assessment and Data on Selected Priorities (Requirements 1.3 and 1.4).

Under Result 1, REEP-A continued work on Requirement 1.2, Deliver Education Data Briefs, for Year 3 by completing a translation of the Education Data Brief: Global Prevalence of School-Related Gender-Based Violence (SRGBV) into French. This brief is a comprehensive document that provides the most recent statistics and data points related to SRGBV (Result I, Requirement 1.2). The English version of the document was submitted in Year 2 and it was determined that for wider usage and dissemination it would be valuable to translate the document to French. The finalized translation was completed for submission on February 4, 2019 and approved on March 7, 2019.

Planning and research for two activities under Requirement Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) continued throughout Year 3. The activities consist of supporting the Rwanda Mission through the implementation of performance evaluations for two of their existing education projects. The first, Huguka Dukore Akazi Kanoze (HDAK), is the Rwanda Mission's flagship workforce development and youth activity (12/2016-12/2021), and seeks to increase stable employment opportunities, including self- employment for vulnerable youth, to improve youth training and employment systems and increase investment in skills for vulnerable youth. The utility of a performance evaluation at this stage was to ensure that the program is effectively investing in youth to achieve the stated objectives. The HDAK Performance Evaluation was implemented in Rwanda at the end of Quarter 1. The evaluation team worked collaboratively with the Rwanda Mission to share initial findings and analysis following the in-country work. The draft of the evaluation report was submitted on February 19, 2019. The final version of the HDAK Performance Evaluation report was submitted June 3, 2019, however, comments still remained. The report was revised and resubmitted for final approval on September 26, 2019.

The second activity, Soma Umenye, is the Rwanda Mission’s flagship reading activity (7/2016-7/2021) and aims to improve the reading skills of one million public school learners nationwide, in grades 1-3. The performance evaluation was determined to be an important tool to assess and learn from prior years, and to guide appropriate adjustments as the program progresses. The initial research plan for the Soma Umenye Performance Evaluation was submitted and approved on November 29, 2018. During Quarter 2, conversations occurred with the Mission to update the research plan following adjustments made to program implementation. Following the onboarding of the team lead, the final research plan was submitted and approved on July 30, 2019. Pre- field planning, including onboarding of all technical consultants, occurred in June 2019. The field data collection occurred from August 5 to 21, 2019, with data analysis and report writing carrying over into Year 4.

Result 1 Next Steps:

• Additional Country and Regional Education Data Trends Snap Shots (Requirement 1.1) are allocated to occur in Year 4, and can be conducted at the request of AFR/SD/ED. • Additional Education Data Briefs (Requirement 1.2) are allocated to occur in Year 4. • Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3 and 1.4): ◦ Data analysis and report writing for the Soma Umenye evaluation (Requirement 1.4).

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 10 ◦ Submission of report for the Soma Umenye evaluation (Requirement 1.4)

RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED

The majority of REEP-A research activities for Year 3 contributed to Result 2, as will the bulk of research activities throughout the TO.

The objectives of Result 2 are to identify or generate knowledge about evidence-based education programming related to early grade reading and access to education, and to ensure that developed knowledge and research products are available to USAID education personnel through cost-effective dissemination methods, as well as to provide trainings and workshops.

Requirements for Result 2 throughout the TO primarily consist of desk research, primary research, capacity building, and ad hoc reports. Result 2 activities that continued in Year 3 include:

• Deliver Workshops (Requirement 2.20-2.30) • Deliver Current Research on Teacher Knowledge, Skills, and Attitudes related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31) • Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32) • Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) • Case Study on LOI Implementation and Language Attitudes (Requirement 2.34) • Development of the Teacher Language and Literacy Assessment (Requirement 2.35)

The final AREW report was submitted in Quarter1 of Year 3 following final guidance from the TOCOR on this deliverable. The final two versions of the AREW Report (Result 2, Requirement 2.20) were submitted on October 20, 2018, and approved October 22, 2018. No other workshops were delivered in Year 3.

Work under the LOI work stream continued on, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31) in Quarter1 of Year 3. This Requirement is designed to serve as a framing document for the entire work stream and will pose future research questions and identify gaps in the existing knowledge base. A draft of the framework document was submitted on February 4, 2019 for further comments from the Education Office of the Economic Growth, Education and Environment bureau. Work continued on the report through revisions and design to produce a high quality, polished report for wide dissemination throughout Quarter 3 and 4.

Under Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes, Dexis will develop 19 country profiles for sub-Saharan African countries currently implementing Goal 1 reading programs. This work includes the continuing data collection on a number of areas across the 19 countries for inclusion in the profiles. A draft profile was shared with USAID on March 5, 2019 as an example to allow for discussion and receive initial feedback. Three draft country profiles (South Sudan, Rwanda, and South Africa) were submitted on June 28, 2019 for further comments and to determine the appropriate document structure for the remainder of the country profiles. Feedback was received for South Africa on July 10, 2019, and for South Sudan on July 15, 2019. Following the receipt of feedback for Rwanda, further discussion is

11 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 necessary to determine the appropriate structure for the remainder of the country profiles.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) continued during Year 3. Following the inclusion of the questions into an EGRA implemented by RTI in Uganda in mid-October 2018, analysis began once the data was received. During Quarter 2 the analysis began by comparing teachers’ responses with data on student reading outcomes. The initial findings coming out of the inclusion of the teacher questions in the EGRA in Uganda was compiled in an initial report submitted on June 24, 2019. The proposed revisions to the LOI-specific questions included in the teacher interview questionnaire aim to improve the reliability and content of data collected through this instrument. A research plan for this activity was also submitted on June 24, 2019 for review.

In Year 3 work began on Requirement 2.34, Case Study on LOI Implementation and Language Attitudes. Under Requirement 2.34, To understand how LOI policies and implementation are perceived and followed, the case study will examine the intersection of teacher training, knowledge, and language proficiency; the use of LOI within classrooms; the linguistic diversity of students; the availability of teaching materials; the teacher placement process; and the attitudes and beliefs of teachers, parents, and community members around implementing official LOI policy. A research plan was submitted to USAID on August 30, 2019 for initial comments.

Requirement 2.35, Teacher Language and Literacy Assessment, involves developing and administering a tool that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. An initial research plan was developed to map the way forward for this activity. Throughout Quarters 2 and 3, USAID provided comments and feedback on the instruments and research plan, suggesting a change in the proposed structure of the tools in the research plan. The suggested changes were incorporated and both documents, the research plan and instruments were re-submitted on July 8, 2019, and were approved on July 12, 2019. Additionally, in Quarter 3, TLLA mission concurrence for Uganda was received on August 13, 2019.

Result 2 Next Steps:

• Deliver Workshops (Requirement 2.20-2.30), to occur throughout Year 4. • Submission of the final report, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes (KSA) related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31). • Submission of 19 draft country profiles, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32). • Submission of the final research plan for the Case Study on LOI Implementation and Language Attitudes (Requirement 2.34). • Initiating in-country work for the Teacher Language and Literacy Assessment (Requirement2.35).

RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED USAID has supported the development of several tools (e.g. Early Grade Reading Assessment and Snapshot of School Management Effectiveness) that have been successfully adapted and applied in numerous countries, generating useful data and animating purposeful policy dialogue. The subsequent years of the project will explore the further development and use of the SRGBV Toolkit.

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 12 The research priority for Result 3 in Year 3 was continued work toward the Delivery of the SRGBV Measurement Framework (Requirement 3.1). The data cleaning, processing, analysis, and report writing for the SRGBV instrument pilot carried into Year 3. The continuation of work on the Delivery of the SRGBV Measurement Framework (Requirement 3.1) remained a primary focus for Quarter 1, 2, and 3. Following the submission of the Malawi Pilot draft report on September 28, 2018, discussions have been ongoing around how to address issues with the survey questions, including requests for clarification, additional information on the recommendations made regarding the survey content, and for more details about the reliability and validity analysis conducted on all the instruments. This has led to additional analyses and discussions around how best to correct these items and proceed with next steps.

During Quarter 2, USAID requested a summary document detailing what had happened so far in terms of decisions and additional analyses conducted and was produced to serve as a reference point for the path forward. The summary document was submitted on February 28, 2019. Further discussions continued stemming from this document, indicating that further analysis, as well as a document articulating options for next steps, was needed. Following the additional analysis and clarification on the results from the Malawi pilot, a scenario document outlining possible next steps for the SRGBV instruments which was submitted for discussion on April 30, 2019. A second item consisted of results of the CFA on the Gender Attitudes and Beliefs instrument for students, teachers, and parents, and was submitted on May 24, 2019. The third item consisted of results of the CFA on the Perceptions of School Climate instrument for students, teachers, and parents and was submitted on June 19, 2019. Following the CFA results submissions, USAID provided feedback on June 26, 2019, which was discussed in a call with Dexis, RTI, and USAID and on June 27, 2019. The scenario document and the findings of the CFA’s were revised following these discussions and submitted in Quarter 4.

Initial sections of the SRGBV toolkit (Result 3, Requirement 3.1) were shared with USAID starting in Quarter I, through an agreed upon iterative process to ensure rapid turnaround, to select a design style for the Toolkit and work continued on the design and layout of the Toolkit during Quarter 2. In the beginning of 2019, a draft version of all chapters of the Toolkit was submitted for USAID review. On May 20, 2019, USAID provided written feedback to the draft version of the Toolkit, which was discussed on a phone call on May 22, 2019. The feedback focused on revising the Toolkit to ensure that the overall framing of the Toolkit more clearly links the survey instruments and the conceptual framework; improving the physical and electronic navigation of the Toolkit; and revising content around data analysis. Additionally, discussion around the merits of including the SEL instrument continued, as this particular instrument was not included in the Malawi pilot, but has been piloted separately in Uganda.

Work on Requirement 3.2 included submission of an Early Grade Reading Barometer Assessment Report on October 31, 2018, and approval on November 13, 2018. The assessment report aimed to determine if the current Early Grade Reading Barometer platform, an online tool hosted by USAID’s Asia Bureau, provided the appropriate venue to incorporate additional datasets. Findings determined that the necessary platform for the Barometer already exists and is well-operated, and thus it is recommended that the Africa Bureau utilize the existing platform to incorporate datasets. The assessment report allowed for taking the next steps for fulling the remaining Sub-Requirement 3.2.3, Incorporate African Mission Early Grade Reading Data into the Early Grade Reading Barometer. Efforts on Requirement 3.2, began in Quarter 2 with work on a modification with the subcontractor, RTI, to

13 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 begin incorporating the Kenya (Tusome) project data starting in Quarter 3. Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has continued to progress during Quarter 4 through the continuation of incorporation of the Kenya (Tusome) baseline and midline EGRAs in English and Kiswahili. However, during Quarter 4, USAID began discussions regarding the possible shift in revisiting the usefulness of the Early Grade Reading Barometer. Further discussions will be necessary to determine if proceeding with Early Grade Reading Barometer activity is beneficial.

Result 3 Next Steps:

• Continue work on Requirement 3.1, Delivery of the SRGBV Measurement Framework, namely: ◦ Submission of a revised draft of the SRGBV Toolkit and instruments. • Continue work on Requirement 3.2, Update Status of Early Grade Reading Barometer.

Table 1: Status of Technical Deliverables for Year 3, by Result

Result 1: Africa Missions strategy related data needs met Status

Deliver Country and Regional Education Data Trend Snap-Shots (Requirement 1.1) In progress

Deliver Education Data Briefs (1.2) Completed

Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (1.3-1.7) In progress

Result 2: Availability of Africa education data and trends expanded Status Develop Early Grade Reading research training modules or courses for classroom, Upcoming (timing online and/or resources materials (2.19.2) TBD) Deliver Workshops (2.20-2.30) Upcoming (timing TBD) Deliver current research on teacher knowledge, skills, and attitudes (KSA) related to language In progress and literacy that influence early grade literacy outcomes (2.31) Develop relevant country-specific profiles of LOI policies, reading program approaches to LOI, In progress and student reading outcomes (2.32) Deliver analysis of relationship between teacher language proficiency and student reading In progress performance (Requirement 2.33)

Case study on LOI Implementation and language attitudes (Requirement 2.34) In progress

Teacher Language and Literacy Assessment (Requirement 2.35) In progress

Result 3: Measurement tools with applicability across countries developed Status Deliver the SRGBV Measurement Framework (3.1) In progress

Early Grade Reading Barometer (3.2) In progress

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 14 CHALLENGES AND LESSONS LEARNED During the course of Year 3, there have been some challenges surrounding communication and understanding of tasks and timelines on various activities. This has been ongoing since the last quarter and continually discussed with the TOCOR to mitigate and remedy the issues. Steps have been taken to address it by creating additional updates and timelines that everyone has access to with clear action items, deliverables, and dates in one central location. Additionally, there were requests for further information and analysis on the SRGBV activity that were not planned or budgeted for. Work has continued despite this, in an attempt to keep things moving. However, accommodating these additional requests has resulted in delays and a need for additional resources. The lack of understanding on the additional resources needed to continue work on these additional requests led to additional challenges and delays. In the future, work will need to halt until formal modifications are in place to mitigate this risk.

Some concerns were raised by USAID regarding the overall quality of the HDAK evaluation report. Dexis sought to remedy these issues by bringing in a Senior Technical Monitoring, Evaluation and Learning colleague to provide the technical insight necessary to resolve all outstanding comments and concerns. Learning from the challenges associated with the HDAK evaluation report, Dexis adjusted the team composition for the upcoming Soma Umenye Performance Evaluation (Requirement 1.4). The Rwanda Mission was also included in the selection process for the Team Lead position for the Soma Umenye Performance Evaluation in order to mitigate potential issues that had occurred in the prior evaluation.

Additionally, USAID flagged some challenges surrounding the SRGBV work including the level of quality of some of the deliverables. Some of these challenges stemmed from the above issues, communication, additional requests and a lack of clarity from both sides on expectations for these additional deliverables. Due to these challenges, a formal notice was submitted to Dexis articulating the issues and immediate need to remedy them. In an effort to address these, Dexis notified the subcontractor, RTI, that they would no longer be involved in completing this Requirement (3.1).Efforts have been focused on ensuring a seamless transition and ensuring concise communication to mitigate any similar issues.

OTHER DELIVERABLES During Year 3, Dexis also submitted the Year 2 Annual Report and the Year 3 Workplan on October 31, 2018. The Year 3 Workplan was approved on November 29, 2018, and the Year 3 Workplan Summary was submitted and approved on December 14, 2018. In order to continue to provide services under the Research for Effective Education Programming- Africa project a Limitation of Funds notice was submitted on May 30, 2019 and the pipeline budget was submitted on June 24, 2019.

15 | REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 PROJECT MANAGEMENT DELIVERABLES In Year 3, Dexis submitted Quarterly Reports for Quarters 1 through 3. Dexis also submitted the Year 2 Annual Report, the Year 3 Annual Work Plan and Year 3 Annual Work Plan Summary. In Quarter 4 Dexis also developed the Annual Work Plan for Year 4, for submission in Quarter 1 of Year 4. In Quarter 4 Year 2, a modification (AID-OAA-TO-16-00024 MOD 04) was requested to change some activities within the TO and to increase the subcontractor’s ceiling. This modification was approved in Year 3, on October 10, 2018. Dexis also sought rate approvals for all personnel whose rates have changed between Years 2 and 3.

In Quarter 3 Year 3, a modification (AID-OAA-TO-16-00024 MOD 09) was requested to increase the unburdened ceiling rate. This modification was approved on May 16, 2019. Additionally, in Quarter 4, Year 3 a modification (AID-OAA-I-15-00024 MOD 05) was requested and approved on September 13, 2019 to incrementally fund the contract, increasing the total obligated amount.

Next Quarter (Year 4, Quarter 1):

• Dexis will submit the Year 4 Work Plan on October 31, 2019. • Dexis will submit the Work Plan Summary for Year 4 within 15 days of the Work Plan approval. • Dexis will submit its Year 4, Quarter 1 Report on January 31, 2020.

REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019 | 16

YEAR 3 DELIVERABLES STATUS

REEP-A Year 3 Deliverables Original Final Date Type Subject/Brief Description Submission Submission Approved Project Management Year 3 Work Plan 10/31/2018 11/29/2018 Project Management Year 3 Work Plan Summary 12/14/2018 12/14/2018 Periodic Report Quarterly Report – Y3Q1 01/31/2019 02/05/2019 Periodic Report Quarterly Report – Y3Q2 04/30/2019 06/06/2019 Periodic Report Quarterly Report – Y3Q3 07/30/2019 07/30/2019 Technical Early Grade Reading Barometer 10/31/2018 11/13/2018 Assessment Report

Technical SRGBV Data Brief, French 02/04/2019 03/07/2019 Translation

Technical TLLA Research Plan and 07/12/2019 07/12/2019 Instruments

Technical Soma Umenye Research Plan 11/29/2018 07/30/2019 07/30/2019

REEP-A YEAR 3 ANNUAL REPORT- OCTOBER 2019 | 17 ANNEX OF YEAR 3 DELIVERABLES

1. Year 3 Work Plan 2. Year 3 Work Plan Summary 3. Quarterly Report – Y3Q1 4. Quarterly Report – Y3Q2 5. Quarterly Report – Y3Q3 6. Early Grade Reading Barometer Assessment Report 7. SRGBV Data Brief, French Translation 8. TLLA Research Plan and Instruments 9. Soma Umenye Research Plan

18 | REEP-A YEAR 3 ANNUAL REPORT-OCTOBER 2019

RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING-AFRICA Year Three Annual Workplan

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 . The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication.

Prepared for

USAID

Prepared by

Dexis Consulting Group

Research for Effective Education Programming – Africa (REEP)

Contract No. 7000-S-2015-01-WO-2017-02

October 2018

ACRONYMS

AREW Africa Regional Education Workshop

DERP Data for Education Research and Programming in Africa

DfID United Kingdom Department of International Development

E3/ED Economic Growth, Education and Environment, Office of Education

EGRA Early Grade Reading Assessment

G2G Government to Government

IRB Institutional Review Board

KSA Knowledge, Skills and Attitudes

LOI Language of Instruction

OCI Organizational Conflict of Interest

PIRLS Progress in International Literacy Study

RDCS Regional Development Cooperation Strategy

REEP-A Research for Effective Education Programming- Africa

SART Secondary Analysis for Results Tracking Database

SLA Second Language Acquisition

SOW Statement of Work

SRBGV School-Related Gender-Based Violence

TO Task Order

TOCOR Task Order Contract Officer Representative

USAID United States Agency for International Development

i | REEP-A YEAR 3 ANNUAL WORKPLAN CONTENTS OVERVIEW OF THE REEP-A PROGRAM 1 PURPOSE 1 ANALYTIC FRAMEWORK 3 YEAR THREE WORK PLAN 4 RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET 4 REQUIREMENT 1.1: DELIVER COUNTRY AND REGIONAL EDUCATION DATA TREND SNAPSHOTS 5 REQUIREMENT 1.2: DELIVER EDUCATION DATA BRIEFS 5 REQUIREMENT 1.3-1.7: DELIVER MISSION SUPPORT FOR EVALUATION, ASSESSMENT, AND DATA ON SELECTED PRIORITIES 6 RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED 8 REQUIREMENT 2.5: DELIVER LESSONS LEARNED IN EDUCATION PARTNERSHIPS 9 REQUIREMENT 2.6-2.18: DELIVER ANALYSIS ON SELECTED RESEARCH TOPICS 10 REQUIREMENT 2.19: DELIVER TRAINING MODULES OR COURSES FOR CLASSROOM, ONLINE AND OR RESOURCE MATERIALS (CAPACITY BUILDING) 10 REQUIREMENT 2.20-30: DELIVER WORKSHOPS 12 REQUIREMENT 2.31: DELIVER CURRENT RESEARCH ON TEACHER KNOWLEDGE, SKILLS, AND ATTITUDES (KSA) RELATED TO LANGUAGE AND LITERACY THAT INFLUENCE EARLY GRADE LITERACY OUTCOMES 13 REQUIREMENT 2.32: DEVELOP RELEVANT COUNTRY-SPECIFIC PROFILES OF LOI POLICIES, READING PROGRAM APPROACHES TO LOI AND STUDENT READING OUTCOMES 14 REQUIREMENT 2.33: DELIVER ANALYSIS OF RELATIONSHIP BETWEEN TEACHER LANGUAGE PROFICIENCY AND STUDENT READING PERFORMANCE 15 REQUIREMENT 2.34: CASE STUDY ON LOI IMPLEMENTATION AND LANGUAGE ATTITUDES 16 RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED 17 REQUIREMENT 3.1: DELIVER THE SRGBV MEASUREMENT FRAMEWORK (METHODS DEVELOPMENT) 17 REQUIREMENT 3.2: UPDATE STATUS OF EARLY GRADE READING BAROMETER 18 REQUIREMENT 3.3-3.4: DELIVER TOOLS FOR SYSTEMATIC DATA COLLECTION IN A SELECTED AREA (METHODS AND TOOL DEVELOPMENT) 19 REQUIREMENT 3.3: DELIVER TOOLS FOR SYSTEMATIC DATA COLLECTION IN A SELECTED AREA (METHODS AND TOOL DEVELOPMENT): TEACHER LANGUAGE AND LITERACY ASSESSMENT 19 REQUIREMENT 3.4: DELIVER TOOLS FOR SYSTEMATIC DATA COLLECTION IN A SELECTED AREA (METHODS AND TOOL DEVELOPMENT): SRGBV DATA COLLECTION ACTIVITY 20 PROJECT MANAGEMENT DELIVERABLES 20 QUARTERLY AND ANNUAL REPORTS 20 SCHEDULE OF YEAR THREE ACTIVITIES 21 ACTIVITY MATRIX 21 GANTT ACTIVITY CHART 31

REEP-A YEAR 3 ANNUAL WORKPLAN | ii OVERVIEW OF THE REEP-A PROGRAM

PURPOSE The overarching aim of the Research for Effective Education Programming– Africa (REEP-A) project is to provide the USAID Africa Bureau and Mission offices, as well as partner organizations, with concrete contributions to research on USAID education initiatives that will inform evidence-based investment, decision-making and the prioritization of needs. REEP-A will contribute to USAID’s education aims in Africa by providing technical and advisory services focused on research and capacity building, to enhance the quality and effectiveness of USAID activities. REEP-A will provide practical and action-oriented gender-sensitive research that will be applied to both project design and implementation. REEP-A aims to strengthen the relationships and links between project evaluation, design, and implementation. In addition, REEP-A will contribute to the capacity development of education personnel and partners by enhancing the use of research in decision-making, mainstreaming the application of feedback-loops and lessons learned into project design and implementation, and expanding the evidence and knowledge base on education initiatives. The intended audience of REEP-A is broad, and includes stakeholders across bilateral and regional Missions and Bureaus, host-country governments, implementing partners, and other donors.

USAID’s Education Strategy, issued in 2011, has promoted the rigorous use of evidence-based programming across the Strategy’s three education goals, specifically the measurement of access to education and early grade reading improvements. The education goals set targets for 2015, focusing on early grade reading, workforce development, and education in crisis and conflict-affected environments:

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

Following the release of the Strategy, the Education Team of the USAID Africa Bureau’s Office of Sustainable Development (AFR/SD/ED) identified the need for targeted research and training services to complement the efforts of Africa Missions to effectively align their education portfolios with the three goal areas of the Strategy. In particular, AFR/SD/ED determined the need for comprehensive data and evaluation of initiatives for early grade reading and enhanced access to education in crisis or conflict- affected countries in order to better inform decision-making and strengthen understanding of the barriers to improved outcomes. The Data for Education Research and Programming in Africa project (DERP), implemented between 2011 and 2016, was established to respond to these identified needs, and provided AFR/SD/ED with timely research and tools that Missions could apply to their ongoing and future education projects. REEP-A will expand on the progress and evidence base established by DERP, and will target identified gaps and challenges.

To meet the ambitious aims for both USAID’s Education Strategy Goals 1 and 3, it is critical that resources are used effectively and strategically to ensure the greatest gains. As Africa Missions are expected to make the largest contributions to these goals, the effective use of resources is particularly

1 | REEP-A YEAR 3 ANNUAL WORKPLAN vital in the Africa region. REEP-A aims to provide theories of change that will inform investment decisions, using a gender-sensitive lens. Through rigorous research, including the monitoring and evaluation of projects, REEP-A strives to address research gaps and expand the knowledge base on effective education programming.

REEP-A seeks to contribute evidence that responds to the question of why, despite some gains, progress toward enhanced early grade reading and access to quality education in conflict and crisis settings in Africa have been modest. Expanding the evidence base and data collection for education interventions in Africa is critical to spurring further progress, increasing the understanding of program effectiveness, identifying barriers, and replicating successes.

REEP-A will target topics and issues that have not received sufficient analysis, and will also address new challenges that have emerged in the last five years, namely:

• Additional support to Africa Missions, as many education projects move from an initial focus on strategy-alignment to the design and implementation phase, with a focus on the expansion of early grade reading and achieving national-level scale • Increased emphasis on cost-effective approaches that promote sustainability • Enhanced understanding of the efficacy of early grade reading interventions • Targeted focus on girls and the role of gender-based dynamics in promoting or inhibiting progress • Enhanced understanding of the weak or poor improvements in early grade reading programs in Africa, as evidenced by preliminary midline reporting • Enhanced understanding of programmatic issues in enrolling and retaining, in both formal and informal education programs, the large number of children and youth living in crisis and conflict areas that lack access to education, particularly girls

REEP-A will extend the analytic framework of DERP, building on the progress made in expanding the evidence base and improving data collection, and will further explore areas for research that were identified as critical during the research program. REEP-A is designed to be flexible and to adapt over time to respond to emerging areas of inquiry and to adequately address practical areas of concern, including bottlenecks and challenges in achieving desired program outcomes. REEP-A will provide analysis on areas with limited, uncertain or weak previous research, with a primary focus on two Strategy Goals: Goal 1 (Early Grade Reading) and Goal 3 (Education in Crisis and Conflict Environments).

The portfolio of education initiatives being implemented by Africa Missions is heavily concentrated on Early Grade Reading (Goal 1) projects, with the number of Goal 1 projects significantly higher than the combined total of both Goal 2 and Goal 3 projects. Currently, 19 Africa Missions have Goal 1 projects, which together reach the largest number of beneficiaries of Goal 1 initiatives globally. Mirroring this emphasis on Early Grade Reading, REEP-A will have a heavier focus on research related to Goal 1.

Following the launch of the Strategy in 2011 and through 2015, the reading skills of around 30 million children globally were improved, of which 24 million are in Sub-Saharan Africa. However, despite these encouraging numbers during the first five years of the Strategy, overall improvements in reading in the region have been slow, and reading levels remain low. REEP-A’s objective within Goal 1 is to better understand the reasons for these slow gains, and to further examine challenges faced by current

REEP-A YEAR 3 ANNUAL WORKPLAN | 2 interventions, with a broader aim of strengthening education programming for USAID, improving the allocation and effectiveness of investments through rigorous analysis and research, and enhancing information sharing and capacity building for USAID personnel, regional Missions and Bureaus, host- country governments, and other partners and donors. In keeping with the overarching strategy and objectives of REEP-A, activities and analyses under this Goal will be approached through a gender- sensitive lens, and will utilize sex-disaggregated data when available.

The second research focus of REEP-A will be on Education in Crisis and Conflict Environments (Goal 3). Currently, seven Africa Missions are implementing Goal 3 initiatives, with the expectation of reaching approximately 1 million beneficiaries by 2018. In crisis and conflict settings, Goal 1 and 3 projects are implemented both jointly and separately, depending on the context. While research in these settings can be particularly challenging, USAID has made progress in identifying several key project elements and components for Education in Crisis and Conflict Environments project design. Crisis and conflict severely hinder education and development, with a particularly pronounced impact on girls and other vulnerable groups. REEP-A will address the need for additional research into gender-based and regional barriers to quality education for children and youth in these settings.

Goal 2 of the Strategy has two primary content areas: Higher Education Partnerships and Workforce Development. The scope of Goal 2 is extremely broad, and engages all of the technical sectors within USAID, particularly tertiary education. While strengthening the quality, equity and relevance of tertiary education and workforce development training is a strategic priority within USAID, tools that measure these targeted issues are limited. Of the three goal areas, Goal 2 has the least emphasis in Africa Missions. While four Goal 2 projects have been implemented by Africa Missions since the Strategy was issued, it is projected that only two, in Liberia and South Africa, will continue past 2015. Due to the strong emphasis on Goals 1 and 3, and because very few projects in Africa Missions are focused on Higher Education and Workforce Development, Goal 2 will not be a research focus of REEP-A, with limited to no resources and analytic efforts from REEP-A directly related to this goal area.

ANALYTIC FRAMEWORK REEP-A will support AFR/SD’s Regional Development Cooperation Strategy (RDCS) through the generation of regional, country-specific and gender-focused data, analysis and research that will assist in identifying and prioritizing needs, promoting rigorous research to inform evidence-based decisions, and promoting gender-equitable education investments within Africa.

To achieve these aims, REEP-A will provide technical assistance and advisory services across the three results articulated in the Education Analytic Framework:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This distribution of activities across the three Results is reflected both throughout the five years of the Task Order (TO) as well as in Year 3. Year 3 will provide strategic research across the three Results, and build on the foundational research and analysis that has occurred over the last two years. Year 3

3 | REEP-A YEAR 3 ANNUAL WORKPLAN remains focused on implementing and continuing research begun thus far, as well as initiating new research activities. In discussion with USAID, including during Quarterly meetings, future research topics will also be determined and refined to fulfill undetermined requirements under the TO.

Research activities will prioritize and integrate gender-focused approaches. In close consultation with AFR/SD/ED, all research activities, including technical plans, will utilize gender-sensitive methodologies and evaluation questions, with the aim of strengthening understanding of the barriers to improving gender equity in education and learning. Recognizing the importance of gender considerations as a key component of success in achieving development aims, research activities within REEP-A will, through the use of available sex-disaggregated data and other gender-sensitive tools, produce research that aims to clearly articulate and understand existing gender disparities, identify gaps, and provide a thorough analysis of gender issues at both the individual and institutional levels, including the interplay of contextual and political factors. In addition to research approaches, a gender lens will be adopted in the other analytic and capacity building services under REEP-A. This gender-sensitive approach includes the engagement of men and boys in conversations around issues of gender equity, which is central to achieving equality both in education and the work force. The identification of gender-based disparities and continued barriers to equity will be a driving focus of the research and recommendations, as well as in shaping research questions and topics throughout the TO.

YEAR THREE WORK PLAN

A substantial number of the activities within the TO will take place within Year 3. In addition to the scheduled research, many activities from the first two years of REEP have rolled over to be completed in Year 3.

Research activities in Year 3 will occur across all three TO Result areas. In Result 1, activities will span all the Requirements, including education data briefs and Mission support for evaluation, assessment and data priorities. Research in Result 1 will include two Mission support performance evaluation activities.

Two work streams will comprise the largest share of research in Year 3: research related to School- Related Gender-Based Violence (SRGBV) (under Result 3) and research on teachers and language of instruction (LOI) (under Result 2 and Result 3). Work anticipated under Result 2 will also include the development of training materials and the delivery of workshops. One desk study on a chosen research topic may also be utilized in Year 3, as determined in discussion with USAID. In addition, under Result 3, work will begin on incorporating data from African countries into the Early Grade Reading Barometer.

RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET The smallest distribution of activities in Year 3, as in the wider TO, falls within Result 1. Activities under Result 1 will include elements across the Sub-Requirements, including Country and Regional Education Data Snapshots, Education Data Briefs, and Mission Support for Evaluation, Assessment and Data on Selected Priorities.

Requirements for Result 1 are primarily in the work stream of Ad Hoc Data Reports. Seven requirements have been identified for the achievement of Result 1:

REEP-A YEAR 3 ANNUAL WORKPLAN | 4 1. Provision of country education data-fact sheets or snapshots, and country and regional education trends analyses. Country and regional education trends analyses will require discussion to determine the data to be presented. (Requirement 1.1) 2. Delivery of time-sensitive education data briefs. (Requirement 1.2) 3. Delivery of support to Missions based on specific data demands or evaluations. (Requirements 1.3-1.7)

REQUIREMENT 1.1: DELIVER COUNTRY AND REGIONAL EDUCATION DATA TREND SNAPSHOTS AFR/SD will use the Education Data Trend Snapshots to inform USAID managers and decision-makers of the overall education status of a particular country, which could affect budget allocations and the Strategy goal or goals a country could pursue. Country data such as the percentage of out-of-school children or literacy rates are examples of the type of data points considered for inclusion. Whenever possible, fact sheets will include gender-based data in addition to sex-disaggregation.

Country and Regional Education Data Trend Snapshots will be delivered in Year 2. An additional iteration of the Snapshots is expected to occur in Year 4. Due to not all of the Snapshots being used in Year 2 the remaining ones will rollover to Year 3. The remaining four that were not used in Year 2 are available in Year 3 in discussion with the TOCOR.

SUB-REQUIREMENT 1.1.1: IDENTIFY DATA AND FORMAT FOR EDUCATION DATA SNAPSHOTS Standards:

• Proposed data includes early grade reading assessment results, where available. • Proposed data includes number or percentage of out of school children, where available. • Proposed data is sufficiently reliable and comparable to facilitate meaningful cross-country analysis. • Final, approved format submitted to TOCOR.

SUB-REQUIREMENT 1.1.2: DEVELOP REGIONAL AND COUNTRY EDUCATION DATA SNAPSHOTS Following TOCOR approval of Sub-Requirement 1.1.1, Dexis will develop gender-sensitive Snapshots.

Deliverable:

• Draft for a minimum of five (5) 1-2 page Snapshots for AFR/SD/ED supported countries.

SUB-REQUIREMENT 1.1.3: DISSEMINATE REGIONAL AND COUNTRY EDUCATION DATA SNAPSHOTS Following TOCOR approval of Sub-Requirement 1.1.2, Dexis will disseminate gender-sensitive Snapshots.

Deliverable:

• Ten (10) hardcopies of each country and regional data sheets to be submitted to the TOCOR.

REQUIREMENT 1.2: DELIVER EDUCATION DATA BRIEFS

5 | REEP-A YEAR 3 ANNUAL WORKPLAN AFR/SD/ED occasionally receives requests from Africa Bureau leadership or from U.S. Congressional members for time-sensitive education data. It usually requires pulling together data from publicly available sources but can require specific Mission project data. The Task Order Contract Officer Representative (TOCOR) will communicate the data information request and provide a quick turn- around time for the delivery.

Upon request of the TOCOR, up to two education briefs will be delivered in Year 3. A French translation of the Education Data Brief: Global Prevalence of School-Related Gender-Based Violence (SRGBV) will be completed by January 2019 which will fulfill one of the briefs for Year 3.

The timing and content of the remaining brief will be determined by USAID and communicated to Dexis through the TOCOR.

SUB-REQUIREMENT 1.2.1: DEVELOP EDUCATION DATA BRIEF Upon request, the most relevant available education data will be assembled to provide targeted and informative responses. In some cases, only data will need to be provided and in other cases the data will require a text description or analysis. Responses will typically be one (1) page in length, include data citations, and be without spelling or punctuation errors.

Standards:

• Response time in 24 hours. • Data citations included. • Text response without spelling or punctuation errors.

REQUIREMENT 1.3-1.7: DELIVER MISSION SUPPORT FOR EVALUATION, ASSESSMENT, AND DATA ON SELECTED PRIORITIES REEP-A is intended to be sufficiently flexible to respond to specific Mission evaluation, assessment and/or data requests, as certain needs emerge over the term of the TO. The Africa Bureau’s education team anticipates that Missions have a range of evaluation, assessment and data needs for which requests will be made. Support could include gender-sensitive evaluation design, including the development of evaluation questions and support for independent government cost estimates; technical support for the review of performance management plans, evaluation design and reports, and other technical documents; development of gender-sensitive research designs for proposed studies; review of evaluation reports and other evidence to determine policy implications; and technical assistance to Missions to develop or review evaluation data collection methods or evaluation needs. Support under this Requirement could be in the form of performance evaluations and assessments. Resources can only support those field activities that are feasible, meaning activities that can be undertaken and completed during the TO’s period of performance, at a cost that does not prohibit or severely limit the provision of support to Missions seeking support, and that conducting the activities does not create a potential organizational conflict of interest (OCI), real or perceived. Impact evaluations are not intended to be among the range of Mission support provided.

Based on information from the TOCOR and discussions with Missions it was decided that two performance evaluations in Rwanda might be feasible. Following these conversations, Dexis completed a brief Feasibility Determination report for both of the proposed evaluations (Requirement 1.3.1 and

REEP-A YEAR 3 ANNUAL WORKPLAN | 6 Requirement 1.4.1) to determine if it was possible to undertake the two evaluations under REEP-A. The envisioned scope for these Mission support activities was originally a bit smaller, such that an extensive or complete evaluation would not have initially been possible. However, given the lack of expressed needs from other Missions, a determination was made that these two evaluations would cover all the mission support activities (1.3-1.7), and that resources could be allocated accordingly. For these reasons, the Feasibility reports submitted to USAID expressed the ability of REEP-A to undertake both of these performance evaluations, which would account for all of the Mission support activities.

These Mission support activities (Requirement 1.3 and Requirement 1.4) began in Year 2 of REEP-A with the submission of research plans for both performance evaluations in Rwanda, namely, the flagship programs on early grade reading (Soma Umenye) and workforce/youth development (Huguka Dukore Akazi Kanoze) in Rwanda. The evaluation for Huguka Dukore Akazi Kanoze is underway. The in-country fieldwork will occur in November 2018, with the final report wrapping up in February 2019. The evaluation for Soma Umenye has been pushed back to April of 2019 to allow for program adjustments to take hold prior to the performance evaluation. The specific dates of this evaluation will be finalized in January 2019.

Work on the performance evaluations in Year 3 will consist of the following Sub-Requirements.

SUB-REQUIREMENT 1.3.2: CONDUCT MISSION-SPECIFIC EVALUATION/DATA ACTIVITY Standards and Deliverables:

• Research plan submitted within one (1) month following a “yes” determination. • Plan comprises methodology or methodologies to be used and identifies role of USAID Mission.

The Huguka Dukore Akazi Kanoze evaluation will occur in late November 2018.

SUB-REQUIREMENT 1.3.3: PRODUCE REPORT ON MISSION-SPECIFIC EVALUATION/DATA ACTIVITY AND DISSEMINATE Standards and Deliverables:

• Draft report submitted within two (2) months following completion of activity. • Final report submitted within 15 days after TOCOR provides comments/approval. • The report will include an executive summary, gender-based data and findings, and will be grammatically correct, without spelling or punctuation errors. • Minimum of two (2) hardcopy reports distributed to AFR/SD/ED as well as electronic copies provided in Microsoft Word and PDF. • Report distributed electronically to intended mission recipients within ten (10) days following TOCOR approval.

The draft report on the Huguka Dukore Akazi Kanoze evaluation will be submitted in January 2019.

SUB-REQUIREMENT 1.4.2: CONDUCT MISSION-SPECIFIC EVALUATION/DATA ACTIVITY Standards and Deliverables:

• Research plan submitted within one (1) month following a “yes” determination. • Plan comprises methodology or methodologies to be used and identifies role of USAID Mission.

7 | REEP-A YEAR 3 ANNUAL WORKPLAN The Soma Umenye research plan is pending approval. The Soma Umenya evaluation will occur in April 2019.

SUB-REQUIREMENT 1.4.3: PRODUCE REPORT ON MISSION-SPECIFIC EVALUATION/DATA ACTIVITY AND DISSEMINATE Standards and Deliverables:

• Draft report submitted within two (2) months following completion of activity. • Final report submitted within 15 days after TOCOR provides comments/approval. • The report will include an executive summary, gender-based data and findings, and will be grammatically correct, without spelling or punctuation errors. • Minimum of two (2) hardcopy reports distributed to AFR/SD/ED as well as electronic copies provided in Microsoft Word and PDF. • Report distributed electronically to intended mission recipients within ten (10) days following TOCOR approval.

The timing of the draft report on the Soma Umenye evaluation will be determined following the updated evaluation timeline.

RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED The majority of REEP-A research activities for Year 3 will contribute to Result 2, as will the bulk of research activities throughout the TO. Result 2 activities span both primary and secondary (desk) research.

The objective of Result 2 is to:

• Identify or generate new knowledge about evidence-based education programming related to reading and school access. • Ensure that the knowledge and research developed is available to USAID education personnel, including through trainings, workshops, and widespread dissemination across relevant websites and social media platforms, which will be articulated in detailed dissemination plans.

Requirements for Result 2 primarily span the work streams of desk research, primary research, capacity building, and ad hoc reports. The projected Result 2 Requirements for Year 3 are:

1. Lessons Learned in Education Partnerships Primary research (Requirement 2.5) 2. Delivery of analysis on selected research topics, which are determined through discussions between Dexis and the TOCOR (Requirement 2.6-2.18) 3. Delivery of Goal 1 training modules or courses for classroom, online and/or resources materials (Requirement 2.19) 4. Deliver Workshops (Requirements 2.20-2.30) 5. Delivery of current research on teacher knowledge, skills and attitudes (KSA) related to language and literacy that influence early grade literacy outcomes (Requirement 2.31) 6. Development of relevant country-specific profiles of Language of Instruction (LOI) policies, reading program approaches to LOI and student reading outcomes (Requirement 2.32)

REEP-A YEAR 3 ANNUAL WORKPLAN | 8 7. Delivery of an analysis on the relationship between teacher language proficiency and student reading performance (Requirement 2.33) 8. Case study on LOI implementation and language attitudes (Requirement 2.34)

The primary focus of Year 3, under Result 2, will be in-depth research and analysis related to the language skills of teachers, and the corresponding impact this has on the acquisition of reading skills and the effectiveness of early grade reading programs. Several requirements in this work stream will occur in Year 3, with additional activities in later years of the TO. The initial deliverable will be a framing report that sets the stage for the work stream and will serve as a foundational document (Requirement 2.31).

The development of comprehensive country profiles, with policy and project information, will allow USAID and other partner organizations to quickly understand approaches adopted by relevant programs, in light of that country’s linguistic and policy context (Requirement 2.32). To gain some insight into the relationships between stated teacher language proficiency and student reading performance, several questions specific to teachers will be included in upcoming Early Grade Reading Assessments (EGRAs), in up to two countries (Requirement 2.33). Case studies in up to two countries will seek to learn more about how teacher preparation for instruction in the LOI, the process of teacher school assignment, attitudes and beliefs about language in education among teachers and other stakeholders, and how language curriculum and textbooks impact reading outcomes (Requirement 2.34).

While the research on teachers and LOI is the primary focus of Result 2, several additional research activities will likely occur in Year 3. Specifically, in-country research on lessons learned from education partnerships will begin (Requirement 2.5) as well as a desk review which will be covered under the selected research topics requirements (Requirements 2.6-2.18). In addition, a training module will be developed (Requirement 2.19) and delivered through an in-country training (Requirement 2.20) in Year 3.

REQUIREMENT 2.5: DELIVER LESSONS LEARNED IN EDUCATION PARTNERSHIPS In support of common education goals and objectives, several missions have engaged in partnerships with a range of development actors, including the Global Partnership for Education, other bilateral donors (e.g. the United Kingdom’s Department of International Development (DfID) in the Democratic Republic of the Congo), and host governments. Partnerships can leverage greater resources, unite entities around specific education objectives, and in the case of Government to Government (G2G) programming, help build capacity and increase the likelihood of sustained efforts. However, partnerships can be difficult to manage, can prove challenging to implement, and significant time and staff resources are required to develop and remain actively engaged in partnerships. Identifying lessons learned can greatly benefit Missions as they seek to engage in future partnerships. This research activity should be structured to answer the following research question:

What are the pros and cons associated with the types of partnerships most commonly initiated by USAID (i.e.G2G, Global Partnership for Education)?

Dexis will describe and categorize partnerships that reflect the most current trend in Africa and identify benefits, barriers, and any lessons learned. This investigation will require desk review and up to 3 in country visits. Dexis and the TOCOR will determine if this research will be undertaken, and if so in which countries, in the early months of Year 3.

9 | REEP-A YEAR 3 ANNUAL WORKPLAN SUB-REQUIREMENT 2.5.1: CONDUCT ANALYSIS OF EDUCATION PARTNERSHIPS Standards and Deliverables:

• Desk research plan includes gender-sensitive methodology for search strategy, inclusion/exclusion criteria, analysis and synthesis. • Research plan includes a gender-sensitive methodology section covering at minimum the following elements: case study selection criteria, respondent sampling strategy, data collection instruments, data collection protocols, and analysis plan. • Plan outlines anticipated timing of all major study phases, including timeframes for in-country data collection. • Staffing plan outlines proposed personnel and reporting/management structure for key evaluation phases: design, fieldwork, analysis, reporting. • Research plan estimates cost.

SUB-REQUIREMENT 2.5.2 PRODUCE REPORT ON LESSONS LEARNED IN EDUCATION PARTNERSHIPS AND DISSEMINATE Dexis will propose a report format that includes the methodology, limitations, findings, conclusions and recommendations.

Standards and Deliverables:

• Report draft submitted within 45 days after completion of analysis. • Final report submitted within 15 days after TOCOR provides comments/approval. • Report includes an executive summary. • Report includes gender based data, learnings and analysis. • Up to five (5) page companion brief on findings. • One (1) presentation of findings to USAID Washington. • Two (2) hardcopy reports distributed to each of 18 Africa Missions. • One (1) dissemination plan for the research.

REQUIREMENT 2.6-2.18: DELIVER ANALYSIS ON SELECTED RESEARCH TOPICS As previously addressed, AFR/SD/ED intends that REEP-A remain sufficiently flexible to allow it to respond to topics that emerge over time.

Selected research topics will be determined through discussion with the TOCOR based on the needs of AFR/SD/ED. There is one desk study available for Year 3. Once a topic is determined, written approval will be given for Dexis to develop a research plan and commence work.

REQUIREMENT 2.19: DELIVER TRAINING MODULES OR COURSES FOR CLASSROOM, ONLINE AND OR RESOURCE MATERIALS (CAPACITY BUILDING) Dexis will deliver up to three (3) tailored classroom and/or online courses designed to transfer research-based knowledge about topics determined in conjunction with the TOCOR. This requirement includes development and distribution of resource materials. These modules, courses, and resource materials should incorporate the most current evidence associated with the research activities conducted under REEP-A. The contractor shall collaborate with AFR/SD/ED to determine countries where classroom training will be delivered. It is more cost effective for mission personnel when training

REEP-A YEAR 3 ANNUAL WORKPLAN | 10 is scheduled in conjunction with other USAID trainings or events (i.e., regional workshops). The TOCOR will provide the contractor with information on USAID’s annual training schedules as they are issued. Further, AFR/SD/ED and E3/ED will alternate sponsorship of a regional education workshop, which is always held in an African country and a USAID education conference, which is held in Washington. The E3/ED education conferences are tentatively planned for 2019, 2021. AFR/SD/ED sponsors the regional workshops for the alternate years. The tentative plan for the next Africa Regional Education Workshop is 2020. Coordinating training opportunities around these events is desirable but not required

Standards and Deliverables:

• Training notebooks and flash drives provided for each classroom training • Classroom training course, up to 13 days total, in Washington, DC, in up to 2 countries or some combination of those locations • Gender-sensitive evaluation conducted for each classroom training • Gender-sensitive evaluation conducted for each online training • Pre-test and post-test administered to all participants of classroom and online training • All course material delivered to TOCOR • Save the date sent to participants in advance including a detailed summary of the objective and goals of the training ◦ Final Report ◦ Background and purpose of the training ◦ Executive Summary ◦ Daily notes ◦ Key takeaways and recommendations for changes, adjustments ◦ Intended audience and objectives of training ◦ Daily and summative evaluation ◦ Conclusion

The first tailored classroom and/or online course will be developed in Year 3. The topic and form of materials will be determined by USAID. Once the topic is chosen, discussions on the format will occur to allow for the development of the necessary materials. Following agreement on topic and type of materials, development of the training will commence.

SUB-REQUIREMENT 2.19.1: DELIVER TRAINING MODULES OR COURSES FOR CLASSROOM, ONLINE AND OR RESOURCE MATERIALS (CAPACITY BUILDING) A starting point for strengthening the knowledge and capacity of USAID education personnel and implementing partners is the identification of priority research findings for classroom training. The TOCOR will schedule a series of meetings with the contractor to discuss possible training topics based on the research generated through DERP and REEP-A. AFR/SD/ED will participate in some of these meetings. The TOCOR will include potential training topics in the quarterly meeting agenda. The contractor shall make recommendations for training topics, training venues, and suggest optimal times, all of which should be in accordance with the contractor’s dissemination and utilization plan for the specific research topic.

Standards and Deliverables:

11 | REEP-A YEAR 3 ANNUAL WORKPLAN • A list of possible training topics submitted within 30 days of the close of the government fiscal year in Years 2 and 4 (October 1, 2017 - September 30 , 2018 and October 1, 2019 - September 30, 2020) of the TO. The list includes a minimum of three possible topics. Each possible topic has a suggested method for delivery (i.e., classroom, online, resource material, etc.). Each possible topic has a suggested timeframe for development.

Discussions on topics for the training/material development are currently underway, and will continue until a topic is selected. The intention is for the materials development to be completed in Year 3, with ongoing discussions to determine possible options.

SUB-REQUIREMENT 2.19.2: DELIVER TRAINING MODULES OR COURSES FOR CLASSROOM, ONLINE AND OR RESOURCE MATERIALS (CAPACITY BUILDING) Dexis will develop the selected subject-specific training modules, courses and resource materials. These modules, courses, and resource materials, to the extent practical, should integrate effective portfolio management practices. The contractor shall ensure that each training utilizes gender-sensitive methodology and includes gender-based learning.

Standards and Deliverables:

• Outline and gender-equitable objectives provided for each selected topic. • Outline includes a minimum of two evaluation components (related to Agency Evaluation Policy). • Pre-test and post-test developed for each classroom and online training course. • One (1) online pre-requisite training module for each classroom course. • A minimum of one (1) trainer identified for each classroom training module and/or course.

Dexis will deliver the selected subject-specific training modules, courses and resource materials. These modules, courses and resource materials, to the extent practical, should integrate effective portfolio management practices. Dexis will ensure that each training utilizes gender-sensitive methodology and includes gender-based learning.

REQUIREMENT 2.20-30: DELIVER WORKSHOPS Dissemination of research findings, and engagement with USAID education personnel and implementing partners, is critical to ensuring the application and effective use of research created under REEP-A. The delivery of workshops throughout REEP-A will be a key component of these efforts.

In Year 3, up to 13 days total, in Washington, D.C., in up to 2 countries, or some combination of those locations, will be conducted following further discussions with the TOCOR on topics and locations.

SUB-REQUIREMENT 2.21.1: IDENTIFY TOPICS FOR YEAR 3 WORKSHOPS Standards and Deliverables:

• Produce outline and gender-sensitive objectives for the workshops. • Find and secure cost-effective and disability accessible locations for the workshops. • Secure experienced facilitators for the workshops.

REEP-A YEAR 3 ANNUAL WORKPLAN | 12 A series of discussions, initiated by the TOCOR, on workshop topics are ongoing to determine topics relevant and beneficial to USAID staff. These conversations are ongoing to finalize the topics, dates and locations of these workshops.

SUB-REQUIREMENT 2.22.1: IDENTIFY TOPICS FOR AREW WORKSHOP • One (1) roadmap document delineating “next steps” not to exceed 5 pages. ◦ The road map includes a list of logistics, role and responsibilities for the contractor and SD/ED staff, technical support, and development of a website for participants.

The roadmap will be developed in collaboration with the TOCOR once further information is provided on options for the upcoming AREW.

REQUIREMENT 2.31: DELIVER CURRENT RESEARCH ON TEACHER KNOWLEDGE, SKILLS, AND ATTITUDES (KSA) RELATED TO LANGUAGE AND LITERACY THAT INFLUENCE EARLY GRADE LITERACY OUTCOMES The framing document will examine and compile available research, literature and case studies related to teacher knowledge, skills, and attitudes (KSA) on language and literacy - three key areas linked to early grade literacy outcomes. The scope, framing, content and timing of the deliverable will be further refined in the accompanying research plan. The review will prioritize research from sub-Saharan Africa; however applicable research from other regions will also be considered. This framing analysis will provide a basis for the subsequent research activities.

A draft of this document was shared with USAID in October 2018 for initial comments and is in the processed of being finalized for approval in February 2019.

SUB-REQUIREMENT 2.31.2: CONDUCT LITERATURE REVIEW OF THE ESSENTIAL COMPONENTS OF TEACHER LINGUISTIC AND LITERACY SKILLS, PEDAGOGICAL KNOWLEDGE BASE AND ATTITUDES. The initial draft of the report was submitted in October 2018 for comments from the Activity Manager. Based on this draft, conversations are continuing to ensure it meets objectives in an effort to work towards the final report draft.

SUB-REQUIREMENT 2.31.3: PRODUCE REPORT ON FINDINGS The draft report will be submitted within 45 days after the completion of analysis with the final report being submitted within 15 days after the TOCOR provides comments/approval. The report will likely be completed in February 2019.

Standards and Deliverables:

• Desk research plan includes methodology for search strategy, inclusion/exclusion criteria, analysis and synthesis. • Desk research plan includes time frame. • Desk research plan includes personnel with sufficient expertise. • Desk research plan includes cost estimate. • Report draft submitted within 45 days after completion of analysis. • Final report submitted within 15 days after TOCOR provides comments/approval. • Report includes an executive summary.

13 | REEP-A YEAR 3 ANNUAL WORKPLAN REQUIREMENT 2.32: DEVELOP RELEVANT COUNTRY-SPECIFIC PROFILES OF LOI POLICIES, READING PROGRAM APPROACHES TO LOI AND STUDENT READING OUTCOMES This Requirement will address the issue of inaccessibility of government policies on LOI through the development of country-level profiles that will allow Missions, AFR/SD and partner organizations to quickly understand the approach taken by each program in light of the country’s linguistic and policy context. Currently, USAID is implementing Goal 1 reading programs in 19 countries in sub-Saharan Africa, with the potential of two additional countries. The Requirement will produce country-level LOI profiles for these countries with a current Goal 1 project, resulting in up to 19 profiles. These profiles will clearly and succinctly describe and illustrate the country’s linguistic landscape and official LOI policy, and the approach to LOI taken by reading programs with regard to teacher training, curriculum development, teaching and learning materials, and community engagement, if any. This Requirement will build on previous research on LOI policy conducted by UNICEF as well as the framing document produced in Requirement 2.31.

SUB-REQUIREMENT 2.32.1: DEVELOP A COUNTRY PROFILE FOR COUNTRIES IN SUB- SAHARAN AFRICA WHERE A USAID READING PROGRAM IS CURRENTLY BEING IMPLEMENTED, OR HAS BEEN RECENTLY COMPLETED The exact countries will be determined in conjunction with USAID, and based on research conducted in the framing document, with up to 19 country profiles being produced depending on gaps and needs. The country profiles will include the:

• Linguistic context of each country. • Current LOI policy and brief history of LOI policy if developed or changed in recent past (within past decade). • Pre-service teacher training curriculum with regard to literacy and second language acquisition (SLA) pedagogy and teacher proficiency in the initial language of instruction (LOI1) and the second or additional language of instruction (LOI2). • Consideration of teacher language proficiency and/or of language-specific training in the teacher posting/assignment process, if any. • Outline/description of country’s USAID reading program approach and ongoing programs with regard to LOI1 and LOI2.

The research for the profiles will begin once the research plan is approved and as the framing document (Requirement 2.31) is wrapping up, likely in February 2019. The draft of the profiles will be submitted to the TOCOR for comments. Once the TOCOR’s comments are included, the profiles will be finalized and submitted for final approval likely in June 2019.

SUB-REQUIREMENT 2.32.2: DEVELOP DISSEMINATION PLAN FOR LOI COUNTRY PROFILES Throughout the research process, discussions will be planned to allow for development and agreement on the best way to disseminate information and results. Following USAID agreement on a dissemination strategy, Dexis will implement the plan to share the completed country profiles. The dissemination plan will be developed collaboratively with USAID to ensure the most effective, and widest, use of the country profiles. The plan will identify the best medium for dissemination, and utility for the intended audience. The dissemination plan will begin once the profiles are nearly completed in April 2019, and will be finalized and submitted for approval in June 2019.

Standards and Deliverables:

REEP-A YEAR 3 ANNUAL WORKPLAN | 14 • Research plan outlining the approach to undertaking this research. • Research plan includes estimated timeframe and cost for each step. • Research plan includes personnel with sufficient expertise. • Draft country profiles submitted within 45 days after completion of analysis. • Final country profiles submitted within 15 days after TOCOR provides comments/approval. • One (1) dissemination plan for LOI country profiles.

REQUIREMENT 2.33: DELIVER ANALYSIS OF RELATIONSHIP BETWEEN TEACHER LANGUAGE PROFICIENCY AND STUDENT READING PERFORMANCE Limited data is available on the degree to which there is matching between the teacher’s language proficiency and the language of instruction at the school. This Requirement aims to provide insight into any trends or patterns in student performance based on the matching of proficiency levels between teachers and students. EGRA data typically include questions about students’ home language. Including the same types of questions for teachers, within the EGRA process, would allow for gathering of teacher proficiency information that could be compared with student proficiency and outcomes.

The activity will begin by requesting that up to two reading programs or external evaluators that are planning to conduct EGRAs in the coming years include a very brief teacher questionnaire in their suite of instruments. Implementers will also be asked to ensure that the student questionnaire includes the two language questions mentioned above. Doing so will add minimal additional cost to the overall EGRA, as the existing teams of enumerators/assessors will be able to conduct the interview with the teacher with little additional training. If the EGRA datasets become publicly available through the Secondary Analysis for Results Tracking (SART) database within this project’s timeframe, researchers for this activity could conduct secondary analysis on the data through a language lens.

SUB-REQUIREMENT 2.33.1: DEVELOP SHORT TEACHER INTERVIEW QUESTIONNAIRE WITH QUESTIONS ABOUT LANGUAGE ABILITY The research plan outlining the approach, timing and questions is underway. The draft research plan will be submitted to USAID in January 2019.

SUB-REQUIREMENT 2.33.2: PROVIDE UP TO TWO (2) PROGRAMS PLANNING AN EGRA WITH THE TEACHER QUESTIONNAIRE TO BE ADMINISTERED AT ALL SAMPLED SCHOOLS There are a few potential EGRAs that will occur in the upcoming year. Dexis will identify upcoming EGRAs and seek approval to include the developed questions related to teachers. The timing and feasibility will be contingent on the timing of the EGRAs, including the identification of EGRAs and agreement for inclusion. The approved questions have been integrated into an EGRA that is currently underway in Uganda.

SUB-REQUIREMENT 2.33.3: CONDUCT SECONDARY ANALYSIS ON DATA AND COMPILE REPORT FROM UP TO TWO PROGRAMS IF/WHEN IT BECOMES PUBLICLY AVAILABLE, INCLUDING EGRA DATA, STUDENT LANGUAGE DATA, AND TEACHER DATA The analysis will occur as soon as the data is obtained, following completion of the EGRA. Once the data is acquired, it will take approximately six months to clean, analyze, and write up the findings.

Standards and Deliverables:

• Research plan outlining the approach to undertaking this research.

15 | REEP-A YEAR 3 ANNUAL WORKPLAN • Research plan includes estimated timeframe and cost for each step. • Research plan includes personnel with sufficient expertise. • Report draft submitted within 45 days after completion of analysis. • Final report submitted within 15 days after TOCOR provides comments/approval. • Final report includes an executive summary. • Final report includes disaggregation by gender.

REQUIREMENT 2.34: CASE STUDY ON LOI IMPLEMENTATION AND LANGUAGE ATTITUDES Case studies will be done in up to two countries. In the selected country, based on conversations with the TOCOR, researchers will conduct focus group interviews with teachers and other stakeholders (such as faculty at teacher training institutions, Ministry of Education staff and officials, and parents/community members) as well as conduct classroom observations, to learn more about how teacher preparation for instruction in the LOI, the process of teacher-school assignment, and attitudes and beliefs about language in education among teachers and other stakeholders, impacts early grade reading outcomes. The case study will also examine the effectiveness of the language curriculum and textbooks in preparing students for using a given language as the LOI, and how language is utilized in the classroom in both linguistically homogeneous as well as heterogeneous contexts.

SUB-REQUIREMENT: PRODUCE RESEARCH PLAN, USING THE COUNTRY LOI PROFILES TO SELECT UP TO TWO COUNTRIES IN WHICH TO CONDUCT A CASE STUDY. The draft research plan will likely be submitted in March 2019, following the completion of the country profiles, as these will inform the research plan. This submission deadline will shift accordingly if any delays arise.

SUB-REQUIREMENT: CONDUCT CASE STUDY The dates for conducting the case study will be determined once the country selection is made through discussions with the TOCOR.

• Develop protocols, pretest, and conduct focus group interviews with teachers, TTI faculty, Ministry of Education officials, and parents/community members. • Conduct classroom observations of reading and other subject lessons to collect data on language-use, as well as availability of LOI materials.

SUB-REQUIREMENT: CONDUCT AN ANALYSIS OF CASE STUDY FINDINGS

SUB-REQUIREMENT: PRODUCE REPORT CONTAINING FINDINGS AND RESULTS FROM THE CASE STUDY Standards and Deliverables:

• Research plan outlining the approach to undertaking this research. • Research plan includes estimated timeframe and cost for each step. • Research plan includes personnel with sufficient expertise. • Cost factors are addressed as part of the country recommendations. • Country recommendations to address process for obtaining permission to conduct field research. • Report draft submitted within 45 days after completion of analysis. • Final report submitted within 15 days after TOCOR provides comments/approval.

REEP-A YEAR 3 ANNUAL WORKPLAN | 16 • Final report includes an executive summary.

Timing for the analysis and final report will be outlined in the research plan as the countries will need to be agreed upon with the TOCOR before work can commence, and will determine the timeline.

RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED USAID has supported the development of several tools (e.g. Early Grade Reading Assessment and Snapshot of School Management Effectiveness) that have been successfully adapted and applied across numerous countries, generating useful data and animating purposeful policy dialogue.

A (SRGBV) Measurement Framework was started under DERP and is being completed under REEP-A. This work will result in an SRGBV Toolkit, a set of tools designed for global use that will yield accurate and reliable data on SRGBV, which is not currently available through other research tools.

The retesting of the SRGBV instruments, and creation of the toolkit, to be completed in Year 3, is detailed below in Requirement 3.1. USAID expressed interest in utilizing the created SRGBV Toolkit for a data collection activity, as quickly as possible, following completion of the instrument retesting. This proposed data collection will fulfill a methods development activity (Requirement 3.4), and will also begin in Year 3 based on further discussions with the TOCOR.

An update of the Early Grade Reading Barometer for Sub-Saharan Africa data will also occur in Year 3, as a Result 3 deliverable (Requirement 3.2).

Lastly, as part of the extensive work stream on teachers and LOI in early grade reading, an assessment tool will be developed for assessing and evaluating teacher language and literacy, and its impact on early grade reading outcomes. This tool development will continue in Year 3, under a methods development activity (Requirement 3.3).

REQUIREMENT 3.1: DELIVER THE SRGBV MEASUREMENT FRAMEWORK (METHODS DEVELOPMENT) The development of a SRGBV measurement framework was started under DERP and advanced by AFR/SD/ED. The tool, comprised of 12 survey modules, was developed and pilot tested in Uganda in March 2016. This initiative is in the process of being finalized under REEP-A, such that a fully functional and user-friendly suite of surveys and data collection protocols can be utilized by the global SRGBV community in the field. To that end, this activity includes the following Sub-Requirements for Year 3:

1. Produce (including translation into French) and disseminate the SRGBV-Toolkit. (Sub- Requirement 3.1.4) 2. Conduct SRGBV data collection. (Requirement 3.3)

Regular meetings with select AFR/SD/ED team members have occurred to discuss each step in the process of creating the toolkit, and the prerequisite research needed to develop the toolkit. Further discussions will also cover the intended application and dissemination of the Toolkit, to ensure that there is a clear and collective understanding of how to best proceed with each Sub-Requirement.

SUB-REQUIREMENT 3.1.3: CONDUCT RE-TEST OF SRGBV SURVEY INSTRUMENTS

17 | REEP-A YEAR 3 ANNUAL WORKPLAN The draft report, including an executive summary, an overview of the field-testing methodology, an overview of statistical validation methodology, and recommended revisions following the field-test, was submitted in October 2018 for comments and approval by the TOCOR. Discussions on how to proceed with re-testing certain components within the toolkit and next steps are currently in progress. The final draft of the report will be submitted following finalization of these components.

The approved SRGBV Toolkit will be written in accessible, non-technical language, include a clear guide on how to use it, have coding instructions, and will be grammatically correct with no spelling or punctuation errors. The structure of the Toolkit will be based on sections outlined in the table of contents of the Conceptual Framework for Measuring School-Related Gender-Based Violence report, and will be determined in discussions with USAID.

SUB-REQUIREMENT 3.1.4: PRODUCE AND DISSEMINATE SRGBV-TOOLKIT Following approval of the final SRGBV re-test report, the SRGBV Toolkit will be produced and disseminated in Year 3 with an English, electronic version ready for CIES in April 2019. In addition to the production of the final Toolkit in English, the Toolkit will also be translated into French for broader application and dissemination. A minimum of 50 SRGBV Toolkits will be delivered to AFR/SD/ED as well as a minimum of two SRGBV Toolkits for each of the 18 Africa Missions.

To facilitate wide, yet targeted, distribution of the Toolkit, a proposed plan will be discussed and submitted to USAID in Year 3, and will articulate the steps required to effectively disseminate the SRGBV Toolkit. These steps will include strategies to promote usage of the Toolkit including proposed training and dissemination strategies to ensure coverage across relevant websites, blogs and platforms, as well as communication tools and social media platforms to reach a broader audience. Although the dissemination plan will not be produced and utilized until the end of Year 3, discussions around how best to reach a wide audience and ensure the usefulness and accessibility of the Toolkit will take place immediately, and will be incorporated into planning during the research process.

The production of the Toolkit is underway with the English, electronic version to be completed for presentation at CIES (April 2019). The dissemination plan for the Toolkit will be determined in collaboration with USAID and will likely be submitted in April 2019 for approval by the TOCOR beyond CIES 2019.

REQUIREMENT 3.2: UPDATE STATUS OF EARLY GRADE READING BAROMETER SUB-REQUIREMENT 3.2.3: INCORPORATE AFRICA MISSION EARLY GRADE READING DATA INTO THE EARLY GRADE READING BAROMETER Following completion and approval of the summary document, Dexis will begin the process integrating early grade reading data from sub-Saharan countries that have generated reading data into the Early Grade Reading Barometer online tool.

Standards and Deliverables:

• At least 1 African country data set is uploaded to the Barometer by the end of Year 3. • A minimum of 3 African country data sets are uploaded by the end of the TO.

At least one African country-level data set will be included in the Barometer by the end of Year 3. The Kenya, Tusome project will be the first country dataset included in Year 3. The selected country for

REEP-A YEAR 3 ANNUAL WORKPLAN | 18 data incorporation will be based on country priorities articulated by the Africa Bureau, and in agreement between USAID and Dexis.

REQUIREMENT 3.3-3.4: DELIVER TOOLS FOR SYSTEMATIC DATA COLLECTION IN A SELECTED AREA (METHODS AND TOOL DEVELOPMENT) As previously addressed, AFR/SD/ED intends that REEP-A remain sufficiently flexible, to allow it to respond to needs that emerge over time. Tool development typically requires primary research and travel to the field, and tool identification needs to be thorough. Avoiding duplication with other existing tools is essential. Some tools may be identified that simply need adaption versus development. To ensure that tool development topics are identified early to allow the contractor to adequately plan, the topic of “tool development” should be reviewed at every quarterly meeting between the TOCOR and Dexis.

To meet the requirement of the two tool/methods development activities, discussions between Dexis and the TOCOR have occurred. Both tool/methods development activities under REEP-A will occur in Year 3. One will be a data collection utilizing the SRGBV toolkit and the other will be an assessment of teacher language and literacy proficiency.

REQUIREMENT 3.3: DELIVER TOOLS FOR SYSTEMATIC DATA COLLECTION IN A SELECTED AREA (METHODS AND TOOL DEVELOPMENT): TEACHER LANGUAGE AND LITERACY ASSESSMENT There are many assumptions, but little concrete data and evidence, about teacher language and literacy ability, especially in multilingual contexts. There is an implicit expectation that teachers are able to speak and read fluently in the LOI—whether the LOI1 or the LOI2, or both. Education stakeholders would benefit from a closer examination of this and other assumptions about teacher use of language in the classroom.

This Requirement will involve developing and administering a teacher language and literacy assessment (TLLA) that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Working closely with local Ministry of Education officials, this tool will be developed and pilot tested, then administered to a sample of teachers in up to two countries.

This tool development is a component of the work stream on teachers and LOI in early grade reading, described in detail in Result 2.

SUB-REQUIREMENT 3.3.1: IDENTIFY AREA FOR TOOL DEVELOPMENT This tool development will be a language and literacy assessment for teachers. A research plan will be developed to articulate the approach to be taken to develop the teacher language and literacy assessment including cost and timeline. The research plan will be submitted in December 2018 for TOCOR comments and approval. Upon approval of the research plan, development of the tool will commence.

SUB-REQUIREMENT 3.3.2: DEVELOP AN EASY-TO-IMPLEMENT TOOL FOR SYSTEMATIC DATA COLLECTION The tool development will likely be completed and submitted to the TOCOR for comments in June 2019.

19 | REEP-A YEAR 3 ANNUAL WORKPLAN SUB-REQUIREMENT 3.3.3: CONDUCT PILOT OF TOOL FOR SYSTEMATIC DATA COLLECTION IN SELECTED AREA Timing for conducting the pilot, as well as a proposed location, will be outlined in the research plan and agreed upon and approved by the TOCOR prior to work commencing.

SUB-REQUIREMENT 3.3.4: PRODUCE AND DISSEMINATE TOOL FOR SYSTEMATIC DATA COLLECTION IN SELECTED AREA Timing for the production of the report and tool, as well as for dissemination, will also be outlined in the research plan based on discussions with the TOCOR.

REQUIREMENT 3.4: DELIVER TOOLS FOR SYSTEMATIC DATA COLLECTION IN A SELECTED AREA (METHODS AND TOOL DEVELOPMENT): SRGBV DATA COLLECTION ACTIVITY To further the aims of the SRGBV work stream, a data collection activity will be conducted using the finalized SRGBV Toolkit. This activity is currently being discussed in terms of timing, location and technical factors. USAID and Dexis will continue discussions to allow for the development of a research plan, to be submitted in Year 3.

SUB-REQUIREMENT 3.4.1: IDENTIFY AREA FOR TOOL DEVELOPMENT This tool development activity will be a data collection utilizing the SRGBV toolkit. A research plan will be developed outlining the steps, methodology and timeline for data collection.

SUB-REQUIREMENT 3.4.2: DEVELOP AN EASY-TO-IMPLEMENT TOOL FOR SYSTEMATIC DATA COLLECTION The SRGBV Toolkit will be used for data collection based on conversations with the TOCOR to determine the desired research questions and objectives, as well the location for implementation.

SUB-REQUIREMENT 3.4.3: CONDUCT PILOT OF TOOL FOR SYSTEMATIC DATA COLLECTION IN SELECTED AREA The SRGBV data collection will begin in Year 3, depending on the timeline of SRGBV Toolkit finalization and based on further discussions with the TOCOR.

SUB-REQUIREMENT 3.4.4: PRODUCE AND DISSEMINATE TOOL FOR SYSTEMATIC DATA COLLECTION IN SELECTED AREA The final report on the data collection, and utilization of the Toolkit, will be dependent on the finalization of the Toolkit itself as well as the timeline for implementing the data collection in a chosen country, per the research plan.

PROJECT MANAGEMENT DELIVERABLES In addition to the research deliverables across the three Result areas, quarterly and annual reports will be submitted, with the following deadlines:

QUARTERLY AND ANNUAL REPORTS • Year 3 Workplan Summary: Due 15 days from TOCOR approval of Workplan. • Quarterly Reports: Due January 31, 2019; April 30, 2019; and July 31, 2019. • Year 3 Annual Report: Due October 31, 2019. • Year 4 Workplan: Due October 31, 2019.

REEP-A YEAR 3 ANNUAL WORKPLAN | 20

SCHEDULE OF YEAR THREE ACTIVITIES

The schedule of activities is illustrated by (a) a matrix outlining completion dates for Requirements and Sub-Requirements and (b) a GANTT chart outlining the activity timelines.

ACTIVITY MATRIX

Requirements and Sub-Requirements Description and Deliverables Projected Completion Date

RESULT 1 Africa Missions Strategy Related Data Needs Met

1.1 Deliver country and regional education data trend snap shots

1.1.1 Identify Data and Format for Education Data A standardized 1-2 page format that can be used for all Four Education Data Snapshots countries across all Strategy goal areas. Proposed data Snapshots are available for 1.1.2 Develop Regional and Country Education Data will include EGRA results and number or percentage Year 3, due to rollover from Snapshots of out of school children, where available. Proposed Year 2. 1.1.3 Disseminate Regional and Country Education data will be reliable and comparable to facilitate cross- Data Snapshots country analyses. List of potential topics submitted to TOCOR. Timing TBD in discussion with TOCOR, following topic determination.

1.2 Deliver education data briefs

1.2.1 Develop education data briefs At request, two education data briefs in Year 3 will be One brief, TBD in discussion provided within 24 hours of the request. Data with TOCOR. citations will be included. One of these briefs is already determined. This will be a French translation French translation of SRGBV of the Education Data Brief: Global Prevalence of brief. To be completed by School-Related Gender-Based Violence (SRGBV). January 2019.

REEP-A YEAR 3 ANNUAL WORKPLAN | 21

1.3-1.7 Deliver mission support for evaluation, assessment, and data on selected priorities

1.3.1 Determine feasibility of mission-specific Written analysis of Mission support activity, which Completed. evaluation/assessment/data support (Huguka Dukore includes estimated total cost and estimated timeframe Akazi Kanoze evaluation) to complete with a "yes" or "no" recommendation and which identifies potential OCI. 1.4.1 Determine feasibility of mission-specific Completed. evaluation/assessment/data support (Soma Umenye evaluation)

1.3.2 Conduct mission-specific evaluation/data activity Research plan, which includes methodology or Research plan completed. (Huguka Dukore Akazi Kanoze evaluation) methodologies to be used and identifies role of USAID Evaluation to be conducted in Mission submitted within one (1) month following a November 2018. 1.4.2 Conduct mission-specific evaluation/data activity “yes” determination. (Soma Umenye evaluation) Research plan with Mission, awaiting comments. Evaluation planned for April 2019.

1.3.3 Produce report on mission-specific evaluation/data Report including gender-based data and findings, an Draft report to be submitted activity and disseminate (Huguka Dukore evaluation) executive summary, and without spelling or by February 2019. punctuation errors. Two (2) hardcopy reports 1.4.3 Produce report on mission-specific evaluation/data distributed to AFR/SD/ED as well as electronic copies TBD based on timing of activity and disseminate (Soma Umenye evaluation) provided in Word and PDF. evaluation.

RESULT 2 Availability of Africa Education Data and Trends Expanded

2.5 Deliver Lessons Learned in Education Partnerships

22 | REEP-A YEAR 3 ANNUAL WORKPLAN

2.5 Deliver Lessons Learned in Education Partnerships

2.5.1 Conduct Analysis of Education Partnerships Research plan, which includes methodology for search TBD in discussions with strategy, inclusion/exclusion criteria, analysis and TOCOR. synthesis. Research plan also includes time frame, personnel with sufficient expertise, and cost estimate.

2.5.2 Produce Report on Lessons Learned in Education Report draft submitted within 45 days after TBD in discussions with Partnerships and Disseminate completion of analysis. Final report submitted within TOCOR. 15 days after TOCOR provides comments/approval. Report includes an executive summary. Report includes gender based data, learnings and analysis. Up to five (5) page companion brief on findings. One (1) presentation of findings to USAID Washington. Two (2) hardcopy reports distributed to each of 18 Africa Missions. One (1) dissemination plan for the research.

2.6 - 2.18 Develop analysis on selected research topics (TBD)

2.x.1 Conduct analysis One desk review for Year 3 TBD in discussion with TOCOR.

2.x.2 Produce report One desk review for Year 3 TBD in discussion with TOCOR.

2.x.3 Disseminate findings One desk review for Year 3 TBD in discussion with TOCOR.

2.19 Deliver Training Modules or Courses for Classroom, Online and/or Resource Materials

2.19.1 Deliver Training Modules or Courses for List of possible training topics, including a minimum of TBD in discussion with Classroom, Online and/or Resource Materials three possible topics. Each possible topic has a TOCOR.

REEP-A YEAR 3 ANNUAL WORKPLAN | 23

2.19 Deliver Training Modules or Courses for Classroom, Online and/or Resource Materials

suggested method for delivery and timeframe for development.

2.19.2 Deliver Training Modules or Courses for Subject-specific training modules, courses and/or TBD in discussion with Classroom, Online and/or Resource Materials resource materials integrating effective portfolio TOCOR. management practices. Each training should utilize gender-sensitive methodology and include gender- based learning. For each selected topic, an outline and gender-equitable objectives will be provided, including a minimum of two evaluation components. A pre- and post-test for each classroom and online training will be developed. One (1) online pre-requisite training module for each classroom course. A minimum of one (1) trainer identified for each training module or course.

2.20 - 2.30 Deliver workshops

2.x.1 Identify topics for workshops Jointly determine preliminary topics for workshops TBD in discussion with based on prior research and a series of discussions TOCOR. with USAID. Outline and gender-sensitive objectives for the workshops.

2.x.2 Develop workshop materials Following final topic approval by USAID, initial TBD in discussion with development of workshop materials will begin. TOCOR.

2.x.3 Produce workshop materials TBD in discussion with TOCOR.

24 | REEP-A YEAR 3 ANNUAL WORKPLAN

2.20 - 2.30 Deliver workshops

2.x.4 Conduct/facilitate workshop Workshops facilitated in a cost-effective and disability TBD in discussion with accessible location, with experienced facilitators. TOCOR.

2.31 Deliver current research based on teacher knowledge, skills, and attitudes (KSA) related to language and literature that influence early grade literacy outcomes

2.31.1 Produce research plan Research plan, which includes methodology for search Completed. strategy, inclusion/exclusion criteria, analysis and synthesis. Research plan also includes time frame, personnel with sufficient expertise, and cost estimate.

2.31.2 Conduct a literature review Literature review which examines the essential Ongoing. components of teacher linguistic and literacy skills, pedagogical knowledge base, and attitudes and beliefs for the effective teaching of literacy in multilingual contexts.

2.31.3 Produce report on findings Report including an executive summary. Draft report to be submitted to TOCOR by February 2019.

2.32 Develop relevant country-specific profiles of LOI policies, reading program approaches to LOI, and student reading outcomes

2.32.1 Develop a country profile for countries in sub- Research plan outlining approach to research, Research plan to be submitted Saharan Africa where a USAID reading program is estimated timeframe and cost for each step, and by November 2018. currently being implemented, or has been recently personnel with sufficient expertise. Country profiles completed (up to 19 countries) which include: linguistic context; current LOI policy and brief history of LOI policy if recently changed or developed; pre-service teacher training with regard to Country profiles to be literacy and SLA pedagogy and teacher proficiency in completed by June 2019. LOI1 and LOI2; consideration of teacher language proficiency and/or of language specific training in teacher postings, if any; and an outline/description of

REEP-A YEAR 3 ANNUAL WORKPLAN | 25

2.32 Develop relevant country-specific profiles of LOI policies, reading program approaches to LOI, and student reading outcomes

the country's USAID reading program approach and ongoing LOI1 and LOI2 programs.

2.32.2 Develop dissemination plan for LOI country One (1) tailored PowerPoint presentation on LOI To be submitted by June profiles country profile findings; a minimum of one (1) 2019. presentation on the findings; one (1) dissemination plan for LOI country profiles.

2.33 Deliver analysis of relationship between teacher language proficiency and student reading performance

2.33.1 Develop short teacher interview questionnaire Research plan outlining the approach, timing, and To be submitted by January with questions about language ability specific questions to be included in planned EGRAs. 2019. Research plan also includes estimated timeframe and cost for each step and personnel with sufficient expertise.

2.33.2 Provide up to two (2) programs planning an EGRA Provision of teacher questionnaire to include in Currently being completed with the teacher questionnaire to be administered at EGRAs planned for upcoming year. with ongoing EGRA in sampled schools Uganda.

2.33.3 Conduct secondary analysis on data and compile Analysis and report of data from up to two (2) TBD upon availability of report from up to two programs programs if/when it becomes publicly available, EGRA data. including EGRA data, student language data, and teacher data. Report draft submitted within 45 days after completion of analysis. Final report submitted within 15 days of TOCOR comments/approval, including an executive summary and disaggregation by gender.

2.34 Case Study on LOI Implementation and language attitudes

26 | REEP-A YEAR 3 ANNUAL WORKPLAN

2.34 Case Study on LOI Implementation and language attitudes

2.34.1 Produce research plan using the country LOI Research plan to be submitted profiles to select up to two countries in which to conduct March 2019 following the a case study. completion of the country profiles.

2.34.2 Conduct Case Study TBD based on discussions with TOCOR.

2.34.3 Conduct an analysis of case study findings. TBD based on discussions with TOCOR.

2.34.4 Produce report containing findings and results from TBD based on discussions the case study. with TOCOR.

RESULT 3 Measurement Tools with Applicability Across Countries Developed

3.1 Deliver the SRGBV measurement framework

3.1.3 Conduct re-test of SRGBV survey instruments Plan for testing the surveys for validity and reliability Completed, discussion of includes description for sampling procedures. Report further testing some includes an executive summary, overview of field- components underway. testing, methodology, overview of statistical validation methodology, and recommended revisions following the field-test. Report provides the estimated time and cost for making revisions and is grammatically correct.

REEP-A YEAR 3 ANNUAL WORKPLAN | 27

3.1 Deliver the SRGBV measurement framework

3.1.4 Produce and disseminate SRGBV tool kit Toolkit includes clear guides on use along with coding Electronic, English version of instructions, and is translated into French. Toolkit is Toolkit to be completed by grammatically correct with no errors. A minimum of April 2019. fifty (50) Toolkits delivered to AFR/SD/ED. A minimum of two (2) Toolkits, each, for 18 Africa Final version of Toolkit TBD Missions. based on discussions with the TOCOR.

3.2 Update status of the early grade reading barometer

3.2.1 Determine if the necessary platform exists for the Assessment report describing current platform and Submitted October 2018. early grade reading barometer optimal platform.

3.2.2 Assess the value of the early grade reading Assessment report describing the optimal method for Submitted October 2018. barometer as a useful analytic tool utilizing the barometer.

3.2.3 Incorporate Africa mission early grade reading data At least one African country data set uploaded to the To be completed by June into the early grade reading barometer Barometer by the end of Year 3. A minimum of 3 2019 (at least one country, African country data sets will be uploaded by the end Tusome data in Kenya, to be of the TO. incorporated).

3.3 Deliver tools for systematic data collection in a selected area

28 | REEP-A YEAR 3 ANNUAL WORKPLAN

3.3 Deliver tools for systematic data collection in a selected area

3.3.1 Identify area for tool development Concept paper including a comparison of each tool to Research plan submitted to Teacher Language and Literacy Assessment the other based on how much time it would take to TOCOR by December 2018. develop, costs of development, and the number of African countries that would likely benefit from its use. Concept paper includes at least two recommendations for tool development.

3.3.2 Develop an easy to implement tool for systematic A low-cost, easy to administer tool that can be used Tool development completed data collection by USAID education personnel and partners to and submitted to TOCOR by conduct their own data collection and analysis. Each June 2019. instrument specifies the factor(s) it evaluates and has a companion guide and coding instructions.

3.3.3 Conduct pilot of tool for systematic data collection A pilot of the tool or instrument in at least one Timing TBD based on in selected area African country. The recommended country should be discussions with TOCOR. identified as conducive and optimal as the pilot target site. The final selection made in collaboration with TOCOR. A plan for conducting a pilot provided within three (3) months after the instrument has been developed and approved by IRB if required.

3.3.4 Produce and disseminate tool for systematic data Tool package translated into French. Tool package Timing TBD based on collection in selected area includes linkage to online tool package. A minimum of discussions with TOCOR. two (2) sets of tool package distributed to 17 Africa Missions (implementing Goal 1 projects).

3.4 Deliver tools for systematic data collection in a selected area

REEP-A YEAR 3 ANNUAL WORKPLAN | 29

3.4 Deliver tools for systematic data collection in a selected area

3.4.1 Identify area for tool development Concept paper including a comparison of each tool to TBD following finalization of SRGBV Phase 2 Data Collection the other based on how much time it would take to Toolkit. develop, costs of development, and the number of African countries that would likely benefit from its use. Concept paper includes at least two recommendations for tool development.

3.4.2 Develop an easy to implement tool for systematic A low-cost, easy to administer tool that can be used TBD following finalization of data collection by USAID education personnel and partners to Toolkit. conduct their own data collection and analysis. Each instrument specifies the factor(s) it evaluates and has a companion guide and coding instructions.

3.4.3 Conduct pilot of tool for systematic data collection A pilot of the tool or instrument in at least one (1) TBD following finalization of in selected area African country. The recommended country should be Toolkit. identified as conducive and optimal as the pilot target site. The final selection made in collaboration with TOCOR. A plan for conducting a pilot provided within three (3) months after the instrument has been developed and approved by IRB if required.

3.4.4 Produce and disseminate tool for systematic data Tool package translated into French. Tool package TBD following finalization of collection in selected area includes linkage to online tool package. A minimum of Toolkit. two (2) sets of tool package distributed to 17 Africa Missions (implementing Goal 1 projects).

30 | REEP-A YEAR 3 ANNUAL WORKPLAN

GANTT ACTIVITY CHART

USAID REEP - Year 3 Description 2018 2019 CLIN 1 Africa Missions Strategy Related Data Needs Met Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep 1.1 Deliver country and regional education data trend snap shots Timing per USAID request 1.1.1 Identify Data and Format for Education Data Snapshots 1.1.2 Develop Regional and Country Education Data Snapshots 1.1.3 Disseminate Reginal and Country Education Data Snapshots 1.2 Deliver education data briefs Timing per USAID request 1.2.1 Develop education data briefs 1.3 Deliver mission support for evaluation, assessment, and data on selected priorities (Huguka Dukore Akazi Kanoze ) 1.3.1 Determine feasibility of mission-specific evaluation/assessment/data support 1.3.2 Conduct mission-specific evaluation/data activity 1.3.3 Produce report on mission-specific evaluation/data activity and disseminate 1.4 Deliver mission support for evaluation, assessment, and data on selected priorities (Soma Umenye) 1.3.1 Determine feasibility of mission-specific evaluation/assessment/data support 1.3.2 Conduct mission-specific evaluation/data activity 1.3.3 Produce report on mission-specific evaluation/data activity and disseminate CLIN 2 Availability of africa education data and trends expanded 2.5 Deliver lessons learned in education partnerships Timing per USAID request 2.5.1 Conduct analysis of education partnerships 2.5.2 Produce report on lessons learned in education partnerships and disseminate 2.6-2.18 Deliver analysis on selected research topics (TBD) Timing per USAID request 2.x.1 Conduct analysis 2.x.2 Produce report 2.x.3 Disseminate findings Deliver Training Modules or Courses for Classroom, Online and/or Resource Materials assroom, Timing per USAID request 2.19 online and or resource materials 2.19.1 Identify early grade reading research findings for classroomo training, online training, or resou rce material 2.19.2 Develop early grade reading research training modules or courses for classroom, online, or res ource materials 2.20-2.30 Deliver workshops Timing per USAID request 2.x.1 Identify topics for workshops 2.x.2 Develop workshop materials 2.x.3 Produce workshop materials 2.x.4 Conduct/facilitate workshop Deliver current research based on teacher knowledge, skills, and attitudes (KSA) related to 2.31 language and literature that influence early grade literacy outcomes Develop relevant country-specific profiles of LOI policies, reading program approaches to LOI, and 2.32 student reading outcomes 2.33 Deliver analysis of relationship between teacher language proficiency and student reading performa nce 2.34 Case Study on LOI Implementation and language attitudes CLIN 3 Measurement tools with applicability across countries developed 3.1 Deliver the SRGBV measurement framework 3.1.4 Produce and disseminate SRGBV tool kit 3.2 Update status of the early grade reading barometer 3.2.1 Determine if the necessary platform exists for the early grade reading barometer 3.2.2 Assess the value of the early grade reading barometer as a useful analytic tool 3.2.3 Incorporate africa mission early graqde reading data into the early grade reading barometer 3.3 Deliver 1 tool for systematic data collection in a selected area (TLLA) 3.3.1 Identify area for tool development 3.3.2 Develop an easy to implement tool for systematic data collection 3.3.3 Conduct pilot of tool for systematic data collection in selected areas Timing per USAID request 3.4 Deliver 1 tool for systematic data collection in a selected area (SRGBV data collection) Timing per USAID request 3.4.1 Identify area for tool development 3.4.2 Develop an easy to implement tool for systematic data collection 3.4.3 Conduct pilot of tool for systematic data collection in selected areas

REEP-A YEAR 3 ANNUAL WORKPLAN | 31

RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING-AFRICA Year Three Annual Workplan Summary

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 . The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication.

Prepared for

USAID

Prepared by

Dexis Consulting Group

Research for Effective Education Programming – Africa (REEP-A)

Contract No. AID-OAA-TO-16-00024

December 2018

ACRONYMS

AFR/SD Africa Bureau’s Office of Sustainable Development

AFR/SD/ED Africa Bureau’s Office of Sustainable Development Education Team

CIES Comparative & International Education Society

EGRA Early Grade Reading Assessment

KSA Knowledge, Skills and Attitudes

LOI Language of Instruction

REEP-A Research for Effective Education Programming- Africa

SRBGV School-Related Gender-Based Violence

TBD To Be Determined

TLLA Teacher Language and Literacy Assessment

TO Task Order

TOCOR Task Order Contract Officer Representative

USAID United States Agency for International Development

i | REEP-A YEAR 3 ANNUAL WORKPLAN SUMMARY

OVERVIEW OF THE REEP-A PROGRAM

PURPOSE Research for Effective Education Programming – Africa (REEP-A) is a five-year project, beginning in October 2016, under the Education Team in the United States Agency for International Development (USAID) Africa Bureau’s Office of Sustainable Development (AF/SD). The overarching aim of the REEP- A project is to provide the USAID Africa Bureau and Mission offices, as well as partner organizations, with concrete contributions to research on USAID education initiatives that will inform evidence-based investment, decision-making and the prioritization of needs. REEP-A will contribute to USAID’s education aims in Africa by providing technical and advisory services focused on research and capacity building, to enhance the quality and effectiveness of USAID activities. REEP-A will provide practical and action-oriented gender-sensitive research that will be applied to both project design and implementation. REEP-A aims to strengthen the relationships and links between project evaluation, design and implementation. In addition, REEP-A will contribute to the capacity development of education personnel and partners by enhancing the use of research in decision-making, mainstreaming the application of feedback-loops and lessons learned into project design and implementation and expanding the evidence and knowledge base on education initiatives.

USAID’s Education Strategy, issued in 2011, has promoted the rigorous use of evidence-based programming and set targets across the Strategy’s three education goals, specifically the measurement of access to education and early grade reading improvements, workforce development and education in crisis and conflict-affected environments. REEP-A will focus on the improvements in early grade literacy and equitable access to education, particularly in challenging environments (Goals 1 and 3).

REEP-A seeks to contribute evidence that responds to the question of why, despite some gains, progress toward enhanced early grade reading and access to quality education in conflict and crisis settings in Africa, have been modest. Expanding the evidence base and data collection for education interventions in Africa is critical to spurring further progress, increasing the understanding of program effectiveness, identifying barriers and replicating successes.

RESEARCH CATEGORIES AND FRAMEWORK To achieve these research aims within Goals 1 and 3, REEP-A will provide technical assistance and advisory services across the three results articulated in the Education Analytic Framework:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

Building on the research activities undertaken in Year 1 and 2, activities in Year 3 will further strategic research across the three Results. Throughout REEP-A the bulk of the research activities will fall within Result 2, with a secondary focus on Result 3. While significantly fewer resources are devoted to Result 1 throughout the Task Order (TO), several activities under this Result took place in Year 2.

1 | REEP-A YEAR 3 ANNUAL WORKPLAN SUMMARY In keeping with the overall aim of REEP-A, Year 3 will focus on identifying, developing and delivering research with a gender-sensitive lens, and will highlight strategic topics that could contribute to improving the effectiveness of initiatives related to early grade reading and equitable access to education.

YEAR THREE ACTIVITIES Year 3 efforts for REEP-A will further these aims through research activities spanning the three Result areas.

The largest areas of focus for Year 3 will include continued research on School-Related Gender-Based Violence (SRGBV), within Result 3, and research on teachers and Language of Instruction (LOI), within Result 2.

In addition to these priorities, Year 3 activities under Result 1 will include education data briefs and Mission support for evaluation, assessment and data priorities. Further work under Result 2, in Year 3, will include research on to be determined (TBD) topics, the development of training materials and the delivery of workshops, through discussions with USAID. Under Result 3, work will also continue in Year 3 on incorporating data from African countries into the Early Grade Reading Barometer and the development of the Teacher Language and Literacy Assessment (TLLA).

RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET Activities under Result 1 will continue in Year 3, and will include elements across all the Sub- Requirements.

In discussion with the Task Order Contract Office Representative (TOCOR), Dexis will deliver the remaining four Country and Regional Education Data Trend Snapshots that were not used in Year 2. Dexis will then prepare a minimum of five (5) 1-2 page gender-sensitive snapshots for the Africa Bureau’s Office of Sustainable Development Education Team (AFR/SD/ED) supported countries in Year 3. These snapshots will be used to inform USAID managers and decision-makers of the overall education status of a particular country, which may affect budget allocations and the Strategy goals a country could pursue.

Upon USAID request, up to two time-sensitive Education Data Briefs will be delivered in Year 3, with a response time of 24 hours. REEP-A produced a comprehensive education data brief that provided statistics and data points on the global prevalence of SRGBV. The data brief was approved on May 24, 2018. A French translation of the Education Data Brief: Global Prevalence of School-Related Gender- Based Violence (SRGBV) will be completed by January 2019. The timing and content of the remaining brief will be determined by USAID and communicated to Dexis through the TOCOR.

To fulfill the Mission Support for Evaluation, Assessment and Data on Selected Priorities requirement two evaluating were selected in Rwanda. In discussion with the TOCOR and with USAID Missions, Dexis completed a brief Feasibility Determination report for two proposed evaluations, to determine if it was possible to undertake two performance evaluations in Rwanda under REEP-A. The evaluations include the flagship programs on early grade reading (Soma Umenye) and workforce/youth development (Huguka Dukore Akazi Kanoze). The evaluation for Huguka Dukore Akazi Kanoze is underway. The in- country fieldwork began in November 2018, and the final report will be submitted to AFR/SD/ED in

REEP-A YEAR 3 ANNUAL WORKPLAN SUMMARY | 2 February 2019. The evaluation for Soma Umenye is scheduled for April of 2019 to allow for program adjustments to take hold prior to the performance evaluation. The specific dates of this evaluation will be finalized in January 2019.

RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED The majority of REEP-A research activities for Year 3 will contribute to Result 2, as will the bulk of research activities throughout the TO. Result 2 activities span both primary and secondary (desk) research.

The primary focus of Year 3, under Result 2, will be in-depth research and analysis related to the language skills of teachers, and the corresponding impact this has on the acquisition of reading skills and the effectiveness of early grade reading programs. Several requirements in this workstream will occur in Year 3, with additional activities in later years of the TO.

The initial deliverable within the LOI workstream will be a framing report that sets the stage and will serve as a foundational document. The framing document will examine and compile available research, literature and case studies related to teacher knowledge, skills and attitudes (KSAs) on language and literacy - three key areas linked to early grade literacy outcomes. This framing analysis will provide a basis for the subsequent research activities. A draft of this document was shared with USAID in October 2018 for initial comments and is in the processed of being finalized for approval in February 2019.

The second activity within the LOI workstream will address the issue of inaccessibility of government policies on LOI through the development of country-level profiles that will allow Missions, AFR/SD and partner organizations to quickly understand the approach taken by each program in light of the country’s linguistic and policy context. Currently, USAID is implementing Goal 1 reading programs in 19 countries in sub-Saharan Africa, with the potential of two additional countries. The exact countries will be determined in conjunction with USAID, and based on research conducted in the framing document, with up to 19 country profiles being produced depending on gaps and needs. To ensure this information is useful and widely used a dissemination plan will be developed once the profiles are nearly completed in April 2019, and will be finalized and submitted for approval in June 2019.

To deliver an analysis of the relationship between teacher language proficiency and the LOI at school, REEP-A will conduct an activity requesting that up to two reading programs or external evaluators that are planning to conduct Early Grade Reading Assessments (EGRAs) in the coming years include a very brief teacher questionnaire in their suite of instruments. Implementers will also be asked to ensure that the student questionnaire includes the two language questions mentioned above. An analysis of the data gained from the EGRAs aims to provide insight into any trends or patterns in student performance based on the matching of proficiency levels between teachers and students. The research plan outlining the approach, timing and questions is underway. The draft research plan will be submitted to USAID in January 2019.

There are a few potential EGRAs that will occur in the upcoming year. Dexis will identify upcoming EGRAs and seek approval to include the developed questions related to teachers. Secondary analysis on EGRA data, student language data and teacher data will occur as soon as the data is obtained, following

3 | REEP-A YEAR 3 ANNUAL WORKPLAN SUMMARY completion of the EGRA. Once the data is acquired, it will take approximately six months to clean, analyze and write up the findings.

Case studies on LOI implementation and language attitudes will be done in up to two countries beginning in Year 3. The case study will examine the effectiveness of the language curriculum and textbooks in preparing students for using a given language as the LOI, and how language is utilized in the classroom in both linguistically homogeneous and heterogeneous contexts. In the selected country, based on conversations with the TOCOR, researchers will conduct focus group interviews with teachers and other stakeholders, as well as conduct classroom observations, to learn more about how teacher preparation for instruction in the LOI, the process of teacher-school assignment and attitudes and beliefs about language in education among teachers and other stakeholders, impacts early grade reading outcomes. The draft research plan will likely be submitted in March 2019, following the completion of the country profiles, as these will inform the research plan.

While the research on teacher and LOI is the primary focus of Result 2, several additional research activities will likely occur in Year 3. Specifically, in-country research on lessons learned from education partnerships will begin, as will a desk review which will be covered under the selected research topics requirements. In addition, a training module will be developed and delivered through an in-country workshop in Year 3.

USAID missions have engaged in partnerships with a range of development actors, including the Global Partnership for Education (GPE), other bilateral donors (e.g. the United Kingdom’s Department of International Development (DfID) in the Democratic Republic of the Congo (DRC)) and host governments to support education goals and objectives to leverage greater resources, unite entities around specific education objectives and so forth. Due to the challenges associated with managing these relationships research on education partnerships, specifically addressing the question of, “What are the pros and cons associated with the types of partnerships most commonly initiated by USAID (i.e. G2G, GPE)?” is to begin in Year 3. Dexis will describe and categorize partnerships that reflect the most current trend in Africa and identify benefits, barriers and any lessons learned. This investigation will require desk review and up to 3 in-country visits. Dexis and the TOCOR will determine if this research will be undertaken, and if so in which countries, in the early months of Year 3.

Selected research topics will be determined through discussion with the TOCOR based on the needs of AFR/SD/ED throughout the TO with one of these topics to begin in Year 3.

The first tailored classroom and/or online course will be developed in Year 3 to transfer research-based knowledge about topics determined in conjunction with the TOCOR. The topic and form of materials will be determined by USAID. These modules, courses and resource materials should incorporate the most current evidence associated with the research activities conducted under REEP-A. Dexis shall collaborate with AFR/SD/ED to determine topics for the training materials and locations for the workshops.

REEP-A YEAR 3 ANNUAL WORKPLAN SUMMARY | 4 RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED A primary research focus for this Result in Year 3 will be the creation of the SRGBV Toolkit.

The draft report of the SRGBV Toolkit was submitted in October 2018 for comments and approval by the TOCOR. Discussions on how to proceed with re-testing certain components within the Toolkit and next steps are currently in progress. The final draft of the report will be submitted following finalization of these components.

Following approval of the final SRGBV re-test report, the SRGBV Toolkit will be produced and disseminated in Year 3 with an English, electronic version ready for CIES in April 2019. In addition to the production of the final Toolkit in English, the Toolkit will also be translated into French for broader application and dissemination. To facilitate wide, yet targeted, distribution of the Toolkit, a proposed plan will be discussed and submitted to USAID in Year 3, and will articulate the steps required to effectively disseminate the SRGBV Toolkit. These steps will include strategies to promote usage of the Toolkit including proposed training and dissemination strategies to ensure coverage across relevant websites, blogs and platforms, and will utilize communication tools and social media outlets to reach a broader audience.

An update of the Early Grade Reading Barometer for Sub-Saharan African data will also occur in Year 3. At least one African country data set will be included in the Barometer by the end of Year 3. The Kenya, Tusome project will be the first country dataset included in Year 3. The remaining countries for data incorporation will be based on country priorities articulated by the Africa Bureau, and in agreement between USAID and Dexis.

Lastly, as part of the LOI workstream, an assessment tool will be developed for assessing and evaluating teacher language and literacy, and its impact on early grade reading outcomes. This Requirement will involve developing and administering a TLLA that will, at minimum, assess teachers’ oral language proficiency, reading and writing ability in the relevant LOI, in addition to including an interview protocol related to language attitudes and beliefs. A research plan will be developed to articulate the approach to be taken to develop the teacher language and literacy assessment, including cost and timeline. The research plan will be submitted in December 2018 for TOCOR comments and approval. Upon approval of the research plan, development of the tool will commence. The tool development will likely be completed and submitted to the TOCOR for comments in June 2019. Working closely with local Ministry of Education officials, this tool will be developed and pilot tested, then administered to a sample of teachers in up to two countries.

5 | REEP-A YEAR 3 ANNUAL WORKPLAN SUMMARY

RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING-AFRICA (REEP-A) YEAR 3, QUARTER 1 REPORT

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 . The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication.

Research For Effective Education Programming-Africa (REEP-A) YEAR 3, QUARTER 1 REPORT

October 1, 2018 - December 31, 2018

January 31, 2019

Research for Effective Education Programming – Africa (REEP-A)

Contract No. AID-OAA-TO-16-00024

ACRONYMS

AFR/SD Africa Bureau’s Office of Sustainable Development

AFR/SD/ED Africa Bureau’s Office of Sustainable Development Education Team

AREW Africa Regional Education Workshop

DERP Data for Education Research and Programming in Africa

EGRA Early Grade Reading Assessment

KSA Knowledge, Skills, and Attitudes

LOI Language of Instruction

LOP Life of Project

RDCS Regional Development Cooperation Strategy

REEP-A Research for Effective Education Programming – Africa

RTI Research Triangle Institute

SRBGV School-Related Gender-Based Violence

TBD To be Determined

TLLA Teacher Language and Literacy Assessment

TO Task Order

TOCOR Task Order Contract Officer Representative

USAID United States Agency for International Development

i | REEP - A YEAR 3, QUARTER 1 REPORT CONTENTS EXECUTIVE SUMMARY 1 OVERVIEW OF THE REEP-A PROGRAM 2 BACKGROUND 2 YEAR THREE, QUARTER 1 ACTIVITIES 3 PROGRAMMING AND PROJECT SUPPORT 3 TECHNICAL ACTIVITIES 4 PROGRESS TOWARD OBJECTIVES (BY RESULT) 5 GENERAL TECHNICAL ACTIVITIES 5 RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET 6 RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED 7 RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED 8 SUCCESSES, CHALLENGES AND LESSONS LEARNED 10 OTHER DELIVERABLES 10 PROJECT MANAGEMENT DELIVERABLES 10

REEP - A YEAR 3, QUARTER 1 REPORT | ii

EXECUTIVE SUMMARY The Research for Effective Education Programming – Africa (REEP-A) Task Order (TO), a contract between the U.S. Agency for International Development (USAID) and Dexis Consulting Group (Dexis), was awarded with a total amount of $7,949,966.00 on September 29, 2016 with a period of performance of five years, ending on September 28, 2021.

The main objective of REEP-A is to generate and effectively disseminate Africa regional and country specific education data, analysis, and research, to ensure the availability of evidence-based interventions that inform the prioritization of needs and education investment decisions. REEP-A specifically focuses on the measurement of equitable access to education and early grade reading improvements. REEP-A activities support USAID’s 2011 Education Strategy, which has promoted the rigorous use of evidence- based programming across the Strategy’s three education goals. The goals set targets for 2015 for early grade reading, workforce development, and education in crisis and conflict-affected environments:

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

To meet these goals, activities are designed to emphasize the following three Results:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This Quarterly Report covers Quarter 1 of FY2019, from October 1, 2018 through December 31, 2018. REEP-A activities during this period included project administration, planning, and technical activities. The quarterly financial report ending on December 31, 2018 (Q1 FY2019) is attached. The REEP-A Program Manager is Stephanie Squires and the Task Order Contract Officer Representative (TOCOR) is Megnote Lezhnev.

1 | REEP - A YEAR 3, QUARTER 1 REPORT OVERVIEW OF THE REEP-A PROGRAM

BACKGROUND The overarching aim of the REEP-A project is to provide the USAID Africa Bureau, overseas Missions, and partner organizations, with concrete research contributions on USAID education initiatives to inform evidence-based investment, decision-making, and the prioritization of needs. REEP-A will contribute to USAID’s education aims in Africa by providing technical and advisory services focused on research and capacity building that intend to enhance the quality and effectiveness of USAID activities. REEP-A will provide practical and action-oriented gender-sensitive research that will be applied to both project design and implementation. REEP-A aims to strengthen the relationships and links between project evaluation, design, and implementation. In addition, REEP-A will contribute to the capacity development of education personnel and partners by enhancing the use of research in decision-making, mainstreaming the application of feedback-loops and lessons learned into project design and implementation, and expanding the evidence and knowledge base on education initiatives.

USAID’s Education Strategy, issued in 2011, has promoted the rigorous use of evidence-based programming across the Strategy’s three education goals, with a particular emphasis on the measurement of access to education and early grade reading improvements. The education goals set targets for 2015, which centered on early grade reading, workforce development, and education in crisis and conflict-affected environments. REEP-A will specifically focus on improvement in early grade reading and the measurement of equitable access to education in crisis and conflict environments (Goals 1 and 3):

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

Following the release of the Strategy, the Education Team of the USAID Africa Bureau’s Office of Sustainable Development (AFR/SD/ED) identified the need for targeted research and training services to complement the efforts of Africa Missions to effectively align their education portfolios with the three goal areas of the Strategy. The Data for Education Research and Programming (DERP) in Africa project, implemented between 2011 and 2016, was established to respond to these identified needs, and provided AFR/SD/ED with timely research and tools that Missions could apply to their ongoing and future education projects. REEP-A will expand on the progress and evidence base established by DERP and will target identified gaps and challenges.

REEP-A seeks to contribute evidence that responds to the question of why, despite some gains, progress toward enhanced early grade reading and access to quality education in conflict and crisis settings in Africa has been modest. Expanding the evidence base and data collection for education interventions in Africa is critical to spurring further progress, increasing the understanding of program effectiveness, identifying barriers, and replicating successes.

REEP - A YEAR 3, QUARTER 1 REPORT | 2 YEAR THREE, QUARTER ONE ACTIVITIES

PROGRAMMING AND PROJECT SUPPORT Under Result 1, REEP-A continued work on Requirement 1.2, Deliver Education Data Briefs, for Year 3 by beginning the process of onboarding a translation services company to translate the Education Data Brief: Global Prevalence of School-Related Gender-Based Violence (SRGBV) into French. The English version of the document was submitted in May 2018 and it was determined that for wider usage and dissemination it would be valuable to translate the document to French. The finalized translation will be completed for submission by February 1, 2019.

Additionally, planning for two activities under Requirement Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) continued throughout the first quarter in Year Three. The activities consist of supporting the Rwanda Mission through the implementation of performance evaluations for two of their existing education projects. The Huguka Dukore Akazi Kanoze performance evaluation was implemented in Rwanda in late November/early December 2018. The evaluation team has been working collaboratively with the Rwanda education team at the Mission to share initial findings and analysis following the in-country work. The final research plan for the Soma Umenye Performance Evaluation for the flagship early grade reading program was submitted and approved on November 29, 2018. In Quarter 2 further planning for the evaluation will continue including recruitment and onboarding of technical consultants for the evaluation team.

Under Result 2, two versions of the Africa Regional Education Workshop (AREW) (Result 2, Requirement 2.20) Report were submitted on October 20, 2018, and approved October 22, 2018 to provide a summary of the events and lessons learned to be used for future planning

Additionally, the Language of Instruction (LOI) work stream continued to constitute a major area of work. In order to meet the objectives of the first activity in this research stream, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes (KSA) related to Language and Literacy that Influence Early Grade Literacy Outcomes (Result 2, Requirement 2.31), a draft of the report was submitted on October 9, 2018. Work continues on the report through revisions and onboarding a graphic designer to produce a high quality, polished report for wide dissemination.

Furthermore, REEP-A continued ongoing work on Result 2, Requirement 2.32, Develop Relevant Country- Specific Profiles of Language of Instruction (LOI) Policies, Reading Program Approaches to LOI, and Student Reading Outcomes. The consultants are compiling information on a number of areas across the 19 countries for inclusion in the profiles. A draft profile from one of the countries will serve as a template to ensure consistency across all profiles. This draft profile will be submitted in Quarter 2 for discussion and agreement with the activity manager and TOCOR.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) included the submission and approval of questions to be added to the Early Grade Reading Assessment (EGRA). These questions are for teachers to gauge their own ability in the language they are teaching. These questions were submitted and approved on September 27, 2018. These were included in an EGRA implemented in Uganda in mid- October 2018.

3 | REEP - A YEAR 3, QUARTER 1 REPORT The continuation of work on the SRGBV Toolkit (Result 3, Requirement 3.1), remained a primary focus within Result 3 for Quarter 1. Efforts consisted of data analysis from the fieldwork for the pilot in Malawi, and progress regarding the Toolkit’s development. The draft Malawi Pilot Report was submitted for review and comments to USAID on September 28, 2018. Additional analysis and discussion on how to address some of the issues with the instruments continued throughout Quarter 1.

Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during this Quarter with the submission of an Early Grade Reading Barometer Assessment Report on October 31, 2018, and approval on November 13, 2018. This assessment report covered Sub-Requirements 3.2.1 and 3.2.2. to allow for the initiation of work on incorporating country data into the Early Grade Reading Barometer. Furthermore, to assist USAID’s Africa Bureau in their aim to expand the number of EGRA datasets from sub-Saharan Africa, the assessment report aimed to determine if the current Early Grade Reading Barometer platform, an online tool hosted by USAID’s Asia Bureau, provides the appropriate venue to incorporate additional datasets. Findings determined that the necessary platform for the Barometer already exists and is well-operated, and thus it was recommended that the Africa Bureau utilize the existing Early Grade Reading Barometer to incorporate datasets. The assessment report lays the groundwork to fulfill the associated Sub-Requirement 3.2.3, Incorporate African Mission Early Grade Reading Data into the Early Grade Reading Barometer.

Additionally, overall program management remains a significant portion of the TO ensuring effective support of the technical activities within REEP-A. Administrative support and facilitation of discussions between the technical and management teams at Dexis, the subcontractor, Research Triangle Institute (RTI), and USAID have occurred throughout Quarter 1 of Year 3 and will continue to be a major management focus throughout the project’s duration.

TECHNICAL ACTIVITIES REEP-A supports the Africa Bureau’s Office of Sustainable Development (AFR/SD’s) Regional Development Cooperation Strategy (RDCS) through the generation of regional, country-specific and gender-focused data, analysis, and research that will assist in identifying and prioritizing needs, promoting rigorous research to inform evidence-based decisions, and promoting gender-equitable education investments within Africa.

To achieve these aims, technical assistance and advisory services under REEP-A fall across the three Results articulated in the Education Analytic Framework:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This distribution of activities across the three Results is reflected throughout the five years of the TO. Year 3 continues to provide strategic research across the three Results, and builds on the foundational research and analysis that has occurred over the last two years. Year 3 remains focused on implementing and continuing research begun thus far, as well as initiating new research activities. In discussion with USAID, including during Quarterly meetings, future research topics will also be

REEP - A YEAR 3, QUARTER 1 REPORT | 4 discussed and refined, to fulfill undetermined requirements and to ensure relevancy and alignment with emerging areas of interest being pursued under the TO. Research activities in Year 3 and subsequent years will prioritize and integrate gender-focused approaches, as done throughout Year 2.

In close consultation with AFR/SD/ED, all research activities, including technical plans, will utilize gender- sensitive methodologies and evaluation questions to improve understanding of the barriers to improving gender equity in education and learning. Recognizing the importance of gender considerations as a key component in achieving development aims, research activities within REEP-A will draw upon available sex-disaggregated data and other gender-sensitive tools to clearly articulate existing gender disparities and identify knowledge gaps. This research will provide a thorough analysis of gender issues at both the individual and institutional levels, including the interplay of contextual and political factors. In addition to research approaches, a gender lens will be adopted in the other analytic and capacity building services under REEP-A. This gender-sensitive approach includes the engagement of men and boys in conversations around issues of gender equity, which is central to achieving equality both in education and the work force. The identification of both gender-based disparities and continued barriers to equity will be driving aims of the research and recommendations, and will shape research questions and topics throughout the TO.

PROGRESS TOWARD OBJECTIVES (BY RESULT)

GENERAL TECHNICAL ACTIVITIES Work on activities within Result 1 continued throughout the first Quarter of Year 3. Planning for the two Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3 and 1.4) activities in support of the Rwanda Mission continued in Quarter 1 of this year. The two performance evaluations for Soma Umenye, an early grade reading initiative, and Huguka Dukore Akazi Kanoze, a youth and workforce development project, progressed with planning and implementation. The final Soma Umenye research plan was submitted and approved on November 29, 2018. The Huguka Dukore Akazi Kanoze evaluation field work took place in November/December 2018. The preliminary data analysis and report writing is underway with an initial report draft to be submitted in February 2019. Following the submission of the initial Huguka Dukore Akazi Kanoze evaluation report in Quarter 2, planning will begin for the Soma Umenye evaluation.

Within Result 2, two versions of the Africa Regional Education Workshop (AREW) (Result 2, Requirement 2.20) Report were submitted on October 20, 2018, and approved October 22, 2018.

Progress on the first activity in the LOI work stream, under Result 2 continued on Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Result 2, Requirement 2.31), with a draft of the report submitted on October 9, 2018. A graphic designer was identified to develop the overall design of the report to ensure a high quality, polished final deliverable. The approval for onboarding the graphic designer will be submitted in Quarter 2 to allow this work to commence.

Within this work stream, work on Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes continued. The consultants have been compiling information on a number of areas across the 19 countries for inclusion in the profiles. A draft profile from one of the countries will serve as a template to agree on with the activity manager to

5 | REEP - A YEAR 3, QUARTER 1 REPORT ensure consistency across all profiles. This draft profile will be submitted in Quarter 2 for discussion and agreement with the activity manager and TOCOR.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) included the submission and approval of questions to be added to the Early Grade Reading Assessment (EGRA). These questions are for teachers to gauge their own ability in the language they are teaching. They were submitted and approved on September 27, 2018. These were included in an EGRA implemented in Uganda in mid-October 2018.

During Quarter 1 Year 3, work continued on the conceptual foundation for the structure and constructs to be recommended for inclusion in the Teacher Literacy and Language Assessment (TLLA) activity (Result 2, Requirement 2.35). The methodology and approach for this tool will be outlined in a Research Plan to be submitted in Quarter 2 of Year 3.

Work toward the production of the SRGBV Toolkit (Result 3, Requirement 3.1) continued in this Quarter. The draft Malawi Pilot Report was submitted and additional analyses and discussions are ongoing about finalizing the survey instruments. Additionally, the submission of two initial sections that have been designed and stylized within the agreed upon layout framework are anticipated for submission in Quarter 2.

Lastly, work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during this Quarter with the submission of an Early Grade Reading Barometer Assessment Report on October 31, 2018 and approval on November 13, 2018.

RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET

The aim of Result 1 is to provide Missions with data and research support that meets needs and facilitates understanding of the status of education in specific countries. Result 1 includes three main activity categories, spread across seven requirements:

• Deliver Country and Regional Education Data Trend Snap Shots (Requirement 1.1) • Deliver Education Data Briefs (Requirement 1.2) • Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3-1.7)

Year 3 Quarter 1 activities under Result 1 included Deliver Education Data Briefs (Requirement 1.2) and Deliver Mission Support for Evaluation, Assessment and Data on Selected Priorities (Requirements 1.3 and 1.4).

The process of onboarding a translation services company to translate the Education Data Brief from English to French, started in Quarter 1. The finalized translation will be completed in Quarter 2 and submitted by February 1, 2019.

Work continued on the performance evaluations for two of the USAID Rwanda Mission’s flagship education programs. Huguka Dukore Akazi Kanoze, is the Rwanda Mission's flagship workforce development and youth activity (12/2016-12/2021). The utility of the performance evaluation carried out this Quarter was to ensure that the program is effectively investing in youth to achieve the stated objectives. Two consultants and two Dexis staff conducted this evaluation in Rwanda, and data analysis

REEP - A YEAR 3, QUARTER 1 REPORT | 6 and the development of initial report findings will take place in Quarter 2. Soma Umenye is the Rwanda Mission’s flagship reading activity (7/2016-7/2021) and aims to improve the reading skills of one million public school learners nationwide, in grades 1-3. A performance evaluation was determined to be an important tool to assess and learn from the initial year of the project, and to guide appropriate adjustments as the program progresses. The Soma Umenye Performance Evaluation Research Plan was both submitted and approved on November 29, 2018, and recruitment of consultants for this evaluation is anticipated to start in Quarter 2.

Result 1 Next Steps:

• Additional Country and Regional Education Data Trends Snap Shots (Requirement 1.1) are allocated to occur in Year 3, and can be conducted at the request of AFR/SD/ED. • Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3-1.4): ◦ Data analysis and report writing from the Huguka Dukore Akazi Kanoze performance evaluation (Requirement 1.3). ◦ Recruitment and onboarding of consultants for the evaluation team for the Soma Umenye evaluation (Requirement 1.4).

RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED

The majority of REEP-A research activities for Year 3 will contribute to Result 2, as will the bulk of research activities throughout the TO.

The objectives of Result 2 are to identify or generate knowledge about evidence-based education programming related to early grade reading and access to education, and to ensure that relevant knowledge and research products are available to USAID education personnel through cost-effective dissemination methods, as well as to provide trainings and workshops.

Requirements for Result 2 throughout the TO primarily consist of desk research, primary research, capacity building, and ad hoc reports. Result 2 activities in Year 3 include:

• Deliver Workshops (Requirement 2.20-2.30) • Deliver Current Research on Teacher Knowledge, Skills, and Attitudes related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31) • Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32) • Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) • Development of the Teacher Language and Literacy Assessment (Requirement 2.35)

The final Africa Regional Education Workshop (AREW) report was submitted in Quarter1 of Year 3 following final guidance from the TOCOR on this deliverable. The final two versions of the AREW Report (Result 2, Requirement 2.20) were submitted on October 20, 2018, and approved October 22, 2018.

Work under the LOI work stream continued on the first requirement, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy

7 | REEP - A YEAR 3, QUARTER 1 REPORT Outcomes (Requirement 2.31) in Quarter1 of Year 3. This Requirement is designed to serve as a framing document for the entire work stream and will pose future research questions and identify gaps in the existing knowledge base. This framework report draft was submitted on October 9, 2018.

Under Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes, Dexis will develop up to 19 country profiles for sub- Saharan African countries currently implementing Goal 1 reading programs. The research plan outlining the information to be included in the profiles was submitted on November 12, 2018 and approved in December 7, 2018. Technical work continued in Quarter 1 through the collection of the necessary information and the development of the structure of the profiles.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) included the submission and approval of questions to be added to the EGRA. Questions were developed to be added for teachers to gauge their own ability in the language they are teaching. These questions were submitted and approved on September 27, 2018. These were included in an EGRA implemented by RTI in Uganda in mid-October 2018.

Requirement 2.35, Teacher Language and Literacy Assessment, involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Dexis developed an initial research plan to map the way forward for this activity. This research plan is anticipated to be submitted in Quarter 2 Year 3.

Result 2 Next Steps:

• Continue technical work and onboard graphic designer for Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31). • Continue work on Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32). • Continue work on Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33). • Development of the Teacher Language and Literacy Assessment and submission of research plan (Requirement 2.35).

RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED USAID has supported the development of several tools (e.g. Early Grade Reading Assessment and Snapshot of School Management Effectiveness) that have been successfully adapted and applied in numerous countries, generating useful data and animating purposeful policy dialogue. The subsequent years of the project will explore the further development of relevant and applicable tools, as there is high demand for low-cost, easily implemented tools for systematic data collection, analysis, and policy dialogue.

REEP - A YEAR 3, QUARTER 1 REPORT | 8 The research priority for Result 3 in Year 2 was to continue work toward the Delivery of the SRGBV Measurement Framework (Requirement 3.1). An SRGBV Measurement Framework was initiated under DERP, and advanced by AFR/SD/ED. It is the intention of AFR/SD/ED that this initiative be finalized under REEP-A, such that a fully functional and user-friendly suite of surveys and data collection protocols can be widely utilized by the SRGBV community in the field. These efforts have been continued in Quarter 1 with continued work on the SRGBV Toolkit that is being designed for global use.

The data cleaning, processing, analysis, and report writing for the SRGBV instrument pilot carried into Quarter 4 Year 2 and continued into Quarter 1 Year 3. The continuation of work on the SRGBV Toolkit (Result 3, Requirement 3.1), remained a primary focus for Quarter 1. The Malawi Pilot draft report was submitted on September 28, 2018. Following this submission, discussions occurred around some of the issues flagged in the report leading to additional analyses and discussions around how to correct these and proceed with next steps. Initial sections of the Toolkit were shared with USAID during this Quarter to select a design style for the Toolkit. The design of each section continues, with regular submissions to USAID for review and comments.

Work on Requirement 3.2 included submission of an Early Grade Reading Barometer Assessment Report on October 31, 2018, and approval on November 13, 2018. The assessment report aimed to determine if the current Early Grade Reading Barometer platform, an online tool hosted by USAID’s Asia Bureau, provides the appropriate venue to incorporate additional datasets. Findings determined that the necessary platform for the Barometer already exists and is well-operated, and thus it is recommended that the Africa Bureau utilize the existing platform to incorporate datasets. The assessment report allows for taking the next steps for fulling the remaining Sub-Requirement 3.2.3, Incorporate African Mission Early Grade Reading Data into the Early Grade Reading Barometer.

Result 3 Next Steps:

• Continue work on the SRGBV Toolkit (Requirement 3.1), namely: ◦ Next steps for the SRGBV instrument pilot will take place, such as the cleaning, processing, and analysis of the data, which will include psychometric analyses of the test items. ◦ The final report, including the analyses, summary of findings, and suggested next steps will be submitted in the next quarter. following ongoing discussions with USAID. ◦ Continue progress toward conceptualization and design of the SRGBV Toolkit. ◦ Continue work on incorporating country data from the Kenya (Tusome) project into the Early Grade Reading Barometer (Requirement 3.2).

9 | REEP - A YEAR 3, QUARTER 1 REPORT Table 1: Status of Technical Deliverables for Quarter 1, by Result

Result 1: Africa Missions strategy related data needs met Status Deliver Country and Regional Education Data Trend Snap-Shots (Requirement 1.1) Upcoming (timing TBD) Deliver Education Data Briefs (1.2) In progress Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (1.3- In progress 1.7) Result 2: Availability of Africa education data and trends expanded Status Develop Early Grade Reading Research Training Modules or Courses for Classroom, Upcoming (timing TBD) Online, and/or Resources Materials (2.19.2) Deliver Workshops (2.20-2.30) Upcoming (timing TBD) Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to In progress Language and Literacy that Influence Early Grade Literacy Outcomes (2.31) Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to In progress LOI, and Student Reading Outcomes (2.32) Deliver Analysis of Relationship Between Teacher Language Proficiency and Student In progress Reading (2.33) Development of the Teacher Language and Literacy Assessment (2.35) In progress

Result 3: Measurement tools with applicability across countries developed Status Deliver the SRGBV Measurement Framework: Conduct Case Study Using SRGBV Toolkit In progress (3.1) Update Status of Early Grade Reading Barometer (3.2) Upcoming

SUCCESSES, CHALLENGES AND LESSONS LEARNED There have been some challenges surrounding communication and understanding of tasks and timeline on various activities. This has been discussed with the TOCOR and addressed through creating additional updates and a timeline that everyone has access to with clear action items, deliverables, and dates in one central place.

OTHER DELIVERABLES In the current period, Dexis also submitted the Year 2 Annual Report and the Year 3 Workplan on October 31, 2018. The Year 3 Workplan was approved on November 29, 2018, and the Year 3 Workplan Summary was submitted and approved on December 14, 2018.

PROJECT MANAGEMENT DELIVERABLES Next Quarter (Year 3, Quarter 2):

• Dexis will submit the Year 3, Quarter 2 report on April 30, 2019.

REEP - A YEAR 3, QUARTER 1 REPORT | 10

RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING-AFRICA (REEP-A) YEAR 3, QUARTER 2 REPORT

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 . The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication. 11 | REEP - A YEAR 3, QUARTER 1 REPORT

Research For Effective Education Programming-Africa (REEP-A) YEAR 3, QUARTER 2 REPORT

January 1, 2019 - March 31, 2019

April 30, 2019

Research for Effective Education Programming – Africa (REEP-A)

Contract No. AID-OAA-TO-16-00024

1 | REEP - A YEAR 3, QUARTER 2 REPORT

ACRONYMS

AFR/SD Africa Bureau’s Office of Sustainable Development

AFR/SD/ED Africa Bureau’s Office of Sustainable Development Education Team

AREW Africa Regional Education Workshop

DERP Data for Education Research and Programming in Africa

EGRA Early Grade Reading Assessment

KSA Knowledge, Skills, and Attitudes

LOI Language of Instruction

RDCS Regional Development Cooperation Strategy

REEP-A Research for Effective Education Programming – Africa

RTI Research Triangle Institute

SRBGV School-Related Gender-Based Violence

TBD To be Determined

TLLA Teacher Language and Literacy Assessment

TO Task Order

TOCOR Task Order Contract Officer Representative

USAID United States Agency for International Development

REEP - A YEAR 3, QUARTER 2 REPORT | ii CONTENTS EXECUTIVE SUMMARY 1 OVERVIEW OF THE REEP-A PROGRAM 2 BACKGROUND 2 YEAR THREE, QUARTER TWO ACTIVITIES 3 PROGRAMMING AND PROJECT SUPPORT 3 TECHNICAL ACTIVITIES 4 PROGRESS TOWARD OBJECTIVES (BY RESULT) 5 GENERAL TECHNICAL ACTIVITIES 5 RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET 6 RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED 7 RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED 9 SUCCESSES, CHALLENGES AND LESSONS LEARNED 10 OTHER DELIVERABLES 11 PROJECT MANAGEMENT DELIVERABLES 11

i | REEP - A YEAR 3, QUARTER 2 REPORT EXECUTIVE SUMMARY The Research for Effective Education Programming – Africa (REEP-A) Task Order (TO), a contract between the U.S. Agency for International Development (USAID) and Dexis Consulting Group (Dexis), was awarded with a total amount of $7,949,966.00 on September 29, 2016 with a period of performance of five years, ending on September 28, 2021.

The main objective of REEP-A is to generate and effectively disseminate Africa regional and country specific education data, analysis, and research, to ensure the availability of evidence-based interventions that inform the prioritization of needs and education investment decisions. REEP-A specifically focuses on the measurement of equitable access to education and early grade reading improvements. REEP-A activities support USAID’s 2011 Education Strategy, which has promoted the rigorous use of evidence- based programming across the Strategy’s three education goals. The goals set targets for 2015 for early grade reading, workforce development, and education in crisis and conflict-affected environments:

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

To meet these goals, activities are designed to emphasize the following three Results:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This Quarterly Report covers Quarter 2 of FY2019, from January 1, 2019 through March 31, 2019. REEP-A activities during this period included project administration, planning, and technical activities. The quarterly financial report ending on March 31, 2018 (Q2 FY2019) is attached. The REEP-A Program Manager is Stephanie Squires and the Task Order Contract Officer Representative (TOCOR) is Megnote Lezhnev.

1 | REEP - A YEAR 3, QUARTER 2 REPORT OVERVIEW OF THE REEP-A PROGRAM

BACKGROUND The overarching aim of the REEP-A project is to provide the USAID Africa Bureau, overseas Missions, and partner organizations, with concrete research contributions on USAID education initiatives to inform evidence-based investment, decision-making, and the prioritization of needs. REEP-A will contribute to USAID’s education aims in Africa by providing technical and advisory services focused on research and capacity building that intend to enhance the quality and effectiveness of USAID activities. REEP-A will provide practical and action-oriented gender-sensitive research that will be applied to both project design and implementation. REEP-A aims to strengthen the relationships and links between project evaluation, design, and implementation. In addition, REEP-A will contribute to the capacity development of education personnel and partners by enhancing the use of research in decision-making, mainstreaming the application of feedback-loops and lessons learned into project design and implementation, and expanding the evidence and knowledge base on education initiatives.

USAID’s Education Strategy, issued in 2011, has promoted the rigorous use of evidence-based programming across the Strategy’s three education goals, with a particular emphasis on the measurement of access to education and early grade reading improvements. The education goals set targets for 2015, which centered on early grade reading, workforce development, and education in crisis and conflict-affected environments. REEP-A will specifically focus on improvement in early grade reading and the measurement of equitable access to education in crisis and conflict environments (Goals 1 and 3):

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

Following the release of the Strategy, the Education Team of the USAID Africa Bureau’s Office of Sustainable Development (AFR/SD/ED) identified the need for targeted research and training services to complement the efforts of Africa Missions to effectively align their education portfolios with the three goal areas of the Strategy. The Data for Education Research and Programming in Africa (DERP) project, implemented between 2011 and 2016, was established to respond to these identified needs, and provided AFR/SD/ED with timely research and tools that Missions could apply to their ongoing and future education projects. REEP-A will expand on the progress and evidence base established by DERP and will target identified gaps and challenges.

REEP-A seeks to contribute evidence that responds to the question of why, despite some gains, progress toward enhanced early grade reading and access to quality education in conflict and crisis settings in Africa has been modest. Expanding the evidence base and data collection for education interventions in Africa is critical to spurring further progress, increasing the understanding of program effectiveness, identifying barriers, and replicating successes.

REEP - A YEAR 3, QUARTER 2 REPORT | 2 YEAR THREE, QUARTER TWO ACTIVITIES

PROGRAMMING AND PROJECT SUPPORT Under Result 1, REEP-A continued work on Requirement 1.2, Deliver Education Data Briefs, for Year 3 by completing a translation of the Education Data Brief: Global Prevalence of School-Related Gender-Based Violence (SRGBV) into French. The English version of the document was submitted in May 2018 and it was determined that for wider usage and dissemination it would be valuable to translate the document to French. The finalized translation was completed for submission on February 4, 2019 and approved on March 7, 2019.

Additionally, planning and implementation for two activities under Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) continued throughout the second quarter in Year 3. The activities consist of supporting the Rwanda Mission through the implementation of performance evaluations for two of their existing education projects. The Huguka Dukore Akazi Kanoze performance evaluation was implemented in Rwanda at the end of Quarter 1. The evaluation team has worked collaboratively with the Rwanda Mission to share initial findings and analysis following the in-country work. The draft of the evaluation report was submitted on February 19, 2019. The final version of the report will be submitted in Quarter 3.

The initial research plan for the Soma Umenye Performance Evaluation for the flagship early grade reading program in Rwanda was submitted and approved on November 29, 2018. During this quarter conversations occurred with the Mission to update the research plan following adjustments made to program implementation. The updated research plan, including timeline and staffing, will be submitted in Quarter 3. Continued planning for the evaluation will occur as well, including recruitment and onboarding of technical consultants for the evaluation team.

Under Result 2, the Language of Instruction (LOI) work stream continued to constitute a major area of work. In order to meet the objectives of the first activity in this research stream, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes (KSA) related to Language and Literacy that Influence Early Grade Literacy Outcomes (Result 2, Requirement 2.31), a draft of the report was submitted on February 4, 2019 for further comments from the Education Office of the Economic Growth, Education and Environment bureau. Work continues on the report with revisions and graphic design to produce a high quality, polished report for wide dissemination. The next draft, including incorporation of all comments, will consist of a fully laid out report in the designed template and will be submitted in Quarter 3.

Furthermore, REEP-A continued ongoing work on Result 2, Requirement 2.32, Develop Relevant Country- Specific Profiles of Language of Instruction (LOI) Policies, Reading Program Approaches to LOI, and Student Reading Outcomes. This work includes the continuation of compiling information on a number of areas across the 19 countries for inclusion in the profiles. A draft profile was shared with USAID on March 5, 2019 as an example to allow for discussion and initial feedback. All draft country profiles will be submitted in Quarter 3 for further comments.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) continued during the quarter. Following the inclusion of the questions into an EGRA implemented by RTI in Uganda in mid-October 2018, analysis began once the data was received. During this quarter the analysis began by comparing teachers’

3 | REEP - A YEAR 3, QUARTER 2 REPORT responses with data on student reading outcomes. This analysis, as well as other findings, illustrate the need to adjust the questions for the next EGRA inclusion, which will be compiled into a report for submission in the upcoming quarter.

Requirement 2.35, Teacher Language and Literacy Assessment (TLLA), involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Dexis developed an initial research plan to map the way forward for this activity. This research plan was submitted on January 18, 2019 with comments received on February 4, 2019. Following discussions with the Activity Manager and RTI the final research plan and initial draft instruments will be submitted in Quarter 3.

The continuation of work on the SRGBV Toolkit (Result 3, Requirement 3.1), remained a primary focus within Result 3 for Quarter 2. Efforts consisted of further data analysis from the fieldwork for the pilot in Malawi, as well as continued development of the Toolkit.

Additionally, overall program management remains a significant portion of the TO ensuring effective support of the technical activities within REEP-A. Administrative support and facilitation of discussions between the technical and management teams at Dexis, the subcontractor, Research Triangle Institute (RTI), and USAID have occurred throughout Quarter 2 of Year 3 and will continue to be a major management focus throughout the project’s duration.

TECHNICAL ACTIVITIES REEP-A supports the Africa Bureau’s Office of Sustainable Development (AFR/SD’s) Regional Development Cooperation Strategy (RDCS) through the generation of regional, country-specific and gender-focused data, analysis, and research that will assist in identifying and prioritizing needs, promoting rigorous research to inform evidence-based decisions, and promoting gender-equitable education investments within Africa.

To achieve these aims, technical assistance and advisory services under REEP-A fall across the three Results articulated in the Education Analytic Framework:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This distribution of activities across the three Results is reflected throughout the five years of the TO. Year 3 continues to provide strategic research across the three Results, and builds on the foundational research and analysis that has occurred over the last two years. Year 3 remains focused on implementing and continuing research begun thus far, as well as initiating new research activities. In discussion with USAID, including during Quarterly meetings, future research topics will also be discussed and refined, to fulfill undetermined requirements and to ensure relevancy and alignment with emerging areas of interest being pursued under the TO. Research activities in Year 3 and subsequent years will prioritize and integrate gender-focused approaches, as done throughout Year 2.

REEP - A YEAR 3, QUARTER 2 REPORT | 4 In close consultation with AFR/SD/ED, all research activities, including technical plans, will utilize gender- sensitive methodologies and evaluation questions to improve understanding of the barriers to improving gender equity in education and learning. Recognizing the importance of gender considerations as a key component in achieving development aims, research activities within REEP-A will draw upon available sex-disaggregated data and other gender-sensitive tools to clearly articulate existing gender disparities and identify knowledge gaps. This research will provide a thorough analysis of gender issues at both the individual and institutional levels, including the interplay of contextual and political factors. In addition to research approaches, a gender lens will be adopted in the other analytic and capacity building services under REEP-A. This gender-sensitive approach includes the engagement of men and boys in conversations around issues of gender equity, which is central to achieving equality both in education and the work force. The identification of both gender-based disparities and continued barriers to equity will be driving aims of the research and recommendations, and will shape research questions and topics throughout the TO.

PROGRESS TOWARD OBJECTIVES (BY RESULT)

GENERAL TECHNICAL ACTIVITIES Work on activities within Result 1 continued throughout the second Quarter of Year 3. Planning and implementation of the two Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3 and 1.4) activities in support of the Rwanda Mission continued in Quarter 2 of this year. The two performance evaluations for Soma Umenye, an early grade reading initiative, and Huguka Dukore Akazi Kanoze, a youth and workforce development project, progressed with planning and implementation. The initial Soma Umenye research plan was submitted and approved on November 29, 2018. An updated research plan, reflective of changes in implementation and timeline will be submitted in Quarter 3 along with the recruitment and onboarding of the evaluation team. The Huguka Dukore Akazi Kanoze evaluation field work took place in Quarter 1. The report writing occurred during this quarter with an initial draft of the evaluation report submitted on February 19, 2019.

Progress on the first activity in the LOI work stream, under Result 2, continued on Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Result 2, Requirement 2.31), with an updated draft of the report submitted to USAID for further comments on February 4, 2019. Work continued on the design of the report as well. The next draft, including all comments incorporated, will consist of a fully laid out report in the design template, which will be submitted in Quarter 3.

Within this work stream, work on Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes continued. This work has consisted of compiling information on a number of relevant areas across the 19 countries for inclusion in the profiles. A draft profile was shared with USAID on March 5, 2019 as an example to allow for discussion of content and layout, as well as for initial feedback. All draft country profiles will be submitted in Quarter 3 for further comments.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) continued during the quarter. In Quarter 1, the developed questions were included in an EGRA implemented by RTI in Uganda. Following this, in Quarter 2, the analysis on the resulting data began, by comparing teachers’ responses with data on

5 | REEP - A YEAR 3, QUARTER 2 REPORT student reading outcomes. This analysis, as well as other findings, illustrate the need to adjust the questions for the next EGRA inclusion, which will be compiled into a report for submission in the upcoming quarter.

Requirement 2.35, Teacher Language and Literacy Assessment, involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Dexis developed an initial research plan to map the way forward for this activity. This research plan was submitted on January 18, 2019 with comments received on February 4, 2019. Following discussions with the Activity Manager and RTI, the final research plan and initial draft instruments will be submitted in Quarter 3.

The continuation of work on the SRGBV Toolkit (Result 3, Requirement 3.1), remained a primary focus within Result 3 for Quarter 2. Efforts consisted of further data analysis from the fieldwork for the pilot in Malawi, and continued development of the Toolkit.

Lastly, work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during this Quarter through work on a modification to allow the subcontractor to receive funding to commence work. This modification will be completed in Quarter 3 and the subcontractor will begin incorporating the Kenya (Tusome) project data.

RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET

The aim of Result 1 is to provide Missions with data and research support that meets needs and facilitates understanding of the status of education in specific countries. Result 1 includes three main activity categories, spread across seven requirements:

• Deliver Country and Regional Education Data Trend Snap Shots (Requirement 1.1) • Deliver Education Data Briefs (Requirement 1.2) • Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3-1.7)

Year 3 Quarter 2 activities under Result 1 included Deliver Education Data Briefs (Requirement 1.2) and Deliver Mission Support for Evaluation, Assessment and Data on Selected Priorities (Requirements 1.3 and 1.4).

Under Result 1, REEP-A continued work on Requirement 1.2, Deliver Education Data Briefs, for Year 3 by completing a translation of the Education Data Brief: Global Prevalence of School-Related Gender-Based Violence (SRGBV) into French. The English version of the document was submitted in May 2018 and it was determined that for wider usage and dissemination it would be valuable to translate the document to French. The process of onboarding a translation services company to translate the Education Data Brief from English to French, started in Quarter 1. The finalized translation was submitted on February 4, 2019 and approved on March 7, 2019.

Work continued on the performance evaluations for two of the USAID Rwanda Mission’s flagship education programs during this quarter. Huguka Dukore Akazi Kanoze, is the Rwanda Mission's flagship workforce development and youth activity. The in-country evaluation of this activity occurred in

REEP - A YEAR 3, QUARTER 2 REPORT | 6 Quarter 1, along with the data analysis, compiling of initial findings, and writing of the evaluation report. The draft evaluation report was submitted to the Mission, this quarter, on February 19, 2019. The Mission provided initial comments on March 1, 2019 with the remaining comments on March 12, 2019. These comments were addressed and the revised draft was submitted to the Mission on March 26, 2019. The final version of the evaluation report will be submitted in Quarter 3.

Soma Umenye is the Rwanda Mission’s flagship reading activity (7/2016-7/2021) and aims to improve the reading skills of one million public school learners nationwide, in grades 1-3. The initial research plan for this performance evaluation was submitted and approved on November 29, 2018. During this quarter conversations occurred with the Mission to update the research plan following adjustments made to program implementation. The Mission shared comments on March 26, 2019, from the implementing partners to be addressed in the revised research plan. The updated research plan including timeline and staffing will be submitted in the next quarter. Further planning for the evaluation will continue in Quarter 3, including recruitment and onboarding of technical consultants for the evaluation team.

Result 1 Next Steps:

• Additional Country and Regional Education Data Trends Snap Shots (Requirement 1.1) are allocated to occur in Year 3, and can be conducted at the request of AFR/SD/ED. • Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3-1.4): ◦ Report writing and submission for the Huguka Dukore Akazi Kanoze performance evaluation (Requirement 1.3). ◦ Submission of updated research plan for the Soma Umenye evaluation (Requirement 1.4). ◦ Recruitment and onboarding of consultants for the evaluation team for the Soma Umenye evaluation (Requirement 1.4).

RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED

The majority of REEP-A research activities for Year 3 will contribute to Result 2, as will the bulk of research activities throughout the TO.

The objectives of Result 2 are to identify or generate knowledge about evidence-based education programming related to early grade reading and access to education, and to ensure that relevant knowledge and research products are available to USAID education personnel through cost-effective dissemination methods, as well as to provide trainings and workshops.

Requirements for Result 2 throughout the TO primarily consist of desk research, primary research, capacity building, and ad hoc reports. Result 2 activities in Year 3 include:

• Deliver Workshops (Requirement 2.20-2.30) • Deliver Current Research on Teacher Knowledge, Skills, and Attitudes related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31) • Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32) • Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33)

7 | REEP - A YEAR 3, QUARTER 2 REPORT • Development of the Teacher Language and Literacy Assessment (Requirement 2.35)

Work under the LOI work stream continued on the first requirement, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31) in Quarter 2 of Year 3. This Requirement is designed to serve as a framing document for the entire work stream and will pose future research questions and identify gaps in the existing knowledge base. A draft of the framework document was submitted on February 4, 2019 for further comments from the Education Office of the Economic Growth, Education and Environment bureau. Work continues on the report through revisions and design to produce a high quality, polished report for wide dissemination. The next draft including all comments and designed will be submitted in Quarter 3.

Under Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes, Dexis will develop up to 19 country profiles for sub- Saharan African countries currently implementing Goal 1 reading programs. This work includes the continuation of compiling information on a number of areas across the 19 countries for inclusion in the profiles. A draft profile was shared with USAID on March 5, 2019 as an example to allow for discussion and initial feedback. All draft country profiles will be submitted in Quarter 3 for further comments.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) continued during the quarter. Following the inclusion of the questions into an EGRA implemented by RTI in Uganda in mid-October 2018, analysis began once the data was received. During this quarter the analysis began by comparing teachers’ responses with data on student reading outcomes. This analysis, as well as other findings, illustrate the need to adjust the questions for the next EGRA inclusion, which will be compiled into a report for submission in the upcoming quarter.

Requirement 2.35, Teacher Language and Literacy Assessment, involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Dexis developed an initial research plan to map the way forward for this activity. This research plan was submitted on January 18, 2019 with comments received on February 4, 2019. A discussion on February 7, 2019 allowed for RTI to walk through the plan and potential concerns with the Activity Manager. Following this call, revisions to the research plan began and was resubmitted February 25, 2019. Work continued as well on further development of the instruments. Final comments from the Activity Manager were received on March 27, 2019. The final research plan and initial draft instruments will be submitted in Quarter 3.

Result 2 Next Steps:

• Continue technical work and design for Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31). • Continue work on Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32).

REEP - A YEAR 3, QUARTER 2 REPORT | 8 • Continue work on Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33). • Submission of the final research plan and initial draft of instruments for the Teacher Language and Literacy Assessment (Requirement 2.35).

RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED USAID has supported the development of several tools (e.g. Early Grade Reading Assessment and Snapshot of School Management Effectiveness) that have been successfully adapted and applied in numerous countries, generating useful data and animating purposeful policy dialogue. The subsequent years of the project will explore the further development of relevant and applicable tools, as there is high demand for low-cost, easily implemented tools for systematic data collection, analysis, and policy dialogue.

The research priority for Result 3 is to continue work toward the Delivery of the SRGBV Measurement Framework (Requirement 3.1). An SRGBV Measurement Framework was initiated under DERP, and advanced by AFR/SD/ED. It is the intention of AFR/SD/ED that this initiative be finalized under REEP-A, such that a fully functional and user-friendly suite of surveys and data collection protocols can be widely utilized by the SRGBV community in the field. These efforts have been continued in Quarter 2 with continued work on the SRGBV Toolkit that is being designed for global use.

The data cleaning, processing, analysis, and report writing for the SRGBV instrument pilot carried into Year 3. The continuation of work on the SRGBV Toolkit (Result 3, Requirement 3.1), remained a primary focus for Quarter 2. Following the submission of the Malawi Pilot draft report, discussions have been ongoing around some of the issues flagged in the report, including requests for clarification, additional information on the recommendations made regarding the survey content, and for more details about the reliability and validity analysis conducted on all the instruments, leading to additional analyses and discussions around how to correct these items and proceed with next steps.

During this quarter, a summary document detailing what has happened so far in terms of decisions and additional analyses conducted at the request of USAID was produced to serve as a reference point for the path forward. The summary document was submitted on February 28, 2019. Further discussions continued stemming from this document, indicating that further analysis, as well as a document articulating options for next steps, was needed. This document containing options for re-pilot will be submitted in Quarter 3.

Continued work on the design and layout of the Toolkit occurred during this quarter. This process has been very iterative to keep things moving as quickly as possible. The Activity Manager called a meeting to discuss the dissatisfaction with the sections they have seen and expressed their concerns for Dexis to address with the subcontractor. All the sections of the Toolkit are to be revised and/or completed with a user-friendly and comprehensive approach across all sections, to occur in Quarter 3.

Efforts on Requirement 3.2, Update Status of Early Grade Reading Barometer, continued with work on a modification with the subcontractor. This modification will be completed in Quarter 3 and the subcontractor will begin incorporating the Kenya (Tusome) project data.

9 | REEP - A YEAR 3, QUARTER 2 REPORT Result 3 Next Steps:

• Continue work on the SRGBV Toolkit (Requirement 3.1), namely: ◦ Submission of a document outlining scenarios for re-pilot of SRGBV survey instruments. ◦ Continue progress toward the design of the SRGBV Toolkit with submission of draft Toolkit in Quarter 3. • Continue work on incorporating country data from the Kenya (Tusome) project into the Early Grade Reading Barometer (Requirement 3.2).

Table 1: Status of Technical Deliverables for Quarter 2, by Result

Result 1: Africa Missions strategy related data needs met Status Deliver Country and Regional Education Data Trend Snap-Shots (Requirement 1.1) Upcoming (timing TBD) Deliver Education Data Briefs (1.2) Upcoming (timing TBD) Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (1.3-1.7) In progress Result 2: Availability of Africa education data and trends expanded Status Develop Early Grade Reading Research Training Modules or Courses for Classroom, Upcoming (timing TBD) Online, and/or Resources Materials (2.19.2) Deliver Workshops (2.20-2.30) Upcoming (timing TBD) Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language In progress and Literacy that Influence Early Grade Literacy Outcomes (2.31) Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to In progress LOI, and Student Reading Outcomes (2.32) Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading In progress (2.33)

Development of the Teacher Language and Literacy Assessment (2.35) In progress

Result 3: Measurement tools with applicability across countries developed Status Deliver the SRGBV Measurement Framework (3.1) In progress Update Status of Early Grade Reading Barometer (3.2) Upcoming

SUCCESSES, CHALLENGES AND LESSONS LEARNED There have been some challenges surrounding communication and understanding of tasks and timeline on various activities. This has been ongoing since the last quarter and continually discussed with the TOCOR. Steps have been taken to address it through creating additional updates and timelines that everyone has access to with clear action items, deliverables, and dates in one central place. Additionally, there have been requests for further information and analysis on the SRGBV activity that was not planned or budgeted for. Work has continued despite this, in an attempt to keep things moving, however has now resulted in delays. The lack of understanding on the additional resources needed to continue work on these additional requests led to additional challenges and further resources needed. In the future work will need to halt until formal modifications are in place to mitigate this risk.

REEP - A YEAR 3, QUARTER 2 REPORT | 10 OTHER DELIVERABLES Additional deliverables were requested under the SRGBV work, Requirement 3.1 which have been mentioned in the technical activities above. Additionally, a formal modification will need to be completed to incorporate the new education strategy into the REEP-A TO for alignment and reporting purposes.

PROJECT MANAGEMENT DELIVERABLES Next Quarter (Year 3, Quarter 3):

• Dexis will submit the Year 3, Quarter 3 report on July 31, 2019.

11 | REEP - A YEAR 3, QUARTER 2 REPORT

RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING-AFRICA (REEP-A) YEAR 3, QUARTER 3 REPORT

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 . The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication. REEP - A YEAR 3, QUARTER 2 REPORT | 12

Research For Effective Education Programming-Africa (REEP-A) YEAR 3, QUARTER 3 REPORT

April 1, 2019 - June 30, 2019

July 30, 2019

Research for Effective Education Programming – Africa (REEP-A)

Contract No. AID-OAA-TO-16-00024

ACRONYMS

AFR/SD Africa Bureau’s Office of Sustainable Development

AFR/SD/ED Africa Bureau’s Office of Sustainable Development Education Team

CFA Confirmatory Factor Analysis

DERP Data for Education Research and Programming in Africa

EGRA Early Grade Reading Assessment

KSA Knowledge, Skills, and Attitudes

LOI Language of Instruction

RDCS Regional Development Cooperation Strategy

REEP-A Research for Effective Education Programming – Africa

RTI Research Triangle Institute

SEL Social and Emotional Learning

SRBGV School-Related Gender-Based Violence

TBD To be Determined

TLLA Teacher Language and Literacy Assessment

TO Task Order

TOCOR Task Order Contract Officer Representative

USAID United States Agency for International Development

REEP - A YEAR 3, QUARTER 3 REPORT | ii CONTENTS EXECUTIVE SUMMARY 1 OVERVIEW OF THE REEP-A PROGRAM 2 BACKGROUND 2 YEAR THREE, QUARTER THREE ACTIVITIES 3 PROGRAMMING AND PROJECT SUPPORT 3 TECHNICAL ACTIVITIES 5 PROGRESS TOWARD OBJECTIVES (BY RESULT) 6 GENERAL TECHNICAL ACTIVITIES 6 RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET 7 RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED 8 RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED 10 SUCCESSES, CHALLENGES AND LESSONS LEARNED 12 OTHER DELIVERABLES 12 PROJECT MANAGEMENT DELIVERABLES 13

i | REEP - A YEAR 3, QUARTER 3 REPORT

EXECUTIVE SUMMARY The Research for Effective Education Programming – Africa (REEP-A) Task Order (TO), a contract between the U.S. Agency for International Development (USAID) and Dexis Consulting Group (Dexis), was awarded with a total amount of $7,949,966.00 on September 29, 2016 with a period of performance of five years, ending on September 28, 2021.

The main objective of REEP-A is to generate and effectively disseminate Africa regional and country specific education data, analysis, and research, to ensure the availability of evidence-based interventions that inform the prioritization of needs and education investment decisions. REEP-A specifically focuses on the measurement of equitable access to education and early grade reading improvements. REEP-A activities support USAID’s 2011 Education Strategy, which has promoted the rigorous use of evidence- based programming across the Strategy’s three education goals. The goals set targets for 2015 for early grade reading, workforce development, and education in crisis and conflict-affected environments:

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

To meet these goals, activities are designed to emphasize the following three Results:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This Quarterly Report covers Quarter 3 of FY2019, from April 1, 2019 through June 30, 2019. REEP-A activities during this period included project administration, planning, and technical activities. The quarterly financial report ending on June 30, 2018 (Q3 FY2019) is attached. The REEP-A Program Manager is Stephanie Squires and the Task Order Contract Officer Representative (TOCOR) is Megnote Lezhnev.

1 | REEP - A YEAR 3, QUARTER 3 REPORT OVERVIEW OF THE REEP-A PROGRAM

BACKGROUND The overarching aim of the REEP-A project is to provide the USAID Africa Bureau, overseas Missions, and partner organizations, with concrete research contributions on USAID education initiatives to inform evidence-based investment, decision-making, and the prioritization of needs. REEP-A will contribute to USAID’s education aims in Africa by providing technical and advisory services focused on research and capacity building that intend to enhance the quality and effectiveness of USAID activities. REEP-A will provide practical and action-oriented gender-sensitive research that will be applied to both project design and implementation. REEP-A aims to strengthen the relationships and links between project evaluation, design, and implementation. In addition, REEP-A will contribute to the capacity development of education personnel and partners by enhancing the use of research in decision-making, mainstreaming the application of feedback-loops and lessons learned into project design and implementation, and expanding the evidence and knowledge base on education initiatives.

USAID’s Education Strategy, issued in 2011, has promoted the rigorous use of evidence-based programming across the Strategy’s three education goals, with a particular emphasis on the measurement of access to education and early grade reading improvements. The education goals set targets for 2015, which centered on early grade reading, workforce development, and education in crisis and conflict-affected environments. REEP-A will specifically focus on improvement in early grade reading and the measurement of equitable access to education in crisis and conflict environments (Goals 1 and 3):

Goal 1: Improved reading skills for 100 million children in primary grades by 2015

Goal 2: Improved ability of tertiary and workforce programs to generate skills relevant to a country’s development goals

Goal 3: Increased equitable access to education in crisis and conflict environments for 15 million learners by 2015

Following the release of the Strategy, the Education Team of the USAID Africa Bureau’s Office of Sustainable Development (AFR/SD/ED) identified the need for targeted research and training services to complement the efforts of Africa Missions to effectively align their education portfolios with the three goal areas of the Strategy. The Data for Education Research and Programming in Africa (DERP) project, implemented between 2011 and 2016, was established to respond to these identified needs, and provided AFR/SD/ED with timely research and tools that Missions could apply to their ongoing and future education projects. REEP-A will expand on the progress and evidence base established by DERP and will target identified gaps and challenges.

REEP-A seeks to contribute evidence that responds to the question of why, despite some gains, progress toward enhanced early grade reading and access to quality education in conflict and crisis settings in Africa has been modest. Expanding the evidence base and data collection for education interventions in Africa is critical to spurring further progress, increasing the understanding of program effectiveness, identifying barriers, and replicating successes.

REEP - A YEAR 3, QUARTER 3 REPORT | 2 YEAR THREE, QUARTER THREE ACTIVITIES

PROGRAMMING AND PROJECT SUPPORT Under Result 1, planning and implementation for two activities under Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3-1.7) continued throughout the third quarter in Year 3. The activities consist of supporting the Rwanda Mission through the implementation of performance evaluations for two of their existing education projects. The final version of the Huguka Dukore Akazi Kanoze performance evaluation report was submitted June 3, 2019. However, comments still remain and the report is being adjusted to include these comments to be resubmitted in Quarter 4.

The initial research plan for the Soma Umenye Performance Evaluation for the flagship early grade reading program in Rwanda was submitted and approved on November 29, 2018. During Quarter 2, conversations occurred with the Mission to update the research plan following adjustments made to program implementation. The final research plan will be submitted in Quarter 4, now that the team lead has been onboarded. Pre-field planning, including onboarding of team members, occurred in June 2019. The field data collection is scheduled to occur in July and August 2019.

Under Result 2, the Language of Instruction (LOI) work stream continued to constitute a major area of work. In order to meet the objectives of the first activity in this research stream, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes (KSA) related to Language and Literacy that Influence Early Grade Literacy Outcomes (Result 2, Requirement 2.31), a draft of the report was submitted on February 4, 2019 and received further comments from the Education Office of the Economic Growth, Education and Environment bureau in Quarter 3. Work continues on the report with revisions and graphic design to produce a high quality, polished report for wide dissemination.

Furthermore, REEP-A continued ongoing work on Result 2, Requirement 2.32, Develop Relevant Country- Specific Profiles of Language of Instruction (LOI) Policies, Reading Program Approaches to LOI, and Student Reading Outcomes. This work includes the continuation of compiling information on a number of areas across the 19 countries for inclusion in the profiles. A draft profile was shared with USAID on March 5, 2019 as an example to allow for discussion and received initial feedback. Three draft country profiles were submitted on June 28, 2019 for further comments and to determine the appropriate document structure for the remainder of the country profiles. Once feedback is received from USAID for the three country profiles submitted, the process of revising the remainder of the country profiles will begin while work on the other profiles continues.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) continued during the quarter. The analysis, comparing teachers’ responses with data on student reading outcomes, as well as other findings, illustrated the need to adjust the questions for the next Early Grade Reading Assessment (EGRA) inclusion. The initial findings coming out of the inclusion of the teacher questions in the EGRA in Uganda was put in an initial report and submitted on June 24, 2019 for review. The proposed revisions to the LOI-specific questions included in the teacher interview questionnaire aim to improve the reliability and content of data collected through this instrument.

3 | REEP - A YEAR 3, QUARTER 3 REPORT Requirement 2.35, Teacher Language and Literacy Assessment (TLLA), involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Dexis developed an initial research plan to map the way forward for this activity. This research plan was submitted on January 18, 2019 with comments received on February 4, 2019. Throughout Quarters 2 and 3 the Activity Manager provided comments and feedback on the instruments and research plan, suggesting a change in the proposed structure of the tools in the research plan. The suggested changes were incorporated and both documents were re-submitted on June 21, 2019.

The continuation of work on the SRGBV Survey Instruments (Result 3, Requirement 3.1), remained a primary focus within Result 3 for Quarter 3. Efforts consisted of further data analysis from the fieldwork for the pilot in Malawi. Throughout Quarter 2 and Quarter 3, conversations with Dexis, the subcontractor Research Triangle Institute (RTI), and USAID focused on requests for clarification, additional information on the recommendations made regarding the survey content, and for more details about the reliability and validity analysis conducted on all SRGBV instruments. Following these discussions, it was determined that additional scenario planning documentation was needed in order to identify the best path for any further instrument development and repiloting.

Following the additional analysis and clarification on the results from the Malawi pilot, three deliverables were added to the SRGBV work stream. The first deliverable consisted of a scenario document outlining possible next steps for the SRGBV instruments which was submitted for discussion on April 30, 2019. Scenario 1 consisted of completing a confirmatory factor analysis (CFA) in order to identify, if possible, a core set of items for the Gender Attitudes and School Climate instruments for all student, teacher, and parent population groups that could be selected and remain as-is for inclusion in the Toolkit. USAID requested this scenario as the first step in order to determine whether any of the other scenarios, all of which involved some degree of repiloting, would be required. The second deliverable consisted of results of the CFA on the Gender Attitudes and Beliefs instrument for students, teachers, and parents, and was submitted on May 24, 2019. The third deliverable consisted of results of the CFA on the Perceptions of School Climate instrument for students, teachers, and parents and was submitted on June 19, 2019. Following the CFA results submissions, USAID provided feedback on June 26, 2019, which was discussed in a call with Dexis, the subcontractor (RTI), and USAID and on June 27, 2019. The scenario document will be revised following these discussions and the findings of the CFAs for submission in Quarter 4.

The draft version of the SRGBV Toolkit (Result 3, Requirement 3.1) was submitted in early 2019 for USAID review. On May 20, 2019, USAID provided written feedback to the draft version of the SRGBV Toolkit, which was discussed in a phone call on May 22, 2019. The feedback focused on revising the Toolkit to ensure that the overall framing of the Toolkit more clearly links the survey instruments and the conceptual framework; improving the physical and electronic navigation of the Toolkit; and revising content around data analysis. Additionally, discussion around the merits of including the Social and Emotional Learning (SEL) instrument continued, as this particular instrument was not included in the Malawi pilot, but has been piloted separately in Uganda. Additional discussions and revisions are expected to continue throughout Quarter 4 in order to produce a high-quality deliverable that is user- friendly.

REEP - A YEAR 3, QUARTER 3 REPORT | 4 Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during Quarter 3 through the approval and continuation of incorporation of the Kenya (Tusome) baseline and midline EGRA’s in English and Kiswahili.

Additionally, overall program management remains a significant portion of the TO, ensuring effective support of the technical activities within REEP-A. Administrative support and facilitation of discussions between the technical and management teams at Dexis, the subcontractor (RTI), and USAID have occurred throughout Quarter 3 of Year 3 and will continue to be a major management focus throughout the project’s duration.

TECHNICAL ACTIVITIES REEP-A supports the Africa Bureau’s Office of Sustainable Development (AFR/SD’s) Regional Development Cooperation Strategy (RDCS) through the generation of regional, country-specific and gender-focused data, analysis, and research that will assist in identifying and prioritizing needs, promoting rigorous research to inform evidence-based decisions, and promoting gender-equitable education investments within Africa.

To achieve these aims, technical assistance and advisory services under REEP-A fall across the three Results articulated in the Education Analytic Framework:

Result 1: Africa Missions Strategy-Related Data Needs Met

Result 2: Availability of Africa Education Data and Trends Expanded

Result 3: Measurement Tools with Applicability Across Countries Developed

This distribution of activities across the three Results is reflected throughout the five years of the TO. Year 3 continues to provide strategic research across the three Results, and builds on the foundational research and analysis that has occurred over the last two years. Year 3 remains focused on implementing and continuing research begun thus far, as well as initiating new research activities. In discussion with USAID, including during Quarterly meetings, future research topics will also be discussed and refined, to fulfill undetermined requirements and to ensure relevancy and alignment with emerging areas of interest being pursued under the TO. Research activities in Year 3 and subsequent years will prioritize and integrate gender-focused approaches, as done throughout Year 2.

In close consultation with AFR/SD/ED, all research activities, including technical plans, will utilize gender- sensitive methodologies and evaluation questions to improve understanding of the barriers to improving gender equity in education and learning. Recognizing the importance of gender considerations as a key component in achieving development aims, research activities within REEP-A will draw upon available sex-disaggregated data and other gender-sensitive tools to clearly articulate existing gender disparities and identify knowledge gaps. This research will provide a thorough analysis of gender issues at both the individual and institutional levels, including the interplay of contextual and political factors. In addition to research approaches, a gender lens will be adopted in the other analytic and capacity building services under REEP-A. This gender-sensitive approach includes the engagement of men and boys in conversations around issues of gender equity, which is central to achieving equality both in education and the work force. The identification of both gender-based disparities and continued barriers to equity

5 | REEP - A YEAR 3, QUARTER 3 REPORT will be driving aims of the research and recommendations, and will shape research questions and topics throughout the TO.

PROGRESS TOWARD OBJECTIVES (BY RESULT)

GENERAL TECHNICAL ACTIVITIES Work on activities within Result 1 continued throughout the third Quarter of Year 3. Planning and implementation of the two Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Result I, Requirements 1.3 and 1.4) activities in support of the Rwanda Mission continued in Quarter 3 of this year. The final version of the performance evaluation Huguka Dukore Akazi Kanoze, a youth and workforce development project, report was submitted June 3, 2019. However, comments still remain, and the report is being adjusted to include these comments to be resubmitted in Quarter 4. The second performance evaluation, Soma Umenye, an early grade reading initiative, progressed with continued planning. Additionally, the recruitment and onboarding of the evaluation team for the Soma Umenye performance evaluation was completed in June 2019.

Progress on the first activity in the LOI work stream, under Result 2, continued on Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Result 2, Requirement 2.31), with an updated draft of the report submitted to USAID for further comments on February 4, 2019 and feedback received from USAID in Quarter 3. Work continued on the design of the report as well. The next draft, including all comments incorporated, will consist of a fully laid out report in the design template, which will be submitted in Quarter 4.

Within this work stream, work on Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes continued. This work has consisted of compiling information on a number of relevant areas across the 19 countries for inclusion in the profiles. A draft profile was shared with USAID on March 5, 2019 as an example to allow for discussion of content and layout. A call was arranged with the Activity Manager on March 7, 2019 to discuss content and layout as well as to provide Dexis with initial overall feedback. Three draft country profiles were submitted June 28, 2019 for comments and to determine the appropriate document structure for the remainder of the country profiles.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) continued during the quarter. This analysis, comparing teachers’ responses with data on student reading outcomes, as well as other findings, illustrated the need to adjust the questions for the next EGRA inclusion. The initial findings coming out of the inclusion of the teacher questions in the EGRA in Uganda were put in an interim report and submitted alongside the research plan, on June 24, 2019 for review. The proposed revisions to the LOI-specific questions included in the teacher interview questionnaire aim to improve the reliability and content of data collected through this instrument.

Requirement 2.35, Teacher Language and Literacy Assessment, involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Dexis

REEP - A YEAR 3, QUARTER 3 REPORT | 6 developed an initial research plan to map the way forward for this activity. This research plan was submitted on January 18, 2019 with comments received on February 4, 2019. Throughout Quarter 2 and Quarter 3 the Activity Manager provided comments and feedback on the instruments and research plan, suggesting a change in the proposed structure of the tools in the research plan. The suggested changes were incorporated and both documents were re-submitted on June 21, 2019.

The continuation of work on Delivery of the SRGBV Measurement Framework (Result 3, Requirement 3.1), remained a primary focus within Result 3 for Quarter 3. Efforts consisted of further data analysis from the fieldwork for the pilot in Malawi. Throughout Quarter 2 and Quarter 3, conversations with Dexis, the subcontractor Research Triangle Institute (RTI), and USAID focused on requests for clarification, additional information on the recommendations made regarding the survey content, and for more details about the reliability and validity analysis conducted on all SRGBV instruments. Following these discussions, it was determined that additional scenario planning documentation was needed in order to identify the best path for any further instrument development and repiloting.

Following the additional analysis and clarification on the results from the Malawi pilot, three deliverables were added to the SRGBV workstream. The first deliverable consisted of a scenario document outlining possible next steps for the SRGBV instruments and was submitted for discussion on April 30, 2019. Scenario 1 consisted of completing a CFA in order to identify, if possible, a core set of items for the Gender Attitudes and School Climate instruments for all student, teacher, and parent population groups that could be selected and remain as-is for inclusion in the Toolkit. USAID requested this scenario as the first step in order to determine whether any of the other scenarios, all of which involved some degree of repiloting, would be required. The second deliverable, the results of the CFA on the Gender Attitudes and Beliefs instrument for students, teachers, and parents, was submitted on May 24th, 2019. The third deliverable consisted of results of the CFA on the Perceptions of School Climate instrument for students, teachers, and parents and was submitted on June 19, 2019. Following the CFA results submissions, USAID provided feedback on June 26, 2019, which was discussed in a call with Dexis, the subcontractor RTI, and USAID and on June 27, 2019. The scenario document will be revised following these discussions and based off of the additional information sourced from the CFA.

The draft version of the SRGBV Toolkit (Result 3, Requirement 3.1) was submitted in early 2019 for USAID review. On May 20, 2019, USAID provided written feedback to the draft version of the SRGBV Toolkit, which was discussed in a phone call on May 22, 2019. The feedback focused on revising the Toolkit to ensure that the overall framing of the Toolkit more clearly links the survey instruments and the conceptual framework; improving the physical and electronic navigation of the Toolkit; and revising content around data analysis. Additionally, discussion around the merits of including the SEL instrument continued, as this particular instrument was not included in the Malawi pilot, but has been piloted separately in Uganda. Additional discussions and revisions are expected to continue throughout Quarter 4 in order to produce a high-quality deliverable that is user-friendly.

Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during Quarter 3 through the continuation of incorporation of the Kenya (Tusome) baseline and midline EGRA’s in English and Kiswahili.

RESULT 1: AFRICA MISSIONS STRATEGY-RELATED DATA NEEDS MET

7 | REEP - A YEAR 3, QUARTER 3 REPORT The aim of Result 1 is to provide Missions with data and research support that meets needs and facilitates understanding of the status of education in specific countries. Result 1 includes three main activity categories, spread across seven requirements:

• Deliver Country and Regional Education Data Trend Snap Shots (Requirement 1.1) • Deliver Education Data Briefs (Requirement 1.2) • Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3-1.7)

Year 3 Quarter 3 activities under Result I included Deliver Mission Support for Evaluation, Assessment and Data on Selected Priorities (Requirements 1.3 and 1.4).

Work continued on the performance evaluations for two of the USAID Rwanda Mission’s flagship education programs during Quarter 3. The performance evaluation report for Huguka Dukore Akazi Kanoze, the Rwanda Mission's flagship workforce development and youth activity, was submitted in Quarter 3. Following the feedback provided by the Mission, the final version of the performance evaluation report was submitted June 3, 2019. However, comments still remain, and the report is being adjusted to include these comments to be resubmitted in Quarter 4.

Soma Umenye is the Rwanda Mission’s flagship reading activity (7/2016-7/2021) and aims to improve the reading skills of one million public school learners nationwide, in grades 1-3. The initial research plan for this performance evaluation was submitted and approved on November 29, 2018. During Quarter 3 conversations occurred with the Mission to discuss initial planning for the evaluation and to update the research plan following adjustments made to program implementation. The revised research plan including timeline and staffing will be submitted the beginning of Quarter 4, following the onboarding of the approved evaluation team.

Result 1 Next Steps:

• Additional Country and Regional Education Data Trends Snap Shots (Requirement 1.1) are allocated to occur in Year 3, and can be conducted at the request of AFR/SD/ED. • Additional Education Data Briefs (Requirement 1.2) are allocated to occur in Year 3. • Mission Support for Evaluation, Assessment, and Data on Selected Priorities (Requirements 1.3 and 1.4): ◦ Report editing and submission for the revised Huguka Dukore Akazi Kanoze performance evaluation report (Requirement 1.3). ◦ Submission of updated research plan for the Soma Umenye evaluation (Requirement 1.4). ◦ In-country field work for the Soma Umenye evaluation (Requirement 1.4). ◦ Data analysis and report writing for the Soma Umenye evaluation (Requirement 1.4).

RESULT 2: AVAILABILITY OF AFRICA EDUCATION DATA AND TRENDS EXPANDED

The majority of REEP-A research activities for Year 3 will contribute to Result 2, as will the bulk of research activities throughout the TO.

REEP - A YEAR 3, QUARTER 3 REPORT | 8 The objectives of Result 2 are to identify or generate knowledge about evidence-based education programming related to early grade reading and access to education, and to ensure that relevant knowledge and research products are available to USAID education personnel through cost-effective dissemination methods, as well as to provide trainings and workshops.

Requirements for Result 2 throughout the TO primarily consist of desk research, primary research, capacity building, and ad hoc reports. Result 2 activities in Year 3 include:

• Deliver Workshops (Requirement 2.20-2.30) • Deliver Current Research on Teacher Knowledge, Skills, and Attitudes related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31) • Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32) • Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) • Development of the Teacher Language and Literacy Assessment (Requirement 2.35)

Work under the LOI work stream continued on the first requirement, Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31) in Quarter 3 of Year 3. This Requirement is designed to serve as a framing document for the entire work stream and will pose future research questions and identify gaps in the existing knowledge base. A draft of the framework document was submitted on February 4, 2019. Dexis received further comments from the Education Office of the Economic Growth, Education and Environment bureau in Quarter 3. Work continued through Quarter 3 on the report, through revisions and design to produce a high quality, polished report for wide dissemination.

Under Requirement 2.32, Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes, Dexis will develop up to 19 country profiles for sub- Saharan African countries currently implementing Goal 1 reading programs. This work includes the continuation of compiling information on a number of areas across the 19 countries for inclusion in the profiles. The country profiles were revised throughout Quarter 3, after receiving initial feedback from USAID regarding the draft profile that was submitted in Quarter 2. Three draft country profiles were submitted on June 28, 2019 for further comments and to determine the appropriate document structure for the remainder of the country profiles.

Additional work under Result 2, Delivery of an Analysis on the Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33) continued during this quarter. Following the inclusion of the questions into an EGRA implemented by RTI in Uganda in mid-October 2018, analysis began once the data was received. The analysis, comparing teachers’ responses with data on student reading outcomes, as well as other findings, illustrated the need to adjust the questions for the next EGRA inclusion. The initial findings coming out of the inclusion of the teacher questions in the EGRA in Uganda was put in an initial report and submitted alongside the research plan, on June 24, 2019 for review. The proposed revisions to the LOI-specific questions included in the teacher interview questionnaire aim to improve the reliability and content of data collected through this instrument.

Requirement 2.35, Teacher Language and Literacy Assessment, involves developing and administering a TLLA that will, at a minimum, assess teachers’ oral language proficiency, reading, and writing ability in

9 | REEP - A YEAR 3, QUARTER 3 REPORT the relevant LOI, as well as include an interview protocol related to language attitudes and beliefs. Dexis developed an initial research plan to map the way forward for this activity. Throughout Quarter 2 and Quarter 3 the Activity Manager provided comments and feedback on the research plan. The revised research plan and survey instruments were submitted on April 18, 2019. The final research plan was adjusted accordingly based on received feedback and submitted June 20, 2019. Additionally, work continued throughout Quarter 3 to further develop the instruments, and an initial draft was submitted June 20, 2019.

Result 2 Next Steps:

• Continue technical work and design for Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language and Literacy that Influence Early Grade Literacy Outcomes (Requirement 2.31). • Continue work on Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to LOI, and Student Reading Outcomes (Requirement 2.32). • Continue work on Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading Performance (Requirement 2.33). • Submission of research plan for case study on LOI implementation and language attitudes (Requirement 2.34) • In-country work planning for Uganda following the approval of the final research plan and instruments for the Teacher Language and Literacy Assessment (Requirement 2.35).

RESULT 3: MEASUREMENT TOOLS WITH APPLICABILITY ACROSS COUNTRIES DEVELOPED USAID has supported the development of several tools (e.g. Early Grade Reading Assessment and Snapshot of School Management Effectiveness) that have been successfully adapted and applied in numerous countries, generating useful data and animating purposeful policy dialogue. The subsequent years of the project will explore the further development of relevant and applicable tools, as there is high demand for low-cost, easily implemented tools for systematic data collection, analysis, and policy dialogue.

The research priority for Result 3 is to continue work toward the Delivery of the SRGBV Measurement Framework (Requirement 3.1). An SRGBV Measurement Framework was initiated under DERP, and advanced by AFR/SD/ED. It is the intention of AFR/SD/ED that this initiative be finalized under REEP-A, such that a fully functional and user-friendly suite of surveys and data collection protocols can be widely utilized by the SRGBV community in the field. These efforts have been continued in Quarter 3 with continued work on the SRGBV Toolkit that is being designed for global use.

The data cleaning, processing, analysis, and report writing for the SRGBV instrument pilot carried into Year 3. The continuation of work on the Delivery of the SRGBV Measurement Framework (Requirement 3.1) remained a primary focus for Quarter 3. Following the submission of the Malawi Pilot draft report, discussions have been ongoing around some of the issues flagged in the report, including requests for clarification, additional information on the recommendations made regarding the survey content, and for more details about the reliability and validity analysis conducted on all the instruments, leading to additional analyses and discussions around how to correct these items and proceed with next steps.

REEP - A YEAR 3, QUARTER 3 REPORT | 10 Following these discussions, it was determined that additional scenario planning documentation was needed in order to identify the best path for any further instrument development and repiloting.

Following the additional analysis and clarification on the results from the Malawi pilot, three deliverables were added to the SRGBV work stream. The first deliverable consisted of a scenario document outlining possible next steps for the SRGBV instruments which was submitted for discussion on April 30, 2019. Scenario 1 consisted of completing a CFA in order to identify, if possible, a core set of items for the Gender Attitudes and School Climate instruments for all student, teacher, and parent population groups that could be selected and remain as-is for inclusion in the Toolkit. USAID requested this scenario as the first step in order to determine whether any of the other scenarios, all of which involved some degree of repiloting, would be required. The second deliverable consisted of results of the CFA on the Gender Attitudes and Beliefs instrument for students, teachers, and parents, and was submitted on May 24, 2019. The third deliverable consisted of results of the CFA on the Perceptions of School Climate instrument for students, teachers, and parents and was submitted on June 19, 2019. Following the CFA results submissions, USAID provided feedback on June 26, 2019, which was discussed in a call with Dexis, the subcontractor (RTI), and USAID and on June 27, 2019. The scenario document will be revised following these discussions and the findings of the CFA’s for submission in Quarter 4.

Continued work on the design and layout of the Toolkit occurred during Quarter 2. The beginning of 2019, the draft version of all chapters of the SRGBV Toolkit (Result 3, Requirement 3.1) was submitted for USAID review. On May 20, 2019, USAID provided written response to the draft version of the SRGBV Toolkit, which were discussed in a phone call on May 22, 2019. This process has been very iterative to keep things moving as quickly as possible. The Activity Manager discussed with Dexis the dissatisfaction with the sections they have seen and expressed their concerns for Dexis to address with the subcontractor. The feedback focused on revising the Toolkit to ensure that the overall framing of the Toolkit more clearly links the survey instruments and the conceptual framework; improving the physical and electronic navigation of the Toolkit; and revising content around data analysis. Additionally, discussion around the merits of including the SEL instrument continued, as this particular instrument was not included in the Malawi pilot, but has been piloted separately in Uganda. Additional discussions and revisions are expected to continue throughout Quarter 4 in order to produce a high-quality deliverable that is user-friendly.

Work towards (Requirement 3.2) Update Status of Early Grade Reading Barometer has progressed during Quarter 3 through the continuation of incorporation of the Kenya (Tusome) baseline and midline EGRA’s in English and Kiswahili.

Result 3 Next Steps:

• Continue work on the SRGBV Toolkit (Requirement 3.1), namely: ◦ Submission of a second document outlining scenarios for re-pilot of SRGBV survey instruments. ◦ Continue progress toward the design of the SRGBV Toolkit with submission of a revised draft Toolkit in Quarter 4. • Continue work on incorporating country data from the Kenya (Tusome) project into the Early Grade Reading Barometer (Requirement 3.2).

11 | REEP - A YEAR 3, QUARTER 3 REPORT Table 1: Status of Technical Deliverables for Quarter 3, by Result

Result 1: Africa Missions strategy related data needs met Status Deliver Country and Regional Education Data Trend Snap-Shots (Requirement 1.1) Upcoming (timing TBD) Deliver Education Data Briefs (1.2) Upcoming (timing TBD) Deliver Mission Support for Evaluation, Assessment, and Data on Selected Priorities (1.3-1.7) In progress Result 2: Availability of Africa education data and trends expanded Status Develop Early Grade Reading Research Training Modules or Courses for Classroom, Upcoming (timing TBD) Online, and/or Resources Materials (2.19.2) Deliver Workshops (2.20-2.30) Upcoming (timing TBD) Deliver Current Research on Teacher Knowledge, Skills, and Attitudes Related to Language In progress and Literacy that Influence Early Grade Literacy Outcomes (2.31) Develop Relevant Country-Specific Profiles of LOI Policies, Reading Program Approaches to In progress LOI, and Student Reading Outcomes (2.32) Deliver Analysis of Relationship Between Teacher Language Proficiency and Student Reading In progress (2.33)

Development of the Teacher Language and Literacy Assessment (2.35) In progress

Result 3: Measurement tools with applicability across countries developed Status Deliver the SRGBV Measurement Framework (3.1) In progress Update Status of Early Grade Reading Barometer (3.2) In progress

SUCCESSES, CHALLENGES AND LESSONS LEARNED After discussing concerns with USAID regarding the Huguka Dukore Akazi Kanoze Performance Evaluation (Requirement 1.3), Dexis adjusted the team composition for the upcoming Soma Umenye Performance Evaluation (Requirement 1.4). The Mission was included in the selection process for the Team Lead position for the Soma Umenye Performance Evaluation in order to mitigate potential issues that had occurred in the prior evaluation.

Additionally, there were some challenges surrounding the level of quality of products submitted to USAID. To address this, it has been proposed that following the outline of Chapter 5 and the next draft of the SRGBV Toolkit (Requirement 3.1) is submitted, Dexis, the subcontractor, and USAID will arrange a workshop to address any outstanding items in real time. The proposed workshop will aim to provide concise communication and a forum for open discussion to ensure there is clear understanding of tasks and expectations moving forward.

OTHER DELIVERABLES Additionally, in order to continue to provide services under the Research for Effective Education Programming-Africa project a Limitation of Funds notice was submitted on May 30, 2019 and the pipeline budget was submitted on June 24, 2019.

REEP - A YEAR 3, QUARTER 3 REPORT | 12 PROJECT MANAGEMENT DELIVERABLES Next Quarter (Year 3, Quarter 4):

• Dexis will submit the Year 3, Annual Report on October 31, 2019. • Dexis will submit the Year 4 Annual Workplan on October 31, 2019. • Dexis will submit the Year 4 Annual Workplan Summary 15 days after COR approval of the Workplan.

13 | REEP - A YEAR 3, QUARTER 3 REPORT

RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING – AFRICA (REEP) Early Grade Reading Barometer Assessment Report

Prepared for USAID

Prepared by Dexis Consulting Group

Research for Effective Education Programming – Africa (REEP) Contract No. 7000-S-2015-01-WO-2017-02

October 2018

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID-OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 . The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication.

ACRONYMS DERP Data for Education Research and Programming in Africa EGRA Early Grade Reading Assessment REEP Research for Effective Education Programming–Africa USAID United States Agency for International Development

1 | EARLY GRADE READING BAROMETER ASSESSMENT REPORT INTRODUCTION

The Early Grade Reading Barometer, an online tool hosted by the U.S. Agency for International Development’s (USAID) Asia Bureau, aims to increase the accessibility of early grade reading assessment (EGRA) data for USAID staff, academics, government officials, and practitioners. As of August 2018, datasets from 22 locations spanning Asia, the Middle East, and Africa, are available within this platformi. For sub-Saharan Africa, the Early Grade Reading Barometer (the Barometer) includes datasets for various regions within the Democratic Republic of the Congo, Ghana, Liberia, Malawi, Nigeria, , Uganda, and Zambia. However, the majority of these datasets are based on EGRAs conducted at least five years ago. Recognizing the need for more recent data from the sub-Saharan Africa region, USAID’s Africa Bureau seeks to expand the number of EGRA datasets from the region. This report aims to determine if the current platform provides the appropriate venue to incorporate additional datasets.

DEVELOPMENT OF THE EXISTING PLATFORM

The concept for the Barometer was developed following the release of USAID’s Education Strategy in 2011, which initiated a pivot within USAID’s education portfolio towards early grade reading. In anticipation of a large influx of EGRA data sets, USAID staff began to think of ways this data could be utilized to better inform programming decisions. To facilitate the development of the Barometer, USAID, in partnership with RTI International, initiated a year-long consultation and experimentation phase, which resulted in the design and construction of the existing Barometer platform.

Following piloting of the Barometer in 2014, EGRA datasets from countries in Asia and the Middle East were gradually incorporated into the tool. The process of dataset incorporation entailed back- end preparation of the data by RTI International, as well as front-end tool preparation by a data visualization firm acquired by RTI International. After the initial piloting phase, the Africa Bureau joined in the efforts to utilize the Barometer. Through Data for Education Research and Programming (DERP), a project under the Africa Bureau, EGRA datasets from countries in sub- Saharan Africa were incorporated into the Barometer, with the Asia Bureau continuing maintenance and oversight of the platform.

VALUE OF THE EXISTING PLATFORM

The existing platform has built a strong online presence and is widely utilized. The platform is unique in that data is easily accessible in the form of interactive reports that allow users to view snapshot data, access student performance data on EGRA subtasks, explore how different EGRA subtasks relate to one another, and even compare student outcomes on EGRA assessments across countries.ii In addition, the Barometer emphasizes data quality and reliability. Data incorporated into the Barometer has been checked against publicly-available data from sources, such as national governments and the United Nations, to ensure high quality data and mitigate potential discrepancies. Furthermore, the platform has the capability to password-protect datasets for countries which may be hesitant to make EGRA scores publicly available.

As reflected in data from Google Analytics, the Barometer sees almost 3000 users each month. This number has increased dramatically since the implementation of a digital marketing campaign was undertaken, and the log-in requirement to utilize the tool was removed. Furthermore, many users return to the Barometer after their initial visit to the website. From January to August 2017, anywhere from 15 to 45 percent of users were classified “returning users.”iii

A digital marketing campaign was implemented in 2017 to attract more users to the tool. One aspect of the campaign entailed utilizing Google AdWords to direct users searching words relevant to early grade reading in Google to the Barometer’s website. This type of ad reached hundreds of thousands of Google users, including 187,950 Google users in August 2017 alone.iv Furthermore, this

2 | EARLY GRADE READING BAROMETER ASSESSMENT REPORT ad received many clicks directing users to the Barometer website, including 3376 clicks in August 2017.v

Social media outlets also proved effective in attracting users to the Barometer. For instance, as part of the digital marketing initiative in 2017, a Facebook advertisement campaign was successful in drawing attention to the tool, with Facebook advertisements from the period March 1 to August 28, 2017 reaching a total of 306,635 people. These advertisements were viewed a total of 772,836 times, and resulted in 7,320 link clicks to the Barometer website. Likewise, there was an increase in the total number of “Likes” on the Barometer’s Facebook page, from 30 in February 2017 to 3,034 in August of the same year.vi

DATA INCORPORATION COSTS

The level of effort and timeline required for the incorporation of a dataset depends upon how closely the data follows the codebook presented in the EGRA Toolkit 2016, and how clearly the dataset analysis aligns with the accompanying report. Thus, the timeline for dataset incorporation can fluctuate from only six weeks to a maximum of six months.viii

ASSESSMENT OF THE EXISTING PLATFORM

The Barometer has been well-utilized both within USAID and by external actors. Data from Google Analytics and social media sites reflect the growing interest and usage of the platform. Furthermore, USAID has already invested in developing a strong web presence for the tool.

The Asia Bureau’s experience with the Barometer presents valuable lessons moving forward on how to ensure accessibility of the platform and EGRA data. First, where possible, it is important refrain from using log-in restrictions on the data. This increases traffic to the website as a whole and allows the data to be more widely utilized. Furthermore, marketing has been an important tool for increasing traffic to the tool’s site. As revealed by RTI’s experience with the social media and Google advertisement campaign, these tools were important to boost the web presence and usage of the Barometer. As the necessary platform for the Barometer already exists and is well-operated, it is recommended that the Africa Bureau utilize the existing Early Grade Reading Barometer to incorporate datasets.

Based on RTI’s experience with the development and implementation of the Barometer with the Asia Bureau, it is recommended that they implement the datasets for the Africa Bureau as well. Dexis’ Organizational Conflict of Interest Mitigation Plan for the Task Order states Dexis will “Ensure RTI [International] will not be involved in any activities that require evaluation of reading programs that RTI [International] has led or is currently leading (Requirement 3.2 on the EGRA Barometer).” However, as the work under this sub-requirement involves incorporation of data into the platform—and does not involve RTI International’s evaluation of its own work or programs —Dexis does not see this as a conflict of interest and recommends that the subcontractor takes on this sub-requirement.

3 | EARLY GRADE READING BAROMETER ASSESSMENT REPORT

i U.S. Agency for International Development, Early Grade Reading Barometer, https://earlygradereadingbarometer.org/ (accessed August 15, 2018). ii Ibid. iii RTI International, “Barometer: Year in Review,” PowerPoint presentation, September 30, 2017. iv Ibid. v Ibid vi Ibid. vii RTI International, “Barometer Costs,” Spreadsheet, 2017. viii RTI International, “Barometer: Year in Review,” PowerPoint presentation, September 30, 2017.

4 | EARLY GRADE READING BAROMETER ASSESSMENT REPORT REFERENCES RTI International. “Barometer: Year in Review.” PowerPoint presentation, September 30, 2017. RTI International. “Barometer Costs.” Spreadsheet, 2017. U.S. Agency for International Development. Early Grade Reading Barometer. https://earlygradereadingbarometer.org/ (accessed August 15, 2018).

5 | EARLY GRADE READING BAROMETER ASSESSMENT REPORT

TRAVAUX DE RECHERCHE POUR UN PROGRAMME D’ÉDUCATION EFFICACE - AFRIQUE (REEP-A) DOSSIER DE DONNÉES SUR L’ÉDUCATION

Date d’approbation: 24 mai 2018 Date d’expiration du contrat: 28 septembre 2021 Numéro du contrat: AID-OAA-TO-16-00024

Dates de début et de fin des activités du contract: 29/09/2016 – 28/09/2021 Montant total de la subvention: 7.949.966,00 $ Mise en œuvre par: Dexis Consulting Group (Dexis) Sous-traitant: Research Triangle Institute (RTI)

1 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS

Dossier de données sur l’éducation: Prévalence mondiale de la violence basée sur le genre en milieu scolaire (VBGMS)

4 février 2019

Travaux de recherche pour un programme d’éducation efficace – Afrique (REEP-A)

Numéro du contrat : AID-OAA-TO-16-00024

AVIS DE NON -RESPONSABILITÉ Réalisé par Dexis Consulting Group pour le compte de l’Agence des États-Unis pour le développement international

2 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS ACRONYMES

EDS Enquêtes Démographiques et de Santé (DHS en anglais)

GSHS Enquêtes mondiales sur la santé des élèves en milieu scolaire

LGBT Lesbiennes, gays, bisexuels et transsexuels

MICS Enquêtes par grappes à indicateurs multiples

OMS Organisation Mondiale de la Santé (WHO en anglais)

PIB Produit Intérieur Brut (GDP en anglais)

RMS Rapport Mondial de Suivi (GMR en anglais)

SACMEQ Consortium de l’Afrique australe et orientale pour le suivi de la qualité de l’éducation

SRSG Représentante spéciale du Secrétaire général des Nations unies

TIMSS Étude sur les tendances en mathématiques et en sciences

UNESCO Organisation des Nations unies pour l’éducation, la science et la culture

UNICEF Fonds des Nations unies pour l’enfance

USAID Agence des États-Unis pour le développement international

VACS Enquête sur la violence contre les enfants

VBGMS Violence Basée sur le Genre en Milieu Scolaire (SRBGV en anglais)

3 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VIOLENCE BASÉE SUR LE GENRE EN MILIEU SCOLAIRE (VBGMS) Le présent Dossier de données sur l’éducation donne un aperçu de la Prévalence mondiale de la violence basée sur le genre en milieu scolaire (VBGMS), en recourant à des données récentes pour illustrer la portée et l’étendue de la VBGMS à travers le monde. Par ailleurs, le Dossier de données fournit aussi des statistiques au niveau des pays, quand elles sont disponibles, en se concentrant particulièrement sur l’Afrique subsaharienne. Le Dossier de données est divisé en deux volets principaux : la prévalence de la VBGMS et l’impact de la VBGMS. Après des chiffres généraux, le volet sur la prévalence de la VBGMS présente des données et des informations organisées comme suit : l’intimidation et autres formes de brimades non sexuelles ; la violence sexuelle ; la violence physique ; et les groupes à haut risque. Les données de l’impact de la VBGMS sont organisées selon les impacts sur les résultats scolaires ; les impacts sur les conséquences de la santé physique et mentale ; le lien entre l’exposition à un stress toxique prolongé et les résultats développementaux et cognitifs ; l’impact sur les mariages et les grossesses précoces ; les coûts de la VBGMS ; et les données sur le signalement et les orientations.

PRÉVALENCE DE LA VBGMS

STATISTIQUES GÉNÉRALES ● Mondialement, il est estimé que 246 millions d’enfants et d’adolescents sont victimes de violence et subissent des intimidations à l’école sous une forme ou une autre chaque année.i ● D’après les estimations, le pourcentage des enfants et des jeunes impactés par l’intimidation et la violence à l’école oscille entre 10 % et 65 %, selon le pays, l’étude et la définition utilisée.ii

INTIMIDATION ET AUTRES FORMES DE BRIMADES NON SEXUELLES Y compris les injures, l’exclusion par les pairs, l’intimidation verbale, la violence physique et le cyber- harcèlement

● À travers le monde, près de 130 millions d’élèves (soit un peu plus d’un sur trois), âgés de 13 à 15 ans, sont victimes d’intimidation.iii ● Environ une fille sur trois, âgée de 13 à 15 ans, est régulièrement victime d’intimidation dans le monde.iv ● Dans 12 des 67 pays à revenu faible et intermédiaire qui disposent de données, plus de la moitié des adolescentes déclarent avoir récemment été victimes d’intimidation.v ● Les données des Enquêtes mondiales sur la santé des élèves en milieu scolaire (GSHS) de 85 pays, menées de 2003 à 2014, indiquent que le pourcentage des jeunes de 13 à 15 ans ayant indiqué qu’ils avaient été victimes d’intimidation un ou plusieurs jours au cours des 30 derniers jours s’élevait de 7 % au Tadjikistan (2006), à 74 % à Samoa (2011). Les données des Études sur les tendances en mathématiques et en sciences (TIMSS) de 2011 indiquaient des taux plus élevés parmi les élèves de 11 à 15 ans interrogés, avec 78 % des élèves au Ghana et 81 % au Botswana faisant état d’intimidation au cours du mois précédent ou advantage.vi ● Dans un sondage d’opinion de 2016, mené par le Fonds des Nations unies pour l’enfance (UNICEF) auprès de quelque 100.000 enfants et jeunes de 18 pays à travers le monde, environ les deux-tiers des répondants signalaient qu’ils avaient été victimes d’intimidation.vii

4 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS ● Une étude compilant des bases de données GSHS de l’Organisation mondiale de la santé (OMS) de 19 pays à revenu faible et intermédiaire entre 2003 et 2006 a conclu que 34 % des élèves de 11 à 13 ans indiquaient avoir subi des intimidations au cours du mois précédent, et 8 % faisaient état d’intimidations quotidiennes.viii ● Les données du sondage d’opinion de 2016, UNICEF U-Report/Représentante spéciale du Secrétaire général des Nations unies – Violence contre les enfants (SRSG-VAC), ont indiqué que parmi les enfants victimes d’intimidation, 30 % disaient en avoir parlé à un adulte, 30 % l’avaient mentionné à un ami ou à un frère ou une sœur, moins de 10 % en avaient parlé à un enseignant et 30 % n’avaient rien dit à personne.ix ● Une enquête de 2008 au Ghana avait constaté que 31 % des élèves des dernières années de lycée avaient été attaqués physiquement une ou plusieurs fois au cours des 12 mois précédents, tandis que 40 % des interrogés signalaient avoir été victimes d’intimidation au cours des 30 derniers jours.x ● Dans une enquête à Nairobi, auprès des écoles publiques du Kenya, 63 % à 82 % des 100.000 élèves interrogés faisaient état de divers types d’intimidation, tandis qu’une enquête en Afrique du Sud constatait que plus de la moitié des répondants avait subi une ou deux fois des intimidations au cours du mois dernier.xi ● L’intimidation est omniprésente au Botswana, au Ghana et en Afrique du Sud, où dans chaque pays environ 80 % des élèves dans la population de référence avaient été victimes d’intimidation, sous une forme ou une autre. Environ 80 % des élèves interrogés sont harcelés chaque mois alors que près de 50 % des élèves le sont chaque semaine.xii

CYBER-HARCÈLEMENT ● La prévalence du cyber-harcèlement devient un sujet de préoccupation croissant. Les statistiques disponibles suggèrent que le pourcentage des enfants et des adolescents affectés par le cyber- harcèlement dans le monde oscille entre 5 % et 21 %.xiii ● Une enquête de 2013 aux États-Unis a conclu que 15 % des adolescents de la troisième à la terminale (années de la 9e à la 12e dans le système éducationnel américain) avaient été intimidés en ligne au cours de l’année précédente. Les filles étaient deux fois plus susceptibles de signaler qu’elles avaient été victimes de cyber-harcèlement que les garçons, 21 % et 9 % respectivement.xiv ● L’Étude nationale de 2012 sur la violence à l’école en Afrique du Sud, constituant un échantillon représentatif des élèves du secondaire, a constaté qu’un élève sur cinq faisait état d’une certaine forme de cyber-harcèlement au cours de l’année précédente.xv

VIOLENCE SEXUELLE Y compris l’abus sexuel, le commerce du sexe, les injures de nature sexuelle, le viol ● Les données des Enquêtes sur la violence contre les enfants (VACS)indiquent que les attouchements de nature sexuelle non désirés représentent la forme d’abus sexuel des enfants la plus souvent signalée dans le monde.xvi L’enquête du Consortium de l’Afrique australe et orientale pour le suivi de la qualité de l’éducation (SACMEQ) a conclu que dans six pays, dont le Kenya, l’Ouganda et la Zambie, plus de 40 % des directeurs d’école indiquaient que le harcèlement sexuel d’élève à élève avait lieu « parfois » ou « souvent ». Des enseignants avaient également été signalés comme étant les coupables, avec en moyenne 39 % des directeurs d’école déclarant que le harcèlement enseignant-élève avait eu lieu dans leurs écoles. xvii

5 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS ● Une analyse des données du SACMEQ montre que deux directeurs d’école sur cinq en Afrique australe et orientale signalent que le harcèlement sexuel se produit entre élèves dans leurs écoles primaires. xviii ● Au Kenya, une femme et un homme sur cinq victimes de violence sexuelle avant 18 ans indiquaient que le premier incident avait eu lieu à l’école. Pour autant, les femmes étaient plus susceptibles de dire que l’incident était intervenu lors d’un trajet à pied, par comparaison aux hommes du même âge, 27 % et 14 %, respectivement.xix ● En Afrique du Sud, une enquête nationale récente a conclu que 8 % des filles au niveau secondaire avaient été victimes d’agressions sexuelles graves ou de viol à l’école au cours de l’année précédente.xx ● En Zambie, une enquête scolaire a conclu que près de 11 % des garçons et 4 % des filles étaient beaucoup plus susceptibles de subir des commentaires sexuels de la part d’enseignants.xxi ● Au Zimbabwe, les femmes de 18 à 24 ans étaient sensiblement plus susceptibles que leurs homologues masculins de signaler que leurs premières expériences de violence sexuelle avant 18 ans avaient eu lieu sur le trajet aller ou retour de l’école, à un taux de 19 % et 7 %, respectivement.xxii ● En Tanzanie, environ 23 % des femmes et 15 % des hommes de 13 à 24 ans signalaient avoir vécu au moins un incident de violence sexuelle pendant l’enfance sur le trajet aller ou retour de l’école.xxiii ● En Afrique du Sud, une enquête représentative, menée à l’échelle nationale en 2011-2012, a constaté qu’environ 1 élève du secondaire sur 20, soit à peu près 5 %, avait fait état d’au moins un acte de contacts sexuels non désirés à l’école au cours de l’année précédant l’enquête. Les filles étaient beaucoup plus susceptibles de signaler qu’elles avaient été sexuellement violentées à l’école que les garçons, à des taux de 8 % et 1 %, respectivement.xxiv

COMMERCE DU SEXE ● Au Mozambique, dans une étude du ministère de l’Éducation, 70 % des filles interrogées signalaient qu’elles savaient que certains enseignants avaient recours à des rapports sexuels en tant que condition pour passer dans la classe supérieure.xxv ● La contrainte et l’abus sexuels par des enseignants en échange de meilleures notes ont été documentés en Amérique latine et en Afrique. La coercition, par les enseignants masculins, à des rapports sexuels des filles qui ne peuvent pas payer les dépenses liées aux études a été documentée en Afrique.xxvi ● Au Ghana, 75 % des enfants citaient des enseignants en tant que principaux coupables de violence à l’école ; au Sénégal, le pourcentage s’élevait à 80 %. La violence peut se manifester sous forme de relations sexuelles inappropriées entre les enseignants masculins et les élèves de sexe féminin, de commerce du sexe pour couvrir les frais scolaires et le coût des fournitures, de rapports sexuels pour passer à la classe supérieure et d’usage excessif du châtiment corporel.xxvii ● Dans l’enquête sur la violence contre les enfants (VACS) effectuée au Swaziland en 2007, dans laquelle seules les femmes avaient fait partie de l’échantillonnage, près de 2 % des filles et femmes de 13 à 24 ans avaient indiqué qu’un enseignant ou un directeur d’école leur avaient proposé de l’argent, des cadeaux, des produits alimentaires, un logement ou de meilleures notes en échange de rapports sexuels à un moment donné dans leur vie.xxviii

6 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS VIOLENCE PHYSIQUE Y compris le châtiment corporel, le travail relevant de l’exploitation, l’humiliation publique ● Un rapport de l’UNICEF de 2014 indique que chez les adolescents, les pairs et les enseignants étaient les coupables de violence physique les plus courants. Concernant les adolescentes, les parents et les autres éducateurs étaient les coupables de la violence physique les plus courants. Dans certains pays, les enseignants étaient mentionnés par un pourcentage significatif de filles, dont 48 % en Ouganda, 42 % au Kenya et 32 % au Nigeria.xxix ● Selon des données des Enquêtes démographiques et de santé (EDS) recueillies pour 36 pays, dans quatre pays d’Afrique subsaharienne, les enseignants étaient les coupables de violence physique les plus courants chez les filles non mariées, avec 34 % au Ghana, 47 % au Kenya, 58 % en Ouganda et 39% en Tanzanie. Chez les garçons de 15 à 19 ans qui subissaient des violences physiques depuis l’âge de 15 ans, les enseignants étaient cités comme coupables dans 34 % des cas en Ouganda, 29 % au Mozambique, 21 % au Cameroun et 19 % au Ghana.xxx

CHÂTIMENT CORPOREL ● Quelque 732 millions d’enfants d’âge scolaire, la moitié de la population mondiale âgée de 6 à 17 ans, vivent dans des pays où ils ne bénéficient pas d’une protection juridique contre le châtiment corporel à l’école.xxxi ● Chez les adolescentes plus jeunes, de 10 à 14 ans, près de deux sur trois sont soumises régulièrement à des châtiments corporels, à la maison et à l’école, selon des données disponibles dans 33 pays. xxxii ● La première étude nationale sur l’abus de l’enfant en Inde en 2007 a constaté que deux enfants sur trois avaient fait état d’abus physiques, dont le châtiment corporel. En dehors de la famille, les enseignants étaient cités comme les principaux coupables.xxxiii ● Les garçons et les enfants des familles les plus pauvres et des castes inférieures subissent le taux le plus élevé de châtiment corporel. Une étude menée par Young Lives à Andhra Pradesh (Inde), où le châtiment corporel a été interdit, a constaté que 82 % des garçons et 72 % des filles de 7 et 8 ans avaient subi une punition physique à l’école au cours de la semaine précédente.xxxiv ● Dans chaque pays couvert par l’étude de Young Lives, à savoir l’Éthiopie, l’Inde, le Pérou et le Vietnam, les garçons étaient plus susceptibles de subir un châtiment corporel à l’école que les filles.xxxv ● Des entretiens avec des élèves à la Barbade, en Égypte, en Inde, au Pakistan, au Soudan, en Tanzanie et au Zimbabwe, ont révélé que les châtiments corporels à l’école étaient douloureux, et que les adolescents détestaient leurs enseignants pour cette raison, qu’ils avaient des difficultés pour se concentrer et pour apprendre, avaient de moins bons résultats et évitaient, voire abandonnaient l’école de crainte d’être battus.xxxvi ● Un examen de la prévalence des châtiments corporels dans 63 pays a conclu qu’un taux supérieur à 90 % des élèves subissaient des châtiments dans neuf pays, notamment le Botswana, le Cameroun, l’Ouganda et la Tanzanie, tandis que ces taux atteignaient 70 % à 89 % dans 11 pays, dont le Bénin, le Ghana et le Togo.xxxvii

7 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS DONNÉES DE PRÉVALENCE POUR DES GROUPES SPÉCIFIQUES À HAUT RISQUE Y compris les élèves handicapés, les élèves LGBT, les populations marginalisées, les situations de conflit et de crise

ÉLÈVES HANDICAPÉ ● Aux États-Unis, les élèves de sexe féminin qui bénéficiaient de services d’éducation spécialisée étaient 4,8 fois plus susceptibles d’être victimes d’intimidation que leurs homologues non handicapés.xxxviii ● Dans une étude concernant 3.706 élèves du primaire en Ouganda, 24 % des filles handicapées de 11 à 14 ans faisaient état de violence sexuelle à l’école, comparé à 12 % des filles non handicapées.xxxix

ÉLÈVES LGBT ● Les élèves lesbiennes, gays, bisexuels et transsexuels (LGBT) faisaient régulièrement état d’une prévalence de violence plus élevée, comparé à leurs homologues non LGBT. Une étude de 2014 en Nouvelle-Zélande a conclu que les élèves lesbiennes, gays et bisexuels étaient trois fois plus susceptibles d’être intimidés que leurs homologues hétérosexuels et que les élèves transsexuels étaient cinq fois plus susceptibles d’être intimidés que leurs homologues non transsexuels.xl ● Dans une étude de 2012 sur les lycéens aux États-Unis, les filles étaient plus susceptibles que les garçons d’indiquer qu’elles avaient été victimes de cyber-harcèlement en même temps que l’intimidation à l’école, à un taux de 11 % et 8 % respectivement. Parmi les jeunes qui se considèrent comme non-hétérosexuels, 23 % indiquaient avoir été victimes à la fois de cyber- harcèlement et d’intimidation à l’école, comparé à seulement 9 % de ceux qui se considéraient comme étant hétérosexuels. xli ● Une analyse récente des données de l’Organisation des Nations unies pour l’éducation, la science et la culture (UNESCO) a conclu que le pourcentage d’élèves LGBT subissant des violences et des intimidations à l’école allait de 16 % au Népal à 85 % aux États-Unis.xlii ● Des données collectées en Norvège en 2015 indiquaient qu’entre 15 % et 48 % des élèves LGBT étaient victimes d’intimidation, comparé à 7 % des élèves hétérosexuels.xliii ● En Asie, des études indiquent que le pourcentage d’élèves LGBT victimes d’intimidation à l’école passait de 7 % en Mongolie à 68 % au Japon.xliv ● Les élèves qui ne sont pas LGBT mais qui sont perçus comme étant non conformes aux normes du genre sont aussi des cibles de la violence homophobe, et dans certains pays, jusqu’à un tiers des autres élèves qui ne semblent pas respecter les normes du genre subissent une violence homophobe et transphobe à l’école.xlv En Thaïlande, par exemple, 24 % des élèves hétérosexuels ont subi des violences parce que l’expression de leur genre était perçue comme non conforme.xlvi

POPULATIONS MARGINALISÉES ● Les enfants affectés par le VIH et le sida courent un risque accru de violence sexuel et d’être la cible d’une intimidation. Une étude de plus de 6.700 élèves du secondaire au Zimbabwe a constaté l’existence de preuves solides indiquant que les enfants orphelins, en particulier ceux ayant perdu leurs deux parents, étaient plus susceptibles d’être victimes de rapports sexuels forcés que les non-orphelins.xlvii

8 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS ● Les enfants et les adolescents les plus vulnérables, y compris ceux qui sont pauvres ; provenant de minorités ethniques, linguistiques ou culturelles ; de communautés de migrants ou de réfugiés ; ou qui sont atteints d’handicap courent un risque plus élevé de violence et d’intimidation à l’école.xlviii L’analyse des données SACMEQ montre qu’au Kenya, où presque la moitié de tous les directeurs d’école faisaient état d’un harcèlement sexuel d’élève à élève, on enregistrait un écart de 40 points de pourcentage entre les écoles desservant les enfants des milieux les plus pauvres et les plus aisés.xlix ● Dans le projet des Young Lives, les conclusions indiquaient que les enfants des familles pauvres subissaient constamment les taux les plus élevés d’intimidation.l ● Le projet d’ActionAid concernant la Violence sexuelle faite aux filles a constaté que l’exploitation sexuelle pouvait être liée à la pauvreté, les filles étant contraintes à avoir des relations sexuelles avec les enseignants du sexe masculin pour payer leurs frais scolaires.li

SITUATIONS DE CONFLIT ET DE CRISE ● Pendant des conflits, lorsque les institutions, les structures de responsabilité et les réseaux sociaux sont affaiblis, les filles et les garçons courent davantage le risque de subir des violences sexuelles.lii ● Les adolescentes peuvent être particulièrement exposées à la violence et au harcèlement sexuels quand une partie de leurs écoles tient lieu de baraquement ou de base pour les militaires, les groupes armés ou la police. Craignant de tels abus, les filles peuvent abandonner leurs études, être retirées de l’école ou ne pas s’inscrire à une classe supérieure.liii ● Dans les pays où la violence sexuelle est utilisée comme arme de guerre, les conséquences sont graves. Pour les filles, les conséquences du viol (qui comprend un traumatisme psychologique et une stigmatisation) mettent en danger leur droit à l’éducation pour le reste de leur vie.liv

IMPACT DE LA VBGMS

IMPACT SUR LES RÉSULTATS SCOLAIRES ● Une étude récente a effectué des examens systématiques et des méta-analyses qui ont identifié 67 et 43 études de 21 pays, respectivement, pour évaluer le rapport global entre les différents types de violence au cours de l’enfance sur les résultats scolaires. L’étude a conclu que les enfants qui avaient subi une forme de violence quelconque pendant l’enfance avaient une probabilité estimative de 13 % indiquant qu’ils ne termineraient pas l’école. Les garçons qui avaient été intimidés sont trois fois plus susceptibles de manquer l’école et les filles ayant subi des violences sexuelles courent trois fois plus le risque de recourir à l’absentéisme.lv ● Un examen mondial a constaté que la violence émotionnelle accroît deux fois plus le risque d’abandonner l’école pour un enfant. Les filles qui avaient subi des violences émotionnelles pendant l’enfance étaient 2,5 fois plus susceptibles d’avoir des résultats scolaires négatifs (redoubler, suivre des cours de remise à niveau, etc.) que celles n’ayant jamais été victimes de violence émotionnelle.lvi ● La violence pendant l’enfance a un impact significatif sur les résultats scolaires des enfants concernant les examens normalisés. Selon une étude mondiale récente, les enfants victimes de

9 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS violence avant l’âge de 18 ans avaient une probabilité prédictive de 9 % d’avoir des résultats scolaires médiocres, comparé à leurs pairs n’ayant pas été victimes de violence pendant l’enfance.lvii ● Dans une enquête sur la violence basée sur le genre au Malawi, 61 % des filles victimes de violence basée sur le genre indiquaient que leurs résultats scolaires en avaient été affectés.lviii ● Les victimes de violence et d’intimidation en milieu scolaire sont davantage susceptibles de manquer l’école, d’avoir des notes plus basses et d’abandonner l’école, ce qui peut avoir un effet négatif sur les résultats scolaires, les possibilités d’éducation futures et les perspectives d’emploi.lix ● Le Rapport mondial de 2006 sur la violence contre les enfants des Nations unies fait remarquer que l’abus verbal, l’intimidation et la violence sexuelle dans les écoles sont régulièrement signalés comme étant des raisons pour le manque de motivation, l’absentéisme et l’abandon.lx ● Le châtiment corporel peut être un facteur de l’abandon scolaire. Par exemple, dans une étude au Népal, 14 % des abandons scolaires étaient attribués au châtiment corporel et à la crainte des enseignants.lxi ● Une étude de 2010 au Royaume-Uni a conclu que les élèves de 16 ans victimes d’intimidation à l’école étaient deux fois plus susceptibles de ne pas faire d’études, de ne pas avoir d’emploi ou de formation et d’avoir des salaires de faible niveau à 23 et 33 ans que ceux n’ayant pas subi d’intimidation. Ces jeunes hommes étaient trois fois plus susceptibles de souffrir de dépression et cinq fois plus susceptibles d’avoir un casier judicaire.lxii ● Une analyse récente des données de l’UNESCO a conclu qu’en Thaïlande, 31 % des élèves qui avaient subi des taquineries ou des intimidations homophobes faisaient état d’absences scolaires au cours du mois précédent, et en Argentine, 45 % des élèves transsexuels avaient quitté l’école.lxiii Il était également indiqué que les élèves LGBT avaient des résultats scolaires plus médiocres que leurs pairs hétérosexuels en Australie, au Chili, au Danemark, au Salvador, en Italie et en Pologne.lxiv ● Au Nigeria, une étude a conclu que les enfants fréquentant une école autorisant les châtiments corporels (gifler, pincer, frapper avec une baguette ou un bâton) avaient un vocabulaire réceptif, un fonctionnement exécutif et une motivation intrinsèque moindres que les enfants fréquentant une école interdisant le châtiment corporel.lxv

IMPACT SUR LES RÉSULTATS DE SANTÉ PHYSIQUE ET MENTALE ● Une enquête au Swaziland a conclu que les filles et jeunes femmes de 13 à 24 ans qui avaient été victimes de violence sexuelle pendant l’enfance étaient trois fois plus susceptibles de contracter le VIH ou d’autres maladies transmissibles sexuellement, ou d’avoir une grossesse non désirée que celles n’ayant pas été victimes.lxvi ● Les jeunes qui ont participé à des altercations physiques sont plus susceptibles d’éprouver une moins grande satisfaction de vivre et un bien-être psychologique plus faible, ainsi que des relations moindres avec la famille et les pairs.lxvii ● Des études récentes suggèrent que les victimes d’intimidation en milieu scolaire courent des risques accrus de santé précaire, ainsi que des résultats inférieurs en matière de richesse et de relations sociales à l’âge adulte, même après avoir pris en compte les difficultés familiales et les troubles psychiatriques de l’enfance.lxviii ● Une analyse des données de 30 pays industrialisés et en transition a conclu que seulement 27 % des enfants victimes d’intimidation déclaraient bénéficier d’une excellente santé, comparé à 36 % des autres enfants. De même, 29 % des enfants qui avaient été harcelés et qui se livraient à

10 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS l’intimidation disaient que leur vie était très satisfaisante, comparé à 40 % des autres enfants.lxix ● Une étude documentant les taux de blessures liées au châtiment corporel en milieu scolaire a conclu que les élèves en Zambie faisaient état de douleurs, de malaises physiques, de nausées, de gêne et d’un sentiment de rancune.lxx ● En Tanzanie, près d’un quart des 408 élèves du primaire interrogés ont déclaré que le châtiment corporel avait été si intense qu’ils avaient été blessés.lxxi

LIEN ENTRE L’EXPOSITION À UN STRESS TOXIQUE PROLONGÉ ET LES RÉSULTATS DÉVELOPPEMENTAUX ET COGNITIFS ● L’exposition à la violence à un jeune âge peut gêner le développement du cerveau et endommager d’autres parties du système nerveux, ainsi que les systèmes endocrinien, circulatoire, locomoteur, reproductif, respiratoire et immunitaire, avec des conséquences à vie.lxxii ● Le stress toxique peut entraîner des modifications à court terme dans le comportement observable, ainsi que des changements moins apparents mais néanmoins permanents dans la structure du cerveau et sa fonction. De plus en plus de preuves associent les expériences négatives pendant l’enfance à un risque plus important d’être atteint de diverses maladies chroniques qui peuvent se faire ressentir à l’âge adulte.lxxiii ● Participer à l’intimidation à l’école peut prédire un comportement antisocial et criminel dans l’avenir. Le fait d’être victime d’intimidation est aussi lié à un plus grand risque de troubles alimentaires et de difficultés sociales et relationnelles.lxxiv ● Les élèves victimes d’intimidation sont plus susceptibles que leurs pairs d’être déprimés, solitaires ou anxieux et de manquer de confiance en eux.lxxv ● Plusieurs études citées récemment dans une analyse des données de l’UNESCO montrent que les enfants et les jeunes qui ont été victimes d’intimidation homophobe courent un plus grand risque d’anxiété, de dépression, de crainte, de stress, de manque de confiance en soi, de solitude, d’autodestruction et de pensées suicidaires.lxxvi ● Une étude aux États-Unis qui avait suivi les enfants victimes d’intimidation entre 7 et 11 ans a trouvé des effets durables 40 ans plus tard, y compris les qualifications scolaires, le soutien familial, les tests cognitifs et la santé physique.lxxvii

IMPACT SUR LES MARIAGES ET LES GROSSESSES PRÉCOCES ● L’abus sexuel, menant à une grossesse, peut se dérouler en milieu scolaire sous forme de VBGMS infligée aux filles par des élèves de sexe masculin ou des enseignants. Une étude du ministère de l’Éducation nationale en Côte d’Ivoire, par exemple, a conclu qu’environ 50 % des enseignants indiquaient avoir eu des relations sexuelles avec des élèves, avec un pourcentage aussi élevé que 70 % dans une région.lxxviii ● La VBGMS liée au fait d’être enceinte englobe aussi l’intimidation et les taquineries effectuées par les camarades de classe et les enseignants à l’encontre des filles enceintes et des mères adolescentes.lxxix ● Le mariage précoce a un lien étroit avec la grossesse précoce ou non désirée et l’abandon de l’école et il peut accentuer la dynamique de l’inégalité des genres.lxxx ● Les données des enquêtes démographiques et de santé (EDS)/enquêtes en grappes à indicateurs multiples (MICS) de 78 pays en voie de développement entre 2000 et 2011 montrent que 63 % des filles indiquant avoir été mariées avant 18 ans n’avaient aucune éducation, comparé à 45 % ayant suivi des études primaires et 20 % des études secondaires.lxxxi

11 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS ● Certains pays ont des politiques stipulant le renvoi ou l’exclusion de l’école des filles enceintes. Quand les filles restent à l’école ou y reviennent après avoir donné naissance, elles peuvent être victimes d’intimidation et d’abus verbal de la part de leurs camarades de classes et enseignants.lxxxii

COÛTS DE LA VBGMS ● Dans la région de l’Asie de l’Est et du Pacifique, on estime que les coûts économiques de certaines conséquences sur la santé, associées à la maltraitance des enfants, correspondaient à 1,4 % et 2,5 % du produit intérieur brut (PIB) annuel de la région.lxxxiii ● La violence à l’égard des jeunes, ne serait-ce qu’au Brésil, coûte près de 19 milliards USD par an, d’après les estimations, dont 943 millions USD peuvent être associés à la violence dans les écoles.lxxxiv ● Aux États-Unis, on estime que le coût de la violence en milieu scolaire sur l’économie s’élève à 7,9 milliards USD par an.lxxxv ● Des travaux d’analyse appuyés par l’Agence des États-Unis pour le développement international (USAID) montrent que la VBGMS, à elle seule, peut être liée à la perte d’une année en école primaire, ce qui se traduit par un coût annuel d’environ 17 milliards USD pour les pays à revenu faible et intermédiaire.lxxxvi ● Une étude a montré que chaque année, le Cameroun, la République démocratique du Congo et le Nigeria perdaient 974 millions USD, 301 millions USD et 1.662 millions USD, respectivement, pour ne pas éduquer les filles en suivant les mêmes normes que pour les garçons, et la violence à l’école est un des facteurs clés contribuant à la sous-représentation des filles dans l’éducation.lxxxvii ● Une étude de Plan International en Inde a calculé que le coût pour la société de l’abandon de l’école à la suite de châtiment corporel correspondait à 1,5 à 7,4 milliards USD en perte de bénéfices pour la société chaque année, ce qui correspond à un montant à hauteur de 0,13 % à 0,64 % du PIB de l’Inde.lxxxviii

DONNÉES SUR LE SIGNALEMENT ET LES ORIENTATIONS ● Une Enquête démographique et de santé (EDS) avait demandé aux filles et aux femmes si elles avaient cherché de l’aide auprès d’une source donnée pour mettre un terme à la violence, et dans l’affirmative, auprès de qui. Les conclusions de 30 pays confirment que la majorité des adolescentes victimes de violence ne cherchent pas à être aidées. Dans les 30 pays, plus de la moitié des filles de 15 à 19 ans qui avaient subi des violences physiques et/ou sexuelles disaient qu’elles n’avaient demandé l’aide de personne.lxxxix ● En se basant sur les données de 30 pays, seulement 1 % des adolescentes victimes de relations sexuelles forcées avaient cherché une aide professionnelle.xc ● Une enquête de référence effectuée en Ouganda, dans le cadre de l’étude Good Schools, incluait la participation de 3.706 élèves de 42 écoles, et a constaté que l’intervention de première ligne était médiocre, face au signalement d’abus par les enfants, en dépit du fait d’avoir quelques structures de référence en place. Selon l’enquête de référence, 529 enfants (14 %) avaient bénéficié d’une orientation. Les filles étaient plus susceptibles que les garçons de recevoir une orientation et de satisfaire aux critères pour un cas grave, à un taux de 9 % et 4 % respectivement. Au total, 104 orientations (20 %) ont abouti à des mesures relativement concrètes, alors que seulement 20 cas (3,8 %) avaient satisfait à tous les critères pour recevoir une réponse adéquate. Près de la moitié (43 %) des enfants orientés n’avaient jamais pensé à demander de l’aide en divulguant leur expérience de violence avant l’enquête de reference.xci

12 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS SOURCES ET RÉFÉRENCES Child, Jennifer Christine, Dipak Naker, Jennifer Horton, Eddy Joshua Walakira, et Karen M. Devries. 2014. “Responding to abuse: Children’s experiences of children protection in a central district, Uganda.” Child Abuse & Neglect. Vol 38(10). Octobre 2014. 1647-1658. Disponible à l’adresse suivante : https://www.sciencedirect.com/science/article/abs/pii/S0145213414002221

Fry, Deborah, Xiangming Fang, Stuart Elliott, Tabitha Casey, Xiaodong Zheng, Jiaoyuan Li, Lani Florian, et Gillean McCluskey. 2018. “The relationships between violence in childhood and educational outcomes: A global systematic review and meta-analysis.” Child Abuse & Neglect, 75(2018), 6- 28. Disponible à l’adresse suivante : https://ac.els-cdn.com/S0145213417302491/1-s2.0- S0145213417302491- main.pdf?_tid=79a4da72-3741-4f75-be6d- 6cf42519062d&acdnat=1524512135_5c84772f6be9ba884c849085d4ba30be

Gershoff, Elizabeth T. 2017. “School corporal punishment in global perspective: prevalence, outcomes, and efforts at intervention.” Psychology, Health & Medicine, 22:sup1, 224-239. Disponible à l’adresse suivante : https://www.tandfonline.com/doi/pdf/10.1080/13548506.2016.1271955?needAccess=true

GMR, UNESCO, UNGEI. 2015. “School-related gender-based violence is preventing the achievement of quality education for all.” Policy paper 17. Disponible à l’adresse suivante : http://www.ungei.org/srgbv/files/232107E.pdf

Know Violence In Childhood: A Global Learning Initiative. 2017. Global Report 2017: Ending Violence in Childhood. Disponible à l’adresse suivante : http://globalreport.knowviolenceinchildhood.org/

Parkes, Jenny. 2015. Gender-based Violence in Education. Background paper for the Education for All Global Monitoring Report 2015. Disponible à l’adresse suivante : http://unesdoc.unesco.org/images/0023/002323/232399e.pdf

Plan International. 2008. “Paying the price: The economic cost of failing to educate girls.” Children in Focus, Volume 1. Disponible à l’adresse suivante : http://www.gbchealth.org/system/documents/category_1/320/The%20Economic%20Cost%20of% 20Failing%20to%20Educate%20Girls.pdf?1344866507/.

Plan International. 2013. A girl’s right to learn without fear: Working to end gender-based violence at school. Disponible à l’adresse suivante : https://plan-international.org/publications/girls-right- learn-without-fear.

RTI International. 2015. “What is the cost of school-related gender-based violence?” USAID Fact Sheet. Disponible à l’adresse suivante : http://www.ungei.org/resources/files/Cost_Associated_with_School_Violence_FINAL.pdf.

Shonkoff, Jack P., Andrew S. Garner et le Comité des aspects psychosociaux de la santé de l’enfant et de la famille, le Comité de la petite enfance, de l’adoption et des soins aux personnes à charge, et la section sur la pédiatrie développementale et comportementale. 2012. The Lifelong Effects of Early Childhood Adversity and Toxic Stress. Rapport technique. American Academy of Pediatrics.

13 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS Disponible à l’adresse suivante : http://pediatrics.aappublications.org/content/early/2011/12/21/peds.2011-2663

UNGEI. 2017. “Realising SDG Commitments to Gender Equality in Education: Pathways and Priority Actions.” Note d’information. Disponible à l’adresse suivante : http://www.ungei.org/SDGS_paper_Final.pdf. UNGEI and NRC. 2016. “Addressing School-Related Gender-Based Violence is Critical for Safe Learning Environments in Refugee Contexts.” Document d’information. Disponible à l’adresse suivante : http://www.ungei.org/srgbv/files/Refugee_Brief_Final.pdf.

UNESCO. 2014. “Developing an education sector response to early and unintended pregnancy.” Document de réflexion pour une consultation globale. Disponible à l’adresse suivante : http://unesdoc.unesco.org/images/0023/002305/230510E.pdf.

UNESCO. 2016. Out in the Open: Education Sector Responses to Violence Based on Sexual Orientation and Gender Identity Expression. Disponible à l’adresse suivante : http://unesdoc.unesco.org/images/0024/002446/244652e.pdf/

UNESCO. 2017. School Violence and Bullying Global Status Report. Disponible à l’adresse suivante : http://unesdoc.unesco.org/images/0024/002469/246970e.pdf.

UNESCO and UN Women. 2016. Global Guidance on Addressing School-Related Gender-Based Violence. Disponible à l’adresse suivante : http://unesdoc.unesco.org/images/0024/002466/246651E.pdf.

UNICEF. 2014a. Hidden in Plain Sight: A Statistical Analysis of Violence Against Children. Disponible à l’adresse suivante : http://files.unicef.org/publications/files/Hidden_in_plain_sight_statistical_analysis_EN_3_Sept_20 14.pdf

UNICEF. 2014b. “A Statistical Snapshot of Violence Against Adolescent Girls.” Disponible à l’adresse suivante : https://www.unicef.org/publications/files/A_Statistical_Snapshot_of_Violence_Against_Adolesce nt_Girls.pdf.

UNICEF. 2017. A Familiar Face: Violence in the Lives of Children and Adolescents. Disponible à l’adresse suivante : https://www.unicef.org/publications/files/Violence_in_the_lives_of_children_and_adolescents.pdf

Nations unies. 2016. Ending the torment: tacking bullying from the schoolyard to cyberspace. Disponible à l’adresse suivante : http://srsg.violenceagainstchildren.org/sites/default/files/2016/End%20bullying/bullyingreport.pdf.

USAID. 2016.The Effects of School-Related Gender-Based Violence on Academic Performance: Evidence from Botswana, Ghana and South Africa. Disponible à l’adresse suivante : http://www.ungei.org/The_Effects_of_SRGBV Academia_FINAL.pdf

14 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS OMS. 2016. Inspire. Seven strategies for ending violence against children. Disponible à l’adresse suivante : http://apps.who.int/iris/bitstream/10665/207717/1/9789241565356-eng.pdf.

15 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS

NOTES DE FIN

i UNESCO. 2017. School Violence and Bullying Global Status Report. Disponible à l’adresse suivante : http://unesdoc.unesco.org/images/0024/002469/246970e.pdf.

ii UNESCO. 2017

iii UNICEF. 2017. A Familiar Face: Violence in the Lives of Children and Adolescents. Disponible à l’adresse suivante : https://www.unicef.org/publications/files/Violence_in_the_lives_of_children_and_adolescents.pdf

iv UNICEF. 2014b. “A Statistical Snapshot of Violence Against Adolescent Girls.” Disponible à l’adresse suivante : https://www.unicef.org/publications/files/A_Statistical_Snapshot_of_Violence_Against_Adolescent_Girls.pdf.

v UNICEF. 2014b.

vi Dominic and Hiu. 2016. (tel que cité dans Nations unies 2016)

vii Nations unies. 2016. Données 2016 U-Report/Représentante spéciale du Secrétaire général des Nations unis chargée de la question de la violence contre les enfants (SRSG-VAC).

viii World Health Organization GSHS data sets (tel que cité dans UNESCO 2017).

ix UNESCO. 2017.

x Owusu, A. 2008. (tel que cité dans Nations unis 2016).

xi Jones N., et al. 2008. (tel que cité dans Nations unies 2016).

xii USAID. 2016.The Effects of School-Related Gender-Based Violence on Academic Performance: Evidence from Botswana, Ghana and South Africa. Disponible à l’adresse suivante : http://www. ungei.org/The_Effects_ of_SRGBV_ Academia_FINAL.pdf

xiii UNESCO. 2017.

xiv 2013 Youth Risk Behavior Survey (tel que cité dans UNESCO 2017).

xv UNESCO. 2017.

xvi UNICEF. 2017.

xvii GMR, UNESCO, UNGEI. 2015. “School-related gender-based violence is preventing the achievement of quality education for all.” Policy paper 17. http://www.ungei.org/srgbv/files/232107E.pdf

xviii Analyse citée dans un document d’orientation conjoint publié par le Rapport mondial de suivi (GMR/RMS), l’UNESCO et l’Initiative des Nations unies pour l’éducation des filles (UNGEI).

xix UNICEF. 2014a.

xx Burton and Leoschut, 2013. (tel que cité dans UNESCO et UNGEI 2015).

16 | DOSSIER DE DONNÉES SUR L’ÉDUCATION : PRÉVALENCE MONDIALE DE LA VBGMS

xxi Plan International. 2013. xxii UNICEF. 2014a. xxiii UNICEF. 2014a. xxiv UNICEF. 2014a. xxv Plan International. 2013. xxvi GMR, UNESCO, UNGEI. 2015. (tel que cité dans UNESCO 2017). xxvii Plan International. 2013. xxviii UNICEF. 2014a. xxix Analysis of data from Demographic and Health Survey (DHS) and Multiple Indicator Cluster Survey (MICS) (tel que cité dans UNICEF 2014a). xxx UNICEF. 2014a. xxxi UNICEF. 2017. xxxii UNICEF. 2014b. xxxiii UNICEF. 2014a. xxxiv Voir http://www.younglives.org.uk (tel que cité dans UNESCO 2017). xxxv Cité dans Gershoff. 2017. xxxvi Cité dans Gershoff. 2017. xxxvii Cité dans Gershoff. 2017. xxxviii Know Violence in Childhood. 2017. xxxix Devries, K., Kyegome, N., Zuurmond, M., Parkes, J., Child, J., Walakira, E. and Naker, D. 2014. Violence against primary school children with disabilities in Uganda: a cross-sectional study. BMC Public Health, Vol. 14. (tel que cité dans UNESCO et ONU Femmes 2016). xl UNESCO. 2016. Out in the Open: Education Sector Responses to Violence Based on Sexual Orientation and Gender Identity Expression. Disponible à l’adresse suivante :http://unesdoc.unesco.org/images/0024/002446/244652e.pdf/ xli Schneider et al (2012) (tel que cité dans UNESCO 2017). xlii UNESCO. 2016.

17 | DOSSIER DE DONNÉES SUR L’ÉDUCATION: PRÉVALENCE MONDIALE DE LA VBGMS

xliii UNESCO. 2016. xliv UNESCO. 2016. xlv Know Violence in Childhood. 2017. xlvi UNESCO. 2016. xlvii Pascoe, S. J. S., Langhaug, L. F., Durawo, J., Woelk, G., Ferrand, R., Jaffar, S., Hayes, R. and Cowan, F. M. 2010. Increased risk of HIV- infection among school-attending orphans in rural Zimbabwe. AIDS Care, Vol. 22, No. 2, pp. 206-20. (tel que cité dans GMR, UNESCO, UNGEI 2015). xlviii UNESCO. 2017. xlix GMR, UNESCO, UNGEI. 2015. l UNESCO. 2017. li UNESCO. 2017. lii Cité dans (Girls’ Right 17): SRSG on Violence against Children. 2012. Tackling Violence in Schools: A Global Perspective. Bridging the Gap Between Standards and Practice. New York, Bureau de la Représentante spéciale du Secrétaire général chargée de la question de la violence contre les enfants. liii Cité dans (Girls’ Right 17): Human Rights Watch (2009). Sabotage Schooling: Naxalite Attacks and Police Occupation of Schools in India’s Bihar and Jharkhand States. New York, Human Rights Watch. / Human Rights Watch (2010). ‘Targets of Both Sides’. Violence against Students, Teachers, and Schools in Thailand’s Southern Border Provinces. New York, Human Rights Watch. / Commission des droits de l’homme des Nations unies (2005). Rapport du Haut-Commissariat des Nations unies sur la situation des droits de l’homme en Colombie, Conseil économique et social des Nations unies. liv Cité dans (Girls Right 17) UNESCO (2011). EFA Global Monitoring Report 2011. The Hidden Crisis: Armed Conflict and Education, p. 15. Paris, UNESCO. lv Fry, D., X. Fang, S. Elliott, T. Casey, X. Zheng, J. Li, L. Florian et G. McCluskey. 2018. “The relationships between violence in childhood and educational outcomes: A global systematic review and meta-analysis.” Child Abuse & Neglect, 75(2018), 6-28. Disponible à l’adresse suivante : https://ac.els-cdn.com/S0145213417302491/1-s2.0-S0145213417302491-main.pdf?_tid=79a4da72-3741-4f75-be6d- 6cf42519062d&acdnat=1524512135_5c84772f6be9ba884c849085d4ba30be

18 | DOSSIER DE DONNÉES SUR L’ÉDUCATION: PRÉVALENCE MONDIALE DE LA VBGMS

lvi Fry et al. 2018. “The relationships between violence in childhood and educational outcomes: A global systematic review and meta- analysis.” Child Abuse & Neglect, 75(2018), 6-28. lvii Fry et al. 2018. “The relationships between violence in childhood and educational outcomes: A global systematic review and meta- analysis.” Child Abuse & Neglect, 75(2018), 6-28. lviii Bisika, T., Ntata, P. et Konyani, S. 2009. Gender-violence and education in Malawi: a study of violence against girls as an obstruction to universal primary education. Journal of Gender Studies, Vol. 18, No. 3, pp. 287-94. (tel que cité dans le GMR, UNESCO, UNGEI 2015). lix UNESCO. 2017. lx Nations unies. 2016. lvi Fry et al. 2018. “The relationships between violence in childhood and educational outcomes: A global systematic review and meta- analysis.” Child Abuse & Neglect, 75(2018), 6-28. lvii Fry et al. 2018. “The relationships between violence in childhood and educational outcomes: A global systematic review and meta- analysis.” Child Abuse & Neglect, 75(2018), 6-28. lviii Bisika, T., Ntata, P. et Konyani, S. 2009. Gender-violence and education in Malawi: a study of violence against girls as an obstruction to universal primary education. Journal of Gender Studies, Vol. 18, No. 3, pp. 287-94. (tel que cité dans le GMR, UNESCO, UNGEI 2015). lix UNESCO. 2017. lx Nations unies. 2016. lxi Nations unies. 2016. lxii Données d ‘Ellery, F., Kassam, N., & Bazan, C. (2010). Prevention Pays: The Economic Benefits Of Ending Violence In Schools. Plan International: Uk, 10. (tel que cité dans UNESCO 2017). lxiii UNESCO. 2016. lxiv UNESCO. 2016. lxv Cité dans Gershoff. 2017. lxvi Know Violence in Childhood. 2017. lxvii Currie, et al, 2013 (tel que cité dans HBSC Bullying and Fighting Fact Sheet). lxviii Wolke et al 2013 (tel que cité dans HBS Bullying and Fighting Fact Sheet). lxix UNESCO. 2017. lxx Cité dans Gershoff. 2017. lxxi Cité dans Gershoff. 2017.

19 | DOSSIER DE DONNÉES SUR L’ÉDUCATION: PRÉVALENCE MONDIALE DE LA VBGMS

lxxii Currie, et al, 2013 (tel que cité dans HBSC Bullying and Fighting Fact Sheet). lxxiii Wolke et al 2013 (tel que cité dans HBS Bullying and Fighting Fact Sheet). lxxiv UNESCO. 2017. lxxv HSBC Bullying and Fighting Fact Sheet and SRSG-VAC (2012) (tel que cité dans UNESCO 2017). lxxvi UNESCO. 2016. lxxvii http://www.telegraph.co.uk/news/health/children/10772302/Bullying-at-school-affects-health-40-years-later.html (tel que cité dans UNESCO 2017). lxxviii UNESCO. 2014. “Developing an education sector response to early and unintended pregnancy.” Document de réflexion pour une consultation globale. Disponible à l’adresse suivante :http://unesdoc.unesco.org/images/0023/002305/230510E.pdf. lxxix UNESCO. 2014. lxxx UNESCO. 2014. lxxxi UNESCO. 2014. lxxxii UNESCO. 2014. lxxxiii UNESCO. 2017. lxxxiv SRSG-VAC 2012 (tel que cité dans UNESCO 2017). lxxxv SRSG-VAC 2012 (tel que cité dans UNESCO 2017). lxxxvi RTI International. 2015. What is the cost of school-related gender-based violence? USAID Fact Sheet. Disponible à l’adresse suivante : http://www.ungei.org/resources/files/Cost_Associated_with_School_Violence_FINAL.pdf. lxxxvii Antonowicz, Laetitia, Too Often in Silence. Rapport sur la violence en milieu scolaire en Afrique de l’Ouest et centrale, basé sur les données de Plan International 2008. lxxxviii Cité dans Gershoff. 2017. lxxxix UNICEF. 2014a. xc UNICEF. 2017. xci Child et. al. 2014

20 | DOSSIER DE DONNÉES SUR L’ÉDUCATION: PRÉVALENCE MONDIALE DE LA VBGMS TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) Research Plan

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019. The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication. Prepared for

USAID

Prepared by

Dexis Consulting Group

Research for Effective Education Programming – Africa (REEP-A)

Contract No. AID-OAA-TO-16-00024

July 2019

ACRONYMS

DERP Data for Education Research and Programming

EGMA Early Grade Mathematics Assessment

EGRA Early Grade Reading Assessment

L1 Language spoken at home

L2 Second language a person learns to speak

Lx 2nd, 3rd, or any ordinal number of language beyond L1

LOI Language of Instruction

LOI1 Initial Language of Instruction

LOI2 Subsequent Language of Instruction

REEP-A Research for Effective Education Programming–Africa

SSME Snapshot of School Management Effectiveness surveys

TLLA Teachers Language and Literacy Assessment

USAID United States Agency for International Development

i | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) TABLE OF CONTENTS TABLE OF CONTENTS II INTRODUCTION 5 GENERAL DESCRIPTION AND BACKGROUND OF WORK 6 OVERVIEW OF THE STAGES OF THE TLLA DEVELOPMENT 7 TLLA FOCAL AREAS AND SUBTASK RECOMMENDATIONS 9 RECOMMENDED SUBTASKS 9 DRAFT ENUMERATION PROTOCOLS 9 DRAFT POST-ENUMERATION ANALYSIS APPROACHES 10 CRITERIA FOR PILOT LOCATION 10 TIMEFRAME 12 STAFFING PLAN 14 STAFF BIOS 15

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | ii INTRODUCTION The Education for All movement, the Millennium Development Goals, and more recently, the Sustainable Development Goals are a few of the overt commitments by the international community to recognizing and upholding the importance of education to development. The United States has been an active promoter of international education for decades, most obviously through a large portfolio of education programs under the United States Agency for International Development (USAID). In line with the international community’s priorities, the United States Government Strategy on International Basic Education, released in September 2018, states that its two primary objectives are to improve learning outcomes and to expand access to quality basic education for all, including the historically marginalized.

From 2011 to 2016 the USAID-funded Data for Education Research and Programming in Africa (DERP) project provided the USAID Africa Bureau, Missions, and partner organizations with research-based evidence, recommendations, and capacity-building activities to guide education project design and implementation. Over the next five years, Research for Effective Education Programming–Africa (REEP– A) aims to build and expand on the research conducted under DERP to continue providing USAID with a strong evidence base on which to plan and carry out activities.

In Africa’s densely multilingual societies, language in education policy is critical to education quality and equity. As governments and Ministries of Education grapple with selecting the best language in which provide quality education to every child, a variety of political, financial, attitudinal, and logistical factors get mixed in the fray. Even when the intentions are high-minded, the challenges of deciding and implementing an effective language policy in a complex sociolinguistic context are many. Much of the research on language in education policy has focused on the students, how they learn best, what they need to learn, and how they fare under different policies. Less research has tackled this thorny issue from the perspective of the teachers, their own proficiencies in the different language options, and how their language proficiency and literacy relate to the quality of instruction they are able to offer in each language. One research focus under REEP-A is on the role of the language of instruction (LOI) in the context of low learning levels among children and youth in multilingual contexts. Numerous countries in sub-Saharan Africa have developed policies that recognize the influence of the LOI on literacy acquisition and on learning outcomes in general. Many policies surrounding literacy instruction have recently shifted toward adopting the language spoken at home or the language of the local community as the language for initial literacy acquisition. The new REEP-A LOI research stream will assess the degree to which teachers in sub-Saharan Africa are prepared to teach reading at the early grade level; specifically, the research will examine preparedness though the lens of (LOI). The REEP-A research study on LOI is closely aligned to a key priority of USAID’s 2018 Education Policy, which is to build the capacity of teachers and ensure they have the skills, motivation and support to teach by “transforming teacher policies and professional development systems to increase the availability of qualified teachers and improve instruction”1.

As part of the new REEP-A LOI work stream, the REEP-A Teacher Language and Literacy Assessment (TLLA) will explore how teachers’ language proficiency and literacy in the LOI influence students’ reading acquisition. It is hypothesized that the teachers’ level of language proficiency and literacy in the LOI can either facilitate student learning, if high, or impede it, if low, but limited data are available on how precisely teacher language and literacy skill levels relate to student reading outcomes. Exploring this

1 U.S Agency for International Development (2018). Education Policy, Washington. D.C. P. 8

5 | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) relationship requires having a valid and reliable tool to measure teachers’ language and literacy skills. The purpose of the TLLA, therefore, is to measure teacher language and literacy proficiency in the relevant LOI. The TLLA will consist of a collection of subtasks assessing teachers’ skills in speaking, listening, reading, writing, vocabulary, and grammar in the given language. The Oral Reading component will consist of identical EGRA subtasks which have been or will be administered to the students in the same context which the TLLA is being administered to the teachers. Field testing and pilot testing will help to identify which subtasks are most effective to reliably evaluate teachers’ language abilities. The TLLA tool will be developed, field tested, and piloted in a sub-Saharan context with a sample of teachers to ensure that the tool performs as designed.

GENERAL DESCRIPTION AND BACKGROUND OF ACTIVITY Goal 1 of USAID’s 2011 Education Strategy was to improve reading skills for children in the primary grades. From 2011 to 2017, USAID provided early grade reading instruction to nearly 70 million children. In sub-Saharan Africa, while significant improvements have been made, progress has been slow, and overall, literacy levels remain low. USAID’s 2018 Education Policy continues to focus on achieving measurable learning outcomes and skills for children and youth in literacy, numeracy, socio-emotional skills2 and other basic skills in a variety of ways, including by generating and using data for decision-making, and strengthening teacher policies and professional development systems to improve the quality of instruction that teachers are able to offer. Over the next several years, many of USAID’s Africa Missions will be designing second-generation reading activities,3 and identifying and dismantling key barriers to learning will be critical to making and sustaining progress in learning outcomes.

A recent report on the state of literacy in sub-Saharan Africa, which reviewed reading assessment scores across 20 countries, found wide variations in student reading performance across languages and countries, as well as within countries. The report concluded that while much work remains to be done to improve literacy in the region, “it is still unclear which particular aspects of [reading] programs and projects are truly essential for improving learning and which components do not provide added value.”4 The report suggests that it would be valuable to gather data through surveys “about pupils, teachers, schools, and communities and about teacher practice, training, and support” alongside literacy assessments. A closer examination of the specific factors at play in each context and program design is needed to identify elements most directly linked to positive impact.

The limited improvement in reading at early primary grades in sub-Saharan Africa has led to increased attention on the quality of both teaching and learning. One area that has received increased focus is the role of the LOI. Numerous countries in sub-Saharan Africa have developed policies that recognize the influence of the LOI on literacy acquisition and on learning outcomes in general. Many policies surrounding literacy instruction have recently shifted toward adopting the language spoken at home (referred to as first language, mother-tongue, or L1), or the language of the local community, as the language for initial literacy acquisition and, in some cases, as the initial language of instruction (LOI1) across all subjects in the early grades.5 Of additional importance to learning is the timing and manner of

2 U.S. Agency for International Development, (2018). Education Policy. Washington, D.C. p. 6 3 Research for Effective Education Programming-Africa (REEP-A). AID-OAA-TO-16-00024

4 RTI International. 2015. The State of Literacy in Africa. Washington, DC: U.S. Agency for International Development. 5 Albaugh, E. A. (2014). State-building and multilingual education in Africa. New York, NY: Cambridge University Press

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | 6 transition to a second or additional language (L2 or Lx6 ), such as a regional, national, or international language, as the subsequent language of instruction (LOI2).

While there is a large research base about language in education issues from high-income contexts, studies have begun to emerge looking at how the choice of LOI affects or interacts with student language proficiencies and learning outcomes in low and middle-income contexts as well. A recent USAID report7 provides an in-depth examination of the literature and issues surrounding LOI policy with a focus on sub- Saharan Africa. The report highlights that an L1 LOI at the primary level can improve all of the following: access and equity (learning in an L1 is linked to enrollment and attendance, while the likelihood of dropping out is linked to learning in a less familiar language); reading acquisition (children learning to read in a language they know benefit from an existing knowledge of vocabulary, language construction, and pronunciation); and learning outcomes (a positive relationship exists between students learning in their L1 and their test scores).

There has been less focus in the research, however, on how the choice of LOI interacts with teacher language proficiencies and literacy, and, by extension, with the quality of instruction that teachers are able to offer in a given LOI. The teacher is widely recognized as the critical element in classroom instruction and student learning.8 Without a qualified and effective teacher, classroom inputs, including teaching and learning materials, textbooks, supplementary readings, and technology support cannot be effectively utilized and thus will not, on their own, result in widespread, positive student outcomes. Nonetheless, the degree to which teachers are effective in producing student learning outcomes can vary significantly. One variable is the teacher’s ability to effectively communicate with students and help them navigate learning in multiple languages in which they all, both teacher and students, may have varying levels of proficiency.

Within the newly established LOI work stream under REEP, this research activity seeks to develop a tool to assess teachers’ language proficiency and literacy in the language(s) of instruction that they are required to use in class. The assessment tool, the TLLA, will consist of subtasks assessing speaking, listening, reading and writing, as well as vocabulary and grammar, in the language(s) used for teaching and learning at the primary school level in that context. The research team plans to assess approximately 150 teachers during the pilot phase of the TLLA, following cognitive interviewing and field testing with approximately 15-35 teachers, using several iterations of the TLLA tool over a span of several days in the country selected. Note that the focus of this activity is to ensure that the tool performs as designed. This activity will not provide representative data for the location(s) where it is pilot-tested.

OVERVIEW OF THE STAGES OF THE TLLA DEVELOPMENT

The following section provides an overview of the process of developing the TLLA, and how this assessment might be adapted for teachers in the sub-Saharan Africa context. The pilot location is given as

6 Lx, meaning 2nd, 3rd, or any ordinal number of a language beyond the L1, is a useful designation in highly multilingual societies where learners have complex language proficiency profiles. 7 RTI International. 2015. Planning for language use in education: Best practices and practical steps to improve learning outcomes. Washington, DC: U.S. Agency for International Development 8 Darling-Hammond, L. (1992). Teaching and knowledge: Policy issues posed by alternative certification for teachers. Peabody Journal of Education, 67, 3, 123-154; Darling-Hammond, L. (2000) Teacher Quality and Student Achievement. Education Policy Analysis Archives.; Bold, T., Filmer, D., Martin, G., Molina, E., Rockmore, C., Stacy, B., Svensson, J., Wane, W. (2017). What Do Teachers Know and Do? Does it Matter? Evidence from Primary Schools in Africa. Policy Research Working Paper 7956. World Bank Group; Westbrook J, Durrani N, Brown R, Orr D, Pryor J, Boddy J, Salvi F (2013) Pedagogy, Curriculum, Teaching Practices and Teacher Education in Developing Countries. Final Report. Education Rigorous Literature Review. Department for International Development

7 | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) an example to illustrate how the assessment might be adapted.

Stage 1. The first stage of the process will involve developing an initial assessment tool, based on best practices in language and literacy assessment, that will be adapted for adult teacher participants in the sub-Saharan context. The research team will develop a suite of subtask descriptions, protocols, sample items in English, and guidelines for adaptation to other languages. For ease of adaptability, the sample items will be developed in English, and the TLLA will be designed such that it is adaptable to any languages of instruction in sub-Saharan African countries. The subtasks will be designed to cover essential components of language and literacy proficiency relevant for adult teaching professionals and will include four of the same subtasks given to students on a corresponding EGRA.

Stage 2. The process of piloting the TLLA will begin with this stage. The piloting process will consist of a first round of cognitive interviewing with a small number of teachers (approximately five teachers), to gain an understanding of how the English version of the TLLA is interpreted. Data will be collected on all items of the instrument during this stage. This version of the TLLA will be administered using a paper instrument. Cognitive interview results may be captured electronically or on paper. Following the completion of the first round of interviews, the data collected will be used to determine any necessary changes to the TLLA instrument.

Stage 3. Following changes to the instrument based on the initial round of interviews, Stage 3 will consist of the adaptation of the instrument into the local language, which will be Luganda in the case of the pilot. The implementers will work closely with a small group of Luganda experts in order to adapt each subtask to the particularities of the Luganda language in the pilot.

Stage 4. Once the tool has been adapted to the local language, Stage 4 will consist of a second round of interviews (approximately five teachers), to understand how the adapted tool is interpreted. This step will be similar to the activities conducted in Stage 2, however both language versions of the instrument will be utilized in this case. Any additional changes will be made to the instrument as a result of the data collected.

Stage 5. This stage will involve field testing the full TLLA instrument. The research team will administer both language versions of the TLLA to approximately 10-25 teachers. The focus of the data collected by field test will be on instrument itself, to understand the appropriateness of the subtask protocols and variability in results. The TLLA will then be updated based on results of the field testing.

Stage 6. Following the two rounds of cognitive interviewing and one round of field testing, Stage 6 will consist of a small pilot of the updated instrument, with approximately 150 teachers. Ideally, the pilot will be conducted at a teacher training, within a teacher college, or at a similar gathering of teachers, to gain access to as many teachers as possible in a short amount of time. The research team will work with the statisticians to identify pilot opportunities that will provide a diverse sample of teachers, ideally across two to three different locations within Uganda, in the case of the pilot. During pilot data collection, the Tangerine software will likely be used for electronic data collection.

Stage 7. Following the pilot, Stage 7 will consist of an analysis of the results of the pilot data collection. Stage 8. Stage 8, the final stage of the tool development, will involve report writing and dissemination of results.

The plan proposed in this research plan meets the requirements as specified in Sub-Requirement 2.35.2 through pre-testing the TLLA tool in Stages 2, 3 and 4, and administering the tool in Stage 5. Sub-

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | 8 requirement 2.35.3 will be met after completing Stage 6, 7 and 8 of this research plan. Thus, at the completion of this research activity in of Uganda, Requirement 2.35 will be completed.

TLLA FOCAL AREAS AND SUBTASK RECOMMENDATIONS The format of the TLLA will be a mix of direct assessment of abilities and performance tasks structured to replicate authentic instructional activities. It is anticipated that this balance treats teachers as professionals and will increase their support for the assessment.

RECOMMENDED SUBTASKS The TLLA is comprised of seven components, which will assess teacher proficiency in the four language skills, vocabulary and grammar. These seven components are Oral Reading, Silent Reading, Speaking, Listening, Grammar, Vocabulary and Writing. The Oral Reading Component consists of four subtasks, which are identical to the Early Grade Reading Assessment (EGRA) for students. The six other components are new subtasks created for this tool. Below is a description of each of the seven components.

Oral Reading: The oral reading component of the TLLA will have four parts. These subtasks will be the same subtasks found on the EGRA for students. First will be a letter sound identification task. Teachers will be presented with a list of letters and, where applicable, diagraphs, in the target language orthography, and be asked to pronounce the sound that each letter represents. For the pilot of the TLLA, this will be done in English and Luganda, a local LOI in Uganda. Next, teachers will be given a decoding task using invented words that adhere to the orthography and phonology of the target language. Third, teachers will read aloud a short narrative of about 60 words and be assessed for accuracy, rate, and prosody. They will be asked to read the story aloud as if they were reading it to a class of children to model fluent and expressive reading. Just like when children take an EGRA, for these first three subtasks, we will calculate the teacher’s accuracy and rate for letter sounds, invented words and oral passage reading. Following the oral reading, teachers will be asked five comprehension questions as the fourth subtask. The four explicit questions and one inferential question will evaluate the teacher’s reading comprehension, as is done with children. The proposed format is a mix of timed and untimed tasks and is expected to take about seven minutes to administer, which aligns with the amount of time for children to complete these four EGRA subtasks.

Silent Reading Comprehension: For this component, the teacher will read a short passage to themselves and will be asked to correct student responses to multiple choice and short answer questions about the passage. The questions will have been answered by a hypothetical student, and the teacher will identify if the student’s answers are correct. If the answers are not correct, the teacher will supply the correct answer. The silent passage will be an informational text about 150-200 words long, with complexity similar to grade 4-6 textbooks. The items will include low level comprehension (i.e., text- based) items from the passage, as well as higher order items, such as main idea and inferential questions. The proposed format is timed and is expected to take five minutes to administer.

Speaking: The speaking component of the TLLA will ask teachers to describe a picture. The picture will be a black and white scene that contains at least 25 items that could be mentioned. Teachers will be scored on the breadth of the items they identify, and their responses patterns will be recorded. Specifically, this subtask will assess grammatical/syntactical forms and vocabulary. To ensure consistency, assessors will tick responses from a comprehensive list of items that could be mentioned in describing

9 | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) the picture. To assess syntax, assessors will be provided with a description, specifics and example utterances for five types of response patterns. The proposed format is timed and is expected to take two minutes to administer.

Listening: The listening component of the TLLA will prompt teachers to repeat sentences supplied by the assessor. The sentences will begin with the simplest structure for that language (e.g., We fish.) and increase in complexity. The proposed format will have 15 items and represent common linguistic structures of varying lengths. It is expected to take two minutes to administer.

Grammar: In addition to the contextual assessments described in the above components, teacher knowledge of grammar will also be assessed in a separate part of the TLLA. As grammar is an essential component of language proficiency, teachers should know the linguistic rules underlying the language of teaching and learning in order to explicitly model and teach these rules. The grammar assessment will consist of error corrections in two parts. The first component will include a structure and written expression, and teachers will be prompted to choose the correct grammatical phrase. The sentences will be read to the teacher to avoid conflating reading ability with grammar ability. The second part will ask teachers to choose the underlined words that contain an error from a provided set of words. The proposed format is untimed and is expected to take five minutes to administer.

Vocabulary: The TLLA will assess teacher vocabulary through a performance measure. Teachers will be given “Tier 2” vocabulary words and asked to orally provide a student-friendly definition for each word. A Tier 2 vocabulary word is the type of word that is used across domains and is more descriptive than a word classified as a Tier 1. Tier 1 words are used in everyday language (such as the word “sad”), while Tier 2 words are more sophisticated and generally learned through books and in adult-child interactions (such as the word “astonished”). Teachers should be directly teaching Tier 2 words, even in cases of second-language learners who need to learn Tier 1 words. In this component, a teacher’s ability to provide a student friendly definition will indicate whether they know the word. The proposed format is timed and is expected to take five minutes to administer.

Writing: Lastly, the writing component of the TLLA will consist of two parts. The first part will ask teachers to score and correct a sample piece of grade 4 student writing. Teachers will correct the student sample and supply edits for items that the teacher identifies as incorrect. The student writing sample will have errors in grammar, punctuation, capitalization, and spelling. The second part of writing will require teachers to produce a written response to a provided prompt. It will be scored on structure (i.e., paragraph), writing conventions (e.g., capitalization, punctuation), verb tenses, sentence length, spelling, vocabulary and overall meaning. The format is a mix of timed and untimed and is expected to take eight minutes to administer.

DRAFT ENUMERATION PROTOCOLS The proposed TLLA subtasks are estimated to take 34 minutes to administer in one language. Accounting for an initial explanation (3 minutes) and transitions between subtasks (2 minutes), the complete TLLA process is estimated to take 39 minutes per teacher from start to finish. During piloting, if this duration is found to contribute to subject effects (i.e., teacher fatigue, which can influence the validity of the results), parts of subtasks that have the teacher score and correct the hypothetical student work (e.g. Writing) may be combined into one larger subtask. Doing this would reduce duplication (e.g., scoring grammar in two places) and decrease explanation and transition time, which could save about 5 minutes, bringing the total time to 34 minutes per teacher. The draft enumerator protocol will be updated and refined once the site is selected.

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | 10 DRAFT POST-ENUMERATION ANALYSIS APPROACHES As outlined in the General Description and Background of Activity, to develop and refine the assessment tool, data will be collected across multiple stages. The resulting dataset will capture teachers’ responses to a series of subtasks administered orally and in writing. The goal of the process is to refine procedures and subtask items to create an assessment that illustrates both (i) the degree of teacher mastery of skills needed to effectively teach in the LOI and (ii) the variation and distribution of teacher abilities in the LOI. At this stage, the research activity is focused on creating a valid tool rather than making assumptions about the relationship between teachers’ abilities and their students’ literacy abilities.

The research plan is based on the following assumptions: (i) The field testing and pilot testing of the TLLA will focus on primary teachers within government schools in the selected country; (ii) Teachers are expected to teach literacy skills to primary students in English and a local language (e.g. Luganda). For the TLLA pilot, the selected country is Uganda, and the TLLA instrument will be adapted into Luganda as the selected local language. In future applications, the TLLA will be adapted to the local language based on the specific context. At this stage, the focus is on the development of the tool. As such, it will not be possible to make claims about the teachers included in the sample. The results will not be disaggregated by grade of teacher.

The previous section (TLLA Focal Areas and Subtask Recommendations), describes the subtasks and the constructs they measure. Items and instructions that measure each construct will be created. Two rounds of cognitive interviewing will be employed to determine if the subtasks are understood by the teachers as expected.

Throughout Stages 2-5, the feasibility of the recommended methodology will be determined through cognitive interviewing and field testing of the TLLA. These stages will help answer the following questions:

1. What is the amount of time to administer each subtask? 2. What is an effective order to administer each subtask? 3. What is an effective format for administering the written subtasks (e.g., ten teachers simultaneously)? 4. How do teachers respond to being told the purpose of the assessment? 5. How do teachers receive evaluating fictitious student writing and answers? 6. Did the items for the speaking subtask extend beyond the provided ones? 7. Are the terms used in the directions (e.g., student friendly) understood by teachers? 8. Is there obvious redundancy in any of the subtasks? 9. What is an efficient way to score the teacher writing sample that still provides useful information? 10. What level of support is needed to score the teacher generated comprehension questions?

As the tool is in development, the current guiding assumptions are based on theoretical approaches to measuring the construct in a structured way. Following field testing and piloting of the TLLA, the assumptions (i.e., distributions) will be adjusted, as needed. Some of the literacy skills being measured are expected to achieve close to a normal distribution in teacher abilities (i.e., a range of scores with the majority in the middle). Specifically, this includes: Speaking, Describing a Picture, Listening, Writing Prompt, and Grammar. For several of the literacy skills, it is expected that some of the teachers will demonstrate close to mastery, including for Oral Reading and Writing. Since the subtasks are untimed, this may result in negative/left skewness (i.e., many teachers doing well on the subtask). On the other

11 | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) hand, for the literacy skills that place a greater demand on language skills, there may be positive/right skewness in the results (i.e., more teachers doing poorly on the subtask). This includes: Grammar, Silent Reading Comprehension, and Vocabulary.

The research questions for the pilot of the TLLA will build on those explored in the field test and will also include new items. Initial questions for the pilot include:

1. Are the items within each subtask contributing to the overall score of that subtask? 2. Is there obvious redundancy in any of the subtasks? a. What is the relationship between correcting the writing sample and producing a writing sample? b. What is the relationship between correcting comprehension questions and answering comprehension questions? c. What is the relationship between correcting grammar in writing and orally? 3. What is the distribution (including mean, median, mode) of scores for each subtask? 4. What are teachers’ self-reported language backgrounds? 5. Do teachers’ self-reported language and literacy skills align with their measured skills? (i.e., Do teachers achieve their best results in the subtasks that align with the skills for their self- reported strength?) 6. What are the syntax response patterns for describing a picture? 7. Does a grade 4 piece of writing show variability in teachers’ ability to identify and correct errors? 8. What is the relationship between oral reading rates and reading comprehension?

The TLLA is a structured interview that elicits samples of language and literacy knowledge to make estimates of an expansive domain of knowledge and abilities. For the pilot, the teachers’ language and literacy knowledge will be assessed in English and Luganda. As the TLLA is an approximation of the skills being measured, the tool will have a number of limitations, as do all similar tests, which will be mitigated as follows.9 Because of subject effects (e.g., subject fatigue), the TLLA cannot be too long to administer. To avoid test effects (e.g., objectivity in scoring) the items and the formats need to be structured so that responses are scored consistently across use. Consistency is essential to ensuring confidence in the data and related conclusions and will be a primary factor in determining which teacher language and literacy abilities can be most accurately measured with this tool.

CRITERIA FOR PILOT LOCATION The TLLA is being developed for use in multiple locations in sub-Saharan Africa. Accordingly, a pilot will be conducted to test the feasibility of the tool and improve its design, in order to develop a tool that can be administered in multiple contexts and countries. Various factors were considered in determining a pilot location for the TLLA, namely; 1) Country context. This category includes considerations of school year timeline, LOI policy, potential reactions among stakeholders (positive and negative) to in-country implementation of this type of research, the overall policy environment, and relationships between the Mission and the Ministry of Education; 2) Logistics. Considerations which need to be taken into account regarding logistics include: support capacity, identification of an existing RTI project, the existence of an RTI or government hosted teacher event where large groups of teachers would be gathered over the next year, degree of access,

9Koretz, D. M. (2008). Measuring up. Harvard University Press.

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | 12 timing concerns that would hinder data collection (such as the rainy season), local data collection capabilities, existing subcontractor relationships, and translation needs. 3) Miscellaneous. In addition to the considerations listed above, there are additional elements to consider in selecting the pilot location. These overarching considerations and/or concerns include: over-saturation of assessment activities; language complexity; and varying teacher knowledge of, and proficiency in, languages of instruction in certain areas of the country.

Uganda provides the best overall environment in which to conduct this work, based on the factors for consideration listed above. The country is a linguistically complex environment, which offers the possibility of assessing teachers in both English and one local language of instruction. Accordingly, for the purposes of the TLLA pilot, English and Luganda, the local language in Uganda, will be the languages used. English is the language of teaching and learning beginning in primary 4 in Uganda. However, since English is not a first or home language for most teachers, some teachers may have challenges using it for instruction. Luganda is widely spoken in Uganda as both a first and second language, and country’s language policy specifies the use of Luganda as a language of instruction in early grades. However, for the many teachers who do not speak Luganda as a first language at home, teaching in the language can be a challenge as well. The variability in teacher comfort levels in Luganda will likely be reflected in their performance on the TLLA. This variability information will be extremely useful in gaining insight around the development of an effective assessment tool.

USAID has two early grade reading activities in Uganda, which RTI is implementing; the USAID School Health and Reading Program, and the USAID Literacy Achievement and Retention Activity. Both projects have strong teams in-country that can provide logistical support and can help identify a high-quality local data collection partner. The projects also have a strong M&E culture and could support any data collection issues in real time, or answer any tablet or translation/analysis questions easily. The USAID Literacy Achievement and Retention Activity is tentatively scheduled to hold teacher trainings in either August 2019 or January 2020. The trainings would provide access to large groups of teachers and would maximize the data collection efforts of this activity within the current timing and funding constraints. Finally, the policy environment in Uganda appears to be open and welcoming towards assessing teacher language proficiency, and RTI can leverage the positive existing client and Ministry relationships in Uganda to encourage support for the TLLA pilot.

TIMEFRAME

Sub Requirement Anticipated Timeline

Produce research plan delineating Draft research plan submitted January 2019. next steps and research Final Research plan to be submitted following receipt of methodology. comments and discussion, TBD.

Develop the TLLA tool. January – June 2019.

Field-test the TLLA tool. TBD following approval of Research Plan.

13 | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) Sub Requirement Anticipated Timeline

Pilot test TLLA tool. TBD based on discussions with USAID, including on teacher training timing, project preference, and country selection. Uganda teacher trainings are tentatively scheduled for August 2019 and January 2020. If Uganda is confirmed as the pilot country, it would be ideal to target one of these trainings for TLLA piloting.

Analyze findings and produce TBD based on the timing of the pilot. report of findings.

Submission of a paper to Within three months following Final report approval and peer-reviewed academic discussions with USAID. journal.

Presentation of findings at applicable TBD based on discussions with USAID. conference(s).

STAFFING PLAN The following staffing plan has been developed and approved for the TLLA instrument development, with noted changes from the approved version highlighted below, due to staffing changes at RTI. There will be a three-person technical research team, four statisticians, two project management staff, and a publications support team.

The three-person technical team will consist of Dr. Margaret (Peggy) Dubeck, Dr. Patience Sowa, and Ms. Karon Harden. The statisticians involved at key points in the activity will include Mr. Christopher (Chris) Cummiskey, Ms. Jennifer Pressley, and Dr. Tracy Kline, given the recent departure of Dr. Corina Owens from RTI. Each member will have specific roles and responsibilities throughout the research and piloting process and will collaborate to produce the newly developed instrument and pilot findings report.

Dr. Dubeck will act as the technical lead and advisor for the TLLA activity, bringing together various RTI experts to conduct research and analysis related to the development and pilot testing of the TLLA instrument. She will lead the research efforts, guide the overall design of the instrument and pilot, lead the development of the speaking and vocabulary subtasks of the TLLA instrument, review all other subtask development, co-facilitate training of the individuals collecting the pilot data, oversee the synthesis of the findings as well as write portions and review the pilot findings report. Dr. Dubeck will ensure coordinated efforts among the team.

Dr. Sowa and Ms. Harden will collaborate with Dr. Dubeck on the development of the TLLA instrument and pilot study. Dr. Sowa will lead the development of the writing and reading comprehension subtasks of the TLLA instrument and Ms. Harden will lead the development of the reading, listening, and grammar subtasks. Together with Dr. Dubeck, Dr. Sowa and Ms. Harden will implement the design of the pilot study and contribute knowledge and experience from previous endeavors developing instruments on foundational reading skills. Dr. Sowa and/or Ms. Harden will also co-facilitate training of the individuals collecting the pilot data, contribute to the synthesis of pilot data and writing of the pilot findings report, particularly pertaining to the subtasks they develop, but also at a holistic level.

Mr. Cummiskey will contribute to the TLLA instrument activity by defining the pilot sample

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | 14 requirements—that is, calculating the required sample size to ensure that appropriate and valid analyses can be performed on the collected pilot data. He will participate in discussions with the RTI technical team to ensure the pilot sample fully represents the criteria needed for the study. Mr. Cummiskey will also oversee pilot data analyses and contribute as needed to the pilot findings report, particularly sections related to the sample and analyses.

Ms. Pressley’s role on the TLLA instrument activity will be to conduct statistical analyses of the pilot data to understand how the instrument performed during the pilot and to inform revisions to the instruments. This role will include cleaning and processing data as they are transmitted from the field, computing frequencies and tabulations, and conducting item analyses. These functions are necessary to understand how the instruments are performing and what revisions are required.

Dr. Kline will perform psychometric analyses on the pilot data to inform any revisions based on results from the analyses. This is a necessary step toward robust, reliable, and valid instruments.

Ms. Emily Geisen will serve as a technical advisor for cognitive interviewing methodology and be available for internal consultation during the cognitive interviewing phase of instrument development.

Given Ms. Keely Stern’s departure from RTI, the project management team has been revised as described here. The project management team consists of overall REEP-A Project Manager, Ms. Lauren Edwards, and overall REEP-A Project Coordinator, Ms. Meredith Sparks (pending personnel approval). As RTI’s REEP-Africa Project Manager, Ms. Edwards will provide management and guidance to the TLLA team. She will also act as a point of contact between RTI International and Dexis to ensure timely deliverables and proactive communication of challenges. Ms. Sparks will be responsible for financial management of the contract, with support from the Project Manager, Ms. Edwards. Ms. Sparks will support the tracking of timelines of deliverables and monitor progress along the way, as well as monitoring the contract budget and appropriate staffing. She will also ensure compliance with the terms of the contract for financial reporting and other issues, as well as compliance with RTI internal policies. With Ms. Edwards, she will act as a second point of contact between Dexis and RTI. Together, Ms. Edwards and Ms. Sparks will work to source, on-board, and manage any local partners that are responsible for the on-site preparations and logistics required for the instrument pilot.

RTI’s Multimedia Communication Services group will provide editing and formatting for all deliverables submitted to Dexis. The five team members from the editing and document preparation pool assigned to support REEP-Africa are Ms. Erin Newton, Ms. Felice Sinno-Lai, Ms. Syanne Olson, Ms. Gail Hayes, and Ms. Pam Prevatt.

After a country has been chosen for the TLLA instrument activity, steps will be taken to recruit and vet a local subcontractor with which to partner. The local subcontractor will be responsible for many practical and logistical considerations of the pilot activities, such as handling logistics for planning and training workshops, recruiting and managing individuals to collect data, assisting with sampling, and developing travel routes for the data collection teams, among other important aspects of the local activities related to the instrument pilot.

STAFF BIOS Dr. Margaret (Peggy) Dubeck is a senior literacy researcher in the International Education Division at RTI International. Her research expertise encompasses developing, piloting, and implementing assessments related to literacy and literacy acquisition. She has led the adaptation of Early Grade

15 | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) Reading Assessments (EGRA) in more than 20 languages around the globe as well as led the development and refinement of new and existing subtasks that measure language abilities, predict reading achievement, and assess teacher knowledge. Dr. Dubeck’s work has also focused on designing reading program interventions, training teachers and teacher professional development, and developing instructional materials. Other focal areas include research design and evaluations of interventions (including longitudinal and tracer studies). She has led instrument design efforts and authored numerous reports and publications. Dr. Dubeck holds a PhD in Reading Education from the University of Virginia.

Dr. Patience Sowa is a senior education analyst in the International Education Division at RTI International. Her work focuses primarily on upper primary and on upper primary and lower secondary literacy and English as a second language (ESL) development. Previously, Dr. Sowa was an associate professor of education, a skilled teacher educator in literacy and language learning, and a curriculum and program designer. Dr. Sowa has worked in a range of multicultural and multilingual contexts in Africa, the Middle East, and North America. Dr. Sowa has expertise in research, monitoring and evaluation, learning assessments and evidence in educator preparation, teacher education in a range of cultural contexts, especially preparation of ESL teachers, and in developing and advising on strategies to improve educational outcomes. Dr. Sowa holds a PhD in Curriculum and Instruction from University of Kansas.

Ms. Karon Harden is an education specialist in the International Education division of the International Development Group at RTI International. She has more than 20 years’ experience in education (12 years in sub-Saharan Africa), including in teaching, curriculum and materials development, teacher training and support, student assessment, and community engagement. Ms. Harden brings expertise in applied linguistics, first language literacy, second language acquisition and literacy, and early grade reading and assessment. She has led primary education projects in the Central African Republic and Nigeria and has taught linguistics and English as a second language at the secondary and tertiary levels in the United States, Nigeria, and Cameroon. Ms. Harden holds an MSEd in Curriculum and Instruction with an emphasis on Reading Education from University of Kansas and an MA in Linguistics with concentrations in Applied Linguistics and Teaching English to Speakers of Other Languages (TESOL) from Indiana University.

Mr. Christopher Cummiskey is a statistician in the International Education Division at RTI International with extensive professional experience in survey research for international education, health, and economic development studies. He established RTI’s standard statistical processing procedures for all Early Grade Reading Assessment (EGRA), Early Grade Mathematics Assessment (EGMA), and Snapshot of School Management Effectiveness (SSME) surveys. His project experience involves teaching and collaborating with clients, researchers, and other stakeholders; and creating, developing, and implementing complex sample designs. He is adept at preparing analysis reports that highlight substantive findings and methodological details. He has professional experience in the Philippines, Egypt, Nigeria, Ghana, South Sudan, Tanzania, Brazil, and Mozambique. Mr. Cummiskey holds an MPH degree in biostatistics from University of North Carolina, Chapel Hill.

Ms. Jennifer Pressley is an education analyst for the International Education Division at RTI International, where she performs various statistical analyses and ensures data quality of USAID-funded projects. While at RTI, Ms. Pressley has worked on analysis for numerous complex student assessments and led workshops on statistical analysis capacity building and skills transfer for ministry of education staff in Ghana and Tanzania. Ms. Pressley has professional proficiency in SPSS and Stata. She holds an MPP degree in public policy with a concentration in international development from American University.

Dr. Tracy Kline is a research psychometrician for the Social, Statistics, and Environmental Sciences

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | 16 division at RTI International, where she performs psychometric analyses, specifically for analyses on pilot data as part of ensuring validity and reliability of instruments and provides analytical evaluations of data collection tools measuring educational attainment. Dr. Kline employs analytic methodologies that investigate item-level instrument efficiency and individual differences elicited by complex item design. Psychometric experience includes Rasch methodology and factor analysis. She holds a PhD from the University of Virginia.

Ms. Emily Geisen is a survey methodologist for the Social, Statistics, and Environmental Sciences division at RTI International, where she has advised on survey methods research, study design, cognitive interviews, focus groups, and usability testing for various surveys throughout RTI. She has led several workshops on cognitive interviewing and has developed survey methodologies for a variety of health and census research studies during her time at RTI. She holds a Masters’ degree in Survey Methodology from University of Michigan.

Ms. Lauren Edwards is a senior project management specialist at RTI International. Ms. Edwards provides administrative, operational, and technical support in all aspects of home office project implementation. Her current portfolio includes a project in Malawi valued at $65 million and several smaller research activities. She has previously supported projects in Uganda valued at $36 million and $60 million. She has also provided short-term technical assistance to projects in Malawi, Uganda, Ethiopia, Tanzania, and the Democratic Republic of Congo, including facilitating enumerator trainings and other workshops. Ms. Edwards holds an MA in Education and Human Development (concentration in International Education) from the George Washington University.

Ms. Meredith Sparks is a project associate at RTI International. Ms. Sparks provides administrative and operational support in all aspects of home office project implementation. Her current portfolio includes supporting the start-up of a $13.9 million contract in Cambodia, in addition to supporting multiple projects in Uganda. She has over 10 years of academic studies and experience related to international communities and development issues. Ms. Sparks holds an MA in Global Studies from the University of North Carolina at Chapel Hill.

Ms. Erin Newton and Ms. Syanne Olson are technical editors, Ms. Felice Sinno-Lai and Ms. Gail Hayes are document preparation specialists, and Ms. Pam Prevatt is a Section 508-compliance specialist; all are part of the RTI Multimedia Communication Services group. Their role is to ensure quality control in editing and formatting of all deliverables submitted by RTI, in addition to making sure deliverables meet standards for marking and branding and for accessibility, and share the message intended. As they are also committed to other projects at RTI, they will work as a team to meet the publication needs for the TLLA deliverables. All five have a minimum of eight years’ experience.

17 | RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) EXHIBIT 1. RTI ESTIMATED LABOR Name Estimated level of effort (days) Cummiskey, Christopher 2 Dubeck, Margaret (Peggy) 36 Edwards, Lauren 10 Geisen, Emily 1 Hardon, Karon 41 Hayes, Gail 2 Kline, Tracy 2 Newton, Erin 4 Olsen, Syanne 2 Pressley, Jennifer 15 Prevatt, Pam 1 Sinno-Lai, Felice 3 Sowa, Patience 13 Sparks, Meredith 10

RESEARCH PLAN: TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) | 18

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

Prepared for USAID

Prepared by Dexis Consulting Group

Research for Effective Education Programming – Africa (REEP–A) Contract No. AID-OAA-TO-16-00024

July 2019

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID- OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019. The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication. TLLA FOCAL AREAS AND SUBTASK RECOMMENDATIONS The format of the TLLA will be a mix of direct assessment of abilities and performance tasks structured to replicate authentic instructional activities. It is anticipated that this balance treats teachers as professionals and will increase their support for the assessment.

RECOMMENDED SUBTASKS

Below is a description of the formats for assessing the proposed subtasks for language skills, grammar, and vocabulary.

Oral Reading: The oral reading component of the TLLA will have four parts. These subtasks will be the same subtasks found on the EGRA for students. First will be a letter sound identification task. Teachers will be presented with a list of letters and, where applicable, diagraphs, in the target language orthography, and be asked to pronounce the sound that each letter represents. For the pilot of the TLLA, this will be done in English and Luganda, a local LOI in Uganda. Next, teachers will be given a decoding task using invented words that adhere to the orthography and phonology of the target language. Third, teachers will read aloud a short narrative of about 60 words and be assessed for accuracy, rate, and prosody. They will be asked to read the story aloud as if they were reading it to a class of children to model fluent and expressive reading. Just like when children take an EGRA, for these first three subtasks, the teacher’s accuracy and rate for letter sounds, invented words and oral passage reading will be calculated. Following the oral reading, teachers will be asked five comprehension questions as the fourth subtask. The four explicit and one inferential question will evaluate the teacher’s reading comprehension, as is done with children. The proposed format is a mix of timed and untimed tasks and is expected to take about seven minutes to administer, which aligns with the amount of time for children to complete these four EGRA subtasks.

Silent Reading Comprehension: For this component, the teacher will read a short passage to themselves and will be asked to correct student responses to multiple choice and short answer questions about the passage. The questions will have been answered by a hypothetical student, and the teacher will identify if the student’s answers are correct. If the answers are not correct, the teacher will supply the correct answer. The silent passage will be an informational text about 150-200 words long, with complexity similar to grade 4-6 textbooks. The items will include low level comprehension (i.e., text-based) items from the passage, as well as higher order items such as main idea and inferential questions. The proposed format is timed and is expected to take five minutes to administer.

Speaking: The speaking component of the TLLA will ask teachers to describe a picture. The picture will be a black and white scene that contains at least 25 items that could be mentioned. Teachers will be scored on the breadth of the items they identify, and their responses patterns will be recorded. Specifically, this subtask will assess grammatical/syntactical forms and vocabulary. To ensure consistency, assessors will tick responses from a comprehensive list of items that could be mentioned in describing the picture. To assess syntax, assessors will be provided with a description, specifics and example utterances for five types of response patterns. The proposed format is timed and is expected to take two minutes to administer.

Listening: The listening component of the TLLA will prompt teachers to repeat sentences supplied by the assessor. The sentences will begin with the simplest structure for that language (e.g., We fish.) and

1 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS increase in complexity. The proposed format will have 15 items and represent common linguistic structures of varying lengths. It is expected to take two minutes to administer.

Grammar: In addition to the contextual assessments described in the above components, teacher knowledge of grammar will also be assessed in a separate part of the TLLA. As grammar is an essential component of language proficiency, teachers should know the linguistic rules underlying the language of teaching and learning in order to explicitly model and teach these rules. The grammar assessment will consist of error corrections in two parts. The first component will include a structure and written expression, and teachers will be prompted to choose the correct grammatical phrase. The sentences will be read to the teacher to avoid conflating reading ability with grammar ability. The second part will ask teachers to choose the underlined words that contain an error from a provided set of words. The proposed format is untimed and is expected to take five minutes to administer.

Vocabulary: The TLLA will assess teacher vocabulary through a performance measure. Teachers will be given “Tier 2” vocabulary words and asked to orally provide a student-friendly definition for each word. A Tier 2 vocabulary word is the type of word that is used across domains and is more descriptive than a word classified as a Tier 1. Tier 1 words are used in everyday language (such as the word “sad”), while Tier 2 words are more sophisticated and generally learned through books and in adult-child interactions (such as the word “astonished”). Teachers should be directly teaching Tier 2 words, even in cases of second-language learners who need to learn Tier 1 words. In this component, a teacher’s ability to provide a student friendly definition will indicate whether they know the word. The proposed format is timed and is expected to take five minutes to administer.

Writing: Lastly, the writing component of the TLLA will consist of two parts. The first part will ask teachers to score and correct a sample piece of grade 4 student writing. Teachers will correct the student sample and supply edits for items that the teacher identifies as incorrect. The student writing sample will have errors in grammar, punctuation, capitalization, and spelling. The second part of writing will require teachers to produce a written response to a provided prompt. It will be scored on structure (i.e., paragraph) writing conventions (e.g., capitalization, punctuation) verb tenses, sentence length, spelling, vocabulary and overall meaning. The format is a mix of timed and untimed and is expected to take eight minutes to administer.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 2 SUBTASK PROTOCOLS AND SAMPLE ITEMS

The format of each of the TLLA subtasks is presented below. While the TLLA is designed to be adaptable into any alphabetic language, the sample items are given in English here. Because we will be piloting the assessment in Uganda in English and Luganda, the sample items for the first four subtasks, which are the same as the student EGRA subtasks, are taken directly from the English EGRA administered by Uganda SHRP in 2017.

Adapting the assessment into a language other than English requires much more than just translation. The adaptation process requires input from linguistic experts in the target language and takes into consideration the unique phonological, lexical, and grammatical structures of the target language. The adaptation process for the TLLA will follow the same principles as the adaption process for the EGRA, which is explained in detail in the EGRA Toolkit 2.0.1 Some guidelines for adaptation are included here after each task.

A. TLLA Oral Reading Tasks

1. Letter Sound Identification Task Timed task: 60 seconds for 100 items

Instructions to the Assessor: Show the teacher the sheet of letters in the teacher stimuli booklet as you read the instructions below.

Learning the letters of the alphabet is one of the first steps that our learners take in learning to read. When we teach a new letter, we teach the sound that it represents.

Here is a page of letters of the English alphabet. Let’s say you are presenting each of these letters to your learners for the first time. Please tell me the sound that you would teach your learners that each letter represents in English. Not the name of the letter, but the sound that it makes. [If the items include digraphs, say:] Some of the boxes contain two letters that combine to make just one sound.

Let’s start with some examples. [Point to the letter u.] In English, the sound of this letter is /u/. If I were teaching my learners this letter for the first time, I would tell them that it makes the sound /u/ in English.

Try this one: [Point to the letter f.] What is the sound of this letter in English?

[If the teacher says /f/:] That’s right. [If the teacher does not say /f/, say:] OK. Then say: In English, the sound of this letter is /f/. If we were teaching our learners the letter f for the first time, we would teach that it makes the sound /f/ in English.

Let’s try one more: [Point to the letter L.] What is the sound of this letter in English?

1 RTI International. (2016). Early Grade Reading Assessment (EGRA) toolkit (2nd ed.) Washington, DC: USAID. Retrieved from https://pdf.usaid.gov/pdf_docs/pa00m4tn.pdf

3 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS [If the teacher says /l/:] That’s right. [If the teacher does not say /l/, say:] OK. Then say: In English, the sound of this letter is /l/. If we were teaching our learners the letter L for the first time, we would teach that it makes the sound /l/ in English.

When I say “Begin,” start here [point to first letter] and go across the page [point]. Point to each letter (or letter combination) and tell me the sound of that letter in English. Read as quickly and carefully as you can. If you come to a letter that you do not know how to pronounce, just skip it and go on to the next letter.

Are you ready? Begin.

Start the timer when the teacher reads the first letter. Follow along with your pencil and clearly mark any incorrect letter sounds with a slash ( / ). Count self-corrections as correct. If you already marked the self-corrected letter as incorrect, circle it ( ø ) and continue. [On the tablet: Follow along on your screen and mark any incorrect letters by touching that letter on the screen—it will turn blue. Mark self- corrections as correct by touching the letter again—it will return to gray.] Stay quiet, except if the teacher hesitates for 3 seconds. Point to the next letter and say, “Please go on.” Mark the skipped letter as incorrect.

If the teacher provides the letter name rather than the sound, or if the teacher adds other explanation as if she were teaching, (e.g., “Learners, this is sound /m/”), say, “Please just say the sound of the letter in English.”

If a teacher provides the less common sound (e.g., in English short vowel a, e, i, o, u, soft c), say, “Please say the other common sound this letter makes.”

Early stop rule: If the teacher does not provide a single correct response for the first 10 items, say “Thank you, that’s all,” discontinue this subtask, check the box at the bottom of the task, and continue to the next task. [On the tablet: If the teacher does not provide a single correct response for the first 10 items, the screen will flash red and the timer will stop. Then press “Next.”]

If the timer runs out before the last item is read, say “Thank you, that’s all.” If the teacher is almost finished, you may let them finish; you do not have to interrupt them. Either way, mark with a bracket (]) the final letter read when the timer ran out; do not count any letters that they read after the end of the timer. [On the tablet: If the timer runs out before the last item is read, the screen will flash red and the timer will stop. Mark the final letter read by touching it so that a red bracket appears. Then press “Next.”]

If the teacher reaches the last item before the screen flashes red, stop the timer as soon as the teacher reads the last letter. Note the number of seconds remaining and record it at the bottom of the task. [On the tablet: Touch the last letter so the red bracket appears. Then press “Next.”]

When finished, say: Thank you! Let’s go to the next section.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 4 Sample Task Stimulus2:

Example: u f L

h i A L h S X A L c (10)

N r c d i T r y s P (20)

D T s N R O J e H i (30)

a e L u V g E U t Z (40)

e t o E I t S n w e (50)

W F A n o E G T N R (60)

M h T b E i n H m T (70)

O e L D Y d a f E a (80)

i U p i N t O Q h R (90)

e o C A O e S a K S (100)

Time remaining on stopwatch at completion (number of SECONDS):

Exercise discontinued because the teacher had no correct answers in the first line.

Exercise skipped because teacher refused.

Development/Adaptation Guidelines:

1. Do an inventory of all the letter-sound correspondences in the standard orthography. 2. Do a corpus analysis of the relative frequencies of the different letter-sound correspondences. 3. Include at a minimum the default letter representation of all consonant and vowel sounds in the language, up to 50. 4. If the orthography includes commonly used digraphs/trigraphs with consistent letter-sound correspondences (e.g. th, ch), include them. 5. If the orthography uses diacritics to distinguish between letter sounds, include them. 6. If the language has fewer than 50 letter-sound correspondences, repeat items according to their relative frequency in the language.

2 For this example, the items are taken directly from the 2017 English EGRA administered by the Uganda SHRP program. 5 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS 7. Randomize the items. After randomizing, check that the neighboring letters do not form a word in that language, and if they do, re-scramble. 8. Ensure that the first line is not composed exclusively of rare items. 9. As much as possible, avoid contentious orthographic disputes. 10. If one letter commonly represents two sounds, accept either sound as correct. 11. Accept dialectal pronunciations differences.

2. Nonsense Word Decoding Task Timed task: 60 seconds for 50 items

Instructions to the Assessor: Show the teacher the sheet of nonsense words in the teacher stimuli booklet as you read the instructions below.

Decoding new words by sounding them out is an important skill that we teach our learners when teaching them to read.

Let’s say you are teaching your learners to read some new words they have never seen before, using their knowledge of the letter sounds. For this exercise, we will use some nonsense words that are spelled like real words in English. Please tell me the correct way to read each of these words in English according to the way the words are spelled.

Let’s start with some examples. [Point to the word ud. This is not a real word in English, but if it were, we would expect a learner to read this word as /ud/.

Try this one: [Point to the word bif.] This is not a real word in English, but if it were, how would you expect a learner to read this word?

[If the teacher says /bif/:] That’s right. [If the teacher does not say /bif/:] OK. Then say: In English we would read this word as /bif/.

Let’s try one more: [Point to the word mep.] This is not a real word in English, but if it were, how would you expect a learner to read this word?

[If the teacher says /mep/:] That’s right. [If the teacher does not say /mep/:] OK. Then say: In English we would read this word as /mep/.

When I say “Begin,” start here [point to first letter] and go across the page [point]. Point to each word and tell me the correct way that you would expect the learners to read that word in English. Remember that these are not real words in English, but they are spelled like real words, so we can still read them as if they were. Read as quickly and carefully as you can. If you come to a word that you do not know how to pronounce, just skip it and go on to the next word.

Are you ready? Begin.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 6

Start the timer when the teacher reads the first word. Follow along with your pencil and clearly mark any incorrect words with a slash ( / ). Count self-corrections as correct. If you already marked the self- corrected word as incorrect, circle it ( ø ) and continue. [On the tablet: Follow along on your screen and mark any incorrect words by touching that word on the screen—it will turn blue. Mark self- corrections as correct by touching the word again—it will return to gray.] Stay quiet, except if the teacher hesitates for 3 seconds. Point to the next word and say, “Please go on.” Mark the skipped word as incorrect.

Early stop rule: If the teacher does not provide a single correct response for the first five items, say “Thank you, that’s all,” discontinue this subtask, check the box at the bottom of the task, and continue to the next task. [On the tablet: If the teacher does not provide a single correct response for the first five items, the screen will flash red and the timer will stop. Then press “Next.”]

If the timer runs out before the last item is read, say “Thank you, that’s all.” If the teacher is in the final five items, you may let them finish; you do not have to interrupt them. Either way, mark with a bracket (]) the final word read when the timer ran out; do not count any words that they read after the end of the timer. [On the tablet: If the timer runs out before the last item is read, the screen will flash red and the timer will stop. Mark the final word read by touching it so that a red bracket appears. Then press “Next.”]

If the teacher reaches the last item before the screen flashes red, stop the timer as soon as the teacher reads the last word. Note the number of seconds remaining and record it at the bottom of the task. [On the tablet: Touch the last word so the red bracket appears. Then press “Next.”]

When finished, say: Thank you! Let’s go to the next section.

Sample Task Stimulus3:

Example: ud bif mep

lus paf sim zon maz lus paf sim zon maz (10)

ver lut ral fid gax ver lut ral fid gax (20)

rop teb fut et sal rop teb fut et sal (30)

sen tib lef huz leb sen tib lef huz leb (40)

bif wix fim riz ret bif wix fim riz ret (50)

Time remaining on stopwatch at completion (number of SECONDS):

2 For this example, the items are taken directly from the 2017 English EGRA administered by the Uganda SHRP program. 7 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS □ Exercise skipped because teacher refused.

Development/Adaptation Guidelines:

1. Do a corpus analysis to determine the permissible and prohibited letter combinations and syllable structures and the average word length in grade-appropriate texts. Ensure that the invented items conform to the phonological and orthographic constraints of real words and are representative of different structures that dominate grade 1 through grade 4 text. 2. Include at a minimum the default or most common letter representation of all consonant and vowel sounds in the language, up to 50. 3. Vary the word length in terms of letters and syllables. 4. Avoid any forms that are real words in another predominant language spoken in the region. We don’t want the teacher to see the form and read it as it is pronounced in another language instead of reading it as it would be pronounced in the target language. 5. Randomize the items. 6. Ensure that the first line is not composed exclusively of rare items. 7. As much as possible, avoid contentious orthographic disputes. 8. Accept dialectal pronunciations differences.

3. Oral Reading Fluency Timed task: 60 seconds for approximately 60 words

Instructions to the Assessor: Show the teacher the oral reading fluency passage in the teacher stimuli booklet as you read the instructions below.

We often read stories aloud to our learners. In addition to building their language and reading skills, reading aloud to them lets us model what fluent and expressive reading sounds like.

Let’s say you are going to read the following story to your learners. First, take a moment to skim the story. Then read it aloud to me as if you were reading it to your learners to model what fluent and expressive reading. If there are any words you don’t know, just skip them and continue to the next word. After you read, we will do two comprehension activities.

Are you ready? Begin.

Start the timer when the teacher reads the first word aloud. Follow along with your pencil and clearly mark any incorrect words with a slash ( / ). Count self-corrections as correct. If you already marked the self-corrected word as incorrect, circle it ( ø ) and continue. [On the tablet: Follow along on your screen and mark any incorrect words by touching that word on the screen—it will turn blue. Mark self- corrections as correct by touching the word again—it will return to gray.] Stay quiet, except if the teacher hesitates for 3 seconds. Point to the next word and say, “Please go on.” Mark the skipped word as incorrect.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 8 Early stop rule: If the teacher does not read a single word correctly in the first line, say “Thank you, that’s all,” discontinue this subtask, check the box at the bottom of the task, skip the Text Comprehension Question Generation task, and continue to the next task. [On the tablet: If the teacher does not read a single word correctly in the first line, the screen will flash red and the timer will stop. Then press “Next.”]

If the timer runs out before the last item is read, say “Thank you, that’s all.” If the teacher is almost finished, you may let them finish; you do not have to interrupt them. Either way, mark with a bracket (]) the final word read when the timer ran out; do not count any words that they read after the end of the timer. [On the tablet: If the timer runs out before the last item is read, the screen will flash red and the timer will stop. Mark the final word read by touching it so that a red bracket appears. Then press “Next.”]

If the teacher reaches the last item before the screen flashes red, stop the timer as soon as the teacher reads the last word. Note the number of seconds remaining and record it at the bottom of the task. [On the tablet: Touch the last word so the red bracket appears. Then press “Next.”]

Mark the rating that best characterizes how expressive the teacher was in reading the passage using intonation (i.e. expressing meaning through strategic variation in vocal pitch and volume).

When finished, say: Thank you! Let’s go to the next section.

Sample Task Stimulus4:

My name is Pat. I live on a farm with my mother, father, and brother. The land gets very dry. Every year we watch the sky and look for the rain. One afternoon as I sat outside, I saw dark clouds. Then something hit my head, lightly at first and then harder. The rains had come at last.

Time remaining on stopwatch at completion (number of SECONDS):

□ Exercise discontinued because the teacher had no correct answers in the first line.

□ Exercise skipped because teacher refused.

Assessor Scoring Guide for Intonation:

3 2 1

Consistently read with vocal Used occasional expressiveness Used little to no vocal expressiveness, conveying and/or used it sometimes expressiveness. Mostly dull and meaning and emotional content inappropriately (i.e. intonation monotonous. Intonation offered through appropriate intonation. did not match meaning). little support toward conveying meaning.

4 For this example, the text and comprehension questions are taken directly from the 2017 English EGRA administered by the Uganda SHRP program. 9 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

Development/Adaptation Guidelines:

1. The story should be a narrative of approximately 60 words, written in third person. The story should have a plot line with a beginning, where the scene is set and the characters are introduced, a middle, where a dilemma is presented, and an end, where the dilemma is resolved. The story should end on a positive note. 2. Do not copy a well-known plot (e.g. from a folktale), but the setting, characters, and actions should be familiar and appropriate for children in the target area. 3. The story should contain emotional content (e.g. excitement, fear, sadness, surprise) so as to give the teacher the opportunity to use expressive intonation. 4. Limit the number of characters to two maximum. All names should be familiar in the context. Do not name a specific place such as the name of the city. 5. The story should be written approximately at a 4th grade level. Include a variety of sentence lengths: some shorter, some longer. Include some conjunctions and clauses. 6. Avoid words with dialect differences. 7. Use positive gender representations.

4. Oral Reading Comprehension Untimed task: 5 items

Instructions to the Assessor: Remove the passage from the Oral Reading Fluency task from in front of the teacher as you read the instructions below.

As teachers, we ask our learners a lot of questions to check their understanding of the lesson. Now I will ask you some questions that you might ask your learners about the story you just read. Please tell me what answer you would expect them to give if they understood the story.

Are you ready? Let’s begin.

Ask the provided questions in the table. Mark the provided box according to the teacher’s answer.

Look Back: This activity is used if the teacher did not answer a question correctly. Give the passage again to the teacher and say: Now you can use the passage to help you find the answer. Ask only the questions that were answered incorrectly the first time. If correct, tick the Correct with Lookback box.

Correct No Questions Correct Incorrect with Response Lookback

1. Where does Pat live? [on a farm]

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 10 Correct No Questions Correct Incorrect with Response Lookback

2. What gets very dry? [the land or the ground]

3. Why do Pat and his family watch the sky? [hoping the rains come; waiting for the rain, looking or watching for rain]

4. What did Pat see as he sat outside? [clouds, dark clouds]

5. How did Pat feel when the rains came? [excited; thankful; happy; any reasonable answer]

Development/Adaptation Guidelines:

1. The questions should be complete statements. Do not use cloze. For example: “Where did the girl go?” is fine. “The girls went to the ?” is not acceptable. 2. Create four explicit questions that are evenly spaced throughout the passage. 3. Create one interferential question for which the answer can be inferred by reading the passage. 4. Write the acceptable answer(s) next to the questions.

11 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

B. TLLA Silent Reading Comprehension Task

I. Silent Reading Comprehension Timed Task: 10 items, 5 minutes

Instructions to the Assessor: Open the teacher stimuli booklet to the Silent Reading Comprehension Task. Read the following instructions to the teacher.

Part of our tasks as teachers is to assess learner language proficiency to help improve their skills. Here is an English reading comprehension exercise completed by a primary 4 school learner. The purpose is for you to assess the learner’s English reading comprehension skills. You will read the comprehension text silently and then correct the learner’s responses. Use the symbol ✓ for the answers which are correct. If the answer is incorrect cross it out and circle the correct answer.

Example (Correct student answer):

Let’s start with an example. I will read the text to you. [Point to the example and read it aloud. Then point to the question. This question asks, “Why was Mary excited to go to a new school?” [Read the response options aloud.] The learner has selected C as the answer. Is the learner’s answer correct? [If the teacher says No say:] That’s right. Cross out the student answer. Which option do you think best answers the question? [If the teacher says A say:] That’s right. Now, circle A. [If the teacher says B, look at the text together again, and say:] According to the text, A answers the question better than the other options.

[If the teacher says the answer is correct say:] Let’s look at the text and then read the response options again. Answer A answers the question better than the other options. The learner’s answer is incorrect. Cross out the learner’s answer, and circle answer A.

When you are ready, we can begin. Ready?

Sample Task Stimulus:

Example:

Mary was excited to go to her new school because it had good teachers and it was close to her house.

Question and Student Answers Mark a tick (✓) if the selected answer by the student was correct. If the student answer is incorrect, cross it out and circle the correct answer.

1. Why was Mary excited to go to a new school? a. It had good teachers. b. It had many teachers.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 12 c. It was far from her house.

William Kamkwamba was born in Masitala village in Malawi. His father was a farmer. When William was 14 years old, there was a drought because it had not rained for a long time. The farmers’ crops could not grow. Soon, there was very little food. His family ate one meal a day. Everyone was hungry and thin. William’s family had no crops to sell, so they could not pay his school fees. Then William had to stop going to school. He still wanted to learn so he went to the library and read many books. One day, William found a book with a picture of a windmill. A windmill is a tall machine with a wheel on top of it. When the wind blows hard, the wheel turns around and makes electricity. William wanted to build a windmill to help his family. He looked at the picture in the book to help him. William had no money, so he used old bicycle parts and pipes to build his windmill. He used the windmill to make electricity for his family’s house. People came from far away to see William’s windmill. Now his family is happy because they have electricity at home.

Questions and Student Answers Mark (✓) if the selected answer by the student was correct. If the student answer is incorrect, cross it out and circle the correct answer.

1. Why was everyone in the village hungry and thin? a. The people did not like to eat. b. There was no food because of the drought. c. It was raining everyday.

2. What was the occupation of William’s father? a. electrician. b. farmer. c. librarian.

3. Why did William have to stop going to school? a. His family ate only one meal a day. b. His parents did not have money to pay his school feels. c. He preferred going to the library.

4. Why did William’s family only eat one meal a day? a. They ate one meal a day because they had very little food and no money. b. They ate only one meal a day because they had no electricity. c. They ate only one meal a day because they were too busy in the fields to eat.

13 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

5. Why did William go to the library? a. He went to the library because he loved to learn. b. He went to the library because he wanted to find a windmill. c. He went to the library because there was food there.

6. Why did William use old bicycle parts and pipes to build the windmill? a. He used old bicycle parts and pipes because they are the best parts to use for building a windmill. b. He used old bicycle parts and pipes because the wind can move these parts easily. c. He used old bicycle parts and pipes because he had no money to buy new parts.

7. How did the windmill help William’s family? a. The windmill made electricity for their home. b. The windmill made food for his family to eat. c. The windmill helped his family grow crops.

8. People came from far away to see William’s windmill because a. They had never seen a windmill before. b. They thought William was very clever. c. They wanted electricity in their homes.

9. Choose the most appropriate title for this passage. a. William’s Electric Windmill. b. A Drought in Masitala Village, Malawi. c. All About Windmills.

10. What is the main idea of the passage? a. There was a drought in William Kamkwamba’s village. b. William Kamkwamba’s electric windmill was a tall machine. c. William Kamkwamba used a book to build an electric windmill.

Development/Adaptation Guidelines: Text selection:

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 14 1. Chose texts that are appropriate for learners in primary 4. 2. You can select and adapt texts from grade 4 textbooks, tourist guides, newspapers, recipe books, and Wikipedia. 3. To assess text complexity, and appropriate grade level for English use the Flesch-Kincaid test at www.readability.com or the readability check in Microsoft word. It is also important to weigh the readability of the selected text against the readability of primary 4 English textbooks to approximate readability for this grade level, as it may differ significantly from one country to another, even in the same language. 4. For local languages, English word and sentence lengths do not apply. Use the word and sentence lengths in primary 4 textbooks in the target language to approximate readability for primary 4. The website https://wordcounttools.com may help in determining word and sentence lengths of a sample text in the target language. 5. The text should be between 150-200 words. 6. Selected texts should be informational (non-fiction). Recommended informational text genres are: a. Process or “How To” Texts. These are texts which describe the specific directions or steps in a process. Examples of process texts are recipes, how to cook a dish or play a game. b. Expository Texts. The purpose of these texts is to inform and describe. The content of these texts are factual and educational. 7. Names and proper nouns should be familiar to learners in the context. 8. Select texts that are high in quality, ie texts that are well-written, are rich in content and have a variety of sentence types. 9. Select texts which give readers the opportunity to answer factual and inferential and questions. Answers to factual/literal questions can be found directly in the text. Inferential questions require the reader to read between the lines and draw reasonable conclusions based on information given in the text.

Developing multiple choice items: There are 2 parts to a multiple-choice item. These are the stem, which identifies or poses the question which students must answer, and a list of suggested responses-options. One of the response options is correct. The rest which are incorrect, are distractors. Below is an example of a multiple- choice question. Example:

A. Which idea below best expresses the main idea of the paragraph? ] Stem Answer [ a. Reading will help you in every aspect of your life. Distractor [ b. We do not need to learn to read Response-options Distractor [ c. Reading keeps your body in shape.

When constructing multiple choice items keep the following in mind:

10. Make sure only ONE response-option is correct or is the best answer to the question. 11. Keep the stem short and clear. 12. State the stem in a positive form. (Avoid negative stems, such as “Which of the following is not true of the characteristics of mammals?”)

15 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

13. Keep the grammar of each response-option consistent with the stem. 14. Make sure the stem clearly states the problem. The stem may consist of a direct question or an incomplete sentence which readers must complete. a. Direct question: Which of the following is a characteristic of a mammal? b. Incomplete sentence: A safari park is a place where … 15. Response distractors should be relevant to the text. They should not be silly or obviously about a different topic. 16. Keep the response items similar in length. 17. Avoid lifting the response-options word for word from the text. 18. Answers to the reading comprehension questions should only be found in the text and should not be answerable from background knowledge. 19. The questions should require the reader to have read and understood the text, and not be answerable from background knowledge alone. 20. Randomize the position of the correct answer (a, b or c) from one item to the next.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 16 C. TLLA Speaking Task

I. Speaking: Describing a Picture Timed Task: 1 picture, 2 minutes

Instructions to the Assessor: Show the teacher the picture in the teacher stimuli booklet and say, Please look at this picture and tell me what is happening in it.

Start the timer when the teacher starts talking. Follow along with your pencil and tick all the items that the teacher mentions from the list on the left of your score sheet. [On the tablet: Follow along on your screen and tick each item mentioned from the list—it will turn blue.] Stay quiet, except if the teacher hesitates for 5 seconds. Say: You can tell me anything you see happening in this picture.

Early stop rule: If the teacher does not say anything at all for 10 seconds, even after the second prompt, say, Thank you, that’s all, discontinue this subtask, check the box at the bottom of the task, and continue to the next task.

If the timer runs out before teacher finishes speaking, say: Thank you, that’s all. If the teacher is almost finished, you may let them finish; you do not have to interrupt them. Either way, tick the final item mentioned when the timer ran out; do not count any items mentioned after the end of the timer. [On the tablet: If the timer runs out before the teacher finishes speaking, the screen will flash red and the timer will stop. Mark the last item mentioned before the screen flashed red. Then press “Next.”]

If the teacher finishes speaking before the timer runs out, stop the timer.

At the end of the timed portion, tick one item from the right side of your score sheet, the Utterances Description. Tick the one option that best describes the overall response pattern used by the teacher.

When finished, say: Thank you! Let’s go to the next section.

Sample Task Stimulus:

17 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

Assessor Score Sheet:

Items Mentioned (Tick all that are said.) Utterances Description

(Tick one, the dominant response pattern)

☐ balloons ☐ mouth ☐ The teacher listed the items and events. Possibly with ☐ barefoot ☐ nudge pauses between items. ☐ bed ☐ open (ajar) ☐ bent ☐ pattern I see - children, a hoe, stretching, walking up. ☐ blanket ☐ peaceful ☐ ☐ brother ☐ plaid The teacher used sentences that were a mix of ☐ children ☐ plain grammatically correct and incorrect. ☐ clean ☐ rectangular ☐ confused ☐ rise The mother taps. Child stretch. ☐ countryside ☐ round ☐ cupboard ☐ roof ☐ The teacher used sentences that were grammatically ☐ door ☐ rural correct. ☐ doorknob ☐ shoulder ☐ early ☐ siblings The children stretch. They live near trees. The adult is barefoot. ☐ elbow ☐ sister ☐ fields ☐ sit up ☐ The teacher used sentences that were grammatically ☐ fists ☐ solid correct and cohesive. ☐ floor ☐ stretch ☐ gently ☐ stripe (s) It is early morning. The adult wakes the kids. They are ☐ grass ☐ sunrise stretching. ☐ ground ☐ tap ☐ The teacher used sentences that are grammatically ☐ handle ☐ tired (sleepy) correct, cohesive, and extend the illustration. ☐ hoe ☐ thatch

☐ horizon ☐ touch It is early morning. The adult is ready to work in the fields. ☐ hut ☐ tree Before she leaves, she has to wake the kids for school. Her ☐ kitenge(wrapper) ☐ t-shirt shoes are likely kept outside the door. ☐ landscape ☐ two ☐ latch ☐ village ☐ mattress ☐ wake-up ☐ morning ☐ white ☐ mother ☐ wood ☐ yawn

□ Exercise discontinued because the teacher had no response.

□ Exercise skipped because teacher refused.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 18 Adaptation Guidelines

1. Locate a black and white drawing of a local scene (e.g., markets, school, domestic scenes from a pupil book). It should include people and show activities and objects. It is helpful if there is content in the foreground and background. 2. Generate a thorough list of all the objects and actions pictured in the list. Include synonyms. As part of the list generation process, show the picture to several proficient speakers of the target language and ask them to describe it. Ensure that you have included all of the words that they use. 3. The drawing should be between 4 inches by 6 inches and a full B5 page. 4. The prompt is to encourage the use of sentences. The two minutes is not to calculate speaking rate. It is to give the task a consistent structure. 5. Technical Adequacy: During cognitive interviewing we will ask teachers about the prompt. Scoring syntax (i.e., response patterns) will require some real examples for training.

Note: The order of the subtasks presented in this document aligns with the order they appear in the TLLA research plan. While the Oral Reading task appears first in the plan because of its relationship to the student EGRA, when these tasks are administered, we expect that this subtask would be administered first. It is a “friendly” task to begin the interaction and it avoids any priming effects that the other subtasks may offer.

19 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

D. TLLA Listening Task

1. Elicited Imitation (Sentence Repetition) Task Untimed task: 15 items. (expected to take 2 minutes)

Instructions to the Assessor: There is no text stimulus for the teacher for this task. Read the instructions below to the teacher.

I will read some sentences aloud in English. I will read each sentence only once. After I read each one, please repeat the same exact sentence back to me word for word. The sentences will gradually get longer as we go. If you cannot remember the whole sentence exactly, just repeat as much of it as you remember.

Let’s start with an example. If I say, “We sing songs in the classroom”, then you will just repeat back to me: “We sing songs in the classroom.”

Now you try an example. I will say a sentence, and you repeat it back to me word for word: “Children like to go to school.”

[If the teacher repeats some or all of it, say:] That’s right.

[If the teacher does not repeat or does not understand what to do, say:] OK, all you need to do is repeat the sentence back to me word for word as much as you remember. Let’s try again: “Children like to go to school.”

Ready? Let’s begin.

Read each sentence aloud clearly and naturally. Do not read too fast or too slowly. Read each sentence only one time. Listen as the teacher repeats the sentence and follow along with your pencil. Clearly mark any incorrect, transposed, or omitted words with a slash ( / ). Count self-corrections as correct. If you already marked the self-corrected word as incorrect, circle it ( ø ) and continue. [On the tablet: Follow along on your screen and mark any incorrect, transposed, or omitted words by touching that word on the screen—it will turn blue. Mark self-corrections as correct by touching the word again—it will return to gray.]

Stay quiet, except if the teacher hesitates for 3 seconds. Say, “Please repeat as much as you remember.” Mark any omitted words as incorrect. If the teacher cannot repeat any words in a given sentence, draw a line through the whole sentence. [On the tablet: If the teacher cannot repeat any words in a given sentence, touch the strike-through button to strike through the whole sentence. Then press “Next.”]

Early stop rule: If the teacher does not correctly provide at least 50% of the syllables for any 3 sentences in a row, say “Thank you, that’s all,” discontinue this subtask, check the box at the bottom of the task, mark with a bracket (]) the end of the final sentence that you read, and continue to the next task. [On the tablet: Mark the final word read by touching it so that a red bracket appears. Then press “Next.”]

When finished, say: Thank you! Let’s go to the next section.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 20 Assessor Stimulus and Score Sheet:

Example 1: We sing songs in the classroom.

Example 2: Children like to go to school.

1. The girls bought fruit at market. (6 syllables)

2. Please give her the book today. (7 syllables)

3. Was drought the reason for crop loss? (8 syllables)

4. They stopped here to rest around sunset. (9 syllables)

5. The parents reject the new policy. (10 syllables)

6. Would you enjoy coming to the match with us?

(11 syllables)

7. Before my brother left today, he finished his work.

(12 syllables)

8. Did they lose their crops because there wasn’t enough

rain? (13 syllables)

9. It was the principal who said we should do

it this way. (14 syllables)

10. The girls won the game because they had trained

harder than the others. (15 syllables)

11. There was no room in the bed, so they

slept on a mat on the floor. (16 syllables)

12. Every morning Mother rises early to heat water for

our bath. (17 syllables)

13. The police have a checkpoint between here and the

city that slows traffic. (18 syllables)

21 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

14. She was a smart president and It’s logical she

was elected three times. (19 syllables)

15. When John woke this morning he discovered that his

bike had been stolen in the night. (20 syllables)

□ Exercise discontinued because the teacher had no correct responses for three sentences in a row.

□ Exercise skipped because teacher refused.

Guidelines for the development of the items for this task:

1. Do an inventory of common sentence structures in the language. 2. Develop 15 sentences that gradually increase in length and complexity and use different common sentence structures. The following are some suggested grammatical structures that can be used to vary the sentence structure and level of complexity, if the target language uses them: different verb tenses (e.g. past, present, future); copula/linking verbs (e.g. He is young., She is a strong woman.) versus intransitive verbs (i.e. with no object, e.g. We slept…) versus transitive verbs (i.e. with a direct object, e.g. We ate dinner.) versus bi-transitive verbs (i.e. with direct and indirect object, e.g. We gave him the book.); passive voice (e.g. They were robbed.); subordinate/dependent clauses (such as adverbial clauses, e.g. when we arrived…, because they wanted …; or relative clauses, e.g. the man who left the book, the man that I saw, etc.), negation, questions (interrogative form), commands (imperative form), prepositional phrases (e.g. on the table, to school), etc. 3. Note that making a sentence longer will usually increase its complexity but avoid simple listing. In the following example, both b (20 syllables) and c (21 syllables) are longer than a (9 syllables), but c is more syntactically complex than b. a. Grace went to market to buy meat. b. Grace went to market to buy onions, tomatoes, garlic, potatoes, and meat. c. When Grace saw that the sky was getting dark, she quickly went to the market to buy meat. 4. Order the items from shortest to longest in terms of syllables. The shortest sentence should be 6-8 syllables long. The longest sentence should be at least 20 syllables long. 5. Use common, familiar, everyday vocabulary. 6. Each sentence should make sense and be grammatical, that is, sound perfectly natural to a native or highly proficient speaker of that language. 7. Each sentence should independent of and unrelated to all the other sentences. 8. Avoid well-known sentences with which the teacher may already be familiar, such as famous quotations or lines from well-known songs or poetry. 9. Avoid the use of the first or second person singular (i.e. In English, I/me or you), as the teacher may be tempted to use the opposite term when repeating the sentence. 10. Avoid words and structures that are only used in some dialects but not others. 11. Accept dialectal pronunciations differences. That is, if the teacher pronounces the same words in a slightly different way due to their own regional dialect/accent, accept their pronunciation.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 22

23 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

E. TLLA Grammar Tasks

1. Structure and Written Expression Task Untimed task: 5 items (expected to take 2 minutes)

Instructions to the Assessor: Show the teacher the Structure and Written Expression task in the teacher stimuli booklet as you read the instructions below.

Here are some sentences in English. Each sentence is missing one part. Below each sentence are four ways to complete the sentence. I will read each sentence while you follow along. Then you tell me which answer fits the sentence best.

Let’s start with an example. [Point to the example and read it aloud.] This sentence says, “My father beans every spring.” [Point to the response options.] Here are four ways to complete that sentence: “A. planting; B. has plants; C. plants; D. is planted.” Which answer fits the sentence best?

[If the teacher says C, say:] That’s right. / [If the teacher does not say C say:] OK, consider this.

[Then say:] If we completed the sentence with C, the sentence would read “My father plants beans every spring.” Answer C fits the sentence better than the other answers.

When you are ready, we will begin. I will start reading here [point to the first sentence], you follow along and choose the best answer to complete the sentence and tell me which one. You can just say A, B, C, or D. If you come to one that you do not know the answer to, just skip it and continue to the next sentence.

Are you ready? Let’s begin.

Follow along with your pencil and clearly mark the answer that the teacher indicates for each item. If the teacher says one answer and then changes his or her mind, mark the new answer. If you already marked the first answer, circle the old answer ( ø ) and mark the new answer. [On the tablet: Follow along on your screen and mark each answer that the teacher indicates by touching that option on the screen—it will turn blue. Mark self-corrections by touching the new answer—the old answer will return to gray and the new answer will turn blue.]

Stay quiet, except if the teacher hesitates for more than 10 seconds on one item. Point to the next item and say, “Please go on.” Mark the skipped item as incorrect.

When finished, say: Thank you! Let’s go to the next section.

Sample Task Stimulus:

Example: My father beans every spring.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 24 A. planting B. has plants C. plants D. is planted

1. The government recognizes education human right and strives to provide free primary education to all children.

A. basic B. as basic C. as a basic D. basically as

2. A tropical cyclone is expected to hit the region on Thursday evening, many domestic flights have been cancelled.

A. so B. because C. provided that D. due to

3. For the first time, the manufacturer has revealed that it three million tonnes of plastic packaging in one year.

A. use B. was used C. used to D. used

4. Our school plans to review official policies on student absenteeism.

A. its B. an C. ourselves D. one

5. With a quarter million people in its many villages in northern Uganda, Bidibidi is the second largest refugee camp in the world.

A. life B. have lived C. living D. lived

□ Exercise skipped because teacher refused.

Development/Adaptation Guidelines:

1. Local or online news stories, Wikipedia, or upper primary textbooks written in the target language can serve as inspiration for sentences. However, you do not have to use sentences verbatim from real sources. You may edit existing sentences for length, clarity, and appropriateness, or develop your own. 2. The sentences should be on topics familiar to adults in that context. Ensure that they are neutral and not disparaging or politically controversial. Avoid company names and proper nouns. 3. Choose one or two words from the sentence to omit. Replace them with a blank. Put those words as one of the four answer options. Devise three distractor options that are similar or plausible enough that the teacher needs to read the larger context of the sentence in order to choose the correct form. Nonetheless, the three distractor options must be grammatically impossible in the context of the sentence. 4. The selection of the correct answer should not require knowledge of prescriptive grammar rules of the “standard” dialect that are not widely known or used (e.g. whom in English). The correct answer should be immediately obvious to a native or highly proficient speaker of the target language based on common usage alone, even if that speaker does not have an academic or specialized background in that language’s grammar system.

25 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

5. The errors should be primarily grammatical in nature, and not based on spelling, punctuation, vocabulary, or factual accuracy. The following are some suggested grammatical structures that can be targeted, if the language uses them: verb conjugations, agreement (subject-verb, verb tenses, noun classes, pronoun references, etc.), conjunctions, subordinate/dependent clauses, negation, possession, determiners (e.g. articles, demonstratives), prepositions, etc. 6. Avoid any structures that are subject to dialect differences. 7. Randomize the position of the correct answer (A, B, C, or D) from one item to the next. 8. Because this is a written task, the item sentences can be slightly longer and more complex than if they were administered orally. Different languages have different average word lengths and ways of encoding meaning into words that make it difficult to prescribe an exact number of words per item. In English, an appropriate item length for this task would be one sentence containing about 12-20 words total. For other languages, adjust the number of words based on average word and sentence length of academic text at the upper primary level in that language. 9. The example sentence should be shorter (about six words in English) and relatively easy so that the teacher can focus primarily on the task procedure rather than on finding the right answer.

2. Written Error Identification Task Untimed task: 5 items, (expected to take 2 minutes)

Instructions to the Assessor: Show the teacher the Written Error Identification task in the teacher stimuli booklet as you read the instructions below.

Identifying and correcting student errors is an important part of our role as teachers.

Here are some sentences in English. Let’s say that your learners wrote these sentences. Some of the sentences contain an error in one of the underlined parts, A, B, C, or D. Please read each sentence and tell me which part contains the error, if any. You do not have to correct the error, just find it, if there is one. You can answer A, B, C, D, or E for no error.

Let’s start with an example. [Point to the first example.] This sentence says, “He was very interesting in what he had heard on the news.” There is an error in Part B; “interesting in” is not correct in this sentence. So the answer here is B.

Now you try an example. [Point to the second example.] This sentence says, “She has been working as an accountant at this office for sometimes.” Does any underlined part contain an error?

[If the teacher says D, say:] That’s right. / [If the teacher does not say D say:] OK, consider this.

[Then say:] There is an error in Part D; “sometimes” is not correct in this sentence. So the answer here is D.

When you are ready, we will begin. You will start here [point to the first sentence], read the sentence, find the underlined part that contains an error, and tell me which one. You can

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 26 just say A, B, C, D, or no error. If you come to one that you do not know the answer to, just skip it and continue to the next sentence.

Are you ready? Begin.

Follow along with your pencil and clearly mark the answer that the teacher indicates for each item. If the teacher says one answer and then changes his or her mind, mark the new answer. If you already marked the first answer, circle the old answer ( ø ) and mark the new answer. [On the tablet: Follow along on your screen and mark each answer that the teacher indicates by touching that option on the screen—it will turn blue. Mark self-corrections by touching the new answer—the old answer will return to gray and the new answer will turn blue.]

Stay quiet, except if the teacher hesitates for more than 20 seconds on one item. Point to the next item and say, “Please go on.” Mark the skipped item as incorrect.

When finished, say: Thank you! Let’s go to the next section.

Sample Task Stimulus:

Example 1: He was very interesting in what he had heard on the news. No Error A B C D E

Example 2: She has been working as an accountant at this office for sometimes. No Error A B C D E

1. Most areas of Uganda usually receive plenty of rain. Some areas of the Southeast and Southwest A B average more than 150 millimeters per months in rainy season. No Error C D E 2. Female school attendance is low than that of males at all levels of education. No Error A B C D E 3. The increase population in the city in recent years has put a lot of stress on the limited water A B C resources. No Error D E 4. In 2004, a team of government scientists at the Ministry of the Environment find that A B C chemicals from the local factory had contaminated the river. No Error D E 5. The PTA decided to provide more student desks because enrollment was high and there was not A B C enough seating. No Error D E □ Exercise skipped because teacher refused.

27 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

Development/Adaptation Guidelines:

1. Local or online news stories, Wikipedia, or upper primary textbooks written in the target language can serve as inspiration for sentences. However, you do not have to use sentences verbatim from real sources. You may edit existing sentences for length, clarity, and appropriateness, or develop your own from scratch. 2. The sentences should be on topics familiar to adults in that context. Ensure that they are neutral and not disparaging or politically controversial. Avoid company names and proper nouns. 3. Choose one to three words from a well-formed sentence to alter so that the result is an ungrammatical sentence. Underline that part of the sentence; it is your correct answer. Underline three other words or groups of words in the sentence as distractor options. Label the four underlined parts A, B, C, and D. Add Part E for No Error. 4. Leave one item error-free so that the correct answer for that item is E. 5. The selection of the correct answer should not require knowledge of prescriptive grammar rules of the “standard” dialect that are not widely known or used (e.g. whom in English). The correct answer should be immediately obvious to a native or highly proficient speaker of the target language based on common usage alone, even if that speaker does not have an academic or specialized background in that language’s grammar system. 6. The errors should be primarily grammatical in nature, and not based on spelling, punctuation, vocabulary, or factual accuracy. The following are some suggested grammatical structures that can be targeted, if the language uses them: verb conjugations, agreement (subject-verb, verb tenses, noun classes, pronoun references, etc.), conjunctions, subordinate/dependent clauses, negation, possession, determiners (e.g. articles, demonstratives), prepositions, etc. 7. Avoid any structures that are subject to dialect differences. 8. Randomize the position of the correct answer (A, B, C, D, E) from one item to the next. 9. Because this is a written task, the item sentences can be longer and more complex than if they were administered orally. Different languages have different average word lengths and ways of encoding meaning into words that make it difficult to prescribe an exact number of words per sentence. In English, an appropriate item length for this task would be one to two sentences containing about 10-20 words total. For other languages, adjust the number of words based on average word and sentence length of academic text at the upper primary level in that language. 10. However, the example sentence should be shorter (about 10 words in English) and relatively easy so that the teacher can focus primarily on the task procedure rather than on finding the right answer.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 28 F. TLLA Vocabulary Task

1. Vocabulary: Performance Task Timed Task: 10 items, 5 minutes

Instructions to the Assessor: Show the teacher the Vocabulary Performance Task in the teacher stimuli booklet as you read the instructions below.

Explaining new vocabulary words to students is an important part of our role as teachers. In this activity, I will give you a word, and you will provide a definition for that word. More specifically, you will provide an explanation of the word in child-friendly language.

[Point to the first word on the list and say:] For example, this word is exhausted. A child-friendly explanation for exhausted is “feeling so tired you can hardly move.”

You can provide the explanation in any language, as you might do in the classroom. I will give you ten words, one at a time. If you want to skip a word that is okay.

Are you ready? Begin.

Start the timer when you read the first word to the teacher. Read the word, listen to the teacher’s explanation, and rate the explanation on your score sheet. [On the tablet: Follow along on your screen and rate the teacher’s explanation by touching that option on the screen—it will turn blue. Mark self- corrections as correct by touching the option again—it will return to gray.] Then continue to the next word. Stay quiet, except if the teacher hesitates for 5 seconds. Say: How would you explain to a child?

If the teacher hesitates for another 5 seconds, move on to the next word.

Early stop rule: If the teacher does not provide any response at all for the first three items, say Thank you, that’s all, discontinue this subtask, check the box at the bottom of the task, and continue to the next task. [On the tablet: If the teacher does not provide any response at all for the first three items, the screen will flash red and the timer will stop. Then press “Next.”]

If the timer runs out before the last item is read, say Thank you, that’s all. If the teacher is attempting the last item, you may let them finish; you do not have to interrupt them. Either way, Rate the teacher’s explanation of the final word read before the timer ran out; do not count any words that they explained after the end of the timer. [On the tablet: If the timer runs out before the last item is read, the screen will flash red and the timer will stop. Rate the teacher’s explanation of the final word read before the timer ran out. Then press “Next.”]

When finished, say: Thank you! Let’s go to the next section.

Sample Assessor Stimulus and Score Sheet:

29 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

Word A child-friendly explanation Evaluate the teacher’s explanation

1. curious To want to know why something ☐ It would help a child to learn the word. happened or how something works. ☐ It was mostly inaccurate or incomplete. ☐ None provided.

2. fortunate When you feel lucky that something ☐ It would help a child to learn the word. happened. ☐ It was mostly inaccurate or incomplete. ☐ None provided.

3. horrible Something is really bad. ☐ It would help a child to learn the word. ☐ It was mostly inaccurate or incomplete. ☐ None provided.

4. timid Someone who is a little scared to ☐ It would help a child to learn the word. try something. They might wait to ☐ It was mostly inaccurate or incomplete. be asked or watch from the distance ☐ None provided. before they try.

5. clutch To hold something really tight so ☐ It would help a child to learn the word. you do not drop it or lose it. ☐ It was mostly inaccurate or incomplete. ☐ None provided.

6. regret To feel really bad about something ☐ It would help a child to learn the word. that you did. You wish you had not ☐ It was mostly inaccurate or incomplete. done it. ☐ None provided.

7. stroll This is like going on a walk. But ☐ It would help a child to learn the word. there is no hurry. You might stop ☐ It was mostly inaccurate or incomplete. along the way to examine flowers or ☐ None provided. stop to talk with people.

8. glance To look at something quickly and ☐ It would help a child to learn the word. then look away. This is the type of ☐ It was mostly inaccurate or incomplete. look you give a boy or girl you like ☐ None provided. but you don’t want them to see you looking.

9. feast A large meal with many things to ☐ It would help a child to learn the word. eat. It may be made for a party or ☐ It was mostly inaccurate or incomplete. wedding. ☐ None provided.

10. A lot of noise that is noticed by ☐ It would help a child to learn the word. commotion others and causing a problem. For ☐ It was mostly inaccurate or incomplete. example, learners arguing outside a ☐ None provided.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 30 Word A child-friendly explanation Evaluate the teacher’s explanation

classroom would be heard by others and likely a teacher would stop it.

Development/Adaptation Guidelines:

1. Identify 10 Tier 2 words in the target language. Tier 2 words are used across domains and content areas. They are more descriptive than Tier 1 words which are everyday words. 2. The ten words should be a mix of different parts of speech (e.g. adjectives, verb, and nouns) known to educated adults. (Examples: Tier 2=exuberant; Tier 1=happy). 3. Technical Adequacy: Expressive language measures are a challenge to get reliability. We may pilot a matching format.

31 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

G. TLLA Writing Tasks

Read this introduction to the teacher.

Writing is a critical component of literacy. Modeling writing with students as well as scoring student writing samples are ways in which we can assess and improve student language proficiency. For this task, you will complete two writing activities. For one task you will correct student work, and for the other answer a writing prompt.

1. Writing Task I – Correcting Student Writing Samples

Timed Task: 10 items, 2 minutes

Instructions to the Assessor: Show the teacher Writing Task I – Correcting Student Writing Samples in the teacher stimuli booklet as you read the instructions below.

For this task you will correct a letter written by a primary 4 learner. Read the letter carefully and correct the letter for mistakes in grammar, capitalization, punctuation, and spelling. Underline the mistakes the student made in the student answer column and then write the correct answer in the teacher feedback column. If there is a punctuation mark missing, draw a line where the punctuation mark should be and then write the correct punctuation mark in the teacher feedback column. If there are no errors in the sentence leave the sentence as it is.

Let’s look at the example together. [Point to the example and ask the teacher to read the first sentence with you. Then say:] This sentence says “Some children like playing football and netball.” Where is the learner error in the sentence?

[If the teacher says football, say:] That’s right. Now underline the word fotball in the student answer column and write the correct spelling answer in the teacher feedback column. [If the teacher does not say football, say:] Ok let’s look at the word fotbal. It should have two o’s. So, the spelling of the word football is the error in the sentence. Now underline the word fotbal in the student answer column and write the correct spelling in the teacher feedback column.

Let’s look at the next example. [Point to the sentence and ask the teacher to read it with you. Then say:]. This sentence says, “Can we play other schools” Where is the student error in the sentence? [If the teacher says there is no question mark, say:] That’s right. Now underline the space after the word schools, in the student answer column and write the question mark in the teacher feedback column.

[If the teacher does not say there is no question mark, OR if the teacher says there is no full stop, or any other answer say:] Ok. Let’s look at the sentence. The student is asking a question. Look at the end of the sentence. It does not have a question mark. The missing question mark

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 32 is the punctuation error in this sentence. Now, underline the space after the word “schools”, in the student answer column and write the question mark in the teacher feedback column.

When you are ready, we will begin. You will start here. [Point to the first sentence.] Are you ready? Begin.

Start the timer when you give the teacher the letter. Stay quiet. When the teacher completes the subtask, enter the answers written on paper into Tangerine. Mark any skipped items as incorrect. When the teacher is finished, say: Thank you! Let’s go to the next section.

Sample Task Stimulus:

Examples:

Student Answer Teacher Feedback

Underline the student error in each line. Write the correct answer here.

Some children like playing fotball and netball.

Can we play other schools

Student Answer Teacher Feedback

Underline the student error in each line. Write the correct answer here.

1. Dear headmistress, I talk with my friends every morning. 2. They tells me what they like about school.

3. They have some ideas to make our school beter. 4. I will share one ideas with you.

5. Much pupils like to read.

6. They read befour school and during lunch.

7. They want more books for the libray.

8. Can the school get the pupils more books.

9. we think this will improve our school.

33 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

Student Answer Teacher Feedback

Underline the student error in each line. Write the correct answer here.

10. Sincerely

Development/Adaptation Guidelines:

1. Choose a text that is appropriate for learners in primary 4. You can select and adapt texts from local newspapers, tourist guides, primary 4 textbooks and Wikipedia 2. Then insert grammar, capitalization, punctuation, and spelling errors. 3. You can also use student work (with permission). 4. The sentences should be on topics familiar to adults in that context. 5. Use texts that are not disparaging or controversial. Avoid stereotypes, cultural, gender, and ethnic bias. 6. Choose texts that are short and simple. (About 60 words in English). The whole passage should consist of 8-10 sentences of varied lengths. 7. Divide the text into sections. Each section of text should only have one error. 8. Make sure you have at least one grammar, punctuation, capitalization and spelling error in the selected text.

2. Writing Task II – Answering a Writing Prompt Timed Task: 1 item, 4 minutes

Instructions to the Assessor: Show the teacher Writing Task II – Answering a Writing Prompt in the teacher stimuli booklet as you read the instructions below.

For this next activity you will write a short letter that responds to a prompt. Follow along as I read it. [Point to the writing prompt and read it aloud. Then say:]

The Head Teacher would like to promote more reading in your school. She has asked teachers to send her suggestions that would encourage more reading. Think of activities you can do in your classroom to get your students to read more. Then write the Head Teacher a letter. Describe three or more activities and give reasons why these activities will help children read more.

You will have four minutes.

Are you ready? Begin. [Start the timer.] Follow the timer, at three minutes, tell the teacher there is one more minute. The writing piece is not scored in front of the teacher. It is scored after he/she is dismissed.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 34 Sample Task Stimulus:

Writing Prompt: The Head Teacher would like to promote more reading in your school. She has asked teachers to send her suggestions that would encourage more reading. Think of activities you can do in your classroom to get your students to read more. Then write the Head Teacher a letter. Describe three or more activities and give reasons why these activities will help children read more.

Development/Adaptation Guidelines:

1. Determine the writing genre. Examples of genres you can use are the following: a. Opinion (viewpoint supported with reasons). b. Process or “How To” (steps in a process or procedure) 2. Make sure the prompt is short and clear and can be responded to in limited time. 3. You can select and adapt prompts from grade 4 textbooks, tourist guides, newspapers, recipe books, and Wikipedia. 4. Make sure adults have some familiarity with the topic and the writing genre. 5. Develop prompts that are not disparaging or controversial. Avoid stereotypes, cultural, gender, and ethnic bias. 6. Develop a prompt which is interesting and meaningful. Some sample prompts include: Opinion prompts: a. Write a paragraph on the reasons why it is important to have a health clinic in your community. b. Write a letter to a friend telling her why she should visit Murchison Falls in Uganda instead of Lake Victoria.

Process or “How To” prompts:

a. Write a letter to your friend from Ethiopia describing how to make your favorite Ugandan dish. (e.g., Ugali, Luwombo, Matooke). b. Describe how to play your favorite game (e.g. chess, snakes and ladders, omweso, football).

Scoring the writing task:

7. Score the writing task on the following: a. Ideas: The main message or central idea of the writing piece, with supporting details. b. Organization: Three or more sentences centered on one idea, with examples. c. Use of signal words relevant to the type of writing genre or text structure. Examples of signal words for opinion writing are: I think, I believe, in my opinion, the best, my favorite. Examples of signal words for process writing are: First, second, next, then, after. d. Sentences: This demonstrates the ability to write a variety of sentence structures. These include long, short, simple and complex sentences. e. Use a scoring guide that is unique to each writing prompt. Below are examples of a persuasive or opinion rubric and a process or “how to” rubric.

35 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

Assessor Scoring Guide: Persuasive/Opinion Rubric:

3 2 1

Ideas Main idea focused and Main idea stated. Personal Little or no reference to clearly identified. Personal opinion stated. 2-3 points the main idea. Opinion opinion stated in fresh made, but not always not clearly stated. Two or original way. Three to supported with reasons. fewer points made. No four points made with 3 reasons given. or more detailed reasons

Organization Clear paragraph structure Basic paragraph structure. Little or no evidence of (beginning, middle, end). Ideas logically arranged. paragraph structure. Ideas Ideas logically arranged Some use of 2-3 disconnected. Little and connected with 4 or transitional/ signal words, variation in sentence more signal/transitional some variation in structure. 1-2 signal words words. Long, short, sentence structure. used. 2-3 incomplete simple and complex sentences. sentences.

Word choice Uses specific nouns and Uses appropriate, correct Simple, limited vocabulary. strong (vivid) verbs. usage of vocabulary. Sometimes incorrectly Vocabulary above grade (nouns, strong verbs). used. Some parts of the level. Writing clearly Writing is appropriate for message may be unclear. targeted for appropriate target audience. evidence.

Process or “How To” Rubric:

3 2 1

Ideas Main idea is focused, Main message/topic Little or no reference to clearly identified and stated. Process the main message/topic. understood. Process introduced. Not all the Process not clearly stated. introduced and stated in steps in the process Many steps in the process fresh original way. All the described, some of the are missing. steps in the process steps may be confusing. clearly described.

Organization Clear paragraph structure Basic paragraph structure. Little or no evidence of (beginning, middle, end). Ideas logically arranged. paragraph structure. Ideas Ideas logically arranged. Some variation in disconnected. Little or Writing has a variety of sentence structure. Some variation in sentence simple and complex sentences incomplete. structure. Many sentences. incomplete sentences.

TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS | 36 3 2 1

Signal Words Four or more signal Two to three signal No signal words present. words appropriate for words are present but do process paragraphs are not always lead the present and lead the reader from one step to reader from one step of the next. the process to the next.

Word choice Uses specific, nouns and Uses appropriate, correct Simple vocabulary. strong (vivid) verbs usage. vocabulary nouns, Sometimes incorrectly precise words Vocabulary verbs etc. Writing is used. Some parts of the above grade level. Writing appropriate for target message may be unclear. clearly targeted for audience. appropriate evidence.

Note: The order of the subtasks presented in this document aligns with the order they appear in the TLLA research plan. When these tasks are administered, the writing subtasks will be administered last. They will not be not be scored in front of the teacher. They will be scored after he/she is dismissed.

37 | TEACHER LANGUAGE AND LITERACY ASSESSMENT (TLLA) INSTRUMENTS

RESEARCH FOR EFFECTIVE EDUCATION PROGRAMMING-AFRICA (REEP-A) RESEARCH PLAN: SOMA UMENYE PERFORMANCE EVALUATION

Prepared for USAID

Prepared by Dexis Consulting Group

July 2019

Funding was provided by the United States Agency for International Development (USAID) from the American people under Contract AID-OAA-TO-16-00024 81, subcontract AID-OAA-I-15-00019 . The contents are the responsibility of the USAID Research for Effective Education Programming (REEP-Africa) Project and do not necessarily reflect the views of USAID or the United States Government. USAID will not be held responsible for any or the whole of the contents of this publication.

ACRONYMS

BLF Building Learning Foundations CDCS Country Development Cooperation Strategy CLA Collaborating, Learning, and Adapting COP Chief of Party COR Contract Officer’s Representative DEO District Education Officer DFID United Kingdom’s Department for International Development DLL Dual Language Learners DOS Dean of Studies ESSP Education Sector Strategic Plan FGD Focus Group Discussion GoR Government of Rwanda IP Implementing Partner IR Intermediate Result KII Key Informant Interview LARS Learning Assessments in Rwandan Schools MEL Monitoring, Evaluation and Learning MINEDUC Ministry of Education NUDOR National Union of Disability Organizations in Rwanda P1 Primary Grade 1 P2 Primary Grade 2 P3 Primary Grade 3 PMPs Performance Management Plans REB Rwanda Education Board REEP-A Research for Effective Education Programming–Africa RNUD Rwanda National Union of the Deaf SEI Sector Education Inspector TTC Teacher Training College UNICEF United Nations Children’s Fund USAID United States Agency for International Development VVOB Flemish Association for Development Cooperation

CONTENTS BACKGROUND ...... 1 Activity Description ...... 1 Evaluation Backgroud ...... 3 EVALUATION PURPOSE ...... 3 Evaluation Questions ...... 4 EVALUATION METHODOLOGY ...... 5 Data Sources ...... 5 Data Collection Methods ...... 7 Sampling and Site Selection ...... 8 Expected Limitations ...... 9 ANALYSIS ...... 10 WORKPLAN ...... 13 ROLES AND RESPONSIBILITIES ...... 15 REFERENCES ...... 16

LIST OF TABLES AND EXHIBITS

TABLE 1. SCHOOL SITE DATA COLLECTION SAMPLE AND METHOD TABLE 2. GETTING TO ANSWERS MATRIX (G2A) TABLE 3. ILLUSTRATIVE WORKPLAN TABLE 4. ESTIMATED LABOR

BACKGROUND

Over the past two decades, Rwanda has made progress towards its goal of becoming a middle- income, knowledge-based economy. The Rwanda Vision 2020 articulates the Government of Rwanda’s (GoR) ambitious development agenda and its strategy to reach this aim by fostering and sustaining economic, social, and political advances. The Vision emphasizes strengthening human capital formation as one of the six strategy pillars, particularly through enhanced education service delivery. Efforts to improve the education system in Rwanda are focused on ensuring universal education for all, improving education quality, and expanding technical and vocational training.

Rwanda has made significant progress in reducing illiteracy and ensuring universal access to basic education. Efforts at improving basic education have comprised of two key components: improving access and enhancing quality. Rwanda has achieved near universal access to primary education, increasing the primary net enrollment rate from around 83 percent in 2002 to 96 percent in 2016.1 According to an assessment of progress toward the Millennium Development Goals, as highlighted in a recent World Bank report, Rwanda is one of the four best-performing countries in Africa in increasing primary net enrollment, a key indicator of achieving universal primary education.2

Despite this progress, basic education challenges remain, particularly regarding education quality. As improvement in quality has not kept pace with the GoR’s expansion of access, learning levels remain low. As highlighted in Rwanda’s 2015-2020 Country Development Cooperation Strategy (CDCS), challenges to quality education include an inadequate supply of quality teachers and a high ratio of pupils to teachers. Learning outcomes and literacy levels remain low, leaving youth largely unprepared for the formal labor market. Enhancing early grade reading and literacy outcomes through increased and strategic investment is a key component of efforts to improve education quality in Rwanda. In support of these priorities, the United States Agency for International Development (USAID) is partnering with the GoR to improve early grade reading outcomes through strategic investment in the sector.

ACTIVITY DESCRIPTION

The Soma Umenye activity, under the LEARN project, is a $72.4 million, five-year initiative (7/2016- 7/2021) of USAID and the Rwanda Education Board (REB) that aims to improve the Kinyarwanda language reading skills of at least one million children nationwide in public and government-funded schools by the end of their third year of primary school. Specifically, Soma Umenye seeks to ensure that at least 70 percent of Primary Grade 1(P1), Primary Grade 2 (P2), and Primary Grade 3 (P3) students are able to read grade-level text with fluency and comprehension.

Soma Umenye strives for close collaboration with GoR counterparts, including through alignment of Soma Umenye activities with the GoR’s Education Sector Strategic Plan (ESSP) 3, as well as with district education plans.

To improve early grade reading, Soma Umenye promotes changes to the “5Ts” of reading: time, text, teaching technique, test, and tongue. Specifically, Soma Umenye aims to improve student reading by:

1 UNESCO Institute for Statistics. No date. Database: "Education: Net enrolment rate by level of education," accessed May 9, 2018, http://data.uis.unesco.org/#. 2 World Bank Group, Global Monitoring Report 2015/2016 (Washington, DC: World Bank Group, 2016), 252.

1 | RESEARCH PLAN: SOMA UMENYE EVALUATION • Improving quality of classroom instruction at the primary level through the application of teacher and student performance standards, and the implementation of an evidence-based in- service training and coaching program • Strengthening the pre-service teacher training program • Increasing availability and use of evidence-based and gender-sensitive early grade reading materials • Strengthening the capacity of the Ministry of Education and University of Rwanda to conduct and apply research, and lead the reading reform process To reach Soma Umenye’s goal of improved literacy outcomes for all children in early grades, inclusive of both boys and girls, children living in extreme poverty, and children with disabilities, the initiative centers on two intermediate results (IR), namely:

• IR1: Classroom instruction in early-grade reading improved

• IR 2: Systemic capacity for early-grade reading instruction improved

The table below summarizes the sub-IRs that fall under the two intermediate results.

CDCS Development Objective: Increased opportunities for Rwandan children and youth to succeed in schooling and the modern workplace

Program Purpose: Improved literacy outcomes for children in early grades

IR 1: Classroom instruction in early- grade IR 2: Systemic capacity for early-grade reading reading improved instruction improved

Sub-IR 1.1: Evidence-based, gender- sensitive Sub-IR 2.1: National advocacy mechanisms for early- early-grade reading materials available and used grade reading interventions strengthened

Sub-IR 1.2: Teachers’ use of evidence- based, Sub-IR 2.2: Student and teacher performance gender-sensitive instructional practices in early- standards and benchmarks for early-grade reading grade reading increased applied

Sub-IR 1.3: Capacity of head and mentor Sub-IR 2.3: Research-based policies and curricula in teachers to coach and supervise early-grade support of early-grade reading instruction implemented reading instruction strengthened

Sub-IR 1.4: Schools’ and teachers’ use of Sub-IR 2.4: Early-grade reading assessment systems student assessment results improved strengthened

Sub-IR 2.5: Capacity of Teacher Training Colleges (TTCs) to prepare effective early-grade reading teachers improved

2 | RESEARCH PLAN: SOMA UMENYE EVALUATION

Cross-Cutting: gender and inclusion of students with special needs, ICT

Soma Umenye is designed to build the capacity of people and systems in Rwanda to adopt and sustain improvements in reading, including providing equitable opportunities for professional development and advancement for female teachers and relevant stakeholders.

The initial year of Soma Umenye was completed in July 2017. In the first year of implementation, Soma Umenye encountered performance and management issues, specifically regarding alignment with GoR priorities and effective coordination with the education sector. To address these challenges, an adjusted approach was adopted for Year 2 (July 2017- July 2018). The updated approach was developed through close collaboration between USAID Rwanda and the implementing partner (IP) over several months. Efforts at course correction included a major revision to the roll-out plan as well as repositioning the activity to better align with and work through GoR systems.

EVALUATION BACKGROUND

USAID determined a performance evaluation of Soma Umenye to be an important tool to assess challenges and barriers to activity success; specifically examining the effectiveness of the recent adjustments to activity implementation to better reflect the current orientation and plans for Soma Umenye in Years 2-3. The evaluation will also provide insight for future adjustments as the program progresses.

This research plan articulates the design and implementation approach for a performance evaluation of Soma Umenye. To examine the impact of the adjustments made to Soma Umenye in Year 2, the evaluation will take place at the end of the third year of the activity. The Dexis Evaluation Team will share the evaluation instruments (including classroom observation tool, interview guide, etc.), as well as the framework that will be used to evaluate sustainability, with the Mission for comments prior to the evaluation implementation. Based on discussions with the Mission, it was determined to be beneficial to include a focus on reviewing the extent to which teachers have applied the methodology introduced during training. To ensure that teachers have absorbed the Soma Umenye methodology and are applying it in the classroom, Soma Umenye expects to commence school coaching from the start of Term 2 of the 2018-2019 school year. Given the aim of capturing the extent to which initial coaching sessions have occurred and have been perceived to be effective by key stakeholders, the primary data collection is anticipated to occur in August 2019.

EVALUATION PURPOSE

The Soma Umenye evaluation will be a formative midline performance evaluation. The evaluation will assess the effectiveness of mid-course corrections made following the initial year of the activity as well as the effectiveness of current activity implementation and performance. The evaluation aims to provide USAID Rwanda with more detailed information regarding activity challenges and successes, and will identify areas for improvement to enhance activity effectiveness and early grade reading outcomes. This includes improving the potential for sustainability, as well as an activity transition strategy.

Based on identified challenges, a major focus of the evaluation will be an examination of implementation processes, specifically the degree to which the activity has implemented and adhered to updated work plans, as well as the overall activity aims for Soma Umenye.

3 | RESEARCH PLAN: SOMA UMENYE EVALUATION The evaluation will detail the performance of the IP against the terms of the contract as well as the extent to which they were able to adapt to changes in the local context to achieve high level results. This will include a focus on challenges encountered in working effectively with the GoR and other systems in Rwanda. Stakeholder perception of IP performance will also be a primary component of the assessment, as those perceptions are hypothesized to be critical for buy-in and ownership of the activity.

EVALUATION QUESTIONS

The evaluation will be guided by five primary questions, developed by USAID Rwanda in collaboration with the Evaluation Team, which will address the key evaluation aims identified above, namely:

1. Implementing Partner: How has the implementing partner and its team performed against the terms of the contract?

a. What progress has Soma Umenye made toward the stated goals and objectives of the contract? i. What progress has been made towards each of the sub-IRs (as shown in the table above)? ii. Is Soma Umenye viewed as positively contributing to GoR priorities and learning outcomes among GoR stakeholders? iii. To what extent has Soma Umenye addressed inequities based on gender and students and teachers with disabilities? iv. Is Soma Umenye making effective use of its organizational structure and staff? b. Have teachers incorporated best practices from Soma Umenye’s trainings into their classrooms? If not, what have been the key impediments to doing so? How has Soma Umenye responded to those challenges? i. How do teachers perceive Soma Umenye interventions? Do teachers perceive that the Soma Umenye interventions have had an impact on their teaching techniques and their students’ learning?

2. Implementation Challenges: What are some challenges or obstacles to implementing the activity? How could those challenges be addressed?

a. National level challenges i. Has Soma Umenye addressed barriers to greater GoR uptake of evidence-based approaches? b. District, sector, school-level challenges

3. Sustainability: What efforts are being made within Soma Umenye to ensure sustainability and how could Soma Umenye more effectively work towards sustainability?

a. How does Soma Umenye define sustainability?

b. Is there currently demonstrable ownership and demand for Soma Umenye interventions? How could Soma Umenye further promote ownership and demand? (covering the following stakeholders) i. GoR officials ii. Local administrators iii. Teachers iv. Local NGO partners

4 | RESEARCH PLAN: SOMA UMENYE EVALUATION c. What is the current level of skills and capacity among stakeholders to sustain Soma Umenye interventions? How could Soma Umenye further build up skills and capacity among stakeholders? (covering the following stakeholders) i. GoR officials ii. Local administrators 1. What role are Soma Umenye district advisors playing in developing the skills and capacity of local administrators? iii. Teachers iv. Local NGO partners

d. Given available resources of GoR and local organizations, which Soma Umenye interventions are most sustainable?

4. Collaboration, Learning, and Adaptation: What evidence, if any, indicates that the activity has employed adaptive management approaches? Does existing evidence indicate the effectiveness of any changes made?

a. Is Soma Umenye effectively collaborating with USAID, GoR, and other development partners in Rwanda? How could their collaboration be improved?

b. How is Soma Umenye using assessments (both formative and summative) to spur adaptation?

c. What opportunities exist to strengthen collaborating, learning, and adapting (CLA) practices in the activity?

EVALUATION METHODOLOGY

The Soma Umenye evaluation will utilize a mixed-methods approach, employing a combination of primary and secondary data. Data collection methods for the evaluation will consist of key stakeholder interviews, focus group discussions, and a comprehensive desk review. The evaluation will engage a wide range of stakeholders, including personnel from the GoR, the IP, and USAID Rwanda, as well as local partners and activity beneficiaries.

Proposed data sources, data collection methods, sampling and site selection, and expected limitations are detailed below. A Getting to Answers Matrix (Table 2), provides a summary of these components.

DATA SOURCES

The evaluation will consist of both primary and secondary data collection from a range of relevant stakeholders and beneficiaries. The Evaluation Team will collect primary data through individual interviews, focus group discussions, and classroom observation while secondary data will be gathered through a comprehensive desk review of relevant documents, detailed below. These data sources will be updated with additional documents following the fieldwork period.

As highlighted in the Getting to Answers Matrix, primary data sources may include:

• Activity sub-grantees and other development partners, particularly those implementing teacher training courses. These will include:

5 | RESEARCH PLAN: SOMA UMENYE EVALUATION ◦ Mureke Dusome

◦ Building Learning Foundations (BLF)

◦ Flemish Association for Development Cooperation (VVOB)

◦ United Nations Children’s Fund (UNICEF)

◦ Norma Evans

◦ National Union of Disability Organizations in Rwanda (NUDOR)

◦ Rwanda National Union of the Deaf (RNUD) • P1, P2, P3 teachers

• Trainers, head teachers, and others implementing the teacher training courses

• GoR partners, particularly the Ministry of Education (MINEDUC) and Rwanda Education Board (REB)

• District and sector officials and local administrators

• Parents of participating students

• USAID Rwanda personnel, notably the contracting officer’s representative (COR) and relevant technical staff from the education office

• Personnel from the IP, including the Chief of Party (COP), activity managers, and relevant technical specialists, and Soma Umenye district advisors

Secondary data sources may include:

• Soma Umenye activity design and theory of change documents, including Performance Management Plans (PMPs) and Monitoring, Evaluation and Learning (MEL) Plans

• Soma Umenye monitoring data from USAID and the IP, including indicator tracking tables, site visit reports, and context monitoring data

• Soma Umenye contract, including any addendums or revisions

• Soma Umenye Learning Agenda, as approved by the GoR Education Ministry

• Soma Umenye quarterly and annual reports

• Soma Umenye and GoR activity cost and financial reporting

• Teacher training materials, student textbooks and other relevant documents

• Academic literature related to early grade reading interventions

• Relevant GoR documents including Education Sector Strategic Plan (ESSP) 2018/19-2023-24, job descriptions for local education officials, Education Sector Policy, ICT in education policy

6 | RESEARCH PLAN: SOMA UMENYE EVALUATION and plan, a forward looking Joint Review of the Education Sector, and Learning Assessments in Rwandan Schools (LARS) system, as well as other documents to be determined

• Soma Umenye Instructional Time Study and datasets

DATA COLLECTION METHODS

Key data collection methods will include:

• Desk review of program documents and relevant literature

• Key informant interviews (KIIs)

• Focus group discussions (FGDs)

• Classroom observation

The first stage of the evaluation will consist of a preliminary desk review that gathers secondary data through the close examination of activity design documents, monitoring data, annual and quarterly reports, and other applicable activity information. The review will also incorporate literature relevant to the activity. The Evaluation Team will develop a desk review instrument to organize and categorize documents, classified by relevance to evaluation questions and sub-questions. The desk review will be used to inform the development of the data collection instruments, and to shape the overall evaluation approach. This phase will also include initial consultation with USAID Rwanda and Soma Umenye prior to arrival in country. USAID Rwanda will introduce the Evaluation Team to the Soma Umenye Team as early in the process as feasible to promote collaboration on construction of field visit schedule and understanding of historical project context and background prior to arrival in country.

During the data collection, the Evaluation Team will conduct KIIs with a range of stakeholders, detailed above in primary sources, who have a strong knowledge of Soma Umenye. Stakeholder interviews will be guided by a KII instrument consisting of a semi-structured interview guide designed to collect information relevant to the evaluation questions, followed by secondary probing questions. The probing questions can be adjusted and tailored to the individual, and will allow the facilitator to gather further details specific to the views and perceptions of the respondent. The interview guide will be accompanied by clear facilitator guidance and instructions for administration. KIIs will generally be conducted with a single respondent, and are expected to take 45 minutes to one hour.

Qualitative data will also be collected through FGDs. The Evaluation Team will develop an instrument to guide the discussions, which will seek to elicit information relevant to the evaluation aims. Focus groups will be used as the primary method of data collection for teachers and other activity beneficiaries, likely including parents. The focus groups will consist of semi-structured discussions that aim to gain insight and perspectives from teachers and activity beneficiaries regarding their experiences with, and perceptions of, Soma Umenye. FGDs are anticipated to take less than one hour.

Lastly, the Evaluation Team will collect ethnographic data in the form of classroom observations. These observations will measure the degree to which and how teachers and students are using the Soma Umenye implemented materials and practices. The Team will design/adapt an observation protocol based on the specific desired competencies and practices expected by the Soma Umenye

7 | RESEARCH PLAN: SOMA UMENYE EVALUATION activity. The observation protocol will be designed to capture both quantitative/numeric information (e.g. using scales) and narrative information about content being delivered, delivery methods, and elements such as student engagement and student work.

All data collection tools will be shared in advance with USAID Rwanda for input before finalization. SAMPLING AND SITE SELECTION

As detailed above, the performance evaluation will primarily employ qualitative methods, including KIIs and FGDs. Given this, evaluation site selection will be largely determined through purposive or convenience sampling. The Evaluation Team will utilize purposive sampling for selecting KII respondents, drawing on a wide range of stakeholders engaged with Soma Umenye, as well as external subject knowledge experts. Convenience considerations, including location and participant availability, may also be a factor in determining KII respondents, particularly for GoR officials, local partners, and sub-contractor personnel.

The Evaluation Team will select sites based on the desk review of relevant documents, as well as consultations with USAID Rwanda. Site selection will allow for diversity in respondents and school/classroom contexts and will aim to provide a representative sample at the district level, including urban and rural districts, overall socio-economic status, educational achievement levels, teacher and student implementation of Soma Umenye desired practices, and other relevant factors. To measure the effect of Soma Umenye over time, the Evaluation Team will seek to achieve a mix of schools and districts that received Soma Umenye interventions in Year 1 and Year 2. Furthermore, the Evaluation Team will also consider the strength of Soma Umenye’s existing relationship with the district and local governments, and schools that have received education interventions from other donors, including the United Kingdom’s Department for International Development (DFID) BLF project. At the individual level, the Evaluation Team will try to ensure a roughly equal balance in gender. Exact site locations will be determined during the desk review process, along with consultation from USAID.

It is anticipated that the sample will include 14 schools from seven districts. In each district, two schools will be selected from two sectors within the district, based on the above-mentioned criteria. In each district, the District Education Officer (DEO) will be interviewed, and in each sector the Sector Education Inspector (SEI) will be interviewed.

In each school, the Evaluation Team will conduct KIIs with the head teacher, Dean of Studies (DOS), and Master Teacher/subject specialist.3 Additionally, the Evaluation Team will conduct two FGDs, one with teachers and one with parents. See Table I for details on school site sampling.

3 The DOS and Master Teacher/subject specialist may not be present in all schools thus the total number of KIIs carried out may be lower than planned. 8 | RESEARCH PLAN: SOMA UMENYE EVALUATION Table 1. School site data collection sample and method

Data Collection Method Sample Content

Soma Umenye training approach and effectiveness Collaboration/relationship with Soma Umenye staff

DEO: 7 total Role in supervising school quality KIIs SEI: 14 total Support for head teachers/teachers Head teachers: 14 total Soma Umenye materials to support teaching (Roughly 45 minutes to 1 and learning DOS: 14 total hour) Most beneficial interventions/aspects Master Teacher/subject specialists: 14 total Challenges in implementation Training content most/least mastered Student learning: benefits and challenges Efforts to sustain Soma Umenye interventions

Soma Umenye training approach and effectiveness

Soma Umenye materials to support teaching and learning

P1 - P3 teachers: 14 total Most beneficial interventions/aspects Challenges in implementation FGDs (4 to 6 people) Training content most/least mastered

(Roughly 1 hour) Student learning: benefits and challenges Assessment practices and use

Student learning: benefits and challenges

Parents of P1-P3 students: Reading practices 14 total Benefits and limitations of materials Involvement in school/student learning

Early grade reading/literacy teaching strategies Classroom observation P1 – P3 classes: 2 per school, Dual Language Learners (DLL) pedagogical 28 total strategies (Roughly 30 minutes) Classroom management practices Student engagement and demonstrated learning

EXPECTED LIMITATIONS

There are expected limitations which will be mitigated, to the extent possible, throughout the evaluation process. First, there is a potential for sampling bias, as with any qualitative research using

9 | RESEARCH PLAN: SOMA UMENYE EVALUATION non-random samples. Despite the use of purposive and convenience sampling, the Evaluation Team will aim to ensure that respondents reflect a variety of perspectives, demographics, and geography, particularly among activity beneficiaries.

Second, there is some potential for response bias on the part of respondents. Though interview questions are unlikely to be highly sensitive, respondents may have some incentive to not be fully forthcoming about their perspectives. The Evaluation Team will attempt to mitigate this concern in the question design process, and through the training of KII and focus group facilitators. Sampling could also pose limitations depending on the availability and accessibility of teachers or students, particularly in rural areas. Random sampling may be done at the unit of classroom or community rather than at the individual level.

ANALYSIS

As the evaluation data will be primarily qualitative, the primary methods of analysis will consist of content analysis, thematic qualitative coding, and convergence and divergence determinations. In addition, statistical analysis may be used to analyze activity monitoring data, including a comparison of performance data and indicator targets, and to aggregate quantitative data from secondary documents. Quantitative analysis may include frequency calculations, cross-tabulations, trend line analysis, correlations, or similar functions, based on need and data availability.

Data and documents from the desk review, as well as any other secondary data, will be analyzed primarily through content analysis. Document excerpts relevant to the evaluation questions and sub- questions will be extracted, coded, and analyzed for emergent themes.

Qualitative data obtained from the KIIs and FGDs will be analyzed using thematic qualitative coding as well as convergence and divergence determinations:

• Thematic qualitative coding: The first stage of analysis of KII and focus group data will consist of a qualitative coding process. Using MAXQDA, NVivo, or similar software, the Evaluation Team will create a coding structure based on evaluation questions, sub-questions, and themes relevant to the evaluation. Segments from KII and focus group transcripts and notes will be coded in alignment with questions and themes. Any unanticipated themes that emerge from the data will be added to the existing codes.

• Convergence and divergence determination: Following the completion of the qualitative coding process, convergence and divergence determinations will be used to analyze the qualitative data gathered, as noted in the Getting to Answers Matrix. Convergence and divergence analysis compares responses toward common questions and themes across respondent groups to identify where responses align or diverge. Using this method, the Evaluation Team will identify common and reoccurring themes, as well as deviations between stakeholder perceptions.

Following the analysis of both primary and secondary data, the Evaluation Team will triangulate results to create a complete set of evaluation findings. The approach to data analysis will be updated as needed prior to, and throughout, the field data collection.

10 | RESEARCH PLAN: SOMA UMENYE EVALUATION

Table 2. Getting to Answers Matrix (G2A)

Evaluation Question Source(s) Method(s) Analysis Soma Umenye contract and Desk review, Content analysis, thematic 1. Implementing Partner: How has the implementing partner activity documents, including KIIs, focus group coding, and its team performed against the terms of the contract? monitoring data discussions convergence/ divergence determinations a. What progress has Soma Umenye made toward the USAID staff, IP personnel (both stated goals and objectives of the contract? upper level management and district advisors), local i. What progress has been made towards each partners, GoR officials, of the sub-IRs (as shown in the table above)? local/district officials, teachers ii. Is Soma Umenye viewed as positively and students contributing to GoR priorities and learning outcomes among GoR stakeholders? iii. To what extent has Soma Umenye addressed inequities based on gender and students and teachers with disabilities? iv. Is Soma Umenye making effective use of its organizational structure and staff?

Training materials, sub- Desk review, Content analysis, thematic b. Have teachers incorporated best practices from Soma grantees, teachers, activity KIIs, focus group coding Umenye’s trainings into their classrooms? If not, what performance monitoring data discussions have been the key impediments to doing so? How has Soma Umenye responded to those challenges? Teachers, students, local partners, IP staff (including i. How do teachers perceive Soma Umenye district advisors), Soma interventions? Do teachers perceive that the Umenye Instructional Time Soma Umenye interventions have had an Study impact on their teaching techniques and their students’ learning?

11| RESEARCH PLAN: SOMA UMENYE EVALUATION

b. How is Soma Umenye using assessments (both formative and summative) to spur adaptation?

c. What opportunities exist to strengthen collaborating, learning, and adapting (CLA) practices in the activity?

14 | RESEARCH PLAN: SOMA UMENYE EVALUATION

WORKPLAN

The Timeline (Table 3) provides an overview of the activity timeline. A feasibility determination report was developed by Dexis in February 2018, and was formally approved by USAID on April 9, 2018. Following initial approval of the research plan, evaluation preparations were put on hold, resuming in 2019, based on USAID Rwanda’s decision that the evaluation should capture the corrective measures undertaken in Year 2 of the activity.

Recruitment and planning began in April 2019, with logistics planning commencing in May 2019. The field data collection is anticipated to occur in August 2019. Initial findings from the evaluation will be prepared in August 2019, with data analysis and report writing to begin in the same period.

13| RESEARCH PLAN: SOMA UMENYE EVALUATION Table 3. Timeline Soma Umenye 2018 2019 Evaluation Feb Mar Apr May June July Aug March Apr May Jun Jul Aug Sep Oct Nov

Feasibility Determination Feb-Mar 2018

Research Plan April-August 2018

Research Plan Revisions March-July 2019

Recruitment/Onboarding April-July 2019

Planning April-July 2019

Logistics May-July 2019

Methodology May-July 2019 Revisions/Finalization

Aug Field Data Collection 2019

Aug Initial Findings 2019

Data Analysis and August-October 2019 Report Writing Final Report Submission Nov and Dissemination 2019 ROLES AND RESPONSIBILITIES

The Evaluation Team will feature a balance of international and local expertise in evaluation methods and early grade reading, and will include the following positions:

• Evaluation Team Lead and Early Grade Literacy Specialist

• Mid-level Evaluation Specialist

• Local Education/Evaluation Specialist

• Research Assistant

The Evaluation Team Lead and Early Grade Literacy Specialist will be responsible for managing the Evaluation Team as well as the design and implementation of the evaluation, including the development of instruments, analysis, and finalized evaluation report. The Evaluation Team Lead will also be an Early Grade Literacy Specialist and will provide subject matter expertise.

The Mid-Level Evaluation Specialist will provide technical input and expertise from the Dexis home office in Washington D.C. S/he will contribute to the evaluation design, recruitment, data collection data analysis, and report drafting.

The Local Education/Evaluation Specialist will be the primary point of contact for in-country data collection activities. S/he will conduct KIIs and focus groups, provide contextual expertise, and assist with logistics and scheduling in the field.

The Research Assistant will be the primary point of contact on the Evaluation Team for supporting the Local Specialist in logistics and scheduling. S/he will also provide support in the desk review phase, note-taking during meetings, data cleaning, coding, and input into excel or a qualitative software to support the Evaluation Team in data analysis and report writing.

Table 4. Estimated Labor

Position Estimated Level of Effort (days) Evaluation Team Lead/Early Grade Literacy 90 Specialist: Kristin Rosekrans Mid-level Evaluation Specialist: Max 80 Shanstrom Local Education/Evaluation 60 Specialist: Michael Tusiime Research Assistant: Kathryn Norris 85

15| RESEARCH PLAN: SOMA UMENYE EVALUATION REFERENCES

Republic of Rwanda, Ministry of Finance and Economic Planning. Rwanda Vision 2020. : Ministry of Finance and Economic Planning, 2000.

Rwanda Ministry of Education. Education Sector Strategic Plan (ESSP). Kigali: Rwanda Ministry of Education, 2013.

UNESCO Institute for Statistics. No date. Database: "Education: Net enrolment rate by level of education." Accessed May 9, 2018. http://data.uis.unesco.org/#.

USAID Rwanda. Country Development Cooperation Strategy. Kigali: USAID, 2015.

World Bank Group. Global Monitoring Report 2015/2016: Development Goals in an Era of Demographic Change. Washington, DC: World Bank Group, 2016.

16 RESEARCH PLAN: SOMA UMENYE EVALUATION