SOMA UMENYE ACTIVITY

QUARTERLY PROGRAM REPORT QUARTER 2, FISCAL YEAR 2019 (JANUARY 1 – MARCH 31, 2019) STEPHEN BLUNDEN

DEVELOPMENT OBJECTIVE: INCREASED OPPORTUNITIES FOR RWANDAN CHILDREN AND YOUTH TO SUCCEED IN SCHOOLING AND THE MODERN WORKPLACE.

IR I: CLASSROOM INSTRUCTION IN EARLY-GRADE READING IMPROVED

IR 2: SYSTEMIC CAPACITY FOR EARLY-GRADE READING INSTRUCTION IMPROVED

Contract No. AID-OAA-I-14-00055, Task Order No. AID-696-TO-16-00001

Prepared For Prepared By

U.S. Agency for International Development Chemonics International Inc. (Contractor) USAID/ 1717 H Street NW USAID Contracting Officer’s Representative: Washington, DC 20006 Luann Gronhovd Phone: 202-955-3300 Fax: 202-955-3400 www.chemonics.com

QUARTERLY PROGRAM REPORT USAID SOMA UMENYE PROJECT QUARTER 2, FISCAL YEAR 2019 (JANUARY 1 – MARCH 31, 2019)

April 30, 2019

This publication was produced for review by the United States Agency for International Development. It was prepared by Chemonics International Inc.

QUARTERLY PROGRAM REPORT USAID SOMA UMENYE PROJECT QUARTER 2, FISCAL YEAR 2019 (JANUARY 1 – MARCH 31, 2019)

Contract No. AID-OAA-I-14-00055, Task Order No. AID-696-TO-16-00001 Cover photo: Dr. Alphonse Sebaganwa, the Head of Department of Examination, Selection and Assessment in REB, addressing participants during a workshop on defining the performance category labels for Early Grade Reading standards in Rwanda; at Nyamata La Palisse Hotel, Bugesera District, February 2019. (Credit: Alain Patrick Mwizerwa/USAID Soma Umenye).

DISCLAIMER

The authors’ views expressed in this publication do not necessarily reflect the views of the United States Agency for International Development or the United States government.

CONTENTS

Acronyms ...... ii Executive Summary ...... 1 Project Overview ...... 2 A. Background ...... 2 B. Program Description ...... 2 Achievements and Discussion of Major Activities ...... 4 A. Operational Activities ...... 4 B. Technical Activities ...... 5 C. MEL Activities ...... 30 D. Communications Activities ...... 34 Challenges and Lessons Learned ...... 39 A. Problems Encountered and Proposed Remedial Actions ...... 39 B. Success Stories and Lessons Learned ...... 39 Activities Planned for Next Quarter...... 43 A. Operational Activities ...... 43 B. Technical Activities ...... 43 C. MEL Activities ...... 50 D. Communications Activities ...... 51 Annex A. Reporting Against Indicators ...... 52 Annex B. Study Visit to Kenya in Support of Digitization and Inclusion Activities ...... 67 Annex C. P2 And P3 Teacher Training ...... 84 Annex D. Rwanda Education Insights: Final Project Plan ...... 93 Annex E. Developing and Implementing a Reading Assessment and Accountability Framework for Early Primary Based on Grade-Level Benchmarks and Targets ...... 123 Annex F. Development and Implementation of a Comprehensive Assessment Program for Reading in Early Primary ...... 164

SOMA UMENYE QUARTERLY PROGRAM REPORT | i ACRONYMS

API application programming interface BEQAD Basic Education Quality Assurance Department BLF Building Learning Foundations CIES Comparative and International Education Society CMA TWG Curriculum-Materials-Assessment Technical Working Group CMU-A Carnegie Mellon University - Africa CPD continuing professional development CTLRD Curriculum, Teaching and Learning Resources Department (REB) DA district advisor DCC district continuing professional development committee DDE district director of education DEO district education officer DOS director of studies EARC education assessment and resource center EGRA early grade reading assessment EMIS education management information system ESAD Examinations, Selection, and Assessment Department (REB) ESSP Education Sector Strategic Plan FARS Fluency Assessment in Rwandan Schools FY fiscal year GALA group-administered literacy assessment GRN Global Reading Network GOR government of Rwanda HCI Human Capital Index (World Bank) ICT information and communication technology IR intermediate result JRES Joint Review of the Education Sector KICD Kenya Insitute of Curriculum Development

SOMA UMENYE QUARTERLY PROGRAM REPORT | ii

KSL L3 Language, Literacy, and Learning Initiative (USAID) LARS Learning Assessment in Rwandan Schools LEGRA Local Early Grade Reading Assessment LEMA Local Education Management Approach LQAS lot quality assurance sampling LSA large-scale assessments LwD learners with disabilities MEL monitoring, evaluation, and learning MICS Multiple Indicator Cluster Survey MINALOC Ministry of Local Government MINEDUC Ministry of Education NCPD National Council of Persons with Disabilities NISR National Institute of Statistics NIU Northern Illinois University NQT newly qualified teacher NRTT National Reading Training Team NUDOR National Union of Disability Organizations in Rwanda ORF oral reading fluency PA provincial advisor PASEC Programme d’analyse des systèmes d’éducation de la Confemen PIRLS Progress in International Reading Literacy Study PISA Programme of International Student Assessment REB Rwanda Education Board REI Rwanda Education Insights RNUD Rwanda National Union of the Deaf SDMS school data management system SEN special education needs SEI sector education inspector SEO sector education officer SGAC school general assembly committee SMSF school monitoring support framework

CLICK AND TYPE REPORT TITLE | iii SNE special needs education SPAM school performance appraisal meeting SSME Snapshot of School Management Effectiveness TCOP Teacher Community of Practice TDM Teacher Development and Management and Career Guidance and Counseling Department (REB) TIMSS Trends in International Mathematics and Science Study TLM teaching and learning materials TOT training of trainers TTC teacher training college UDL universal design for learning UN United Nations UNICEF United Nations International Children’s Emergency Fund USAID United States Agency for International Development URCE University of Rwanda, College of Education VSO Voluntary Service Overseas WCPM words correct per minute

SOMA UMENYE QUARTERLY PROGRAM REPORT | iv

EXECUTIVE SUMMARY

This quarterly report details USAID Soma Umenye activities and achievements between January 1 and March 31, 2019.

The objective of the USAD Soma Umenye project, which is a five-year initiative of USAID and REB, is to improve reading outcomes in Kinyarwanda for at least 1 million children in public and government-aided schools in Rwanda in Grades 1, 2, and 3. Specifically, USAID Soma Umenye will ensure that at least 70 percent of these students are able to read grade-level text with fluency and comprehension.

Summary highlights of USAID Soma Umenye achievements for the quarter:

• Soma Umenye delivered teacher training to 4,121 P2 Kinyarwanda teachers and to 3,616 P3 Kinyarwanda teachers in January 2019. • Approximately half of the P1 textbooks were reprinted in Quarter 2 with distribution completed for Kigali City and Eastern Province. Distribution in Northern Province also started in Quarter 2. • Subcontracts with international printers for printing of P2 and P3 textbooks were signed in Quarter 2. The project has operationalized a robust quality assurance approach for both printers to ensure REB quality standards are being met. In addition, subcontracts with local printers were executed for P1 decodables and Andika Rwanda readers. • Soma Umenye led a study tour to Kenya with representatives from REB and education leaders from Rwanda’s NGOs focused on disability to learn about Kenya’s progress with inclusive education and the use of digitized textbooks. • Project staff collaborated with REB to revise and validate Module 2 for the training of P2 and P3 Kinyarwanda teachers. • Soma Umenye supported REB to redefine revised learner benchmarks for early grade reading in Rwanda. • The project co-facilitated a joint site visit to Rulindo district led by the Honorable U.S. Ambassador to Rwanda and MINEDUC’s director general of planning. • The project developed an outline of a planned ESSP dashboard illustrating the potential for MINEDUC to present national and district education data to enable tracking of progress towards ESSP goals.

A highlight of Q2 is the progress USAID Soma Umenye has made with REB to develop draft revised early grade reading learner benchmarks. This intervention is well-timed, supporting MINEDUC’s drive to develop a comprehensive assessment model as well as potentially underpinning GoR’s ability to present internationally comparable data to report on SDG4 and the Human Capital Index.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 1 SECTION 1 PROJECT OVERVIEW

A. BACKGROUND Basic education remains a priority for the government of Rwanda (GOR) and remarkable progress has been made in the country. Alongside these advances, some challenges remain related to the availability of sufficient funds for primary education. To address this problem, USAID Soma Umenye is working hand in hand with the GOR to provide a complete package of interventions to increase the number of students that achieve grade-level fluency and comprehension standards in Kinyarwanda. The project was awarded to Chemonics in July 2016.

B. PROGRAM DESCRIPTION USAID Soma Umenye was designed in response to the GOR’s priorities and the evidence (demonstrated by assessments like the Learning Assessment in Rwandan Schools) that early grade reading required additional investment. Its objective is to improve reading outcomes in Kinyarwanda for at least 1 million children. Specifically, Soma Umenye will target all children in Grades 1-3 attending public and government- aided schools nationwide and ensure that at least 70 percent of these students are able to read grade-level text with fluency and comprehension. Below, Exhibit 1 lists Soma Umenye’s results framework.

Exhibit 1. USAID Soma Umenye Project Results Framework

Development Objective: Increased opportunities for Rwandan children and youth to succeed in schooling and the modern workplace IR 1: Classroom instruction in early- IR 2: Systemic capacity for early-grade grade reading improved reading instruction improved Sub-IR 1.1: Evidence-based, gender-sensitive Sub-IR 2.1: National advocacy mechanisms for early-grade reading materials available and used early-grade reading interventions strengthened Sub-IR 1.2: Teachers’ use of evidence-based, Sub-IR 2.2: Student and teacher performance gender-sensitive instructional practices in early- standards and benchmarks for early-grade grade reading increased reading applied Sub-IR 1.3: Capacity of head and mentor Sub-IR 2.3: Research-based policies and teachers to coach and supervise early-grade curricula in support of early-grade reading reading instruction strengthened instruction implemented Sub-IR 1.4: Schools’ and teachers’ use of Sub-IR 2.4: Early-grade reading assessment student assessment results improved systems strengthened Sub-IR 2.5: Capacity of TTCs to prepare effective early-grade reading teachers improved Cross-Cutting: Gender and inclusion of students with special needs, ICT

IR 1 focuses on the classroom and school-level interventions necessary to improve evidence-based reading instruction, including the provision of materials, training and coaching, supportive leadership, and analysis and use of student assessment results.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 2

IR 2 focuses on strengthening the capacity of the education system in Rwanda to implement and support high-quality, evidence-based reading instruction throughout the country, and thereby enabling high quality reading instruction to continue beyond the life of Soma Umenye.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 3 SECTION 2 ACHIEVEMENTS AND DISCUSSION OF MAJOR ACTIVITIES

A. OPERATIONAL ACTIVITIES Recruitment. During this quarter, USAID Soma Umenye made great progress in filling vacant positions. Specifically, the following staff joined the project:

• Procurement and Logistics Director, Edward Tushabe (January 2019); • School Leadership Specialist, Cyprien Bunani (February 2019); • Provincial Advisor – Kigali, Raphael Karanganwa (February 2019); • IR1 Lead, Protogene Ndahayo (February 2019); • Field Program Assistant, Annet Mbabazi (March 2019).

In addition, USAID Soma Umenye identified candidates for the following positions:

• Technical Finance Coordinator (candidate has accepted the job offer and will begin in May 2019); • Systems Strengthening Advisor (candidate, from Cambridge Education, was selected and expects to join the project in May 2019); • EGR Advisor (this position is expected to be embedded at MINEDUC pending further discussions with MINEDUC).

USAID Soma Umenye continues to recruit for the following positions:

• Monitoring, Evaluation and Learning (MEL) Specialist (as the original candidate approved for this position accepted another offer, a second recruit was conducted, a finalist selected, and negotiations are expected to be completed shortly); • Procurement and Logistics Assistant (following the departure of one of the project’s procurement and logistics assistants, USAID Soma Umenye is in the process of conducting a recruit. This position is expected to be filled by June 2019).

Procurement. The project completed or advanced several significant procurements during this quarter.

• P1 textbooks. Following quality issues and rejection of proposed remediation measures, approximately half of the P1 textbooks were reprinted in Quarter 2 with distribution completed for Kigali City and Eastern Province. Distribution in Northern Province also started in Quarter 2.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 4

• P2 and P3 textbooks. Subcontracts with international printers for P2 and P3 textbooks were signed in Quarter 2. The project has operationalized a robust quality assurance approach for both printers to ensure REB quality standards are met. • P1 decodables and Andika Rwanda readers. Subcontracts with local printers were executed for P1 decodables and Andika Rwanda readers. • Bookshelves, P2/P3 teacher’s guides, and P2/P3 read-alouds. Supplier selection for bookshelf distribution, P2/P3 teacher’s guides, and P2/P3 read-aloud storybooks was completed in Quarter 2.

The project also engaged two procurement specialists during Quarter 2, whose efforts contributed greatly to finalizing ongoing procurements; further, the project fielded an education and supply chain specialist to coordinate multiple ongoing book distributions.

Finance and compliance. The project moved forward the following finance and compliance initiatives in Quarter 2:

• Transport reimbursements and per diem allowances totaling approximately $735,000 were paid to project beneficiaries in Quarter 2 (primarily for teacher trainings) using mobile money. These payments represent a 1000% increase in use of mobile money from last quarter and full integration of the system into the project’s activities. USAID Soma Umenye continued implementation of its expanded compliance program in Quarter 2 with an all-staff training and deep dive into the project’s child safeguarding and anti-trafficking policies.

B. TECHNICAL ACTIVITIES This quarter, under Intermediate Result (IR) 1, interventions included:

• Delivered teacher training Module 1 to over 7,700 P2 and P3 teachers nationwide. • Developed Module 2 training guides for both P2 and P3 in collaboration with REB in preparation for training teachers on Module 2 in Quarter 3. • Procured printing and distribution services for P2 and P3 essential core. • Facilitated a study tour to Kenya to learn more about Kenya’s progress with inclusive education and the use of digitized books. • Began distribution of redesigned classroom library bookshelves.

This quarter, IR 2 interventions included:

• Began developing a mockup of the ESSP dashboard and sharing with MINEDUC and REB. • Revised Soma Umenye’s school leadership training Module 1, which covers the role of SEOs, head teachers and directors of studies to deliver coaching to teachers.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 5 B1. IR 1: CLASSROOM INSTRUCTION IN EARLY-GRADE READING IMPROVED

IR 1 focuses on the classroom and school-level interventions necessary to improve evidence-based reading instruction, including the provision of materials, training and coaching, supportive leadership, and analysis and use of student assessment results.

Sub-IR 1.1: Evidence-based, gender-sensitive early-grade reading materials available and used

1.1.1. Collaborate with REB to develop/revise an “essential core” of instructional materials for P2 Kinyarwanda

As agreed upon with REB, the essential core of P2 Kinyarwanda materials consists of (1) a student textbook, (2) a teacher’s guide, (3) a read-aloud storybook, and (4) leveled readers.

Procure and distribute bookshelves. In Quarter 2, 6,000 bookshelves were received at USAID Soma Umenye warehouses. During this quarter, a distribution company was selected and distribution will start with schools in Kigali in early April.

P2 Textbook, Teacher’s Guide (including assessment protocol), and Read-aloud Story Book

Collaborate with CTLRD and TDM staff to develop P2 content for the textbook, teacher’s guide, and read-aloud book. This activity was completed in Quarter 1.

Submit draft P2 pupil book, teacher’s guide, and read-aloud book to USAID and REB for approval and validation. This activity was completed in Quarter 1.

Digitize the P2 teacher’s guide. As USAID Soma Umenye explores ways of supporting REB’s ambition to digitize education materials so that they are accessible and inclusive for all students, including those with disabilities, a team composed of USAID Soma Umenye, REB, and stakeholders from the Rwandan disability community conducted a study visit to Nairobi, Kenya in March. For details on the specific objectives of this trip, please refer to section 1.1.4.

The study tour team explored various digital options for teaching and learning materials and, following the trip, USAID Soma Umenye is building on this information to design the most effective strategy for digitizing teacher guides. A concept note will be drafted in Quarter 3 and will be presented to REB and USAID. See Annex B for additional information about the trip.

P2 Leveled Readers

Hold workshop with CTLRD to review approved leveled readers for P2 and ensure they follow the REB-approved leveling guidelines. This activity was completed in 2018.

Printing and Delivery

SOMA UMENYE QUARTERLY PROGRAM REPORT | 6

Procure printing and delivery services for P2 essential core and print and deliver essential core. The procurement of the P2 essential core started in Quarter 1 and was expected to be completed in Quarter 2. Below, we describe the status of the P2 essential core materials.

• P2 student textbook. Soma Umenye experienced some delays in securing REB approval of the printing prototypes due to two issues. — The first issue was agreeing on Creative Commons language. While Soma Umenye initially proposed an Attribution license (in which users can revise the work in question as long as they attribute credit for the original work), REB had significant concerns about local parties revising textbooks, potentially resulting in two versions circulating in Rwandan schools. As a result, Soma Umenye proposed an Attribution, No Derivatives license, which REB approved. This license allows users to use the work in any way they like (with attribution) but does not permit users to revise the work in question without the copyright holder’s permission (i.e., REB’s permission). — The second issue related to the binding of the book. As the student textbook is several hundred pages in length, REB required multiple prototypes before it felt comfortable that the binding was strong enough to last under school conditions. We expect to receive approval in Quarter 3. • P2 teacher’s guide. Due to the oversaturation of the local market (related to the Made in Rwanda policy requiring all books used in schools be printed in Rwanda), Soma Umenye did not initially receive sufficient acceptable responses to its request for printing services for the teacher’s guide. After reissuing the tender, Soma Umenye was able to identify and contract an appropriate supplier. • P2 read-aloud story book. In procuring printing services for the read-aloud book, we faced similar challenges. After re-issuing the tender, we expect to sign a subcontract with an appropriate supplier in Quarter 3. • P2 leveled readers. Several factors have impacted the timeline for the leveled readers. Initially, Soma Umenye sought (and received) confirmation from REB regarding whether these books had to follow the Made in Rwanda policy. As it is often the practice of local publishers (from whom we sought to procure the leveled readers) to print outside of the country, Soma Umenye included the requirement to print in country in the solicitation for procuring leveled readers. In addition, REB requested that readers should be printed in accordance with the technical specification REB requires for textbooks. As a result, in Quarter 3, Soma Umenye will request that publishers submit revised quotations which reflect REB’s textbook specifications.

1.1.2. Collaborate with REB to develop/revise an “essential core” of instructional materials for P3 Kinyarwanda

As agreed upon with REB, the essential core of P3 Kinyarwanda materials consist of (1) a student textbook, (2) a teacher’s guide, (3) a read-aloud storybook, and (4) leveled readers.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 7 P3 Curriculum Planning

Hold a workshop with CTLRD and TDM to agree on the plan for P3 content for the student book, teacher’s guide, and read-aloud book following the curriculum. This activity was completed in Quarter 1.

P3 Textbook, Teacher’s Guide (including assessment protocol), and Read-aloud Story Book

Collaborate with CTLRD and TDM staff to develop P3 content for the student textbook, teacher’s guide, and read-aloud book. This activity was completed in Quarter 1.

Submit draft P3 student book, teacher’s guide, and read-aloud book to USAID and REB for approval. This activity was completed in Quarter 1.

Digitize the teacher’s guide. Related to the description under 1.1.1, the planning and implementation of this activity will begin in Quarter 3.

P3 Leveled Readers

Hold a workshop with CTLRD to review approved leveled readers for P3 and ensure they follow the leveling guidelines. This activity was completed in 2018.

Printing and Delivery

Procure printing and delivery services for P3 essential core and print and deliver essential core. The procurement of the P3 essential core started in Quarter 1 and was expected to be completed in Quarter 2. However, as with the P2 essential core, the project experienced delays due to multiple factors.

• P3 student textbook. With these books, Soma Umenye faced similar challenges as those described above for the P2 textbook. REB has approved the printing prototype and mass production of the books has started. • P3 teacher’s guide. Soma Umenye faced the same challenges with an oversaturated supplier market as described for P2 books. • P3 read-aloud story book. Soma Umenye faced the same challenges with an oversaturated supplier market as described for P2 books. • P3 leveled readers. See discussion under P2 materials.

1.1.3. Build REB’s skills in writing an early grade reading textbook

Throughout the process of revising the P2 and P3 essential core, Soma Umenye staff collaborated hand-in-hand with CTLRD staff, i.e., during book design, book revision, and illustration. Through this process, Soma Umenye helped to build the skills of CTLRD staff to write an early grade reading textbook.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 8

USAID Soma Umenye is still considering options to organize a training workshop for REB Kinyarwanda specialists and other stakeholders on the effective design of early grade reading programs. The Global Reading Network (GRN) has developed a relevant training course, and USAID Soma Umenye is in discussion with USAID about the possibility of offering the GRN course in Rwanda.

1.1.4. Provide adaptations that make the materials described above accessible to students with disabilities.

Support MINEDUC to implement the 2018 Special Needs and Inclusive Education Policy as it relates to early grade literacy in Kinyarwanda. During this quarter, USAID Soma Umenye focused its efforts on understanding the revised Special Needs Education policy (passed by the Cabinet in February 2019) and meeting with Rwandan disability stakeholders in order to determine how best to support the implementation of this policy and to inform the project’s inclusion plan. Soma Umenye facilitated meetings with implementers and other projects, such as Voluntary Service Overseas (VSO) and the DFID-funded Building Learning Foundations (BLF), as well as with civil society and local organizations such as the Rwanda National Union of the Deaf (RNUD) and the National Union of Disability Organizations in Rwanda (NUDOR). Soma Umenye also attended REB-sponsored events on inclusive education.

In further support of developing a revised inclusion plan and providing support to MINEDUC, USAID Soma Umenye conducted a study visit to Kenya to explore the intersection of digitization and inclusion. Along with representatives from REB, RNUD, and NUDOR, project staff met with several inclusion stakholders, including those described below:

• Policy. Representatives from the government of Kenya, including the Ministry of Education (Directorate of Special Needs Education) and the Kenyan Institute of Curriculum Development; • Providers. Organizations producing and utilizing digital content, including content specifically designed for students with disabilities; • End-users. Students and teachers incorporating digital content and inclusive education practices into their classrooms (Kambui Primary School for the Deaf).

USAID Soma Umenye also developed and presented a draft concept note to USAID for a proposed Rwandan Inclusive Education workshop to be held in Quarter 3.

Assess the need for Braille books in Rwanda’s schools for the blind. In February, USAID Soma Umenye visited the Educational Institute for the Blind in Kibeho to better understand the current needs of blind learners. USAID Soma Umenye also met with the Executive Director of the Rwanda Union of the Blind to understand current education initiatives for students who are blind or low-vision. From these meetings, USAID Soma Umenye identified several challenges facing blind learners, including the lack of a standardized Kinyarwanda Braille and a shortage of teachers who are trained in Braille. This information will be used to inform the project’s inclusion plan.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 9

Adapt, print, and distribute Braille books to schools for the blind. No activity conducted in Quarter 2.

Survey schools for the deaf to understand their needs for reading instruction materials. During this quarter, Soma Umenye staff met with stakeholders in the deaf community to understand the current opportunities and challenges. In February, USAID Soma Umenye convened a working group session on deaf education, with a particular emphasis on the process to develop and validate Rwandan Sign Language. Representatives from USAID/Rwanda, VSO, RNUD, and NCPD agreed on a way forward and USAID Soma Umenye was invited to join the steering committee for the Rwandan Sign Language Dictionary. During one of the steering committee meetings, NCPD agreed to fully fund the development and validation of the dictionary. During Quarter 3, USAID Soma Umenye will develop a plan to support the teaching of deaf learners that will work in parallel with the dictionary rollout.

Sub-IR 1.2: Teachers’ use of evidence-based, gender- sensitive instructional practices in early-grade reading increased

1.2.1. Provide refresher support to P1 teachers

Assess P1 teacher competencies in the classroom. In line with the Year 3 workplan, USAID Soma Umenye will conduct a refresher training for P1 Kinyarwanda teachers. To inform this training, USAID Soma Umenye developed an assessment tool and assessed P1 teacher competencies during this quarter. The project identified a sample of P1 teachers (a total of 44 teachers from 17 schools selected from a mix of urban and rural districts) and assessed their classroom practice as compared to what they learned during project training on evidence-based reading instruction. To gather this information, staff observed teachers during instruction and conducted face-to-face interviews and focus groups with trained P1 teachers. Soma Umenye will use this information to determine the size of the refresher training and identify priority areas to be addressed.

The results of this assessment indicated that some teachers do not refer to the teacher's guide while delivering their lessons as they are still struggling to familiarize themselves with the structure of both the teacher's guide and the student book. Many of these teachers said that they did not follow the lesson plan in the teacher’s guide because their students did not yet have the complementary textbooks. Additionally, most of the teachers were not using the “I do, we do, you do” approach in teaching the five core components of evidence-based reading instruction, even though they were able to describe it (therefore, appear to understand it).

Based on the findings of this assessment, it is evident that P1 teachers could benefit from refresher training that emphasized the “I do, you do, we do approach” to instruction. In addition, given the fact that P1 teacher training was delivered before the P1 decodable readers were approved for use in public schools, the training will also show teachers

SOMA UMENYE QUARTERLY PROGRAM REPORT | 10

how to use decodable readers in the classroom and how to encourage parents to use them at home.

Design and deliver support to address any weaknesses identified. USAID Soma Umenye drafted the content of P1 refresher training during this quarter and the training manual will be finalized in April. The refresher training for all P1 Kinyarwanda teachers is scheduled for May.

1.2.2. Train P2 teachers on evidence-based, gender-sensitive early grade reading instruction (focused on the revised teaching and learning materials).

Collaborate with TDM staff to update the P2 training guide. In Quarter 2, USAID Soma Umenye held a workshop with TDM to revise Module 2 for P2 teacher training. At the end of the workshop, Soma Umenye and REB staff revised the training modules for Phase 2 training of P2 training and ensured these were aligned with the revised essential core materials.

Submit draft P2 training guide to USAID and REB for approval. Upon completion of the training module for P2 Kinyarwanda teachers, USAID Soma Umenye collaborated with TDM to organize a validation workshop. The overall objective of the workshop was to ensure that the module was in line with REB’s teacher training framework and that it would equip P2 Kinyarwanda teachers with the knowledge and skills to allow them to successfully implement evidence-based reading and writing instruction in their schools. At the end of the workshop, the P2 training modules were validated and approved for use in the training. The approved training materials were shared with USAID for their technical concurrence, which was received.

Train National Reading Training Team (NRTT) to deliver P2 teacher training. In March, USAID Soma Umenye organized the training of trainers for members of the NRTT. In total, Soma Umenye successfully trained 192 trainers on Module 2 content.

Support NRTT to train P2 teachers. P2 Module 2 teacher training is planned for Quarter 3. See Annex C for a report on Module 1 training for P2 and P3 teachers.

1.2.3. Train P3 teachers on evidence-based early grade reading instruction (focused on the revised teaching and learning materials).

Hold a workshop with TDM to agree on the plan for P3 content for the training guide. During this quarter, USAID Soma Umenye met with the newly appointed head of TDM and the director of TDM to discuss and agree on the content for the P3 Module 2 teacher training. USAID Soma Umenye and TDM staff agreed upon a training approach and agreed to jointly develop the Module 2 content.

Collaborate with TDM staff to develop content for the P3 training guide. In Quarter 2, USAID Soma Umenye collaborated with TDM to develop Module 2 for the training of P3

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 11 Kinyarwanda teachers. At the end of the workshop, the team had completed the training module for Phase 2 P3 training.

Submit draft P3 training guide to USAID and REB for approval. Upon completion of the training module for P3 Kinyarwanda teachers, USAID Soma Umenye collaborated with REB to organize a validation workshop. The overall objective of the validation workshop was to ensure that the teacher training module was in line with REB’s teacher training framework and will equip P3 Kinyarwanda teachers to successfully implement evidence- based reading and writing instruction in their schools. At the end of the workshop, the P3 training modules were validated and approved for use in the training. The approved training materials were shared with the USAID for approval, which was received.

Soma Umenye.

Alain Patrick Mwizerwa for USAID for Mwizerwa Patrick Alain

: PHOTO Members of the NRTT discuss during training as they prepare to train P2 and P3 Kinyarwanda teachers countrywide.

Train National Reading Training Team (NRTT) to deliver P3 teacher training. In March, USAID Soma Umenye trained trainers to deliver the P3 teacher training, which they will do in April.

Support NRTT to train P3 teachers. P3 Module 2 teacher training is planned for Quarter 3.

1.2.4. Support school-based teacher professional development (the creation of communities of practice for P1-P3 Kinyarwanda teachers).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 12

Collaborate with REB to conduct CPD needs assessment of P1, P2, and P3 teachers. During this quarter, Soma Umenye staff defined the scope of this needs assessment, which will cover continuing professional development (CPD) needs that can be met through a teachers’ community of practice or in-school coaching support. The assessment will take place in Quarter 3.

Develop and implement a strategy to promote communities of practice to support teachers. During this quarter, USAID Soma Umenye developed a draft strategy to promote communities of practice to support teachers. The use of communities of practice is rooted in the first two priorities of the Education Sector Strategic Plan 3 (ESSP 3):

• Enhance quality of learning outcomes; and • Strengthen the CPD and management of teachers.

The communities of practice will be the mechanisms through which early grade Kinyarwanda teachers receive training and critical support so that they can better fulfil their role in developing their students’ reading skills. Through the communities of practice, in-school coaches will provide teachers with specific training and coaching, with assistance from the project’s district advisors who will ensure that schools have coaching plans in place. During the biweekly community of practice meetings in schools, coaches will provide teachers with resources and videos for self-study along with the scripted lessons and assessment protocols that are in their teacher’s guides.

In Quarter 3, USAID Soma Umenye will train school leaders on coaching and communities of practice. This training will include detailed guidelines on the organization of the community of practice at the school and sector level.

1.2.5. Support a sustainable model for in-service CPD and coaching

Work through TDM Technical Working Group (and the School-Based Mentor Task Force) to design a sustainable model. Given that the draft report on CPD models has not yet been finalized (further described in the paragraph below), this activity was shifted to Quarter 3.

Conduct desk study of different CPD models. In Quarter 2, USAID Soma Umenye conducted a desk study of different CPD models in East Africa for REB’s review and consideration with respect to their applicability in Rwanda. The desk study analyzed CPD models from Kenya and Uganda and the draft report will be discussed internally in Quarter 3 before being shared with REB.

Organize study tour to a country with an effective CPD model in East Africa. This activity was shifted to Quarter 3 as USAID Soma Umenye will need to discuss the desk study both internally and with REB before planning a study tour. Once planned, the study tour will include representatives from REB and Soma Umenye.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 13 Provide CPD data to TDM and the DCCs. The new head of TDM welcomed the idea of sharing teacher training data with TDM. Recently, the Director General of REB requested that this information include both the pre- and post-training results for each training participant. The head of TDM suggested that USAID Soma Umenye should first compile the results of both modules of the training. Since the second module of P2 and P3 teacher training will be conducted in April, USAID Soma Umenye agreed with TDM to share the final data in Quarter 3. While the data has value to indicate the gain in subject knowledge during the training process, Soma Umenye will ensure that the data is shared in a form that is appropriate to guide future training and potentially to inform tailored coaching.

Support review of TCOP. TCOP is no longer active. In Quarter 3, Soma Umenye will seek guidance from REB’s CTLRD on their revised strategy to house digitized texts.

1.2.6. Support teacher incentives for high-performing teachers

USAID Soma Umenye committed to supporting REB’s initiative to motivate outstanding teachers. Through this initiative, REB awards highly performing teachers with cows, laptops, and motorcycles on World Teacher’s Day (October 5) every year. High performing teachers are selected at the school, sector, district, and national level. USAID Soma Umenye has pledged to award high performing P1-P3 Kinyarwanda teachers as a means of communicating the importance of great teachers in the early grades to ensure strong foundations.

In Quarter 2, USAID Soma Umenye developed draft guidelines for the selection of high performing P1-P3 Kinyarwanda teachers. The guidelines include selection criteria, the selection process and timeline, and the types of awards that will be provided. Draft guidelines will be discussed in Quarter 3 at the REB – USAID Soma Umenye steering committee.

Sub-IR 1.3: Capacity of head and mentor teachers to coach and supervise early- grade reading instruction strengthened

1.3.1. Train designated coaches for P1-P3 Kinyarwanda teachers.

Gather information about the current frequency of SEO visits to schools. In Quarter 2, USAID Soma Umenye developed a template to collect information on the frequency of SEO visits to schools, the activities they undertake when they visit, and, if applicable, the reasons why they might not be visiting schools. The information collected using this form will inform the project’s strategy on how SEOs can support the coaching of P1-P3 Kinyarwanda teachers and the establishment of communities of practice. Soma Umenye will also use this information to develop a strategy to promote SEO school support as sustainable beyond the life of the project.

Collaborate with TDM staff to develop coaching protocols and tools. In Quarter 2, Soma Umenye staff worked with REB to develop coaching protocols, tools, and guidelines,

SOMA UMENYE QUARTERLY PROGRAM REPORT | 14

based on existing instructional materials and teacher competencies, for effective coaching support to P1-P3 Kinyarwanda teachers. The coaching guidelines are included in the school leadership Module 1, which will be distributed to all school leaders (including DDEs, DEOs and SEOs as well as head teachers and DOS) during school leadership training in Quarter 3. The coaching protocols and tools were tested in Gicumbi District to ensure that they were appropriate and user friendly. This exercise confirmed that the protocols and tools are ready to be used by designated coaches to strengthen the competencies of early grade reading Kinyarwanda teachers.

Using the classroom observation tool, the coach will record five things: (1) Fluency rate of randomly selected students (2) key strengths of the teacher in teaching reading and writing, (3) key areas in which the teacher could do better, (4) three specific suggestions for the teacher to improve before the next visit, (5) the date of visit, and (6) the signature of the coach. After being signed by the coach, the form will then be countersigned by the teacher to demonstrate that s/he understood the feedback and will commit to revising his/her practice. In addition to the classroom observation tool, other tools developed include the School Monthly Coaching Plan, School Coaching Report, Sector Monthly Coaching Plan, Sector Monthly Coaching Report, and Coaching Supervision Form (to be used by the SEO in his/her coaching of the school-based coach).

Hold a workshop to design the coaching training guide in collaboration with TDM staff. The coaching training guide and the coaching guideline and tools were developed at the same workshop in February. Workshop participants included CTLRD writers, representatives from TDM, Soma Umenye technical team members, and some external technical assistants.

Submit draft coaching training guide, protocols, and tools to USAID and REB for approval. The school leadership training guide, which includes the coaching tools and guidelines, was approved by REB this quarter. USAID has also approved the materials.

Train National Reading Training Team (NRTT) to deliver coaching training. No activity planned in Quarter 2.

Support NRTT to train coaches. No activity planned in Quarter 2.

1.3.2. Train head teachers to promote their leadership in support of early-grade reading instruction

Finalize curriculum for head teacher/DOS training. The first module for head teacher and DOS training has already been validated by REB and USAID. This training module provides an introduction to school leadership and coaching support for early grade reading Kinyarwanda teachers. Module 1 training of school leaders (head teachers and directors of studies) is scheduled to take place in Quarter 3. This training will build on a one-day orientation for school leaders that was completed in 2018.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 15 Exhibit 2. Contents of School Leader Training • General principles of school leadership Module 1 • Evidence-based approach to early grade reading • How to coach early grade teachers of reading • General principles of school leadership Module 2 • How to lead communities of practice for early grade reading teachers • General principles of school leadership Module 3 • How to oversee student assessment and use assessment data

Collaborate with TDM staff to develop additional content for head teacher/DOS training. The focus of Module 1 training is the introduction to school leadership and coaching support; Module 2 will focus on school leadership and communities of practice for improved reading and writing instruction. During this quarter, USAID Soma Umenye drafted Module 2. In Quarter 3, Soma Umenye will seek REB and USAID approval for this module.

Submit draft head teacher/DOS training guide to USAID and REB for approval. No activity planned in Quarter 2.

Train National Reading Training Team (NRTT) and/or DDEs to deliver head teacher/DOS training. No activity planned in Quarter 2.

Support NRTT to train head teachers/DOS. No activity planned in Quarter 2.

Sub-IR 1.4: Schools’ and teachers’ use of student assessment results improved

1.4.1. Develop P2 assessment protocol (see sub-IR 1.1), train teachers to use it (see sub-IR 1.2), and train head teachers to oversee it (see sub-IR 1.3).

The formative assessment tools and guidelines are included in the P2 teacher’s guides, which were finalized in Quarter 1. In Quarter 2, Soma Umenye staff incorporated these tools and guidelines into Module 2 of the P2 teacher training and in Module 1 of the school leadership training to ensure that both teachers and school leadership are able to use the tools effectively.

1.4.2. Develop P3 assessment protocol (see sub-IR 1.1), train teachers to use it (see sub-IR 1.2), and train head teachers to oversee it (see sub-IR 1.3).

The formative assessment tools and guidelines are included in the P3 teacher’s guides, which were finalized in Quarter 1. In Quarter 2, USAID Soma Umenye incorporated these tools and guidelines into Module 2 of the P3 teacher training and in Module 1 of the school leadership training.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 16

1.4.3. Collaborate with REB to create and implement a sustainable model(s) of school-level remediation for P2-P3 early grade reading students.

Hold a workshop with TDM and CTLRD to develop several appropriate models for remediation. No activity planned in Quarter 2.

Support implementation of resulting model. No activity planned in Quarter 2.

B2. IR 2: SYSTEMIC CAPACITY FOR EARLY-GRADE READING INSTRUCTION IMPROVED

IR 2 focuses on strengthening the capacity of the education system in Rwanda to implement and support high-quality, evidence-based reading instruction throughout the country during and beyond the life of the USAID Soma Umenye activity.

Sub-IR 2.1: National advocacy mechanisms for early grade reading interventions strengthened

2.1.1. Develop and implement an advocacy strategy to highlight the importance of early grade reading in Kinyarwanda.

Orient REB staff to the Soma Umenye program. This quarter, USAID Soma Umenye:

• Oriented the Examinations, Selection and Assessment Department (ESAD) to the Year 3 Soma Umenye program. • Briefed the new head of the TDM department (James Ngoga) on the overall USAID Soma Umenye program. • Met with the head of MINEDUC’s BEQAD department to brief him on Soma Umenye’s school support model, including the coaching role provided by head teachers and DOS and the role of the SEO to monitor schools’ delivery of their coaching plans. • Shared a National Reading Roadmap with the director general of REB, the Advisor to the Minister of Education, and the REB steering committee. The National Reading Roadmap detailed Year 3 activities with a specific emphasis on supporting REB to respond to the Human Capital Index (HCI) agenda.

In addition, Soma Umenye staff developed an orientation package for all REB staff that describes planned Year 3 activities.

Support REB to attend CIES. USAID Soma Umenye provided logistical support in preparation for two representatives from REB and two from Soma Umenye who were asked to represent Rwanda at CIES. We will report on the activity in Quarter 3.

Implement advocacy strategy. An advocacy plan is in draft form. Soma Umenye has organized quarterly meetings of a REB-Soma Umenye Steering Committee that meets quarterly, which includes the director general and heads of department from REB and provides a forum to fast track USAID Soma Umenye commitments, monitor progress,

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 17 and ensure accountability to REB and the GOR. In addition, Soma Umenye team leads work directly with the heads of REB’s TDM, CTLRD, and ESAD departments ensuring that there is clear understanding of Soma Umenye’s planned activities among the individual departments.

• Technical working groups. USAID Soma Umenye has been actively involved in technical working groups (TWGs) and task forces (TF). Through these, project staff ensure that project activities are aligned with the ESSP. For example, through the Curriculum-Materials-Assessment (CMA) TWG, USAID Soma Umenye and BLF advocated for the establishment of an assessment task force that is now co-chaired by REB and USAID Soma Umenye. Soma Umenye’s MEL director will represent the project in this task force, which will report to the CMA TWG. Terms of reference for the task force have been drafted and will be finalized by the task force in Quarter 3. • District events. During this quarter, USAID Soma Umenye drafted summary presentations on the ESSP, early grade reading components in the ESSP, and the project’s advocacy priorities at the national and district level for early grade reading metrics to be included in district development plans. In further support of implementing the project’s advocacy strategy at the district level, two district advisors (in Gicumbi and Kirehe) were selected to lead their respective District Education Sector meetings. USAID Soma Umenye held an internal meeting in February to refine these presentations and integrate the early grade reading assessment and accountability framework discussed under Sub-IR 2.2. • Instructional time study. USAID Soma Umenye is in the process of finalizing a report on the instruction time study conducted in 2018. This quarter, the instructional time study reference group reviewed the draft report and provided feedback. This report is currently under internal review and USAID Soma Umenye will submit the report to USAID in Quarter 3. • Learning agenda retreat. During this quarter, USAID Soma Umenye reviewed options to facilitate a retreat to kick start the learning agenda. Current plans are to hold the retreat to reflect on findings from the LEMA district pilot delivery in Quarter 3.

Collaborate with MINEDUC/REB and other development partners in the finalization and implementation of the recently drafted Literacy Policy. No activities in Quarter 2.

2.1.2. Support the implementation of the ESSP 3.

Collaborate with REB to develop and support an implementation plan for ESSP 3 During this quarter, MINEDUC presented their revised ESSP M&E plan and requested support from development partners. USAID Soma Umenye confirmed its offer to support the development of a summary ESSP plan, translated into Kinyarwanda, and to support the dissemination of the summary plans to the district level in collaboration with BLF.

Collaborate with BEQAD and inspectors on inspection forms. During Quarter 2, USAID Soma Umenye presented outline plans for a local early grade reading assessment process at the district and sector level to BEQAD, reflecting the feedback BEQAD

SOMA UMENYE QUARTERLY PROGRAM REPORT | 18

provided in December. In addition, in February, BEQAD requested Soma Umenye to facilitate sessions at its national training of inspectors, sharing experience of inspection systems in the region. These activities are supporting a process to revise BEQAD’s approach to inspecting early grade reading potentially with the goal of sector-based inspectors, now accountable to BEQAD, monitoring school delivery of coaching plans and playing a key role in the delivery of LEMA which will be developed in Quarter 3.

Help develop the ESSP dashboard and Exhibit 3. Dashboard Mock-up integrate it with MINEDUC and REB systems. This quarter, USAID Soma Umenye moved forward planning for the ESSP dashboard. First, as described under Sub-IR 2.2 and 2.4, Soma Umenye discussed with REB developing a national assessment and accountability framework. The second part of this proposal (the accountability framework) would include a reading assessment that would be administered locally (the local early grade reading assessment or LEGRA) and the data captured in a dashboard that is tentatively named the Rwanda Education Insights (REI). Second, Soma Umenye updated and finalized the dashboard plan (Annex D), which included reviewing and integrating HCI education-related indicators. The team developed mock-ups of the dashboard

(see Exhibit 3) that displayed national HCI indicators. The dashboard plan is undergoing an internal approval process.

During this development of this plan, the team organized meetings with various stakeholders to solicit feedback on the prototype mockups. In Quarter 2, three districts (Kicukiro, Gisagara and Kirehe) were selected as pilot districts for LEGRA and dashboard implementation. Key criteria in selecting these districts were the availability of collected EGRA sample data, data readiness in school data management systems, and engaged leadership.

To promote stakeholder buy-in, USAID Soma Umenye presented the REI proposal to BEQAD in a meeting organized by MINEDUC inspectors. Additionally, USAID Soma Umenye presented an REI visualization packet, effectively demonstrating how the REI could present sector, district, and national data to MINEDUC’s director general of planning and shared it with the head of REB’s ESAD department. During this quarter, USAID Soma Umenye also facilitated an orientation meeting with DDEs, district education officers (DEOs), and SEOs from the three pilot districts to share the project’s

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 19 plans to implement the accountability tools (LEGRA and the ESSP dashboard) at the sector and district levels. As a result of the meeting, USAID Soma Umenye established a WhatsApp group for district and sector education officers in the three pilot districts to ease communication on matters related to the implementation of the accountability framework. Finally, USAID Soma Umenye started to collect existing data from the three pilot districts.

2.1.3. Strengthen the capacity of districts to identify and respond to local and national early grade reading priorities.

Collaborate with REB to share their expectations regarding P1, P2, and P3 Kinyarwanda reading performance with district officials, including vice mayors of social affairs. No activity planned for this quarter.

Work with DEOs, SEOs, and possibly vice-mayors of social affairs to get early grade metrics included in district plans and annual performance contracts. During this quarter, USAID Soma Umenye drafted summary presentations on the ESSP, early grade reading components in the ESSP, and project advocacy priorities at the national and district level for early grade reading metrics to be included in district development plans.

Collaborate with REB and district officials to create a LEMA-type instrument to measure early grade reading performance as well as key indicators predicting strong performance. During this quarter, USAID Soma Umenye developed LEGRA tools. The project will test the tools and present them at a workshop with key stakeholders in Quarter 3.

Collaborate with district officials to conduct LEMA-type assessments each term and develop support plans to address any problems identified. Once the tools have been tested and presented in stakeholders in Quarter 3, USAID Soma Umenye will collaborate with district officials to conduct these assessments each term.

2.1.4. Collaborate with REB to support Andika Rwanda to raise the national-level profile of children's literacy

The launch of Andika Rwanda 2019. The activity was completed in Quarter 1.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 20

USAID Soma Umenye USAID Soma

: PHOTO In Nyamagabe District, Southern Province, an SEO gathered almost 2,000 students to talk about Andika Rwanda competition. Distribute Andika Rwanda 2019 guidelines. In this quarter, USAID Soma Umenye collaborated with DDEs, DEOs, and SEOs to distribute Andika Rwanda 2019 guidelines to all public and government-aided schools in Rwanda. As part of this effort, USAID Soma Umenye, in conjunction with the district education teams, facilitated district-level orientation meetings in all 30 districts for SEOs. SEOs then held meetings with head teachers, Kinyarwanda teachers, and students to discuss Andika Rwanda 2019.

Alain Patrick Mwizerwa For USAID For Umenye Soma Mwizerwa Patrick Alain

: PHOTO A P2 student reads her Andika Rwanda entry in front of the school jury committee at one school in Ngoma District, Eastern Province. SEOs, in collaboration with head teachers and key Kinyarwanda teachers, formed school-level jury committees, where students had the opportunity to read their entries to the jury for consideration. Juries then deliberated and selected entries, which were submitted to the sector and district.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 21 Set up Andika Rwanda Steering Committee. This activity was completed in Quarter 1.

Help produce winning entries as books. The 24 winning titles of Andika Rwanda 2018 have been approved by REB as supplementary readers for all P1 classrooms. Once the printer prototype is approved by REB, printing and distribution will commence in Quarter 3. The Andika Rwanda 2018 books, in addition to the previously procured 65 P1 supplementary readers and 24 decodable readers, will create a classroom library of 113 supplementary readers.

Recognize winners at National Literacy Day celebration. No activity planned this quarter.

Develop concept paper for REB-implemented Andika Rwanda in 2020. No activity planned this quarter.

Sub IR 2.2: Student and teacher performance standards and benchmarks for early grade reading applied

2.2.1. Develop, finalize, communicate, and implement benchmarks and targets for P1, P2, and P3 reading fluency and comprehension as well as student performance standards.

Soma Umenye.

USAID

: PHOTO The head of REB’s Examinations, Selection and Assessment department and USAID Soma Umenye Chief of Party Stephen Blunden participate in the workshop to finalize cut scores and benchmarks.

Hold workshop to review available assessment data and identify fluency and comprehension standards. In February and March, USAID Soma Umenye organized two working group sessions with staff from REB, MINEDUC, URCE, and teachers to set oral reading fluency (ORF) and reading comprehension (RC) standards. To set the standards, Soma Umenye used the modified Angoff method.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 22

Exhibit 4. The Three Phases of the Modified Angoff Method

General Learning performance assessment descriptors for Part 1 specialists each category

Reading performance Kinyarwanda curriculum Part 2 descriptors P1, specialists P2, and P3

Establishment Master teachers and Kinyarwanda of ORF and RC curriculum Part 3 benchmarks specialists

The process had three phases. In the first phase, Ministry of Education learning assessment specialists came together to discuss how many performance categories Rwanda should have. The conclusion was four: (1) does not meet expectations, (2) partially meets expectations, (3) meets grade-level expectations, and (4) exceeds grade- level expectations.

Exhibit 5. General Performance Descriptors for Reading in Early Primary Does not meet Partially meets Meets grade-level Exceeds grade-level expectations expectations expectations expectations Pupils are unable to Pupils read grade-level Pupils read most grade- Pupils read grade-level identify most sounds in texts very slowly - usually level texts accurately. texts easily and fluently, words or to consistently syllable by syllable – They usually self-correct respecting tones and connect sounds to letters. hesitantly, and with when they make a mistake syllable duration. They are As a result, they make limited confidence, and are able to figure out able to figure out the many errors reading the particularly when faced the meaning of most new meaning of all new grade- simplest grade-level texts, with longer texts. They words by using simple level words and . answer so many that they not make some errors when clues in the text or all literal and simple understand most of what reading familiar words or illustrations. They are able inferential comprehension they are reading and can simple texts, often to answer most literal questions, including only rarely figure out the skipping over words comprehension questions making accurate and meaning of new words. entirely. They rarely go and demonstrate logical predictions and They do not like to read. back to self-correct. As a confidence in their reading making personal result, they are able to abilities. connections between the answer some basic literal text their own lives. They comprehension questions like to read and have or figure out the meaning confidence in their reading of some new words. abilities.

In the second phase, reading curriculum specialists defined what “performance” looked like for each performance category from the first to third grade. In this phase, they elaborated performance descriptors for each of the four categories for each grade level.

In the final phase, master teachers and curriculum specialists established bechmarks for oral reading fluency by relying on their understanding of student performance as well as data they collected from students. To set the benchmarks, teacher learned how to

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 23 collect oral reading fluency data and spent a half day collecting this data from students in a nearby school. Next, they met to discuss how borderline pupils would perform; that is, how would a student that met expectations but was right at the border with partially meeting expectations perform (what would they be able to do). After setting initial cut scores (separating the performance categories), teachers went back to their classrooms to collect additional data to test their initial decisions. Then Soma Umenye brought teachers back to revisit their initial cut scores and revise them if needed.

Through this process, they identified the scores that differentiated each of the four performance categories described above. As a result of this exercise, the grade-level expectation in oral reading fluency for third graders was moved from 33 correct words per minute to 41 correct words per minute, which raises the bar for both Rwandan students and teachers and also aligns Rwanda with regional benchmarks. Please see Annexes D and E for additional information.

Exhibit 6. Draft cut scores and benchmarks for P1 to P3 oral reading fluency and comprehension Competency Grade Does not Partially meets Meets grade- Exceeds grade- Grade-level level meet expectations level level expectations benchmark expectations expectations Oral reading P1 1 to 6 cwpm 7 to 11 cwpm 12 to 21 cwpm 22 or more cwpm 12 cwpm fluency P2 1 to 7 cwpm 8 to 27 cwpm 28 to 31 cwpm 32 or more cwpm 28 cwpm P3 1 to 18 cwpm 19 to 40 cwpm 41 to 53 cwpm 54 or more cwpm 42 cwpm Reading P1 20% 40% 60% to 80% 100% 60% Comprehension P2 20% 40% 60% to 80% 100% 60% P3 20% 40% to 60% 80% 100% 80%

During the last working group session with teachers, the Soma Umenye’s chief of party solicited participants feedback on what they would like to see in future teacher trainings, particularly with regards to linking the performance categories with remediation interventions. MINEDUC and REB participants also engaged in a similar exercise to determine the additional support that would be required for remediation efforts in order to improve reading outcomes. This session was opened and closed by the Dr. Alphonse Sebaganwa, the head of ESAD.

USAID Soma Umenye will submit the proposed cut scores and benchmarks to REB in April 2019 for final review and approval.

Develop communications campaign to share approved standards with district, sector, and school actors. This activity is subject to the standards being finalized and approved by the GOR.

2.2.2. Develop, finalize, communicate, and implement teacher performance standards for early grade reading.

Develop/finalize teacher performance standards for early grade reading. No activity planned in Quarter 2.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 24

Sub-IR 2.3: Research-based policies and curricula in support of early-grade reading instruction implemented

2.3.1. Support the development of a government of Rwanda-owned learning agenda for early grade reading to inform and strengthen curricula and relevant policy frameworks, including the new draft Literacy Policy.

Gather stakeholders together to create a learning agenda. During this quarter, USAID Soma Umenye met with the Advisor to the Minister of Education to discuss the project’s proposal to host a national retreat that would focus on activities to support MINEDUC’s interest in setting a national reading target within the framework of the HCI indicators and targets. A follow-up discussion is being planned in Quarter 3 following Soma Umenye’s proposed three-district LEMA pilot.

Identify REB priorities with respect to improving early grade reading performance and effectiveness. USAID Soma Umenye proposed a learning agenda retreat in Quarter 2 to identify priority questions whose answers can help Rwanda improve reading outcomes. USAID Soma Umenye expects that the retreat will provide a platform to explore and develop MINEDUC’s agenda. However, the proposed dates for the learning agenda retreat conflicted with the national leadership retreat. As a result, this activity has been shifted to Quarter 3 to follow the proposed three-district LEMA pilot.

Develop a plan to gather necessary evidence. The three-district LEMA pilot planned in Quarter 3 will kick start the learning agenda, collecting data from over 6,000 learners from 154 schools.

2.3.2. Implement learning agenda activities in collaboration with Rwandan stakeholders.

Collaborate with REB to finalize the time-on-task study. USAID Soma Umenye incorporated several rounds of comments on the time-on-task study. In Quarter 3, the project will finalize the time-on-task study by collaborating with the study reference group, which includes REB and URCE officials, to provide inputs to the report and specifically agree on the policy implications of the study.

Collaborate with REB to conduct another study from the learning agenda. USAID Soma Umenye expects that the second learning agenda study will be agreed upon at the proposed learning agenda retreat (described under 2.3.1 above) during Quarter 3. One possible idea, which was raised by REB, is the study of the impact of on learner performance. REB requested USAID Soma Umenye widen the scope of the 2018 EGRA sample to include districts with known evidence of dialects. Review of the 2018 EGRA data may result in more indepth study. Another idea being considered is a follow up to the Nkombo Island site visit where the question was raised as to whether P1 learners arrive at school with sufficient knowledge of Kinyarwanda and whether additional inputs are required at P1 to enable learners effective participation.

2.3.3. Support dissemination of evidence and the use of data for decision-making.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 25

Support dissemination of the time on task study findings. USAID Soma Umenye, in collaboration with REB, will support the dissemination of the time on task study findings in Quarter 3, once the report has been finalized. USAID Soma Umenye will ensure that the findings are shared widely with REB, MINEDUC, and URCE. The project will also develop policy briefs aimed at informing MINEDUC and REB decision-makers about the policy implications of the study findings.

Sub-IR 2.4: Early-grade reading assessment systems strengthened

2.4.1 Prepare for EGRA administration.

Train data collectors for FY2019 EGRA. No activity planned.

2.4.2. Administer EGRA assessment.

Finalize the collection of FY2018 EGRA data. The activity is completed.

2.4.3. Analyze and report on EGRA data.

Conduct data cleaning, analysis, and report writing for FY2018 data. In Quarter 2, USAID Soma Umenye staff finalized the draft baseline report. In Quarter 3, USAID Soma Umenye plans to share these results with stakeholders and solicit feedback during the proposed learning agenda retreat.

Disseminate FY2018 EGRA findings. The findings will be shared with USAID and REB in Quarter 3 through the learning agenda framework.

2.4.4. Collaborate with REB to identify a sustainable national early grade reading assessment model.

This quarter, USAID Soma Umenye developed a concept note on early grade assessment and accountability framework (see Annex E) and presented it at the REB- USAID Soma Umenye Steering Committee. Upon approval of the concept note, Soma Umenye organized a series of working sessions with staff from REB, MINEDUC, and URCE, as well as teachers and district education personnel. The goal of these sessions was to develop the reading assessment and accountability frameworks.

Over the past decade, some dozen early grade reading assessments have been performed. However, because they have not been tied to a government-owned assessment framework, it is very difficult to tell a coherent story from their results.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 26

Exhibit 7. Percent of P3 students “meeting expectations” for reading comprehension

70%

60%

50%

40%

30%

20%

10%

0% EGRA 2011 LARS 2011 EGRA 2014 EGRA 2015 EGRA 2016 LARS 2016 EGRA 2018

The inconsistent results shown in Exhibit 7 likely are the product of:

• Assessments that use different metrics to measure “meeting expectations” • Assessments that use texts of varying level of difficulties • Assessments that use varying administrative protocols • Assessments that use different sampling methodologies • Assessments that use different instruments (which have not been equated)

During the first working session, in early February, participants developed four different levels of performance (does not meet grade-level expectations, partially meets expectations, meets grade level expectations and exceeds grade level expectations) with respect to the key reading competencies in each term.

In the concept note, Soma Umenye proposed that REB develop a common assessment framework that all assessments conducted in Rwanda must adhere to. This assessment framework would (1) start from the standards in the competency-based curriculum (2) include stable performance categories (3) with qualitative and quantitative performance descriptors for each category (Soma Umenye proposes that the cut scores separating the categories be set using the modified Angoff method, as described under Sub-IR 2.2). This framework would only change when the curriculum is changed. To measure performance under this assessment framework at the national, district, sector, or school level, Rwanda would use harmonized assessment instruments that have been equated.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 27 The texts used in assessments of early grade reading performance would adhere to rigorous methods that define grade-level text.

This proposed assessment framework aligns P2 and P3 literacy competencies with the performance descriptors used by Sustainable Development Goal (SDG) 4.1.1 as well as with the criteria used to measure education quality in early primary grades in the Human Capital Index. The benefit of such an alignment is that it allows decision makers at all levels of the system (school, sector, district, national) to track progress with respect to these two indicators. Soma Umenye proposed that this comprehensive assessment program be supported by an electronic dashboard (see Sub-IR 2.1) to facilitate the entry, compilation and presentation of data on the percentage of pupils in each of the four performance categories at all levels (school, sector, district, national) of the system. The data will allow educational decision-makers to monitor progress with respect to national targets and direct resources to struggling districts, sectors, or schools.

Additionally, the accountability framework associated with the assessment framework takes account of MINEDUC/REB’s recent decision to institute term-based assessments and to use the results of those assessments to direct remediation to pupils in need. As described under sub-IR 2.1, Soma Umenye proposes to work with REB to develop a local early grade reading assessment to be used at the district level to measure the performance of each district’s early grade students against national targets. Once validated, the comprehensive program would be implemented in a number of pilot districts beginning in the 2019-2020 school year. If successful, it could then be expanded to other school districts, subject areas and levels.

During the working sessions organized following the REB-Soma Umenye Steering Committee at which Soma Umenye presented the concept paper, district and central level education representatives, along with education researchers from URCE, worked together to design an assessment and accountability framework for reading in early primary years. The group reached the following decisions (some of which are described under sub-IR 2.2 as well):

Exhibit 8. Decisions Made Related to Assessment and Accountability Framework Issue Decision Assessment Framework Number of performance categories 4 (1) does not meet expectations, (2) partially meets expectations, (3) meets Labels for performance categories grade-level expectations, and (4) exceeds grade-level expectations General policy level performance See Annex F descriptors for each category

SOMA UMENYE QUARTERLY PROGRAM REPORT | 28

Issue Decision General policy level performance See Annex F descriptors for reading in P1, P2, and P3 Skills to assess each term in P1, P2, and See Annex F P3 Accountability Framework When to assess? Middle of the term Who to assess? All pupils Who should assess? Teachers How data should be collected? Tablets, pencil and paper Who should validate teacher data? SEO How will data be aggregated? SEO enters school data into dashboard Percent of pupils in each of the four performance categories by school, What data should appear on the sector, district (and national); progress at dashboard? each level with respect to meeting SDG and HCI targets Who should develop and test termly assessment tools and termly remediation REB/MINEDUC activities? They should be different for pupils in the What type of remediation activities are four categories. Some activities proposed appropriate? (see Annex F)

Sub-IR 2.5: Capacity of teacher training colleges (TTCs) to effectively prepare teachers of early-grade reading increased

2.5.1. Strengthen the capacity of TTC tutors to prepare teachers to deliver evidence-based early grade reading instruction and promote gender equality and empowerment of vulnerable populations.

Support revision of the TTC curriculum as it relates to early grade reading in Kinyarwanda. No activity in Quarter 2.

Support TTC tutors to train finalists (students in their last year of study) using Soma Umenye modules. In Quarter 2, Soma Umenye prepared training for newly qualified P1 teachers and newly transferred P1 teachers, which will be delivered in Quarter 3.

Develop concept note for Year 4 support to TTCs. No activity planned in Quarter 2.

2.5.2. Support REB to develop/implement protocols for tracking NQTs in early grade Kinyarwanda and to provide appropriate mentoring

Hold workshop with TDM to develop scope of work for Kinyarwanda tutors with respect to newly qualified teachers. No activity planned in Quarter 2.

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 29 Train TTC tutors to mentor NQTs. During this quarter, USAID Soma Umenye trained all the Kinyarwanda tutors from all 16 TTCs as part of its NRTT training. To promote sustainability, the expectation is that these tutors will act as mentors to NQTs in the future.

C. MEL ACTIVITIES MEL 1. Conduct ongoing performance monitoring.

Collect and analyze data on Soma Umenye performance with respect to its indicators. During this quarter, the MEL team processed data on (1) participants who attended USAID Soma Umenye trainings (Phase 1 training of trainers and Phase 1-2 P2/P3 teacher training); (2) the Year 2 distribution of teacher guides and read-aloud books (3) school and teacher profiles; and (4) training participant feedback.

Coordinate school monitoring and lessons observation activities to monitor the fidelity of Soma Umenye implementation. The MEL team finalized the development of Year 3 monitoring tools and protocols and organized a two-day refresher training for district and provincial advisors. The team also conducted inter-rater reliability assessments of their skills, using videos of Kinyarwanda lessons, to ensure consistency in their data collection. In Quarter 3, at the start of the second school term, the project will pilot the tools and, after incorporating feedback from the pilot, will start the monitoring process.

Soma Umenye.

Alain Patrick Mwizerwa For USAID For Mwizerwa Patrick Alain

: PHOTO Field staff participate in activities to establish the inter-rater reliability among them with respect to classroom observation..

Produce summary reports of collected data on internal dashboard. The MEL team developed an internal dashboard that is accessible to all staff and that presents FY17 and FY18 datasets and performance indicator targets.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 30

Exhibit 9. USAID Soma Umenye Dashboard

USAID SOMA UMENYE QUARTERLY PROGRAM REPORT | 31

SOMA UMENYE QUARTERLY PROGRAM REPORT | 32

MEL 2. Implement data quality assurance procedures.

Monitor the quality of Soma Umenye activities as per quality benchmarks. During this quarter, the MEL team monitored the quality of implementation of the training of trainers and P2/P3 Kinyarwanda teacher training (Phases 1 and 2) against established quality benchmarks. Additionally, while Soma Umenye conducted training in every district, the MEL team monitored 15 training sites (one in a district) out of a total of 33, to assess the quality of P2 and P3 teacher training, identify best practices, and highlight areas for improvement. These findings were presented at an internal debrief session and the team established the following recommendations, which will be implemented in upcoming training sessions:

Recommendations • A feedback session at the end of each day should be organized for each site. • Make available separate rooms for group activities. • Make available and prepare a babysitting room for teachers who bring their children. • Procure sufficient refreshments for trainees, children, and babysitters. • District advisors should assign some site logistics coordination to training assistants. This will allow district advisors to be more involved in the training and monitor the quality of training delivery. • Ensure hygienic restroom facilities. • Decide whether trainees who miss a day should be allowed to continue the following training day. Only participants who attend at least 90% of the total training hours are reported under Soma Umenye performance indicators. • If USAID Soma Umenye materials are not available in schools at the time of the training, ensure that the teachers are trained on the Soma Umenye methodology and relate it to their current materials. • Ensure that training rooms are gender-balanced, which can be accomplished by combining participants from different sectors and districts. • Combine transport and per diem forms to reduce the time spent filling both forms. Plan for enough time to ensure that that trainings to do not conflict with other development partner activities.

MEL 3. Hold performance review and learning activities.

Conduct semi-annual performance monitoring review and learning workshops. This activity has been postponed to Quarter 3 as there will then be sufficient data to inform the learning sessions.

MEL 4. Report Soma Umenye data as required.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 33

The MEL team compiled information for Quarter 2 performance indicator data reporting and prepared Quarter 1 and Quarter 2 participant training information. This information will be submitted through the TraiNet system in April 2019.

D. COMMUNICATIONS ACTIVITIES Media engagement and mobilization. The communications team created Twitter buzz with a series of exchanges on different USAID Soma Umenye activities.

• Twitter threads for key events. The communications team created Twitter threads for various key events, described below.

— USAID Soma Umenye worked with Mureke Dusome to organize an International Mother Language Day event that took place in Rulindo District in February. The U.S. Ambassador and the Director General of Education Policy and Planning from MINEDUC attended the event.

- The team also updated the project Twitter on regular basis. This quarter, the team tweeted the most about Andika Rwanda 2019, teacher training, International Mother Language Day, International Women’s Day, and World Poetry Day celebrations. The team also tweeted short videos that captured teacher training.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 34

SOMA UMENYE QUARTERLY PROGRAM REPORT | 35

- The team used Twitter to conduct an Andika Rwanda Awareness Campaign to encourage participation and ensure that students understood the competition guidelines. The team also visited several schools during the school jury committee process and captured this on Twitter.

• Soma Umenye retweets. USAID Soma Umenye has also been tagged by various influencers in their tweets, which were in turn retweeted by the project.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 36

The communications team has grown USAID Soma Umenye’s audience by 18.9 percent, from 671 followers in the previous quarter to 830 on 31 March 2019.

The communications team attended workshops to collect photographs and conducted interviews (audio and video testimonials) with teachers, school leaders, REB and district and sector officials, and AR participants.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 37 USAID Soma Umenye visibility materials. This quarter, USAID Soma Umenye communications team finalized the branding and design of teacher training visibility materials, which included banners and pull-ups.

USAID Soma Umenye communications materials. In Quarter 2, the communications team developed press releases, success stories, and talking points.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 38

SECTION 3 CHALLENGES AND LESSONS LEARNED

A. PROBLEMS ENCOUNTERED AND PROPOSED REMEDIAL ACTIONS Textbook printing and distribution. Distribution of P1 textbooks continued to suffer delays due to slower-than-expected production speeds, distribution failures, and general delays in delivery to schools. The launch of P2 and P3 textbook printing was delayed due to multiple rounds of quality inspections to achieve REB print standards.

Our remedial action included:

• Working with the printer of P1 textbooks to ensure a rigorous quality assurance process was in place and applied. • Recruiting a specialist supply chain technical assistant to ensure effective Soma Umenye management of the distribution process. • Delivery of Soma Umenye quality assurance oversight in both the initial production of the P2 and P3 print prototype for review by REB to ensure that the printers were clear about the specifications required.

Decodables. Following a determination by REB that decodable specifications should be revised, the project began revising texts and entered into negotiations with printers for the revised specifications. REB specification changes were two-fold. First, REB proposed that Soma Umenye produce decodables in full color. Second, REB confirmed that they now expected readers to be printed using the same specifications as applied to textbooks. Both changes have resulted in delays, as the specification for cover board required by REB is not readily available so needs to be ordered directly from the mill and the local printer capacity to print in full color is limited.

Delays in production precluded the project’s ability to deliver the decodable readers in time for use at targeted points in the P1 curriculum in 2019, and as a result, the project is in the process of revising its distribution strategy.

B. SUCCESS STORIES AND LESSONS LEARNED See next page.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 39 SNAPSHOT Rwanda Develops Benchmarks to Clarify Expectations for Reading Achievement Kinyarwanda experts The Rwanda Education Board (REB) has set benchmarks that define the expected achievement of students in produced grade-level Kinyarwanda in Grades 1-3. When shared with performance teachers, these benchmarks will make clear what expectations for P1, P2 children should be able to do with respect to reading in and P3 students. Kinyarwanda when they graduate from each grade. REB worked on the benchmarks with USAID Soma Umenye, a program focused on improving reading in Kinyarwanda in early grades. To set the benchmarks, REB curriculum and assessment experts and USAID Soma Umenye staff first drew from Rwanda’s Kinyarwanda curriculum the expected performance for Grades 1- 3. Next, they established performance categories: (1) not meeting grade-level expectations (2) partially meeting expectations (3) PHOTO: Alain Patrick Mwizerwa/USAID Soma Umenye meeting expectations (4) exceeding expectations. Following this, they described in rich detail what – for Participants discuss how to define what example – “partially meeting expectations” meant; that it means to meet grade-level is, what skills a student who partially met expectations expectations. would be able to display. They did this for each of the performance categories. Finally, they met with experienced teachers to determine how the

performance categories would correspond with pupil performance on an assessment that measured oral reading fluency (the ease with which a student can read words) and reading comprehension. To do this, they associated a certain range of scores from the test with each performance category, thereby establishing benchmarks. As a result, teachers can use these assessment tests to place students in each of the four performance categories. The benchmarks are awaiting approval by the Ministry of Education. According to Jean Francois Rubanda, Kinyarwanda Examination Officer at Rwanda Education Board, “These benchmarks are key, as they will help us and the ministry stakeholders in early grade reading to track progress in reading and to collect data to inform where to put effort in improving our education quality to meet international standards.”

SOMA UMENYE QUARTERLY PROGRAM REPORT | 40

SNAPSHOT Inclusion and Digitization Study Tour Sparks New Ideas for Rwanda The Kenyan experience “How do we ensure that students with disabilities are supported in schools and that all students can access sheds light on Rwanda’s materials?” This was a question posed by William Safari path to inclusive of the National Union of Disability Organizations in education Rwanda (NUDOR) to Fred Haga, the director of special needs education at the Ministry of Education in Kenya during a recent study tour to Nairobi. Mr. Safari’s question is central to an ongoing discussion in Rwanda as promoting inclusive education and exploring accessible digital materials in the education sector are two priorities for the Rwandan Ministry of Education (MINEDUC). In an effort to further this discussion, USAID Soma Umenye supported a study tour to Kenya to explore the intersection of digital content and inclusive education.

PHOTO: Kate Brolley/USAID Soma Umenye Throughout the week, participants had the opportunity to interact with accessible and versatile digital content Study tour participants testing out at the Ministry of Education and UNICEF and see it in equipment at the Kenya Institute of action at the Kambui Primary School for the Deaf. Curriculum Development’s in-house educational radio station. What makes this content accessible and versatile is that it allows students to customize and combine diverse features like narration, sign language, interactivity, audio-description of images, structured comprehension questions, and other functions. Not only does this format cater to students with disabilities, but it also accommodates students with different learning styles. Teachers at the Kambui Primary School for the Deaf enthusiastically noted that this type of digital content allows all students to actively participate in the lesson. Study tour participants have now begun to think about the opportunities and challenges to using accessible digital content in Rwanda and supporting MINEDUC to ensure that all students, including those with disabilities, have access to quality education. “Sharing this experience with my colleagues at the Rwandan Education Board will contribute a lot to the planning and successful implementation of digital content,” said Rodgers Kabamba of REB describing his experience participating in the study tour. “Having seen inclusive education in the Kenyan context, I am confident that REB will be able to train teachers to help learners with disabilities access materials, including digital content.”

SOMA UMENYE QUARTERLY PROGRAM REPORT | 41 SNAPSHOT P2 and P3 Rwandan Teachers Trained to Improve Kinyarwanda Literacy Teaching Techniques The P2 and P3 Rwanda’s Grade 1, 2, and 3 students are not yet learning to read well enough to meet grade level Kinyarwanda teachers standards set by the Ministry of Education. In 2018, only training equipped 30 percent of Grade 2 students met these standards. teachers with skills to To address this issue, the Soma Umenye (“read and teach Kinyarwanda more understand”) project, implemented by the United States Agency for International Development (USAID), is effectively. training Kinyarwanda teachers in Grades 1, 2, and 3 to implement evidence-based classroom instruction. In early 2019, USAID Soma Umenye trained some 8,000 Grade 2 and 3 teachers in a structured approach to literacy instruction that emphasizes student practice. First, the teacher models a skill, such as reading a letter or decoding a word (I do), then the teacher guides students to practice the skill with her (we do), and then the teacher gives students the opportunity to practice the skill on their own (you do). This interactive process PHOTO: Alain Patrick Mwizerwa/USAID Soma Umenye boosts student learning. The training also covered During teacher training, teachers at one gender-responsive and inclusive instruction practices. training site in the Northern Province discuss effective ways of teaching To support teachers in the classroom, USAID Soma reading in Kinyarwanda. Umenye also collaborated with the Rwanda Education Board (REB) to revise and distribute new materials, including a new student textbook and teacher’s guide. According to Alphonse Makuza, a P3 Kinyarwanda teacher at Kabagera Primary School (Kirehe District), “The new P3 teachers' guide is better than the previous one we used. The new methodology is more effective because it gives additional time for every student to contribute.” Joan Murungi, Head of Department in Charge of Curriculum at REB noted, “I am confident this training will help teachers to further their understanding of how children learn to read, best practices in reading instruction, and how they can apply these practices in the classroom.” USAID Soma Umenye aims to help at least 70 percent of Grade 1, 2, and 3 students meet ministry standards by 2021.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 42

SECTION 4 ACTIVITIES PLANNED FOR NEXT QUARTER

A. OPERATIONAL ACTIVITIES In Quarter 3, the project will finalize on-boarding of several new staff, as well as finalize recruitment for positions that remain open. This includes re-competition of the Deputy Chief of Party – Operations position following rejection of the long-term visa renewal of the Deputy Chief of Party – Operations Kate Arden.

The project’s Finance team will continue to refine its mobile money processes, working with the vendor to minimize errors and mitigate risks.

Quarter 3 will be critical for the management of remaining printing subcontracts and coordination of distribution. The project’s short-term book supply chain specialist will continue to provide support in this regard, pending visa extension.

Following the recall of P1 textbooks that did not meet REB quality standards, Soma Umenye will work with USAID in Quarter 3 to establish a plan for destruction of unusable books to ensure they remain out of schools.

B. TECHNICAL ACTIVITIES B1. IR 1: CLASSROOM INSTRUCTION IN EARLY-GRADE READING IMPROVED

Sub-IR 1.1: Evidence-based, gender-sensitive early-grade reading materials available and used

1.1.1. Collaborate with REB to develop/revise an “essential core” of instructional materials for P2 Kinyarwanda

As agreed upon with REB, the essential core of P2 Kinyarwanda materials consist of (1) a student textbook, (2) a teacher’s guide, (3) a read-aloud storybook, and (4) leveled readers.

Procure and distribute bookshelves. To store classroom readers in schools, USAID Soma Umenye has procured and will distribute and assemble approximately 15,000 bookshelves for P1, P2, and P3 classrooms in public and government-supported schools. These bookshelves will house a set of approximately 100 levelled readers per class. While the distribution of bookshelves started in Quarter 2, it will be completed in Quarter 3.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 43 P2 Textbook, Teacher’s Guide (including assessment protocol), and Read-aloud Story Book

Digitize the teacher’s guide. The project will formalize a plan in Quarter 3 that will link a digital version of the teacher’s guide to the project’s coaching and CPD activities.

Printing and Delivery

Procure printing and delivery services for P2 essential core and print and deliver essential core. Below, we lay out the plans with respect to printing and delivering the P2 essential core.

• P2 student textbook. P2 textbook production is currently ongoing, and printers will start shipping and distributing the books in Quarter 3. It is expected that the distribution process will start immediately after the arrival of the books in Kigali and USAID Soma Umenye will monitor the process. To accelerate distribution, USAID Soma Umenye will continue to ensure quality assurance by sending key team members to monitor production. • P2 teacher’s guide. The teacher’s guides will be printed and distributed in Quarter 3 (through teacher training). • P2 read-aloud story book. The read alouds will be printed and distributed in Quarter 3 (through teacher training). • P2 leveled readers. Soma Umenye will continue to work with local publishers to finalize production of the P2 leveled readers and plan future distribution.

1.1.2. Collaborate with REB to develop/revise an “essential core” of instructional materials for P3 Kinyarwanda

As agreed upon with REB, the essential core of P3 Kinyarwanda materials consist of (1) a student textbook, (2) a teacher’s guide, (3) a read-aloud storybook, and (4) leveled readers.

P3 Textbook, Teacher’s Guide (including assessment protocol), and Read-aloud Story Book

Digitize the teacher’s guide. The project will formalize a plan in Quarter 3 that will link a digital version of the teacher’s guide to the project’s coaching and CPD activities.

Printing and Delivery

Procure printing and delivery services for P3 essential core and print and deliver essential core. Below, we lay out the plans with respect to printing and delivering the P2 essential core.

• P3 student textbook. P3 textbook production is currently ongoing, and printers will start shipping and distributing the books in Quarter 3. It is expected that the distribution process will start immediately after the arrival of the books in Kigali and USAID Soma Umenye will monitor the process. To accelerate distribution, USAID

SOMA UMENYE QUARTERLY PROGRAM REPORT | 44

Soma Umenye will continue to ensure quality assurance by sending key team members to monitor production as and when required. • P3 teacher’s guide. The teacher’s guides will be printed and distributed in Quarter 3 (through teacher training). • P3 read-aloud story book. The read alouds will be printed and distributed in Quarter 3 (through teacher training). • P3 leveled readers. Soma Umenye will continue to work with local publishers to finalize production of the P3 leveled readers and plan future distribution.

1.1.4. Provide adaptations that make the materials described above accessible to students with disabilities.

Support MINEDUC to implement the 2018 Special Needs and Inclusive Education Policy as it relates to early grade literacy in Kinyarwanda.

• Facilitate a Rwanda Inclusive Education Technical Workshop. Soma Umenye will propose to REB to hold a workshop that brings together key stakeholders to support MINEDUC and REB to develop a robust implementation plan for its revised special needs policy. While the workshop will be grounded in the objectives of the new policy, there will be a specific focus on defining achievable activities that can be implemented in the short term.

• Develop a revised project inclusion plan. Based on the outcome of the above-mentioned workshop (as well as Quarter 2 activities), Soma Umenye will draft a plan of proposed inclusion interventions for REB and USAID’s review.

Sub-IR 1.2: Teachers’ use of evidence-based, gender-sensitive instructional practices in early-grade reading increased

1.2.1. Provide refresher support to P1 teachers

Soma Umenye will deliver P1 teacher refresher training in Quarter 3 informed by Soma Umenye’s recent assessment of P1 teachers’ classroom application.

1.2.2. Train P2 teachers on evidence-based, gender-sensitive early grade reading instruction (focused on the revised teaching and learning materials).

Support NRTT to train P2 teachers. USAID Soma Umenye will support the NRTT members trained in Quarter 2 as they train P2 teachers from schools across the country during during Quarter 3.

1.2.3. Train P3 teachers on evidence-based early grade reading instruction (focused on the revised teaching and learning materials).

Support NRTT to train P3 teachers. USAID Soma Umenye will support the NRTT members trained in Quarter 2 as they train P3 teachers in Quarter 3.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 45

1.2.4. Support school-based teacher professional development (the creation of communities of practice for P1-P3 Kinyarwanda teachers).

Collaborate with REB to conduct CPD needs assessment of P1, P2, and P3 teachers. USAID Soma Umenye will develop a detailed plan to conduct the CPD needs assessment in Quarter 3.

Sub-IR 1.3: Capacity of head and mentor teachers to coach and supervise early- grade reading instruction strengthened

1.3.1. Train designated coaches for P1-P3 Kinyarwanda teachers.

Submit draft coaching training guide, protocols, and tools to USAID and REB for approval. Soma Umenye’s coaching guide, protocols and tools are included in Soma Umenye’s school leadership Module 1 which was approved by REB and USAID last quarter.

1.3.2. Train head teachers to promote their leadership in support of early-grade reading instruction

Finalize curriculum for head teacher/DOS training. Following the validation of the first module during Quarter 2, USAID Soma Umenye will finalize the remaining modules in Quarter 3.

Train National Reading Training Team (NRTT) to deliver head teacher/DOS training. In Quarter 3, Soma Umenye will conduct a training of trainers session for members of the NRTT to prepare them to train school leaders.

Support NRTT to train head teachers/DOS. In Quarter 3, Soma Umenye will support NRTT training of school leaders nationwide. School leaders’ Module 1 includes focus on teacher coaching by head teachers and DOS and oversight of the school coaching plan by the SEO.

Sub-IR 1.4: Schools’ and teachers’ use of student assessment results improved

1.4.1. Develop P2 assessment protocol (see sub-IR 1.1), train teachers to use it (see sub-IR 1.2), and train head teachers to oversee it (see sub-IR 1.3).

Soma Umenye will train head teachers and directors of studies in the oversight of assessment and use of assessment data for P2 students in Quarter 3.

1.4.2. Develop P3 assessment protocol (see sub-IR 1.1), train teachers to use it (see sub-IR 1.2), and train head teachers to oversee it (see sub-IR 1.3).

Soma Umenye will train head teachers and directors of studies in the oversight of assessment and use of assessment data for P3 students in Quarter 3.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 46

B2. IR 2: SYSTEMIC CAPACITY FOR EARLY-GRADE READING INSTRUCTION IMPROVED

Sub-IR 2.1: National advocacy mechanisms for early grade reading interventions strengthened

2.1.1. Develop and implement advocacy strategy to highlight the importance of early grade reading in Kinyarwanda.

Support REB to attend CIES. REB and USAID Soma Umenye staff will make two joint presentations at CIES during Quarter 3. Soma Umenye staff will refine the planned presentations with the chief of party in collaboration with the travelling REB officials.

Implement advocacy strategy.

• District events. Soma Umenye’s district advisors will conduct meetings with vice mayors of social affairs and DDEs to advocate to them to include early grade reading metrics in their imihigos and district development plans.

• Instructional time study. In Quarter 3, USAID Soma Umenye will finalize the instructional time study draft report and will share it with USAID. After USAID and REB have approved the study, Soma Umenye will disseminate its findings at the provincial and district levels.

• Learning agenda retreat. In Quarter 3, Soma Umenye plans to hold a three-day reflection following the three-district LEMA pilot in June. On the final day of the reflection, MINEDUC and REB leaders will present key findings and recommendations to the Minister of Education and other senior education leaders.

2.1.2. Support the implementation of the ESSP 3.

Collaborate with REB to develop and support an implementation plan for ESSP 3. Soma Umenye will collaborate with BLF to summarize and translate ESSP 3 and disseminate it to the districts. This process will commence as soon as MINEDUC confirms that ESSP 3 has been approved and is published.

Help develop the ESSP dashboard and integrate it with MINEDUC and REB systems. Soma Umenye will share a revised plan to develop the ESSP dashboard and integrate it with MINEDUC and REB systems in Quarter 3. In addition, Soma Umenye will recruit a team to lead and develop the ESSP dashboard in Rwanda as well as international specialists to inform the development of the dashboard. A prototype of the dashboard, presenting 2018 EGRA data, will be prepared for the next JRES presentation. Presentation of data from the three-district LEMA pilot planned for June 2019 will use the proposed ESSP dashboard format to demonstrate how ESSP will be able to make data available at the district level to inform district planning.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 47 2.1.3. Strengthen capacity of districts to identify and respond to local and national early grade reading priorities.

Collaborate with REB to co-develop and deliver a three-district LEMA pilot in June, which will enable districts to assess early grade reading performance in their districts and develop district plans. Soma Umenye envisions the following steps: • Approval of the concept and sharing it with REB; • Internal development of LEMA tools; • Testing EGRA tools against LEGRA tools; • Meeting district officials from pilot districts; • A preparatory workshop with stakeholders to plan the pilot and prepare data collection teams; • Communication to all schools and key stakeholders to prepare them for the pilot.

Hold three District Education Conferences to present and discuss LEMA data to inform district plans. USAID Soma Umenye will hold a national conference in June 2019 to present findings from the three-district pilot to:

• Reflect on the merit of LEMA overall and to recommend future steps; • To propose changes to the LEMA process; • To propose changes to the LEMA instrument; • To reflect on data collected from the LEMA and recommendations made by districts for both school improvement and district improvement; • Reflect on potential for establishing district reading targets; • Reflect on the potential for establishing a national reading target.

2.1.4. Collaborate with REB to support Andika Rwanda to raise national-level profile of children's literacy.

Support evaluation of Andika Rwanda entries. In Quarter 3, USAID Soma Umenye will work with REB to develop district jury selection criteria and a marking scheme. Based on this criteria, USAID Soma Umenye and REB will support districts to select eight members to form a district-level jury committee. Soma Umenye will collaborate with REB to develop similar criteria and schemes for the national-level juries.

Help produce winning entries as books (Andika Rwanda 2018 winners). In Quarter 3, Soma Umenye will continue to manage the printing of Andika Rwanda books (2018 winners), which will be distributed to P1 classrooms as soon as they are available.

Sub IR 2.2: Student and teacher performance standards and benchmarks for early grade reading applied

2.2.1. Develop, finalize, communicate, and implement benchmarks and targets for P1, P2, and P3 reading fluency and comprehension as well as student performance standards.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 48

Hold workshop to review available assessment data and identify fluency and comprehension standards. In Quarter 3, USAID Soma Umenye will submit the draft cut scores and benchmarks to REB and REB Steering Committe in April for review/approval.

Sub-IR 2.4: Early-grade reading assessment systems strengthened

2.4.1 Prepare for EGRA administration.

Train data collectors for FY2019 EGRA. USAID Soma Umenye will begin the recruitment of data collectors for the 2019 EGRA data collection and prepare training materials.

2.4.3. Analyze and report on EGRA data.

Conduct data cleaning, analysis, and report writing for FY2018 data. In Quarter 3, USAID Soma Umenye staff will finalize the draft baseline report; share it with USAID, REB, and other stakeholders; incorporate stakeholder feedback; and produce a final EGRA report.

Disseminate FY2018 EGRA findings. USAID Soma Umenye plans also to hold the first dissemination session through a small learning agenda session with USAID, REB, and MINEDUC. Further dissemination activities will be determined by the participants in this first session.

2.4.4. Collaborate with REB to identify a sustainable national early grade reading assessment model.

Once the grade-specific benchmarks and cut scores are validated, USAID Soma Umenye proposes that national targets be established by estimating the percentage of pupils in each grade level who, in a national large-scale assessment, score at or above the benchmark score, based on our understanding of current reading performance and estimates of improvement that is achievable over time. Discussions on this proposal will continue in Quarter 3 and will be informed by the three-district LEMA pilot planned for June 2019.

Sub-IR 2.5: Capacity of teacher training colleges (TTCs) to effectively prepare teachers of early-grade reading increased

2.5.1. Strengthen capacity of TTC tutors to prepare teachers to deliver evidence- based early grade reading instruction and promote gender equality and empowerment of vulnerable populations.

Support TTC tutors to train finalists (students in their last year of study) using Soma Umenye modules. In Quarter 2, USAID Soma Umenye in collaboration with and TTC principals identified 1,280 Year 3 student teachers in 16 TTCs who plan to teach Kinyarwanda in early grades after they graduate. The identified Year 3 student teachers will be trained at TTCs and training will be facilitated by TTC tutors. In Quarter 2, Soma Umenye trained all Kinyarwanda tutors (64 Kinyarwanda tutors in 16 TTCs) on early grade reading

SOMA UMENYE QUARTERLY PROGRAM REPORT | 49 instructional practices as well as on gender-sensitive and disability-inclusive instruction to as members of NRTT to deliver training of year 3 TTC students in Quarter 3.

2.5.2. Support REB to develop/implement protocols for tracking NQTs in early grade Kinyarwanda and to provide appropriate mentoring

Train TTC tutors to mentor NQTs. USAID Soma Umenye will prepare for and conduct NQT training in Quarter 3 in collaboration with REB and TTCs.

C. MEL ACTIVITIES MEL 1. Conduct ongoing performance monitoring.

Collect and analyze data on Soma Umenye performance with respect to its indicators. The MEL team will compile Quarter 3 actual results for USAID Soma Umenye indicators.

Coordinate school monitoring and lessons observation activities to monitor the fidelity of Soma Umenye implementation. Soma Umenye staff will internally finalize the revised data collection tools after the practice exercise and feedback session planned for April 2019. The MEL team will then identify a sample of schools to be monitored (monthly) by field staff in Quarter 3 to enable USAID Soma Umenye to detect changes in key indicators and evaluate fidelity of implementation to Soma Umenye’s proposed interventions.

Produce summary reports of collected data on internal dashboard. The MEL team will develop initial analytical dashboards in Quarter 3 based on monitoring information collected and internal data needs.

MEL 2. Implement data quality assurance procedures.

Monitor the quality of Soma Umenye activities as per quality benchmarks. The MEL team will monitor the quality of implementation of the (1) training of trainers session (2) the P2/P3 Kinyarwanda teacher training, and (3) the TLM distribution against established quality benchmarks and guidelines.

Carry out data quality assessments to verify data submitted for Soma Umenye indicators. The MEL team will plan a systematic quality assurance exercise, including the identification of several indicators to verify data submitted for reporting.

MEL 3. Hold performance review and learning activities.

Conduct semi-annual performance monitoring review and learning workshops. This activity will be conducted in Quarter 3.

MEL 4. Report Soma Umenye data as required.

The MEL team will provide the required performance information and participant cost data to USAID via the Development Information System and TraiNet portals.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 50

D. COMMUNICATIONS ACTIVITIES Communication plan. USAID Soma Umenye will submit the Year 3 Communications Plan to USAID in April.

Media engagement and mobilization. In Quarter 3, the communications team will continue to engage media and ensure project visibility on social media by regularly updating the project’s Twitter and Instagram accounts.

USAID Soma Umenye visibility. The communications team will work with the technical team and produce visibility materials when needed.

USAID Soma Umenye communications materials. The communications team will ensure all workshops or trainings that happen in Quarter 3 are well branded according to USAID branding guidelines.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 51 ANNEX A. REPORTING AGAINST INDICATORS

Indicator Indicator Name Frequency Target Actual Number Objective: Improved literacy outcomes for children in early grades N/A (annual Percent of P1-P3 students able to read grade-level text with fluency and comprehension (outcome, data 1 annual 50% RFTOP: Deliverables 5.8 and 5.9) available Q4)1 N/A (annual data 2 Number of children whose reading outcomes in Kinyarwanda are improved annual 689,180 available Q4)2 N/A (annual data 3 (ES 1-5) Number of learners reached in reading programs at the primary level with USG assistance annual 1,378,359 available Q4) Number of education administrators and officials who complete professional development activities 4 (ES 1-12) quarterly 0 03 with USG assistance Number of men 0 0 Number of women 0 0 Bugesera 0 0 Burera 0 0 Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0

1 The actual will be determined after finalizing the EGRA report (data collected in September and October 2018). Given the likely REB revision of grade-level standards, the analysis of students meeting standards will need to be revised. 2 1. 3 The actual will be reported after completing all USAID Soma Umenye-supported training modules, which is expected to start in Quarter 3 (May 2019).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 52

Indicator Indicator Name Frequency Target Actual Number Huye 0 0 Kamonyi 0 0 Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0 Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0 Nyanza 0 0 Nyarugenge 0 0 Nyaruguru 0 0 Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0 Dean of Studies/Deputy Head Teacher 0 0 District Education Official (DEO and DDE) 0 0 Headteacher 0 0 Sector Education Officer 0 0 Soma Umenye trainers-(NRTT) 0 0 Other 0 0 Evidence-based reading instruction 0 0 Has disability 0 0 Does not have disability 0 0 4A Percent of head teachers successfully trained quarterly 0 0

SOMA UMENYE QUARTERLY PROGRAM REPORT | 53 Indicator Indicator Name Frequency Target Actual Number Number of men 0 0 Number of women 0 0 Bugesera 0 0 Burera 0 0 Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0 Huye 0 0 Kamonyi 0 0 Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0 Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0 Nyanza 0 0 Nyarugenge 0 0 Nyaruguru 0 0 Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0

SOMA UMENYE QUARTERLY PROGRAM REPORT | 54

Indicator Indicator Name Frequency Target Actual Number IR 1: Classroom instruction in early- grade reading improved Number of primary school educators who complete professional development activities on 5 (ES 1-7) quarterly 0 04 implementing evidence-based reading instruction with USG assistance Number of men 0 0 Number of women 0 0 Bugesera 0 0 Burera 0 0 Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0 Huye 0 0 Kamonyi 0 0 Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0 Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0 Nyanza 0 0 Nyarugenge 0 0

4 Training of P2 and P3 Kinyarwanda teachers started in Quarter 2 (January 2019). The actual result will be reported in Quarter 3 after the completion of 10 days of training. For Phase 1 of in-service teacher training, 3,986 P2 Kinyarwanda teachers (3,119 female and 867 male), 3,434 P3 Kinyarwanda teachers (2,446 female and 988 male), and 205 Kinyarwanda teachers teaching both P2 and P3 (161 female and 45 male) attended the first five days of USAID Soma Umenye teacher training. The second phase of training is planned in April 2019.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 55 Indicator Indicator Name Frequency Target Actual Number Nyaruguru 0 0 Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0 P1 0 0 P2 0 0 P3 0 0 In-service 0 0 Pre-service 0 0 5A (ES 1- Number of primary or secondary school educators who complete professional development 0 quarterly 05 8) activities on teaching students with special educational needs with USG assistance Number of men 0 0 Number of women 0 0 Number of persons trained with USG assistance to advance outcomes consistent with gender 0 5B (GNDR equality or female empowerment through their roles in public or private sector institutions or quarterly 06 8) organizations Number of men 0 0 Number of women 0 0 Bugesera 0 0 Burera 0 0 Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0 Huye 0 0 Kamonyi 0 0

5 On track to reach Q4 target of 6,633 6 On track to reach Q4 target of 6,633

SOMA UMENYE QUARTERLY PROGRAM REPORT | 56

Indicator Indicator Name Frequency Target Actual Number Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0 Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0 Nyanza 0 0 Nyarugenge 0 0 Nyaruguru 0 0 Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0 P1 0 0 P2 0 0 P3 0 0 In-service 0 0 Pre-service 0 0 5C Percent of early grade reading teachers successfully trained quarterly 0 07 Number of men 0 0 Number of women 0 0 Bugesera 0 0 Burera 0 0

7 The Quarter 4 target is 90%. On track as teachers have begun training, and we expect them to complete and report in Quarter 4.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 57 Indicator Indicator Name Frequency Target Actual Number Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0 Huye 0 0 Kamonyi 0 0 Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0 Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0 Nyanza 0 0 Nyarugenge 0 0 Nyaruguru 0 0 Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0 P1 0 0 P2 0 0 P3 0 0 Number of USG-assisted organizations and/or service delivery systems that serve vulnerable N/A (annual 6 (ES 4-3) annual 1 persons strengthened data

SOMA UMENYE QUARTERLY PROGRAM REPORT | 58

Indicator Indicator Name Frequency Target Actual Number available Q4)8 Sub-IR 1.1: Evidence-based, gender-sensitive early-grade reading materials available and used Number of primary textbooks and other teaching and learning materials (TLM) provided with USG 7,706,9089 7 (ES 1-10) quarterly 1,883 10 assistance P1 7,706,908 1,883 P2 0 0 P3 0 0 Bugesera 253400 660 Burera 281237 0 Gakenke 336921 0 Gasabo 169134 0 Gatsibo 268440 0 Gicumbi 306514 0 Gisagara 205946 0 Huye 265498 0 Kamonyi 272313 0 Karongi 363532 0 Kayonza 228028 0 Kicukiro 90441 0 Kirehe 188353 0 Muhanga 324882 0 Musanze 244650 0 Ngoma 212244 0 Ngororero 302850 0

8 Annual indicator reported in Quarter 4 9 This number includes 670,000 P1 textbooks, 172 teacher guides and read aloud books, 6,600,000 decodable readers (275,000 sets of 24), 432,136 supplementary readers (a set of 76 titles to 5,686 classrooms), and 4,600 SD cards. 10 This big difference between the target and actual nummber is explained by the fact that texbooks distribution to schools planned in Quarter 2 was posponed to Quarter 3 of Year 3. The actual for Quarter 2 of 1,883 reflects materials provided to five schools (EP Shengampuri, EP Nkombo, G.S Nkanga, GS Ishywa and GS Bugumira) in Quarter 2. Distribution of P1 Kinyarwanda textbooks in schools is ongoing and the actual will be reported in Quarter 3 and Quarter 4 along with other P1, P2 and P3 teaching and learning materials.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 59 Indicator Indicator Name Frequency Target Actual Number Nyabihu 276673 0 Nyagatare 258265 0 Nyamagabe 317793 0 Nyamasheke 414978 0 Nyanza 241690 0 Nyarugenge 96356 0 Nyaruguru 284123 0 Rubavu 239142 0 Ruhango 232867 0 Rulindo 247665 379 Rusizi 333536 844 Rutsiro 282141 0 Rwamagana 167296 0 Student textbook 670,000 1,560 Teacher guide 86 0 Read Aloud book 86 0 Decodable reader 6,600,000 60 Leveled reader 432,136 260 Sets of flash cards 0 0 Alphabet charts 0 3 N/A 100% Percent of primary school classrooms that receive a complete set of essential reading instructional (annual data 8 annual (15,122 of materials with USG assistance available 15,122) Q4) 9 Percent of observed classrooms in which children are using project-provided books quarterly 50% N/A11 P1 50% - P2 50% - P3 50% - Bugesera 50% - Burera 50% - Gakenke 50% -

11 Data not collected in Quarter 2 as schools had not yet received the full set of essential reading instructional materials.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 60

Indicator Indicator Name Frequency Target Actual Number Gasabo 50% - Gatsibo 50% - Gicumbi 50% - Gisagara 50% - Huye 50% - Kamonyi 50% - Karongi 50% - Kayonza 50% - Kicukiro 50% - Kirehe 50% - Muhanga 50% - Musanze 50% - Ngoma 50% - Ngororero 50% - Nyabihu 50% - Nyagatare 50% - Nyamagabe 50% - Nyamasheke 50% - Nyanza 50% - Nyarugenge 50% - Nyaruguru 50% - Rubavu 50% - Ruhango 50% - Rulindo 50% - Rusizi 50% - Rutsiro 50% - Rwamagana 50% - Sub-IR 1.2: Teachers’ use of evidence-based, gender-sensitive instructional practices in early-grade reading increased N/A 10 Percent of teachers demonstrating essential skills in the teaching of reading annual 55% (Annual QPR4) Sub-IR 1.3: Capacity of head and mentor teachers to coach and supervise early-grade reading instruction strengthened

SOMA UMENYE QUARTERLY PROGRAM REPORT | 61 Indicator Indicator Name Frequency Target Actual Number Percent of P1-P3 Kinyarwanda teachers who report receiving adequate coaching for the 0 11 quarterly 012 implementation of an evidence-based early grade reading approach Number of men 0 0 Number of women 0 0 P1 0 0 P2 0 0 P3 0 0 Bugesera 0 0 Burera 0 0 Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0 Huye 0 0 Kamonyi 0 0 Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0 Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0 Nyanza 0 0 Nyarugenge 0 0 Nyaruguru 0 0

12 Coaching interventions are not yet implemented.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 62

Indicator Indicator Name Frequency Target Actual Number Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0 Percent of head teachers demonstrating essential leadership skills in the support of early grade 12 quarterly 0 013 Kinyarwanda literacy in their school Number of men 0 0 Number of women 0 0 Bugesera 0 0 Burera 0 0 Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0 Huye 0 0 Kamonyi 0 0 Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0 Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0

13 The baseline for this indicator will be set in Q3..

SOMA UMENYE QUARTERLY PROGRAM REPORT | 63 Indicator Indicator Name Frequency Target Actual Number Nyanza 0 0 Nyarugenge 0 0 Nyaruguru 0 0 Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0 Sub-IR 1.4: Schools’ and teachers’ use of student assessment results improved Percent of schools (1) sharing assessment results with SGACs and (2) helping SGACs use 13 quarterly 0 014 assessment results to inform parents P1 0 0 P2 0 0 P3 0 0 Bugesera 0 0 Burera 0 0 Gakenke 0 0 Gasabo 0 0 Gatsibo 0 0 Gicumbi 0 0 Gisagara 0 0 Huye 0 0 Kamonyi 0 0 Karongi 0 0 Kayonza 0 0 Kicukiro 0 0 Kirehe 0 0 Muhanga 0 0 Musanze 0 0 Ngoma 0 0

14 The baseline for this indicator will be set in Quarter 3.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 64

Indicator Indicator Name Frequency Target Actual Number Ngororero 0 0 Nyabihu 0 0 Nyagatare 0 0 Nyamagabe 0 0 Nyamasheke 0 0 Nyanza 0 0 Nyarugenge 0 0 Nyaruguru 0 0 Rubavu 0 0 Ruhango 0 0 Rulindo 0 0 Rusizi 0 0 Rutsiro 0 0 Rwamagana 0 0 IR 1: Classroom instruction in early- grade reading improved N/A Number of laws, policies, regulations, or guidelines developed or modified to improve primary grade (annual data 14 annual 4 reading programs available Q4) Sub-IR 2.1: National advocacy mechanisms for early-grade reading interventions strengthened N/A (annual data 15 Percent of agreed-on annual activities in the approved transition plan that are completed annual 50% available Q4) Sub-IR 2.2: Student and teacher performance standards and benchmarks for early-grade reading applied 16 Number of early grade reading performance standards approved by MINEDUC annual 3 N/A (annual data 17 Number of early grade reading teacher performance standards approved by MINEDUC annual 1 available Q4) Sub-IR 2.3: Research-based policies and curricula in support of early-grade reading instruction implemented N/A Number of scientific studies published or conference presentations given as a result of USG (annual data 18 annual 3 assistance for research programs available Q4)

SOMA UMENYE QUARTERLY PROGRAM REPORT | 65 Indicator Indicator Name Frequency Target Actual Number Sub-IR 2.4: Early grade reading assessment systems strengthened Number of times Soma Umenye-supported assessment data is cited by policymakers in official 19 quarterly 0 0 documents, presentations, or media interviews Newspaper - - Radio - - Official document - - Presentation - - Sub-IR 2.5: Capacity of TTCs to prepare effective early grade reading teachers improved N/A Number of host country tertiary education institutions receiving capacity development support with (annual data 20 annual 16 USG assistance15 available Q4)

15 Note that the target here includes Rwanda’s teacher training colleges. In Rwanda, they are in fact secondary education institutions but perform the same function performed by tertiary institutions elsewhere.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 66

ANNEX B. STUDY VISIT TO KENYA IN SUPPORT OF DIGITIZATION AND INCLUSION ACTIVITIES

INTRODUCTION The objective of USAID Soma Umenye is to improve reading outcomes in Kinyarwanda for at least 1 million children in public and government-aided schools in Rwanda. Specifically, Soma Umenye will target all children in Grades 1-3 attending public and government-aided schools nationwide and ensure that at least 70 percent of students are able to read grade-level text with fluency and comprehension.

As Soma Umenye explores ways of supporting the Rwanda Education Board’s ambition to digitize education materials so that they are accessible and inclusive for all students, a team composed of staff from Soma Umenye, the Rwanda Education Board (REB), and stakeholders from the Rwandan disability community conducted a study visit to Nairobi, Kenya in March 2019.

TRIP REPORT PURPOSE OF TRIP

The government of Rwanda has a vision to digitize materials in education sector and to ensure that education materials are inclusive and accessible to all students. ICT-based education is part of Vision 2020 for Rwanda and this is further reflected in the ICT in Education Policy (Ministry of Education, 2016). This policy also stresses the role that digital learning can play in ensuring that students are ready and competitive for the global workforce. In addition to integrating ICT in education, the government of Rwanda has demonstrated commitment to supporting students with special educational needs (SEN) and those with disabilities. This commitment is reflected in the Ministry of Education’s Strategic Priority Number 7 of Education Sector Strategic Plan Framework for basic education (ESSP) 2018/19-2023/24, and specifically Outcome 7.2 which has a stated goal to “increase the participation and achievement of children with disabilities and SEN at all levels of education.” Furthermore, the recent approval of the Revised Special Needs and Inclusive Education Policy (2019) demonstrates Rwanda’s commitment to supporting inclusive education for all students.

Kenya has made significant strides in digitizing education content for primary school students and ensuring that this material is accessible for all students, including those with disabilities. Specific examples of using digital content to increase access for students with disabilities (including those who are blind and deaf) are discussed in the remainder of this report. The specific objectives of this trip included meeting with and gaining insight from the following stakeholders to inform answers to the following questions: (1)

SOMA UMENYE QUARTERLY PROGRAM REPORT | 67 how can digital materials support learning outcomes for all students and (2) how can digital materials support learning outcomes for students with disabilities?

• Policy: representatives from the government of Kenya, including the Ministry of Education (Directorate of Special Needs Education) and the Kenyan Institute of Curriculum Development; • Providers: organizations producing and using digital content, including content specifically designed for students with disabilities; and • End-Users: students and teachers incorporating digital content and inclusive education practices into their classrooms

For USAID Soma Umenye to effectively explore the intersection of digitization and inclusion, it was critical to involve key Rwandan stakeholders in the study visit to Kenya. These stakeholders included a representative from REB and several representatives from Rwanda’s disability community (the Rwanda National Union of the Deaf and the National Union of Disability Organizations in Rwanda). The following individuals participated in the study visit to Kenya:

Individual Organization Rodgers Kabamba Cloud Solution Specialist, Rwanda Education Board William Safari Education for All Project Manager, National Union of Disability Organizations in Rwanda Eric Tuyishime Education for All Project Officer and Guide, National Union of Disability Organizations in Rwanda (NUDOR) Augustin Munyangeyo Chairperson, Rwanda National Union of the Deaf (RNUD) Theophile Binama Sign Language Program Coordinator and Interpreter, Rwanda National Union of the Deaf Martin Arabaruta Early Grade Reading Advisor, USAID Soma Umenye Winnie Muhumuza Gender and Inclusion Advisor, USAID Soma Umenye Kate Brolley Short-Term Special Projects Coordinator, USAID Soma Umenye

OVERVIEW AND FINDINGS

In line with the specific trip objectives listed above, the study visit team met with organizations involved with various issues (policy, providers, and end-users) in order to get a comprehensive understanding of digitization and inclusion in Kenya. The list of organizations and individuals can be found in the table below and an overview of each organization can be found in Appendix B-1.

Organization Individual(s) Kenya Institute of John Kimotho, Director of Educational Media Curriculum Charles Munene, Director of e-Learning Development Grace Mwiti, ICT in Education Policy Ministry of Education Fred Haga, Director of Special Needs Education (Directorate of Special John Kimani, Special Needs Education Officer Needs Education) Providers eKitabu Will Clurman, CEO

SOMA UMENYE QUARTERLY PROGRAM REPORT | 68

Organization Individual(s) Michael Ng’eno, Senior Program Manager Mercy Musyoki, Marketing Manager Georgine Auma, Studio KSL Director eLimu Sam Rich, Co-Founder UNICEF/Kenya Florian Rabenstein, Education Officer (Office of Innovation) Rolando Villamero, Education Officer (Inclusive Education and Disability) VSO/Kenya Catherine Mwangi, Education Program Manager Fredrick Odinga, Deaf Inclusion Specialist Rose Nyagwoka, Deaf Child Worldwide Representative End-Users Kambui Primary School Connie Mutiso, Head Teacher for the Deaf Grade 1, 2, 3 and 8 Teachers Grade 3 Students (age range: 9-14)

Kenyan Policy Context

1. Digitization

The ICT Integration in Primary Education project (also known as the Digital Literacy Program), is one of the government of Kenya’s flagship education programs. The goal of this program is to integrate ICT into the curriculum in all primary schools across Kenya. Key aspects of this ongoing program include: (1) improvement of ICT infrastructure; (2) development of digital content; (3) strengthened teacher capacity to deliver digital content; and (4) procurement and distribution of ICT devices. Thus far, the government has provided laptops to teachers, tablets to students, and special technology devices (such as Braille embossers that can translate and print a braille version of a text) to schools. The initial distribution of these devices started in 2016 and 2017.

John Kimotho, Director of Educational Media at the Kenya Institute of Curriculum Development (KICD), stated that it is the Ministry of Education’s vision to ensure that every book is available in both print and digital form. To achieve this, KICD established the Kenya Education Cloud two years ago. UNICEF provided support for the initial set-up by sponsoring five servers that are currently housed at KICD. The Cloud is an online portal for the submission of digital content by publishers. Publishers also have access to digital content standards that KICD has developed. Once submitted, content is then evaluated and approved by KICD for use in schools. Teachers also have access to this content, once validated. So far, KICD has validated over 300 titles, including textbooks and supplementary readers, that have been submitted via the Kenyan Education Cloud. While the Cloud is currently available in all East African countries (with the majority of books free of charge and in a mix of languages), Mr. Kimotho noted that it would be better to eventually create one harmonized East African Cloud.

2. Inclusion

In 2018, the government of Kenya released a revised special needs education policy called the Sector Policy for Learners and Trainees with Disabilities. This revised sector policy aligns with the Kenya Vision 2030, the Constitution of Kenya, Sustainable Development Goal 4, and the national curriculum reform efforts (namely the ongoing transition to a competency-based curriculum). The policy places a large emphasis on inclusive education, as well as on early

SOMA UMENYE QUARTERLY PROGRAM REPORT | 69 identification and assessment as key components in the provision of quality education for students with disabilities. The policy is being disseminated at all levels, from early childhood education to tertiary education, across Kenya’s 47 counties.

The Directorate of Special Needs Education within the Ministry of Education is primarily tasked with implementing the Sector Policy for Learners and Trainees with Disabilities. Even though the Directorate is a relatively new department (less than two years old), it also played an instrumental role in drafting the policy. Director Fred Haga, who leads this department, stated that his goal is to collaborate with other departments in the Ministry of Education to harmonize interventions and ensure the provision of reasonable accommodation for students with disabilities. Director Haga also noted that he is trying to advocate to the Ministry of Education leadership that disability and inclusion become required components of any future education project in Kenya.

Currently, there are three types of schools in Kenya: special, inclusive/integrated, and mainstream. According to Director Haga, there are around 300 special schools in Kenya, which are government-funded and serve approximately one-third of all students with disabilities. Currently, inclusive/integrated schools are essentially no different from mainstream schools, though they incorporate a resource room to provide extra help for students with disabilities. Mainstream schools are those that provide little to no tailored support for students with disabilities.

Director Haga noted that, in the implementation of Kenya’s revised policy, the Ministry of Education learned from its experiences with the previous special needs education policy of 2009. Director Haga stated that the old policy was very broad. It included many types of children with specific educational needs (such as refugees, street children, and children who are gifted and talented). However, it failed to give proper attention to students with disabilities. The revised policy has a different scope and it covers only learners and trainees with disabilities. Fredrick Odinga, a Deaf Inclusion Specialist with VSO Kenya, also noted that the previous policy lacked a robust implementation plan and framework, which hindered execution and application. The study visit participants found this last point particularly interesting given the fact that Rwanda recently approved a revised special needs education policy and conversations are ongoing about how partners and stakeholders can support the government of Rwanda in implementing the policy. This point underscores the need for a thorough implementation framework.

Key Themes

Over the course of the week, several themes emerged related to digitization and inclusion. There were also several cross-cutting themes that looked at the intersection of both digitization and inclusion. These themes are outlined in the table below and further explained in the rest of this report.

Digitization Inclusion Infrastructure (technology capacity and Special education training connectivity) Use and management Resources Teacher capacity Adapted materials Cross-Cutting Themes

SOMA UMENYE QUARTERLY PROGRAM REPORT | 70

Universal Design for Learning principles Phased implementation Government ownership

1. Digitization

Infrastructure (technology capacity and connectivity). The government of Kenya has invested significant resources to strengthen infrastructure at both the government level and the school level. On a tour of their facilities, KICD showed the study visit team (1) the Kenya Education Cloud servers; (2) computer labs for creating and validating digital content; (3) TV program studio; and (4) radio program studio. As mentioned above, KICD hosts both a public and private version of the Kenya Education Cloud. The Cloud is supported by five large servers, with initial funding provided by UNICEF. The public Cloud enables individuals (students, teachers, parents, publishers) to access KICD-approved education content. KICD also has well- equipped computer labs that are used to create and validate digital content. New MacIntosh computers enable KICD staff to create graphics and video/audio overlays for digital content in house. Finally, KICD produces an educational TV and radio program. The TV program runs 24/7 and the episodes are designed to supplement the curriculum. All of the production, from recording to editing to producing, is done at the KICD facility. While the TV program initially started as a World Bank-funded project, it is now fully-funded by the government of Kenya. The radio program, similar to the TV program, supplements the curriculum and aligns with class times. While the content is created by KICD, the program is broadcast by the Kenya Broadcasting Agency to ensure national coverage (the government of Kenya also purchased radios for all primary schools). Charles Munene, Director of e-Learning at KICD, stated that their goal is to ensure that all episodes from both the TV and radio programs (past and present) are available on the public Kenya Education Cloud.

At the school level, the government of Kenya has also made investments to ensure that schools are technologically capable of receiving and integrating digital content. In addition to distributing tablets to primary school students, the government of Kenya has improved access to electricity and the internet for schools.

KICD noted that all primary schools are now connected to electricity. In order to Umenye.

connect schools that previously were not connected, KICD worked with electrical companies to ensure access to power. Even with this investment in schools, Mr. Kimotho of KICD noted that two main

challenges remain. The first is that USAID Soma PHOTO: transformers often blow at schools and Study tour participants on a tour of KICD's in-house TV studio this takes a considerable amount of time to fix. The second is that schools now have bills for electricity and, in some cases, the internet. Mr. Kimotho stated that 3G connectivity exists almost all over Kenya; however, schools need to purchase data bundles if they want to access the internet. With regards to electricity bills, Mr.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 71 Kimotho noted that Kenya does take advantage of the Universal Service Fund; however, this is only being used in approximately 1,000 primary schools.

Given the current context in Kenya, several providers noted the importance of designing digital content to fit within the current infrastructure. For example, eLimu, eKitabu, and UNICEF stressed the need to ensure that all content is accessible offline. Additionally, eLimu and eKitabu recommended designing a tablet and Android version of content. The Android version is particularly useful for parents to use with their children at home. Finally, UNICEF — which is designing an accessible digital textbook with KICD — noted that the specifications of the tablets in schools often impacts and, in some cases, limits the type of digital content that they can design. This is especially true if the content includes a lot of video and other media.

Use and management. The government of Kenya has provided and distributed electronic devices to nearly all government-funded primary schools, with the goal of having a 1:1 student to device ratio. However, Mr. Kimotho at KICD notes that the uptake and use of these devices has been slow. Recent KICD estimates place national utilization rates around 35%. While Mr. Kimotho stated that KICD is still exploring the reasons behind these low utilization rates, he suggested that part of the reason could be a lack of buy-in by head teachers and school leadership. Head teachers are responsible for managing the devices in schools. However, they have not received specific training on how to use or encourage the use of these devices. Currently, only teachers have received training (please see the section on “teacher capacity” for additional information on teacher training). Therefore, head teachers can sometimes act as a roadblock to the integration of ICT and digital content in their classrooms. Sam Rich, co- founder of digital content provider eLimu, echoed Mr. Kimotho by providing first-hand accounts of head teachers being resistant to using digital content.

To encourage greater use and adoption of digital materials, Will Clurman, CEO of digital content provider eKitabu, noted that his organization developed two incentive programs: (1) the iTOYA program (ICT Innovation Teacher of the Year Award) and (2) a digital essay competition for students. The iTOYA award was developed in conjunction with the Teachers Service Commission; the Center for Mathematics, Science and Technology Education in Africa; and the Kenya Secondary Schools Heads Association. The digital essay competition is open to all students in upper primary and secondary schools, and winners receive a computing device and scholarship funds. Both eKitabu and eLimu also stressed the importance of ensuring the right messaging to teachers. In their experience, some teachers feel as though they will eventually be replaced by digital content. Therefore, they are reluctant to learn how to use it. It is important for teachers to understand that digital content should be used to supplement the work they already do; it should not be a replacement. While KICD’s goal is to distribute laptops to teachers, they haven’t completed this distribution. eLimu strongly recommended that this intervention (laptops for teachers) continue as another way to increase teacher capacity and buy-in.

Teacher capacity. Related to use and management, teacher capacity also plays a strong role in the effective use of digital content in Kenyan primary schools. This sentiment was echoed by almost every organization that the study team met with in Nairobi. Mr. Kimotho of KICD explained Kenya’s current teacher training model. This involves training one teacher per school on using digital devices with the goal that this teacher will then train other teachers at his/her school. Even Mr. Kimotho noted that this model has not been particularly successful and that teachers require refresher training and follow-up support. UNICEF Kenya also stated that the

SOMA UMENYE QUARTERLY PROGRAM REPORT | 72

current teacher training model is not as effective as it could be and may be part of the reason why utilization rates are low.

Mr. Rich of eLimu spoke further about the government of Kenya’s teacher training model for digital content. He noted that when the government of Kenya started distributing tablets and digital content to primary schools, they conducted a two-day training for teachers on how to use the devices. However, the training (conducted via PowerPoint) was not hands-on, and it lacked important practical components. Currently, eLimu conducts a two-week teacher training prior to the distribution of their digital content in primary schools. This training is done in schools, and it targets two teachers per school. The training is practical/hands-on and when the teacher goes into the classroom for the first time to use digital content with students, he/she is supported by a training member of the eLimu team. eLimu also brought on volunteers from the community to assist and support teachers when they are using the tablets after the training. However, eLimu recognizes that it’s often difficult for teachers to have a full two weeks available for training. As a result, eLimu is currently exploring ways to minimize teacher training. One of their proposed interventions is to create video clips on essential concepts that teachers need to know about digital content, including where to store tablets, how to explain to students what they will do on the tablets, and how to get students on-task quickly. These video clips will be shared with teachers via WhatsApp groups that eLimu has already established.

2. Inclusion

Special education training. In Kenya, most training for teachers on special needs and inclusive education is in-service training. Given the recent Sector Policy for Learners and Trainees (2018) and its emphasis on building teacher capacity in the area of inclusive pedagogy, the Ministry of Education, UNICEF, and VSO all stated the importance of conducting in-depth training for teachers on new tools and techniques to reach students with disabilities. While the majority of this training is in-service training, the Ministry of Education noted that this model is challenging. Specifically, Fred Haga, in charge of the Directorate of Special Needs Education at the Ministry of Education, stated that it is difficult, expensive, and unsustainable to bring teachers in for an intensive training on inclusive education. As a result, the Ministry of Education is working to include a much larger component on inclusive education during pre-service training. As part of strengthening pre-service training, VSO is working with teacher training colleges and the Kenya Institute of Special Education to incorporate specific training on Kenyan Sign Language.

Even still, it was noted that there is a difference between training general education teachers on inclusive education techniques and having specialized teachers who have specific training in Braille and sign language. Both are important and necessary in order to ensure access to quality education for all students. This was particularly evident at the Kambui Primary School for the Deaf. For those teachers who were not deaf, they had received specialized training on Kenyan Sign Language. Some of these teachers were even graduates of Kambui Primary School.

The issue of special education training extends beyond teachers. VSO and the Kambui School for the Deaf both emphasized the importance of providing specialized training to parents of children with disabilities. Catherine Mwangi, VSO’s education program manager, stated that for deaf children, their learning often stops after school because they are unable to communicate with their parents. VSO and the Kambui School for the Deaf have set up programs to train parents of deaf children and equip them with a basic level of Kenyan Sign Language. At the request of parents, VSO has also set up support groups so that parents can share advice and help each other.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 73

Resources. Kenya seems to have a lot of political will when it comes to implementing inclusive education and, specifically, the Sector Policy for Learner and Trainees with Disabilities (2018). However, the identification and adoption of appropriate resources remains a challenge. Director Haga of the Ministry of Education noted that the new policy is ambitious and some of the proposed interventions are expensive. He also noted that learning outcomes can be better in inclusive settings; however, this presupposes that schools are well-provisioned and teachers are properly trained. His team is currently thinking through activities that can be done and resources that can be provided at little or no cost. Unfortunately, he was not able to share specific examples of this with the study visit team.

Additionally, Director Haga and Fredrick Odinga, a deaf inclusion specialist at VSO, both noted that inclusive schools face specific challenges when it comes to resource rooms/special education units. Often these units are staffed by teachers who have little to no training in special needs education (and certainly not in Kenyan Sign Language or Braille). Based on numerous site visits to inclusive schools, Mr. Odinga stated that often these teachers end up finding ways to keep the students busy. However, they don’t have the skills to provide them with specialized education. To tackle this challenge, VSO recruited deaf teachers to teach in selected special units. These deaf teachers also trained some of the hearing teachers on basic sign language skills. Initially, VSO paid the salaries for the teachers; however, recently the government of Kenya started employing and paying the salaries of these teachers.

Another challenge related to resources for inclusive education has to do with Kenya’s Education Assessment and Resource Centers (EARCs). EARCs were established in Kenya in 1984 and there are currently 47 across the country (one in every county). The overall goal of these centers is to provide placement and appropriate education services for students with special education needs. Director Haga explained that identification of students with disabilities is usually done first at the community and school level. If, at that stage, it is determined that a student will require additional and specialized support, he/she is referred to an EARC. At the EARC, the student should theoretically undergo a thorough assessment conducted by a multi- disciplinary team, including experts from the education and medical fields. After this assessment, the EARC team will assist the student’s teachers in developing an Individualized Education Program (IEP) for the student and will refer him/her to an appropriate school. In certain cases, the student may be referred to a special school or he/she may be referred to an inclusive school. During our school site visit to Kambui Primary School for the Deaf, the head teacher noted that the school only accepts students who have been referred by an EARC. However, Director Haga stressed the challenge of ensuring that all 47 EARCs are appropriately staffed and resourced. He noted that it is difficult to find enough teachers who are trained in special education and health specialists to work at these centers.

Adapted materials. In addition to qualified teachers, students with disabilities need access to a diverse set of materials in order to improve their literacy. For many students with disabilities, adapted materials can include audiobooks, texts in Braille, or texts in large print. Several stakeholders in Kenya emphasized that, regardless of the specific format, materials for students with disabilities should incorporate universal design for learning (UDL) principles (see discussion below for more information). Director Haga of the Ministry of Education stated that there is a lack of appropriate materials for students with disabilities in schools, including special schools. VSO and UNICEF noted that sometimes publishers are reluctant to produce teaching and learning materials for students with disabilities because of the higher production cost and lower return on investment.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 74

3. Cross-Cutting Themes (Digitization and Inclusion)

Universal design for learning principles. Universal Principles of UDL design for learning (UDL) emerged as a significant cross-cutting issue when speaking with both digital and • Use multiple means of engagement: inclusive education–focused organizations. The Ministry they “why” of learning • Use multiple means of representation: of Education, KICD, UNICEF, and eKitabu all the “what” of learning specifically referenced the importance of incorporating • Use multiple means of action and UDL principles when designing and producing digital expression: the “how” of learning content for all students, including those with disabilities. UDL recognizes that all children learn differently and benefit from differentiated learning techniques in the classroom. Through this approach, there is a strong emphasis on access for all students (including those who are exceeding expectations, those who are falling behind, and those with disabilities). No assumptions are made regarding the needs of specific students. Rather, students themselves can play a role in choosing the best learning format for them. It should be noted that both KICD and UNICEF acknowledged that the majority of primary school teachers often have large classes and few resources. Therefore, one of the goals of incorporating UDL principles into digital content is to decrease the burden on the teacher. With this type of digital content, which is accessible to students regardless of their specific learning need, the teacher is able to use one digital program to reach a range of students. USAID also recently published a comprehensive toolkit for using UDL to improve literacy specifically for students with disabilities.

The integration of UDL principles and digital content into the Kenyan curriculum was clear in conversations with both UNICEF and KICD. UNICEF is developing an accessible digital textbook with KICD. However, Florian Rabenstein, in the Office of Innovation at UNICEF, noted that KICD is really driving the project. Its goal is to ensure that textbooks are accessible to all students in Kenya, particularly those with disabilities, in alternative formats (as no printed book can offer all the necessary features to ensure access for all users). For students with disabilities, the project is initially focusing on those with hearing, visual, and intellectual impairments.

Mr. Rabenstein of UNICEF and Mr. Kimotho of KICD both noted that the pilot version of this accessible digital text includes versatile formats and allows students to customize and combine diverse features like narration, sign language, interactivity, audio-description of images, and other functions to suit different preferences, learning styles, or access needs. Both UNICEF and KICD also noted that accessible digital textbooks allow children with different learning styles to access the same content, participate in the same textbook-based activities inside and outside the classroom, and have equal opportunities to achieve positive educational outcomes. Mr. Rabenstein also stated that design on this textbook is anchored in UNICEF’s principles of digital innovation (the full set of principles can be found in Appendix B-3).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 75 While Mr. Kimotho of KICD noted that it is the government of Kenya’s vision to ensure that all textbooks are designed with UDL principles in mind, this does not come without its challenges.

UNICEF noted that the accessible digital textbook project is often perceived by

publishers as a project solely for Umenye. students with disabilities. Even though this is not the case and the textbook is designed to improve learning for all students, there is still some resistance from content designers with regards to incorporating UDL principles into the USAID Soma PHOTO: content design. Study tour participants at the UN Compound in Nairobi following a meeting with UNICEF on the accessible digital textbook project The UNICEF/KICD accessible digital textbook is still in the pilot phase. A small portion of the textbook was tested with students and teachers in early March and the team is now incorporating their feedback to produce a revised version. Even though this particular textbook is still in the pilot phase, the visit to the Kambui Primary School for the Deaf allowed study visit participants to see accessible digital content in action. Grade 3 students at this government-funded school were using tablets to view digital content from eKitabu’s Studio KSL (Kenyan Sign Language) program. Studio KSL is a program developed by eKitabu using USAID Tusome’s supplementary readers. In this particular class, students were interacting with a digital version of a Grade 3 English book. Each page of the print version of the book appeared on a single panel on the tablet. There was a picture in the background and the sentence appeared at the bottom of the screen. A voice narrated the sentence and each word was highlighted in a different color as it was read. On one side of the screen, there was a video overlay of a student signing the sentence in Kenyan Sign Language. At the end of the story, there were interactive questions to test reading comprehension, and there was a glossary of all words in the story. This format was useful for both profoundly deaf and hard of hearing students in the class. All children in the class had a chance to interact individually with the content on their tablets and as a group with the teacher projecting the digital images on the wall. As the students made their way through the story, they practiced signing each word. The teacher guided the students through the questions at the end of the story, and he was able to adapt the level of difficulty of some questions based on the ability of the student. In particular, the teacher noted that it was not only easier to teach using the digital content, but it also enabled a greater level of student interaction with the text.

Phased implementation. Regarding the integration of digital content in schools and a system- wide shift towards inclusive education, many stakeholders noted the importance of a phased approach to implementation. For digital content specifically, KICD, UNICEF, eKitabu, and eLimu noted that there should have been more attention given to planning for teacher (and head teacher) training and that this should happen in conjunction with the distribution of tablets and digital devices. A more manageable way to do this is through phased implementation.

With regards to inclusion, Director Haga of the Ministry of Education emphasized that a process like this takes time. The overarching goal of the Sector Policy for Learners and Trainees with Disabilities (2018) is to transition towards inclusive education in Kenya. Director Haga notes that Kenya’s vision of inclusive education is a complete integration of students with disabilities

SOMA UMENYE QUARTERLY PROGRAM REPORT | 76

into classrooms and schools. However, it is critically important to ensure that the implementation plan is appropriately phased. The Ministry of Education recognizes that one of the biggest disservices an education system can do is to place a student with a disability into an inclusive or mainstream school without the appropriate support system. This thinking is reflected in the 2018 policy, which states the need to specifically maintain special schools while transitioning towards inclusive education.

Government ownership. All stakeholders in Kenya emphasized the importance of government ownership in all activities regarding digitization and inclusion. Specifically, several interventions that started off as donor-funded projects have now been fully absorbed by the government of Kenya. For example, the World Bank initially funded KICD’s educational TV program; now the government funds this project. VSO hired deaf teachers to provide support to students with hearing impairments in inclusive schools and now these teachers are government employees. Special schools in Kenya are also funded by the government. Additionally, the government of Kenya provided the funding to procure and distribute digital devices to students and teachers.

Even with projects that are not fully-funded by the government, the government is still playing an active role. For example, UNICEF stated that KICD is driving most of the technical decisions for the accessible digital textbook project. Mr. Rich of eLimu also noted that he has seen success when projects create space for government representatives to observe activities. In this case, eLimu invited the government to observe teacher training and the pilot phase of the rollout of their literacy app. This created buy-in from the government.

RECOMMENDATIONS MOVING FORWARD

The primary objective of this study visit was to explore ways of harnessing digital content to improve educational access for students, including those with disabilities. The themes in the section above highlight some of the key considerations for creating an enabling environment to integrate digitization and inclusion.

The recommendations below are separated into two categories. The first set of more general recommendations are geared towards all study visit participants (the Rwanda Education Board, the Rwanda National Union of the Deaf, the National Union of Disability Organizations in Rwanda, and USAID Soma Umenye). The second set of recommendations include some more specific actions that Soma Umenye can continue to explore.

General recommendations for using digital content for students with disabilities:

• At the systems level, there needs to be a commitment to ensuring that the infrastructure (for example, power, access to electricity) is in place to support digital content in schools. • Explore ways of designing digital content so that it is accessible for all students, not just those who have disabilities. One way to do this is to think about UDL principles as the digital application is being designed (examples include: sign language video overlay for deaf students; larger text and audio options for students who have low vision; inclusion of a variety of exercises/activities to test comprehension for students at all levels). • Providing digital textbooks does not negate the need to provide access to supplementary materials for students with disabilities. These materials could include

SOMA UMENYE QUARTERLY PROGRAM REPORT | 77 texts in Braille, large print readers, etc. Additionally, digital content is only effective in supporting the learning of students with disabilities if it is accompanied by teachers who have been trained in special needs and inclusive education methodologies. • Emphasis should still be placed on ensuring that teachers are trained in sign language, Braille, and general special education pedagogy. • Teachers and students must be trained and prepared to use digitally accessible content. • Co-design and test digital content with students, teachers, and experts to ensure that it is accessible and user-friendly.

Recommendations for Soma Umenye:

As Soma Umenye seeks to support REB in the areas of digitization and inclusion, there are several ideas from Kenya that might be appropriate to contextualize to Rwanda.

• Determine the technology that is available to REB (for both students and teachers). On the study visit, Rodgers Kabamba (ICT Officer with REB) informed the group that REB is about to distribute 2,448 new tablets to selected primary schools. Soma Umenye should request to see a sample tablet in order to gauge specifications and capacity. Soma Umenye should also request the distribution list and the distribution timeline. • In addition to digitizing the teacher’s guide, Soma Umenye should explore the possibility of digitizing a student text and incorporating UDL principles into the digital design to make it accessible for all students, including those with disabilities. Incorporating these principles (similar to what the group saw in action in Kenya) would be beneficial to not only students with disabilities, but also students who may need extra support to reach fluency levels. Proposed steps under this activity could include: — Determine the content to adapt. — Select pilot schools. These could perhaps be a small sample of the schools that REB is distributing new tablets or selected model/inclusive schools. — Develop a storyboard for the text to determine a list of adaptations needed. This should be done in consultation with REB, teachers, experts, and students with disabilities. A list of sample digital adaptations can be found in Appendix B- 4. — Based on adaptations, determine technical expertise required to create the digital content (such as short-term technical experts or a subcontractor). — Test and validate the content in pilot schools and provide training to teachers/head teachers. This step is particularly important given the lessons from Kenya to constantly adapt. — Incorporate feedback and revise the content if necessary. — Finalize user’s guide for the teacher.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 78

Appendix B-1. Trip Itinerary March 11 – 15, 2019

Day Time Activity Day 1: March 11, 11:00am Arrive in Nairobi, Kenya 2019 1:00pm Arrive at hotel 2:00pm – 4:30pm Meet with eKitabu team 5:00pm – 6:00pm Group introduction and overview of the week Day 2: March 12, 9:00am – 11:00am Meet with Fred Haga, Deputy Director in the 2019 Directorate of Special Needs Education (MoE) 3:00pm – 5:00pm Meet with John Kimotho, Director of Educational Media, Kenya Institute of Curriculum 5:30pm – 6:00pm Development Group de-brief and reflection Day 3: March 13, 11:00am – Meet with eLimu 2019 12:30pm Meet with UNICEF 2:00pm – 4:00pm Group de-brief and reflection 5:00pm – 6:00pm Day 4: March 14, 9:00am – 11:00am Tour KICD facilities with Charles Munene, 2019 Director of E-Learning 2:00pm – 4:00pm Meet with VSO Kenya 5:00pm – 6:00pm Group de-brief and reflection Day 5: March 15, 9:00am – 1:00pm Tour and interact with students using digital 2019 content at the Kambui School for the Deaf 2:00pm – 3:00pm Final group wrap-up and next steps 4:00pm Depart hotel for the airport

SOMA UMENYE QUARTERLY PROGRAM REPORT | 79 Appendix B-2. Overview of Organizations and Individuals

eKitabu: eKitabu is a Kenya-based organization that specializes in designing digital education content. Since its founding in 2012, eKitabu has brought digital content to over 1,500 schools across all 47 counties of Kenya and 13 African countries. The organization also specializes in producing digital educational content for students with hearing and visual impairments.

eLimu: Founded in 2012, eLimu is a Kenya-based organization that develops digital content that is aligned with the Kenyan curriculum. The organization recently developed an app (Hadithi! Hadithi!) designed to improve literacy rates in the first two years of primary school. The app includes stories written by local teachers and games to reinforce learning. eLimu also conducts extensive teacher training courses to give teachers the confidence to use technology in the classroom.

Fred Haga: Mr. Haga is currently the acting director of special needs education in the Ministry of Education in Kenya. He led the Technical Committee that steered the review of the National Special Needs Education Policy Framework of 2009 that resulted in the Sector Policy for Learners and Trainees with Disabilities in May 2018. Mr. Haga has held several leadership roles within the disability-rights movement for over 23 years. Mr. Haga also presented at the 2018 Inclusive Education in Africa workshop.

Kenya Institute of Curriculum Development: The Kenya Institute of Curriculum Development (KICD) has been developing and producing digital content that aligns with the Kenyan national education curriculum. KICD has also developed and published standards for digital content for content producers and publishers in Kenya.

UNICEF Kenya: UNICEF Kenya recently piloted the first digital textbook (for students with visual and hearing impairments) incorporating universal design for learning (UDL) principles. UDL, supported by USAID, is based on the premise that children learn in different ways and therefore it is important that learning environments, curricula, and materials align with student learning differences.

VSO Kenya: VSO has been working in Kenya since 1959 and primarily focuses on three activities: education, health, and livelihoods. VSO Kenya’s inclusive education program works with disadvantaged and marginalized children and their communities. It aims to increase access to inclusive quality education and lifelong learning. The inclusive education program has three main goals: (1) Enhanced conducive, safe, friendly formal and non-formal learning environments for delivery of inclusive education; (2) Communities, parents, and children actively engaged in education and promoting the safety of disadvantaged and marginalized children; (3) Effective implementation of inclusive education policies by the government.

Kambui School for the Deaf: Kambui School for the Deaf is a public, government-funded special school in Kiambu County. For the past year, the school has been using digital content to improve literacy rates for deaf children in the lower primary grades. The school uses tablets and laptops provided by the government of Kenya.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 80

Appendix B-3. UNICEF Principles of Digital Innovation

PRINCIPLES FOR INNOVATION AND TECHNOLOGY IN DEVELOPMENT These principles are not intended as hard and fast rules but meant as best-practice guidelines to inform the design of technology enabled development programs.

1. DESIGN WITH THE USER

• Develop context appropriate solutions informed by user needs. • Include all user groups in planning, development, implementation, and assessment. • Develop projects in an incremental and iterative manner. • Design solutions that learn from and enhance existing workflows and plan for organizational adaptation. • Ensure solutions are sensitive to, and useful for, the most marginalized populations: women, children, those with disabilities, and those affected by conflict and disaster.

2. UNDERSTAND THE EXISTING ECOSYSTEM

• Participate in networks and communities of like-minded practitioners. • Align to existing technological, legal, and regulatory policies.

3. DESIGN FOR SCALE

• Design for scale from the start, assess and mitigate dependencies that might limit ability to scale. • Employ a systems approach to design, considering implications of design beyond an immediate project. • Be replicable and customizable in other countries and contexts. • Demonstrate impact before scaling a solution. • Analyze all technology choices through the lens of national and regional scale. • Factor in partnerships from the beginning and start early negotiations.

4. BUILD FOR SUSTAINABILITY

• Plan for sustainability from the start, including planning for long-term financial health, i.e., assessing total cost of ownership. • Utilize and invest in local communities and developers by default and help catalyze their growth. • Engage with local governments to ensure integration into national strategy and identify high-level government advocates.

5. BE DATA DRIVEN

• Design projects so that impact can be measured at discrete milestones with a focus on outcomes rather than outputs.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 81 • Evaluate innovative solutions and areas where there are gaps in data and evidence. • Use real-time information to monitor and inform management decisions at all levels. • When possible, leverage data as a by-product of user actions and transactions for assessments.

6. USE OPEN STANDARDS, OPEN DATA, OPEN SOURCE, AND OPEN INNOVATION

• Adopt and expand existing open standards. • Use open data and functionalities and expose them in documented APIs (Application Programming Interfaces) where use by a larger community is possible. • Invest in software as a public good. • Develop software to be open source by default with the code made available in public repositories and supported through developer communities.

7. REUSE AND IMPROVE

• Use, modify and extend existing tools, platforms, and frameworks when possible. • Develop in modular ways, favoring approaches that are inter-operable over those that are monolithic by design.

8. ADDRESS PRIVACY AND SECURITY

• Assess and mitigate risks to the security of users and their data. • Consider the context and needs for privacy of personally identifiable information when designing solutions and mitigate accordingly. • Ensure equity and fairness in co-creation, and protect the best interests of the end- users.

9. BE COLLABORATIVE

• Engage diverse expertise across disciplines and industries at all stages. • Work across sector silos to create coordinated and more holistic approaches. • Document work, results, processes and best practices and share them widely. • Publish materials under a Creative Commons license by default, with strong rationale if another licensing approach is taken. • When possible, leverage data as a by-product of user actions and transactions for assessments.

UNICEF innovation principles have been endorsed or adopted by the following partners: UNICEF, WHO, HRP, USAID, Gates Foundation, EOSG Global Pulse, WFP, OCHA, UNDP, SIDA, IKEA Foundation, UN Foundation, and UNHCR.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 82

Appendix B-4. Sample Digital Adaptations that Incorporate UDL Principles

Hearing Vision Intellectual All Students Provide subtitles for Provide voiceover Allow manipulation Prepare a glossary of all audio and video and auditory of form and digital new vocabulary (in components reinforcement content the form of words or (narration) images) Provide a video Supply subtitles and Highlight words as Utilize pre (context) overlay with sign audio description of they are read on the and post language images screen (comprehension) questions Allow students to control the pace of Make fonts, colors, and spacing of text adaptable text (speed up or slow down) Use simple and non- Make size of the video adjustable text heavy panels Use a variety of Enable vibration/audible feedback exercises to test understanding

SOMA UMENYE QUARTERLY PROGRAM REPORT | 83

ANNEX C. P2 AND P3 KINYARWANDA TEACHER TRAINING PHASE 1: JANUARY 8-27, 2019

1. INTRODUCTION USAID Soma Umenye is a five- year initiative of the United States Agency for International Development (USAID) and the Rwanda Education Board (REB) that aims to improve reading outcomes in Kinyarwanda for at least 1 million children in public and government-aided schools in Rwanda. Specifically, USAID Soma Umenye is expected to ensure at least 70 percent of P1-P3 students are able to read grade-level text with fluency and comprehension. A trainer conducts training in Nyaruguru District. In Year 3, USAID Soma Umenye expanded its coverage from Grade 1 (Primary 1 or P1) to Grades 2 and 3 (P2 and P3) in all public and government-aided schools. As part of this expansion, we trained P2 and P3 teachers in evidence-based reading instructional strategies.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 84

2. CONTENT OF PHASE 1 TRAINING OF P2 AND P3 KINYARWANDA TEACHERS At the beginning of fiscal year 2019, USAID Soma Umenye staff revised the “essential core” for P2 and P3, which include (1) a student textbook (2) a teacher’s guide (3) a read-aloud story book (4) leveled readers. The objective of the teacher training was to equip teachers to use these materials to teach the foundational skills of reading and writing in Kinyarwanda. The training also covers Kinyarwanda orthography (which Jean Paul, a member of the National Reading Training Team facilitates a was recently reformed), formative training session in Ngoma District. assessment (formal and informal), gender responsive pedagogy, and supporting inclusive education methodologies. In addition, the training introduces the scaffolded approach of “I do, you do, we do.” This is a process of gradually releasing responsibility to children: first the teacher models a skill, then practices it together with the children, and then children are encouraged to practice the skill on their own. Finally, teachers are also trained to manage a classroom library and check-out system. They are shown how students can be taught to care for books, check them out to read at home, and how to match children with appropriately leveled books.

Soma Umenye will deliver training to P2 and P3 teachers over 10 days in two phases (five days each). The first phase of P2 and P3 training focused on foundations of early grade reading and writing instruction.

• For P2 teachers, Phase 1 is called “Foundation of early grade reading and writing instruction” while Phase 2 is called “Reading and writing instruction with focus on fluency, vocabulary and comprehension instructional strategies.” • For P3 teachers, Phase 1 is entitled “Learning to read” and focuses on the five core components of evidence-based reading and writing instruction. Phase 2 is called “Reading to learn” and focuses on teaching children how to apply their reading and writing skills in learning.

The focus of this report is on Phase 1 training. In Phase I, Soma Umenye trained approximately 4,100 P2 teachers and 3,600 P3 teachers.

Phase 1 P2 training covers:

• Scaffolding and phonological awareness • Teaching blends • Phonics and writing instruction • Whole lesson demonstration • Five core components of reading and • Cursive writing writing instruction • Inclusive teaching and learning

SOMA UMENYE QUARTERLY PROGRAM REPORT | 85 • Formative assessment • Kinyarwanda orthography • Structure of books (teacher’s guide and pupil book)

Phase 1 P3 training covers:

• The definition of reading • Teaching blends, vocabulary, fluency • Five core components of reading and • Whole lesson demonstration writing instruction • Read aloud stories • Understanding new textbook • Cursive writing (teacher’s guide and pupil book) • Library management • Scaffolding approach • Formative assessment • How to teach phonological awareness, • Inclusive teaching and learning phonics • Kinyarwanda orthography

The training focuses on practice. Trainers modeled instructional activities for teachers to practice. For example, after group discussions, teachers would conduct micro-teaching and receive feedback from their fellow teachers. Or teachers would select members from their groups to present to the larger group what they had discussed. This practical focus builds their skills and creates confidence.

3. TRAINERS USAID Soma Umenye has supported the creation of the National Reading Training Team (NRTT), which is composed of REB’s Kinyarwanda specialists, URCE lecturers, Kinyarwanda tutors from teacher training colleges, REB’s national and district master trainers for Kinyarwanda, and selected Kinyarwanda teachers. The NRTT (with more than 200 members) leads the training of teachers across the country. One team of NRTT members trained P2 teachers while another team trained P3 teachers. The NRTT members are supported by trainers from Teachers discuss during a small group activity. Inspire Educate and Empower Rwanda (a subcontractor).

USAID Soma Umenye only selects trainers who have achieved 90% or higher in the post-test in the training-of-trainers session.

For P2 and P3 teacher training, Soma Umenye assigned one trainer to each room (of 25-30 teachers)

SOMA UMENYE QUARTERLY PROGRAM REPORT | 86

4. TRAINING SUPERVISION USAID Soma Umenye’s technical and MEL teams observed 15 training sessions in 15 districts to review the quality of training implementation. The findings of the team were as follows:

• Trainers generally followed the training modules as written and appeared to have mastered the content. They emphasized practice as requested. • Teachers mentioned how valuable they found the practice activities. • Trainers occasionally raised questions (directing them at the observing technical team member) regarding the revised instructional materials. • In some geographic areas, teachers pronounced certain sounds in Kinyarwanda differently due to their linguistic background (dialects, etc.). This created some confusion during the phonological awareness sessions.

5. CHALLENGES ENCOUNTERED AND PROPOSED SOLUTIONS • Teaching phonological awareness and phonics is particularly challenging for teachers from both Nyabihu, Rubavu, and a few additional districts where certain groups of teachers speak different dialects of Kinyarwanda. They have specific sounds that are different than standard Kinyarwanda speakers. Soma Umenye plans to study the impact of dialects on the teaching and learning of early grade reading and will explore these issues in significantly more detail. • A few teachers were unable to complete the training for various reasons, including sudden illness. Some did not participate on Saturday due to their religious beliefs. To catch up on the missed content, these teachers were asked to attend specific days during the second round of the training. • Some trained teachers were transferred to other schools/classes and are teaching different subjects. While some transfers are bound to happen, Soma Umenye will consider proposing a minimum period for teachers to remain assigned to their posts to enhance the effectiveness of training. • Some teachers were appointed to teach P2 and P3 after completion of Soma Umenye’s Phase 1 training. They will be trained through the catch-up training later in 2019.

6. RESULTS OF POST-TRAINING TEST In order to understand whether teachers’ knowledge changed during the training, USAID Soma Umenye administers a pre-training and post-training test. A version of the test is included in Appendix C-1. The results demonstrated that teachers’ knowledge of relevant concepts improved over the course of training. Exhibit 10 below shows the average change in scores of training participants (by province) in the Phase I training.

Exhibit 10: Analysis of pre-test and post-test scores from teacher training assessments in each province

P2 teachers P3 teachers Province Pre-test Post-test Difference Pre-test Post-test Difference Kigali 56.3 85.8 29.5 54.1 83.5 29.4

SOMA UMENYE QUARTERLY PROGRAM REPORT | 87

P2 teachers P3 teachers Province Pre-test Post-test Difference Pre-test Post-test Difference South 57.5 88.8 31.3 56.7 89.0 32.3 North 58.4 86.9 28.5 52.3 79.9 27.6 East 55.7 82.9 27.2 47.7 83.8 36.1 West 56.8 85.5 28.7 49.5 84.4 34.9

For P2 trained teachers, the average score increased from 56.9 percent in pre-test to 85.9 percent in post-test; while for P3 trained teachers, the average score increased from 51.8 percent in pre-test to 84.6 percent in post-test. These increments in scores are statistically significant. Exhibit 11 shows the gain per question (in number of percentage points). These results suggest that knowledge acquisition occurred as a result of the teacher training. (Note that during our review of teacher performance we identified some results that appeared unusual. Following investigation, we identified improper management of the test and the results were eliminated from the final report.)

Exhibit 11. Percentage point increase on question in post-training results Question P2 P3 1. For effective reading instruction, which of the following 22.1 23 should you teach? 2. Which of below represent technique of teaching phonological 9 8.9 awareness? 3. Select the best activity that describes how a teacher can use 38.3 50.7 the scaffolded (I Do, We Do, You Do) teaching approach. 4. Which of the following represent phonics activity? 57.1 62.7 5. Which one is the correct activity of teaching writing 30.6 37.5 instruction? 6. Select the right activity that you can use in teaching reading 28 50.5 skills. 7. Which statement is correctly describing a classroom-based 10.3 16.7 assessment in P2. 8. Select the best statement that describes how to promote 44.8 36.4 inclusion in your classroom. 9. Essential core material used in teaching Kinyarwanda P2 33.3 21.3 include: 10. How should you prepare a lesson of Kinyarwanda in P2 19.3 19.5

Exhibit 12 shows the variation in improvement (measured in terms of percentage point gains in the post-test results) across districts for P2 teachers.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 88

Exhibit 12: The difference between post-test and pre-test P2 scores in each district

45 41 40 40 36 35 34 34 35 32 32 32 31 31 31 31 30 30 29 28 28 30 27 26 29 25 25 25 25 24 24 25 23 22 22 21 20 15 10 5 0 Huye Rusizi Kirehe Burera Rutsiro Ngoma Nyanza Gasabo Rubavu Karongi Rulindo Gatsibo Kicukiro Gicumbi Kayonza Nyabihu Gisagara Kamonyi Gakenke Ruhango Musanze Bugesera Muhanga Nyagatare Ngororero Nyaruguru Nyamagabe Nyarugenge Rwamagana Nyamasheke

gain_in_score/district average_gain_in_scores

7. RECOMMENDATIONS MOVING FORWARD Based on the training supervision findings and pre- and post-training test results, USAID Soma Umenye has developed the following recommendations:

• Review those training topics that were less well understood in the next phase of training. • Identify best practices from high-performing training sites and ensure learning informs future planned training. • Consider whether there is potential to track future coaching records for teachers who produced low post-test results to track potential improvement. • Conduct training during the holidays if possible as it was very effective. Most of the training could be completed more quickly compared to last year when training was conducted during the weekends and trainees progressed through the training more systematically

SOMA UMENYE QUARTERLY PROGRAM REPORT | 89 Appendix C-1: Pre- and Post-Test Tool (translated English version)

Teacher Training Module 1 Instructions: 1. The duration of this test is 20 minutes. 2. Don’t put your answers on this questionnaire. All questions should be answered on answer sheet attached to this questionnaire. 3. Try to answer all questions by filling the bubbles of the answer sheet. You correctly filled the bubble. You didn’t fill the bubble correctly.

------1. For effective reading instruction, which of the following should you teach? a) Phonological awareness and vocabulary. b) Reading comprehension and phonics. c) Reading fluency and reading comprehension. d) All above can be taught for effective reading instruction.

2. Which of below represent technique of teaching phonological awareness? a) Asking learners to read letters/ blends. b) Asking learners to identify first/middle/ or last sound of a word. c) Showing the words cards to students. d) Writing sentences on blackboard. e) First ask learners to write words with the new blend and then ask them to identify it in those words.

3. Select the best activity that describes how a teacher can use the scaffolded (I Do, We Do, You Do) teaching approach. a) Teacher and pupils practice the exercise on new blend together. b) The teacher writes the consonants that make new blend in calligraph and model how to write it using a finger to trace it. c) The learners do independently exercise of reading words and sentences by imitating the examples from teacher. d) All activities above describe how a teacher can use the scaffolded (I Do, We Do, You Do) teaching approach.

4. Which of the following represent phonics activity? a) The ability to hear or to match sounds of words with new blend. b) The ability to hear, to identify and to match sounds that make a word. c) The ability to connect sounds to letters/blends d) The ability to understand the sound and analyze it to make a syllable, a word or a sentence.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 90

5. Which one is the correct activity of teaching writing instruction? a) When a teacher is modeling how to form a new blend, he can write the consonants that make the blend in different ways and ask learners to imitate. b) The teacher must use scaffolded “I Do, We Do, You Do” teaching approach when teaching to write words and sentences with new blend in cursive. c) During the practices of writing words and sentences with new blends in cursive, the teacher use scaffolded teaching approach by breaking small groups that are managed by bright learners in absence of teacher. d) All activities above are suggested when teaching writing.

6. Select the right activity that you can use in teaching phonological awareness or phonics. a) In phonics exercise, teacher says the sounds that make a word “umutsima” and asks learners to say the syllables that make that word by clapping hands. b) In phonological awareness exercise, teacher displays four illustrations and asks learners to identify the pictures that have names with new blend “mby”. c) In phonological awareness exercise, teacher traces his finger under each syllable that makes a word and then reads that word exemplary. d) In phonics exercise, the teacher utters two words “ijwi” and imvi” and learners find that each of those words have two syllables but their sounds are different.

7. Which statement is correctly describing a classroom-based assessment in P2. a) The formal formative assessment of reading can be replaced by summative assessment of reading because or those assessment have the same purpose in P2 classroom. b) The results from formal formative assessment of reading and writing Kinyarwanda help teacher to improve or inform both the teaching and learning processes. c) The teacher conducts formative assessment for reading at the end of unit two and six in order to grade students and award the best performers. d) The results from formal formative assessment will help a teacher to promote students from P2 to P3.

8. Select the best statement that describes how to promote inclusion in your classroom. a) Check whether all groups include learners with disabilities without considering other learners in teaching and learning. b) Only focus on females and learners with disabilities because they have some difficulties. c) Treat all learners equally without excluding some students based on their backgrounds/physical disabilities. d) All the answers above are correct.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 91

9. Essential core material used in teaching Kinyarwanda P2 include: a) Text book, decodable storybook for learners and teacher guide. b) Textbook, Teacher’s guide, read aloud story book and supplementary readers. c) Teacher’s guide, decodable story book after any letters and read aloud story book. d) Read aloud story book, teacher’s guide and other supplementary books.

10. How should you prepare a lesson of Kinyarwanda in P2 a) Participating in training is enough. b) Review the teacher guide and follow it when teaching. c) Discuss with a colleague on methodology of teaching. d) I prepare my own lesson plan and then I teach.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 92

ANNEX D. RWANDA EDUCATION INSIGHTS: FINAL PROJECT PLAN

INTRODUCTION In partnership with the government of Rwanda and in collaboration with development partners, USAID Soma Umenye proposes the development of an education sector dashboard that will bring together critical information on the education delivery system, and clearly communicate that information to enable its use for policy, monitoring, and improvement. This project plan provides a detailed roadmap through 2021 for the development and implementation of a dashboard that will deliver actionable education information to stakeholders at various levels within the Rwandan education system, from Ministry of Education (MINEDUC) and Rwandan Education Board (REB) leadership down through to head teachers. MINEDUC and REB will be close partners in all stages of the project’s development and roll-out, and will be supported to establish the capacity for its long-term success.

Within the terms and scope of this project plan, the Rwanda Education Insights dashboard project will report on primary education outcomes (P1-P6) within public and government-aided schools at the national, district, and sector levels, and at the school level in three pilot districts. The dashboard will have a priority focus on early grade learning outcomes, other primary grade outcomes identified in the national Education Sector Strategic Plan (ESSP) monitoring matrix, and outcomes relating to the World Bank’s Human Capital Index (HCI) education components. While these will be areas of focus, the Rwanda Education Insights website and suite of related reports will provide a broad array of information prioritized by stakeholder groups that can guide national policymakers, development partners, district and sector officials, and head teachers and parents in their efforts to understand and improve Rwanda’s education system. This plan details the proposed approach to Rwanda Education Insights (REI) to produce a comprehensive and innovative support for Rwanda’s continued progress in the education sector.

GOALS FOR THE PROJECT The primary goal of REI is to provide a “window” on to actionable data that can serve as a focus for improvement planning at the various levels of Rwanda’s education system, with new visibility provided for early grade learning outcomes. MINEDUC, REB, Soma Umenye, and other development partners such as Building Learning Foundations (BLF) collect and host significant data relevant to the primary education system and key progress metrics for the ESSP and HCI. However, these data are not integrated, and are rarely shared back to districts, sectors, and schools in a manner that supports local planning and continuous improvement. The REI project will address the data siloes throughout the education sector by integrating data from multiple sources needed to track education system performance and progress against the ESSP and metrics relating to the HCI. For that reason, MINEDUC has included the REI dashboard in the ESSP Implementation Plan. In addition, by integrating early learning information from Soma Umenye and BLF, the project will provide new visibility into learning and educator training outcomes that are not currently available through existing governmental

SOMA UMENYE QUARTERLY PROGRAM REPORT | 93 systems. Finally, REI will provide data visualizations and links to resources that ensure information is usable for stakeholders at all levels, thereby addressing a key principle of the National Data Revolution Policy (April 2017) to make public data discoverable and easily accessible to the widest range of users for the widest range of purposes.

SCOPE OF THIS REPORT AND THE PROJECT

This report includes the final proposed project plan for REI, including:

• A description of the needs analysis informing the project approach • The primary user groups and use cases for the project • The proposed design of REI reports, project indicators, and user support features • The data sourcing needs and proposed technical architecture of the project • The proposed approach to project management and staffing • The project implementation plan and timeline

As depicted below, within the term and scope of this project plan, the REI project will provide one national profile report, 30 district profile reports, 416 sector profile reports, and school- level reports within three pilot districts. In a future scope, the REI will provide a platform that can be extended to other parts of the education continuum (such as pre-primary and secondary), and expanded to include school-level data for the entire nation.

Exhibit 13. Scope and Phasing

Informed by the needs analysis and addressing the proposed use cases, REI will:

• Coherently organize an array of information on the Rwandan education system, allowing end users to quickly access information needed to support local education management and improvement. • Focus attention on learning outcomes, particularly those in early grades, that are not currently part of government of Rwanda reporting systems.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 94

• Include a comprehensive set of indicators allowing the monitoring of primary educational system performance against the ESSP 3 targets and outcomes relating to the HCI (specifically, its “Expected Years of learning-adjusted school” component). • Allow sector and district officials to benchmark local performance against averages and their best- performing counterparts. • Enable offline use cases (such as distributing printed copies at a local planning meeting) through PDF reporting features, as well as through a one page “Highlights Report” that incorporates the highest priority descriptive items and indicators as determined by stakeholders. • Support the export of all data to Excel for further detailed analysis. • Provide links to resources, a glossary of descriptors and indicators, and a reports library to support users to place the data in the appropriate context.

The project management structure will build capacity within Rwanda for long-term administration and management while ensuring the availability of necessary technical expertise that has experience with projects similar in scope and complexity. REI will be developed during calendar year 2019 and prepared for its initial launch in early 2020 in close coordination with MINEDUC, REB, and other partners. The management and administration of the project will transition to MINEDUC and REB (or another Rwanda-based administrator designated by the government of Rwanda) during Quarter 4 of 2019, with a view to having the dashboard fully embedded in the government by 2020 in accordance with a transfer and sustainability plan. Throughout, the project will be informed through the input of key local stakeholders, including district and sector education officers and head teachers.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 95 Exhibit 14. Mock-up of REI Site Design

The REI site design will focus attention on early grade learning outcomes and coherently organize an array of priority indicators

PROJECT NEEDS ANALYSIS The needs analysis for this project included extensive primary source document review, assessment of leading international models, engagement with Soma Umenye team members, meetings with government of Rwanda national and district officials, coordination with other development partners, stakeholder workshops, and school site visits.

The primary source document review included analysis of various governmental and development partner reports and plans, including the draft MINEDUC Education Sector Strategic Plan – 2018/19 to 2023/24 (ESSP 3), the World Bank Group Human Capital Project report (2018), MINEDUC 2017 Education Statistical Yearbook (April 2018), MINEDUC Real Time Monitoring and Data Management Project report (October 2017), Ministry of Youth and ICT National Data Revolution Policy (April 2017), and the Global Education Monitoring Report on Accountability Practices and Policies in Rwanda’s Education System (2017). This review

SOMA UMENYE QUARTERLY PROGRAM REPORT | 96

identified descriptors and indicators currently reported at the national level, the current state of education data collection and management systems, and priorities and policies for data reporting and monitoring.

The consultant has an in-depth understanding of the Illinois Report Card, which has been recognized by the Education Commission of the States as the leading United States education dashboard site, and other U.S. education reporting best practices and models. In addition, the consultant analyzed South Africa’s Data Driven Districts Dashboard (https://www.eddashboard.co.za/Account/Anonymous) to review a leading model in sub- Saharan Africa.

The consultant made three trips to Rwanda during the planning period for the purpose of this needs analysis: October 7 – 16, 2018, November 20 – 29, 2018, and January 20 – 27, 2019. These trips included meetings with Soma Umenye team members to discuss the project and alignment with other project activities. In particular, the consultant and Soma Umenye focused on strategies for brining visibility to early learning outcomes. The trips also included meetings with development partners UNICEF and BLF to review progress or activities relating to the dashboard, and a series of meetings with appropriate MINEDUC and REB officials and consultants to discuss related data collections and project use cases. The meetings with UNICEF and BLF revealed that the dashboard can be a key source of information for broader data system integration activities by identifying which system elements require integration to support stakeholder reporting needs. The meetings with MINEDUC and REB demonstrated the need for the dashboard to provide “windows on to available data” so that the government can provide visualizations and information to shape local action. In addition to providing inputs into the REI project plan, these meetings also helped to build awareness of and support for the proposed project.

A key focus for the first two trips was preparing for and delivering two separate day-long workshops that included representation from the district, sector, and school levels. Soma Umenye team members organized a stakeholder advisory group consisting of approximately 20 representatives from the various levels of the education delivery system, and from each province of the country. The first workshop focused on mapping existing data flows and identifying local data needs (see the Exhibit 15 below with the output of the data flow mapping).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 97

Scenes from the stakeholder workshops

These inputs were used to prioritize indicators for the dashboard and better understand the availability of data for those indicators. The second workshop focused on refining the list of proposed indicators, and receiving feedback on initial dashboard design concepts. In both workshops, participants expressed strong interest in the dashboard to support their roles in the Rwandan education system, and participants were enthusiastic about remaining engaged as an advisory group for the project. As one participant stated, “the dashboard will help me know where I am and what is needed to improve education within my sector.”

Finally, the trips included visits to two primary schools to interview the head teachers, and discussions with various district-level officials. The school visits illuminated the on-the-ground systems for data collection and usage, and how the dashboard can support localized planning and school improvement efforts. These head teachers were hungry for data to inform their educational decision-making, and lamented their limited ability to analyze student outcome information and understand comparative performance in relation to other schools and sector/district benchmarks. The district officials believed REI could serve as a valuable tool for monitoring overall district performance, and identifying both high-performing and low- performing sectors for taking action. For example, the vice-mayor of Huye District described how she could envision using REI to determine the performance status of the various sectors throughout the district, identifying the under-performing sectors, and matching them with higher-performing sectors as models of practice.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 98

Exhibit 15. Output from Mapping of Current Data Flows: First Stakeholder Advisory Group Workshop SEO to SEO to SEO or DEO DEO to SCHOOL to SCHOOL to OTHERS DEO/DDE REB/MINEDUC to SCHOOL MINEDUC SEO DEO School SEO entry report Teachers’ Salary Student data at Daily student Teachers Maternity leave function beginning of attendance trained by BLF reports schools report year and Soma Umenye School feeding Activities /duties in Data analysis School Daily teacher Adult literacy Teacher statistics month report capitation grant attendance situation and abandonment how to report improve Dropouts School inspection Sector School feeding School capitation Sports reports rates report education report grant meeting report School SEO monthly School visit Adult literacy No. of pupils by Kagame Cup Inspection report report report class and gender Competition Report School School data Data collection Dropout data No. of teachers In charge of Feeding management tools health insurance Report system School census Meeting Teacher data - School Activities of data resolution at start of the infrastructure youth (Itorero) district level year status Pregnancy Feedback on Data on school Entry report Report school drop out by inspection gender School Entry School dropout School Data on school Report data infrastructure dropout: gender report Students with School Teacher School test result Special Needs repetition data replacement report Adult literacy School School entry Annual report promotion data analysis report School Adult literacy School Weekly plan of General initiative report inspection head teacher Assembly report Report Termly and Teacher Assessment eval. Annually qualifications report at the end Report and profile of term and year Monthly SEO Teacher No. of transfer report placement students School Education School feeding Construction meetings School census

SOMA UMENYE QUARTERLY PROGRAM REPORT | 99 USER GROUPS AND USER CASES Through the needs analysis, the following use cases emerged as priorities for REI. The below use cases are ordered from those closest to the school level to the national level, and not in order of importance.

User Group Description of Use Cases Design Implications Head Teachers • Using REI information for developing objectives, • Incorporate indicators relevant and Teachers indicators, and targets for school improvement to typical school improvement plans objectives • Providing reports to utilize in school • Support both web-based and performance appraisal meetings (SPAM) carried paper-based reporting out in conjunction with the School Monitoring processes and Support Framework (SMSF) • Ensure timely reports (e.g., • Providing school-based staff with an termly) where possible understanding of the “why” behind data • Include links to resources to collection – given the volume of data collected support school improvement at the school level, REI will provide more immediate and more localized benefits than existing reporting systems Parents and • Providing reports and information to utilize in • Support both web-based and Community SPAM meetings that direct the focus on student paper-based reporting Members outcomes • Include easily understandable • Providing an objective basis for discussions with visuals and language other community members and elected officials • Incorporate Kinyarwanda on school quality and performance translation feature Sector • Use of REI reports in coordination with field • Include indicators relevant to Education assessments for management, inspection, and monitoring and inspection Officers (SEOs) improvement planning purposes activities required by • Identification of higher performing sectors within MINEDUC or REB the same region for collaboration on • Comparisons to district improvement strategies benchmarks and top- • Use as a support for school improvement performing sectors planning meetings within the sector by • Comparison across sectors grounding discussion in objective indicators of • Assessment of performance performance improvements over time • Use to assess progress of the sector in • Identify relationship to ESSP 3 comparison to ESSP 3 outcome areas and HCI- outcome areas and HCI related metrics metrics • Include links to resources to support school and sector improvement District • Use of REI reports in coordination with field • Include indicators relevant to Education assessments for management, inspection, and monitoring and inspection Officers improvement planning purposes activities required by (DEOs), District • Identification of higher performing districts MINEDUC/ REB Directors of within the same province for collaboration on • Comparisons to national and Education improvement strategies district benchmarks (DDEs), and • Use as a support for sector improvement • Assess performance Vice-Mayors planning meetings within the district by improvements over time grounding discussion in objective indicators of • Link to imihigo contracts and performance action plans for each district

SOMA UMENYE QUARTERLY PROGRAM REPORT | 100

User Group Description of Use Cases Design Implications • Use to assess progress of the district in • Identify relationship to ESSP 3 comparison to ESSP 3 outcome areas and HCI- outcome areas and HCI-related related metrics metrics • Use to identify appropriate objectives and • Include links to resources to indicators for district imihigo contracts and support continuous related action plans improvement Soma Umenye • Provide baseline information on districts, • Focus on early learning & BLF District sectors, and schools to support local monitoring outcomes data Advisors and and evaluation • Align to SMSF and MEL Provincial • For Soma Umenye, use of REI as an input for indicators Advisors, MEL collection of SMSF data by SEOs and project • Exportable data for further staff district advisors analysis • Incorporation of REI into sector and district communities of practice/conferences by grounding discussion in objective indicators of performance • Serve as a supplement to additional data collected by MEL teams for project monitoring and evaluation Other NGOs • Provide baseline information on districts, • Quick reference to district, and sectors, and schools so that data collections sector, and school information Development undertaken by the organization can focus on • Exportable data for further Partners programmatic activities analysis MINEDUC • MINEDUC inspectors at both the regional and • Include indicators relevant to Inspector district levels can use REI reports to assess the quality review processes Reviews quality of school performance as part of REB’s • Comparisons to Quality Campaign district/national benchmarks • Ability to compare across sectors and districts • Assess performance improvements over time REB and • Simplify access to information needed for • Align to agency programmatic MINEDUC Staff program administration and decision-making and monitoring needs and Leadership • Assess performance against ESSP 3 and HCI- • Clear crosswalk to ESSP and related metrics at all levels of the education HCI education component system • Exportable data for further • Focus the system on performance against analysis priority indicators • Simple process for future • Use reports and information to assess national expansion and updating of and district-level education performance indicators • Define needs for data access to inform broader • Assess performance data system development and integration improvements over time activities

SOMA UMENYE QUARTERLY PROGRAM REPORT | 101

Project interviews with a vice mayor, head teachers, and many others informed the use case descriptions

PROPOSED INDICATORS AND DESIGN APPROACH Informed by the needs analysis and to address the proposed use cases, the REI project will provide a full array of information deemed most important by stakeholders to support local education management and improvement. While REI includes indicators that align to ESSP 3, it is not only a tool for ESSP 3 monitoring. As described in the use cases, it will support a number of functions at the national, district, sector, and school levels. For that reason, the project has shifted from its initial label of the “ESSP Dashboard” to the broader framing of “Rwanda Education Insights.” The REI site will either be a standalone website or accessible through MINEDUC’s website. The initial landing page will orient the user, and enable him or her to quickly navigate to a national, district, sector, or (eventually) school profile of interest. The information for each profile is then organized into the categories and descriptors/indicators shown in the accompanying table.

These descriptors and indicators were selected based on the priority given by stakeholders, the relevance to ESSP 3 outcome areas, and the availability of the data. Appendix D-1 includes a

SOMA UMENYE QUARTERLY PROGRAM REPORT | 102

more detailed description of these descriptors and indicators for the first phase of the REI launch, including the relationship to the ESSP outcome areas, the sources, and additional notes. Additional descriptors and indicators to consider including in a future phase are identified in Appendix D-2. While the items in Appendix D-3 were generally identified as priorities by stakeholders, they either have less relevance to ESSP 3, or the data to support the indicator are not readily available.

A key innovation of REI’s reporting approach is to emphasize learning outcomes data, particularly in early grades, as opposed to input data such as class size or number of textbooks. REI will incorporate Early Grade Reading Assessment (EGRA) data at the national level, as well as the results of Local Early Grade Reading Assessments (LEGRA) at the district level, and School Monitoring and Support Framework early grade indicators at both the district and sector levels. As was discussed by Soma Umenye with the head teacher of G.S. Manyange A School, the only learning outcome data currently available to primary school educators are national exam results at P6. By sourcing learning outcome data directly from Soma Umenye and BLF, REI will direct focus to learning outcomes in P1-P3, which are recognized in ESSP 3 as critical for establishing a foundation of learning for later years.

In addition to addressing ESSP 3 and HCI-related indicators throughout the tool, REI will include a summary table for both ESSP 3 and HCI’s education component that enables the user to quickly assess performance of the nation, district, or sector against relevant ESSP 3 and HCI benchmarks. Critically, REI will be the exclusive tool for (1) monitoring ESSP 3- and HCI-related performance at the district, sector, and eventually school level and (2) reporting that performance in a centralized and standardized manner. These summary tables also highlight the flexibility of the REI platform: as new international and national monitoring priorities emerge in future years, the platform can be easily extended to include new reports addressing these priorities.

REI Category REI Descriptor or Indicator • Sector or district name • Sector or district identifier number Profile Information • Sector or district education officer • Map • Listing of schools in sector or district, with additional profile info • Pupils newly admitted in P1 who attended pre-primary • Learners with disabilities and disability types • Orphan pupils Student Characteristics • Average class attendance and Progress • Grade-level enrollments • Overage pupils • Student mobility • Promotion, repetition, and dropouts • Kinyarwanda early grade reading proficiency and comprehension • Kinyarwanda early grade reading indicators Learning Outcomes • English early grade proficiency • Math early grade proficiency • National exam participation and pass rates • Teacher attendance • Pupil/teacher ratio Teachers

SOMA UMENYE QUARTERLY PROGRAM REPORT | 103 REI Category REI Descriptor or Indicator • Teaching staff by subject • Teacher qualifications • Teacher experience levels • Teachers receiving Soma Umenye training and supports • Teachers receiving BLF training and supports Educator Training • ICT trained teachers • Competency-based curriculum • Special needs and inclusive education • School leadership and management • Pupil/classroom ratio • Pupil/desk ratio • Pupil/textbook ratios • Internet connectivity • Digitalized content • Smart classrooms ICT and Learning • Source and number of laptops Supports • Pupil/laptop ratio • School library • Pupil/latrine ratio • Feeding program • Rain water harvesting, tap water, safe drinking water • Solar • Energy grid

The individual indicator pages will enable quick comparisons across different levels within the education system (nation, district, sector), as well as over time for certain indicators. The site will enable disaggregation by gender for most of the indicators. Based on stakeholder feedback, the data visualizations will include reference numeric values in addition to the graphical representations. In some cases, a hover box will be used to display the reference numeric values to avoid a cluttered display. A box at the bottom of each chart will provide a short description of the information displayed, the source of data for the indicator, and links to user resources relating to that indicator. The site will incorporate a consistent color palette, font structure, and chart styles that support users’ quick comprehension of the information displayed within the tool.

While REI will be a web-based tool, many of the use cases identified for the project also require offline access. For example, to be effectively used in a school performance appraisal meeting, an SEO or head teacher would ideally have paper copies of REI reports that can be reviewed during the meeting by all participants. For that reason, REI will support a conversion to PDF for each of the web-based reports, as well as include a “Highlights Report” feature that will produce a one-page, PDF summary overview with approximately 15 descriptors or indicators for the district, sector, or school of interest (see Appendix D-4). The descriptors and indicators included in the Highlights Report include the highest priority items identified by stakeholders in the consultant’s second workshop session. The Highlights Report will be designed to be highly legible when printed or copied in black and white.

The REI site will include the following features to help users effectively engage with the data:

SOMA UMENYE QUARTERLY PROGRAM REPORT | 104

• Translation to Kinyarwanda: The site will include a button that automatically translates the entire site into Kinyarwanda. • Export to Excel and printing reports: All data included in the site will be exportable into a single Excel file for further analysis. In addition, the data for a particular unit (district, sector, or school) will also be exportable to Excel. All pages will also enable the ability to convert the information into a PDF report for printing. • Sector/District comparison feature: The site will enable side by side comparisons of sectors and districts for comparative purposes. For example, an SEO could quickly compare performance of a sector for a particular indicator against a neighboring sector within the district. • Page feedback and help features: While the project implementation plan includes a series of additional stakeholder workshops, the site itself will enable immediate feedback during user sessions. A feedback button will enable the user to provide a comment or suggestion that is tracked to the page the user is viewing. Users can optionally submit their contact information for follow-up. In addition, a help button will enable the user to directly email a customer support function to address any questions or concerns regarding the site. • Data source listing, glossary, reports library: The site’s reference tools will include a summary listing of the data source used for each descriptor or indicator and a glossary defining each descriptor and indicator. Finally, REI will incorporate a reports library that houses links to key official reports at the national and district level, including the NISR Statistical Year Book, EGRA reports, and district imihigo contracts and action plans.

DATA SOURCING AND TECHNICAL ARCHITECTURE DATA SOURCING:

Initially, the REI project will receive data from two primary sources and four to five secondary sources, as depicted in the “source” column in the indicator table in Appendix D-1. The primary sources of data will be (1) the school census data collected and maintained by MINEDUC for its education management information system (EMIS) and the Statistical Year Book, and (2) the newly emerging School Data Management System (SDMS). There is significant overlap between the data elements collected through census and those collected through the SDMS, and MINEDUC has indicated that eventually the SDMS will replace the spreadsheet-based data collection process for the census. In the long term, the SDMS will be the preferred source of much of the student- and teacher-related data for REI, as it is entered by school officials at the individual student and teacher level. While MINEDUC has indicated the SDMS should be fully functional in spring 2019, the REI project should plan on utilizing either census or SDMS data for Phase I, in case the SDMS data have not been fully validated and are not available for external reporting. The extent of SDMS availability will need to be confirmed in spring 2019, and both census and SDMS data should be obtained, analyzed, and considered for inclusion. Regardless of the readiness of the SDMS system, some census data will continue to be needed for most of the elements in the Profile Information, School Learning Supports & ICT, and School Learning Environment categories. The census data will either be provided as one file from MINEDUC meeting the project’s requested specifications (the preferred approach) or as separate spreadsheet files for each district and sector. The REI

SOMA UMENYE QUARTERLY PROGRAM REPORT | 105 project team will need to clean and consolidate all SDMS and census data into the requirements and format of the REI database model.

The secondary sources of data for REI include the following:

1. Soma Umenye: Soma Umenye will use its own data for REI indicators addressing learning outcomes in Kinyarwanda early grade reading, and teacher and head teacher training. The data sources for these indicators include the Early Grade Reading Assessment (EGRA) (for reporting at the national level), t Local Early Grade Reading Assessment (LEGRA) (for reporting at the district level), the School Monitoring and Support Framework (SMSF), the MEL training files, and pupil/textbook ratio information. Soma Umenye will validate all of its data before including it within REI. 2. BLF: Similarly, Building Learning Foundations (BLF) will provide learning outcome and teacher/head teacher training data for early grade English and math. BLF has confirmed the availability of this data and its willingness to provide the data for the REI project. Similarly, BLF will need to validate all data prior to its inclusion in REI. 3. National examination: The REI project will seek from REB’s ESAD department the national examination participation and pass rate data for P6. 4. Teacher attendance: Teacher attendance was identified as a high priority indicator by stakeholders; however, this data is not currently collected by MINEDUC or REB. The REI project will seek to source this directly from SEOs or DEOs for inclusion in the reporting database. 5. ICT information: Certain information on ICT, such as the extent of smart classrooms, may need to be sourced directly from REB’s ICT Department to the extent this information is not available through census data.

These secondary sources will all be collected by the REI project team, normalized to the project’s data model, and incorporated into the REI aggregate data-mart. The REI project team will enter into appropriate data sharing agreements with all data providers specifying how the data will be used and managed. Where available, two years of data will be requested. All of the data the REI project receives will be aggregate, i.e., none of the data will include personally identifiable student or teacher information. In addition, the REI project will only receive file extracts from the source systems, and will not be assuming management or control over any of the source systems. As such, under the National Data Revolution Policy, data provisioned for REI should be considered as “non-sensitive,” which is open by default under the policy. While these factors should limit the security and confidentiality concerns of the data owners, the technical architecture will still be established to meet the highest standards for the safeguarding of all data received for REI. Before any public release of data on REI, it will first be validated by MINEDUC, REB, and the appropriate DEOs and SEOs. This validation process is built into the project timeline.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 106

TECHNICAL ARCHITECTURE DESCRIPTION:

Initial development phase: For the initial development phase addressed by this project plan, REI’s technical architecture will primarily consist of the components for data management and the REI website application, as depicted below:

This project plan proposes the technology systems and hosting approach in the following bullet points based on consultation with technical experts that have developed projects similar in scope and complexity. Adjustments to these technology systems can be made to accommodate REB and MINEDUC specifications, although such adjustments may impact the project schedule and budget estimates.

• The proposed database management system is Microsoft SQL Server. The database management system will include approximately 50 descriptors/indicators for each profile level (national, district, sector, and eventually school), and will need to be extended with each new year of data as it becomes available. • The website is proposed for development in the .NET Framework, and must be mobile optimized to ensure accessibility via all device types. • The proposed graphing and charting tools are Microsoft Visual Studio Express, which is the free version of Microsoft Visual Studio. • The website application and database management system are proposed to be initially hosted at a project development environment managed by the lead contractual technical architect, with remote access available to all Rwandan technical team members. Hosting would then be transferred to the government of Rwanda National Data Center once the project has been launched and the website and database system are fully developed to ensure minimum disruption

SOMA UMENYE QUARTERLY PROGRAM REPORT | 107 to the development process with this transfer. If earlier hosting in Rwanda is an expectation of REB and MINEDUC, this can be accomplished but needs to be specified at the outside of the development process.

The proposed initial phase technical architecture is further detailed in Appendix D-5.

Future state technical architecture: Long-term, the sources of data for the REI project should become more centralized within MINEDUC and REB, and the data integrated for REI should also support Business Intelligence analytical platforms and publicly available CSV files on the National Data Portal at NISR. That way, data needed for reporting, monitoring, research, and evaluation can be integrated through a single standardized, automated system approach and made available for multiple stakeholder functions. One conceptual approach to achieve this future state vision is depicted below, demonstrating how the REI project architecture can be incorporated into a larger integrated system design. The approach to the future state technical architecture must be aligned with UNICEF-Rwanda’s and MINEDUC’s project to support the integration of Rwanda education management information systems. REI should be viewed as an important input for that project, as the work to integrate data across multiple systems for reporting purposes will reveal differences in data format and definitions, gaps in data availability or reliability, and integration needs to support the project’s stakeholder use cases.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 108

PROJECT IMPLEMENTATION PLAN AND TIMELINE As detailed in Appendix D-6, development of the REI project is proposed to commence in Quarter 3 of fiscal year 2019, with the fully functional Phase I site to launch in Quarter 3 of fiscal year 2020. Prototypes and partial releases will be staged throughout 2019 to maintain the visibility of the project and receive iterative feedback on the site’s design and features. Phase I will include reports for the nation as a whole, all districts, and all sectors. Early grade learning outcomes from the Soma Umenye project will be incorporated in the district and sector reports for three pilot districts. The pilot districts should be selected to ensure alignment with Soma Umenye’s LEGRA and SMSF implementation, and to include at least one district with validated SDMS data.

Phase II of the project will launch in Quarter 2 of fiscal year 2021. Phase II will include a second year of reports for the nation, all districts, and all sectors. In addition, school-level reports will be published in the three pilot districts. Additionally, the management and administration of the project will transition to MINEDUC and REB in Quarter 4 of 2019 with a view to having the dashboard fully embedded in the government by 2020.

Other key project activities reflected in the timeline include:

• REB, MINEDUC, and development partner engagement: For REI to be successfully launched and sustained, it must be developed in full partnership with the Rwandan government and in close coordination with key development partners. This is reflected in the project timeline, with the identification of REB and MINEDUC focal persons; formal data sharing agreements with REB, MINEDUC, and BLF; and presentations at key project milestones to MINEDUC and REB policymakers. In addition, two standing bodies are proposed for project engagement and input: a REI Advisory Board, and an REI Technical Working Group. — The REI Advisory Board will consist of key decision-makers and the focal persons from REB and MINEDUC, as well as senior-level representatives of NISR, UNICEF, BLF, MasterCard Foundation, CMU-Africa, USAID, Tony Blair Institute, and Soma Umenye. The purposes of the Advisory Board are to provide strategic feedback on overall project direction, ensure alignment with data system integration activities within the government, support outreach on the project, and eventually oversee a successful transition to MINEDUC or another long-term administrator. To ensure alignment with the UNICEF-Rwanda education information management system integration project, the REI Advisory Board may be merged with the Task Force that will be established for that project. This Board should meet at least quarterly. — In addition to the Advisory Board, a Project Technical Working Group will be established consisting of the Project Director, Assistant Director, Technical Director, Soma Umenye’s REI point persons, REB and MINEDUC focal persons, CMU-A, UNICEF-Rwanda and its system integration contractor, and BLF’s technical lead. On a bi-weekly basis, the REI Technical Working Group will review progress, assess key upcoming milestones, address implementation challenges, and ensure coordination with other related project activities. These meetings will either be held via videoconference or in-person at Soma Umenye’s offices.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 109 • Prototype development and JRES presentation: A key early project milestone is the development of a prototype website and reports for presentation at the spring/summer 2019 Joint Review of the Education Sector meeting. This will help to build broader awareness of REI and its potential for positively impacting the Rwandan education sector.

• Stakeholder advisory group engagement: In addition to central government and development partner engagement, the project will continue to be informed by a Stakeholder Advisory Group consisting of representatives from DEOs, SEOs, and head teachers. As described in the needs analysis section of this plan, this group met twice during the course of the plan’s development and provided invaluable input into the types of information needed to support local education planning and management, the availability of data at various levels, and the preferred approach to visualization of information. This group will be regularly informed of the project’s implementation through email and a WhatsApp message group, and will meet in person approximately twice per year during the duration of the project. This group will help ensure the project is addressing the needs of local stakeholders and is continuously informed by their input.

• End-user training: The project timeline incorporates training of all potential end-users prior to the launch of each phase. This training will be supported by a user guide and training manual with accompanying web tutorials that demonstrate the site’s navigation and how to utilize the site’s features. — At the national level, training will be provided to MINEDUC and REB staff whose responsibilities include data maintained within REI, or who may be utilizing REI in connection with oversight and improvement processes. — In addition, training will be provided to Soma Umenye and BLF staff who will utilize REI in connection with monitoring and evaluation activities. For both Soma Umenye and BLF, at least two district advisors per province should be trained as “dashboard champions” to provide direct dashboard training and supports at the sector and school level throughout the year. — For Soma Umenye, REI training should be an extension of the district advisors’ training on the Local Education Monitoring Approach (LEMA), and viewed as a critical support for LEMA implementation and monitoring activities. The REI team will work closely with the Soma Umenye MEL team to accomplish this goal and ensure that REI reports can be incorporated directly into SMSF data collection and SPAM processes. — DEOs and SEOs will receive in-person training at the provincial level prior to both the Phase I and Phase II launch. REB and MINEDUC should lead this training with Soma Umenye support to demonstrate their ownership and partnership in the project. In addition, Phase II training will also be provided to head teachers in each of the three districts that will receive school- level reports. Both Soma Umenye and BLF district advisors will participate in the training of DEOs, SEOs, and head teachers.

• Project management meetings and project reports: Throughout the project, the project director, assistant director, and technical director will track progress using project management software and address any implementation challenges in real time. the technical experts and Rwanda-based development teams will coordinate activities on a day-to-day basis and establish a regular check-in process. The project director and assistant director will elevate issues to the REI Advisory Board and REI Technical Working Group as needed to ensure a timely resolution of all project issues as they arise. In addition, the project director and assistant director will provide a

SOMA UMENYE QUARTERLY PROGRAM REPORT | 110

mid-point and final project report that documents the status of the project and that can be shared with key external stakeholders.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 111

Appendix D-1. Indicator Table for REI Initial Phase

Exhibit 16. Descriptor and Indicator Table Sector Dashboard16

Descriptor or Indicator ESSP17 Census18 SH19 Source Notes Profile Information Sector Name Sector Identifier # District Name Sector Education Officer: Name and Contact Sector Map Latitude and Longitude, in Degree, Minutes, Seconds Primary Schools in Sector A Census Listing with links to school data a. School Name, Identifier #, Website EMIS only includes school name; Soma Umenye has 1: 5 Census captured identifier b. Geographic Coordinates 1: 18, 21-22 Census Point ID and Latitude and Longitude c. Year Established 1: 25 Census d. School Status (Public or Government-aided) 1:29-30 D Census Private not included in initial scope e. School Setting Primary only, Primary + Secondary ordinary, Primary 1; 47-49 D Census + Secondary (O+A) f. Pre-Primary Level 5.1 1; 68 Census Census – refers to as “nursery level” Student Characteristics and Progress (all, M, F) Pupils newly admitted in P1 who attended pre-primary III.2.6; 266- SDMS or ESSP specifies 3 years of pre-primary; years not 5.1 level 268 Census available in Census Total learners with disabilities and disability types III.7; 375- SDMS or Hearing, visual, speaking, other physical, learning, 7.2 H, D 381 Census multiple Orphan pupils III.6; 363- SDMS or 1.2 S, D 369 Census Average class attendance by enrolled pupils 1.2 III.5; 337- SDMS or Total pupils enrolled/Average attendance per year A 357 Census Grade-level enrollments 1.2 1.b; 247-, SDMS or Schools suggested new entrants also be tracked A 250 Census

16 District dashboard reports will vary in the Profile Information area but otherwise include the same areas, descriptors, and indicators. National will also include EGRA results. 17 Indicates outcome area in ESSP 3 plan. 18 Collected through the School Census Data Collection Tool (http://mineduc.gov.rw/resource/statistics/data-collection-tools/); Reference to Section and Row of Tool 19 Stakeholder workshop priority. H = head teacher/teachers; S = sector; D = district; A = all.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 112

Descriptor or Indicator ESSP17 Census18 SH19 Source Notes Overage pupils 1.2 III.4.a; 286- SDMS or For each level, % at appropriate age, % 1 year over, % D 306 Census 2 years over, % 3 or more years over Student mobility 1.2 % of pupils experiencing at least one transfer in or out A SDMS of a school in prior year Promotion, repetition, and dropouts 1.2 234-40 and SDMS or Also incorporate transition from primary to lower A III.3 Census secondary Learning Outcomes (All, M, F) Kinyarwandan early grade reading proficiency 1.1 D Soma National: EGRA; District: LEGRA for P3; Basic & Umenye optimal levels School Management Support Framework 1.1 D Soma 20 indicators; reported Not Achieved, Partially Umenye Achieved, or Achieved English Early Grade Proficiency 1.1 D BLF P1-P3 representative at district level and above Math Early Grade Proficiency 1.1 D BLF P1-P3 representative at district level and above National Exam Participation (P6) 7.1 NEPS National Exam Pass Rates (P6) 7.1 H, D NEPS NEPS has no LwD data; includes school ID; data available in December National Exam Through-Put Pass Rates (P6) NEPS and Pass rate as a % of potential exam takers, based on a census P1 enrollment 5 years earlier Teachers Teacher attendance (% of teachers with 80% or higher 2.2 A SEO Possibly LEMA domain or incorporate into attendance rate) reports LTMIS/Inspectorate data collection Pupil/Teacher Ratio (P1-P3 and all) 2.2 V.9; 584 A SDMS or Need to account for double shifts. Single in P6 in census 2018; in P5 and P6 in 2019 Teaching staff by subject 2.2 V.6; 525 S, H SDMS or census Teacher qualifications 2.2 IV.1; 389- A SDMS or Report unqualified, A2, A1, A0 or higher 400 census Teacher experience levels 2.2 V.4; 478- A SDMS or Median experience 489 census Educator Training P1-P3 teachers and head teachers receiving Soma 2.1 A Soma Umenye training and supports Umenye P1-P3 teachers and head teachers receiving BLF 2.1 BLF English and Math training and supports ICT trained teachers 2.1 V.12; 610 S, D SDMS or census Competency-based curriculum 2.1 V.12; 608 A SDMS or Teacher and head teacher census Special needs and inclusive education 7.2 V.12; 612 D SDMS or Teacher and head teacher census School leadership and management 9.1 V.12; 609 SDMS or Teacher and head teacher census

SOMA UMENYE QUARTERLY PROGRAM REPORT | 113 Descriptor or Indicator ESSP17 Census18 SH19 Source Notes ICT and Learning Supports Internet connectivity 4.1 V; 651 H, D Census Pupil/computers ratio 4.1 V; 641-644 A Census Census does not break down by grade level; laptops used P4-P6 One Laptop Per Child 4.1 V; 642 H, D Census Incorporate into Pupil/Computers Ratio page Digitalized content for teaching and learning 4.1 V; 632 Census Identified as priority by REB ICT Department Smart classrooms (# - at least 2 per ESSP) 4.1 D REB ICT Primary: projector, digitalized content, internet, Dept. storage Pupil/Classroom ratio 6.1 II.1.a; 75-80 A Census Per ESSP: Primary: 40:1 Pupil/Textbooks ratios 6.1 IV: 680-688 A Census Math, Kinyarwanda, English, Social and Religious Studies, Science and Technology Pupil/Desk ratio 6.1 II.1.a; 75-80 D Census School library 6.1 VI: 674 S Census School feeding program and type XII; 810, A Census Type: None, Breakfast and lunch, Lunch, One cup of 816-819 milk, porridge only Electricity grid 6.1 II.6.b; 149 A Census Solar power 6.1 II.6.b; 150 Census Pupil/Latrine ratio 6.1 II.6.c; 162 A Census M/F ratio Tap water 6.1 II.6.a; 143 A Census Rain water harvesting system 6.1 II.6.a; 143 Census Safe drinking water 6.1 IX; 768 Census

SOMA UMENYE QUARTERLY PROGRAM REPORT | 114

Appendix D-2. Potential Future Indicators

Descriptor or Indicator ESSP20 Census21 SH22 Source Notes Profile Information School land area D Furthest student distance traveled D Boarding school or non-boarding school 1; 68 D Removed as rarely applicable to primary level Social protection program category percentages (1-4) D MINALOC Information for the sector or district as a whole Financial status/capitation grant H, D Sports S, D Census Asks if the school has sport infrastructure in various areas Clubs S School partnership information (PTA, SGAC, NGO, H, S Census includes information on SGA and SGAC churches, etc.) Sector environment (market days, employers, etc.) H Student Characteristics and Progress Drug abuse D Pupils having health insurance S, D Teenage pregnancy S, D Children regularly fed at school XII; 821 A Learning Outcomes Pass rates in key subjects S, D % of P3 learners at or above LARS benchmarks 1.1 2017 LARS Provincial only; both literacy and numeracy % of P6 learners at or above expected LARS levels 1.1 2017 LARS Provincial only; both literacy and numeracy Teachers Head teacher attendance S Average teacher instructional hours S, D Head teacher class visits S Head teacher action plan, contract performance S Teachers achieving English proficiency APT B1 or 2.1 BLF ESSP dashboard mockup above % of SBMs spending more than 2 hrs/week mentoring TMIS REB dashboard prototype fellow teachers Support staff ratios D ICT and Learning Supports Classrooms on or above standards of size D Census

20 Indicates outcome area in ESSP 3 plan. 21 Collected through the School Census Data Collection Tool (http://mineduc.gov.rw/resource/statistics/data-collection-tools/); Reference to Section and Row of Tool 22 Stakeholder workshop priority. H = head teacher/teachers; S = sector; D = district; A = all.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 115 Descriptor or Indicator ESSP20 Census21 SH22 Source Notes % of schools with ICT for teaching and learning X Census Menstrual hygiene management (MHM) rooms D Science corner Census Meeting standards of accessibility for LwD 7.2 S Greening and beautification program S, D Census Playgrounds A Census Playgrounds reported by sports type

SOMA UMENYE QUARTERLY PROGRAM REPORT | 116

Appendix D-3. Proposed REI Website Design

SOMA UMENYE QUARTERLY PROGRAM REPORT | 117

Appendix D-4. Proposed REI Highlights Report Design

SOMA UMENYE QUARTERLY PROGRAM REPORT | 118

Appendix D-5. Technical Architecture Design Parameters

The diagram on the following page represents the flow of data into and across the system from source warehouse to user interface via web application.

Data enters the system from its original sources and is stored in landing tables, either as a direct import or after being transformed based on the needs of the REI system.

The data is then moved from the landing tables into the core layer, the main operational tables in the system. The core layer employs the system’s underlying data model, within which the data from all those disparate sources is integrated and reorganized based on the system’s output needs. The database prepares output based on that data model and makes it available to the web application using stored procedures in addition to functions and views, technical terms that refer to a variety of ways web applications can extract data from this database.

The web application uses application programming interfaces (APIs) to retrieve data previously prepared for it by the central database system and present it through the user interface.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 119

SOMA UMENYE QUARTERLY PROGRAM REPORT | 120

Appendix D-6. Project Timeline

Project Quarter Project Activity 2019 2020 2021 1 2 3 4 1 2 3 4 1 2 3 Finalize implementation agreement and scope Identify REB and MINEDUC focal persons Data sharing agreements with MINEDUC, REB, and BLF Phase I data sourcing: Census, SDMS, Soma Umenye, BLF Develop prototype web-based dashboard using sample data Present prototype at Joint Review of Education Sector (JRES) meeting Conduct Stakeholder Advisory Group Workshop Feedback sessions with DEOs/SEOs in 3 pilot sites Develop Phase I database Finalize and implement MEL plan for dashboard Receive and incorporate LEMA data for 3 pilot districts Revise and finalize Phase I web design & copy Develop Phase I User Guide and Training Manual Present Phase I site to REB and MINEDUC teams Phase I data validation by MINEDUC, REB, DEOs, SEOs Phase I launch: All districts, all sectors, early grade learning outcomes in 3 pilot districts Phase I training: DEOs, SEOs, Soma Umenye DA’s Phase II data sourcing Extend database for Phase II data

SOMA UMENYE QUARTERLY PROGRAM REPORT | 121 Project Quarter Project Activity 2019 2020 2021 1 2 3 4 1 2 3 4 1 2 3 Phase II web design & copy Develop Phase II User Guide Present Phase II site to REB and MINEDUC teams

Phase II training: DEOs, SEOs, Soma Umenye DA’s Phase II launch: All districts, all sectors, schools in 3 districts Phase II DEO/SEO/head teacher training Develop and negotiate transfer and sustainability plan Transfer of database/website to MINEDUC or other Rwandan-based administrator Bi-weekly technical team meetings with Soma Umenye (videoconference or in-person) Stakeholder Advisory Group Workshops REI Advisory Board meetings Project reports (midpoint and final)

SOMA UMENYE QUARTERLY PROGRAM REPORT | 122

ANNEX E. DEVELOPING AND IMPLEMENTING A READING ASSESSMENT AND ACCOUNTABILITY FRAMEWORK FOR EARLY PRIMARY BASED ON GRADE-LEVEL BENCHMARKS AND TARGETS A CONCEPT PAPER PREPARED FOR THE RWANDA EDUCATION BOARD BY USAID SOMA UMENYE

1. EXECUTIVE SUMMARY With the transition to knowledge-based economies, there is heightened recognition of the need to produce graduates with the knowledge and skills, including literacy-related skills, required to be successful in such economies. The increased globalization of world economies has raised the bar for educational systems, whose graduates now need to be competitive on a regional if not international stage. This explains, in part, the dramatic increase in the number of large-scale national, regional and international assessments and of countries participating in these assessments. As a case in point, the number of countries participating in TIMSS, PIRLS and PISA international assessments (NCES 2019a; NCES 2019b; NCES 2019c), has doubled, and in some cases tripled, over the last two decades. The recent Human Capital Index Study did a comparative analysis of the efficacy of 157 national education systems, one of which was Rwanda (World Bank, 2018).

Rwanda’s commitment to ensuring its graduates are competitive on the regional and international stage is reflected in a number of policy-related decisions:

• The decision to include in Education Sector Strategic Plans an indicator that tracks the percentage of pupils meeting grade-level expectations or benchmarks for reading (and math); • The decision to establish national targets for reading, defined as the percentage of pupils expected to be meeting minimal grade-level benchmarks; • The decision to introduce district-led assessments at the end of terms 1 and 2 and centrally-led end-of-year national assessments as a means of monitoring progress towards these targets.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 123 This paper proposes two related initiatives, Cut scores versus anchored in the lessons learned from the 11 large- benchmarks – What’s the scale early grade reading assessments carried out difference? over the last nine years, that would provide Cut scores are the scores on an assessment that separate one MINEDUC/REB with the inputs and tools required performance category from another. to achieve the above: an early grade reading For example, the cut score assessment framework, that would ensure that separating “not meeting expectations” and “partially meeting future assessments align with international best expectations” for oral reading practices and an accountability framework that fluency in P2 may be 10 correct would establish defensible benchmarks and targets words per minute or CWPM. A benchmark refers to the specific cut and put in place a robust system for monitoring score for the performance category progress against these benchmarks and targets. Each “meeting grade-level expectations.” is described below. It is the minimal score pupils must attain in order to be considered to be performing at grade level. For A. THE EARLY GRADE READING ASSESSMENT example, the benchmark for P2 may FRAMEWORK be 20 CWPM. If so, any pupil who can read at 20 or more CWPM is The table on the following page provides an example considered to be performing at of an assessment framework for reading in early grade level. category “meeting grade-level expectations”. primary. Assessment frameworks for a given content area and grade level generally:

• establish different categories of pupil performance (e.g., not meeting expectations, meeting expectations, exceeding expectations) and provide a rich description, for each performance category, of the specific grade-level reading skills pupils in each of p1 to P3 should (or should not) be able to demonstrate; • establish grade-specific performance benchmarks, keyed to the above categories and descriptors, for key reading skills and each assessment instrument. The benchmarks would be established using international-based practices or the establishment of cut scores and be defensible nationally and internationally; • use international best-practices (specifically the modified Angoff method23) to define, for national and district-level reading assessment instruments, scores corresponding to each performance category (e.g., not meeting expectations, meeting expectations, exceeding expectations)”, so that future reading assessment results are comparable across time, jurisdictions (district, national, international) and across different instruments.

Once completed, assessment frameworks provide a valid, defensible basis for reporting on progress with respect to regional or global indicators like SDG

23 Appendix F-4 summarizes some of the most common internationally-recognized best practices in establishing cut scores or benchmarks. For a more complete discussion, see Zieky & Perie, 1982 or Livingston & Zieky, 2006, https://www.ets.org/Media/Research/pdf/Cut_Scores_Primer.pdf

SOMA UMENYE QUARTERLY PROGRAM REPORT | 124

4.1.1: the proportion of children (SDG 4.1.1) in grades 2 or 3 and at the end of primary who achieve at least a minimal proficiency in reading and mathematics. (Montoya, 2018). They also bring much needed consistency and reliability to the development of national and district large-scale assessment tools and interpretation of pupils’ reading skills. The diagram below outlines the different steps in the development of an assessment framework.

Developing an assessment framework Diagram on the following outlines the key components of an assessment framework and the specialists involved in developing each component.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 125 Table 1: Example, Reading Assessment Framework for Early Primary Curriculum Standard Read P3 end-of-grade level text fluently and confidently; Retain or identify most of the literal information contained in text Does not meet expectations Partially meets expectations Meets expectations Exceeds expectations Not performing at a level that will Partial mastery of material, but not Solid academic mastery of material; Superior performance; performing Performance categories ensure success enough to meet grade level Performing at grade level above grade level expectations expectations Oral reading fluency. Does not Oral reading fluency. Is Oral reading fluency. Is Oral reading fluency. Exhibits read at all or reads very slowly uncomfortable reading orally; comfortable reading orally. Reads confidence when; Reads quickly, and with great hesitation, with Reads slowly and hesitantly, with with some fluency and generally smoothly and accurately, little or no accuracy. little expression. Tends to read accurately, although not respecting punctuation (commas, words rather than sentences, but necessarily smoothly. Sometimes question mark, period). May even Either cannot read the words, with some fluency and accuracy hesitates or uses overt read with expression and correct reads only first letter/syllable without overtly segmenting. segmenting and blending words intonation before getting stuck, or misreads However, uses overt segmenting when sounding out these words. words entirely to sound out unfamiliar word. Does not consistently respect Sounds out unfamiliar words Does not always identify when sentence punctuation (commas, automatically, accurately and Comprehension. Is unable to reading does not made sense question mark, period). Usually without hesitation. Identifies extract any literal information, because a mistake has been made. checks to make sure the selection when reading has not made sense recall and details or make Continues to read rather than re- is making sense and the re-reads and self-corrects. inferences. Needs adult support reading or self-correcting or self-corrects if necessary to retell a story. Comprehension. Extracts all of Performance Comprehension. Extracts most Comprehension. Extracts some the literal information from text descriptors of the literal information from Vocabulary – Is unable to figure literal information from text, for provides clear, complete and text; provides generally accurate out the meaning of new word. example names of main accurate answers to questions; and complete answers. Makes characters; answers to questions Makes accurate inferences based inferences (but no always accurate may contain inaccuracies; Cannot on what is said or done in text; ones), based on what is said or make logical inferences based on Accurately describes main done in texts. Makes reasonable information in story. When asked characters predictions, when prompted, of to retell a story, focuses on one Can accurately retell a P3 story, what might happen, based on key event or includes inaccurate including specific and relevant events in text. Can accurately information. details. Makes plausible retell, in sequence, most main Predictions are often guesses, not predictions of what might happen, events or ideas in a P3 story based on information in text based on events in text Vocabulary- is usually able to Vocabulary – is able to figure infer meaning of new words by Vocabulary. Uses a variety of out the meaning of some new using context clues or analysing strategies to accurately infer the words the structure of the word meaning of new words EGRA Oral reading fluency 0 to EGRA Oral reading fluency 20 to EGRA Oral reading fluency 40 to EGRA Oral reading fluency 61+ Cut scores 19 CWPM 39 CWPM 60 CWPM CWPM for each assessment ETC ETC PASEC – Level 3 ETC tool TERCE – Level 1 UNICEF – MICS Level 6

Benchmark score : Minimally meeting grade level expectations

SOMA UMENYE QUARTERLY PROGRAM REPORT | 126

Certain of these components, like the grade specific curriculum standards, have already been developed and can be taken directly from the existing curriculum. Others – performance categories, cut scores or all categories for key reading skills, including benchmark score for “meeting grade level expectations” – need to be revised. In the case of performance categories, there is a need to come to consensus on the number of performance categories to be used when interpreting the results of large-scale assessments. In the case of cut scores/benchmarks, there is a need to use international best practices to establish defensible cut scores/benchmarks for key reading skills in P1 and P2, and to validate or revise the current P3 benchmarks.24 The process would correct a serious shortcoming in the current P1 benchmarks were pupils are considered to be “meeting grade level expectations” if, by the end of the year, they are able to read one word. This is a standard that most educational stakeholders would perceive to be less than ambitious. Finally, certain of the components, namely harmonized national and district assessment instruments for P1 to P3, keyed to the framework and rich qualitative and quantitative performance descriptors for each category25, do not exist and would have to be developed.

24 Cut scores/benchmarks were established for P3 in 2012 and validated in 2014 using a data-driven method developed exclusively for use with Early Grade Reading Assessment data. Cut scores/benchmarks for P1 and P2 were not based on actual performance of P1 and P2 pupils, but rather extrapolated from P3 benchmarks. 25 The need to establish comprehensive descriptions of what pupils in each performance category can and cannot do with respect to the curriculum expectations of a given grade level was first identified by REB in 2016, in the LARS II report (p. 65).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 127 General goal for that subject Established by master teachers, curriculum Grade-specific specialists, using defensible, curriculum standard international best practices (Angoff, Edel, Bookman Set by learning methods, etc.) assessment specialists

Cut scores for all performance categories, including DIAGRAM 1: benchmark score Performance COMPONENTS OF AN corresponding to categories "meeting ASSESSMENT expectations" FRAMEWORK FOR EARLY GRADE READING Common Aligned with across subject, performance (OR ANY SUBJECT AREA) grade levels descriptors

Harmonized assessement Describe what pupils instruments to Qualitative and in each category can measure progress quantitative and cannot do with respect to grade- performance against descriptors level standard descriptors for each • National, end-of-year category instruments Developed by curriculum specialists for • District term 1 & 2 instruments content area; Do not change unless curriculum changes • Sector instruments Developed by curriculum and learning assessment specialists

SOMA UMENYE QUARTERLY PROGRAM REPORT | 128

B. ACCOUNTABILITY FRAMEWORK FOR EARLY GRADE READING

The second initiative, an accompanying accountability framework for reading in early primary would: • Ensure that all educational stakeholders are aware of minimal reading benchmarks for key skills in P1, P2 and P3; • Use national baseline data to determine the percentage of pupils’ meeting minimal grade-level benchmarks with respect to key skills; • Use data-driven methods, and data on average learning gains in low-, middle-, and high-resource countries to establish defensible short-, medium-, and long-term national targets for the percentage of pupils meeting grade-specific benchmarks; • Provide districts with simplified tools to collect district baseline data on pupils’ performance with respect to benchmarks, and a common process for establishing justifiable district improvement targets to include in their imihigo; • Provide national and district level officials with the tools they need, including a digital dashboard, to collect and report, each term, on 1) the percentage of pupils meeting benchmarks and 2) progress with respect to targets. Diagrams 2 and 3 are potential dashboard displays of learning assessment results.

Diagram 2 : District display, P3 Reading, by performance categories

SOMA UMENYE QUARTERLY PROGRAM REPORT | 129 Diagram 3: National display, % P3 pupils meeting fluency and comprehension benchmarks, by province

Developing an accountability framework. Diagram 3 below outlines the processes and procedures for developing and implementing an accountability framework for early grade reading, aligned with MINEDUC/REB’s commitment to district and national assessment of learning results.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 130

National baseline to Targets reflect determine % pupils by skewedness of data, performance category resources directed towards improvements & National end of term 3 assessment to measure international trends with progress re targets respect to national learning gains National level National targets set (% DIAGRAM 3: DEVELOPING increase, decrease pupils in each category) & AND IMPLEMENTING A communicated to disticts District terms 1 & 2 results READING ACCOUNTABILITY aggregated at national level via dashboard FRAMEWORK, P1 TO P3

District baseline conducted to identify % pupils each District level Annual district term 1 & 2 category assessments carried out; progress reported to national level District assessment tool

District imihigo targets set and performance (short, medium), in category cut scores, alignment with national targets aligned with national assessment framework , developed by national level Although there are some underlying structures and resources in place to support the implementation of an accountability framework (district imihigo, Education Sector Strategic Plan indicators for percentage of pupils meeting grade level expectations in reading; 2018 national Early Grade Reading Assessment at P1 to P3 to establish national baseline), including the recent commitment to annual district and national assessment of learning results, most of the components identified above would have to be developed. This includes a national/district dashboard for collecting and reporting data on pupils’ progress, national and district targets, and harmonized district and national reading assessment tools.

An accountability framework like the one outlined above would make national and district officials equally responsible for collecting and reporting on progress with respect to targets, and at same time shift responsibility for improving learning outcomes to those who have the most control over the quality of learning in schools: district/sector officials, head teachers, and classroom teachers.

Although the assessment and accountability frameworks proposed here target lower primary and are limited to one key subject area – reading – the processes for developing and implementing them can be applied to all grade levels and subject areas. The development and implementation of early grade-reading assessment and accountability frameworks can in fact serve as a pilot for development of similar frameworks for other subject areas and grade levels.

2. INTRODUCTION Since the early 1990s, there has been a dramatic rise in the number of national, regional and international large-scale assessments (LSAs) of learning. At the international level, the number of countries participating in the grade 4 Trends in International Mathematics and Science Study

SOMA UMENYE QUARTERLY PROGRAM REPORT | 131 (TIMSS) has increased almost three-fold in the last 25 years, from 20 in 1995 to 58 in 2019 (NCES 2019 a).26 Participation in the Progress in International Reading Literacy Study (PIRLS) has increased from, 34 in 2001 to 49 countries in 2016, the last year the assessment was carried out (NCES 2019b)27 In the same time period, the number of countries participating in the Programme of International Student Assessment (PISA) for 15 to 16-year olds has increased from 32 to about 80 (NCES 2019c).28

A. WHY THE INCREASED INTEREST IN LARGE-SCALE ASSESSMENTS?

The dramatic increase in the number of LSAs globally can be attributed to a number of factors (Kellaghan and Greaney, 2001), including:

• An increased emphasis on accountability worldwide. (Kellaghan, Greaney & Murray, 2009) Stakeholders want assurance that:

o learners are learning what they should be learning, and they are learning it to the expected extent; o the resources expended to improve pupil learning outcomes are having the desired effect, i.e., learning outcomes are improving over time. (Kellaghan & Greaney, 2001, 2004) • An increased recognition of the link between educational outcomes and economic growth. Over the past 20 years, repeated studies (Hanushek and Wößmann. 2007) have demonstrated that the knowledge, skills and attitudes that students develop in school have a direct impact on a country’s economic growth. • The transition to knowledge-based economies. Educational systems need to produce graduates who have the higher-order thinking skills required for knowledge-based economies (OECD 1996) • The advent of globalization, and with it the need for economies to be competitive on the regional and international stage. Educational systems need to produce graduates who can compete with counterparts regionally and internationally.

All of the above has led to a shift in focus in how jurisdictions monitor educational quality. Rather than defining quality in terms of the quality of educational inputs (student participation rates, teacher-student ratios, class size, infrastructure, etc.) quality is now being defined in terms of educational outputs, and specifically the quality of students’ learning (Kellaghan, Greaney & Murray, 2009).

Sustainable Development Goal 4.1. This shift is reflected in the new Sustainable Development Goals (SDG) (Montoya, 2018) which commits the global community to ensuring that, by 2030, all girls and boys complete free, equitable and quality primary and secondary education leading to relevant and effective learning outcomes (SDG 4.1). At the primary level, the indicator for measuring progress towards this goal is the proportion of children (SDG 4.1.1) in grades 2 or 3 and at the end of primary who achieve at least a minimal proficiency in reading and mathematics, with the definition of “minimal proficiency” for reading captured in the following draft general performance descriptors:

26 Source: IES NCES (National Centre for Educational Statistics) website https://nces.ed.gov/timss/countries.asp Retrieved Feb 9, 2019. 27 Source: PIRSL website https://nces.ed.gov/surveys/pirls/countries.asp Retrieved Feb 9, 2019. 28 Source : Educational Research Centre website http://www.erc.ie/studies/pisa/who-takes-part-in-pisa/ Retrieved Feb 9, 2019.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 132

Table 1: General performance descriptor – minimally meeting grade level proficiency, SDG 4.1.1

Grade level Pupils who meet minimal reading proficiency levels for their grade level: Grade 2/P2 Read and comprehend most of the written words in an instrument given to them, particularly familiar ones, and extract explicit information from sentences.

Grade 3/P3 Read aloud written words accurately and fluently. They understand the overall meaning of sentences and short texts. They identify the texts’ topic.

Source: S. Montoya. UNESCO Institute for Statistics. November 2018

Human Capital Index. The recently released Human Capital Index (HCI) study (World Bank, 2018) also puts an emphasis on learning outcomes. Of the seven indicators that make up the index, two focus on the quality of the education received. One indicator looks at harmonized test scores on regional or international assessments of reading, mathematics and science. At the early primary level, the indicator looks specifically at P3 or P4 pupils’ average reading comprehension scores on the EGRA reading comprehension subtask. The second indicator, the learning-adjusted years of schooling, uses the harmonized test score results as the basis for calculating the efficiency of educational systems (Ibid.).

B. RWANDA’S POSITION WITH RESPECT TO LARGE-SCALE ASSESSMENTS

Rwanda’s position with respect to the above. Rwanda is not immune from international trends. As early as 2000, with the publication of Vision 2020, Rwanda committed itself to transitioning from an agrarian to a knowledge-based economy and ensuring that its graduates develop the hard and soft skills required to be competitive in such economies. As high-level literacy skills are an undeniable requirement of knowledge-based economies. improving reading and writing skills across the grade levels, and particularly in the early grades, has become a priority and a recognized means of improving overall learning results. The last two Education Sector Strategic Plans have included, as a measure of the overall quality of the education system, the percentage of pupils meeting minimal reading performance standards at P3 (MINEDUC, 2017).

Rwanda’s commitment to monitoring the quality of its educational system is also reflected in the 11 different national assessments of early grade reading skills carried out between 2010 and 2018. (See Appendix E-2 for an overview of the reading assessments administered each year and grade levels targeted). These have included the Learning Assessment in Rwandan Schools (LARS), the Early Grade Reading Assessment (EGRA) and the Fluency Assessment in Rwandan Schools (FARS).

Table 2: Different early grade reading assessment instruments, 2011 to 2018 Assessment Nature of instrument Reading skills assessed EGRA One-on-one interview where Listening comprehension pupils are presented with a series Ability to recognize letter sounds of tasks to complete orally and Ability to read syllables enumerator notes correct answers Ability to read familiar words Oral reading fluency (number of correct words of a grade- level text the pupil reads correctly in 1 minute) Reading comprehension (% of correct questions, out of a total of 5, pupil is able to answer about text read)

SOMA UMENYE QUARTERLY PROGRAM REPORT | 133 LARS Pencil and paper multiple choice Vocabulary test Reading comprehension

FARS One-on-one interview where Oral reading fluency (number of correct words of a grade- pupils are presented with two of level text the pupil reads correctly in 1 minute) the EGRA tasks to complete orally Reading comprehension (% of correct questions, out of a and enumerator notes correct total of 5, pupil is able to answer about text read) answers

Although many of these assessments have been donor-driven, the sheer number of assessments carried out is a testament to their importance for decision-makers.

Rwanda’s commitment to building a high-quality, competitive educational system is reflected in its recent decision to introducing annual large-scale assessments at the district and national level to monitor educational quality (http://mineduc.gov.rw/fileadmin/user_upload/Amatangazo/Press_Release__English_FINAL_as_at_0 5_02_2019.pdf) and to track, over time, the percentage of pupils meeting the targets outlined in the Education Sector Strategic Plan (MINEDUC, 2017).

3. WHAT HAVE WE LEARNED FROM 11 YEARS OF EARLY GRADE READING ASSESSMENTS? The 11 national assessments of early grade reading skills conducted between 2010 and 2018 should ideally have provided the MINEDUC/REB with a clear picture of the progress it has made in improving pupils’ reading skills, as well as the inputs required to: 1) build robust national, district and sector early grade reading assessment instruments, 2) establish defensible national grade-level targets and 3) monitor progress with respect to these targets. However, the extent to which previous assessments can be used to measure progress over time or serve as the basis for future initiatives is compromised by the following:

• No equating was ever carried out between the LARS instruments and the EGRA/FARS to establish a basis for comparability of results across different assessment instruments. As a result, it is impossible to determine if the LARS assessment for a given grade level is more or less difficult than the EGRA/FARS, or if the assessments are of the same degree of difficulty. This makes it impossible to compare pupil results across assessments.

• Each assessment groups pupils’ results into a different number of performance categories and labels the categories differently. Large-scale assessments typically group pupils according to the level of performance they are able to demonstrate. Reporting pupil results in terms of performance categories is in fact one of the internationally-recognized best practices in large-scale assessments (Bejar, 2009; Cizek, 1996).

There is a lack of consistency across the 11 assessments reviewed for this concept paper as to the number of categories retained and their labels (see table below). This makes it difficult to compare results across assessments.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 134

Table 3: Comparison, LARS and EGRA/FARS performance categories, with category corresponding to “minimally meeting grade level expectations” in bold LARS Performance 2010-2012 FARS/EGRA 2014-2018 FARS/EGRA Performance Categories29 Categories 2012 Performance categories P2, P3 P1, P2, P3 P1 P2 P3 Does not meet Does not meet grade Non-reader Non-reader Non-reader grade level level expectations Beginning reader Beginning reader Beginning reader expectations Emerging reader Emerging reader Emerging reader Meets grade level Meets grade level Competent Reader Competent Reader Competent expectations expectations Reader Exceeds grade Exceeds grade level Proficient Reader Proficient Reader Proficient level expectations expectations Reader

The LARS I (MINEDUC 2012) assessment reported pupil performance according to three performance categories. In LARS II and III (MINEDUC 2016, 2018) the categories were dropped in favor of reporting average scores, and in the case of LARS III, the percentage of pupils meeting or not meeting a given benchmark, although no explanation is given as to what the benchmark was or how it was established. In the case of FARS/EGRA, five categories were eventually retained. However, the category corresponding to “meets grade level expectations” varies by grade level: at P3, it is “competent reader”, at P2 it is “emergent reader” and at P1 it is “beginning reader”.

It should be noted that the above table implies an alignment of performance categories assessment tools, with LARS I “meets grade level expectations” aligning with the FARS category of “competent” reader at P3. However, as the next bullet point explains, comprehensive, criterion- referenced performance descriptors keyed to grade-specific curriculum expectations are not provided for the different categories retained (see discussion, next point). For that reason, it is impossible to determine whether pupils who are meeting grade-level expectations on LARS would meet grade level expectations on FARS/EGRA and vice versa.

• There is no consistent definition, across assessments, of the specific reading skills a pupil at a given grade level who “meets grade level expectations” or “is at or above a benchmark” is able to demonstrate. None of the assessments provided rich, criterion- reference performance descriptors for the different categories of pupil performance. This limits comparability across assessments, as it is impossible to determine with any degree of certainty or clarity, what pupils who “meet grade-level expectations” are able to do with respect to curriculum expectations for their grade level. In the case of LARS I (REB 2012), no qualitative or quantitative descriptors are provided for the three performance categories retained. In the case of FARS or EGRA, quantitative descriptors are provided in the form of the number of correct words a pupil is able to read in one minute for the fluency task, or the percentage of reading comprehension questions pupils are able to answer. However, no overall qualitative descriptors, linked to curricular expectations for the grade level, are provided. A P3 pupil who

29 The three EGRA/FARS categories of “non reader, beginning reader and emergent reader” created in 2014 were established by subdividing the original performance category “does not meet grade level expectations”. The creation of additional categories to distinguish pupils who are making some progress in reading (emerging reader) from pupils who are struggling (beginning reader or nonreader) is common when data is highly skewed, i.e., when the majority of pupils are not meeting minimal grade level expectations as it provides a more nuanced understanding of the level of reading skills. It also allows for important improvements over time to be captured among pupils at the lower end of the scale.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 135 is deemed to be meeting minimal grade-level expectations on LARS, may or may not meet minimal grade-level expectations on FARS.

The lack of equating across instruments, or occasionally even between different iterations of the same assessment instrument, combined with a lack of the common definition of the skill level associated with “meeting grade level expectations” makes it difficult to sketch a clear picture of what is happening over time with respect to young children’s reading skills. It has also resulted, at the national level, in conflicting portrayals of the efficacy of the early primary education system (see graph below).

Graph 1: Percentage of P3 pupils meeting "grade level expectations", LARS vs FARS 70% 63% 60% 54% 50% 40% 31% 28% 30% 25% 20% 10% 0% LARS I 2011 LARS III 2016 FARS 2014, ORF FARS 2015, ORF FARS 2016, ORF

• The instruments used in the assessments have changed, compromising comparability over time. This is the case of EGRA and FARS, where difficulty of the text passage presented to pupils sometimes varied from one year to the next (See EDC 2016 and 2017), despite the development of initial criteria for end-of-grade-level texts in 2013 (EDC 2013). As the level of difficulty of a text has a bearing on the fluency or speed with which a pupil reads a text (oral reading fluency) and his/her ability to understand it (reading comprehension), changing the difficulty of a text impedes any ability to determine, with any degree of certainty, improvements over time. It also makes it difficult to attribute increases (or decreases) in the percentage of children meeting grade-level expectations to improvements (or not) in the teaching-learning environment.

• There is no consistent methodology, across different assessment instruments, for determining the scores that separate different performance categories. Cut scores are the scores on an assessment that separate different performance categories. Each of the national assessments identified the scores corresponding to each performance category retained, using a data-driven method. LARS I used Item Response Theory and EGRA/FARS used a method developed specifically for EGRA data, the “mean/median method.” (See Appendix E-3 and Room to Read 2018 for an explanation of this method). However, the methodology was only used to establish cut scores for P3 (Clark-Chiarelli, 2012; EDC 2013). P1 and P2 cut scores were extrapolated from the P3 cut scores (Minutes, L3 Steering Committee, April 15, 2015).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 136

Table 4. Oral reading fluency cut scores for different performance categories, EGRA P1 to P3, with cut scores corresponding to ‘Meeting grade level expectations’ highlighted in red EGRA/FARS Performance P1 ORF scores P2 ORF scores P3 ORF Scores Categories Non-reader 0 CWPM 0 CWPM 0 CWPM Beginning reader 1 TO 19 CWPM 1 TO 19 CWPM 1 TO 19 CWPM Emerging reader 20 TO 32 CWPM 20 TO 32 CWPM 20 TO 32 CWPM Competent Reader 33 TO 47 CWPM 33 TO 47 CWPM 33 TO 47 CWPM Proficient Reader 48+ CWPM 48+ CWPM 48+ CWPM

As a result of the extrapolation, a P2 child would be considered to be meeting grade-level expectations if they reached the P3 cuts scores for “emerging reader”, i.e., 20 to 32 CWPM. A P1 child would be considered to be performing at grade level if she/he achieved the P3 cut scores for “beginning level”, i.e., 1 to 19 CWPM. Although the reasoning is not without its logic, it resulted in a P1 child being considered to be performing at grade level if, by the end of the school year, she/he could read 1 word. Most stakeholders, not to mention members of the international community, would consider this a very low bar to attain.

• No district or sector-level data collection and reporting on pupils’ reading skills in early primary, resulting in limited district-level accountability for improving learning outcomes. All of the assessments of early grade reading skills have been carried out at the national level, using instruments that lend themselves to large-scale national assessments. No corresponding, simplified instruments have been developed for districts or sectors to use to collect data on the extent to which pupils are meeting the expectations outlined in the curriculum. That, combined with the lack of national targets, means that districts are unaware of whether their pupils are performing at acceptable levels.

The absence of a decentralized data collection and reporting process has also meant that accountability for meeting ESSP or SDG indicators has largely fallen on the national level. Decentralizing the data collection and reporting process so that districts are reporting the percentage of pupils at each grade level meeting minimal expectations and having district targets – aligned with national targets – incorporated into the district Imihigo would make districts and schools more accountable for improving the quality of teaching and learning.

Summary. Although the 11 different large-scale reading assessments carried out over the past eight years do not provide a consistent or comprehensive picture of early primary pupils reading skills, or the extent to which their skills have improved over time, there have been a lot of lessons learned. These include lessons on: 1) the need to ensure grade and subject-specific benchmarks for “minimally meeting grade-level expectations” are defensible and comparable across assessment instruments, 2) the importance of establishing ambitious but justifiable national and district targets with respect to the percentage of pupils meeting minimal benchmarks and 3) the need to transferring responsibility for meeting these targets from the national to the district/school level.

The remaining sections of this paper describe two related initiatives that capitalize on the lessons learned from the numerous large-scale reading assessments carried out in Rwanda:

SOMA UMENYE QUARTERLY PROGRAM REPORT | 137 • an assessment framework (Section 3) that would provide bring much needed coherence to early grade reading assessments and ensure that benchmarking practices align with international-best practices in the field.

Section 3 describes critical elements of a reading assessment framework and for each identifies whether or not the element has been put in place, as well as elements that warrant revision in order to align with international best practices. It then proposes a program of activities for the development of an assessment framework for early grade reading that, once validated, would bring much needed consistency and comparability to early grade reading assessments.

• an accountability framework (Section 4) that would ensure that all educational stakeholders are aware of the minimal reading benchmarks pupils at each grade level are expected to achieve, that national and district levels have defensible but ambitious targets with respect to those benchmarks, and that all levels of the system collect and report, on a regular basis and using a digital dashboard, on the percentage of pupils meeting those bench mark, as well as progress towards targets. The accountability framework described it will allow national and district structures to share the responsibility for collecting and reporting on progress with respect to targets, while at the same time shift responsibility for improving learning outcomes to structures that have the most control over the quality of learning in schools: district/sector officials, head teachers, and classroom teachers.

TEXTBOX 1 – Extending to other grade levels and subject areas Although the assessment and accountability frameworks proposed here target lower primary and are limited to one key subject area – reading – the processes for developing and implementing them can be applied to all grade levels and subject areas. The development and implementation of early grade-reading assessment and accountability frameworks can in fact serve as a pilot for development of similar frameworks for other subject areas and grade levels.

4. EARLY GRADE READING ASSESSMENT FRAMEWORKS The diagram below, and the table on the following page, describe the key elements of an early grade reading assessment framework.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 138

Established by master General goal for that subject teachers, curriculum specialists, using Grade-specific defensible, international curriculum standard best practices (Angoff, Set by learning Edel, Bookman methods, assessment etc.) specialists

Cut scores for DIAGRAM 1: ELEMENTS performance Performance categories - for each of AN ASSESSMENT categories Common assessment FRAMEWORK FOR across subject, instrument EARLY GRADE READING grade levels (OR ANY SUBJECT AREA) Aligned with performance descriptors

Harmonized assessement Describe what pupils instruments to Qualitative and in each category can measure progress quantitative and cannot do with respect to grade- against descriptors performance level standard descriptors for each category National, end-of-year instruments Developed by curriculum specialists District term 1 & 2 for content area; Do not change instruments unless curriculum changes Sector instruments

Each element of the framework is described below.

A. GRADE-SPECIFIC READING CURRICULUM STANDARDS Standards are general statements of what children should be able to do at the end of a learning cycle or a specific grade level.

1. Who should develop grade-specific reading standards? Standards are generally established by curriculum specialists.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 139

Table 5: Illustrative example, Assessment Framework, P3 Reading Curriculum Standard Read P3 end-of-grade level text fluently and confidently; Retain or identify most of the literal information contained in text Does not meet expectations Partially meets expectations Meets expectations Exceeds expectations Not performing at a level that will Partial mastery of material, but not Solid academic mastery of material; Superior performance; performing Performance categories ensure success enough to meet grade level Performing at grade level expectations above grade level expectations Oral reading fluency. Does not Oral reading fluency. Is Oral reading fluency. Is Oral reading fluency. Exhibits read at all or reads very slowly and uncomfortable reading orally; Reads comfortable reading orally. Reads confidence when; Reads quickly, with great hesitation, with little or slowly and hesitantly, with little with some fluency and generally smoothly and accurately, respecting no accuracy. expression. Tends to read words accurately, although not necessarily punctuation (commas, question rather than sentences, but with smoothly. Sometimes hesitates mark, period). May even read with Either cannot read the words, reads some fluency and accuracy without when sounding out unfamiliar words expression and correct intonation only first letter/syllable before overtly segmenting. However, uses or uses overt segmenting and Sounds out unfamiliar words getting stuck, or misreads words overt segmenting to sound out blending when sounding out these automatically, accurately and entirely unfamiliar word. Does not always words. Does not consistently without hesitation. Identifies when identify when reading does not respect sentence punctuation reading has not made sense and Comprehension. Is unable to made sense because a mistake has (commas, question mark, period). self-corrects. extract any literal information, recall been made. Continues to read Usually checks to make sure the Comprehension. Extracts all of and details or make inferences. rather than re-reading or self- selection is making sense and the the literal information from text Needs adult support to retell a correcting re-reads or self-corrects if provides clear, complete and story. necessary accurate answers to questions; Comprehension. Extracts some Comprehension. Extracts most of Makes accurate inferences based on Vocabulary – Is unable to figure literal information from text, for the literal information from text; what is said or done in text; Performance descriptors out the meaning of new word. example names of main characters; provides generally accurate and Accurately describes main answers to questions may contain complete answers. Makes inferences characters inaccuracies; Cannot make logical (but no always accurate ones), Can accurately retell a P3 story, inferences based on information in based on what is said or done in including specific and relevant story. When asked to retell a story, texts. Accurately identifies main details. Makes plausible predictions focuses on one key event or characters. Makes reasonable of what might happen, based on includes inaccurate information. predictions, when prompted, of events in text Predictions are often guesses, not what might happen, based on events Vocabulary. Uses a variety of based on information in text in text strategies to accurately infer the Can accurately retell, in sequence, meaning of new words Vocabulary – is able to figure out most main events or ideas in a P3 the meaning of some new words story

Vocabulary- is usually able to figure out the meaning of new words by using context clues or analysing the structure of the word EGRA Oral reading fluency 0 to 19 EGRA Oral reading fluency 20 to 39 EGRA Oral reading fluency 40 to 60 EGRA Oral reading fluency 61+ CWPM CWPM CWPM CWPM Cut scores ETC ETC PASEC – Level 3 ETC for each assessment tool TERCE – Level 1 UNICEF – MICS Level 6

SOMA UMENYE QUARTERLY PROGRAM REPORT | 140

2. Do grade-specific standards exist for early grade reading, Kinyarwanda? Yes. The statements can be found in the curriculum. For example, the P2 curriculum contains the general goal or standard that pupils be able to “read fluently, and answer questions on everyday life experiences.”

B. PERFORMANCE CATEGORIES Performance categories group pupils according to the level of performance they demonstrate, for example not meeting grade-level expectations, meeting grade level expectations, or exceeding grade level expectations. While the number of performance categories can vary by jurisdiction, the general guideline is to limit it to three or four, as the it becomes more difficult to differentiate between categories as the number of categories increases (Zieky & Perie 2006).

1. Who should develop performance categories? Performance categories should be developed by those charged with carrying out large-scale learning assessments.

2. Do performance categories exist for early grade reading? Performance categories do exist and have been used for reporting pupils’ performance results in some reports, but as mentioned in Section 2, their use has been inconsistent. Where they have been used the number of performance categories used and the labels attached to them have varied across assessments. There is a need to stabilize the number of categories and their labels, to ensure consistency in future large-scale learning assessments.

C. PERFORMANCE DESCRIPTORS Performance descriptors describe qualitatively — and sometimes quantitatively — what a pupil in each performance category can do with respect to the overall curriculum standard. In the example shown (see Table 5), they describe what a P3 pupil who has met (or exceeded) grade-level expectations is able to do when presented with a P3-level text.

Performance descriptors reflect curriculum expectations for a given grade level. For that reason, they are established by curriculum experts familiar with those expectations. Once established, performance descriptors only change if the curriculum changes.

Descriptions that are comprehensive in nature provide assessment developers with an indication of the key skills their assessments should measure. Performance descriptors also play a critical role in the establishment, for each assessment instrument, of cut scores corresponding to each performance categories (see further discussion, Section D below).

Importance of national performance descriptors in reporting against common, global indicators like SDG 4.1.1. Table 1 outlines the draft performance descriptors the global community has agreed to report on with respect to pupils’ reading progress in P2 and P3. Aligning national performance descriptors for “meeting grade level expectations” with the global descriptors and ensuring that cut scores for “meeting grade level expectations” align closely with the descriptor provides a defensible basis for the use of the data to report progress with respect to SDG 4.1.1.

1. Who should develop performance descriptors? Performance descriptors should be established by curriculum specialists familiar with the expectations for each grade level.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 141

2. Do performance descriptors for early grade reading in Kinyarwanda exist? No. As mentioned in Section 2, none of the large-scale reading assessments have provided rich, criterion-reference performance descriptors for the different categories of pupil performance. This limits comparability across assessments. In the case of FARS or EGRA, quantitative descriptors have been provided in the form of the number of correct words a pupil is able to read in one minute for the fluency task, or the percentage of reading comprehension questions pupils are able to answer. However, no overall qualitative descriptors, linked to curricular expectations for the grade level, are provided.

The LARS II 2016 report stressed the need to develop clear performance descriptors for “not meeting” and “meeting grade level expectations:

We would recommend developing a clear standard of what a pupil needs to achieve in order to “Meet Expectations” before the test is rolled out. One of the lessons we learned from LARS II is that the number of pupils not meeting expectations varies greatly depending on the definition adopted... Having clear cut-offs of the boundaries between the three levels will be helpful when analyzing the results and will allow REB to obtain a precise number of children in each category. (p. 64)

D. CUT SCORES AND BENCHMARKS THAT ALIGN WITH PERFORMANCE CATEGORIES Cut scores are the scores that separate different performance categories. Benchmarks are scores that separate pupils “not meeting grade level expectations” from those who are meeting grade level expectations. The table below, taken from a state-wide assessment of reading in the state of Oregon in the U.S., shows the link between scores on a particular assessment, in this case the Proficiency Assessment (ELPA 21), the different performance categories, and more importantly curriculum-based performance descriptors for each category.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 142

Table 6: Example, link between cut scores, benchmarks, performance categories and performance descriptors, Oregon ELPA 21 Performance category & corresponding score

Benchmarks are the minimal score a pupil needs to attain to be Performance descriptor considered to be meeting grade level expectations. In the above corresponding to that score framework, that score is 555 (2nd score range) or 610 (3rd score) range). di

Linking cut scores and benchmarks to performance descriptors makes pupils’ results more meaningful. Stakeholders get a clear image of what pupils who score within a given range can and cannot do with respect to reading. Not having performance descriptors limits the usefulness of the data.

Cut scores and benchmarks are assessment-specific. Different assessment instruments will have different cut scores for each of the categories retained. However, if the cut scores and benchmarks for each instrument align with the common performance descriptions for each category, a pupil who obtains a score in the “meet expectations” range on assessment A should be able to obtain a score in the “meets expectations” range on assessment B. Also, if the nature of an assessment instrument changes significantly (i.e., if it is harder or easier than previous versions), the cut scores for the different performance categories need to be adjusted to reflect those changes and preserve alignment with the performance descriptors.

1. Who should develop cut scores and benchmarks for a given assessment instrument? International best-practices suggest that cut scores should be set by curriculum experts and master teachers with extensive, nuanced knowledge of the range of abilities of pupils at given grade levels with respect to the content area in question. The involvement of panels of experts is an essential component of best practices in establishing cut scores (Zieky & Perie, 2006; Livingston & Zieky, 1984; Zieky & Perie, 2008).

2. Do (defensible) cut scores and benchmarks exist for early grade reading instruments? Yes and no. Although cut scores exist for both LARS I and for the oral reading fluency (ORF) and reading comprehension (RC) EGRA and FARS subtasks, there has been no consistent methodology for establishing them. In the case of LARS III, pupils are evaluated as to whether they met a specific benchmark, but there is no indication in the report as to what the benchmark is or how it was obtained. In the case of EGRA/FARS ORF and RC, only the P3 cut scores were based on empirical data, an essential component of international best practices (EDC, 2013). The P1 and P2 cut scores were established by

SOMA UMENYE QUARTERLY PROGRAM REPORT | 143 extrapolating from the P3 results, a process that cut scores and benchmarks that are difficult to defend, nationally or internationally (see Section 3 for discussion of limitations of P1 cut scores and benchmarks.)

3. What processes should be used to establish cut scores? As the quote in Textbox 2 points out, there are no Textbox 1: Best practices, definitive cut scores for a given assessment, just more or establishing cut scores and less defensible ones based on the methodology used to benchmarks establish them. “There are no true or correct cut scores for a test, only more or less defensible ones. The most common, internationally-recognized Defensibility is based in large methodologies for establishing cut scores – the Angoff, measure on the method used to set the Bookman, the Edel, the Nedelsky - are summarized standards.” in Appendix E-4 All of these methodologies share a Ferrara, Perie & Johnson, 2008. common feature: they have curriculum experts and Quoted in EGRA Toolkit, 2016, p. 132 master teachers — with extensive and nuanced professional knowledge of the range of abilities of pupils at given grade levels — assess the relative difficulty of each assessment item and based on that judgment, set cut scores for different performance categories, including the benchmark score for “minimally meeting grade level expectations).. The involvement of panels of experts is an essential component of best practices in establishing cut scores (Zieky & Perie, 2006), as is ensuring that the cut scores are content- or assessment-specific and empirically established.

Using any of the above methods would bring current practices in establishing cut scores or benchmarks in alignment with international best practices, ensuring their defensibility at the international level. Of all of the methods described, however, only the Angoff method30 (see textbox 2 below) can be used for both oral, one-on-one interview assessments like EGRA or FARS and pencil and paper assessments like LARS. It is also the most widely used and best researched method for establishing cut scores (Cisek, 2012). For both of those reasons, this paper recommends that future attempts to establish cut scores and benchmarks use a modification of the Angoff method. (See Appendix E-5 and Ferdous (2019) for a description of the modified Angoff method and how it can be used to establish cut scores and benchmarks for oral reading fluency or reading comprehension.)

30 See Plake, Ferdous & Buckendahl (2005); Smith, Davis-Becker & O’Leary (2014) and Impara & Blake (1997) and Zieky & Perie (2008) for a more indepth discussion of how the Angoff method

SOMA UMENYE QUARTERLY PROGRAM REPORT | 144

Textbox 2 : Brief overview of the Angoff and modified Angoff method (developed in 1990)

With the Angoff methods, master teachers and curriculum experts examine the degree of difficulty of each question on the assessment and come to consensus on the percentage of pupils in a given performance category, for example “not meeting grade level expectations” who would be able to correctly answer each question. This percentage becomes the predictive difficulty score for a given question, for pupils in a given performance category. The overall cut scores for is then calculated by averaging experts’ predictive difficulty score for each question in the assessment.

The modified Angoff was developed because master teachers and curriculum experts often have difficulty visualizing a large group of pupils and determining the percentage who would be able to correctly answer a given question. The modified Angoff overcomes this difficulty by having experts identify a pupil in their classroom who is at the lowest end of the performance descriptor for a given category (i.e., a borderline pupils) and identify whether or not the pupil would be able to answer the questions (i.e., yes or no). During collation, facilitators analyze the number of “yes” and “no” votes for each to establish the average score pupils would obtain.

E. RIGOROUS METRICS DEFINING THE NATURE OF AN END-OF-GRADE-LEVEL TEXT Although not included in Table 2, a national early Textbox 3 : Link between text grade reading assessment framework should include difficulty and cut rigorous metrics describing the characteristics of the scores/benchmarks grade-level text that pupils should be able to read at the end of each of P1, P2, and P3. That is because the Benchmarks (or cut scores) will fluency with which pupils are able to read a text, as only be reliable and valid if well as their ability to comprehend the ideas measures of reading fluency and contained, depend upon the difficulty of the text comprehension are themselves (Amendum et al., 2018; Room to Read 2018). Pupils reliable and valid. will read a text more fluently, and better understand Room to Read, 2018 the ideas presented, if the text has short words; familiar words; a high percentage of repeated words; short, simple sentences; and one or two key ideas. Conversely, the fluency of their reading and the degree to which they are able to extract information from a text will decrease when presented with a text with longer words, a higher percentage of unfamiliar words, few repeated words, and longer sentences (Ibid.).

Strictly controlling the level of text difficulty for a given grade level is necessary for comparability across jurisdictions and over time (EDC 2013; Room to Read 2018)). If texts of equal difficulty are not used in consecutive assessments, it is impossible to attribute any increase (or decrease) in the percentage of children meeting grade-level expectations to improvements (or lack thereof) in the teaching-learning environment. For guidelines on how to develop texts of equal level of difficulty for assessing oral reading fluency or reading comprehension, see Room to Read (2018).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 145 1. Who should set text metrics? Criteria for text difficulty are generally established by curriculum specialists knowledgeable about the expectations of the curriculum.

2. Do rigorous metrics exist for early grade reading, Kinyarwanda? Yes. To ensure the comparability of future oral reading fluency and reading comprehension EGRA/FARS-like tasks, in August 2018 MINEDUC/REB Kinyarwanda specialists established rigorous criteria for end-of-grade-level texts for P1, P2, and P3.

Textbox 4: Developing a battery of assessment instruments to measure progress, over time, with respect to national targets Not included in the above description are the development of national, district, or sector assessment instruments, aligned with the performance descriptors, and the establishment of defensible cut scores for each instrument. The development of a battery of coordinated assessments provides the basis for the development and implementation of an accountability framework described in Section 4.

F. MOVING FORWARD: DEVELOPING AN EARLY GRADE READING ASSESSMENT FRAMEWORK The discussions in Sections A to E demonstrate that considerable work has been done over the past nine years to develop individual components of an assessment framework. However, there is a need to complete the work and bring much needed coherence to future large-scale assessments of reading in the early grades. Doing so would ensure that future results can be compared across time, across different assessment instruments, and across jurisdictions. All of these are critical for reporting progress with respect to national reading targets, as well as SDG indicator 4.1.1.

The key actions required to develop the framework

• Developing grade-specific reading standards, based on the expectations of the curriculum • Coming to consensus on the number of performance categories for reporting pupil performance • Developing grade-specific performance descriptors for each performance category, aligned with curriculum expectations for each grade level, and ensuring that the performance descriptors for “meeting minimal grade-level expectations” at P2 and P3 align with the SDG performance descriptors for ‘meeting minimal grade-level expectations’ those grade levels. This latter step would ensure that any future national assessment data could be used to report on progress with respect to SDG 4.1.1 at P2 or P3 level. • Implementing international best practices, and specifically a modified Angoff method, to establish grade-specific cut scores the most common skill assessed in early grade reading, and one referenced in SDG 4.1.1 at the P3 level, specifically oral reading fluency. Doing so now would allow the results of the 2018 large-scale assessment to be analyzed according to the new performance categories and cut scores. This in turn would provide

SOMA UMENYE QUARTERLY PROGRAM REPORT | 146

a valid and defensible baseline for future assessments of ORF in P1 to P3 and could potentially provide baseline data for SDG 4.1.1 for P3 reading.

5. NATIONAL EARLY GRADE READING ACCOUNTABILITY FRAMEWORK In a situation in which accountability policies are not well developed, national assessment findings are unlikely to have much effect. (Hopmann and Brinek 2007, Quoted in Kellaghan, Greaney & Murray, 2009).

Accountability frameworks provide mechanisms for monitoring and reporting progress with respect national, district or sector targets. The diagram below outlines the processes and procedures for developing and implementing an accountability framework for early grade reading, each of which is described below.

National baseline Targets reflect conducted to determine % skewedness of data, pupils by performance resources directed category towards improvements National end of term 3 and international trends assessment to measure with respect to national progress towards learning gains targets

National National targets set (% ll increase, decrease pupils in each category) and Diagram 2: Developing and communicated to districts District terms 1 & 2 results implementing a reading aggregated at national level via dashboard accountability framework, P1 to P3

District level District baseline conducted to identify % pupils in each Annual district term 1 & 2 category assessments carried out; progress reported to national level District assessment tool and performance District imihigo targets set category cut scores, (short, medium), in aligned with national alignment with national targets assessment framework, developed by national level A. NATIONAL BASELINE The starting point of any accountability framework is necessarily national assessment data on the extent to which pupils are meeting or exceeding grade-level expectations. Data need to be available on the percentage of pupils, at each grade level, in each performance category. This data, and specifically how pupils are distributed across the different performance categories, is essential for setting justifiable targets (Stern & Piper, in press).

Are national data available for a baseline? Yes. The recent 2018 EGRA measured pupils’ performance with respect to all important reading skills (listening comprehension, letter/syllable reading, word reading, oral reading fluency, reading comprehension). If the decision is made to track progress using similar one-on-one oral interviews at either the

SOMA UMENYE QUARTERLY PROGRAM REPORT | 147 national or district level, the 2018 data could be used as a baseline. It could also be used to establish defensible but ambitious national targets for these skills. If the decision is made to use a pencil and paper test at either the national or district level, or a hybrid instrument that combines oral-interview tasks with written assessments, an equating process would allow MINEDUC/REB to map the 2018 baseline results onto the new instruments.

B. ESTABLISHMENT OF DEFENSIBLE BUT AMBITIOUS NATIONAL TARGETS The distribution of baseline data for each grade Graph 2: Example of skewed data level across the different performance categories retained will provide an initial basis to establish defensible targets. If there are a large percentage of pupils performing below grade-level expectations — i.e., if the data is skewed left as is the case in graph 2 — more ambitious targets should be set for reducing the percentage of pupils in the lowest category (not meeting expectations), and less ambitious targets established for the percentage of pupils “meeting expectations.” If there are a high percentage of pupils clustered near but just below the minimal cut score for “meeting expectations,” more ambitious targets can be established (Stern & Piper, in press).

In addition to the skewedness of the data, targets should reflect the international evidence base with respect to national improvements in learning gains over time. Stern and Piper (in press) review the learning gains achieved after three and then five years of a national initiative to improve reading outcomes in low- and middle-income countries. The results provide an informed basis for establishing defensible national gains for similar countries.

Table 8: Example of national targets, by performance category

Year A B C Target: Meeting or Does not Meets Exceeds exceeding meet expectations expectations expectations (Sum expectations column B + C)

2018 50% 40% 10% 50% 2020 Target 40% 48% 12% 60% 2030 Target 25% 60% 15% 75%

Textbox 5: Focus of national (or district) targets Given the skewed nature of reading data collected over the past nine years, the focus of the national targets should be on identifying the percentage increase and decrease of pupils in each category over the short, medium and long term, in addition to the percentage of pupils meeting or exceeding grade-level expectations or benchmarks

SOMA UMENYE QUARTERLY PROGRAM REPORT | 148

C. DISTRICT BASELINE As the realities of each district are different, each will have a different starting point. Having district collect their own baseline data, using simplified tools that produce results equivalent to those of the national instrument, will allow each district to identify the percentage of pupils in each performance category at their baseline, and compare that with the national distribution. This also provides the evidence base for the establishment of defensible but ambitious district targets, based on the skewedness of each district’s data.

D. ESTABLISHMENT OF DEFENSIBLE BUT AMBITIOUS DISTRICT TARGETS Having clear, measurable targets for system improvement is needed to drive decision- making… only with similar, clearly articulated measures of improved learning outcomes, and the systems to translate these down to the school level, will attention begin to focus on what is needed to improve student achievement. Crouch & de Stefano, 2015, p. 10

As is the case with the national level, district targets should reflect the skewedness of the data, as well the international evidence base with respect to average annual learning gains in low-, medium-, and high-resource contexts. Providing districts with common procedures or algorithms to follow to establish targets will ensure consistency across district.

Targets should be set by identifying the percentage increase and decrease of pupils in each category over the short, medium and long term, consistent with the resources the district expects to expend to improve learning outcomes in the targeted curricular area. This process will allow districts to quickly establish a target for the percentage of pupils meeting or exceeding grade-level expectations over the short, medium or long term.

Recognize that different districts have different starting points and resources to support improved learning. The processes outline above will allow for different district targets. The targets for districts that have a higher starting point, or where there are more resources in the home or community to support pupils’ learning (high socio-economic status (SES) for example), may be different from those targets from districts with a lower starting point or few resources. The table below illustrates potential starting points and targets for two districts with different profiles.

Table 9: Illustrative example of different targets for districts with different profiles

District Year A. B. C. Target: Meeting or Does not Meets Exceeds exceeding meet expectations expectations expectations expectations (Sum column B + C)

District A 2018 50% 40% 10% 50% - High SES 2020 Target 40% 48% 12% 60%

2030 Target 30% 60% 15% 75%

District B 2018 70% 25% 5% 30% - Low SES 2020 Target 60% 33% 7% 40%

SOMA UMENYE QUARTERLY PROGRAM REPORT | 149 District Year A. B. C. Target: Meeting or Does not Meets Exceeds exceeding meet expectations expectations expectations expectations (Sum column B + C)

2030 Target 50% 40% 10% 50%

The national target for “meeting or exceeding expectations” for any one year should be based on the average of the district targets. Once district targets are established, they are inserted into the Imihigo for the district.

E. DISTRICT MONITORING AND REPORTING OF PROGRESS WITH RESPECT TO TARGETS The recent MINEDUC/REB decision to introduce district-level assessments of learning outcomes at the end of terms 1 and 2 provides the policy-level impetus for districts to measure and report on progress with respect to targets. Providing districts with ready- made but simple assessment tools, keyed to the assessment framework and equated to national assessment tool, as well as corresponding cut scores for different performance categories, will ensure that the data collected and reported is comparable across jurisdictions.

Collecting additional learning-outcome-related data. In addition to data on learning outcomes, districts might collect data on those factors that have, over repeated national early grade reading assessments, correlated with stronger reading skills (see summary document, Appendix E-6). These include, having and using a Kinyarwanda textbook during reading lessons, regular attendance, taking material home each night to read, and low teacher and head teacher absentee rates. Collecting and reporting this data would allow districts to identify potential factors impeding the attainment of targets, and institute measures to address them.

F. NATIONAL MONITORING OF DISTRICT PROGRESS WITH RESPECT TO TARGETS Knowing which schools (or districts) are performing and which are not (in terms of both leading and lagging indicators) allows the system to then provide differentiated response based on that performance. Crouch & DeStefano, 2015, p. 12

The use of a national dashboard, where districts input the results of their assessments for aggregation at the national level, will allow all decision makers to track progress. The diagram below shows an example of a national/district dashboard screen summarizing progress with respect to targets:

SOMA UMENYE QUARTERLY PROGRAM REPORT | 150

Diagram 3 : District display, P3 Reading, by performance categories

Diagram 4: National display, % P3 pupils meeting fluency and comprehension benchmarks, by province

The submission of data via a national/district dashboard would allow the national level to identify districts that are struggling and need additional support to meet their Imihigo targets. This additional support could come from the national level or through a national- district partnership whereby school-based coaches, head teachers, and sector education inspectors are informed of district results and provided with tools to address learning gaps identified and/or address potential factors contributing to low performance results.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 151 G. NATIONAL ASSESSMENT TO CONFIRM PROGRESS, NATIONWIDE, WITH RESPECT TO TARGETS The recent MINEDUC/REB decision to introduce national, end-of-year assessments of learning outcomes at P1 to P5 provides the policy environment required to: 1) validate performance data submitted by districts and 2) generate a national snapshot of progress with respect to targets. To generate consistent results, the national assessment tools must be keyed to the assessment framework and aligned with district-level tools to ensure comparability across jurisdictions. If the results are to be used to generate data with respect to global or regional indicators (for example, SDG 4.1.1), the instruments themselves and the identification of cut scores corresponding to “meet grade level expectations” must align with internationally recognized best practices.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 152

Appendix E-1. Bibliography

Amendum, S. J., Conradi, K & Hievert, E (2018). Does Text Complexity Matter in the Elementary Grades? A Research Synthesis of Text Difficulty and Elementary Students' Reading Fluency and Comprehension. Educational Psychology Review, 30 (1) 121-151 Bejar, I. I. (2008). Standard setting: What is it? Why is it important. R&D Connections, 7, 1–6.

Clark-Chiarelli, N. (2012). Proposed national reading standards, Kinyarwanda and English, P3 & P5. Boston, MA: Education Development Center. Cizek, G. J. (1996). Standard‐setting guidelines. Educational Measurement: Issues and Practice, 15(1), 13–21. Cizek, G. J. (Ed). (2012). Setting Performance Standards: Foundations, Methods and Innovations. NY: Routledge.

Crouch, L. and de Stefano, J. (2015). A Practical Approach to In-Country Systems Research. Paper prepared for the Research on Improving Systems of Education (RISE) Program Launch Conference, Washington, DC. June 18-19, 2015. Available at https://ierc- publicfiles.s3.amazonaws.com/public/resources/14_Crouch-DeStefano_Core%20Systems.pdf

Crouch, L. and de Stefano, J. (2017). Doing Reform Differently: Combing Rigor and Practicality in Implementation and Evaluation of System Reforms. International Development Working Paper No. 2017-01 RT International. Available at: https://www.rti.org/sites/default/files/resources/rti- publication-file-d63038a1-a11e-4672-ad81-fe57c3f631c7.pdf

Education Development Center (2013). Fluency Assessment in Rwandan Schools 2012 Baseline Report (Draft). Boston, MA: Education Development Center.

Education Development Center (2014). Rwandan National Reading and Mathematics National Baseline Report. Boston, MA: Education Development Center.

Education Development Center (2016). National Fluency Assessment of Rwandan Schools – Midline Report. Boston, MA: Education Development Center.

Education Development Center (2017). National Fluency and Mathematics Assessment of Rwandan Schools – End line Report. Boston, MA: Education Development Center.

Ferdous, M. (2019). Setting Multiple Performance Standards for a Timed Reading Fluency and Comprehension Assessment. Manuscript submitted for publication. Obtain via personal communication with author.

Ferrara, S., Perie, M. & Johnson, E. (2008). Matching the judgemental task with standard setting panelist experts: the item descriptor ID matching procedure. Journal of Applied Testing Technology 9 (1).

Greaney, V. and Kellaghan, T. (Eds.) (2008). Assessing National Achievement Levels in Education. Vol 1 of the National Assessments of Educational Achievement Series. Washington: World Bank. Available at: http://siteresources.worldbank.org/EDUCATION/Resources/278200- 1099079877269/547664-1099079993288/assessing_national_achievement_level_Edu.pdf

Hanushek , E and Wößmann. L. (2007). The Role of Education Quality in Economic Growth. Policy Research Working Paper 4122, World Bank, Washington, D.C. available at: http://www-

SOMA UMENYE QUARTERLY PROGRAM REPORT | 153 wds.worldbank.org/servlet/ WDSContentServer/WDSP/IB/2007/01/29/000016406_20070129113447/Rendered/PDF/wps4122.pd f.

Hopmann, S. T. and Brinek, G. (2007). Introduction: PISA According to PISA- Does PISA Keep what it Promises? In PISA Zufolge PISA: PISA According to PISA, S.T. Hopmann, G. Brinek and M. Retzl (Eds). 9-19. Vienna: LIT

Impara, J. & Plake, B. (1997). Standard Setting: An Alternative Approach. Journal of Educational Measurement, 34 (4), pp. 353-366.

Kellaghan, T., and Greaney, V. (2001). The Globalisation of Assessment in the 20th Century. Assessment in Education 8(1), 87-102.

Kellaghan, T., and Greaney, V. (2004). Assessing Student Learning in Africa. Washington: World Bank .

Kellaghan, T., Greaney, V. & Murray, S. (Eds) (2009). Using the Results of a National Assessments of Educational Achievement. Vol 5 of the National Assessments of Educational Achievement Series. Washington: World Bank

Livingston, S. and Zieky, M. (1982). Passing Scores: A Manual for Setting Standards of Performance on Educational and Occupational Tests. Princeton, NJ: Educational Testing Services. MINEDUC (2017). Education Sector Strategic Plan, 2018/19 to 2022/23. Kigali: Ministry of Education.

Montoya, S. (2018.) Upgrade Indicator SDG 4.1.1 (a). Presentation at Eighth Meeting of the IAEG- SDG. November 2018, Stockholm, Sweden. Retrieved Feb 14, 2019 from https://unstats.un.org/sdgs/files/meetings/iaeg-sdgs-meeting-08/4.5%20UNESCO- UIS%204.1.1a%20Reclassification.pdf

NCES (2019a). TIMSS Participating Countries 1995-2019. Retrieved Feb 9, 2019 from https://nces.ed.gov/timss/countries.asp

NCES (2019b). PIRLS Participating Countries 2001-2016. Retrieved Feb 9, 2019 from https://nces.ed.gov/surveys/pirls/countries.asp

NCES (2019c). PISA Participating Countries 2000-2015. Retrieved Feb 9, 2019 from https://nces.ed.gov/surveys/pisa/countries.asp

Organisation for Economic Cooperation and Development (OECD) (1996) The Knowledge-based Economy. Paris: OECD

Plake, B, S., Ferdous, A. A. & Buckendahl, C. Setting Multiple Performance Standards: A Variation of the Yes/No Method. Unpublished and undated manuscript.

Plake, B, S., Ferdous, A. A. & Buckendahl, C. (2005). Setting multiple performance standards using the Angoff yes/no method: An alternative item mapping method. Paper presented at the meeting of the National Council on Measurement in Education Montreal, Canada.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 154

Room to Read (2018). Setting Data-Driven Oral Reading Fluency Benchmarks. Room to Read. Available at https://www.roomtoread.org/media/984470/room-to-read_fluency-benchmarking- guidance-note_published-may-2018.pdf

RTI (2016). EGRA Toolkit. 2nd Edition. RTI International. Available at https://www.globalreadingnetwork.net/resources/early-grade-reading-assessment-egra-toolkit- second-edition

Rwanda Education Board. (2012). Learning Achievement in Rwandan Schools (LARS I).

Rwanda Education Board. (2016). Learning Achievement in Rwandan Schools (LARS II). Final Draft Report for Validation Workshop, September 2016.

Rwanda Education Board. (2018). Learning Achievement in Rwandan Schools (LARS III).

Smith, R, Davis-Becker, S. & O’Leary, L. (2014). Combining the Best of Two Standard-Setting Methods: the Ordered Item Booklet Angoff. Journal of Applied Testing Technology. 15(1) 18-26. Retrieved Feb 14, 2019.

Stern, J.M.B. & Piper, B. (2019). Resetting targets: Why large effect sizes in education development programs are not resulting in more students reading at benchmark. Research Triangle Park, NC: RTI Press.

World Bank Group (2018). The Human Capital Project. Washington: World Bank Publications. Retrieved Feb 14, 2019 from https://openknowledge.worldbank.org/bitstream/handle/10986/30498/33252.pdf

Zieky, M. & Perie, M. (2006). A Primer on Setting Cut Scores on Tests of Educational Achievement. Princeton, NJ: Educational Testing Services. Available at : https://www.ets.org/Media/Research/pdf/Cut_Scores_Primer.pdf

SOMA UMENYE QUARTERLY PROGRAM REPORT | 155

Appendix E-2: National Early Grade Reading Assessments, 2011-2018

Assessm Years, Grade levels and reading skills assessed ent 2010 2011 2012 2014 2015 2016 2017 2018 EGRA P331 P132 P1, p2, p333 (check) Phonemic awareness Listening P listening (naming initial comprehension comprehension sounds of words) Reading letters Reading letters Reading letters Reading syllables Reading syllables Reading syllables Reading familiar Reading familiar words Reading familiar words ORF words ORF RC Decoding (reading RC invented words) ORF RC LARS LARS I LARS II P2 LARS III P3 P334 VOCABUL VOCABULA VOCABU ARY RY LARY RC RC RC FARS P335 P1, P2, P336 P1, P2, P1, P2, P3 ORF ORF P3 ORF RC RC ORF RC RC

31 Data were collected at the beginning of P4, and results considered to be representative of end of P3. Test developers considered the text used to measure ORF and RC to be an end of P2, beginning P3 level. 32 RC and ORF data were collected using what test developers considered to be a P2-level text. As such, the results are not comparable with the ORF and RC results of 2014 to 2016 using grade-specific texts. 33 Strict quantitative and qualitative metrics were developed to define expectations of the type of text a child should be able to read at each grade level and a bank of equated texts developed for each grade level. 34 Data were collected at the beginning of P4 (and P6) and the results considered to be representative of pupil performance at the end of P3 (and P5). 35 The 2012 FARS national data collection used an end of P3 grade level text to measure ORF and RC. Its purpose was to validate the draft P3 performance standards or cut scores established using the 2011 EGRA data which, although at the P3 level, used an end of P2 text. 36 Grade-level texts were developed to measure progress at P1, P2 and P3, although the texts were not always equated to ensure they were of equal levels of difficulty. SOMA UMENYE QUARTERLY PROGRAM REPORT | 156

Appendix E-3: Mean, Median Method for Establishing Cut Scores, EGRA/FARS

Ideally cut scores for foundational reading skills like those measured by EGRA or FARS would be based on research confirming the predictive validity of a given cut score. That is, if a pupil reaches the minimal cut score required to be considered to be “meeting grade level expectations” for a given grade level, there is a greater probability of the pupil will meet minimal expectations for this skill at later grades (Dynamic Measurement Group, INC. 2010). Although there is reliable data on the predictive value of cut scores for reading skills in some languages and in high resource countries (English in the US, for example) - similar data is not yet available for all languages or all contexts. Given the lack of predictive data, the most defensible means of establishing cut scores for foundational skills is to use international best practices for establish cut scores for large-scale assessments. These are discussed in Appendix E-4 of the concept paper.

The cut scores for EGRA/FARS assessments were established using a mean/median method37, developed specifically for use with EGRA-like tasks, that exploits the strong correlation between reading comprehension (RC) and ORF in early primary. The methodology is summarized in Diagram 1 and below:

Step 1: Participants establish minimally-acceptable levels of RC, based on the text presented and the questions asked. In most countries, the EGRA RC Diagram 1: Steps in performance standard or benchmark is mean/median method (from set somewhat arbitrarily at 80%.38 USAID, 2015, p.1) Step 2: Pupils who read at 80% or higher comprehension are isolated and their average (mean or median) ORF scores calculated and arranged in quartiles. Step 3: The ORF score of pupils in quartiles 2 and 3 are then used to set minimal ORF cut scores for the grade level.

This is the methodology that was used to establish cut scores P3 ORF performance categories 2012, based on EGRA data collected in 2011 (see 2012 Clark-Chiarelli report for description of methodology used). At the time, the cut scores were considered provisional because they reflected data collected at a single point in time, using a P2-level text. The cut scores were revised or validated in 2012, following a second national assessment of P3 pupils’ oral reading fluency skills using P3-level texts (see EDC 2014 baseline report).

37 More recently, a second methodology based on linear regressions has been piloted. As it requires more advanced statistical knowledge, it is less widely used than the mean or median method. 38 Some countries with very low reading levels, for example the Democratic Republic of Congo, have opted to use 60% as the minimally acceptable reading comprehension level. SOMA UMENYE QUARTERLY PROGRAM REPORT | 157 Table 1: National P3 ORF cut scores by performance category, Kinyarwanda Original Performance Categories Corresponding P3 ORF cut scores Not meeting grade level expectations 0 to 33 CWPM Meeting grade level expectations 33 to 47 CWPM Exceeding grade level expectations 48+ CWPM

In 2014, the performance category of “not meeting grade level expectations was subdivided into three performance categories, the categories re-labelled, and new cut scores established for the three new categories. It is unclear in the literature the process used to establish the new cut scores.

Table 2: National P3 ORF cut scores by performance category, Kinyarwanda 2014-2016 Performance Categories P3 ORF Performance Standard (cut scores) Non-reader 0 CWPM Beginning reader 1 to 19 CWPM Emerging reader 20 to 32 CWPM Competent Reader 33 to 47 CWPM Proficient Reader 48+ CWPM

P1 and P2 ORF cut scores –P1 and P2 data was not used to establish cut scores for these two grade levels. Rather the P3 cut scores were used (Minutes, L3 Steering Committee, April 15, 2015).

SOMA UMENYE QUARTERLY PROGRAM REPORT | 158

Appendix E-4: Four Different International Best Practices for Establishing Cut Scores

Method Brief description Angoff method Experts examine the degree of difficulty of each question and come to - developed 1990 consensus on the percentage of pupils in a given performance category, - For all for example “not meeting grade level expectations” who would be able assessments to correctly answer each question. This percentage becomes the predictive difficulty score for a given question, for pupils in a given performance category. The overall cut scores for is then calculated by averaging experts’ predictive difficulty score for each question in the assessment. Bookman method Experts (or an Item Response Theory analysis) ranks test questions in - developed mid order by difficulty from easiest to hardest. 1990s - for multiple They then come to consensus on the “bookmark” that corresponds to choice or short the hardest question a pupil who just barely “meets grade level answer tests expectations” would be likely to answer correctly. This becomes the minimal cut score for that performance category. The process is repeated for other performance categories. Nedelsky method For each multiple-choice question, experts identify the proposed - developed 1954 answers that a borderline test taker (i.e., a pupil who just barely “meets - for multiple grade level expectations”) would be able to recognize as wrong, as well choice tests only as those that s/he would not be able to identify as wrong.

A score is established for the question by dividing 1 by the number of answer options the pupil would not be able to identify as wrong, and hence would have to guess between them. For example, if a multiple- choice question has four possible answers, two of which a borderline test take would be able to identify as wrong, the borderline test taker score for that time is ½ or 0.5 The cut score for the overall test is calculated by adding all the scores the pupil would obtain on each multiple-choice question. Edel Method Experts are given a 4 x 3 grid that plots the degree of difficulty of a - pencil and question against its importance or relevance with respect to the content paper tests being tested

Essential Important Acceptable Questionable Easy Medium Hard

Experts class all the questions into one of the 12 boxes (e.g., questions that are essential but easy, or hard but questionable, etc.). They then consider one performance category, for example, pupils “meeting grade level expectations” and come to consensus on the percentage of questions these pupils would be able to answer correctly with respect to the questions in each of the 12 boxes.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 159 Appendix E-5: Application of Modified Angoff Method to ORF Cut Scores

In the case of ORF, committee members are presented with the grade-level text and asked to identify if, for each word, a pupil at the lowest end of the performance category would a) attempt to read the word and b) read it correctly. The average of the yes/no votes provides an indication of the minimal ORF score for that category, as well as the average accuracy score (% of words read that will be read correctly). The table below gives an example of a recording device experts would use to the ability of pupils in a given performance category to read a grade-level text.

Table 9: Example of working document for assessing pupils’ ability to read a grade-level text For text: Paul went to the market. He wanted to buy bananas for his sister. She loves bananas. Paul bought six bananas. His sister was very happy.

Word Attempted in 1 Correctly read? minute? Paul Yes No Yes No Went Yes No Yes No To Yes No Yes No The Yes No Yes No Market. Yes No Yes No He Yes No Yes No Wanted Yes No Yes No To Yes No Yes No Buy Yes No Yes No Bananas Yes No Yes No Etc.

The modified Angoff has been used to establish cut scores for EGRA/FARS in Lebanon (see reference paper) and the validity and reliability of the method has been established.

Using a method like the Angoff or modified Angoff, or any of those outlined in Table 9, to set cut scores would ensure that future practices align with international best practices, namely that cut scores are content and assessment specific, empirically determined and based on local expert judgement. This would make any reporting of results defensible in the eyes of national and international assessment experts.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 160

Appendix E-6: Factors that Correlate with Better Reading Skills, Rwanda 2010-2018

Factors that correlate - at the p>.05 or p>.01 significance level - with P1 to P4 pupils’ mastery of foundational reading skills A review of national assessments of foundational reading skills P1 to P4 2014 to 2017 REB, in collaboration with USAID-supported literacy programs, carried out four national studies of early primary pupils’ foundational reading skills between 2014 and 2017. Each study collected contextual data to identify factors that correlate statistically with pupils’ performance, although not all studies examined the same factors.

The table below identifies factors that correlated with pupils’ reading scores across the four studies. YES+ indicates that the factor was found to correlate significantly, and positively, with pupil scores for the each of the studies. That is, the more the factor was present, the higher pupils’ scores. YES- indicates a negative correlation, i.e., the more the factor was present, the lower pupils’ scores. NC means that the factor was studied but not found to correlate significantly with pupil scores. Filled in boxes indicate that the factor was not examined in the study in question. R-values are not presented as the studies calculated them separately, by distinct reading skill and grade level. This provides a level of detail difficult to capture in a summary table. The overall level of significance (p-values) of the correlation is indicated, however.

Factors that account for more of the difference in pupils’ performance - Multivariate linear regression analysis conducted on the different data sets has identified factors have considerably more bearing on pupils’ performance than others. That is the case of pupils who read aloud to someone at home or has someone who reads to them (2018 P1 study) and of the distance pupils live from Kigali (2014 study). Living further away from Kigali accounted for more of the difference in pupils’ scores than did any of the other factors examined in 201439 The multivariate linear regression analysis conducted on the 2016 data, however, revealed that no one factor had significantly more bearing on pupils’ performance than did others. School or teacher factors explained only two to three percent of the variance in oral reading fluency scores.

In the 2017 P1 to P4 end line study, home-related factors accounted for more of the difference in pupils’ performance than teacher or school related factors, suggesting that parental involvement in literacy, where parents or family members read to children, check their homework, and ensure their child goes to school every day, has considerable bearing on children’s reading abilities.

39. By 2016, this was no longer a factor in pupils’ performance. However, distance from a District office did significantly correlate with lower pupil performance. The farther pupils were from a district office, the lower their performance.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 161 Factors in bold in the table are those that account for more of the difference in pupils’ performance in certain studies.

201440 2015 2016 2017 Focus Factor P1-P341 P1-P342 P1-P443 P144 n= 1,799 n=3,014 n=2,985 n=2,895

P1 N/S P1YES +** P3 YES-* Age (older = lower score) P2 YES - * P2 YES - ** P1, P2 No P3 YES - ** P3 YES - ** correlation

P1 YES - ** P1 YES - * P1 YES - ** Number of days absent from school P2 YES - * P2 YES - * (greater absentee rate, lower score) P3 YES - **

Has electric light bulb P1 YES +**

NC at any P1 N/S P2 YES +* P1 YES +** Has a television or motorcycle/car at grade P2 N/S home45 P3 YES +*

P1 YES +** P1 YES +** P1 YES +** Shows homework to parents/caregivers P2 YES +* P2 YES +** Pupil P3 YES +** P3 YES +**

P1 YES +* NC at any P1 YES +** Takes Kinyarwanda books home to read P2 YES +** grade P3 N/S

P2 YES+** P2 YES +* P1 YES +** Someone at home reads to pupil P3 YES+* P3 YES +**

Reads to someone aloud at home P1 YES +**

Reads to him/herself at home P1 YES +*

P1 NO P1 YES+ P1 YES+ YES+ Gender – being a girl P2 YES+** P2 YES+ P2 YES+ P3 YES+** P3 YES+ P3 YES+

40 The 2014 EGRA report does not report correlations for many individual factors, but rather uses three different composite indexes to examine correlations between pupil performance and factors related to the home environment, the school/teacher and pupils’ socio-economic status. The higher the composite score on an index, the more resources available to the pupil -in the home, at school, or economic resources - to support improved reading. At the P1 level, no correlations were found. At the P2 level, there were strong – but generally not important - correlations between pupils’ composite score on all three indexes and pupils’ reading skills (p >.01, 2-tailed). The same is true at the P3 level for oral reading fluency (p >.01, 2-tailed). The composite indexes do not explain a significant amount of the variance in student achievement. 41 See Rwandan National Reading and Mathematics Assessment, Baseline Report. L3 Initiative, USAID. Dec. 2014 42 See National Fluency Assessment in Rwandan Schools, Midline Report. L3 Initiative, USAID. Jan 2016. 43 See National Fluency and Mathematics Assessment in Rwandan Schools, End line Report, L3 Initiative, USAID. Jan 2017 44 From Rwanda Early Grade Reading Baseline Report (Draft). Soma Umenye. USAID. April 2018. 45 This is a proxy for socio-economic level

SOMA UMENYE QUARTERLY PROGRAM REPORT | 162

Distance from Kigali (farther = lower YES- ** NC at any grade P1 YES - ** score) P2 YES - * P3 NC Distance from District Office (farther = P1 YES - ** P1 YES - ** lower score) P2 YES - ** P2 YES - ** P3 YES - ** P3 YES - *

Has A2 or higher diploma P1 YES+**

Teacher provides progress reports on P1 YES+* pupils’ reading skills to HT

Support from mentors/coaches P1 – YES+**

Number of days absent from school P1 YES - * P2 YES – * Teacher (greater absentee rate, lower pupil P2 YES - ** P1, P3 NC scores)4647 P3 YES - **

Class size (i.e., does larger class size NC at any grade correlate with lower results)

Presence of school or community library (does presence of library correlate with higher results)

Head teacher Head teacher evaluates learners orally P1 YES+* Head teacher monitors pupils’ progress P3 YES +* through classroom observations P1, P2 NC

Head teacher monitors pupils’ progress P1 YES +* through results on tests P2, P3 NC

Head teacher discussed lower primary P1 Yes+ * literacy results at last SGAC meeting

**= Correlations are significant at the >0.01 level (2-tailed) * = Correlations are significant at the >0.05 level (2-tailed)

The absence of an asterisk indicates that studies reported correlations, either positive or negative, but did not indicate the level of significance.

46 The 2014 EGRA study found that teacher and pupil absenteeism correlated with distance from district office. Schools farther from the district office had higher rates of teacher and pupil absenteeism than school closer to the district office

SOMA UMENYE QUARTERLY PROGRAM REPORT | 163 ANNEX F. DEVELOPMENT AND IMPLEMENTATION OF A COMPREHENSIVE ASSESSMENT PROGRAM FOR READING IN EARLY PRIMARY

The following outlines a comprehensive assessment program for reading in early primary designed by REB, MINEDUC, URCE and Soma Umenye representatives. The program seeks to (1) identify pupils at four different levels of performance (does not meet grade-level expectations, partially meets expectations, meets grade-level expectations and exceeds grade-level expectations) with respect to the key reading competencies in each term and (2) direct targeted remedial interventions to pupils in the two lowest performance categories. The comprehensive assessment also seeks to identify pupils who are unable to demonstrate any performance with respect to the targeted competencies (“zero score pupils”), who need intensive remediation beyond that which can be provided by teachers during class time.

The comprehensive assessment program aligns end-of-P2 and -P3 competencies with the SDG 4.1.1a performance descriptors for those two grade levels as well as with the criteria used to measure education quality in early primary in the Human Capital Index. Such an alignment will ensure that P2 and P3 Term 3 assessment results allow decision makers at all levels of the system (school, sector, district, national) to track progress with respect to these two indicators.

The comprehensive assessment program is supported by an electronic dashboard (see Annex C above) to facilitate the entry, compilation, and presentation of data on the percentage of pupils in each of the four performance categories at all levels (school, sector, district, national) of the system. The data will allow educational decision-makers to monitor progress with respect to national targets and direct resources to struggling district, sectors, or schools.

The program supports MINEDUC/REB’s recent decision to institute term-based assessments, and to use the results of those assessments to direct remediation to pupils in need. If validated, the comprehensive program would be implemented in a number of pilot districts beginning in the 2019-2020 school year. If successful, it could then be expanded to other school districts, subject areas and levels.

The comprehensive assessment program is built around three key components:

A formative and summative A formative accountability framework that: assessment framework that: 1) answers • Has all levels of the system (school, sector, the question ”How good is good enough?” with district, national) use common tools for respect to reading competencies in early measuring and common benchmarks primary, 2) defines “good enough” in terms interpreting progress each term; of benchmarks and targets, and 3) provides • Has schools direct targeted remediation common tools for measuring and reporting activities to pupils in the lowest performance progress across all levels of the system. categories.

An electronic dashboard that tracks the percentage of P1, P2 and P3 pupils in each school in each performance category (does not meet, partially meets, meets expectations, exceeds expectations); aggregates that data at the sector, district, and national levels; and makes it available to decision makers at all levels of the system so they can direct resources to struggling areas. The dashboard also provides information on progress with respect to SDG 4.1.1a and Human Capital Index education indicators.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 164

Each of these three components is described below.

A. FORMATIVE AND SUMMATIVE ASSESSMENT FRAMEWORK FOR READING IN EARLY PRIMARY The assessment framework provides MINEDUC and REB curriculum and learning assessment specialists with a reference for developing future formative and summative assessment instruments for all subjects and grade levels. It groups pupil performance into four performance categories commonly used in international assessments. The following table presents the general performance categories and the policy level descriptors for each. The descriptors are generic in nature and applicable to all subjects and grade levels.

Table 1: General performance categories, with policy level descriptors

Does not meet Partially meets Meets grade-level Exceeds grade-level expectations expectations expectations expectations Pupils are performing well Pupils are performing Pupils are performing at Pupils are performing below expected levels. below expected levels. expected levels. They have beyond the expected level. They have very limited They have partial mastery developed sufficient They have developed mastery of the knowledge, of the knowledge, skills, mastery of the knowledge, superior mastery of the skills, and competencies and competencies outlined skills, and competencies knowledge, skills, and outlined in the curriculum. in the curriculum. As a outlined in the curriculum competencies outlined in As a result, they cannot result, they are able to to successfully complete the curriculum. As a result, complete most basic complete some (or most basic tasks. they can successfully grade-level tasks. partially complete some) complete all basic tasks as basic, grade-level tasks. well as some complex tasks.

Pupils who are unable to begin a task are considered to be too low to accurately assess and are generally included in a “too low to assess” or “zero” category.

Adapting the general policy level descriptors to reading in early primary generates the following general descriptors for reading at this level:

Table 2: General Performance Category Descriptors, Reading in Early Primary

Does not meet Partially meets Meets grade-level Exceeds grade-level expectations expectations expectations expectations Pupils are unable to Pupils read grade-level Pupils read most grade- Pupils read grade-level identify most sounds in texts very slowly –usually level texts accurately. texts easily and fluently, words or to consistently syllable by syllable – They usually self-correct respecting tones and connect sounds to letters. hesitantly, and with limited when they make a mistake syllable duration. They are As a result, they make confidence, particularly and are able to figure out able to figure out the many errors reading the when faced with longer the meaning of most new meaning of all new grade- simplest grade-level texts, texts. They make some words by using simple level words and answer all so many that they do not errors when reading clues in the text or literal and simple understand most of what familiar words or simple illustrations. They are able inferential comprehension they are reading and can texts, often skipping over to answer most literal questions, including making only rarely figure out the words entirely. They rarely comprehension questions accurate and logical meaning of new words. go back to self-correct. As and demonstrate predictions and making They do not like to read. a result, they are able to confidence in their reading personal connections answer some basic literal abilities. between the text their comprehension questions own lives. They like to or figure out the meaning read and have confidence of some new words. in their reading abilities.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 165

The general performance descriptors can be further broken down into grade-specific descriptors (see Appendix F-1), aligned with the overall grade-level expectations for reading outlined in the Kinyarwanda curriculum and the corresponding competencies.

The grade-specific reading performance descriptors describe what pupils in each performance category can do if they have met the expectations of the Kinyarwanda curriculum.

Grade-specific performance descriptors are the primary reference for the establishment of cut scores or benchmarks for reading assessments (term assessments, end-of-year assessments, large-scale pencil and paper assessments like LARS, large-scale one-on-one interview assessments like EGRA, etc.). Doing so ensures comparability across assessment results.

The grade-specific performance descriptors in Appendix F-1 have been aligned with the draft performance descriptor for SDG 4.1.1a. As the table below demonstrates, the Rwandan performance descriptors are of equal or slightly higher level of difficulty than the SDGs.

Table 3. Comparison of SDG 4.1.1a performance descriptors and P2, P3 performance descriptors

Grade Draft SDG 4.1.1 a Performance Rwandan performance description for descriptor (November 2018) “meets grade level expectations”

P2 Students read and comprehend most Students read P2-level texts accurately but of the written words in an not necessarily quickly. They are able to instrument given to them, answer basic literal comprehension particularly familiar ones, and questions, locate information in these texts, extract explicit information from and understand the meaning of words in sentences. texts. P3 Students read aloud written words Students read P3 texts with enough accuracy accurately and fluently. They and fluency. They answer literal and some understand the overall meaning of basic inferential questions about these texts, sentences and short texts. They make reasonable predictions of what might identify the texts’ topic. happen, and summarize accurately main events or ideas.

Cut scores or benchmarks. An internationally recognized best practice (modified Angoff method) was used to establish draft cut scores and benchmarks for oral reading fluency and comprehension, when progress is measured using a one-on-one interview type assessment and a text corresponding to the definition of “end of grade-level text” established in 2018 (see Table 4 below).

Table 4: Draft cut scores and benchmarks for P1 to P3 oral reading fluency and comprehension48

Competency Grade Does not Partially Meets grade- Exceeds grade- Grade- level meet meets level level level expectation expectations expectations expectations benchmark s Oral reading P1 1 to 5 cwpm 6 to 9 cwpm 10 to 19 cwpm 20 or more cwpm 10 cwpm fluency P2 1 to 9 cwpm 10 to 19 cwpm 20 to 29 cwpm 30 or more cwpm 20 cwpm

48 This table includes the initial draft cut scores and benchmarks established after the first round of consultation with teachers. They were further revised after Round 2 and the final draft cut scores and benchmarks are included under sub-IR 2.2.1 in main body of the quarterly report above.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 166

Competency Grade Does not Partially Meets grade- Exceeds grade- Grade- level meet meets level level level expectation expectations expectations expectations benchmark s P3 1 to 20 cwpm 21 to 34 cwpm 35 to 45 cwpm 46 or more cwpm 35 cwpm Reading P1 20% 40% 60% to 80% 100% 60% Comprehension P2 20% 40% 60% to 80% 100% 60% P3 20% 40% to 60% 80% 100% 80%

Finalizing the draft cut scores and benchmarks. The finalization of the draft cut scores and benchmarks should involve a five-part process:

1) Comparing the above cut scores and benchmarks with the average cut scores proposed during Round 2 of the modified Angoff process, to determine the extent to which the decisions in the above table reflect teachers’ deep professional knowledge of pupils’ reading skills in P1, P2, and P3. (See Tables 5 and 6 below).

Table 5: Average (mean) cut scores proposed by master teachers, 2nd round of Angoff deliberations

Does not meet Partially meets Meets expectations Exceeds Benchmarks P1 1 to 7 cwpm 8 to 13 cwpm 14 to 21 cwpm 22+ cwpm 14 cwpm P2 1 to 8 cwpm 9 to 27 cwpm 28 to 33 cwpm 33+ cwpm 28 cwpm P3 1 to 20 cwpm 21 to 41 cwpm 42 to 53 cwpm 53+ cwpm 42 cwpm

2) Looking at the data master teachers collect in their classroom between now and March 20th on pupils’ reading skills and in particular: a. the extent to which their predictions as to the performance category a given pupil will be in matches with their eventual assessment of that pupil’s reading skills and b. the distribution of scores in each performance category. This will help determine whether pupils are clustered at the lower or upper boundaries of any particular category or evenly distributed across the category.

Note: It is critical that participants understand that the classroom data is not nationally representative and cannot be used to establish new cut scores or benchmarks. They will also need to understand that the most valid, reliable and defensible data they have for their cut scores and benchmarks is the results of the Angoff method. That must continue to be the basis for policy decision-making.

3) Comparing Rwanda’s benchmarks with those of neighbouring countries where Bantu-languages are used as the language of instruction in Textbox 1 early primary (see Textbox 1). Although there are differences Malawi Chechewa 40 cwpm between these languages which would justify slight differences in Tanzania Kiswahili 50 cwpm benchmarks, any significant deviations would be difficult to justify Zambia 45 cwpm internationally. Kenya Kiswahili 45 cwpm

As the reading comprehension benchmark is used in the HCI index, and as the reading comprehension and/or the oral reading fluency benchmarks are being proposed as benchmarks for reporting on progress with the SDG 4.1.1a, at P2 or P3, it is critical that Rwanda have defensible benchmarks for these two competencies.

4) Comparing the draft benchmarks with benchmarks used previously in Rwanda to report progress with respect to these two competencies.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 167 Table 6: Comparison draft and previous benchmarks Difference Benchmarks Difference - Previous Proposed – previous Grade if mean previous Competency benchmark 2018 benchmarks level Angoff benchmarks (2014-2016) benchmark and Angoff score used and draft 2018 benchmarks Oral Reading P1 1 cwpm 10 cwpm 14 cwpm + 9 cwpm + 13 cwpm fluency P2 20 cwpm 20 cwpm 28 cwpm Unchanged + 8 cwpm P3 33 cwpm 35 cwpm 42 cwpm + 2 cwpm + 9 cwpm Reading P1 80% 60% na Decrease na comprehension P2 80% 60% na Decrease na P3 80% 80% na Unchanged na

5) Ensuring that the benchmarks established are sufficient to support pupils’ progress in subsequent grades. This includes ensuring that pupils are reading at a fluency level necessary to read quickly the longer texts encountered in all subjects beginning P3, and to support their transition to reading in English beginning in P4. Pupils who have not developed appropriate levels of fluency in Kinyarwanda in P3 will have difficulty applying what they know about reading in Kinyarwanda to reading in English.

ESTABLISHING NATIONAL TARGETS FOR ORAL READING FLUENCY AND READING COMPREHENSION, P1 TO P3 Once the grade-specific benchmarks and cut scores for these two skills are established, national targets should be established by calculating the percentage of pupils in each grade level who, in the national large- scale assessment carried out in 2018, scored at or above the benchmark score. Data on the annual learning gains in countries in the region that have implemented large-scale reading improvement programs will provide an empirical basis for the establishment of informed, future targets.

USING THE ABOVE AS THE BASIS FOR THE DESIGN AND DELIVERY OF A COMPREHENSIVE ASSESSMENT PROGRAM AT THE DISTRICT, SECTOR, AND SCHOOL LEVELS Improving reading scores will require two parallel initiatives:

1. Ensuring that struggling pupils are identified, that their reading difficulties are diagnosed, and that targeted remediation is delivered to address these difficulties 2. Ensuring that the factors that have correlated with improved reading abilities in the past are tracked at the school, sector, and district levels and that necessary interventions are introduced to address these factors.

In the case of number 1 above, REB, MINEDUC, URCE, and Soma Umenye representatives involved in the design of the comprehensive assessment program proposed that the new term-based assessments focus on the following reading competencies:

Table 6: Reading competencies to assess each term

Grade Term 1 Term 2 Term 3 P1 • Identify sounds in spoken • Accuracy of letter reading • Accuracy of syllable reading (with & words (phonological • Accuracy of syllable reading (no without blends) awareness) blends) • Accuracy of familiar word reading • Accuracy of identification of • Accuracy of familiar word reading • Accuracy and basic, literal vowels, consonants, and CV comprehension of simple decodable sentences (middle Term 3) P2 • Accuracy, syllable reading • Fluency in syllable reading • Fluency and accuracy of reading middle • Accuracy and • Accuracy and fluency and of Term 3 decodable text comprehension of decodable comprehension of decodable texts • Basic literal comprehension of texts decodable text (middle of Term 3)

SOMA UMENYE QUARTERLY PROGRAM REPORT | 168

Grade Term 1 Term 2 Term 3 P3 • Fluency and literal • Fluency and literal comprehension • Fluency and literal comprehension of P3- comprehension of P3-level of P3-level texts with P2 level texts with P2 and all P3 blends texts with P2 and Term 1 P3 andTerms 1and 2 P3 blends (narrative) blends (narrative) (narrative)

Selection of competencies. The competencies selected are all foundational ones with high predictive value, i.e., research suggests they predict how well pupils will read at later grade levels. They also form a hierarchy of competencies: those in subsequent terms build on those of previous terms (see Diagram 1 below).

Identifying Diagram 1: Hierarchy of reading sounds P1 competencies Accuracy, letter reading P1 Accuracy of syllable reading, P1 and P2

Fluency of syllable reading P2

Accuracy of familiar word reading, P2 and P1

Accuracy of text reading, P2

Oral reading fluency and comprehension of simple grade-level text, P2 and P3

Aligning competencies this way has diagnostic value. If a pupil in P3 or P2 has difficulty reading fluently a grade-level text, the school can use one of the P2 tools to check whether they are able to accurately read familiar words (P2 and P1 assessment tasks). If they cannot read familiar words, the school can test whether the pupils can read syllables (P1 and P2 assessment tasks). If pupils cannot read syllables, the school can use a P1 task to test their ability to read letters. Finally, if they cannot identify letters, the school can use the Term 1 P1 tasks to see if they can accurately identify sounds in words.

e package of term-specific tasks constitutes battery of diagnostic instruments that schools can use to identify learning gaps.

Link SDG and HCI. The Term 3 P2 and P3 competencies align with those measured by SDG 4.1.1a and HCI. Collecting data on the percentage of pupils meeting grade-specific benchmarks at the end of P2 and P3 will provide decision makers with an indication of progress with respect to these indicators.

Resources required to implement comprehensive assessment program. REB/MINEDUC/URCE and Soma Umenye Kinyarwanda and learning assessment specialists would have to develop and field test simple assessment instruments for each term.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 169 The team would also have to develop simple remediation strategies for pupils in the “does not meet” and “partially meets” performance categories. Ideally many of these activities would take the form of simple games that pupils could take home to play with parents, siblings, or community members.

Both the assessment instruments and the remediation activities would need to be field tested and then piloted in a couple of districts before implementation nationwide. Appendix F-2 includes other remedial strategies identified by REB/MINEDUC/URCE and Soma Umenye representatives. All of these remediation strategies will have to be reviewed by pilot district and school representatives to assess the feasibility of their implementation with existing resources. The success of schools in implementing appropriate remediation should be evaluated as part of the pilot project.

Ensuring tools are easy to use. Since the most reliable way to measure reading competencies, particularly in P1 and even in P2, is via short one-on-one interviews with pupils, REB/MINEDUC/URCE, Soma Umenye and district representatives recommended that the tools be tablet-based, but capable of collecting data in off-line mode. This would automate the data analysis and the identification of which performance category a pupil should be assigned, based on results. Even if there are resources to support such an e-option, pencil and paper versions of the assessment tasks should be created so that assessments can continue should the technology fail.

Setting cut scores and benchmarks for each term-task. Once the tools are developed, a modified Angoff process can be used, in conjunction with the grade-specific performance descriptors, to establish the scores corresponding to each of the four performance categories.

Creating equivalent tasks each year. To avoid a “teaching to the test” situation, equivalent versions of the term-specific tasks would have to be created each year and distributed to districts and schools a week or so before the assessment week. If the decision is made to use tablet-based assessments, this could be easily done by having district or sector inspectors upload the new tasks to the tablets.

B. ELECTRONIC DASHBOARD TO MEASURE PROGRESS IN PUPILS’ READING SKILLS An electronic dashboard would be created to monitor the implementation of the comprehensive assessment program and track progress with respect to national and international indicators at the local, district, and national levels. The dashboard would allow authorities to identify underperforming schools, sectors, or districts and direct additional resources to them.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 170

Appendix F-1. Grade-specific performance descriptors, P1 to P3 reading P1 Curriculum Standard: Read simple P1-level texts accurately and understand the basic ideas presented. Does not meet (or below basic) Partially meets (basic) Meets (competent) Exceeds (proficient) Listening comprehension Can only Listening comprehension Can answer Listening comprehension Listening comprehension answer the most basic comprehension simple questions like Who? Where? in a text Can answer all basic literal listening Can answer literal comprehension questions about the text or story read to read to him/her and provide a little detail in comprehension questions about a text questions, as well as basic inferential him/her. Answers are limited to a few answers. read to him/her. Provides some detail in questions or questions requiring making words. answers. a connection between text and his/her Phonological awareness life experiences. Provides complete Phonological awareness/ (sounds) Identifies or recognizes some P1 Phonological awareness (sounds) answers. (sounds) Rarely identifies or recognizes sounds (vowels, consonants, blends, syllables) Identifies or recognizes most P1 sounds P1 sounds (syllables, vowels, blends, in spoken words, but not consistently. Phonological awareness (sounds) (vowels, consonants, blends, and syllables) consonants) in spoken words. Can identify all P1 sounds (vowels, Alphabetic awareness (letters, syllables) in spoken words. consonants and syllables) in spoken Alphabetic awareness (letters, Connects some sounds to letters (vowels, words. syllables) Cannot consistently connect consonants, blends, syllables) but makes Alphabetic awareness (letters, sounds to letters (vowels, consonants, frequent mistakes. As a result, makes syllables) Connects most P1 letters and Alphabetic awareness (letters, blends, syllables). As a result, makes many frequent mistakes reading syllables or simple, letter blends to sounds. As a result, reads syllables) Accurately connects all P1 mistakes reading letters, syllables or short P1 words and rarely recognizes and most P1 syllables or simple, familiar sounds to letters and does so quickly. As simple P1 words. self-corrects when errors made. Has a lot of words, although may do so slowly and a result, reads P1 syllables and words difficult reading longer words. sometimes syllable-by-syllable. Sometimes quickly and smoothly. Vocabulary Can only rarely identify the recognizes and self-corrects his/her meaning of the key/new word in a text by Vocabulary Can identify the meaning of Vocabulary errors. looking at contextual clues. If asked to use some key/new words by looking at Can quickly identify the meaning of most new word in a sentence, hesitates, repeats illustrations or basic contextual clues but key/new words by looking at clues in Vocabulary Can identify the meaning of sentences given by other pupils or needs does so very slowly and not always illustration or context. If asked to use most key/new words by looking at clues in teacher support to finish sentence. accurately. If asked to use new word in a new word in a sentence, creates an illustration or context. If asked to use new sentence, creates a very basic sentence that interesting, original sentence that Fluency Can only read the most basic or word in a sentence, creates an original does not always convey the meaning. conveys meaning. familiar P1 words. For most other words, sentence that conveys meaning. reads the first letters or syllables and then Fluency Reads words, sentences and Fluency Can read some basic or familiar P1 gets stuck and cannot complete the work. Fluency Reads words, sentences with a short stories accurately and quickly. words but does not read fluently. Has Looks at words a long time before high degree of accuracy. Reads calmly/not Sometimes reads with expression, difficulty reading any other words accurately reading. Does excessive segmenting of quickly and without expression. making connections between sentences or may skip them entirely. Does excessive words into syllables and cannot blend Sometimes self corrects when errors are in a text. segmenting when reading. Is not able to syllables to produce words. made. recognize errors and self-correct. Voice is Reading comprehension Answers all Reading comprehension hesitant when reading. Reading comprehension Answers most literal comprehension questions and Can only answer the most literal Reading comprehension Answers some literal comprehension questions about questions connecting the text to the real comprehension question about the text. literal comprehension questions. text read. life

SOMA UMENYE QUARTERLY PROGRAM REPORT | 171

P1 Curriculum Standard: Read simple P1-level texts accurately and understand the basic ideas presented. Urwego rw’ibanze Urwego ruciriritse Urwego rukwiye Intyoza Urwego rw’ikirenga Kumva umwandiko yasomewe Ashobora Kumva umwandiko yasomewe Kumva umwandiko yasomewe Itahuramajwi Ashobora gutahura gusubiza ibibazo byoroheje nka: Ashobora gusubiza ibibazo byose Ashobora gusubiza ibibazo byose byoroheje byo rimwe na rimwe amajwi y’inyajwi, Ni nde? Ni he? Ku mwandiko yasomewe, byoroheje byo kumva umwandiko kumva umwandiko ndetse n’ibibazo byoroshye ingombajwi n’ibihekane mu magambo akanatanga ibindi ibisobanuro bike mu yasomewe. Atanga ibindi bisobanuro byo gusesengura umwandiko yasomewe bisubizo bye. byisumbuye mu bisubizo bye. cynagwa ibindi bihuza umwandiko n’ubuzima Ihuzamajwi Ntashobora guhuza buri gihe busanzwe. Atanga ibisubizo byuzuye. amajwi n’inyajwi, ingombajwi n’ibihekane Itahuramajwi Atahura amwe mu majwi Itahuramajwi Atahura amajwi hafi ya byabyo. Ku bw’iyo mpamvu, ntashobora y’inyajwi ingombajwi, ibihekane n’imigemo ari yose y’inyajwi, ingombajwi n’ibihekane Itahuramajwi Atahura amajwi yose y’inyajwi gusoma cyangwa se yagerageza gusoma ku kigero ke, nubwo hari igihe atabishobora. biri ku kigero ke. ingombajwi n’ibihekane biri ku kigero ke. agakora amakosa menshi asoma inyuguti, imigemo cyangwa inkuru zoroshye ziri ku Ihuzamajwi Ahuza amwe mu majwi Ihuzamajwi Ahuza amajwi hafi ya yose Ihuzamajwi Ahuza amajwi yose n’ibimenyetso kigero cy’umwaka wa mbere. n’ibimenyetso dukoresha twandika inyajwi, n’ibimenyetso bikoreshwa bandika n’ibimenyetso bikoreshwa bandika inyuguti ziri ingombajwi n’ibihekane. Akora amakosa iyo inyuguti n’ibihekane. Ku bw’iyo mpamvu, ku kigero ke. Kandi akabikora yihuta. Ku bw’iyo asoma imigemo cyangwa amagambo magufi ari asoma imigemo hafi ya yose cyangwa mpamvu, asoma imigemo n’amagambo biri ku Inyunguramagambo ku kigero ke. Ni gake yikosora iyo akoze amagambo yoroshye kandi amenyerewe kigero ke yihuta kandi atuje. Ni gake atahura igisobanuro k’ijambo ikosa. Biramugora gusoma amagambo cyane biri ku kigero ke, nubwo asoma rishya mu mwandiko agendeye ku byo amaremare. atihuta kandi rimwe na rimwe agemura. Inyunguramagambo Ashobora gutahura yasomye. Iyo bamubwiye gukoresha Rimwe na rimwe iyo akoze ikosa igisobanuro cy’amagambo mashya hafi ya yose, ijambo rishya mu nteruro, arajijinganya Inyunguramagambo Ashobora gutahura arabimenya akikosora. agendeye ku byo yasomye cyangwa ku agasubiramo interuro zatanzwe n’abandi igisobanuro cy’amagambo amwe n’amwe mashusho ari mu mwandiko. Iyo asabwe banyeshuri cyangwa agafashwa n’ mashya agendeye ku byo yasomye cyangwa ku Inyunguramagambo gukoresha ijambo rishya mu nteruro, akora umwarimu mu gukora iyo nteruro. mashusho ari mu mwandiko, ariko akabikora Ashobora gutahura igisobanuro interuro y’umwimerere kandi ifite icyo gahoro gahoro, ndetse rimwe na rimwe nabi. cy’amagambo mashya hafi ya yose, isobanuye. Gusoma udategwa Ashobora gusoma Iyo asabwe gukoresha ijambo rishya mu agendeye ku byo yasomye cyangwa ku gusa amwe mu magambo amenyerewe nteruro, akora interuro ntoya rimwe na mashusho ari mu mwandiko. Gusoma udategwa cyane ari ku kigero ke. Andi magambo, rimwe idafite icyo isobanuye. Iyo asabwe gukoresha ijambo rishya mu Asoma neza yihuta amagambo, interuro asoma inyuguti ibanza cyangwa umugemo nteruro, akora interuro y’umwimerere n’utwandiko tugufi. Rimwe na rimwe asoma Gusoma udategwa Asoma ategwa amwe ubanza ntabashe kuharenga. Abanza kandi ifite icyo isobanuye. asesekaza, bityo agahuza neza interuro zigize mu magambo yoroshye cyangwa amenyerewe kwitegereza ijambo umwanya munini Gusoma udategwa - Asoma neza umwandiko. ari ku kigero ke. Biramugora gusoma neza mbere yo kurisoma. Asoma amagambo amagambo n’interuro. Asoma atuje/ariko andi magambo cyangwa akayasimbuka yose. agemura cyane kandi ntashobora guhuza atihuta kandi adasesekaza. Aragemura cyane iyo asoma. Ntashobora imigemo ngo akoremo ijambo. Kumva umwandiko Asubiza neza ibibazo kumenya ko akoze ikosa ngo yikosore. Kumva umwandiko byose byoroheje byo kumva umwandiko Ntamenya ko yakoze amakosa ngo yikosore. Kumva umwandiko Ashobora gusubiza Asubiza ibibazo byoroheje hafi ya byose n’ibibazo bihuza umwandiko n’ubuzima Asoma mu ijwi rishidikanya. gusa ibibazo byoroheje byo kumva byo kumva umwandiko. busanzwe. umwandiko Kumva umwandiko Asubiza bike mu

bibazo byoroheje byo kumva umwandi

SOMA UMENYE QUARTERLY PROGRAM REPORT | 172

P2 Curriculum Standard: Read P2-level texts accurately and with some fluency; Answer simple literal comprehension questions or questions that require making personal connections with everyday life Does not meet (or below basic) Partially meets (basic) Meets (competent) Exceeds (proficient) Listening comprehension Correctly Listening comprehension Answer some Listening comprehension Answer most Listening comprehension Easily answers answer only the most basic literal literal questions and make some basic literal and some simple inferential literal and inferential questions and makes questions about a story. Answers are (although not always logical) predictions. questions and make basic logical plausible and logical predictions. generally short, lacking detail. Cannot When summarizing, will give one or two predictions. When summarizing, includes Accurately summarizes all of the main make logical predictions about story. events or ideas. most of the main ideas or events. ideas or events in a text. Attempt to summarize a text or story are limited to one or two events or ideas. Phonological Awareness Phonological Awareness Recognize or Phonological Awareness Quickly and Sometimes recognize or identify some P2 identify most P2 blends in spoken words. accurately recognizes or identifies all P2 Phonological Awareness Rarely able to blends in spoken words, but not always blends in spoken words recognize or isolate P2 sounds (blends) in consistently. Alphabetic awareness Match most P1, Alphabetic awareness Correctly spoken words. P2 letters or letter blends to sounds to matches all P1, P2 letter or letter blends Alphabetic awareness Sometimes, match reading most P2 syllables and 3 and 4 to sounds when reading P2 syllables or 3 Alphabetic awareness Rarely able to some P1, P2 letter or letter blends to syllable words accurately, although not to 4 syllable words. Reads words match P1or P2 letter or -blends to sounds to read some syllables and words always quickly. accurately, quickly, and effortlessly. sounds. As a result, are unable to decode accurately. most syllables or words accurately. Vocabulary Can figure out the meaning Vocabulary Use a variety of strategies to Vocabulary Have difficulty figuring out of most new words by using use clues in understand meaning of all new words in meanings of new words in P2 text. Are Vocabulary Because of high inaccuracy of illustration or text; Is able to use them P2 texts (including nuanced ones). Able sometimes able to use new words in a word reading, unable to identify meaning new words in a basic, original sentence. to use new words in original, interesting sentence, although sentence will be basic of new words in P2 text or use new sentences. or modeled on a sentence given by words presented Fluency Reads most words in P2 texts another pupil. Fluency Read P2-level texts easily accuracy., Rarely reads syllable by syllable.. accurately, fluently and with confidence, Fluency Read P2 texts, hesitantly and Fluency Read P2 texts slowly, hesitantly Generally (but not always immediately) checking while reading that text makes inaccurately, either misreading most (syllable-by-syllable), with lots of errors self-corrects when reading does not make sense and self correctly quickly if words or skipping over them. and omissions. Does not self-correct. sense. necessary.

Reading comprehension Answer a Reading comprehension Make so many Reading comprehension Answer most Reading comprehension Answer literal limited number of simple, literal errors and read so slowly they are unable basic, literal comprehension questions and simple inferential questions. comprehension questions about text read. to understand the most basic ideas in text. about text. When summarizing, identify Accurately summarize all main events or When summarizing, identify some but not most main events or ideas. ideas in story. all the important information.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 173 P2 Curriculum Standard: Read P2-level texts accurately and with some fluency; Answer simple literal comprehension questions or questions that require making personal connections with everyday life Urwego rw’ibanze Urwego ruciriritse Urwego rukwiye Intyoza Urwego rw’ikirenga Kumva umwandiko yasomewe Kumva umwandiko yasomewe Kumva umwandiko yasomewe Kumva umwandiko yasomewe Asubiza neza gusa ibibazo byoroheje byo Asubiza neza bimwe mu bibazo byoroheje Asubiza ibyinshi mu bibazo byoroheje byo Asubiza bitamugoye ibibazo byo kumva kumva agakuru yasomewe. Atanga byo kumva umwandiko kandi agashobora kumva umwandiko akanasubiza bimwe na umwandikon’ibibazo byo guhuza ibisubizo bigufi bidafite ibisobanuro gutahura rimwe na rimwe uko umwandiko bimwe mu bibazo byo guhuza umwandiko umwandiko n’ubuzima busanzwe kandi bihagije. Ntashobora gutahura uko uza gukomeza. Mu kuvuga muri make ibyo n’ubuzima busanzwe kandi akanatahura ku agashobora gutahura neza uko umwandiko uza gukomeza. Iyo agerageje yasomye, atanga ingingo imwe cyangwa buryo bworoheje uko umwandiko uza umwandiko uza gukomeza. Avuga neza kuvuga muri make ibyo yasomye, ntarenza ebyiri. gukomeza. Mu kuvuga muri make ibyo muri make iby’ingenzi biri mu mwandiko ingingo imwe cyangwa ebyiri. yasomye, atanga ingingo hafi ya zose cyangwa yasomye. Itahuramajwi Rimwe na rimwe ashobora ibitekerezo bikubiye mu mwandiko. Itahuramajwi Ashobora gutahura mu Itahuramajwi Ni gake cyane ashobora gutahura mu magambo yumvise amajwi Itahuramajwi Atahura mu magambo magambo yumvise amenshi mu amajwi gutahura mu magambo yumvise amajwi y’ibihekane byo mu mwaka wa kabiri. yumvise amenshi mu majwi y’ibihekane byo y’ibihekane byo mu mwaka wa kabiri, y’ibihekane byo mu mwaka wa kabiri. mu mwaka wa kabiri. akabikora neza kandi yihuta. Ihuzamajwi Rimwe na rimwe ashobora Ihuzamajwi Ni gake cyane ashobora guhuza amajwi y’inyuguti cyangwa Ihuzamajwi Ahuza amajwi y’inyuguti Ihuzamajwi Ashobora guhuza amajwi guhuza amajwi y’inyuguti cyangwa y’ibihekane byo mu mwaka wa mbere n’uwa cyangwa y’ibihekane byo mu mwaka wa yose y’inyuguti cyangwa y’ibihekane byo y’ibihekane byo mu mwaka wa mbere kabiri n’ibimenyetso byayo agasoma neza mbere n’uwa kabiri n’ibimenyetso byayo mu mwaka wa mbere n’uwa kabiri n’uwa kabiri n’ibimenyetso byayo. Kurera imigemo n’amagambo. agasoma neza imigemo n’amagabo afite n’ibimenyetso byayo mu gihe asoma iyo mpamvu ntashobora gusoma neza imigemo 3 cyangwa 4 byo mu mwaka wa imigemo yo mu mwaka wa kabiri imyinshi mu migemo cyangwa amagambo. Inyunguramagambo Agerageza kabiri nubwo atabikora yihuta buri gihe. cyangwa amagambo agizwe n’imigemo 3 gusobanura, bimugoye, amwe n’amwe mu cyangwa 4. Asoma amagambo neza Inyunguramagambo Ashobora kuvumbra Inyunguramagambo magambo mashya ahuye na yo mu myandiko yihuta kandi bitamugoye. igisobanuro cy’amagambo menshi mashya Kubera ko hari amagambo menshi yo mu mwaka wa kabiri. Rimwe narimwe ahereye ku mwandiko cyangwa ku mashusho. Inyunguramagambo Akoresha adasoma neza, ntashobora gusobanura ashobora gukoresha amagambo mashya mu Ashobora gukoresha amagambo mashya mu uburyo bunyuranye agashobora cyangwa gukoresha mu nteruro nteruro, nubwo iyo nteruro iba yoroheje nteruro ze bwite z’umwimerere. gusobanura no gukoresha amagambo amagambo mashya ahura na yo. cyangwa yiganye iyatanzwe na mugenzi we. yose mashya mu myandiko yo mu Gusoma udategwa Asoma neza menshi mu mwaka wa kabiri (harimo n’imyandiko Gusoma udategwa Gusoma udategwa Asoma arandaga kandi magambo ahura nayo. Ni gacye asoma ijimije). Ashobora gukoresha amagmbo Asoma nabi amenshi mu magambo ahura ajijinganya, agemura amwe n’amwe mu agemura. Muri rusange arikosora iyo asomye mashya mu nteruro ze bwite kandi na yo, ajijinganya cyangwa akayataruka. magambo ahura na yo cyangwa akayataruka. nabi nubwo atikosora buri gihe. zishimishije. Iyo akoze ikosa ntiyikosora. Kumva umwandiko yisomeye Kumva umwandiko Gusoma udategwa Bimworoheye, Ashobora gusubiza ibyinshi mu bibazo Kubera ko asoma nabi kandi arandaga Kumva umwandiko yisomeye asoma neza kandi yifitiye ikizere byoroheje byo kumva umwandiko akanavuga ntashobora kumva iby’ingenzi mu byo Ashobora gusubiza bimwe na bimwe mu imyandiko yo mu mwaka wa kabiri kandi muri make ibyinshi mu by’ingenzi bikubiye mu asoma. bibazo byoroheje byo kumva umwandiko. akagenda agenzura niba umwandiko ufite mwandiko yasomye. Iyo vuga muri make iby’ingenzi yasomye icyo uvuze, akanikora vuba igihe bibaye ntabivuga neza byose. ngombwa. Kumva umwandiko yisomeye

SOMA UMENYE QUARTERLY PROGRAM REPORT | 174

P2 Curriculum Standard: Read P2-level texts accurately and with some fluency; Answer simple literal comprehension questions or questions that require making personal connections with everyday life Urwego rw’ibanze Urwego ruciriritse Urwego rukwiye Intyoza Urwego rw’ikirenga Ashobora gusubiza ibibazo byose byoroheje byo kumva umwandiko n’ibibazo byoroheje bihuza umwandiko n’ubuzima busanzwe. Avuga muri make iby’ingenzi byose bikubiye mu mwandiko yasomye.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 175 P3 Curriculum Standard: Read P3-level texts accurately and fluently. Answer literal and basic inferential questions correctly. Does not meet (or below basic) Partially meets (basic) Meets (competent) Exceeds (proficient) Listening comprehension Struggle Listening comprehension Answer Listening comprehension Listening comprehension Answer to answer most literal comprehension most basic literal comprehension Answer literal and some basic literal and inferential questions about questions. Predictions mostly questions, but not inferential. Make inferential questions about text text and make interesting, plausible inaccurate or illogical. Summaries are some basic (although not always read to them. and logical predictions. Summarize incomplete, with only basic logical) predictions. Provides some Predictions are logical. Summarize accurately all main events or ideas in information and not connected in a but not all important activites or accurately most main events or P3-level text. logical way. events when summazing a text. ideas in text. Phonological Awareness Accurately Phonological Awareness Phonological Awareness Able to and quickly identity P3 blend sounds Phonological Awareness Rarely Recognize or isolate some P3 correctly recognize most P3 blend in spoken words recognize or isolate P3 blend sounds blend sounds in spoken words. sounds in spoken words. in spoken words. Alphabetic awareness Match all P1 Alphabetic awareness Match Alphabetic awareness Match to P3 letter or letter blends quickly Alphabetic awareness Rarely match some P1 to P3 letter or letter most all P1 to P3 letter or letter and accurately to sounds to decode P1 to P3 letter or -blends to sounds. blends accurately to sounds, so able blends accurately to sounds, so effortlessly unfamiliar P3 level works As a result, are unable to decode to read some P3-level syllables or able to read most P3-level most P3-level syllables or words words accurately. syllables or words accurately, accurately, particularly those although not always quickly. Vocabulary Able to use contextual containing P3 blends. clues and word structure to Vocabulary Understand basic P3 understanding meaning of new P3 vocabulary. Does not consistently Vocabulary Able to use words. use contextual clues to understand contextual clues to understand Vocabulary High inaccuracy of word more complex or nuanced most new words in P3 texts. Fluency Read P3 texts accurately reading means pupils are rarely able vocabulary. and fluently, sometimes with to identify meaning of new P3 words. Fluency Read P3 texts accurately expression and/or appropriate Fluency Read P3 texts slowly, and with some level of fluency. gestures. hesitantly. Makes some errors. Most of the time goes back and Fluency Read P3 texts very slowly Either does not self-correct or does corrects when reading does not (syllable by syllable), hesitantly and so slowly. make sense.. Reading comprehension Accurately with high degree of inaccuracy. answer literal and inferential Reading comprehension Answers Reading Comprehension questions on P3 texts. Make some basic literal questions. Answer most literal about P3- interesting and plausible predictions Reading comprehension Answer Information in summaries is level texts. Make reasonable of what might happen, based on only the most basic literal accurate, but incomplete, missing predictions of what might happen. information in text. Summarize comprehension question about P3 important details. Able to summarize accurately accurately main and secondary text (e.g., name of character) events or ideas in texts.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 176

P3 Curriculum Standard: Read P3-level texts accurately and fluently. Answer literal and basic inferential questions correctly. Does not meet (or below basic) Partially meets (basic) Meets (competent) Exceeds (proficient) Summaries are incomplete, with only most main events or ideas in a P3 basic information and not connected text. in a logical way.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 177

P3 Curriculum Standard: Read P3-level texts accurately and fluently. Answer literal and basic inferential questions correctly. Urwego rw’ibanze Urwego ruciriritse Urwego rukwiye Urwego rw’ikirenga Kumva umwandiko yasomewe Kumva umwandiko yasomewe Kumva umwandiko Kumva umwandiko yasomewe Agorwa no gusubiza ibyinshi mu Asubisa ibyinshi mu bibazo byoroheje yasomewe Asubiza neza kandi vuba ibibazo byose bibazo byoroheje byo kumva byo kumva umwandiko, ariko ibyo Asubiza neza ibibazo byoroheje byoroheje byo kumva umwandiko umwandiko. Atahura ashidikanya guhuza umwandiko n’ubuzima byo kumva umwandiko na bimwe n’ibibazo bihuza umwandiko n’ubuzima cyangwa nabi uko umwandiko uza busanzwe bikamunanira. Agerageza na bimwe mu bibazo bihuza busanzwe. Atahura neza kandi vuba uko gukomeza. Iyo avuga muri make gutahura uko umwandiko uza umwandiko n’ubuzima busanzwe. umwandiko uza gukomeza kandi ibyo yasomye avuga iby’ibanze bike gukomeza nubwo buri gihe atabitahura Atahura neza uko umwandiko uza agashobora gukora neza inshamake kandi akabikurikiranya nabi. neza. Ashobora gukora inshamake gukomeza. Ashobora gukora neza y’ibyo bamusomeye agaragazamo ingizo y’iby’ ingenzi yasomye ariko inshamake ya bwimwe by’ ingenzi zose z’ingenzi. Itahuramajwi Ni gake cyane ntashyiramo ingingo nyamukuru zose. bamusomeye mu mwandiko. ashobora gutahura mu magambo Itahuramajwi Ashobora gutahura amajwi y’ibihekane byo mu mwaka Itahuramajwi Ashobora gutahura Itahuramajwi Ashobora neza kandi vuba mu magambo, amajwi wa gatatu. mu magambo amajwi y’ibihekane gutahura neza mu magambo y’ibihekane byose byo mu mwaka wa bimwe na bimwe byo mu mwaka wa amajwi y’ibyinshi mu bihekane byo gatatu. Ihuzamajwi Ni gake cyane gatatu. mu mwaka wa gatatu. ashobora guhuza amajwi Ihuzamajwi Ashobora guhuza neza y’ibihekane n’inyuguti zibigize. Ibyo Ihuzamajwi Ashobora guhuza neza Ihuzamajwi Ashobora guhuza kandi vuba amenshi mu majwi bigatuma adashobora gusoma neza amwe n’amwe mu majwi y’inyuguti neza amenshi mu majwi y’inyuguti y’inyuguti cyangwa y’ibihekane byo mu imyinshi mu migemo cyangwa cyangwa y’ibihekane byo mu mwaka cyangwa y’ibihekane byo mu mwaka wa mbere kugeza mu wa gatatu amagambo arimo ibihekane byigwa wa mbere n’ibimenyetso byayo mwaka wa mbere kugeza mu wa n’ibimenyetso byayo kandi agashobora mu mwaka wa gatatu. kugeza mu wa gatatu kandi agashobora gatatu n’ibimenyetso byayo, kandi gusoma neza, bitamugoye amagambo gusoma neza imwe n’imwe mu agashobora gusoma neza imyinshi atamenyerewe ari mu myandiko iri ku Inyunguramagambo Asoma migemo n’amagambo byo mu mwaka mu migemo n’amagambo byo mu kigero cyo mu mwaka wa gatu. amagambo menshi nabi bigatuma wa gatatu biri ku kigero ke. mwaka wa gatatu biri ku kigero ke adashobora gusobanukirwa nubwo abikora atihuta. Inyunguramagambo Ahereye ku n’amagambo mashya ahuye na yo Inyunguramagambo Asobanukirwa mwandiko no ku miterere y’ijambo ari ku kigero cyo mu mwaka wa n’amagambo y’ibanze yo mu mwaka Inyunguramagambo Ahereye ashobora gusobanukirwa vuba gatatu. wa gatutu. Ntashobora gushingira ku ku mwandiko ashobora amagambo mashya ahura na yo mu mwandiko ngo asobanukirwe gusobanukirwa n’amagambo myandiko yo mu mwaka wa gatatu. Gusoma adategwa Asoma nabi, n’amagambo amwe akomeye. mashya ahura na yo mu myandiko arandaga, agemura, anajijinganya yo mu mwaka wa gatatu. Gusoma adategwa Asoma neza imyandiko yo mu mwaka wa gatatu. Gusoma adategwa Asoma arandaga imyandiko yo mu mwaka wa gatatu kandi ajijinganya imyandiko yo mu Gusoma adategwa Asoma neza adategwa kandi asesekaza. Kumva umwandiko yisomeye mwaka wa gatatu. Agenda akora imyandiko yo mu mwaka wa

SOMA UMENYE QUARTERLY PROGRAM REPORT | 178

P3 Curriculum Standard: Read P3-level texts accurately and fluently. Answer literal and basic inferential questions correctly. Urwego rw’ibanze Urwego ruciriritse Urwego rukwiye Urwego rw’ikirenga Ashobora gusubiza gusa ibibazo amakosa iyo asoma. Ntashobora gatatu adategwa cyane. Iyo Kumva umwandiko yisomeye byo kumva imyandiko yo mu kwikosora yanakwikosora akikosora asomye nabi akenshi arikosora. Ashobora gusubiza ibibazo byose byo mwaka wa gatatu by’ibanze bigoranye. Kumva umwandiko yisomeye kumva imyandiko yo mu mwaka wa byoroheje (urugero ni nde Ashobora gusubiza ibyinshi mu gatatu n’ibibazo bihuza umwandiko uvugwa). Iyo avuga muri make Kumva umwandiko yisomeye bibazo byoroheje byo kumva n’ubuzima busanzwe. Ashobora iby’ingenzi yasomye, ntabivuga Asubiza bimwe na bimwe mu bibazo imyandiko yo mu mwaka wa gutahura neza cyane uko umwandiko neza byose kandi abikurikiranya byo kumva umwandiko by’ibanze gatatu. Ashobora kandi gutahura uza gukomeza. Avuga muri make ibyo nabi. byoroheje. Ashobora kuvuga muri uko umwandiko uza gukomeza. yasomye agaragaza ingingo z’ingenzi make ibyo yasomye ariko ingingo Ashobora kuvuga neza muri make n’iz’ingereka ziri mu mwandiko.. z’ingenzi zose ntazivuge. ibyo yasomye agaragazamo ingingo z’ingenzi.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 179 Appendix F-2. Results, Modified Angoff, Round 1 and Round 2 Scoring Average Oral Reading Fluency Cut Scores, Round 1 & Round 2 Scoring

Partially meets Meets Exceeds Grade expectations expectations expectations P1 Round 1 Round 2 Round 1 Round 2 Round 1 Round 2 Mean 8.2 7.7 14.6 13.2 22.2 22.4 Median 8 7 14 12 22 22 Std. Dev. 1.8 2.3 3.3 1.9 1.99 1.2 Min 5 6 12 12 20 20 Max 11 14 20 17 25 24 P2 Round 1 Round 2 Round 1 Round 2 Round 1 Round 2 Mean 7.4 8.7 21 27.7 30.6 32.6 Median 8 8 23 28 30 32 Std. Dev. 2.2 1.8 9.1 2.5 2.2 3.2 Min 5 6 4 22 27 30 Max 10 12 30 30 34 41 P3 Round 1 Round 2 Round 1 Round 2 Round 1 Round 2 Mean 20.6 20 33.7 41.2 53.8 53.4 Median 20.5 19 34 41 54 53 Std. Dev. 2.4 3.74 5.9 2.2 3.15 1.59 Min 18 16 25 37 48 50 Max 25 27 42 44 60 55

SOMA UMENYE QUARTERLY PROGRAM REPORT | 180

Appendix F-3. Potential remediation activities

Teachers will receive customized remediation activities for pupils who score in the does not meet and partially meets performance categories each term. These activities should be simple enough for parents, family members, community members or older students to lead, and should directly address the skills identified as weak.

In addition, teachers and schools should consider the following:

A. For all pupils, but particularly those in the lower performance categories: • At the classroom level o Teachers should encourage these pupils, give them positive reinforcement and communicate to them that they can learn to be good readers • At the school level o Head Teachers should share the results of the term assessments with the school general assembly and seek their assistance in putting in programs to reinforce the skills of pupils in the lower categories

o Head Teachers should ensure that available reading textbooks are in classrooms, and in the hands of pupils each day

o Head teachers should organize school-based events that raise awareness of the importance of reading, make reading fun (reading festivals, reading clubs, etc.) and make pupils want to come to school. • At the community level o Encourage community leaders and other stakeholders to promote and raise the awareness of the importance of reading

o Encourage community to initiate activities that promote reading (reading clubs, reading camps, reading contests, community libraries, etc.)

o Initiate activities to monitor pupil and teacher attendance and punctuality.

B. For pupils in the does not meet and partially meets categories a. Identify whether attendance or punctuality is a factor in these pupils’ poor performance, and if so, work with community to improve attendance

b. Re-arrange seating arrangement so that these pupils are: 1) sitting in the front of the class, close to the blackboard and the teacher and 2) sitting next to a stronger reader (pupil in “meets expectations” and “exceeds expectations” performance categories) and 3) called on regularly.

c. Have teacher meet with parents in these two performance categories each term to explain the specific skills they are weak in and give them simple activities they can do at home to reinforce these skills (perhaps by using some of the remediation games or activities provided by REB/MINEDUC, by asking children to read out loud each evening, by checking their homework..). Also encourage parents to reduce the amount of chores these children have to do so that they can devote more time to school work and reading.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 181 d. Ensure that these pupils have increased time to practice reading by sending reading textbooks or other books home each evening for pupils to read. Increasing the amount of time pupils in both these categories practice reading or practice basic reading skills is very important for improving the reading scores of pupils in these two categories.

C. For pupils in the meets expectations performance category a. Give them additional material to read (books to take home, visits to library, and more challenging material, so that they read more b. Organize campaigns to encourage them to read more, and to read outside the classroom (reading clubs, Andika Rwanda, reading competitions, etc.). c. Organize meetings with parents of pupils in this category to inform them of what they can do to strengthen their children’s reading skills (encouraging them to read more, asking them questions about what they are reading, etc. d. Give them reading leadership positions in the classroom or in the school (leaders in reading clubs, working with a struggling reader in a younger grade, etc.)

D. For pupils in Zero category

It will be difficult for teachers to address the needs of these pupils in the classroom. These pupils need remedial reading materials that take them back to the foundations (sounds, letters, letter sounds, decoding words…) and a trained person to support them to build these skills. Inserting remedial reading classes in the timetable does not appear to be a viable model and it is difficult to find someone who will deliver remedial reading instruction for free. Given the complexity of the situation, as part of the piloting of the accountability framework, school-based solutions should be found for how to address the needs of these pupils.

SOMA UMENYE QUARTERLY PROGRAM REPORT | 182 U.S. Agency for International Development 1300 Pennsylvania Avenue, NW Washington, D.C. 20523 Tel.: (202) 712-0000 Fax: (202) 216-3524 www.usaid.gov