<<

SABER Country Report STUDENT ASSESSMENT 2013

Key Policy Areas for Student Assessment Status

1. Classroom Assessment In Mauritania, formal, system-level documents provide guidelines for classroom assessment. Along with the system-level document, there is an official curriculum document which specifies what students are expected to learn and to what level of performance. While there are some system-level mechanisms to ensure that teachers develop skills and expertise in classroom assessment, classroom assessment practices are known to be of moderate quality and only ad hoc mechanisms are in place to monitor the quality of the practices. 2. Examinations The School Leaving Examination has been administered to grade 13 students since 1974 in Original Literature, Modern Literature, Natural Sciences, Mathematics, and technical subjects. Results are used to certify student completion of the school cycle and to determine selection to higher- institutions, and are officially recognized by certification and selection systems in both Mauritania and abroad. While there is one systematic mechanism in place to ensure the quality of the examination, inappropriate behavior surrounding the examination process is high. At the same time, only one mechanism, a permanent oversight committee, is in place to monitor the consequences of the examination for stakeholder groups. 3. National Large-Scale Assessment (NLSA) The Assessment of Learning Achievement was administered for the first time in 1999. Since then, it has been operating on a regular basis, each time assessing a representative random sample of students from different grades and subject areas. In Mauritania, a formal, publicly available policy document authorizes the NLSA and there is a written plan for future NLSA activities. In addition, although funding for the NLSA is provided on an irregular basis by non-government sources, it covers all core NLSA activities. While the NLSA office is adequately staffed to carry out the assessment effectively, NLSA results are poorly disseminated and there are no mechanisms in place to monitor its consequences. 4. International Large-Scale Assessment (ILSA) In the last ten years, Mauritania has participated in MLA II (2003) and PASEC (2004) and has taken concrete steps to participate in PASEC 2015 and TIMSS 2015. Although funding for ILSA is mainly provided by loans or external donors, it covers all core ILSA activities. The ILSA team typically attends international meetings on ILSAs; however, no opportunities to learn about ILSAs are offered in Mauritania. While Mauritania-specific results are regularly disseminated in Mauritania, it is not clear that decisions based on ILSA results have had a positive impact on students’ achievement levels.

THE WORLD BANK MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Introduction systems: the types/purposes of assessment activities and the quality of those activities. Mauritania has focused on increasing student learning outcomes by improving the quality of education in the country. An effective student assessment system is an Assessment types and purposes important component to improving education quality and learning outcomes as it provides the necessary Assessment systems tend to be comprised of three information to meet stakeholders’ decision-making main types of assessment activities, each of which needs. In order to gain a better understanding of the serves a different purpose and addresses different strengths and weaknesses of its existing assessment information needs. These three main types are: classroom assessment, examinations, and large-scale, system, Mauritania has decided to benchmark this system level assessments. system using standardized tools developed under The

World Bank’s Systems Approach for Better Education Classroom assessment provides real-time information Results (SABER) program. SABER is an evidence-based to support ongoing teaching and learning in individual program to help countries systematically examine and strengthen the performance of different aspects of their classrooms. Classroom assessments use a variety of education systems. formats, including observation, questioning, and paper- and-pencil tests, to evaluate student learning, generally on a daily basis. What is SABER-Student Assessment? Examinations provide a basis for selecting or certifying SABER-Student Assessment is a component of the students as they move from one level of the education SABER program that focuses specifically on system to the next (or into the workforce). All eligible benchmarking student assessment policies and systems. students are tested on an annual basis (or more often if The goal of SABER-Student Assessment is to promote the system allows for repeat testing). Examinations stronger assessment systems that contribute to cover the main subject areas in the curriculum and improved education quality and learning for all. usually involve essays and multiple-choice questions.

National governments and international agencies are Large-scale, system-level assessments provide feedback increasingly recognizing the key role that assessment of on the overall performance of the education system at student learning plays in an effective education system. particular grades or age levels. These assessments The importance of assessment is linked to its role in: typically cover a few subjects on a regular basis (such as (i) providing information on levels of student every 3 to 5 years), are often sample based, and use learning and achievement in the system; multiple-choice and short-answer formats. They may be (ii) monitoring trends in education quality over national or international in scope. time; (iii) supporting educators and students with real- Appendix 1 summarizes the key features of these main time information to improve teaching and types of assessment activities. learning; and (iv) holding stakeholders accountable for results.

SABER-Student Assessment methodology

The SABER-Student Assessment framework is built on the available evidence base for what an effective assessment system looks like. The framework provides guidance on how countries can build more effective student assessment systems. The framework is structured around two main dimensions of assessment

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 2 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Quality drivers of an assessment system Table 1: Framework for building an effective assessment system, with indicator areas The key considerations when evaluating a student assessment system are the individual and combined quality of assessment activities in terms of the adequacy of the information generated to support decision making. There are three main drivers of information quality in an assessment system: enabling context, system alignment, and assessment quality.

Enabling context refers to the broader context in which the assessment activity takes place and the extent to which that context is conducive to, or supportive of, the assessment. It covers such issues as the legislative or policy framework for assessment activities; institutional The indicators are identified based on a combination of and organizational structures for designing, carrying criteria, including: out, or using results from the assessment; the availability of sufficient and stable sources of funding; x professional standards for assessment; and the presence of trained assessment staff. x empirical research on the characteristics of effective assessment systems, including analysis of the System alignment refers to the extent to which the characteristics that differentiate between the assessment is aligned with the rest of the education assessment systems of low- versus high-performing system. This includes the degree of congruence nations; and between assessment activities and system learning x theory — that is, general consensus among goals, standards, curriculum, and pre- and in-service experts that it contributes to effective teacher training. assessment.

Assessment quality refers to the psychometric quality of Levels of development the instruments, processes, and procedures for the assessment activity. It covers such issues as design and The World Bank has developed a set of implementation of assessment activities, analysis and standardized questionnaires and rubrics for collecting interpretation of student responses to those activities, and evaluating data on the three assessment types and the appropriateness of how assessment results are and related quality drivers. reported and used. The questionnaires are used to collect data on the Crossing the quality drivers with the different characteristics of the assessment system in a particular assessment types/purposes provides the framework country. The information from the questionnaires is and broad indicator areas shown in Table 1. This then applied to the rubrics in order to judge the framework is a starting point for identifying indicators development level of the country’s assessment system that can be used to review assessment systems and in different areas. plan for their improvement. The basic structure of the rubrics for evaluating data collected using the standardized questionnaires is summarized in Appendix 2. The goal of the rubrics is to provide a country with some sense of the development level of its assessment activities compared to best or recommended practice in each area. For each indicator, the rubric displays four development

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 3 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

levels—Latent, Emerging, Established, and Advanced. played a major role in the country’s recent economic These levels are artificially constructed categories growth. chosen to represent key stages on the underlying continuum for each indicator. Each level is In 1999, Mauritania assumed a sweeping reform of its accompanied by a description of what performance on education system, which aimed at consolidating its the indicator looks like at that level. education system with the introduction of a single schooling track to help ensure full x Latent is the lowest level of performance; it (French and Arabic); strengthening lower secondary represents absence of, or deviation from, the education by adding one additional year of schooling; desired attribute. and the creation of professional training centers in each x Emerging is the next level; it represents partial region. Since this reform, Mauritania has made presence of the attribute. significant gains in the education sector, as x Established represents the acceptable minimum demonstrated in the net enrollment rate at the primary standard. level, which rose from 61% in 1999 to 74% in 2011 and x Advanced represents the ideal or current best the gender parity among female students which practice. increased from 43% in 2002 to 50% in 2009. While the transition rate from primary to secondary school A summary of the development levels for each increased from 37% in 2009 to 53% in 2011, the gross assessment type is presented in Appendix 3. secondary school enrollment rate is still quite low at 27%, with a girls-to-boys ratio of 0.9. In reality, assessment systems are likely to be at different levels of development in different areas. For Mauritania committed to further reform in its Education example, a system may be Established in the area of Sector Plan for 2011-2020, which laid out a number of examinations, but Emerging in the area of large- priorities to further improve access to and quality of scale, system-level assessment, and vice versa. While education including school feeding programs, improved intuition suggests that it is probably better to be teacher training and incentive programs, distribution of further along in as many areas as possible, the textbooks, and improved access to secondary schools. evidence is unclear as to whether it is necessary to be functioning at Advanced levels in all areas. Detailed information was collected on Mauritania’s Therefore, one might view the Established level as a student assessment system using the SABER-Student desirable minimum outcome to achieve in all areas, but Assessment questionnaires and rubrics in 2011. It is only aspire beyond that in those areas that most important to remember that these tools primarily focus contribute to the national vision or priorities for on benchmarking a country’s policies and arrangements education. In line with these considerations, the ratings for assessment activities at the system or macro level. generated by the rubrics are not meant to be additive Additional data would need to be collected to across assessment types (that is, they are not meant to determine actual, on-the-ground practices in be added to create an overall rating for an assessment Mauritania, particularly by teachers and students in system; they are only meant to produce an overall schools. The following sections discuss the findings by rating for each assessment type). The methodology for each assessment type, accompanied by suggested assigning development levels is summarized in policy options. The suggested policy options were Appendix 4. determined in collaboration with key local stakeholders based on Mauritania’s immediate interests and needs. Education in Mauritania Detailed, completed rubrics for each assessment type in Mauritania are provided in Appendix 5.

Mauritania is a lower-middle income country in Sub- Saharan Africa. GDP per capita (current US$, 2012) is $1,110, and annual growth was approximately 4% in 2011 and 7.6% in 2012. Mineral discoveries have

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 4 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Classroom Assessment it is not common to observe errors in the scoring or grading of students' work, for classroom assessment activities to provide little useful feedback to students, Level of Development and for classroom assessment activities to be mainly used as administrative or control tools rather than as pedagogical resources, it is very common for parents to In Mauritania, the Evaluator's Factsheets According to be poorly informed about students' grades and for the Competencies-Based Approach document, grade inflation to be a serious problem. authorized by the Ministry of National Education in 2008, provides guidelines for classroom assessment at In addition, there are only ad hoc mechanisms to the primary level. In addition, the Integration and monitor the quality of classroom assessment practices. Remediation Guidebook documents, authorized by the Although classroom assessment is a required Ministry of National Education and the National component of a teacher's performance evaluation and Pedagogical Institute in 2000, provide guidelines for of school inspection or teacher supervision, there are classroom assessment at the secondary level. no external moderation systems that review the difficulty of classroom assessment activities or the However, there are scarce system-wide resources for appropriateness of scoring criteria. Additionally, there teachers to engage in classroom assessment activities. are no national reviews of the quality of education that Scoring criteria and the official curriculum documents— include a focus on classroom assessment, and no which outline what students are expected to learn in government funding available for research on the different subject areas at different grade or age levels quality of classroom assessment activities and how to and to what level of performance—are available to improve them. teachers. Yet textbooks or workbooks that provide support for classroom assessment, item banks or pools However, there are adequate required uses of with examples of selection/multiple-choice or classroom assessment to support student learning, supply/open-ended questions, online assessment excluding its use as an input for external examination resources, or computer-based testing with instant results. For example, classroom assessment activities reports on students' performance, are not available. are required for use in diagnosing student learning issues, providing feedback to students on their learning, In addition, there are very few system-level informing parents about their child's learning, planning mechanisms in place to ensure that teachers develop next steps in instruction, and grading students for skills and expertise in classroom assessment. For internal classroom uses. example, school inspection or teacher supervision includes a component focused on classroom Suggested policy options: assessment, pre-service teacher training opportunities are available, and teachers have opportunities to 1. Introduce varied and systematic resources for participate in item development for, or scoring of, large- teachers to engage in classroom assessment activities. scale assessments or exams. However, there are no in- For example, develop and make widely available service teacher training opportunities, online resources resources for teachers, such as textbooks/workbooks or on classroom assessment, opportunities to participate item banks with examples of multiple-choice or open- in conferences and workshops, or requirements for ended questions, which provide guidance for teacher training programs to include a course on conducting classroom assessment activities. classroom assessment. 2. Establish varied and systematic mechanisms to Classroom assessment practices are known to be of monitor the quality of classroom assessment practices. moderate quality. Classroom assessment activities For example, commission a national review of the typically do not rely mainly on multiple-choice, quality of education in Mauritania, which includes a selection-type questions and teachers often use explicit focus on the quality of classroom assessment activities. or a priori criteria for grading students' work. Although

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 5 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Examinations At the same time, up-to-date compulsory courses or workshops on the examination are available for teachers. In addition, teachers are involved in most Level of Development examination-related tasks, including selecting or creating examination questions and scoring guides, and scoring the examination. The School Leaving (Baccalaureate) Examination's main purposes are for student certification of school cycle While internal review and observers are in place to completion and student selection to higher-education ensure the quality of the examination, there are no institutions. It was authorized by the Prime Minister of external observers or review, external certification or the Islamic Republic of Mauritania and the Ministry of audits, pilot or field testing, or translation verification. National Education in 2011 with a formal, system-level Only one mechanism, a permanent oversight document, the Decree Organizing the National committee, is in place to monitor the consequences of Baccalaureate Number 2011-034. The examination the examination. supports the monitoring of education quality levels and the planning of reforms. The In addition, inappropriate behavior surrounding the examination was first administered in 1974, and it examination process is high. Examples include leakage continues to be administered to students in grade 13 in of the content of an examination paper or part of a Original Literature, Modern Literature, Natural Sciences, paper prior to the examination; impersonation (when Mathematics, and technical subjects. Student names an individual other than the registered candidate takes and their examination results are made publically the examination); copying from other candidates; using available in schools. unauthorized materials such as prepared answers and notes; collusion among candidates via mobile phones or Regular funding is allocated by the government for the passing of paper; intimidation of examination examination. Registration fees of adult candidates not supervisors, markers, or officials; and the provision of enrolled at school contribute directly to funding external assistance via the supervisor or mobile phone. examination. Funding covers all core examination activities, including examination design and Suggested policy options: administration, data analysis and reporting, long- or medium-term planning of program milestones, and staff 1. Establish appropriate preventive and reactive training. However, funding does not cover research and measures to preserve the credibility of the School development activities. Leaving Examination results. For example, introduce strict protocol around introduction and usage of mobile The Directorate of Examinations and Assessment, which phones during administration of the exam. is an office within the Ministry of Education, has had primary responsibility for running the examination since 2. Establish measures to preserve the confidentiality of 2004. It has state-of-the-art facilities and is adequately student results. For example, develop and implement staffed with permanent and full time staff to carry out protocol which prohibits results from being posted the examination effectively, with minimal issues. publicly in schools by student name.

In addition, Mauritania offers some opportunities that 3. Introduce varied and systematic mechanisms to prepare for work on the examination, including non- monitor the consequences of the School Leaving training courses or workshops on educational Examination. For example, establish and ensure funding measurement and evaluation, which are offered to staff for regular focus groups or surveys of key stakeholders, at the Directorate for Examinations and Assessment, which monitor the consequences of the examination. and internships in the examination office, which are offered to students from the École Normale Supérieure.

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 6 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

National Large-Scale Assessment (NLSA) scorers to be trained to ensure high inter-rater reliability. However, there is no requirement for discrepancies to be recorded on a standard sheet, Level of Development double processing of data does not take place, there are no external or internal reviewers or observers, and no external certification or audit takes place. The Assessment of Learning Achievement's main purposes are for monitoring education quality at the NLSA results are disseminated within twelve months system level and holding government or political after the large-scale assessment is administered, and authorities accountable. In addition, it supports policy the main reports on the results contain information on design, evaluation, and decision making. The overall achievement levels and subgroups. In addition, Assessment of Learning Achievement was first the main reports on the results contain information on administered in 1999, and subsequently administered in trends over time overall and for subgroups and contain 2000, 2001, 2003, 2004, 2008, and 2011. Different standard errors. However, reports with results are not grade levels and subjects have been chosen for each made available for all stakeholder groups. Moreover, administration of the assessment, and each although reports on the results are not confidential, administration includes a representative random dissemination is often limited to policymakers. Although sample of students. there are workshops or presentations for key stakeholders on the results, there is no media briefing The Ministry of National Education's organizational organized to discuss the results, and results are not chart which creates an Assessment Department at the featured in newspapers, magazines, radio, or television. National Pedagogical Institute (IPN) authorized the Assessment of Learning Achievement in 1999. No mechanisms are in place to monitor the consequences of the NLSA, such as a permanent There is irregular funding from non-government sources oversight committee, expert review groups, regular for the Assessment of Learning Achievement, which focus groups or surveys of key stakeholders, themed covers all core NLSA activities, including assessment conferences that provide a forum to discuss research design and administration, data analysis and reporting, and other data on the consequences of the large-scale long- or medium-term planning of program milestones, assessment, or funding for independent research on the and staff training. However, NLSA funding does not impact of the large-scale assessment. cover research and development activities. Suggested policy options: The NLSA office is a unit created for running the assessment, and is adequately staffed in terms of staff 1. Ensure stable government funding for national large- quality. However, staff are mainly temporary or part- scale assessment activities, which cover core time. Although no issues have been identified with the examination activities, as well as research and performance of the human resources responsible for development. the large-scale assessment, Mauritania offers very limited opportunities to prepare individuals for work on 2. Ensure that results from the Assessment of Learning the NLSA. Achievement are widely disseminated. For example, develop a plan to disseminate NLSA results, which Some mechanisms, such as a requirement for all includes activities, such as hosting a media briefing. booklets to be numbered and for all proctors or administrators to be trained according to a protocol, are 3. Introduce varied and systematic mechanisms to in place to ensure the quality of the NLSA. In addition, a monitor the consequences of the NLSA. For example, pilot is conducted before the main data collection and establish a permanent oversight committee tasked with double scoring of data takes place, and there is a monitoring the consequences of the Assessment of standardized manual for large-scale assessment Learning Achievement. administrators. In addition, there is a requirement for

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 7 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

international assessment databases, university courses International Large-Scale Assessment on the topic of international assessments, funding for attending international workshops or training on (ILSA) international assessments, or online courses on international assessments.

Level of Development Although country-specific results and information are regularly and widely disseminated in Mauritania, Mauritania has participated in two ILSAs in the last 10 products providing feedback to schools and educators years, the Programme on the Analysis of Education about ILSA results are not made available. Systems (PASEC) in 2004 and the Monitoring Learning Achievement (MLA) II in 2003. In addition, Mauritania Results from the ILSA are used in a variety of ways to has taken concrete steps to participate in PASEC and inform decision making in Mauritania, including tracking TIMSS 2015. An informal policy document, Basic the impact of reforms on student achievement levels, Education Support Project: Funding Request to the GPE and informing curriculum improvement, teacher for Program Implementation, authorized by the Ministry training programs, other assessment activities in the of National Education in 2013, addresses participation system, and resource allocation. For example, results in ILSAs. from PASEC have led to changes related to classroom size, textbook production and dissemination, and Funding for ILSA is provided mainly by loans or external teacher training. However, it is not clear that decisions donors as part of Mauritania's National Education based on ILSA results have had a positive impact on Sector Development Programme (PNDSE). Funding students' achievement levels. covers all core activities of the ILSA, including international participation fees, implementation of the Suggested policy options: assessment exercise in Mauritania, the processing and analysis of the data collected from the implementation 1. Establish and make publicly available a formal, of the assessment exercise, reporting and dissemination system-level policy document regarding ILSA of the assessment results in Mauritania, and attendance participation in Mauritania. at international expert meetings for the assessment exercise. However, ILSA funding does not cover 2. Introduce a variety of opportunities to learn about research and development activities. ILSA, which are available to a wide audience, including ILSA team members. For example, develop and make In addition, Mauritania’s ILSA office is adequately widely available workshops on using international staffed to carry out the ILSA effectively. The ILSA team assessment databases. has previous experience working on international assessments, and has attended all international 3. Develop and implement a plan to ensure that ILSA meetings related to the assessment. In addition, the results are systematically fed back to schools and ILSA coordinator is fluent in the language of the educators. assessment. Although the team does not have extensive training or experience to carry out the required assessment activities effectively, there have been no issues with the performance of the human resources responsible for ILSA activities, such as errors or delays in the printing or layout of the tests booklets or in the administration of the assessment.

However, Mauritania offers no opportunities to learn about ILSAs, such as workshops or meetings on using

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 8 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Appendix 1: Assessment Types and Their Key Differences

Classroom Large-scale assessment Examinations Surveys National International Exit Entrance

Purpose To provide To provide To provide To certify To select immediate feedback on feedback on the students as they students for feedback to overall health of comparative move from one further inform classroom the system at performance of level of the educational instruction particular the education education system opportunities grade/age system at to the next (or level(s), and to particular into the monitor trends in grade/age workforce) learning level(s)

Frequency Daily For individual For individual Annually and Annually and subjects offered subjects offered more often more often on a regular on a regular where the system where the system basis (such as basis (such as allows for allows for every 3-5 years) every 3-5 years) repeats repeats

Who is All students Sample or A sample of All eligible All eligible tested? census of students at a students students students at a particular grade particular grade or age level(s) or age level(s)

Format Varies from Usually multiple Usually multiple Usually essay Usually essay observation to choice and short choice and short and multiple and multiple questioning to answer answer choice choice paper-and-pencil tests to student performances

Coverage of All subject areas Generally Generally Covers main Covers main curriculum confined to a few confined to one subject areas subject areas subjects or two subjects

Additional Yes, as part of Frequently Yes Seldom Seldom information the teaching collected from process students?

Scoring Usually informal Varies from Usually involves Varies from Varies from and simple simple to more statistically simple to more simple to more statistically sophisticated statistically statistically sophisticated techniques sophisticated sophisticated techniques techniques techniques

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 9 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Appendix 2: Basic Structure of Rubrics for Evaluating Data Collected on a Student Assessment System

Development Level

LATENT ESTABLISHED (Absence of, or EMERGING (Acceptable deviation from, (On way to meeting minimum ADVANCED Dimension attribute) minimum standard) standard) (Best practice) Justification EC—ENABLING CONTEXT EC1—Policies EC2—Leadership, public engagement EC3—Funding EC4—Institutional arrangements EC5—Human resources SA—SYSTEM ALIGNMENT SA1—Learning/quality goals SA2—Curriculum SA3—Pre-, in-service teacher training AQ—ASSESSMENT QUALITY AQ1—Ensuring quality (design, administration, analysis) AQ2—Ensuring effective uses

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 10 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Appendix 3: Summary of the Development Levels for Each Assessment Type

Assessment Type LATENT EMERGING ESTABLISHED ADVANCED

Absence of, or deviation On way to meeting Acceptable minimum Best practice from, the attribute minimum standard standard

There is no system-wide There is weak system- There is sufficient There is strong system- institutional capacity to wide institutional system-wide institutional wide institutional support and ensure the capacity to support and capacity to support and capacity to support and quality of classroom ensure the quality of ensure the quality of ensure the quality of assessment practices. classroom assessment classroom assessment classroom assessment practices. practices. practices. CLASSROOM ASSESSMENT

There is no standardized There is a partially There is a stable There is a stable examination in place for stable standardized standardized standardized key decisions. examination in place, examination in place. examination in place and and a need to develop There is institutional institutional capacity and institutional capacity to capacity and some strong mechanisms to run the examination. The limited mechanisms to monitor it. The EXAMINATIONS examination typically is monitor it. The examination is of high of poor quality and is examination is of quality and is perceived perceived as unfair or acceptable quality and is as fair and free from corrupt. perceived as fair for corruption. most students and free from corruption. There is no NLSA in There is an unstable There is a stable NLSA There is a stable NLSA place. NLSA in place and a in place. There is in place and institutional need to develop institutional capacity and capacity and strong institutional capacity to some limited mechanisms to monitor run the NLSA. mechanisms to monitor it. The NLSA is of high NATIONAL (OR SYSTEM- Assessment quality and it. The NLSA is of quality and its LEVEL) LARGE-SCALE impact are weak. moderate quality and its information is ASSESSMENT information is effectively used to disseminated, but not improve education. always used in effective ways.

There is no history of Participation in an ILSA There is more or less There is stable participation in an ILSA has been initiated, but stable participation in an participation in an ILSA nor plans to participate there still is need to ILSA. There is and institutional capacity in one. develop institutional institutional capacity to to run the ILSA. The capacity to carry out the carry out the ILSA. The information from the INTERNATIONAL LARGE- ILSA. information from the ILSA is effectively used SCALE ASSESSMENT ILSA is disseminated, to improve education. but not always used in effective ways.

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 11 MAURITANIA ǀ SABER-STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Appendix 4: Methodology for Assigning particular assessment type cannot be greater than the Development Levels score for these key dimensions. These key variables include formal policy, regular funding, having a 1. The country team or consultant collects information permanent assessment unit, and the quality of about the assessment system in the country. assessment practices.

2. Based on the collected information, a level of development and score is assigned to each dimension in the rubrics:

x Latent = 1 score point x Emerging = 2 score points x Established = 3 score points x Advanced = 4 score points

3. The score for each quality driver is computed by aggregating the scores for each of its constituent dimensions. For example:

The quality driver, ‘Enabling Context,’ in the case of ILSA, has 3 dimensions on which a hypothetical country receives the following scores: Dimension A = 2 points; Dimension B = 2 points; Dimension C = 3 points. The hypothetical country’s overall score for this quality driver would be: (2+2+3)/3 = 2.33

4. A preliminary level of development is assigned to each quality driver.

5. The preliminary development level is validated using expert judgment in cooperation with the country team and The World Bank Task Team Leader.

For scores that allow a margin of discretion (i.e., to choose between two levels of development), a final decision has to be made based on expert judgment. For example, the aforementioned hypothetical country has an ‘Enabling Context’ score of 2.33, corresponding to a preliminary level of development of ‘Emerging or Established.’ Based on qualitative information not captured in the rubric, along with expert judgment, the country team chooses ‘Emerging’ as the most appropriate level.

6. Scores for certain key dimensions under ‘Enabling Context’ (in the case of EXAM, NLSA, and ILSA) and under ‘System Alignment’ (in the case of CLASS) were set as ceiling scores, i.e., the overall mean score for the

SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS 12 MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Appendix 5: SABER-Student Assessment Rubrics for Mauritania

This appendix provides the completed SABER-Student Assessment rubrics for each type of assessment activity in Mauritania. In each row of the rubric, the relevant selection is indicated by a thick border and an asterisk. The selection may include a superscript number that refers to the justification or explanation of the selection (as indicated by a thick border and an asterisk), which is provided in the “Development level rating justifications” section at the end of each rubric. If a row includes a superscript but not a thick border and an asterisk, such superscript indicates that insufficient information was available to determine the relevant selection in the row.

MAURITANIA Classroom Assessment

13 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ENABLING CONTEXT AND SYSTEM ALIGNMENT Overall policy and resource framework within which classroom assessment activity takes place in a country or system, and the degree to which classroom assessment activity is coherent with other components of the education system. LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT AND SYSTEM ALIGNMENT 1: Setting clear guidelines for classroom assessment There is no system-level document that There is an informal system-level There is a formal system-level document This option does not apply to this provides guidelines for classroom document that provides guidelines for that provides guidelines for classroom dimension. assessment. classroom assessment. assessment.1 * This option does not apply to this This option does not apply to this The availability of the document is The document is widely available. dimension. dimension. restricted.2 * ENABLING CONTEXT AND SYSTEM ALIGNMENT 2: Aligning classroom assessment with system learning goals There are no system-wide resources for There are scarce system-wide resources There are some system-wide resources There are a variety of system-wide teachers for classroom assessment. for teachers for classroom assessment.3 for teachers for classroom assessment. resources available for teachers for classroom assessment. * There is no official curriculum or There is an official curriculum or There is an official curriculum or There is an official curriculum or standards document. standards document, but it is not clear standards document that specifies what standards document that specifies what what students are expected to learn or students are expected to learn, but the students are expected to learn and to to what level of performance. level of performance required is not what level of performance. 4 clear. * ENABLING CONTEXT AND SYSTEM ALIGNMENT 3: Having effective human resources to carry out classroom assessment activities There are no system-level mechanisms This option does not apply to this There are some system-level There are a variety of system-level to ensure that teachers develop skills dimension. mechanisms to ensure that teachers mechanisms to ensure that teachers and expertise in classroom assessment. develop skills and expertise in classroom develop skills and expertise in classroom assessment.5 * assessment.

14 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ASSESSMENT QUALITY Quality of classroom assessment design, administration, analysis, and use. LATENT EMERGING ESTABLISHED ADVANCED

ASSESSMENT QUALITY 1: Ensuring the quality of classroom assessment Classroom assessment practices suffer Classroom assessment practices are Classroom assessment practices are Classroom assessment practices are from widespread weaknesses or there is known to be weak. known to be of moderate quality. 6 known to be generally of high quality. no information available on classroom assessment practices. * There are no mechanisms to monitor the There are ad hoc mechanisms to monitor There are limited systematic mechanisms There are varied and systematic quality of classroom assessment the quality of classroom assessment to monitor the quality of classroom mechanisms in place to monitor the practices. practices. 7 assessment practices. quality of classroom assessment * practices. ASSESSMENT QUALITY 2: Ensuring effective uses of classroom assessment Classroom assessment information is not This option does not apply to this Classroom assessment information is Classroom assessment information is required to be disseminated to key dimension. required to be disseminated to some key required to be disseminated to all key 8 stakeholders. stakeholders. stakeholders. * There are no required uses of classroom There are limited required uses of There are adequate required uses of There are adequate required uses of assessment to support student learning. classroom assessment to support classroom assessment to support classroom assessment to support student learning. student learning, excluding its use as an student learning, including its use as an input for external examination results. 9 * input for external examination results.

15 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Classroom Assessment: Development-level rating justifications

1. The Evaluator's Factsheets According to the Competencies-Based Approach document, authorized by the Ministry of National Education in 2008, provides guidelines for classroom assessment at the primary level. In addition, the Integration and Remediation Guidebook documents (available by subject area), authorized by the Ministry of National Education and the National Pedagogical Institute in 2000, provide guidelines for classroom assessment at the secondary level.

2. The documents providing guidelines on classroom assessment are available in teacher training colleges, in in-service courses for teachers, and in national education offices, such as the Regional Directorates of National Education and Provincial Education Inspection Offices.

3. There are scarce system-wide resources for teachers to engage in classroom assessment activities. Although scoring criteria and the official curriculum documents, which outline what students are expected to learn in different subject areas at different grade or age levels and to what level of performance, are available to teachers, textbooks or workbooks that provide support for classroom assessment, item banks or pools with examples of selection/multiple-choice or supply/open-ended questions, online assessment resources, or computer based testing with instant reports on students' performance, are not available.

4. An official curriculum or standards document is available, which specifies what students at different grade or age levels are expected to learn and to what level of performance.

5. There are very few system-level mechanisms in place to ensure that teachers develop skills and expertise in classroom assessment. For example, school inspection or teacher supervision includes a component focused on classroom assessment, pre-service teacher training opportunities are available, and teachers have opportunities to participate in item development for, or scoring of, large-scale assessments or exams. However, there are no in-service teacher training opportunities, online resources on classroom assessment, opportunities to participate in conferences and workshops, or requirements for teacher training programs to include a course on classroom assessment.

6. It is rare that classroom assessment activities rely mainly on multiple-choice, selection-type questions and that teachers do not use explicit or a priori criteria for scoring or grading students' work. Although it is not common to observe errors in the scoring or grading of students' work, for classroom assessment activities to provide little useful feedback to students, and for classroom assessment activities to be mainly used as administrative or control tools rather than as pedagogical resources, it is very common for parents to be poorly informed about students' grades and for grade inflation to be a serious problem. It is also common for classroom assessment activities to be mainly about recalling information and to not be aligned with a pedagogical or curricular framework. Additionally, it is common for the uneven application of standards for grading students' work to be a serious problem.

7. Although classroom assessment is a required component of a teacher's performance evaluation and of school inspection or teacher supervision, there are no external moderation systems that review the difficulty of classroom assessment activities or the appropriateness of scoring criteria. Additionally, no national reviews of the quality of education that include a focus on classroom assessment are in place, and no government funding is available for research on the quality of classroom assessment activities and how to improve them.

16 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

8. Classroom assessment information is required to be disseminated to all key stakeholders, including school district or Ministry of Education officials, students, and parents.

9. There are adequate required uses of classroom assessment to support student learning, excluding its use as an input for external examination results. For example, classroom assessment activities are required for use in diagnosing student learning issues, providing feedback to students on their learning, informing parents about their child's learning, planning next steps in instruction, and grading students for internal classroom uses.

17 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

MAURITANIA Examinations

18 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ENABLING CONTEXT Overall framework of policies, leadership, organizational structures, fiscal and human resources in which assessment activity takes place in a country or system and the extent to which that framework is conducive to, or supportive of, the assessment activity. LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 1: Setting clear policies No standardized examination has taken The standardized examination has been The examination is a stable program that This option does not apply to this place. operating on an irregular basis. has been operating regularly. 1 dimension * There is no policy document that There is an informal or draft policy There is a formal policy document that This option does not apply to this authorizes the examination. document that authorizes the authorizes the examination.2 dimension. examination. * This option does not apply to this The policy document is not available to The policy document is available to the This option does not apply to this dimension. the public. public. 3 dimension. * This option does not apply to this This option does not apply to this The policy document addresses some The policy document addresses all key dimension. dimension. key aspects of the examination.4 aspects of the examination. * ENABLING CONTEXT 2: Having strong leadership All stakeholder groups strongly oppose Most stakeholder groups oppose the Most stakeholders groups support the All stakeholder groups support the the examination or are indifferent to it. examination. examination. examination.5 * There are no attempts to improve the This option does not apply to this There are independent attempts to There are coordinated attempts to examination by stakeholder groups. dimension. improve the examination by stakeholder improve the examination by stakeholder groups. groups.6 * Efforts to improve the examination are This option does not apply to this Efforts to improve the examination are This option does not apply to this not welcomed by the leadership in dimension. generally welcomed by the leadership in dimension. charge of the examination charge of the examination.7 * (CONTINUED)

19 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 3: Having regular funding There is no funding allocated for the There is irregular funding allocated for There is regular funding allocated for the This option does not apply to this examination. the examination. examination.8 dimension. * This option does not apply to this Funding covers some core examination Funding covers all core examination This option does not apply to this dimension. activities: design, administration, data activities: design, administration, data dimension. processing or reporting. processing and reporting.9 * This option does not apply to this Funding does not cover research and This option does not apply to this Funding covers research and dimension. development.10 dimension. development. * ENABLING CONTEXT 4: Having strong organizational structures The examination office does not exist or The examination office is newly The examination office is a stable This option does not apply to this is newly established. established. organization. 11 dimension. * The examination office is not This option does not apply to this The examination office is accountable to This option does not apply to this accountable to an external board or dimension. an external board or agency. dimension. agency. 12 * Examination results are not recognized Examination results are recognized by Examination results are recognized by Examination results are recognized by by any certification or selection system. certification or selection system in the one certification or selection system in two or more certification or selection country. another country. system in another country.13

The examination office does not have The examination office has some of the The examination office has all of the The examination office has state of the the required facilities to carry out the required facilities to carry out the required facilities to carry out the art facilities to carry out the examination. examination. examination. examination. 14 * (CONTINUED)

20 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 5: Having effective human resources There is no staff to carry out the The examination office is inadequately The examination office is adequately The examination office is adequately examination. staffed to effectively carry out the staffed to carry out the examination staffed to carry out the assessment examination, issues are pervasive. effectively, with minimal issues.15 effectively, with no issues. * The country does not offer opportunities This option does not apply to this The country offers some opportunities The country offers a wide range of that prepare for work on the dimension. that prepare for work on the opportunities that prepare for work on examination. examination. 16 the examination. *

21 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

SYSTEM ALIGNMENT Degree to which the assessment is coherent with other components of the education system. LATENT EMERGING ESTABLISHED ADVANCED

SYSTEM ALIGNMENT 1: Aligning examinations with learning goals and opportunities to learn It is not clear what the examination This option does not apply to this There is a clear understanding of what This option does not apply to this measures. dimension. the examination measures.17 dimension. * What the examination measures is This option does not apply to this What is measured by the examination is This option does not apply to this questioned by some stakeholder groups. dimension. largely accepted by stakeholder groups.18 dimension. * Material to prepare for the examination There is some material to prepare for the There is comprehensive material to There is comprehensive material to is minimal and it is only accessible to examination that is accessible to some prepare for the examination that is prepare for the examination that is very few students. students. accessible to most students. 19 accessible to all students. * SYSTEM ALIGNMENT 2: Providing teachers with opportunities to learn about the examination There are no courses or workshops on There are no up-to-date courses or There are up-to-date voluntary courses There are up-to-date compulsory courses examinations available to teachers. workshops on examinations available to or workshops on examinations available or workshops on examinations for teachers. to teachers. teachers. 20 * Teachers are excluded from all Teachers are involved in very few Teachers are involved in some Teachers are involved in most examination-related tasks. examination-related tasks. examination-related tasks. 21 examination-related tasks. *

22 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ASSESSMENT QUALITY Degree to which the assessment meets quality standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED

ASSESSMENT QUALITY 1: Ensuring quality There is no technical report or other There is some documentation on the There is a comprehensive technical There is a comprehensive, high quality documentation. examination, but it is not in a formal report but with restricted circulation. technical report available to the general report format.22 public. * There are no mechanisms in place to This option does not apply to this There are limited systematic mechanisms There are varied and systematic ensure the quality of the examination. dimension. in place to ensure the quality of the mechanisms in place to ensure the examination.23 quality of the examination. * ASSESSMENT QUALITY 2: Ensuring fairness Inappropriate behavior surrounding the Inappropriate behavior surrounding the Inappropriate behavior surrounding the Inappropriate behavior surrounding the examination process is high. 24 examination process is moderate. examination process is low. examination process is marginal. * The examination results lack credibility The examination results are credible for The examination results are credible for This option does not apply to this for all stakeholder groups. some stakeholder groups. all stakeholder groups.25 dimension. * The majority of the students (over 50%) A significant proportion of students A small proportion of students (less than All students can take the examination; may not take the examination because of (10%-50%) may not take the examination 10%) may not take the examination there are no language, gender or other language, gender, or other equivalent because of language, gender, or other because of language, gender, or other equivalent barriers.26 barriers. equivalent barriers. equivalent barriers. * (CONTINUED)

23 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

LATENT EMERGING ESTABLISHED ADVANCED

ASSESSMENT QUALITY 3: Using examination information in a fair way Examination results are not used in a Examination results are used by some Examination results are used by most Examination results are used by all proper way by all stakeholder groups. stakeholder groups in a proper way. stakeholder groups in a proper way. stakeholder groups in a proper way. 27 * Student names and results are public.28 This option does not apply to this Students’ results are confidential. This option does not apply to this dimension. dimension. * ASSESSMENT QUALITY 4: Ensuring positive consequences of the examination There are no options for students who There are very limited options for There are some options for students who There is a variety of options for students do not perform well on the examination, students who do not perform well on the do not perform well on the examination. who do not perform well on the or students must leave the education examination. 29 examination. system. * There are no mechanisms in place to This option does not apply to this There are some mechanisms in place to There is a variety of mechanisms in place monitor the consequences of the dimension. monitor the consequences of the to monitor the consequences of the examination. examination. 30 examination. *

24 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Examinations: Development-level rating justifications

1. The School Leaving (Baccalaureate) Examination's main purposes are for student certification of school cycle completion and student selection to higher- education institutions. In addition, the examination supports the monitoring of education quality levels and the planning of education policy reforms. The examination was first administered in 1974, and it continues to be administered to students in grade 13 in Original Literature, Modern Literature, Natural Sciences, Mathematics, and technical subjects.

2. The Prime Minister of the Islamic Republic of Mauritania (IRM) and the Ministry of National Education (MNE) authorized the examination in 2011 with the document Decree Organizing the National Baccalaureate Number 2011-034.

3. The document Decree Organizing the National Baccalaureate Number 2011-034 is available to the public and was published in the Official Journal of the Islamic Republic of Mauritania.

4. The document Decree Organizing the National Baccalaureate Number 2011-034 addresses some key aspects of the examination, as it outlines governance, distribution of power, and responsibilities among key entities, describes the purpose of the examination and authorized uses of results, outlines procedures to investigate and address security breaches, cheating, or other forms of inappropriate behavior, and specifies who can sit for the examination. However, the document does not state funding sources for the examination, outline procedures for special or disadvantaged students, identify rules about preparation, or explain alignment with curricula and standards or the format of the examination questions.

5. Policymakers, teacher unions, educators, students, parents, media, think tanks and NGOs, , and employers all strongly support the examination.

6. The education system reform of 1999 involved many aspects of the educational system especially the Baccalaureate organization. In particular, the reform led to a reduction in the number of disciplines submitted for the examination. The reform was initiated by the Government and the Administration; it was not the result of the demands of any special interest group rather a concern of improving the functioning and efficiency of the system in the face of many challenges.

7. Leadership in charge of the examination generally welcomes efforts to improve the examination.

8. Regular funding is allocated by the government for the examination. In addition, funding also comes from student fees, which are collected from adult candidates who are no longer attending school.

9. Funding covers all core examination activities, including examination design and administration, data analysis and reporting, long- or medium-term planning of program milestones, and staff training.

10. Funding for the examination does not cover research and development activities.

25 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

11. The Directorate of Examinations and Assessment, which is an office within the Ministry of Education, has had primary responsibility for running the examination since 2004.

12. The Directorate of Examinations and Assessment is not accountable to an external board or agency.

13. Examination results are recognized by certification and selection systems in Mauritania, France, other Francophone countries using the Baccalaureate system, and all countries in Francophone Africa.

14. The Directorate of Examinations and Assessment has state-of-the-art facilities to carry out the examination, including computers for all technical staff, a secure building, secure storage facilities, access to adequate computer servers, the ability to backup data, and adequate communication tools. In addition to their own administrative facilities, during the examination period, the Directorate of Examinations and Assessment has a secure facility reserved for conducting confidential activities.

15. The Directorate of Examinations and Assessment is adequately staffed with permanent and full time staff to carry out the examination effectively, with minimal issues. While issues—such as errors in scoring that have led to delays in results being reported, weaknesses in test design, and omission of curricular topics—have been identified with the performance of human resources responsible for the examination, frequent errors in the examination questions and data processing, poor training of test administrators, and delays in administering the examination due to issues with the design of the examination questions have not been identified.

16. Mauritania offers some opportunities that prepare for work on the examination, including non-university training courses or workshops on educational measurement and evaluation, which are offered to staff at the Directorate for Examinations and Assessment, and internships in the examination office, which are offered to students from the École Normale Supérieure. However, there are no university graduate programs or university courses on educational measurement and evaluation. Also, there is no funding available for attending international programs, courses, or workshops on educational measurement and evaluation.

17. There is a clear understanding that the examination measures internationally recognized curriculum guidelines or standards.

18. Stakeholder groups largely accept what is measured by the examination.

19. There is some material to prepare for the examination that is accessible to all students. Although students must pay for such resources, examples of the types of questions that are on the examination and information on how to prepare for the examination are available. The framework document explaining what is measured on the examination and the report on the strengths and weaknesses in student performance are not available.

20. Up-to-date compulsory courses or workshops on the examination are available for teachers. 21. Teachers are involved in most examination-related tasks, including selecting or creating examination questions and scoring guides, and scoring the examination. However, they are not involved in acting as a judge or resolving inconsistencies between examination scores and school grades.

26 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

22. There is some documentation about the technical aspects of the examination, but it is not in a formal report format. A Technical Committee (TC), which is comprised of senior officials from the Directorate of Examinations and Assessment and Presidents of the Juries (juries are teams which are in charge of examinations locally and handle the primary organizational problems), resolves problems that may arise during the examination. The TC centralizes reports generated by juries, which at the request of the Minister this year, were synthesized by the Directorate of Examinations and Assessment.

23. There is only one systematic mechanism, internal review or observers, in place to ensure the quality of the examination. External review or observers, external certification or audits, pilot or field testing, or translation verification are not in place.

24. Inappropriate behavior surrounding the examination process is high. Leakage of the content of an examination paper or part of a paper prior to the examination; impersonation (when an individual other than the registered candidate takes the examination); copying from other candidates, using unauthorized materials such as prepared answers and notes; collusion among candidates via mobile phones or passing of paper; intimidation of examination supervisors, markers, or officials; and the provision of external assistance via the supervisor or mobile phone all occur during the examination process. However, issuing forged certificates or altering test results does not occur.

25. Baccalaureate results are perceived as credible by all stakeholder groups.

26. All students can take the examination regardless of background, location, or the ability to pay.

27. All stakeholder groups, including policy makers, teacher unions, educators, students, parents, media, think tanks and NGOs, universities, and employers, use examination results in a proper way.

28. Student names and results are public, as results are posted inside school facilities, and are also accessible online and via SMS by a student's individual number.

29. Some options are available to students who do not perform well on the examination. For example, students may retake the examination, repeat the grade, or attend remedial or preparatory courses in order to prepare to retake the examination. The remedial or preparatory courses that are available are private in nature, and students can only repeat a grade three times. Students cannot opt for less selective schools, universities, or tracks.

30. There is only a permanent oversight committee in place to monitor the consequences of the examination. This permanent oversight committee studies the reports of the chairmen of examination and correction centers regarding issues such as fraud and the behavior of some candidates during the examination process. There are no expert review groups, regular focus groups or surveys of key stakeholders, or studies that are updated regularly. In addition, there is no funding for independent research on the impact of the examination or a permanent oversight committee.

27 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

MAURITANIA National (or System-Level) Large-Scale Assessment (NLSA)

28 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ENABLING CONTEXT Overall framework of policies, leadership, organizational structures, fiscal and human resources in which NLSA activity takes place in a country or system and the extent to which that framework is conducive to, or supportive of, the NLSA activity. LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 1: Setting clear policies for NLSA No NLSA exercise has taken place. The NLSA has been operating on an The NLSA is a stable program that has This option does not apply to this irregular basis. been operating regularly. 1 dimension. * There is no policy document pertaining There is an informal or draft policy There is a formal policy document that This option does not apply to this to NLSA. document that authorizes the NLSA. authorizes the NLSA.2 dimension. * This option does not apply to this The policy document is not available to The policy document is available to the This option does not apply to this dimension. the public. public.3 dimension. * There is no plan for NLSA activity. This option does not apply to this There is a general understanding that the There is a written NLSA plan for the dimension. NLSA will take place. coming years. 4 * ENABLING CONTEXT 2: Having strong public engagement for NLSA All stakeholder groups strongly oppose Some stakeholder groups oppose the Most stakeholders groups support the All stakeholder groups support the NLSA. the NLSA or are indifferent to it. NLSA. NLSA. 5

(CONTINUED)

29 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 3: Having regular funding for NLSA There is no funding allocated to the There is irregular funding allocated to There is regular funding allocated to the This option does not apply to this NLSA. the NLSA.6 NLSA. dimension. * This option does not apply to this Funding covers some core NLSA Funding covers all core NLSA activities: This option does not apply to this dimension. activities: design, administration, analysis design, administration, analysis and dimension. and reporting. reporting.7 * This option does not apply to this Funding does not cover research and This option does not apply to this Funding covers research and dimension. development activities. 8 dimension. development activities. * ENABLING CONTEXT 4: Having strong organizational structures for NLSA There is no NLSA office, ad hoc unit or The NLSA office is a temporary agency or The NLSA office is a permanent agency, This option does not apply to this team. group of people. institution or unit.9 dimension. * This option does not apply to this Political considerations regularly hamper Political considerations sometimes Political considerations never hamper dimension. technical considerations. hamper technical considerations. technical considerations. 10 * This option does not apply to this The NLSA office is not accountable to a The NLSA office is accountable to a This option does not apply to this dimension. clearly recognized body. clearly recognized body. 11 dimension. * (CONTINUED)

30 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 5: Having effective human resources for NLSA There is no staff allocated for running an The NLSA office is inadequately staffed The NLSA office is adequately staffed to The NLSA office is adequately staffed to NLSA. to effectively carry out the assessment. carry out the NLSA effectively, with carry out the NLSA effectively, with no minimal issues. 12 issues. * The country does not offer opportunities This option does not apply to this The country offers some opportunities to The country offers a wide range of that prepare individuals for work on dimension. prepare individuals for work on the opportunities to prepare individuals for NLSA. NLSA. 13 work on the NLSA. *

31 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

SYSTEM ALIGNMENT Degree to which the NLSA is coherent with other components of the education system. LATENT EMERGING ESTABLISHED ADVANCED

SYSTEM ALIGNMENT 1: Aligning the NLSA with learning goals It is not clear if the NLSA is based on This option does not apply to this The NLSA measures performance against This option does not apply to this curriculum or learning standards. dimension. curriculum or learning standards. 14 dimension. * What the NLSA measures is generally This option does not apply to this What the NLSA measures is questioned What the NLSA measures is largely questioned by stakeholder groups. dimension. by some stakeholder groups. accepted by stakeholder groups. 15 * There are no mechanisms in place to There are ad hoc reviews of the NLSA to There are regular internal reviews of the This option does not apply to this ensure that the NLSA accurately ensure that it measures what it is NLSA to ensure that it measures what it dimension. measures what it is supposed to intended to measure. is intended to measure.16 measure. * SYSTEM ALIGNMENT 2: Providing teachers with opportunities to learn about the NLSA There are no courses or workshops on There are occasional courses or There are some courses or workshops on There are widely available high quality the NLSA. workshops on the NLSA. 17 the NLSA offered on a regular basis. courses or workshops on the NLSA offered on a regular basis. *

32 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ASSESSMENT QUALITY Degree to which the NLSA meets technical standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED

ASSESSMENT QUALITY 1: Ensuring the quality of the NLSA No options are offered to include all This option does not apply to this At least one option is offered to include Different options are offered to include groups of students in the NLSA. dimension. all groups of students in the NLSA. all groups of students in the NLSA. 18 * There are no mechanisms in place to This option does not apply to this There are some mechanisms in place to There are a variety of mechanisms in ensure the quality of the NLSA. dimension. ensure the quality of the NLSA. 19 place to ensure the quality of the NLSA. * There is no technical report or other There is some documentation about the There is a comprehensive technical There is a comprehensive, high quality documentation about the NLSA. technical aspects of the NLSA, but it is report but with restricted circulation. technical report available to the general not in a formal report format. public. 20 * ASSESSMENT QUALITY 2: Ensuring effective uses of the NLSA NLSA results are not disseminated. NLSA results are poorly disseminated. 21 NLSA results are disseminated in an This option does not apply to this effective way. dimension. * NLSA information is not used or is used This option does not apply to this NLSA results are used by some NLSA information is used by all in ways inconsistent with the purposes dimension. stakeholder groups in a way that is stakeholder groups in a way that is or the technical characteristics of the consistent with the purposes and consistent with the purposes and assessment. technical characteristics of the technical characteristics of the assessment. 22 * assessment. There are no mechanisms in place to This option does not apply to this There are some mechanisms in place to There are a variety of mechanisms in monitor the consequences of the NLSA. dimension. monitor the consequences of the NLSA. place to monitor the consequences of 23 the NLSA. *

33 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

National (of System-Level) Large Scale Assessment (NLSA): Development-level rating justifications

1. The Assessment of Learning Achievement's main purposes are for monitoring education quality at the system level and holding government or political authorities accountable. In addition, it supports policy design, evaluation, and decision making. The Assessment of Learning Achievement was first administered in 1999, and subsequently administered in 2000, 2001, 2003, 2004, 2008, and 2011. Different grade levels and subjects have been chosen for each administration of the assessment, and each administration includes a representative random sample of students. In 1999, students in grades 3, 4, and 5 were assessed in Arabic, French, Mathematics, and Environmental Studies. In 2000, students in grades 4 and 6 were assessed in Arabic, French, and Mathematics, while in 2001, students in grades 2 were assessed in Arabic, French, and Mathematics. In 2003, students in grades 8 were assessed in Mathematics and Science and students in grade 5 were assessed in Arabic, French, Mathematics, and Environmental Studies. In 2004, students in grade 11 C and D were assessed in Mathematics, Physics-Chemistry, and Natural Sciences, while in 2008, students in grade 12 were assessed in Mathematics, Physics-Chemistry, and Natural Sciences, and in 2011, students in grades 3 and 5 were assessed in Arabic, French, and Mathematics.

2. The Ministry of National Education's organizational chart which creates an Assessment Department at the National Pedagogical Institute (IPN) authorized the Assessment of Learning Achievement in 1999.

3. The policy document authorizing the large-scale assessment is publicly available in the Official Journal of the Islamic Republic of Mauritania (RIM).

4. A publicly-available written plan is available, which specifies who will be tested and in which subject areas. The plan is available to, and easily accessible by, the public.

5. Stakeholder groups have not attempted to reform the NLSA program.

6. There is irregular funding from non-government sources for the NLSA.

7. Funding covers all core NLSA activities, including assessment design and administration, data analysis and reporting, long- or medium-term planning of program milestones, and staff training.

8. NLSA funding does not cover research and development activities.

9. Despite differing administrative affiliations of team members, the NLSA office, the National Evaluation Unit, is a permanent agency, institution, or unit created for running the assessment.

10. Political considerations never hamper technical considerations, and results from the Assessment of Learning Achievement have never been withheld from publication because of political reasons.

34 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

11. The NLSA office is accountable to the Direction générale de la stratégie, de la programmation et de la cooperation (DGSPC), a directorate for strategy and planning in the Ministry of Education as well as the Direction des projets Education et formation (DPEF), a directorate for managing cooperative activities in the Ministry of Finance. In addition, it is accountable to the local coordinating body of donors.

12. Although team members have all been adequately trained in the field of assessment and have experience working on large-scale assessment, they are mainly temporary or part-time staff. However, no issues have been identified with the performance of the human resources that are responsible for the large-scale assessment.

13. Mauritania offers very limited opportunities to prepare individuals for work on the NLSA. Although there is funding available for attending international programs, courses, or workshops on educational measurement and evaluation, there are no university graduate programs, university courses, or non-university training courses or workshops on educational measurement and evaluation. In addition, there are no internships or short-term employment opportunities in the large-scale assessment office.

14. The NLSA measures performance against national curriculum guidelines or learning standards.

15. Stakeholder groups largely accept what is measured by the NLSA.

16. There are regular internal reviews, as well as ad hoc reviews, of the alignment between the assessment instrument and what it is supposed to measure.

17. Although there are occasional presentations offered on the NLSA, there are no courses or workshops available on a regular basis.

18. Special plans are made to ensure that the large-scale assessment is administered to students in hard-to-reach areas, as all geographic areas are sampled. In addition, the large-scale assessment is offered in the language of instruction for almost all student groups.

19. Some mechanisms, such as a requirement for all booklets to be numbered and for all proctors or administrators to be trained according to a protocol, are in place to ensure the quality of the NLSA. In addition, a pilot is conducted before the main data collection and double scoring of data takes place, and there is a standardized manual for large-scale assessment administrators. In addition, there is a requirement for scorers to be trained to ensure high inter-rater reliability. However, there is no requirement for discrepancies to be recorded on a standard sheet, double processing of data does not take place, there are no external or internal reviewers or observers, and no external certification or audit takes place.

20. A comprehensive, high quality technical report is available to the general public.

21. The NLSA results are disseminated within twelve months after the large-scale assessment is administered, and the main reports on the results contain information on overall achievement levels and subgroups. In addition, the main reports on the results contain information on trends over time overall and for subgroups and contain standard errors. However, reports with results are not made available for all stakeholder groups. Moreover, reports on the results are not

35 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013 confidential, but dissemination is often limited to policymakers. Although there are workshops or presentations for key stakeholders on the results, there is no media briefing organized to discuss the results, and results are not featured in newspapers, magazines, radio, or television. 22. Some stakeholder groups use NLSA results in a way that is consistent with the stated purposes and technical characteristics of the assessment.

23. No mechanisms are in place to monitor the consequences of the NLSA, such as a permanent oversight committee, expert review groups, regular focus groups or surveys of key stakeholders, themed conferences that provide a forum to discuss research and other data on the consequences of the large-scale assessment, or funding for independent research on the impact of the large-scale assessment.

36 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

MAURITANIA International Large-Scale Assessment (ILSA)

37 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ENABLING CONTEXT Overall framework of policies, leadership, organizational structures, fiscal and human resources in which ILSA takes place in a country or system and the extent to which that framework is conducive to, or supportive of, ILSA activity. LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 1: Setting clear policies for ILSA The country/system has not participated This option does not apply to this The country/system has participated in The country/system has participated in in an ILSA in the last 10 years. dimension. at least one ILSA in the last 10 years. two or more ILSA in the last 10 years.1 * The country/system has not taken This option does not apply to this The country/system has taken concrete This option does not apply to this concrete steps to participate in an ILSA in dimension. steps to participate in at least one ILSA in dimension. the next 5 years. the next 5 years. 2 * There is no policy document that There is an informal or draft policy There is a formal policy document that This option does not apply to this addresses participation in ILSA. document that addresses participation in addresses participation in ILSA. dimension. ILSA. 3 * This option does not apply to this The policy document is not available to The policy document is available to the This option does not apply to this dimension. the public. 4 public. dimension. * ENABLING CONTEXT 2: Having regular funding for ILSA There is no funding for participation in There is funding from loans or external There is regular funding allocated at There is regular funding approved by law, ILSA. donors. 5 discretion. decree or norm. * This option does not apply to this Funding covers some core activities of Funding covers all core activities of the This option does not apply to this dimension. the ILSA. ILSA. 6 dimension. * Funding does not cover research and This option does not apply to this This option does not apply to this Funding covers research and development activities. 7 dimension. dimension. development activities. * (CONTINUED) 38 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

LATENT EMERGING ESTABLISHED ADVANCED

ENABLING CONTEXT 3: Having effective human resources for ILSA There is no team or national/system There is a team or national/system There is a team and national/system This option does not apply to this coordinator to carry out the ILSA coordinator to carry out the ILSA coordinator to carry out the ILSA dimension. activities. activities. activities. 8 * This option does not apply to this The national/system coordinator or The national/system coordinator is fluent This option does not apply to this dimension. other designated team member may not in the language of the assessment. 9 dimension. be fluent in the language of the assessment. * This option does not apply to this The ILSA office is inadequately staffed or The ILSA office is adequately staffed or The ILSA office is adequately staffed and dimension. trained to carry out the assessment trained to carry out the ILSA effectively, trained to carry out the ILSA effectively, effectively. with minimal issues. with no issues. 10 *

39 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

SYSTEM ALIGNMENT Degree to which the ILSA meets technical quality standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED

SYSTEM ALIGNMENT 1: Providing opportunities to learn about ILSA The ILSA team has not attended The ILSA team attended some The ILSA team attended all international This option does not apply to this international workshops or meetings. international workshops or meetings. workshops or meetings. 11 dimension. * The country/system offers no This option does not apply to this The country/system offers some The country/system offers a wide range opportunities to learn about ILSA. 12 dimension. opportunities to learn about ILSA. of opportunities to learn about ILSA. * This option does not apply to this This option does not apply to this Opportunities to learn about ILSA are Opportunities to learn about ILSA are dimension. 13 dimension. available to the country's/system's ILSA available to a wide audience, in addition team members only. to the country's/system's ILSA team * members.

40 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

ASSESSMENT QUALITY Degree to which the ILSA meets technical quality standards, is fair, and is used in an effective way. LATENT EMERGING ESTABLISHED ADVANCED

ASSESSMENT QUALITY 1: Ensuring the quality of ILSA Data from the ILSA has not been The country/system met sufficient The country/system met all technical This option does not apply to this published. standards to have its data presented standards required to have its data dimension. beneath the main display of the presented in the main displays of the international report or in an annex. international report. 14 * The country/system has not contributed This option does not apply to this This option does not apply to this The country/system has contributed new new knowledge on ILSA. dimension. dimension. knowledge on ILSA. 15 * ASSESSMENT QUALITY 2: Ensuring effective uses of ILSA If any, country/system-specific results Country/system-specific results and Country/system-specific results and Country/system-specific results and and information are not disseminated in information are disseminated irregularly information are regularly disseminated in information are regularly and widely the country/system. in the country/system. the country/system. disseminated in the country/system. 16 * Products to provide feedback to schools This option does not apply to this Products to provide feedback to schools Products to provide feedback to schools and educators about the ILSA results are dimension. and educators about the ILSA results are and educators about ILSA results are not made available. 17 sometimes made available. systematically made available. * There is no media coverage of the ILSA There is limited media coverage of the There is some media coverage of the There is wide media coverage of the ILSA results. ILSA results. ILSA results. results. 18 * If any, country/system-specific results Results from the ILSA are used in a Results from the ILSA are used in some Results from the ILSA are used in a and information from the ILSA are not limited way to inform decision making in ways to inform decision making in the variety of ways to inform decision used to inform decision making in the the country/system. country/system. making in the country/system. 19 country/system. * It is not clear that decisions based on This option does not apply to this This option does not apply to this Decisions based on the ILSA results have ILSA results have had a positive impact dimension. dimension. had a positive impact on students' on students' achievement levels. 20 achievement levels. *

41 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

International Large Scale Assessment (ILSA): Development-level rating justifications

1. Mauritania has participated in two ILSAs in the last 10 years, specifically in Programme on the Analysis of Education Systems (PASEC) in 2004 and the Monitoring Learning Achievement (MLA) II in 2003.

2. Mauritania has taken concrete steps to participate in at least two ILSAs, PASEC and TIMSS 2015.

3. The informal policy document, Basic Education Support Project: Funding Request to the GPE for Program Implementation, authorized by the Ministry of National Education in 2013, addresses participation in ILSAs.

4. The document Basic Education Support Project: Funding Request to the GPE for Program Implementation is not available to the public.

5. There is funding sourced from loans or external donors as part of Mauritania's National Education Sector Development Programme (PNDSE).

6. Funding covers all core activities of the ILSA, including international participation fees, implementation of the assessment exercise in Mauritania, the processing and analysis of the data collected from the implementation of the assessment exercise, reporting and dissemination of the assessment results in Mauritania, and attendance at international expert meetings for the assessment exercise.

7. ILSA funding does not cover research and development activities.

8. A team and a national coordinator carry out the ILSA activities.

9. The national coordinator is fluent in the language in which the international-level meetings are conducted and related documentation is available.

10. The ILSA office is adequately staffed to carry out the ILSA effectively. The ILSA team has previous experience working on international assessments, and has attended all international meetings related to the assessment. In addition, the ILSA coordinator is fluent in the language of the assessment. Although the team does not have extensive training or experience to carry out the required assessment activities effectively, there have been no issues with the performance of the human resources responsible for ILSA activities, such as errors or delays in the printing or layout of the tests booklets or in the administration of the assessment.

11. ILSA team members have attended all international meetings related to the assessment.

12. Mauritania offers no opportunities, such as workshops or meetings on using international assessment databases, university courses on the topic of international assessments, funding for attending international workshops or training on international assessments, or online courses on international assessments, to learn about ILSAs.

13. This option does not apply to this dimension.

42 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

14. Mauritania met all technical standards required to have its data presented in the main displays of the international report.

15. Mauritania has contributed new knowledge on ILSA.

16. Country specific results and information are regularly and widely disseminated in Mauritania. For example, a national report was made available online and copies of the national report were distributed to key stakeholders. In addition, Mauritania's results were communicated through a press release and received coverage on the television, radio, and newspapers, while brochures and PowerPoint presentations with the results were made available online or distributed to key stakeholders. However, copies of the international report were not distributed to key stakeholders.

17. Products providing feedback to schools and educators about ILSA results are not made available.

18. ILSA results receive wide media coverage. Assessment results are on the front page of the newspapers or the main story on the TV news, and there are editorials or columns commenting on the international assessment results.

19. Results from the ILSA are used in a variety of ways to inform decision making in Mauritania, including tracking the impact of reforms on student achievement levels, and informing curriculum improvement, teacher training programs, other assessment activities in the system, and resource allocation. For example, results from PASEC have led to changes related to classroom size, textbook production and dissemination, and teacher training.

20. It is not clear that decisions based on ILSA results have had a positive impact on students' achievement levels.

43 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

Acknowledgements References

This report, part of a 16-country benchmarking exercise Clarke, M. 2012. “What Matters Most for Student in the Middle East & North Africa and Africa regions, Assessment Systems: A Framework Paper.” was prepared by the World Bank SABER Student READ/SABER Working Paper Series. Washington, DC: Assessment team in partnership with UNESCO, which World Bank. led the data collection efforts. It benefited from feedback and review from Geraldo Joao Martins, Senior United Nations Educational, Scientific, and Cultural Education Specialist and Task Team Leader for Organization (UNESCO)- Institute for Statistics. education projects in Mauritania in the World Bank’s Education Profile- Mauritania. Montreal, QC: UNESCO. Africa region, as well as participants of the national Data retrieved from http://stats.uis.unesco.org/unesco validation seminar organized by UNESCO in Mauritania. on September 11, 2013.

World Bank. 2012. Mauritania - Education Sector Development Program Project. Washington D.C.: World Bank. Data retrieved from http://documents.worldbank.org/curated/en/2012/09/ 16791326/mauritania-education-sector-development- program-project on September 11, 2013.

------. 2013. “Mauritania: Country Brief”. Washington, D.C.: World Bank. Data retrieved from http://go.worldbank.org/RF97LCKG10 on September 11, 2013.

------. World Bank Development Indicators: Mauritania Country Indicator Data. Washington, DC: World Bank. Data retrieved from http://databank.worldbank.org/data on September 11, 2013.

44 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS MAURITANIA ǀ STUDENT ASSESSMENT SABER COUNTRY REPORT |2013

www.worldbank.org/education/saber

The Systems Approach for Better Education Results (SABER) initiative produces comparative data and knowledge on education policies and institutions, with the aim of helping countries systematically strengthen their education systems. SABER evaluates the quality of education policies against evidence-based global standards, using new diagnostic tools and detailed policy data. The SABER country reports give all parties with a stake in educational results—from administrators, teachers, and parents to policymakers and business people—an accessible, objective snapshot showing how well the policies of their country's education system are oriented toward ensuring that all children and youth learn.

This report focuses specifically on policies in the area of student assessment.

This work is a product of the staff of The World Bank with external contributions. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries.

THE WORLD BANK

2 SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS