Active Learning Network on Accountability and Performance in Humanitarian Assistance

Record of the Second Meeting

21-22 October 1997

Secretariat: Humanitarian Policy Programme, Overseas Development Institute Portland House Stag Place London SW1E 5DP Record of the Second ALNAP Meeting 21-22 October 1997, ODI, London

In Attendance

Katherine Alley Evaluation Department, UNICEF, New York Eva Asplund Head of Division for Humanitarian Assistance, SIDA, Stockholm Helen Awan Administrator, Humanitarian Policy Programme, ODI. Richard Blewitt Emergencies Unit, ActionAid (currently on secondment to UN DHA) Sue Birchmore Head of Evaluation, World Vision UK, Milton Keynes. John Borton Coordinator, Humanitarian Policy Programme, ODI. Polly Byers Policy and Planning Adviser, Office of Foreign Disaster Assistance, USAID, Washington DC. Matthew Carter Programme Officer, CAFOD, London. Louisa Chan Office of the Director, Emergency and Humanitarian , WHO, Geneva. Jacqueline Coeffard Head of Evaluation Unit, ECHO, Brussels. Sara Davidson Inter-Agency Coordinator, People in Aid, London. Dominique Desvignes Internal Audit Department, ICRC, Geneva. Antonio Donini Chief, Lessons Learned Unit, UN DHA, New York. Claude Forthomme Senior Evaluation Officer, FAO, Rome. Laura Gibbons Coordinator, Relief and Rehabilitation Network, ODI. Andre Griekspoor Head, Monitoring and Evaluation Unit, MSF-Holland, Amsterdam. Alistair Hallam Research Fellow, Humanitarian Policy Programme, ODI. Caroline Harper Head of Research and Development, SCF, London. Sonja Hyland Department of Foreign Affairs, Dublin. Mukesh Kapila Senior Humanitarian Advisor, Emergency Aid Department, DFID, London. John Kirkby Senior Consultant, ETC, North Shields, UK. Natalia Langlais Programme Officer, Emergency Aid Department, DFID, London. Bernard Lemaire Emergency Programme Officer, Belgian Administration Development Cooperation, Brussels. Joanna Macrae Research Fellow, Humanitarian Policy Programme, ODI. John Mitchell Emergency Advisor, International Division, British Red Cross Society, London. Stephan Moll Controller, and Swiss Disaster Relief, Swiss Agency for Development and Cooperation, Federal Department of Foreign Affairs, Berne. Susan Purdin Project Manager, The Sphere Project, Geneva. David Riley Chief, Programme Coordination Section, UNHCR, Geneva. Ian Shaw Director of Graduate Studies, School of Social and

1 Administrative Studies, University of Wales, Cardiff. Paul Smith-Lomas Deputy Director, Emergency Department, , Oxford. John Telford Consultant, EMMA, Mountrath, Republic of Ireland. Koenraad van Brabant Research Fellow, Humanitarian Policy Programme, ODI.

Unable to Attend

Raymond Apthorpe Independent Consultant, Australian National University, Canberra Mikael Barfod European Community Humanitarian Office, Brussels Tony Beck Interchange Claes Bennedich Swedish International Development Cooperation Agency, Stockholm Eric Berg Ministry of Foreign Affairs, Norway Jim Bishop Interaction, Washington Margie Buchanan-Smith Actionaid, London Edmund Cain United Nations Development Programme, New York William Carlos Department of Foreign Affairs, Ireland Vincent Coultan CARE UK Niels Dabelstein DANIDA, Copenhagen Sean Doyle DGVIII, Brussels John Eriksson Independent Consultant, Washington DC Marika Fahlen Ministry of Foreign Affairs, Sweden Marco Ferrari Humanitarian Aid and Swiss Disaster Relief, Berne Stephen Green , Rome Nils Gussing Gussing Consulting, Geneva Peter Hawkins Save the Children Fund, London Paul Hebert Department of Humanitarian Affairs, Geneva Ian Hopwood United Nations Children’s Fund, New York Kari Karanko Ministry of Foreign Affairs, Finland Chris Kaye Department of Humanitarian Affairs, Geneva Werner Kiene World Food Programme, Rome Ted Kliest Ministry of Foreign Affairs, Netherlands Alcira Kreimer World Bank, Washington Louise Lavigne Canadian International Development Agency, Geneva Hans Lundgren Organisation for Economic Development Cooperation, Paris Wayne MacDonald Canadian International Development Agency, Quebec Peter McDermott United Nations Children’s Fund, Geneva Joel McLellan Steering Committee for Humanitarian Response, Geneva Ole Moesby DANIDA, Copenhagen Juan Francisco Montalban Ministry of Foreign Affairs, Spain Xavier Ortegat VOICE Andy Pugh CARE, Atlanta

2 A Riberio-Vanderauwera Belgian Administration for Development Cooperation, Brussels John Rogge United Nations Development Programme, New York Bernard Sexe Ministry of Foreign Affairs, France Nick Stockton OXFAM, Oxford Klaus Streicher Ministry of Foreign Affairs, Germany Jacques Stroun International Committee of the Red Cross, Geneva Laurent Thomas Food and Agricultural Organisation, Rome Joeren Verheul Ministry of Foreign Affairs, Netherlands Mr Von Rom Ministry of Foreign Affairs, Germany Elizabeth Wade-Brown Catholic Fund for Overseas Development, UK Peter Walker International Federation of Red Cross, Geneva

3 DAY 1: 21st October

Welcome Address

Simon Maxwell, who took up his post as Director of ODI at the beginning of October, commended ALNAP members for their initiative in establishing the network and their evident support for and interest in it. He spoke about accountability initiatives over a broad front of development and humanitarian activities, drawing attention to the Charter for Food Security prepared for the World Food Summit, and the exciting linkages to be made between the accountability issues and the growing interest in a rights-based approach to development and poverty reduction.

Introduction and Review of Agenda

Those present introduced themselves. John Borton explained that whereas the focus of the first ALNAP meeting had, of necessity, been on conceptual and procedural issues that the intention was for this second meeting to focus on more practical issues. The programme was a full one that would begin with a lateral look to approaches towards accountability and performance issues in the UK social services sector which would then be followed by presentations by OFDA and MSF- Holland on their recent experiences in introducing, respectively, monitoring and performance monitoring systems. The second day would focus on ALNAP management issues, with feedback from the newly elected Steering Committee. This would be followed by presentations from Alistair Hallam on his study of best practice in the evaluation of humanitarian assistance programs and Koenraad Van Brabant’s discussion paper on institutional learning.

Tour de Table

SIDA: In collaboration with DGI of the European Commission, SIDA is undertaking an evaluation of the Swedish Committee for Afghanistan. It is also undertaking a programme of 'capacity studies' of selected organisations which have received substantial support from SIDA over an extended period. The Life and Peace Institute, the Mines Advisory Group and the African Housing Fund are among those organisations included in the programme.

DHA: The Lessons Learned Unit has been undertaking a study on indigenous mine action programmes in Afghanistan, Mozambique, Cambodia and Angola. The country reports and a synthesis report will be available soon. A study on humanitarian coordination in Angola which includes coordination in relation to the demobilisation programme and a study on strategic coordination in the Great Lakes region during 1996-97 are also underway. A draft report will be available by the end of the year and a workshop is planned to be held in Stockholm early next year.

Antonio Donini also outlined the Strategic Framework exercise which was being carried out under the auspices of the UNACC (Administrative Coordination Committee). Afghanistan had been selected as being representative of a country in

4 crisis and during October a number of senior UN officials and NGO personnel had engaged in a programme of activities with UN personnel working in Afghanistan to explore ways in which coordination could be improved between UN agencies involved in peacebuilding.

Richard Blewitt outlined the current process aimed at improving the strategic planning and monitoring functions within the UN Consolidated Appeals Process (CAP). A paper was in preparation which would examine the principal elements of 'good strategy' and strategic monitoring. A next phase would test the outcomes in relation to two countries, with Sudan being one and either Afghanistan or Tadjikistan being the other.

WHO: WHO has commissioned CRED (Centre for Research on the Epidemiology of Disasters) at the Catholic University of Louvain to undertake an evaluation of health sector interventions in the Great Lakes Region. WHO is collaborating with OFDA and CDC Atlanta to review health sector interventions in Afghanistan. As part of a study on the extent to which support to the health sector can support the peace building process, case studies are being undertaken in Slovenia and Angola. A meeting is being held at WHO at the end of October to prioritise WHO research agenda and examine best practice issues in the health sector.

ICRC: Between late 1997 and early 1998 the ICRC will be reviewing its policies on evaluation. Because of the differences in perspective and use of terms within the organisation an ICRC evaluation terminology has been prepared. In July 1997 ICRC and ECHO auditors undertook a joint audit of ICRC programmes in Afghanistan.

UNICEF: UNICEF is participating in the tripartite WFP/UNHCR/UNICEF evaluation of operational coordination in the Great Lakes Region (complementing the DHA study which is focusing on strategic coordination). UNICEF is also exploring similarities and differences between monitoring and evaluation methodologies in the development and conflict situations and the potential cross-overs between the two methodologies.

Sphere Project: the Steering Committee for Humanitarian Response (SCHR) and InterAction launched the Sphere Project in July 1997. Technical work led by personnel seconded from agencies in the sectors of: water and sanitation; food security; nutrition; health services; and shelter and site selection had commenced at the beginning of October. It was planned that the Humanitarian Charter and the minimum technical standards would be published in June 1998.

DFA, Ireland: DFA has just completed an evaluation of its support for programmes in Bosnia and is currently undertaking an evaluation of its support for agencies in Somalia.

British Red Cross Society: As a result of the conclusions of the June 1997 World Disasters Forum held in London the BRCS had set up a project to investigate the value and feasibility of an Ombudsman for Humanitarian Assistance. Experience

5 with ombudsman and watchdog functions in relation to international organisations and the private sector are being reviewed. The project has a Steering Group and a Reference Group and it is proposed to use the ALNAP as a wider reference group. A full report will be made to the 1998 World Disasters Forum. (A one page Project Overview was circulated).

MFA, Switzerland: Swiss Development Cooperation has been undertaking a systematic performance review of all (200) of its current projects.

Oxfam-UK: An evaluation of the Oxfam's programmes in the Great Lakes Region from 1994-97 is just being finalised. As part of a quality review the Africa team has been reviewing earlier evaluations to assess the extent to which lessons have been learnt. Oxfam International (comprising ten organisations) is holding a Quality Conference in Brussels to determine common quality standards for all Oxfams. The Strategic Planning and Evaluation Department is undertaking a review of impact assessment with the objective of developing practical tools for fieldworkers.

People in Aid: The Code of Best Practice in the Management and Support of Aid Personnel was launched by PIA in February 1997 and is currently being piloted in several organisations. In addition PIA is developing a process and mechanisms for auditing the quality of organisation's human resource management. In order to assess PIA's impact on member and non-member organisations a baseline study is being undertaken.

UNHCR: UNHCR has been undertaking a major organisational review which has included review of the organisation’s objectives. Among the outcomes are that management objectives will be set by managers at the country level and that self evaluation is to become mandatory practice (self-evaluation was found to be more meaningful than external evaluation). As a result of UNHCR's operational collaboration with NGO implementing partners and the need to clarify the responsibilities and obligations of UNHCR and the NGOs, Partnership Agreements are being introduced.

ETC: On behalf of Netherlands Development Cooperation ETC is currently investigating quality assurance mechanisms in Dutch humanitarian assistance with a focus upon the perspectives of beneficiary populations. A code of best practice may result from this work.

DFID: Natalia Langlais indicated that DFID was considering providing support to both the Sphere Project and the BRCS Ombudsman project. Mukesh Kapila signalled that DFID would be publishing a White Paper on the UK Aid Programme (the first White Paper in 22 years) in early November. It would contain a range of new policies and set out DFID's mandate on humanitarian aid in situations of conflict and natural disasters. The UK would have Presidency of the EU during the first half of 1998 and DFID was planning to use the opportunity to develop and raise awareness of humanitarian principles.

6 FAO: Following work with outside consultants (ETC) FAO was finalising a mission statement. A manual on FAO's role and activities in emergencies had been produced which includes a range of best practice examples. FAO is aware of the weaknesses in its current monitoring systems and is seeking to be more scientific and rigorous. An evaluation is planned of FAO's operations in Rwanda and Burundi.

ECHO: Jacqueline Coeffard indicated that the ECHO Evaluation Manual is to be reviewed and revised and that a study is to be commissioned in the near future on performance indicators. She explained that, during its first years of existence, ECHO's Evaluation Unit had concentrated on ECHO's own operational needs and had not taken part in wider discussions on evaluation and accountability but that this was now changing.

John Telford (EMMA) had been preparing a good practice manual for ECHO staff in Head Office and in the field which was intended to serve as the basis for training programmes. Among the nine principle areas covered were: international humanitarian law and its implications for humanitarian assistance; transparency and accountability; the participation of vulnerable groups; and management issues (particularly human resource management).

USAID/OFDA: OFDA has been monitoring its performance monitoring systems and a presentation on this process would be given after lunch. An evaluation was underway of the DART team which was deployed in Bosnia on a continuous basis over a five year period. Internal evaluations were also underway of programmes in South America and Asia.

Belgian Administration Development Cooperation: No evaluations are currently planned of humanitarian assistance programmes. Programmes are monitored but experience has been that monitoring procedures are problematic.

ODI: Simon Maxwell shared some of the outcomes of a study he had been involved in on rehabilitation in the Horn of Africa including 14 'Important lessons to learn about rehabilitation' and the principle elements of being a 'good donor' a 'good aid recipient' and a 'good NGO'. (A three page note was distributed).

John Borton explained that, on behalf of DFID's Evaluation Department, he and Joanna Macrae had submitted a synthesis study of 28 emergency aid evaluations in April. This had been followed by a review he had undertaken of recent organisational and procedural changes in the management of emergency aid within DFID to assess the extent to which the findings of the synthesis study had been addressed. The two reports were scheduled to be considered by the DFID's Project Evaluation Committee in December and should be placed in the public domain early in 1998.

Joanna Macrae outlined an evaluation of International Alert which had been commissioned by Sweden, Norway, Denmark and the Netherlands and led by the

7 Chr. Michelsen Institute, Bergen. She and a colleague had prepared a case study on International Alert's work in Sierra Leone. The other case studies were of Burundi and Sri Lanka. It was planned to publish the main report in the near future.

Presentation: 'Accountability and Performance: Similarities and Differences Between the UK Social Services Sector and the International Humanitarian System' by Ian Shaw.

Ian's presentation began by outlining the trends relating to accountability and performance in the UK with particular reference to social work. A critical influence had been the Conservative Government's determination in the 1980's to apply commercial models of quality to the public sector and the resultant stress on improving consumer interests and satisfaction in the subsequent reforms. A 1986 report on community care by the Audit Commission had served as a benchmark for later reforms and given them a value-for-money orientation. The Social Services Inspectorate was strengthened and advocated inter alia 'more explicit standards', 'choice' and 'accountability'. In 1990 a Home Office review of agencies and projects in the voluntary sector had focused on performance against funding and adopted a 'strategies expressed through objectives' approach. This had led to the establishment of the Charities Evaluation Service and stimulated the use of evaluation by voluntary agencies.

He then identified a cluster of related themes within the literature on the UK experience over the last decade.

The meaning of 'quality' is elusive as it is made to serve different purposes by different groups of people and the same groups of people at different time. However, it is the 'value for money' concept of quality which is now central to any debate about service quality. The rhetoric of quality has been used to legitimate organisational change whether it be in the social services, the fire service, public libraries or British Rail. 'Quality' has been used to establish a shift in relationships between the state and voluntary sectors and it has been closely associated with a reassertion of managerial control and the flourishing of management-by- effectiveness strategies. The emphasis upon quantitative measures of effectiveness and efficiency has resulted in various types of 'goal distortion' and 'goal displacement'; one example being an over emphasis on the measurable items and the relative neglect of the less measurable items.

Options open to users dissatisfied with a service are often limited. The social science literature identifies the exit and voice options. The exit option is frequently used by consumers in a competitive multi-product market. But, where users are vulnerable, or regard themselves as involuntary service users, or are on the receiving end of a service monopoly, they have almost no exit option.

Evaluation, if approached in a participatory way, can enhance the voice option. Despite the emphasis on evaluation within the social work sector, a distinct

8 difference is evident between micro-level evaluations undertaken by practitioners and formal evaluations (evaluations with a big 'E') usually managerially driven and involving 'scrutiny from above'.

9 He ended by posing a series of questions for discussion in relation to the humanitarian system, including:

Does the absence of a frame of reference in statute or precedent suggest that greater emphasis should be placed on process measures of evaluation based on best practice, peer review and 'in-house' policing of quality?

How can participatory principles be introduced into quality measures?

How do the cultural and political contexts influence the choice of evaluation method and specific enquiry methods?

During the ensuing discussion a number of points emerged:

 A key difference between the UK context and that in which most humanitarian agencies operated was the very limited media scrutiny and public opinion mechanisms and the weakness or absence of national and local government structures through which the opinions of the affected population could be represented.

 The increased emphasis on performance indicators was likely to reduce the willingness or agencies to work in contexts where the risks of an unsatisfactory outcome were high. Agencies were likely to become less innovative and more cautious.

 It was acceptable for agencies to achieve lower standards in the early phases of an operation than later on when the situation and resource flows was more stable.

 ways of taking greater account of beneficiary perspectives might include the election of representatives from the beneficiary community or the use of 'participant observer' evaluation where the evaluator spent an extended period with the beneficiary community. Care was needed to ensure that the views of the affected population who were not beneficiaries (because they had been left out of the assistance programmes) were represented.

 The context in which programmes were operating were critical to the standards that could be expected and the judgements that could be made.

 The concept of cost-effectiveness can be problematic and work against the interests of beneficiaries

 Some activities are inherently easier to measure than others and the likelihood was that the easily measurable activities (eg. water production) would be selected as indicators of performance whilst other activities (eg. ensuring ease of access and hygiene awareness campaigns) might be not be measured and thus experience a reduced emphasis.

10 Reflecting on the discussion Ian identified the need for the humanitarian system to develop qualitative methodologies, critical awareness and the visibility of practice. Minimum requirements for accountability were:

i) indications of what constituted good practice; ii) peer review; iii) an independent voice.

Presentation: 'The Selection of Performance Indicators in a Donor Organisation: The Process of Preparing the R4 Report Within OFDA' by Polly Byers

Polly began by explaining the structure of USAID and OFDA's location within USAID's Bureau of Humanitarian Response alongside the Office of Food for Peace, the Office of Private and Voluntary Cooperation and the Office of Transition Initiatives. It was important to see OFDA's performance indicators as flowing from OFDA's goals which themselves were framed within a hierarchy of USAID's goals and US national interests.

US national interests:

i) economic opportunity promoted; ii) humanitarian and other complex crises prevented; iii) prospects for peace and stability enhanced; iv) US protected against specific global dangers.

USAID's goals:

i) broad-based economic growth achieved; ii) sustainable democracies built; iii) World's population stabilised and human health protected in a sustainable fashion; iv) environment managed for long-term stability; v) lives saved, suffering reduced and development potential reinforced.

OFDA's goal was the last of these, ie. lives saved, suffering reduced and development potential reinforced.

Polly explained the process by which a team in OFDA (referred to as the R4 team after its full title Results, Review and Resources Request team) had developed a hierarchy of strategic objectives and intermediate results and identified indicators to be used in assessing OFDA's performance in addressing these objectives. In selecting indicators the priority had been to select those that were realistic and could be readily reported on. The process was not yet completed. Piloting was planned in two countries and refinements might well be made before OFDA put the system into global use.

11 Two strategic objectives had been identified in pursuit of OFDA's goal namely 'Critical needs met of targeted vulnerable groups in emergency situations' and 'Increased adoption of mitigation measures in countries at greatest risk of natural and man-made disasters'. For each Strategic Objective a series of 'Intermediate Results' had been selected and indicators of performance identified for both the Strategic Objectives and the Intermediate Results (see below).

STRATEGIC OBJECTIVE 1:

Critical needs met of targeted vulnerable groups in emergency situations.

Indicator 1.1: Percent of disaster response where an acceptable proportion of the target vulnerable population's critical emergency needs have been met: a. complex emergencies, b. natural and man-made disasters

Intermediate Result 1.1:

Improved targeting of emergency assistance to the most vulnerable groups.

Indicator 1.1.1: Percent of disaster responses based on periodic process of needs assessment and recalibration of targeting: a. complex emergencies, b. natural and man-made disasters

Intermediate Result 1.2:

Emergency assistance, meeting recognised standards, delivered within acceptable time frame.

Indicator 1.2.1: Percent of disaster response programmes accomplished within acceptable timeframes:a. complex emergencies, b. natural and man made disasters

Indicator 1.2.2: Percent of disaster response programmes that have delivered emergency assistance packages which meet international standards a. complex emergencies, b. natural and man made disasters

Intermediate Result 1.3:

Capacities for livelihoods restored.

Indicator 1.3.1: Percent of disaster response programmes which includes the implementation of appropriate relief to development components

12 Intermediate Result 1.4:

Disaster response capabilities of NGOs and host government entities strengthened.

Indicator 1.4.1: Number and percent of health standards informing health protocols adopted by implementing agencies

Indicator 1.4.2: Percent of emergency health NGOs with health professionals trained in OFDA-approved emergency health protocols

STRATEGIC OBJECTIVE 2:

Increased adoption of mitigation measures in countries at greatest risk of natural and manmade disasters.

Indicator 2.1: Number and percent of OFDA-targeted at-risk countries with one or more Prevention/Mitigation/Preparedness (PMP) programmes

Intermediate Result 2.1:

Enhanced institutional capacity of NGOs and international organisations to reduce the impact of disaster.

Indicator 2.1.1: Change in the institutional capacity of international NGOs and IOs to develop and implement PMP programmes

Intermediate Result 2.2:

Strengthened host country capacities to reduce vulnerability to natural disasters.

Indicator 2.2.1: Percent of OFDA-targeted vulnerable countries developing, adopting and practising national and local disaster mitigation and preparedness programmes (Host Country Institutional Capacity [HCIC] score)

Intermediate Result 2.3:

Improved use of resources to link relief and development.

Indicator 2.3.1 Percent of disasters at sub-national, national and regional levels with a Strategic Plan Quality Score (SPQS) of 3 or more

Discussion following the presentation covered a range of issues. Concern was expressed at the apparent inability of the indicators to account for the context in which OFDA and its partner agencies were operating. How much 'humanitarian space' did they enjoy in their area of operation? Would poor performance against the indicators in a particularly difficult context (eg. where the front-line kept shifting

13 and access to target populations was severely restricted) be regarded differently from poor performance in less difficult contexts? Might the system represent a "deadening bureaucratic process", making agencies more cautious in their behaviour and discouraging responses to humanitarian needs in particularly difficult contexts? Such a system of fixed indicators and targets appeared to run counter to recent ideas on 'Smart Relief' in areas of ongoing conflict with its emphasis upon pragmatism and opportunism.

Another area of concern was that of funding organisations developing objectives which, though broadly similar, were not identical and in some respects might emphasize different aspects of the performance expected of partner agencies; it might become difficult for partner agencies receiving funding from more than one donor to reconcile the different expectations. As funding organisations made greater use of performance indicators it was important that they took steps to ensure that major contradictions did not emerge between the objectives of different organisations. Differences between funding organisations might also arise in relation to their country strategies. It was important for funding organisations to ensure that there was coherence between their country strategies.

Presentation: 'Monitoring within an Implementing Agency: the Development and Introduction of a New Monitoring System within MSF-Holland' by Andre Griekspoor

Andre explained how MSF-Holland has been working to improve both the monitoring and evaluation processes within the organisation but that the presentation would focus upon monitoring as it was here that the greatest developments had occurred resulting in the introduction of a new monitoring system.

Changes in the organisational structure, in particular the delegation of greater responsibility to the field and the dismantling of the Project Management Department to give the organisation a 'flatter' structure, had been an important motive for the introduction of the new monitoring system. The new Operational Directors were less directly involved in the running of operations but still needed to be kept informed of key information. Another factor contributing to the development and introduction of the new monitoring system had been the introduction of a planning system in 1994 based on the Logical Framework. Considerable effort had been put into institutionalising the 1994 planning system and the new monitoring system built upon the planning system.

Monitoring was defined as "the continuous process of measuring, collecting, analysing, recording and communicating information to assist project management decision-making". Guiding principles for the process were:

1. The information provided at each level of the four principal levels of the organisation should be tailored to the information needs and responsibilities of managers at each level (a principle that required clarification of the

14 decision-making responsibilities at each level).

MSF-H Board > Operational Director/Management Team (Amsterdam) > Country Manager > Project Coordinator

2. Each instrument (report, debriefing, project proposal, etc.) should have two functions:

i) to meet the management information needs of a particular level; and ii) to enable that level to account to a higher level.

3. Each instrument should be reacted upon

4. The amount of information should be kept to the minimum needed.

It was recognised that:

i) each management level and instrument operated within a policy context (ie. agency policies in a particular country and overall agency policies) ii) projects involved resource allocations which needed a planning context (project level, country level and agency level) ii) it was critical to observe the environment in which the agency was operating.

The different types of monitoring instrument to be used for operational monitoring by the Country Managers were identified as being:

 exploratory mission/assessment reports  interventional approach reports  project proposals and project planning  updates on human resources and finances  project reports  debriefing of other members of the Country Management Team after field trips  debriefing field staff during field visits or when leaving the country/at the end of contract  trip reports from supporting HQ departments  ad hoc communications

15 16 The instruments to be used for monitoring at the central level were identified as being:  Sitreps  quarterly progress reports (results and resources)  field trip reports by the Operational Director  trip reports by supporting departments in HQ  debriefing of the Country Managers  ad hoc communications

Many of these instruments were already in use but their design and the way that they were used lacked clarity as to their purpose and the key information that they had to transmit.

Andre explained that the agency fully realised the importance of using reporting instruments in combination; no single instrument could provide all the information required and neither should important decisions be taken on the basis of only one report. Field trips were an important part of checking the information being provided though project reports. In addition it was accepted that each instrument requires feedback; an effective monitoring system requires those generating reports to be informed about what was and was not useful in the report and what use was made of the information.

How well was the new system working? In part it was too early to say as improvements were still being made to particular instruments such as the format used by the Sitreps and ensuring that the analysis undertaken as part of the decision to commence a project was properly recorded. Nevertheless, appreciation had been expressed by Operational Directors for the additional information that the system was generating (or at least for the fact that information was being provided in a clearer and more consistent form) and there was a general feeling that the new system represented an improvement on previous practice. The inclusion of a 'Status Descriptor' section in the project reports was encouraging greater openness about the constraints being experienced.

A concern was that perhaps too much attention was being placed upon formal reporting requirements and indicators than upon informal reporting and using the reports as an opportunity to reflect and if necessary reconsider activities and approaches. Reporting was not the same as monitoring and a balance was needed between formal and informal methods for transmitting information. Another concern was that there was still considerable variation in the quality of some reports and that the change in format needed to be followed up with training and coaching.

Finally, the selection of indicators and their usefulness still represented a major problem. Many MSF-H projects included several activities running in parallel and these required separate indicators of performance. Moreover most projects were prepared using a Logical Framework approach but experience was showing that the quality of reporting on the indicators was often only as good as the indicators themselves, in other words a poor indicator often resulted in reporting that was not

17 very useful. Discussions were continuing within the agency as to the value of different indicators and the usefulness of different indicators at different levels within the agency.

 In the discussion following Andre's presentation a number of points emerged.  Systematic monitoring and reporting is vital in organisations were there are frequent changes of staff. The ability to trace the 'decision trail' on particular issues or projects was an important resource for newly arrived managers, particularly at the country level.  End of project reports were useful and should include key questions drawing out the reasons for the closure of the project.  Monitoring and reporting were activities which offered opportunities for involving the perspectives of the local community/affected population and also for strengthening the capacities of local staff and local structures. The involvement of local communities was a vital part of analysing the needs and setting the objectives of a particular project. The local community could also play a useful role in selecting the indicators to be used.  The quality of human resources was often central to the issue of the quality of reporting.  Attempts to clarify what was required of field staff in their monitoring and reporting was to be welcomed as field staff often found it difficult to decipher what was actually required of them.

ALNAP Management Matters

John Borton explained that the purpose of this final session for the day was to present progress in obtaining funding for ALNAP and developing the Evaluations and Reports Database; confirm the composition of the Steering Committee ahead of the first formal meeting of the Steering Committee which would take place that evening; and informing members of the proposed setting up of a monthly Newsletter.

Funding

John presented the table listing financial contributions to date. The Irish Department of Foreign Affairs, the Swiss Agency for Development and Cooperation (SDC) and SIDA had all recently confirmed significant contributions to ALNAP (in the £20,000 to £27,000 range) which complemented the original DFID 'start-up' funding. Smaller contributions (in the £1,200 to £3,000 range) had also been received from FAO, IFRC and MSF-Holland and offers of a similar level of contribution had been made by UNHCR, WHO and ICRC. When these contributions were combined the funding requirements of approximately £115,000 for Year 1 were covered. The contributions from the SDC, SIDA and the IFRC were for two years and the expectation was that DFA funding for Year 1 would be matched in Year 2. Though there was a funding shortfall for Year 2 at this stage, it seemed probable that it would be made up over the next few months.

Following points of clarification and minor corrections to the table, it was agreed that

18 representatives of those organisations contributing funds or planning to contribute funds would meet the ODI Accounts Administrator after the first day's session to discuss contractual and reporting arrangements.

Evaluations and Reports Database

John reminded participants that the objective of the database was to make ALNAP members aware of the existence of humanitarian aid evaluations and key studies on humanitarian aid issues and to enable the Secretariat to extract and e-mail summary information on individual studies to ALNAP members.

A database had been established since the May meeting using the same library software as used by the ODI Library (Inmagic/DB Text Plus). The structure of the database was described. In addition to the normal cataloguing used by the ODI Library, additional fields had been created to address the specific needs of ALNAP members. The additional fields included: the Table of Contents of the report, the Executive Summary, the Principal Findings and the TOR for the study. This information had been scanned into the database for 80 reports. Additional fields on the composition of the team which prepared the study, a description of the methodology employed, the way the study was managed, the value of the programmes being evaluated and the approximate cost of the evaluation were awaiting entry by John.

It was noted that the focus of the database would be on evaluation studies but that other documents with a high relevance and use to ALNAP members would also be eligible for entry, including reviews, mission reports, and relevant academic papers. Furthermore it was noted that for the database to become more comprehensive would require ALNAP members to provide reports to ODI to enter onto the database. Anyone submitting a report should indicate its confidentiality. Three status levels were available: 1. Secretariat only; 2. ALNAP members only; 3. Open.

Composition of the Steering Committee

John reminded the meeting of the composition of the Steering Committee that had been agreed at the May meeting:

Bilateral/multilateral donors Eva Asplund, SIDA; William Carlos, DFA UN agencies/departments Maureen Connelly, UNHCR; Antonio Donini, DHA NGOs Andre Griekspoor, MSF-H; one to be nominated ICRC/IFRC Both to be nominated

It was noted that Sonja Hyland would be representing DFA in place of William Carlos who was unable to attend this meeting. Following the first ALNAP meeting discussions within UNHCR had resulted in the decision that it would be more appropriate for a representative of the Programme side to participate in ALNAP meetings and so Maureen Connelly had been replaced by David Riley from the Programme Coordination Section. At the first meeting Peter Walker had been

19 unsure who would represent the IFRC at subsequent meetings. It had now been agreed that Peter would continue to represent the IFRC but was currently on a mission and had proposed that John Mitchell of the BRCS would represent the IFRC.

This left the second NGO representative and the ICRC representative to be nominated. Dominique Desvignes explained that Jacques Stroun would be the principal contact for ALNAP in ICRC but that he was unable to attend this meeting and she had been nominated to represent ICRC in his place. NGO representatives present proposed and supported the nomination of Susan Purdin of the Sphere Project as the second NGO representative.

Attention of participants was drawn to the Draft Agenda for the Steering Committee meeting to be held at 6pm that evening.

ALNAP Newsletter

During the summer some members had indicated a desire for the sharing of information continue between the meetings. It had been agreed that the Secretariat would prepare a simple Newsletter which would be e-mailed to members at the end of each month. Perhaps every three months it could include an update on all relevant activities of ALNAP members. The proposal was broadly welcomed.

The first days session was brought to a close and participants reminded of the ALNAP dinner to be held at a nearby restaurant at 19.30.

20 DAY 2: 22nd October

John Borton welcomed participants to the second day. He explained that at the start of the Steering Committee the previous evening Andre Griekspoor had been appointed as Chair of the Steering Committee.

Andre explained that the draft agenda had been adhered to though an additional item 'Dissemination of Material' had been added and there had been a discussion about the time to be served on the Steering Committee. It had been agreed that the current members of the Steering Committee would serve for one year and that there should be an annual rotation. He then summarised the discussion and outcomes on each item of the agenda.

Management of the In-Depth Studies

It had been decided that it would be preferable to 'de-link' the in-depth studies from ALNAP. It had been apparent from the discussions at the May meeting that it would be difficult for the wide range of organisations comprising ALNAP to agree on priority subjects for the in-depth studies. In addition an uneasiness had been apparent over the involvement of the Secretariat in the commissioning and possibly undertaking the in-depth studies. It had therefore been proposed to allow such studies to be proposed and funded in the normal way with interested and willing organisations participating. It was likely that at least some ALNAP members would be aware of, or possibly involved in, such study initiatives and that ALNAP would therefore be kept informed of their initiation, progress and outcome. In cases where those involved in undertaking such studies were not members of ALNAP then they could be invited to come and present the progress and results of the studies. De- linking would therefore not exclude ALNAP from being informed about the studies that are carried out.

Some concern was expressed about the effect of 'de-linking' on ALNAP's ability to ensure that key studies were carried out. Whilst ALNAP's ability to directly initiate studies might be lessened it would still be possible for ALNAP to recommend particular studies to interested parties and for some ALNAP members to carry forward such recommendations. The Secretariat would still have the ability to commission Background Papers (of the sort presented by Koenraad Van Brabant) and studies synthesising the results of evaluation reports.

Review of ALNAP Membership

It had been agreed that membership should deliberately be broadened to include representation of 'southern' NGOs and possibly south-based research institutions. In addition it would be desirable to add at least one France-based NGO and to encourage participation of the French Government The issue of participation by 'southern' governments had been discussed again and it had been decided it could be highly problematic and should be not be considered until ALNAP was much better established. The issue of participation by organisations had

21 been discussed. It was felt that the UN High Commissioner for Human Rights/Human Rights Centre should be invited to join as it had an operational role in several context but there was not a consensus on whether human rights NGOs should be invited. In the absence of consensus it had been decided not to pursue participation by human rights NGOs.

Discussion revealed a broad concurrence with the Steering Committee's views and an equal difference of opinion on the human rights NGO issue.

Funding Matters

Nothing was substantially added to the earlier discussion in the Steering Committee. A meeting had been held involving actual and potential contributors to ALNAP and the ODI Accounts Administrator. Agreement had been reached on contractual arrangements and financial and narrative reporting procedures.

Reporting Procedures

The Steering Committee would receive quarterly narrative reports on progress and those organisations contributing funding would receive quarterly financial reports for the same period.

Dissemination of Material

The Steering Committee had discussed the proposed e-mail Newsletter. The Secretariat had also been encouraged to explore the use of Alias Listservers and the possibility of a Website. The existence of a monitoring and evaluation website funded by a group of UK NGOs and maintained by the Centre for Development Studies at Swansea had been noted. The Steering Committee also stressed the need for organisations submitting reports and information to the Secretariat to be clear about the status of the reports and for the three levels of status already in use on the database to be adhered to.

In discussion it was agreed that the e-mail list used by the Secretariat would be provided to all members to enable them to send their own information to all members; it was not necessary for all update information to be channelled via the Secretariat.

Presentation: 'Status of the DAC Study to Identify and Disseminate Best Practice in the Evaluation of Humanitarian Assistance Programmes and Feedback on the Questionnaire Circulated to ALNAP Members in July' by Alistair Hallam

Alistair explained that the study had commenced in June 1997 and would be completed by April/May 1998. A workshop to review a draft report was planned for early in the New Year.

22 Following an initial review of the literature he had prepared 17 questions which had been sent to 55 existing and potential members of ALNAP in July. A 'report back' paper summarising the 14 responses received had been e-mailed to members last week and a hard copy included in their meeting pack. He had deliberately not indicated which of the respondents had held which views as some had replied informally and from a personal rather than an organisational perspective whilst others had clearly presented a considered organisational position. In view of the limited time available he wanted to concentrate on the issues emerging from the work carried out to date.

A three page paper summarising these issues had been included in the meeting pack.

In summary the issues were: i) Recognising the political dimension of humanitarian assistance ii) Establishing a framework for humanitarian assistance iii) Establishing realistic expectations about the scope of the evaluation iv) Developing appropriate participatory techniques for information-gathering in emergencies v) The importance of improving overall management of humanitarian aid programmes vi) The importance of follow-up to evaluations vii) The importance of dis-aggregating data viii) The need for a system-wide analysis

He wanted to hear the views of ALNAP members on this formulation of the key issues. His approach to the draft report was iterative and the subject of frequent expansion, revision and refinement. A draft would be sent out in December ahead of the workshop in early 1998.

Initial discussion focused on his proposals regarding follow-up. The idea of a standard one-year on review of the status of the recommendations was felt to be rather crude but, given the current practice of very limited formal follow-up to reports, was probably necessary.

The role of the evaluators in the follow-up period was discussed. Some members felt that more use should be made of evaluators in the discussions about how best to deal with particular recommendations as it was very common for the evaluators contact with the report and awareness of its status within the commissioning organisation to end following delivery of the report. During the course of the discussion one member referred positively to the active follow-up to the OLS Review submitted in late 1996 and to a recommendation matrix which had been prepared by the relevant UN agencies. That a positive follow-up process was underway was news to one of the members of the evaluation team present. An effective return on the evaluation 'investment' implied that greater use should be made of the

23 knowledge acquired by the evaluators.

Variability in the quality of evaluators was discussed and concern expressed that there was no apparent professional code or standards to which evaluators adhered. This was contrasted with auditors. It was agreed that there was a need for such a code to be developed and for more training to be provided to those involved in humanitarian aid evaluations.

The need for system-wide analysis was broadly supported as responses could only be judged in the context of overall needs rather than those in just one part of the affected area.

The value of continuity in successive evaluations of the same programme was discussed. FAO uses different evaluators for each phase of a programme but is unsure as to the efficiency of this approach. Where an earlier evaluation had been critical of project personnel it was possible that the same evaluators undertaking a follow-up evaluation may not be welcomed.

Presentation: 'Organisational and Institutional Learning in the International Humanitarian System; Opening the Dialogue' by Koenraad Van Brabant

Koenraad presented his paper which had been e-mailed to members the previous week and a hard copy included in the meeting pack. The paper had been commissioned by the Secretariat in response to points made during the May meeting. Potentially the paper could serve as the first ALNAP Discussion Paper.

He emphasised the need to distinguish between organisational learning (collective learning within an organisation) and institutional learning (system wide learning between and across agencies). As with the evaluation literature, most of the literature on organisational and institutional learning in the aid sector related to development agencies and stable contexts. It had therefore been necessary to draw on his own field and agency experience in preparing the paper. His presentation covered the same areas as the paper:

 Organisational and Institutional Learning in the Aid Sector  Why Should Organisations Learn?  Obstacles to Learning  Catalysts for Change, Catalysts for Learning?  Creating the Learning Organisation: Internal Changes  Creating the Learning Organisation: External Changes  Monitoring and Evaluating the Learning Organisation  Rewarding the Learning Organisation

The following points emerged during the discussion:

The sharing of information necessary for learning is often poor due to time

24 pressures on personnel which in turn was a product of the level of resources available.

Learning organisations were desirable but they cost more to run and this has to be recognised by donor organisations in terms of the levels of overhead they provided for in their funding.

Mechanisms need to be developed to assess whether or not personnel have 'learnt'.

Attitudes to learning are greatly affected by the culture of the organisation and the cultural background of its personnel.

The rapid turnover of staff in agencies is a major barrier to organisational and institutional learning and is a key factor discouraging investment by agencies in training. Instances were cited of personnel leaving the organisation shortly after they had benefitted from specialist training.

One member with considerable experience working in and evaluating humanitarian agencies felt that many of the problems experienced within agencies stem from personality issues, yet the majority of efforts to address such problems employ organisational and structural change approaches rather than improving the way individuals relate to each other.

ALNAP's role in fostering learning was discussed. It was felt it had an important role to play in encouraging agencies to share their learning experiences. As a follow up to Koenraad's paper it was suggested that a checklist should be prepared for agencies to consider whether or not they have the characteristics and mechanisms of a 'Learning Organisation'. This would help stimulate discussion within members' organisations and the results of these discussions be shared at subsequent meetings. Another suggestion was of setting-up a pilot study to examine how lessons had been learned within and between a selected sample of organisations.

Conclusions and Follow-Up

There was broad agreement that the meeting had been valuable. In trying to identify the priority areas of work for ALNAP over the coming year, the following areas were highlighted:

 Organisational and Institutional Learning  The Strategic Framework  Evaluation and review processes and their impact on decision-making  Beneficiary Perspectives and Consultation  Human Resource Development

The Secretariat undertook to reflect on ways in which these areas could be carried forward.

25 John Borton thanked participants for making the meeting a success. Members gave a special vote of thanks to Helen Awan for ensuring the smooth running of the meeting.

The meeting ended at 13.30.

Results of the Evaluation Form Completed by Participants in the Second ALNAP Meeting

18 forms were returned by participants. Of these four were prepared by participants who were not present for all the sessions and so were not fully completed. [6 were prepared by representatives of bilateral/multilateral donors, 4 by representatives of UN agencies/departments; 5 by representatives of NGOs; 1 by a Red Cross representative; 2 by consultants.]

Length of meetings

1.5 days 11 2 days 5 Longer than 2 days 1

[2 members suggested that meetings should be held near a weekend (Thursday/ Friday was suggested) to enable participants to reduce their travel costs by staying over a Saturday night]

Format of meetings/number of presentations

Too many presentations 0 About the right number 11 Too few presentations 7

Results of the Usefulness/Relevance scoring of the five presentations

('Limited use' was given a score of 1; 'relatively useful' a score of 2; 'useful and relevant' a score of 3 and 'very useful and relevant' a score of 4)

UK Social Services 39 OFDA 40 MSF-H 44 DAC Evaluation Study 61 Organisational/Institutional Learning 53

[No pattern was discernible between the scorings given by respondents from different types of organisation]

Themed meetings or range of issues

i Focus on particular theme 11 Cover a range of issues 6 [One members suggested a combination of the above]

ii Was sufficient time allowed for ALNAP management matters

Too little 0 About right 14 Too much 3

Food/refreshment arrangements

All 18 respondents said the arrangements were good.

The following suggestions were offered

1. Choose a restaurant with a lower noise level! 2. Some UN agencies get a flat rate DSA so some members ought to pay separately for their dinner 3. Hot water should be provided for the herbal tea drinkers

Satisfaction with the way ALNAP is developing

Not satisfied 0 Satisfied 12 Very satisfied 6

COMMENTS MADE IN RESPONSE TO SPECIFIC QUESTIONS

In what ways do you think ALNAP's structure and activities might be improved?

1. Important to build on ALNAP members experience, plans and ideas. In the beginning it may require an active 'pushing' from the Secretariat to get ideas for presentations at coming meetings and newsletter.

2. More discussions in between meetings. Stimulate people to get a far deeper involvement in some issues. Link up people working on same issues.

3. Discussions could be in smaller groups and possible with an agenda proposed by speaker - depends on the topic. Need structure for discussion, however achieved.

4. Work in both groups sessions as well as plenary.

5. ALNAP is still in a early stage of development. The range of issues explored in the second meeting are fundamental to improving organisational learning. Given the very fruitful discussions which took place and the suggestions mad for further work, it may be easier to comment on this area after the next meeting, since information has started to flow and will continue to do so from now onwards.

iii 6. Don't keep sending questionnaires out to the membership. Eventually people will stop answering if they receive too many and don't receive satisfactory feedback on them.

7. ALNAP needs to cultivate a clear manageable focus to avoid drifting into an all encompassing body picking up 'all the issues' relating to humanitarian work.

8. Present and discuss more 'case studies'

9. Longer sessions on fewer issues.

10. The Secretariat should indicate at the end of each meeting what the agenda will be for the next meeting and the topics which might be suitable within a year.

11. More opportunities for interaction between meetings and at meetings.

12. Expanding slightly to include southern organisations. Should be very careful not to dilute focus. Hopefully will spend more time on substance presentations rather than on ALNAP management matters.

13. Advance the concrete studies and their follow-up and application especially the DAC Evaluation Study leading to specific guidance on skills, knowledge and attitudes of the 'good evaluation'

14. There was a tendency to focus on the more macro problems and not look at how the results of evaluation can be put into practice at the operational level.

15. Focus on one theme/experience/presentation (or maximum of 2) per meeting. Allow more time for exchange of information among ALNAP members.

What key issues do you think ALNAP should prioritise over the next 12 months?

1. Organisational Learning and beneficiary participation are important and OK for the coming meetings. The issue of reporting is related to a number of questions we have discussed and should be brought up at a later stage as it concerns all types of ALNAP members.

2. Reinforcing evaluation processes by field managers.

3. Organisational learning. Evaluation/learning/decision-making at strategic level - how to go beyond project/programme evaluation.

4. Beneficiary consultation/involvement in monitoring, evaluation and accountability measures.

iv Ongoing evaluation techniques and tools (as opposed to 'Big E' evaluations).

5. Beneficiary voice and how to find it. Feedback on existing practice (this was suggested for different organisations) Happy with the proposed list produced at the end of the meeting.

6. Organisational learning Human resource development

7. Organisational/institutional learning How to improve dissemination of lessons learned to the field Donor/NGO coordination - How to improve coordination to achieve real change in the delivery of humanitarian assistance.

8. Monitoring and evaluation in humanitarian assistance.

9. Best practice in evaluation. Key themes arising from evaluations.

10. Learning processes.

11. Follow up of the paper on organisational and institutional learning (case studies for NGOs, bilateral and multilateral agencies)

12. How to effectively disseminate only the most relevant documents throughout the network.

13. Continue to share/present/discuss evaluation and monitoring processes at agency and donor level. Follow-up on Sphere project progress, Ombudsman, Learning. What does evaluation, accountability and performance mean to southern NGOs

14. How does evaluation contribute to improved humanitarian response? Learning from evaluation applied to improved practice.

15. Follow-up on the DAC Best Practice Study. Follow-up on organisational/institutional learning; the role of evaluations in informing organisational change.

16. Follow-up on the DAC Best Practice Study. Methodologies and skills (especially on participatory methods) and qualitative evaluation (eg. of protection activities). There is a need to combine both quantitative and qualitative approaches.

17. Working more closely with implementing agencies/NGOs (There was very little vocal input during the meeting from operational NGOs!).

18. Strategic Framework approach and its implications for strategic

v planning/monitoring/evaluation.

Suggestions for additional ALNAP members

Health Net International MEMISA Centre for Human Rights (Suggested separately by 4 members) ICVA - if rehabilitated! Inter Africa Group La Red and/or Duryog Nivaran Caritas International Some of Caritas' southern members eg. Albania or Bangladesh France (as donor) AICF Luke L. Hingson, President, Brothers' Brother Foundation, USA Tom Baker, American Red Cross, Washington, USA Albert York, World Concern, Seattle, USA Anja Clauss, Doctors of the World, New York, USA Gordon Buhler, ADRA, USA Kenneth Flemmer ADRA, USA Rudy von Bernuth, Save the Children Fund E L Soper, Latter Day Saints Charities, USA Nancy Horn, Opportunity International, Illinois, USA Tex Lanier, President, World Concern, Seattle, USA Michelle Tereno, International Medical Corps, Los Angeles, USA Joel MacCollam, President, World Emergency Relief, Carlsbad, USA Catherine Robins, World Vision Relief and Development, Washington, USA Harold Northrup, International Rescue Committee, USA George Freaks, Policies and Operations Evaluation Department, Netherlands Ministry of Foreign Affairs Rita Parhad, Center on International Cooperation, New York University, USA Richard Scott, Chief, Division of Programme Evaluation, IOM, USA Giles Whitcomb, Consultant, Cambridge Massachusetts, USA

Any other suggestions?

1. Keep up the good work! Thanks for a stimulating two days. 2. Circulation of papers as far in advance as possible would allow comments from a wider group within member organisations. 3. Listserve and discussion groups (email based) might make newsletter redundant. These can be arranged to cost very little of ODI time 4. The Secretariat should commission/prepare a background paper on beneficiary perspectives. 5. Make/formalise the 'Constitution' of ALNAP. As important issues arise dissent may appear and clear mechanisms and agreed procedures will be required on voting admission, lapsed membership contributions, etc.

vi 6. Presenters should be encouraged to use presentational software such as MS Powerpoint of Lotus Freelance. 7. Well done!

vii