Rapid Humanitarian Assessments – How Rational?
Total Page:16
File Type:pdf, Size:1020Kb
NAVIGATING POST-CONFLICT ENVIRONMENTS Humanitarian Information Management Rapid Humanitarian Assessments – How Rational? A Value-of-Information Study of Two Assessments in Iraq 1 Summary “Speed kills” vs. “Victims cannot wait” During the most recent Iraq war, speed became a topic of significant focus, mixed with the aura of precision weaponry. Even for a subject as mundane as supply chain management, the Harvard Business Review used the dramatic title “Speed kills” (Morales & Geary 2003) in relation to activities in Iraq. The expectation of rapid achievement extended into the post-war period, and into non-military aspects, until both of them were painfully slowed by ever-increasing insecurity. The humanitarian community, not exempt from this climate, strained to execute a running start in their delivery of relief to the Iraqi people. This included attempts at rapidly displaying an operational picture. The tools summoned to this task had been around for many years prior to the spring 2003 emergency. They valued speed for a different reason – the premise that victims cannot wait. One of these tools – rapid assessments – had been part of the humanitarian toolbox for decades. In Iraq, however, some of the rapid assessments created a fabric of information so dense that they offered an unusually close look at the trade-off between speed and other desired qualities. This study investigates the trade-off in more general perspectives of humanitarian information management, using data from actual assessments in Iraq. Information demands in post-conflict emergency relief and rehabilitation settings are heavy and are difficult to meet within useful timeframes, and with acceptable For whom this report is meant: reliability and precision. Over the past few years, the community of The majority of the professionals working in humanitarian information management in recent humanitarian practitioners has started to years have been database and Geographic build standardized systems of Information Systems (GIS) specialists. Many information collection, analysis and use. were involved in the design or support of rapid These systems are meant to facilitate assessments. These disciplines are not normally information transfer across sectors and trained in survey process quality management and in value-of-information estimates, concerns phases of the relief and development that we address here. process, and ultimately enhance the baseline information available to The second group whose reactions we invite are peacekeepers and development agencies. survey specialists and policy scientists who may They all struggle, in varying degrees, have relevant competencies in the aforementioned domains but little exposure to with the basic fact that war destroys post-war situations. These and other types of information, and that, as a result, the turbulent environments undermine the kind of units on which they are expected to stability that is taken for granted for much of deliver substantive information are normal social science research, including themselves not completely known. Some surveys. We hope that by learning from both sides humanitarian practitioners will be able to of the countries subject to the ravages of translate their inputs into better assessment wars and subsequent interventions by the tools. humanitarian community have not had a 2 reliable population census in many years; moreover, the destruction of records, the ”brain drain” of experts, and population displacement have made much of the pre-existing data inaccessible or obsolete. Security, survey productivity and rationality On 1 May 2003, President Bush declared the end of major combat operations in Iraq. On 19 August 2003, a bomb devastated the United Nations headquarters in Baghdad. Rapid assessments are highly vulnerable to insecurity. In relatively secure environments – shown here as the period prior to the attack on the United Nations – productivity fluctuates primarily in response to internal reorganization. New regional offices may be opened, and new survey workers are recruited and trained. When security is poor, data collection may entirely cease (red line) or become intermittent (blue). Figure 1: Survey productivity over time Some of the productivity spikes may reflect efforts to retain trained staff in secure regions that otherwise might not be a survey priority, awaiting improved conditions in priority areas. Ultimately, the value of the assessments depends on the use of the incomplete information in practical decision- making. From this viewpoint, we ask whether the rapid assessments were rational – how much information they acquired, and in response to what factors, both internal ones and the external signals that their changing environment sent. “Rapid assessments” have gained currency as stopgap measures to fill urgent information needs in turbulent post-war situations. However, they are not exempt from that turbulence. They are expected to produce rapid results, such as on levels of malnutrition, locations of displaced populations, or the status of basic infrastructure, with the minimum of 3 foundational information existing and with very short set-up time. The new addition to this process has been the requirement to collect foundational data. These include a listing of populated places visited, complete with names, geographic coordinates, administrative status, and current population estimate. Such data are basic to the reconstitution of a community gazetteer. The gazetteer, which in the lingo of sample surveys is nothing else but the community frame, in turn is critical for concurrent population and facility surveys, project tracking and avoidance of duplication of effort. Standardization and quality Some of the initiatives to create standardized information systems were closely associated with the United Nations-executed Humanitarian Information Centers (HICs), of which there have been several, including in countries undergoing recent military interventions by Western powers. These centers have been involved in rapid assessments, with responsibilities ranging from supporting independent assessments to coordination of multi-party data collection efforts to implementing them directly. For example, the HIC in Kosovo assembled a well-publicized rapid assessment from contributions by a variety of relief agencies, mapping the housing stock in war-affected villages and the needs for urgent winterization measures before the cold season in 1999-2000. Little is known about the quality of rapid assessments. There is an obvious conflict between speed and completeness, frequently compounded by lack of security and/or access. Some domains, notably nutrition, rely on cross-culturally validated protocols, but many rapid assessments use instruments that have not been adequately pre-tested, and are administered by data collectors who were minimally trained. It is safe to assume that, from a survey quality perspective, sampling error, however serious, will often be outweighed by measurement error. It is equally safe to say that, for a good part, if not most, of the assessments, by the time results reach the intended users, they no longer deserve the predicate “rapid.” In some cases, the rapidity is more descriptive of the users, who have left the scene or have morphed into a different policy landscape by the time the data are available for analysis, or findings are ready to inform decisions. Such considerations have prompted us to study the rationality of rapid assessments in greater depth. We are attempting this on two levels. First, we guide the readers through some of the more abstract points that have surfaced in the literature. The question “Why rapid?” may be easy to answer – in emergency response, delay is universally seen as policy failure. “Rapid” is also seen in an inter-organizational context. Early collaboration in surveys and assessments signals that coordination processes are being established amongst disparate responders and reflects well on the humanitarian community. Beyond that, several trade-offs remind us that the most rapid is not necessarily the best. Speed may compromise completeness, as mentioned above, but also other information standards such as reliability and validity. Its relationship with cost is far from straightforward; few are so naive as to believe that “rapid” means “shorter”, and therefore “cheaper”. Other trade-offs are categorical. Rapid assessment planners need to decide on the basic units of information collection and analysis – will they conduct population (e.g. household), community or facility surveys, or some combination thereof? Speed has 4 often suggested community-level assessments. These make an important presumption that the differences between communities are more important than those within, or that policies may neglect differences within for the time being. Iraq as a test case This report contains a strong empirical part. We test rationality claims – such as the use of pre-existing information, the respect for policy, and dynamic adaptation during fieldwork – with recent data from Iraq. This country makes for an attractive case study because of the pre-meditated nature of the war and parallel preparations undertaken by the humanitarian community – including for rapid assessments. After a long build-up, the campaign unfolded as a kind of blitzkrieg, with the United Nations and non-governmental organizations (NGO’s) anticipating that their assessment missions would execute a running start while the smoke of battle was still clearing.