CRRC Project Task III Workshop

July 24-25, 2013

Petersen room, Allen Library

University of Washington, Seattle

DRAFT DISCUSSION PAPERS

Please do not cite or circulate without permission

Table of Contents

Stakeholder and public mental models of and concerns about dispersant and oil spill processes………………………………………………………Page 2

What-If Scenario Modeling to Support Oil Spill Preparedness and Response Decision Making………………………………………………………………..Page 28

“It’s Raining Oil”: Information Flow through Twitter after the 2010 Deepwater Horizon Oil Spill ……………………………………………….Page 47

Methods for communicating the complexity and uncertainty of response actions………………………………………………………………..………….Page 77

Best Practices for Community and Stakeholder Engagement in Oil Spill Preparedness and Response……………………………………………...Page 97 SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 2 Discussion Draft – 15 July 2013

Stakeholder and public mental models of and concerns (all concerns, including environmental and public health) about dispersant and oil spill processes (leads Ann Bostrom and Ann Hayward Walker).

DISCUSSION DRAFT July 15, 2013

Section I. Preface

Experience with stakeholders and the public on oil spills and dispersant issues from 1980 through the Deepwater Horizon has shown that communicating about dispersants has long been and remains a problem across the country (Walker 2012, 2011a, 2011b, 2011c, 2010, 2001a, 2001b, 1999, 1997; Bostrom et al 1995 and 1996; Pavia 1984, 1985). Further, high quality information to support decision making is one of the perceived goals of oil spill response (Tuler et al 2008). The research presented here represents one piece of a collaborative social science and natural science research project designed to address public, media and political concerns and develop preparedness recommendations and response tools to facilitate well-balanced decisions under the uncertain conditions of risk that spills represent. Natural science communications can benefit from data-driven social science research and collaborations between the two can lead to new breakthroughs (Schaal, 2012).

To develop strategies for engaging communities and individuals in discussions about spill issues, the overarching project builds on a mental models approach for risk communications and entails a relatively new approach to survey research, analysis of social media data, and integration of relevant social and natural science research findings. The project has three subsidiary objectives: (1) identify key information needs and areas of confusion and misunderstanding, (2) explore the role of social media in effective risk communication, and (3) identify better methods to communicate scientific uncertainty and complexity with respect to response alternatives. Results from survey research and analysis of Deepwater Horizon Twitter data inform the team’s approach for characterizing model constituencies and their communication needs as they relate to dispersants and oil spills. The results are intended to be immediately applicable to promote effective response communications about dispersants and oil spills. Project end users include Unified Command (Federal and State On-scene Coordinators and spillers known as Responsible Parties), dispersant decision makers from coastal Regional Response Teams (RRTs), and academia. Many of these key stakeholders are looked to by elected officials/politicians and the public for assurance about oil spill response options.

Section 2. Introduction to Mental Models research

Successful risk management and risk communication depend on knowledge of fears, needs, and values of intended audiences ahead of crafting and delivering risk communication messages (Levine and Picou 2012). A mental models approach is well suited to elicitation of local community expertise on the workings of the local marine environment to produce valued goods and services, how pollutants affect that production, how best to deal with pollution, perceptions regarding environmental tradeoffs, e.g.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 1 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 3 Discussion Draft – 15 July 2013 opportunities and limitations (or risks and benefits), and associated preferences and tradeoffs regarding dispersant use in event of a spill.

Key stakeholders and members of the public in the Gulf of Mexico and other states hold a variety of risk perceptions and mental models of dispersant and oil spill processes (e.g., Gill et al 2012; Webler and Lord, 2010, Walker 2011c). As a result of the Deepwater Horizon (DWH) incident, many may hold correct and sophisticated beliefs regarding dispersant and oil spill processes and recognize the relevant scientific uncertainties, but findings from a workshop earlier this year (Walker, 2012) show that some of their mental models omit key elements, and may focus unduly on elements that contribute relatively little to potential risk. A mental model is someone's understanding of how something works in the real world. Such misperceptions of natural processes associated with the lifecycle of an oil spill and dispersants can influence risk perceptions and public health (CRRC 2012).

To assess mental models, information needs, and risk perceptions of lay stakeholders, this research builds on a mental models risk communication approach. This approach reflects both the natural and engineering sciences of how risks are created and controlled, and the social and behavioral sciences of how people comprehend and respond to such risks (Morgan et al 2002). The approach entails developing a decision-focused expert model of dispersant and oil spill processes, in order to identify correct beliefs as well as misperceptions that influence oil spill response decisions. Comparing layperson mental models with an expert decision model can provide insights about information gaps and misunderstanding, which in turn help identify knowledge areas to address, thus supporting more effective communications.

By recognizing people’s concerns and prior beliefs, a mental models approach can improve ways of communicating complex scientific information, such as that about oil spills and dispersant use, and empower informed decision making (Fischhoff et al., 2011). Mental models approaches belong to a larger category of qualitative research approaches to better understand stakeholder beliefs and perceptions concerning risk (Wood et al., 2012). The aim of such research is to discover how people think about an issue, in order to assess how new information will be interpreted so it can be designed to be most useful.

Mental models are important as they are people’s “inference engines” and show how people connect contexts or ideas (e.g., Gentner and Stevens, 1983). Mental models of hazardous processes include ideas people have about identifying a risk, exposure to the risk, the effects of exposure, how to mitigate the risk, and how the risk unfolds in time.

Communicating risk is much like creating a forecast; there are many factors to consider. Humans perceive risk in many ways, emotionally, socially, as well as cognitively. The most basic way of perceiving risk is the use of one’s emotions or feelings to evaluate risks and benefits (Slovic et al., 2002). In addition to differences in feelings about risk, social and cultural differences influence perceptions of risk; these differences are associated with variations in beliefs about the role of government (Kahan et al., 2007).

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 2 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 4 Discussion Draft – 15 July 2013

These influences, however, are only part of the picture, and understanding them is not a sufficient basis for determining what information might be useful to or desired by decision makers and should be made accessible for them. It’s important to recognize that how people process information cognitively also affects their individual risk perceptions (e.g. Peters & Levin, 2008; Reyna et al., 2009); their prior beliefs—that is, their mental models—influence their risk perceptions and decision making (e.g., Bostrom et al., 2012), as well as how they interpret new information.

The mental models approach in this research has five steps, as described in detail by Bostrom et al. (1992) and Morgan et al. (1992; 2002). This approach has been applied successfully in a wide variety of domains [e.g., dispersant communications (Bostrom et al., 1995), flashfloods (Wagner, 2007), injury prevention (Austin and Fischhoff, 2011), and wildland fire (Zaksek and Arvai, 2004), among others]. The project leverages extensive work on most of the five steps, completed previously (e.g., Bostrom, Fischbeck, Kucklick, Walker 1995). Prior research had revealed a lack of shared understanding among decision makers about oil properties and fate and transport, even before the addition of dispersants (e.g., Bostrom et al 1996; Scholz et al 1999).

Section 3. Research methods

3.1 Expert decision model

Members of the research developed an expert decision model for dispersant use in oil spill response through expert elicitation in the late 1990’s and 2012 (see Appendix A, include expert model and list of experts who participated). At both time points the experts involved in the elicitation were among those in the nation most expert in the sciences of oil spill response. The model was developed to support communications designed for stakeholders in oil spill response, and in particular for those facing response decisions. Because it is designed for response decisions and not necessarily to address those decisions faced by coastal residents, only some of the model is relevant for public survey samples. Nevertheless, it reflects the overall structure of the hazardous process, from exposure (sources, pathways, and influences on these), through effects (ecological, economic, as well as human health in the later model) and mitigation of risk. The key pieces in the model are: Initial oil (dispersability), time, physical and environmental conditions, fate and transport processes, logistics, response options (best practices), and impacts of both the spill and the response.

3.2 Survey item selection

The initial questions for the project derive from mental models research with oil spill responders and stakeholders in the late 1990’s (e.g., Bostrom et al; Pond et al). These were revised during survey toolkit development in 2012, through three workshops (Bostrom et al, in preparation). The initial intent was to refine these questions in a small sample of cognitive interviews (i.e., “read-alouds”; see Ericsson and Simon 1993; Sudman et al 1996) with spill-interested non-experts on the Gulf coast or in Alaska.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 3 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 5 Discussion Draft – 15 July 2013

A principal components analysis of 2012 workshop responses to earlier versions of the candidate survey questions was used to select initial sets of items whose content and structure (including within set correlations) would be of interest. The decision model developed in the August 2012 SEA workshop was used to guide the selection of additional questions. Interview recruitment emails were sent a sample of Alaska CRRC workshop participants and Mississippi Sea Grant contacts (n=100 total), with the anticipation of conducting at least a dozen cognitive interviews by telephone. As described below, fewer than this responded. To compensate for the small number of cognitive interviews, a sample of questions debriefing the interpretation of each item (question) was added to the survey data collection (the results of which were not available in time to be incorporated into this draft).

The team worked with Google Insights, with the aim to apply a novel multiple matrix survey design (e.g., Thomas et al 2006; Gonzalez and Eltinge 2007) in order to elicit perceptions, beliefs and preferences that are representative of coastal residents nationally using the above-mentioned survey items. Google’s new Consumer Insights is now alpha testing fielding more two questions per person but currently can still only guarantee the timeliness of max two questions per respondent (pairs). Using NOAA’s designation of coastal counties for the Statistical Abstract, the FIPS for all U.S. coastal counties excepting those in the Great Lakes region were extracted and converted to zipcodes. Google can target population down to the zip code level, but would not stratify the sample to the state level, due to current market limitations. Google uses algorithms to analyze IPs and other user features to infer demographics, and have validated their approach, demonstrating that response rates and sampling errors are comparable or better than those obtained with Internet panels or telephone surveys (McDonald et al 2012). However, not all responses can be weighted, and charges are incurred regardless of the availability of weights. In the analyses that follow, some results are weighted and some not, pending further discussions with Google regarding weighting. In all cases the weights do not change the gist of the relative distributions, with the exception that they reduce the proportion of don’t know responses by up to ten percent in some instances. The design pairs questions with the intent of allowing inferred response sets for multiple questions by coastal region, which for the most part remain to be analyzed.

Given the two-question constraints that still prevail, along with strict character limits on prompts, some questions were used to introduce a context for other questions, including a question regarding ocean ecosystem resilience (add ref), two open-ended questions—one free association with chemical dispersant use, and one about information wants and needs—and a question regarding anticipated economic impacts of a spill.

Section 4. Results

4.1 Cognitive interviews

The response to our interviewee recruitment emails for the cognitive interviews to the Alaska workshop and Mississippi Sea Grant lists was extremely low. Further, even

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 4 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 6 Discussion Draft – 15 July 2013 though we screened out a significant proportion of those on the lists whose emails or other contact information showed them to be NOAA employees (noaa email addresses) or academic researchers (.edu email addresses), those who did respond to our request for participants tended to be heavily invested stakeholders with significant experience. We conducted nine cognitive interviews via Skype or telephone, five of which were with a non-expert convenience sample from Seattle. The interviews ranged from about 17 minutes to almost an hour. Though the comments and results supported switching back from a Likert-type response scale to a True-False response scale for the knowledge questions, eliminating the question about when it was appropriate to consider source control (all respondents said always), and a few minor wording changes. Non-expert respondents struggled with words like biodegrade and photo-oxidation, which we addressed by adding context or definitions. While we did receive a few additional responses from Alaska regarding potential interviews, those respondents were unable to schedule/complete the interviews within a week of the initial recruitment email. In order to complete data collection before the workshop (scheduled for the end of July), we abbreviated the cognitive interview collection to conduct the survey.

The first question we asked participants was open-ended:

"Briefly, what information do you think should be included in a summary reference booklet on oil spill response options—including mechanical on water and shoreline strategies, controlled burning, chemical dispersants, and source control—for it to be most useful to concerned members of the public?"

Responses specified wanting to know the what and how of dispersant use, their pros and cons, and contextual information and history and experience of use, as two responses illustrate:

• “Response booklet should include explanation of how dispersants work, summary of rules for their use (shallow v deep water, application rate, specific variety used and why), timeline of use history - so we had a set of rules in place at DHOS that guided use of dispersants... how did the set of rules change as a result of DHOS use of dispersants and their impacts on the gulf ecosystem?“ • “A description of the control methods, prior experience with each in other oil spills from various sources, the advantages and disadvantages of each.”

The first respondent above describes her experience with oil spills as follows: “I am a geologist, but I know minimal detailed info about oil geology. I began learning about oil spill response on April 20, 2010 and implemented a [grant-type omitted to protect confidentiality] grant in MS related to DHOS in fall 2010 then in spring 2013 led a group of citizen scientists through a literature review DHOS to learn about the process of science and its role in emergency response.”

The second reports his experience with oil spills as follows: “Experience with BP oil spill. I have an oyster farm at [location omitted to preserve confidentiality] AL, on the MS sound. There is a chain of offshore islands 12 miles offshore. The water between Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 5 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 7 Discussion Draft – 15 July 2013 each island was boomed. The entrance to the bay where I am located was boomed. We had no evidence of any oil at any time after the spill as determined by periodic sampling of sediment, water and oyster meat. I would say the spill response methods were effective in preventing oil from reaching my site.”

It swiftly became apparent that given their investment and experience these respondents have the expertise to interpret questions about oil spill response somewhat differently than most coastal residents.

Analyses of the cognitive interviews were conducted iteratively, and the survey items revised throughout the process, as initially planned. One salient result of this process was that the True-False response scale was easier for interviewees to interpret and use for the candidate survey questions than was the Strongly Agree-Strongly Disagree response scale, for which reason we switched to that scale, despite questions raised about the True- False scale by a survey expert at the August meeting. A point of common understanding in survey methodology is that an explicit “don’t know” option will increase the number of don’t know responses (add ref). Given that uncertainty and lack of knowledge are of interest to us as well, however, we deemed it appropriate to include an explicit don’t know response category.

4.2 Survey results

To date we have received over 500 responses per question (July 2-6th, 2013), but anticipate having 1500 responses per question by July 10th (subject to human subjects review of the last screening question).

Initial results suggest that people see ocean ecosystems as somewhat resilient but potentially vulnerable to the cumulative effects of major oil spills (see Figure 1). A plurality (33% overall) selected the threshold view of how ocean ecosystems work (“Oceans are stable within limits. With a few oil spills, the oceans will return to a stable balance. Major oil spills will lead to dangerous effects”). Next most prevalent (27.3%) was the view that ocean ecosystems are more likely to see ocean ecosystems as fragile (“Oceans are delicately balanced. A few major oil spills will have catastrophic effects.”), which women were significantly more likely to choose than men. Least frequently selected was the view that ocean ecosystems are very stable, and that major oil spills will have little to no effects.

As mentioned above, two open-ended questions were also included in our surveying. For the open-ended questions, respondents tend to write only a word or two, response times are on the order of 15-22 seconds (see Figures 2-5 below), shorter if the question appears after another question, longer if first. Not all responses are in (n=3871 in Figure 2, n=433 in Figure 4 below), and the responses are not weighted by demographic or other factors that would make these representative pictures for coastal areas.

The first open-ended question was: “ What key information do you think should be in a booklet on oil spill response options, for it to be useful to you?” Similar to the results

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 6 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 8 Discussion Draft – 15 July 2013 from the cognitive interviews, responses to this open-ended question do include mention of pros (clean up) and cons (costs) (see Figure 2). However, the dominant response is “don’t know,” and the general picture that emerges is a focus on actions—what to do, how to prevent harm, how to clean up—with a secondary emphasis on damage and costs.

Figure 1. Ocean ecosystem resilience (N=10354, sampled from NOAA-designated coastal counties, but excluding the Great Lakes coastal counties). Answers were displayed in random order. Results are weighted by age in this figure. The median response time for this question was 34 seconds, which is relatively slow.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 7 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 9 Discussion Draft – 15 July 2013

Figure 2, July 2-6th, 2013: Responses to open-ended question following ocean ecosystem resilience screen. Word cloud does not include those few responses that were either numerical or obviously nonresponsive.1

In Figures 3 and 4, responses to “What comes to mind first when you think of using chemical dispersants to respond to marine oil spills?” paint a general picture of a response technology that people dislike and equate with pollution, characterizing dispersants as equally polluting or worse than spilled oil. When grouped by sentiment, neutrals (don’t know) dominate, but negatives outweigh positives, by far.

1 Which included the following: ug, rockets, sex, g you, fragudal, republicans are idiots, something about being gay, what snacks are served. Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 8 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 10 Discussion Draft – 15 July 2013

Figure 3. July 2-6, 2013: Responses to open-ended question following ocean ecosystem resilience screen. Word cloud does not include those few responses that were either numerical or obviously nonresponsive.2

Grouped by sentiment (Neutral, Negative, Positive), by Google analytics:

2 These included the following: baba ganoush, fak, sex, wer. Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 9 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 11 Discussion Draft – 15 July 2013

Figure 4. July 3-6, 2013: Responses to open-ended question (initial question, used as an introductory question—i.e., a screener that does not screen out any respondents—for a subsequent question)

Figure 5: Response metrics from Google analytics for data shown in Figure 4. Note that the south is underrepresented (Texas in particular), and there are few responses from Alaska.

Most respondents do not feel they know whether there is scientific agreement on the effectiveness of chemical dispersants, but a majority of those responding to date have a tendency to think of dispersants as persistent (detectable in fish after a year), and toxic (toxicities due to dispersant rather than oil). Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 10 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 12 Discussion Draft – 15 July 2013

4.1 Initial conditions, fate and transport, logistics

As shown in Figures 6 (below) and 7 (next page), the modal response is don’t know for both if weathering decreases oil dispersibility and if evaporated oil is broken down by sunlight into environmental safe compounds. However, of the remaining responses, False (and maybe false) are more prevalent than True (and maybe true), wherease both statements are actually true (check and add ref).

Figure 6: How does oil behave? (Q15, Q21)

Figure 8. Add dispersants and what happens? (Q11, Q26, Q27, Q28 )

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 11 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 13 Discussion Draft – 15 July 2013

Figure 7. Responses to Q21: Evaporated oil photo- oxidizes (is broken down by sunlight) into environmentally safe compounds, by ocean ecosystems behavior model (A=Fragile, B=Gradual, C=Stable, D=Threshold, E=Random). Figure shows the results for respondents with demographics. C was not weighted; A, D were weighted by Age; B, E were weighted by Gender. (400 responses). The unweighted responses differ as one might expect for adherents to different models of ocean ecosystem resilience. For example those who think ocean ecosystems are fragile more likely to deem it false that evaporated oil is broken down by sunlight into environmentally safe compounds ! (!!" =45.58, p<0.001).

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 12 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 14 Discussion Draft – 15 July 2013

4.2 Monitoring and response options and decisions

At the heart of the expert decision model are elements of the response decision itself, including baseline information, anticipated effectiveness of response options, preferences for different response options, and ways of monitoring the effectives of responses once they are implemented.

Baseline information (Q7, Q8, Q9, Q10): The majority of respondents find it either very important or essential to know the four kinds of information suggested (see Figure 9 below), one of which (ingredients) was suggested by the twitter analysis underway rather than by the specialists who helped develop the decision model.

Figure 9. Perceived information needs for oil spill response

Anticipated response effectiveness (Q4, Q19, Q14): The similarity of judgments for the removal of oil by controlled burning and the hit rate for dispersants applied by aircraft suggest that people are using a common sense model of effectiveness for difficult tasks, with the majority of responses in the lowest two categories (a small percentage, less than a third; or none or a tiny percentage). Almost half responded that they didn’t know if recovery of over 25% of oil spilled in the open option with mechanical response options such as booms and skimmers is common. Nevertheless, more judged this true (34.2% includes true/maybe true) than false (18% includes false/maybe false). Results are similar for weighted and unweighted data. This suggests optimism about the effectiveness of mechanical strategies relative to other response strategies.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 13 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 15 Discussion Draft – 15 July 2013

Figure 10: Controlled burning removal percentage and chemical dispersant hit rate.

Decision preferences (Q5, Q6): Dispersant use and controlled burning are more likely to be judged as never appropriate than as always appropriate, although a plurality of respondents select the middle option, sometimes appropriate. Responses are similar across coastal regions.

Figure 11: Judged appropriateness of dispersant use and controlled burning, when technically feasible.

4.3 Monitoring (Q17, Q18, Q23)

Monitoring is essential for both response effectiveness and health outcomes. Despite scientific evidence that sensory testing is an effective approach to assessing oil contamination of seafood, coastal residents are more likely to think it invalid than valid, although about a third say they don’t know. Although 39% selected don’t know, the majority of the remainder think air quality monitoring enables response organizations to protect human health from oil spill response-related air pollutants. Almost half (45%) were uncertain of the informativeness of aerial visual inspection with regard to dispersant

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 14 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 16 Discussion Draft – 15 July 2013 effectiveness, with the remainder nearly equally split between True/Maybe true and False/Maybe false.

Figure 12: Is monitoring valid and effective?

4.4 Impacts (Toxicity: Q20, Q22, Q24, Q29)

Judgments of the persistence and toxicity of both oil and dispersant are include that a majority (50.1%) think not only that dispersed oil at low concentrations (54.5%) but also dispersant ten hours after application (50.3%) are toxic, and judgments are split regarding whether ecological effects are due primarily to the dispersed oil or the dispersant, after application of dispersants (28.3% True/Maybe true; 27.8% False/Maybe false, 44% Don’t know).

Figure 13: Toxicity related perceptions

4.5 Perceptions of the science of oil spill response Although the modal response to questions about the science of oil spill response was “don’t know” (from around a third to almost half—47%—of respondents said they didn’t Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 15 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 17 Discussion Draft – 15 July 2013 know), more respondents were confident that lab studies can predict the impact of dispersed oil on marine life than not (42.6% True/Maybe true versus 14.7% False/Maybe false; Figure 14). This holds to a slightly lesser degree for controlled burning as well.

In contrast, fewer respondents think scientists agree on the effectiveness or toxicity of dispersants than not (True/Maybe true 21.2% for effectiveness, 25.2% for toxicity; False/Maybe false 31.8% for effectiveness, 31.4% for toxicity). Asking this question after the ocean ecosystem resilience question results in a lower proportion of don’t knows, but paints the same overall picture of relative lack of faith in dispersant science (see Figures 14-15 below, unweighted and weighted results respectively).

Figure 14: Perceptions of the science of oil spill response

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 16 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 18 Discussion Draft – 15 July 2013

Figure 15. Science of oil spill response, unweighted and weighted results for Q12.

Section 5. Discussion of results

Preliminary results suggest that coastal respondents have limited interest and knowledge about oil spill response, but despite this are negatively disposed toward dispersant use on oil spills. [expand]

Section 6. Application implications for current practice

Common sense models of the shortcomings of technological responses may be driving some of these judgments, but the pattern of missing knowledge and incorrect understanding of fate and transport processes suggests an opening for developing a deeper appreciation of the tradeoffs made in oil spill response decisions. [expand]

REFERENCES

Austin L, Fischhoff B. (2012) Injury prevention risk communication: A mental models approach. Injury Prevention, 18(2):124-9. Epub 2011 Nov 16. doi:10.1136/injuryprev-2011-040079.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 17 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 19 Discussion Draft – 15 July 2013

Bostrom A, Fischbeck P, Kucklick JH, Hayward Walker A. A Mental Models Approach to the Preparation of Summary Reports on Ecological Issues Related to Dispersant Use. Washington, DC: Marine Spill Response Corporation, 1995. Contract No.: 95–019.

Bostrom, A., O'Connor, R. E., Böhm, G., Hanss, D., Bodi, O., Ekström, F., Halder, P., Jeschke, S., Mack, B., Qu, M., Rosentrater, L., Sandve, A., & Sælensminde, I. (2012). Causal thinking and support for climate change policies: International survey findings. Global Environmental Change: Human and Policy Dimensions, 22, 210-222, 2012.

Ericsson K A, Simon H A 1993 Protocol Analysis: Verbal Reports as Data MIT Press, Cambridge, MA.

Fischhoff, B., Brewer, N., & Downs, J.S. (eds.). (2011). Communicating risks and benefits: An evidence-based user’s guide. Washington, DC: Food and Drug Administration. http://www.fda.gov/AboutFDA/ReportsManualsForms/Reports/ucm268078.htm

Gentner D, Stevens AL (eds.). 1983. Mental Models. Hillsdale, NJ: Lawrence Erlbaum Associates.

Kahan D, Slovic P, Braman D, Gastil J, Cohen G. The Second National Risk and Culture Study: Making Sense of - and Making Progress In – The American Culture War of Fact. Yale Law School, Public Law Working Paper No. 154. Available at: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1017189 , accessed August 2012.

Morgan, M.G., Fischhoff, B., Bostrom, A., Lave, L. and Atman, C.J. "Communicating Risk to the Public." Environmental Science & Technology, vol. 26, no. 11, pp 2048-2056, 1992.

Morgan, M.G., Fischhoff, B., Bostrom, A. and Atman, C.J. Risk Communication: A Mental Models Approach. Cambridge University Press, 2002.

Peters E., Levin, I.P. (2008). Dissecting the risky-choice framing effect: Numeracy as an individual-difference factor in weighting risky and riskless options. Judgment and Decision Making, 3(6), 435-448.

Reyna VF, Nelson WL, Han PK, and Dieckman NF. (2009). How Numeracy Influences Risk Comprehension and Medical Decision Making. Psychological Bulletin.

Scholz D, Debra K. Ann Hayward Walker, Janet H. Kucklick, and Robert G. Pond (1999) Aligning Expectation and Reality: A Summary of Dispersant Risk Communication Issues. International Oil Spill Conference Proceedings: March 1999, Vol. 1999, No. 1, pp. 585-590.

Slovic P., Finucane, M.L., Peters, E., MacGregor, D.G. (2002). Rational actors or rational fools: Implications of the affect heuristic for behavioral economics. Journal of SocioEconomics, 31(4), 329-342.

Starbird, K and Palen L (2011). "Voluntweeters”: Self-Organizing by Digital Volunteers in Times of Crisis. CHI 2011 Session: Behavior May 7–12, 2011, Vancouver, BC, Canada.

Sudman S, Bradburn N M, Schwarz N (eds.) 1996. Thinking about Answers: The Application of Cognitive Processes to Survey Methodology Jossey-Bass, San Francisco, CA.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 18 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 20 Discussion Draft – 15 July 2013

Tuler, SP, Webler T, Kay R. 2008. Comparing stakeholders’ objectives for oil spill response: A Q study of four regions. Greenfield, MA, USA: Social and Environmental Research Institute.

Wagner K. 2007. Mental models of flash floods and landslides. Risk Analysis: 27: 671–82.

Wood M, Bostrom A, Bridges T, Linkov I. Cognitive Mapping Tools: Review and Risk Management Needs. Risk Analysis Volume 32, Issue 8, pages 1333–1348, August 2012.

Zaksek M, Arvai, JL. 2004. Toward improved communication about wildland fire: Mental models research to identify information needs for natural resource management. Risk Analysis 24: 1503–1514.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 19 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 21 Discussion Draft – 15 July 2013

Appendix I. Expert Model

[will insert expert model here]

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 20 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 22

Time (oil age, time to shoreline/impact): 1 hour, 10 hours, 1 day, 1 week, 1 month, 1 year, 10 years

Initial Oil (dispersibility) Type (API Gravity): oil name, pour point Amount (volume, extent of spill): <10k, 10k to 100k, >100k, SONS Source (release characteristics): continuous, intermittent or instantaneous

Physical and Environmental Conditions Weather: Wind speed, direction, air temperature, cloud cover Sea conditions: Temperature, salinity; sea state and wave height, mixed layer, currents (longshore), tidal height, tides (ebbing and flooding) Shoreline and water depth: distance to shore, shoreline types, collection areas, hydrography, water depth Time of day Season Boundary conditions Waterbody type (size of waterbody) Mobility, flushing, retention LOCATION Fate and Transport Processes Spreading Evaporation Dissolution Dispersion Emulsification Sedimentation Biodegradation Photo-oxidation Stranding Aerosolization (semi-volatiles)

Logistics Mechanical oil recovery system availability: equipment, quantity, capacity (incl back-up storage), response time Dispersant availability: type, quantity, location Dispersant delivery system availability: air/sea optinos, dispersant rate capabiltiy, on-station time Bioremediation delivery equipment availability Controlled oil burning equipment availability Monitoring equipment availability

Response options (best management practices) No action / delay action Encounter rate - as it relates to performance of different response options: subsurface, on water, on shore

Authorization (use of Subpart J and new technologies; regulatory constraints = section 106, section 7 endangered species and marine mammal protection – habitat constraints) On water response SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 23

Mechanical: Boom (protective, containment, enhance encounter rates), skimmers (stationary, advancing), storage, sorbents Controlled Burning: containment, ignition, spotters to direct operations, residue management, fire controls Chemical: Dispersants (toxicity -- dispersant, oil, and dispersed oil) [Note: other chemical use is minor: elasticity modifiers, herding agents, vapor suppression] Source control Subsurface response Mechanical recovery (low API gravity and tar sands): vacuuming Source control: top hat

Chemical dispersants Physical dispersion Waste management and decontamination of equipment Monitoring Monitoring baselines Human health conditions, issues & outcomes: pre-existing health status of communities; post-interventions; health outcomes monitored for consumption of contaminated seafood

Human health situation: commercial - regulated by federal government; recreational and subsistence regulated by state Consumption practices: selection of locations for testing; sensory (organoleptic) and chemical testing Monitoring human health exposures and effects Public health and welfare (population) Monitoring air exposures (individual and community) Monitoring water exposures (individual and community) Medical and mental health surveillance: tracking the trending of potential exposures (SAMSA and Red Cross); need instrumentation to assess right analytes Food exposures (consumption) Fisheries (incl commercial, recreational and personal use), molluscs, crustaceans, finfish, other) Dietary preferences - e.g. urchins, etc. and highly migratory species impacts on international populations – state department interactions; includes shift of common dietary needs during incident Subsistence needs – e.g., mammals, and other regulatory and oversight agencies; Alaska native mammal commission, etc. Operational and other impacts from response: Mental health

Monitoring communications (2-way; monitor social media) Monitoring worker health Medical surveillance of volunteers: chemical exposures, heat/cold/mental stress Medical surveillance of workers and responders: chemical exposures, heat/cold stress effects, mental stress Monitoring Response Options: effectiveness of response options, situational (location, extent of contamination) Surface distribution of oil (overflights, remote sensing, visual observations)

Shoreline and near-shore oiling (shoreline surveys--sediment samples and analysis) Water column concentrations (boat surveys--water samples and analysis Bottom sediment concentraions (beyond the near shore) (sediment samples and analysis Biological monitoring (organism counts, sampling (contamination/uptake, organoleptic for food fish) Baseline environmental monitoring (or control area for pre-spill data)--pre- and post-treatment monitoring and relative comparisons SMART monitoring (air and water) Data management and sharing SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 24

Fate and Transport Modeling 2D and 3D transport modeling: Trajectory, distribution, concentration, and description of the contaminate (quantitative and qualitative); TAP output with probabilities of impacts from hypothetical Time to reach sensitive species (i.e., species of concern)scenario, modeled uncertainties Time to reach sensitive habitats (i.e., habitats of concern) Time to reach human use resources of concern Remobilization Long term oil budget

Impacts (of spill and response) Human health exposures (Note: from Nalini Sathiakumar's summary for the IOM) Exposures for workers /responders/volunteers (those in spill proximity for extended periods): inhalation, dermal or ingestion (direct, indirect through ingestion of exposed seafood) Chemical exposures (responders/workers/volunteers) Crude oil may include benzene and other volatile organic compounds (VOCs – benzene, ethylbenzene, toluene, xylene, napthalene), oil mist, polycyclic aromatic hydrocarbons (PAHs), diesel fumes and heavy metals (aluminum, cadmium, nickel, lead and zinc) Oil dispersants may include sulfonic acid salt (detergent), propylene glycol, 2-butoxyethanol (solvent) and petroleum distillates (mixture of paraffins that may contain a small Airborne particulates, combustion byproductsamount of aromatic hydrocarbons) Physical hazards may include ergonomic hazards, excessive noise levels, sun exposure and heat stress; injuries may occur due to slips, trips Exposures from plumes : inhalation Special populations exposed to controlled oil burning (elderly,infants, children with asthma) Exposures from ingestion, especially high seafood consumption by vulnerable populations (e.g., prenatal through pregnant women) Exposures to stress (social disruption, spill and response events) Human health effects (Note: from Nalini Sathiakumar's summary for the IOM; based on indirect measures of exposure, data from very few spills) Mental health effects (stress, other) Short term health effects (from extended exposures to oil (VOCs, PAH) or dispersant (solvent [2-butoxyethanol], oil mist) Ocular (eyes): watering, itching Respiratory: cough and throat irritation Neurological: nausea/vomiting, headache, dizziness, irritability, confusion, weakness of extremities Irritation of skin (occupational) Genotoxicity and endocrine effects (possible, only observed in a few spills to date) Response resources engaged: Engagement of health care professionals - outreach and education, 3rd parties (EMTs, Nurses, Veterinarians, Pharmacists, Doctors, Associations (ASTHO, NACCHO)) Occupational health (SOPs for HHS) Training, Just-in-time training PPE Mental health support (Helpline) Natural Resources – response options and how they impact resources; what we need to protect and concern over what chosen response option is doing to the resources Natural resources at risk: mammals, birds, fish, shellfish, reptiles, turtles, flora, plankton (planktonic sensitive life stages), corals [Mearns: RAR doesn't address subsistence throughout the year] Concentration (# at risk) Vulnerability of biological resources Recovery rates Integrity and functionality of ecosystm Status (threatened, endangered) Significance (% of local, regional or global population) Habitat preference (where does it live, feed, reproduce) Habitats (location) Air Open water/on water: birds, marine mammals, turtles SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 25

Nearshore (< 10 meters ): birds, marine mammals, planktonic sensitive life stages, turtles, shoreline intertidal fish, shellfish, corals Subsurface (upper 0-30 feet of water column, lower water column) Benthic/subtidal - deep sea bottom: habitats (seagrass, coral reefs, hard bottom), shellfish, corals Shoreline (ESI categories 1-10): birds, plants, turtles, shellfish, corals Marsh: Birds, fish, shellfish, plants Effects Mortality: toxicity indices- indicator species (LC50 24-48-exposure factor (Anderson's Exposure index)) Sublethal (chronic) Reduced growth Reduced reproduction

Reduced mobility; Reduced ability to compete or survive Tainting Genetic effects Human use resources at risk Effects on extraction : fisheries, water intakes, other resources (sand mining, kelp harvesting, desalinization) Tainting Contamination of equipment Closures Recreation: Tourism, Marinas, Recreational fishing, Parks Cultural: archeological, historical, Native American Shipping Managed Areas: Refuges, Marine Sanctuaries, Estuarine Resources, Wildlife management areas

The above is a conceptual sketch of an oil spill response decision model, designed to represent what factors one would take into account (reflecting the best science) in deciding whether or not to delay action or to implement mechanical, chemical and/or controlled burning responses to an oil spill in the open ocean. This model reflects expert elicitations carried out in the mid 1990's and in August 2012, and does (or will) incorporate SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 26 Discussion Draft – 15 July 2013

Appendix II. Survey questions [highlight where there are right answers?] Q1. What key information do you think should be in a booklet on oil spill response options, for it to be useful to you? [open ended] Q2. Here are five views on how ocean ecosystems work, each shown as a ball balanced on a line with a different ability to withstand oil spills. Which best represents your view? [select one] [5 pictures plus texts: Gradual, Fragile, Stable, Threshold, Random] images randomized Q3. What comes to mind first when you think of using chemical dispersants to respond to marine oil spills? [open ended] Q4. On average what proportion of chemical dispersants applied by aircraft hit (land on) a target oil slick, to the best of your knowledge? [(a) none or a tiny percentage (b) a small percentage (less than a third) (c) around half (d) the majority (more than half) (e) all or almost all (more than two thirds)] Q5. When it is technically feasible to use controlled burning on a marine oil spill, how appropriate is it to consider it as a response (clean-up) option? [Never appropriate, Almost never appropriate, Sometimes appropriate, Almost always appropriate, Always appropriate] responses randomly reversed Q6. When it is technically feasible to use chemical dispersants on a marine oil spill, how appropriate is it to consider it as a response (clean-up) option? [Never appropriate, Almost never appropriate, Sometimes appropriate, Almost always appropriate, Always appropriate] responses randomly reversed Q7. For oil spill response, how important is knowing the ingredients of the dispersant available to use? [Not important, Somewhat important, Very important, Essential] responses randomly reversed Q8. For oil spill response, how important is knowing the relative toxicities of chemical dispersants and oil, in the short and long terms? [Not important, Somewhat important, Very important, Essential] responses randomly reversed Q9. For oil spill response, how important is knowing the availability of equipment and personnel to rehabilitate/restore impacted wildlife and environmental resources? [Not important, Somewhat important, Very important, Essential] responses randomly reversed Q10. For oil spill response, how important is knowing the pre-spill ecological baseline of the oil spill area? [Not important, Somewhat important, Very important, Essential] responses randomly reversed Q11. Chemical dispersants used on oil spills enhance natural rates of biodegradation. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q12. Toxicologists and spill scientists agree on the toxicity of chemical dispersants. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q13. Scientists agree on the effectiveness of using chemical dispersants on oil spills. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q14. When correctly conducted, controlled oil burning at sea has been shown to remove what percentage of spilled oil where applied, to the best of your knowledge. [(a) none or a tiny percentage (b) a small percentage (less than a third

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 21 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 27 Discussion Draft – 15 July 2013

(c) around half (50%) (d) the majority (more than half) (e) all or almost all (more than two thirds ] Q15. As oil weathers, its dispersibility decreases rapidly. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q16. Laboratory studies can predict the effects controlled oil burns will have on marine life. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q17. Using real-time air quality monitoring, response organizations can protect human health from oil spill response-related air pollutants. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q18. Sensory testing by smell and taste is a valid scientific method to determine oil contamination of seafood. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q19. Recovery of over 25% of oil spilled in the open ocean is common with mechanical response options such as booms and skimmers. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q20. Ten hours after application on the open sea, concentrations of the dispersant itself are generally still high enough to cause lethal or sub lethal effects on some organisms. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q21. Evaporated oil photo-oxidizes [is broken down by sunlight] into environmentally safe compounds. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q22. Normally, after dispersants have been sprayed on an oil spill in the open ocean, ecological effects are due primarily to the dispersed oil, not the dispersant. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q23. Dispersant effectiveness can be determined by aerial visual inspection. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q24. Dispersed oil in the ocean at low concentrations will cause adverse ecological effects. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q25. Laboratory studies can predict the impact of dispersed oil on organisms in the marine environment. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q26. Use of chemical dispersants reduces evaporation of oil. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q27. Photo-oxidation (breakdown due to UV / sunlight) of physically and/or chemically dispersed oil is a major mechanism for removing petroleum pollutants from the marine environment. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q28. A significant proportion of physically and or chemically dispersed oil biodegrades (is removed from the marine environment through biodegradation). [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed Q29. Chemical dispersants used on oil spills are detectable in fish after a year. [True, Maybe True, Don't know, Maybe False, False] responses randomly reversed

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 22 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 28 Discussion Draft 7-1-13 1

What-If Scenario Modeling to Support Oil Spill Preparedness and Response Decision Making

Tom Leschine

1. Scenarios and Scenario Analysis

Scenarios are essentially descriptions of possible future events that might reasonably be expected to occur, considered in terms of their implications for present-day decisions and actions (Jarke et al. 1998, cited in Duinker and Greig 2007). In the broadest terms, scenario-based thinking can be applied to any of the three basic questions humans typically ask about the future (Rubin and Kaivo-oja 1999, cited in Duinker and Greig 2007, 208):

• What may happen? (possible futures) • What is most likely to happen? (probable futures) • What would we prefer to happen? (preferred futures)

As such, scenarios have likely been employed since time immemorial. Bradfield et al. (2005) comment that Plato’s Republic, Thomas More’s Utopia, and George Orwell’s dystopian classic Nineteen Eighty-Four are all products of scenario-based thinking from this broad perspective.

The modern day practice of scenario analysis (SA) had its origins in the post-WW II era, particularly military and governmental strategic planning during the Cold War. Efforts to systematize scenario analytic techniques were initiated then and continue to the present day. The birth of modern SA is strongly associated with work done at the Rand Corporation and Stanford Research Institute (SRI) during the 60s and 70s, by Herman Kahn and associates (Khan 1965; Chermack et al 2001).

The terms “futurism” and “futurology” are now often applied to the practice of thinking systematically and creatively about possible futures. Concurrent with the analytical development of SA was the flowering of futurism, finding a niche in the book-publishing world that has proven extremely durable. From Alvin Toffler’s Future Shock (1970) to Alan Weisman’s The World Without Us (2007), the genre continues regularly to find its way onto bestseller lists worldwide. People are drawn to speculations about the future for both good and bad, but modern SA practice, described in the next section, aims to systematize future-oriented thinking and to focus on, if not the probable, the possible, under the rubric of plausibility.

Modern Day Scenario Analysis

SA emerged as a tool of business planning during the same era as the work of Kahn and his associates. Several authors point to Royal Dutch Shell as the first successful application of scenario analysis in business planning (Lorenz 1990, Postma and

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 29 Discussion Draft 7-1-13 2

Liebl 2005, Bradfield et al., 2005). Shell’s approach and successes were well documented, giving their efforts additional force (Wack 1985, Huss et al. 1987, Lorenz 1990 [all refs. in Bradfield]). Using an approach called “intuitive logics” that was developed specifically for the purpose, Shell managers initiated in 1967 periodic corporate planning exercises built around ‘plausible’ projections of the oil industry business environment in year 2000. Unlike many in the electric utility industry, Shell correctly foresaw and planned against the possibility that the industry as a whole would not keep expanding as it had in the past, as became the reality during the 1970s and 80s. Postma and Liebl (2005, 162) attribute to Ringland (2002) the idea that SA properly done creates “a coherent and credible set of stories of the future as a ‘wind tunnel’ for testing business plans or projects, prompting pubic debate, or increasing coherence.”

SA is now employed extensively in long-range business planning and strategic management (Bloom and Menefee 1994, Bradfield et al. 2005, Postma and Liebl 2005, Varum and Melo 2010). While the term encompasses an array of techniques intended to assist managers and planners in getting a handle on future trends and changes as they potentially affect the organization, the Shell “intuitive logics” approach, or variations on it, has emerged as a much referenced standard in a field otherwise rife with many professional practitioners and approaches (Amer et al 2013; Branfield et al. 2005, Postma and Liebl 2005).

SA as a Tool for Environmental Assessment SA is also an emerging tool of environmental assessment and management (Peterson et al. 2003, Duinker and Greig 2007, Tompkins et al. 2008, Biggs et al. 2010). Especially prominent are the Millennium Ecosystem Assessment (MEA; Carpenter et al. 2003) and the scenarios of future climate change developed by the Intergovernmental Panel on Climate Change (IPCC) in support of its series of climate assessment reports now four in number with a fifth under development (http://www.ipcc.ch/). Both studies unfolded under UN auspices and both address problems of the human environment at global scales. While scenario analysis is appropriate at any spatial scale from the local to the global (Duinker and Greig 2007), it is most appropriate at temporal scales not so far into the future that less and less can be counted on for sure (aka ‘predetermineds’, discussed below) as uncertainties of increasingly indeterminate character multiply (Postma and Liebel 2005).

The MEA developed and analyzed in detail four scenarios of plausible future change in the global natural and human environments, the drivers behind change, the impacts on ecosystems and ecosystem services of plausible changes, and their potential consequences for human well being (Carpenter et al. 2003). Specifically, the MEA asked, “What are the consequences of plausible changes in development paths for ecosystems and their services over the next 50 years and what will be the consequences of those changes for human well-being?” The IPCC used models and expert judgment to derive multiple scenarios of future trends in global warming and greenhouse gas (GHG) emissions based on assumptions about the nature and rates Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 30 Discussion Draft 7-1-13 3 of global demographic change, economic growth, and technological development (IPCC 2007 Fourth Assessment Report). To site another example more immediately relevant to this study, the Arctic Council’s Arctic Marine Shipping Assessment (AMSA 2009) used scenarios to help “identify the major uncertainties that would be critical to shaping the future of Arctic marine activity to 2020 and 2050” (Arctic Council, 2009). The study did not consider the potential for spills however.

SA Methodology SA is by definition “what if” analysis. It focuses on the plausible, not on what is most likely to occur or on what might be most desirable, the idea of “visioning” the future. Analysts aim to develop coherent but deliberately varied alternative futures, typically through bounding assumptions about future states of a few key underlying driving forces and their interactions with the other important variables that define the system. Essential initial steps of a typical SA (i.e., one following in spirit the “intuitive logics” approach developed by Shell Oil) are first to identify the forces thought to be shaping the future in the domain of interest and then to select from these the primary driving forces (usually two). The selected driving forces are ideally central to the logic of the problem under consideration, high in their likely impact over the relevant time horizon, and relatively low in their predictability over the same time frame (Postma and Liebl, 2005).

For the climate problem, where a primary concern is the rate and extent of future global warming, the key dependent variable is GHG emissions. But many factors will interact to determine future emissions, including future population and economic growth, technological innovation, the extent to which policies encourage conversion to low-carbon energy sources or not, the choices humans make on de- and re-forestation and other land conversion, and much more. In the end the IPCC chose demographic change, social and economic development, and the rate and direction of technological change as the key driving forces. Clearly, extensive additional analysis will often be needed to flesh out the resulting scenarios. In the IPCC’s case, these took the form of statistical analyses of very large data sets and extensive computer simulation.

The advantage of choosing just two driving forces, as many SAs do, is that conceptually the pair then determine axes of a 2 x 2 plot with the expected high and low end values of each determining the scaling. The associated scenario logic then leads to four distinct scenarios, represented by the quadrants of the plane so depicted (Postma and Liebl 2005). By way of example, Figure 1 illustrates the AMSA delineation of possible futures for marine activity in the Arctic in light of a rapidly changing climate and ice regime, but of more direct influence, the twin drivers, level of future demand for Arctic resources and shipping routes, and the nature of future governance arrangements. (Arctic Council 2009).

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 31 Discussion Draft 7-1-13 4

Figure 1: AMSA Scenarios Matrix Source: Arctic Council. 2009. Arctic Marine Shipping Assessment Report. Second Printing.

In 2007, the Global Business Network facilitated two scenario workshops for the AMSA to identify the major uncertainties in the future of Arctic marine activity to 2020 and 2050. Workshop participants identified 120 factors that could shape Arctic marine activity by 2050. From the identified factors, they then created six potential matrices for framing a set of scenarios. This was done by selecting pairs of critical factors or uncertainties and crossing them to produce frameworks. These frameworks were discussed and analyzed on their strengths, weaknesses, and applicability to the Arctic. Finally, two primary drivers and key uncertainties were selected as axes for the final AMSA matrix. They used three criteria to choose the final axes; degree of plausibility, relevance to the Arctic and maritime affairs, and being at the right level or threshold of external factors. The final axes selected were resource and trade, and governance. Resource and trade encompasses the level of demand for Arctic natural resources and trade, and exposes the scenarios to a broad range of potential market developments. Governance is the degree of relative stability of rules for marine use both within the Arctic and internationally (Arctic Council 2009).

According to the AMSA report, “The use of scenario planning identified three key issues for the Arctic states; the ongoing globalization of the Arctic through natural resource development and destinational marine traffic; arrival of the global

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 32 Discussion Draft 7-1-13 5 maritime industry in the Arctic Ocean with Arctic voyages of large tankers, cruise ships and bulk carriers on regional and destinational voyages; and lack of international policies in the form of maritime governance to meet this arrival” (Arctic Council 2009). The AMSA SA also includes three regional scenario futures to 2020 including the Bering Strait region, Canadian Arctic and Northwest Passage, and the Northern Sea Route and adjacent areas.

Although SA is different from trend analysis, trends should by no means be ignored in SA; relevant trends thought likely to persist through the period of interest— called ‘predetermineds’ by Postma and Liebl (2005)—must be factored into scenario development as appropriate. For example, analysts and decision makers might be comfortable assuming certain patterns of development or demographic trends will continue over relevant time frames for analysis directed at uncertainties elsewhere in a problem’s definition.

Uncertainties are the other key considerations in developing scenarios. The way uncertainty matters in SA is frequently misunderstood however. With statistical and probabilistic approaches the likelihood of future events is often at issue, the key uncertainty that matters. With SA however, events are known with certainty because they are postulated to occur, indeed they are the backbone of the “what if” that drives SA; it’s their coming about that is not yet known, as Postma and Liebl (2005 163) put it. If they do occur, what will their impacts be and how will they unfold? What effects will prior actions have on mitigating deleterious impacts, assuring that positive gains are actually realized? And will they in fact occur, or will other less anticipated things happen instead, and will we wish in retrospect we had been better prepared for than we in fact were? Questions like these frame the residual uncertainties in SA. In the example presented below of a comprehensive SA conducted to support the response to the Deepwater Horizon (DH) oil spill under Department of Interior’s (DOI) leadership, scientific uncertainties are estimated via expert judgment (DOI 2010, Machlis and McNutt 2010).

Uncertainty takes other, more difficult to deal with forms in SA, as well, ultimately limiting and potentially undermining SA’s effectiveness as a planning tool. Citing Lempert et al. (2003), Postma and Liebl (2005) note that the appropriate causal models for characterizing interactions of the key variables may either not be known or be disputed among experts. Moreover, the values planners, decision makers and the public-at-large attach to various projected outcomes are just that—values—and as such may be disputed as to their relative importance or overall desirability. More fundamentally, the very nature of knowledge itself comes into play. A number of authors, including Postma and Liebel, note that knowledge generally divides into three categories: things we know, things we know we don’t know, and things we don’t know that we don’t know (Postma and Liebel 2005, 166). The type of uncertainty best explored via SA is that of the second type, “known unknowns” to use the terminology of a much-cited quote of former Defense Secretary Donald Rumsfeld. . Rumsfeld’s “unknown unknowns” are, not surprisingly, little considered in SA. Examples abound where the logic employed in developing scenarios proved Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 33 Discussion Draft 7-1-13 6 too simplistic or too confining given events that actually later occurred, seen more clearly with the benefit of hindsight.

Sociologist Lee Clarke (1999) uses the term “fantasy planning” to characterize the tendency of institutions and organizations to under-imagine and under-plan relative to events that actually transpire, a problem especially acute, he argues, when organizations formally set out to define and plan for “worst case” events. The two worst marine oil spill disasters in US history, Exxon Valdez and Deepwater Horizon, provide stark cases of this failure to imagine and plan for the worst, events that turned out to be all too realistic scenarios. The use of “worst case” scenarios in oil spill prevention and response is explored in the next section.

2. Scenarios in Oil Spill Preparedness and Response

While scenarios are widely employed in oil spill contingency planning, they most often take the form of estimated or simulated “worst case” discharges (WCD), or other plausible spills that could reasonably be expected to occur from a particular vessel or facility. This is no doubt a product of the way WCDs have been highlighted in law and regulation post-EVOS (see generally 40 CFR Part 300, National Oil and Hazardous Substances Contingency Plan and USCG et al. 2002, National Preparedness Response Exercise Program). Much effort has gone into estimating WCDs for contingency planning purposes. Examples include ERC (2001), Etkin (2003), Etkin et al. (2003), Ferreira et al. (2003), Lima et al. (2003), Etkin et al. (2005), and French-McCay et al. (2005).

With rare exception however, the use made of scenarios in preparedness and response planning falls well short of the fully evolved scenario analysis described in Sec. 1 (Harrald and Mazzuchi 1993, Ott and Stalfort 1997, Holt 2001, Machlis and McNutt 2010). In some instances, historic trends in spill incidence have been relied upon to generate “worst-case” estimates (e.g., Etkin 2003), an approach that assumes the worst thing likely to be encountered is not that much different from what has already been seen. The 1989 Exxon Valdez oil spill (EVOS) revealed many inadequacies in response preparedness, among them a lack of either system or realism in the approaches taken to planning for and training against possible future spill scenarios. Even scenarios that challenged responders in ways actual spills had challenged them in the past were often missing from the repertoire of potential future spill events (Harrald and Mazzuchi 1993).

Such shortcomings were addressed post-EVOS through a number of measures. While PREP (the National Preparedness for Response Exercise Program, discussed below) and similar enhanced planning and training efforts are widely regarded to have improved overall preparedness Holt 2001, Reiter et al. 2005), attention continues to be called to inadequacies and limitations in the way the basic question of what constitutes adequate response preparedness is framed (Harrald and Mazzuchi 1993, Ott and Stalfort 1997, Holt 2001). In contrast to the way scenarios Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 34 Discussion Draft 7-1-13 7 are developed through open and highly interactive group processes in SA approaches that have evolved outside the spill response community, highly prescribed and agency- and mode-specific definitions of “worst case”, “average most probable” and “maximum most probable” discharge form the backbone of both the National Contingency Plan (NCP; 40 CFR Part 300) and the PREP exercises that have brought industry elements into response planning (USCG et al. 2002).

A notable recent exception is the work of the US Dept. of Interior’s Strategic Sciences Working Group (SSWG), an independent scientific advisory committee formed in the aftermath of the 2010 Deepwater Horizon spill (Machlis and McNutt 2010). Also standing apart, though more limited in scope, is a “hindcasting scenario exercise” developed and field tested shortly after EVOS by Harrald and Mazzuchi (1993). The latter suggested a number of institutional inadequacies endemic to spill response planning as it then was, including several in the areas of public information and communications that are the focus of this project. As with the Royal Dutch Shell SA described above, the insights obtained seem due at least in part to the systematic rigor, breadth and comprehensiveness with which this scenario-driven exercise was framed.

The Spill Preparedness and Response Framework under the NCP The Oil Pollution Act (OPA), passed in the wake of the Exxon Valdez oil spill (EVOS), resulted in sweeping amendments to the National Contingency Plan (NCP; 40 CFR Part 30), the basic framework for government-led response to hazardous spills. The National Response System (NRS) created by the NCP has the mandate to provide for “efficient, coordinated, and effective response to discharges of oil and releases of hazardous substances, pollutants, and contaminants” (NCP, Sec. 300.3(b)). The NCP requires a coordinated system of contingency plans at federal, regional and local levels (i.e., areas required to produce so-called area plans) that address the removal of oil or hazardous substance discharges that may occur, or the imminent threat of such discharges.

An elaborate organizational structure and set of responsibilities for planning and preparedness is created in Subpart B of the NCP (Responsibility and Organization for Response), with parallel but separate requirements that address both facility- and vessel-based incidents (Figure 2). At the federal level, the National Response Team coordinates the NCP (Sec. 300.110) while guiding and coordinating with regional response teams (RRTs) that prepare regional contingency plans (RCPs; Sec. 300.115). The RRTs in turn set regional policy and offer guidance to designated area committees to assure inter-area consistency and consistency of the area contingency plans (ACPs) they prepare with contingency planning at higher levels (Sec. 300.115(2)).

The basic nexus to scenario-based considerations in spill contingency planning occurs at the level of the ACPs (Sec 300.210(c)):

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 35 Discussion Draft 7-1-13 8

Each Area Committee, in consultation … shall develop an ACP for its designated area. This plan, when implemented in conjunction with other provisions of the NCP, shall be adequate to remove a worst case discharge under § 300.324 [emphasis added], and to mitigate or prevent a substantial threat of such a discharge, from a vessel, offshore facility, or onshore facility operating in or near the area.

Source: National Contingency Plan Subpart C, Planning and Preparedness. US Government Printing Office Electronic Code of Federal Regulations, accessed 7/8/13.

Under the NCP individual vessels and facilities that meet certain criteria are likewise required to submit plans that address how they would respond to WCDs in a manner consistent with applicable ACPs (VRPs and FRPs respectively). As Subpart D, Operational Response, makes clear (Sec. 300.324) the preparedness and response system supported by the NCP is oriented in significant ways toward developing and maintaining the ability to respond to WCDs. This orientation has translated however into a focus largely on the equipment and other logistical assets necessary for response to major spills, much less so on consideration of the broader environment—ecological, socio-cultural and political—that affects and is affected by spills and spill response. Moreover, although “spills of national significance” (SONS)

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 36 Discussion Draft 7-1-13 9 are defined in OPA along with WCDs, there are no specific requirements for plan holders to develop response plans for spills that, due the their size and complexity of response, would qualify as SONS. The DH spill was the first such designated in US history and proved unprecedented in many ways that were not anticipated. SA was in fact employed to assist responders, as elaborated below.

Highlighting Worst Case Discharge (WCD) for Contingency Planning Prior to the EVOS, determinations of what constitutes “worst case” were haphazard, highly subjective, and often made in ways that precluded the development of a shared vision across stakeholders of what constitutes “worst case” (Harrald and Mazzuchi 1993, Ott and Stalfort 1997, Holt 2001). A review of published reports suggests that the framing and assessment of worst case is often dominated by technological and limited scientific considerations. The focus is typically on engineering and technical aspects of accident scenarios, conditions of wind and weather that can be either accident inducers or complicating factors to the response, potential oil outflow, and the means to contain or mitigate spills with available technologies. Absent or minimal is consideration of possible societal and ecological consequences of spills, their attendant complexities and linkages, and how such factors can or should feed back to the response itself (By way of contrast, see Machlis and McNutt 2010 and DOI SSWG 2010).

Paraphrasing Holt (2001, p. 605), under business as usual planners may have license to assume a future that validates the plan they already have, or one that reflects their biases rather than inherent future uncertainties. Plans that are dominated by procedural considerations or that are based on unrealistic or poorly thought through scenarios may prove of limited utility in actual use (Harrald and Mazzuchi 1993). As Holt (2001) cautions, “The proverbial question ‘what if’ needs to be in the forefront of the planner’s mind” (loc. cit.).

The Oil Pollution Act of 1990 (OPA; 33 U.S. Code 2701-2761 et seq.) addresses the inadequate attention given to assuring the capacity to respond to major spills like EVOS in Sec. 505(b), where it specifies what constitutes a worst case discharge for planning purposes: (A) in the case of a vessel, a discharge in adverse weather conditions of its entire cargo; and (B) in the case of a facility, the largest foreseeable discharge in adverse weather conditions.

The agencies that deal with oil and hazardous materials through the NCP have worked in parallel to elaborate the basic WCD definitions in OPA as they apply to the activities they regulate. For example, the Bureau of Ocean Energy Management (BOEM) defines WCD for exploratory and development drilling operations as:

The daily rate of an uncontrolled flow of natural gas and oil from all producible reservoirs into the open wellbore. The package of

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 37 Discussion Draft 7-1-13 10

reservoirs exposed to an open borehole with the greatest discharge potential is considered the worst-case discharge scenario. http://www.boem.gov/Oil-and-Gas-Energy-Program/Resource- Evaluation/Worst-Case-Discharge/Index.aspx

The US EPA has jurisdiction over spills in inland waters and has similarly developed its own definition of ‘worst case’ for planning purposes (http://www.epa.gov/oem/docs/oil/fss/fss02/bdavispaper.pdf).

Each of the USCG, EPA and DOI (parent to both BOEM and BSEE) have defined what constitute “worst case”, “maximum most probable” and “average most probable” spills from the perspective of their particular responsibilities under the NCP (USCG et al. 2002).

National Preparedness for Response Exercise Program (PREP) Relatedly, the National Preparedness for Response Exercise Program (PREP) was developed in 1994 to meet the intent of section 4202(a) of OPA. PREP is a unified federal effort that satisfies the exercise requirements of the Coast Guard, EPA, the Research and Special Programs Administration (RSPA) Office of Pipeline Safety, and the Minerals Management Service (now BOEM and BSEE). While PREP is a voluntary program, it satisfies the OPA 90 exercise requirements mandated by the federal agency with regulatory oversight for the specific type of industry involved. Plan holders may choose to develop their own exercise program that complies with the regulatory exercise requirements stated in OPA 90, or they can follow the PREP Guidelines. Area Contingency Plan holders are required to follow the PREP Guidelines.

The National Response System (NRS) that exists under the NCP formally includes a number of organizational entities: the National Response Team, Regional Response Teams, Area Committees, On-Scene Coordinators, and state and local government entities involved with response planning and coordination. PREP, “consistent with OPA 90 objectives, specifically involves the private sector with the NRS in order to ensure effective exercise development, delivery and coordination” (USCG et al. 2002).

PREP includes guidelines for both internal and external exercises. The internal exercises are conducted wholly within the plan holder’s organization, and as such they may or may not involve other members of the broader response community. For example, USCG response personnel often participate in internal Spill Management Team (SMT) exercises, if their schedules can accommodate it. External exercises include other members of the response community and are designed to examine the response plan and ability to coordinate. For vessels, the plan holder must ensure that each vessel in the fleet conducts the exercise. For unmanned barges, only one barge in a fleet must conduct the exercise each quarter. Facilities have the option of conducting emergency procedures exercises. Each response plan

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 38 Discussion Draft 7-1-13 11 holder must identify a SMT that is required to conduct an annual tabletop exercise to ensure familiarity with the response plan.

PREP also includes area exercises, for areas that have separate and distinct ACPs. PREP states a national goal of 20 area exercises each year, 60 for a triennial cycle, with all areas across the country exercised triennially. Area exercises are designed to exercise the government and industry interface for spill response and include federal, state, and local government and industry. WCDs are PREP’s primary focus for area exercises, that is, one plan holder’s WCD will be selected for the area exericise. The PREP exercise formulations prescribe that plan holders, response resources, and response agencies will practice responding to small (average most probable), medium (maximum most probable) and large (worst case) discharges of oil from their facility or vessel. At least once in a triennial cycle, each plan holder must conduct an exercise for the WCD identified in their plan. (USCG, et al. 2002). In addition to planned exercises, the PREP Guidelines also call for government- initiated unannounced exercises and specify that plan holders must conduct annually at least one internal unannounced exercise.

PREP exercises, while praised for the coherence they have brought to the response system, have also been criticized on several grounds. One is the lack of advance over time in the complexity of events that define PREP training exercises (Reiter et al. 2005). While the PREP guidance requires that 15 core components be tested over the triennial cycle, different individual components can be tested each year, in effect keeping the level of difficulty roughly constant over time. This is in contrast to some state-level practices, in particular, Washington State’s, where essential components of exercises aimed at WCDs are retained for subsequent drills, even though successfully tested in the current drill (Reiter et al. 2005). Also, because PREP exercise are self-evaluated by the private sector entities that organize and deploy them, lessons learned may neither be shared nor embraced within the sponsoring organizations. Offsetting to a certain extent, agencies with regulatory responsibility have the authority to request and review plan holder records to assure compliance with PREP. But in the 23 years since OPA 90 was passed and especially since 9/11, competing priorities have arisen for plan holders and for the agencies that must assure compliance. Although feedback mechanisms exist within PREP, as Reiter et al. (2005) put it, drills that become routinized may come to have less and less learning value over time.

Other uses of Scenarios in Preparedness and Response Planning Although scenarios centered around WCD dominate the post-EVOS spill preparedness landscape, a few other, differently construed uses of scenario-based thinking in response planning are worthy of mention. Notable among these are a series of “consensus” ecological risk assessments developed through workshops involving local USCG and industry personnel with other local stakeholders in areas with ACPs (Aurund et al. 2000), and an ad hoc SA conducted in 2010 during the DH oil spill by a special working group convened by DOI. The purpose of the latter was

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 39 Discussion Draft 7-1-13 12 to assess potential future consequences of the DH oil spill across a wide array of ecological and societal concerns (Machlis and McNutt 2010).

Ecological Risk Assessment Workshops Ecological Risk Assessment (ERA) workshops have been sponsored by USCG, NOAA and other agencies with the goal of developing in advance of spills in particular regions an expert consensus on the relative ranking of various spill response measures, including the use of chemical dispersants, in terms of the resource damages avoided by use of the measures[cite response.restoration.noaa.gov]. Spill scenarios are developed by workshop participants for this purpose with the scenarios that emerge seemingly free of the single-worst case example thinking that otherwise dominates much spill preparedness planning.

Expert panels are assembled locally and encouraged to develop their own consensus spill scenarios to support the rest of the exercise. Groups are encouraged to develop multiple scenarios that vary oil type, spill size and location, weather, and other physiographic and biological factors that affect spill behavior and impact (Aurand et al. 2000). These workshops are framed using a step-wise ecological risk assessment framework developed originally by EPA in the late 1980s (insert citation) with results taking the form of consensus relative-risk ranking matrices across multiple response options (including the ‘no response’ option). Ecological endpoints appear to dominate in these exercises, but the “what if” character comes through strongly in the experts’ projections of relative impact. These are based on such factors as the extent of a resource affected and the time to its recovery with the matrix format lending to relatively easy comparison across response options (NOAA website).

An added advantage of the ERA approach is that the focus on endpoints can raise the broader question of what appropriate endpoints should be to judge the efficacy of spill response. The dialogue on this question could be further enhanced by the inclusion of other stakeholders in the workshops (Tuler et al. 2008, Webler and Lord, 2010).

Scenario Building for the Deepwater Horizon Spill This exercise, conducted under DOI auspices by the independent Strategic Sciences Working Group (SSWG), was intended to help spill responders anticipate possible future impacts of the spill and efforts to contain it. It took a very comprehensive approach that seems unprecedented in the annals of scenario analysis in the context of spill preparedness and response. Principles of SA as a strategic planning tool were fully embraced in the exercise conducted by the SSWG. A conceptual framework of the likely evolution of spill response and recovery undergirded the scenario development effort, guided by interaction with both federal and nonfederal scientists and decision makers dealing with the spill on a day-to-day basis (Machlis and McNutt 2010). An interdisciplinary, coupled human and natural systems perspective informed consideration of the spill’s consequences, meaning that Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 40 Discussion Draft 7-1-13 13 scenarios considered how possible future ecological impacts could cascade into social and economic consequences for the human, resource-dependent communities of the Gulf region and vice versa. The Working Group developed its scenarios in the form of “cascading chains of consequences”, assigning as well levels of scientific uncertainty to each consequence on a scale that ranged from “not possible” to “certain” with successive levels of uncertainty in between and “nk” (not known) reserved for “insufficient information to assign a level of certainty”(DOI 2010).

Ultimately three scenarios were developed and analyzed for their potential consequences. The first examined the time period from containment of oil outflow from the damaged well to the beginning of ecosystem recovery, assuming a flow rate of 40,000 bbl/day for the uncapped well, and 100 days to containment, both parameters that were poorly known at the time, but a time frame over which ecosystem stress was expected to keep increasing. The second examined the time horizons for short-term and longer-term recovery, when system stress was expected to be declining. The third was a variant on the second assuming different flow rates and times to containment, and developed with the assistance of Coast Guard area commanders. Each scenario produced unique insights into what could happen in the future, many events and consequences that in fact came to pass (see DOI 2010 for details).

As noted, the level of sophistication in development and analysis of the SSWG’s scenarios was unprecedented in the realm of spill preparedness and response. It is also perhaps worthy of note that both the Working Group itself and the approach it took to developing and analyzing scenarios were largely independent of the responders themselves and their day-to-day priorities and concerns. In taking such a comprehensive approach to scenario building for the DH oil spill, an unprecedented event with a range of consequences difficult to anticipate before the fact, the working group perhaps points the way to other use that could be made of SA, such as to develop response scenarios and consequences for spills in Arctic waters newly exposed to oil development and oil in transit.

Summary and Conclusions

Scenario Analysis is the process of developing plausible futures around the forces affecting an organization or its mission in the face of uncertainties over which the organization has little or no control. As one author puts it, to build scenarios is to construct a ‘wind tunnel’ with which to test future plans, ambitions and ideas. SA has been applied with great success in both military and business strategic planning, but also has occasionally fallen short in situations where planners have failed to grasp the full complexity of the forces shaping the future, for example, a late-90’s attempt to construct scenarios of the future of the internet as a business planning tool (Postma and Liebl 2005). Still, SA properly done forces careful attention and group consensus on what can be assumed more or less for sure given the question the planning period of interest, what the chief uncertainties are, what driving forces

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 41 Discussion Draft 7-1-13 14 are likely to determine future outcomes, and how plausible future outcomes can be examined in order to best inform present day decision making.

SA has also traditionally been a game for experts to play, potentially limiting its utility to deal with events like major oil spills where impact spreads well beyond the corporate boardroom or the domains in which public management authority resides. Moreover, as spill impacts grow in magnitude or importance and social conflict grows, the results of SA intended to inform spill management have to be communicated and explained, leaving them vulnerable to the same forces that can limit the utility of all analytic tools and expert findings in such circumstances.

If the value of contingency planning is in the collective “creation of a shared definition of a successful response” by a diverse set of experts (Harrald and Mazzucci, 1993 p189), several elements of this account suggest that SA in oil spill response planning and preparation have yet to achieve their potential. First, in contrast to such other SA examples as the MEA, there appears to seldom be an initial, well-formulated question that guides scenario development. Second, the regulatory emphasis on drills and exercises seems to preclude the creative engagement of diverse experts and stakeholders to explore drivers of change in the context of oil spill planning and response, and to reify static definitions of SA, both of which may hamper its use as an effective tool for anticipating the unthinkable. Members of the lay public have traditionally not been partners in spill-related SAs, and the metrics used to judge the potential effectiveness of various response options have with rare exception (the work of DOI’s SSWG on the DH spill in particular), not been metrics of direct societal importance.

That said, the present heavy reliance on WCD as the basis for preparedness and response planning is a not unreasonable starting point for fuller use of SA methodology with better framed organizing questions and more encompassing bases for defining the outcomes of concern. The SSWG’s effort was after all initiated in the middle of “the mother of all WCD’s” and the group, with the benefits of independence from influence by spill managers and responsibility for day-to-day decisions regarding spill management, was able to generate a number of valuable insights regarding the likely shape of recovery to come for decision makers. The PREP drills and ERA Workshops both seem potentially to offer platforms for launching more comprehensive efforts to apply the tools of scenario analysis than current practice. Thinking broadly, systematically and comprehensively about possible future spills, their management and the nature of recovery afterwards might also further the cause of more fully engaging the public in what is often seen as a technician’s game up until the moment the oil enters the water, the TV lights come on and opinion regarding the spill and its management has gone viral through social media, increasingly playing a dominant role in the shaping of pubic opinion.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 42 Discussion Draft 7-1-13 15

Acknowledgements Special thanks to project collaborators Ann Bostrom, Ann Hayward Walker and Bob Pavia for very helpful comments and suggestions, and to Erica Bates of the School of Marine and Environmental Affairs for invaluable research assistance.

References (still in development)

Aamo, O.M., M. Reed, and A. Lewis. 1997. Regional contingency planning using the OSCAR oil spill contingency and response model. Proc. AMOP Seminar, Edmonton, Canada.

Amer, M, T.U. Daim, A. Jetter. A review of scenario planning. Futures, Volume 46, February 2013, Pages 23–40.

Arctic Council. 2009. Arctic Marine Shipping Assessment Report. Second Printing.

Aurand, D., L. Walko, R. Pond. 2000. Developing Consensus Ecological Risk Assessments: Environmental Protection In Oil Spill Response Planning A Guidebook. United States Coast Guard. Washington, D.C. 148p.

Aurand, D. and G. Coelho (Compilers). 2006. Ecological risk assessment workshop: Environmental tradeoffs associated with oil spill response technologies. Cape Flattery, Washington. A report to Regional Response Team X. Ecosystem Management & Associates, Inc., Lusby, MD. Technical Report 05-01, 32 pages.

Aurand, D. and G. Coelho (Compilers). 2006. Ecological risk assessment: Consensus workshop. Environmental tradeoffs associated with oil spill response technologies. Mexico – United States Pacific Coastal border region. A report to the US Coast Guard, District 11. Ecosystem Management & Associates, Inc., Lusby, MD. Technical Report 06-02, 50 pages.

Aurand, D. (Compiler). 2003. Ecological risk assessment: Consensus workshop. Environmental tradeoffs associated with oil spill response technologies. Maryland Eastern Shore. A report to Commander, US Coast Guard Activities, Baltimore. Ecosystem Management & Associates, Inc., Lusby, MD. Report 02-02, 36 pages.

Biggs, R., M.W. Diebel, D. Gilroy, A.M. Kamarainen, M.S. Kornis, N.D. Preston, J.E. Schmitz, C.K. Uejio, M.C. Van De Bogert, B.C. Weidel, P.C. West, D.P.M. Zaks, and S.R. Carpenter. 2010. Preparing for the future: teaching scenario planning at the graduate level. Frontiers in Ecology and the Environment. 8(5): 267-273

Bloom, M.J. and M.K. Menefee. 1994. Scenario planning and contingency planning. Public Productivity & Management Review. 17(3): 223-230.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 43 Discussion Draft 7-1-13 16

Bradfield, R., G. Wright, G. Burt, G. Cairns, and K. Van Der Heijden. 2005. The origins and evolution of scenario techniques in long range business planning. Futures. 37: 795-812.

Briggs, W., S. Lundgren, B. Parker, and G. McMullin. 2011. SONS 2010 and the Deepwater Horizon Response: Applying Lessons Learned from a Worst Case Exercise to a Worst Case Discharge. International Oil Spill Conference Proceedings: March 2011, Vol. 2011, No. 1, pp. abs249.

Broje, V., G. Merrell, J. Taylor, and M. Macrander. 2011. Comprehensive Contingency Planning for Arctic Offshore Operations. International Oil Spill Conference Proceedings: March 2011, Vol. 2011, No. 1, pp. abs84.

Carpenter, S.R., P.L. Pingali, E.M. Bennett, and M.B. Zurek, eds. 2003. Millennium Ecosystem Assessment. Ecosystems and Human Well-being: Scenarios, Vol. 2. 560 pgs.

Chermack, T. J., Lynham, S. A., & Ruona, W. E. A. (2001). A review of scenario planning literature. Futures Research Quarterly, 7(2), 7-32.

Clarke, L. 1999. Mission Improbable. Using Fantasy Documents to Tame Disaster. University of Chicago Press. 229 pp.

Department of the Interior. 2010. DOI Strategic Sciences Working Group Mississippi Canyon 252/Deepwater Horizon Oil Spill Progress Report. 9 June 2010.

Duinker, P.N. and L.A. Greig. 2007. Scenario analysis in environmental impact assessment: Improving explorations of the future. Environmental Impact Assessment Review. 27: 206-219.

Environmental Research Consulting (ERC). 2001. Analysis of Washington State Vessel and Facility Oil Discharge Scenarios for Contingency Planning Standards. Prepared for Washington Dept. of Ecology Spills Program, Olympia, WA.

Etkin, D.S., D. French McCay, J. Rowe, and L. Pilkey-Jarvis. 2005. Modeling Impacts of Response Method and Capability on Oil Spill Costs and Damages for Washington State Spill Scenarios. International Oil Spill Conference Proceedings: May 2005, Vol. 2005, No. 1, pp. 467-473.

Etkin, D.S. 2003. Analysis of US Oil Spill Trends to Develop Scenarios for Contingency Planning. International Oil Spill Conference Proceedings: April 2003, Vol. 2003, No. 1, pp. 47-57.

Etkin, D.S., D. French-McCoy, J. Jennings, N. Whittier, S. Subbayya, W. Saunders, and C. Dalton. 2003. Financial Implications of Hypothetical San Francisco Bay Oil Spill Scenarios: Response, Socioeconomic, and Natural Resource Damage Costs. Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 44 Discussion Draft 7-1-13 17

International Oil Spill Conference Proceedings: April 2003, Vol. 2003, No. 1, pp. 1317-1325.

Ferreira, H.O., A. Cabrai, and A. Souza Junior. 2003. An Application of Worst Case Scenario Concept in Oil Spill Response Planning for Offshore Drilling Operation in Brazil. International Oil Spill Conference Proceedings: April 2003, Vol. 2003, No. 1, pp. 371-376.

French-McCay, D.P., J.J. Rowe, N. Whittier, S. Sankaranarayanan, D.S. Etkin, and L. Pilkey-Jarvis. 2005. Evaluation of the Consequences of Various Response Options Using Modeling of Fate, Effects and NRDA Costs of Oil Spills Into Washington Waters. International Oil Spill Conference Proceedings: May 2005, Vol. 2005, No. 1, pp. 457-461.

Hannah, L., et al. 2007. Protected areas needs in a changing climate. Frontiers in Ecology and the Environment. 5(3): 131-138.

Harrald, J.R. and T. Mazzuchi. 1993. Planning for success: A scenario-based approach to contingency planning using expert judgment. Journal of Contingencies and Crisis Management. 1(4): 189-198.

Holt, W. 2001. The Use of Scenarios in Contingency Planning. International Oil Spill Conference Proceedings: March 2001, Vol. 2001, No. 1, pp. 605-607.

Huss, W.R. and E.J. Honton. 1987. Scenario planning-What Style Should You Use? Long Range Planning. 20(4): 21-29.

Intergovernmental Panel on Climate Change (IPCC). 2000. Emissions Scenarios. Summary for Policymakers. N. Nakicenovic and R. Swart (eds.). IPCC, Geneva, Switzerland, 20 pp.

Intergovernmental Panel on Climate Change (IPCC). 2007. Climate Change 2007: Synthesis Report. Contribution of Working Groups I, II and III to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change [Core Writing Team, Pachauri, R.K and Reisinger, A. (eds.)]. IPCC, Geneva, Switzerland, 104 pp.

The International Tanker Owners Pollution Federation Limited (ITOPF). Contingency planning for marine oil spills. Technical Information Paper 16.

Jarke, M.X., T. Bui, and J.M. Carroll. 1998. Scenario management: an interdisciplinary approach. Requirements Engineering. 3:155-173.

Jenkins, L. 2000. Selecting scenarios for environmental disaster planning. European Journal of Operational Research. 121: 275-286.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 45 Discussion Draft 7-1-13 18

Lempert, R.J., S.W. Popper, S.C. Bankes. 2003. Shaping the Next One Hundred Years: New Methods for Quantitative, Long-term Policy Analysis. Rand, Santa Monica.

Lima, J.A., A. Sartori, E. Yassuda, J.E. Pereira, E. Anderson, and E. Campos. 2003. Development of Oil Spill Scenarios for Contingency Planning Along the Brazilian Coast. International Oil Spill Conference Proceedings: April 2003, Vol. 2003, No. 1, pp. 947-953.

Lorenz, C. 1990. How Shell Made its Managers Think the Unthinkable. Financial Times. March 5th, 1990.

Machlis, G.E. and M.K. McNutt. 2010. Scenario-building for the Deepwater Horizon oil spill. Science. 329: 1018-1019.

National Oil and Hazardous Substances Pollution Contingency Plan. 40 CFR Part 300.

NOPSEMA. 2012. Oil spill contingency planning. Environmental guidance note, N- 04700-GN0940, Rev July 2, 2012.

Oil Pollution Act of 1990. 33 U.S. Code 2701-2761 et seq.

Ott, G. L. and D. C. Stalfort. 1997. Planning and Exercising for Success: the Four- Step, Scenario-Based Process1. International Oil Spill Conference Proceedings: April 1997, Vol. 1997, No. 1, pp. 3-8.

Peterson, G.D., G.S. Cumming, and S.R. Carpenter. 2003. Scenario planning: A tool for conservation in an uncertain world. Conservation Biology. 17(2): 358-366.

Pereira, H.M., et al. 2010. Scenarios for global biodiversity in the 21st Century. Science. 330: 1496-1501.

Postma, T.J.B.M. and F. Liebl. 2005. How to improve scenario analysis as a strategic management tool? Technological Forecasting & Social Change. 72: 161-173.

Reiter, G., L. Pilkey Jarvis, and E. Storey. 2005. The National Preparedness for Response Exercise Program (PREP): A Paper Tiger or Dynamic Process? International Oil Spill Conference Proceedings: May 2005, Vol. 2005, No. 1, pp. 969- 972.

Ringland, G. Scenarios in Business. Wiley, Chichester.

Rubin, A. and J. Kaivo-oja. 1999. Towards a futures-oriented sociology. International Review of Sociology. 9(3): 349-371.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 46 Discussion Draft 7-1-13 19

Tompkins, E.L., R. Few, and K. Brown. 2008. Scenario-based stakeholder engagement: Incorporating stakeholders preferences into coastal planning for climate change. Journal of Environmental Management. 88: 1580-1592.

Tuler, S., Webler, T., and Kay, R. 2008. Comparing stakeholders’ objectives for oil spill response: A Q study of four regions. Technical report submitted to the Coastal Response Research Center, NOAA Grant # NA04NOS4190063. Project Number: 05- 983. Greenfield, MA: Social and Environmental Research Institute, Inc.

U.S. Coast Guard Research and Development Center. 2003. U.S. Coast Guard Oil Spill Response Research & Development Program, a Decade of Achievement. Final Report. Report No. CG-D-07-03.

U.S. Department of Homeland Security. 2013. National Response Framework. Second Edition, May 2013.

U.S. Department of Homeland Security. 2008. National Incident Management System.

U.S. Department of Transportation, et al. 2002. National Preparedness for Response Exercise Program (PREP) Guidelines.

Varum, C.A. and C. Melo. 2010. Directions in scenario planning literature- A review of the past decades. Futures. 42: 355-369.

Wack, P. 1985. Scenarios: uncharted waters ahead. Harvard Business Review. 63(5): 73-89.

Washington State Department of Ecology, et al. 2013. Northwest Area Contingency Plan (NWACP). Fourteenth Release (Change 14).

Webler, T. and F. Lord. 2010. Planning for the Human Dimensions of Oil Spills and Spill Response. Environmental Management. 45: 723-738.

Wolf, M.A. and C. Baker. 2011. Combining Oil and Hazardous Materials Worst Case Discharge Scenarios. International Oil Spill Conference Proceedings: March 2001, Vol. 2001, No. 1, pp. 719-723.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 47 DISCUSSION DRAFT – July 15, 2013

DRAFT “It’s Raining Oil”: Information Flow through Twitter after the 2010 Deepwater Horizon Oil Spill DRAFT Kate Starbird Department of Human Centered Design and Engineering University of Washington [email protected] Dharma Dailey Department of Human Centered Design and Engineering University of Washington [email protected] Abstract This research examines how information about oil impacts and specifically dispersant use moved through social media and the surrounding Internet during the 2010 Deepwater Horizon Oil Spill. Using a collection of tweets captured during the spill, we employ a mixed-method approach including an in-depth qualitative analysis to examine the content of Twitter posts, the connections that Twitter users made with each other, and the links between that site and the surrounding Internet. This paper offers a range of findings to help practitioners and others understand how social media is used by a variety of different actors during a slow-moving, long-term, environmental disaster.

Introduction - DRAFT This research examines how information about oil impacts and specifically dispersant use moved through social media and the surrounding Internet during the 2010 Deepwater Horizon Oil Spill. Using a collection of tweets captured during the spill, we employ a mixed-method approach including an in-depth qualitative analysis to examine the content of Twitter posts, the connections that Twitter users made with each other, and the links between that site and the surrounding Internet. This paper offers a range of findings to help practitioners and others understand how social media is used by a variety of different actors during a slow-moving, long-term, environmental disaster. We identify the most-recommended accounts in the set and describe how certain accounts were influential in different “communities” within the larger conversation around the Oil Spill. From a sample of Twitter users, we find several to be “local”—i.e. tweeting from affected areas during the event, and offer a detailed description of the tweeting behavior of locals. We also describe how the Twitter account from Unified Command fit into the larger network, and present some examples of how they were interacting with the Twitter crowd. Additionally, we demonstrate how celebrities and political commentators played significant, though not central, roles in shaping the information flow Though we examine Twitter behavior in general, we often focus specifically on the conversation around dispersant use (our data set contains more than 11,000 tweets with dispersant-related terms). Related to this, we explore some of the fears and concerns that Twitter users had about dispersant use. In the discussion, we note how many Twitter users were proactively searching for information to help them make sense of the event, and we theorize that even as many users sought to reduce their uncertainty through this interaction, some users may have been looking for sources to support their fears of a worst- case scenario. This latter perspective could explain some of the information-sharing patterns we discovered.

1 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 48 DISCUSSION DRAFT – July 15, 2013

Literature Review - DRAFT

Social Media and Crisis Response Sociologists of disasters (e.g. Fritz & Mathewson, 1957; Dynes, 1970; Kendra & Wachtendorf, 2003) have established that after a disaster or large-scale emergency event, people will often converge physically onto the scene. Researchers at the University of Colorado have connected this physical convergence behavior with a newly enabled digital convergence behavior that is manifesting online (Hughes et al., 2008; Hughes & Palen, 2009; Palen et al., 2010). People are now turning to social media after crisis events to, among other things, share and seek information (Palen & Liu, 2007; Sutton et al., 2008), to offer and improvise assistance (Starbird & Palen, 2011; Starbird & Palen, 2013), and to participate in collective sense-making (Vieweg et al., 2008; Qu et al., 2009; Heverin & Zach, 2012).

Social Media Use by Emergency Responders With the knowledge that their constituents are already there and expecting them, emergency responders are increasingly experimenting with using social media and other online tools for crisis communications, though these forays are not without their challenges. Latonero & Shklovski (2011) describe how one EM, an early adopter of social media, uses these new tools to broadcast messages, to interact, and to listen to the public. He describes how the ability to listen to the public through sites like Twitter is extremely important, as it enables him to identify rumors, misinformation, and concerns of his constituents, and to respond to them with better information in real-time. Monitoring social media can also increase situational awareness for responders, and empirical research shows that platforms like Twitter contain a significant amount of situational awareness information during and after disasters (Vieweg et al., 2010). Hughes and Palen (2012) report that social media use is disrupting the work of response professionals who serve as Public Information Officers (PIOs), repositioning them as translators of information, rather than gatekeepers. However, they note that many PIOs are concerned about these changes, feeling that they do no have the capacity or the training to handle social media communications, and fearing the spread of misinformation through these channels. On the other side of this relationship, when official channels of emergency-related information are not adequately meeting their information needs, citizens turn to informal channels—and social media are increasingly playing a role in informal information- seeking (Sutton et al., 2008).

Social Media Use during the 2010 Deepwater Horizon Oil Spill In June 2010, Pew released a survey conducted June 3-6, finding that nearly two-thirds of Americans were following the story closely, and that respondents trusted the news media more for information about the spill than the government or BP (Pew 2010a). 46% reported that they had little or no trust in the Federal Government. Later that summer, Pew published a related study of the role of media in the event (Pew 2010b), reporting that the Oil Spill dominated the mainstream news during the 100 days after the spill. They noted that due to the complex nature of the spill itself and the methods for cleaning it, media professionals had to have (or quickly gain) a large amount of technical and scientific expertise, and for the most part, the report claimed, they rose to the occasion. Looking to some of the content of the coverage, their research also indicated that while Obama had received mixed (both supportive and critical) coverage, BP and their CEO Tony Hayward, emerged as villains. Sutton et al. (2013) examined how government organizations used the Twitter platform during the 2010 DWH Oil Spill, using what they called the “microstructure”—i.e. retweets, mentions, replies, and links— within tweets to characterize tweeting behaviors from government accounts, and mapping out the network of government accounts using friend-following relationships. They report that government accounts were less likely to interact with others using the retweet mechanism and had a larger proportion of non- 2 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 49 DISCUSSION DRAFT – July 15, 2013 reciprocal friend-follower relationships than other accounts—supporting a view of government entities using Twitter primarily as a broadcast mechanism. Spiro et al. (2010) looked at rumoring behavior during the Oil Spill, using quantitative analysis to assess the drivers of social media activity related to the spill. They found that news coverage increased the volume of tweets related to the spill, and concluded that saliency of an event drives online participation. They also found that transmission of information (in the form of retweets) was more common in tweets with event-related keywords. That study did not look at the content of tweets or rumors or at whose information was being transmitted (retweets). This research complements the largely quantitative studies of Twitter use during the 2010 DWH Oil Spill (Spiro et al., 2010; Sutton et al., 2013), employing an in-depth qualitative analysis to examine the kinds of information that were flowing through social media and to identify the individuals and organizations who played important roles in shaping those flows. Twitter The analysis focuses on a data set collected during the event from public Twitter messages. Twitter is a popular microblogging platform that allows users to share short, 140-character messages with their account’s “followers” as well as with the general public. A vast majority of tweets are public and searchable, an affordance that contributes to the platform’s utility during crisis events. Though the original premise of Twitter was quite simple, users have introduced several linguistic conventions to increase its usability, including the mention (@username), the retweet (often RT @username or via @username) and the (#keyword). Mentions allow a user to send a public message directed specifically to the attention of another user. Retweets act as a forwarding mechanism, where a downstream user can repost a message that they have seen, giving attribution to its original author or another upstream Twitter user. Retweets also serve, for some, as a recommendation mechanism, both of the information and the author (Starbird & Palen, 2010). act as a mechanism for Twitter users to mark up their tweets with certain keywords, connecting them with certain groups or conversations and increasing their searchability. In recent years, Twitter has added functionality to support these conventions. At the time of the 2010 DWH Oil Spill, they had recently rolled out one-click retweet functionality, though many Twitter users still employed other methods, including manual ones, for marking their tweets as retweets. These tweet features can be important for tracking information flow in the space. Hashtags can be used to home in on specific kinds of conversation and sub-groups in the conversation space, and are therefore useful for focused search. Retweets and mentions demonstrate different kinds of connections between users, and can be traversed to create network graphs.

Using Tweets to Navigate the Larger Information Space Another common feature of tweets are URL links that users either embed manually, copy and paste over from web links, or are automatically including when user “share” web content (more common now than in 2010). URLs in tweets connect tweet content to other sources on the internet. Kwak et al. (2010) suggest that Twitter can be as much of a news media as a “social” media, as many tweets connect through URL links to other sites. Though the research described here is based primarily on data shared through the Twitter platform, the analysis extends into the wider Internet, using the links embedded in tweets to reconstruct the information space.

Methods This research examines online communication and interaction during the 2010 Deepwater Horizon (DWH) Oil Spill. It consists of an in-depth qualitative analysis complemented by descriptive network and 3 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 50 DISCUSSION DRAFT – July 15, 2013 quantitative analyses, to examine information flow within social media as well as the broader information space of the surrounding Internet. The 2010 Deepwater Horizon Oil Spill On April 20, 2010, the Deepwater Horizon oil rig exploded and sank into the Gulf of Mexico off the coast of Louisiana. For 87 days, oil flowed from the seabed resulting in what became the largest marine oil spill along a U.S. coastline (Robertson and Krauss, 2010). Oil made its way to the waters and shores of Louisiana, Alabama, Mississippi, and Florida, causing environmental damage, economic impact and concern for human health in the region. Though it is estimated that as much of 30% of it remains in the Gulf (Johnson, 2013), in the immediate aftermath of the explosion over 47,000 people were involved in efforts to burn, contain, recover or disperse oil. The use of two chemical dispersants, Corexit 9500 and 9527, became controversial among both experts and the public who were concerned about health impacts on marine life and humans. According to the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling (2011), this controversy was fueled by three factors: the unprecedented amount of dispersants used (1.84 million gallons), the novel, untested application of dispersants underwater; and lack of guidelines about overall amounts used and duration of use. The DWH Oil Spill sparked considerable conversation on the web—from mainstream news, to the , to the rapidly growing social media sites. At the time, Twitter was just over four years old, but was experiencing rapid growth (Costolo, 2010), and recent earthquakes in Haiti (January 2010) and Chile (February 2010) had attracted attention to the site as a place of information sharing, sense-making, and digital volunteerism during disaster (Harvard Humanitarian Initiative, 2011; Starbird & Palen, 2011). Data Collection Twitter provides several Application Programming Interfaces (APIs) that allow anyone with a Twitter account to collect public Twitter data programmatically. The data analyzed in this study were collected using Twitter’s Streaming API, which enables a forward-in-time search on a keyword or a set of keywords. We collected on the term #oilspill, a hashtag that had emerged to signal participation in the ongoing, public conversation around the event. For each tweet, we captured the timestamp, author, and tweet text, among other features. The collection period spanned from May 9, 2010 (~3 weeks after the spill began) and August 4, 2010 (~3 weeks after the well was capped). However, during that time our collection software went inactive for several short time periods (less than eight hours) and one long period (July 8–July 12) and no tweets were collected at those times. Figure 1 shows the volume of tweets per day during the collection period.

4 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 51 DISCUSSION DRAFT – July 15, 2013

Figure 1. Tweet Volume Over Time. Red represents volume of #OilSpill tweets per day. Blue is volume for dispersant-related tweets. The scale for Red is 10X that of Blue (top #). Black bars represent black-out periods where collection software was inactive.

The collection captured 693,409 tweets sent by 132,075 different Twitter users (see Table 1). Though the mean suggests each Twitter user sent 5.25 tweets, the number of tweets per Twitter user has a heavy- tailed distribution typical of many social media metrics, and the majority of Twitter users contributed only one tweet to our set (83,035). Of those, the sole tweet of 50,299 of these accounts was a retweet, indicating relatively low engagement in the larger Oil Spill conversation. Only 8,464 users sent 10 or more #OilSpill tweets, but those 6.4% users produced more than half the tweets we collected. Figure 1 illustrates this effect—how most Twitter users sent very few #OilSpill tweets and how a few Twitter users sent a high volume of #OilSpill tweets. This distribution is important for understanding our sampling strategy and for interpreting the findings presented here. Within this set, 11,146 (1.6%) tweets contain one of the following dispersant-related keywords that we identified in the data: dispersant, corexit, and the misspelling dispersent, and these were sent by 3,283 different Twitter users. This larger data set is referred to in this paper as the Total #OilSpill Tweet Collection, and several parts of the analysis refer back to this set, including the network diagrams.

Figure 2. Tweet volume per Twitter user Figure 3. Tweet volume per Twitter user

5 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 52 DISCUSSION DRAFT – July 15, 2013

Total #OilSpill Tweet Collection 50-50 Random-Dispersant Sample

Sampling To complete an in-depth qualitative analysis of the Twitter activity, we identified a small sample from the larger data set using a tweet-based selection method (as opposed to a Twitter user-based one). Because this research study was designed to assess both the broader conversation about the Oil Spill and commentary specifically related to dispersant use, we randomly selected 250 tweets from the Total #OilSpill Tweet Collection and 250 tweets from the 11,146 that contained a dispersant-related term. These 500 tweets become the 50-50 Random-Dispersant Sample. As a result of the tweet-based selection method, Twitter users who sent more #OilSpill-related tweets and more dispersant-related tweets are more likely to be represented in our data sample. To explain, an account that sent 1000 tweets is 1000 times more likely to appear in our sample than an account that sent one tweet. The mean number of #OilSpill tweets per Twitter user in the 50-50 Random-Dispersant Sample is 422 and the median is 40, both much higher than the overall data set. Figure 2 shows how the distribution of tweets-per-Twitter user changes due to this sampling technique. This sample is therefore both representative of overall tweet content and biased towards high volume Twitter users and those who tweeted about dispersants. This means that though descriptive statistics at the tweet level can be extrapolated across the larger sample, characterizations made about Twitter users (e.g. how many are local, how many accounts belonged to members of the media, etc.) are only true for this sample.

Data Set # of Tweets # of Dispersant- # of Distinct Related Tweets Twitter users Total #OilSpill Tweet Collection 693,409 11,146 132,075 50-50 Random-Dispersant Sample 500 254* 387 Table 1. Descriptions of the Two Data Sets

* Consistent with the relative percentages in the set, 4 of the 250 random tweets also contained dispersant-related terms.

Method of Analysis This research focuses around an in-depth qualitative analysis on a sample of #OilSpill tweets. However, we also employed descriptive statistics, including network analysis, to help us identify and characterize larger patterns in the data. The analysis took place in three stages: exploratory analysis, qualitative coding, and mixed-method analysis of results.

Exploratory Analysis We first approached the Total #OilSpill Tweet Collection data quantitatively, calculating the larger features of the set (e.g. number of retweets and URLs) and creating network graphs of the interactions. This helped us understand some of the features of the larger set, including which accounts were influential. It also informed our sampling strategy. A second part of the exploratory analysis was qualitative as we sampled randomly from the data, reading tweets and assessing Twitter profiles. Using a grounded qualitative approach, we identified key themes, developed coding categories, attempted to code tweets and Twitter users using these categories, and then revised our coding scheme in an iterative process.

6 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 53 DISCUSSION DRAFT – July 15, 2013

Our exploratory quantitative understanding of the basic features of the set and our preliminary qualitative analysis then informed our sampling strategy, which was designed to capture both a cross-section of all Twitter activity as well as a substantial subset of the major actors in the space.

Qualitative Coding We coded the data across three separate dimensions: tweet content, Twitter accounts, and mentions. For each tweet in the 50-50 Random-Dispersant Sample, we coded for tweet theme (allowing one or two selections), emotion conveyed in the tweet, whether the tweet contained scientific content, whether the tweet contained a mention of dispersants and whether that mention was negative, and whether the tweet contained a conspiracy theory. We derived the tweet themes and emotion categories during the exploratory analysis, primarily using a grounded approach and iterating over coding categories until we found a stable set of codes. We also shared preliminary categories with the members of our larger research team (cite), comparing and integrating emerging categories from the social media analysis with categories derived from parallel research on oil spill stakeholders’ mental models (cite). For tweet coding, we had three coders code 200 tweets each, with an overlap of 100 tweets for assessing inter-rater reliability. (Results of coding TBD). Each Twitter account that contributed one or more tweet in our 50-50 Random-Dispersant Sample was coded for Location relative to the event (Remote, Peripheral, Local, Unknown) as well as for role and organizational affiliation (Individual, Informal Group, Formal Organization, Other). Location was determined through a combination of reading through the #OilSpill tweets and assessing information available on the account owner’s current profile (in June 2013), including their 3000 most-recent tweets. We classified as “local” anyone for whom we had evidence that they were in the area at any time during the event. Peripheral users lived in or visited affected states, but not affected areas of those states—e.g. the east coast of Florida or northern Louisiana. Since many inland individuals had fear of oil and dispersant impacts through airborne transmission and other environmental effects, we applied a liberal lens in coding areas within about 100 miles of the Gulf Coast as “local.” Unfortunately, we did not have access to what the user profiles looked like at the time of the event, and additionally, many account owners do not broadcast their account location, so we were unable to determine the location for 39 accounts. Table 2 shows the number of Twitter users in each location category and Table 3 shows the coding across Organizational Affiliation. For accounts that were organizations and for individuals with an organizational affiliation, we also coded the organizational type and sector. For Twitter user coding, we divided the 387 accounts evenly across three coders, then used a secondary round of consensus coding to bring results into alignment. Location # Twitter Accounts Org Affiliation # Twitter Accounts Local 41 Individual 333 Peripheral 30 Formal Organization 21 Remote 276 Other Group/Entity 20 Unknown 30 Unknown 13 Table 2. Location Distribution of Accounts Table 3. Affiliation Distribution of Accounts

For mention coding, we identified every entity mentioned in the tweets in the 50-50 Random-Dispersant Sample, as well as details about that entity’s professional role and affiliation. For instance, Lisa Jackson is coded as an individual who is an Administrator at the EPA, a U.S. governmental organization at the national level. BP is coded as an organization in the Oil Industry. Additionally, we coded each mention as either an “object” of a comment or an information “source,” and then coded the tweet’s sentiment towards this mention. For mentions that were “objects,” we coded along a 5-point scale from Very Positive to 7 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 54 DISCUSSION DRAFT – July 15, 2013

Very Negative with a secondary option of “directive” where account owners were telling the mentioned entity what to do. For mentions that were “sources,” we coded along a spectrum of trust that included the following options: Explicit Trusted, Implicit Trusted, Neutral, Questioning, Implicit Distrusted, Explicit Distrusted, Unknown. Mentions were coded by a single coder. See Appendix A for more detail on the coding schemes derived from and used in this analysis.

Mixed-Method Analysis Finally, we analyzed the results of the coding, integrating descriptive statistics of the codes with further qualitative analysis of tweet content, Twitter user behaviors, and network connections between users. This mixed method approach provides a rich set of findings, described below.

Data Limitations Consistent with other studies on social media data (boyd & Crawford, 2011), this data represents only a subset of the broader Twitter conversation about the DWH Oil Spill and has specific biases related to its collection and sampling. Most significantly, only tweets where the account owner purposefully used the #OilSpill hashtag in their tweets were collected, which means that account owners who were not aware of the hashtag are not represented in this analysis. However, the presence of certain accounts, including several high-volume, local Twitter users and the official account of Unified Command (@Oil_Spill_2010), suggests that some portion of the relevant Oil Spill conversation was indeed organized around the use of this hashtag. Additionally, our sampling technique favored high-volume Twitter users who tweeted about dispersants, so the presented statistics do not characterize all accounts that tweeted about the Oil Spill. We are careful in the reporting to account for these limitations, but recommend that readers keep them in mind when referring to these findings out of the context of this paper.

Findings Findings on information recommendation, influential accounts, and the information-sharing behaviors of highly-engaged Oil Spill tweeters, including locals, provide insight into the broader patterns of information flow through Twitter during the Oil Spill. These findings characterize a complex information space—a social media platform tightly integrated into the surrounding Internet. Informed by network and tweet analyses, we highlight the roles played by some influential Twitter accounts, including Unified Command; NGOs with context-expertise including wildlife protection groups; celebrities; and political commentators; among others. We show sense-making within the social media crowd to manifest both as complex information-seeking from sources with scientific credentials, and the spread of conspiracy theories. The paper concludes with implications for practitioners.

Tracing Information Flow Using Retweets, Replies and URLs The connections that exist between accounts within a social media site as well as the connections between social media and other online sites are important for understanding information flow. As discussed above, retweets and URL links within tweets play important roles in information flow within Twitter and across the broader information space of the Internet. Table 4 shows the number of retweets, replies, and tweets with URLs both within the Total #OilSpill Tweet Collection and within the subset of tweets that contain dispersant-related terms. 44% of the tweets we collected are retweets and 69% contain a URL link to an external webpage. For both features, these are much higher rates (approximately three times) than what would be found across a completely random set of tweets in 2010 (Rao, 2010; Suh et al., 2010). So, #OilSpill tweets were more likely to contain a URL and to be retweets than the average tweet. For dispersant-related tweets these numbers are even higher—53% are retweets and 75% contain a link. Both of these suggest that many Twitter users relied on 8 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 55 DISCUSSION DRAFT – July 15, 2013 other sources and websites for the information they were sharing. First-hand and other “generative” information that was new to the broader information space, though certainly present, was rare. Replies constitute about 5% of the data set, which is much lower than the rate of replies across all Twitter data. This is likely related to the fact that our dataset was collected using a hashtag search and many replies do not contain hashtags.

Tweets # Tweets # RTs # Replies # with URL Total #OilSpill Tweet Collection 693,409 302,759 (44%) 34,102 (4.9%) 478,547 (69%) Dispersant-Related Tweets 11,146 5,907 (53%) 556 (5.0%) 8,317 (75%) Table 4. Retweet and URL Counts

Network Diagrams One way to examine information flow during the event is through network visualizations. Using linguistic conventions in tweet content (like retweets and mentions) we can construct networks of relationships between Twitter users that are event-specific and separate from their lists of followers and followings. Figure 4 is a network graph showing the connections created by each retweet in the Total #OilSpill Tweet Collection. Each node in the graph represents a Twitter account in our set and is sized by the number of retweets that account received. Each edge represents a retweet of one account by another. The edges are directed—i.e. they have an arrow pointing to the retweeted account—and each is sized according the log number of retweets sent between the two accounts. Accounts in this graph are grouped into and subsequently colored by “communities” as determined by the Louvain method (Blondel et al., 2008). 48 different communities were found, but only 8 had a significant number of community members. The light blue, aqua, and light green accounts, many of which congregate in the center of the graph, include local Twitter users, formal response organizations, emergent response groups, and media accounts. The red area at the lower right corner of the graph is an almost wholly separate conversation by mostly right-wing political commentators. Edges at the bottom, center of the graph are much thicker than in other areas, indicating stronger ties between nodes—i.e. many retweets of one account by another.

9 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 56 DISCUSSION DRAFT – July 15, 2013

Figure 4. Network Diagram from Retweets Most Retweeted/Recommended Accounts Analyzing retweet patterns allows us to see what voices were being heard within the set. Retweets can also be seen as a form of recommendation (Starbird & Palen, 2010), lending insight to what voices were valued as well. We approach the analysis of the most retweeted accounts from two perspectives: by examining retweets across the entire set (Table 5) and by looking at retweets from Twitter users in our 50- 50 Random-Dispersant Sample who were coded as Local to the event (Table 6). From a high level view, we see that Non-Governmental Organizations (NGOs) were influential in the space, along with a few individuals who were tweeting from affected areas. Celebrities make the top-retweeted list across the larger set, but are not as influential among local, high-volume #OilSpill tweeters. Significantly, @Oil_Spill_2010, the official account of Unified Command, is highly retweeted both in the larger set and among locals in the sample.

Twitter User # of Affiliation Affiliation Sector Description RTs Type 1 NWF 7677 NGO Environment- Wildlife National Wildlife Federation 2 IBRRC 7327 NGO Environment- Wildlife International Bird Rescue 3 WhoDat35 5261 Individual Local individual 4 Oil_Spill_2010 4738 Government Response Account of Unified Command

10 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 57 DISCUSSION DRAFT – July 15, 2013

5 GulfOilCleanup 4538 Informal Group Media, Community Informal group of collaborating media Response partners for Oil Spill relief 6 Alyssa_Milano 3507 Individual Actress, Celebrity Early adopter of Twitter, celebrity 7 MacMcClelland 3474 Individual Journalist Mother Jones journalist in NOLA 8 sami_shamieh 3259 Individual Political Commenter Remote individual 9 TheOilDrum 3225 NGO Media, Energy Online news about energy 10 Jason_Pollock 3093 Individual Entertainment Filmmaker, writer, activist Table 5. Top 10 Most Retweeted Accounts in the Total #OilSpill Tweet Collection

Wildlife NGOs as Context Experts Table 5 shows the National Wildlife Federation (@NWF) and the International Bird Rescue (@IBRRC) to be by far the most-retweeted in the Total #OilSpill Tweet Collection. These two accounts are also both central in our network graph and found to be in the same community (Figure 4, in light green). Their positioning towards the left side of the graph is likely due to a broad array of connections to Twitter users who were not high volume #OilSpill tweeters. Both NWF and IBRRC are NGO organizations that focus on the environment and on wildlife protection and rescue, and both had structure that pre-existed the 2010 Deepwater Horizon Oil Spill. Significantly, though the NWF has the higher position in the larger Twittersphere, the IBRRC is more highly-retweeted among the local, high-volume #OilSpill tweeters in our sample.

Local “Authorities” In third place in the larger data set and first place among locals is @WhoDat35, a local Twitterer who tweeted 5,173 #OilSpill tweets, receiving just over one retweet per tweet. At #4 and #2, respectively, is @Oil_Spill_2010, the official account of Unified Command. They were retweeted 4738 times, but only sent 616 tweets, which gives them a retweet rate of 7.7. These two accounts, both seen as having a certain kind of Local Authority (Starbird & Palen, 2010), take on central positions in the network and in the larger “story” around the Oil Spill as communicated on Twitter. Interestingly, they both appear in the same community in the network graph (Figure 4, in aqua). We offer a detailed accounting of them below.

Twitter User # of Affiliation Affiliation Sector Description RTs Type 1 WhoDat35 675 Individual Local individual 2 Oil_Spill_2010 601 Government Response Account of Unified Command 3 GOHSEP 567 Government Response Local response organization 4 IBRRC 403 NGO Environment- Wildlife International Bird Rescue 5 SaveTheGulf 342 Informal Group Community Response Local action group 6 GulfOilCleanup 318 Informal Group Media, Community Informal group of collaborating media Response partners for Oil Spill relief 7 TheNewsBlotter 310 Individual Blogger Local blogger 8 BPGulfLeak 198 Individual Individual as a cause 9 TheOilDrum 165 NGO Media, Energy Online news about energy 10 NolaNews 157 Organization Media Local news outlet Table 6. Top 10 Most Retweeted Accounts by Locals in 50-50 Random-Dispersant Sample

Accounts Specific to the 2010 Deepwater Horizon Oil Spill @GulfOilCleanup, an informal group of collaborating media partners who came together after the event to help with community response efforts, appears in the top ten of both locals and the larger data set. @GulfOilCleanup is one of 21 accounts in the 50-50 Random-Dispersant Sample that were coded as 11 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 58 DISCUSSION DRAFT – July 15, 2013 being specific to the 2010 Deepwater Horizon Oil Spill. @SaveTheGulf and @BPGulfLeak, both of which were highly retweeted by locals in our sample, are also accounts initiated specifically to respond, in some way, to the event. Some of these accounts were operated by single individuals, but others, like @GulfOilCleanup, represented groups. Emergent organizations of spontaneous volunteers and other actors are an established feature of disaster response (Dynes, 1970; Kreps & Bosworth, 1994; Tierney et al., 2001). Recent research shows people appropriating social media technologies to improvise collective action online after disaster events (Vieweg et al., 2008; Qu et al., 2009; Starbird & Palen, 2011). The informal groups who operated event- specific accounts can be characterized as emergent response organizations. Some of these groups may have been purely online organizations, but others appear to have some offline interaction as well (e.g. @GulfOilCleanup and @gulfvolunteers).

Media Reporting from the Ground Also on the top-ten list for the Total #OilSpill Tweet Collection is Mac McClelland (@MacMcClelland), a journalist from Mother Jones who reported from the ground in affected areas during the spill. Her account is in a central position in our large network graph, but interestingly, she is not highly retweeted by the high-volume #OilSpill tweeters in our 50-50 Random-Dispersant Sample. She appears as the 47th most- retweeted Twitter user by locals and the 24th most-retweeted by remote users in that sample, suggesting less impact among locals. Another media account, @NolaNews, makes the Top Ten Retweeted among locals in the 50-50 Random- Dispersant Sample. Interesting, that account ranks 59th overall, and 37th among remote Twitter users in our sample, demonstrating that their impact was much higher among highly-engaged locals than others. This effect could be due, in part, to pre-existing following connections between local citizens and local news sites.

The Celebrity Effect Two celebrities, @Alyssa_Milano and @Jason_Pollock, were among the most retweeted by the larger Twittersphere. @Alyssa_Milano was a very early adopter of Twitter and often tweets about highly visible issues and events. Both of these individuals position themselves as social media activists and their large following numbers likely contributed to their highly-retweeted status. Though highly-retweeted, both appear towards the periphery of the network graph and are not central figures in the long-term #OilSpill conversation. To further illustrate this, for both accounts, the number of different people who retweeted them is quite high in proportion to their overall retweet count—i.e. most people in the set retweeted them only once. The suggests that though celebrities and others with large Twitter followings may be able to attract a large number of people to engage in a topic, this engagement is likely to be low, and therefore the effect may be more superficial than influential. One of these two celebrities was influential in spreading information about oil coming down as rain. The following tweet was retweeted 52 times: @Jason_Pollock (June 24, 2010): MUST SEE: Its RAINING OIL IN LOUISIANA! (VIDEO) http://j.mp/RainingOilLA #bpFAIL #OilSpill Pls Pass On! The meme, which contained a video claiming to show evidence of the phenomenon, had begun to spread on June 23, but only among a few accounts, including a few locals. Pollock’s tweet sparked a broad echo effect in the broader Twitterverse, but significantly no one in our 50-50 Random-Dispersant Sample retweeted him.

The Political Echo In our initial, exploratory qualitative analysis we encountered a large number of tweets of a political nature. That gave rise to some politically-themed coding categories, and we later coded 20 of the 387 12 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 59 DISCUSSION DRAFT – July 15, 2013 accounts in the 50-50 Random-Dispersant Sample as being primarily devoted to political commentary. Many of the political tweets were extremely negative towards the government in general and the response efforts particularly. They also contained a relatively large portion of talk that we later began to classify as related to one of several conspiracy theories that spread in the space. It was therefore not surprising to find a political blogger among the Top Ten most-retweeted in the larger space. @sami_shamieh, at #9, is a right-wing political blogger who sent more than 1000 #OilSpill tweets, many laying blame on the Obama administration for not adequately responding to the event. Significantly, the retweet network graph (Figure 4) shows that this account was at the core of a secondary conversation about the #OilSpill, and that, though vocal and inflammatory, its influence was not strong within the mainstream #OilSpill conversation. However, some locals did connect at a superficial level with this account, showing some exchange of information between the political fringe and those affected by the spill—an effect that opened up channels for the transmission of conspiracy theories, discussed below.

An In-Depth Look at Three Influential Accounts

Tweeting as an Environmental Watchdog The NWF was retweeted 7,677 times by 4,103 different Twitter users, suggesting a broad impact and likely reflecting a high number of followers interested in their commentary on the event. In recent years, the NWF has been particularly attentive to its use of social media for promoting the organization’s work (cite presentation/study), and their highly-recommended status during this event may be a by-product of approaching their social media communications with intentionality. They sent 256 #OilSpill tweets during the event. 236 of these (92%) contained a link to a webpage, often to a on their own website or to a source in the mainstream media. They sent only 36 retweets (14%) and 5 of those were of tweets authored by @IBRRC, indicating a connection between the accounts of these two highly-recommended organizations. Evidence suggests that the NWF’s tweets had an effect on the conversation about dispersants as well. Though they sent only four tweets referring to dispersant use, those tweets, which were critical of BP’s strategy for deploying dispersants, were retweeted 111 times. @NWF (May 20, 2010): EPA demands that BP use a less toxic dispersant http://bit.ly/dAwCzc #oilspill @NWF (May 25, 2010): Transparency or censored documents? BP holds back on info about the dispersants used in the #OilSpill http://nyti.ms/9yJmcH The first tweet above was retweeted 21 times, and the second, which contained an implicit accusation of BP’s actions around dispersant use, was retweeted 53 times. These two tweets linked to mainstream media articles—the first to an article in the Washington Post and the second to one in the New York Times. Though these tweets were far from supportive of dispersant use, the following two tweets and the web content to which they linked were substantially more negative about the impacts of dispersants and the motivations for their use: @NWF (May 10, 2010): Using dispersants on an #oilspill doesn’t reduce the total amount of oil in the environment: http://bit.ly/daeo8q @NWF (May 25, 2010): BP Still Stonewalling EPA on Dispersant Chemicals http://bit.ly/cmXif8 #oilspill For the latter two tweets mentioned above, the first tweet was sent relatively early in the response period, on May 10. It linked to a blog on the NWF’s website that cited several facts that “you wish you didn’t know about oil spills,” including the following:

13 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 60 DISCUSSION DRAFT – July 15, 2013

Dispersed oil doesn't disappear. It is simply no longer visible on the surface because it is mixed into the water. The chemicals in dispersants can be harmful to fish and wildlife. Producers are not required to disclose the complete composition of chemical dispersants. The second tweet also linked to a blog on the NWF’s website, one that made the following claim: Now BP is saying the dispersants are safe for people & wildlife -- trust us. But the National Wildlife Federation won't take [REDACTED1] for an answer. We'll continue pushing the federal government to do what BP won't -- conduct proper environmental monitoring, testing & public safety protection. The language in this blog suggests a cover up on the part of BP to conceal the environmental and human health impacts of dispersant use, a claim we see repeatedly in tweets and linked-to sites within this set. This blog positions the NWF as a watchdog in the event, an entity separate from government and corporate interests that will fight to do the right thing for the environment and for human health. This tweet was retweeted 28 times and the one above it 9 times. We hypothesize that the lower retweet rate for the first tweet may be due to the relatively early date of its posting (May 10). Considered in chronological order, the @NWF received consistently more retweets per dispersant-related tweets over time. Two possible explanations for this are that the account itself was reaching a wider audience as the event progressed or that dispersant use was becoming more of a hot button issue at the end of May. Indeed, the tweets sent on May 20 coincide with the first spike of dispersant-related conversation in the set—341 tweets were sent that day (see Figure 1). Their final tweet about dispersant use was sent on May 25, long before that conversation was over. It is unclear why they stopped tweeting about dispersants even as they ramped up their overall #OilSpill tweeting.

Local Authority: The Concerned Local @WhoDat35 is a local Twitter user; her username shows support for the New Orleans Saints football team for which “Who Dat” is a team catchphrase. She was a high volume #OilSpill tweeter, sending 5,173 tweets, and she is in our 50-50 Random-Dispersant Sample. She was retweeted 5,261 times, or about once for every tweet she sent. Though her per-tweet impact was not very high, her overall impact on the set was. Significantly, she was retweeted by far fewer unique Twitter accounts than other highly-retweeted accounts in the data. For instance, while the NWF and IBRRC were retweeted by 4103 and 2291 different accounts respectively, @WhoDat35 was only retweeted by 877 different accounts. However, these 877 Twitter users were both more engaged with @WhoDat35 (each retweeted her on average 6 times) and more engaged with the #OilSpill conversation over the course of the event. People who retweeted @WhoDat35 sent a mean of 164 and a median of 40 #OilSpill tweets each, far more than the mean (5.25) and median (1) from the data set at large. @WhoDat35 can be seen as having a form local authority (Starbird & Palen, 2010), because she lived in an affected area. Our network and retweet analyses show her as central and highly influential within the network of high-volume #OilSpill tweeters, especially other locals. She is not the only local in the set— our coded sample identified 40 other accounts that were operated by someone who was in the affected area at some time during the event. However, she is by far the most influential individual who is not associated with an organization or the media. One factor in this may be that she sent a relatively high volume of tweets compared to other accounts. @WhoDat35 ranks fifth in the set on the most tweets sent. Above her are two bots, one account specific to the event (@BPOilpocalypse), and another local (@Seachele420) who had significantly fewer retweets but also played a significant role in the #OilSpill conversation. A higher volume of tweets means a person

1 This language is exactly as it appeared on the website, including the “REDACTED.” 14 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 61 DISCUSSION DRAFT – July 15, 2013 is more likely to be retweeted and engaged with in the conversation, and clearly this had some effect in @WhoDat35’s status. But volume is likely not the only factor. @WhoDat35 was also, at times, a source of first-hand reports of oil impacts as she walked her local beaches looking for tarballs and other signs of oil. She was often retweeted for these as well as for second-hand reports of impacts that she found in other sources. Close analysis of @WhoDat35’s tweet stream also shows her be a measured voice in the discourse. Over 80% of her tweets contain links to external sites, which means that she was using other online sources to back up her claims. In the case of rumors about oil coming down as rain, @WhoDat35 did her own research and tweeted information disputing that claim: @WhoDat35 (June 25, 2010): Video purports 2 show aftermath of an oily rain in La. EPA says that oily rain is highly unlikely (CSMonitor) http://bit.ly/bpBxxS #oilspill @WhoDat35 also used the Twitter platform to engage publicly with government accounts related to Oil Spill response. She sent 379 reply tweets that she also marked with the #OilSpill , intentionally calling them to the attention of others following the conversation. These included four tweets to the @WhiteHouse, seven to @BarackObama, 37 to @LisaPJackson, and 26 tweets to the @Oil_Spill_2010 account (described below). Below are a few samples of these tweets: @WhoDat35 (May 16, 2010): @WhiteHouse We paid for ALVIN & NR-1 with our taxpayer dollars, so we demand deployment now w/media embeded. #oilspill #oceans @WhoDat35 (May 27, 2010): @BarackObama Mornin Mr. President sir. Please take the WHPressCorps on a tour with Billy Nungesser IN the marsh, not on the shore #oilspill @WhoDat35 (June 21, 2010): @BarackObama Sir, please do whatever it takes 2 KILL the well NOW. We cant wait until Aug. The Gulf will be a deadzone #oilspill @WhoDat35 (July 18, 2010): @LisaPJackson : pls consider disbarring BP from ALL federal contracts including those w/Defense Energy Support Center (DESC) #oilspill These tweets show how one local Twitter user leveraged that platform in attempt to make her voice heard, questioning the government’s clean-up strategies and giving directives to government officials. These tweets were often retweeted, amplifying these sentiments across the larger information space. From another perspective, these tweets also represent an opportunity for the recipient accounts of government representatives and organizations to respond directly to constituents, in this case to an influential local Twitter user, through the social media platform. @WhoDat35 also engaged with other local users in the set, and sent 44 public tweets to @BP_America, the official account of BP for the spill, including these two: @WhoDat35 (June 24, 2010): @BP_America During #oilspill cleanup, your co. has a DUTY 2 ensure that workers R protected frm the harms caused by exp. 2 oil & dispersant @WhoDat35 (June 26, 2010): @BP_America : You call this cleaned? The oil is still beneath the sand!http://www.youtube.com/watch?v=5zQydQB7TOQ #oilspill #blacktide Like many others in the set, @WhoDat35 was especially critical of BP, accusing them of not doing everything they could to stop and clean up the spill, and also insinuating that they were more concerned with avoiding responsibility than with protecting the environment and the health of people in the region.

15 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 62 DISCUSSION DRAFT – July 15, 2013

A Local’s Fear of Dispersant Use @WhoDat35 sent 278 tweets about dispersant use—99 of these were retweets and 199 had URL links. A large number of these express a negative sentiment towards dispersant use, and many demonstrate a fear of negative health impacts. The content of these tweets are well-aligned with other locals who were also tweeting about these issues. A few examples are listed below: @WhoDat35 (May 11, 2010): Chemical dispersants: Is the cure worse than the disease? (via @WWL) http://bit.ly/bsuaOq #oilspill #BPoilspill @WhoDat35 (June 13, 2010): Huge globs of oil mixed in w/dispersant (red) on Ala. Beaches #oilspill #oceans http://twitpic.com/1wmht4 @WhoDat35 (June 20, 2010): @OIL_SPILL_2010 INFORM THE PUBLIC 48 HOURS IN ADVANCE OF ANY AERIAL DISPERSANT SPRAYING IN POPULATED AREAS NOW. #oilspill @WhoDat35 (July 14, 2010): @LisaPJackson have u conferred with the Dept. Of Ag as to the fallout to our crops of dispersant COREXIT sprays and acid rain?! #oilspill @WhoDat35 (July 14, 2010): Censored Gulf news: Scientists call on Obama to stop chem-spray (Examiner.com) http://exm.nr/cwXH1H #oilspill #dispersants #STOPCOREXIT Within the #OilSpill data we collected, dispersant conversation centered around three, often intwined, issues: human health impacts, environmental impacts, and method of deployment. The first tweet in the above example, which is both a retweet and contains a link to an external site that contains a full article, suggests that the health impacts are uncertain and possibly negative. This sentiment is often conveyed in the #OilSpill data, along with another view, expressed in the final tweet, that the impacts are known to be extremely negative. Like @WhoDat35 does here, other Twitter users often cite other sources for this kind of information, including ones that they view as scientific. Another fear expressed in the set was that dispersants would rain down on people from above, either because they were being directly sprayed on inhabited areas by aircraft or that they could rise into the atmosphere and later come down as toxic rain inland. The third and fourth tweets from the sample above demonstrate those fears. The second and fourth tweets address environmental impact. It is a Twitpic, likely sent from her phone, that links to an image of oil debris on the beach, and in the text of the post @WhoDat35 claims to identify dispersants within an area of oil debris on the Alabama shore. Our in-depth examination of @WhoDat35’s many tweets suggest that she is a relatively “normal” individual. She did not get involved in more of the far-fetched conspiracy theories that were rampant in the information space (described below), as some other locals did. In reading through these tweets and tracking back to the sources mentioned in them, our researchers felt that these were reasonable fears considering the information available. Indeed, members of our research team who entered this project without previous knowledge of Oil Spill effects and dispersant use felt many of these same fears as we combed through the information.

Local Authority: The Unified Command Account @Oil_Spill_2010 was the official account of Unified Command during the summer of 2010. Though their account is now gone and—problematically—another has taken its place, their profile at the time read, “Official tweets from the Deepwater Horizon Incident Joint Information Center (JIC) on Unified Command response efforts to the oil spill in the Gulf of Mexico.” They had significant impact on the #OilSpill conversation, both across the larger set, and among local, high volume Twitter users, as indicated by their high retweet rates (4738 total retweets from 1600 different accounts). They sent a total of 616 #OilSpill tweets. 16 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 63 DISCUSSION DRAFT – July 15, 2013

The @Oil_Spill_2010 account was one part of a comprehensive communications strategy employed by Unified Command’s Joint Information Center (JIC) that included liveblogging of press conferences; two event-specific websites with audio, video, and transcripts; a YouTube channel; a FaceBook group; a Flickr account; a widget; and text messages. The JIC used the Twitter accounts to promote their other communication venues and to broadcast information from government organizations such as health information from the CDC and maps of the spill from NOAA. Each day, a link to a summary of command and government activities was promoted via Twitter along with “status maps” of the event. Actions of the government’s Oil Spill leadership were updated frequently. In addition, non-governmental activities were reported such as BP America’s announcement that claims were being taken. Occasionally, @Oil_Spill_2010 linked to media stories from national sources like CNN and the New York Times. In addition to serving as a newswire, Twitter was employed by the JIC as a way to monitor public concern and to converse with them directly. Ideas for capping the well were solicited through Twitter. Questions on topics such as how to volunteer in the clean-up were answered. Responding to the opportunity of direct interaction with the public over their concerns, those operating the @Oil_Spill_2010 account were able to dispel rumors and correct misinformation. The following exchange shows Unified Command using the platform to interact with the public around the use of dispersants, specifically addressing rumors about dispersants being sprayed on people: @Oil_Spill_2010 (June 10, 2010, 5:28pm): @cindyscott54 @WhoDat Dispersants are only used in and over H2O. EPA has a 24/7 hotline for concerns 1-888- 623-0287. #oilspill @WhoDat35 (June 10, 2010, 5:34pm): @Oil_Spill_2010 thank you for your response. What about workers on boats. Are they warned before aerial sprays r conducted? #oilspill @Oil_Spill_2010 (June 10, 2010, 6:57pm): @WhoDat35 No dispersants are released within three miles of any boats #oilspill For the most part, the integrated and comprehensive approach to sharing information and communicating with the public seems exemplary. However, as may be expected, some missteps may have occurred. First, like other response organizations trying to incorporate social media (Hughes & Palen, 2012), they may have had difficulty keeping up with the volume of communications directed at them, and with the real-time pace of social media and the related expectations of its users. As evidence, though their account was publicly addressed 151 times, @Oil_Spill_2010 only sent 31 replies to users. Additionally, their interactions sometimes did not go perfectly. On June 27, 2010, @WhoDat35 was in the New Orleans area and complained about air quality. Her concerns were not addressed by @Oil_Spill_2010, who failed to recognize, due to lost context in the tweet propagation, that the information was indeed being reported from someone in the area, and made a claim that was untrue: @WhoDat35 (June 27, 2010, 6:54pm): L.A. has smog alerts. There R definitely VOCs in the air in N.O. My eyes and nose are burning. Why no Air Quality alerts here? #oilspill @Oil_Spill_2010 (June 27, 2010, 7:41pm): @WhoDat @barbiesnow Were in NoLa and have not received any reports. Stand by as we continue to investigate your questions. #oilspill @Oil_Spill_2010 did not follow up on this exchange, and it may have contributed to distrust from the individual who requested information. Hours later, @WhoDat35 speculated that the lack of information may be intentional:

17 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 64 DISCUSSION DRAFT – July 15, 2013

@WhoDat35 (June 27, 2010, 11:11pm): Captured ALL of the NOAA NWS surface smoke data over SELA that now comes up empty. Was it scrubbed? Who knows? Nite all. #oilspill This exchange shows a negative effect when interactions go awry. In this case, the Unified Command missed a valuable opportunity to establish trust with a local individual who had significant influence on the larger discourse. The event-specific approach to their communications strategy also led to challenges after the event had “ended.” Much of the rich record of activity documented at the time is inaccessible because the website for the response period (deepwaterhorizonresponse.com) was taken down sometime after the spill was capped. While efforts were made to redirect users to the recovery-focused site that followed it (restorethegulf.gov), the dynamic nature of the site means that a large portion of the event record is unavailable. Further, the Twitter account @Oil_Spill_2010 was cancelled, making it available for use by other entities. It was subsequently assumed by another individual or entity who then posted tweets under this account name, possibly confusing the public.

Characterizing the Accounts of Locals @WhoDat35 and @Oil_Spill_2010 were two of 41 accounts in our 50-50 Random-Dispersant Sample that were local to the event. Table 2 shows the distribution across location of accounts in this sample, which was skewed towards high-volume #OilSpill tweeters and those who shared information about dispersants. 30 accounts were coded as peripheral—i.e. in states along the Gulf Coast but not near directly affected areas. 276 accounts were remote, and we were unable to code 39 accounts (~10%). Local accounts differed from remote accounts in a couple important ways. They were more likely to tweet about dispersants—in the 50-50 Random-Dispersant Sample, about 5.8% of locals’ tweets were about dispersants, whereas only 2.4% of remote users’ tweets were dispersant related. Taking our sampling method into consideration, it is likely that more than 10% of all #OilSpill tweets about dispersants came from locals. Locals in our set also tweeted more tweets, over a longer period of time, than others. Locals sent, on average, 642 tweets over the course of 65 days, compared to 413 tweets over 48 days for remote users. These distributions are also heavy-tailed. Remote users represent a much higher percentage of Twitter users who only contributed one #OilSpill tweet. Locals in the sample also received far more retweets than remote users—656.7 per user, compared with 106.9. Their retweet-per-tweet rate was also substantially higher. Using a log transformation to normalize the distribution, locals received 0.77 log retweets per log tweet, while remote users had a rate of 0.58. . In a finding that complements our network analysis, locals were more highly retweeted among a smaller group of people than remote individuals. Each person who retweeted a remote account retweeted them, on average, 2.3 times. That number is 3.0 for locals. Taking these findings together suggests that people who are participating in the #OilSpill conversation over the long term are more likely to value (and amplify) the voices of people who are local to the area, and locals were more likely to connect and share information from other locals than from remote #OilSpill tweeters. This confirms findings in previous research on Twitter use after crisis events that the crowd values the voices of those who have local authority (Starbird & Palen, 2010; Starbird & Palen, 2012). 22 of the 41 accounts of locals were operated by individuals who were not affiliated with an organization, and seven were individuals who listed an affiliation (e.g. @MacMcClelland, identified herself as reporting for Mother Jones). Eleven belonged to organizations or other entities, and of those, seven were accounts specific to the event (@Oil_Spill_2010, @GulfOilCleanup, @BPOilSpill, @GulfVolunteers,

18 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 65 DISCUSSION DRAFT – July 15, 2013

@CleanTheGulfNow, @WetlandsOfLA, and @PrjGulfImpact). These accounts were all started after the spill began and many were focused on galvanizing community response efforts. Almost all have gone silent since the event ended. Five accounts were associated with mainstream media, and five others identified as bloggers in their public Twitter profile.

Volunteerism, Outrage, and Fear It is difficult to simply characterize the 26,000 tweets sent by locals in our sample. Several themes emerged during our preliminary qualitative analysis and subsequent tweet coding. Though not a primary activity, many did turn to Twitter to post first-hand reports of oil impacts: @SoloAnn (July 1, 2010: Ft. Morgan, Gulf Shores & Mobile Bay, AL-no oil, no orange goo, no smell. Lots of birds at mouth of the bay. Pelis 2. SEEMS ok. #oilspill @OceanDog (July 1, 2010): Oiled brown pelican-La. #oilspill http://twitpic.com/21hs3i @OceanDog (July 3, 2010): Hazmat truck-Long Beach Harbor- driver, dont ask me no questions. #oilspill http://twitpic.com/224pnh #response @CarmenSisson (July 13, 2010): Spent the evening exploring Biloxi to Waveland. Lots of BP workers on the beach at 9 p.m. using work lamps in Long Beach. #oilspill #response @SoloAnn (July 13, 2010): The #wildlife sand crabs were few...and discolored. Majority of shells that live on shore are dead. No dead fish though, yet. Gulf Shores #oilspill Like @WhoDat35, locals @OceanDog, @SoloAnn, and @CarmenSisson spent time during the spill touring coastal areas taking pictures of oil on the shore, oiled wildlife and response efforts. Convergence onto the scene is a known phenomenon after disaster events, as is spontaneous volunteerism. It is possible that local Twitter users saw an opportunity to respond to the event by sharing these first-hand reports on impacts. Another local Twitter user in our sample volunteered to help with clean-up efforts, but was denied when she went to apply. Tweet evidence therefore supports a view that some people want to contribute in a productive way to the event. Another way local Twitter users responded was through communicating outrage at the spill, its causes, its impacts, and the clean-up efforts that they largely felt were insufficient.

19 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 66 DISCUSSION DRAFT – July 15, 2013

Analyzing Mentions in Tweets: Heroes, Villains and Trusted Sources To explore how the Twitter crowd viewed different sources of information and different actors in the Oil Spill response, we identified all the people, organizations and other entities mentioned in all of the tweets in the 50-50 Random-Dispersant Sample and coded them along scales of trust and sentiment to determine how Twitter users felt about different entities. During the summer of 2010, Pew reported that BP and Tony Hayward emerged as villains of the oil spill story as told by the mainstream media (2010b), and our analysis of the social media record supports that finding. Among the 272 total mentions of organizations, there were 150 explicit mentions of BP, the vast majority of which were negative, including 44 explicitly negative mentions. Of the 137 mentions of individuals in the sample, Barack Obama was the most mentioned individual (26 times in the sample). The sentiment around Obama was mostly negative, although a few tweets defended him. This conflicts somewhat with the Pew study of the media treatment of the U.S. President, which found a more balanced review of his actions and policies, and perhaps reflects the interaction between the political blogosphere and the Twitter #OilSpill conversation The next most-mentioned individual was Tony Hayward, the CEO of BP, who was mentioned 8 times. The sentiment around Hayward was extremely negative, with many locals bristling at a comment he made about “wanting his life back.” When cited as a source, he was explicitly distrusted: LScribner (June 17, 2010): Newsflash! BPs CEO Tony Hayward doesnt think BP cut corners on safety to make a profit! Stop the presses! #oilspill hogwash

The EPA: Too Weak to Act But Still Trusted Exploring how different entities were mentioned offers some insight into public sentiment around the event response. For instance, though Lisa Jackson, who was mentioned five times, was referred to negatively when an object in a tweet, tweets in our sample that cited her as a source showed an implicit trust. Twitter users were more ambivalent when talking about the EPA in general, with a mixture of positive and negative tweets about the organization. In our sample, most of the negative EPA mentions indicated anger about their policy of dispersant use. Others showed a frustration about their lack of power and saw them as being pushed around by BP. GoodTwitty (May 21, 2010): RT @joanneleon: So I guess #BP can override #EPA Nola.com BP is sticking with its dispersant choice http://bit.ly/aiYNmV #oilspill GulfCoastSpill (May 22, 2010): BP: Screw you EPA: #BP is sticking with its dispersant choice. http://bit.ly/aOz71O #gulf #oilspill Six of the tweets in our sample that mentioned the EPA were “directive” tweets, where Twitter users gave them advice or orders on what to do. When cited as a source, the EPA was implicitly trusted in seven, and distrusted in two, suggesting that many still saw them as a trusted source, even though they were unhappy with their policies. Opaque mentions of the “U.S. Government” were either neutral or negative, when mentioned as an object, or implicitly distrusted, when mentioned as a source. This finding aligns with a Pew survey in June 2010 where about half of respondents reported having little to no trust in the Federal Government in regards to oil spill information (Pew 2010a). The sentiment around individual politicians mentioned in the set varied, however sentiment was mostly positive around Bobby Jindal, the governor of Louisiana, and extremely positive around Billy Nungesser, the President of Plaquemines Parish. Both officials were viewed positively for taking a stand against the U.S. government on behalf of the people:

20 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 67 DISCUSSION DRAFT – July 15, 2013

@SoloAnn (May 22, 2010): Venice, LA Nungesser angry, fed up with slow response: http://bit.ly/9wgIxT Oil Spill #oilspill #blacktide The people need to stand up too!

The Other Trusted Sources in the Set? “Scientists” Of particular interest, tweets in our sample contain seven mentions of otherwise unnamed “scientists,” like the one below: @WhoDat35 (July 23, 2010): Leading Ocean Scientists Issue Consensus Statement to End Dispersant Use (CNBC) http://bit.ly/ajhfmq #oilspill #blacktide @oceandog (July 12, 2010): Researcher says Corexit is toxic at only 2.61 PPM and reacts to warming of Gulf. #oilspill #ecocide The terms chemist, marine biologist, biologist, researcher and PhD also all appear in our sample, and there are 2360 mentions of scientists across the Total #OilSpill Tweet Collection. When we examine trust according to the role of the person mentioned (e.g. CEO, celebrity, advocate, lawmaker), the most trusted in our coded sample were academic researchers or other scientists, with 22 mentions of implicit trust. In some cases, the researchers cited were explicitly associated with local universities in text of the tweet, implying both local and expert authority: @oceansforme (June 8, 2010): Alabama scientist: Dispersant worse than oil on shore - Andalusia Star-News http://bit.ly/d4M0gQ #oilspill #ocean This shows the Twitter crowd valued academic credentials. Most tweets that referred to scientists contained links to articles where those sources were cited. In other words, a substantial portion of #OilSpill tweets contained references to cited, scientific content, a finding we return to in the Discussion.

Politics, Activism, and Connection to Past Events Glenn Beck and Erin Brockovich were both mentioned three times, the former within the right wing blogosphere commentary and the latter by locals after she visited the affected area as an activist to meet with them about their legal rights. Both were positioned in a positive way by those mentioning them. Ricki Ott, a scientist and advocate for people affected by oil spills was mentioned twice in the sample, as were Dick Cheney, Admiral Thad Allen, and George Bush. These last three received mostly negative mentions. The mentions of George Bush, of which there are 323 in the larger set, often co-occur with the term Katrina, marking a significant connection that many made between the 2010 Oil Spill and the Hurricane crisis that affected New Orleans in 2005. There are 2428 mentions of “katrina” within the Total #OilSpill Tweet Collection, including the three popular (by number of retweets) tweets below, which again show how partisan, political talk drifted into the oil spill conversation. RT @anonymized: It took George W Bush two days to waive the Jones Act after Katrina. It took Obama SEVENTY days to do so after #oilspill. (23 retweets) RT @exposeliberals: I dont know if the #oilspill is Obama’s Katrina, but Obama is certainly America’s Katrina! #tcot #p2 (15 retweets) RT @anonymized: If one really needs a metaphor, and employs logic to make one, #oilspill is Cheney’s Chernobyl, not Obama’s Katrina. (15 retweets) Conspiracy Theories One thing that initially surprised researchers as we delved into the data was the vast number of tweets relating to conspiracy theories, some more far-fetched than others. The digital convergence of political commentators from both sides of the political spectrum likely exacerbated this effect—a fact underscored

21 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 68 DISCUSSION DRAFT – July 15, 2013 by the multiple mentions of Glenn Beck in our small sample. Some of the conspiracy theory conversation was concentrated within the tweets of political commentators and other inflammatory accounts, in some cases event-specific, but many of the theories spread out through retweets and links to others in the set, including locals and influencers. Our tweet coding indicated that locals had considerable fear about the event, and our analysis suggests that conspiracy theories may have played a role in these fears. It is also possible that fear played a role in the propagation of conspiracy theories. In “The Righteous Mind,” Haidt (2012) posits that human rationality is subservient to intuition—that our heads follow our gut reactions and not the other way around—and that political division and belief in conspiracy theories both relate to this phenomenon. If this theory is true, then it might explain why seemingly rational, normal citizens tweeted and retweeted conspiracy theory-laden content during the spill. In this case, they created rational arguments to support their underlying fears—the fear of an impending environmental and health disaster, and the fear of government. For many locals, this latter fear was connected to their experiences during Katrina—something many commented on in their tweets. And the former is likely rationale in some sense, as locals shared photos of oil-covered birds and greasy, black waves coming ashore. This effect certainly played into the fears of oil or dispersants raining down on people. The story of oil raining from the sky later shifts to dispersants coming down from above, either directly from airplanes or as rain. Some of the most-shared photographs and videos in the dispersant-related #OilSpill data are images of planes spraying dispersants. Unified Command was careful to post these photos with a view of the water below, emphasizing that they were being sprayed over sea and not land. But other images do not show the water below and fearful citizens used them as evidence that chemicals were being dropped on them from above. This became an increasingly likely explanation when people fell ill: @anonymized (July 12, 2010): .@BP_America I wanna kno what #BP is gonna do for my Daughter age 4 if shes sick b/c of your use of #Corexit in #oilspill #Toxic #blacktide This may also explain why a notice that oil spill responders might need to be evacuated in advance of a hurricane fed into fears that the whole coast would be evacuated and put into “FEMA camps”: @britishpollute (June 6, 2010): Gulf Oil Disaster: Planned Event To Cause Mass FEMA Evacuation? http://tinyurl.com/26ofz2l #oilspill 16:45:01 @kate_sheppard (June 3, 2010): Conspiracy du jour: Feds using #BP #oilspill as excuse to put 50 mil people in FEMA camps. And oh yeah, aliens. http://mojo.ly/9bdEy9 The second tweet above, sent in early June, 2010, links to a Mother Jones article debunking the theory of FEMA camps, linking its origin and spread to Tea Party media outlets and bloggers. Indeed, the more sinister versions of this theory did not initially propagate in the space. The Total #OilSpill Tweet Collection has tweets between June 3 and June 25—most sent by the same @britishpollute account, which sent this same tweet 34 times—but they are rarely retweeted. Then an announcement by Unified Command that they are evacuating the area for an impending storm (Alex) sets off a new burst of activity around the mass evacuation meme: @joanneleon (June 25, 8:09am): Acc to Suttles, rig evacuation due to serious storm will take 5 days, another 5 days to return to working #oilspill #p2 @WhoDat35 (June 25, 12:31pm): BP Spill Opps Evacuation To Occur 5 Days Ahead Of Storm Force Winds (WSJ) http://bit.ly/c10L46 #oilspill #blacktide #hurricane @anonymized (June 25, 5:24pm): Check this video out -- EXTREME RED ALERT! Evacuation about to happen! Storms are mobilizing FEMA http://youtu.be/05xZcyVgI5s#oilspill #gulf 22 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 69 DISCUSSION DRAFT – July 15, 2013

These three tweets show how the Unified Command’s announcement leads to a resurgence of an existing fear in the population. In some ways, the affordances of the social media platform may contribute to this effect, as downstream Twitter users who followed @WhoDat35 may not have traversed the link in her tweets (to the Wall Street Journal) to see that the evacuations applied only to responders. Though Twitter can be a self-correcting information space (Mendoza et al., 2010; Sutton, 2010), misinformation often travels further and faster than corrections of misinformation. The Twitter platform contributed to another conspiracy theory as well: some high volume #OilSpill tweeters thought they were being censored when their content did not appear in a manner they expected: @Anonymized (July 25, 2010): #nda #oilspill #censorship || cc: @abcnews Australias #blacktide @noirbp has been BLOCKED / #censored from tweeting #corexit info for 2wks! Twitter often blocks accounts that appear to be spam and filters the content from “low-quality” accounts form appearing in public streams (Twitter Terms of Service). Though Twitter does not publish the algorithms they use—if they did, the spammers could easily evolve around them—but tweeting an extremely high volume and over-using hashtags may cause a change in status. In this case, the account owners thought it was the #OilSpill content of their tweets that were being “censored” and did not realize the filtering was probably due to other features in their tweets. This fed into a fear, remarked upon the set, of a widespread cover-up.

Discussion

Sense-making and scientists Previous research has reported that many people utilize social media platforms after crisis events for collective sense-making activities (Vieweg et al., 2008; Qu et al., 2009; Heverin & Zach, 2012). This research supports those claims, as the #OilSpill tweets show many people, including some locals, coming together through Twitter to try to make sense of the event. Weick describes sense-making as an attempt to make meaning out of experience, and connects it to a need to reduce uncertainty (xxxx). People are not comfortable with uncertainty, and in times of uncertainly they often fear the worst. Researchers in the area of health psychology have theorized that “fearing the worst” is a coping mechanism—i.e. if one prepares for the worst, then they are better able to accept the facts when their worst fears are realized (Sweeny & Cavanaugh, 2012). During the Oil Spill, some people turned to social media platforms to seek information that would help reduce their uncertainty, and to collectively process that information. We want to emphasize here that Twitter users OFTEN sought out and later tweeted/retweeted scientific information and other highly technical resources. These individuals (and groups) were actively searching the information space looking for trustworthy information, and making calculated decisions about what and whom they could trust. Many tweets from @WhoDat35, the most influential local Twitter user in the #OilSpill conversation, demonstrate this: @WhoDat35 (June 83, 2010): @LisaPJackson Your Gulf EPA #oilspill water sampling data is 6 days old. Please update it. Tx. http://www.epa.gov/bpspill/ @WhoDat35 (June 4, 2010): @BP_America why arent you using peer-reviewed technology of oil-eating bacteria proven to work 4 #oilspill clean-up? http://bit.ly/9gtzLV Other users in our sample shared similar information, including links to media articles and citing scientific “experts” and technical reports on the efficacy of different dispersants. When the scientific 23 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 70 DISCUSSION DRAFT – July 15, 2013 information they found was conflicting or when it acknowledged uncertainty in its claims, Twitter users often returned to a place of fearing the worst: @CarmenSisson (July 13, 2010, 11:30pm): Nearly 1 million barrels of dispersants have been poured into the Gulf, but scientists cant agree on safety. http://bit.ly/d4PGdV #oilspill @CarmenSisson (July 14, 2010, 11:25am): RT @cleanthegulfnow: Study: Corexit Cuts Survival Rates By 50% http://bit.ly/dhwcSd #oilspill The tweet except above shows another local Twitter user grappling with uncertainty about dispersant use. In the first tweet, sent at 11:30pm on July 13, she cites an article noting a disagreement by scientists on the safety of dispersant use, and in the second, sent 12 hours later, she retweets information citing a study that claims Corexit has negative effects on marine life. In the absence of certainty, she focused on the more negative outcome. For an unprecedented event where the impacts could not be known, and where scientists with different kinds of expertise shared conflicting views, crisis communicators were forced to walk a thin line between aggravating the public’s fears and being less than forthright about the situation. Implications for Practitioners Emergency response professionals and other crisis responders are beginning to experiment with social media, and in some cases these platforms are being formally integrated into response plans. Many realize that social media are interactional media, and clearly Unified Command and others understood this already in the Spring of 2010, when they deployed their communications response plan. This research supports the claim (Red Cross, 2010; Hughes & Palen, 2012) that social media users will attempt to contact response agencies through their social media accounts, and they will expect replies, often at a pace that may strain a response agency’s capacity. In many cases, these back and forth communications will be visible to others in the space, and can be an integral part of developing trust with an affected community. Responders can also use social media to detect rumors, misinformation, and concerns among the crowd. This research reveals several important implications for emergency responders who choose to use social media to interact with the crowd: 1. Social media is interactional media. Engage. But only if you can do it well. It is important to keep up with the stream, especially the public messages directed to your accounts. Establishing a social media account in the response space opens up a new communication channel that people will assume they can use. If their messages to responders go ignored, or if they feel disrespected by the response, then the work to engage could be counterproductive. 2. Connect with local users and other influencers. The social media crowd after a crisis event is a global one, but local voices are extremely important in shaping the conversation. One recommendation is to spend some time searching for the influential accounts—finding the most highly retweet accounts is one way to do this—and then engage with these accounts in a way that demonstrates respect for both their fears and their drive to be informed. 3. Government sources were seen more negatively and often as less trustworthy than other accounts in the social media crowd. 4. Existing NGOs in the environmental and wildlife protection sectors were, in this event, positively viewed and trusted. Emergent groups of community responders also developed trust within the local population and across the crowd. Local media were recommended and trusted as well. These groups might make good partners for communication strategies.

24 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 71 DISCUSSION DRAFT – July 15, 2013

5. The social media crowd values academic credentials and scientific information. In their information seeking and through their social media interactions, they are actively trying to make sense of the situation and to reduce their uncertainty. They may also be seeking both to learn the truth and to validate their fears. When these latter two conflict, they may suffer from cognitive dissonance and struggle to align their fears with incoming information. When the “truth” is uncertain, they are likely to assume the worst. 6. People affected by crisis want to help. Develop strategies for soliciting their expertise. Unified Command did use their Twitter account to collect ideas from the crowd for stopping the spill, and this likely had a positive effect on their account’s online reputation within the broader audience. But it may not have helped them connect as well to locals. Meanwhile, several engaged locals were combing the shore looking for oil impacts to share with the crowd. Finding a way to structure and support citizen reporting may be a way of building trust and engagement between responders and the local crowd. 7. One final warning is that responders should take care when creating event-specific accounts and websites that they have the resources to keep these alive when the event has ended. After the well had been capped, Unified Command stood down and cancelled their @Oil_Spill_2010 account, and at some point after that, another account took over that name and began tweeting information that was critical of both BP and the government response. Additionally, Unified Command shut down their websites, which led to claims by some that they were covering up information about the response.

Appendix A – Select Coding Categories and Codes Twitter User Account Location Local Peripheral Remote Unknown Affiliation Individual Individual w/ Org affiliation Informal Group Formal Organization Individual as a Cause Individual as an Org Online “Organization” Online Service Other Individual Role Various, e.g. Blogger, Journalist, Lawyer, Advocate, CEO, Executive Director, Entertainer, Elected Official Organization Type NGO Action Group, not an NGO Government (US) Government (State) Government (Local)

25 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 72 DISCUSSION DRAFT – July 15, 2013

Oil or Dispersant Industry Research/Education (Academic) Research/Education (Government) Research/Education (Oil Industry) Research/Education (Other) Media (International) Media (National) Media (Local) Media (Alternative) Military (U.S.) Other Tweet Theme State: Clean up or Spill Status Response: Clean up strategy Response: Communications / Cover up Response: Environmental Policy Response: Monitoring Response: Options: Science of Cleanup / Dispersants Response: Recovery / Assistance Response: Evacuation / Safety Regulation Response: Responsibility / Liability Response: Who is in charge? Response: Community Response Response: Wildlife Response Response: Call to Action Response: General or Other Impact: Environmental Impact Impact: Health Impact Impact: Mental Health Impact Impact: Economic Impact Impact: Political Impact Impact: General or Other Driver: Environmental or Commercial Policy Driver: Corruption Driver: Us / Our behavior Other Emotion Neutral 26 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 73 DISCUSSION DRAFT – July 15, 2013

Anger / Outrage Distrust Frustration Fear / Apprehension Sadness Disgust / Horror Happiness / Relief Neutral Tone, Negative Content Sarcastic / Negative Accusatory General Negative or Other Negative Supportive / Positive Undecided / Ambivalent Other Mentions Name Write-in Individual Role Various, e.g. Blogger, Journalist, Lawyer, Advocate, CEO, Executive Director, Entertainer, Elected Official As Source Sentiment Explicit Trusted Implicit Trusted Neutral Questioning Implicit Distrusted Explicit Distrusted Unknown – Can’t Code As Object Sentiment Explicit Positive Implicit Positive Neutral Implicit Negative Explicit Negative Directive Other Unknown

References

Blondel V, Guillaume J, Lambiotte R, Mech E. (2008) Fast unfolding of communities in large networks. J Stat Mech: Theory Exp 2008:P10008. (http://findcommunities.googlepages.com) 27 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 74 DISCUSSION DRAFT – July 15, 2013 boyd, d. & Crawford, Kate, Six Provocations for Big Data (2011). A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society, September 2011. Costolo, D. (2010). Meaningful Growth. Twitter Blog. Available at: https://blog.twitter.com/2010/meaningful-growth

Drabek, T. E. (1986). Human Responses to Disaster: An Inventory of Sociological Findings. New York: Springer-Verlag. Dynes, RR. (1970). Organized Behavior in Disaster. Heath. Fritz, C. E., & Mathewson, J. H. (1957). Convergence behavior in disasters: A problem in social control. Washington, DC: National Academy of Sciences. Haidt, J. (2012). The Righteous Mind: Why Good People are Divided by Politics and Religion. New York: Vintage Books. Harvard Humanitarian Initiative. (2011). Disaster Relief 2.0: The Future of Information Sharing in Humanitarian Emergencies. Washington, D.C. and Berkshire, UK: UN Foundation & Vodafone Foundation Technology Partnership. Heverin, T., & Zach, L. (2012). Use of microblogging for collective sense‐making during violent crises: A study of three campus shootings. Journal of the American Society for Information Science and Technology, 63(1), 34-47. Hughes, A., L. Palen, J. Sutton, S. Liu, & S. Vieweg. (2008). “Site-Seeing” in Disaster: An Examination of On-Line Social Convergence. Proc. of Information Systems for Crisis Response and Management Conference (ISCRAM) 2008. Hughes, A. & Palen, L. (2009). Twitter adoption and use in mass convergence and emergency events. Proc of the 2009 Information Systems for Crisis Response and Management Conference, ISCRAM 2009. Hughes, A. L., & Palen, L. (2012). The Evolving Role of the Public Information Officer: An Examination of Social Media in Emergency Management. Journal of Homeland Security and Emergency Management, 9(1), Article 22. Johnson, B. (2013) “Study: ‘Dirty bathtub’ buried oil from BP spill.” CBS News, January 28, 2013. http://www.cbsnews.com/8301-205_162-57566171/study-dirty-bathtub-buried-oil-from-bp-spill/ Retrieved July 7, 2013. Kendra, J. M. & Wachtendorf, T. (2003). Reconsidering Convergence and Converger: Legitimacy in Response to the World Trade Center Disaster. Terrorism and Disaster: New Threats, New Ideas: Research in Social Problems and Public Policy 11, 97-122. Kreps, G. A. & Bosworth, S.L. (1994). Organizing, Role Enactment, and Disaster: A Structural Theory. Cranbury, NJ: Associated University Presses. Kwak, H., Lee, C., Park, H., & Moon, S. (2010). What is Twitter, a Social Network or a News Media? In Proc. of , (WWW 2010), pp. 591-600, ACM: New York. Latonero, M., & Shklovski, I. (2011). Emergency Management, Twitter, and Social Media Evangelism. International Journal of Information System for Crisis Response Management. 3 (4): 1-16. Mendoza, M. Poblete, B. & Castillo, C. (2010). Twitter under crisis: can we trust what we RT? In Proceedings of the First Workshop on Social Media Analytics (SOMA '10). New York: ACM, 71-79. National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling. (2011) “The Use of Surface and Subsea Dispersants During the BP Deepwater Horizon Oil Spill, Staff Working Paper No. 4.” October 6, 2010. Revised January 11, 28 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 75 DISCUSSION DRAFT – July 15, 2013

2011. http://www.oilspillcommission.gov/sites/default/files/documents/Updated%20Dispersants%20W orking%20Paper.pdf Nuclear Regulatory Commission: Developing an Emergency Risk Communication (ERC)/Joint Information Center (JIC) Plan for a Radiological Emergency (2011) Palen, L, Anderson, K.M., Mark, G., Martin, J., Sicker, D., Palmer, M. & Grunwald, D. (2010). A Vision for Technology-Mediated Support for Public Participation & Assistance in Mass Emergencies & Disasters. In Proceedings of the 2010 ACM-BCS Visions of Computer Science Conference, 1-12. Edinburgh, UK: British Computer Society. Pew. (2010a). Gulf Disaster Continues to Dominate Coverage, Interest News Media Trusted For Information On Oil Leak. June 9, 2010. Pew Research Center for People and the Press. Web accessed June 18, 2013. http://www.people-press.org/2010/06/09/news-media-trusted-for-information-on-oil- leak/ Pew. (2010b). 100 Days of Gushing Oil – Media Analysis and Quiz. August 25, 2010. Pew Research Center’s Project for Excellence in . Web accessed June 18, 2013. http://www.journalism.org/analysis_report/100_days_gushing_oil Qu, Y., Wu, P.F., Wang, X. (2009). Online Community Response to Major Disaster: A Study of Tianya Forum in the 2008 Sichuan Earthquake, 42nd Hawaii International Conference on System Sciences (HICSS '09), 1-11. Qu, Y., Huang, C., Zhang, P. & Zhang, J. (2011). Microblogging after a major disaster in China: A case study of the 2010 Yushu Earthquake, In Proc. of the ACM 2011 Conference on Computer Supported Cooperative Work. (CSCW 2011). New York: ACM, 25-34. Rao, L. (2010). Twitter Seeing 90 Million Tweets Per Day, 25 Percent Contain Links. TechCrunch, (September 14, 2010). Available at: http://techcrunch.com/2010/09/14/twitter-seeing-90-million-tweets- per-day/ Robertson, C. & Krauss, C. (2010). "Gulf Spill Is the Largest of Its Kind, Scientists Say.” New York Times, August 2, 2010. http://www.nytimes.com/2010/08/03/us/03spill.html?_r=2&fta=y& Starbird, K., Palen, L., Hughes, A. & Vieweg, S. (2010). Chatter on The Red: What hazards threat reveals about the social life of microblogged information. Proc. of the ACM 2010 Conference on Computer Supported Cooperative Work. New York: ACM, 241-250. Starbird, K., & Palen, L. (2010). Pass It On?: Retweeting in Mass Emergencies. Proc. of the International Conference on Information Systems for Crisis Response & Management Conference, ISCRAM 2010. Starbird, K., & Palen, L. (2011). ‘Voluntweeters’: Self-organizing by digital volunteers in times of crisis, Proc of the 2011 ACM Conference on Human Factors in Computing Systems. New York: ACM, 1071- 1080. Suh, B., Hong, L., Pirolli, P. & Chi, E.H. (2010). Want to be Retweeted? Large Scale Analytics on Factors Impacting Retweet in Twitter Network, in IEEE Intl Conference on Social Computing, IEEE, 177-184. Sutton, J., Palen, L. & Shklovski, I. (2008). Backchannels on the Front Lines: Emergent Use of Social Media in the 2007 Southern California Wildfires. In the 2008 Proceedings of Conference on Information Systems on Crisis Response and Management (ISCRAM 2008). Suttton, J.N. (2010). Twittering Tennessee: Distributed Networks and Collaboration Following a Technological Disaster. In Proceedings of the 7th International ISCRAM Conference. Seattle, USA, May 2010. Suttton, J.N. (2013). Tweeting the Spill: Online Informal Communications, Social Networks, and 29 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 76 DISCUSSION DRAFT – July 15, 2013

Conversational Microstructures during the Deepwater Horizon Oilspill. International Journal of Information Systems for Crisis Response and Management (IJISCRAM) 5(1). Sweeny, K., & Cavanaugh, A. G. (2012). Waiting is the hardest part: A model of uncertainty navigation in the context of health news. Health Psychology Review, 6(2), 147-164. Tierney, K, Lindell, M, & Perry, R.W. (2001). Facing the Unexpected: Disaster Preparedness and Response in the United States. John Henry Press, Washington, DC. Vieweg, S., Palen, L., Liu, S., Hughes, A., & Sutton, J. (2008). Collective Intelligence in Disaster: An Examination of the Phenomenon in the Aftermath of the 2007 Virginia Tech Shootings. Proceedings of the Information Systems for Crisis Response and Management Conference (ISCRAM 2008). Vieweg, S., Hughes, A., Starbird, K., & Palen, L. (2010). Micro-blogging during two natural hazards events: What Twitter may contribute to situational awareness, Proc of the 2010 ACM Conference on Human Factors in Computing System, New York: ACM, 1079-88.

30 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 77 DISCUSSION DRAFT

Methods for communicating the complexity and uncertainty of response actions and the tradeoffs associated with various response options

(Lead Ann Bostrom, Bob Pavia, Ann Hayward Walker, eventually Kate Starbird, Tom Leschine, Debra Scholz also)

DISCUSSION DRAFT

I. Introduction

Experience with stakeholders and the public on oil spills and dispersant issues from 1980 through the Deepwater Horizon has shown that communicating about dispersants has long been and remains a problem across the country (Walker 2012, 2011a, 2011b, 2011c,

2010, 2001a, 2001b, 1999, 1997; Bostrom et al 1995 and 1996; Pavia 1984, 1985; Pond et al 1997). The research presented here represents one piece of a collaborative social science and natural science research project designed to address public, media and political concerns and develop preparedness recommendations and response tools to facilitate well-balanced decisions under the uncertain conditions of risk that spills represent. Natural science communications can benefit from data-driven social science research and collaborations between the two can lead to new breakthroughs (Schaal,

2012).

To develop strategies for engaging communities and individuals in discussions about spill issues, the overarching project builds on a mental models approach for risk communications and entails a relatively new approach to survey research, analysis of social media data, and integration of relevant social and natural science research findings.

The project has three subsidiary objectives: (1) identify key information needs and areas of confusion and misunderstanding, (2) explore the role of social media in effective risk

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 1 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 78 DISCUSSION DRAFT communication, and (3) identify better methods to communicate scientific uncertainty and complexity with respect to response alternatives. Results from survey research and analysis of Deepwater Horizon Twitter data inform the team's approach for characterizing model constituencies and their communication needs as they relate to dispersants and oil spills. The results are intended to be immediately applicable to promote effective response communications about dispersants and oil spills. Project end users include

Unified Command (Federal and State On-scene Coordinators and Incident Commanders representing a spiller known as the Responsible Party), dispersant decision makers from coastal Regional Response Teams (RRTs), academia. Many of these key stakeholders are looked to by elected officials/politicians and the public for leadership and assurance about oil spill response options.

Communicating about oil spills and oil spill responses involves conveying not only the logistics and politics of response decisions and actions, but also the science of oil spills and response options (Machlis and McNutt 2011, p 320). And like all science, the science of oil spills and spill response is inherently uncertain. The complex mix of incident-specific variables and unknown information amplifies these scientific uncertainties in spill situations, especially during the initial emergency phase. Tackling this as a risk communication task means acknowledging the uncertainties and complexities; tackling this as a decision support task means providing actionable information. Accordingly, this paper has two aims. The first is to describe current and emerging approaches to conveying uncertainty in risk communications, with an eye toward how these approaches are and can be applied to oil spill response situations. The

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 2 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 79 DISCUSSION DRAFT second is to explore current approaches to tackling the complexity of oil spill response and the sciences behind it, to assess what is currently done to communicate actionable knowledge and information for oil spill response, and how to improve on that.

President Truman famously joked about wanting a one-handed economist, since his economic advisor Nourse was always saying “on the one hand” and “on the other.”

While considering both pros and cons is nearly universally considered an essential element of thoughtful decision making, advising someone to consider the pros and cons of action in a crisis may lead to confusion rather than protective or risk reducing action.

The conflict between simplicity on the one hand and information accuracy and sufficiency on the other is a signature of crisis and emergency communications. Further, even in less critical risk situations, if the scientific and social bases for making a choice are difficult to convey or understand the resulting confusion can be exploited by parties preferring inaction (Freudenberg et al, 2008).

Any focus on actions requires an assessment of the context; what choices are there and what do they need to know in order to act effectively? Oil spill responders face technical decisions and occupational health challenges, in contrast to consumers who face seafood purchase and consumption choices. Even within such categories, levels of expertise and knowledge needs will range widely, with corresponding variation in prior knowledge information processing capacity (Ericsson and Lehmann, 1996) and ability to handle uncertainty and complexity in decision making (Parker and Fischhoff, 2005).

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 3 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 80 DISCUSSION DRAFT

Much of the advice in vogue in oil spill response focuses on crisis and emergency situations; this guidance focuses on boiling information down to a handful of specifics

(e.g., Covello’s advice, as cited in Reynolds and Seeger 2012 Chapter 1), getting “the facts right” (Reynolds and Seeger 2012 chpt 3), and focusing on actions people should take. Likely more important than the notion of boiling information down to a few small sound bites is recognizing the importance of providing enough specifics for recipients to act on the information in order to effectively mitigate risk (e.g., Wood et al 2012).

II. COMMUNICATING UNCERTAINTY

Uncertainty encompasses a wide range of states, including lack of knowledge (epistemic or model uncertainty fall into this category), natural variability (also called aleatoric uncertainty) (cite Morgan 1992; Eiser et al 2012), ambiguity (lack of precision or clarity), and ignorance (Smithson, 1989). While experts in the field sometimes insist that it is essential to distinguish between these (e.g., TRB rollover report from NRC), responses to them share some common features. Aversion to ambiguity and uncertainty is a common finding (Camerer and Weber, 1992); people try to avoid it. A consequence of this is that people may prefer point estimates even when they are misleading, and may avoid taking risks even when the net benefits of doing so are apparent and agreed. It follows that focusing on the uncertainty in a situation may deter appropriate protective or mitigative actions. However, suppressing known uncertainties may also be regarded as unethical, although not as unethical as distorting information (Smithson 2008).

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 4 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 81 DISCUSSION DRAFT

While some sources advise acknowledging uncertainty (e.g., Reynolds and Seeger 2012 p

156), proposals on how to do so are often general, focusing on recognizing that there are uncertainties, rather than specific guidance on how to communicate particular uncertainties. For example, the CDC CERC principle of acknowledging uncertainty and not overstating what you know is illustrated with this quote:

““I want to acknowledge the importance of uncertainty. At the early stages of an outbreak, there’s much uncertainty, and probably more than everyone would like. Our guidelines and advice are likely to be interim and fluid, subject to change as we learn more.”

Dr. Richard Besser, CDC Acting Director, H1N1 Press Conference, April 23, 2009” from Reynolds and Seeger, CERC, CDC 2012 P 14

Relevant to this discussion is the evidence that “less is more” for information recipients who may face challenges understanding numbers, as distilled from empirical studies of the effects of communicating quantitative information and summarized in the FDA risk communication handbook (Fagerlin and Peters, chpt 6 in Fischhoff et al 2011). The resulting advice steers clear of communicating specific uncertainties around quantitative estimates, with a key message of keeping innumeracy and human information processing limits in mind, generally by finding simple ways of summarizing point estimates. Up to a quarter of the U.S. public may be considered innumerate, and innumeracy is associated with higher reliance on narratives and emotions in decision making (add refs).

There is a rich body of empirical evidence regarding how to communicate uncertainties around numerical estimates that is applicable to oil spill response. Oil amounts, distances, transport rates—many numerical parameters are of interest to oil spill responders and publics concerned about potentially exposed ecosystems, fisheries, or human populations.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 5 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 82 DISCUSSION DRAFT

In oil spills, communicating uncertainty about oil movement, a topic in which the public has a great interest, is consider important (Beegle-Krause 2001). Experience has show that it is difficult for event spill response experts to interpret uncertainty in the context of response decisions. It is complex enough that NOAA publishes an extensive interpretation guide for graphical oil spill trajectories

(http://docs.lib.noaa.gov/noaa_documents/DWH_IR/reports/NOAA_Oil_Spill_Response/

2056_NOAATrajectoryMaps.pdf)

And there is considerable press around communication failures stemming from inadequate communications about uncertainties (e.g., oil flow rates from the well head in the BP spill, oil quantities, and so forth) [add refs]. Empirical research suggests that (a) verbal expressions of uncertainty lend themselves to diverse interpretations, depending on context, and so are readily subject to misinterpretation (ref Budescu et al, 2009;

Wallsten and Budescu), (b) visual and graphical representations of uncertainty appear in many cases to be more interpretable than numbers, though numbers may be useful for tasks where they can be applied directly (Shah and others; ), (c) not all visual and graphical representations of uncertainty are equal (Ancker et al 2006; Cuite et al, 2008); some graphical presentations of uncertainty have been shown to influence risk preferences (Stone et al 1997), for example: (d) communicating uncertainty may (i) convey an impression of transparency or honesty that promotes good communication

(Johnson and Slovic 1995) and improve decision making or trust in the information

(Joslyn and LeClerc 2012; Joslyn, Nemec and Savelli 2013); (ii) convey an impression of incompetence (Johnson and Slovic 1995); or (iii) confuse people (Johnson and Slovic

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 6 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 83 DISCUSSION DRAFT

1998) and; (e) confirmatory information processing biases may in some instances lead people to select an upper or lower bound estimate that supports their prior beliefs

(Viscusi et al, 1991, see also Russo et al 1996), in cases where they have access to a distribution or range of estimates. Finally, evidence suggests that people prefer to communicate uncertainty verbally, although they prefer numbers when they are on the receiving end (Wallsten et al, 1993a,b)

Stemming from these findings are several specific recommendations:

(a) include numbers with verbal probability descriptions [provide examples]

(b) use simple graphics when possible, bearing in mind that some kinds of graphical representations are more interpretable than others, and that this depends on context, as well as individual numeracy and graphicacy (e.g., Cuite et al, 2008; Shah and Freedman

2009).

(c) Be prepared for increased risk aversion and conflict when communicating uncertainties, as uncertainty increases risk avoidance, and uncertainties give people more leeway to make choices on the basis of their values, which may conflict.

(d) Evaluate communications of uncertainty, as effects may not be predictable (see above).

III. COMMUNICATING COMPLEXITY: Simulation and simplification

Many environmental policy decisions are complex and characterized as “wicked, ” including oil spill policy and response decisions (Machlis and McNutt 2011).

Characteristics that typify wicked problems include persistence, deep scientific

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 7 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 84 DISCUSSION DRAFT uncertainties, conflicting values, and competing definitions. Oil spills and other fossil fuel transport accidents as a class of problems share all of these characteristics. Specific oil spill events exhibit more uniformity of purpose (i.e., containing or cleaning up spills), but still entail deep scientific uncertainties and complexity, in part because they involve ocean ecosystems.

Complexity in decision making can refer to the number of decision attributes, the number of decision alternatives, or to other characteristics of the information available about the decision, for example, including the social or political complexity of the decision context or processes. People prefer a moderate amount of complexity; too little is boring; too much overwhelms (Berlyne, 1966). Preferences for complexity are context dependent.

Fear, for example, can reduce preferences for complexity (Berlyne, 1966).

In many circumstances our perceptual, judgment and decision-making processes work to reduce complexity. By selectively directing, enhancing or inhibiting those processes, communications can shape their deployment and effects within limits, either unintentionally or by design. Examples of this kind of shaping include framing effects

(Peters et al 2006; Levin, Schneider and Gaeth, 1998) and anchoring effects (Tversky and

Kahneman, 1974; Epley and Gillovich, 2006).

Strategies for making decisions under complexity include simplification, simulation, and partitioning or narrowing the scope of the problem. Simplification can be done analytically, through modeling, but is also is carried out though story telling (Kahneman

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 8 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 85 DISCUSSION DRAFT

2011), or mental modeling (Morgan et al, 2002; Gentner and Stevens, 1983; Johnson-

Laird 1983), which may mean using analogy (Bostrom 2008; Gentner and Smith 2012), after determining the gist of the problem (on gist extraction, see Adam and Reyna 2005).

Mental models are our “inference engines” for simulating events (Bartlett, 1983; Craik

1943; Gentner and Stevens 1983). For further discussion of mental models in the context of oil spill response see (Bostrom et al, white paper on stakeholder mental models).

Simplification and satisficing are routine in problem solving and decision making. As put by Simon in 1959 (p 272): “The decision-maker's model of the world encompasses only a minute fraction of all the relevant characteristics of the real environment, and his inferences extract only a minute fraction of all the information that is present even in his model.” Simon goes on to describe how we quite actively—if unwittingly—construct these fractions. While one might like to believe that this is intentional screening of situations to see what is relevant, many of these processes stem from how we process information cognitively. Whether they are adaptive processes depends on the match between the task and the process, which varies. In the ensuing decades, research on behavioral decision making has revealed some of the structure of these active processes, including mental shortcuts (rules of thumb or “heuristics”), biases (such as the tendency to preferentially process information that confirms our prior beliefs), and how these vary as a function of individual differences (e.g., numeracy) or context (e.g., stress or time pressure) (see e.g.,

Payne, Bettman and Johnson, 1988; Payne, Samper, Bettman and Luce, 2008; Peters et al

2006; Tversky and Kahneman, 1974; Kahneman and Frederick, 2002; Fischhoff, 2012;

Kahneman, 2011; Weber and Johnson, 2009). Awareness of these is a first step toward using them to improve the design of communications processes and products.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 9 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 86 DISCUSSION DRAFT

Story telling manifests itself in oil spill response both through scenario construction and analysis (Leschine and Pavia draft paper), and through narratives evident in social media

(Starbird and Dailey draft paper) and responses to open-ended survey questions (mental models draft paper). Stories have the advantage of engaging people more effectively than statistical evidence (Beach 2009; Kahneman 2011), and the disadvantage of being specific so that they tend to be relevant only metaphorically or by analogy, and even then are likely misleading (Kahneman 2011, chpt 19).

While use of analogies and metaphors by oil spill response stakeholders remain to be formally analyzed, analogies are prevalent in oil spill risk communications. One example is an API poster that borrows a cake analogy from Raffi Khatchadourian of the New

Yorker (see Figure 1). Analogies can be extremely useful (Gentner et al 2011), and even essential to learning and discovery, but they can invite comparisons that evoke unanticipated responses from stakeholders (Johnson 2003; Roth et al 1990; Slovic et al

1990; for further discussion see Bostrom 2008). As with other forms of risk communication, evaluation is essential (Fischhoff et al 2011; see Walker and Pavia draft paper).

Systems dynamics research examining mental models of dynamic systems shows that intuitions often align with simpler, more linear processes and fail to account adequately for accumulation and feedbacks (Booth Sweeney and Sterman 2000, 2007; Moxnes and

Saysel, 2009; Sterman 2011). Systems dynamics researchers have had some success in improving people’s intuitions about the performance of nonlinear systems, though it has Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 10 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 87 DISCUSSION DRAFT been modest (e.g., Moxnes and Saysel, 2009; Sterman 2010). Simple linear models generally predict behaviors—such as human performance—as well or better than experts do, if the models incorporate those variables identified as key by experts (Dawes 1979).

However even analytic linear models may fail nevertheless to explain or predict well, , especially for systems such as environmental or ocean ecological systems that are complex and nonlinear (Lorenz 1983; Miles 2009).

NOAA has attempted to create oil spill response simulations for users with a wide range experience, including those with little or no expertise (such as General NOAA

Operational Modeling Environment, or GNOME, at http://response.restoration.noaa.gov/oil-and-chemical-spills/oil-spills/response- tools/gnome.html, or the trajectory analysis planner, TAP at http://response.restoration.noaa.gov/oil-and-chemical-spills/oil-spills/response- tools/trajectory-analysis-planner.html). NOAA has also sponsored or developed mapping tools, such as ERMA, Environmental Response Management Application

(http://gomex.erma.noaa.gov/ ), which is an interactive web mapping tool.

[comment on evaluation/success of these efforts?] There are an increasing number of online simulators for complex systems, with which interested parties can play directly (e.g., at http://bit.ly/atmco2, http://bit.ly/stockflow, http://bit.ly/5c4pU3, or http://bit.ly/10Po0ah, also see Demski et al 2013). In the domain of water management, for example the Desert City Decision Center has developed WaterSim on the web, an online simulation that is now being used in K-12 education (see http://dcdc.asu.edu/watersim/watersim-on-the-web/). The more sophisticated simulations

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 11 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 88 DISCUSSION DRAFT are similar to NOAA’s response simulations in that they require downloading and installing software first (e.g., http://www.climateinteractive.org/ and the Decision Theater version of WaterSim). Recently easily accessible web-based simulators have proliferated, which suggests there may be a role for an easily accessible simple web- interface simulator of spill response decisions that would illustrate response decision consequences and trade-offs. One such example is the Response Operations Calculator

(ROC), an online tool that allows evaluation of response cleanup methods.

(http://www.genwest.com/roc). ROC allows users to compare combinations of response methods, such as in situ burning, dispersants, and mechanical recovery, under simplified spill scenarios.

IV. DISCUSSION

This paper highlights six recommendations to complement those developed in other parts of this project (Starbird and Dailey, Walker and Pavia, Leschine and Pavia, and Bostrom et al):

1) Include numbers with verbal probability descriptions;

2) Use simple graphics when possible to communicate probability and

uncertainty, bearing in mind that some kinds of graphical representations are

more interpretable than others, and that this depends on context, as well as

individual numeracy and graphicacy;

3) Be prepared for value conflicts and increased risk aversion when

communicating uncertainties;

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 12 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 89 DISCUSSION DRAFT

4) Evaluate communications of uncertainty, as effects may not be predictable;

and

5) Take how people simplify information into account in designing

communications (and evaluate); and

6) Develop interactive web-based oil spill response simulations that help users

explore tradeoffs in oil spill response decisions.

These recommendations and the insights identified in this paper come from behavioral decision research, a field which has only recently begun to achieve recognition in government operations. The Office of Science and Technology Policy in the U.S. is now

(July 2013) hiring for a behavioral insights team, inspired by the Behavioral Insights

Team (BIT, or “nudge unit” after Thaler and Sunstein, 2008) commissioned by the UK

Prime Minister David Cameron in 2010 (see https://www.gov.uk/government/organisations/behavioural-insights-team). The aim is to use rapid, iterative experimentation to identify and test interventions that will advance government priorities and save government money. Thus it is an opportune moment for the NOAA and other federal agencies involved in oil spill response to familiarize themselves with the empirical research strategies used in behavioral decision research, and use them to achieve improvements in oil spill response communications and management.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 13 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 90 DISCUSSION DRAFT

Figure 1. API Poster (source?)

REFERENCES

Adam, M. B. and Reyna, V. F. (2005), Coherence and correspondence criteria for rationality: experts' estimation of risks of sexually transmitted infections. J. Behav. Decis. Making, 18: 169–186. doi: 10.1002/bdm.493

Ancker, J. S., Senathirajah, Y., Kukafka, R., & Starren, J. B. (2006). Design features of graphs in health risk communication: a systematic review. Journal of the American Medical Informatics Association, 13(6), 608-618.

Beach, L. R. (2009). Narrative Thinking and Decision Making: How the Stories We Tell Ourselves Shape our Decisions, and Vice Versa. Online publication: www.LeeRoyBeach.com.

Beegle-Krause J. (2001) General NOAA Oil Modeling Environment (GNOME): A New Spill Trajectory Model. International Oil Spill Conference Proceedings: March 2001, Vol. 2001, No. 2, pp. 865-871

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 14 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 91 DISCUSSION DRAFT

Berlyne, D. E. Curiosity and Exploration. Science , New Series, Vol. 153, No. 3731 (Jul. 1, 1966), pp. 25-33

Booth Sweeney, L. and Sterman, J. (2000) Bathtub Dynamics: Initial Results of a Systems Thinking Inventory. System Dynamics Review, 16(4), 249-294.

Booth Sweeney, L. and Sterman, J. (2007) Thinking About Systems: Students’ and Their Teachers’ Conceptions of Natural and Social Systems. System Dynamics Review, 23(2-3), 285-312.

Bostrom, A. “Lead is like mercury: risk comparisons, analogies and mental models.” Journal of Risk Research, 11(1-2), 99-117, 2008.

Bostrom, A., P. Fischbeck, JH Kucklick, R. Pond, A. Hayward Walker. Ecological Issues in Dispersant Use: Decision-Makers’ Perceptions and Information Needs. Scientific and Environmental Assoc. Inc., for the Marine Preservation Association. Washington DC, Oct 31, 1997.

Bostrom, A., Fischbeck, P., Kucklick, J.H., and Hayward Walker, A. H.A Mental Models Approach to the Preparation of Summary Reports on Ecological Issues Related to Dispersant Use. Marine Spill Response Corporation, Technical Report Series, 95- 019. Washington D.C. 1996.

Budescu D, Broomell S and Por HH (2009). Improving Communication of Uncertainty in the Reports of the Intergovernmental Panel on Climate Change. Psychological Science 2009 20: 299- 308.

Camerer, C. and M. Weber 1992. Recent Developments in Modeling Preferences - Uncertainty and Ambiguity Journal of Risk and Uncertainty Volume: 5 Issue: 4 Pages: 325-370 DOI: 10.1007/BF00122575

Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. Sternberg (Ed.), Advances in the psychology of human intelligence (pp. 17–76). Hillsdale, NJ: Erlbaum.

Cuite, C.L., N.D. Weinstein, K. Emmons, and G. Colditz. 2008. A test of numeric formats for communicating risk probabilities. Medical Decision Making 28: 377–84.

Dawes, RM (1979). The robust beauty of improper linear models in decision making. Am Psychology 34:571-582.

Demski, C. Spence, A. and Pidgeon, N. (2013) Summary findings of a survey conducted in August 2012 – Transforming the UK Energy System: Public Values, Attitudes and Acceptability Working Paper. (UKERC: London)

Eiser, J. Richard, Ann Bostrom, Ian Burton, David M. Johnston, John McClure, Douglas Paton, Joop van der Pligt, and Mathew P. White. "Risk interpretation and action: A Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 15 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 92 DISCUSSION DRAFT conceptual framework for responses to natural hazards." International Journal of Disaster Risk Reduction 1 (2012): 5-16.

Epley, N., & Gilovich, T., (2006). The anchoring-andadjustment heuristic: Why the adjustments are insufficient. Psychological Science, 17, 311–318.

Ericsson K. A. and A. C. Lehmann. Expert and Exceptional Performance: Evidence of Maximal Adaptation to Task Constraints. Annu. Rev. Psychol. 1996.47:273-305.

Fischhoff, 2012 Judgment and Decision Making (Chpt 1 pp 3-22) in Judgment and Decision Making, Earthscan, 2012;

Fischhoff B, Brewer NT and Downs JS, (eds). Communicating risks and benefits : an evidence-based user's guide. Silver Spring, MD : U.S. Dept. of Health and Human Services, Food and Drug Administration, 2011. Accessed 10 July 2013 at: http://www.fda.gov/downloads/AboutFDA/ReportsManualsForms/Reports/UCM268069. pdf

Freudenberg WR, Gramling R, Davidson DJ. Scientific certainty argumentation methods (SCAMs): science and the politics of doubt. Sociol Inq 2008, 78:2–38. . D. Gentner, K. J. Holyoak, & B. N. Kokinov (Eds.), The analogical mind: Perspectives from cognitive science. Cambridge, MA: MIT Press.

Gentner, D. & Smith, L. (2012). Analogical reasoning. In V. S. Ramachandran (Ed.) Encyclopedia of Human Behavior (2nd Ed.). pp. 130-136. Oxford, UK: Elsevier.

Gentner, D., & Stevens, A. L. (Eds.). (1983). Mental models. Hillsdale, NJ: Lawrence Erlbaum Associates.

Hegerty, Canham and Fabrikant 2010. Thinking about the weather (good display design)

Johnson, B. B. (2003), Are Some Risk Comparisons More Effective Under Conflict?: A Replication and Extension of Roth et al. Risk Analysis, 23: 767–780.

Johnson B.B. and Slovic P (1995) Risk Analysis (add full ref)

Johnson, B. B., & Slovic, P. (1998). Lay views on uncertainty in environmental health risk assessment. Journal of Risk Research, 1,261–279.

Joslyn, S., Nemec, L., & Savelli, S. (2013). The Benefits and Challenges of Predictive Interval Forecasts and Verification Graphics for End Users.Weather, Climate, and Society, 5(2), 133-147.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 16 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 93 DISCUSSION DRAFT

Joslyn S and J. LeClerc, 2012: Uncertainty forecasts improve weather related decisions and attenuate the effects of forecast error. J. Exp. Psychol., 18, 126–140, doi:10.1037/a0025185.

Kahneman, D., & Frederick, S. (2002). Representativeness revisited: Attribute substitution in intuitive judgment. In T.Gilovich, D. Griffin, & D. Kahnenan (eds.) Heuristics and biases: The Psychology of Intuitive Judgment. Cambridge University Press .

Kahneman, D. (2011). Thinking, fast and slow. Macmillan.

Lakoff, George. Women, Fire and Dangerous Things: What Categories Reveal about the Mind. Chicago, The University of Chicago Press, 1987.

Levin, I.P., Schneider, S.L., & Gaeth, G.J. (1998). All frames are not created equal: A typology and critical analysis of framing effects. Organizational Behavior and Human Decision Processes, 76, 149–188.

Lorenz, E. (1963): Deterministic nonperiodic flow. Journal of Atmospheric Sciences, 20, 130 – 141.

Machlis, G.E., and M.K. McNutt. 2011. Ocean policy: Black swans, wicked problems, and science during crises. Oceanography 24(3):318–320, http://dx.doi.org/10.5670/oceanog.2011.89.

Miles, EL On the Increasing Vulnerability of the World Ocean to Multiple Stresses. Annual Review of Environment and Resources, Volume: 34 Pages: 17-41: 2009

Morgan, M. G. (1992). Uncertainty: a guide to dealing with uncertainty in quantitative risk and policy analysis. Cambridge University Press.

Moser, Susanne C., S. Jeffress Williams, and Donald F. Boesch (2012). Wicked Challenges at Land's End: Managing Coastal Vulnerability Under Climate Change, Annual Review of Environment and Resources, Vol. 37: 51–78.

Moxnes, E., and Saysel, A.K. (2009) Misperceptions of global climate change: information policies. Climatic Change 93(1-2), 15-37.

Parker, A. M., & Fischhoff, B. (2005). Decisionmaking competence: External validation through an individual-differences approach. Journal of Behavioral Decision Making , 18 , 1–27.

Pavia, R. and L.A. Onstad. 1985. Plans for Integrating Dispersant Use In California. Proceedings, 1985 International Oil Spill Conference. Washington, D.C. American Petroleum Institute. pp. 85-88.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 17 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 94 DISCUSSION DRAFT

Pavia, R. and R.W. Smith. 1984. Development and Implementation of Guidelines for Dispersant Use. In Oil Spill Chemical Dispersants: Research, Experience, and Recommendations, T.E. Allen, ed. STP840. Philadelphia: ASTM. pp. 378-389.

Payne, J.W., Bettman, J.R., & Johnson, E.J. (1988). Adaptive strategy selection in decision making. Journal of Experimental Psychology: Learning, Memory and Cognition, 14, 534-552.

Payne, J. W., Samper, A., Bettman, J. R., & Luce, M. F. (2008). Boundary conditions on unconscious thought in complex decision making. Psychological Science, 19, 1118-1123.

Peters, Ellen; Västfjäll, Daniel; Slovic, Paul; Mertz, C.K.; Mazzocco, Ketti; Dickert, Stephan. Numeracy and Decision Making. Psychological Science (Wiley-Blackwell). May2006, Vol. 17 Issue 5, p 407-413.

Pond, R., J.H. Kucklick, A.H. Walker, A. Bostrom, P. Fischbeck, and D. Aurand. 1997. Bridging the Gap for Effective Dispersant Decisions through Risk Communication. In: Proceedings of the 1997 International Oil Spill Conference, April 7-10, Ft. Lauderdale, FL. American Petroleum Institute. Washington, DC. pp. 753-759.

Khatchadourian R. A reporter at large: The Gulf War – Were there any heros in the BP Oil Disaster? New Yorker, March 14, 2011 Accessed July 10, 2013, http://www.newyorker.com/reporting/2011/03/14/110314fa_fact_khatchadourian

Reynolds, B and M. Seeger, Crisis and Emergency Risk Communications Handbook, CDC 2012 edition.

Roth, E., Morgan, M. G., Fischhoff, B., Lave, L., & Bostrom, A. (1990). What do we know about making risk comparisons? Risk Analysis, 10(3), 375-387.

Russo JE, Medvec VH, Meloy MG. The distortion of information during decisions. Organizational Behavior and Human Decision Processes 1996;66:102–10.

Shah, Priti and Eric G. Freedman. (2009) Bar and Line Graph Comprehension: An Interaction of Top-Down and Bottom-Up Processes. Topics in Cognitive Science, 1–19.

Shah and Miyake

Simon, H. A. (1955). A behavioral model of rational choice. Quarterly Journal of Economics , 59 , 99–118.

Simon, H. A. (1956). Rational choice and the structure of the environment. Psychological Review , 63 , 129–138.

Slovic, P., Kraus, N., & Covello, V. T. (1990). What Should We Know About Making Risk Comparisons? 1. Risk Analysis, 10(3), 389-392.

Smithson, M. (1989). Ignorance and Uncertainty. Emerging paradigms. New York: Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 18 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 95 DISCUSSION DRAFT

Springer Verlag.

Smithson, M. (2008). ‘The many faces and masks of uncertainty’. In: Bammer, G. and Smithson, M. (eds). Uncertainty and Risk: Multidisciplinary perspectives. London: Earthscan, 13–25.

Stone, E. R., Yates, J. F., & Parker, A. M. (1997). Effects of numerical and graphical displays on professed risk-taking behavior. Journal of Experimental Psychology: Applied, 3(4), 243-256..

Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. Yale University Press.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1130.

Walker Ann Hayward (2012). Integrating Risk Communications with Crisis Communications Real-Time. Interspill 2012, London, UK, March 14, 2012 Walker Ann Hayward (2011c). Oil Spill Perceptions and Operational Science. Society of Risk Analysis Annual Meeting, Charleston, SC, December 6, 2011.

Walker, Ann Hayward (2010). Aerial Dispersants on the Deepwater Horizon Spill Response. Pacific States/British Columbia Oil Spill Task Force 2010 Annual Meeting, Honolulu, HI, October 6, 2010.

Walker, Ann Hayward (2011a). Dispersant Risk Communication Needs. SETAC Gulf Oil Spill Meeting, Pensacola, FL, April 27, 2011.

Walker, Ann Hayward (2011b). Risk Communications for Significant Oil Spills. International Oil Spill Conference, Portland, OR, May 2011.

Walker, Ann Hayward (2010). Aerial Dispersants on the Deepwater Horizon Spill Response. Pacific States/British Columbia Oil Spill Task Force 2010 Annual Meeting, Honolulu, HI, October 6, 2010.

Walker, Ann Hayward, Debra Scholz, Don V. Aurand, Robert G. Pond, and James R. Clark (2001) Lessons Learned in Ecological Risk Assessment Planning Efforts. International Oil Spill Conference Proceedings: March 2001, Vol. 2001, No. 1, pp. 185- 190.

Walker, Ann Hayward, Debra Scholz, John N. Boyd, Ed Levine, and Eric Moser (2001) Using the Pieces to Solve the Puzzle: A Framework for Making Decisions About Applied Response Technologies. International Oil Spill Conference Proceedings: March 2001, Vol. 2001, No. 1, pp. 503-508.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 19 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 96 DISCUSSION DRAFT

Walker, A.H., D.K. Scholz, J.H. Kucklick, R.G. Pond. 1999. Government and Industry Partnering: Nationwide Progress in Pre-authorization Agreements since 1994. In: Proceedings of the 1999 International Oil Spill Conference, Seattle, WA. American Petroleum Institute, Washington, DC. 6 p.

Walker, A.H., R. Pond, and J.H. Kucklick. 1997. Using Existing Data to Make Decisions about Chemical Countermeasure Products. In: Proceedings of the 1997 International Oil Spill Conference, April 7-10, Ft. Lauderdale, FL. American Petroleum Institute, Washington, DC. pp. 403-408.

Wallsten, T. S., Budescu, D. V., & Zwick, R. (1993). Comparing the calibration and coherence of numerical and verbal probability judgments. Management Science, 39, 176- 190.

Wallsten, T. S., Budescu, D. V., Zwick, R., & Kemp, S. M. (1993). Preferences and reasons for communicating probabilistic information in verbal or numerical terms. Bulletin of the Psychonomic Society, 31, 135-138.

Weber, E. U. & Johnson, E. J. (2009). Mindful judgment and decision making. Annual Review of Psychology.

Wood, MM, Mileti DS, Kano M, Kelley MM, Regan R and Bourque LB. Communicating Actionable Risk for Terrorism and Other Hazards. Risk Analysis, Vol. 32, No. 4, 2012.

Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center 20 (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 97 DISCUSSION DRAFT – July 15, 2013

Draft White Paper for Review:

Best Practices for Community and Stakeholder Engagement in Oil Spill Preparedness and Response

By Ann Hayward Walker and Bob Pavia (eventually other team members)

Section I. INTRODUCTION

The Deepwater Horizon (DWH) spill was a pivotal moment in spill preparedness and response due in part to the widespread concerns raised about dispersant effectiveness, effects, and safety. The public became deeply engaged in this question though both traditional and social media, arguably more so than in prior spills. The DWH response organization in multiple locations undertook a wide variety of activities to communicate practices and risks to both the general public and to those most directly affected by dispersant use such as commercial fishers. In many ways those communication activities could not overcome widespread concerns about ecological and human health risks associated with dispersant use (refs?). In this regard, the Deepwater Horizon spill heightened awareness of persistent risk communication problems around oil spill response and dispersant use.

Experience with stakeholders and the public on oil spills and dispersant issues from 1980 through the Deepwater Horizon has shown that communicating about dispersants has long been and remains a problem across the country (Walker 2012, 2011a, 2011b, 2011c, 2010, 2001a, 2001b, 1999, 1997; Bostrom et al 1995 and 1996; Pavia 1984, 1985). In light of the review of best practices undertaken for this paper, the Deepwater Horizon response experience suggests that both institutional and operational factors in oil spill response have inhibited rather than promoted engaging communities and stakeholders in oil spill preparedness and response. In this paper we assess current best practices for community and stakeholder engagement in oil spill preparedness and response, as well as the institutional and operational conditions that prevent or promote them. The results are intended to be immediately applicable to promote effective response communications about dispersants and oil spills. Project end users include Unified Command (Federal and State On-scene Coordinators and Incident Commanders representing the spillers known as Responsible Party - RP), dispersant decision makers from coastal Regional Response Teams (RRTs), and academia. Many of these key stakeholders are looked to by elected officials/politicians and the public for assurance about oil spill response options.

Section II. STAKEHOLDER NEEDS

Stakeholders are defined as those groups that have a stake/interest/right in an issue or activity, i.e. oil spill, and those that will be affected either negatively or positively by decisions about the issue or activity. They include relevant government agencies, formal and informal resource users, private sector entities, and communities (adapted from UN REDD Programme). Examples of oil spill stakeholder groups are shown in Table 1.

1 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 98 DISCUSSION DRAFT – July 15, 2013

Unified Command* (Federal On-scene Media and the Public Coordinator, e.g., USCG, USEPA; State On-scene Coordinator(s); Responsible Party/“the spiller”); Affected Local Community; Affected Tribal Nation)

Regional Response Teams* (15 Federal agencies Affected Coastal Users and Industries, e.g., fishing/ and appropriate state representatives for each seafood, tourism, industrial water users state in a Federal region)

Resource Trustees* (federal, state and tribal), e.g., Scientific/Academic Community NOAA, US DOI

Spill Managers, Operations Specialists and Affected Community, including property owners in Practitioners the vicinity of the spill and renewable resource communities, e.g., subsistence and other communities dependent upon renewable resources

State and Local Emergency Managers Environmental Groups/NGOs, e.g., National Wildlife Federation (NWF)

Elected and Appointed Officials, and their Oil, Gas, Marine Industry constituents

Table 1. US stakeholder groups in oil spill response (adapted from Walker, 2011), updated to reflect post-Deepwater Horizon changes to the USCG Incident Management Handbook. *Decision authority in regulation for oil spill preparedness and response

One of the outcomes of the Deepwater Horizon Spill has been a sustained effort to identify how community and stakeholder communication about dispersant use have been improved. The Coastal Response Research Center, for example, hosted workshops to identify research priorities with regard to dispersants and risk communication (CRRC, 2012), as listed in Table 2.

As the objectives, guidelines and issues listed in Table 2 demonstrate there is a significant need to address gaps in the current practices for communicating both inside and outside of the response infrastructure. This white paper identifies practices to fill those gaps based on literature review and practitioner experience. Improving communications about dispersants going forward will require responder, stakeholder and community engagement, while assessing and recognizing the influence that governmental and responsible party legal teams have over the process and content of external communications during spill response. The recommended practices in the final section of this paper will provide example approaches to content creation, audience targeting, means of information exchange and dissemination, and mechanisms for analytics and feedback for oil spill preparedness and response communications. They will include approaches for use in conjunction with Incident Command System (ICS) during a response and for a range of preparedness activities before and after responses.

2 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 99 DISCUSSION DRAFT – July 15, 2013

Supply laypeople (political/elected officials/general public/local stakeholders) with credible information they need to make informed judgments about risk to Objectives health, safety, and environmental tradeoffs associated with oil spill response including dispersant application

Identify what the information needs are based on stakeholder group perspective (culturally sensitive) and develop recommendations for Guidelines mechanisms to meet these information needs and expectations using multiple research methods (e.g., focus groups, surveys, structured interviews)

Acknowledge external (general public) stakeholder perception that unified Issues/Problems command inherently involves a conflict of interest (e.g., transparency on the release of proprietary ingredients)

Table 2. Priority Dispersants and Risk Communication research needs (Coastal Response Research Center, 2012).

Engagement represents an opportunity for preparedness and response organizations to learn about the risk perceptions and concerns of stakeholders and communities, share technical information, and establish constructive relationships and dialogue about oil spills and response options, such as dispersants (Walker et al, 2013). Community engagement is the process of working collaboratively with and through groups of people affiliated by geographic proximity, special interest, or similar situations to address issues affecting the well-being of those people (ATSDR, 2011). Generally, for oil spill response, community refers to groups at the local level of government, e.g., those in a county, but it also encompasses those affiliated by a common interest, e.g., renewable resource communities.

Internal stakeholders would make and implement preparedness and response decisions; external oil spill stakeholders would be those affected by preparedness and response decisions.

Stakeholder engagement is a fundamental accountability mechanism that obliges an organization and its stakeholders to identify, understand, and respond to concerns about issues (AccountAbility 2011) that affect both the stakeholders and the organization, in this case the response organizations involved in managing oil spills. Since the Exxon Valdez oil spill in 1989, oil spill response organizations in the US began adopting the Incident Command System (ICS) as common framework for managing oil spills. In 2003 ICS, a component of the National Incident Management System (NIMS), became the national approach for managing all incidents in the US (HSPD-5). Unified Command, which appeared first in the 1994 revision to the US Oil and Hazardous Substances Pollution Contingency Plan (NCP), is an ICS structure to enable joint incident command based on geographic jurisdiction and legal responsibility and authority.

Critical success factors (CSFs) for oil spills (Walker et al, 1995; Harrald, 2006) indicate that external communications with stakeholders and the public must be effective for an oil spill response to be regarded as successful. A European report on risk communication for accidental oil pollution identified seven communications mistakes following the Prestige oil spill in Spain, one of which was the absence of interaction/direct communication with those affected in local area suffering from the accident (AMPERA, 2007).

3 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 100 DISCUSSION DRAFT – July 15, 2013

Indicators of successful community engagement (Walker et al. 2013) which relate to the 1995 oil spill response critical success factors are: • The community trusts and has confidence in the response organisation and other relevant authorities. • Community and stakeholder response expectations are realistic and achievable. • Community leadership and stakeholders are engaged, communicate and exchange timely and reliable information that meets the needs of their constituency and the response, which will promote realistic and credible response decision-making, facilitate appropriate compensation for damages, and promote recovery.

Walker et al. (2013) suggested the following purposes for oil spill community and stakeholder engagement:

• Implementing an effective, timely, and credible process for stakeholder interactions during preparedness and response; and • Providing situation-appropriate, credible and timely assistance and input to Unified Command decisions from oil spill stakeholders during preparedness and response.

A suggested scope of stakeholder engagement for oil spills would be to incorporate stakeholders in procedures and organizational arrangements, e.g., interface with the ICS response organization to improve stakeholder understanding about oil spills and response options, and to mitigate ecological and human dimension impacts from oil spills.

Comparisons of lay and expert risk perceptions, together with research on the effects of risk communication, indicate that expertise and information can have large effects on risk perceptions. Cognitive processes such as categorization, similarity judgments, and inference from mental models are, from an information processing perspective, all components of risk perception (Bostrom, 2005). A previous study of the mental models of US dispersant decision makers (Bostrom et al. 1995, 1997) showed a significant divergence in fundamental understanding of oil weathering and fate processes among decision makers, even without the addition of dispersants. To be effective in addressing risk perceptions around dispersants, the technical content of dispersant communications needs to address stakeholder understanding about both oil and dispersed oil composition, fate and effects processes, in order to enable response decision making. The mental models and risk perceptions white paper for this project reviews these principles and process in detail.

Federal agencies, e.g., the Nuclear Regulatory Commission (NRC), US Environmental Protection Agency, US Coast Guard, Centers for Disease Control (CDC), and the World Health Organization (WHO) have turned to consultants to apply their practitioner knowledge in guidance on risk communication in relation to environmental and public health emergencies. This work emphasizes the need for, and provides guidance about, ways to improve media communications, in addition to recognizing the value of engaging directly and indirectly with stakeholders and communities interested in or affected by an incident. Guidance to improve media messaging and communications has special appeal because of the trust that the public invests in the media as a source of oil spill information (PEW, 2010). Agencies and large organizations generally have dedicated public affairs staff to implement recommendations for external communications. Some agencies have developed guidance on community and/or stakeholder engagement (US EPA 2005, ATSDR 2011, USNMC) in addition to or as part of risk communication

4 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 101 DISCUSSION DRAFT – July 15, 2013

guidance. These agencies generally have dedicated public affairs staff to implement recommendations for external communications.

Communications shortfalls and areas of improvement identified following the Deepwater Horizon oil spill largely focus on the traditional areas of crisis communications, public affairs and communications technology (National Commission, 2011 and USCG, 2011) and also note the impact of political influence on external communications. The role of risk communication and engagement is absent in these recommendations. However, a need for “whole of government” approach for messaging is noted (USCG, 2011), with a recommended solution being through improvements in public affairs and crisis communications. The contrast between the NCP (federal government/top down management of pollution incidents) and NRF/Stafford Act (local government/bottom up management of declared disasters) approaches is also noted but without a suggested resolution on how to align these two institutional frameworks. Especially for spills that are perceived by communities as technological disasters, such as the Deepwater Horizon (DWH) and Exxon Valdez oil spills, the need for collaboration, or engagement, has been reported in the literature (Tierney, 2009). Even for spills which are not viewed as disasters, social media now gives a global voice to communities in the vicinity of a spill and the general public, as noted following the 2010 DWH, “we all have to understand that there will never again be a major event … that won’t involve public participation” (Allen, 2011).

Walker (2012) notes that informing the media and stakeholders about oil spills which involve controversial issues like dispersants requires first integrating crisis communications with risk communication through constant, real-time coordination and collaboration within the incident command organization and other internal stakeholders, and then through engagement with external stakeholders including affected communities. This real-time coordination and collaboration is necessary, both to learn about stakeholder and community risk perceptions about the incident, and to assess the situation in relation to those perceptions. Those working on the spill in the ICS organization have the technical capabilities to address incident-specific risk perceptions during response by developing the information content to share through external communications and engagement.

Past concerns in the US about the use of dispersants were related primarily to ecological effects. However, during DWH communities along the Gulf of Mexico expressed the concerns about human health risks from aerial applications of dispersants and concerns about seafood safety from exposure to both aerial and subsea applications of dispersants (add some example citations). Such concerns also were expressed by the academic community, representing both environmental and public health scientists (IOM, 2010, add others). During DWH, some key dispersant subject matter experts were assigned to other functions which in effect pre-empted their input to dispersant assessment and decisions, e.g., Dr. Jaqueline Michel who was the Chair of the National Academy of Sciences (NAS), National Research Board Committee on Understanding Oil Spill Dispersants: Efficacy and Effects, was assigned to Shoreline Cleanup Assessment Team (SCAT) with no assigned responsibility to for dispersants. Also on that NAS committee were Dr. James Payne who was assigned to Natural Resource Damage Assessment and Yvonne Adassi who was assigned to Alternative Response Technologies Evaluation System (ARTES).

In 2013 the National Response Team (NRT) published an ICS-based model for collaborative communications, however, this model is oriented toward collaboration among multiple governmental levels of ICS within the incident management team, rather than collaboration through engagement with communities, other external stakeholders, or trusted sources of information outside the response organization.

5 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 102 DISCUSSION DRAFT – July 15, 2013

Engagement in the context of the Incident Command System

Community and stakeholder engagement and collaboration with emergency managers can take place during a response within the confines of the ICS organization (FEMA 2007) but this is more common during a Stafford Act response to a disaster than a pollution response under the NCP. In ICS, the Public Information Officer (PIO) is assigned the primary responsibility for communicating with the public, media, and/or coordinating with other agencies. The Liaison Officer, who is the contact for assisting and/or cooperating Agency Representatives, also has assigned responsibility in ICS for external communications.

The PIO is a key member of the Joint Information System (JIS), which is responsible for developing, verifying, and gaining approval for coordinated information during an incident. The JIS mission is to coordinate information delivery to the public for all involved organizations. The Joint Information Center (JIC) is the place where JIS functions are carried out, often but not always within the incident command post. The JIC is managed by the lead PIO for the incident (NRT, 2013).

The functioning of the ICS and JIC model has limitations in communicating information under conditions of risk and uncertainty, especially as social media has strained the ability of the ICS to control the timing and content of messages about the incident. Tierney (2009) identified advantages and disadvantages of ICS which are relevant to dispersant communications. In particular Tierney points out that ICS is less compatible with collaborative modes of decision making often found in community-based organizations and Stafford Act responses. This is also due in part because collaborative decision making involves both horizontal and vertical integration. Tierney also notes that critical strategic decisions can fall to elected or appointed leaders who are outside the ICS; they are external stakeholders for oil spill response. Buck, et Al. (2006) also identified limitations of ICS when events become politicized. Given the continuing controversy associated with dispersant use in the Deepwater Horizon and the ubiquity of socially- generated data containing misrepresentations or misinterpretations of facts (e.g., Devine and Devine, 2013, Tobin, 2010 . see also Maynard, 2010), some additional decision making, and monitoring, will be required in the future when dispersant use is considered by regulators as atypical. Communities, stakeholders and the public watching this atypical situation will likely challenge Unified Command and other decision makers as official sources of credible information in ways that are difficult to overcome especially when attempting to quell false rumors or correct misinformation using the JIS model of communications.

Some models for best practices for the operation of the JIS in the context of rapidly-evolving social media applications and usage trends can be found in recently released guidance by Federal agencies including for example:

• Centers for Disease Control: Crisis & Emergency Risk Communication (2012) and Crisis and Emergency Risk Communication Model, Crisis and Emergency Risk Communication manual; and The Health Communicator’s Social Media Toolkit • Department of Homeland Security Science and Technology Directorate: Community Engagement Guidance and Best Practices (2012) • Nuclear Regulatory Commission: Developing an Emergency Risk Communication (ERC)/Joint Information Center (JIC) Plan for a Radiological Emergency (2011) • U.S. Coast Guard: Risk Communication for Responders (2011) • U.S. Environmental Protection Agency: Effective Risk and Crisis Communication during Water Security Emergencies (2007)

6 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 103 DISCUSSION DRAFT – July 15, 2013

These models have a strong foundation in message mapping approaches put forward by Covello (1998, 2001, 2002, and 2006). Message mapping focuses on clear messages targeted at specific stakeholders that communicate risk, address concerns, and respond to those concerns through trusted sources (Covello, 2003). A great deal of message mapping work has been done in the context of public health issues such as disease outbreaks. In certain situations, message mapping focuses on providing the public with information on actions they should take, such as shelter in place, boiling water before use, or other specific instructions. Social media is used in this context to understand public concerns, provide information, gather feedback for those caring out recommended actions, and to monitor rumors (NRC, 2011, CDC no date).

There are, however, limitations in the message mapping approach, especially in technological disaster situations, such as a large or politicized oil spill involving dispersant use, that would benefit from collaborative decision making, as Tierney identified (2009). The role of social media is rapidly evolving and the subject of much discussion. CDC and the Federal Emergency Management Agency (FEMA) are particularly active in exploring ways in which to use social media in a disaster context (e.g., http://www.dhs.gov/social-media-directory), using the term, Social Media in Emergency Management (SMEM).

There is very active debate about how social media activity should be integrated in to the ICS. Crowe (2010) identifies a fundamental incompatibility between the constraints of JIS as defined under NIMS and the realities of social media. A particular conflict is seen in the NIMS construct to control information so that the public can validate the information on the basis of the government’s trustworthiness. A lack of trust in the government can lead people to other sources to validate information. This assertion seems borne out by Pew (2010, 2010a) research on the DWH oil spill, where there were significant issues of public trust when government and industry issued information about the spill. Crowe does not propose a solution for addressing the fundamental conflict between NIMS/JIS and social media demands. Hughes and Palen (2012) examined how social media is forcing an evolution of PIO roles within the ICS, in particular the bi-directional flow of information between the JIC and the media and the JIC and the public as shown in Figure 1.

Figure 1. PIO communication model from Hughes and Palen (2012)

7 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 104 DISCUSSION DRAFT – July 15, 2013

Hughes and Palen’s model highlights critical changes in information flow, where the public is informing media and the response rather than being a passive receptor of information. This new communication paradigm has resulted in a fundamental shift in the PIO/JIS from that of a gatekeeper to that of a translator. The challenges inherent in this change lead Hughes and Palen to recommend redistribution of PIO social media functions to other components of the ICS. Hughes and Palen however, do not explicitly address how subject matter experts fit into their model. We propose a modification of Hughes and Palen’s model to explicitly include subject matter experts as show in Figure 2.

Figure 2. PIO communication model modified to include subject matter experts.

Dispersant communication strategies for planning and response activities can benefit from understanding the key shift of paradigm from gatekeeper to translator and guide to response information. That is likely to require changes in the central control of social media applications during response and in particular the application of social media for situational awareness, collaboration, and communication. It could extend to using social media for building networks among technical specialists to empower collaborative information sharing and input into decision making. Buck, et al. (2009) make the point that: “Coordinative systems are more appropriate for dealing with disagreement, controversy, and integrating multiple divergent perspectives, while command systems such as ICS remain useful for the organization and completion of predictable agreed upon tasks by formal agencies.”

The challenge for dispersant decision making is to integrate networks of individuals and their input who fall outside of the responder community, represented by the National Response System and an incident- specific ICS organization, into response decision making through community engagement. This integration of networks can also leverage limited response resources for engagement. A model that could represent such integration is show in Figure 3.

8 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 105 DISCUSSION DRAFT – July 15, 2013

Figure 3. Potential integrative communication model show areas of collaboration among the form and informal response mechanisms

Five examples of community engagement during emergencies through the ICS should be considered in developing strategies for evolving the existing ICS to these evolving engagement paradigms:

1. Broadcasting messages to the public. 2. Two-way communication between emergency managers and major stakeholder groups. 3. Situational awareness for emergency responders about community risk perceptions, questions and concerns. 4. Situational assessment, internal coordination, and consensus development in the ICS organization to develop technical information to address perceptions, concerns and questions. 5. External coordination with trusted sources and networks in the affected area, and nationally when appropriate, e.g., academics, public health officials, relevant trade associations.

These community engagement examples are highly relevant to dispersant use decision making, public engagement, and monitoring. The challenge for evolving the incident-specific ICS organization during for dispersant decision making is understanding how to best integrate networks of individuals who are external the responder community to provide useful and relevant inputs into response decision making through community engagement, e.g., shoreline property owners who provide input to the shoreline cleanup assessment team (SCAT), as well as how to address their risk perceptions which could become barriers to effective decision making.

ICS is flexible and adaptable to all kinds and sizes of incidents, and provides a common organizational framework. How to implement ICS with regard to engagement and external communications, especially social media, is the subject of debate among the emergency response community (e.g., http://www.emergencymgmt.com/emergency-blogs/crisis-comm/Does-social-media-monitoring- 052013.html). There is variability among federal agencies in the details of how they implement it. For example, both the US Coast Guard and EPA have ICS incident management handbooks (IMH) which are

9 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 106 DISCUSSION DRAFT – July 15, 2013

generally similar but there are significant differences in some technical functions and information flow. For oil spills, the USCG IMH is both a training and response guide for government and industry responders. If a function is not represented in the IMH, it may be unlikely that a new function will be identified and supported during a spill by Unified Command, especially the RP’s Incident Commander. This is especially true for external communications and engagement which carries significant legal implications for the RP.

A distinct challenge to real-time risk communication and community engagement during oil spills, especially large-scale oil spills resulting from the continuing discharge of a well blowout that may last for weeks to months, is the dynamic interaction of the oil and multiple environmental systems with on- scene weather and oceanographic conditions. The oil spill situation changes constantly. Assessing the situation to address the perceptions, questions, and concerns of political, media public, affected community and other stakeholder groups about that incident is highly complex and takes time and resources. Because oil spills are so dynamic, the IMT focus is on assessing the situation to determine tactical operations for controlling the movement of spilled oil to limit the extent of contamination and impacts, rather than collaboration to address external perceptions, questions, and concerns. With its flexibility, ICS has the capacity to better position the response organization to adapt to being responsive to external drivers in addition to executing operational plans, through additions to the written guidance and training.

A structural and functional evolution of the organization can support changing from a gatekeeper role in communications to that of a translator and guide in the context of evolving public concerns and advancing communication patterns. Figure 4 shows functional enhancements in a generalized ICS organization, through the addition of Technical Assessment for Stakeholder Communications (TASC) function in the Environmental Unit and a public health coordinator to the Command Staff, supported by internal information sharing among the highlighted boxes. Functional evolution can occur by implementing social media capabilities into multiple levels of the ICS to further responder situational awareness, assessment and external coordination as identified above. Section 4 will outline specific examples of how to best exploit these suggestions for functional changes.

10 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 107 DISCUSSION DRAFT – July 15, 2013

Figure 4. Functional enhancements in an oil-spill ICS organization (generic rather than incident or scenario- specific)

Section 3. RECOMMENDATIONS FOR BEST PRACTICES

When a crisis occurs, available traditional and social media often are “appropriated” for the purpose of collecting and disseminating disaster-relevant information, and new disaster-related content is rapidly created and shared by the response, external subject matter experts, e.g., academia, and community members in general. Wide-scale interaction between members of the public has qualities of being collectively resourceful, self-policing and will generate information that cannot otherwise be easily obtained (Sutton, 2008). Stakeholder surveys with local-level oil spill stakeholders have shown that known websites (government websites, e.g., Coast Guard, EPA, NOAA, DOI, or websites of professional

11 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 108 DISCUSSION DRAFT – July 15, 2013

associations) are the form of media considered most important to inform judgments about dispersant use in response to a spill (Walker, 2012). However, it is also evident that people will use information from any number of sources to satisfy their needs and inform their actions in the face of disaster, especially when information portals and known websites fail to meet their information needs (Sutton et al., 2008) with regard to timing and content.

Waugh and Tierney (2007) observed the following.

The national emergency management system includes complex networks of public, private, and nonprofit organizations; nongovernmental organizations; and volunteers. The networks are diverse, and communication—let alone collaboration—can be very difficult. Integrating volunteer organizations, faith-based organizations, for-profit organizations, and others into one unified effort can be a monumental task. Poor cultural interoperability (this is what ICS is supposed to remedy) complicates multi-organizational, intergovernmental, and intersector operations…When the warm and fuzzy meets the lean and mean, cultures and personality types clash, and differences can be hard to overcome. The attempt to impose control can be counterproductive in a system in which resources are dispersed, authority is shared, and responsibility needs to be shared.

Best practices for engagement must acknowledge a situation space that defined by these complex networks, cultural interoperability, and the need for collaboration across sectorial boundaries. Public (community) engagement is integral to these best practices, so it is important that we define this concept in way that allows us to build engagement strategies. Rowe and Frewer (2005) identify three mechanisms of public engagement (2005): communication, consultation, and participation. Communication is from the sponsor to the public, consultation is from the public to the sponsor, and participation is a two-way flow of information between the sponsor and the public. Rowe and Frewer identify 100 participatory mechanisms and point out that there is no overarching theory on how to best apply these mechanisms. Of course, in 2005, social media was not a driving force in community engagement, and so is absent from this list. This evolution of communication technologies allows Hughes and Palen’s more pervasive two-way communication model shown in Figure 1.

The challenge for dispersant communications then is to adapt these typologies to the circumstances of spill preparedness and response. It is helpful to consider a definition established by the Working Group on Community Engagement in Health Emergency Planning sponsored by FEMA. They have put forth this definition:

“Community engagement—defined here as structured dialogue, joint problem solving, and collaborative action among formal authorities, citizens at-large, and local opinion leaders around a pressing public matter” (Schoch-Spana, et. al. 2007)

What Do Best Practices in Community Engagement for Dispersant Use Policies Looks Like?

According to Walker et al. (2013) mutually-beneficial engagement, i.e., beneficial for preparedness and response organizations and stakeholders/communities, can serve as a means to resolving incident- specific technical issues by providing appropriate input into the management of the response, which in turn may facilitate: (1) the resolution of response issues, (2) process of evaluating and processing claims for compensation, and (3) community resilience. Mutually beneficial means that affected community

12 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 109 DISCUSSION DRAFT – July 15, 2013

residents and stakeholders need to understand how their input can be most effective and useful; and responders need to appreciate that engagement can support credible decision making and enhance community recovery from spills, especially major emergencies that are viewed as technological disasters. A variety of practices reflect how community engagement during oil spill preparedness and response can:

• Interface effectively with the response organization to help minimize impacts on the environment and socio-economic activities. • Facilitate stakeholders’ ability to receive compensation for spill impacts in a timely manner that in turn facilitates restoration and resilience in affected communities. • Support community resilience and recovery.

While most of the emphasis in this paper so far has been on community engagement during spill response, preparedness is also integral to NIMS (FEMA, 2013). FEMA characterizes preparedness within NIMS as a continuous cycle as show in Figure 5.

Figure 5. NIMS Preparedness Cycle from FEMA (2013)

Community engagement during the preparedness cycle often focuses on keeping communities safe from harm and resistant to disasters. FEMA (2013) specifically states its preparedness goal as:

“A secure and resilient nation with the capabilities required across the whole community to prevent, protect against, mitigate, respond to, and recover from the threats and hazards that pose the greatest risk.” where hazards include spills and other technological disasters.

Community engagement practices for dispersants should accommodate multiple timescales, audiences, and categories of communications.

13 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 110 DISCUSSION DRAFT – July 15, 2013

To be further developed with input from the peer-review workshop ... Apply FEMA guidance to inform practices for dispersant preparedness and use, adapt them here to fit more specific dispersant preparedness and response requirements.

Time Scales

• Learning and adapting (research, lessons learned,) • Preparedness (education, contingency planning, training, and exercising) • Response (in progress and planned) • Recovery (natural resources, community infrastructure)

Target audiences

Targeting audiences and tailoring messages to them has been an important approach used in social marketing efforts (Bostrom, et. al., 2013). In their evaluation of this approach for communication climate change information, Bostrom (2013) use this definition for these terms:

“Targeting involves identifying a specific subpopulation for communications, whereas tailoring involves including individuating, personal information within the design of the communication.”

Target audiences for oil spills can be grouped in four categories:

1. Formal Authorities (e.g., in the National Response System for preparedness and response for all sizes of spills, including a spill of national significance/SONS) 2. Citizens at large (a. directly affected by spill; b. those not in the local community who might still influence response decisions through political or other processes) 3. Local opinion leaders (e.g., elected officials, community leaders), trusted sources (e.g., academia, NGOs) influential communicators (e.g., both traditional and social media) 4. Other stakeholders (e.g., seafood and tourist industries)

Bostrom (2013) found that targeting and tailoring can have limited effectiveness. There can often be significant effort involved in segmenting populations of interest and tailoring messages, especially when the information available during a spill is limited and rapidly changing. Participating in social media among some population segments is accelerating this process during disasters (see for example http://onlinempa.usfca.edu/social-media/). While targeting and tailoring can still be broadly applicable for spill response communications, the recommendations in this report focus on two groups beyond those directly engage in response efforts: (1) the public at large and (2) trusted sources of information who are subject matter specialists and experts about specific oil spill issues of concern.

Table 3 is an example of how content could be developed and disseminated during response from the incident management team to the public at large.

14 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 111 DISCUSSION DRAFT – July 15, 2013

Response Practitioners in External Trusted Coordination Content/Deliverables Communicators IMT Sources/Intermediaries* • Unified Command and Initial/early stage of response: • Unified • Fact sheets on the Internal Incident Management Incident Management • Media – all levels, ID and Command response, sampling Team (IMT) – Local and Area Team (IMT) vet local/regional level (UC) programs, scientists Level • Scientific Support media via Sea Grant • Regional involved in the spill, • UC Coordinator (SSC), • Unified Command (UC) Response response options • JIC,,SSC, PHC, LO, Public Health • SSC and PHC Team (RRT) being used (what, Science Teams Coordinator (PHC) • Centers for Disease • PIO/Joint Info. why, where, how) • Response SMEs to • Technical Specialists Control (CDC) Center (JIC) • Fact sheets on PPE address questions and internal Science • Local/regional academia, • Liaison Officer and worker which exceed “talking Team(s): Chemists, e.g., LSU and Tulane (LO) monitoring points” depth of toxicologists and • National Academia, e.g., • UC legal team (incorporate fact knowledge industrial hygienists National Academy of representatives sheets and response • RRTs (agency, IMT, and Sciences Institute of • Environmental options re: potential consultants) Medicine, National Unit Leader risks, vulnerability, External – all levels • Operations Science Foundation, (EUL) exposure) • Media • Environmental Unit (EU): AMA • Volunteer • Press releases • NRT Sample Coordinator; • Stakeholder association Coordinator if • Q&As • Websites for tourism TASC (Technical representatives, e.g., activated • Talking points area Assessment for e.g., Louisiana • National • Traditional media • Local / state officials Stakeholder Shrimpers Association Response campaign • Additional trusted Communication) and Local health officials Team (if a • Websites sources (to be Alternative Response associations (ASALPHO, major • Social media content identified for specific Technology Evaluation AOEC to reach response) for Twitter, Instagram, response and affected System (ARTES) physicians and , webinars, stakeholder) • Detailed issues: issue- pharmacists in affected specific Operational area) Science Advisory Team • NGOs, e.g., NWF and (OSAT), IBBRC • Broad issues: external Science Advisory Board

Table 3: Means of Information Dissemination during Response: Incident Management Team to Public Audience (Walker 2011 and 2012) *People who are turned to, listened to and/or lead opinions.

We suggest using three engagement categories to frame recommendations:

• Outreach (1-way communication from authorities to the public) • Consultation (Structured dialogue) • Participation (Joint problem solving, collaborative action)

Section 4. Discussion of results Information from the national public survey analysis and social media analysis is providing insights into gaps in understanding about oil spills and dispersants. In addition, each of the white papers identify areas of practice that are relevant to oil spill preparedness and response. In applying these findings, the information flow diagrams will focus on approaches that build resilience within the planning and response system through community engagement that allows for adaptive responses to unanticipated situations. They will also provide recommendations for analytics and feedback that can empower consultation and participation as part of community engagement.

Note to reviewers: This section will use excerpts of key practices identified in other white papers to develop candidate information flow diagrams. The final report will contain the excerpts from other white papers post workshop. Our goal is to identify 4-8 candidate topics to develop into information flow diagrams. The candidate topics will be presented to reviewers at the workshop for discussion. We will incorporate recommendations from the workshop and best practices from the literature to develop the 15 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 112 DISCUSSION DRAFT – July 15, 2013

final information flow diagrams. The following table will help frame the workshop discussion on this topic. Gap in Communication Timescale Audience(s) Engagement Performance Understanding Mechanism Category Measure

Behavior of oil, without/pre- dispersants

Behavior of Preparedness Public, area Prespill: dispersants when and response planning Outreach applied from the air participants Response: Consultation

Behavior of dispersants when applied subsea

Dispersant fate Ushahidi crowd map, Response Public, Participation Dispersant fate and transport ERMA disciplinary and transport experts

Section 5. Community Engagement Applications for future practice

To be developed based on discussion and feedback from the workshop. That might mean setting aside time during workshop to discuss completing a framework like the one offered below.

References

Allen, Thad. You Have to Lead from Everywhere. 2010. An Interview by Scott Berinato. Havard Business Review. www.hbr.org/2010/11/you-have-to-lead-from-everywhere/ar/1.

AMPERA (European Concerted Action to foster prevention and best response to Accidental Marine Pollution). 2007. Risk Communication in Accidental Marine Pollution: Good Practice Guide for an effective communication strategy. AMPERA Publication No. 2. Retrieved from: http://www.upf.edu/enoticies/0708/_pdf/ampera.pdf

Bostrom, A., Gisela Bohm and Robert E. O’Connor. 2013. Targeting and tailoring climate change communications. WIREs Clim Change 2013. doi: 10.1002/wcc.234

Bostrom, A. “Risk Perception” in Mitcham, Carl (Ed), Encyclopedia of Science, Technology and Ethics. MacMillan Reference, 2005.

16 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 113 DISCUSSION DRAFT – July 15, 2013

Bostrom, A., P. Fischbeck, J.H. Kucklick, R. Pond, and A.H. Walker. 1997. Ecological Issues in Dispersant Use: Decision-makers’ Perceptions and Information Needs. Prepared for: Marine Preservation Association.

Bostrom, A., P. Fischbeck, J.H. Kucklick, and A.H. Walker. 1995. A Mental Models Approach for Preparing Summary Reports on Ecological Issues Related to Dispersant Use. Marine Spill Response Corporation, Washington, DC. MSRC Technical Report Series 95-019, 28 p.

Buck, Dick A.; Trainor, Joseph E.; and Aguirre, Benigno E. (2006) "A Critical Evaluation of the Incident Command System and NIMS," Journal of Homeland Security and Emergency Management: Vol. 3 : Iss. 3, Article 1. DOI: 10.2202/1547-7355.1252

ATSDR. 2011. Principles of Community Engagement. Clinical and Translational Science Awards Consortium. Community Engagement Key Function Committee Task Force on the Principles of Community Engagement. NIH Publication No. 11-7782 http://www.atsdr.cdc.gov/communityengagement/pdf/PCE_Report_508_FINAL.pdf

Centers for Disease Control (CDC). August 2010. The Health Communicator’s Social Media Toolkit. Office of the Associate Director for Communication. Atlanta, GA. Report No. C8215-469-A.

Covello, V.T. and Allen, F. (1988) Seven Cardinal Rules of Risk Communication. Washington (DC): Environmental Protection Agency.

Covello, V.T. and P. Sandman. 2001. Risk communication: Evolution and Revolution. In Solutions to an Environment in Peril. Baltimore: John Hopkins University Press. pp. 164-178

Covello, V.T., 2002. Message Mapping, Risk and Crisis Communication. Invited Paper Presented at the World Health Organization Conference on Bio-terrorism and Risk Communication, Geneva, Switzerland, October 1, 2002

Covello, V.T. (2003) Best practice in public health risk and crisis communication. Journal of Health Communication, Vol. 8, Supplement 1, June: 5-8.

Covello, V.T. (2006) Risk communication and message mapping: A new tool for communicating effectively in public health emergencies and disasters. Journal of Emergency Management, Vol. 4 No.3, 2540.

Covello, V, Scott Minamyer and Kathy Clayton 2007. Effective Risk and Crisis Communication during Water Security Emergencies: Summary Report of EPA Sponsored Message Mapping Workshops. US EPA National Homeland Security Research Center. Retrieved from cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=461264

Covello, V.T. 2011. Developing an Emergency Risk Communication (ERC)/Joint Information Center (JIC) Plan for a Radiological Emergency. US Nuclear Regulatory Commission. NUREG/CR-7032. 96pp

Crowe, Adam. 2010. The Elephant in the JIC: The Fundamental Flaw of Emergency Public Information within the NIMS Framework. Journal of Homeland Security and Emergency Management. Volume 7, Issue 1, Article 10.

17 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 114 DISCUSSION DRAFT – July 15, 2013

Devine, Shanna and Tom Devine. 2013. Deadly Dispersants in the Gulf: Are Public Health and Environmental Tragedies the New Norm for Oil Spill Cleanups? Government Accountability Project. Web accessed June 11, 2013. http://www.whistleblower.org/program-areas/public- health/corexit

Federal Emergency Management Agency. 2007. Basic Guidance for Public Information Officers (PIOs). National Incident Management System. FEMA 517/November 2007. 25pp. Fischhoff, B., Noel T. Brewer, and Julie S. Downs, editors. 2011.

Federal Emergency Management Agency. January 2008. Emergency Support Function #15 – External Affairs Annex in the National Response Framework. Retrieved from http://www.fema.gov/pdf/emergency/nrf/nrf-esf-15.pdf

Fischhoff, B. Editor. 2011. Communicating Risks and Benefits: An Evidence-Based User's Guide. Food and Drug Administration (FDA), US Department of Health and Human Services. Retrieved from http://www.fda.gov/AboutFDA/ReportsManualsForms/Reports/ucm268078.htm

Harrald, J. R. 2006. Agility and Discipline: Critical Success Factors for Disaster Response. Annals of the American Academy of Political and Social Science Vol. 604, Shelter from the Storm: Repairing the National Emergency Management System after Hurricane Katrina, pp. 256-272. Sage Publications, Inc.

Hughes, Amanda L. and Palen, Leysia (2012) "The Evolving Role of the Public Information Officer: An Examination of Social Media in Emergency Management," Journal of Homeland Security and Emergency Management: Vol. 9: Iss. 1, Article 22

Hyer, R.N. and V.T. Covello. 2005. Effective Media Communication during Public Health Emergencies: A WHO Handbook. World Health Organization (WHO). Geneva, Switzerland. Retrieved from: http://www.who.int/csr/resources/publications/WHO%20MEDIA%20HANDBOOK.pdf .

Institute of Medicine (IOM). 2010. Assessing the Human Health Effects of the Gulf of Mexico Oil Spill. Washington, D.C.: The National Academies Press.

Kahneman, D., Slovic, P. and Tversky, A. (Eds). (1982). Judgment under uncertainty: Heuristics and Biases. Cambridge University Press.

Maynard, Andrew, 2010. Nano Dispersants and nano hysteria – time to think about the science folks! Web Accessed 06/17/2013. http://2020science.org/2010/05/28/nano-dispersants-and-nano- hysteria-time-to-think-about-the-science-folks/

National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling, 2011. Deep Water: The Gulf Oil Disaster and the Future of Offshore Drilling-Report to the President. January 2011. Retrieved from http://www.oilspillcommission.gov/final-report

National Response Team (NRT). 2013. NRT Joint Information Center Model Collaborative Communications during Emergency Response. April 2013. 172pp

18 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 115 DISCUSSION DRAFT – July 15, 2013

Navy and Marine Corps Public Health Center (NMCPHC). A Risk Communication Primer—Tools and Techniques. 620 John Paul Jones Circle, Suite 1100, Portsmouth, VA 23708-2103. Retrieved from: http://www.med.navy.mil/sites/nmcphc/Documents/policy-and-instruction/nmcphc-risk- communications-primer.pdf

Persensky, J. and S. Browde, A. Szabo, L. Peterson, E. Specht, E. Wright. Effective Risk Communication: The Nuclear Regulatory Commission’s Guidelines for External Risk Communication. Office of Nuclear Regulatory Research, US Nuclear Regulatory Commission. Washington, DC. Report No. NUREG/BR-0308.

Pew, 2010. Gulf Disaster Continues to Dominate Coverage, Interest News Media Trusted For Information On Oil Leak. June 9, 2010. Pew Research Center for People and the Press. Web accessed June 18, 2013. http://www.people-press.org/2010/06/09/news-media-trusted-for-information-on-oil- leak/

Pew, 2010a. 100 Days of Gushing Oil – Media Analysis and Quiz. August25, 2010. Pew Research Center’s Project for Excellence in Journalism. Web accessed June 18, 2013. http://www.journalism.org/analysis_report/100_days_gushing_oil

Reynolds, B. et al. 2012. Crisis and Emergency Risk Communication. Centers for Disease Control and Prevention, Washington DC. Retrieved from http://emergency.cdc.gov/cerc/pdf/CERC_2012edition.pdf

Rowe G, Frewer LJ. 2005 A typology of public engagement mechanisms. Sci Technol Human Values 2005;30(2): 251–290.

Sandman, P. 1993. Responding to Community Outrage: Strategies for Effective Risk Communication. American Industrial Hygiene Association. Retrieved from: http://psandman.com/media /RespondingtoCommunityOutrage.pdf

Schoch-Spana, Monica, Crystal Franco, Jennifer B. Nuzzo and, and Christiana Usenza. 2007. Community Engagement: Leadership Tool for Catastrophic Health Events. Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science. March 2007, 5(1): 8-25. doi:10.1089/bsp.2006.0036

Sutton, J.N. 2010. Twittering Tennessee: Distributed Networks and Collaboration Following a Technological Disaster. In: Proceedings of the 7th International ISCRAM Conference, Seattle, USA.

Sutton, J., L. Palen, and I. Shklovski, I. 2008. Backchannels on the Front Lines: Emergent Use of Social Media in the 2007 Southern California Wildfires. In: Proceedings of Conference on Information Systems on Crisis Response and Management (ISCRAM 2008).

Szabo, A. and J. Persensky, L. Peterson, E. Specht, N. Goodman, R. Black. Effective Risk Communication: Guidelines for Internal Risk Communication. Office of Nuclear Regulatory Research, US Nuclear Regulatory Commission. Washington, DC. Report No. NUREG/BR-0318.

Tierney, K. 2009. Disaster response: research findings and their implications for resilience measures. Retrieved from: http://www.resilientus.org/library/Final_Tierney2_dpsbjs_1238179110.pdf

19 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)

SEA-UW CRRC Workshop July 24-25, 2013 Drafts Page 116 DISCUSSION DRAFT – July 15, 2013

Tobin, M. (2010, June 17). North america faces years of toxic oil rain from bp oil spill chemical dispersants. Retrieved from http://www.examiner.com/article/north-america-faces-years-of- toxic-oil-rain-from-bp-oil-spill-chemical-dispersants

U.S. Department of Health and Human Services. 2002. Communicating in a Crisis: Risk Communication Guidelines for Public Officials. Washington, D.C.

U.S. Homeland Security Presidential Directive Five (HSPD-5), Management of Domestic Incidents, February 28, 2003, http://www.dhs.gov!dhspublici display?theme-42&content-496 .

US Coast Guard (USCG). BP DEEPWATER HORIZON OIL SPILL Incident Specific Preparedness Review (ISPR). January 2011. Retrieved from http://www.uscg.mil/foia/docs/dwh/bpdwh.pdf

US Environmental Protection Agency (USEPA), April 2005. Superfund Community Involvement Handbook, Office of Superfund Remediation and Technology Innovation EPA 540-K-05-003 Retrieved from www.epa.gov/superfund

US Environmental Protection Agency (USEPA). July 2008. Communicating Radiation Risks: Crisis Communications for Emergency Responders. Office of Radiation and Indoor Air. EPA-402-F-07- 008.

Walker, A. H., and J. Boyd, M. McPeek, D. Scholz, J. Joeckel, and Gary Ott. 2013. Community Engagement Guidance for Oil and HNS Incidents. SEA Consulting Group, Cape Charles, VA USA, 178 pages.

Walker , A.H. 2012 . Integrating Risk Communications with Crisis Communications Real-Time. Interspill 2012, London, UK, March 14, 2012

Walker, A. H. 2011. Risk Communications for Significant Oil Spills. International Oil Spill Conference – Portland, OR. May 24, 2011.

Walker, A.H., D. Ducey, Jr., S. Lacey and J. Harrald. 1994. A White Paper Prepared for the 1995 International Oil Spill Conference: Implementing an effective response management system. Technical Report IOSC-001. API. Washington, DC. December 1994. Retrieved from: http://www.ioscproceedings.com.

Add EPA references, and other good practice guides, examples. E.g., http://fema.ideascale.com

20 Funding for this project was provided by the University of New Hampshire’s Coastal Response Research Center (NOAA Grant Number: NA07NOS4630143. Contract: 13-003)