<<

U.S. Fish and Wildlife Service U.S. Department of the Interior

National Wildlife Refuge System National Protocol Framework for Monitoring Common Murre and Brandt’s Cormorant Breeding Colonies in the Current System Regions 1 and 8

Version 1.0 August 2018

ON THE COVER Roy Lowe photographing colonies on the Oregon coast from a helicopter; Image of a common murre and Brandt’s cormorant colony at Yaquina Head after “dotting” of birds and nests in ArcMap. Photographs by: David Pitkin and Shawn W. Stephensen, USFWS ii

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

NWRS Survey Protocol Signature Page

1 Version is a decimal number with the number left of decimal place indicating the number of times this protocol has been approved (e.g., first approved version is 1.0.; prior to first approval all versions are 0.x; after first approval, all minor changes are indicated as version 1. x until the second approval and signature, which establishes version 2.0, and so on). A protocol is subject to versioning so it can be a living, dynamic document revisable with advancements and new techniques in field methods, data analyses, data processing, and data management that would improve the cost effectiveness, sustainability, and quality of its data over time. 2 Signature by National I&M Coordinator signifies approval of a protocol used at multiple stations from two or more Regions.

iii

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Survey Protocol Summary

This protocol framework (PF) provides a foundation and guidance for consistent development of site-specific survey protocols (SSPs) with detailed instructions for conducting breeding colony surveys of common murre (Uria aalge) and Brandt’s cormorant (Phalacrocorax penicillatus) at locations throughout the California Current System (CCS). Cooperating Partners throughout the CCS are encouraged to utilize the survey guidance in this PF, where feasible. The individual SSPs should, at minimum, address elements identified in this PF pertinent to the survey method, staff, equipment, and environmental conditions at respective survey locations. This PF is the first such tool developed as part of the Pacific Seabird Inventory and Monitoring Program (PSP), which is a U.S. Fish and Wildlife Service (USFWS) inter-Regional program established for the purpose of coordinating, integrating, and supporting the full life cycle of seabird surveys conducted throughout the Service’s Pacific jurisdiction. It is intended to be the first in a series of PFs that will eventually establish suggested standards and minimum requirements for surveys contributing to the CCS-wide breeding population status and trend assessment of seabird species with significant portions of their global breeding populations in the CCS.

Common murre (COMU) and Brandt’s cormorant (BRAC) are included in one survey protocol framework because they are surface nesters with overlapping nesting seasons, broadly occur throughout the CCS, often co-occur in dense aggregations at colony sites, and are therefore efficiently surveyed simultaneously within the CCS primarily by aerial photography. Although aerial photographic surveys will be the preferred method of field data collection for both species, boat- and land-based surveys will be used to provide supplemental contextual data, or when aerial surveys are not practical. As a result, all three survey methods are described in this protocol framework.

The content and structure of the protocols described herein follows standards set forth in the U.S. Fish and Wildlife Service’s How to Develop Survey Protocols: a Handbook (Version 1.0) (USFWS 2013). Each of the following eight elements of a protocol is addressed in a narrative: introduction, sampling design, field methods, data management, analysis, reporting, personnel requirements and training, operational requirements, and references. Additionally, a series of standard operating procedures provides greater detail on recommended survey methods and technical aspects of this protocol.

A 3-tiered sampling frame is described in this PF to integrate the timing and locations of COMU and BRAC colony surveys such that statistically rigorous estimates of population status, distribution, and trends may be achieved at sub-regional and CCS-wide scales. The data management element of the PF describes a centralized, interactive web-based application to be developed and deployed where all quality assured colony survey data could be easily entered by partners, and accessible for analyses and interpretation. Partners will set the degree to which their data are available to other users.

iv

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Suggested citation: Bridgeland, W.T., N. Nur, S.W. Stephensen, S. Thomas, G. McChesney, S. Holzman, R. Swift, and K. Kilbride. 2018. National Protocol Framework for Monitoring Common Murre and Brandt’s Cormorant Breeding Colonies in the California Current System. U.S. Fish and Wildlife Service, Natural Resources Program Center, Fort Collins, CO.

This protocol is available from ServCat: https://ecos.fws.gov/ServCat/Reference/Profile/109398

v

The authors dedicate this Protocol Framework to Harry Carter (1956-2017) in recognition of his dedication to seabird conservation and decades of Pacific seabird survey work. (Photograph by: Tracy Miner Ames)

Acknowledgments

Members of three Working Groups participated in numerous discussions and made important contributions and editorial comments on drafts of this document, including: Sarah Allen, Cassie Bednar, Louise Blight, Russ Bradley, Phil Capitolo, Ryan Carle, Harry Carter, Aaron Christ, Michael Cunanan, Kaylene Keller, Rachael Orben, Julia Parish, Scott Pearson, Heather Renner, Nora Rojek, Khem So, Erin Stockenberg, Rob Suryan, William Sydeman, Laurie Wilson, and Nathan Zimpfer. The authors also greatly appreciate the participation of over 50 seabird scientists and managers in a three-day meeting in Portland, OR in November 2016 where the scope and contents of this Protocol Framework were discussed and refined. We also thank the following reviewers for their considerable time invested in improving this document: D. Ledig, D. Lyons, and L. Wilson. Any mistakes or omissions remaining are the sole responsibility of the authors.

Disclaimers The findings and conclusions in this report are those of the authors and do not necessarily represent the views of the U.S. Fish and Wildlife Service.

The mention of trade names or commercial products in this report does not constitute endorsement or recommendation for use by the federal government.

vi

Contents

NWRS Survey Protocol Signature Page ...... iii Survey Protocol Summary ...... iv Acknowledgments ...... vi Abbreviations Used ...... xi Narratives ...... 1 Element 1: Introduction ...... 1 Element 2: Sampling Design ...... 4 Element 3: Field Methods and Processing of Collected Materials ...... 20 Element 4. Data Management and Analysis ...... 29 Element 5: Reporting ...... 34 Element 6: Personnel Requirements and Training ...... 36 Element 7: Operational Requirements ...... 40 Element 8: References ...... 44 Appendices ...... 49 Appendix 1. Sampling Design Power Analysis ...... 49 Appendix 2. Descriptions of Data Fields for Entry into the Colony Registry and Colony Survey Database ...... 62 Appendix 3. Science Information Needs ...... 65 Standard Operating Procedures (SOP) ...... 67 SOP 1: Standard Operating Procedure for conducting aerial surveys of common murre and Brandt’s cormorant colonies within the California Current System ...... 67 SOP 2: Standard Operating Procedure for conducting boat-based surveys of common murre and Brandt’s cormorant colonies within the California Current System ...... 77 SOP 3: Standard Operating Procedure for conducting land-based surveys of common murre and Brandt’s cormorant colonies within the California Current System ...... 86 Supplemental Materials (SM) ...... 93 SM 1: Maps of Sub-regions within the CCS ...... 93 SM 2: Regional Director Order #12-02 with attachments ...... 101 Attachment A. Project Safety Plan ...... 107 Attachment B. Aviation Project Risk Assessment Worksheet ...... 109 Attachment C. Aviation Operation Go-No Go Checklist ...... 113 SM 3: Example of a completed Survey Project Aviation Safety Plan ...... 114 SM 4: Example Colony Status Record Form and Aerial Survey Log ...... 121 SM 5: Example of Land-based Survey Methods: Colony Count Protocol for Common Murre Restoration Project Colonies ...... 125 SM 6: Brant’s Cormorant Nest Survey Protocols ...... 132 Part 1. Assigning BRAC nest categories when counting nests from aerial photographs .. 132 Part 2. Assigning BRAC nest categories when counting nests from land ...... 136

vii

List of Tables

Table 2.1. An estimate of the distribution of colonies of COMU and BRAC by proposed sub- region in the CCS...... 13

Table 2.2. The distribution of CCS colonies and numbers of breeding COMU and BRAC by colony size (Small, Medium, Large, as defined) based on most recent available estimates...... 13

Table 2.3. Summarized Sampling Objectives with 80% probability of detection. See Element 2 for more details...... 14

Table 2.4. Sample sizes needed to achieve the sub-regional objective of trend detection for BRAC and COMU...... 15

Table 2.5. Sample sizes needed to achieve the CCS-wide objectives of trend detection for BRAC and COMU...... 16

Table 2.6. Sample sizes needed to achieve the CCS-wide objective of detecting short-term change for BRAC and COMU...... 17

Table 3.1. Categories of BRAC nests to be tallied separately in colony surveys...... 21

Table 7.1. A,B,C. Suggested templates for developing budgets for aerial, boat-based, and land- based surveys of COMU and BRAC colonies...... 42

Table A1.1 Summarized Sampling Objectives with 80% probability of detection. See Element 2 for more details...... 50

Table SOP 1.1. Example of a checklist of equipment needed for aerial surveys of COMU and BRAC...... 71

Table SOP 1.2. Example work plan for aerial surveys...... 72

Table SOP 1.3. An example format for documenting survey crew members’ flight training status for each scheduled survey...... 73

Table SOP 1.4. Factors affecting aerial survey data quality and measures taken to assure high quality...... 76

Table SOP 2.1. Example work plan for boat-based surveys...... 81

Table SOP 2.2. Suggested template for tracking the Boat Operator’s training certifications prior to each scheduled survey...... 81

Table SOP 2.3. Example checklist of equipment that may be needed for boat-based surveys of COMU and BRAC...... 83

viii

Table SOP 2.4. Factors affecting boat-based survey data quality and measures taken to assure high quality...... 85

Table SOP 3.1. Example work plan for land-based surveys...... 89

Table SOP 3.2. Example checklist of equipment that may be needed for land-based surveys of COMU and BRAC...... 90

Table SOP 3.3. Factors affecting land-based survey data quality and measures taken to assure high quality...... 92

List of Figures

Figure 2.1. Map of the eight proposed sub-regions for surveying COMU and BRAC in the CCS. See SM 1 for detailed maps of each sub-region...... 12

Figure 3.1. Satellite photo shown locations of geo-referenced photographs (JPEG) showing the position of the camera when they were taken...... 28

Figure 7.1. The functional relationship between the PSP and Survey Partners...... 41

Figure A1.1. The statistical power to detect declining and increasing trends of BRAC populations of a typical sub-region in relation to Study Design and number of colony sites per Tier...... 55

Figure A1.2. The statistical power to detect declining and increasing trends of BRAC populations of the entire CCS in relation to Study Design and number of colony sites per Tier. 56

Figure A1.3. The statistical power to detect declining and increasing trends of COMU populations of a typical sub-region in relation to Study Design and number of colony sites per Tier...... 57

Figure A1.4. The statistical power to detect declining and increasing trends of COMU populations in the entire CCS in relation to Study Design and number of colony sites per Tier. 58

Figure A1.5. The statistical power to detect short-term change of BRAC populations for a typical sub-region in relation to Study Design and number of colony sites per Tier...... 59

Figure A1.6. The statistical power to detect short-term change of COMU populations in a typical sub-region in relation to Study Design and number of colony sites per Tier ...... 60

ix

Figure SOP 2.1. An example of an annotated photograph showing named portions of an island wherein birds and nests are tallied separately...... 80

Figure SOP 3.1. An example of an annotated photograph showing named portions of an island wherein birds and nests are tallied separately...... 88

x

Abbreviations Used

AK Alaska ALSE Aviation Life Support Equipment (DOI Manual) ASAP As Soon As Possible BC BLM Bureau of Land Management (US) BRAC Brandt's Cormorant CA California CAN Canada CCS California Current System CF Refer to COMU Common Murre CONANP Comisión Nacional de Áreas Naturales Protegidas DM Data Manager DOI Department of Interior (US) DSLR Digital Single Lens Reflex (Camera) FAA Federal Aviation Administration (US) FGDC Federal Geographic Data Committee (US) FTE Full Time Equivalent (Personnel) GAR Green, Amber, Red (Risk assessment system) GLM General Linear Model GPS Geographic Position System I&M Inventory and Monitoring (USFWS Program) IAT Interagency Aviation Training (DOI) IMP Inventory and Monitoring Plan (USFWS Refuge) IUCN International Union for the Conservation of Nature (United Nations) MEX Mexico MIN Minimum MOCC Motorboat Operator Certification Course (DOI) MS Microsoft Corporation NGO Non-Governmental Organization NPS National Park Service NW Northwest NWR National Wildlife Refuge NWRC National Wildlife Refuge Complex OAS Office of Aviation Safety (DOI) OP Observation Point OR Oregon PCR PSP Colony Registry

xi

PCSD PSP Colony Survey Database PDF Portable Document Format PF Protocol Framework PII Personally Identifiable Information PPE Personal Protective Equipment PRIMR Planning and Review of Inventory and Monitoring on Refuges (Database) PSP Pacific Seabird Project (USFWS) R1 Region 1 (USFWS) R8 Region 8 (USFWS) RDO Regional Director Order ServCat Service Catalog (USFWS Database) SM Supplemental Material (in this PF) SO Sampling Objective SOP Standard Operating Procedure SSP Site-Specific Protocol UAS Unmanned Aerial System USFWS United States Fish and Wildlife Service USGS United States Geological Survey WA Washington (State) YR Year

xii

Narratives

Element 1: Introduction

Background and purpose In the from Alaska to California and the tropical Pacific islands, the U.S. Fish and Wildlife Service (USFWS) has public trust responsibility for tens of millions of . By definition, true seabirds are adapted to the marine environment and get a large portion of their resources from the ocean or coastal waters, and mainly come to land for nesting and rearing young. It is at these nesting colonies where seabirds are most easily observed and monitored and highly vulnerable to exploitation, predation, and disturbances. Three Arch Rocks National Wildlife Refuge, which was the first National Wildlife Refuge on the West Coast, was established by President Theodore Roosevelt in 1907 to protect large colonies of nesting seabirds. Many other refuges and other public lands along the West Coast have been specifically designated to protect these birds on their nesting grounds.

As abundant marine predators, seabirds play vital roles in marine ecosystems. Due to their high visibility relative to other marine organisms, observed changes in their populations and breeding habits can be valuable indicators of local- or large-scale changes in the health of the marine environment. Seabird populations can be dramatically affected directly or indirectly by a variety of natural environmental fluctuations and human activities. In recent years, many seabird species’ local and regional populations are suspected to have trended steadily downward; however, little is known about actual range-wide trends and their implications for long-term sustainability of some species (USFWS 2005), and whether resource managers should be intensifying protective and conservation measures for them.

Most continental U.S. West Coast seabirds nest on USFWS refuge or other federal public lands (Naughton et al. 2007), where population data of many species have been collected for many years by USFWS personnel and others. However, the lack of standard survey methods, an integrated and statistically rigorous sampling methodology, and a central data repository and management system where survey data from disparate sources are stored and available for analyses have limited the use of these data for ecosystem-scale inferences about species’ population status and trends. The Inventory and Monitoring Initiative’s Pacific Seabird Program (PSP) was initiated in 2016 by USFWS in collaboration with conservation partners to address these deficiencies. The PSP is an inter-USFWS Regional program (Regions 1, 7, and 8) established for the purpose of monitoring seabird population status and trends by coordinating and integrating seabird surveys conducted by the USFWS and other organizations on species within the USFWS’ Pacific Ocean jurisdiction. Ecosystem-wide population status and trend information is fundamental to wildlife and land management agencies for identifying and prioritizing management actions necessary to conserve these migratory birds; the PSP was established to achieve this broader perspective. The California Current System (CCS) defines an oceanographic ecosystem that extends along the West Coast of from southern British Columbia to central Baja California, Mexico (experts debate the precise boundaries), and is the initial focus of the PSP and this protocol framework (PF).

1

This PF for the common murre (Uria aalge; COMU) and Brandt’s cormorant (Phalacrocorax penicillatus; BRAC) is envisioned as the first in a series that will eventually establish standards and minimum requirements for surveys contributing to the CCS-wide breeding population status and trend assessment of seabird species with significant portions of global populations in the CCS. PFs are expected to be produced by the PSP staff in cooperation with its conservation partners as the fundamental tools for achieving program goals of standardization and integration of CCS seabird data from all sources.

Breeding population surveys of seabirds within the CCS have been and will continue to be conducted by a variety of organizations including: National Wildlife Refuges, National Parks, National Monuments, state wildlife agencies, tribes, Canadian and Mexican agencies, academic institutions, and private conservation organizations. USFWS policy (701 FW 2) encourages engaging in cooperative surveys that “meet legal or other directives, assess large-scale resource issues, or improve the effectiveness of the Refuge System.” Cooperative surveys may extend beyond any refuge boundaries and involve partners including “other refuges, Service programs, other Federal or State agencies, or private partners.” The PSP goal of integrating population data collected throughout the CCS can only be achieved with the cooperation of all entities generating pertinent data within the CCS, and the PSP actively seeks to support and promote that cooperation. Throughout this PF, committed and potential PSP cooperators are referred to as Partners.

This PF comprises a foundation and guidance for developing consistent site-specific survey protocols (SSPs) that will provide detailed instructions for conducting breeding colony surveys of COMU and BRAC at specific locations throughout the CCS. The content and structure of this overarching PF meets all requirements set forth in the USFWS’s How to Develop Survey Protocols: a Handbook (Version 1.0) (USFWS 2013). Specifically, each of eight elements of a protocol is addressed in individual sections that are supplemented with Standard Operating Procedures (SOPs) and other supporting materials. The Handbook articulates the requirements for USFWS personnel to prepare SSPs for each base of survey operations under their control. Although Partners conducting surveys with different planning and reporting format requirements may customize their SSPs accordingly, they should use this PF for preparing their content. The individual SSPs should, at minimum, address elements presented in this PF pertinent to the survey method, staff, equipment, and environmental conditions within the respective survey area. At all times and especially during the initial implementation stage when survey participants are developing their SSPs, PSP staff will assist survey coordinators, where possible, to adapt the requirements and recommendations of this PF to their local conditions and needs.

COMU and BRAC are addressed together in this PF because breeding populations of these two species are most efficiently surveyed simultaneously within the CCS by obtaining aerial photographs of their breeding colonies. This is possible because both species typically nest in relatively large, dense, and conspicuous colonies on the surfaces of cliffs and offshore rocks/islands during overlapping seasons, where they are present during daylight hours. The use of aerial photography (helicopter or fixed-winged aircraft) for surveying is preferred considering the relative ease to locate and photograph whole colonies from the air compared to the difficulty to accurately survey the large breeding aggregations of these species with boat or ground-based

2 methods. Counts of either individual birds (COMU) or nests and birds (BRAC) are taken from select photographs of colonies to estimate the respective breeding populations.

As the first PF prepared in conjunction with the PSP, every effort has been made to make it a model for future frameworks to address other seabird species and survey methods given the current (2018) state of knowledge and resources. Nonetheless, it is fully expected that as the PF is implemented and evaluated by survey coordinators and as survey methodology advances, it will be updated and revised when necessary. Thus, the PF is a living document for which users are encouraged to submit comments and criticisms to the PSP staff who will revise it. For example, readers may note Element 4: Data Management and Analysis does not specify what database will be used as the repository. Because this database currently does not exist, the PSP’s Data Manager is initiating a requirements analysis with subsequent development and beta testing in order to deploy a functional, user-friendly application in the future. A full description of the database and how it will be used by survey participants is anticipated to be the first substantial revision of this PF (see Appendix 3, item 1).

Objectives Management Objectives: The protocols described in this PF are intended to establish standardized surveys that collectively will determine, in a statistically rigorous manner, the status and trends of COMU and BRAC breeding population sizes in the CCS. Specifically, the surveys associated with this PF will: 1. Identify if these species’ populations are declining or increasing at the regional (CCS- wide) and sub-regional scales, and quantify any such changes. 2. Characterize variation in CCS-wide and sub-regional population trends at various time scales. 3. Characterize the geographical distribution of COMU and BRAC breeding colonies, and track changes in their sizes and distributions.

Survey data collected under this PF will provide critically needed information that, when integrated with information from other sources, can inform management activities and conservation goals such as the following: 1. Assess potential impacts from climate change including sea level rise, increasing storm surges, ocean warming, ocean acidification, changes in and local currents, and changes in prey availability, in the context of other ecological influences. 2. Assess local, regional, and CCS-wide population response to management actions or other conservation measures. 3. Assess damages from oil spills and other contaminants, including short-term and long- term response to such events. Survey timing and locations may be adjusted to facilitate such assessments. 4. Identify impact of potential threats from , energy production, land uses, or other sources for coastal and marine planning. 5. Inform government regulatory and permitting processes. 6. Inform and direct U.S. Coast Guard and state agency responses to pollution events by providing the latest abundance and distribution information. 7. Strengthen relationships with USFWS partners and cooperators, and integrate seabird information from all sources.

3 Sampling Objectives Sampling objectives are derived from the above management objectives, and describe the statistical rigor and confidence levels of the information obtained by the sampling design. When management objectives involve detection of change over time of measured attributes, the sampling objective specifies the time period and magnitude of change that will be confidently detected. See Element 2 for specific sampling objectives, a description of the sampling framework, sampling design options, and Appendix 1 for statistical power analyses.

Element 2: Sampling Design

Objectives: Overview Objectives of the monitoring are to detect changes in BRAC and COMU breeding population sizes over time and across several spatial scales. In Element 2, management-based objectives are translated (stepped down) into specific sampling objectives. These objectives represent an elaboration of those presented in Element 1.

Temporal change: Determining population change over time involves assessment of several components. First, there is the direction and magnitude of long-term trends (generally exceeding one decade). Second, there is the detection and quantification of short-term changes (e.g., 1-10 years). Third, there is detecting and quantifying changes in trend direction (positive or negative) over the long-term. There is increasing recognition among the scientific community that seabird population trends display significant changes over decades, often associated with changes in environmental conditions (Bestelmeyer et al. 2011). The approach described herein addresses all three components of temporal change.

The three types of temporal change need to be addressed at multiple spatial scales: (1) the entire CCS region, (2) sub-regions within the CCS (as described below), and (3) individual colony or colony complex. The sampling design directly addresses the first two spatial scales, and also provides a basis for SSPs that address the finest spatial scale.

Quantifying change over time at the sub-regional scale is important because threats or other influences on population dynamics may be operating within only one or several of the sub- regions of the CCS. Information at this scale provides useful information when considering management needs and implementing actions from among available options. The sample design also allows comparison of sub-regional trends with each other and the overall CCS-wide trends. In this way, the over-all CCS-wide change over time can be partitioned into the sub-regions that contribute to this trend. The interpretation of trends that are similar across the CCS, and management response to such trends, will be very different from the interpretation of trends that vary strongly across the CCS. Sub-regional trends can also be contrasted with trends at the level of the individual colony or colony-complex.

Spatial patterns: Another set of objectives of management interest concerns determining the distribution and abundance of the breeding populations of the two species throughout the CCS. This includes estimating absolute abundance in a standardized and statistically-robust manner and determining how the spatial pattern of distribution and abundance changes over time. For

4

example, is one or both of these species shifting its distribution over time, and if so, by how much?

Sub-regions: The concept of sub-regions within the CCS is an important part of the sampling design because of ecological differences throughout this broad region. However, the precise delineation of sub-regions suggested herein (Figure 2.1 and SM 1) is subject to change as better information becomes available, and is only intended as a starting point for the implementation of the PF and SSPs.

Sampling Objectives For some seabird species, a Recovery Team or other Work Group has already and independently developed sampling objectives. However, this is not the case for BRAC and COMU. As a result, this PF derives sampling objectives for BRAC and COMU based on criteria developed for the IUCN (IUCN 2012) by a broad group of scientists and intended to apply to a wide variety of species.

Rationale for the Sampling Objectives: The IUCN has five categories for extant species: Least Concern, Near Threatened, Vulnerable, Endangered, and Critically Endangered, of which only the last three categories have quantitative criteria associated with them. COMU and BRAC are currently classified by the IUCN as “Least Concern.” For the PF, we focus on the criteria that would result in changing the classification of either species to “Vulnerable”. Specifically, this would apply if either of the two species is reduced in population size by at least 30% over 10 years or three generations, whichever is greater. Because seabirds are relatively long-lived, we must consider each species’ generation time.

Based on analysis and demographic modeling of COMUs on the Farallon Islands in central California (Nur et al. 1994, Lee et al. 2008), average generation time is variable (depending on rates of population growth, age of first breeding, and survival rates, which are themselves variable), and is roughly 10 years or more. We have chosen a lower bound, and thus conservative, estimate of 10 years. Thus, to reach the IUCN criterion of vulnerable for COMU, the population would need to reach an equivalent to a 30% decline in population size over a 30- year period, equivalent to a 1.18% decline per year on average. However, for COMU we set an additional sampling objective of detecting a 30% decline after 15 years. This additional objective was deemed desirable to avoid delaying development of potential management actions and response to a significant decline within the CCS population. Thus, COMU sampling objectives are to detect a 30% decline at two time scales.

For Farallon Islands BRAC, based on demographic modeling (Nur et al. 1994, Nur and Sydeman 1999), generation time is about five years or more. Again, we have chosen five years as a lower bound, conservative estimate. Hence, the IUCN criterion for BRAC is approximately equivalent to a 30% decline in population size over a 15-year period, equivalent to a 2.35% decline per year on average. These values provide the basis for developing a sampling objective for each species at the CCS-wide level. Specifically, an average decline of 2.35% per year for BRAC is of management concern because, after 15 years, this rate of decline will amount to a 30% cumulative decline. The desired confidence is to detect such a decline, if present, with 80% probability (see below).

5

These above described sampling objectives for COMU and BRAC apply at the CCS-wide scale; however, the IUCN criteria are not intended to apply at the subpopulation level (IUCN 2012). Therefore, The Sampling Frame Working Group set sampling objectives at the sub-region scale to be detection of a 40% decline over 15 years (i.e., an average decline of 3.35% per year) for each species. The detection of a greater decline is warranted because subpopulations are expected to experience proportionally greater fluctuation in abundance than larger scale populations. Therefore, it would take a larger change to raise concern.

These sampling objectives refer to detection of long-term, constant population trends. However, short-term changes of large magnitude are also of potential concern for both species. A one- or two-year decline may not be of major concern; for example, strong El Niño or El Niño-like oceanographic events have been shown to have had one- to two-year or sometimes longer impacts on breeding populations (Ainley and Boekelheide 1990, Ribic et al. 1992, Bertram et al. 2005, Capitolo et al. 2014, Oro 2014, Ainley et al. 2018). Therefore, detecting an average decline of 40% over a three-year period in a sub-region for either COMU or BRAC is an additional sampling objective for this PF. This criterion does not refer to a trend, but rather involves a reduction in the mean abundance metric from one time period (a five-year baseline period) to the following period (three-year post-change period). The sampling design addresses the same criterion for both species.

List of Sampling Objectives: In summary, the sampling design identifies seven principal sampling objectives. For each objective, the desired confidence level is an 80% probability of detecting the specified change, where statistical significance is set at an alpha level of P = 0.05 (see Appendix 1) .

The seven sampling objectives (SO1 – SO7) are the following:

For COMU: SO1: For the entire CCS, detect an average population decline of 1.18% per year over a 30-year period (i.e., a cumulative decline of 30%) with 80% probability SO2: For the entire CCS, detect an average population decline of 2.35% per year over a 15-year period (i.e., a cumulative decline of 30%) with 80% probability. SO3: For an individual sub-region, detect an average population decline of 3.35% per year over a 15-year period (i.e., a cumulative decline of 40%) with 80% probability. SO4: For an individual sub-region, detect a short-term reduction in mean abundance metric of 40%, for a 3-year period compared to the previous 5 years with 80% probability.

For BRAC: SO5: For the entire CCS, detect an average population decline of 2.35% per year over a 15-year period (i.e., a cumulative decline of 30%) with 80% probability. SO6: For an individual sub-region, detect an average population decline of 3.35% per year over a 15-year period (i.e., a cumulative decline of 40%) with 80% probability. SO7: For an individual sub-region, detect a short-term reduction in mean abundance metric of 40%, for a 3-year period compared to the previous 5 years with 80% probability.

6

The sampling design for this PF is intended to meet the aforementioned primary sampling objectives. In addition, the sampling design also serves to address the following secondary sampling objectives, which focus on detecting population increase:

 Detection of increasing trend, comparable in magnitude to the listed declining trends (cf. SO1-SO3, SO5-SO6).  Detection of short-term increases, comparable in magnitude to the listed short-term decreases (cf. SO4, SO7).

These secondary objectives are addressed as well by the sampling design.

Sampling Framework

Sampling units and target universe The target universe consists of all breeding aggregations of the two colonial species, BRAC and COMU, in coastal areas of the CCS from northern Mexico to southern British Columbia. Breeding aggregations in mainland areas as well as offshore islands are included herein. Also, potentially included are estuarine areas, including Bay and the Salish Sea in Washington State and British Columbia. These aggregations have been uniquely identified and defined, corresponding to discrete seabird breeding colony sites, which are further documented in Seabird Colony Catalogs and Databases for WA, OR, and CA (see Element 3 and Element 4). A single colony site consists of an island or offshore rock, a small group of islands or offshore rocks, and/or a small section of the mainland coast.

The sampling framework described herein refers to these identified colony locations or sites. The sampling unit is the colony as identified in the Colony Catalog databases. The number of sampling units is in excess of 460 identified colonies for BRAC and about one-half that value for COMU, though not are all currently occupied colonies.

Sampling Challenges and Sources of Error For BRAC and COMU, the sampling design addresses four challenges: (1) the geographical extent of the breeding colonies in the CCS is large, (2) counting individuals or nests is very time- consuming, especially for large colonies, (3) the total number of colony locations is large, and (4) translating differences in counts of individuals or nests over time into estimates of changes in relative number of breeding individuals over time is subject to multiple sources of error.

The relationship between counts of individuals or nests at a colony and the number of breeding individuals at a colony is referred to as “detectability” in this PF. For COMU, we identify three types of error associated with detectability: (1) breeding individuals not counted because they were absent during the survey, (2) breeding individuals who were present during the survey but not enumerated, and (3) individuals who were counted as part of a survey but were non-breeders. Surveys in this context are counts taken from aerial photographs (see Element 3).

For BRAC, the number of breeding pairs estimated is on the basis of counts of nests and assessment of nest condition (Element 3). Thus, a breeding pair may fail to be detected if its nest is not observed or if its nest is determined to be in poor condition (e.g., still under construction or

7

partially dismantled after breeding failure). For example, undercounting can occur if a pair’s nest is too undeveloped at the time of the survey to be counted but it is completed later or, alternatively, the survey occurred too late in the nesting cycle for the nest to be counted (e.g., had failed some time prior to the survey and either the nest was no longer present or wasn’t counted because no birds were present).

Other factors contribute to the degree of detectability of breeding adult COMU and BRAC on their colonies. Perhaps the primary source of error is that an individual survey is but a snapshot in time, and may be a poor representation of the actual number of breeders at that colony over the course of the season. The relationship between counts of individuals or nests at a colony and the number of breeders is not a constant and reflects many factors including: date of survey, especially in relation to timing of breeding (e.g., incubation, timing of hatching); rates of breeding failure; hour of survey; weather conditions at the time of the survey; survey equipment; characteristics of over-flight (e.g., height); observers, photographers, counters; and foraging conditions. Some of these factors can be controlled and addressed by standardization of survey protocols and observer training, as described in this PF. Of those that cannot be controlled (e.g., survey date relative to breeding phenology), conducting replicate standardized surveys of the same colonies during the same season can support development of statistical models that can estimate effects of some sources of error.

Four-tier Approach to Sampling To reduce bias, increase precision, and address the previously mentioned challenges, the sampling framework establishes four sampling tiers. Surveys of colonies in Tiers 1-3 provide the means to estimate trends, change in trends, and short-term changes. Tier 1 colonies are surveyed more than once per year, Tier 2 colonies only once per year, and Tier 3 colonies are surveyed at multiple-year intervals. Tier 4 colonies, which are surveyed every 10-15 years, serve to characterize distribution and abundance of the two species.

Tier 1: These colony sites are monitored annually with replication. There are two sub-tiers:

1A) Intensively monitored. Data collection includes monitoring of reproductive behavior. Studies at these sites provide information on breeding phenology and allow for the statistical adjustment of counts with regard to timing of the surveys. Information on reproductive success is also important in the interpretation and analysis of breeding survey data (e.g., if there is early season failure or delayed breeding). These sites generally include replicate surveys, which are often land-based.

1B) Sites with replication, non-intensive. These sites have at least two surveys per year. Replicate surveys will generally be aerial surveys, but land- and boat-based surveys may also be included.

Tier 2: Sites surveyed annually. These sites are surveyed only once per year. Generally, these will be sites surveyed aerially. Note that a site may be in Tier 1B (replication) in one year and Tier 2 (no replication) in a subsequent year.

8

Tier 3: Sites surveyed every several years. These sites are surveyed every third year on a rotating basis. Thus, 33% of Tier 3 sites are surveyed in any given year. These sites serve to extend the spatial coverage provided by sites in Tiers 1 and 2 to improve estimates of population trends and changes in trend.

Tier 4: Tiers 1, 2, and 3 do not comprise all known breeding colonies. All sites (past or present) not included in Tiers 1, 2, or 3 are included in Tier 4. If possible, about every decade a CCS- wide survey of all Tiers should be conducted to enable assessment of the distribution and abundance of the two species throughout the CCS.

Value of Tiers 1A and 1B (See also Appendix 3, Part 2) The identifying characteristic of Tier 1A colonies is that they will be the subjects of intensive phenology and reproductive outcome studies. These will involve multiple replicate surveys tracking within-season variation in counts of individuals and nests and how these are affected by timing of breeding, breeding success, etc. (e.g., Sydeman et al. 1997). Ideally, sites representing the geographical variation of breeding phenology of each species in the CCS should be established to maximize the applicability of the statistical models derived from them to adjust survey results in the respective sub-regions. Colonies with on-going studies or that have been surveyed in the past and seem feasible for consideration as Tier 1A colonies for one or both species include those at Tatoosh Island, Yaquina Head, Coquille Point, Castle Rock NWR, Point Reyes Headlands, South Farallon Islands, Devil’s Slide Rock & Mainland, Alcatraz Island, Año Nuevo Island, Castle-Hurricane Colony Complex, and Point Arguello. All of these and others, including BRAC colonies in southern California with distinct breeding phenology, should be seriously considered as Tier 1A sites because intensive monitoring has an important role to play in the long-term success of this Sampling Framework. Details about the frequency, duration, location, and data collection of these studies should be decided in consultations between site Survey Coordinators, PSP staff, and seabird researchers.

Statistically estimating intra-year error for a colony site requires standardized replicate surveys within a single breeding season. Two or more surveys of Tier 1B colonies conducted on different days of the same season will expose unusually high or low survey counts due to sources of error (e.g., timing of survey was too early or late) or confirm true variation in breeding numbers (reflecting unusual ecological conditions). See also Element 4: Analysis Methods.

Stratification of Colony Sites in the Sampling Framework The sampling framework allows characterization of change over time, as well as distribution and abundance of each species at two spatial scales: the entire CCS and sub-regions of the CCS as described below. In order to address both spatial scales by reducing bias and increasing precision, the design incorporates stratification of sample sites by 1) sub-region, and 2) colony size. Stratification will permit detection of pertinent trends for each stratum and improve CCS- wide population trend estimation. Sub-regions: The sampling frame identifies eight sub-regions (strata) within the CCS. Objectives of the sampling as described in the PF include detecting change over time at the sub- regional level (SO3, SO4, SO6, and SO7). Tentative sub-regions have been delineated (see Figure 2.1 and Table 2.1). Currently, the same sub-regional boundaries are established for both species. These are proposed delineations; for implementing the PF, these will be revisited with

9

the most recent data and revised as needed. Furthermore, delineation of sub-regions is specifically for the purposes of establishing a sampling design that can attain sub-region-specific objectives. Because survey data are spatially explicit, future analyses will not be constrained by sub-regional boundaries and can group colonies as appropriate to the study question.

The proposed boundaries for the sub-regions were determined by CCS seabird survey experts based on the following primary considerations: 1. Knowledge of the current distribution of COMU and BRAC colonies and locations of population concentrations and discontinuities. 2. Characterization of areas with respect to shared ecological characteristics that may affect BRAC and COMU population change, such as upwelling patterns, currents, prey resources, climate, and/or sources of disturbance at the colonies. The number of current colonies within a sub-region was also a consideration. The sub-region boundaries described here are not based on documented population genetic structure; they are subject to change as new information becomes available (see Appendix 3, Part 3). In the context of this sampling frame, they serve as one example of how the CCS may be sub-sampled to meet the objectives above. Future PFs that consider other seabird species may designate different sub- regions. Figure 2.1 shows the eight sub-regions; SM 1 includes individual maps of each showing the proposed boundaries with greater resolution. Note that inland bays have been lumped with adjacent open coasts, particularly for the Salish Sea and San Francisco Bay. This is because too few colonies of COMU or BRAC occur in either embayment to establish separate sub-regions, distinct from adjacent coastal areas. In the future we hope to include the Mexican portion of the CCS, but currently lack sufficient information about the number and location of BRAC colonies in Mexico. Consistent with the CCS perspective of the PSP, sub-regions that cross state boundaries were recognized in contrast to the traditional State-specific sub-regional boundaries. The Southern Washington/Northern Oregon sub-region includes the Columbia River where some BRAC nest in closer proximity to Washington colonies than to those to the south on the Oregon coast. The Southern Oregon/Northern California sub-region recognizes the continuity of colony distributions and ecological conditions within that area irrespective of the political boundary. In California, the sub-region divisions largely align with prominent terrestrial landforms (e.g., capes) that are associated with natural breaks in oceanographic and climatic conditions. Note that BRAC breed further south than COMU; there are currently multiple BRAC colonies in the southern CA sub-region, but only 1 COMU colony. Colony Size: The second stratification criterion is colony size. It varies strongly for both species, especially COMU, where a single colony can contain less than 100 or exceed 100,000 breeders. Three strata are proposed: large, medium, and small colonies. Tentative break points were determined based on examination of the distribution of colony size within the range of sizes (as determined by the most recent counts available) and noting natural discontinuities in that distribution for each species. For COMU, the initial, proposed categories are <5,000, 5,000- 15,000, and >15,000 individuals. For BRAC, categories are <200, 200-1000, and >1000 individuals. An approximate indication of how the full set of colonies in the sample frame

10

would fall out into the three size strata is indicated for each species in Table 2.2. Size categories may change as new information becomes available; the cut-points may also be adjusted among sub-regions, to reflect the relative size distribution within each sub-region. The sample design with respect to sample size and allocation of colonies is discussed below.

Additional considerations: The availability of historic data (a recent stretch of 10 or more years of data) is an important sampling consideration. Continued sampling of sites with historical data may be desired, and would require an additional secondary stratification criterion, together with the two primary criteria (sub-region and colony size), as discussed below.

11

.

Figure 2.1. Map of the eight proposed sub-regions for surveying COMU and BRAC in the CCS. See SM 1 for detailed maps of each sub-region. Mexican colonies will be included as information about them becomes available.

12

Table 2.1. An estimate of the distribution of colonies of COMU and BRAC by proposed sub-region in the CCS. These data represent the best available information we have for active BRAC and COMU colonies in BC, WA, OR, and CA. Active colonies represent those that have had nesting birds observed during

COMU (Active Colonies) BRAC (Active Colonies) Sub-regions Total % Total % Total Total % Total % Total Colonies Birds Colonies Colonies Birds Colonies British Columbia, Washington, 16 1.3% 18.8% 13 0.2% 8.7% and Columbia River Northern Oregon 12 1.5% 14.1% 8 0.4% 5.3% Central Oregon 10 5.7% 11.8% 13 2% 8.7% Southern Oregon and Northern 27 56% 31.8% 18 10.8% 12% California Northern California 8 3% 9.4% 16 8% 10.7% North Central California 11 32% 12.9% 23 41.3% 15.4% South Central California 0 0 0 26 21.8% 17.4% Southern California 1 Unk 1% 32 15% 21.5% Totals 85 149 their last survey and/or have been determined active by local experts.

Table 2.2. The distribution of CCS colonies and numbers of breeding COMU and BRAC by colony size (Small, Medium, Large, as defined) based on most recent available estimates. COMU numbers are based on adjusted counts, and BRAC numbers are 2X number of active nests. Note that about 86% of all birds COMU (Active Colonies) BRAC (Active Colonies) S (<5K) M (5-15K) L (>15K) Total S (<201) M (201-1K) L (>1K) Total # Colonies 62 9 14 85 102 39 8 149 # Birds 70,910 67,871 848,501 987,282 4,530 13,819 13,680 32,029 % Total birds 7.2% 6.9% 85.9% 100 14.1% 43.2% 42.7% 100 % Total colonies 72.9% 10.6% 16.5% 100 68.4% 26.2% 5.4% 100 Avg. col size 1,313 7,541 60,607 12,822 62 419 1,710 281 are in large colonies.

Sampling Framework: Study Design and Sample Size The proposed sampling framework incorporates four key components: the four tiers, stratification criteria, sample size, and selection of sampling sites. The first two elements have been discussed above. For sample size, we carried out a statistical power analysis (Nur et al. 1999; see Appendix 1) to provide guidance regarding the minimum number of colony sites needed to achieve the stated sampling objectives (Summarized in Table 2.3). However, the number of sites needed depends on how frequently each site is surveyed: more than once per year (Tier 1), once per year (Tier 2), or less-than-annual surveys (Tier 3). In other words, tiers differ with respect to replication and whether years are “skipped”. We use the term “study design” to refer to which tiers are included in the sampling framework. For example, one study design may include Tier 2 sites only (no replication or skipping); whereas, another study design includes Tiers 1-3. In contrast to study design, we use the term “sampling design” to refer to the

13

number of colonies sampled with regard to each of the tiers. Thus, sampling design is the more inclusive term incorporating both study design (composition in terms of Tiers) and sample size. The most inclusive term is “sampling framework” which incorporates sampling design as well as the selection and allocation of sampling sites and thus includes stratification.

Table 2.3. Summarized Sampling Objectives with 80% probability of detection. See Element 2 for more details. Sampling Species Extent Decline Time Notes Objective (years) S01 COMU CCS 30% (1.18/y) 30 S02 COMU CCS 30% (2.35/y) 15 S03 COMU Subregion 40% (3.35/y) 15 S04 COMU Subregion 40% 3 Compared to past 5 years. S05 BRAC CCS 30% (2.35/y) 15

S06 BRAC Subregion 40% (3.35/y) 15 S07 BRAC Subregion 40% 3 Compared to past 5 years.

The statistical power analysis is described in Appendix 1. With regard to trend detection, the power analysis focused primarily on the question: What is the sample size needed to achieve 80% power to detect target trends over the specified time span for each of several study designs? Three target trends were investigated: 1.18%/yr decline over 30 years (SO1), 2.35%/yr decline over 15 years (SO2 and SO5), and 3.35%/yr decline (SO3 and SO6).

The power analysis tested the effect of sample size on the ability to detect trends, both decreasing and increasing, and was conducted at the CCS-wide and sub-regional scales. The power to detect short-term changes (SO4 and SO7) was also analyzed. Appendix 1 presents results in more detail, briefly summarized here.

Sub-regional results: declining trends: Different sampling designs can achieve the same power, as is demonstrated by the power analysis. We first summarize the sample sizes needed to achieve the sub-regional objective of detecting 3.35% decline over 15 years for BRAC (SO5) with respect to the following six study designs, numbered Designs 1-6: 1) Tier 1 only; 2) Tier 2 only; 3) Tier 3 only; 4) Tiers 1 and 2; 5) Tiers 2 and 3; and 6) Tiers 1, 2, and 3.

Results were that either six Tier 1 sites (each with two surveys per year) or approximately 13 Tier 2 sites can achieve the specified power to detect a declining trend (Table 2.3). Design 3 (Tier 3 only) cannot meet the desired power within the range of sample sizes investigated. If the design includes Tiers 1 and 2 (Design 4), then four sites of each tier (eight sites total) will achieve the desired power. Design 5 (Tiers 2 and 3) would require nine sites of each tier (18 sites total). For Design 6, four sites for each tier (12 sites total) will achieve the desired power.

14

For Study Designs 1, 2, 4, 5, and 6, the total number of colony sites needed thus varies from six to 18.

Table 2.4. Sample sizes needed to achieve the sub-regional objective of trend detection for BRAC and COMU. See details in Appendix 1. Target trends are 3.35%/yr decline (SO3, SO6) and comparable increase, 3.47%/yr (see text), over 15 years. The total number of colony sites needed per design are shown. Sample sizes shown assume equal number of sites per tier for Designs 4, 5, and 6. Where sample size differed by direction of trend, the respective values are shown as a range; where larger value is shown, this refers to increasing trend.

Total Number of Sites Needed per Sub-region Design Tiers BRAC COMU 1 Tier 1 6-7 5 2 Tier 2 13-14 9 3 Tier 3 >16 >16 4 Tier 1 + Tier 2 8-10 6 5 Tier 2 + Tier 3 18-20 12 6 Tiers 1, 2, and 3 12 9

BRAC surveys display considerable variability in nest counts among years, reflecting annual fluctuations in breeding activity (Nur and Sydeman 1999), movement of adults among colonies, and variability in timing of breeding and success of nesting within and among years. Because COMU counts display less variability among years than do BRAC, the power to detect a target trend is greater for COMU than for BRAC under the same sampling design. Accordingly, a smaller sample size of COMU colonies is needed to achieve the comparable sampling objective. Thus, five Tier 1 sites (Design 1) or nine Tier 2 sites (Design 2) per sub-region can meet the COMU objective SO3 (Table 2.3). Study Design 4 (Tiers 1 and 2) requires three sites of each (6 sites total). Design 5 (Tiers 2 and 3) requires six sites of each tier (12 sites total). Study Design 6 with Tiers 1-3 requires three sites of each (9 sites total). Thus, for COMU, the minimum sample size varies from 5 to 12 for all study designs except number 3 (Table 2.3).

Sub-regional results: increasing trends: Sampling objectives SO1-SO7 were developed with respect to detecting declines. However, detecting population increases is also of management interest. In general, the power to detect an increase for the sub-region or CCS-wide will be equivalent to the power to detect a comparable decrease considering proportional change rather than absolute percent. A proportional 50% increase is not equivalent in magnitude to a 50% decrease. Instead, a 100% increase is equivalent in magnitude to a 50% decrease. For example, if a species begins with a population of 1000 increases by 100% and then decreases by 50% it is back to 1000. Conversely, a 50% decrease followed by 100% increase will also return it back to its starting point. In this context, a 40% decrease (decrease to 3/5 of baseline value) is comparable in magnitude to a 66.7% increase (increase to 5/3 of baseline value). On an annual basis, a 3.47%/yr increase is comparable to a 3.35%/yr decrease.

The power analysis quantified the sample size needed to detect increasing trends for both species, of 3.47%/yr (Table 2.3). In general, sample sizes were similar for detecting an

15

increasing trend compared to detecting the comparable decreasing trend. For BRAC, the sample size needed was, in most cases, slightly greater; while for COMU there was no difference for sub-regional trends.

CCS-wide results: The target trend for BRAC CCS-wide was a 2.35% decline/yr over a span of 15 years. For each study design, the sample size per sub-region that achieved the sampling objective for a single sub-region, also met the CCS-wide objective of detecting the CCS-wide target trend (Table 2.4).

For example, for BRAC, Design 1 (Tier 1 only) with six colony sites yields 81% power to detect the sub-regional target trend of 3.35%/yr decline. The same sampling design (six Tier 1 sites), extended over the entire CCS, is expected to provide 89% power to detect the CCS-wide target of 2.35% decline (Appendix 1). This pattern held for all study designs considered. Hence, satisfying the sub-regional objective SO6 also satisfied the CCS-wide objective SO5.

Table 2.5. Sample sizes needed to achieve the CCS-wide objectives of trend detection for BRAC and COMU. Target trends are 2.35%/yr decline (SO2, SO5) and comparable increase, 2.41%/yr, over 15 years. The total number of colony sites needed per design per sub-region are shown, assuming seven sub-regions assumed for BRAC and five for COMU (see text). Sample sizes shown assume equal number of sites per tier for Designs 4, 5, and 6. Where sample size differed by direction of trend, the respective values are shown as a range. For BRAC, the larger value shown for Designs 1, 4, and 5 refers to increasing trend; for Design 3, the larger value shown is for decreasing trend. For COMU, the larger value shown (Design 2) refers to decreasing trend.

Number of Sites Needed Per Sub-region Design Tiers BRAC COMU 1 Tier 1 3-4 3 2 Tier 2 6 4-5 3 Tier 3 14-18 14 4 Tier 1 + Tier 2 4-6 4 5 Tier 2 + Tier 3 8-10 8 6 Tiers 1, 2, and 3 6 6

Similarly for COMU, for each study design the sample size that achieved the sub-region objective SO3, detecting a 3.35% decline/yr over 15 years, also met the CCS-wide objectives of detecting a 1.18%/yr decline over 30 years, as well as a 2.35% decline/yr over 15 years (SO1, SO2, respectively). Thus, COMU sampling designs with four Tier 1 sites (Design 1) or with nine Tier 2 sites per sub-region achieved the sub-regional objective (SO3; see Table 2.3). If those sampling designs were applied to all sub-regions the CCS-wide objectives (SO1 and SO2; Table 2.4) were met.

For both species, the sample sizes to detect increasing trends of comparable magnitude (i.e., 2.41%/yr) were similar to those needed to detect declining trends (Table 2.4). Where there was a difference in sample size, the difference was relatively small.

16

Detecting short-term declines: Sampling objectives SO4 and SO7 refer to detecting short-term changes in counts at the sub-regional level. For both species, sampling designs that met the trend objectives (SO3 and SO6) also met the short-term change objectives of detecting a 40% average decline over a 3-year period for a single sub-region (Table 2.5). The same was the case for detecting short-term increases. For example, for BRAC, six Tier 1 sites are needed to detect a 40% short-term decline (SO7) or a 66.7% increase with 80% power. In comparison, trend detection required either six (declining trend) or seven sites (increasing trend) to achieve 80% power. Results were similar for COMU. Note that Tier 3, with surveys every third year, is not considered here with regard to detecting short-term change.

Table 2.6. Sample sizes needed to achieve the CCS-wide objective of detecting short-term change for BRAC and COMU. The objective is to detect a 40% decline (SO4, SO7) or a comparable increase of 66.7% over three years (see text). Only Tiers 1 and 2 are considered. The total number of colony sites needed per design are shown. Sample sizes shown assume equal number of sites per Tier for Design 4. Where sample size differed by direction of change, the respective values are shown as a range; larger value refers to short-term decline.

Number of Sites Needed Design Tiers BRAC COMU 1 Tier 1 6 4 2 Tier 2 10-11 8 4 Tier 1 + Tier 2 8 6

Sampling Design: Conclusions and Recommendations The limiting factor with regard to the number of survey sites needed to meet the stated statistical objectives for detecting population change was that of trend detection for the individual sub- region (SO3 and SO6). The specific number of sites needed for each study design differed depending on the composition of Tiers.

Thus, under Design 2 (no replication), 13-14 colony sites are needed for BRAC and nine sites for COMU per sub-region to meet the objectives for trend detection for an individual sub-region as well as CCS-wide (Table 2.3). Under Design 1, all sites have replication (here assumed to be 2 replicates per year), which reduces the minimum number of sites by approximately one-half compared to Design 2. Design 1 is recommended because replication allows for improved estimation of the effects of day of season and other factors that are likely to affect counts, including annual variation in breeding probability and breeding success (see Element 4, Data Analysis). The availability of ancillary information on timing of breeding (collected at Tier 1A sites) makes replication even more valuable. In addition, replication provides for the following: 1) estimation of intra-year sampling error, 2) identification or corroboration of anomalous years, and 3) improved ability to interpret year-to-year variation in counts and identify causes of such short-term variation.

In addition to Design 1, Designs 4 and 6 also include replication. Design 4 (Tiers 1 and 2 only) would require a total of 8-10 sites for BRAC and six sites for COMU per sub-region (assuming equal number of sites per Tier). Design 6 includes not just Tiers 1 and 2 colonies, but also Tier 3 colonies which are surveyed every several years (here we assume every third year). Adding Tier

17

3 colonies expands the geographic basis for trend-estimation for a sub-region. A broader set of sampled colonies also assists with analyses to determine whether trends differ within a sub- region with regard to an important criterion such as colony size (large vs small colonies) or other ecological factors.

Tier 3 sites also contribute to the statistical power to detect a trend for a sub-region or the entire CCS. For example, the power to estimate a trend of 3.35%/yr decline for COMU is 68% assuming a sampling design of six Tier 2 sites (i.e., Design 2). By adding six Tier 3 sites (i.e., Design 5), power is increased to 80% (Appendix 1).

In summary, inclusion of Tier 1 sites has benefits aside from that of detection of sub-regional trends, including improved assessment of individual years and improved statistical estimates of counts. Designs 1, 4, and 6 all include Tier 1, but differ with respect to the other tiers. Inclusion of Tier 3 sites is desirable to the extent that this results in improved geographic coverage of trend-estimation. This may be the case, if, for example, three Tier 3 sites can be surveyed (each site once every three years) for every one Tier 2 site surveyed every year. The inclusion of all three Tiers in a study design corresponds to Design 6. Alternatively, if the total number of sites is fixed, then Design 4 (Tiers 1 and 2) may be preferred to Design 6. All else being equal, surveying twelve sites composed of Tiers 1 and 2 will provide greater precision and statistical power than surveying twelve sites that are a combination of Tiers 1-3, but the former will require more surveys per year. Thus, the decision among Designs 1, 4, and 6 will depend on the number of colonies that are available to be surveyed as well as on available resources. In addition, it is desirable to include at least one intensive site (Tier 1A) for each sub-region. Tier 1A sites will provide important information that is not otherwise available, especially with regard to phenology (i.e., events in the breeding cycle, including reproductive failure).

Implementation of the PF requires that participants in each sub-region jointly, and in coordination with PSP staff, select a specific study design and associated sample size. This may need to be done before the SSP can describe the scope of the survey effort. Based on the availability of resources and information about changes in colony sizes and distribution, a study design could be modified to increase or decrease the frequency of surveys as deemed appropriate. However, any changes need to take into account criteria for allocation and selection of sites, as described below.

Sampling Framework: Allocation/Selection of Sampling Sites The sampling frame of BRAC and COMU colonies is large. For each species, there are hundreds of colonies in the CCS. The validity of inferences drawn from the subset of colonies that are surveyed depends on the sample being representative of the population of colonies. Random selection of sites, within the constraints of stratification, is one way to reduce bias. More specifically, the sampling framework is a stratified-random sampling design; i.e., colonies are first stratified by two primary criteria, sub-region and colony size; and then are chosen randomly among available sites. A secondary criterion is the availability of historic data, as described below.

Guidance for sample size selection has been outlined in the previous section. Where the total number of extant colonies in a sub-region is comparable to the desired sample size, inclusion of all colonies is recommended. Where the desired sample size is much smaller than the total 18

number of colonies for the sub-region, then colonies to count should be selected based on colony size. All else being equal, it is recommended that all colonies in the “large” category be included in the sampling design, with small and medium-size colonies added to attain the desired total number. The latter two categories can be equally represented, with the minimum number of small- and medium-size at least equal to (likely surpassing) the number of large colonies. The number of colonies per size stratum should remain more or less constant between years unless there is specific reason to make a change. Table 2.2 shows an overall break-down with respect to the three size categories for all identified colonies in the CCS based on recent available data. These recommendations are guidelines for implementing the sampling design and are not prescriptive. The size stratification is not intended to dictate analysis, but to aid in selecting colonies for the sampling design.

An additional consideration for site selection is inclusion of colonies with historic data. The suggested threshold is the availability of count data for most years in a recent 10-year period. This includes cases where photographs have been counted (and thus data are ready for analysis), or are available to be counted. Depending on the species, there may be some sub-regions for which most colonies fall into the category of having historic data, whereas, in other sub-regions a substantial proportion may not have historic data. The two-fold goal is to have an adequate sample size of colonies with historic data, while ensuring that colonies with historic data are representative of the larger set of colonies for that sub-region.

The recommended approach is, for each sub-region and species, to first select candidate colony sites randomly within each of the three colony-size strata. If this results in a relatively small number of sites with historic data within the selection (to be determined with the aid of the PSP statistician (or surrogate) when the sub-regional plan is designed), one or more additional colonies from the pool of colonies with historical data would be randomly selected in order to attain the desired number of sites with historic data. For example, for BRAC, if Design 4 were implemented (only Tier 1 and Tier 2), a minimum of 8 to 10 colonies is recommended per sub- region. If random selection yielded too few sites with historic data, additional sites (beyond the 8 to 10) would be selected to increase the number in that category, to five or six, respectively if available. These additional sites would be added to the original number if resources permit or the additional sites could replace some of the non-historic colonies (provided that the replacement selection is random). This will allow for comparison of historic and non-historic sites to determine if population change at historic sites is representative of the larger set of colonies for that sub-region. In some cases there may not be sufficient historic sites within the sub-region, so evaluation of how representative they are of the area will be made on a case by case basis. Where a large proportion of colonies have historic data, or too few do, this step would be omitted. The important point is that sites should be selected randomly within each stratification category, i.e., sub-region, colony size, and historic sites, if needed.

Survey timing and schedule Seabird biologists in the CCS have been conducting aerial population surveys for more than three decades with the goal of surveying during the best time window for capturing the maximum occupancy of surface-nesting breeders of interest at the target colonies. This occurs during the incubation and early chick-rearing phases of the nesting cycle for COMU and BRAC (Sydeman et al. 1997, Manuwal et al. 2001, Capitolo et al. 2014). Predicting exactly when that window occurs each year is challenging because each species responds to different 19

environmental cues that affect the onset of nesting and subsequent nesting outcomes (Nur et al. 2017). These influences will vary from year to year, resulting in varying degrees of synchrony within a colony, among colonies, and between species, within and between years. This is further complicated by the necessity to schedule survey logistics (e.g., aircraft, personnel, equipment) before predictive variables can be known, and by the likelihood that a survey may be delayed due to poor weather or other logistical issues. All of this becomes more critical because typically there has been only a single aerial survey per year, and its timing relative to the actual breeding phenology is known to affect the colony counts (i.e., survey date; see Appendix 1). Under these conditions, surveyors must use their local knowledge, experience, and whatever information is available to them to set the target survey window each year, knowing uncertainty remains.

The sampling framework acknowledges this uncertainty and attempts to reduce it by adding standardized replicate surveys when possible, as well as using Tier 1A colonies to obtain phenological data that can be used to model the effect of survey date and other variables on counts, and calibrate the timing of future surveys. All available CCS historical information on survey dates, breeding phenology, influences on timing, and survey counts should, at the earliest opportunity, be compiled by the PSP staff to conduct an initial statistical analysis of optimal timing of surveys as it varies by sub-region. Note that photographs of BRAC nests can contain some phenological information (e.g., numbers of nests still under construction, numbers of nests in incubation to early chick stage; and numbers of nests in later chick stages). As described in Element 4: Analysis Methods, the spring transition date (reflecting the timing of upwelling in the CCS) provides a good predictor of timing of COMU egg-laying (Bertram et al. 2005, Nur et al. 2017), which is available for past years and may be correlated to phenological indicators.

As additional information becomes available across sub-regions, subsequent analyses can be used to further inform the survey scheduling process. Multiple land-based surveys within a season can be timed to bracket the entire nesting season to identify phenological events, such as the dates of mean egg-laying, hatching, and fledging. The frequency of surveys within the season will determine the precision with which those events are measured. Task 7 in Table SOP 3.1 refers to scheduling multiple surveys of the same colonies within a season; the survey frequency and data collected for each to be determined by the sampling objective of the SSP.

Element 3: Field Methods and Processing of Collected Materials

This Element, SOPs 1 – 3, and SM 2-6 provide guidance and standards for developing SSPs for conducting surveys, preparing colony photographs for counting birds and nests, and compiling the results. Although aerial photographic surveys will be the primary method for collecting population data at most colonies, boat or land-based surveys may also be used, especially for certain Tier 1 colonies (where phenology and breeding success data may also be collected) and during periodic broad-scale surveys (see Element 2). Boat- and land-based surveys are also useful for correctly identifying cormorant species and for locating small or new colonies that may be missed on aerial surveys. Below are overviews of the survey types; more details are given in the respective SOPs.

Bird and Nest Counts

20

The fundamental data collected during all survey types consists of counts of birds and nests. Counts will typically be taken from aerial photographs of colonies or direct observations made during surveys conducted from boats or land.

Common Murre Because COMU do not build a nest and breed in very dense clusters, only birds are counted. This applies to aerial photographic, boat- and land-based breeding population surveys. Each bird is tallied separately to produce a total count for the colony, which may include separate counts for subcolonies or other sub-areas. For trend monitoring, raw bird counts conducted during the peak breeding period are often used (e.g., Carter et al. 2001). However, since not all breeders are present, and some non-breeders are likely present at the time of a survey, a correction factor based on separate occupancy studies may be applied to estimate the number of breeding birds (or, breeding pairs) present for a season. A correction factor of 1.67 times the colony adult count to estimate breeding adults has been extensively used, but is based on an occupancy study done on the Farallon Islands (Carter et al. 2001), and needs to be confirmed to be applicable in other areas and years where it is likely to be different. Actual bird counts should always be recorded and archived to permit updated correction factors to be applied retroactively (see Element 4). Breeders not present at the time of a survey may include active breeding pair members out foraging or loafing on the water, birds that have not yet laid eggs, birds that have failed nesting and departed the colony, or birds that may have been disturbed (flushed) from the colony.

Brandt’s Cormorant Cormorants build relatively well defined nests that can be easily counted from aerial photographs, boat- or land-based surveys. BRAC nests are counted assuming each nest represents one breeding pair, but deciding what constitutes an active nest requires some experience and interpretation. This PF uses a standardized system of nest and territory categories, summarized in Table 3.1 and Supplemental Materials (SM) 6. Nests in each category are tallied separately. If nests cannot be categorized, such as those poorly visible in aerial photos, they are counted as “unknown nest.” See SM 6 for more details, photographs of categories, and a sample field data form.

Table 3.1. Categories of BRAC nests to be tallied separately in colony surveys. An “X” indicates that, for the respective survey, the nests in that category may be tallied; no X indicates the nest category is not tallied because it is seldom distinguishable during that survey type. See also SM 6. Tallies of the last two categories are not added to nest counts and do not contribute to the estimate of breeding pairs.

Category Description Aerial Boat Land Well-built nest Nest with attending adults sitting in an incubating or brooding posture, or with adult standing on nest with eggs visible or possibly obscured by adult X X X (contents undetermined). Fairly-built A well-defined, roughly circular pile of nest material (up to ~ 6 inches X nest high), with some evidence of a nest bowl depression at its center. Poorly built A disorganized mound or flat pile of nest material with attending adult. X X nest Nest w/chicks Visible small to large-sized chicks; usually with adults attending X X Brood of large Usually 1-4 large chicks (>3/4 adult size) within colony (not creching at X X chicks perimeter) but not in a nest Abandoned Well-built nest with guano ring on nest rim with no attending adult X X nest Empty nest Well-built nest with attending adult but visibly empty of eggs or chicks X X

21

Category Description Aerial Boat Land Undetermined Nest that cannot be placed in any above category due to poor or X X X nest incomplete visibility Territorial site No formed nest, but one adult or pair sitting or displaying courtship X X X behavior. May or may not have an unformed pile of nest material. Undetermined Unable to determine between a nest and territorial site because of poor X X X site view.

Each nest is assigned to only one category; if chicks are present and visible in the nest, the nest quality is not recorded (i.e., just recorded as “nest with chicks”). From aerial surveys or other one-time per season surveys, it is suggested that raw data categories considered as “nests” include the following: well-built nests, poorly built nests, nests with chicks, broods of large chicks, abandoned nests, and empty nests. Categories considered “sites” include territorial sites and undetermined sites. Only the total of all categories of nests (except unknown and territorial sites) is currently used to estimate the number of breeding pairs in a colony. This category list may be revised to better estimate annual breeding population size depending on other data available on breeding phenology or other information. For example, perhaps only active nests (i.e., those most likely to have eggs or chicks present) were used and poorly built, empty and abandoned nests were not used because an annual correction factor was developed to apply to active nests.

Aerial Surveys (See also SOP 1) All aerial operations conducted with Federal employee participation must comply with all FAA and DOI safety, training, operational policies, and regulations. This PF is not intended to serve as a replacement or substitute for the official documents that describe those policies and regulations, and it is incumbent on all personnel involved with aerial surveys to be familiar and in compliance with the latest versions of those official documents. Latest versions of the following documents (and others) most pertinent to field personnel may be found at the DOI – Office of Aviation Services (OAS) website: https://www.doi.gov/aviation/library/forms. 1. Interagency Aviation Training Guide 2. Basic Aviation Safety 3. Interagency Aviation Mishap Response Guide and Checklist 4. Field Reference Guide for Aviation Users 5. Region 1/8 Aviation Operation Go/No-Go Checklist (see also SM 2 and 3)

Type of aircraft: Recent aerial photographic surveys of seabird colonies within the CCS have been conducted from both fixed-wing airplanes and helicopters, and to a very limited extent with unmanned aircraft systems (drones; UAS). Both of the manned types of aircraft have advantages and disadvantages depending on the specific needs of the survey and resources available. Therefore, the type of aircraft should be determined by the Survey Coordinator. Specific aircraft and pilots must be approved by the OAS for federal government employee use. Under this PF, the minimum standards apply to the quality of the aerial photographs that are obtained from a given platform, such that colony images must permit comparable (between platforms) counting accuracy after image processing. See SOP 1 for a list of features to consider when choosing aircraft.

Pre-survey logistics and preparations: All survey participants (pilot, aircrew, project aviation manager, flight manager, and flight follower) must hold all required current certifications of

22

training for their respective roles, as specified in the Interagency Aviation Training Guide. See Element 6 for details about personnel training and certifications.

Safety preparations: The SSP must list all personal protection equipment (PPE) necessary to be available for the survey crew and pilot according to specifications in the Basic Aviation Safety manual, and provide for training in proper PPE use.

The Aviation Safety Plan of the SSP must describe the process of developing a flight plan based on coordination between the Survey Coordinator who is aware of the details of the survey mission (which may vary based on weather during the survey), and the aircraft pilot, who ultimately files the flight plan before the flight. Consult SOP 1 and the safety documents and authorities listed there for all Aviation Safety Plan, Risk Assessment, and reporting requirements. The SSP should set minimum weather standards for conducting a survey based on local knowledge of seasonal weather patterns and the reliability of weather predictions.

Scheduling: Aerial surveys are complex missions that involve coordination of many participants and resources, typically requiring planning and preparations months before any field work can occur. Population surveys must occur during the period when COMU and BRAC nesting seasons overlap in the survey area. Therefore, SOP 1 includes a work plan with a schedule of all necessary activities that the Survey Coordinator administers. An annual schedule should consider aircraft availability, certification of aircraft and pilot, and the availability of survey personnel training courses to assure that all certifications are current for the survey. See Table SOP 1.3 for an example personnel training tracking chart.

Survey equipment: Table SOP 1.1 includes a checklist of survey equipment. Professional quality Digital Single Lens Reflex (DSLR) cameras are the preferred equipment for obtaining high resolution photographs of colonies from aerial surveys. Cameras currently (2017) used are capable of high speed automatic focus, image stabilization, quick lens exchange, and rapidly cycling to take up to 51 megapixel images. They are more than adequate to produce excellent photographs suitable for processing and counting birds the size of COMU from the safe-survey altitudes. The SSP sets minimum standard camera and lens specifications (e.g., 21 megapixels, auto-focus, image stabilization) and allows for periodic re-evaluation of affordable equipment that might improve survey efficiency and image processing, such as higher resolution to reduce potential disturbance of low-flight aircraft and allowing fewer pictures at lower magnification or higher altitude to achieve coverage, faster image cycling, larger capacity data storage cards, GPS coordinate stamping, or photography that can be obtained using UAS.

Field data collection procedures: On the day of the survey, the first order of business will likely be a consultation between the Survey Coordinator and pilot about flight conditions and the status of the aircraft to determine if the mission can occur. If it’s a GO as determined with the “No/Go Checklist”, even for experienced flyers the boarding process can be hectic and stressful, and the more details the SSP includes for this phase of the process, the less likely the crew will get up in the air and discover a critical item was overlooked. See SOP 1 for a procedural checklist and additional details.

23

Boat-based Surveys Although most COMU and BRAC breeding surveys will be primarily conducted using aircraft, certain areas (e.g., Salish Sea) have been and likely will continue to be surveyed from boats due to cost-effectiveness and the ability to collect information in addition to simple counts (e.g., breeding phenology, behavioral observations) with relatively simple logistics. When information gathered from boat-based surveys includes more than simple counts of breeding birds, as for Tier 1A colonies (see Element 2), it will supplement and contextualize breeding bird and nest counts derived from aerial surveys. This section is an overview of what should be covered if the SSP includes boat-based surveys; see SOP 2 for more details.

For boat-based surveys, compliance with official DOI (http://elips.doi.gov/ELIPS/DocView.aspx?id=1624) and USFWS (https://nctc.fws.gov/courses/programs/watercraft-safety/documents/241fw1.pdf) watercraft operations policies is mandatory, and the SSP should incorporate all elements of the most recent policies that apply to the survey. The DOI-Motorboat Operator Certification Course (MOCC) Student Manual is available at: https://training.fws.gov/courses/programs/watercraft- safety/MOCC-StudentManualv.2016-12-20.pdf, and other instructional materials are available at: https://nctc.fws.gov/courses/programs/watercraft-safety/resources.html.

Type of watercraft: The SSP can specify whatever type of boat is appropriate for the local conditions and provides a safe platform from which high quality observations that meet the survey objectives can be made. Policies cited above address requirements for watercraft certification, safety and navigation equipment, and maintenance procedures.

Pre-boat survey logistics and preparation: The boat operator must hold a current MOCC certificate, and have received any additional safety and operational training with the selected watercraft that is necessary to prepare for conditions which may be encountered on the survey. See Element 6 and the Boat-based survey SOP 2 for more details about personnel training and certifications. The watercraft should be inspected and current on all scheduled maintenance (including adding fresh fuel), and taken on a test float at least a few days before the survey if it is not in regular use. Electronics (radar, marine VHF radio, GPS, etc.) must be in operable condition and boat captain must know proper operating procedures.

Safety preparations: The SSP should detail how the survey Float Plan will comply with all safety considerations described in the U.S. Fish and Wildlife Service policy https://nctc.fws.gov/courses/programs/watercraft-safety/documents/241fw1.pdf, including providing PPE and exposure survival equipment for all crew and training in its proper use. A Personal Locator Beacon or similar device should be verified as operable. The procedures for completing a survey Float Plan to be given to a designated on-shore trip follower, and an operational risk assessment evaluation (i.e., GAR; see the MOCC Student Manual) should also be in the SSP.

Scheduling: Boat surveys typically involve coordination of multiple participants and resources requiring planning and preparations months before any field work can occur. SOP 2 includes an example work plan and training schedule to assist the planning process.

24

Survey equipment: The SSP should include a complete list of the equipment and supplies that will be needed to conduct the survey using watercraft, such as the checklist included in SOP 2 plus appropriate safety equipment listed in the MOCC Student Manual.

Field data collection procedures: The SSP should include a procedural checklist for tasks to be done on the day of the survey, such as the example given in SOP 2 which includes both safety and survey data collection procedures. It is the responsibility of the Survey Coordinator to assure that the field method protocols, as detailed in the SSP, are being implemented correctly as observations are made and recorded.

Land-based Surveys Land-based surveys include any counts, photography, or observations of COMU or BRAC colonies made from observation points located on the colony itself or adjacent land. Land-based surveys will most likely be conducted at certain Tier 1 colonies selected for intensive monitoring (see Element 2), in part, because they offer the opportunity for longer, more frequent observations at lower costs and simpler logistics than aerial or boat surveys. However, these will not be feasible in some large areas (e.g., portions of the Washington outer coast) where colonies are too distant from shore or difficult to access.

Pre- survey logistics and preparation: Most COMU and BRAC colonies in the CCS are only partially visible from land observation points (OPs). Therefore, land-based bird and nest counts generally comprise indices of relative abundance rather than a complete count of the colony population, and the accuracy and repeatability of the observations depend on standardizing the location of the OPs, areas of the colony being surveyed, and other factors that may affect the accuracy of the counts. However, OPs may be moved between years to obtain optimal views if the subject colony has relocated substantially. SOP 3 describes the selection criteria and documentation needed for each OP, a survey equipment list, and personnel roles and training requirements.

Safety preparations: Safety considerations should be addressed if access or use of the OP involves exposing the survey personnel to hazardous conditions such as steep slopes, high winds, unstable footing, or other falling hazards. SOP 3 describes how to mitigate those hazards.

Scheduling: SOP 3 includes a template for a work plan that should be part of the SSP, and describes an annual schedule that includes details unique to each season’s surveys, such as dates of each OP visit and survey crew member names.

Survey equipment: The SSP should include a complete list of the equipment and supplies that will be needed to conduct the survey, such as the checklist included in SOP 3.

Field data collection procedures: Details of the field procedures (i.e., how and what data are collected) will be determined by the role that land-based surveys will play in the sampling framework design (Element 2) as specified in the SSP. SOP 3 presents a general procedural structure that will likely be common to all land-based surveys.

25

Establishing sample units Sampling units for all survey methods will be sites of colonies for breeding COMU and BRAC, as defined and numbered by the respective jurisdictional Colony Databases. (Exceptions: surveys designed to produce index counts of portions of colonies for the purpose of obtaining breeding phenology, productivity, or other information that may or may not be extrapolated to larger populations. In those cases, the sample unit will be defined in the appropriate SSP.)

All currently known CCS colony sites in the US for COMU and BRAC have been assigned official USFWS identification numbers: the first three digits corresponding to the U.S. Geologic Survey (USGS) 1:250,000 quadrangle map number where each colony is located, and the next three digits initially assigned sequentially from north to south within that map, but as new sites were more recently added the number sequence no longer followed that geographic pattern. In Oregon only (Naughton et al. 2007), colonies discovered after the initial numbering have been assigned sequential decimals of the nearest previously numbered site (e.g., a site numbered 219- 001 in OR in 1988 when numbers were first assigned was the closest site to a new colony found later, which was designated as colony 219-001.1). In California some colony sites including multiple breeding islands have separately-numbered sub-colonies within them (Carter et al. 1992).

Colony site numbers in the United States correspond to single islands, small island complexes, sections of mainland coastline, or (less often) a combination of these forming a discreet cluster of known current or historic breeding sites. Many numbered colonies include multiple seabird species, and some include multiple discreet clusters of nests of a single species, including COMU and BRAC. For some analyses, numbered colonies have been clustered into colony complexes (e.g., Carter et al. 2001; Capitolo et al. 2006, 2014). In Oregon, the sites have traditionally been considered single colonies identified by their colony numbers. Historical count data is organized and archived based on numbers of each species within each unique colony site number.

For the purposes of this PF, the sampling framework (Element 2) has been designed to treat numbered colony sites as the sampling units from which population parameters for COMU and BRAC are estimated. PSP staff will work with each Partner to determine which colonies should be sampled in a given season according to the sampling framework. Answers to interesting and important questions about intra-colony (as defined herein) variability and dynamics are beyond the scope of this PF.

Due to the inconsistency of colony site numbering conventions between jurisdictions, if new unnumbered COMU or BRAC breeding sites are discovered during a survey, the Refuge Biologist at the respective state office should be consulted for a number assignment: CAN or WA – Washington Maritime NWR; OR – Oregon Coast NWRC; CA – San Francisco Bay NWRC.

Processing of collected data Raw collected data will be in the form of unprocessed aerial digital photographs, field notes, and field data forms containing counts and survey metadata. Counts of birds and nests will be derived from a subset of the many hundreds of aerial photographs taken that will be selected as the best representations of the surveyed colonies. The non-photographic data (phenology and 26

breeding status) and metadata will provide the context for interpretation of the count data. All of these data must be processed to be organized and available for analyses, and the SSP will describe in detail how that processing is to be done, and what quality assurance and verification procedures are followed for each data category.

Metadata: Metadata for each survey should be digitally (or on field data forms) recorded on the day of the survey, and verified by a separate participant if possible. Consistent with the Data Management Plan (Element 4 and Appendix 2), the SSP will list all information to be recorded, record format, and record file location. The SSP should also follow guidelines in Element 4 for recording metadata related to supporting digital photographs and entering it into the PSP Colony Survey Database (PCSD).

Aerial photographs: The SSP will incorporate detailed instructions for processing images gathered during the survey, including the following image processing tasks:

1. Upload images from cameras (memory cards) to file folders on two hard drives labeled by date, camera, and colony number as soon as possible after the survey, then store hard drives in separate locations, or if only one drive, store in a fire-proof safe 2. Options for geo-referencing images 3. Process photos and merge images (if necessary) to obtain whole-colony views 4. Perform COMU and BRAC bird/nest counting in ArcMap or comparable software 5. Save the image with tally marks (dots) to a digital file, using a standardized file naming system 6. Tabulate and record the counts in the PCSD (see Element 4) 7. Archive all photos, including those not counted (see Element 4)

Geo-referencing images refers to using software that associates each photograph with GPS coordinates of the location of the camera when the photograph was taken. When the GPS track of the survey route is mapped with each image number superimposed on the track (Figure 3.1) potential for confusion about which colony is shown in a particular image is reduced (see Gladics 2009). Surveyors very familiar with the survey area, and who are careful about taking photos in spatial sequence may find this step redundant, and this PF does not require geo- referencing, but less experienced surveyors may find it very useful.

See Gladics 2009 for an example of detailed, step by step instructions for obtaining geo- referenced aerial photographs using Garmin and GPS-Photo Link® software, merging select images using Adobe Photoshop® to prepare them for counting, conducting the counting (i.e., “dotting”) in ArcMap®, and tabulating the results. These instructions have not been updated to include the latest versions of the various software programs but nonetheless provide a comprehensive description of the functional steps developed for use by the Oregon Coast NWRC. Other stations have successfully used different procedures and software, and may continue to do so if the ultimate goal of obtaining accurate, repeatable counts from appropriately labelled and archived images can be achieved. The SSP should be flexible to adopt new technologies (e.g., automated counting software) as they become available to improve efficiency.

27

Count (numerical) data collected by boat or land: Counts made in the field or derived from photographs will need to be properly labeled, recorded, and entered into the PCSD according to standard conventions specified in the Data Management Plan (Element 4) and the SSP. These counts will be identified by survey type and processed separately from counts derived from aerial photography.

Data verification and quality assurance: There are many opportunities for errors or mishaps to affect the quality of the data produced by large surveys with multiple participants, for example; mis-recording data, incorrectly transferring data to a different medium, inexperience with photographic equipment used, failure of the lens’s autofocus system, failure to collect complete and accurate information, and accidental loss of data records. Survey Coordinators should specify methods to verify data fidelity at these points in the process. For example, proper field personnel training (Element 6) should minimize incomplete or inaccurate data collection and recording. See the respective survey type SOPs for additional data quality assurance measures.

Figure 3.1. Satellite photo shown locations of geo-referenced photographs (JPEG) showing the position of the camera when they were taken. This tool helps identify which colony is shown in each image.

28

End of season procedures Survey supplies and equipment should be carefully inspected and properly prepared for storage including cleaning and servicing of equipment battery removal and charging, and reformatting of data memory cards. Any equipment replacement or repairs should be done at this time in preparation for the next season or scheduled photographic survey.

A full accounting of actual associated survey costs (including labor, see Element 7) should be made and reported to the site supervisor (e.g., Refuge Project Leader) and the PSP Coordinator, with explanations of discrepancies from pre-survey cost estimates, if any. Non-USFWS Partners may have different cost accounting procedures.

The SSP describes how all digital and physical data will be managed, backed-up, and then archived securely according to the Data Management Plan (Element 4). If data processing or analyses will not be done by the Partner, coordinate data transfer to the appropriate parties. Otherwise, transfer reports of analyses as they become available to the PSP Data Manager as prescribed in the Data Management Plan.

Element 4. Data Management and Analysis

Many individuals from different government agencies and research and conservation organizations currently collect seabird population and nesting data from colonies within the CCS. Although similar data are being collected, different protocols and methods result in data that may not be easily combined across jurisdictions and sub-regions for analyses. This Element describes how all partners can standardize data collection, storage, and analysis across the CCS to overcome this issue.

Data collected in the field must adhere to data standards described in this PF and shall be consistent with conventions of other USFWS Inventory and Monitoring (I&M) centralized databases such as ServCat (an online archive of USFWS documents), PRIMR (an online database of survey metadata), and FWSpecies (an online database of refuge species occurrences) from which data can be publicly accessible. A centralized cross-regional application for the entry of colony data will allow for consistency, increased accuracy, and an ability to report and analyze data across a wider geographic range. This Element describes minimum functionality and features of such an application as the first step toward identifying what existing or new application will ultimately be used to serve as the repository of these and other survey data regarding seabirds in the Pacific. A critical consideration will be the needs and requirements of the survey partners who will enter and then seek to extract data. Development of that database is a top priority of the PSP Data Manager and PSP Steering Committee, USFWS I&M staff, and ultimately PSP partners throughout the CCS.

Data Processing A PSP Colony Survey Database (PCSD) will be developed to provide centralized data entry, data storage, and reporting/analysis tools for all seabird colony surveys in the CCS. Application development will carefully consider data to be collected under this PF and to accept data from other seabird monitoring activities in the Pacific. It will be available to all partners involved in seabird colony monitoring, where access will be controlled via username and password. During registration, users will request access to the specific colonies they survey for data input. The

29

PSP Data Manager (DM) will review and grant such requests within two working days; an alternate approver will be assigned during DM absence to avoid/minimize delays in approvals. Data fields applicable to the Partner’s SSP will be presented to the user for data entry. Some of the fields will only accept values from an approved domain; whereas, others will accept all values. Required fields will be determined and identified prior to data entry. Survey metadata (weather, etc.) will be entered and linked to specific survey data. After data entry is complete, a screen of values will be presented to the user for quality control review. Flags would indicate if entries are outside the range of accepted values or required fields are missing. The users will be able to identify the degree to which their data are accessible by the public. All data collected by US federal agencies or at federal properties will be available to the public after it have been entered and reviewed.

Although the direct digital entry of data is commonplace for many citizen science projects like eBird and iNaturalist, the inherent difficulty of repeating logistically expensive surveys if data are lost due to accidental equipment malfunction would justify using traditional hard copy field data forms and notebooks. However, SSPs may allow electronic field data entry if the device specified has been field tested and shown to be reliable, and backup units are always taken on surveys. The benefits of electronic field data entry may outweigh the potential of data loss as data entered into a digital system can include error checking, automated backup, and reduced input time.

Datasheets (or electronic forms) adaptable to each site-specific protocol (SSP) will be developed and provided by PSP staff to users in both Adobe PDF© and Microsoft WORD© formats, and will appear similar to electronic data entry screens (see SM4 for an example data sheet). Upon return from the field, a Survey Coordinator will scan the completed datasheet and store them on a local hard drive, or server if available. For surveys that do not involve photo interpretation, data will be entered into the PCSD as soon after the survey as possible. If the survey involves the collection of photographs, initial data processing will focus on the collection of metadata about the survey (e.g. GPS track logs, weather information, etc.). Photographs will be named according to the naming convention described in the SSP and stored on a local server/hard disk and uploaded to ServCat or other cloud-based archive within a month of collection. Once processing and interpretation has been completed for one or more digital images of a colony, the results will also be added to that same cloud-based archive, using the naming convention in the SSP.

The PCSD will allow data entry by any registered user, regardless of agency affiliation. A list of data attributes, attribute definitions, and accepted values will be included in each SSP and will allow flexibility, where needed. Minimum required attributes are presented in Appendix 2. A list of known seabird colonies and their specific geographic coordinates will be maintained by the PSP DM who will coordinate all additions. This will be known as the PSP Colony Registry (PCR), which will be maintained with attributes related to the physical characteristics of the colony location including geographic coordinates stored in WGS84 as decimal degrees with a minimum of six digits after the decimal point, habitat characteristics, topography, ownership, and predator occurrence.

Data Queries Users would be able to query the application and readily view data according to established data 30

access rules. Specific query and summary pages will be developed based on the results of surveys of user needs and requirements.

An online map of colonies (PSP Colony Mapper) linked to survey data will allow the public to interact with the data and perform simple queries about species composition and populations of each colony. This mapper will be maintained by the PSP DM.

Metadata Complete survey documentation is essential to assure that data collected at high risk and expense have maximum utility to future seabird and land managers, researchers, and other potential users. Metadata shall describe all aspects of the subject data, including who collected and interpreted the data, what and where the data were collected, the methods used, and the reasons why the data were collected. Metadata for the PCR will be created and maintained by the PSP DM and follow Federal Geographic Data Committee Standards (Federal Geographic Data Committee 1998). Metadata for the PCSD will adhere to USFWS Data Standards (274 FW 1: https://www.fws.gov/policy/274fw1.html) and will include, as a minimum: a. Name and type of survey b. Date and time of survey c. Personnel and roles of surveyors d. Environmental conditions during survey (weather, seas) e. Geographical extent of survey - if a refuge, will use the FBMS code for that refuge, plus any accepted sub-unit name, and colony name f. Programmatic context of survey (where in the sampling frame) g. Name of individual entering the data into the database.

Data security and archiving Digital photos collected for seabird monitoring shall be grouped by colony map number and sub- region and archived in ServCat as soon as practicable. Non-USFWS cooperators can provide their photos and metadata to the PSP DM for archiving in ServCat.

Paper copies of datasheets shall be digitally scanned to a local hard drive upon return from the field. For surveys conducted by the USFWS, these scanned datasheets shall be archived in ServCat at the end of the field season (or ASAP) with the following naming convention: colonycode_date_initialsofdatacollector.pdf. ServCat administrators will be responsible for ensuring security and backup of the archived electronic data. Once the PCSD platform is established, an SOP will be developed to provide additional details on end-of-season procedures, and data security and archiving.

Regardless of the application used by the PSP for these data, the DM will assure that all data entered are in an open and accessible format and archived by the refuge or agency in ServCat or other accessible archive, and will update the PSP colony mapper by December 31st of each year.

Analysis Methods The following is the suggested approach to analysis of the survey count data in order to address management and sampling objectives. It is expected that the PSP statistician will take the lead on conducting these analyses with input from other PSP staff and the Steering Committee, and in

31

coordination with the Survey Coordinators. Considering the complexity and diversity of datasets that will be compiled and the scope of questions to be addressed, these analyses will be best carried out by an analyst with advanced training.

The dependent variable to be analyzed consists of counts of individuals (COMU) or of nests (BRAC) at colonies (as defined by the Seabird colony database), based on individual survey results.

Where a colony has been surveyed and tallied multiple times in the same breeding season, results of each survey will be included in the analysis. An important feature of the analysis method is the inclusion of information on variables that may be influencing “detectability” as discussed above (Element 2). In this regard, the date of the survey (i.e., day of year or “julian date”) is particularly important. Other variables should be considered for inclusion in the analysis depending on the information available regarding the survey, as illustrated below. Therefore, assembling databases that collect relevant ancillary information is needed.

Regarding the analysis of change over time, three types of analysis are considered here: 1) long- term trends (e.g., over 15 years), 2) short-term changes (over the span of three years or less), and 3) changes in trend. These analyses are of data from repeated surveys conducted at colony locations, as determined by the sampling framework, such that surveys are conducted more than, less than, or exactly once per year at each location (Tiers 1, 3, and 2, respectively; see Element 2).

For many purposes, surveys for all colonies in the CCS will be included in the analysis (described below), and thus, data from multiple sub-regions are included in the statistical models analyzed. However, there may be cases where an individual sub-region is analyzed by itself, or other subsets of the full CCS are pulled out for analysis.

In general terms, the statistical model for analyzing trend can be thought of as analyzing counts (Y) for colony site i in year j on date k, in relation to variables that specify trend, variables that uniquely identify a colony (“colony id”), and variables that affect detectability. In this regard, Generalized Linear Models (GLMs) provide a valuable, robust, and widely-available approach to the analysis (Dobson and Barnett 2011). Note that there may be cases where the GLM simplifies to a Linear Model, but it is still helpful to think of the latter as just a special case of the broader GLM.

Here we outline the use of negative binomial regression, one type of GLM, which is especially suitable for analyses of counts (Hilbe 2011). Negative binomial regression analyzes counts in relation to a set of predictor variables, similar to a Linear Model, but with the relationship between counts and the predictor variables being a “log-link”. Thus, what is being predicted by the predictor variables and their coefficients is ln(Y) not Y itself. Population change over time is proportional (e.g., a population increases at 5% per year) and so either a log-link or logarithmic transformation of the count variable, i.e., ln(Y) is needed for all analyses (Nur et al. 1999).

32

The following is intended to provide an overview of the statistical approach, and does not provide details, for which see statistical texts (Dobson and Barnett 2011, Hilbe 2011) and available software (e.g., R, Stata).

The GLM is a predictive model, with a set of predictor variables, each with its coefficient as determined though Maximum Likelihood Estimation (or variants thereof). The predictor variables can be classified into three categories, and all models will have one or more variables from each of the categories: (1) Variables accounting for change over time. If one is testing for a linear trend (constant proportional change over time), then one models b1*X1, where X1 = year. One can also model annual change as a quadratic, cubic, or fit other non-linear functions. One can also model a change in trend between one time interval and another (see below, regarding change in trend). Thus, we can estimate the trend, calculate a standard error, test for significance, test for deviations from linearity, etc. (2) A variable specifying the individual colony (“colony id”). Colony id is a categorical variable (also referred to as a “factor”) in the model. The statistical model can treat colony id as a fixed effect or as a random effect. In addition, one can include sub-region or other groupings, in which case colony id is nested within sub-region. This variable is needed since counts of COMU and BRAC vary among colonies by several orders of magnitude, and we need to control for that variation in order to estimate proportional change over time at colonies. (3) Variables that account for variation in detectability as defined in Element 2. That is, variation in counts may reflect variation in: calendar date of the survey, survey date in relation to timing of breeding (e.g., onset or mean incubation or hatching dates); rates of breeding failure; hour of survey; weather conditions at the time of the survey; survey equipment; characteristics of over-flight (e.g., height); observers, photographers, or counters; and oceanographic conditions associated with the survey. Each of these variables can be modeled as a quantitative variable, or as a categorical variable (factor), as appropriate. Whether one of these variables is evaluated in the model depends on information at hand, and whether there is sufficient variation for that variable.

Of these variables, survey date should always be considered in the analysis; that is, we will wish to account for variation in counts due to calendar date, if there is statistical evidence for such an effect. However, survey date in relation to timing of breeding is expected to be even more relevant in understanding variation in counts. Information on timing of breeding can be obtained from intensively monitored colonies (Tier 1 A), but there are currently few such sites. However, timing of breeding by COMU has been shown to related to spring upwelling transition date; e.g., on the Farallones mean COMU laying date during recent years (1999-2015) increased by 0.234 (± 0.068) days (P = 0.004) for a 1-day increase in spring transition date (Point Blue, unpublished; Nur et al. 2017). Furthermore, spring transition date has been calculated for various locations in the CCS. Thus, the effect of survey date relative to spring transition date can be analyzed, where information on COMU and BRAC breeding is not available.

Thus, a GLM analysis can be carried out in which proportional change in counts over time is analyzed, while controlling for variation among sites and with respect to other variables affecting counts. A single slope (trend) can be estimated for the entire CCS or different slopes can be fit

33

for each sub-region or other grouping. Moreover, one can test whether slopes differ among sub- regions, or differ among colonies within the same sub-region. An example is provided in Nur (2011). Thus, one can compare population trends for different groups of colonies, e.g., those that are subject to disturbance compared to those that are not; or with respect to a management action.

The change over time can be estimated as a trend, or modelled as a step-wise change from one time period to another, whether the time period is one year or several years. That is, the time periods being compared must be specified; the analysis allows flexibility in doing so. Furthermore, a change in trend can be analyzed using splines (Harrell 2001), also referred to as “change-point analysis.” One can estimate the difference in trends before and after a point in time (i.e., compare two trends, joined at a “knot”), to determine whether the trends differ significantly, relate them to ecological variables, etc. See Ainley et al. 2013 for an example.

As described above, at some colonies surveys may be replicated (more than one survey per colony per year, Tier 1A, B), but in other cases there may be no replication. The analysis can include both, but if there are sufficient observations from replicated surveys, it will be valuable to pull out a separate data set for replicated surveys. An analysis confined to replicated surveys can confirm the estimated effect of those variables that vary within a year, since each colony will have been surveyed two or more times per year, and thus variation among replicates (for the same colony surveyed in the same year) is not due to variation in breeding numbers. Intensively monitored sites (Tier 1A) can provide a rich source of replicated data.

GLM allows flexibility in approach. The choice of model used depends, in part, on the distribution of residuals. Poisson regression is a type of GLM that assumes that residuals are Poisson-distributed. However, ecological count data commonly violate this assumption, displaying “over-dispersion” compared to the Poisson distribution. This over-dispersion can be modeled explicitly through the use of negative binomial regression, which assumes that residuals conform to a negative binomial distribution.

However, if residuals are Gaussian (i.e., normally distributed) a Linear Model may be most appropriate, but with the dependent variable the log-transform of counts (i.e., ln(Y)). For example, survey data from three marsh bird species were analyzed by Stralberg et al. (2010); two species were best modeled by negative binomial regression (black rail Laterallus jamaicensis and salt marsh yellowthroat Geothlypis trichas), while the third species (tidal marsh song sparrows, Melospiza melodia) was best analyzed with a linear model (i.e., normally-distributed residuals with an identity link). For any GLM, it is important to confirm that residuals conform to their assumed distribution. This is one component of assessing goodness of fit of a model (Harrell 2001).

Element 5: Reporting

Types of reports Several different reports can be prepared from data collected under this PF. Annual progress reports summarizing COMU and BRAC survey activities and results would be prepared by Partners after each field season, where possible and applicable. Completed annual progress

34

reports can be submitted by the Site Manager to the PSP staff, who can then upload them to ServCat. Interim reports describing multiple-year survey results may be prepared as well when accumulated data permits robust statistical inferences (see Element 2) about population trends, distributional changes, or other phenomena relevant to seabird management and regulatory processes. Survey Coordinators will be given the opportunity to contribute to and review any reports that include data from surveys they conducted, and offered co-authorship or otherwise appropriately acknowledged. Reports should be submitted to peer-reviewed literature sources whenever appropriate, as encouraged by DOI policy. (https://www.doi.gov/scientificintegrity/FAQ)

Other reports will be produced by Principal Investigators when ongoing COMU and BRAC population monitoring data are used to examine time-constrained events such as the population response to a disturbance (e.g., oil spill, disease event, strong el Niño event, increased predator pressure) or a multi-year breeding phenology study of a Tier 1 colony. Non-governmental investigators generating data or reports outside of public domain may require data-sharing agreements to protect proprietary information.

SSPs, which stepdown from the PF for individual locations, provide explicit instructions about different reports to be produced, when they are due, and describe the review process required for approval.

Report contents Consult with the Site Manager to determine the appropriate reporting types and schedules to specify in the SSP. Regardless of its type, any report should succinctly address survey progress toward achieving the CCS-wide and/or SSP-specific objectives, where applicable. If a reason for conducting the survey is to inform site-specific management, then the report would compare survey results to pre-defined values that trigger management actions or model refinement.

Partners contributing survey data will have access to the PCSD to submit queries that will produce summary statistics and common analyses of the dataset of interest to the report author. The results of those queries can be incorporated into reports articulating short- and long-term as well as linear or non-linear population trends at multiple geographical scales and time frames (see Element 2). Beyond the CCS reporting described in this PF, the publication of survey data in peer-reviewed scientific journals requires permission from those who collected the data. Individuals involved with data collection will be invited to participate in the preparation of these CCS-wide reports and be appropriately recognized as co-authors or acknowledged contributors.

COMU and BRAC colony registry updates will be a reporting output based on annual survey data entered into the PCSD by Partners from throughout the CCS. These colony updates will be accessible to all cooperators to incorporate into their reports and available to the public.

Reports with survey results and findings would ideally include the following in accordance with the Survey Protocols Handbook: https://www.fws.gov/policy/SurveyProtocalsHB.pdf (USFWS 2013):

1. Title. Include these three items:

35

a. Name of the survey (for USFWS participants, this should match the survey record in the Planning and Review of I&M on Refuges (PRIMR) application or the refuge Inventory and Monitoring Plan (IMP); b. Survey ID (from PRIMR or the IMP, if applicable); c. Time period that the survey was conducted. 2. Authors. Identify names, affiliations, and contact information. 3. Date prepared. Provide the date of the report. 4. Objectives. Include the management and sampling (monitoring) objectives identified in the SSP. 5. Methods. Provide a succinct description of field and analysis methods from the SSP. 6. Results. Describe the number and types of samples collected and present data summaries. Where applicable, update existing tables or figures with new data. 7. Important findings. Interpret the results of the survey with respect to the management objectives or decisions that to be made. Discuss reliability of the results and provide conclusions and any recommendations. 8. Problems encountered. Describe any difficulties with the data collection or analysis, including departures from the methods in the survey protocol.

Reporting schedule The Survey Coordinator prepares annual and interim reports (every 3‐5 years, or coincident with sampling frame cycles), which are submitted to the Site Manager and then the PSP staff for uploading to ServCat. The USFWS I&M policy (701 FW 2) recommends reporting of findings in peer-reviewed scientific journals, when appropriate. For reports authored by USFWS employees, follow the requirements for reporting disclaimers described in 117 FW 1.

PSP staff will annually report on Program status and Partner activities to USFWS Regional and National leadership of the Refuge System and the PSP Steering Committee; they also will periodically (depending on when significant results are available) prepare CCS-wide COMU and BRAC population status and trend reports for Partners, interested stakeholders, and the general public using a variety of appropriate media.

Report archiving Survey reports will be archived in ServCat with links to relevant PRIMR survey records, if applicable. Reports on ServCat may be available to the public through data.gov at the discretion of the Survey Coordinator and Site Manager.

Element 6: Personnel Requirements and Training

Roles and responsibilities Survey personnel: The SSP should list all survey personnel by name (if possible) and detail their respective roles and responsibilities for all aspects of the COMU and BRAC survey using the guidelines below.

Field crew: The field crew consists of individuals collecting the survey data, including the Survey Coordinator or crew leader plus one or more biological technicians or assistants. All field personnel should have a complete understanding of the survey objectives and the field methods and procedures used to conduct the survey, be skilled in field identification and

36

knowledgeable of behaviors of COMU and BRAC, and clearly understand their duties and roles during the survey. All members of the field crew should know and abide by guidelines for scientific integrity and scholarly conduct according to USFWS policy 212 FW 7.7. Crew members have the following additional roles and responsibilities:

Survey Coordinator – This person is responsible for all aspects of the survey planning and preparation, safety procedures, implementation, data quality assurance and management, data analyses, and reporting. The Survey Coordinator selects the field crew and assures that all members are qualified, adequately trained, and hold current DOI certifications (e.g., aircraft safety training, MOCC, etc.) appropriate for conducting the survey.

Biological technician(s) – Assists the Survey Coordinator as directed with any aspect of the survey preparations, implementation, and post-survey activities. Technicians are responsible for adhering to all activity schedules, following survey protocols and safety procedures, and coordinating closely with the Survey Coordinator. The biological technician may also supervise project volunteers or assistants.

Contractor – In cases where some or all of the survey tasks are contracted to parties outside of the USFWS or other managing agency, the roles and responsibilities of the contractor are determined by the Project Leader or Site Manager issuing the contract, and are explicitly detailed in the contract.

Field support: All aerial surveys and most boat-based surveys will involve field support personnel in addition to the survey crew including the following:

Aircraft pilot or boat operator – If DOI employees will be passengers, operators must hold current DOI and FWS certifications of training (see below) to operate the respective equipment under all potential survey conditions. The SSP will specify the appropriate aircraft or watercraft and the range of environmental conditions that could potentially be encountered during the survey, which will determine the operator’s certification requirements. Operators are responsible for passenger and crew safety, certifying that the equipment meets all maintenance specifications, and reviewing safety procedures and proper PPE use with all passengers before embarking on the survey. Operators should be familiar with the survey protocol, geography, and mission plan, and assure that the equipment is capable of completing the survey safely.

Trip follower - Aerial and boat-based surveys are supported by trip followers who monitor the survey trip’s progress from the ground, and are ready to call for rescue service if necessary. This is a critical position and is necessary for the safety of the crew. According to a pre-established safety plan, the Survey Coordinator or other survey crew member regularly contacts the trip follower with progress updates. Most aircraft are equipped with automated GPS tracking systems that allow the trip follower to track the flight in real time via the internet, but boat-based surveys will likely require contact by radio or cell phone. It is the trip follower’s responsibility to be vigilant throughout the duration of the survey by following the protocols for scheduled contacts and quickly reacting to a missed contact or responding to a distress signal according to the safety plan.

37

Supervisory personnel: The supervisor of the Survey Coordinator will have the ultimate authority for approval of survey plans and implementation of seabird surveys, and this individual should be kept informed by the Survey Coordinator of all related activities. Depending on the station where the survey is based, the supervisor may be the Project Leader or the Refuge (or site) Manager. The PSP Coordinator may also be involved with survey planning and scheduling, and can act as a liaison among stations to coordinate surveys. The SSP should list supervisory staff and their respective roles and responsibilities.

Support staff: Depending on the complexity of the survey, other personnel may be involved with various aspects of its implementation and, in turn, they should be included in pertinent planning discussions. Supporting staff members may include a Safety Officer (safety plan review for compliance and completeness), maintenance staff (equipment maintenance and repair), budget and contracting administrators (procurement and contract development), data manager (assist with Data Management plan development), biometrician (survey design and data analyses), and an I&M Zone Biologist (planning and logistical assistance). A SSP should list support staff and describe their respective roles and responsibilities.

Qualifications and training The qualifications and training requirements for personnel directly involved with data collection, analyses, and data management will vary depending on the survey type (aerial, boat-based, or land-based) and other survey characteristics that may require particular skills. The following lists should be considered guidelines for developing a SSP, and may not include skills and training necessary for all surveys of COMU and BRAC colonies.

Aerial surveys – special requirements: All aerial operations conducted with federal employee participation must comply with all FAA and DOI safety, training, operational policies, and regulations. This PF is not intended to serve as a replacement or substitute for the official documents that describe those policies and regulations; it is incumbent on all personnel involved with aerial surveys to be familiar and in compliance with the latest versions of those official documents. The Interagency Aviation Training (IAT) Guide lists how often certifications need to be renewed. Most certification renewals are available on-demand, on-line through the IAT website, but some initial training courses must be attended in person with the instructors and are only offered intermittently. See SOP 1 and the IAT website: https://www.iat.gov/library.asp for a list of documents and training requirements most pertinent to field personnel. Because unmanned aerial systems (UAS) are still being tested for applicability to this survey, it is not identified as an operational method for field data collection within this PF at this time.

Survey personnel general qualifications and training: Survey Coordinator: 1. Previous survey experience preferably conducting seabird surveys 2. Competence identifying all native seabirds under variable field conditions 3. Experience flying in small aircraft and/or operating a boat on the ocean 4. Supervisory experience with field personnel 5. Competence using digital cameras, maps, and GPS units 6. Complete understanding of survey goals, objectives, and protocols 7. Current on all pertinent DOI safety training, certificates, and job hazard analyses 38

8. Competence in data-entry programs and procedures 9. Knowledge of and adherence to DOI scientific integrity and scholarly conduct standards

Biological technician: 1. Trained to identify native seabirds in field conditions 2. Experience flying in small aircraft and/or boating on the ocean 3. Trained to use digital cameras, maps, and GPS units 4. Trained about survey goals, objectives, and protocols 5. Current on all pertinent DOI safety training, certificates, and job hazard analyses 6. Trained in pertinent data-entry programs and procedures 7. Knowledge of and adherence to DOI scientific integrity and scholarly conduct standards

Aircraft pilot: 1. Approved and certified DOI pilot for specified aircraft 2. Experience conducting wildlife surveys at slow speeds and low altitudes 3. Familiar with West Coast geography and weather conditions 4. Briefed on survey goals, protocols, hazards, and the flight plan

Boat operator: 1. Trained and DOI certified on all aspects of operating the survey craft in open ocean and near shore conditions (see below) 2. Familiar with West Coast weather and sea conditions, and coastal bathymetry 3. Briefed on survey goals, protocols, and float plan

The boat operator must hold a current Motorboat Operator Certification Course (MOCC) certificate, and have received any additional safety and operational training with the selected watercraft that is necessary to prepare for conditions which may be encountered on the survey (e.g., open ocean training). See SOP 2 and consult the latest version of the MOCC manual for certification renewal requirements. The DOI MOCC manual and other policy documents and instructional materials available at: https://nctc.fws.gov/courses/programs/watercraft- safety/resources.html.

Data manager: 1. Technical skills related to databases and computer technology 2. Experience with seabird survey data and datasets 3. Familiarity with use of applicable PSCD 4. Knowledge of and adherence to DOI scientific integrity and scholarly conduct standards

Data analyst/biometrician: 1. Advanced statistical training 2. Experience developing statistically sound survey protocols 3. Analyze seabird survey data and report results 4. Knowledge of and adherence to DOI scientific integrity and scholarly conduct standards

39

Element 7: Operational Requirements

The Survey Partner’s Relationship with the PSP The functional relationship between the PSP and the Partners conducting surveys at colony sites throughout the CCS is presented in Figure 7.1. The role of the PSP staff is to complement and support the seabird monitoring conducted at the field sites by coordinating and standardizing the surveys to allow CCS-wide (and sub-regional) inferences about population status, trends, and distribution. PSP staff also will maintain a centralized interactive repository for survey data that facilitates data synthesis, analyses, and reporting by both parties. Specifically, the PSP facilitates these survey efforts by providing tools (protocol frameworks, SOPs, and data repository) and services (e.g., coordination among stations, data management, data interpretation) to the Partners in the field. In turn, the role of the Partners in the PSP is to develop SSPs that utilize standardized data collection procedures described in this PF, and to submit subsequent survey data to the centralized PCSD. Accordingly, the operational requirements of Partners are to plan surveys in coordination with PSP staff, obtain resources (personnel, equipment, and funds) to conduct surveys, process collected data for PCSD entry, and implement all relevant aspects of the survey life cycle at relevant spatial scales.

40

Figure 7.1. The functional relationship between the PSP and Survey Partners. The PSP provides tools and services to the Partners to support coordinated surveys, and the Partners provide standardized survey data to the interactive PSP Colony Survey Database (PCSD) maintained by the PSP. Products such as analyses and reports are produced by both the PSP staff and Partners.

Budget Considerations Partners that have been conducting COMU and BRAC colony surveys know the costs associated with those surveys, and have probably documented those costs in their budgets. However, if the new sampling frame described in this PF (Element 2) prescribes more extensive or additional surveys than had been conducted in past years by a Partner, those additional costs would need to be reflected in the budget, and sufficient funding would need to be procured. One of the benefits of adopting this PF and participating in the PSP is the establishment of an overarching rationale to support seabird survey work done in the CCS which can be used to leverage additional resources to help Partners meet the sampling frame objectives. However, if Partners are unable to obtain the funding necessary for full compliance with the sampling frame objectives, they can still participate in the PSP by submitting any pertinent data they are able to contribute, and they will retain access to the PCSD and its products. As PSP Partners, the expectation is that limited funding would be allocated according to the data priorities described in the sampling frame with consideration of priority survey needs and issues within the PSP.

Budget templates Costs associated with surveys will vary greatly among survey Partners depending on the survey method, geographical extent, personnel salaries, logistical constraints, need for equipment purchases, and other factors unique to the survey. The following budget templates (Table 7.1) show typical line items likely to apply to any of the three survey types, and can be reproduced in an electronic spreadsheet (e.g., MS Excel) and appropriately modified and populated for the survey SSP.

Staff costs should include direct and indirect costs associated with time spent on survey planning and all phases of implementation including field work, administration, data entry, analyses, and reporting. Federal Partners should calculate the full-time equivalent (FTE) by dividing the total hours spent on the survey for each person by 2080 hours/year (i.e., one FTE).

Because most equipment will be re-usable for many surveys, do not add it to the costs every survey year; however, account for personnel changes that may require purchasing different sizes of gear such as flight suits, helmets, boots, and rain gear.

41

Table 7.1. A,B,C. Suggested templates for developing budgets for aerial, boat-based, and land-based surveys of COMU and BRAC colonies.

A. (YEAR) AERIAL SURVEY BUDGET

Conducting Total TOTAL Annual STAFF Planning and Reporting Hours Hourly Rate COST FTE Project Leader/Supervisor Survey Coordinator Field Technician Flight Follower Other Staff (List) Total Payroll TRAVEL (If overnight travel involved) Days Daily Rate Other Lodging Per diem x Number of Personnel Other (non-vehicle) travel expenses (list) Total Travel Daily Days/Miles Rate/Mileage GROUND TRANSPORT/VEHICLE Rate Rental/Daily Vehicle Cost Mileage or Fuel Charge Total Vehicle Total Hourly/Daily Other AIRCRAFT/PILOT Hours/Days Rate Rental Pilot Expenses/Day (Lodging, per diem) Landing Fees or Other Expenses (List) Total Aircraft EQUIPMENT Cost Quantity Digital Camera 2 (min) Telephoto Lens 2 Camera Accessories (Bag, Batteries, Data Cards, etc.) 1 Flight Suit and Gloves 1/crew member Flight Helmet 1/crew member GPS Unit 1 Other (List)

Total Equipment GRAND TOTAL

42

B. (YEAR) BOAT-BASED SURVEY BUDGET

Conducting Total TOTAL Annual STAFF Planning and Reporting Hours Hourly Rate COST FTE Project Leader/Supervisor Survey Coordinator Field Technician Float Follower Other Staff (List) Total Payroll TRAVEL (If overnight travel involved) Days Daily Rate Other Lodging Per diem x Number of Personnel Other (non-vehicle) travel expenses (list) Total Travel Daily Days/Miles Rate/Mileage GROUND TRANSPORT/VEHICLE and TRAILER Rate Rental/Daily Vehicle Cost Mileage or Fuel Charge Parking Fees Total Vehicle WATERCRAFT/OPERATOR (If not Survey Total Hourly/Daily Other Coordinator or other staff) Hours/Days Rate Rental/Contractor Operator Expenses/Day (Lodging, per diem) Fuel Ramp Fees or Other Expenses (List) Total Watercraft EQUIPMENT Cost Quantity Digital Camera 1 Telephoto Lens 1 Camera Accessories (Bag, Batteries, Data Cards, etc.) 1 Float Coat, Boots, Rain Gear, Gloves 1/crew member Binoculars 1/crew member GPS Unit 1 Spotting Scope 1 Navigation Charts Others (List) Total Equipment GRAND TOTAL

43

C. (YEAR) LAND-BASED SURVEY BUDGET Conducting Total TOTAL Annual STAFF Planning and Reporting Hours Hourly Rate COST FTE Project Leader/Supervisor Survey Coordinator Field Technician Other Staff (List) Total Payroll TRAVEL (If overnight travel involved) Days Daily Rate Other Lodging Per diem x Number of Personnel Other (non-vehicle) travel expenses (list) Total Travel Daily Days/Miles Rate/Mileage GROUND TRANSPORT/VEHICLE Rate Rental/Daily Vehicle Cost Mileage or Fuel Charge Parking Fees Total Vehicle EQUIPMENT Cost Quantity Digital Camera 1 Telephoto Lens 1 Camera Accessories (Bag, Batteries, Data Cards, etc.) 1 Boots, Rain Gear, Gloves 1/crew member Binoculars 1/crew member GPS Unit 1 Spotting Scope 1 Others (List) Total Equipment GRAND TOTAL

Schedule All survey schedules are anchored to the incubation phases of the COMU and BRAC breeding cycles when they overlap in the survey area, which is when population surveys must be conducted. See the work plans in the respective survey type SOPs for survey activity schedules before and after field work. The number and types of surveys conducted by a Partner in a given year should conform to the sampling frame schedule (Element 2) describing when colonies in different sampling tiers are to be surveyed; this should be determined in coordination with the PSP staff and other CCS Partners.

Element 8: References

Ainley DG, and Boekelheide, RJ (Editors). 1990. Seabirds of the Farallon Islands: Ecology, Structure and Dynamics of an Upwelling System Community. Stanford University Press, Stanford, CA.

Ainley DG, Nur N, Eastman JT, Ballard G, Parkinson CL, Evans CW, and DeVries AL. 2013. Decadal trends in abundance, size and condition of Antarctic toothfish in McMurdo, Sound, Antarctica, 1972–2011. Fish and Fisheries 14:343–363.

44

Ainley DG, Santora J A, Capitolo PJ, Field JC, Beck JN, Carle RD, Donnelly-Greenan E, McChesney GJ, Elliott M, Bradley RW, Lindquist K, Nelson P, Roletto J, Warzybok P, Hester M, and Jahncke J. 2018. Ecosystem-based management affecting Brandt's Cormorant resources and populations in the central California Current region. Biological Conservation 217: 407-418.

Bertram DF, Harfenist A, and Smith BD. 2005. Ocean climate and El Niño impacts on survival of Cassin’s Auklets from upwelling and downwelling domains of British Columbia. Can. J. Fish. Aquat. Sci. 62:2841–2853.

Bestelmeyer BT, Ellison AM, Fraser WR, Gorman KB, Holbrook SJ, Laney CM, Ohman MD, Peters DPC, Pillsbury FC, and Rassweiler A. 2011. Analysis of abrupt transitions in ecological systems. Ecosphere 2(12):129. doi:10.1890/ESO11-00216.1

Capitolo PJ, McChesney GJ, Carter HR, Parker MW, Hall JN, Young RJ, and Golightly RT. 2006. Whole-colony counts of Common Murres, Brandt’s Cormorants and Double-crested Cormorants at sample colonies in northern and central California, 1996-2004. Unpublished report, Department of Wildlife, Humboldt State University, Arcata, California; and U.S. Fish and Wildlife Service, San Francisco Bay National Wildlife Refuge Complex, Newark, California. 40 pp. http://scholarworks.calstate.edu/bitstream/handle/2148/940/Whole- colony%20counts%20of%20common%20murres,%20brandts%20cormorants%20and%20double crested%20cormorants%20at%20sample%20colonies%20in%20northern%20and%20central%2 0california,%201996-2004.pdf?sequence=1

Capitolo PJ, McChesney GJ, Carter HR, Parker MW, Eigner, LE, and Golightly R T. 2014. Changes in breeding population sizes of Brandt’s Cormorants Phalacrocorax penicillatus in the Gulf of the Farallones, California, 1979–2006. Marine Ornithology 42: 35–48.

Carter HR, Wilson UW, Lowe RW, Rodway MS, Manuwal DA, Takekawa JE, and Yee JL. 2001. Population trends of the Common Murre (Uria aalge californica). Pages 33-133 in D. A. Manuwal, H. R. Carter, T. S. Zimmerman, and D. L. Orthmeyer, editors. Biology and conservation of the Common Murre in California, Oregon, Washington, and British Columbia. Volume 1: Natural history and population trends. U.S. Geological Survey, Information and Technology Report USGS/BRD/ITR-2000-0012, Washington, D.C.

Carter HR, McChesney GJ, Jaques DL, Strong CS, Parker MW, Takekawa JE, JoryDL, Whitworth DL. 1992 Breeding Populations Of Seabirds In California, I989-1991 Volume I - Population Estimates, ed Gilmer DS, U.S. Fish and Wildlife Service, Northern Prairie Wildlife Research Center, Dixon, CA 491pp. http://aquaticcommons.org/id/eprint/11254

Cooch EG, and White GC. 2017. MARK: A Gentle Introduction, 17th Edition. Available at www.phidot.org.

45

Dobson A, and Barnett A. 2011. An Introduction to Generalized Linear Models (3rd ed.) Chapman & Hall/CRC, Boca Raton, FL.

Federal Geographic Data Committee. FGDC-STD-001-1998. Content standard for digital geospatial metadata (revised June 1998). Federal Geographic Data Committee. Washington, D.C. https://www.fgdc.gov/standards/projects/metadata/base-metadata/v2_0698.pdf

Gladics A. 2009. Aerial Seabird Training Manual. US Fish and Wildlife Service, Oregon Coast National Wildlife Refuge Complex, Newport, OR. Unpublished report 69 pp. https://ecos.fws.gov/ServCat/Reference/Profile/83213

Gould WR, and Nichols JD. 1998. Estimation of temporal variability of survival in animal populations. Ecology 79:2531-2538.

Harrell Jr FE. 2001. Regression Modeling Strategies: With Applications to Linear Models, Logistic Regression, and Survival Analysis. Springer, New York.

Hilbe J. 2011. Negative Binomial Regression, 2nd ed. Cambridge University Press, Cambridge, UK.

IUCN. 2012. IUCN Red List Categories and Criteria: Version 3.1. Second edition. Gland, Switzerland and Cambridge, UK, IUCN. iv + 32 pp. Available from: http://www.iucnredlist.org/technical-documents/categories-and-criteria.

Lee DE, Abraham C, Warzybok PM , Bradley RW, and Sydeman WJ. 2008. Age-specific survival, breeding success, and recruitment in Common Murres (Uria aalge) of the California Current System. Auk 125:316-325.

Manuwal DA, Carter HR, Zimmerman TS, and Orthmeyer DL, Editors. 2001. Biology and conservation of the common murre in California, Oregon, Washington, and British Columbia. Volume 1: Natural history and population trends. U.S. Geological Survey, Biological Resources Division, Information and Technology Report USGS/BRD/ITR–2000-0012, Washington, D.C. 132 pp.

Naughton, MB, Pitkin DS, Lowe RW, So KJ, and Strong CS. 2007. Catalog of Oregon Seabird Colonies. U.S. Department of Interior, Fish and Wildlife Service, Region 1, Biological Technical Publication FWS/BTP-R1009-2007, Washington, D.C.

Nettleship DN. 1996. "Family Alcidae (auks)." In Handbook of the birds of the world., edited by J. del Hoyo, A. Elliott and J. Sargatal, 678-722. Barcelona: Lynx Edicions.

Nur N, Ford RG, and Ainley DG. 1994. Final report: computer model of Farallon seabird populations. PRBO Report to Gulf of the Farallones National Marine Sanctuary.

Nur N, Jones SL, and Geupel GR. 1999. Statistical Guide to Data Analysis of Avian Monitoring Programs. Biological Technical Publication, US Fish and Wildlife Service, BTP-R6001-1999.

46

Nur N and Sydeman WJ. 1999. Survival, breeding probability, and reproductive success in relation to population dynamics of Brandt's Cormorants Phalacrocorax penicillatus. Bird Study 46 (suppl.):S92-S103.

Nur N. 2011. Chapter 5: Statistical methodology for aggregation: Analyzing indicator data across space and time. In Collins J, Davis J, Hoenicke R, Jabusch T, Swanson, C, Gunther, A, Nur, N, and Trigueros, P. Assessment framework as a tool for integrating and communicating watershed health indicators for the San Francisco Estuary. San Francisco Estuary Partnership, Oakland, CA. Available from: http://www.sfei.org/sites/default/files/biblio_files/DWR_4600007902_Final_Project_Report.pdf

Nur N, Bradley R, and Jahncke R. 2017. Common Murre Laying Indicator. Indicators of Climate Change in California. California EPA. Sacramento, CA.

Oro D. 2014. Seabirds and climate: knowledge, pitfalls, and opportunities. Front. Ecol. Evol. 2:79. doi: 10.3389/fevo.2014.00079 Ribic, CA, Ainley DG, and Spear LB. 1992. Effects of El Niño and La Niña on seabird assemblages in the Equatorial Pacific. Marine Ecology Progress Series 80:109-124.

Stralberg D, Herzog M, Nur N, Tuxen K, and Kelly M. 2010. Predicting avian abundance within and across tidal marshes using fine-scale vegetation and geomorphic metrics. Wetlands 30:475-487.

Sowls AL, DeGange AR, Nelson JW, and Lester GS. 1980. Catalog of California seabird colonies. U.S. Fish and Wildlife Service, Biological Services Program FWS/OBS 37/80, Washington, D.C.

Sydeman WJ, Carter HR, Takekawa TE, and Nur N. 1997. Common Murre population Uria aalge trends at the South Farallon Islands, California, 1985-1995. Unpublished Report, Point Reyes Bird Observatory, Stinson Beach, CA; U.S. Geological Survey, Dixon, CA; and USFWS, Newark, CA.

[USFWS] U.S. Fish and Wildlife Service. 2005. Regional Seabird Conservation Plan, Pacific Region. U.S. Fish and Wildlife Service, Migratory Birds and Habitat Programs, Pacific Region, Portland, OR

[USFWS] U.S. Fish and Wildlife Service. 2006. Beringian Seabird Colony Catalog- computer database and Colony Status Record archives. U.S. Fish and Wildl. Serv., Migr. Bird Manage., Anchorage, AK [USFWS] US Fish and Wildlife Service. 2013. How to develop survey protocols, a handbook (Version 1.0). Fort Collins, Colorado: US Department of Interior, Fish and Wildlife Service, National Wildlife Refuge System, Natural Resource Program Center. https://www.fws.gov/policy/SurveyProtocalsHB.pdf

47

Wallace EA, and Wallace GE. 1998. Brandt's Cormorant (Phalacrocorax penicillatus), The Birds of North America (P. G. Rodewald, Ed.). Ithaca: Cornell Lab of Ornithology; Retrieved from the Birds of North America: https://birdsna.org/Species-Account/bna/species/bracor DOI: 10.2173/bna.362

48

Appendices

Appendix 1. Sampling Design Power Analysis Nadav Nur and Leo Salas

Introduction and Overview Element 2 describes a Sampling Framework that outlines how COMU and BRAC breeding colony surveys can be designed to collect data that has the potential to meet the management and sampling objectives of the PF. To provide information to guide decisions regarding the sampling design (Element 2), an analysis was conducted to compare the statistical power of different sampling designs to detect trends of the magnitude and with the confidence specified by the sampling objectives (Table A1.1), referred to as target trends in Element 2 of the PF.

Statistical power to detect a trend depends on the following six components (see also, Nur et al. 1999): 1. Sampling effort in each unit of time (in this case, one year) 2. Number of years over which the trend is being assessed 3. Magnitude of the trend for which power is being calculated 4. Variability of the response variable 5. Specific statistical test to be used 6. Alpha level (i.e., the Type I error rate) used in the statistical test

Component 1, sampling effort per year, reflects differences in sampling design with regard to: number of sites surveyed, replication of surveys, and frequency (annual or less than annual) of surveys. We explore the importance of each of these in the power analysis.

Component 2, time span, is tied to the statistical objectives (Table A1.1) identified in Element 2: generally 15 years, but we also examined lesser and greater spans.

Component 3, magnitude of trend, is determined by the statistical objectives (Element 2).

For Component 4, variability of response, we used statistical estimates of variability in counts of COMU individuals and BRAC nests based on the analysis of relevant datasets contributed by seabird biologists, as described below.

Components 5 and 6, the statistical test used and alpha level, were fixed in the analysis and are described below.

This power analysis compares sampling designs that differ with respect to: a) number of colonies sampled (i.e., for which counts of individuals or nests were completed), b) replication of surveys during a breeding season, and c) frequency of sampling among years. As described in Element 2, replication distinguishes Tier 1 from Tiers 2-4; whereas, frequency of sampling distinguishes Tiers 1 and 2 (sampling every year) from Tiers 3 and 4 (less frequent sampling). We use the term “study design” to refer to which Tiers are included in the sampling framework; for example, one study design may include Tier 2 sites only while another study design may include Tiers 1, 2,

49

and 3. This is distinct from the term “sampling design” to refer to both study design (composition in terms of Tiers) and sample size per Tier.

Specific questions addressed by the power analysis were: 1. What is the sample size needed (the number of sites per sub-region) to achieve 80% power to detect target trends, for each of several study designs? 2. What is the power to detect one or more target trends for a specified sampling design, comparing across study designs? 3. How is statistical power to detect trends change affected by within season survey replication? 4. How is statistical power to detect trends affected by frequency of surveys? We refer to less-than-annual surveys as “skipping.” How much power is lost by skipping? Conversely, how much power is gained by adding Tier 3 sites to a study design that already has Tiers 1 and/or 2?

The power analysis is conducted at the two spatial scales of interest: sub-region and the entire CCS area of interest. Sub-regions are described in Element 2. Here, we do not consider any specific sub-region. Instead, we consider how statistical power varies with the sampling design characteristics of a generalized sub-region.

Table A1.1 Summarized Sampling Objectives with 80% probability of detection. See Element 2 for more details. Sampling Species Extent Decline Time Notes Objective (years) S01 COMU CCS 30% (1.18/y) 30 S02 COMU CCS 30% (2.35/y) 15 S03 COMU Subregion 40% (3.35/y) 15 S04 COMU Subregion 40% 3 Compared to past 5 years. S05 BRAC CCS 30% (2.35/y) 15 S06 BRAC Subregion 40% (3.35/y) 15 S07 BRAC Subregion 40% 3 Compared to past 5 years.

Simulation Methods Overview We estimated statistical power through the use of simulations in a three-stage process. First, we simulated population change over time for a population characterized by either (1) an underlying trend over time (SO1-SO3, SO5-SO6) or (2) a short-term change (SO4, SO7; see Table A1.1). We describe the power analysis for trend estimation, and then describe any differences with regard to power to detecting short-term change.

In the second stage we simulated the survey process, incorporating variability in counts as determined from our statistical analysis of recent datasets. This resulted in simulated counts per survey at multiple sites within a sub-region, and multiple sub-regions within the CCS. Different

50

scenarios were simulated based on variation in Components 1-3 (sampling effort, time span, and trend magnitude).

Third, we carried out statistical estimation of the simulated data. For each simulation, we determined the estimated trend, its standard error, and whether the trend was significant. The proportion of simulations in which the observed trend was statistically significant, expressed as a percent, is the statistical power to detect a trend of specified magnitude (Nur et al. 1999).

Description of the Simulations We first describe simulations for the single sub-region. CCS-wide simulations were similar; we only describe any differences. Statistical analyses of four unpublished, long-term, and extensive datasets were used to estimate several parameter values for incorporation into the power analysis. These data were gathered and compiled by many seabird biologists over the years. The datasets used were: (1) Aerial survey counts for COMU and BRAC surveys in California provided by G. McChesney and P. Capitolo, (2) aerial survey counts for COMU and BRAC surveys in Oregon provided by S. Stephenson, 3) COMU counts on the Upper Shubrick plot on Southeast Farallon Island provided by R. Bradley (unpublished Point Blue data which was an extension of data analyzed by Sydeman et al. 1997), and 4) aerial survey counts of COMUs and BRAC nests from Washington from S. Thomas.

The most extensive datasets (in terms of length of time series and spatial extent) were the datasets for California and Oregon, hence estimation of parameters for use in the simulations focused on these two. We used the most recent 20 years of data for Oregon (1996-2015), a span in which most years had at least 30 colonies surveyed per species. The California data covered a similar time period, 1996-2016, with a similarly robust dataset. We considered excluding data from the strong El Niño year 1998, but did not because estimates of parameters for use in the simulations were similar with or without those data.

Analyses and simulations were conducted on natural log (ln[x+1] or loge[x+1]) transformed values. Either ln-transformation or use of a ln-link in the analysis is required because population change is a multiplicative process (e.g., a population declines at 2.35% per year) (Nur et al. 1999), and, furthermore, colony size varies by several orders of magnitude.

Selecting simulation model component values Colony size varies among colony sites for both species. In the simulations, the starting size of each colony was randomly chosen from a distribution of values that matched the distribution of sizes in the datasets. Once the starting size of the colony was specified, we modeled change over time, and incorporated variability in survey counts as described below. Thus, there was a deterministic component to this change (the trend or short-term change), which the simulations were designed to estimate and test for significance. In addition, there was: 1) year-to-year variation in population size of the sub-region around the trend line; and 2) additional variability in counts for the individual colony, among and within years, as described below. A parallel approach was taken for SO4 and SO7 regarding short-term change.

Modeling population change over time First, trend values were chosen: declines of 1.18, 2.35, and 3.35% for COMU (Sampling Objectives SO1-SO3) and 2.35 and 3.35% for BRAC (Sampling Objectives SO5 and SO6). We 51

then added biological process variation reflecting true year-to-year variation (not due to sampling error; Gould & Nichols 1998 ) around the trend line for the sub-region. In other words, it is not realistic for a population decreasing on average 2.35% per year over 15 years to decline exactly 2.35% in every year. The magnitude of process variation was estimated from the datasets for California and Oregon using the method described by Cooch & White (2017).

Modeling variation in counts Two final sources of variation were added to the simulations. First, variation in counts due to day of the season was estimated from the datasets. For both species, a quadratic curvature was found with counts peaking at an intermediate date within the period of surveys. Because the magnitude of the curvature was similar for California and Oregon datasets for both species, we used the value for California. For the statistical test used in the power analyses, we fit a model which included a quadratic date effect to adjust for the effect of date. This increased the ability to detect the true, underlying changes in trends specified by the simulation.

Variability not attributed to population change over time at the sub-region level nor to day of the season is referred to as residual variability. This variability has two sources. First, there is variation among replicate counts of the same colony in the same year due to colony occupancy differences, behavioral differences, and counting errors, pertaining to each survey. Second, counts between years will vary for the same reasons. Thus, if the sub-region population declines at, for example 5% between one year and the next, the change in counts at individual colonies can be substantially higher or lower than that value.

To estimate residual variability for the purpose of the simulations, we used the California dataset and fit a statistical model accounting for main effects of colony (some colonies are, overall, larger than others; “colony” treated as a factor), year (reflecting any and all year-to-year fluctuations in size of the population for the entire sub-region, with year as a factor), and the quadratic effect of date, as described above. The residual variation (i.e., unexplained variability) is measured by the Root Mean Square Error of such a model. This source of variation, and its magnitude, was incorporated into the simulations. Our estimates of residual variability were comparable among all four data sets.

Our approach was to simulate a generalized sub-region rather than analyzing each proposed sub- region separately. We assume that the sub-regional results so obtained will help guide the choice of sampling design across the eight sub-regions. Initial inspection of the results indicates similarity of sources of variation among the sub-regions, but this assumption needs to be verified by further investigation.

Comparison of Study Designs Study design specifies which tiers are included: Tier 1 sites (see Element 2) have replication and annual surveys. Tier 2 sites have no replication and annual surveys. Tier 3 sites have no replication and less than annual surveys. For the purposes of the power analysis, skipping of surveys was assumed on a three-year cycle (i.e., sites surveyed every third year), but other frequencies can be considered as well. The power analysis compares the following study designs, as numbered: 1) Annual surveys with replication (Tier 1) 2) Annual surveys, no replication (Tier 2) 52

3) Triennial surveys, no replication (Tier 3) 4) Tier 1 + Tier 2 (Annual, with and without replication) 5) Tier 2 + Tier 3 (Annual and triennial, no replication) 6) Tiers 1 + 2 + 3. Thus, the three multi-tier designs considered all included Tier 2, which is currently the most prevalent in the CCS.

Number of Sites Within each study design, we assessed the statistical power in relation to sample size, which allowed us to compare sampling designs. For each sub-region, the number of colony sites sampled per tier could contain 2, 3, 4, 5, etc., up to 16 sites. Designs 1-3 include exactly that number of sites. Designs 4 and 5 each contain two Tiers and, therefore, twice the total number of sites (i.e., contain 4, 6, 8, 10, etc.). Design 6 contains three Tiers with three times as many total sites (i.e., contain 6, 9, 12, 15, etc., sites).

Statistical Test used for Power Analysis to Detect Declining Trends We fit a linear model to the simulated data, estimating the trend over time. As previously mentioned, the response variable was transformed [ln(x + 1)]. We fit a linear trend for year, which corresponds to constant proportional change over time. The model included colony as a fixed, categorical effect (i.e., colony as a factor) and the date of survey as a quadratic (linear and quadratic terms included). Although negative binomial regression (i.e., Generalized Linear Model with negative binomial error structure and ln-link; Hilbe 2011) is a desirable means to analyze count data, a linear model provides a robust result for the simulations and a much quicker fit. We confirmed that residuals from the simulation were normally-distributed (recall that the dependent variable was first ln-transformed).

We used a Type I error rate of alpha = 0.05 (Nur et al. 1999), assuming a two-sided test of the null hypothesis that the trend = 0. Use of alpha = 0.10 would increase power, but the difference in power is not large (results not reported).

Time Span and Simulations Time span was either 15 years or 30 years as per the sampling objectives (SO1-SO7). We ran 1000 simulations for each combination of study design, number of sites, time span, and trend magnitude for the sub-region analysis (see below for other analyses) .

Power Analysis for CCS-wide trend Using the same methodology as for the single sub-region, we carried out simulations to assess statistical power to detect target trends, comparing among sampling designs for multiple sub- regions rather than a single sub-region. For COMU, we assumed five sub-regions because several have no or few COMU colonies. Because the number of “effective” sub-regions for the purpose of the analysis is not known, we selected a conservative number. Though we may be underestimating power in this manner, we consider a conservative choice to be appropriate. For BRAC, we assumed seven sub-regions using similar reasoning. For CCS-wide analyses, we did not assume that all sub-regions conformed to the same trend; instead, we assumed a mean trend (as specified by SO1, SO2, or SO5; Table A1.1), and allowed for variability around the mean trend. Specifically, we assumed that the trend for each sub-region within the CCS, was drawn

53

from a probability distribution of trends, whose mean was the target trend, and with SD of 0.20 that of the mean (i.e., CV = 0.20). Due to the larger number of sites (distributed over five to seven sub-regions) we carried out 500 simulations for each sampling design scenario.

Power Analysis for Increasing Trends Sampling Objectives SO1-SO7 specifically refer to detecting declining trends. In addition, we evaluated the power to detect increasing trends. We evaluated increasing trends equivalent in magnitude, with regard to proportional change. For example, a 40% decline over 15 years (3.35%/yr), to 3/5 of the starting value, is comparable to a 66.7% increase over 15 years, i.e., to 5/3 of the starting value. Thus, increasing trends evaluated were 2.41%/yr (cf. to 2.35%/yr decline) and 3.47%/yr (cf. to 3.35%/yr decline). In terms of ln(trend), the coefficients of increasing and decreasing trends were the same (± 0.0238, ± 0.0341, respectively).

Power Analysis for short-term change SO4 and SO7 refer to detecting, for a single sub-region, a short-term drop of 40% in mean abundance metric, for a 3-year period compared to the previous 5 years. We simulated such a change, assuming no trend in abundance during the 8 year period. In general, all aspects of the simulation were the same as for the trend power analysis, except that the postulated underlying change over time was a one-time change in the mean, rather than a trend. Note that we retained process variation around the mean value (prior to and subsequent to the “step change.”) We also conducted a power analysis to detect short-term increases. The magnitude of the increase was a 66.7% increase, for reasons outlined above. Here, the statistical test was a difference in mean for the two time periods (5 year baseline period followed by 3 year “post-change”). For this analysis, we only considered Designs 1, 2, and 4 (Tier 1 and/or Tier 2). Tier 3 sites would not be suitable for detecting short-term changes.

Thus, the power analysis addressed the following questions, at the sub-regional and CCS-wide scales: 1. For a given study design and magnitude of change over time, what is the number of sites needed to achieve a specified level of statistical power, e.g., 80%? 2. For a given study design, specified number of sites, and magnitude of change over time, what is the expected statistical power? 3. How does statistical power vary with replication and skipping (i.e., in the latter case, triennial rather than annual surveys)?

Results BRAC Trends Sub-regional Trends, Declining and Increasing We first examined the power to detect a trend of 3.35% decline per year, over 15 years, for a single sub-region (SO6). Design 2 (Tier 2 only) requires 14 sites to achieve 80% power, while Design 1 (Tier 1 only) can achieve 80% power with 6 sites (i.e., 12 surveys per year) (Figure A1.1). The difference between Designs 1 and 2 is essentially due to Tier 1 having two surveys per year rather than one. As expected, Design 3 (Tier 3 only) had much lower power, about 15 to 26% lower for number of sites varying from 4 to 16, compared to Tier 2. Designs 4-6 also achieved 80% power as follows (Figure A1.1): Design 4 with 4 sites per tier (Tiers 1 and 2, 8 sites total, 12 surveys/yr); Design 5 with 9 sites per tier (Tiers 2 and 3, 18 sites total, c. 12

54

surveys/yr); and Design 6 with 4 sites per tier (Tiers 1, 2, and 3: 12 sites total, c. 13.3 surveys/yr).

The total number of sites needed varied depending on the design, which differs with respect to replication and skipping. The total number of surveys per year is the primary determinant of power (i.e., number of sites x number of surveys per site per year). To detect a 3.35% trend, approximately 12 to 14 surveys per year were needed to achieve the desired 80% power. The different designs stipulate how that survey effort is allocated.

As expected, the power to detect increasing trends of comparable magnitude was similar to that obtained for the declining trends (Fig A1.1). For a given sampling design, the difference in power for detecting increasing versus declining trends was generally 2.0% or less. Thus, the number of sites needed to achieve 80% power was similar for detecting increasing vs. comparable decreasing trends (Table 2.3).

Figure A1.1. The statistical power to detect declining and increasing trends of BRAC populations of a typical sub-region in relation to Study Design and number of colony sites per Tier. Time span is 15 years; trends are percent change per year. Study Designs are: 1 (Tier 1); 2 (Tier 2); 3 (Tier 3); 4 (Tiers 1 and 2); 5 (Tiers 2 and 3); 6 (Tiers 1, 2, and 3). The lowess smooth curve for power is shown for each simulation scenario; the dashed horizontal lines indicate 80% power.

55

CCS-wide trends The power to detect the target trend of 2.35% decline over 15 years, as stipulated by SO5, was 80% if approximately six sites were surveyed once per year per sub-region (Design 2; Figure A1.2). With replication (Design 1), half that number was needed. With Design 4 (Tiers 1 and 2), a total of four colony sites are needed per sub-region. With Designs 5 and 6, the total number of sites needed per sub-region is eight and six, respectively (Figure A1.2). Thus, the number of sites per sub-region needed to detect a 2.35%/yr decline across the entire CCS (here assumed to include seven sub-regions) is substantially less than the number of sites needed to detect a 3.35%/yr decline for a single sub-region (cf. Figures A1.1 and A1.2). As a result, if the sub- regional sampling objective is met, the CCS-wide objective will also be met.

Figure A1.2. The statistical power to detect declining and increasing trends of BRAC populations of the entire CCS in relation to Study Design and number of colony sites per Tier. Time span is 15 years; trends are percent change per year. Study Designs are: 1 (Tier 1); 2 (Tier 2); 3 (Tier 3); 4 (Tiers 1 and 2); 5 (Tiers 2 and 3); 6 (Tiers 1, 2, and 3). The lowess smooth curve for power is shown for each simulation scenario; the dashed horizontal lines indicate 80% power.

Results were similar for detecting comparable increasing trends (Fig A1.2). The difference in power was relatively small between detecting a decreasing trend versus an increasing trend, thus sample sizes needed were similar but not identical (Table 2.4).

COMU Trends Sub-regional Trends, Declining and Increasing

56

Surveying nine sites annually (Design 2), can achieve 80% power to detect a declining trend of 2.35%/yr over a 15 year period (Fig A1.3). With replication, the number of sites needed dropped even further, as expected. For example, with Design 1 (Tier 1 only), five sites are needed (10 surveys per year; Fig. A1.3). With Design 4, six sites (3 Tier 1, 3 Tier 2) achieve 80% power, a design that corresponds to nine surveys per year. In contrast, with triennial surveys only (Design 3), 12 sites (an average of four surveys per year) only achieve 63% power. However, combining Tier 3 sites with Tiers 1 and/or 2 resulted in effective designs for detecting trends. For example, with Design 5, surveying a total of 12 sites (six each of Tiers 2, and 3) achieved 80% power. This is substantially fewer colony sites than needed to detect a comparable trend for BRAC (Table 2.3), due to greater annual variability observed for BRAC.

Figure A1.3. The statistical power to detect declining and increasing trends of COMU populations of a typical sub-region in relation to Study Design and number of colony sites per Tier. Time span is 15 years; trends are percent change per year. Study Designs are: 1 (Tier 1); 2 (Tier 2); 3 (Tier 3); 4 (Tiers 1 and 2); 5 (Tiers 2 and 3); 6 (Tiers 1, 2, and 3). The lowess smooth curve for power is shown for each simulation scenario; the dashed horizontal lines indicate 80% power.

Results for increasing trends were within 2% of those for comparable declining trends (Figure A1.3). For COMU, all sample sizes listed above for detecting a declining trend also applied for detecting an increasing trend.

CCS-wide trend The power to detect the target trend of 2.35% decline over 15 years was 80% if five colony sites were surveyed once per year per sub-region, across all sub-regions (Tier 2 only; Fig. A1.4). With

57

replication (Tier 1), surveying three sites per sub-region (six surveys/yr) would meet the sampling objective. With Design 4 (Tiers 1 and 2), four colony sites are needed per sub-region (six surveys/yr). Design 5 (Tiers 2 and 3) can achieve 80% power with eight sites (on average, 5.3 surveys/yr).

Figure A1.4. The statistical power to detect declining and increasing trends of COMU populations in the entire CCS in relation to Study Design and number of colony sites per Tier. For 15 year time span, the respective trends are 2.35%/yr decline (small diamond) and 2.41%/yr increase (triangle); lowess smooth curve shown as solid line. For 30 year time span, trends are 1.18%/yr decline (filled circle) and 1.20%/yr increase (square); lowess smooth curve shown as dashed line. Study Designs are: 1 (Tier 1); 2 (Tier 2); 3 (Tier 3); 4 (Tiers 1 and 2); 5 (Tiers 2 and 3); 6 (Tiers 1, 2, and 3). 80% power shown with dashed horizontal line.

We also determined the power to detect a 1.18%/yr decline as well as a 1.20%/yr increase over 30 years (cf. SO1). Power was substantially higher for these two trend scenarios compared to detecting the target trend over a 15 year period, presumably because of the large sample size available over a 30 year period (Figure A1.4).

Results were similar for increasing trends, compared to declining trends of comparable magnitude (Fig A1.4 Table 2.4). Comparing increasing and declining trend, differences in power for the sample sizes of interest were in all cases less than 3%.

As with BRAC, the number of sites per sub-region needed to detect a 2.35%/yr decline across the entire CCS (here assumed to include five sub-regions) is less than the number of sites needed to detect a 3.35%/yr decline for a single sub-region (cf. Figures A1.3 and A1.4). Thus, if the sub- 58

regional sampling objective is met (SO3), we can expect that CCS-wide objectives (SO1, SO2) will also be met.

Short-term changes for BRAC and COMU For BRAC, twelve Tier 2 colony sites (Design 2) are sufficient to achieve 80% power to detect a short-term 40% drop (Fig A1.5). Interpolation suggests that 11 sites would suffice, as well. Six Tier 1 sites (Design 1) can achieve the same power, as can four Tier 1 and four Tier 2 sites (Design 4). Both these designs require 12 surveys/yr. Results were similar when considering an increase of comparable magnitude (66.7%; Figure A1.5).

Figure A1.5. The statistical power to detect short-term change of BRAC populations for a typical sub- region in relation to Study Design and number of colony sites per Tier. Change analyzed is 40% drop over a 3-year period or 66.7% increase over a 3-year period; see text. Study Designs are: 1 (Tier 1); 2 (Tier 2); 4 (Tiers 1 and 2). Tier 3 sites were not considered for short-term change. Lowess smooth curve for power is shown for each simulation scenario; the dashed horizontal lines indicate 80% power.

For COMU, eight Tier 2 colony sites (Design 2) are sufficient to achieve 80% power to detect a 40% drop (Fig A1.6). Four Tier 1 sites (Design 1) can achieve the same power, as can three Tier 1 and three Tier 2 sites (Design 4). Results were similar when considering an increase of comparable magnitude, 66.7% increase (Figure A1.6).

59

Figure A1.6. The statistical power to detect short-term change of COMU populations in a typical sub- region in relation to Study Design and number of colony sites per Tier. Change analyzed is 40% drop over a 3-year period or 66.7% increase over a 3-year period; see text. Study Designs are: 1 (Tier 1); 2 (Tier 2); 4 (Tiers 1 and 2). Tier 3 sites were not considered for short-term change. Lowess smooth curve for power is shown for each simulation scenario; the dashed horizontal lines indicate 80% power.

Thus, for both species, sample designs that can achieve the sub-regional trend objectives (SO3 and SO6 for COMU and BRAC, respectively), have more than enough power to achieve detection of short-term change, thus meeting objectives SO4 and SO7.

Take away points from Power Analysis We draw the following inferences and conclusions from the power analysis of the two species. 1. The limiting factor in terms of sample size needed to meet the sampling objectives (SO1- SO7) is the power to detect the target trend for an individual sub-region. Sampling designs that meet SO3 (with regard to COMU) will also meet SO1, SO2, and SO4; sampling designs that meet SO6 (with regard to BRAC) will also meet SO5 and SO7. 2. For BRAC, if sites are surveyed annually without replication (Design 2), approximately 14 colony sites would be needed to achieve SO6. However, other designs can also achieve the same sampling objective, such as surveying six or seven sites with replication (Design 1). Designs 4, 5, and 6, involving two or more tiers, can be considered as well. 3. For COMU, five to nine sites surveyed per year per sub-region, respectively, depending on whether there is replication or not, respectively, will achieve at least 80% power to detect the target trend for a single sub-region. 4. For COMU, a sampling design of five Tier 2 sites surveyed per sub-region will have sufficient power to detect a 2.35%/yr decline across the entire CCS over a span of 15 years (SO2). While this number of sites is sufficient for assessing an overall CCS-wide trend, if present, it would not be adequate for characterizing change at the sub-regional level, including the comparison of trends among sub-regions within the CCS. 5. Consideration of power to detect a trend will not determine which sampling design should be used because there are multiple ways to achieve the desired level of power. The power

60

analysis results illustrate the trade-offs between sampling fewer sites more frequently and sampling more sites less frequently, as discussed in Element 2. Thus, Tier 3 sites can provide benefit when combined with Tiers 1 and/or 2, by augmenting power and providing a broader spatial basis from which to infer sub-regional trends.

Caveats The power analysis is intended to inform the selection of a sampling design. It is particularly helpful in showing how power is expected to change with a change in the number of sites surveyed, the study design (i.e., degree of replication or frequency of surveying), trend magnitude, time span, etc. For example, Figures A1.1 to A1.6 illustrate the gain in power as one increases the number of colony sites surveyed per tier. However, the precise estimates of power depend on the assumptions made in the modeling exercise. As a result, the metric reported here provides a relative measure of statistical power, but we acknowledge the uncertainty regarding the absolute values reported. As new survey data collected under the PF become available, the model assumptions will be re-evaluated and any changes in analysis results will be identified.

References Cooch EG, and White GC. 2017. MARK: A Gentle Introduction, 17th Edition. Available at www.phidot.org.

Gould WR, and Nichols JD. 1998. Estimation of temporal variability of survival in animal populations. Ecology 79:2531-2538.

Hilbe J. 2011. Negative Binomial Regression, 2nd ed. Cambridge University Press, Cambridge, UK.

Nur N, Jones SL, and Geupel GR. 1999. Statistical Guide to Data Analysis of Avian Monitoring Programs. Biological Technical Publication, US Fish and Wildlife Service, BTP-R6001-1999.

Sydeman WJ, Carter HR, Takekawa TE, and Nur N. 1997. Common Murre population Uria aalge trends at the South Farallon Islands, California, 1985-1995. Unpublished Report, Point Reyes Bird Observatory, Stinson Beach, CA; U.S. Geological Survey, Dixon, CA; and USFWS, Newark, CA.

61

Appendix 2. Descriptions of Data Fields for Entry into the Colony Registry and Colony Survey Database

This is a list of minimum fields to be included in an SSP under this PF.

PSP Colony Registry (PCR) – A database of the physical locations of each seabird colony. MAP NUMBER - From USGS 1:250,000 topographic quadrangle map index. SITE NUMBER - Sequentially numbered north to south colony sites, except CA where colonies added after initial numbering were numbered sequentially as discovered. In OR and WA new colonies were assigned whole numbers identical to the nearest colony, plus sequential digits right of the decimal. In BC, the colony is assigned the next integer higher than the one to the north. SUB-COLONY NUMBER – In CA, some parts of colony complexes were assigned a sub- colony number. COLONY CODE – Composed of Map Number, hyphen, Site Number, hypen, sub-colony number (if present) (ex. 123-032-011) STATE – State/Province Abbreviation (AK,BC,WA,OR,CA, BCN (Baja California), BCS (Baja California Spur)) SITE NAME – recognized name of the colony, if it’s an unofficial name it is in quotation marks COUNTY – county of the respective State DESCRIPTION – Site description COMPLEX – 1 – part of a complex, 0 – not part of a complex LATITUDE: Latitude of colony in decimal degrees based on WGS84. For onshore colonies, latitude is given for the center point. For groups or complexes of offshore rocks or islands that constitute a single colony, latitude is given for the centroid. LONGITUDE: Longitude in decimal degrees based on WGS84. For onshore colonies, longitude is given for the center point. For groups or complexes of offshore rocks or islands that constitute a single colony, longitude is given for the centroid.

PSP Colony Survey Database (PCSD) –NOTE: these fields are for the entry of data collected under this PF. COLONY CODE– corresponds to field in PCR above DATE - Date of survey in the format of a 2 digit Month, a slash, a 2 digit day, a slash, and a 4 digit year (ex. 06/02/2017). OBSERVERS: Observers. Full names of people collecting data, or as are given on the sheet (e.g., David S. Pitkin, Roy W. Lowe). Upper and lower case characters are used. START TIME: 24 hour format END TIME: 24 hour format SURVEY TYPE: Survey method used to census or view the colony. Combinations can be used (e.g., B, M for boat and mainland coverage of Yaquina Head) A= Aerial survey from fixed-wing aircraft or helicopter UAV=Unmanned Aerial Vehicle (drone) AP= Aerial photography B= Boat M= Land-based survey from a disjunct mainland observation point C= Land-based survey in the colony or on the island SPECIES: BRAC for Brandt’s Cormorant and COMU for Common Murre

62

ACTUAL BIRD COUNT: Actual number of identified birds reported by observer. NUMBER OF NESTS: Number of nests. Filled in for species in which nests were counted. If nests were looked for and not found the value is zero. If nests were not looked for or not noted, the value is X. For BRAC, use best estimate of nests derived from all categories of nests (see Element 3). WBN: For BRAC only, number of well-built nests FBN: For BRAC only, number of fairly-built nests PBN: For BRAC only, number of poorly-built nests NWC: For BRAC only, number of nests with chicks present LBC: For BRAC only, number of AN: For BRAC only, number of abandoned nests EN: For BRAC only, number of empty nests UN: For BRAC only, number of undetermined nets TS: For BRAC only, # of territorial sites (not used to determine number of nests above US: For BRAC only, # of undetermined sites (not used to determine number of nests above. ESTIMATED NUMBER OF BREEDING BIRDS: Population estimate of number of breeding birds; for species where nest counted instead of birds, this is filled with (NUMBER OF NESTS * CONVERT) and CONVERT= 2 (for each BRAC nest) or 1.67 (for each individual COMU). This can be filled as 0 for negative data (survey done, no birds seen). WHAT COUNTED: What was counted – birds or nests. BO = Individual birds: Actual count of individual breeding birds in the colony. BE = Individual birds: Estimate of individual breeding birds in the colony N= Nests: Count of nests in colony. E= Pairs: Count or estimate of breeding pairs in colony. ESTIMATE TYPE: Type of estimate or count made by the observer. Codes may be qualitative or quantitative. These codes are the same as those used for the Beringian Seabird Colony Catalog (USFWS 2006): W= An actual count: Exact count of each bird, pair, or nest in colony. S= Known percent of colony: Exact count of birds, pairs, or nests in known fraction of colony area; count extrapolated to total number. Notes should describe method further. Z= Part estimated, part counted: Exact count on part of colony area; remainder of colony and total number in colony estimated. T= Count by groups: Count of birds, pairs, or nests by groups (e.g., 10’s or 1000’s). Notes should describe method further. K= Count adjusted by observer: Observer adjusted count of individuals, nests, pairs to give better estimate of total population (e.g., using study of attendance at nests). Notes should describe method further. G= Observer may have adjusted count: Observer reported number of birds, nests or pairs; did not say whether number was original count or adjusted estimate. Y= Estimate, not an actual count: Observer estimated birds, nests, or pairs by some other method. Observer did not make an exact count of individuals or groups, but numbers were provided (e.g., “500 to 1000” or “thousands”). Notes should describe method further. X= Present, no actual count: Observer reported breeding birds of this species at colony, but no estimate of numbers. P= Probable: Observer reported breeding birds probably or possibly present at colony,

63

but no estimate of numbers. V= Common or abundant: Observer reported breeding birds common or abundant at colony, but no numerical estimate. R= Rare or uncommon: Observer reported breeding birds uncommon or rare at colony, but no estimate of numbers. U= Type of estimate unknown: Census method or accuracy of method unknown (observer did not describe method well).

CONVERT: Conversion. This code describes how the count or survey data were adjusted to generate estimates of breeding birds. These codes are the same as those used for the Beringian Seabird Colony Catalog (USFWS 2006): 2 = Count of nests multiplied by 2 for counts of breeding birds 1.67 = Count of nests is multiplied by 1.67 for count of breeding birds.

REPLICATIONS: the number of times a colony was surveyed in a year

DATA ENTRY DATE: Data entry date.

DATA ENTRY NAME: Name of the person entering the data

PHOTO INTERPRETER NAME: Full names of people counting seabirds on slides taken during an aerial photography survey. Slide counting may or may not be performed by the observers listed in the Census table. Upper and lower case characters are used.

References

[USFWS] U.S. Fish and Wildlife Service. 2006. Beringian Seabird Colony Catalog- computer database and Colony Status Record archives. U.S. Fish and Wildl. Serv., Migr. Bird Manage., Anchorage, AK

64

Appendix 3. Science Information Needs

This PF is intended to be a living document that will be revised and updated as new information becomes available that will improve any of its elements (e.g., field methods, data management) associated with conducting breeding colony surveys of COMU and BRAC described herein. Following is a list of the currently recognized scientific information needs that would be most useful for facilitating full achievement of the goals and objectives of this PF and the PSP in general.

1. Database Requirements Analysis As described in Element 4, the selection (or development) of a database that will serve as the repository of survey data generated under this PF is critical to integrate those data for analyses and, in turn, make the data and their analyses available to the cooperators. Crucial information needed for database development and deployment is a requirements document, including attributes and functionalities of the database. These attributes will be based on the needs of all the users, which may be determined by a formal survey of all PSP partners to be administered by the Data Manager. Completing this task is a top priority for the PSP Data Manager.

2. Detailed nesting phenology of representative colonies of COMU and BRAC for all sub- regions The accuracy of the colony counts as estimates of the breeding populations is closely tied to the timing of the surveys relative to the nesting stage of each species. COMU colony occupancy rates by breeding adults are known to peak during the incubation stage; therefore, annual surveys are timed to coincide with the average “window” of dates when most breeding adults are incubating eggs. Because there is variation in intra-colony breeding synchrony, adult occupancy relative to the time of day and tidal stage, and the inter-annual date of initiation of egg-laying, there is added uncertainty with respect to the best date to survey. Variation in the degree of overlap for COMU and BRAC breeding cycles by season and location also increases the chance a single aerial survey of both species may occur outside of the ideal window when most adults are present. Although the number of BRAC nests are relatively easy to count and is a good proxy (each nest equals two breeding birds) for number of breeders, estimates are complicated by the fact that nest building within a colony is often poorly synchronized (some nests may be initiated after the survey), and not all nests actually result in egg-laying (see Figure 3.1).

Much of the uncertainty associated with these variables and others can be mitigated if the dates of aerial surveys can be more precisely correlated with the phenology of the colony. (See Element 2 for a discussion of how historical count data may also be used to statistically reduce uncertainty due to survey timing.) At a minimum, phenological studies should document dates of initial egg-laying, mean egg-laying, first hatching, mean hatching, first fledging, and mean fledging. These benchmarks should be associated with counts of adults present on the same colony at different times of day, tidal stages, and weather conditions. Nesting phenology studies will most likely occur on Tier 1A colonies (Element 2) using land-based surveys, and ideally would include representative colonies within each sub-region. Because suitable phenology study colonies may not be available in every sub-region, in practice, nesting phenology within a sub- region may have to be estimated based on local anecdotal phenological observations. Therefore, it is important to identify all colonies where phenology studies via land or boat-based observations are feasible and to conduct those studies so sub-regional variation can be estimated

65

and provide context for aerial colony counts. Details about the frequency, duration, location, and data collection of these studies should be decided in consultations between site Survey Coordinators, PSP staff, and seabird researchers.

3. Metapopulation structure of CCS COMU and BRAC populations Very little is known about the degree of genetic mixing or metapopulation dynamics that occurs in these two species within the large geographic range of the CCS. COMU within the CCS are considered one subspecies (U. a. californica; Nettleship 1996), and there are no recognized subspecies of BRAC (Wallace and Wallace 1998). Although this indicates genetic mixing within the CCS is occurring, there may still be a metapopulation structure with distinct subpopulations subject to different selective pressures and demographics. Improved knowledge of whether distinct subpopulations exist, and, if so, where boundaries between subpopulations are located, their degree of permeability, and presence of long-term source/sink dynamics would provide context for sub-regional population trends detected via survey data, help define where sub-region boundaries should be located, and assist determination of what environmental factors might be driving subpopulation trends.

References: Nettleship DN. 1996. "Family Alcidae (auks)." In Handbook of the birds of the world., edited by J. del Hoyo, A. Elliott and J. Sargatal, 678-722. Barcelona: Lynx Edicions.

Wallace EA, and Wallace GE. 1998. Brandt's Cormorant (Phalacrocorax penicillatus), The Birds of North America (P. G. Rodewald, Ed.). Ithaca: Cornell Lab of Ornithology; Retrieved from the Birds of North America: https://birdsna.org/Species-Account/bna/species/bracor DOI: 10.2173/bna.362

66

Standard Operating Procedures (SOP)

SOP 1: Standard Operating Procedure for conducting aerial surveys of common murre and Brandt’s cormorant colonies within the California Current System

Revision History Log Previous Revision Author Changes made Section and Reason Approved New version date paragraph by version number number

Revisions to this SOP will be logged here. Version numbers increase incrementally by tenths (e.g., 1.1, 1.2) for minor changes. Major revisions should be designated with the next whole number (e.g., 2.0, 3.0, etc.).

Purpose This standard operating procedure (SOP) describes minimum standards and required procedures for conducting aerial surveys of breeding colonies of common murres (COMU) and Brandt’s cormorants (BRAC) for the purpose of counting breeding adults per colony. Surveys using this SOP will generate population data for these two surface-nesting seabird species consistent with the requirements of the U.S. Fish and Wildlife Service (USFWS) Pacific Seabird Inventory and Monitoring Program (PSP).

Application This SOP should be used as a guide by any entity (National Wildlife Refuge or other base of operations) developing a SSP for conducting aerial population surveys of seabird colonies within the California Current System (CCS), which extends along the West Coast from southern British Columbia to northern Mexico. This SOP describes methods for subject colonies to be photographed from helicopter or fixed wing aircraft using high resolution digital photographic equipment to acquire images from which counts of adult birds (COMU) and nests (BRAC) can be taken.

Personnel roles and training Survey Coordinator: Typically, this is the refuge manager or biologist; this person is responsible for all aspects of the survey planning and preparation, safety procedures, implementation, data quality assurance and management, data analyses, reporting, selection of the field crew; assures all crew members are qualified and adequately trained; and prepares a survey Flight Plan in coordination with the Pilot.

Survey crew: One or two technicians assisting the Survey Coordinator; secondary photographer and/or navigator and note-taker; responsible for knowing and adhering to all activity schedules, survey protocols and safety procedures.

67

Aircraft pilot: Certified operator of the survey aircraft; assures safety plan compliance; files completed flight plan with the airport; coordinates with the Survey Coordinator on all aspects of the mission.

Flight follower: Tracks the survey flight from a ground base by computer and direct contact with the survey crew, and contacts U.S. Coast Guard and other emergency responders in emergencies according to the safety plan.

Supervisor: The Survey Coordinator’s supervisor, typically the refuge (or site) manager or project leader, who authorizes and approves the survey and safety plan and assures compliance with all DOI and USFWS policies.

Support personnel: Other persons not directly conducting the survey, but may contribute to its planning and execution, including administrative staff, Safety Officer, maintenance staff, Zone I&M biologist, and PSP staff.

All aerial operations (missions) conducted with Federal employee participation must comply with all FAA and DOI safety, training, operational policies, and regulations. This SOP is not intended to serve as a replacement or substitute for the official documents that describe those policies and regulations; therefore, it is incumbent on all personnel involved with aerial surveys to be familiar and in compliance with the latest versions of those official documents. Latest versions of the following documents (and others) most pertinent to field personnel may be found at the DOI – Office of Aviation (OAS) services website: https://www.doi.gov/aviation/library/forms. 1. Interagency Aviation Training Guide 2. Basic Aviation Safety Manual 3. Interagency Aviation Mishap Response Guide and Checklist 4. Field Reference Guide for Aviation Users 5. Aviation Life Support Equipment (ALSE) Handbook

Identical Regional Director Orders #12-02 (R1) and #12-01 (R8) applies to flight operations in USFWS Regions 1 and 8, and require: 1. Refresher training will be accomplished every two years for the B-3, Basic Aviation Training; M3, Aviation Safety for Supervisors; and the A-312, Water Ditching and Survival course when required. 2. Mandatory completion of the RDO attachments – Project Safety Plan, Aviation Risk Assessment, and the GO/NO-GO checklist.

See SM 2 for Order #12-02 and the attachments listed under requirement #2. Contact the OAS, your Regional Aviation Manager, and Regional Safety Officer for the latest orders and policies that may apply to survey flights.

Aircraft Surveys have been successfully conducted from fixed-wing airplanes and helicopters; the Survey Coordinator should choose the aircraft based on the requirements of the survey and the aircraft

68

(and pilots) available. A DOI form AQD-91 (https://www.doi.gov/aviation/library/forms) must be submitted to [email protected] when completed. A list of approved DOI Contractors or vendors and contract price listing(s) is available at http://oas.doi.gov. The vendor must be selected and approval to fly must be obtained before survey can occur. The aircraft must be a suitable platform for obtaining colony images of sufficient quality to make accurate bird and nest counts. Other considerations include the following:  Cost  Potential for wildlife disturbance  Aircraft range relative to survey distances and available fuel stations  Aircraft minimum airspeed  Aircraft photography ports (belly port recommended; do not photograph through closed windows)  Passenger space  Consistency with previous surveys  Available pilots’ familiarity with the survey area and methods

Unless a belly port is used, fixed-wing aircraft must be high-wing so there is an unobstructed view downward. Photography should occur through open windows or belly ports rather than through window panes, but check with the pilot about safely opening windows. Photography through side windows typically requires flying in tight circles and will likely result in oblique colony views that may obscure birds and nests. Belly ports permit photography from directly above the colony with level flight, allowing straight line transects which are safer and easier on pilot and passengers. Photography from a helicopter is greatly facilitated by removal of the passenger door(s) to provide an unobstructed view of the colonies. However, optimal aircraft performance (and associated costs), crew safety, and comfort occur with the door installed. Since coastal surveys typically involve round trips over the same coast where photography occurs on only one of the two legs of the route, consider making arrangements for ground transfer of the door during the photographic leg so it is available for flight use on the non-photographic or cruising leg.

Safety preparations The SSP must list all personal protection equipment (PPE) required for the survey crew according to specifications in the Basic Aviation Safety Manual: https://www.doi.gov/sites/doi.opengov.ibmcloud.com/files/uploads/Basic_Aviation_Safety_2013 .pdf and ALSE Handbook: https://www.doi.gov/sites/doi.opengov.ibmcloud.com/files/uploads/Aviation_Life_Support_Equi pment_Handbook_2008.pdf. In addition, crew members should have training in proper PPE use. The manual and handbook includes lists of all required and recommended PPE based on the characteristics of the mission and aircraft.

According to the RDO #12-02 (SM 2), each survey mission should be described in an approved (by the Project Leader) Project Aviation Safety Plan, which is usually prepared by the Survey Coordinator in consultation with the pilot before the mission. Included in this Plan will be a completed Aviation Project Risk Assessment Worksheet wherein risk categories associated with the mission are scored and risk mitigation is described. Copies of the Safety Plan form and Risk

69

Worksheet are in SM 2, and example completed forms are in SM 3. Seabird survey flights should not operate below 500 feet above ground level, or otherwise require high-risk maneuvers classifying them as “Special Use Activity” missions (see https://www.doi.gov/sites/doi.opengov.ibmcloud.com/files/uploads/Field_Reference_Guide_201 4.pdf for the complete definition). The SSP should describe how the information entered on these forms reflects the local survey conditions (e.g., hazards along the flight path, airports available for emergency landing or fuel), and should be updated regularly with the latest contact information for all first responders and advances in safety procedures and technology. Finally, the safety plan should include a completed Aviation Mishap and Response Guide and Checklist (https://www.doi.gov/aviation/safety/iamrgc) that is updated for each mission, and reviewed by all survey participants and Project Leader prior to the flight. A copy of the safety plan and mishap response form should be given to the Flight follower just before the flight, and filed after the flight for future reference. Note that any safety forms that include personal identification information (PII, e.g. home phone numbers) should have PII redacted before made publically available. It is important to distinguish between weather conditions for safe flight and acceptable survey weather for obtaining quality photographs, which is more restrictive. Minimum conditions for photography typically include 1,000-foot cloud ceiling or higher, 3 miles horizontal visibility, and wind speeds below 40 knots with low turbulence. However, the SSP should set minimum standards based on local knowledge of seasonal weather patterns and the reliability of weather predictions. During the survey flight any unplanned deviation from the flight plan, and planned takeoffs and landing for refueling or other reasons should be communicated to the Flight follower as soon as possible. Emergency response procedures should be initiated by the Flight follower within 15 minutes of an unscheduled stop and failed attempt to contact the crew. Develop a step-by-step instruction list for the Flight follower to use in an emergency based on recommendations of local emergency services agencies (i.e., U.S. Coast Guard, State Police, and local FAA office). The instruction list should be reviewed with all parties involved before each flight so it is clear what steps to take in case of an emergency. A debriefing should be held after the flight is completed to obtain suggestions from the crew to improve the survey and safety procedures.

Equipment Table SOP 1.1 lists equipment other than PPE and other safety equipment required. See the Basic Aviation Safety Manual and ALSE Handbook for current lists of safety equipment, and include those in the SSP.

Field methods Pre-survey preparations: Aerial surveys are complex missions that involve coordination of many participants and resources, typically requiring planning and preparations months before any field work can occur. The survey must occur during the period when COMU and BRAC breeding seasons overlap. Therefore, the SSP should include a work plan (Table SOP 1.2) with the schedule of all necessary activities, and assign a Survey Coordinator responsible for administering the schedule. Pilots and aircraft often need to be reserved and DOI certified months before the survey, and blocks of dates should be reserved to include enough days to

70

Table SOP 1.1. Example of a checklist of equipment needed for aerial surveys of COMU and BRAC. This list should be modified for consistency with the requirements of the SSP, and does not include PPE or other safety equipment. See the Basic Aviation Safety Manual and Aviation Life Support Equipment (ALSE) Handbook for current lists of safety equipment, and include those in the SSP. Survey Equipment Qty. Specifications Comments Camera body 3 Digital, SLR, 21 megapixel 2 plus 1 spare; synchronize clock (minimum), with AutoFocus, with GPS unit Image Stabilized

Camera lens 1(2) 100-300 mm zoom, f/4, AF For close-up photos; 2 if overviews not needed

Camera lens 1 70-200 mm zoom, f/4, AF For overview and close-up

Camera batteries 4 Fully charged Minimum 1 spare for each camera, or more if needed

Camera memory 6 8 GB or larger 3/camera or more if needed cards

Camera bag 1 Suitable for all cameras and If there is room on the aircraft accessories

Lens cleaner kit 2 Cleaner fluid, paper, dust brush If there is room on the aircraft

Cell phone 1 Fully charged; connected to For contact with Flight follower com system

Flight suit, helmet, 2 NOMEX or equivalent Match sizes with personnel; gloves, leather coveralls; see ALSE Handbook Check compatibility of helmet com boots (Helicopter) wiring with ship’s system

Electrical tape 1 Roll Use to tape over unused camera controls

Maps 1@ 1:24,000 maps of all colonies Use as reference for colony number/name and navigation; 1 per photographer

Clipboard 1 For maps, flight log Rubber bands or clips to hold paper

Pencil 4 Mechanical Or colored pen; attached to clipboard

GPS Unit 1 Loaded with local maps Synchronize clock with camera

Notebook 2 Lined weatherproof paper For field notes, photo log for each photographer

Personal supplies 1@ Snacks, water, anti-nausea and headache medications, ear plugs complete the survey plus extra days for weather or other unanticipated delays. The schedule should consider the availability certified aircraft with proper equipment (e.g., floatation) and of pilot and survey personnel training courses to assure that all required certifications are current for the survey. The schedule should identify all participants by role, name, and certification status;

71

specify when they will be needed and when their availability has been confirmed; and list potential alternates if participants become unavailable. Table SOP 1.3 is a suggested template for tracking crew members’ training status. The schedule should also include an inventory of all supplies and equipment that will be needed for the survey prepared far enough in advance to procure anything missing or to accomplish replacements or repairs.

Table SOP 1.2. Example work plan for aerial surveys. The task list should be modified to conform to the SSP. Replace time periods with actual dates once the survey date is established. Use the Status column to record progress toward completing each task.

Start by (time Complete by Person(s) before (time before Task responsible survey) survey) Status Comments 1. Develop/update site-specific Survey Follow guidelines in protocol (SSP) for survey; Coordinator, National Protocol determine cost Site Manager 12 months 6 months ______Framework and SOP 2. Obtain SSP approval/survey Work with PSP Project Leader 6 months 2 months funding ______Coordinator

3. Submit Form AQD-91; Schedule Survey Reserve window allowing pilot and aircraft Coordinator 6 months 1 month ______for weather delays

4. Identify survey crew; schedule Survey Check required courses required safety training Coordinator 6 months 1 month ______and availability

5. Obtain survey equipment, data Survey forms Coordinator 6 months 1 month ______See SSP, PPE checklists

Survey 6. Prepare flight/safety plan Coordinator 1 month 1 week ______Survey 7. Identify flight follower 1 month 1 week Include alternates Coordinator ______

8. Train survey crew on survey Survey methods Coordinator 1 month 1 week ______Include alternates

9. Confirm certified aircraft/pilot Survey availability Coordinator 2 weeks 2 weeks ______

10. Confirm survey crew, Flight Survey Inform alternates as follower availability Coordinator 7 days 2 days ______needed Survey Coordinator, Adjust schedule if 11. Check weather forecast Pilot 5 days Day of ______needed

72

Table SOP 1.3. An example format for documenting survey crew members’ flight training status for each scheduled survey. This should be modified to conform to current training requirements and crew composition according to the SSP.

73

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Survey procedures: Even for experienced small aircraft flyers, the boarding process can be hectic and stressful, and the more details the SSP includes for this phase of the process, the less likely the crew will get up in the air and discover a critical item was overlooked. Here is a suggested day of survey checklist for the Survey Coordinator: 1. Consult weather report and conditions at area airports, including fuel availability 2. Consult with pilot about weather suitability, aircraft readiness (pilot completes pre-flight inspection), and fuel availability where needed 3. Make joint preliminary GO/NO GO decision (if NO GO, fix problem or abort mission) 4. Contact survey crew and Flight follower to inform them of status; update flight plan, if necessary 5. Assemble crew at takeoff location; review equipment against both survey and safety equipment lists; and confirm all equipment is present and serviceable 6. Review mission plan with the pilot; and crew members view pilot certification card and aircraft inspection record 7. Entire crew reviews GO/NO-GO list; if NO GO, resolve problem or abort mission. 8. Contact Flight follower with estimated time of take-off 9. Board aircraft; and pilot conducts final aircraft check and final safety review with crew; pilot activates the flight plan. All crew members are secured with approved restraint systems and all PPE is used. 10. Crew member turns on GPS, obtains satellite connection and photographs the GPS unit that shows date, time, and location (the first photos taken). GPS unit is safely affixed to the aircraft in a location where it will obtain continuous satellite connection. 11. Takeoff to first colony; crew prepares cameras and GPS, and gets in position to photograph 12. Photographers continually communicate with pilot about colony approaches, altitude , number of passes, observations of bird flushing, and next colony heading 13. Photographers frequently check camera settings, photo quality, battery, and memory card status 14. Record order of colonies surveyed, time at each colony, altitude, deviations from flight plan, and other notable observations (see SM 4, Flight Log form) 15. Maintain scheduled contacts with Flight follower via radio or cell phone. Contact Flight follower ASAP of flight plan deviations or unscheduled landings. 16. Complete survey and return to base; notify Flight follower and Supervisor of landing at base airport 17. Upon landing turn off all devices (e.g., cameras, GPS) 18. Transfer photo files and GPS log into labeled folders ASAP; recharge camera batteries; and review and verify written notes with crew 19. Complete survey metadata form (See Element 4 for list of metadata)

Standard procedure in OR and CA includes having a second photographer who is taking images of the same colonies as the primary photographer, but with a wider angle lens for more overview and orientation shots. In WA, both photographers take duplicate close-ups. The SSP defines the roles of each photographer, and the photographic equipment should correspond to those roles (see equipment checklist). In the case of two photographers with separate roles, the close-up

74

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

photographer should take the position in the aircraft with the best and most complete view (or have access to the belly port). The second photographer is responsible for taking overview or orientation photographs and supplemental close-up photographs of the colony, with the lower and higher focal lengths of a zoom lens (e.g., 70 to 200 mm), respectively. The supplemental close-ups can be helpful to assure coverage of large colonies and as a backup if there is a problem with the close-up photographer’s camera. In all cases, the close-up photographer(s) is responsible for obtaining complete photographic coverage of each colony with a telephoto lens, usually 200 mm or 300 mm focal length. The lens selected must result in close-up images of sufficient magnification and clarity to permit counting individual COMU. All colony photographs should be taken as near perpendicular to the colony surface as possible. Cameras settings should be appropriate for the conditions; typical settings used are aperture priority mode with f-stop set to 5.6 (or as low as the lens allows) and ISO set to 800. In order to prevent accidental changes in camera settings during the survey that result in low-quality images, controls that will not be intentionally changed should be taped over before the flight. All cameras should be set to record accurate time stamps on the images, and geo-referencing features activated, if applicable. All equipment needs to be secured in the cabin during flight in aircraft with windows or doors removed. At least one spare camera body and telephoto lens should be onboard in case of malfunction, and extra camera batteries and data cards are essential. Consider using a portable GPS unit that is synchronized with the camera time to record the entire survey route so any questionable images can be cross-checked for the time and location of collection. In addition to the aircraft’s navigation system, hard copies of maps with the survey route clearly indicated can be helpful for communication with the pilot. Survey routes have been established by the state survey stations, and should be incorporated into the fight plan. See the equipment checklist, above. Fixed- wing aircraft may have room for a third crew member, who assumes the roles of data recorder and navigator. The data recorder keeps a flight log (SM 4) of times, altitudes, and photographs taken at each colony along with any other helpful notes, as well as a checklist of colonies to photograph. He/she may also help photographers exchange camera batteries, data cards, and lenses. Once on the survey route, there should be constant communication between the Survey Coordinator and the Pilot about how each colony will be approached (e.g., heading, altitude) and photographically covered, when the photographers are satisfied they have covered the colony, and what the next heading will be. To prevent accidental skipping of a colony, careful attention to the planned route is needed, especially if there is a deviation from the plan for any reason. The cameras should be checked frequently during the aerial survey for image quality, battery level, and memory card status.

Post-flight procedures: At the end of the flight, the Survey Coordinator should complete the survey metadata field form that records all the important information about the survey flight (see Element 4 for list) and it should be checked for accuracy by another crew member. All images on the camera and GPS data cards should be copied to file folders labeled by colony, date, and camera on a secure hard drive as soon as possible.

75

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Data quality assurance Factors potentially affecting data (i.e., colony photos, metadata) quality include capabilities and functionality of the cameras and lens, photographer skills, flight and weather conditions, light conditions, accuracy and completeness of metadata, and the photo downloading and labeling process. Table SOP 1.4 lists these factors and how they are controlled to maximize data quality.

Table SOP 1.4. Factors affecting aerial survey data quality and measures taken to assure high quality.

Factor Quality assurance measures Camera, lens, Minimum capabilities are specified; settings are specified and taped in place or frequently and accessories checked; lens cleaned as needed; spare camera body, batteries, and memory cards available during survey Photographer Pre-survey training in camera operation and survey objectives; diligence about proper skill perspective and coverage at each colony; redundant photos taken Flight conditions Standard altitude, ground speed; coordination with pilot about mission objectives, orientation to colonies, time allotted to photograph each colony Weather and light Minimum survey weather and time of day parameters specified and adhered to conditions Metadata Pre-printed forms with labeled spaces for data; completed forms checked by different crew accuracy members; electronic data entry double checked; timely record keeping Accuracy of Downloaded chronologically by colony number, date, and photographer; checked against GPS photo labels record for time and location synchrony

76

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

SOP 2: Standard Operating Procedure for conducting boat-based surveys of common murre and Brandt’s cormorant colonies within the California Current System

Revision History Log Previous Revision Author Changes made Section and Reason Approved New version date paragraph by version number number

Revisions to this SOP will be logged here. Version numbers increase incrementally by tenths (e.g., 1.1, 1.2) for minor changes. Major revisions should be designated with the next whole number (e.g., 2.0, 3.0, etc.).

Purpose This SOP describes minimum standards and required procedures for conducting boat-based surveys of breeding colonies of common murres (COMU) and Brandt’s cormorants (BRAC). Surveys using this SOP will generate population and other observational data for these two surface-nesting seabird species consistent with the National Protocol Framework.

Application This SOP should be used by any entity (National Wildlife Refuge or other base of operations) developing a Site-Specific Protocol (SSP) for conducting boat-based population and behavioral surveys of seabird colonies within the California Current System (CCS), which extends along the West Coast of North America from southern British Columbia to northern Mexico. Although most COMU and BRAC breeding surveys will be primarily conducted using aircraft, certain areas (e.g., Salish Sea) have been and likely will continue to be surveyed from boats due to cost-effectiveness and the ability to collect information in addition to simple counts (e.g., breeding phenology, behavioral observations) with relatively simple logistics. Information gathered from boat-based surveys following this SOP will supplement and contextualize breeding bird and nest counts derived from aerial surveys.

Personnel roles and training This SOP is not intended to serve as a substitute or supersede the official documents that describe policies and regulations, and it is incumbent on all personnel involved with boat-based surveys to be familiar and in compliance with the latest versions of those official documents. For boat-based surveys, compliance with official DOI (http://elips.doi.gov/ELIPS/DocView.aspx?id=1624) and USFWS (https://nctc.fws.gov/courses/programs/watercraft-safety/documents/241fw1.pdf) watercraft operations policies is mandatory, and the SSP should incorporate all elements of the most recent policies that apply to the survey. The DOI-Motorboat Operator Certification Course (MOCC) Student Manual is available at: https://training.fws.gov/courses/programs/watercraft- safety/MOCC-StudentManualv.2016-12-20.pdf and other instructional materials are available at: https://nctc.fws.gov/courses/programs/watercraft-safety/resources.html. 77

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

All field personnel should be familiar with the survey objectives, proficient in the field methods and procedures used to conduct the survey, have detailed knowledge of COMU and BRAC field identification and behaviors, and clearly understand their specific duties and roles during the survey. Boat-based surveys typically include the following personnel:

Survey Coordinator: Typically, this is the Refuge Manager or Biologist; this person is responsible for all aspects of the survey planning and preparation, safety procedures, implementation, data quality assurance and management, data analyses, reporting, selection of the field crew; assures all crew members are qualified and adequately trained; and prepares a survey Float Plan in coordination with the Boat Operator. The Survey Coordinator may also be the Boat Operator. Biological technician(s): Assists the Survey Coordinator, as directed, with any aspect of the survey preparations, implementation, and post-survey activities; responsible for knowing and adhering to all activity schedules, survey protocols and safety procedures; and may also supervise project volunteers or assistants. Boat Operator: Holds current MOCC and Open Water Training certifications; is familiar with the survey area boating conditions; assures safety plan compliance including demonstrating the use of safety equipment and PPE to crew; completes an Operational Risk Assessment Evaluation with crew input (see MOCC Student Manual); operates the boat in compliance with existing policy, guidelines, and training; and submits the Float Plan developed with the Survey Coordinator (form USFWS 3-2227) to the trip follower before embarking. Trip follower: Receives the Float Plan from Survey Coordinator or Boat Operator before the survey crew embarks; is available on shore to receive scheduled contacts with the survey crew; and contacts the U.S. Coast Guard, State Police, or other appropriate authorities in emergencies according to the Float Plan. Supervisor: Refuge Manager or Project Leader who authorizes and approves the Survey and Safety Plan and assures compliance with all applicable DOI and USFWS policies. Support personnel: Other persons who are not directly conducting the survey, but may contribute to its planning and execution, including administrative staff, Safety Officer, maintenance staff, Zone I&M Biologist, and PSP staff.

Type of watercraft The SSP will specify whatever type of boat is appropriate for the local conditions and provides a safe platform from which high quality observations that meet the survey objectives can be made. DOI 485 DM Chapter 22 Safety Policy applies to all… “… watercraft for which the DOI is responsible (e.g., watercraft the DOI owns, borrows, rents, or leases), anyone on board watercraft for which the DOI is responsible, and DOI personnel conducting official duties on watercraft regardless of ownership. Employees performing official duties on commercially licensed watercraft (ferries, tour boats, commercial vessels, etc.) will abide by established maritime standards for those vessels, orders issued by the captain of the vessel and in accordance with all relevant safety standards and authorities noted in this chapter.

78

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

… Contractors are not within the scope of this chapter but must comply with the safety and health clauses in their contract agreement and with Federal, State, and local watercraft requirements.”

Applicable policies that address requirements for watercraft certification, safety and navigation equipment, and maintenance procedures should be cited in the SSP.

Pre- survey logistics and preparation Boat-based surveys are complex missions that involve coordination of multiple participants and resources, typically requiring planning and preparations months before any field work can occur. Therefore, the SSP should prescribe a format for a work plan (e.g., Table SOP 2.1) that itemizes all necessary activities and their timing in preparation for each survey. Boats may need to be reserved months before the survey, and blocks of dates should be reserved to include enough days to complete the survey plus extra days for weather or other delays. The plan should consider the availability of MOCC and Open Water training courses to assure that the Operator’s certifications are current for the survey. Consult the latest version of the MOCC Student Manual for certification renewal requirements. Table SOP 2.2 is a template for tracking the Boat Operator’s training status.

Designing the survey: The Survey Float Plan route will be developed based on which colonies are to be surveyed, their geographical configuration, tidal and daily wind considerations, locations of observation points relative to each colony, and the proximity of boat ramps to the colonies. Therefore, the first step is to select the colonies based upon how the local survey fits into the sampling frame as described in Element 2, and in coordination with the PSP staff. For the purposes of the Partner conducting the survey, the survey may include species and colonies in addition to the COMU and BRAC colonies targeted by the sampling frame.

In most colonies, it will not be possible to count all birds or nests of these species from a boat because parts of the colonies will be out of sight, or only visible at oblique angles where closer birds or topography are obscuring more distant ones. Therefore, only portions of colonies will be counted, and the exact boundaries of the count areas will need to be determined and documented to permit consistent repeated counts of the same areas. Documentation of count areas should include a unique count area name and descriptive narratives of each, accompanied by annotated photographs taken from designated (by GPS coordinates) observation points (Figure SOP 2.1). These descriptions must be taken into the field for reference. Count area data will be most useful when interpreted in the context of replicate counts of the same areas and counts derived from aerial photography if available, as described in the PF. In addition to area counts, data collected during boat-based surveys may include breeding phenology (timing of incubation, hatched chicks, or fledging), observations of predators or other sources of disturbance, numbers of birds on adjacent waters, or condition of island vegetation and soils. See SM 4 for a suggested field data form template.

79

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Figure SOP 2.1. An example of an annotated photograph showing named portions of an island wherein birds and nests are tallied separately. (Courtesy G. McChesney, USFWS)

80

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Table SOP 2.1. Example work plan for boat-based surveys. The task list should be modified to conform to the SSP. Replace time periods with actual dates once the survey date is established. Use the Status column to record progress toward completing each task.

Start by Complete by Person(s) (time before (time before Task responsible survey) survey) Status Comments 1. Develop/update site-specific Survey 12 months 6 months Follow guidelines in National protocol (SSP) for survey; Coordinator, Protocol Framework and SOP determine cost Site Manager ______2. Obtain SSP approval/survey Project Leader 6 months 2 months Work with PSP Coordinator funding ______3. Schedule Boat Operator and Survey 6 months 2 months Reserve window allowing for aircraft Coordinator ______weather delays 4. Identify survey crew; Survey 6 months 1 month Check required courses and schedule required safety Coordinator ______availability training 5. Obtain survey equipment, Survey 6 months 1 month See SSP, PPE checklists data forms Coordinator ______

6. Prepare float/safety plan; Survey 1 month 1 week risk assessment (GAR) Coordinator ______7. Identify trip follower Survey 1 month 1 week Include alternates Coordinator ______8. Train survey crew on survey Survey 1 month 1 week Include alternates methods Coordinator ______

9. Confirm Operator/boat Survey 2 weeks 2 weeks availability Coordinator ______10. Confirm survey crew, trip Survey 1 week 2 days Inform alternates as needed follower availability Coordinator ______11. Conduct shake-down Boat Operator 1 week 1 week Conduct necessary cruise ______maintenance 12. Check weather forecast Survey 5 days Day of Adjust schedule if needed Coordinator, ______Boat Operator

Table SOP 2.2. Suggested template for tracking the Boat Operator’s training certifications prior to each scheduled survey. Modify appropriately to conform to the latest training requirements and certification schedules.

Safety preparations: The SSP should detail how the survey plan will comply with all safety considerations described in the 241 FW 1 Watercraft Safety policy, including providing PPE and exposure survival equipment for all crew, instructions for its proper use, and opportunities to practice using it. The procedures for completing a Survey Float Plan to be given to a designated trip follower, and an Operational Risk Assessment Evaluation should also be in the SSP. Descriptions and examples of both of these documents are in the MOCC Student Manual.

81

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Acceptable weather and ocean conditions for conducting surveys should be explicit in the SSP based on coordination between the Boat Operator and the Survey Coordinator, and knowledge of local weather patterns and prediction reliability. Thresholds of wind, wave, and visibility parameters that would warrant abandonment of the survey mission should be specified in the SSP. These thresholds should be set with consideration of the conditions necessary for high- quality survey observations, as well as safety considerations.

The boat should be inspected and current on all scheduled maintenance, and taken on a test cruise before the survey if it is not used regularly. The SSP will include a list of all watercraft equipment, survey equipment, and supplies needed to conduct the survey as a reference for the Survey Coordinator’s preparations. Survey crew members should have the opportunity to become familiar with the basic operation of the boat and its equipment in preparation for an emergency.

Survey equipment: The SSP should include a complete list of the equipment and supplies that will be needed to conduct the survey. The characteristics of observations to be made on the survey will determine which optical equipment (binoculars, spotting scopes, range finders, cameras) is appropriate. There should be replacement units of any equipment essential to the data collection on board in case of malfunctions, loss (over-board), or damage. Although colony counts made from boats typically do not require a systematic photographic record, photography for other records is recommended so include an appropriate camera and lens. Boat surveys will require navigation charts, colony maps, field data forms (including descriptions of all colony count areas), field notebooks, and possibly electronic data recording devices.

Table SOP 2.3 lists suggested equipment other than PPE and other safety equipment required. See the MOCC Student Manual for current lists of required and recommended safety equipment and include those in the SSP.

Field data collection procedures The SSP should include a procedural checklist for tasks to be done on the day of the survey (Table SOP 2.3). The decision to conduct the boat survey is made after reviewing weather and sea conditions, and boat status. The trip follower is then notified the Float Plan is in effect, the crew runs through its safety and equipment check, and the survey route is reviewed by the Boat Operator and Survey Coordinator. Typically, the safety plan includes checking in with the trip follower on a regular schedule (hourly, if possible), whenever a deviation from the Float Plan occurs, and when the survey is complete.

It is the responsibility of the Survey Coordinator to assure that the field method protocols, as detailed in the SSP, are being implemented correctly as observations are made and recorded. The Survey Coordinator should have trained and practiced field observation methods with all survey personnel before the survey, and should closely monitor inexperienced crew members during the survey for proper technique and count accuracy. Multiple observers conducting counts will need to communicate clearly and frequently about how they are dividing up the counts before and after observations are made at a colony site. Use mechanical or electronic tally counters to keep track of counts. If there are separate observers and data recorders, the observers should review the recorded data immediately after each colony is counted to assure accuracy. Field data forms

82

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Table SOP 2.3. Example checklist of equipment that may be needed for boat-based surveys of COMU and BRAC. This list should be modified for consistency with the SSP requirements, and does not include PPE and safety equipment; see the MOCC Student Manual. Survey Equipment Qty. Specifications Comments Camera body 2 Digital, SLR, 21 megapixel, Synchronize clock with GPS unit AF, IS

Camera lens 1 70-200 mm zoom, f/4, AF Consider 100-400 zoom lens

Camera batteries 2 Fully charged 1 spare for each camera, or more if needed

Camera memory 3 8 GB or larger 1/camera plus spare cards

Camera bag 1 Suitable for all cameras and accessories

Lens cleaner kit 2 Cleaner fluid, paper, dust brush

Spotting scope(s) 1(2) 20-60 zoom w/weather cover Or as specified in the SSP w/tripod

Binoculars 1/crew 10X42 w/image stabilizer; Consider higher power w/IS member waterproof

Tally counter 1/crew 1 – 9999 minimum capacity member

Charts and maps 1@ Waterproof 1:24,000 maps of Use as reference for colony all colonies; most recent number/name and navigation charts showing navigation aids and hazards

Rainproof 1@ For maps and data forms clipboard

Pencil 4 Mechanical Attach to clipboard

GPS Unit 1 Loaded with local maps Synchronize clock with camera; record observation locations

Range finder 1 Electronic laser; extra battery Measure distance to colony

Notebook, data 2 Lined weatherproof paper For field notes, observations, forms photo log

Field guides 1@ Seabirds; marine mammals Others according to interest

Personal supplies 1@ Food, water, anti-nausea medications, rain gear, gloves, extra warm layers, sun lotion, sun glasses, rubber boots

83

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

should be water-resistant and include space for survey metadata to be recorded at the time each colony is counted. (See SM 4 for example.)

Survey procedures: A procedural checklist for the day of the survey will help assure that all critical steps are taken such as the following: 1. Consult weather and marine conditions forecast 2. Consult with Boat Operator about forecast suitability and boat readiness 3. Make joint (Leader and Operator) preliminary GAR risk assessment (see MOCC Student Manual) decision; if amber or red, mitigate risk, fix problem, or abort mission 4. Contact survey crew and trip follower to inform them of GO/NO GO decision; update Float Plan if necessary 5. Assemble crew at launch site; review equipment against both survey and safety equipment lists; confirm all equipment is present and serviceable, and crew is competent to use 6. Review mission plan with the Boat Operator 7. Entire crew reviews GAR assessment; if amber or red, mitigate risk, fix problem, or abort mission. 8. Board boat; Operator conducts safety review with crew including location and use of all safety equipment 9. Cruise to first colony observation point using GPS; crew prepares optical equipment and data forms 10. Survey Coordinator and crew review the counting procedure and count area descriptions at each colony; Leader double checks inexperienced surveyor’s counts and BRAC nest categories (see Figure 3.1of the PF Narrative), discusses discrepancies and how to adjust; data recorder completes field data forms and confirms entries with crew 11. While surveying, the Boat Operator continually communicates with survey crew about weather and ocean conditions, personnel well-being (address sea sickness), colony approaches, boat speed, holding positions, and the next heading 12. Record metadata including order of colonies surveyed, time at each colony, GPS position, deviations from Float Plan, and other notable observations 13. Assigned person maintains scheduled (e.g., hourly) contacts with trip follower via cell phone or VHF radio 14. Note and respond to any changes in weather or boat condition that may warrant re- evaluation of GAR assessment or adjustment of the Float Plan 15. Complete survey and return to base; notify trip follower and Supervisor 16. Review and verify written notes with crew 17. Complete survey metadata form 18. Download photos and GPS log into labeled files; copy and file completed data forms

Data quality assurance Factors potentially affecting data (i.e., colony counts, metadata) quality include knowledge and skills of field observers, adherence to clear and complete data collection protocols, specifications and condition of optical equipment, weather and light conditions, accuracy and completeness of metadata, and the data downloading and labeling process. Table SOP 2.4 lists these factors and how they are controlled to maximize data quality.

84

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Table SOP 2.4. Factors affecting boat-based survey data quality and measures taken to assure high quality.

Factor Quality assurance measures Observer Experienced Survey Coordinator trains crew on protocols; crew practices before survey; crew knowledge and selected based on pertinent knowledge and experience skill Adherence to Detailed protocol is well documented; Survey Coordinator monitors crew during survey for protocol fatigue or inattentiveness; field forms include instructions and explicit space for and definitions of data categories

Optical Minimum capabilities are specified; lens cleaned as needed; spare camera body, batteries, equipment and memory cards available during survey; personnel trained in proper use

Weather and Minimum survey weather and time of day parameters specified and adhered to light conditions Metadata Field forms with labeled spaces for data; completed forms checked by separate crew accuracy members; electronic data entry double checked; timely record keeping Accuracy of data Data entry SOP followed; photos in labeled files checked against GPS record for time and entry location synchrony; sample of data field double checked against field forms

85

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

SOP 3: Standard Operating Procedure for conducting land-based surveys of common murre and Brandt’s cormorant colonies within the California Current System Revision History Log

Previous Revision Author Changes made Section and Reason Approved New version date paragraph by version number number

Revisions to this SOP will be logged here. Version numbers increase incrementally by tenths (e.g., 1.1, 1.2) for minor changes. Major revisions should be designated with the next whole number (e.g., 2.0, 3.0, etc.).

Purpose This standard operating procedure (SOP) describes minimum standards and required procedures for conducting land-based surveys of breeding colonies of common murres (COMU) and Brandt’s cormorants (BRAC). Surveys using this SOP will generate population and other observational data for these two surface-nesting seabird species consistent with the National Protocol Framework (PF).

Application This SOP should be used by any entity (National Wildlife Refuge or other base of operations) developing a Site-Specific Protocol (SSP) for conducting land-based population and behavioral surveys of seabird colonies within the California Current System (CCS), which extends along the West Coast of North America from southern British Columbia to northern Mexico. Land-based surveys include any counts, photography, or observations of COMU or BRAC colonies made from an observation point (OP) located on the colony island or on the adjacent mainland. Although most COMU and BRAC breeding surveys will be conducted using aircraft, colonies that can be viewed wholly or in part from an OP on land enable surveys at much less logistical difficulty and cost that can therefore be scheduled more frequently than aerial or boat-based surveys. Land-based surveys will most likely be conducted at certain Tier 1A colonies selected for intensive monitoring (see PF Element 2). Information gathered from land-based surveys following this SOP will supplement and contextualize breeding bird and nest counts derived from aerial surveys. However, these will not be feasible in some large areas (e.g., Washington outer coast) where colonies are too distant from shore or difficult to access.

Personnel roles and training All field personnel should be familiar with the survey objectives, proficient in the field methods and procedures used to conduct the survey, have detailed knowledge of COMU and BRAC field identification and behaviors, and clearly understand their specific duties and roles during the survey. Land-based surveys typically involve the following personnel:

86

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Survey Coordinator: Typically the Refuge Biologist or Manager; this person is responsible for all aspects of the survey planning and preparation, safety procedures, implementation, data quality assurance and management, data analyses, reporting, and selection of the field crew; assures all crew members are qualified and adequately trained; and prepares a survey schedule in coordination with the field personnel. Biological technician(s): Assists the Survey Coordinator, as directed, with any aspect of the survey preparations, implementation, and post-survey activities; responsible for knowing and adhering to all activity schedules, survey protocols, and safety procedures; and may also supervise project volunteers or assistants.

Supervisor: Refuge Manager or Project Leader who authorizes and approves the Survey and Safety Plan and assures compliance with all applicable DOI and USFWS policies. Support personnel: Other persons who are not directly conducting the survey, but may contribute to its planning and execution, including administrative staff, Safety Officer, maintenance staff, Zone I&M Biologist, and PSP staff.

Pre- survey logistics and preparation Planning the survey: The selection of colonies to survey from a land-based OP will be based upon the feasibility of viewing colonies from land, and how the local survey fits into the sampling frame as described in the PF (Element 2). Land-based survey data will be most useful when interpreted in the context of replicate counts of the same areas and counts derived from aerial photography. For the purposes of the Partner conducting the survey, the survey may include collecting information on species and colonies in addition to the COMU and BRAC colonies targeted by the sampling frame.

OPs need to be selected carefully to provide a safe vantage of a representative portion of the colony. The SSP must provide detailed descriptions of each uniquely named OP including: maps, GPS coordinates, aerial photographs, written directions, and permission for access; specifications for optical equipment used; and minimum visibility requirements for conducting the survey. In most colonies, it will not be possible to count all birds or nests of these species from an OP because parts of the colonies will be out of sight, or only visible at oblique angles where closer birds are obscuring more distant ones. Therefore, land-based bird and nest counts generally comprise indices of relative abundance rather than a complete count of the colony population. The accuracy and repeatability of the observations depend on standardizing the location of the OPs and other factors that may affect the accuracy of the counts, such as optical equipment, light conditions, and careful attention to field protocols. The specific boundaries of the area to be surveyed from each OP should be described on annotated photographs of the area, and the survey area should be given a unique name (Figure SOP 3.1 and SM 5). Field data forms may need to be customized for individual OPs to accommodate OP-specific protocols and conditions. Permits or written permission from landowners of the OPs and their access routes, where necessary, need to be obtained prior to the survey, kept on file, and copies taken into the field

87

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

along with descriptions and maps of OPs and access routes (See SM 5). All orientation material should be archived to be available for future replicate surveys.

DSR-01-UPPER

DSR-01-LOWER

DSR-01-ER

Figure SOP 3.1. An example of an annotated photograph showing named portions of an island wherein birds and nests are tallied separately. (Courtesy G. McChesney/USFWS)

In addition to area counts of birds or nests, data collected during repeated land-based surveys may include breeding phenology (timing of incubation, hatched chicks, or fledging), evidence of breeding success, observations of predators or other sources of disturbance, numbers of birds on adjacent waters, or condition of island vegetation and soils. See SM 4 for a suggested field data form template. Table SOP 3.1 is an example pre-survey work plan and schedule that can be customized for the SSP. Scheduling: A big advantage of land-based surveys is the flexibility of scheduling permitted by their simpler logistics. Thus, they can occur more frequently, and can be readily rescheduled for weather or other contingencies. All survey schedules are anchored to the incubation phases of the COMU and BRAC breeding cycles when they overlap in the survey area, which is when population surveys must be conducted. However, multiple land-based surveys within a season can be timed to bracket the entire nesting season to identify phenological events. The frequency of surveys within the season will determine the precision with which those events are measured. Task 7 in Table SOP 3.1 refers to scheduling multiple surveys of the same colonies within a season; the survey frequency and data collected for each to be determined by the sampling

88

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

objective of the SSP. An annual schedule should identify all field personnel by name, specify when they will be needed and when their availability has been confirmed, and list potential alternates if individuals become unavailable. Safety preparations: Safety considerations should be addressed if access or use of the OP involves exposing the survey personnel to hazardous conditions such as parking along roads, steep slopes, high winds, unstable footing, or other falling hazards. In these cases, the SSP should prescribe a pre-survey job hazard analysis conducted by the Survey Coordinator or supervisor with the field personnel to assure hazards are mitigated with PPE, safety equipment, or special training. The SSP should also include procedures for notifying supervisors of scheduled field work (i.e., participant names, start time, locations, check-ins, and end time). Whenever practical, hazardous surveys should be conducted by at least two people. Survey equipment: The equipment necessary for the survey will be determined by the data collection methods, but typically will include equipment listed in Table SOP 3.2. In most cases, all equipment will have to be safely transported on foot to the OP in appropriate containers and

Table SOP 3.1. Example work plan for land-based surveys. The task list should be modified to conform to the SSP. Replace time periods with actual dates once the initial survey date is established. Tasks 9-11 would be repeated for each survey conducted within a season. Use the Status column to record progress toward completing each task.

Start by Complete by Person(s) (time before (time before Task responsible survey) survey) Status Comments 1. Estimate survey cost and Survey 9 months 2 months Work with PSP Coordinator obtain funding/approval Coordinator, Project Leader ______2. Develop/update SSP Survey 9 months 2 months Follow guidelines in National Coordinator ______Protocol Framework and SOP

3. Identify survey crew Survey 6 months 1 month Consider seasonal hiring Coordinator ______process 4. Obtain survey equipment, Survey 6 months 1 month See SSP checklist data forms Coordinator ______5. Scout and establish OPs; Survey 4 months 1 month GPS OPs, parking, access obtain written landowner Coordinator ______route permission 6. Complete written Survey 2 months 2 weeks Include photos from previous descriptions and maps of OPs Coordinator ______year, if applicable 7. Prepare schedule of surveys Survey 1 month 1 week Include personnel names and for each OP Coordinator ______data to be collected 8. Train survey crew Survey 1 month 1 week Include alternates; visit all Coordinator ______Ops 9. Check weather forecast Survey 5 days Survey day Adjust schedule as needed Coordinator, ______crew leader 10. Confirm crew availability Survey 2 days 1 day Inform alternates if needed Coordinator, ______crew leader 11. Confirm all OPs are Survey Survey day Survey day Update OP instructions if appropriate for colony locations Coordinator ______needed

89

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

backpacks. Setting up optics at the OP may require equipment anchors or tethers to prevent wind-toppling or falling down steep slopes; include those supplies if appropriate, and otherwise adjust the list to conform to the SSP.

Table SOP 3.2. Example checklist of equipment that may be needed for land-based surveys of COMU and BRAC. This list should be modified for consistency with the SSP requirements, and does not include PPE and safety equipment.

Survey Equipment Qty. Specifications Comments Camera body 1 Digital, SLR, 21 megapixel, AF, IS Set time and date stamps

Camera lens 1 70-200 mm zoom, f/4, AF For overview and close-up

Camera batteries 2 Fully charged 1 plus spare

Camera memory 2 8 GB or larger 1 plus spare cards

Camera bag 1 Suitable for all cameras and accessories

Backpack 1/crew Suitable for transporting all field member supplies to OP

Lens cleaner kit 2 Cleaner fluid, paper, dust brush

Spotting scope(s) 1(2) 20-60 zoom w/weather cover, Or as specified in the SSP w/tripod quick release mounting plate; professional grade tripod

Binoculars 1/crew 10X42 w/image stabilizer member (recommended); waterproof

Tally counter 1/crew 1-9999 capacity member

OP directions and 1/OP Narrative plus aerial photos and maps annotated survey area photos

OP and access 1/OP As needed Updated annually route landowner permits

Rainproof clipboard 1(2) For maps and data forms

Pencil 2/crew Mechanical Extra lead member

GPS Unit 1 Loaded with OPs, parking location, access landmarks

Range finder 1 Electronic laser; extra battery Measure distance to colony

90

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Survey Equipment Qty. Specifications Comments

Notebook, data 2 Lined weatherproof paper For field notes, observations, photo forms log

Pruning shears 1 Appropriate type for access trail Or machete; obtain landowner maintenance permission

Field guides 1@ Seabirds; marine mammals Others according to interest

Personal supplies 1@ Food, water, rain gear, gloves, Based on expected conditions and extra warm layers, sun lotion, timing sun glasses, rubber boots

Field data collection procedures It is the responsibility of the Survey Coordinator to assure that the field method protocols, as detailed in the SSP, are being implemented correctly as observations are made and recorded. The SSP should describe what and how data are collected at each OP and who is responsible for verification and quality assurance. If there are separate observers and data recorders, the observers should review the recorded data immediately after each colony is counted to assure accuracy. Field data forms should be water-resistant and include space for survey metadata (see example in SM 4). Survey procedures: A procedural checklist for the day of the survey will help assure that all critical steps are taken. 1. Consult weather and marine conditions forecast 2. Assemble crew; review equipment against equipment list; confirm all equipment is present and serviceable 3. Inform Supervisor of the plan for the day 4. Proceed to first OP; review the survey area documentation, the list of data to be collected as indicated on the field data forms, and note the time 5. Survey Coordinator regularly double checks inexperienced surveyor’s counts and BRAC nest categories (see Figure 3.1), discusses discrepancies and how to adjust; data recorder completes field data forms and confirms entries with crew 6. Multiple observers making counts will need to communicate clearly and frequently about how they are dividing up the counts before and after observations are made 7. Use mechanical or electronic tally counters to keep track of counts 8. Record metadata for each OP visit 9. Proceed to the next OP, if planned; repeat steps 4-8 10. Complete survey and return to base; notify Supervisor 11. Review and verify written notes with crew 12. Download photos into labeled files; copy, scan and digitally file completed data forms

Data quality assurance

91

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Factors potentially affecting data (i.e., colony counts, metadata) quality include careful selection of the OPs, knowledge and skills of field observers, adherence to clear and complete data collection protocols, specifications and condition of optical equipment, weather and light conditions, accuracy and completeness of metadata, and the data downloading and labeling process. Table SOP 3.3 lists these factors and how they are controlled to maximize data quality.

Table SOP 3.3. Factors affecting land-based survey data quality and measures taken to assure high quality.

Factor Quality assurance measures Observation Point OPs selected carefully to assure consistently good view of the target colony and observer safety and (OP) selection comfort. Observer Experienced Survey Coordinator trains crew on protocols; crew practices before survey; crew selected knowledge and based on pertinent knowledge and experience skill Adherence to Detailed protocol is well documented; Survey Coordinator monitors crew during survey; field forms protocol include instructions and explicit space for and definitions of data categories

Optical equipment Minimum capabilities are specified; lens cleaned as needed; spare batteries and memory cards available during survey; personnel trained in proper use

Weather and light Minimum survey weather and time of day parameters specified and adhered to conditions Metadata accuracy Field forms with labeled spaces for data; completed forms checked by separate crew members; electronic data entry double checked; timely record keeping Accuracy of data Data entry SOP followed; database automatically checks for omissions and entries outside normal entry parameters; sample of data fields double checked against field forms

92

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Supplemental Materials (SM)

SM 1: Maps of Sub-regions within the CCS

La Push Cape Flattery

La Push

Point Grenville Common Murre Colonies Recently Occupied Historic Brandt's Cormorant Colonies Recently Occupied

Pacific Seabird Program Regions 1, 7 and 8 Newport, Oregon Produced: May 8, 2018 93 Recently occupied colonies are those for which a Basemap: World_Topo_Map - ESRI File: PF_Maps.aprx, using breeding population was observed and recorded during GIS layers COMU_All and BRAC_All the latest survey in the PSP Colony Survey Database. Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

94

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Common Murre Colonies Recently Occupied Historic Brandt's Cormorant Colonies Recently Occupied

Pacific Seabird Program Regions 1, 7 and 8 Newport, Oregon Produced: May 8, 2018 0 25 50 Recently occupied colonies are those for which Basemap: World_Topo_Map - ESRI Kilometers a breeding population was observed during the File: PF_Maps.aprx, using latest survey, and recorded in the GIS layers COMU_All and BRAC_All PSP Colony Survey Database. 95

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Common Murre Colonies Recently Occupied Historic Brandt's Cormorant Colonies Recently Occupied

Pacific Seabird Program Regions 1, 7 and 8 Newport, Oregon Produced: May 8, 2018 0 25 50 100 Recently occupied colonies are those for which Basemap: World_Topo_Map - ESRI Kilometers a breeding population was observed during the File: PF_Maps.aprx, using latest survey, and recorded in the GIS layers COMU_All and BRAC_All PSP Colony Survey Database. 96

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Cape Vizcaino

Common Murre Colonies Recently Occupied Historic Brandt's Cormorant Colonies Recently Occupied

Pacific Seabird Program Regions 1, 7 and 8 Newport, Oregon Produced: May 8, 2018 0 25 50 Recently occupied colonies are those for which Basemap: World_Topo_Map - ESRI Kilometers a breeding population was observed during the File: PF_Maps.aprx, using latest survey, and recorded in the GIS layers COMU_All and BRAC_All PSP Colony Survey Database. 97

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Farallon Islands NWR

Pigeon Pt.

Common Murre Colonies Recently Occupied Historic Brandt's Cormorant Colonies Recently Occupied

Pacific Seabird Program Regions 1, 7 and 8 Newport, Oregon Produced: May 10, 2018 0 25 50 Recently occupied colonies are those for which Basemap: World_Topo_Map - ESRI Kilometers a breeding population was observed during the File: PF_Maps.aprx, using latest survey, and recorded in the GIS layers COMU_All and BRAC_All PSP Colony Survey Database.

98

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Pt. Piedras Blancas

Pt. Buchon

Common Murre Colonies Recently Occupied Historic Brandt's Cormorant Colonies Recently Occupied

Pt. Arguello Pt. Conception

Pacific Seabird Program Regions 1, 7 and 8 Newport, Oregon Produced: May 8, 2018 0 25 50 Recently occupied colonies are those for which Basemap: World_Topo_Map - ESRI Kilometers a breeding population was observed during the File: PF_Maps.aprx, using latest survey, and recorded in the GIS layers COMU_All and BRAC_All PSP Colony Survey Database. 99

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Pt. Conception

San Anacapa I. Santa Miguel I. Santa Cruz I. Rosa I. Pt. Vicente

Santa Barbara I. Santa Catalina I. San Nicolas

San Clemente I. Pt. Loma

0 25 50 100 Kilometers 0 25 50 Miles

100

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

SM 2: Regional Director Order #12-02 with attachments (RO #12-01 issued by the R8 Director is identical)

101

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

102

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

103

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

104

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

105

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

106

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Attachment A. Project Safety Plan

107

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

108

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Attachment B. Aviation Project Risk Assessment Worksheet

109

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

110

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

111

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

112

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Attachment C. Aviation Operation Go-No Go Checklist

113

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

SM 3: Example of a completed Survey Project Aviation Safety Plan

Project Aviation Safety Plan 2017 Annual Aerial Seabird Colony Survey - Helicopter Mission: Photograph seabird Project Name: 2017 Aerial Seabird Colony Unit: OCNWRC colonies Survey Study Anticipated Project Date: June 21-22, 2017 Start Time: 10:00AM Ending Time: 5:00PM Project Plan Prepared by: Shawn W. Stephensen Title: Wildlife Biologist Date: 06/20/2017 Note: Signature by the preparer verifies that all personnel have the required training for the mission.  Attach Map, clearly showing areas to be flown; aerial hazards must be indicated. Project Plan Reviewed by: Title: Date: Project Plan Reviewed by: Title: Date: Project Plan Reviewed by: Title: Date: This Flight is Approved by: Title: Date:

Project Description: The objective of these flights is to conduct an aerial photographic survey of seabird colonies along the entire Oregon coast with a helicopter to document seabird species abundance, distribution, and nesting areas. Digital photographs will be taken of all common murre and cormorant species colonies and birds/nests will be counted on the photographs. Proposed flight dates for this study are June 21 and 22, since the survey area is approximately 320 miles long, survey duration is two days. Oregon has a total of 393 colonies with approximately 1.3 million seabirds representing 15 species.

On June 21, Shawn W. Stephensen, Jennifer Nelson, and Lila Bowen (USFWS employee and SEA Interns) will depart from Newport and drive to Astoria Airport. Mike Everette (pilot) of Northwest Helicopters will depart from Olympia, Washington and fly to Astoria Airport to meet Shawn, Jennifer, and Lila. The helicopter doors will be removed from the passenger side of the ship to allow unobstructed views. The doors will be transported via vehicle from Astoria to Newport by Lila. Shawn, Jennifer, and Mike will depart Astoria Airport and fly south along the coastline taking colony photographs from Astoria to Heceta Head/Sea Lion Point area, and back to Newport Airport. Aerial flight following (AFF) will be completed by Pam Johnson at the USFWS Newport office or her residence during this flight. Rebecca Chuck will be the backup flight follower. Shawn will call Pam/Rebecca via cell phone before take off and upon landing before, during, and at the end of the flight on both days. The doors will be placed back on the helicopter and secured for the evening.

On June 22, Shawn, Jennifer, and Mike will depart Newport and fly to Gold Beach Airport. The helcopter doors will be removed and transported by Deklyn Wood (SEA Intern) from Gold Beach to Reedsport. The doors will be transferred to Lila in Reedsport. Lila will bring the doors to Newport Airport. Shawn, Jennifer, and Mike will depart Gold Beach Airport fly to Goat Island (near the California Oregon border) and then travel north to Newport Airport taking photographs along the way. Aerial flight following (AFF) will be done by Pam Johnson or Rebecca Chuck during this flight. The doors will be placed back on the helicopter at Newport and Mike will return to Olympia.

The ship will fuel at Astoria, Tillamook, and Newport on June 21 and Gold Beach, North Bend, and Newport on June 22. Shawn will make contact with AFF person via cell phone during refueling stops, upon landing, and prior to departure. In case of an emergency, the helicopter can land at Tillamook, Florence, North Bend, Bandon, or Brookings Airports which ever is closest. Flights may be aborted or canceled due to poor weather conditions or low visibility.  Attachments: Map Other:

Project Supervisor: Kelly Moroney Phone: (541) 867-4550 Cell: xxx-xxx-xxxx Aircraft Manger: Doug Utecht Phone: (360) 754-7200 Cell: xxx-xxx-xxxx Participants: Shawn W. Stephensen (USFWS Biologist), Jennifer Nelson (SEA Intern), and Mike Everette (Pilot) 

114

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Type of Flight: Specific mission Desired Aircraft Type: Helicopter with floats Charge Code: Type Procurement: Method of Payment: Voucher Projected Cost: $15,000

Vendor: Northwest Helicopters, Inc. Phone: (360) 754-7200 Cell: (253) 225- 3501 Aircraft N#: N88TA Make & Model: 206 Bell Jet Ranger Aircraft Color: Grey III Pilot Name: Mike Everette Pilot Carded: Yes No A/C Carded: Yes No Flight Follow Procedure: AFF via computer with Pam Johnson Request or Flight #: Method of Resource Tracking: Phone Radio Prior to Takeoff Each Stop Enroute Arrival at Dest. Scheduling Dispatch Phone: (541) 867-4550 Destination Dispatch Phone: FM Receive: FM Transmit: Tones: FM Receive: FM Transmit: Tones: AM Air to Air: AM Unicom: Other:

Start Location Latitude Longitude Elevation Runway length & Surface or Helispot Size Astoria 46D 9M 33S N 123D 52M 50S W 200 Unknown Newport 44D 34M 40S N 124D 3M 42S W 200 Unknown Enroute Stops Latitude Longitude Elevation Runway length & Surface or Helispot Size Tillamook 45D 25M 7S N 123D 48M 49S W 200 Unknown Gold Beach 42D 24M 48S N 124D 25M 27S W 200 Unknown Destination Location Latitude Longitude Elevation Runway length & Surface or Helispot Size Newport 44D 34M 40S N 124D 3M 42S W 200 Unknown Gold Beach 42D 24M 48S N 124D 25M 27S W 200 Unknown

Passenger Name Weight Departure Point Destination Point Shawn W. Stephensen 175 lbs Astoria and Newport, OR Round Trip Jennifer Nelson 150 lbs Astoria and Newport, OR Round Trip Mike Everette 180 lbs Olympia, WA, Astoria and Round Trip Newport, OR

Cargo Weight Cubic Feet Hazardous Material Destination Yes No Yes No Yes No Yes No

Type of Flight Personal Personnel Protective Equipment Requirements Nomex clothing, hardhat w/chin strap, gloves, leather boots, eye protection, Air Ops general/ground personnel hearing protection, fire extinguisher Fixed Wing point to point flights Hearing protection Fixed Wing mission flights Nomex clothing, gloves, leather boots, hearing protection Rotor Wing flights Flight helmet, Nomex clothing, gloves, leather boots, eye protection, hearing protection, approved secondary restraint harness for doors off flights.

115

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Aircraft Manager must confirm with Dispatch prior to the flight that affected routes or other airspace concerns have been deconflicted.

Military Training Route (MTR) Information MTR Route Legs-Altitude Activity Time Time Zone Hot Cold Start Stop UTC

Local Other airspace concerns/hazards:

Justification Statement for Low-Level Flights:

Special Instructions: Aerial flight following (AFF) will be accomplished via USFWS personnel, Pam Johnson at the Newport field office or residence, office telephone: (541) 867-4550, cell phone: (541) 270-7810. Shawn Stephensen will contact Pam prior to departure and upon return at each stop location. If the ship makes unscheduled stops, Shawn will contact Pam via cell phone upon landing and prior to departure. Backup flight following person is Rebecca Chuck at (541) 867-4550 (office) or (541) 270-7811 (cell).

Emergency Evacuation/Search and Rescue Information and Procedures: Notify search and rescue personnel if contact can not be made within 15 minutes after ship has an unscheduled stop. A 4 man life raft may be deployed in case of water ditching. See attached Mishap Response Guide and Checklist.

Applicable Risk Management Worksheet/Job Hazard Analysis (JHA) referenced and on file: Risk Assessment Matrix Severity Likelihood IV Negligible III Marginal II Critical I Catastrophic Frequent 2 3 4 4 A Probable 2 3 4 HIGH B Occasional 1 2 SERIOUS 4 C Remote 1 MEDIUM 2 3 D Improbable LOW 2 2 2 E Reference the Aviation Risk Mgmt. Workbook, JHAs, etc., to assist completion of Risk Assessment Assess the risks involved with the proposed operation. Use additional sheets if necessary. Assignment: Date: Pre-Mitigation hazards rate out as:

Describe the Hazard: Likelihood Severity Risk A-E I-IV Level 1. Helicopter crash D I 3 2. 3.

116

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

4. 5. 6. 7. 8. Pre-Mitigation Overall Rating:

Post Mitigation hazards rate out as: Mitigation Controls: Likelihood Severity Risk A-E I-IV Level 1. Review aeronautical charts before flight to determine E I 2 hazards 2. Check aircraft safety/maintenance and pilot card E I 2 3. 4. Post-Mitigation Overall Rating: Success Probability/Benefits Statement: Obverall rating (score) is 2. By completing a review of flight hazards before flying, this will reduce the risk factor. Also, verifying the helicopter maintenance log and pilot card will also reduce risk.

Operation Approved By: Title: Date:

Appropriate Management Level for Risk Decisions Risk Level Fire Project HIGH Incident Commander or Ops Line Manager Chief SERIOUS Incident Commander or Ops Line Manager Chief MEDIUM Air Operations Branch Chief Project Aviation Manager LOW Helibase Manager Helicopter or Flight Manager

 Mission Planning/Preflight Briefing Checklist: Review with all participants as part of preflight briefing 1. Chain of command, individual roles and responsibilities are identified to all participants? Yes No NA 2. Project Aviation Safety Plan is approved and signed at the appropriate levels? Yes No NA 3. Is the emergency evacuation plan, helibase crash/rescue plan reviewed? Yes No NA 4. Are communications and flight following established, including repeater tones? Yes No NA 5. Can terrain, altitude, temperature or weather that could have an adverse effect be mitigated? Yes No NA 6. Are all aerial hazards identified and known to all participants? Yes No NA 7. Have ground operations hazards and safety been identified to all participants? Yes No NA

117

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

8. Have mitigating measures been taken to avoid conflicts with military or civilian aircraft? Yes No NA 9. Have adequate landing areas been identified and or improved to minimum standards? Yes No NA 10. Are all agency personnel qualified for the mission? Yes No NA 11. Are there enough (qualified) agency personnel to accomplish the mission safely? Yes No NA 12. Is the pilot carded and experienced for the mission to be conducted? Yes No NA 13. Will adequate briefings be conducted prior to flight to include Pilot, Passengers Yes No and Dispatch (all participants)? NA 14. Are all involved aware that the pilot has the final authority, but if any Yes No passenger/aircrew/ground personnel feels uncomfortable, that they can refuse/curtail NA the flight without fear of reprisal? 15. Is the aircraft capable of performing the mission with a margin of safety? Yes No NA 16. Have manifests of cargo and passengers, load calculations and/or weight & Yes No balance completed? NA 17. Is the aircraft properly carded? Yes No NA 18. Do all personnel have the required PPE? Yes No NA 19. Fuel planning, adequate fuel on board, fuel truck location, availability of Yes No commercial fuel? NA 20. Remember; maps of areas/sites, handheld radios, cell phones, day/survival Yes No packs, sic sacks NA 21. Will the mission be conducted at low levels? (Below 500’ AGL) Yes No NA 22. Can the same objective be achieved by flying above 500’ AGL? Yes No NA 23. Are pilot flight and duty times compromised? Yes No NA 24. Is there an alternative method that would accomplish the mission more safely? Yes No NA 25. Other? (identify) Yes No NA 26. Other? (identify) Yes No NA 27. Other? (identify) Yes No NA 28. Other? (identify) Yes No NA Above items (1-20) checked “NO” and item (22-24) checked “YES” require correction, and /or re-evaluation of flight/mission before proceeding. Evaluate additional items accordingly. Identify Correction: Aircraft/Flight Mgr. Date: Pilot Date: Signature: Signature:

Project Aviation Safety Briefing

A copy of this briefing page will be submitted to the Aviation Safety Officer within 5 days of the completion of this project.

118

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Briefing Leader: ______Briefing Date: ______Time: ______Location: ______Discussion Items: a. Hazard Analysis (as outlined in plan) b. Safety Air Ops (Ground) c. Safety Air Ops (Flight) d. Military Training Routes e. Flight Following f. Frequencies g. Fueling h. Emergency Evacuation Plan i. Authorities j. Weather Considerations k. Review applicable JHAs/Risk Assessments l. other

Briefing Attendees SigSignature and Concurrence:

119

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Figure 1. Map of project site locations and airports.

120

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

SM 4: Example Colony Status Record Form and Aerial Survey Log

121

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Colony Status Record Area number:______U.S. Fish and Wildlife Service

Colony Number:______Colony Name:______Observer(s):______Date:______Start time:______End time:______Latitude:______Longitude:______Other location data:______

Sea swells & direction Beaufort sea state Sea surface temp. Visibility (NM)

Mean wind speed & direction % cloud cover Air temperature Precipitation

Secchi Conductivity

Species No. nests/ No. birds Breeding status/ How counted (2) No. pairs Stage of breeding (1)

Fork-tailed Storm-Petrel Leach's Storm-Petrel Double-Crested Cormorant Brant's Cormorant Pelagic Cormorant Cormorant (unidentified) Black Oystercatcher Glaucous-winged Gull Western Gull Gull (unidentified) Common Murre Pigeon Guillemot Cassin's Auklet Rhinocerous Auklet Tufted Puffin Horned Puffin

(1) Include evidence that species is breeding in colony, and stage (e.g. nest building, eggs, chicks) (2) Include for each species: Whole colony or part? Exact count, count by groups, estimate, replicated, etc? FORM CONTINUES ON BACK

122

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Description of Colony

Colony Number______Census Date______

Census methods for colony (where viewed from, special conditions, etc.)______Description of colony (habitats, groups of birds, vegetation)______Mammalian predators, human disturbances, etc.______Marine mammals______Access to colony______Evaluation of data quality (reliability of this census)______Overall evaluation of colony______Supplemental material (photographs, reports, etc.)______

Detailed map of colony

123

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Flight Log — Aerial Photographic Surveys of Seabird Colonies Date: ______Page __ of __

Notes By:______Pilot/Aircraft:______

Camera 1/Lenses:______Camera 2/Lenses:______

Camera 1 -- Lead Photographer: Camera 2 -- Overview Photographer:

Alt. Time Image#s/Location/Notes Image#s/Location/Notes (ft.)

124

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

SM 5: Example of Land-based Survey Methods: Colony Count Protocol for Common Murre Restoration Project Colonies

(Abridged) By: G. McChesney/USFWS

To monitor seabird attendance, we will be conducting colony counts of various nesting sites. The frequency and details of these colony counts vary from site to site. Site specific procedures are included later in this protocol. Data collected will be entered on the Seabird Colony Count sheets for each field site. During the non-breeding season (Aug 15 – Apr 14) counts are conducted between the hours of 0700- 1100. During the breeding season (Apr 15 ~ Aug 14) counts are conducted between the hours of 1000-1400. If it is too difficult to get counts completed during this time frame (this sometimes happens at Point Reyes, especially if they are short staffed on a colony count day), counts may be started at 0930 in order to be done by 1400. This schedule is constructed to coincide with times of most consistent murre attendance.

Using the appropriate optical equipment, count all adult Common Murres on a given nesting site using a tally-counter (clicker). Also count all adults of other species, entering a number on the data sheet for each species observed. If you see a non-listed species, use the blank fields for other species. Do not count young of the year (i.e. chicks) of any species. These young are accounted for using other monitoring techniques. If time allows, you may distinguish adult birds from immature birds, and record them on your datasheet. This is especially helpful for Brown Pelicans.

Note: Most rocks become a bustling, tight little colony during the breeding season, and counting the birds can be daunting. It is not possible to see all birds on the rock from the viewing location. These colony counts are an index. Get the best counts possible.

POINT REYES AND DRAKES BAY COLONIES

Colony counts will be conducted once a week from the traditional overlooks (1-10) along the Headlands and twice a week in Drakes Bay (once a week at each of the Drakes Bay colonies). Data collected will be entered on the Seabird Colony Count sheets for PRH and DB.

See the Point Reyes Overlook Guide and Colony Count Binder for information on counting locations.

Using the appropriate optical equipment, count all adult Common Murres on a given nesting site one time, then count all other adults of other species, entering a number on the data sheet for each species observed. Due the large numbers of COMUs at several nesting sites designated plots have been created to enumerate a sub-sample of attending COMUs (e.g. PRH-03-B, PRH-05-B, PRH-10-C, PRH-10-B, PRS- 02, MPR-01, and DPR-01). Instead of counting all the COMUs in the given nesting area, count the COMUs on these plots three times (with the exception of PRH-10-C and PRH-10-B which are only counted once) and record them on the datasheet. Enter each count in the database; they will be averaged later, and the average will be used as the “official” count for that location.

Try to complete the whole colony count in one day. If counts of the entire Headlands cannot be completed due to rain or , try to complete them the next day. Occasionally there will be a week when

125

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

you cannot count from one or two overlooks (usually 5 & 6). For Drakes Bay, if visibility is obstructed, count as many areas as possible. Be sure to note that you were fogged out on the datasheet.

If you are getting close to 1400, and have not completed your counts – try to prioritize your COMU counts, and finish the survey. Because weather is often not conducive to getting surveys done at PRH, it’s important to get the data, even if we are a bit outside of our time window.

* Note – If you have attempted three colony count/nest surveys, without success, you can give up for that week, and wait until the following week to attempt another survey. This caveat is for Point Reyes only.

DEVIL’S SLIDE ROCK & MAINLAND

Colony Counts of Devil’s Slide Rock will be conducted every other day. This includes DSR Upper, DSR Lower, and Elizabeth Rock. Colony Counts of the mainland areas will be conducted once a week. Data collected will be entered on the Seabird Colony Count sheet for Devil’s Slide Rock.

Using the appropriate optical equipment, count all adult Common Murres on Devil’s Slide Rock and on mainland count areas three consecutive times (to be averaged later for increased accuracy), then count all other adult birds present in the area of interest one time, entering a number on the datasheet for each species observed.

Count Areas Devil’s Slide Rock (DSR-01-UPPER, DSR-01-LOWER, and DSR-01-ER) Count Devil’s Slide Rock from TRADITIONAL Pullout. DSR has been separated into two counting areas: Upper and Lower. Count them separately. Upper is considered to be the area on the “top” of the rock, including the sloped area below plot C on the eastern section of the rock. Lower is considered to be the area along the vertical walls of the rock and along the base during lower tides. Elizabeth Rock (DSR- 01-ER) is also counted, if there are roosting birds on it. This is the low, flat rock immediately to the south of DSR, often washed by waves, or covered at high tide.

Mainland North (DSR-02-MN) This is the area on the north facing slope of the Bunker. Conduct this count from GRAVEL pullout.

Below Traditional Roost (DSR-07-BTR) This is a roosting area below the Traditional pull out. Conduct this count from the GRAVEL pullout

West side of Bunker (DSR-02-MN) This is the area on DSR mainland that curves around the west side of the Bunker. This area is just West of Mainland North, and is only visible from the GRAVEL 2 pullout. At the beginning of the season, check this site – PECO and BRCO rarely nest here, but if they do, include them in colony counts and nest surveys. If there are no birds present, you do not need to include this pullout – however, it is worth periodically checking throughout the season. Refer to this area as DSR-02-MN for datasheet and data entry purposes, but make a note that the birds were seen from GRAVEL 2.

April’s Finger (DSR-05-AF) This is the pointy rock outcrop located directly below the observer location on Bunker. Conduct this count from the BUNKER.

Upper Mainland South (DSR-05-A-UPPER)

126

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

This the upper portion of the south face of bunker. Count from the top of the ridge to the steep area that separates Upper and Lower Mainland South. Conduct this count from PEFA PT.

Lower Mainland South (DSR-05-A-LOWER) This is the lower portion of south face of bunker. Conduct this count from PEFA PT.

Mainland South Roost (DSR-05-A-ROOST) This is a roost area near the tide line. The rocky outcrop juts westward. Conduct this count from PEFA PT. Turtlehead (DSR-05-B) This is the rocky outcrop south of bunker. Conduct this count from PEFA PT.

DSR-05-C This is a west-facing cliff between DSR-05-B and PEFA PT. Conduct this count from PEFA PT.

DSR-03

This is the cove just south of PEFA PT. Conduct this count from PEFA PT.

DSR-04 This is the cliff that hugs the cove south of PEFA PT. Conduct this count from PEFA PT and SOUTH OF TUNNEL overlook.

PHOTOS OF DSR COUNT AREAS

127

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

128

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

BUNKER VIEW

DSR-05-AF

(May 2005)

129

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Mainland area visible from “Gravel 2” is just west of DSR-02- MN (refer to this area as DSR-02- MN for datasheets and data entry).

130

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

131

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

SM 6: Brant’s Cormorant Nest Survey Protocols

Part 1. Assigning BRAC nest categories when counting nests from aerial photographs The following is an excerpt from: Protocol for Counting Seabirds from Aerial Photographs of Breeding Colonies and Roosts in California, G.J. McChesney and H.R. Carter, U.S. Geological Survey, Biological Resources Division, Western Ecological Research Center, Dixon,CA and Department of Wildlife, Humboldt State University, Arcata, CA. Revised 1999. Unpublished report

Note that nests tallied in the five nest categories below are combined to arrive at a total colony nest count. Territorial sites are noted, but not included in the total count.

132

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

133

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

134

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

135

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Part 2. Assigning BRAC nest categories when counting nests from land

The following is an excerpt from unpublished field protocols developed for California’s Common Murre Recovery Project, courtesy of G. McChesney, USFWS

To better understand the dynamics of these seabird colonies, we are monitoring the formation of BRAC nesting sites, and their proximity to COMU. In conjunction with a Colony Count, BRAC nests will be counted and characterized. This information will be entered onto the datasheet and into the BRAC NEST SURVEYS table in the database. At each overlook count the BRAC nests using the following classifications and the total number of nests for PECOs, WEGUs, BLOYs, and possibly CORAs. Do not count COMU nest sites.

Date: Write out the date of the nest survey.

Time: Record the time (military time) of the survey.

Sub-colony/ Area: Record the colony or subarea that you are counting. Refer to colony count protocol for photos of sub-colony areas.

Sample of Nest Survey datasheet for nests located on named (1st line) and unnamed (2nd line) rocks/cliffs:

Nest Crechin Over PB FB WEG PECO BLOY Date Time Areas Z WBN UNK Brooding g NOTES look N N U Nest Nest Nest Chicks Chicks 5/15/1 CRM- 1410 HPR 0 2 1 0 0 0 0 2 2 0 0 09 On small roost rock 5/15/1 Bixby 1415 HPR 2 0 0 0 0 0 0 1 0 0 below 0 S Area CRM-09 cliffs.

BRAC: For each colony, subcolony, or subarea, record number of territorial sites, poorly-built nests, fairly-built nests, well-built nests, nests with brooded chicks, and numbers of creching chicks. Be sure not to double count nests! Each nest should only count once. It is a good idea to use the big multi- clickers to count everything at once and avoid double counting. If you see chicks in a nest, it is considered “nest with brooded chicks” and no longer a “well-built nest” (nest condition of nests with brooded chicks does not need to be recorded. Creching chicks are chicks that are no longer at nest sites and too difficult to count as “nests”; instead we count the total number of chicks that are creching/wandering. When counting creching chicks, nest condition does not have to be recorded. At this point of development, nests have often been destroyed by wandering chicks anyway.

Nest counts are meant to be a snapshot of what is going on at the colony. Record what you see at the time you are conducting the count (ex: “Nest with brooded chicks” is based on which nests you can see chicks in while you are doing the count, not on which nests you know have chicks because of your productivity monitoring). See below list for a description of categories:

These are birds that do not have a nest but are displaying, or a pair is present, Territory or a site that has Nesting Material: Loose clumps or stringy bunches of marine (Z) or terrestrial vegetation forming at most a disorganized mat. PBN Poorly-Built-Nest: A disorganized mound or a flat pile of nesting material.

136

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Fairly-Built-Nest: A well-defined, roughly circular pile of nesting material up FBN to approximately 6” in height, with some evidence of a nest bowl depression at its center. Well-Built-Nest: Substantial ( >6” vertical height ) amount of nesting WBN material, forming a clearly-defined circular nest structure with a well- developed nest bowl, often plastered with much guano. Nest unk. Any site that cannot be seen fully and therefore cannot be determined.

# Brooding Count the number of nests that have chicks present in the nest. Nests Record number of wandering chicks. These are large chicks (late stage III Creching chicks and older) that are very mobile and no longer in nest.

Other Species: If there are any WEGU, PECO, BLOY, or CORA nests these should also be counted. When counting these it is only necessary to distinguish between Territories (Z) and Nests (entered in the database as WBN). Record this data in the appropriate column on the datasheet.

Notes: Any other additional information, such as photos taken, #of WEGU chicks present etc. If you have issues like creching PECO chicks and can’t tell how many nests they come from, give your best guess of the number of nests and put the total number of chicks and other relevant information about the nest count (i.e. there are 3 chicks that could be from one nest or two) in the notes section. For nests recorded in Nest Survey Areas, use the notes to further describe where the nests are located (see datasheet example above).

Territory (Z) These are birds that do not have a nest. This includes birds that are displaying, a pair that is present, or a site that has nesting material (loose clumps or stringy bunches of marine & terrestrial vegetation forming at most a disorganized mat).

137

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Poorly-Built-Nest (PBN): A disorganized mound, or a flat pile of nest material.

138

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

Fairly-Built-Nest (FBN): A well-defined, roughly circular pile of nest material (up to ~ 6 inches high), with some evidence of a nest bowl depression at its center.

Well-Built-Nest (WBN): Substantial (> 6 inches high) amount of nest material forming a clearly-defined circular nest structure with a well-developed nest bowl, often plastered with a lot of guano.

139

Monitoring Common Murre and Brandt’s Cormorant Colonies Ver. 1.0

U.S. Fish and Wildlife Service U.S. Department of the Interior

National Wildlife Refuge System

140