Institutions and Self-Governing Social Systems: Linking Reflexivity and Institutional Theories for Cybersecurity and Other Commons Governance Policies

by Kenneth Norman Clark

B.A. in Physics, May 1982, Whitman College B.S. in Electrical Engineering, May 1984, The University of New Mexico M.E. in Electronic Engineering, December 1987, The California State University-California Polytechnic State University

A Dissertation submitted to

The Faculty of The Columbian College of Arts and Sciences of The George Washington University in partial fulfillment of the requirements for the degree of Doctor of Philosophy

August 31, 2012

Dissertation directed by

Kathryn E. Newcomer Professor of Public Policy and Public Administration The Columbian College of Arts and Sciences of The George

Washington University certifies that Kenneth Norman Clark has passed the Final Examination for the degree of Doctor of

Philosophy as of April 23, 2012. This is the final and approved form of the dissertation.

Institutions and Self-Governing Social Systems: Linking Reflexivity and Institutional Theories for Cybersecurity and Other Commons Governance Policies

Kenneth Norman Clark

Dissertation Research Committee:

Kathryn E. Newcomer, Professor of Public Policy and Public Administration, Dissertation Director

Donna L. Infeld, Professor of Public Policy and Public Administration, Committee Member

Michael Harmon, Professor Emeritus of Public Policy and Public Administration, Committee Member

ii

© Copyright 2012 by Kenneth Norman Clark All rights reserved

iii Dedication

In memory of my father Norman H. Clark. Thank you for inspiring me and starting off with me on this long journey.

I wish we could have arrived here together.

iv Acknowledgments

First of all, I am truly grateful to my Dissertation

Research Committee: Professors Kathryn Newcomer, Donna

Infeld, Michael Harmon, Costis Toregas, and Jerrold Post, all of The George Washington University. Your support and guidance throughout this process is greatly appreciated. I especially thank Kathryn Newcomer, my Dissertation Director, and Donna Infeld, our Ph.D. Program Director, for their dedication, patience, and optimism in guiding me through to completion.

My thanks go out to all the survey participants and interviewees who volunteered their time and energy to answer some very important questions regarding cybersecurity, and who contributed to this research, which addresses one of our nation’s greatest national security threats. Thanks also to

Professor Stuart Umpleby, who mentored me and inspired me to take on this study of Reflexivity Theory, always expressing confidence that I could complete the task. And to Charlotte

Hess for her interest and support in my research, and leading the concept of the “New Commons” into the social sciences. I am also very grateful to Janet Hulstrand for her tireless support and encouragement. I am extremely grateful to my true friends Monique Vella, Dave Sapper, and Mike

v Tovrea for all their encouragement, laughs, and unwavering support through the years: Cheers!

Finally, a person cannot complete a task such as this without the love and support of family. I thank my sister,

Karen Gould, for being there and encouraging me. I thank

Ethel Maki: you were there on Day 1 and I am truly blessed that you are with me now. Susan McKeehan, you have always been and will be an inspiration and a shining light. I thank my Mom, Kathy Clark: you and Dad raised me and guided me toward new horizons in my life, as only the best parents in the world could do. I wish both of you could be here as I complete this journey. Most of all I thank my wife, Sheila, for all her love, unwavering support, inspiration, and encouragement throughout this long process. We made it!

vi Abstract

Institutions and Self-Governing Social Systems: Linking Reflexivity and Institutional Theories for Cybersecurity and Other Commons Governance Policies

A commons is the conventional term that describes a widely accessible and shared resource, such as an ocean, open rangeland, a public park, or public portions of the Internet, which is difficult to exclude or limit use once naturally provided or humanly produced. Ruin of a commons can result when individuals use the resource recklessly and selfishly, rather than adhering to conservation-minded, collective action, with a view toward preserving the commons for future generations, in cooperation with others in the community. Employing a mixed methods research design with the U.S. Federal government’s use of the Internet as an illustrative case, the research described here explores how Reflexivity Theory and Institutional Theory, and their common theoretical element of human agency, can be used in developing new policy concepts for commons governance. The research answers the questions, "How may Reflexivity and Institutional Theories be used to help improve the formulation of commons governance policies?" and “How may Reflexivity and Institutional Theories be used to improve Federal cybersecurity policies governing use of the Internet?” Through interviews and an Internet-based survey to collect and analyze data, the research demonstrates that elements of these theories can be used to inform both Federal cybersecurity policies and governance policies affecting other commons.

vii Table of Contents

Dedication ...... iv

Acknowledgments ...... v

Abstract...... vii

List of Figures ...... xii

List of Tables ...... xiii

Chapter 1: Introduction and Overview of the Research ...... 1

Research Purpose...... 1

The Problem in Context: Commons Governance...... 3

The Case: Federal Cybersecurity Policy...... 10

Research Strategy and Questions...... 20

Summary...... 22

Chapter 2: Review of the Literature ...... 24

Introduction...... 24

Social Systems and Reflexivity Theory...... 24

Institutional Theory...... 35

The IAD Framework ...... 44

Linking Reflexivity and Institutional Theories ...... 47

The Research Case: Federal Cybersecurity Policy...... 50

viii Federal Cybersecurity Policy and the Federal Information Security Management Act of 2002 ...... 56

Reflexivity and Institutional Theories for Addressing Cybersecurity ...... 65

Summary...... 70

Chapter 3: Research Methodology and Design ...... 72

Introduction...... 72

Exploratory Mixed Methods Research...... 72

Data Collection Process...... 75

The Survey ...... 79

The Interviews ...... 88

Data Analysis Process...... 90

Research Validity and Limitations...... 95

Summary...... 103

Chapter 4: Analysis of Data and Findings ...... 105

Introduction...... 105

Results of the Quantitative Data Analysis...... 106

Quantitative Analysis Supporting Research Question #1. 115

Subsidiary Question #1a Results ...... 116

Subsidiary Question #1b Results ...... 126

Subsidiary Question #1c Results ...... 134

ix Subsidiary Question #1d Results ...... 140

Quantitative Research Results...... 146

Results of the Qualitative Data Analysis...... 147

Reflexivity Theory and Cybersecurity Policy ...... 149

Avoiding Behaviors That Impact Trust and Reciprocity ...... 153

Self-Monitoring Habits for Cybersecurity ...... 156

Strengths and Shortfalls of FISMA ...... 157

Additional Policy Ideas for Cybersecurity ...... 163

Integrating and Summarizing the Results...... 165

Summary...... 169

Chapter 5: Conclusions and Research Implications ...... 171

Introduction...... 171

Summary of the Findings...... 171

Implications for Future Policy...... 175

Considerations for Future Research...... 183

Concluding Remarks...... 185

Appendix A: Internet Survey ...... 187

Appendix B: Interview Questions ...... 197

Appendix C: Survey Solicitation ...... 199

x Appendix D: References ...... 200

xi List of Figures

Figure 2.1: The IAD Framework ...... 44

Figure 2.2: Reflexivity and Institutional Theory Linkage Point ...... 48

Figure 2.3: The IAD Framework and Cybersecurity ...... 67

Figure 4.1: Educational Levels of Survey Respondents .... 107

Figure 4.2: Current Occupation of Survey Respondents .... 108

Figure 4.3: Personal Security Concerns of Respondents ... 110

Figure 4.4: Views on Security Policy Enforcement...... 112

Figure 4.5: Views on Areas for Internet Security Policy. 113

Figure 5.1: Reflexivity and Institutional Theory in the IAD Framework...... 173

xii List of Tables

Table 2.1: Reflexivity and Insider Threat Promising Practices...... 55

Table 3.1: Survey Questions Supporting Question 1a ...... 83

Table 3.2: Survey Questions Supporting Question 1b ...... 85

Table 3.3: Survey Questions Supporting Question 1c ...... 86

Table 3.4: Survey Questions Supporting Question 1d ...... 87

Table 3.5: Interview Question Categories ...... 94

Table 4.1: Views on Receiving Warning Information ...... 111

Table 4.2: Views on Participating in Policy-Making ...... 114

Table 4.3: Views on the Importance of Maintaining Trust . 117

Table 4.4: Views on Avoiding Trust-Impacting Actions .... 117

Table 4.5: Generated Results for TRUST ...... 118

Table 4.6: Views on Describing Product Information ...... 119

Table 4.7: Analysis of TRUST and AUCTIONS ...... 120

Table 4.8: Views on Getting Approval Before Posting Information ...... 121

Table 4.9: Analysis of TRUST and POSTING ...... 122

Table 4.10: Check E-mail Sources Before Forwarding Them . 123

Table 4.11: Analysis of TRUST and EMAIL ...... 123

xiii Table 4.12: Respondent Views on Personal Identity and Action...... 124

Table 4.13: Analysis of TRUST and IDENTITY ...... 125

Table 4.14: Importance of Another’s Judgment of Actions . 127

Table 4.15: Generated Results for RECIPROCITY ...... 128

Table 4.16: Analysis of RECIPROCITY and POSTING ...... 129

Table 4.17: Analysis of RECIPROCITY and EMAIL ...... 130

Table 4.18: Views on Sharing Another’s Contact Information...... 131

Table 4.19: Analysis of RECIPROCITY and PRIVACY ...... 132

Table 4.20: Analysis of RECIPROCITY and IDENTITY ...... 133

Table 4.21: Respondent Views on Activities Being Monitored...... 135

Table 4.22: Analysis of MONITOR and AUCTIONS ...... 136

Table 4.23: Mask or Use Another Identity at Chat Rooms .. 137

Table 4.24: Analysis of MONITOR and MASK ...... 138

Table 4.25: Analysis of MONITOR and IDENTITY ...... 139

Table 4.26: Views on Security Jeopardizing Activities ... 141

Table 4.27: Scan Personal Computers for Viruses ...... 141

Table 4.28: Analysis of ACCESS and VIRUS ...... 142

Table 4.29: Careful with Activities when Social Networking...... 143

Table 4.30: Analysis of ACCESS and CHATROOM ...... 144

xiv Table 4.31: Operate Home Wireless Networks with Passwords...... 145

Table 4.32: Analysis of ACCESS and WIRELESS ...... 145

Table 4.33: Summary of Quantitative Results ...... 147

Table 4.34: Questions on Reflexivity Theory and Cybersecurity...... 150

Table 4.35: Questions on Trust and Reciprocity ...... 153

Table 4.36: Question on Self-Monitoring Habits ...... 156

Table 4.37: Questions on FISMA ...... 158

Table 4.38: Question on Additional Policy Ideas ...... 164

xv Chapter 1: Introduction and Overview of the Research

Research Purpose

Throughout their daily lives, people from around the world access and use a variety of widely shared resources in the course of engaging in their work and leisure activities.

A place where such resources are found is conventionally known as a commons. Examples of a commons in nature are a forest, a lake, an ocean, open rangeland, and the atmosphere. Examples of human-made commons are a public road, a public park, and public portions of the Internet. A commons may also include nontangible resources such as scientific or other scholastic knowledge (Hess and Ostrom,

2007:3-9). It is difficult to exclude or limit use to individuals of such widely accessible and shared resources once they are available, and difficult to control self- serving behaviors and prevent the exploitation and inevitable overuse of a commons.

The exploratory research described here examines the challenges involved in governing use of a commons, and proposes theory-based concepts to inform policies governing use of a commons. Employing a mixed methods research design with U.S. Federal cybersecurity policy and the government’s use of the Internet as an illustrative case, the research

1

draws upon two -based theories: Reflexivity

Theory and Institutional Theory. Reflexivity Theory holds that individuals in social settings reflect on their own actions as both observers and participants, and that they modify their behaviors accordingly (Giddens, 1991:35; Soros,

2003:2-3; and Umpleby, 2007:515-516). Institutional Theory holds that societies can organize themselves around informal and formal rules, norms, and organizations to sustain regular and cooperative patterns of human behavior

(Anderson, 2003:19; North, 1993; Ostrom, 2005:3; Peters,

2005:18-19; and Scott, 2008:48-49). The research described here proposes that a common element of both theories can be used to help promote policies for more socially conscious use of a commons.

Through quantitative analysis using collected survey data, and qualitative analysis using data from interviews, the research applies concepts from the two theories to analyze behavioral use of the Internet, and inform cybersecurity policy affecting the Federal government’s use of this prevalent commons. While the Internet and the government’s use of this resource is one case, the research suggests applicability of the findings to inform governance policies affecting other commons found in society.

2

This chapter has four main parts. First, it outlines the contextual problem, commons governance, and various approaches used for managing the use and protection of the finite resources in a commons. Next, the chapter introduces the research case, Federal cybersecurity policy that helped guide the data collection and analysis. An outline follows of the research strategy and study questions that provided focus for the research. Finally, this chapter concludes with an overview of the topics addressed in subsequent chapters.

The Problem in Context: Commons Governance

Nobel Laureate Elinor Ostrom asserted that resource problems affecting a commons “are among the core social dilemmas facing all peoples” (Ostrom, 2005:219). Social dilemmas refer to situations in which individuals must make choices in a variety of interdependent circumstances that may ultimately affect these choices. These dilemmas are more likely to occur when the strategies and actions of individuals pursuing their own self-serving interests lead to suboptimal, potentially destructive results affecting a broader community (Green and Shapiro, 1994:74; Hess and

Ostrom, 2007:5; and Ostrom, 2003:19-20). A social dilemma may even occur when individuals consume resources produced by their community without making their own contributions

3

and efforts to produce these goods, essentially “free riding” at the expense of all other community members (Green and Shapiro, 1994:74-77).

One prevailing social dilemma affecting a commons, given the resource’s potentially uncontrolled accessibility, is the potential for overuse, which can in some cases even lead to extinction of the resource due to the hedonistic actions of individual consumers. In 1968, University of

California professor Garrett Hardin called this type of situation a “tragedy of the commons.” He noted that:

Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons. Freedom in a commons brings ruin to all. (Hardin, 1968:1244)

This “tragedy” also describes outcomes that can occur as a result of polluting activities, in which consumers and producers introduce destructive contaminates into the resource. As Hardin warns, “We are locked into a system of

‘fouling our own nest,’ so long as we behave only as independent, rational, free-enterprisers” (Hardin,

1968:1245).

Ruin of a commons can result from self-serving or “free riding” behaviors by individuals who are using the resource recklessly rather than adhering to conservation-minded, collective action along with others in the community, with a

4

view toward preserving the commons for future generations.

As Ostrom notes:

All efforts to organize collective action, whether by an external ruler, an entrepreneur, or a set of principals who wish to gain collective benefits, must address a common set of problems. These have to do with coping with free-riding, solving commitment problems, arranging for the supply of new institutions, and monitoring individual compliance with sets of rules. (Ostrom, 1990:27)

In order to counter overuse and ruin of a commons,

Ostrom suggests collective action, which entails thoughtful, cooperative behaviors on the part of individual consumers and their community, with reciprocal action and mutual trust as essential elements (Ostrom, 2005:272,287; and Ostrom and

Walker, 2003:7). However, collective action can be difficult to accomplish, as observed by economist Mancur Olson:

Unless the number of individuals in a group is quite small, or unless there is coercion or some other special device to make individuals act in their common interest, rational, self-interested individuals will not act to achieve their common or group interests. (Olson, 1971:2)

A fundamental challenge to securing collective action is that there is no real motivation for individuals to cooperate with others when they can have full, unrestrained use of a commons without having to pay for it (Friedman,

2002:596; Green and Shapiro, 1994:9; and Ostrom, 2003:20).

Unless there is some external stimulus or coercion, self-

5

interested behaviors can prevail, leading to overuse and potential extinction of a commons.

Given the challenges to achieving collective action, what options are available? Public policies promoting conservation-minded and non-destructive practices can help prevent a potential “tragedy of the commons.” Some scholars have suggested establishing property rights and privatization for selectively managing a commons, or using direct governmental intervention to induce a more regulated use of commons resources (Demsetz, 1967:354-356; and Ophuls,

1973:52). However, growing research in new self-organization and self-governance approaches, such as those suggested by

Elinor Ostrom and others, provide viable alternatives that can still foster collective action (Dixit, 2004:12-13; and

Ostrom, 1990:15-25). Researchers are introducing policy analysis tools and concepts that can help identify and develop new forms of commons governance policies. The exploratory research described here is one such effort: it introduces supplementary policy formulation concepts for helping strengthen commons governance policies.

To understand the relevance and value of researching new considerations for governance policy, it is necessary to understand the general context of the problem. As mentioned above, individual consumers engaging in self-motivated

6

behaviors in sufficient numbers can be a root cause of the overuse, and even the potential ruin, of a commons.

Protecting finite resources requires the promotion of collective behaviors and actions among all consumers, and placing an emphasis on the importance of conserving and protecting resources, rather than exploiting them. Yet formulating and implementing effective management and governance processes is a challenge. Some scholars believe that these matters should be left up to institutional authorities, such as the government, to resolve, or that society should rely on market forces to control the problem.

Others do not view either of these approaches as entirely successful solutions for long-term governance. As Ostrom declared:

Neither the state nor the market is uniformly successful in enabling individuals to sustain long- term, productive use of natural resource systems. Further, communities of individuals have relied on institutions resembling neither the state nor the market to govern some resource systems with reasonable degrees of success over long periods of time. (Ostrom, 1990:1)

Developing alternative governance policies, and in particular those with more self-governing mechanisms, requires an understanding of the basic concept of governance. The Oxford English Dictionary defines governance as “controlling, directing, or regulating influence;

7

control, sway, mastery” (Oxford English Dictionary, 1989).

Nobel Laureate Oliver E. Williamson defines governance, in the context of economics as “the means by which to infuse order, thereby to mitigate conflict and realize mutual gain”

(Williamson, 2009). Jan Kooiman describes governing as the activities that purposefully “guide, steer, control or manage” sectors or facets of societies (Kooiman, 1993:2). A common theme in these definitions is that some essence of control and influence on human behaviors is needed in order to instill order. With a commons, such influence and control should help move individuals away from self-serving, uncontrolled consumption, towards more favorable collective, conservation-minded behaviors.

Conventional approaches involving the use of property rights or government intervention typically are designed to influence and control the individuals in society who are using a commons. However, these policy options may be prone to inefficiencies and failures. For example, property rights may incur legal expenses, and may have other costs and risks associated with private enforcement, such as fees for service or tolls, if the government cannot provide adequate protection, or if additional monitoring or policing costs are required during the implementation of the rights

(Demsetz, 1967:348; Dixit, 2004:129; and Ostrom, 1990:12).

8

Government intervention can also lead to: the inefficient allocation of resources; expenses associated with regulatory action; the introduction of personal biases in regulatory policy as lawmakers seek to satisfy the desires of their constituents; and costs associated with implementing regulatory monitoring and enforcement activities (Friedman,

2002:612-613; Ostrom, 1990:9-11; and Weimer and Vining,

1999:167,183,194).

Self-governance policies provide an alternative approach to shape behaviors and encourage more collective action. Communities benefit from cost savings through self- enforcement and self-monitoring and less third party enforcement, and when social norms evolve naturally towards sustaining long-term collective action. Moreover, self- governing approaches can gain greater acceptance by members of the community than government intervention (Ostrom,

1990:25; Ostrom, 2000:154; and Ostrom, Walker, and Gardner,

1992:405,414). Kooiman defines self-governance as “the capacity of societal entities to provide the necessary means to develop and maintain their own identity, and thus show a relatively high degree of social-political autonomy”

(Kooiman, 2003:79). Furthermore, there is an important linkage between collective action and self-governance, as described by political science theorist Vincent Ostrom:

9

If there is a shared community of understanding and a reasonable level of consensus about how to address common problems, people will exercise a significant influence in monitoring, facilitating, and constraining one another’s behavior rather than presuming that it is only governments that govern. (V. Ostrom, 1991:17)

However, while policy approaches that incorporate more self-governance mechanisms can help in governing the use of a commons, these approaches are still evolving in current public policy practice. As Elinor Ostrom notes:

What is missing from the policy analyst’s tool kit—and from the set of accepted, well-developed theories of human organization—is an adequately specified theory of collective action whereby a group of principals can organize themselves voluntarily to retain the residuals of their own efforts. (Ostrom, 1990:24-25)

The research presented here contributes to studies in self- governance policy concepts for managing a commons, and suggests new concepts for policies governing the Federal government’s use of one of society’s most ubiquitously shared resources—the Internet.

The Case: Federal Cybersecurity Policy

Research expert Robert K. Yin defines a case in research as the set of real-life events that supports data collection and provides a contextual boundary for evaluating the findings of the research. For example, a case may describe a crisis event, the social life in a community, or an abstract process (Yin, 2003:13-15). For this research,

10

the Internet, as an illustrative commons, and governance policy addressing security for one of its prominent users, the U.S. Federal government, serve as the case to guide the data collection and analysis.

The Internet has origins dating back to 1969 with a

Department of Defense (DoD) Advanced Research Projects

Agency (ARPA) sponsored project to create a secure network that interconnected university research centers conducting defense-related research. Named the ARPANET and initially comprised of four sites, this network continued to expand, and in 1984 upon the deactivation of the original ARPANET, over 500 interconnected computers remained making up a new cross-country network called the “Internet.” In 1987, the

DoD transitioned operational oversight of the Internet to the National Science Foundation (NSF), and in 1995, the NSF transferred management of the Internet over to the private sector. This transition to the private sector promoted increased access and public use of the Internet (Dodd,

1998:184-185; and Hafner and Lyon, 1996:154,240-244). Growth of the Internet accelerated increasing both the number of public and private sector users around the world. In 2010, the International Telecommunication Union estimated that there were over 2.4 billion people using the Internet

(International Telecommunication Union, 2011).

11

The Internet is part of a broader, global information resourced environment called cyberspace. Cyberspace encompasses a vast globally networked environment of information technology and telecommunications systems. The

White House formally defines cyberspace as “the interdependent network of information technology infrastructures, and includes the Internet, telecommunications networks, computer systems, and embedded processors and controllers in critical industries” (White

House, 2009:1). The U.S. Department of Homeland Security further maintains, “Cyberspace underpins almost every facet of American life, and provides critical support for the U.S. economy, civil infrastructure, public safety, and national security” (U.S. Department of Homeland Security, 2011:iv).

Cyberspace is virtually borderless across the international community as the Center for Strategic and International

Studies argues, “Cyberspace spans the globe. No single nation can secure it” (Center for Strategic and

International Studies Commission on Cybersecurity for the

44th Presidency, 2008:20).

Moreover, the Internet and cyberspace are also termed by some as a “Global Commons” that also characterizes maritime, air, and outer space environments (Denmark and

Mulvenon, 2010:11). Others view the Internet as simply a

12

“pseudo commons,” having some elements of sovereignty exercised across nation states, albeit sovereignty that is still weak and fragmented (Lewis, 2010:62). Still others have identified the Internet as a “New Commons” that has evolved from new technologies, has no pre-existing rules, and has no clear institutional arrangements (Hess, 2008:1-

4). Founder and former Digital Library of the Commons director Charlotte Hess differentiates between “Traditional

Commons,” which includes forests and grazing lands, and

“Knowledge Commons,” which includes the Internet (Hess,

2008:13).

No two commons are alike, and there can be unique differences in the impact of externalities, positive or negative by-products or effects, on the resource. For example, pasture land can be depleted from overgrazing, and while the analogy is not precise here, in that one person’s use of the Internet does not take away from another’s ability to use the same information, there can be problems in accessing and using the Internet itself if there are too many users at the same time. On the other hand, the Internet is prone to malicious attacks from both external and insider users, a more negative externality than farmers competing for stock feed from a pasture. Cybersecurity can be viewed as a public good, nonrivalrous and nonexcludable, and can be

13

available to all users (Mulligan and Schneider, 2011:71).

The Internet, a representative commons, has similar but also unique challenges that can affect its long-term sustainability.

Nevertheless, viewing the Internet in the context of being a representative commons also evokes challenges from social dilemmas and self-motivated user actions as experienced in other commons (Hess and Ostrom, 2007:4-5). In addition to encouraging overuse and exploitation, such problems can adversely affect the overall security of the

Internet and cyberspace for interconnected computers and general users.

The White House highlighted such security concerns in cyberspace in a recent cybersecurity policy report:

The architecture of the Nation’s digital infrastructure, based largely upon the Internet, is not secure or resilient. Without major advances in the security of these systems or significant change in how they are constructed or operated, it is doubtful that the United States can protect itself from the growing threat of cybercrime and state-sponsored intrusions and operations. (White House, 2009:i)

In addition, given its diverse and globally distributed nature, the Internet can be viewed as essentially a user- controlled environment:

State and non-state actors are able to hack, intrude, corrupt and destroy data with relative impunity...

14

Security in the cyber commons is often self-provided by users rather than by a central authority. (Denmark and Mulvenon, 2010:29)

Having a decentralized, user-controlled resource also presents challenges for effective long-term governance:

Governance in cyberspace resembles the American Wild West of the 1870s and 1880s, with limited governmental authority and engagement. Users—whether organizations or individuals—must typically provide for their own security. Much of cyberspace operates outside the strict controls or any hierarchical organizations. (Rattray, Evans, and Healy, 2010:149)

This governance problem complicates the effective implementation and management of security controls that can protect users and systems operating in cyberspace.

The International Telecommunication Union defines cybersecurity as “the collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user’s assets.” The cyber environment “includes users, networks, devices, all software, processes, information in storage or transit, applications, services, and systems that can be connected directly or indirectly to networks” (International

Telecommunication Union, 2009:2).

Cybersecurity threats can come from the outside, from both state and non-state actors, and can include both

15

organizational and individual users. Existing cybersecurity shortfalls in the U.S. are exposing computers to external threats from foreign countries, terrorist cells, organized crime, and criminal factions intent on stealing or corrupting U.S. government data (Committee on Improving

Cybersecurity Research in the United States, 2007:3;

Director of National Intelligence, 2009:9; and White House,

2003:5-7).

In addition to external threats, there can be internal threats to cybersecurity, and they can be just as catastrophic. Internal threats involve the legitimate, authorized insiders, such as employees and network operators, who cause severe security problems by stealing vital information and/or destroying network assets:

An organization’s people (employees, customers, third parties, suppliers, etc.) are its greatest asset and its weakest link. Human error is overwhelmingly stated as the greatest weakness this year (86%), followed by technology (a distant 63%). (Deloitte, 2009:30)

Insider threats to cybersecurity can result from careless actions, and from either deliberate or inadvertent violations of cybersecurity policy that open up networks to external attacks or introduce viruses and malicious software

(“malware”) into networks. The U.S. Government

Accountability Office (GAO) cites some of the other problems that can be encountered with insiders:

16

The disgruntled organization insider is a principal source of computer crime. Insiders may not need a great deal of knowledge about computer intrusions because their knowledge of a target system often allows them to gain unrestricted access to cause damage to the system or to steal system data. The insider threat includes contractors hired by the organization, as well as employees who accidentally introduce malware into systems. (U.S. Government Accountability Office, 2010:4)

The insider threat can come from malicious and deliberate actions taken by insiders to steal or destroy information, or to deliberately sabotage an organization’s computer systems (Cappelli, Moore, and Trzeciak, 2012:1; and

National Science and Technology Council, 2006:39). Moreover, as behavioral scientists Eric Shaw, Keven Ruby, and Jerrold

Post write, “damage done by insiders far outweighs that by

‘hackers’ and other outsiders…the average cost of insider attacks is nearly fifty times greater than that by outsiders” (Shaw, Ruby, and Post, 1998:1). Cybersecurity experts Dawn Cappelli, Andrew Moore, and Randall Trzeciak write, “Insider threats are influenced by a combination of technical, behavioral, and organizational issues, and must be addressed by policies, procedures, and technologies”

(Cappelli, Moore, and Trzeciak, 2012:145). A comprehensive approach to cybersecurity policy must particularly address some control of insider user behaviors on the Internet.

Furthermore, research is needed concerning the sociological,

17

behavioral, and psychological factors affecting Internet usage (Committee on Improving Cybersecurity Research in the

United States, 2007:6; President’s Information Technology

Advisory Committee, 2005:46; and Shaw, Ruby, and Post,

1998:2-3).

The research described here contributes to behavioral research by focusing on insider behaviors in the context of

Reflexivity Theory. This may help to identify new governance policy concepts for the Internet and offer guidance applicable to other commons environments. While malicious insider threats are indeed serious threats to cybersecurity, the focus of this research is on those behaviors related to unintentional insider threats from authorized insiders—in other words, careless or negligent cybersecurity actions.

Unintentional insider threats can be defined as “insiders who accidentally affect the confidentiality, availability, or integrity of an organization’s information or information systems” (Cappelli, Moore, and Trzeciak, 2012:xxi). These actions, though unintentional, can still open networks to catastrophic external attacks or can propagate serious internal cybersecurity vulnerabilities. The selection of unintentional insider threats for the research focus is not to discount malicious insider threats, but rather focus the integration of the Reflexivity and Institutional Theories in

18

the context of social dilemmas that may be associated with unintentional insiders. Additionally, a focus on malicious insiders would require an alternative approach to the data sampling used in this research, potentially even requiring a review of criminal information and records, if available.

Exploring unintentional insider user behaviors and related policy considerations makes Federal cybersecurity policy an appropriate case for this research.

The Federal government is one of the Internet’s prominent organizational user populations with over 4.4 million active employees (U.S. Office of Personnel

Management, 2010). This available pool of Internet users requires effective and comprehensive security policies to address external and internal threats to cybersecurity.

Federal cybersecurity policy affects federal information systems used by executive agencies, contractors of agencies, and organizations acting on behalf of executive agencies

(National Institute of Standards and Technology, 2010:2).

However, challenges with effective cybersecurity policy continue as the GAO reported:

The threats to information systems are evolving and growing, and systems supporting our nation’s critical infrastructure and federal systems are not sufficiently protected…actions are needed to enhance security over federal systems and information. (U.S. Government Accountability Office, 2011a:11)

19

Growing concerns with cybersecurity warrant new governance policy concepts and approaches. An examination of new policy concepts for securing Federal information systems on the

Internet, as an illustrative case, will guide exploration of applicable governance policy considerations for other commons as part of satisfying the overall research strategy.

Research Strategy and Questions

According to Yin, exploratory research can involve any, or a combination, of five types of general research strategies (experiments; surveys and interviews; documentation reviews; historical reviews; and case studies), depending on the research questions. Surveys and interviews examine contemporary events where little control exists over events during research, and where the research answers "who," "what," where," "how many," and "how much" questions (Yin, 2003:5-6). For this research, documentation reviews, and both an Internet-based survey and interviews provided the means to collect the data needed for answering the research questions.

Research questions provide focus for the research activity, and help frame and guide implementation of the research. Yin writes, "Defining the research questions is probably the most important step to be taken in a research

20

study" (Yin, 2003:7). For the research described here, two general questions provided context for the research, and subsidiary research questions guided the specific data collection, activity and analysis. The first general question provided scope for examining the use of the two linked theories for helping improve commons governance policy:

Research Question #1: How may Reflexivity and Institutional Theories be used to help improve the formulation of commons governance policies?

Aligned with this first research question are four subsidiary research questions focusing on human behaviors associated with Reflexivity and Institutional Theories:

Question #1a: To what extent will actors avoid behaviors that could jeopardize the trust of other actors using a commons?

Question #1b: To what extent will actors avoid behaviors that could jeopardize reciprocal action with other actors using a commons?

Question #1c: To what extent will actors avoid behaviors that could bring attention of their actions to third-party monitoring agents?

Question #1d: To what extent will actors avoid behaviors that could adversely affect their own future ability to use a commons?

The second general question provided focus on the practical use of the two theories to improve Federal cybersecurity policy:

21

Research Question #2: How may Reflexivity and Institutional Theories be used to improve Federal cybersecurity policies governing use of the Internet?

Associated with this second general question is one subsidiary research question focusing on user behaviors when using the Internet that can affect cybersecurity:

Question #2a: Under what circumstances will actors reflect on their own actions when using the Internet and avoid behaviors that would adversely affect their ability to use the Internet?

Units of analysis support the original research study questions and characterize the particular entities or phenomena researched in the effort (Frankfort-Nachmias and

Nachmias, 2000:47; Rea and Parker, 2005:158; and Yin,

2003:22-25). For the research described here, the unit of analysis is Internet users. An Internet-based survey was the method used to collect data from sampled Internet users, and documentation reviews and interviews supported additional collection of data on current Federal cybersecurity policy.

In Chapter 3, the research strategy is described in more detail.

Summary

Chapter 1 presented the purpose and value of this research endeavor, as well as the research study questions.

It also introduced the two foundational theories—Reflexivity

22

Theory and Institutional Theory—that form the basis for the proposed supplementary policy concepts used for commons governance. Finally, this chapter introduced the general strategy of inquiry, a survey and interviews, supporting the research and analysis.

Chapter 2 provides a review of the relevant literature, including an examination of the two underlying theories and the research case, Federal cybersecurity policy. Chapter 3 outlines the research methodology and design for collecting data through the survey and interviews, and reviews the data analysis approach, as well as limitations to the findings.

Chapter 4 presents and summarizes the research results.

Chapter 5 presents the main conclusions; considerations for applying these results to future policy formulation; and suggestions for further research.

23

Chapter 2: Review of the Literature

Introduction

Substantiating the relevance of new policy concepts for commons governance requires first of all understanding key principles from Reflexivity Theory and Institutional Theory, and then reviewing the specifics of the case. Chapter 2 presents a literature review of these two theories, as well as the relevant theoretical linkage point used as the basis for developing new governance policy. This chapter will discuss General and Reflexivity Theory, and

Institutional Theory including a review of the Institutional

Analysis and Development (IAD) Framework, an innovative policy analysis diagnostic tool developed by Elinor Ostrom and her colleagues at the Workshop in Political Theory and

Policy Analysis. Federal cybersecurity policy will also be reviewed in this chapter. This review will establish the background and context for evaluating, through empirical activity, the utility and applicability of the research findings.

Social Systems and Reflexivity Theory

First introduced by Austrian-born biologist Ludwig von

Bertalanffy shortly after World War II, General Systems

24

Theory (GST) provided a new theoretical approach for analyzing complex phenomena and objects found in nature, society, and the physical sciences. GST provided a way of making broad interpretations of both internal and external interactions occurring between individual elements inside larger, more complex systems (von Bertalanffy, 1968:37-38).

GST defined a "system" as a conglomeration of numerous individual elements bonding or interacting together in some way different from the general external environment in which they individually operate. For example, complex systems, such as automobiles, aircraft, and even biological organisms, consist of individual parts, or elements, that work together to complement and sustain the overarching system. As von Bertalanffy explained, GST theorized a science of “wholeness," wherein elements of a complex system are viewed as collectively working together rather than in isolation from each other (von Bertalanffy, 1968:37-38).

Effectively used in engineering and natural science inquiries, GST also showed applicability and relevance in evaluating the sociological influences upon and organizational interrelationships of human social systems.

American sociologist described social systems as consisting “in a plurality of individual actors interacting with each other in a situation which has at

25

least a physical or environmental aspect” (Parsons, 1951:5).

Likewise, the prominent political scientist David Easton explained, "All social systems are composed of the interactions among persons and that such interactions form the basic units of these systems" (Easton, 1965:36). Social systems vary in size and can even broadly encompass society itself.

Easton described society as the “most inclusive social system,” and “the only one that encompasses all the social interactions of the biological persons involved” (Easton,

1965:47). Similarly, German sociologist wrote, “Society is the all-encompassing social system that includes everything that is social” (Luhmann, 1995:408).

Luhmann viewed social contact as an aspect of a system where

“every social contact is understood as a system, up to and including society as the inclusion of all possible contacts”

(Luhmann, 1995:15). For his part, von Bertalanffy viewed the social sciences—economics, political science, sociology, and social psychology—as an actual "science of social systems"

(von Bertalanffy, 1968:194-195).

Broad organizational systems consist of the people, objects, and activities that constitute the working elements of these unique types of social systems. The interactions and inter-dependencies of all elements, working together,

26

support the organizational system (Coleman and Palmer,

1973:78-79; Katz and Kahn, 1978:20; Scott, 2008:x; and von

Bertalanffy, 1968:196). GST introduced a new theoretical approach for identifying and evaluating complex social problems through the modeling and analysis of social inter- relationships, organizational structures, and social and political processes.

GST identified two distinct types of systems—closed and open. Closed systems, generally applicable to systems described in the physical sciences, are isolated from direct interactions with their external environments. Just as chemicals in a sealed beaker are isolated from direct contact with the air, closed systems are kept isolated from outside influences. Open systems, on the other hand, interact with their external environments. For example, living organisms conduct chemical exchanges with their environments as part of the normal process of respiration, inhaling oxygen, and exhaling carbon dioxide (von

Bertalanffy, 1968:39-41,121-122).

In the social sciences, the open systems model appropriately describes the majority of social systems.

Social scientists Daniel Katz and Robert L. Kahn stress this point when discussing social organizations:

27

Social organizations are flagrantly open systems in that the input of energies and the conversion of output into further energetic input consist of transactions between the organization and its environment. (Katz and Kahn, 1978:20)

As these interactions become increasingly interdependent and cooperative, collective organizational action can be promoted (Katz and Kahn, 1978:20-21; and Parsons, 1951:72).

Within social systems, the basic element is the individual person (Easton, 1965:35). Social systems are prone to experiencing social dilemmas as individual humans, each in pursuit of their own self-serving interests, come into conflict with each other:

Social systems by their nature are in a constant state of relative turmoil, because they comprise independent actors (individual, group, or corporate) seeking their own interests and in so doing coming into conflict with other independent actors seeking different interests. (Dunsire, 1993:33)

Such social dilemmas can inhibit productive collective action, and can lead to the overuse of commonly shared resources.

The human attribute of agency drives the personal actions and behaviors that contribute to social dilemmas.

Famed British sociologist Anthony Giddens described agency as the capability of individuals to act or intervene in events, where “the reflexive monitoring which the individual maintains is fundamental to the control of the body that

28

actors ordinarily sustain” (Giddens, 1984:9). Sociologist

Frances Cleaver discusses a similar aspect of agency:

In social theory, agency is seen as the capability, or power to be the originator of acts and a distinguishing feature of being human. Reflexivity and the ability to act purposively commonly feature in definitions of agency. (Cleaver, 2007:226)

Human agency links deliberative action to conscious thought as psychologist Albert Bandura writes:

People possess self-directive capabilities that enable them to exercise some control over their thoughts, feelings, and actions…forethought is translated into incentives and guides for action through the aid of self-regulatory mechanisms. (Bandura, 1989:1179)

Human agency also relates to reflexivity.

Reflexivity is the quality or condition of being reflexive, where reflexive describes a wide range of circumstances that involve the turning or bending of something back upon itself. The act of bending something back can involve light and heat, physiological actions, and even mental actions, including those that are a part of self-examination (Oxford English Dictionary, 2009). A wide range of disciplines incorporate the concept of reflexivity: linguistics, philosophy, physics, biology, political science, art, literature, law, economics, and sociology

(Bartlett, 1987:10-18; and Suber, 1987:260-261). Reflexivity can even occur in scientific and doctoral study when researchers reflect on their own actions throughout the

29

researching process. A reflexive approach can benefit research by improving the integrity of the analysis and the results of the research (Alvesson, Hardy, and Harley,

2008:497; Bourdieu, 2004:4-5; Finlay, 2002:531-532; Forbes,

2008:450; Johnson, 1997:284; and Kleinsasser, 2000:157).

In sociology, reflexivity describes the self-conscious act of a person viewing themselves as both an observer and subject in some social activity (Callero, 2003:119). Giddens explains that reflexivity “should be understood not merely as ‘self-consciousness’ but as the monitored character of the outgoing flow of social life” (Giddens, 1984:3).

Sociologist Vladimir Lefebvre argued that individuals possess a special quality called reflexivity, and in making decisions through the use of this quality, they increase their personal view of themselves as ethical beings

(Lefebvre, 1982:xvii-xx,55,59; and Lefebvre, 2006:1). , one of the founders of , the science of communications and control systems, was a proponent for factoring the observer into the process of making observations. According to von Foerster, “we have to observe our own observing, and ultimately account for our own accounting” (von Foerster, 2003/1979:285). Finally, social theorist Margaret Archer related reflexivity to

30

influencing personal choices when taking a particular course of action:

Everyone is a reflexive being. This means that we deliberate about our circumstances in relation to ourselves and, in the light of these deliberations, we determine our own personal courses of action in society. (Archer, 2003:167)

A common theme expressed throughout these examples is the deliberative aspect of self-monitoring, and consequently making adjustments to, personal behavior and action.

Considering reflexivity and human agency as essential factors in self-monitoring and adjusting behavior and action now suggests a means to help address social dilemmas occurring in social systems. Reflexivity suggests an added dimension of self-reflection for monitoring and adjusting human behavior:

All human beings continuously monitor the circumstances of their activities as a feature of doing what they do, and such monitoring always has discursive features. (Giddens, 1991:35)

Similarly, systems and behavioral scientist Stuart Umpleby suggests a linkage between human thought and action that can affect a social system:

If people change the way they think, they will change the way they behave. And if people change the way they behave, they will eventually change the way they think, as they make their thoughts consistent with their actions. If enough people change the way they behave, the social system will operate differently. (Umpleby, 1997:638-639)

31

Notions about reflexive thought and action affecting social systems are further codified in social science’s Reflexivity

Theory.

Reflexivity Theory, as it relates to sociology, holds that human actors reflect on their own thoughts and actions while participating in social settings. This notion of the ability of self-consciousness and reflection to influence thinking even appeared in the writings of Plato:

In the exercise of thought, the soul, as I fancy it, is simply engaged in conversation, questioning itself and answering, affirming and denying. And when, having reached a definition, whether slowly or by a more rapid impulse, it at length agrees and affirms undoubtingly, we state this to be its opinion. (Plato, 1894:187)

In the sixteenth century, the Right Rev. Edward Reynolds,

Lord Bishop of Norwich, suggested a relationship between reasoning and reflexivity in affecting personal action:

So it is in those two offices of reason, the transient and reflexive act, that whereby we look outward on others, or inward on ourselves, especially where there is passion to withdraw and pervert it. (Reynolds, 1826: 209)

In 1934, social psychologist George H. Mead similarly described the attribute of reflexivity in affecting general human behaviors:

It is by means of reflexiveness—the turning-back of the experience of the individual upon himself—that the whole social process is thus brought into the experience of the individuals involved in it; it is by such means, which enable the individual to take the attitude of the other toward himself, that the

32

individual is able consciously to adjust himself to that process. (Mead, 1934:134)

Margaret Archer similarly wrote, “Were we humans not reflexive beings there could be no such thing as society.

This is because any form of social interaction, from the dyad to the global system, requires that subjects know themselves to be themselves” (Archer, 2003:19). As with human agency, Giddens described the importance of reflexive awareness in influencing an individual’s self-identity and consciousness (Giddens, 1991:52-53).

The principles of Reflexivity Theory continue to evolve conceptually in the social sciences and to have important applications in these studies. As social psychologist Ray

Holland writes:

Human reflexivity defines personal existence and is the basis on which people form social units. It is therefore the process which needs to be kept at the center of any method of appraising human existence, including the accounts of that existence provided explicitly by theorists or metatheorists. (Holland, 1999:481).

Reflexivity Theory also has practical applications in the area of economics. George Soros advanced key aspects of

Reflexivity Theory in emphasizing the importance of reflexive behaviors in global investing:

I envision reflexivity as a feedback loop between the participants’ understanding and the situation in which they participate, and I contend that the concept of

33

reflexivity is crucial to understanding situations that have thinking participants. (Soros, 2003:2)

Soros views the principles of Reflexivity Theory as relevant for individuals who are seeking to understand the situations they are a part of, and he views the theory as useful even for improving public policy (Soros, 2003:34-35).

Reflexivity Theory has other applications such as in analyzing and improving organizational management (Alvesson,

Hardy, and Harley, 2008; and Umpleby, 2010), and anthropological research (Davies, 1999; Holland, 1999; Pink,

2001; and Scholte, 1972). Examples of other empirical research using Reflexivity Theory have included Majidi’s

(2006) study on how culture affects perceptions and supports reflexivity; and Campbell’s (2010) use of reflexivity to support a theory of human consciousness. As the research described here will suggest, Reflexivity Theory also has practical application in developing new cybersecurity policy.

In summary, this section reviewed key concepts of

Reflexivity Theory and in particular the human aspect of reflexivity in influencing human agency and behaviors. When reflexive influences shape individual behaviors, this influence could motivate behavior toward more collective action within social systems. As will be shown in the next

34

section, Institutional Theory and Reflexivity Theory together address something specific about human behaviors to promote collective action.

Institutional Theory

Formal governmental institutions such as executive branches, legislatures, the courts, and governing constitutions have been a traditional focus of study in political science since its earliest beginnings (Anderson,

2003:19). In recent years, a better understanding of the importance of institutions in formulating public policy has emerged (Parsons, 1995:323). The economist Douglass C.

North, a Nobel Laureate, maintains:

Institutions are the humanly devised constraints that structure human interaction. They are made up of formal constraints (rules, laws, constitutions), informal constraints (norms of behavior, conventions, and self imposed codes of conduct), and their enforcement characteristics. (North, 1993)

Elinor Ostrom describes institutions in the context of societal interactions:

Institutions are the prescriptions that humans use to organize all forms of repetitive and structured interactions including those within families, neighborhoods, markets, firms, sports leagues, churches, private associations and governments at all scales. (Ostrom, 2005:3)

35

And according to public policy expert Wayne Parsons,

“Institutionalists argue that our primary focus should be how the institutional arrangements within a society shape human behaviour” (Parsons, 1995:224).

A common theme in all of these views is the role that institutions can play in shaping human behavior, especially in promoting greater collective action. Before outlining how

Institutional Theory relates to Reflexivity Theory, and how together they can benefit commons governance, it is important to review some of the background of Institutional

Theory.

Formal Institutional Theory in the social sciences emerged in the late nineteenth century, with initial theoretical contributions coming from economists in Germany and Austria, and later from the United States. These early institutional theorists proposed that human behaviors and social processes, shaped by cultural and historical forces, were relevant influences in affecting economic processes.

The European institutional concepts complemented those coming from economists in the United States such as

Thorstein Veblen and John Commons. Veblen claimed that habits and conventions drove individual behaviors (Scott,

2008:2-3). Commons proposed that institutional rules such as property rights and common law, constrained individual and

36

firm-related action, and transactions were a basic exchange occurring between individuals (Commons, 1961:55-74).

Neoclassical economic theory, emphasizing that rational thinking would cause economic actors to focus on maximizing their own personal satisfaction or utility, dominated economic thought in the early to middle twentieth century.

With the of new institutional economics in the

1970s, however, there was a return to viewing institutions as an important element in economic theory (Scott, 2008:3-

5,28). And more contemporary institutional economists, such as Nobel Laureate Ronald H. Coase, have advanced theories on the rules and governance systems that affect economic production and transactions, further expanding new institutional theory into economics (Coase, 1991).

During the late nineteenth and early twentieth centuries, political scientists in Europe and the United

States also viewed institutions as important. These scholars, however, focused their discussions on explaining formal institutional structures such as constitutions, legal codes, and political systems. From the mid-1930s through the

1960s, political science theory transitioned to a more behaviorist orientation in order to explain the motivations and actions of political actors, voters, and political parties. During this time political science theorists also

37

attempted to define their theories more solidly in order to gain recognition for their discipline as a true science

(Peters, 2005:12; and Scott, 2008:xi,5-6,31).

In the 1970’s and 80’s, Rational Choice Theory (RCT) emerged. RCT applied economic principles to support the assumption that individual political actors choose actions that provide them with the best personal benefits and outcomes, or the maximum utility (Calhoun et al., 2007:81;

Green and Shapiro, 1994:14-15; and Scott, 2008:6-8). This orientation dominated the field of political science until institutional theories reemerged with “new institutionalism” in the 1980s (Scott, 2008:8).

The new institutionalism movement, advanced by scholars such as James March and Johan Olsen, emerged as a counter to the former behaviorist model. March and Olsen, in advancing the theory of political institutions, wrote:

Political life is ordered by rules and organizational forms that transcend individuals and buffer or transform social forces…norms of appropriateness, rules, routines, and the elaboration of meaning are central features of politics. (March and Olsen, 1989:160,171)

Political theories followed one of two major orientations.

Historical institutionalism focused on how politics, processes, and institutional structures changed over time, and how social constraints on actor behaviors evolved.

38

Rational choice institutionalism, having its roots in economic theory, focused on defining governance systems while viewing these systems as primarily constructed by individuals intent on advancing their own personal interests

(Barzelay and Gallego, 2006:532; Durant, 2007:440-441; Hall and Taylor, 1996; Peters, 2005:16; and Scott, 2008:31-35).

Both orientations highlighted the importance that institutions had in affecting political behavior and preferences (Scott, 2008:35).

Sociologists such as Herbert Spencer, with his influential work comparing and contrasting institutions across different societies, and his suggestion that society is actually an evolving organic system having institutional subsystems, have also acknowledged the importance of institutions (Scott, 2008:8). William Graham Sumner built on

Spencer’s theories to describe how institutions evolve with defined purposes, functions, and structures. Other contributions to institutional analysis include those of

Karl Marx, with his work in redefining political and economic structures for society; Émile Durkheim, who viewed systems of knowledge, beliefs, and moral authority in forming social institutions; and Max Weber, who emphasized the importance of rules in defining social structures,

39

social behavior, and economic behavior (Calhoun et al.,

2007:141; Scott, 2008:10-13; and Weber, 1947).

In the United States, thought leaders such as sociologist Talcott Parsons, with his theories on normative standards, morals, and value patterns and how culture affects behavior, have advanced current institutional theory

(Calhoun et al., 2007:141; and Scott, 2008:14). Parsons describes social institutions as the “body of rules governing action in pursuit of immediate ends in so far as they exercise moral authority derivable from a common value system” (Parsons, 1937:407). New sociological institutionalism has sought to explain institutions and has presumed the presence of purposive individual action (Nee,

1998:1).

Institutions also have a relationship to social systems. Parsons views institutions as essential in patterning the interactive relationships within a social system, “The most fundamental mechanisms of social control are to be found in the normal processes of interaction in an institutionally integrated social system” (Parsons,

1951:301). Institutions help promote more cooperative behaviors over self-motivated ones. Parsons writes, “The function of institutions is always the same—the regulation of action in such a way as to keep it in relative conformity

40

with the ultimate common values and value-attitudes of the community” (Parsons, 1990:331). Peters also highlights a relationship with human agency where he writes,

“Institutions are the products of human agency, and the structure of rules and incentives that is created to shape the behaviour of the participants is a choice of the designers” (Peters, 2005:163). In sum, the institutional movement in sociology complemented the institutional theories from political science and economics with the common recognition that institutions can shape human behaviors.

Institutional theorist Avner Greif highlights four key elements of an institution that can regulate social behavior: rules, norms, beliefs, and organizations (Greif,

2006:30). Central to institutions are rules. Rules are enforced prescriptions of required, prohibited, or permitted actions commonly shared by individuals, and result from explicit or implicit efforts to achieve order and predictability of actions (Ostrom, 1986:5; and Ostrom,

2005:6,16). Norms are shared concepts of certain actions, or outcomes, of what “must, must not, or may be appropriate”

(Ostrom, 2005:112). In following norms, individuals subordinate their own personal actions or self-interests to those of the collective interests (Calhoun et al., 2007:84).

41

Beliefs are opinions or ideas regarding the structure and details of an individual's experience, or behavioral beliefs that are opinions about the behavior of others (Greif,

2006:36). A fourth component of institutions, organizations, provides the formal mechanisms needed for governance, produces and disseminates rules, and perpetuates beliefs and norms (Greif, 2006:37).

Ostrom also highlights the importance of trust and reciprocity in promoting collective action and the long-term maintenance of self-governing institutions (Ostrom,

2005:272,287). Ostrom maintains that there is a relationship between “trust, reciprocity, and reputation” and lack of these can bring about negative effects (Ostrom, 2003:53).

And according to Cook and Cooper, trust and reciprocity among individuals benefit and strengthen institutions:

The capacity to engage in mutually beneficial relationships based on trust and reciprocity extends the reach of institutionally supported activities, however, and enables a range of interactions that would not be possible in the absence of such institutions. (Cook and Cooper, 2003:209)

Within the context of cybersecurity, there is a role for trust to play in the Internet environment:

Relational trust requires ongoing, experiential information about another person, nonrepeated interactions between individuals with no prior communication are not based on trust: they are acts of risk-taking. (Cheshire, 2011:52)

42

For the research described here, trust and reciprocity also have an association with Reflexivity Theory, where reflexive action contributes to these collective action enablers.

In general, Institutional Theory provides a means for supporting greater analytical insight into how policy and processes can be established to influence self-governing, cooperative behaviors, with important applications in the areas of economics, political science, and sociology.

Institutional Theory has also been useful in studies in information and knowledge management, information technologies and innovation, and electronic government planning (King et al., 1994:158; and Luna-Reyes and Gil-

García, 2011:329-345).

Elinor Ostrom and her “Workshop” colleagues at Indiana

University have devised a structured diagnostic tool, the

Institutional Analysis and Development (IAD) Framework, useful for assessing institutional characteristics and change. And, as will be shown in the following section, the

IAD Framework is useful for depicting a key linkage, a common element, between Institutional Theory and Reflexivity

Theory.

43

The IAD Framework

The IAD Framework is designed to help identify and analyze situations shaped by rules and norms that affect individual behaviors. This includes analyzing the effects of introducing new rules, norms, social elements, and even environmental influences (Figure 2.1). Elinor Ostrom and

Charlotte Hess explain that the IAD Framework “can be used to investigate any broad subject where humans repeatedly interact within rules and norms that guide their choice of strategies and behaviors” (Ostrom and Hess, 2007:41). The

IAD Framework can also help instantiate the theoretical linkage between Institutional Theory and Reflexivity Theory, and can show how application of this linkage helps in developing new commons governance policy.

Figure 2.1 The IAD Framework

44

The IAD Framework can support empirical analysis of various scenarios involving commons-type resources, focusing on general variables that help analyze human interactions and their consequences in a variety of institutional settings (Ostrom, 2009:408-414). It helps guide analysis by examining three broad areas of analysis that are relevant to examining an institution’s design and activity: underlying factors that include physical, community, and institutional factors; the Action Arena; and outcomes.

In describing the first area, underlying factors, the

Biophysical and Material Conditions outline the material and tangible elements that affect action and are in turn transformed by the actions from the Action Arena. The

Attributes of Community describe the community in which the

Action Arena takes place, and may include resource users and policy-makers. The Rules address the core component of an institution, that is, the known and enforced instructions, guidelines, or policies that shape participant actions and behaviors. This includes day-to-day operational rules, policy-making rules, and constitutional rules.

The next major section depicted in the IAD Framework is the Action Arena. This is the heart of the IAD Framework, used for analysis of the behavioral impacts of introducing any changes of policy or process to the institution (Ostrom,

45

2005:13-15). The Action Situations, specifically focuses on and describes the types of interactions, whether cooperative or not, between participants. Here the IAD Framework aids in examining actions and reactions of the participants that may be influenced by the Biophysical and Material Conditions, the Attributes of Community, and the Rules.

Interactions include exogenous influences, activities, constraints, and incentives that affect participants. And

Outcomes identify the resultant changes and feedback to reinforce change to the physical, community, and institutional characteristics and activities in the Action

Arena. Finally, the Evaluative Criteria in the IAD Framework provide the basis for assessing the outcome measurements

(Ostrom, 2005:13-27; Ostrom, 2009:413-416; and Ostrom and

Hess, 2007:42-63).

Overall, the IAD Framework provides an effective tool to help in defining and analyzing human behaviors affected by the physical, community, and institutional factors. The

IAD Framework has been effectively used in a number of research projects: investigating oil and gas development in the Yukon (May, 2007); modeling decision making in public organizations (Heikkila and Isett, 2004); examining water adjudication systems of California and Nevada (Bethurem,

2009); assessing governance in Mekong and Rhine River basins

46

(Myint, 2005); managing private and public urban property

(Choe, 1992); analyzing banking reform in the U.S. (Polski,

2003); examining groundwater banking programs in the Central

Valley of California (Pinhey, 2003); and comparing State and national management practices of public forests (Koontz,

1997). The framework also provides a valuable instantiation point to show linkage between Reflexivity Theory and

Institutional Theory.

Linking Reflexivity and Institutional Theories

There is an important link between the Reflexivity and

Institutional theories. Sociologists Henri Lustiger-Thaler,

Louis Maheu, and Pierre Hamel have observed:

Institutions are points of entry for the construction of personal and social experiences…Institutions are confronted by the reflexivity of actors, their own internal reflexivity and expanded informational networks. (Lustiger-Thaler, Maheu, and Hamel, 2001:47- 50)

The IAD Framework’s Action Arena helps to highlight a common element between concepts of Reflexivity Theory and

Institutional Theory (Figure 2.2).

47

Figure 2.2 Reflexivity and Institutional Theory Linkage Point

Analysis of the behaviors and actions of participants and their responses to physical, community, and institutional factors takes place in Action Situations, and it is in this part of the IAD Framework that the concepts of

Reflexivity Theory and Institutional Theory can come together. Consideration of the human agency of the participants as important in shaping behaviors in an action situation, whether involving use of a commons or not, is the nexus point and common element between the two theories.

As discussed previously, reflexivity and human agency are essential factors in self-monitoring and shaping of

48

behaviors. With the IAD Framework as a tool for

Institutional Theory, human agency also factors in.

Regarding the Action Situations in the IAD Framework, Ostrom writes, “Participants in an action situation are decision- making entities assigned to a position and capable of selecting actions from a set of alternatives” (Ostrom,

2005:38). Giddens described agency as the capability of individuals to act or intervene in events (Giddens, 1984:9), and Cleaver wrote, “Agency is seen as the capability, or power to be the originator of acts” (Cleaver, 2007:226).

Human agency involves reflection on the consequences of actions, establishing goals, and planning and selecting courses of action, and this follows consistently to the nature of action situations depicted in the IAD Framework.

Bandura emphasizes, “People anticipate the likely consequences of their prospective actions, they set goals for themselves, and they plan courses of action likely to produce desired outcomes” (Bandura, 1989:1179). There have been a wide range of research areas in the concepts of human agency: applications to autonomous leadership and development (Norris, 2011); research into models of human agency in computer simulations (Francisco, 2010); examining institutions and human agency with fragile countries

(Narasimhan, 2012); reviewing management and institutional

49

frameworks affecting water resources (Fans, 2005); and studying humans rights (Shea, 1985). Appreciating the relevance of human agency in affecting human behaviors, as a foundational concept depicted through both Reflexivity

Theory and Institutional Theory provides a useful and constructive concept for commons governance policy. It is also constructive in addressing cybersecurity policies.

The Research Case: Federal Cybersecurity Policy

Information and communications technologies link to the

Internet and benefit society by enhancing collaboration and collective action:

Information and communications technologies provide powerful opportunities for collective actions namely, the sharing of “rich” messages among a very large number of people, and, as such, they are the right tools for the creation and expansion of virtual communities. (Foray, 2004:30)

These communications technologies provide support to the critical infrastructures of the United States such as those supporting electricity, water, air traffic control systems, and financial systems. These infrastructures also support sensitive military and intelligence operations (Center for

Strategic and International Studies Commission on

Cybersecurity for the 44th Presidency, 2010:5; Committee on

Improving Cybersecurity Research in the United States,

50

2007:2; President’s Information Technology Advisory

Committee, 2005:iii,2,6; and White House, 2011:1).

As discussed in Chapter 1, both state and non-state actors, both organizational and individual Internet users, continue to threaten the cybersecurity of the United States as Defense Secretary Leon Panetta has observed:

Capabilities are available in cyber[sic] to virtually cripple this nation: to bring down the power grid, to impact on our governmental systems, to impact on Wall Street and our financial system and to literally paralyze this country. (Panetta, 2012)

Outside threats from foreign countries, terrorist cells, organized crime, and criminal factions can involve the theft, corruption, and destruction of U.S. telecommunications and information technology systems, as well as the information they contain. These systems control the critical infrastructures that support the nation’s vital domestic and national security requirements; access to water and electricity; transportation delivery; the ability to gather intelligence; and military defense. These threats may also be affected by acts of information terrorism perpetrated by conventional terrorist groups, and information culture groups that exercise aggressive acts within cyberspace (Post, Ruby, and Shaw, 2000:97,111).

Actions that can be taken to address these outside threats include: conducting intelligence operations to identify and

51

counter the actions of the instigators; creating trusted

Internet connections that interface with Federal government systems; deploying intrusion detection systems and sensors to identify and respond to malicious activity; increasing information sharing and coordination between national cybersecurity operations centers; and working with international partners and the private sector on risk management strategies (U.S. Department of Homeland Security,

2011:A-1; White House, 2008:1-5; and White House, 2011a:11).

While mitigating the threats from external actors focuses on protecting and monitoring U.S. infrastructures from intrusion, insider threats ranging from malicious to unintentional activity require a focus on individual actors themselves to mitigate the threat. Actions that can be taken to address malicious insider threats include screening and selecting personnel before they are hired and given access to networks; monitoring the online activity of users; and effectively managing individuals with personalities prone to conducting malicious activity (Shaw, Ruby, and Post,

1998a:3,9-10).

The monitoring of critical resources is particularly important in mitigating insider threats. As Camp writes,

“Monitoring resources in a shared domain is one of the simplest but most underused governance mechanisms” (Camp,

52

2011:98). However, in some cases monitoring to verify trust is actually related to potential sanctions being in place:

Individuals have a strong incentive to act cooperatively when robust monitoring and assurance structures are present, but cooperation in these cases has more to do with sanctions and other negative outcomes than interpersonal trust. (Cheshire, 2011:54)

Cybersecurity promising practices designed to address malicious insider threats also help in mitigating the effects of unintentional insider threats (Cappelli, Moore, and Trzeciak, 2012:xxi). Periodic security awareness training for employees can help instill cultural change, increasing awareness of security issues, and ensuring that employees understand the policies, procedures, and potential sanctions that might be imposed for failure to adhere to the guidelines. Effective password and account management is also of crucial importance to ensure that control of systems access is maintained, and to support system monitoring as needed, as well as to enable identifying attribution in connection with any potential malicious or unintentional actions that might impact system security. Logging, monitoring, and auditing online actions to identify any malicious activity are also promising practices. (Cappelli,

Moore, and Trzeciak, 2012:xxi,147-149).

Relevant linkages to reflexivity may further enhance the effectiveness of cybersecurity promising practices

53

through the shaping of insider behaviors. Table 2.1 presents examples of ways in which insider threat promising practices can be reinforced by considering the attributes of reflexivity. The reinforcement actions suggested in Table

2.1 may help refine the implementation of promising practices and future cybersecurity policy.

54

Table 2.1 Reflexivity and Insider Threat Promising Practices

Insider Threat Reflexivity Attribute and Promising Practice Reinforcement Action (Source: Cappelli, Moore, Trzeciak, 2012:147-149) Employee reflects on the influence of policies, procedures, and technical controls on his/her online actions Periodic security  Outcome: Reinforcement of training; awareness training Alternative online behaviors (policies, procedures,  Reinforcement Action: Tailor technical controls) for training to the individual; Clearly employees. identify acceptable norms of behavior on the Internet. Employee reflects on his/her actions in accessing and using the network  Outcome: Self-monitoring of behavior; Consideration of Strict password and attribution with his/her actions account management  Reinforcement Action: Implement policies and practices. security banner notification with log-on; Institute signed agreements identifying the norms of behavior required for gaining/keeping access. Employee reflects on his/her actions that may be monitored and documented  Outcome: Self-monitoring of behavior; Consideration of attribution with his/her actions Log, monitor, and audit being monitored and audited online activity.  Reinforcement Action: Implement security banner notification highlighting potential for online monitoring; Inform employees of past documented infractions.

From this brief analysis of insider threat promising practices it can be seen that some associated behaviors may be shaped through reflexivity, and the corresponding human agency of the individual user to exercise some self- regulation. Organizations and cybersecurity policy can

55

improve the effectiveness of promising practices through various reinforcing actions that leverage reflexivity, as indicated in Table 2.1. In this way, reflexivity can enhance the effectiveness of cybersecurity promising practices. This short analysis was intended to provide a few concrete examples of approaches that can be useful in leveraging the principles of reflexivity to shape Internet user behavior.

Other research efforts into insider behaviors and cybersecurity have included: examining online behaviors involving privacy and information assurance (Chai, 2009); reviewing information security behaviors of organizational employees (Herath, 2008); outlining human resource management for information systems security (Edwards, 2011); and identifying working professionals’ security behaviors and practices (Grant, 2010). Research into organizational insider behaviors is particularly relevant, especially for the U.S. Federal government, with more than 4.4 million active employees.

Federal Cybersecurity Policy and The Federal Information Security Management Act of 2002

In order to understand the dimension of the challenges with cybersecurity for the Federal government, it is necessary to review the key policies that govern federal

56

information technology. This section reviews the issues raised by the U.S. Government Accountability Office (GAO) involving the security of federal agency information systems, and federal cybersecurity polices particularly the

Federal Information Security Management Act of 2002.

In October 2011, the GAO reported that 24 federal agencies did not have adequate security measures in place for their information systems, and that this included a lack of the necessary protections from both external and internal threats. The agencies assessed by the GAO reported the following types of security issues: unauthorized intrusions into networks; deliberately caused network service interruptions (“denial of service”); the introduction of malicious computer software code; improper computer and network user practices; and scans and probes of networks by unknown individuals (U.S. Government Accountability Office,

2011:4,5,16). The GAO maintains that such weaknesses threaten the confidentiality and availability of information and systems at Federal agencies (U.S. Government

Accountability Office, 2011:33). Such problems affect the smooth and effective functioning of the Federal government: and because the Federal government provides services to every American, these vulnerabilities can in turn affect the entire nation.

57

The roots of Federal cybersecurity policy can be found in the Communications Act of 1934. This Act defined a new national regulatory framework for the telecommunications industry by mandating non-discriminatory interconnection support, suspending tariffs, and requiring the fair and equitable establishment of telephone rates. The Act also created the Federal Communications Commission (FCC), giving it almost unlimited authority to regulate interstate and

U.S. access to overseas land and radio communications resources (Brock, 1994:49-52).

In October 1965, Congress passed the Brooks Act of

1965, addressing automatic data processing equipment

(computer systems) in the Federal government, and assigning new responsibilities to the Bureau of Budget (now the White

House Office of Management and Budget), the General Services

Administration, and the Department of Commerce. The

Department of Commerce was now authorized to provide

“scientific and technological advisory services relating to automatic data processing,” and, through the National Bureau of Standards (the predecessor to the Department of

Commerce’s National Institute of Standards and Technology

{NIST}), to make recommendations on standards for Federal computer systems.

58

The Computer Security Act of 1987 amended the Brooks

Act of 1965 giving the National Bureau of Standards (NBS) authority for developing standards and guidelines for

Federal computer systems. For sensitive information stored on Federal computer systems, the NBS was required to collaborate with the National Security Agency (NSA). This

Act also required the development of security plans by the operators of Federal computer systems “that contain sensitive information,” and required mandatory periodic training for those managing, using, and operating Federal computer systems that held sensitive information. National security systems, systems supporting the Intelligence

Community, cryptology supporting national security, and military command and control systems, were not included in these NBS responsibilities.

In 2002, Congress passed the Federal Information

Security Management Act of 2002 (FISMA), which amended the

Computer Security Act of 1987. FISMA was primarily intended to provide a comprehensive framework for ensuring the effectiveness of information security controls; to coordinate government-wide management and oversight of information security risks; to develop and maintain protective controls; and to increase general oversight of

Federal information and information systems.

59

FISMA expanded the role of the White House Office of

Management and Budget (OMB) in information security oversight responsibilities for all Executive Branch departments and agencies, with the exception of national security systems. Agencies were required to follow standards and guidelines developed by NIST; to develop agency-wide security programs; and to take action to ensure compliance

(White House, 2009:C-7). FISMA did not specifically address actions and individual behaviors of Federal computer users.

Rather, the head of each agency was responsible for providing security protections, complying with standards and policies, addressing the need to have management processes and controls in place, and providing security awareness training for personnel.

The GAO has warned that lack of established agency-wide information security programs required by FISMA was an underlying cause for the weaknesses they reported in October

2011 in information security (U.S. Government Accountability

Office, 2011:16). However, while FISMA was well-intentioned, and despite the progress made in some areas, critics argue that FISMA has not adequately addressed the critical security problems of the nation’s Federal information systems. Furthermore, FISMA is out of date: the guidance established for implementing FISMA and for auditing affected

60

systems is no longer effective in addressing current threats, and has even in some cases slowed down the processes for maintaining security (Paller, 2010). Critics have pointed out a variety of weaknesses in FISMA including efficiency problems and structural defects; weak performance monitoring and measurement processes; limited security enforcement direction on the use of the Internet; ambiguous and confusing policy language; currency with technology and current threats; burdens placed on agencies by unfunded policy mandates; and the exclusion of the private sector in policy implementation (Center for Strategic and

International Studies Commission on Cybersecurity for the

44th Presidency, 2008:68-69; Gilligan, 2010; Lewis, 2007; and

Silvers, 2006). Critics have called for new approaches and ongoing consideration of ways to improve Federal cybersecurity policy (Center for Strategic and International

Studies Commission on Cybersecurity for the 44th Presidency,

2008:68-69; Gilligan, 2010; Paller, 2010; and Schumm, 2011).

FISMA reform has captured the attention of both the executive and legislative branches of government. In July of

2010, the White House Office of Management and Budget (OMB) issued a memorandum outlining the responsibilities of OMB, the White House Cybersecurity Coordinator, and the

Department of Homeland Security (DHS) regarding the

61

implementation of FISMA. This Memorandum stated that OMB is responsible for submitting the annual FISMA report to

Congress, developing and approving the cybersecurity portions of the President’s budget, and coordinating with the White House Cybersecurity Coordinator on all policy issues (White House, 2010).

The White House Cybersecurity Coordinator serves as the principle White House official responsible for coordinating interagency cooperation with DHS. DHS was designated with primary responsibility for operational cybersecurity on

Federal systems that fall under FISMA oversight, and for overseeing implementation, FISMA compliance, operations and incident response, and for reviewing the cybersecurity programs of agencies (White House, 2010).

Congress is currently addressing FISMA reform through the introduction of a number of new bills in both the House and the Senate. In February 2012 Senators Lieberman,

Collins, Rockefeller, and Feinstein introduced the

“Cybersecurity Act of 2012” (S.2105) which would increase the role of DHS in addressing cybersecurity, particularly with regard to critical infrastructures; emphasize greater information sharing about cybersecurity between agencies; add an increased emphasis on the continuous monitoring of

Federal systems; and address FISMA reform issues such as

62

acquisition risk management, an increased oversight role for

DHS, and more responsibility for OMB in setting information technology acquisition standards.

In March 2012 Congresswomen Mack and Blackburn introduced H.R. 4263 (“SECURE IT Act of 2012”) to increase cybersecurity information sharing across the Federal government and with the private sector, and to update roles and responsibilities for FISMA, including closer coordination between DHS and the Department of Commerce in regard to implementation of NIST standards, prosecuting crime in cyberspace, and addressing issues of research and development.

Other policies and/or laws, while not directly supporting FISMA reform, have an influence on various aspects of monitoring and protecting Federal information systems. National Security Presidential Directive

54/Homeland Security Presidential Directive 23 signed by

President Bush in January of 2008 codified the Comprehensive

National Cybersecurity Initiative (CNCI) to address Federal as well as national U.S. cybersecurity strategy (White

House, 2008). Executive Order 13587 (“Structural Reforms to

Improve the Security of Classified Networks and the

Responsible Sharing and Safeguarding of Classified

Information”) addressed the insider threat issue to

63

classified national security information (President Obama,

2011) in response to the unauthorized disclosure of classified information from a U.S. soldier to the Internet site WikiLeaks (USA Today, 2012).

Electronic surveillance and monitoring has always presented challenges for intelligence and law enforcement organizations because of Constitutional requirements for protecting privacy and civil liberties of U.S. citizens. The

Communications Act of 1934 protected the privacy of communications on telephones and telegraphs by banning the interception and divulgence of information conveyed through these media, and the Federal Wiretap Act, passed by Congress in 1968, and the Electronic Communications Privacy Act of

1986 extended these protections. In 1978, Congress passed the Foreign Intelligence Surveillance Act that addressed the limits of electronic surveillance for the gathering of foreign intelligence (White House, 2009:C-5).

Such laws and guidelines present a rather complex and dynamic governance regime for those whose responsibility it is to uphold Federal cybersecurity. These policies also do not meet the current needs of Federal cybersecurity as the

Center for Strategic and International Studies (CSIS) observed, “Federal policies governing cybersecurity activities are also out of date” (Center for Strategic and

64

International Studies Commission on Cybersecurity for the

44th Presidency, 2011:13). However, a new governance regime drawing on the Reflexivity and Institutional Theories and their common element of human agency may help in improving the current cyberspace environment.

Reflexivity and Institutional Theories for Addressing Cybersecurity

Cybersecurity is indeed a challenge that requires collective action in order to adequately address the commons environment of the Internet, including the Federal government’s portion of cyberspace. The capabilities of the

Internet and the users themselves affect online behavior:

Online behavior is shaped by the capabilities and constraints of applications as well as by the socially centered, human factors that influence how people use those applications. (Clark, 2011:15)

The emphasis on informal and formal rules, norms, and organizations in Institutional Theory can inform policy approaches for addressing cybersecurity issues by providing a structured methodology, addressing the key elements necessary to “structure human interaction” (North, 1993) that is essential to long-term cybersecurity governance.

Institutional Theory promotes regular and cooperative patterns of behavior among all parties involved, and with cybersecurity this would include the users and operators of

65

the Internet. Institutional Theory concepts may not, however, address the behaviors of all users, particularly those with malevolent intentions; however, increasing collective action within Federal agencies and affiliates can help to eliminate some of the existing problems of enforcement and coordination across all affected parties, including the private sector, thereby improving the level of cybersecurity.

The emphasis on individuals in social settings reflecting on their own actions as both observers and participants, and then modifying their behaviors, as conceptualized in Reflexivity Theory, can also help in developing policies that lead to more self-monitoring and self-regulating improvements that can help improve cybersecurity. Focusing on the issue of human agency and shaping user behaviors on the Internet offers a new constructive approach.

The IAD Framework (Figure 2.3) from Ostrom and her colleagues helps instantiate the linkage between

Institutional Theory and Reflexivity Theory. It is also a useful tool for analyzing new approaches for addressing cybersecurity through analysis of the three broad areas: underlying factors; outcomes; and the Action Arena.

66

Figure 2.3 The IAD Framework and Cybersecurity

In the context of cybersecurity, the Biophysical and

Material Conditions identify the material and tangible cyberspace infrastructures affecting action, and are in turn transformed by Action Situations from the Action Arena. When examining a cybersecurity environment, examples of these components would include identifying hardware and software resources, locally-based and enterprise-wide networks, encryption devices, and network intrusion monitoring devices.

The Attributes of Community refers to the community in which the Action Arena takes place, and may include both

67

resource users and policy-makers. Regarding Federal cybersecurity, this could include Internet users, Internet service providers, Federal departments and agencies, NIST, and OMB.

The Rules address the core component of an institution, that is, the known and enforced instructions, guidelines, or policies that affect participant actions and behaviors. In the context of cybersecurity this would include policies addressing privacy, civil liberties, and network monitoring, standards addressing identity and access management, network certification guidelines, and information security standards from NIST (also known as Federal Information Processing

Standards {FIPS}).

For the outcomes portion of the framework, Interactions include external influences, activities, constraints, and incentives affecting participants. In the context of cybersecurity, this might include cybersecurity threats, economic and budget limitations, and warning reporting from the private sector and even international partners.

Outcomes are the resultant changes and feedback to reinforce change that result to the physical, community, and institutional characteristics and activities in the Action

Arena. In the context of cybersecurity this would include achieving reliable network access, trusted transactions,

68

integrity of information stored and of transiting networks, infrastructure protection, and interagency cooperation.

Finally, the Evaluative Criteria in the IAD Framework provide the basis for assessing the outcome measurements.

Measuring and reporting outcomes is one of FISMA’s key requirements, so FISMA measures could be included as outcomes, or metrics could address the number of cybersecurity intrusions, or the number of observed successful attacks on Federal systems.

The Action Arena, the heart of the IAD Framework, is the focus point for analyzing the behavioral impacts of any changes to cybersecurity policy or processes that are introduced, including such activities as information sharing, network and systems monitoring, and cybersecurity incident response. It is also in the Action Arena that the instantiation of the theoretical linkage between Reflexivity

Theory and Institutional Theory is identified. Here reflexive behaviors and human agency with participants in the Federal information systems environment occur, and it is here that analysis can be done regarding the impact of changes in the physical, community, and/or institutional factors. As adjustments are made to each of these three broad areas, the resultant shaping of participant behaviors can be identified and analyzed.

69

All in all, the IAD Framework provides an effective tool to use in identifying and analyzing situations that are shaped by cybersecurity rules and norms, and assessing how they may affect the behaviors of the participants, such as

Federal employees, that are using the Internet. While the research described here will not go into a detailed analysis of the IAD Framework for developing new cybersecurity policies, it will show how Institutional Theory could be applied, along with Reflexivity Theory, to address cybersecurity issues and how these two theories could be useful in developing new cybersecurity policy. In the chapters that follow, some of the theoretical concepts represented in the IAD Framework will be examined.

Summary

Chapter 2 provided a review of the relevant literature, including a detailed examination of the two underlying theories and the case. Reflexivity Theory holds that individuals in social settings reflect on their own actions as both observers and participants, and modify their behaviors accordingly. Institutional Theory holds that societies can organize themselves around informal and formal rules, norms, and organizations to sustain regular and cooperative patterns of behavior. Cleaver defines agency as

70

“the capability, or power to be the originator of acts and a distinguishing feature of being human” (Cleaver, 2007:226).

Human agency is the nexus point and common element between these two theories: the IAD Framework’s Action Arena is where the link between Reflexivity Theory and Institutional

Theory is instantiated and can help in the analysis of a commons, to include the Internet. This chapter also detailed the elements of the case, Federal cybersecurity policy, which provides a real-life application of the IAD Framework.

The next chapter presents the data-gathering methodologies

(electronic survey and personal interviews) that were used to explore the utility of the proposed concepts.

71

Chapter 3: Research Methodology and Design

Introduction

The research described here draws upon the constructive theoretical linkage between Reflexivity Theory and

Institutional Theory and uses a mixed methods approach to help identify new policy concepts for commons governance.

Chapter 3 describes the approach used to fulfill this research objective. First, the exploratory mixed methods research approach is described. Second, the data collection and analysis processes that are used to address the research questions are introduced. Finally, potential limitations to the validity of the research findings, and methods used to address them are reviewed.

Exploratory Mixed Methods Research

General exploratory research focuses on an emerging issue or concept: it is not intended to present widely applicable results, but rather to provide the foundation for conducting more conclusive research in the future (Singh,

2007:63-64). Exploratory research can break new ground, provide insight into new concepts or ideas, and/or test the feasibility and outline the methods needed to conduct more extensive research (Babbie, 2010:92-93). Because both

72

Reflexivity Theory and Institutional Theory are relatively new, exploratory research is appropriate for this dissertation. As new self-governing approaches and new policy considerations for commons governance emerge, cybersecurity remains a challenging issue not only for the

Federal government, but also for the entire Internet community. The exploratory research described here introduces new concepts to build on and to help advance self-governance policies that may be useful to aid in governing the Internet and other similar commons. A mixed methods research design is appropriate for an exploratory orientation such as the one here.

Mixed methods research involves gathering and analyzing both quantitative and qualitative data. Quantitative data are numerical information collected from surveys and other structured data methodologies. Qualitative data are words collected from narratives, case studies, interviews, and other open-ended text and image data-gathering methodologies. Qualitative data are typically analyzed to depict trends or themes that can help develop concepts and theories (Creswell, 2003:18-20,179; and Patten, 2004:19).

Mixed methods research combines the strengths of quantitative research for testing hypotheses and generalizing findings to a population with the benefits of

73

qualitative research, which provides context as well as insights to help interpret the quantitative findings

(Creswell and Plano-Clark, 2007:9-10).

There are six general methodology designs for conducting mixed methods research. The sequential explanatory design involves first collecting and analyzing quantitative data as the primary source, followed by qualitative data collection and analysis, then integrates the results that have been gathered using both methods. The sequential exploratory design is just the opposite: it involves first collecting and analyzing qualitative data, then collecting quantitative data, and then integrating the results. The sequential transformative design is more flexible: either quantitative or qualitative data collection can be done first, followed by integrating the findings obtained using both methods.

The concurrent triangulation design involves collecting qualitative and quantitative data concurrently, simultaneously integrating the data. Concurrent nested designs follow a similar approach, albeit in this design one method, either quantitative or qualitative, is dominant and focuses the general analysis. Finally, concurrent transformative designs assign equal priority to both methods

74

in focusing the analysis (Creswell et al., 2003:224; and

Creswell and Plano-Clark, 2007:71-72,143).

For the research described here, the sequential explanatory design provided the most appropriate methodology for achieving the research goals. Despite the research time involved for this design, the design has clear, separate phases, and is straightforward, providing some ease in reporting the findings (Creswell et al., 2003:223-224).

Quantitative data aided in the initial exploration of the relevance of human agency to link the Reflexivity and

Institutional Theories, and the general relationship of human agency to Internet use. The qualitative data collection elaborated on these initial findings and explored their utility to Federal cybersecurity policy and general commons governance policies. As such, the important first step of implementing the sequential explanatory design involved planning and implementing sampling and quantitative data collection processes.

Data Collection Process

In any data collection process, the researcher must first identify the population that constitutes the entire group of individuals being studied. While the population can be quite large and difficult to examine in totality, as for

75

example when researching an issue that affects every member of a society, a sample is a smaller, finite subset of the population that is drawn in a way to assure that it is representative of the entire population (Conover, 1999:69-

70; Frankfort-Nachmias and Nachmias, 2000:163; Gravetter and

Wallnau, 2009:3-4; and Warner, 2008:3).

Researchers typically use three common sampling methods: random, stratified, and judgmental, or purposive sampling. With random sampling every member of a population has an equal chance for being solicited to participate in the research. With stratified sampling, the researcher selects some factors that are presumed to be pertinent, such as year of birth or country of origin, and then takes random samples from groups assigned on these factors. Judgmental or purposive sampling is non-probabilistic, and the sampling follows the judgment of the researcher based on some characteristic that is especially relevant to the purpose of the research. Purposive sampling does not accommodate full statistical testing. However, data from purposive sampling can still provide valuable information, and can support preliminary explanation of key concepts (Ackoff,

1953:92,104,118; Frankfort-Nachmias and Nachmias, 2000:168-

171; Miller and Salkind, 2002:51-53; Patten, 2004:49-51; Rea and Parker, 2005:162-172; and Singh, 2007:108). Convenience

76

sampling, however, provides the means to acquire a ready and convenient source of data, albeit this sampling approach is biased and cannot provide a general representation of the population (Davis and Smith, 2005:416; Patten, 2004:45; and

Trochim, 2001:56).

For the research described here, convenience sampling provided the best method to meet the goals of the exploratory research with the readily available data collection resources. Internet users constituted the population of interest as relevant to the purpose of the research. And while Federal cybersecurity policy was the specific case researched, the theoretical concepts being explored address general user behaviors and human agency on the Internet, whether these users are Federal government employees or common citizens.

For the sampling, survey participants were solicited through group notifications posted on the Internet and e- mails sent to individuals. Groups of users were reached via the popular Internet social networking site “LinkedIn®,” and group e-mail solicitations were sent to faculty and students at The George Washington University Trachtenberg School of

Public Policy and Public Administration, as well as to business colleagues, and associates of the author. The

Internet users solicited for the participation in the survey

77

constituted a combined pool of more than 135,000 potential participants. This number was derived based on 10 e-mailed associates of the author, 498 e-mail recipients from the

Trachtenberg School, and the following LinkedIn® subgroups:

California Polytechnic State University Alumni (14,849 members), EOP (1,029 members), Market Research Bulletin

(24,692 members), The George Washington Alumni Association

(19,500 members), Trachtenberg School (1,049 members),

Transformational Leadership: Front Runners in Organizations

(12,272 members), U.S. Government Relations and Public

Affairs (19,709 members), Washington DC Connections (29,504 members), Whitman College (1,279 members), and Worldwide

Intranet Challenge (11,415 members).

The notifications emphasized that participation in the survey was strictly anonymous and confidential. The survey did not collect any personal information, only general demographic information was gathered through questions in the survey. At the conclusion of the nine-month open access period for voluntary participation in the survey, 159 respondents had fully completed the survey providing data for analysis. While this was a relatively small sample, it was nonetheless useful for exploring the concepts researched.

78

For the qualitative research phase, the sampled group consisted of cybersecurity experts from both the public and private sectors. E-mail solicitations were used to establish initial contact with the interviewees. The interpretive portion of the research discussed in Chapter 4, will present the integrated data analysis and results from both the quantitative and qualitative phases.

For this research, missing data was always a concern.

Surveys often draw non-responses or inconsistent responses to some questions (Meulman, Van der Kooij, and Heiser,

2004:64; and Singh, 2007:112). For this research, in order not to skew the results of the survey, only records in which a respondent elected not to answer any of the survey questions (no survey data provided at all) were initially deleted from the analyzed group. Furthermore, when analyzing a particular survey question, the author deleted any records having missing data, and there was no imputation, or estimation of values performed on the missing data.

The Survey

The benefits of using an Internet-based survey are convenience, cost-effectiveness, and the ability to ensure confidentiality and security for respondents. The limitations of Internet-based surveys include lower response

79

rates, the bias of self-selection (some participants are not comfortable with the Internet medium and do not answer all questions), and lack of the opportunity for respondents to engage directly with the interviewer or to ask for clarification when they need it (Rea and Parker, 2005:11-

12,35). Despite these limitations, since in the case of the research described here user behaviors on the Internet were a focus, an Internet-based survey best met the research goals and was a logical and appropriate method of data collection.

For implementing the survey, SurveyMonkey®, a U.S.-based company that provides Internet-based survey development tools, also hosted the survey tool on the Internet. The survey consisted of 42 questions that collected respondent demographics, and used multiple-choice list selections,

“yes” or “no” options, and Likert attitudinal scaled answers. (See Appendix A for the actual survey questions.)

The primary purpose of the survey questions was to explore respondents’ views about their proclivity toward certain

Internet behaviors and actions, and their personal opinions of new policy concepts concerning cybersecurity. Likert-type survey questions are particularly useful for gathering data about personal views and perceptions. Likert-type survey questions typically solicit respondents’ agreement or

80

disagreement with statements provided to them (Miller and

Salkind, 2002:330; and Rea and Parker, 2005:68).

The ordering of the questions ensured that the initial questions were straightforward and easy-to-answer, in an attempt to keep respondents engaged in the survey. The limited use of open-ended questions, and the presentation of focused, closed-ended questions was designed to elicit answers that could be easily tabulated and analyzed. The research questions guided the process of framing the survey questions.

As discussed in Chapter 2, reflexivity and the ability for conducting purposive actions are aspects of human agency

(Cleaver, 2007:226). Moreover, human agency is common to both Reflexivity Theory and Institutional Theory, and as such exploring reflexivity as a potential influence in shaping human behaviors, such as in using commons resources or the Internet, was a key concept examined in the survey.

In research, an independent variable identifies what is manipulated, either by nature or the researcher, and is a representation of the cause or treatment under study. A dependent variable identifies what is affected by the independent variable and essentially represents the effects or outcomes (Trochim, 2001:8). As such, in order to explore reflexivity as relevant for shaping behaviors, reflexivity

81

was the independent variable, and the dependent variables represented the behaviors or actions that could be affected by reflexivity.

Survey questions used to address Subsidiary Question 1a

(“To what extent will actors avoid behaviors that could jeopardize the trust of other actors using a commons?”) examined the value placed by respondents on the importance of maintaining trust with other resource users, and examined their self-perceived proclivity to avoid actions that might jeopardize this trust. Reflexivity Theory is a component in this question, as it asks respondents to reflect on their own behaviors for maintaining interpersonal trust.

Institutional Theory also is a component in this question because, as discussed in Chapter 2, trust between actors is an important element for enabling stronger collective action and self-governance (Cook and Cooper, 2003:209; and Ostrom,

2005:287). Table 3.1 identifies the survey questions selected to address Question 1a.

82

Table 3.1 Survey Questions Supporting Question 1a

Independent Variable Survey Questions: “How important is maintaining the trust of other users on the Internet to you?” and “Do you avoid activities on the Internet that could cause you to lose the trust of other Internet users?” #1 “How frequently do you accurately describe product information when selling at online auctions?” #2 “How frequently do you get approval before posting information on another Internet user?” #3 “How frequently do you check e-mail sources before forwarding e-mails to others?” #4 “If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?”

An independent variable was created by adding the results of two survey questions: “How important is maintaining the trust of other users on the Internet to you?” and “Do you avoid activities on the Internet that could cause you to lose the trust of other Internet users?” While both questions were correlated (Pearson Correlation=0.409, significant at the 0.01 level, 2-tailed), their combined use was essential to support the theories tested.

Survey questions used to address Subsidiary Question

1b, “To what extent will actors avoid behaviors that could jeopardize reciprocal action with other actors using a commons?”, examined the value respondents placed on maintaining reciprocity with others, and their self- perceived proclivity to avoid actions that might jeopardize

83

reciprocity. As mentioned in Chapter 2, Ostrom argues that there is a relationship between “trust, reciprocity, and reputation” and that the lack of these can bring about negative effects (Ostrom, 2003:53). Therefore an independent variable for this test must bring in these concerns— reputation and trust—with reflexivity and maintaining reciprocal action. Reflexivity Theory is a component in

Question 1b, as this question asks respondents to reflect on their own behaviors for maintaining reciprocal relationships. Institutional Theory also is a component because reciprocity is another key element for collective action (Ostrom, 2005:287; and Ostrom and Walker, 2003:7-8).

As such, Table 3.2 identifies the survey questions selected to address Question 1b.

84

Table 3.2 Survey Questions Supporting Question 1b

Independent Variable Survey Questions: “To what extent do you think other Internet users’ judgment of your online activities on the Internet is important?” and “Do you avoid activities on the Internet that could cause you to lose the trust of other Internet users?” #1 “How frequently do you get approval before posting information on another Internet user?” #2 “How frequently do you check e-mail sources before forwarding e-mails to others?” #3 “How frequently do you get approval before sharing another Internet user’s contact information?” #4 “If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?”

An independent variable was created by adding the results of two survey questions: “To what extent do you think other

Internet users’ judgment of your online activities on the

Internet is important?” (a reputation question) and “Do you avoid activities on the Internet that could cause you to lose the trust of other Internet users?” Both questions were not correlated (Pearson Correlation=0.017, not significant).

Subsidiary Question 1c, “To what extent will actors avoid behaviors that could bring attention of their actions to third-party monitoring agents?”, focused on another aspect of commons governance. The concept of using such agents to help govern the use of a commons, whether these agents would be accountable to the resource’s users or are users themselves, is highlighted by Ostrom as an important

85

design principle for establishing long-enduring institutions. These monitors keep watch on the conditions of the resource, the activities of users, and provide rule enforcement (Ostrom, 2005:259-265). As with the previous two questions, Reflexivity Theory is a component in Question 1c as this question asks respondents to reflect on their own behaviors undertaken to avoid receiving undue attention in an environment with third-party monitoring. Table 3.3 identifies the survey questions selected to address

Subsidiary Question 1c.

Table 3.3 Survey Questions Supporting Question 1c

Independent Variable Survey Question: “To what extent would you be more careful with your activities on the Internet if they were being monitored by an Internet activity monitoring organization?” #1 “How frequently do you accurately describe product information when selling at online auctions?” #2 “How frequently do you mask or use another identity when visiting Internet chat rooms and/or social networking sites?” #3 “If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?”

Subsidiary Question 1d, “To what extent will actors avoid behaviors that could adversely affect their own future ability to use a commons?” examined respondents’ proclivity to avoid actions that could adversely affect their use of a

86

shared resource. As with the other questions, Reflexivity

Theory is a component in the query because the questions address the conscious self-monitoring of respondent behaviors that might be detrimental to their use of the

Internet. Table 3.4 identifies the survey questions used to address Subsidiary Question 1d.

Table 3.4 Survey Questions Supporting Question 1d

Independent Variable Survey Question: “To what extent do you know which online activities would jeopardize your personal security when you use the Internet?” #1 “How frequently do you scan your personal computer for viruses?” #2 “How much more careful are you with your activities on the Internet when you use social networking sites or chat rooms?” #3 “How frequently do you operate your home wireless network using a password?”

Analyzing and evaluating the data collected from the survey constituted the quantitative phase of the research.

Consistent with the sequential explanatory approach, these quantitative findings then required further analysis and elaboration, leveraging the results obtained with qualitative data. As such, the interviews constituted the qualitative research data-gathering phase.

87

The Interviews

The interviews explored some of the challenges regarding current Federal cybersecurity policies, and solicited the opinions of interviewees about the new policy concepts and behavioral issues identified in the quantitative phase. Personal interviews follow a variety of formats, depending on the particular requirements of the research. In open-ended interviews participants are asked questions designed to uncover general facts and opinions about a particular subject. Focused interviews use flexible formats such as open-ended questions, but also include direct questioning based on a particular area of study

(Miller and Salkind, 2002:202,310).

For the research described here, the use of focused, non-attributable interviews helped gather data designed to answer the research questions while respecting the limited time interviewees were able to contribute to the interview process. Interviewees representing a broad base of professional expertise—both former government officials and private-sector experts with experience in cybersecurity, and policy analysis and implementation—were selected. A broad range of experience was necessary to ensure adequate coverage of all pertinent areas involving cybersecurity.

Interviewees included a former Director of the National

88

Security Agency, a former headquarters official and advisor to the Director of the Central Intelligence Agency, a former

Chief Information Officer of a major Department of Defense agency, a former head of multi-agency information assurance at the National Security Agency, a former official responsible for overseeing White House telecommunications and information technology, two former White House National

Security Staff members, a current homeland security official, a nationally renowned expert in cybersecurity and advisor to the President, and two current private-sector market leaders in cybersecurity policy. In total eleven experts were interviewed.

The author used e-mail to contact fourteen prospective interviewees. The interview solicitations included a copy of the interview questions and a statement affirming that the results of the interview would be non-attributable. The 14 interview questions were designed to address all of the research questions. (The interview questions are included in

Appendix B.) The questions asked for interviewees’ personal insights into the Federal Information Security Management

Act of 2002 and other cybersecurity policy issues, and the utility of applying the proposed suggestions that were supported by the survey findings to inform future cybersecurity policy. The alignment of the interview

89

questions with the concepts under examination are described in the following section, which outlines the data analysis process, the next step in the sequential explanatory design.

Data Analysis Process

The data analysis process involved identifying common descriptions, themes, and assertions from the data that had been collected. During the quantitative phase of the mixed methods research, the Statistical Package for the Social

Sciences (SPSS®), Version 19, was used to cleanse the data and remove records with missing data. SPSS® also supported statistical analysis by generating both descriptive statistics and bivariate analyses that were useful in answering the research questions. Descriptive statistics are calculated to summarize and organize collected data (Patten,

2004:97).

The data analysis activity followed a nonparametric statistical approach. Parametric statistical methods involve the use of variables that have some dependency on the original population’s characteristics, and have a known distributional shape across an interval scale so that it is possible to calculate statistics such as means and variances. Nonparametric statistical methods have fewer testing constraints based on population characteristics, are

90

distribution-free, and generally involve using nominal and ordinal scale data. Nonparametric statistical methods still allow for the calculation of descriptive statistics such as frequencies (Conover, 1999:116-118; Gravetter and Wallnau,

2009:606; and Singh, 2007:162). Likert-type scaled questions associated with surveys are well suited for nonparametric statistical methods (Ackoff, 1953:268; Conover, 1999:115-

116; Frankfort-Nachmias and Nachmias, 2000:444; Ho,

2006:357; Miller and Salkind, 2002:330; and Warner, 2008:21-

24).

For the quantitative analysis, calculations of frequency distributions provided the first phase of the data analysis activities. Frequency distributions provide an ordering and tabulation of the counts or percentages associated with various categories of measurement in a variable, essentially showing a pattern of responses to the variables under investigation (Frankfort-Nachmias and

Nachmias, 2000:321-323; Glenberg and Andrzejewski, 2008:22-

24; and Gravetter and Wallnau, 2009:37). Histograms, or bar charts, graphically presented the frequency distribution information, and tables with percentages of tabulated categories in each variable of interest were generated.

The second data analysis activity involved creating contingency tables to present an analysis of bivariate

91

relationships drawing upon selected survey questions. For two variables, a 2x2 contingency table examines the relationship between two properties in each variable, essentially examining four possible combinations of values between the two variables (Conover, 1999:179-181; and Rea and Parker, 2005:180). For the research described here, 2x2 contingency tables were used to test the relationships between pairs of variables from the survey.

For the qualitative portion of the data analysis activity, using data derived from the interviews, summarization of the information helped to identify concepts suggested by the data. To identify concepts in the qualitative interview data, the author classified and categorized information guided by the context of the original interview questions and aligned with the original research questions. This approach was consistent with the requirement for developing a classification or coding scheme, and looking for regularities that could help identify knowledge-revealing patterns in the data. This also supported the qualitative analysis objectives of ensuring that the findings would be coherent and consistent, that they would help in understanding the phenomena studied, and be useful in contributing to new theory (Patton,

2002:463,465-467). For sorting and analyzing the interview

92

data there were five categories: Reflexivity Theory and its relevance to cybersecurity policy; avoiding behaviors that negatively impact trust and reciprocity; self-monitoring habits for cybersecurity; strengths and shortfalls of FISMA; and additional policy ideas for cybersecurity. Table 3.5 presents these categories aligned with the original research questions.

93

Table 3.5 Interview Question Categories

Reflexivity Theory and Its Relevance to Cybersecurity Policy (Supporting Subsidiary Research Questions 1a, 1b, 1c, and 1d) - Do you feel that people reflect on the ramifications of their actions when they use the Internet? - If users were aware of third-party monitoring, do you think they would alter their online behaviors? - Do you think that people would generally alter their online behaviors if their true identities were visible to other users using the Internet? - Should these considerations be factored into new cybersecurity policy? Avoiding Behaviors that Negatively Impact Trust and Reciprocity (Supporting Subsidiary Research Questions 1a and 1b) - How important is maintaining trust and reciprocity between users on the Internet? - Do you think users reflect on their own actions to maintain trust and reciprocity with others? - Do social networking sites promote greater trust between Internet users? - What policies should address such interactions? Self-monitoring Habits for Cybersecurity (Supporting Subsidiary Research Questions 1d and 2a) - What self-monitoring/self-policing practices do you think could promote more effective cybersecurity? Strengths and Shortfalls of FISMA (Supporting Subsidiary Research Question 2a) - What do you see as the policy strengths of the FISMA of 2002? - What are some of the policy shortfalls of the FISMA of 2002? - What methodologies did policy-makers and their staff members follow to research, analyze, and formulate policy for the FISMA of 2002? - Do you think there is a clear understanding of the infrastructure boundaries of affected Federal information systems governed by FISMA policies? If not, why not? - What general security principles from the FISMA could apply to managing other sharable resources, even non-technological resources such as natural resources? - In general, do you think that the FISMA adequately addresses the need for controlling user behaviors on the Internet? Additional Policy Ideas for Cybersecurity (Supporting Subsidiary Research Questions 2a) - What new ideas in Federal cybersecurity policy should now be included in future policies?

Using the classification and coding scheme, presented in

Table 3.5, the author sorted the interview data according to

94

the five categories. Chapter 4 presents the integration of quantitative and qualitative data consistent with this scheme. Throughout the entire data collection and analysis process, it was necessary to maintain research validity and to address potential research limitations. This topic is discussed in the next section of this chapter.

Research Validity and Limitations

Maintaining research validity throughout the data collection and analysis activities is essential for ensuring that the research actually answers the research questions and satisfies the goals of the research. Validity involves making the best and most accurate analysis given the existing circumstances, and ensuring that the measurements obtained in fact measure what was intended to be measured

(Frankfort-Nachmias and Nachmias, 2000:149; Patten, 2004:59;

Singh, 2007:77; and Trochim, 2001:20). Several types of validity should be considered for the research described here.

Internal validity focuses on cause and effect determinations that are part of the research outcomes.

Research experts Donald Campbell and Julian Stanley define internal validity as a determination that the "experimental

95

treatments" used actually make a difference during the research (Campbell and Stanley, 1963:5). Trochim writes:

Internal validity is only relevant in studies that try to establish a causal relationship…The key question of internal validity is whether observed changes can be attributed to your program or intervention (the cause) and not to other possible causes (sometimes described as alternative explanations for the outcome). (Trochim, 2001:172)

It is essential that potential causal factors that, while not measured in the research, are appropriately identified in the research documentation (Newcomer, 2011). While the research described here is not experimental, internal validity must be addressed to substantiate the evidence that reflexivity, the independent variable, could be a factor in shaping behaviors in using commons resources, such as the

Internet, and not something else.

For the research described here, there are three relevant threats to internal validity: measurement effects; situational effects; and selection bias. Measurement effects focus on the potential that during the process of participating in the experiment, or the survey, the participants were affected by the research instrument itself. And, situational effects, or also known as the

Hawthorne effect, address the issue that respondents are aware of the study and even its novelty and this might affect their answers. Strategies for addressing internal

96

validity are designing the study to discount the effects of these threats, and identifying these other causes in advance and controlling for them (Newcomer, 2011; and Trochim,

2001:172-176).

Measurement effects address the issue of the survey having an effect on the respondents as they proceeded to answer the questions. To address this concern, survey questions focusing on the specific Subsidiary Questions were located on multiple pages of the survey to avoid grouping and potentially synchronizing answers to the questions based on their close proximity. Additionally, initial scoping questions at the beginning of the survey focused on cybersecurity and Internet related general questions, not those questions specifically related to the Subsidiary

Questions. This method provided some control on the potential for a systematic effect to answering the survey.

The survey was pretested with associates of the author for two weeks prior to the survey opening to others to identify any problems with the instrument and clarifying any confusing survey questions. Finally, once the survey was active, respondents could only take the survey once as the

SurveyMonkey® tool only allowed access once to the survey; when the survey was submitted, the user’s computer Internet

97

identification (Internet Protocol address) was retained by the company to prevent future access to the survey.

The situational effects address the issue of survey participants being aware that they are taking a survey, its specific topical focus, cybersecurity, and that this might be a novelty and therefore motivate them to change their responses. While it is difficult to completely control for this issue, participation in the survey was strictly voluntary, and respondents were told at the beginning that no personal information was being gathered and this was hoped to encourage them to be candid; trust in the researcher and process is, of course, a subjective matter.

However, the survey was provided to multiple participants and the interview data was used to also substantiate the findings and increase the internal validity of the resultant data.

Self-selection bias is a threat to both internal and external validity, particularly regarding bias in selecting subjects for the research. One way to control this threat is to take random samples from the population (Trochim,

2001:45,196), but that was not possible in this study.

However, the solicitations on LinkedIn® went to a wide-range of subgroups with varying demographics of members represented to increase the variety of respondents.

98

External validity focuses on generalization, or the degree to which the study results may generally apply to other phenomenon or populations (Campbell and Stanley,

1963:5; Frankfort-Nachmias and Nachmias, 2000:101; Miller &

Salkind, 2002:50; Newcomer, 2011; and Trochim, 2001:42). A research project satisfies the demands of external validity when the results of the research can demonstrate clear applicability to other populations or settings. When the research cannot support such generalizations—the results of the research do not hold true for other phenomena, or even for the originally examined phenomenon—there is a treat to achieving external validity.

To mitigate threats to external validity, it is necessary to identify the domain in which the research's findings provide generalized description that may apply beyond the immediate case in the research effort. Tactics such as following effective sampling strategies and concept mapping, or presenting the interrelationships and similarities of concepts from one environment to another, can help address threats to external validity (Frankfort-

Nachmias and Nachmias, 2000:102; Patten, 2004:89; and

Trochim, 2001:42-43). Convenience sampling, while providing a ready and convenient source of data, was a limitation and resulted in a skewed result as most survey participants had

99

advanced education levels and were unrepresentative of the total Internet population. To still demonstrate that the findings may apply to other representative commons, despite the sampling limitations, the data analysis and research summary highlighted how the two theories and new policy concepts could be applied to a wider context.

Another applicable validity area, construct validity, addresses the need for the research to follow a well-defined measurement plan, and to follow logically and empirically from the original research goals. To satisfy construct validity, the researcher must demonstrate that theoretical expectations are addressed by using the appropriate research instruments and supporting research processes (Frankfort-

Nachmias and Nachmias, 2000:152; Trochim, 2001:64-65; and

Warner, 2008:862).

Construct validity requires identifying correct operational measures for the research, and establishing a clear chain of reproducible evidence that is consistent with the original research questions. Having key experts review the research report, and avoiding subjective judgments of the final research results helps in maintaining construct validity (Trochim, 2001:22,64-65; and Yin, 2009:42). For the research described here, close adherence to the defined research methodology and the empirical analysis outlined in

100

this chapter, and showing a clear chain of evidence to help guide future research, were essential for supporting construct validity. The fact that a dissertation committee reviewed the research instruments and the research findings provided an additional means for enhancing construct validity. Additionally, the survey instrument was pilot tested with associates of the author to address any technical or informational discrepancies prior to full deployment to the public.

Other research limitations include researcher bias, which can pose a threat to research validity if the observations and the recording of information are selective, based on the personal views or opinions of the researcher.

Strategies such as practicing critical self-reflection to address any bias that might affect reporting of the results of the research provides one way of addressing researcher bias (Johnson, 1997:283). Throughout the research process, the author carefully considered his own bias and experiences in the subject matter and relied on the statistical evidence from the survey and common themes from the interviewees. Any bias or predisposition to the results could also be identified through reviews by the dissertation committee.

Chapter 5 summarizes the conclusions and presents alternative applications to other representative commons

101

that may counter potential bias for the results that were gathered and pertinent to Internet and cybersecurity issues.

There are inherent limitations in using self-reporting research instruments, such as surveys. There may be problems in understanding the questions; participants may skip or fail to fully answer questions; and/or participants may introduce personal bias in their answers (Warner,

2008:4,126). Measurement errors may also result from differences in the health or mood of the respondents at the time of the survey, and inaccurate responses may be either inadvertent or deliberate on the part of respondents. The level of social awareness or intelligence of the respondent may influence such limitations, and there could be varying interpretations of what the survey questions are asking.

Differences in the administration of the research instrument can also be a factor, as well as differences in processing of the information presented to participants (Ackoff,

1953:300-301; and Frankfort-Nachmias and Nachmias,

2000:149). For the survey used in this study, the most common known type of error was failure to answer some survey questions.

One final relevant research validity area, reliability, requires that other researchers intending to examine related phenomenon would be able to repeat the operations of the

102

research and follow similar data collection and analysis methodologies and obtain the same results. The primary goal of reliability is to minimize the errors in the research so that it can consistently be repeated by others (Frankfort-

Nachmias and Nachmias, 2000:154; Patten, 2004:71; Singh,

2007:77; Trochim, 2001:88; and Warner, 2008:826). Consistent with Creswell’s recommended practices to satisfy the requirements of measurement reliability, the author gathered an adequate amount of data; collected data that was appropriate to the theory and model of the research being conducted; and presented an audit trail of reliable documentation so that other researchers could reconstruct and verify the research (Creswell, 2007:208-209). A dissertation committee composed of experts in each of the focus areas in the research described here also provided a critical auditing function and this helped improve the measurement reliability in the research.

Summary

This chapter described the exploratory mixed methods research methodology that was used for answering the questions of the research. It also outlined the sampling process, and the data collection processes and the types of research data collected. The incorporation of collected data

103

from both the quantitative and qualitative phases of the research provided important findings for evaluating the applicability of using the Reflexivity and Institutional

Theories to develop new commons governance policies.

This chapter also outlined the empirical processes followed in analyzing the data. Finally, the approaches followed for maintaining research integrity and validity were reviewed. Chapter 4 presents and summarizes the results obtained in the research. Chapter 5 provides an overall assessment of the dissertation results, and a discussion of the relevant applications of the research results to cybersecurity and other matters of commons governance policy.

104

Chapter 4: Analysis of Data and Findings

Introduction

The research described here proposes that a common element, human agency, provides the theoretical linkage point between the Reflexivity and Institutional Theories, and that the consideration of human agency in shaping behaviors through application of these two theories provides an approach to inform new policies governing use of a commons. Mixed methods research—using quantitative survey data and qualitative interview data—explored concepts from these two theories to analyze Internet-user behaviors, and explore the relationship between reflexivity, human agency, and Internet use.

This chapter presents the data analysis and research findings using both descriptive statistics and bivariate analyses. Consistent with the sequential explanatory design of the research, the first phase involved collecting and analyzing quantitative data, followed by collecting and analyzing qualitative data. Finally, the two data sources were integrated and interpreted together. Quantitative data aided in the initial exploration of the relevance of the link between the Reflexivity and Institutional Theories in answering the research questions, as well as the

105

relationship of human agency and reflexivity to Internet use. The qualitative data collection elaborated on these initial findings and explored their applicability to Federal cybersecurity policy and general commons governance policies. In the following section, the findings from the quantitative data analysis phase are presented, with descriptive statistics (using frequency distributions), and bivariate analyses (using contingency tables).

Results of the Quantitative Data Analysis

In analyzing the survey results, the first demographic identified the gender of survey respondents. Of the 159 respondents, 136 responded to this question. There were 80 female respondents, or 58.8%, and 56 male respondents, or

41.2%. Whatever the gender of those 23 who elected not to answer this question might be, the majority of overall survey respondents was female.

The next demographic examined reported the educational level of the respondents (Figure 4.1).

106

Figure 4.1 Educational Levels of Survey Respondents

Of 159 survey respondents, 137 answered this question

(N=137). “Masters Degree” represents 45.3% of respondents;

“Bachelors Degree” represents 43.8% of respondents. This advanced level of reported education is not surprising since faculty and students at The George Washington University

Trachtenberg School of Public Policy and Public

Administration received a widely distributed e-mail solicitation for the survey, and members on the professional social networking site “LinkedIn®,” who also received

107

solicitation, are likely to have Masters and Bachelors degrees.

The next demographic area examined was the current occupation of the survey respondents (Figure 4.2).

Figure 4.2 Current Occupation of Survey Respondents

Of 159 survey respondents, 135 answered this question. Of those responding, “management or professional” represents

39.1%, “student” represents 37.5%, and “education,” identifying teachers or administrators of educational

108

institutions, represents 10.9%. Again, the number of student and educator respondents is not surprising given the wide e- mail solicitation to faculty and students at The George

Washington University Trachtenberg School of Public Policy and Public Administration. In addition, the professional social networking site “LinkedIn®” tends to attract managers and other professionals, which may have contributed to the higher percentage of professional occupations represented in the survey.

The survey collected data on respondents’ general views of cybersecurity and cybersecurity policy. The responses to the survey question (“On a 10-point scale, where 1 is ‘Not

At All Concerned’ and 10 is ‘Extremely Concerned,’ how concerned are you about your personal security (examples: identity theft, stalking) when you use the Internet?”) are presented in Figure 4.3.

109

Figure 4.3 Personal Security Concerns of Respondents

All 159 respondents answered this question. The mean of responses was 7.09, the median was 7.0, and the interquartile range, the range covered by the middle 50% of reported values, is 6-8. 79.9% of the respondents selected a value between 6 and 10 (“Extremely Concerned”). A majority of survey respondents expressed a fairly high level of concern about personal security when using the Internet.

When asked, “How important to you is getting advanced warning information on potential security problems affecting

110

the Internet?”, practically all respondents indicated that this was important (Table 4.1).

Table 4.1 Views on Receiving Warning Information

When asked “How important to you is getting advanced warning information on potential security problems affecting the Internet?”, survey respondents replied (N=158): (4) or (1){Not at all} or (3){Neutral} (5){Extremely (2) important} 2.5% 8.9% 88.6%

88.6% of respondents to this question answered in the affirmative, that is they said that getting advanced warning information on potential security problems affecting the

Internet is important to them.

The next question examined views about the perceived effectiveness of the enforcement of current Internet security policies. Responses to the survey question (“On a

10-point scale, where 1 is “Far Too Weak” and 10 is “Far Too

Restrictive,” how would you rate the enforcement effectiveness of current Internet security policies?”) are presented in Figure 4.4.

111

Figure 4.4 Views on Security Policy Enforcement

For those 113 respondents reporting an opinion, the mean of responses was 4.76, the median was 5.0, and the interquartile range was 4-6. 69.9% selected a value between

1 (“Far Too Weak”) and 5. A majority of the respondents feel that the effectiveness of the enforcement of current

Internet security policies is somewhat weak. This result, coupled with the results of the question about security concerns, suggests that security issues should be addressed in order to provide Internet users with greater confidence in the effectiveness of cybersecurity policy.

112

The next question examined views on specific areas to address in Internet security policy (Figure 4.5).

Figure 4.5 Views on Areas for Internet Security Policy

The top four areas selected “Responding to Internet security breaches” was selected by 77.4% of respondents, “Protecting

Internet user privacy” was selected by 74.8%, “Levying penalties for policy violations” was selected by 60.4%, and

“Managing Internet user identities” was selected by 23.9%.

The next question examined asked respondents “Do you think if you were allowed to participate in developing

113

Internet security policies, you would be more willing to follow these new policies?” (Table 4.2). This question addresses one of Ostrom’s design principles for long- enduring institutions where “most of the individuals affected by a resource regime are authorized to participate in making and modifying their rules” (Ostrom, 2005:263).

Table 4.2 Views on Participating in Policy-making

When asked “Do you think if you were allowed to participate in developing Internet security policies, you would be more willing to follow these new policies?”, survey respondents replied (N=127): (1){No, not at all} (4) or (5){Yes, to (3){Neutral} or (2) a great extent} 7.1% 26.8% 66.1%

66.1% of respondents to this question answered in the affirmative, that is they said that they would be more willing to follow Internet security policies if they were allowed to participate in the development of these policies.

The purpose of presenting these initial univariate statistics is to provide general insight into the demographics of survey participants. These statistics also present general perceptions of the survey respondents on

Internet security and the effectiveness of Internet security policies. The bivariate analysis, using contingency tables, follows in the next section.

114

Quantitative Analysis Supporting Research Question #1

Research Question #1 asks, “How may Reflexivity and

Institutional Theories be used to help improve the formulation of commons governance policies?” This question highlights a common element, human agency, from the theoretical linkage point between Reflexivity Theory and

Institutional Theory. In answering this question through the quantitative analysis, there is a focus on “reflexivity” as the independent variable under exploration. The dependent variables are those potentially affected through reflexivity, such as shaping behaviors to maintain trust and reciprocity, (essential for collective action), and avoiding actions that bring attention to third party monitoring organizations. In order to explore reflexivity and human agency, it is necessary to look at several representative survey questions that can measure the independent variable addressing reflexivity. These questions, added together, constitute the representative independent variable for analyzing the relationship with the dependent variables.

These variables will be identified for answering each subsidiary research question.

115

Subsidiary Question #1a Results

Subsidiary Question #1a asks, “To what extent will actors avoid behaviors that could jeopardize the trust of other actors using a commons?” This question examines the value placed by respondents on the importance of maintaining trust with other Internet users, and examines their self- perceived proclivity to avoid actions that might jeopardize this trust. Reflexivity and maintaining trust are both components here, as respondents are asked to reflect on the behaviors they use to maintain interpersonal trust. As such, an independent variable, TRUST, derived from the results of two survey questions, is used to analyze the relationship to the dependent variables.

TRUST is created by adding the results of two survey questions: “How important is maintaining the trust of other users on the Internet to you?” and “Do you avoid activities on the Internet that could cause you to lose the trust of other Internet users?” Tables 4.3 and 4.4 show the initial survey results for each of these two questions.

116

Table 4.3 Views on the Importance of Maintaining Trust

When asked “How important is maintaining the trust of other users on the Internet to you?”, survey respondents replied (N=132): (4) or (1){Not important (3){Neutral} (5){Extremely at all} or (2) important} 9.1% 25.8% 65.1%

65.1% of respondents answered the question about maintaining trust affirmatively, indicating that they believe that it is important to maintain the trust of other users on the

Internet.

Table 4.4 Views on Avoiding Trust-Impacting Actions

When asked “Do you avoid activities on the Internet that could cause you to lose the trust of other Internet users?”, survey respondents replied (N=118): (1){No, not at all} (4) or (5){Yes, to (3){Neutral} or (2) a great extent} 14.4% 22.0% 63.6%

63.6% of respondents answered this question about avoiding trust-impacting actions in the affirmative, that is they said that they do avoid activities on the Internet that could cause them to lose the trust of others.

TRUST is created by adding the results of these two survey questions. TRUST has the components that maintaining trust is important and that users avoid activities that

117

could lose the trust of others. Records with missing values

(i.e., answers of “No Opinion”) were excluded from generating the variable TRUST. Table 4.5 presents the generated results for TRUST.

Table 4.5 Generated Results for TRUST

TRUST: combined represents that maintaining trust is important and users avoid activities that could lose the trust of others (N=116): 5-7 {Somewhat 2-4 {Not Reflexive} 8-10 {Reflexive} Reflexive} 6.0% 38.0% 56.0%

56.0% of respondents to this question have indicated reflexive responses; 38.0% have indicated somewhat reflexive responses.

The first analysis of TRUST was with the dependent variable, AUCTIONS, representing the Likert-type survey question that asks, “How frequently do you accurately describe product information when selling at online auctions?” First, Table 4.6 presents the results for this question.

118

Table 4.6 Views on Describing Product Information

When asked “How frequently do you accurately describe product information when selling at online auctions?”, survey respondents replied (N=62): Rarely [(1)“Never” Almost Always [(4) Occasionally (3) or (2)] or (5)”Always”] 16.2% 4.8% 79.0%

79.0% of respondents to this question answered in the affirmative, that is they said that they almost always accurately describe product information when selling at online auctions.

The results for the first analysis of TRUST with

AUCTIONS using cross tabulation are presented in Table 4.7.

As both variables had missing data, there were 47 valid responses available for the analysis.

119

Table 4.7 Analysis of TRUST and AUCTIONS

TRUST

Not Somewhat Reflexive Reflexive Reflexive N=2 N=17 N=28 AUCTIONS [How frequently do you accurately describe product information when selling at online auctions?] Rarely 50% 29.4% 7.1%

Occasionally 0 5.9% 3.6%

Almost Always 50% 64.7% 89.3%

Totals 100% 100% 100%

From these results it appears that reflexivity has a fairly strong relationship to actions associated with almost always accurately describing product information when selling at online auctions (89.3% are reflexive; 64.7% are somewhat reflexive). This result is consistent to what would be expected as disreputable selling practices at auctions could lead to eventual exclusion from the auctions, and this may be on the mind of the auction participants.

The second analysis with TRUST was with the dependent variable, POSTING, representing the Likert-type question that asks, “How frequently do you get approval before posting information on another Internet user?” Table 4.8 presents the results for this question.

120

Table 4.8 Views on Getting Approval Before Posting Information

When asked “How frequently do you get approval before posting information on another Internet user?”, survey respondents replied (N=96): Rarely [(1)“Never” Almost Always [(4) Occasionally (3) or (2)] or (5)”Always”] 21.8% 20.8% 57.4%

57.4% of respondents to this question answered in the affirmative, that is they said that they almost always get approval before posting information on another Internet user.

The results of the second analysis of TRUST with

POSTING are presented in Table 4.9. As both variables had missing data, there were 79 valid responses available for the analysis.

121

Table 4.9 Analysis of TRUST and POSTING

TRUST

Not Somewhat Reflexive Reflexive Reflexive N=3 N=28 N=48 POSTING [How frequently do you get approval before posting information on another Internet user?] Rarely 33.33% 32.1% 14.6%

Occasionally 33.33% 28.6% 12.5%

Almost Always 33.33% 39.3% 72.9%

Totals 100% 100% 100%

From these results it also appears that reflexivity has a strong relationship to actions associated with users almost always getting approval before posting information on another Internet user (72.9% reflexive and 39.3% somewhat reflexive).

The third analysis with TRUST was with the dependent variable, EMAIL, representing the Likert-type question that asks, “How frequently do you check e-mail sources before forwarding e-mails to others?” Table 4.10 presents the results for this question.

122

Table 4.10 Check E-mail Sources Before Forwarding Them

When asked “How frequently do you check e-mail sources before forwarding e-mails to others?”, survey respondents replied (N=154): Rarely [(1)“Never” Almost Always [(4) Occasionally (3) or (2)] or (5)”Always”] 15.5% 13.0% 71.5%

71.5% of respondents to this question answered in the affirmative, that is they said that they almost always check e-mail sources before forwarding e-mails to others.

The results of the third analysis of TRUST with EMAIL are presented in Table 4.11. As both variables had missing data, there were 112 valid responses available for the analysis.

Table 4.11 Analysis of TRUST and EMAIL

TRUST

Not Somewhat Reflexive Reflexive Reflexive N=7 N=43 N=62 EMAIL [How frequently do you check e-mail sources before forwarding e-mails to others?] Rarely 28.6% 25.6% 8.1%

Occasionally 0 14.0% 11.3%

Almost Always 71.4% 60.4% 80.6%

Totals 100% 100% 100%

123

From these results it appears that reflexivity has a relationship to actions associated with users almost always checking e-mail sources before forwarding e-mails to others

(80.6% reflexive and 60.4% somewhat reflexive).

The fourth analysis with TRUST was with the dependent variable, IDENTITY, representing the Likert-type question that asks, “If your personal identity was known to other

Internet users, would you exercise greater care with your activities on the Internet?” This question addresses the nature of a user’s identity, and the role it has in shaping behaviors.

The first step in the analysis of this data was to examine the responses to the question, “If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?”

(Table 4.12).

Table 4.12 Respondent Views on Personal Identity and Action

When asked “If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?”, survey respondents replied (N=131): (1){No, not at all} (4) or (5){Yes, to (3){Neutral} or (2) a great extent} 18.3% 15.3% 66.4%

124

66.4% of respondents to this question answered in the affirmative, that is they said that if their personal identity was known to other Internet users, they would exercise greater care with their activities on the Internet.

The results of the fourth analysis of TRUST with

IDENTITY are presented in Table 4.13. As both variables had missing data, there were 112 valid responses available for the analysis.

Table 4.13 Analysis of TRUST and IDENTITY

TRUST

Not Somewhat Reflexive Reflexive Reflexive N=7 N=42 N=63 IDENTITY [If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?] Rarely 42.9% 11.9% 23.8%

Occasionally 0 23.8% 12.7%

Almost Always 57.1% 64.3% 63.5%

Totals 100% 100% 100%

From these results it appears that reflexivity has a relationship to identity, with respondents almost always exercising greater care with their activities if their

125

personal identity was known to others (63.5% reflexive and

64.3% somewhat reflexive).

In examining the results for Question 1a, the analyses suggested an apparent relationship between reflexivity and users’ actions to almost always accurately describe product information at online auctions, gain approvals for posting information, checking e-mail sources, and being careful if identity was known—actions that involve trust. Reflexivity appears to be relevant in shaping user behaviors and supports the hypothesized results for Question 1a.

Subsidiary Question #1b Results

Subsidiary Question #1b asks, “To what extent will actors avoid behaviors that could jeopardize reciprocal action with other actors using a commons?” This question examines the value that respondents placed on maintaining reciprocity with others, and their self-perceived proclivity to avoid actions that might jeopardize this reciprocity.

Reflexivity is a component in this question, since respondents are asked to reflect on their own actions for maintaining reciprocal relationships. Moreover, as mentioned in Chapter 2, Ostrom argues that there is a relationship between “trust, reciprocity, and reputation” and that the lack of these can bring about negative effects (Ostrom,

126

2003:53). Therefore an independent variable for this test must bring in these concerns—reputation and trust—with reflexivity and maintaining reciprocal action. Accordingly, an independent variable, RECIPROCITY, derived from the results of two survey questions, is used to test the relationship to the dependent variables.

RECIPROCITY is created by adding the results of two survey questions: “To what extent do you think other

Internet users’ judgment of your online activities on the

Internet is important?” and “Do you avoid activities on the

Internet that could cause you to lose the trust of other

Internet users?” Table 4.14 shows the initial survey results for the first question (related to reputation); the results for the second question (related to trust) were presented in

Table 4.4.

Table 4.14 Importance of Another’s Judgment of Actions

When asked “To what extent do you think other Internet users’ judgment of your online activities on the Internet is important?”, survey respondents replied (N=149): (1){Not at all} or (4) or (5){To a (3){Neutral} (2) great extent} 33.5% 27.5% 39.0%

39.0% of respondents to this question answered in the affirmative, indicating that they believe other Internet

127

users’ judgment of their online activities is important.

This is a somewhat low value, (yet necessary for factoring in the analysis).

RECIPROCITY is created by adding the results of these two survey questions. RECIPROCITY essentially incorporates the considerations that other users’ judgment of activities is important and that users avoid activities that could lose the trust of others. Records with missing values (i.e., answers of “No Opinion”) were excluded from generating the variable RECIPROCITY. Table 4.15 presents the generated results for RECIPROCITY.

Table 4.15 Generated Results for RECIPROCITY

RECIPROCITY: combined represents that other users’ judgment of activities is important and users avoid activities that could lose the trust of others (N=112): 5-7 {Somewhat 2-4 {Not Reflexive} 8-10 {Reflexive} Reflexive} 9.8% 55.3% 34.9%

55.3% of respondents to these questions have somewhat reflexive responses represented; 34.9% have reflexive responses represented.

The first analysis with RECIPROCITY was with the dependent variable, POSTING, representing the Likert-type question that asks, “How frequently do you get approval

128

before posting information on another Internet user?”

Reciprocity is a component as users should expect having similar considerations, or reciprocal action, from others to protect personal information (Table 4.16). As both variables had missing data, there were 78 valid responses available for the analysis.

Table 4.16 Analysis of RECIPROCITY and POSTING

RECIPROCITY

Not Somewhat Reflexive Reflexive Reflexive N=6 N=44 N=28 POSTING [How frequently do you get approval before posting information on another Internet user?] Rarely 33.33% 27.3% 10.7%

Occasionally 33.33% 18.2% 14.3%

Almost Always 33.33% 54.5% 75.0%

Totals 100% 100% 100%

As indicated in Table 4.16, there is an apparent relationship between reflexivity and user actions to almost always get approval before posting information on another

Internet user (75.0% reflexive and 54.5% somewhat reflexive).

129

The second analysis was with the dependent variable,

EMAIL, representing the Likert-type question that asks, “How frequently do you check e-mail sources before forwarding e- mails to others?” (Table 4.17). As both variables had missing data, there were 108 valid responses available for the analysis.

Table 4.17 Analysis of RECIPROCITY and EMAIL

RECIPROCITY

Not Somewhat Reflexive Reflexive Reflexive N=10 N=61 N=37 EMAIL [How frequently do you check e-mail sources before forwarding e-mails to others?] Rarely 30% 14.8% 10.8%

Occasionally 0 13.1% 10.8%

Almost Always 70% 72.1% 78.4%

Totals 100% 100% 100%

As indicated in Table 4.17, there is also an apparent relationship with reflexivity and almost always checking e- mail sources before forwarding e-mails to others (78.4% reflexive and 72.1% somewhat reflexive).

The third analysis with RECIPROCITY was with the dependent variable, PRIVACY, representing the Likert-type question that asks, “How frequently do you get approval

130

before sharing another Internet user’s contact information?”

First, Table 4.18 presents the results for this question.

Table 4.18 Views on Sharing Another’s Contact Information

When asked “How frequently do you get approval before sharing another Internet user’s contact information?”, survey respondents replied (N=134): Rarely [(1)“Never” Almost Always [(4) Occasionally (3) or (2)] or (5)”Always”] 10.4% 13.4% 76.2%

76.2% of respondents to this question answered in the affirmative, that is they said that they almost always get approval before sharing another Internet user’s contact information.

As both variables had missing data, there were 99 valid responses available for the analysis (Table 4.19).

131

Table 4.19 Analysis of RECIPROCITY and PRIVACY

RECIPROCITY

Not Somewhat Reflexive Reflexive Reflexive N=9 N=53 N=37 PRIVACY [How frequently do you get approval before sharing another Internet user’s contact information?] Rarely 11.1% 7.5% 8.1%

Occasionally 33.3% 18.9% 8.1%

Almost Always 55.6% 73.6% 83.8%

Totals 100% 100% 100%

As indicated in Table 4.19, there is an apparent relationship with reflexivity and users almost always getting approval before sharing another Internet user’s contact information (83.8% reflexive and 73.6% somewhat reflexive).

The fourth analysis with RECIPROCITY was with the dependent variable, IDENTITY, representing the Likert-type question that asks, “If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?” (Table 4.20). As both variables had missing data, there were 107 valid responses available for the analysis.

132

Table 4.20 Analysis of RECIPROCITY and IDENTITY

RECIPROCITY

Not Somewhat Reflexive Reflexive Reflexive N=11 N=58 N=38 IDENTITY [If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?] Rarely 36.4% 19.0% 18.4%

Occasionally 9.1% 20.7% 13.2%

Almost Always 54.5% 60.3% 68.4%

Totals 100% 100% 100%

As indicated in Table 4.20, there is also an apparent relationship with reflexivity and identity considerations, where users almost always exercise greater care with their activities if their personal identity was known (68.4% reflexive, and 60.3% somewhat reflexive).

In examining the results for Question 1b, the analyses suggested an apparent relationship between reflexivity and users actions to almost always gain approvals for posting and sharing information, checking e-mail sources, and being careful if identity was known—actions that involve reciprocity, trust, and reputation. Reflexive action appears to be relevant in shaping user behaviors and supports the hypothesized results for Question 1b.

133

Subsidiary Question #1c Results

Subsidiary Research Question #1c asks, “To what extent will actors avoid behaviors that could bring attention of their actions to third-party monitoring agents?” As discussed in the previous chapter, use of such agents is highlighted by Ostrom as an important design principle for establishing long-enduring institutions (Ostrom, 2005:259).

Monitoring was also highlighted in Chapter 2 as a cybersecurity promising practice for addressing insider threats. As with the previous questions, reflexivity is a component, since this question asks respondents to reflect on behaviors they have taken, or would take, to avoid receiving undue attention in an environment with third-party monitoring.

The first step in the analysis of this data was to examine the responses to the question, “To what extent would you be more careful with your activities on the Internet if they were being monitored by an Internet activity monitoring organization?” (Table 4.21).

134

Table 4.21 Respondent Views on Activities Being Monitored

When asked “To what extent would you be more careful with your activities on the Internet if they were being monitored by an Internet activity monitoring organization?”, survey respondents replied (N=132): Not Reflexive: Somewhat Reflexive: (4) or (1){Not at all} or Reflexive: (5){To a great (2) (3){Neutral} extent} 31.8% 25.0% 43.2%

43.2% answered the question affirmatively: that is they indicated that they are more careful (reflexive) with

Internet activities if these activities are being monitored.

This variable also serves as the independent variable

(MONITOR) as reflexivity is a component in shaping behaviors with the scenario described here. For MONITOR, values of 1 and 2 were coded as “Not Reflexive,” values of 3 were coded

“Somewhat Reflexive,” and values of 4 and 5 were coded as

“Reflexive.”

The first analysis was with the dependent variable,

AUCTIONS, representing the Likert-type question that asks,

“How frequently do you accurately describe product information when selling at online auctions?” As online auctions are generally hosted by companies that monitor activities, this question is appropriate for analysis. The results of this question were provided in Table 4.6.

The results of the first analysis of MONITOR with this variable, using cross tabulation, are presented in Table

135

4.22. As both variables had missing data, there were 50 valid responses available for the analysis.

Table 4.22 Analysis of MONITOR and AUCTIONS

MONITOR

Not Somewhat Reflexive Reflexive Reflexive N=13 N=14 N=23 AUCTIONS [How frequently do you accurately describe product information when selling at online auctions?] Rarely 23.1% 14.3% 17.4%

Occasionally 7.7% 7.1% 0

Almost Always 69.2% 78.6% 82.6%

Totals 100% 100% 100%

As indicated in Table 4.22, there is an apparent relationship with reflexivity and users almost always accurately describing product information when selling at online auctions (82.6% reflexive and 78.6% somewhat reflexive).

The second analysis involved the dependent variable,

MASK, representing the Likert-type question that asks, “How frequently do you mask or use another identity when visiting

Internet chat rooms and/or social networking sites?” This question is relevant, as chat rooms and social networking

136

sites may be monitored either by the service provider, or in some cases by law enforcement organizations. Table 4.23 presents the results for this question.

Table 4.23 Mask or Use Another Identity at Chat Rooms

When asked “How frequently do you mask or use another identity when visiting Internet chat rooms and/or social networking sites?”, survey respondents replied (N=128): Rarely [(1)“Never” Almost Always [(4) Occasionally (3) or (2)] or (5)”Always”] 64.9% 17.2% 17.9%

64.9% of respondents to this question answered in the negative, that is they said that they rarely mask or use another identity when visiting Internet chat rooms and/or social networking sites. This result is not surprising as

LinkedIn®, the potential source for some survey respondents, is primarily intended to bring professionals together for networking and collaboration. The same could be true for those who are students and educators, another source of survey respondents.

The results of the second analysis of MONITOR with this variable are presented in Table 4.24. As both variables had missing data, there were 108 valid responses available for the analysis.

137

Table 4.24 Analysis of MONITOR and MASK

MONITOR

Not Somewhat Reflexive Reflexive Reflexive N=31 N=27 N=50 MASK [How frequently do you mask or use another identity when visiting Internet chat rooms and/or social networking sites?] Rarely 74.2% 63.0% 64.0%

Occasionally 3.2% 25.9% 20.0%

Almost Always 22.6% 11.1% 16.0%

Totals 100% 100% 100%

As indicated in Table 4.24, there is an apparent relationship for users having reflexive behaviors and rarely masking or using another identity (64.0% reflexive and 63.0% somewhat reflexive).

The third analysis involved the dependent variable,

IDENTITY, representing the Likert-type question that asks,

“If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?” This question addresses the nature of public identity and its role with reflexivity in shaping behaviors in third-party monitored environments. Table 4.25 presents the results. As both variables had missing data, there were 124 valid responses available for the analysis.

138

Table 4.25 Analysis of MONITOR and IDENTITY

MONITOR

Not Somewhat Reflexive Reflexive Reflexive N=38 N=30 N=56 IDENTITY [If your personal identity was known to other Internet users, would you exercise greater care with your activities on the Internet?] Rarely 31.6% 26.7% 5.4%

Occasionally 21.0% 10.0% 12.5%

Almost Always 47.4% 63.3% 82.1%

Totals 100% 100% 100%

As indicated in Table 4.25, there is an apparent relationship with reflexivity and users almost always exercising greater care with Internet activities if their personal identity was known (82.1% reflexive and 63.3% somewhat reflexive).

In examining the results for Question 1c, the analyses indicated a relationship between reflexivity and users accurately describing product information, not masking identity at chat rooms or social networking sites, and exercising greater care with Internet activity if their identity was known. These results will be important in addressing the value of formulating new cybersecurity policies that concern identity management and monitoring of

139

Internet resources, to be discussed in Chapter 5. As the results show, reflexivity has an apparent relationship in shaping behaviors and supports the desired results for answering Question 1c.

Subsidiary Question #1d Results

Subsidiary Research Question #1d asks, “To what extent will actors avoid behaviors that could adversely affect their own future ability to use a commons?” This question examined respondents’ proclivity to avoid actions that could adversely affect their use of the Internet. As with the other questions, reflexivity is a component in the query because it addresses the conscious self-monitoring of behaviors that might be detrimental to respondents’ access and use of the Internet.

The first step in the analysis of this question was to examine the responses to the survey question, “To what extent do you know which online activities would jeopardize your personal security when you use the Internet?” (Table

4.26).

140

Table 4.26 Views on Security Jeopardizing Activities

When asked “To what extent do you know which online activities would jeopardize your personal security when you use the Internet?”, survey respondents replied (N=156): (1){Not at all} or (4) or (5){To a (3){Neutral} (2) great extent} 9.6% 16.7% 73.7%

73.7% answered the question in the affirmative: that is,

73.7% of respondents claim to know which online activities jeopardize their personal security. As this question deals with reflexivity in connection with avoiding actions that can impact Internet security, the results of this question also serve as another measure of reflexivity to be called

(ACCESS). For ACCESS, values 1 and 2 are coded as “Not

Reflexive,” values of 3 are coded “Somewhat Reflexive,” and values of 4 and 5 are coded as “Reflexive.”

The first analysis was with the dependent variable,

VIRUS, representing the Likert-type question that asks, “How frequently do you scan your personal computer for viruses?”

Table 4.27 presents the results for this question.

Table 4.27 Scan Personal Computers for Viruses

When asked “How frequently do you scan your personal computer for viruses?”, survey respondents replied (N=157): Rarely [(1)“Never” Almost Always [(4) Occasionally (3) or (2)] or (5)”Always”] 13.4% 22.9% 63.7%

141

63.7% of respondents to this question answered in the affirmative, that is they said that they almost always scan their personal computers for viruses.

The results of the first analysis of ACCESS with this variable are presented in Table 4.28. As both variables had missing data, there were 154 valid responses available for the analysis.

Table 4.28 Analysis of ACCESS and VIRUS

ACCESS

Not Somewhat Reflexive Reflexive Reflexive N=15 N=25 N=114 VIRUS [How frequently do you scan your personal computer for viruses?] Rarely 26.7% 4.0% 13.1%

Occasionally 20.0% 32.0% 20.2%

Almost Always 53.3% 64.0% 66.7%

Totals 100% 100% 100%

As indicated in Table 4.28, there is an apparent relationship with reflexivity and users almost always scanning their personal computer for viruses (66.7% reflexive and 64.0% somewhat reflexive).

The second analysis involved the dependent variable,

CHATROOM, representing the Likert-type question that asks,

142

“How much more careful are you with your activities on the

Internet when you use social networking sites or chat rooms?” This question addresses the nature of the security challenges identified with social networking sites and chat rooms such as potential effects on the personal security of users themselves, if their personal information or password information is shared with other parties. Table 4.29 presents the results for this question.

Table 4.29 Careful with Activities when Social Networking

When asked “How much more careful are you with your activities on the Internet when you use social networking sites or chat rooms?”, survey respondents replied (N=116): Almost Always [(4) Rarely [(1)“Not at Neutral (3) or (5)”To a great all” or (2)] extent”] 10.3% 13.8% 75.9%

75.9% of respondents to this question answered in the affirmative, that is they said that almost always they are more careful with their activities on the Internet when using social networking sites or chat rooms.

The results of the second analysis of ACCESS with this variable are provided in Table 4.30. As both variables had missing data, there were 115 valid responses available for the analysis.

143

Table 4.30 Analysis of ACCESS and CHATROOM

ACCESS

Not Somewhat Reflexive Reflexive Reflexive N=15 N=16 N=84 CHATROOM [How much more careful are you with your activities on the Internet when you use social networking sites or chat rooms?] Rarely 13.3% 12.5% 9.5%

Occasionally 26.7% 12.5% 11.9%

Almost Always 60.0% 75.0% 78.6%

Totals 100% 100% 100%

As indicated in Table 4.30, there is an apparent relationship with reflexivity and being more careful with activities at social networking sites and chat rooms (78.6% reflexive and 75.0% somewhat reflexive).

The third analysis involved the dependent variable,

WIRELESS, representing the Likert-type question that asks,

“How frequently do you operate your home wireless network using a password?” This question addresses the discipline needed to protect home networks from intruders: a behavior that security experts view as of critical importance for protecting wireless networks. Table 4.31 presents the results for this question.

144

Table 4.31 Operate Home Wireless Networks with Passwords

When asked “How frequently do you operate your home wireless network using a password?”, survey respondents replied (N=148): Rarely [(1)“Never” Almost Always [(4) Occasionally (3) or (2)] or (5)”Always”] 8.2% 2.6% 89.2%

89.2% of respondents to this question answered in the affirmative, that is they said that they almost always operate home wireless networks using a password.

The results of the third analysis of ACCESS with this variable are presented in Table 4.32. As both variables had missing data, there were 145 valid responses available for the analysis.

Table 4.32 Analysis of ACCESS and WIRELESS

ACCESS

Not Somewhat Reflexive Reflexive Reflexive N=15 N=24 N=106 WIRELESS [How frequently do you operate your home wireless network using a password?] Rarely 13.3% 4.2% 8.5%

Occasionally 0 0 3.8%

Almost Always 86.7% 95.8% 87.7%

Totals 100% 100% 100%

145

As indicated in Table 4.32, there is also a relationship with reflexivity and operating home wireless networks using a password (87.7% reflexive and 95.8% somewhat reflexive).

In examining the results for Question 1d, the analyses suggested an apparent relationship between reflexivity and the various user actions in avoiding online activities that would jeopardize their personal security when using the

Internet.

Quantitative Research Results

In reviewing the results of the quantitative analyses for the four subsidiary questions, it is apparent that reflexivity does have a role in shaping the behaviors of

Internet users. Table 4.33 provides a summary of the quantitative results for all four questions. These results will be integrated with the results from the qualitative analysis phase to complete the mixed methods data analysis activity.

146

Table 4.33 Summary of Quantitative Results

Subsidiary Question Relationship with Reflexivity

1a: To what extent will - Accurately describe product information actors avoid behaviors when selling at online auctions that could jeopardize - Get approval before posting the trust of other information on another Internet user actors using a commons? - Check e-mail sources before forwarding e-mails to others - Exercise greater care with activities on the Internet if personal identity was known to others 1b: To what extent will - Get approval before posting information actors avoid behaviors on another Internet user that could jeopardize - Check e-mail sources before forwarding reciprocal action with e-mails to others other actors using a - Get approval before sharing another commons? Internet user’s contact information - Exercise greater care with activities on the Internet if personal identity was known to others 1c: To what extent will - Accurately describe product information actors avoid behaviors when selling at online auctions that could bring - Mask or use another identity when attention of their visiting Internet chat rooms and/or actions to third-party social networking sites monitoring agents? - Exercise greater care with activities on the Internet if personal identity was known to other Internet users 1d: To what extent will - Frequently scan personal computer for actors avoid behaviors viruses that could adversely - Careful with activities on the Internet affect their own future when using social networking sites or ability to use a chat rooms commons? - Frequently operate home wireless network using a password

Results of the Qualitative Data Analysis

The qualitative portion of the data analysis used data derived from the interviews. Interviewees representing a broad base of professional expertise—both former government officials and private-sector experts with experience in

147

cybersecurity policy analysis, formulation, and implementation—were selected. Interviewees included a former

Director of the National Security Agency, a former headquarters official and advisor to the Director of the

Central Intelligence Agency, a former Chief Information

Officer of a major Department of Defense agency, a former head of multi-agency information assurance at the National

Security Agency, a former official responsible for overseeing White House telecommunications and information technology, two former White House National Security Staff members, a current homeland security official, a current nationally renowned expert in cybersecurity and advisor to the President, and two current private-sector market leaders in cybersecurity policy. In total eleven experts were interviewed.

In summarizing the results, general identified themes were intended to be used to continue exploring concepts from the Reflexivity and Institutional Theories, and their influence in shaping Internet-user behaviors. The qualitative data elaborated on the initial findings from the quantitative phase that addressed the general Research

Question #1 (“How may Reflexivity and Institutional Theories be used to help improve the formulation of commons governance policies?”). The qualitative phase also provided

148

data addressing the general Research Question #2 (“How may

Reflexivity and Institutional Theories be used to improve

Federal cybersecurity policies governing use of the

Internet?”).

To identify themes that emerged from the interview data, information was classified and categorized, guided by the original research questions and the specific interview questions. This approach helped identify commonalities from the interview comments that could identify patterns or themes useful in answering the research questions. To report these findings, the qualitative data analysis was separated into five thematic categories. The first category addressed

Reflexivity Theory in general and its relevance to new cybersecurity policy.

Reflexivity Theory and Cybersecurity Policy

This part of the interview examined key principles from

Reflexivity Theory involving reflexive action and human agency, and their relevance to concepts for new cybersecurity policy. Interviewees were asked about aspects of cybersecurity derived from Subsidiary Questions 1a, 1b,

1c, and 1d from Research Question #1 addressing issues of trust, reciprocity, third-party monitoring, and avoidance of actions that could negatively impact future Internet access.

149

Interview questions #6 and #10 were the primary sources of the data, and had four parts as indicated in Table 4.34.

Table 4.34 Questions on Reflexivity Theory and Cybersecurity

Reflexivity Theory and Its Relevance to Cybersecurity Policy (Supporting Subsidiary Research Questions 1a, 1b, 1c, and 1d) - Do you feel that people reflect on the ramifications of their actions when they use the Internet? - If users were aware of third-party monitoring, do you think they would alter their online behaviors? - Do you think that people would generally alter their online behaviors if their true identities were visible to other users using the Internet? - Should these considerations be factored into new cybersecurity policy?

The interviewees offered mixed responses to the first question. The difference of opinions appeared more related to views that Internet users have some general understanding of the consequences of their actions, but also have a relatively high level of complacency and comfort level when using the Internet. Most of the interviewees emphasized that users really have no idea of the potential ramifications that can result from improper or insecure Internet behaviors. An interviewee stated a point shared by other interviewees, “It is pretty clear that a lot of people don’t know the ramifications. In a professional environment, people are more careful.” Most agreed that users do not think matters through when using the Internet, and that they

150

are now too comfortable with the Internet. Another suggested that there seems to be a generational gap, with younger users more likely to consciously trade off security risk for the sake of convenience. “Yes, there is a heightened sensitivity on the Internet, however I’m not sure about the younger generation,” stated one interviewee.

In discussing the monitoring of Internet activity, interviewees generally agreed that users expect that some general monitoring is going on; however, if users knew that they were being specifically targeted, they would most likely alter some of their behaviors. Some interviewees were against established monitoring on the Internet, as monitoring opens up potential issues of privacy and civil liberties concerns involving U.S. citizens. There are specific laws, such the Foreign Intelligence Surveillance

Act and other electronic surveillance laws, that have prohibitions for certain types of monitoring activities of

U.S. citizens. With regards to terrorism on the Internet, an interviewee emphasized that identity protection is an important factor, with terrorist factions “trying to mask their identities” as they use the Internet. “This is a major national security concern being addressed by the Federal government,” the interviewee stated.

151

In answering the question of whether people will generally alter their online behaviors if their true identities are known to others, most interviewees agreed that users would alter their behaviors in such circumstances. “Behaviors would be different if there was attribution, anonymity has a detrimental effect,” maintained one interviewee. Identities matter and some means to identify and track identities could be beneficial with cybersecurity. The issue of anonymity was raised by some interviewees and one suggested that anonymity is beneficial on the Internet under certain circumstances:

However when an individual is engaging in activities that affect a wide range of people, such as associated with recent events in the Middle East, it is important that the identities be known of those involved in initiating great civil events.

Another suggested “there are places where you want identities to be anonymous, it depends on the circumstances.”

When asked whether or not concepts of reflexivity and identity should be factored into new cybersecurity policy, the interviewees generally felt that identity management is indeed a relevant factor for cybersecurity policy. One interviewee expanded on this point by suggesting role-based identification approaches for software applications, and establishing security that is adjusted to the threats and

152

can be selected based on the individual. These concepts were further explored through other aspects of the interviews, including how reflexive action relates to maintaining trust as explained below.

Avoiding Behaviors That Impact Trust and Reciprocity

This part of the interview examined the relevance of the key principles from Reflexivity Theory involving reflexive action and human agency in relation to avoiding behaviors that could negatively impact trust and reciprocity with others. This question examined interviewees’ views of aspects related to Subsidiary Questions 1a and 1b from

Research Question #1. Interview questions #8 and #9 were the source of the data and had four parts as indicated in Table

4.35.

Table 4.35 Questions on Trust and Reciprocity

Avoiding Behaviors that Negatively Impact Trust and Reciprocity (Supporting Subsidiary Research Questions 1a and 1b) - How important is maintaining trust and reciprocity between users on the Internet? - Do you think users reflect on their own actions to maintain trust and reciprocity with others? - Do social networking sites promote greater trust between Internet users? - What policies should address such interactions?

153

There was general agreement among the interviewees that maintaining trust and reciprocity with others on the

Internet is important. The emphasis on trust in relation to governmental activities was highlighted by an interviewee who stated, “As far as government agencies are concerned, it is especially necessary for having trust. We need trust to conduct the business of government.” “The majority of us reflect on trust and reciprocity with others,” another interviewee maintained. Trust in relation to transactions was a common point raised by some interviewees. Two interviewees said that maintaining trust is critical when interacting with others involved in certain joint activities such as those involving bank transactions or sales. As with trust, reciprocity is situational and based on the particular transactions as highlighted by the interviewees.

Interviewees pointed out that generally speaking, exercising reciprocity depends on the nature of the interaction and the existing relationship between users.

When asked whether users reflect on their own actions in order to maintain trust and reciprocity, there was a sense of some reflection, however, there is a tendency not to reflect on the dangers posed by the Internet. An interviewee suggested that there is a tendency, because of the ease of accessibility on the Internet, “to trust in the

154

apparent security of the Internet and to not always reflect on the possible ramifications of every action taken.”

Another interviewee stated that not enough reflection occurs, and another that in general, users want trust built into the systems that they operate, and another stated that users often adjust their behaviors based on the level of trust they feel exists with another user.

For the question about whether social networking sites promote greater trust, there were generally two views: social networking sites can promote trust, and social networking sites can be dangerous. One interviewee suggested that trust comes about when there is increased use or experience with the social networking sites, and another suggested that social networking helps promote common values and common beliefs, and as such, these sites could ultimately promote greater trust between users. The dangers of social networking were raised by some interviewees where an interviewee observed, “Social networking leads to a mindset that security isn’t important.” Another interviewee commented that social networking “is abysmal” and users do not know the true impact of what can happen on social networking sites when interacting with the wrong types of users, such as criminals.

155

With regards to what cybersecurity policies should be considered concerning social networking sites, one interviewee suggested that policies addressing new technologies for social networking to promote more trust and reciprocity could be beneficial. That same person said that more automated means supporting the self-monitoring of these sites is beneficial, a topic for the third theme.

Self-Monitoring Habits for Cybersecurity

This part of the interview examined self-monitoring habits, and their usefulness in improving cybersecurity and

Internet usage. Interviewees were asked about topics associated with Subsidiary Questions 1d and 2a from Research

Questions 1 and Question 2 respectively. Interview question

#7 was the source of the data, with the question indicated in Table 4.36.

Table 4.36 Question on Self-Monitoring Habits

Self-monitoring Habits for Cybersecurity (Supporting Subsidiary Research Questions 1d and 2a) - What self-monitoring/self-policing practices do you think could promote more effective cybersecurity?

There was some agreement among a number of interviewees that self-monitoring practices can be useful in promoting more effective cybersecurity, however two of the 11

156

interviewees were adamantly against self-monitoring stating that self-monitoring does not work. Approaches to self- monitoring practices were highlighted by interviewee comments such as: self-certifying encryption devices for

Internet security; have widely published promising practices for guiding information technology monitoring and balancing security risks; exercising good operational discipline; maintaining awareness of the nature of the particular systems used to access the Internet; and having a tiered defensive approach.

Strengths and Shortfalls of FISMA

This part of the interview examined key policy areas of strengths and shortfalls identified by the interviewees in connection with the Federal Information Security Management

Act of 2002 (FISMA). The interviewees addressed topics related to Subsidiary Question 2a. Interview questions 2, 3,

4, 5, and 12 were the sources of the data as indicated in

Table 4.37.

157

Table 4.37 Questions on FISMA

Strengths and Shortfalls of FISMA (Supporting Subsidiary Research Question 2a) - What do you see as the policy strengths of the FISMA of 2002? - What are some of the policy shortfalls of the FISMA of 2002? - What methodologies did policy-makers and their staff members follow to research, analyze, and formulate policy for the FISMA of 2002? - Do you think there is a clear understanding of the infrastructure boundaries of affected Federal information systems governed by FISMA policies? If not, why not? - What general security principles from the FISMA could apply to managing other sharable resources, even non- technological resources such as natural resources? - In general, do you think that the FISMA adequately addresses the need for controlling user behaviors on the Internet?

There was a general feeling among the interviewees that

FISMA put cybersecurity issues onto the government’s policy agenda, and that this in itself was valuable. Most interviewees agreed that the policy was well-intentioned and that it provided a clear purpose of addressing serious vulnerabilities. Another interviewee emphasized that the law also highlighted the need for a coherent National Security

Council Policy and addressed continuity of operations (COOP) and continuity of government (COG) issues. FISMA also facilitated the implementation of risk-based approaches for addressing cybersecurity. This person said that FISMA also helped establish standardization of security reporting

158

information, though requirements were not effectively defined across the entire issue of cybersecurity.

With regards to policy shortfalls of FISMA, there was a general feeling among the interviewees that FISMA is still generally more administrative than regulatory in nature, having an emphasis on documenting issues and security compliance without having substantive repercussions or consequences tied to non-compliance. One interviewee likened

FISMA reporting to “grades on report cards,” and said that there is no long-term strategy identified to address the cybersecurity issues or provide a “get-well plan.” Funding for implementation is also an issue; FISMA is an “unfunded mandate” on Federal departments and agencies to fix cybersecurity problems.

The focus on addressing technology issues and not behavioral issues with users and managers of Federal information technology systems was another general shortfall identified by interviewees. One interviewee indicated that

FISMA was an outgrowth of Chief Information Officer work from the Federal Chief Information Officers Council, and followed from work on the Computer Security Act of 1987, along with classified work done at the National Security

Agency. (This might explain the technological emphasis of

FISMA.) An interviewee added that there needs to be more

159

balance between the issues of technology, users, and managers. “FISMA doesn’t deal with user behaviors” claimed one interviewee.

In discussing interfacing with the private sector, interviewees generally felt that there is not a good balance between “security and economics.” There is a need to have clear guidance to industry so that the business community can see that implementation of FISMA is in their best interest. The currency of FISMA was also raised by interviewees as an issue. Since 2002, cybersecurity threats have changed along with technology. “FISMA was useful in its time, but it was late coming out” noted one interviewee in relation to the encryption standards used in the government and with the public key. One interviewee said, simply,

“FISMA is an unresolved issue.”

While the interviewees were not specifically involved themselves in developing FISMA, interviewees felt that there is need to ensure close dialogue between policy-makers and policy implementers, especially in developing cybersecurity policy. It is essential for Internet operators to be included in the planning of policy implementation; and important for the implementers of the policies to avoid having governance conducted “in a vacuum.” Another interviewee suggested that there was not enough coordination

160

between policy-makers and those affected by the policies; and that the issues had not been thought out strategically, looking at all the risks.

With regards to there being a clear understanding of the infrastructure boundaries governed by FISMA, opinions of interviewees were mixed. Interviewees acknowledged that the phenomenal growth of social networking sites has made the establishment of boundaries very hard. An interviewee suggested that there is not a clear understanding of the infrastructure boundaries, especially when considering critical infrastructures and all the supporting networks, many of which are in the private sector. Another interviewee felt that at the federal level there were clear boundaries within agencies, and that agency-to-agency boundaries are better understood than in the past.

In discussing the applicability of general security principles from the FISMA to the management of other sharable resources, there was a feeling that there might be some relevance. “The principles applied to information security are not unique to information technology systems” stated one interviewee. Another interviewee mentioned, “Any organization dealing with highly sensitive information, at a minimum should meet standards just as with FISMA.” Resource protection principles from the FISMA, especially with regard

161

to critical infrastructure, are applicable, especially in addressing power grids and actions to be covered with these resources. One interviewee said that the lessons learned from September 11, 2001 emphasized the need for a broader reach of security, and better protection and sharing of information of critical infrastructures. In this regard, the internetworking of critical infrastructure systems is an important area to address since if one network breaks down it can cause a cascading effect that can take down the entire system. Yet another said that the FISMA mandate for coordination across agencies could effectively apply in managing other resources and sectors in the country, and that the FISMA offers a structure and standardization that is beneficial in managing other types of resources.

In the areas of contingency guidelines, FISMA provides a potential model useful for other shared resources. One interviewee pointed out that the emphasis on COOP and COG that the FISMA addresses also benefits other sectors, and that the risk management approach mandated by the FISMA is beneficial for other resource areas, particularly those involving critical infrastructures. FISMA also highlights owner-operator responsibilities that are also beneficial for managing other resources, and in addressing contingencies.

FISMA’s emphasis on a common standard for security is

162

beneficial to other sectors and resources, and that it provides the mission need that must be documented along with security compliance was highlighted by other interviewees.

With regards to FISMA adequately addressing the need for controlling user behaviors on the Internet and additional policies for security behavior, there was a general feeling among the interviewees that FISMA primarily addresses technology issues, with little emphasis on correcting human behaviors. One interviewee said, “It’s all about security controls,” and added that there are few incentives in policy for maintaining good Internet behaviors.

Additional Policy Ideas for Cybersecurity

This part of the interview examined ideas from the interviewees about additional policy areas that could be useful for cybersecurity. Interviewees were asked about aspects of the issues related to Subsidiary Question 2a.

Interview question 13 was the source of these data as indicated in Table 4.38.

163

Table 4.38 Question on Additional Policy Ideas

Additional Policy Ideas for Cybersecurity (Supporting Subsidiary Research Questions 2a) - What new ideas in Federal cybersecurity policy should now be included in future policies?

Education and user awareness training was cited by some interviewees as important for improving cybersecurity, especially with organizational insiders. One interviewee highlighted the need to address better education of Internet users, to help make them smarter in their Internet practices; “the human dimension is the weakest link.” An interviewee stated that there is a need to have policies that focus more on the business processes and cultural aspects of Internet use; there should be less emphasis on addressing cybersecurity as a purely technological issue.

“Continuous monitoring is an important area” claimed one interviewee. Another interviewee stated that cybersecurity cannot be achieved simply through a security lens, and that the problem is not only in the government sector. “Take the original intent of FISMA and make it dynamic with other areas” was recommended by one interviewee.

There was a general feeling among interviewees that identity management is important to cybersecurity. Several interviewees suggested that there is a need to emphasize verification of user identity, and that bringing more user

164

attribution information into cybersecurity policy is important. Identity management for malicious external and insider threats as well as unintentional insider threats was highlighted by interviewees. Interviewees suggested that addressing attribution issues with Internet usage is the number-one federal policy priority; bad actors want to be transparent, or undetected; they take actions to achieve transparency; and that is a liability that must ultimately be addressed. Balancing privacy with identity management and monitoring was also a common need highlighted by interviewees. One interviewee said that new policy must resolve the existing issues concerning the tension between being open on the Internet and being secure; there has not been enough debate about the tradeoffs involved in this tension. Another interviewee emphasized, “There has to be an integration of privacy policies with cybersecurity.”

Integrating and Summarizing the Results

In reviewing the results of both the quantitative and qualitative phases of data analysis, some common themes emerged. This section summarizes those general themes.

Judging from the results of both the quantitative and qualitative analyses, maintaining the trust of other

Internet users appears to be important to Internet users,

165

and it seems that people generally reflexively avoid

Internet activities that could cause them to lose this trust.

Most survey respondents said that they generally considered getting approval before posting information about other Internet users, and checking e-mail sources before forwarding e-mails to others. Likewise, most of the interviewees felt that maintaining trust is important especially in certain transactional actions. In relating these results to general commons usage, as scholars such as

Ostrom have noted with regard to maintaining trust in collective action involving a commons, the results suggest trust is relevant and individuals are generally reflexive in their efforts to avoid jeopardizing this trust.

From both the quantitative and qualitative results, it would seem that maintaining reciprocity based on elements of trust and reputation is important to Internet users, and that reflexivity plays a role. That is, survey respondents generally considered getting approval before posting information on other Internet users, and/or before sharing the contact information from other Internet users.

Maintaining reciprocity was also seen as important especially in connection with certain transactional activities identified by the interviewees. In relating these

166

results to general commons usage, as noted by Ostrom on the importance of reciprocity, reputation, and trust, the results indicate that individuals are generally reflexive in their efforts to avoid jeopardizing reciprocity with others.

The results of both the quantitative and qualitative analyses suggest that Internet users are generally more careful with Internet activities, and will alter their behaviors, if they know that their activities are being monitored. Moreover, the relationship of reflexivity in avoiding these actions is heightened under the circumstances when a user’s personal identity is known. Known identities and the attribution of actions are very relevant aspects in understanding reflexive action in environments with third- party monitoring. (As will be discussed in Chapter 5, identity management is an important factor for consideration in future cybersecurity policy.)

The results from the quantitative and qualitative analyses were mixed whether individuals avoid actions that might cause them to lose access to a resource such as the

Internet. Generally survey respondents indicated that they knew which online activities jeopardized their security; but the cybersecurity experts maintained that many users really have no idea what the ramifications of their actions might be. The results of the research actually seem to indicate

167

there is a false sense of complacency and an inappropriate level of comfort with Internet use among general users.

(Suffice it to say that cybersecurity policy experts who have daily exposure to the problems generated by Internet use are likely to have greater insight into the real nature of threats and vulnerabilities than general Internet users.)

However, it seems that some self-monitoring action may mitigate at least some of the security issues facing common

Internet users. Relating the results of this research to general commons management, knowing the nature of the resource is very important in shaping actions. As discussed in Chapter 2 regarding the IAD Framework, “the physical attributes of a resource always play an essential role in shaping the community and the decisions, rules, and policies” governing it (Ostrom and Hess, 2007:45-46).

The results of both the quantitative and qualitative analyses suggest reflexivity is relevant to some self- managed use of the Internet in the context of Federal cybersecurity policy. A number of such circumstances have already been discussed with regards to maintaining trust and reciprocity, third-party monitoring environments, and personal use of the Internet. A recurring theme was that new cybersecurity policy needs to do a better job of addressing more of the “human factors” in addition to technological

168

factors in order to maintain cybersecurity; and that policies should be also orientated to shape human behaviors.

As suggested through both the quantitative and qualitative analyses, introducing a greater degree of identity management into cybersecurity policy, and finding a means to identify attribution will also be beneficial in maintaining security. As this research has suggested, when the identities of individuals are known publicly, it can have an effect in shaping user behaviors.

The compilation of the quantitative and qualitative results suggests that Reflexivity Theory, in concert with

Institutional Theory, provides a constructive and valuable added dimension for formulating commons governance policy.

While the Internet was the focus of this research, there are useful elements that can be applied to other commons scenarios.

Summary

This chapter presented and summarized the results obtained through the mixed methods research. It presented the analysis of the quantitative data that initially explored the relevance of the link between the Reflexivity and Institutional Theories, as well as the relationship of agency and reflexivity to Internet use. Qualitative data

169

analysis elaborated on these initial findings and explored their applicability to Federal cybersecurity policy as well as to general commons governance policies. As the results suggest, there is a relationship between reflexivity and certain user actions on the Internet; and both public and private-sector experts in cybersecurity policy were in general agreement concerning policy concepts that support this relationship. Chapter 5 will summarize the main conclusions of the research described here; implications for applying the results to future commons policy; and suggestions for further research.

170

Chapter 5: Conclusions and Research Implications

Introduction

The purpose of the exploratory mixed methods research described here was to explore how two theories—Reflexivity

Theory and Institutional Theory—and their common theoretical element of human agency can be used to improve the formulation of commons governance policies. The applicability of these theories to governance policies that affect the Federal government’s secure use of the Internet was also explored. This chapter summarizes the main research findings; proposes implications for applying the findings to future policy; and suggests areas for additional research.

Summary of the Findings

As discussed in Chapter 1, ruin of a commons can result when self-serving individuals use the resource recklessly rather than adhering to conservation-minded, collective action in cooperation with other community members.

Collective action is key to avoiding a “tragedy of the commons.” The shaping of human behaviors is needed in order to create order and to encourage greater collective action, whether the affected resource is the Internet or another resource. Human agency is the nexus point and common linking

171

element between the Reflexivity and Institutional Theories: considering the importance of reflexivity and human agency in shaping behavior suggests a viable means to address new ways to promote collective action through policy.

Reflexivity Theory, linked with Institutional Theory, can indeed provide a constructive and valuable added theoretical dimension in developing self-governing policies.

As Figure 5.1 shows, the IAD Framework helps provide an instantiation of the common element of human agency in the

Action Arena. As reviewed in Chapter 2, the IAD Framework also provides a viable approach for analyzing the rules and norms affecting a commons, especially the Internet and cybersecurity.

172

Figure 5.1 Reflexivity and Institutional Theory in the IAD Framework

The findings of the research described here provide some important implications, through the linkage of the

Reflexivity and Institutional theories, for revising some of the basic aspects of the theories. First and foremost,

Reflexivity Theory brings more human behavioral considerations to Institutional Theory, which generally emphasizes a structured, top-down, analytical view involving formal and informal rules, norms, and organizations as part of the institution. While rational choice theory, an aspect of rational choice institutionalism, does address human motivations and actions, particularly in the satisfaction of utility, Reflexivity Theory brings an added dimension of an

173

actor’s self-reflection and self-monitoring in social settings that is beneficial. Because with Reflexivity Theory there is a focus on the individual actor with an emphasis on self-monitoring in shaping behaviors, an institution’s informal constraints (“self imposed codes of conduct,”

North, 1993) play a greater role in promoting collective action in lieu of formally established rules (such as laws and constitutions). With the potential for more self- governance involving the individuals themselves, there is the benefit of reducing, but not necessarily eliminating, some of the need for organizational elements to manage and monitor governance processes. Incorporating Reflexivity

Theory also revises the Institutional Theory approach, taking a somewhat top-down analysis of the broader elements in institutions, by introducing a bottoms-up analysis that addresses the individual affected by the institution.

With regard to the research findings refining cybersecurity and more specifically FISMA, in addressing the weakest link in cybersecurity—the individual user—there is a need for policy to emphasize human behavior more than it has in the past, where the focus has been more on technology and administrative issues. Incorporating more consideration of human behavior, including how reflexive behavior shapes action, and updating policy to address new technologies and

174

threats—external, malicious insider, and unintentional insider—would be a valuable refinement to Federal cybersecurity policy, and in particular FISMA. Moreover,

FISMA provides valuable principles of resource protection and standards that can benefit the management of other commonly shared resources. These include FISMA’s emphasis on organizational structure and oversight in providing an established governance approach to include FISMA’s emphasis on addressing owner-operator relationships in cybersecurity.

FISMA’s emphasis on addressing contingency requirements, such as continuity of operations, presents considerations for future policies to protect resources from natural and man-made disasters.

Implications for Future Policy

The research described here has some important implications for future commons governance policy and in particular, Federal cybersecurity policy. Reflexivity

Theory, together with Institutional Theory, provides a constructive dimension concerning human behavior that should be considered when developing policies oriented towards greater self-governance. However, no two commons are exactly alike; there are variations in the externalities that can adversely affect various commons; and in certain

175

circumstances, such as with the Internet, there may be the need to augment self-governing policies with formal rules- based governance. The Internet is a dangerous place, and while self-governance approaches may be effective for dealing with unintentional insider threats, malicious internal and external threats would require a more formal approach. However, regardless of the approach taken, policies affecting a commons, and especially cybersecurity policy, must first and foremost address the challenge of shaping human behaviors towards greater collective action.

There should also be an emphasis on developing policy to improve identity-monitoring and managed access to resources that ensures some measure of accountability and oversight of the users of a commons. This is also useful in determining the attribution of user actions that are detrimental to the long-term survival of the resource.

However, such policies must balance the benefits of having open access to the resource with the security risks that may result, as well as with the need to protect privacy. The need for protecting privacy is cited in recommendations to the President on future cybersecurity policy (Center for

Strategic and International Studies Commission on

Cybersecurity for the 44th Presidency, 2008:61-63; and White

House, 2009:iii). As the research described here has

176

suggested, individuals demonstrate reflexivity by reflecting upon and modifying their own actions, particularly when their identities are known to others, including third-party monitoring organizations. In cases with malicious insiders, anonymity is desired as Camp writes, “Security is a social activity in which the most malicious participants work to be the least visible” (Camp, 2011:103).

An emphasis on the importance of establishing trusted identities on the Internet is beginning to take shape on the governmental agenda, and the government could benefit from

Reflexivity Theory concepts informing new policy. The

National Strategy for Trusted Identities in Cyberspace claims that when using innovative and secure online identity solutions, “Individuals can better trust the identities of the entities with which they interact” (White House,

2011:5). Perpetrators of cyber attacks are also sensitive to this aspect of identity and attribution:

It is important to note that the creators and propagators of malware may wish to keep their targets unaware of infection—remaining invisible and unidentifiable as the source of control. (Cerf, 2011:62)

The National Science and Technology Council emphasized the importance of using technologies that can better authenticate user identities in cyberspace:

177

Authentication is fundamental to all information security because it connects the actions performed on a computer to an identified user that can be held accountable for those actions (National Science and Technology Council, 2006:32)

The results of this research support descriptive policy considerations for identity monitoring and access, and suggest that public visibility of identity can indeed have some influence on reflexively shaping Internet user behaviors.

The use of trusted identities can enhance trust and reciprocity between users of common resources. As this research has suggested, individuals do reflect on their actions to maintain trust and reciprocity with other

Internet users in certain transactional situations. Policies should be developed to promote environments that are conducive to developing trust and reciprocity, especially when the matters of security and reliability of transactions are particularly important. Banking transactions, for example, require environments that ensure the confidentiality of the information shared, and provide all parties assured identity of those involved in the transaction. Sales transactions also require the protection of proprietary information, and accurate identification of the parties involved. Cybersecurity policies need to promote

178

an online environment that supports these protected transactions.

Policies that address the protection of privacy are and continue to be extremely important in the new homeland security environment and especially in addressing cybersecurity policy. The Computer Science and

Telecommunications Board of the National Research Council writes:

Privacy rights are never absolute but rather are rights that society balances with the need or desirability for disclosure under various circumstances. In many cases, individuals can make their own decisions and choices regarding their privacy. (National Research Council, 1994:4)

For example, protecting privacy with the searching and cataloguing of information available through the Internet led to a recent U.S. Justice Department investigation into

Google, Inc. and its Street View application (Schneider,

2012). Protecting the privacy of personal information is critical in Internet transactions. The development and implementation of privacy policies can benefit from understanding the principles of both Reflexivity Theory and

Institutional Theory. For example, incorporating the concept of reflexivity in addressing privacy issues now puts some of the onus for protecting private information back on individuals as they reflect on themselves, their place in

179

the social setting, and how much private information they wish to release to others. Policy approaches that consider the importance of personal responsibility and self- monitoring such as increased use of informed consent, where the individual decides what personal information will be released (National Research Council, 1994:106), rather than expecting the government or institutions to be solely responsible for protecting privacy, is supported with concepts from Reflexivity Theory.

Institutional Theory can also formalize and structure the way privacy policies are analyzed, developed, and implemented, following the key elements of rules, norms, and organizations within an institution. Laws and regulations protecting privacy, such as the E-Government Act of 2002, form the rule-based foundation of the governance regime for protecting privacy in a networked environment. Balanced norms should include the identification of new practices that organizations will follow, whether operating on

Internet-based systems or managing other shared resources.

Organizations such as the Federal government, have Chief

Privacy Officers and formal organizations for overseeing the requirements of privacy laws and regulations. Privacy norms also have a complementary role with sustaining institutions:

“Privacy norms do not merely protect individuals; they play

180

a crucial role in sustaining social institutions”

(Nissenbaum, 2011:44). For these reasons both Reflexivity

Theory and Institutional Theory can promote a shared risk environment between individuals and the government in developing and implementing privacy policies.

Policies that address the need to educate users concerning the potential threats to them personally as well as to their use of a shared resource, including the training of users about acceptable norms of behavior, are important.

As the research presented here has suggested, Internet users may not be fully aware of the nature or degree of the threats that may be introduced to their organization through their Internet use. Developing policies that encourage more user education and better situational awareness, including awareness of threat information, is of crucial importance for cybersecurity. Federal employees frequently are notified of potential cybersecurity threats; notifications to general

Internet users are also important. As Table 4.1 indicated,

Internet users feel that it is very important to be well informed about potential security problems on the Internet.

Policies that promote standardization of information in interparty transactions and other information sharing situations are also important. In protecting the infrastructure of resources—such as electricity and water—

181

having a standardized way to report threat information is of critical importance. Banking and sales transactions also require standards for information sharing.

Policy implementation can be a challenge and is an important consideration early in policy development: it is important to ensure that policy can be implemented by the people who actually own, operate, and maintain commons resources. As with the nation’s critical infrastructures, significant portions of the Internet infrastructure are owned and operated by the private sector: therefore, policy- makers must consider the ability of the private sector to implement federal policy. In order to be effective, policy must be actionable, not just administrative or “checking a box,” as some interviewees suggested was the case with

FISMA. To be actionable, policy must have clearly defined and achievable courses of action, with measures to determine the degree of effectiveness of the implementation, and the benefits realized.

Finally, new policy development affecting shared resources such as the Internet should provide opportunities for users to participate in the process of developing new policy concepts, as suggested by the responses of the survey in Table 4.2. For example, the White House provides a place on its website where citizens submit petitions on important

182

issues of policy (White House, 2012). Providing citizens an opportunity to participate in cybersecurity policy development, through web-based blogs or Internet-based collaboration forums, is another way to encourage citizen participation in the governance processes.

Considerations for Future Research

There have been calls for research into the areas of cybersecurity technology and policy to address accountability and security, user access control, the auditing and traceability of actions, and ascertaining attribution of actions (Committee on Improving Cybersecurity

Research in the United States, 2007:8). Continued research into the behavioral and policy aspects of cybersecurity has also been called for by the White House:

The Federal government could also consider ways in which it could focus more resources on research into possible ‘game-changing’ areas, such as behavioral, policy, and incentive-based cybersecurity solutions. (White House, 2009:19)

The research described here has made it clear that, as with other commons resources, in order to be effective, policies addressing Internet use must consider issues of human behaviors. The research described here examined the relevance of reflexivity and human agency, along with

Institutional Theory concepts, using the IAD Framework, in

183

addressing the full scope of the challenges involved in maintaining cybersecurity. Further research into the issues of both malicious and unintentional insider threats; exploration of how identity and anonymity can affect

Internet user action; and how public policy addresses these areas would also be beneficial.

Another viable area of research would be examination of policy formulation tools such as the IAD Framework, tailored to the Internet environment and based on Institutional

Theory. As Chapter 2 suggests, the IAD Framework can help document and analyze a variety of issues and factors affecting cybersecurity. As such it is useful for exploring the impacts of physical, community, and institutional change on Internet user actions and outcomes. Continued research into the concepts of Reflexivity Theory, especially research into how reflexive action may be relevant even with malicious insiders would also be useful in better understanding cybersecurity issues. (The research described here is focused only on unintentional insider threats).

There may also be research applications concerning the

Reflexivity and Institutional Theories related to external threat actors such as terrorist groups and nation-state actors. What are the rules, norms, and organizations that

184

might be associated with these actors as they plan and perpetrate cybersecurity attacks?

Finally, as the methodologies described here did not allow for an actual probability sample to help determine statistical significance, conducting research on unintentional and malicious insider threats using sampling from a broader population of Internet users could be beneficial. This approach could help in researching any causal relationships between reflexivity and human agency that may affect the use of a commons, and in particular the

Internet.

Concluding Remarks

The research described here offers a new perspective for addressing cybersecurity and commons governance policy.

While cybersecurity policy thus far has focused primarily on external threats and technology, consideration of both malicious and unintentional insider threats is especially needed in order to adequately address the full-scope of the cybersecurity problem, as human behaviors are quite relevant in addressing cybersecurity concerns. It is hoped that this research will help policy-makers find ways to draw on the social sciences, through application of the Reflexivity and

Institutional Theories, in developing innovative new

185

approaches to cybersecurity policy. The social sciences— especially economics, political science, and sociology—can lend important perspectives critical in adequately addressing cybersecurity. Bringing these social science perspectives to bear on important concerns of technology science and the Internet could be very helpful in addressing the most challenging issues facing society today.

186

Appendix A: Internet Survey

187

188

189

190

191

192

193

194

195

196

Appendix B: Interview Questions

1. Please briefly describe your career background in cybersecurity policy.

2. What do you see as the policy strengths of the Federal Information Security Management Act (FISMA) of 2002? What are some of the policy shortfalls of the FISMA of 2002?

3. What methodologies did policy-makers and their staff members follow to research, analyze, and formulate policy for the FISMA of 2002?

4. Do you think there is a clear understanding of the infrastructure boundaries of affected Federal information systems governed by FISMA policies? If not, why not?

5. What general security principles from the FISMA could apply to managing other sharable resources, even non-technological resources such as natural resources?

6. Do you feel that people reflect on the ramifications of their actions when they use the Internet? Do you think that people would generally alter their online behaviors if their true identities were visible to other users using the Internet? Should these considerations be factored into new cybersecurity policy?

7. What self-monitoring/self-policing practices do you think could promote more effective cybersecurity?

8. How important is maintaining trust and reciprocity between users on the Internet? Do you think users reflect on their own actions to maintain trust and reciprocity with others?

9. Do social networking sites promote greater trust between Internet users? Could increased use of these sites promote better security practices as

197

users collaborate with other users? What policies should address such interactions?

10. Do you think third parties, such as the Federal or State governments, should actively monitor Internet user actions? If so, what organizations should perform this monitoring? If users were aware of third-party monitoring, do you think they would alter their online behaviors?

11. Should a graduated system of penalties be in place for users who demonstrate poor security behaviors on the Internet?

12. In general, do you think that the FISMA adequately addresses the need for controlling user behaviors on the Internet? What additional policies would promote more security conscious behaviors?

13. What new ideas in Federal cybersecurity policy should now be included in future policies?

14. Do you have anything else to add?

198

Appendix C: Survey Solicitation

Group Solicitation:

Your insight is needed for new cybersecurity policy approaches. Research is underway at The George Washington University, and needs participants for a confidential, 10- 15 minute on-line survey. Please access the survey at: https://www.surveymonkey.com/s/internet_security_policy_sur vey

199

Appendix D: References

Ackoff, Russell L. (1953). The Design of Social Research. Chicago: The University of Chicago Press.

Alvesson, Mats, Hardy, Cynthia, and Harley, Bill. (2008). Reflecting on Reflexivity: Reflexive Textual Practices in Organization and Management Theory. Journal of Management Studies, 45(3):480-501.

Anderson, James E. (2003). Public Policymaking: An Introduction (5th ed.). Boston: Houghton Mifflin Company.

Archer, Margaret S. (2003). Structure, Agency and the Internal Conversation. Cambridge: Cambridge University Press.

Babbie, Earl. (2010). The Practice of Social Research (12th ed.). Belmont, CA: Wadsworth.

Bandura, Albert. (1989). Human Agency in Social Cognitive Theory. American Psychologist, 44(9):1175-1184.

Bartlett, Steven J. (1987). Varieties of Self- Reference. In Steven J. Bartlett and Peter Suber (Eds.), Self-Reference: Reflections on Reflexivity. Boston: Martinus Nijhoff Publishers.

Barzelay, Michael and Gallego, Raquel. (2006). From ‘New Institutionalism’ to ‘Institutional Processualism’: Advancing Knowledge about Public Management Policy Change. Governance: An International Journal of Policy, Administration, and Institutions, 19(4):531- 557.

Berger, Peter L. and Luckmann, Thomas. (1966). The Social Construction of Reality. In Craig Calhoun, Joseph Gerteis, James Moody, Steven Pfaff, and Indermohan Virk (Eds.), Contemporary Sociological Theory (2nd ed.). Malden, MA: Blackwell Publishing.

200

Bethurem, Nancye Lou. (2009). “Flowing Toward Sustainability: Two Stream Adjudications Analyzed Under the IAD Framework.” (Ph.D. diss., University of Nevada, Las Vegas).

Bourdieu, Pierre. (2004). Science of Science and Reflexivity. Translated by Richard Nice. Chicago: The University of Chicago Press.

Brock, Gerald W. (1994). Telecommunication Policy for the Information Age: From Monopoly to Competition. Cambridge, MA: Harvard University Press.

Calhoun, Craig, Gerteis, Joseph, Moody, James, Pfaff, Steven, and Virk, Indermohan, (Eds.). (2007). Contemporary Sociological Theory (2nd ed.). Malden, MA: Blackwell Publishing.

Callero, Peter L. (2003). The Sociology of the Self. Annual Review of Sociology, 29:115-133.

Camp, L. Jean. (2011). Reconceptualizing the Role of Security User. Daedalus, 140(4):93-107.

Campbell, Douglas. (2010). “A Theory of Consciousness.” (Ph.D. diss., The University of Arizona).

Campbell, Donald T. and Stanley, Julian C. (1963). Experimental and Quasi-Experimental Designs for Research. Boston: Houghton Mifflin Company.

Cappelli, Dawn, Moore, Andrew, and Trzeciak, Randall. (2012). The CERT Guide to Insider Threats: How to Prevent, Detect, and Respond to Information Technology Crimes (Theft, Sabotage, Fraud). Upper Saddle River, NJ: Addison-Wesley.

Center for Strategic and International Studies Commission on Cybersecurity for the 44th Presidency. (2008). Securing Cyberspace for the 44th Presidency: A Report of the CSIS Commission on Cybersecurity for the 44th Presidency. Washington: Center for Strategic and International Studies.

201

Center for Strategic and International Studies Commission on Cybersecurity for the 44th Presidency. (2010). A Human Capital Crisis in Cybersecurity: Technical Proficiency Matters. Washington: Center for Strategic and International Studies.

Center for Strategic and International Studies Commission on Cybersecurity for the 44th Presidency. (2011). Cybersecurity Two Years Later. Washington: Center for Strategic and International Studies.

Cerf, Vinton G. (2011). Safety in Cyberspace. Daedalus, 140(4):59-69.

Chai, Sangmi. (2009). “Three Essays on Behavioral Aspects of Information Systems: Issues of Information Assurance and Online Privacy.” (Ph.D. diss., University of Buffalo).

Cheshire, Coye. (2011). Online Trust, Trustworthiness, or Assurance? Daedalus, 140(4):49-58.

Choe, Jaesong. (1992). “The Organization of Urban Common- property Institutions: The Case of Apartment Communities in Seoul.” (Ph.D. diss., Indiana University).

Clark, David D. (2011). Introduction. Daedalus, 140(4):5- 16.

Clark, Kenneth N. (1997). National Reorganization and Unity of Effort in the Information Operations Era. Montgomery, AL: U.S. Air Force Air Command and Staff College.

Cleaver, Frances. (2007). Understanding Agency in Collective Action. Journal of Human Development, 8(2):223-244.

Coase, Ronald H. (1991). “The Institutional Structure of Production.” Nobel Prize Lecture, December 9, 1991. Stockholm: The Nobel Foundation.

Coleman, Charles J. and Palmer, David D. (1973). Organizational Application of System Theory. Business Horizons, 16(6):77-84.

202

Committee on Improving Cybersecurity Research in the United States. (2007). Seymour E. Goodman and Herbert S. Lin (Eds.), Toward a Safer and More Secure Cyberspace. Washington: The National Academies Press.

Commons, John R. (1961). Institutional Economics: Its Place in Political Economy. Madison: University of Wisconsin Press.

Conover, W.J. (1999). Practical Nonparametric Statistics (3rd ed.). New York: John Wiley and Sons, Inc.

Cook, Karen S. and Cooper, Robin M. (2003). Experimental Studies of Cooperation, Trust, and Social Exchange. In Elinor Ostrom and James Walker (Eds.), Trust and Reciprocity: Interdisciplinary Lessons from Experiemental Research. New York: Russell Sage Foundation.

Creswell, John W. (2003). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches (2nd ed.). Thousand Oaks, CA: Sage Publications.

Creswell, John W. (2007). Qualitative Inquiry and Research Design: Choosing Among Five Approaches (2nd ed.). Thousand Oaks, CA: Sage Publications.

Creswell, John W. and Plano-Clark, Vicki L. (2007). Designing and Conducting Mixed Methods Research. Thousand Oaks, CA: Sage Publications.

Creswell, John W., Plano-Clark, Vicki L., Gutmann, Michelle L., and Hanson, William E. (2003). Advanced Mixed Methods Research Designs. In Abbas Tashakkori and Charles Teddue (Eds.), Handbook of Mixed Methods in Social and Behavior Research. Thousand Oaks, CA: Sage Publications.

Davies, Charlotte Aull. (1999). Reflexive Ethnography: A Guide to Researching Selves and Others. London: Routledge.

Davis, Stephen F. and Smith, Randolph A. (2005). An Introduction to Statistics and Research Methods: Becoming a Psychological Detective. Upper Saddle River, NJ: Pearson Education, Inc.

203

Deloitte. (2009). Protecting What Matters: The 6th Annual Global Security Survey. Retrieved from http://www.deloitte.com/assets/Dcom- Shared%20Assets/Documents/dtt_fsi_GlobalSecuritySurvey _0901.pdf

Demsetz, Harold. (1967). Toward a Theory of Property Rights. American Economic Review, 57(2):347-59.

Denmark, Abraham M. and Mulvenon, James. (2010). Chapter 1: Contested Commons: The Future of American Power in a Multipolar World. In Abraham M. Denmark and James Mulvenon (Eds.), Contested Commons: The Future of American Power in a Multipolar World. Washington: Center for a New American Security.

Director of National Intelligence. (2009). The National Intelligence Strategy of the United States of America. Washington: The Office of the Director of National Intelligence.

Dixit, Avinash K. (2004). Lawlessness and Economics: Alternative Modes of Governance. Princeton: Princeton University Press.

Dodd, Annabel Z. (1998). The Essential Guide to Telecommunications. Upper Saddle River, NJ: Prentice Hall.

Dunsire, Andrew. (1993). Modes of governance. In Jan Kooiman (Ed.), Modern Governance: New Government— Society Interactions. London: Sage Publications.

Durant, Robert F. (2007). Toxic Politics, Organizational Change, and the ‘Greening’ of the U.S. Military: Toward a Polity-Centered Perspective. Administration and Society, 39(3):409-414,416-446.

Easton, David. (1965). A Framework for Political Analysis. Englewood Cliffs, NJ: Prentice-Hall, Inc.

Edwards, Gregory. (2011). “Federal Government Information Systems Security Management and Governance are Pacing Factors for Innovation.” (Ph.D. diss., Walden University).

204

Fans, Dirk Rietveld. (2005). “Key Institutional Elements Enhancing or Obstructing Sustainable Rural Livelihoods Through Water Resources Management in the Indian Subcontinent.” (Ph.D. diss., The University of York).

Finlay, Linda. (2002). ’Outing’ the Researcher: The Provenance, Process, and Practice of Reflexivity. Qualitative Health Research, 12(4):531-545.

Foray, Dominique. (2004). The Economics of Knowledge. Cambridge: Massachusetts Institute of Technology Press.

Forbes, Joan. (2008). Reflexivity in professional doctoral research. Reflective Practice, 9(4):449-460.

Francisco, Matthew R. (2010). “Agents on the Loose: Embodied Reflexive Practice in Emerging Computational Social Science. (Ph.D. diss., Rensselaer Polytechnic Institute).

Frankfort-Nachmias, Chava and Nachmias, David. (2000). Research Methods in the Social Sciences (6th ed.). New York: Worth Publishers.

Friedman, Lee S. (2002). The Microeconomics of Public Policy Analysis. Princeton, NJ: Princeton University Press.

Giddens, Anthony. (1984). The Constitution of Society. Berkeley, CA: University of California Press.

Giddens, Anthony. (1991). Modernity and Self-Identity: Self and Society in the Late Modern Age. Stanford, CA: Stanford University Press.

Gilligan, John. (2010). Federal Information Security: Current Challenges and Future Policy Considerations. Written Testimony to the Subcommittee on Government Management, Organization and Procurement, Committee on Oversight and Government Reform (111th Congress). March 24, 2010.

205

Glenberg, Arthur M. and Andrzejewski, Matthew E. (2008). Learning From Data: An Introduction to Statistical Reasoning (3rd ed.). New York: Lawrence Erlbaum Associates.

Grant, Gordon J. (2010). “Ascertaining the Relationship between Security Awareness and the Security Behavior of Individuals.” (Ph.D. diss., Nova Southeastern University).

Gravetter, Frederick J. and Wallnau, Larry B. (2009). Statistics for the Behavioral Sciences (8th ed.). Belmont, CA: Wadsworth, Cengage Learning.

Green, Donald P. and Shapiro, Ian. (1994). Pathologies of Rational Choice Theory: A Critique of Applications in Political Science. New Haven: Yale University Press.

Greif, Avner. (2006). Institutions and the Path to the Modern Economy: Lessons from Medieval Trade. Cambridge: Cambridge University Press.

Hafner, Katie and Lyon, Matthew. (1996). Where Wizards Stay Up Late: The Origins of the Internet. New York: Simon and Schuster.

Hall, Peter A. and Taylor, Rosemary C.R. (1996). Political Science and the Three New Institutionalisms. Political Studies, 44(5):936-957.

Hamilton, Gary G. and Feenstra, Robert (1998). Chapter 7: The Organization of Economies. In Mary C. Brinton and Victor Nee (Eds.), The New Institutionalism in Sociology. New York: Russell Sage Foundation.

Hardin, Garrett. (1968). The Tragedy of the Commons. Science, 162(3859):1243-1248.

Heikkila, Tanya, and Isett, Kimberley Roussin. (2004). Modeling Operational Decision Making in Public Organizations: An Integration of Two Institutional Theories. The American Review of Public Administration, 34(1):3-19.

Herath, Tejaswini. (2008). “Essays on Information Security Practices in Organizations.” (Ph.D. diss., The State University of New York at Buffalo).

206

Hess, Charlotte. (2008). “Mapping the New Commons.” Presentation, 12th Biennial Conference of the International Association for the Study of the Commons. July 14-18, 2008.

Hess, Charlotte and Ostrom, Elinor. (2007). Introduction: An Overview of the Knowledge Commons. In Charlotte Hess and Elinor Ostrom (Eds.), Understanding Knowledge as a Commons: From Theory to Practice. Cambridge, MA: The MIT Press.

Ho, Robert. (2006). Handbook of Univariate and Multivariate Data Analysis and Interpretation with SPSS. New York: Chapman and Hall/CRC.

Holland, Ray. (1999). Reflexivity. Human Relations, 52(4):463-484.

International Telecommunication Union. (2009). Series X: Data Networks, Open System Communications and Security (Recommendation ITU-T X.1205). Geneva: International Telecommunication Union.

International Telecommunication Union. (2011). “Global Number of Internet Users.” ITU Statistics. Retrieved from http://www.itu.int/ITU-D/ict/statistics/

Johnson, R. Burke. (1997). Examining the Validity Structure of Qualitative Research. Education, 118(2):282-292.

Katz, Daniel, and Kahn, Robert L. (1978). The Social Psychology of Organizations (2nd ed.). New York: John Wiley and Sons, Inc.

King, John Leslie, Gurbaxani, Vijay, Kraemer, Kenneth L., and McFarlan, F. Warren. (1994). Institutional Factors in Information Technology Innovation. Information Systems Research, 5(2):139-170.

Kleinsasser, Audrey M. (2000). Researchers, Reflexivity, and Good Data: Writing to Unlearn. Theory Into Practice, 39(3):155-162.

Kooiman, Jan. (1993). Social-Political Governance: Introduction. In Jan Kooiman (Ed.), Modern Governance: New Government—Society Interactions. London: Sage Publications, Inc.

207

Kooiman, Jan. (2003). Governing as Governance. London: Sage Publications.

Koontz, Tomas M. (1997). “Federalism and Natural Resource Policy: Comparing State and National Management of Public Forests.” (Ph.D. diss., Indiana University).

Lefebvre, Vladimir A. (1982). Algebra of Conscience: A Comparative Analyis of Western and Soviet Ethical Systems. Boston: D. Reidel Publishing Company.

Lefebvre, Vladimir A. (2006). Research on Bipolarity and Reflexivity. Lewiston, NY: The Edwin Mellen Press.

Lewis, James. (2007). Federal IT Security: The Future for FISMA. Testimony before the House Committee on Oversight and Government Reform, Subcommittee on Government Management, Organization, and Procurement, and the Subcommittee on Information Policy, Census, and National Archives, (110th Congress). June 7, 2007.

Lewis, James A. (2010). Sovereignty and the Role of Government in Cyberspace. Brown Journal of World Affairs, 16(2):55-65.

Luhmann, Niklas. (1995). Social Systems. Translated by John Bednarz, Jr., with Dirk Baecker. Stanford, CA: Stanford University Press.

Luna-Reyes, Luis Felipe and Gil-García, J. Ramón. (2011). Using Institutional Theory and Dynamic Simulation to Understand Complex E-Government Phenomena. Government Information Quarterly, 28(3):329-345.

Lustiger-Thaler, Henri, Maheu, Louis, and Hamel, Pierre. (2001). Towards a Theory of Global Collective Action and Institutions. In Pierre Hamel, Henri Lustiger- Thaler, Jan Nederveen Pieterse, and Sasha Roseneil (Eds.), Globalization and Social Movements. New York, NY: Palgrave.

Majidi, Mehdi. (2006). “Cultural Factors in International Mergers and Acquisitions: Where and When Culture Matters.” (Ph.D. diss., The George Washington University).

208

Mantzavinos, C. (2001). Individuals, Institutions, and Markets. Cambridge: Cambridge University Press.

March, James G. and Olsen, Johan P. (1989). Rediscovering Institutions: The Organizational Basis of Politics. New York: Free Press.

May, Jason C. (2007). “An Institutional Analysis of Oil and Gas Sector Development and Environmental Management in the Yukon Territory.” (Ph.D. diss., Wilfrid Laurier University).

Mead, George H. (1934). Mind, Self, and Society. Charles W. Morris (Ed.). Chicago: The University of Chicago Press.

Meulman, Jacqueline J., Van der Kooij, Anita J., and Heiser, Willem J. (2004). Chapter 3: Principal Components Analysis with Nonlinear Optimal Scaling Transformations for Ordinal and Nominal Data. In David Kaplan (Ed.), The Sage Handbook of Quantitative Methodology for the Social Sciences. London: SAGE Publications.

Miller, Delbert C. and Salkind, Neil J. (2002). Handbook of Research Design and Social Measurement (6th ed.). Thousand Oaks, CA: Sage Publications.

Mulligan, Deirdre K. and Schneider, Fred B. (2011). Doctrine for Cybersecurity. Daedalus, 140(4):70-92.

Myint, Tun. (2005). “Strength of ‘Weak’ Forces in Multilayer Environmental Governance: Cases from the Mekong and Rhine River Basins.” (Ph.D. diss., Indiana University).

Narasimhan, Ajay Tejasvi. (2012). “Toward Understanding the Nature of Leadership in Alleviating State Fragility.” (Ph.D. diss., The Claremont Graduate University).

National Institute of Standards and Technology. (2010). Special Publication 800-53, Revision 3: Recommended Security Controls for Federal Information Systems and Organizations. Gaithersburg, MD: National Institute of Standards and Technology.

209

National Research Council. (1994). Rights and Responsibilities of Participants in Networked Communities. Dorothy E. Denning and Herbert S. Lin (Eds.). Washington: National Academy Press.

National Science and Technology Council. (2006). Federal Plan for Cyber Security and Information Assurance Research and Development. Arlington, VA: National Coordination Office for Networking and Information Technology Research and Development.

Nee, Victor. (1998). Chapter 1: Sources of the New Institutionalism. In Mary C. Brinton and Victor Nee (Eds.), The New Institutionalism in Sociology. New York: Russell Sage Foundation.

Newcomer, Kathryn E. (2011). “Strategies to Help Strengthen Validity and Reliability of Data.” (Paper, The George Washington University).

Nissenbaum, Helen. (2011). A Contextual Approach to Privacy Online. Daedalus, 140(4):32-48.

Norris, Sharon E. (2011). “Human Agency and Learner Autonomy Among Adult Professionals in an Organizational Context: Towards a New Science of Autonomous Leadership and Development.” (Ph.D. diss., Regent University).

North, Douglass C. (1993). “Economic Performance Through Time.” Nobel Prize Lecture, December 9, 1993. Stockholm: The Nobel Foundation.

Office of Management and Budget. (1996). Circular No. A- 130, Management of Federal Information Resources. Washington, DC: White House.

Olson, Mancur. (1971). The Logic of Collective Action: Public Goods and the Theory of Groups. Cambridge, MA: Harvard University Press.

Ophuls, William. (1973). The Return of Leviathan. Bulletin of the Atomic Scientists, 29(3):50-52.

Ostrom, Elinor. (1986). An Agenda for the Study of Institutions. Public Choice, 48(1):3-25.

210

Ostrom, Elinor. (1990). Governing the Commons: The Evolution of Institutions for Collective Action. New York: Cambridge University Press.

Ostrom, Elinor. (2000). Collective Action and the Evolution of Social Norms. The Journal of Economic Perspectives, 14(3):137-158.

Ostrom, Elinor. (2003). Toward a Behavioral Theory Linking Trust, Reciprocity, and Reputation. In Elinor Ostrom and James Walker (Eds.), Trust and Reciprocity: Interdisciplinary Lessons from Experimental Research. New York: Russell Sage Foundation.

Ostrom, Elinor. (2005). Understanding Institutional Diversity. Princeton, NJ: Princeton University Press.

Ostrom, Elinor. (2009). “Beyond Markets and States: Polycentric Governance of Complex Economic Systems.” Nobel Prize Lecture, December 8, 2009. Stockholm: The Nobel Foundation.

Ostrom, Elinor and Hess, Charlotte. (2007). A Framework for Analyzing the Knowledge Commons. In Charlotte Hess and Elinor Ostrom (Eds.), Understanding Knowledge as a Commons: From Theory to Practice. Cambridge: The MIT Press.

Ostrom, Elinor and Walker, James. (2003). Introduction. In Elinor Ostrom and James Walker (Eds.), Trust and Reciprocity: Interdisciplinary Lessons from Experimental Research. New York: Russell Sage Foundation.

Ostrom, Elinor, Walker, James, and Gardner, Roy. (1992). Covenants With and Without A Sword: Self-Governance Is Possible. The American Political Science Review, 86(2):404-417.

Ostrom, Vincent. (1991). The Meaning of American Federalism: Constituting a Self-governing Society. San Francisco: Institute for Contemporary Studies Press.

Oxford English Dictionary. (1989, 2009). Second and Third Edition. Oxford: Oxford University Press. Retrieved from www.oed.com.proxygw.wrlc.org/view/Entry/250790

211

Paller, Alan. (2010). Federal Information Security: Current Challenges and Future Policy Considerations. Testimony before the Subcommittee on Government Management, Organization, and Procurement of the Committee on Oversight and Government Reform, (111th Congress). March 24, 2010.

Panetta, Leon. (2012). Remarks at the University of Louisville on March 1, 2012. American Forces Press Service. Retrieved from http://www.defense.gov/news/newsarticle.aspx?id=67396

Parsons, Talcott. (1937). The Structure of Social Action: A Study in Social Theory with Special Reference to a Group of Recent European Writers. New York: McGraw- Hill Book Company, Inc.

Parsons, Talcott. (1951). The Social System. New York: The Free Press.

Parsons, Talcott. (1990). Prolegomena to a Theory of Social Institutions. American Sociological Review, 55(3):319- 333.

Parsons, Wayne. (1995). Public Policy: An introduction to the theory and practice of policy analysis. Cheltenham, UK: Edward Elgar Publishing Limited.

Patten, Mildred L. (2004). Understanding Research Methods: An Overview of the Essentials (4th ed.). Glendale, CA: Pyrczak Publishing.

Patton, Michael Quinn. (2002). Qualitative Research and Evaluation Methods (3rd ed.). Thousand Oaks, CA: Sage Publications, Inc.

Peters, B. Guy. (2005). Institutional Theory in Political Science: The ‘New Institutionalism’ (2nd ed.). London: Continuum International.

Pinhey, Nicholas Alan. (2003). “Banking on the Commons: An Institutional Analysis of Groundwater Banking Programs in California’s Central Valley.” (Ph.D. diss., University of Southern California).

212

Pink, Sarah. (2001). More visualising, more methodologies: on video, reflexivity, and qualitative research. The Sociological Review, 49(4):586-599.

Plato. (1894). The Theaetetus of Plato. Translation and notes by Benjamin Hall Kennedy, DD. Cambridge: C.J. Clay, M.A. and Sons, at the University Press.

Polski, Margaret M. (2003). The Invisible Hands of U.S. Commercial Banking Reform: Private Action and Public Guarantees. Norwell, MA: Kluwer Academic Publishers.

Post, Jerrold M., Ruby, Keven G., and Shaw, Eric D. (2000). From Car Bombs to Logic Bombs: The Growing Threat from Information Terrorism. Terrorism and Political Violence, 12(2):97-122.

President Obama. (2011). Executive Order 13587, “Structural Reforms to Improve the Security of Classified Networks and the Responsible Sharing and Safeguarding of Classified Information.” Federal Register 76(198):63811-63815.

President’s Information Technology Advisory Committee. (2005). Cyber Security: A Crisis of Prioritization. Arlington, VA: National Coordination Office for Information Technology Research and Development.

Rattray, Greg, Evans, Chris, and Healey, Jason. (2010). Chapter 5: American Security in the Cyber Commons. In Abraham M. Denmark and James Mulvenon (Eds.), Contested Commons: The Future of American Power in a Multipolar World. Washington: Center for a New American Security.

Rea, Louis M. and Parker, Richard A. (2005). Designing and Conducting Survey Research: A Comprehensive Guide (3rd ed.). San Francisco: Jossey-Bass.

Reynolds, Right Rev. Edward, DD, Lord Bishop of Norwich. (1826). The Whole Works of the Right Rev. Edward Reynolds, D.D. London: B. Holdsworth.

213

Schneider, Howard. (2012). “Google says Justice Dept. probed privacy issues associated with Street View.” Washington Post, 26 April 2012, Online version. Retrieved from http://www.washingtonpost.com/business/google-says- justice-dept-probed-privacy-issues-associated-with- street-view/2012/04/26/gIQAMwkijT_story.html

Scholte, Bob. (1972). Toward a Reflexive and Critical Anthropology. In Dell Hymes (Ed.), Reinventing Anthropology. New York: Pantheon Books.

Schumm, Gigi. (2011). Outdated FISMA Threatens Cybersecurity. Federal Times, March 6, 2011. Retrieved from http://www.federaltimes.com/article/20110306/ ADOP06/103060304/

Scott, W. Richard. (2008). Institutions and Organizations: Ideas and Interests (3rd ed.). Thousand Oaks, CA: Sage Publications, Inc.

Shaw, Eric D., Ruby, Keven G., and Post, Jerrold M. (1998). Insider Threats to Critical Information Systems: Characteristics of the Vulnerable Critical Information Technology Insider (Final Report). Bethesda, MD: Political Psychology Associates, LTD.

Shaw, Eric, Ruby, Keven, and Post, Jerrold. (1998a). The Insider Threat to Information Systems: The Psychology of the Dangerous Insider. Security Awareness Bulletin 2-98:1-10.

Shea, Terrence Sinnett. (1985). “Human Rights as an Instrument in the Humanist Reconstruction of Post- Modern Political Economy.” (Ph.D. diss., University of Maryland College Park).

Silvers, Robert. (2006). Rethinking FISMA and Federal Information Security Policy. New York: New York University Law Review.

Singh, Kultar. (2007). Quantitative Social Research Methods. Los Angeles: Sage Publications.

Soros, George. (2003). The Alchemy of Finance. Hoboken, New Jersey: John Wiley and Sons, Inc.

214

Suber, Peter. (1987). A Bibliography of Works on Reflexivity. In Steven J. Bartlett and Peter Suber (Eds.), Self-Reference: Reflections on Reflexivity. Boston: Martinus Nijhoff Publishers.

Trochim, William M. K. (2001). Research Methods Knowledge Base (2nd ed.). Cincinnati: Atomic Dog Publishing.

Umpleby, Stuart A. (1997). Cybernetics of Conceptual Systems. Cybernetics and Systems, 28(8):635-651.

Umpleby, Stuart. (2007). Reflexivity in Social Systems: The Theories of George Soros. Systems Research and Behavioral Science, 24:515-522.

Umpleby, Stuart. (2010). From Complexity to Reflexivity: The Next Step in the Systems Sciences. In (Ed.), Cybernetics and Systems ’10. Vienna: Austrian Society for Cybernetic Studies.

USA Today. (2012). “Judge refuses to dismiss Manning’s WikiLeaks Case.” April 25, 2002. Retrieved from http://www.usatoday.com/news/military/story/2012-04- 25/judge-manning-wikileaks/54538844/1

U.S. Department of Commerce, Internet Policy Task Force. (2011). Cybersecurity, Innovation and the Internet Economy. Washington: U.S. Government Printing Office.

U.S. Department of Homeland Security. (2011). Blueprint for a Secure Cyber Future. Washington: U.S. Department of Homeland Security.

U.S. Government Accountability Office. (2010). Cyberspace: United States Faces Challenges in Addressing Global Cybersecurity and Governance. Washington: U.S. Government Accountability Office.

U.S. Government Accountability Office. (2011). Information Security: Weaknesses Continue Amid New Federal Efforts to Implement Requirements (GAO-12-137). Washington: U.S. Government Accountability Office.

215

U.S. Government Accountability Office. (2011a). Cybersecurity: Continued Attention Needed to Protect Our Nation’s Critical Infrastructure and Federal Information Systems. Statement of Gregory C. Wilshusen before the House Committee on Homeland Security, Subcommittee on Cybersecurity, Infrastructure Protection, and Security Technologies, (112th Congress). March 16, 2011. Washington, U.S. Government Accountability Office.

U.S. Office of Personnel Management. (2010). Historical Federal Workforce Tables. Retrieved from http://www.opm.gov/feddata/HistoricalTables/TotalGover nmentSince1962.asp

Von Bertalanffy, Ludwig. (1968). General Systems Theory: Foundations, Development, Applications. New York: George Braziller, Inc.

Von Foerster, Heinz. (2003/1979). Cybernetics of Cybernetics. In Understanding Understanding: Essays on Cybernetics and Cognition. New York: Springer- Verlag.

Warner, Rebecca M. (2008). Applied Statistics: From Bivariate Through Multivariate Techniques. Los Angeles: Sage Publications.

Weber, Max. (1947). The Theory of Social and Economic Organization. Translated by A.M. Henderson and Talcott Parsons. New York: Oxford University Press.

Weimer, David L. and Vining, Aidan R. (1999). Policy Analysis: Concepts and Practice. Upper Saddle River, New Jersey: Prentice Hall.

White House. (2003). The National Strategy to Secure Cyberspace. Washington: The White House.

White House. (2008). The Comprehensive National Cybersecurity Initiative. Retrieved from http://www.whitehouse.gov/cybersecurity/comprehensive- national-cybersecurity-initiative

White House. (2009). Cyberspace Policy Review: Assuring a Trusted and Resilient Information and Communications Infrastructure. Washington: The White House.

216

White House. (2010). Memorandum for the Heads of Executive Departments and Agencies (“Clarifying Cybersecurity Responsibilities and Activities of the Executive Office of the President and the Department of Homeland Security”). July 6, 2010. Washington: The White House.

White House. (2011). National Strategy for Trusted Identities in Cyberspace: Enhancing Online Choice, Efficiency, Security, and Privacy. Washington: The White House.

White House. (2011a). International Strategy for Cyberspace: Prosperity, Security, and Openness in a Networked World. Washington: The White House.

White House. (2012). “We the People: Your Voice in Our Government.” Website. Retrieved from https://wwws.whitehouse.gov/petitions

Williamson, Oliver E. (1996). The Mechanisms of Governance. New York: Oxford University Press.

Williamson, Oliver E. (2009). “Transaction Cost Economics: The Natural Progression.” Nobel Prize Lecture, December 8, 2009. Stockholm: The Nobel Foundation.

Yin, Robert K. (2003). Case Study Research: Design and Methods (3rd ed.). Thousand Oaks, CA: Sage Publications.

Yin, Robert K. (2009). Case Study Research: Design and Methods (4th ed.). Thousand Oaks, CA: Sage Publications.

217