<<

Anti-Essentialism in Public Administration Conference A decentering tendency has undermined the foundations of public administration theory Fort Lauderdale, FL March 2-3, 2007

Theorizing Public Administration as a

Catherine Horiuchi University of San Francisco

Working Paper: Not for attribution or citation with author’s permission

Abstract

The capacity of information networks to capture and manipulate ever-larger streams of globally acquired, real- data accelerates fragmentation of traditional public administration protocols, away from managing stable states toward temporary and permeable framing of governmental and corporate interests. Technologies simultaneously provide historic opportunities for dissent, individualism, and small-d democratic movements. Intermittent, overlapping governance – characterized by private government, small wars, state failures, and opportunistic shifts of power and capital from public stewardship to private parties – results in ideological or pragmatic retreats from and progressions of institutional boundaries. Ephemeral balances rather than negotiated long-term settlements demark the edges of public and private. This fluidity of realms increasingly affects the and allocation of administrative responsibilities between formerly firmly edged divisions of local, state, and national governments. The assumption of a static state in public administration theory does not hold. Government becomes a metaphorical fluid, an eddy that retains its shape more or less, long enough to become an object of analysis and action. The new assumption of administrative fluidity invokes a world of measurement estimating impacts of partially ordered and partially stochastic events. Sensemaking derives from sophisticated evaluative and probabilistic analyses. Traditional construction of the field with its assumption of durable governmental operations may no longer be a best-fit theory for multi-layered, ephemeral states. Theorizing Public Administration as a Stochastic Process

Introduction

A casual observer might be forgiven for thinking academic preparation for

practitioners in the field called public administration involves traversing rudimentary

courses in a loose collection of social and administrative training, grounded in

scholarly studies and practitioner reports of these practices in collective action. These

concepts and practices have been gathered together over the past century in an

occasionally haphazard but generally straightforward fashion, responding to the ebb and

flow of public judgment about the roles and responsibilities of governing institutions.

This constructionist approach to public administration extends to the training of its

professoriate and the development of the field’s curricula.

Public Administration finds its basis in a series of postulates and assumptions of

public governance, first established in the earliest written documents, periodically tested

and revised. Bedrock of these assumptions in modern public administration is that of a stable state. In its absence, we assume humanity exists in anarchy or anomie. This stability is assumed to last, at minimum, long enough for any of the field’s trained practitioners to act and be recognized as a successful manager of public interests through application of this study of collected concepts. In the one-class-per-field model of public

administration, a typical curriculum skims organizational theory, economics, law, policy

formulation, program implementation, ethics, and management, with a smattering of

and analytic techniques.

The closing decade of the 20th century and the opening years of the 21st century

offer considerable evidence suggesting this assumption of stability results in practitioners

Horiuchi: Stochastic Process 1 who despite this collective study may not respond appropriately in the likely range of

situations a typical administrator will encounter. Professional education and research

based on this implicit postulate does not offer practitioners the best possible preparation

for public participation and service.

Long before the terrorist attacks on September 11, 2001, and accelerated

thereafter in concert with the increasing power and capacity of electronic data systems,

the US government’s law enforcement and national security interests sought the capacity

to capture, catalog, and analyze data related to subjects of investigation. With ubiquitous systems and networks came the possibility to capture and store for an undetermined

period of time every imaginable bit of information on all possible distinct persons.

The exponential increase in storage capacity and new software programs to

capture and assign meaning to streaming data is reframing the role of government and its

relationship to corporate interests whose cooperation is essential, as much of the data

management infrastructure is privately owned and managed. These and other

technologies of mass data transfer simultaneously provide historic opportunities for dissent, individualism, and democratic actions. No is too mundane to be captured and posted online, potential affecting overlapping government structures, from neighborhoods to nations-states through the immediacy offered in network communications.

There would be little impetus to commit so much government planning and

assessment to these technological advances if existing administrative practices worked

superbly in all instances. To the degree these adoptions reflect explicit or implicit

problems, they should be examined to consider whether they improve or worsen

Horiuchi: Stochastic Process 2 outcomes and democratic processes. One frequent observation is the speed and

constancy of changes which governments must address, that is, a perceived need for

administrative fluidity and responsiveness.

An assumption of administrative fluidity invokes a world of measurement

estimating impacts of partially ordered and partially stochastic events. The use of

computers to assist decision makers and governance operatives in their sensemaking

under variable conditions and time-sensitivity, demonstrates an intention to include as

many known and measurable variables as possible to better model the entire universe of

possible causes and effects.

A sense of fluidity and impermanence is nothing new, particularly in politics,

where officials can rise and fall with bad practice or bad luck. As Lord Palmerston, 19th

century British Prime Minister noted, a state enjoys “… no permanent friends and no

permanent enemies, only permanent interests.” (Lord Palmerston, 1848) These concerns

were less troublesome in the past, however. Modern has given us tightly-coupled

systems that are not capable of being completely free of unexpected effects. (Perrow,

1984; Taleb 2004)

Public administration theorists and practitioners might consider radical

restructuring, beginning with revision of fundamental assumptions and concepts. This can lead to the development of a new set of competencies to manage stochastic and short- lived administrative constructs. I begin by reviewing the concept of , and continue with a discussion of the problems in existing assumptions. The technical response to the problem is described, governmental efforts to reduce randomness and error through complete information. Three cases are considered for the role of

Horiuchi: Stochastic Process 3 randomness. Lastly, ideas are offered toward restructuring assumptions that might result

in greater explanatory power regarding public responsiveness.

Regarding Randomness, or Stochastic Events and Processes

A stochastic event or instance can be consider a random, uncontrolled input that

affects a process of otherwise fully specified character so that, despite following known

rules and acting within acknowledged constraints, the cannot be accurately

anticipated. A stochastic process might characterize either an unknown system that is

acting on persons and conditions, or a system with random effects that interacts with a

known process in a non-periodic, unforeseeable manner. Central to these descriptions is

the concept of randomness, at least from the point of reference of those acted upon by the

stochastic process. Inherent is a tantalizing potential for reducing seemingly random and

stochastic effects through more complete knowledge.

If a substantial number of events are random, can we argue that public leadership simply cannot manage to minimize damage to populations? Perhaps, but such an argument is pointless and fails the test of political legitimacy. Rather, similar to the

Heisenberg principle and the Hawthorne studies, let us assume we affect any system merely through observation and study. If so, the field of public administration may develop useful knowledge and modify praxis through acknowledgement of and investigation around these random effects.

Eagle (2005) addresses what he considers an unjust neglect in the study of the concept of randomness, which he considers a special case of a process’s unpredictability.

This randomness is not mere , and a better understanding of the concept as unpredictability makes it useful in discriminating between various theories. Randomness

Horiuchi: Stochastic Process 4 is a key concept in , and for any dynamic process that is modeled with a

component. Human behavior, and by extension human organizational

systems, are also usefully modeled as random processes (Eagle, 752). Limitations on predictability can be categorized as epistemic, computational, or pragmatic. These limitations prevent the correct recognition of the state or situation under consideration and may result in inappropriate response. Reducing these limitations, therefore, should increase the correct identification of conditions and diminish unpredictable, random, chaotic, and stochastic events with the concomitant risks associated with operating under mistaken assumptions.

A random effect does not map uniformly across all units of analysis. Consider

marine accidents. These rare events were investigated in Perrow (1984) and are of

interest in any question of when and why insurance is purchased in case of loss. Any

particular ship captain will most probably experience zero accidents. The sea as a whole,

will most likely experience some loss each year. In these cases, external requirements for

insurance are the most likely method to ensure no uninsured losses. Goulielmos (2004)

tested smaller areas, and found that a non-random number of ships were lost per area

(using areas much smaller than the ocean as a whole) while the number of ships lost per

month remained random, using a new statistical measure for nonlinear dependence

In its data-driven effort to increase surety of security through complete

knowledge, the US government’s main interest is reduce unpredictability. But the act of measuring creates system changes, and the measurements themselves introduce process errors. And in developing a computational model of a open system, only a subset and not the full set of measurable inputs can be included, thereby introducing more uncertainty in

Horiuchi: Stochastic Process 5 the process outputs of the models, including whether the unpredictable model states

reasonably resemble in nature or number unpredictable or stochastic states in experiential

conditions. Since many governance functions revolve around response to uncertain and

unfavorable events, any irreducibility in randomness indicates bona fide risks to

successful operation or indeed viability. To what degree do existing theory and praxis

offer a protection against randomness?

Existing Theory Offers Incomplete Response to Risks of Random Events

Current public administration theory streams propose governments operate on

specific organizational principles, and determine courses of action using specific

decision-making models. In this section several principles and models are considered for

their effects under randomness.

Public managers must facilitate equitable collective social behavior despite a

rising number of unsteady, unstable states. Public administrators are expected to operate

successfully regardless of exposure to a range of outside mega-forces: geopolitical (wars,

annexations), religious (fundamentalism), economic (the ascendancy of capitalism and

globalization), and physical (climate variation, manmade or not). These external forces

result in internal financial, institutional, and political processes that disrupt the “normal”

flows of government performance. Public administration is not alone in facing these

forces, but may be later than other fields in addressing them beyond a peripheral

recognition. These immensity of these forces is illustrated by concerns raised in other fields. The question of sustainability underlying research initiatives across a diverse range of physical and social sciences serves as acknowledgment of the serious risks facing the general population if these forces are ignored. Indeed, administrative theory

Horiuchi: Stochastic Process 6 on the question of an unstable and random world, and the unsteady circumstances in

which administrators may operate in a non-deterministic time frame, are of pre-eminent public interest.

Existing theory and practice in the field of public administration “resolves”

questions of random introduction of new issues by accruing additional specializations

such as emergency management, homeland security, leadership, and diversity, and by

developing models of new stabilities, most recently networks of governance. As an

example, these stable networks supplement existing models of traditional local and

national units, which become placeholders in the study of how managers deal with

contracts for services in a hollowed state. A displacement strategy fits well with an

accretive or conglomerate model of the field, as faculty can be added to teach courses in

the new core public administration concepts, and phasing out less significant faculty and practitioner skills. Occasionally the field’s programs or entire schools have been renamed or split off to reflect shifting interests and knowledge gaps, from public administration to public affairs to public policy and public management. Perhaps unsurprisingly, even the terms “public” and “administration” vanished in the case of the

University of Southern California’s School of Public Administration merger with its

School of Urban Planning and Development to become the School of Policy, Planning and Development.

Policy schools provide an interesting case of the consequences associated with a determination that an essential knowledge cannot be gained through a cursory review of

material, the type of review that is ubiquitous in a public management curriculum

designed to offer students a limited introductions to the deep bodies of knowledge that

Horiuchi: Stochastic Process 7 underpins collective action. A rationalist perspective infers that more complete

knowledge of a policies design and potential effects could result in better decisions and

improved results. In order to develop the technological/analytical capacities needed for this model of the policy process, students take additional quantitative courses in quantitative and evaluation techniques; rather than graduating generalists targeting

positions as public managers, these programs create policy analysts who might become

technical specialists or consultants to governments.

Governance derives most fundamentally from bureaucratic organization, allowing

for predictable collective action. Merton (1957) describes bureaucracy as clearly defined

patterns of activity in formal, rationally ordered social structures overseen by

administrative, positional authority, defined as “the power of control which derives from

an acknowledged status.” (Merton in Shafritz et al., pg. 103.) The negative effects of

bureaucracy develop from three personality dislocations experienced by the trained,

salaried experts who operate under strict categorization: Veblen’s “trained incapacity,”

Dewey’s “occupational psychosis,” and Warnotte’s “professional deformation.” These

human personality responses to the demands of working in bureaucracy result in

“inappropriate responses under changed conditions.” (his italics, pg. 105.) Under the

limitations of human personality, then the unsurprising response of government

bureaucracies to any unexpected and randomly timed situation would itself be random

and unclear, and success in no measure certain.

In a society where administrative controls are few and far away, say, a rural office

hundreds of miles from the state capital, a bureaucratic workforce has long enjoyed a natural barrier to tightly coupled control systems and should have experienced fewer

Horiuchi: Stochastic Process 8 incidents of serious personality or behaviors disturbances that might be related to the bureaucratic model. However, under a new technological regime where every phone call can be pre-scripted and recorded, and every keystroke cataloged, these remote workers may fall prey to the same bureaucratic disorders. To the general public, this situation may be recognized or characterized as “unresponsive” behavior. It is in fact highly responsive, however it is responding to the demands of the permanent bureaucratic control records rather than the ephemeral citizen who perhaps will hang up the phone or exit the office in short order.

The most prevalent models of decision-making do not fare much better when considering the frequency and inevitability of random events on a systemic reference scale. The most commonly agreed-upon and applied models, Herb Simon's satisficing bounded rationality and Charles Lindblom's incrementalism operate under different assumptions, but each produces sub-optimal results. Using these two models, the preferred outcome of an inappropriate (“wrong”) decision or no decision would be a small adverse impact. Indeed, smaller decisions with smaller impacts lie at the heart of the logic supporting incrementalism. But for certain decisions, in a environment of tight coupling, adverse effects may well be unacceptable, ranging across a continuum from small failures to catastrophic. A small failure might be the 2001 California energy market collapse with an estimated cost of $100 billion paid by taxpayers or ratepayers over an extended period, 10 to 20 years. The largest involve catastrophic losses, such as nuclear technology failure, unanticipated escalation of small wars into global events, or biological annihilation though or deliberate acts.

Comprehensive Data Collection as a Method to Address Stochastic Risk

Horiuchi: Stochastic Process 9 In this setting, seeking complete knowledge as a guard against missteps seems an

obvious . If analyzing more data leads to more recognition of risks, perhaps it can

prevent catastrophic outcomes. This section of the paper describes how this data is captured and catalogued. A conundrum regarding unique identifiers provides a logistical

hurdle. Popular antipathy toward some law enforcement use of data, and concerns about data security and provides, result in a political hurdle.

Data capture of all information on all persons follows a straightforward method.

The essence of the idea is quite simple: as electronic data streams through a worldwide

network of wires, each bit can be copied as it goes by a collection point. As the data is

collected, or at any later point, computer programs analyze the data and present summary

or detailed information for analyst attention to determine whether machine-targeted

patterned data is suited for specific attention or intervention. A model of this collection is

provided in Diagram 1.

multiple-sourced data feeds

ŅtotalÓ information

Diagram 1: Sensemaking through Comprehensive Information Assessment

This strategy involves data mining, a technological capacity criticized for ethical

concerns by privacy advocacy groups such as the Electronic Frontier Foundation and the

Horiuchi: Stochastic Process 10 American Civil Liberties Union. More pragmatic concerns also exist; tests of data will

certainly result in failure to reject the null hypothesis in instances where a number of

hypotheses are tested by a number of analysts, regardless of coordination. Large data sets

contain patterns, so the method also invites participant bias (Denton, 1985; Frank, 1998)

Statisticians have developed strategies to address data cleansing (Hernandez and Stolfo,

1998) and use of random sampling within large data sets (Owen, 2003); the former

method requires several passes through the data and the latter method does not

interrogate the full data set. Philosophical questions have been shelved in the main, and

both the commercial world and governments alike develop ever larger and more

sophisticated data mining applications to manipulate these astronomically large data

structures, often using visualization techniques that improve comprehension of large data

sets. (Tufte, 2001; Rogowitz and Treinish, ND)

After passage of the USA PATRIOT Act in 2001, sustained funding and

administrative support flowed into data collection and mining. The “Total Information

Assurance” proposal of John Poindexter was closed under public pressure in 2003, shortly after it was announced. Its initiatives have in the main continued, and new databases have been developed and are growing steadily. (Electronic Privacy Information

Center (2005; American Civil Liberties Union, 2004) As one example, the National

Security Administration requested access to AT&T telecom switches so traffic could be subjected to electronic capture and analysis. According to published reports, some portion of internationals call traffic was diverted and routed through two large switches situated in the US to increase access to objects of surveillance.

Horiuchi: Stochastic Process 11 In the case of Homeland Security analysis of large data sets, any real-time

processes preclude data cleansing. Security concerns also likely limit random selection

of reduced data for patterns, out of concerns that seemingly irrelevant data contains

signals of impending terrorist activity (that is, we must first find the dots, then connect

them.)

Matching specific individuals to each message or data record is an ongoing

challenge. This requires fixed and unchanging identifiers for each person to facilitate

cross-indexing and searches. Most US citizens can be associated with a specific 9-digit

number issued by the Social Security Administration initially designed for use in

collecting payroll taxes and determining retirement benefits. Others with certain

employment rights may apply for a unique tax identification number. Yet the same US

government acknowledges and tacitly facilitates the widespread use of and dependence

upon a substantial underclass of foreign-born labor who have entered the country with no official authorization to live or work in the US. Regarding this community, the California

journalist and author Peter Schrag has asked, “Is the border still a line, or is it now a

region?” (Schrag, 2007) Once this population moves through the border region and starts

working, two strategies predominate. The worker and employer can jointly agree to an

unreported labor agreement, where the employer pays no payroll or other taxes, or the

worker and employer can use (and reuse) a valid Social Security Administration number

(SSN) that uniquely belongs to another person. Reports on reuse suggests many numbers are used dozens or hundreds of persons; retired workers who have returned to their home

countries participate in a market where they trade use of their identifier for a cash rent.

Horiuchi: Stochastic Process 12 These behaviors increase false matches and complicate the search through data for

patterns of interest to law enforcement and security agencies.

The undocumented or mis-documented workforce is desirable in no small

measure due to their status: without legitimacy, these individuals have few bargaining

rights, are unlikely to report workplace violations related to pay, safety, and are unlikely

to receive the full range of workers’ benefits that otherwise make the US workforce one

of the most highly compensated in the world. For millions operating on falsified

documents or no documents at all, the type of entries generated in the databases create

persistent problems in data mining applications.

With more constraints on using the SSN as an identifier in non-official

applications, a collection of secondary identifiers are commonly used as part of the

process for establishing connections between databases that store individuals. These

include employer- or school-assigned identifiers, and the unique number on a person’s state-issued driver’s licenses. To reduce forgery, and increase security in the issuance of

driver’s licenses, the Real ID Act in 2005 (H.R. 418) was enacted. It requires states to

change processes around the granting and manufacturing of official drivers’ license and

state identification cards. The states are balking at the requirements and a number of

legislatures have proposed refusal.

Evaluating the Explanatory Power of Different Assumptions: Reflecting on Cases

The quality of a theory rests, in part, on its ability to accurately predict outcomes

and completely explain why certain outcomes are more likely, especially in an applied

field such as public administration. The application of theory in administrative praxis

results in acts of power that can help or harm large numbers of persons, indeed, can result

Horiuchi: Stochastic Process 13 in the prosperity or deaths of millions. Given the serious outcomes of flawed theoretical

constructs, there should be no limit to the number of ideas considered and evaluated.

Yet, how can ideas be tested, especially when the field encompasses several research

paradigms that do not have equal creadence among all? One method is to consider a

collection of cases. Three are considered here as to whether possible changes to

assumptions based on recognition of stochastic events and ephemeral governance provide

better explanation of outcomes. This consideration results in the development of more

equitable, favorable options that might improve outcomes. Many cases, in the US and

internationally, exemplify the limits of the traditional constructs and suggest the value of

revising the field to establish new emphases. Three well-documented US cases are

selected here as illustrations: Hurricane Katrina in 2005, the NASA loss of Challenger in

1986 (followed by the loss of Columbia in 2003); and the 1889 Johnstown Flood. These

cases span the 19th, 20th and 21st centuries of American governance, and showcase the strength of ephemeral governance theory against situations and traits both stable and unstable. New tools and technology may offer the possibility to minimize government missteps and reduce the intensity and duration on the public, but this may require changes in the way administrations are structured and managed. Because the NASA and Katrina cases are more recent and have been widely discussed, the bulk of the description here is

of the earliest event.

The Johnstown Flood of 1889 was the worst civil disaster the United States had

yet suffered with over 2200 lost. (Frank, 1988, 63) Publicly commissioned and constructed, the dam was only briefly operational as originally intended before being

turned over to private interests. Built to support 19th-century river canal traffic in an area

Horiuchi: Stochastic Process 14 with intermittent waterways, the earth and rock dam’s construction was delayed several as other federal projects took priority in funding. By the time it was finished in

1852, railroad tracks had been laid to support commercial interests in the area, ending the public utility for the dam, which had been drained in 1855 to repair leaks before being sold to the railroad in 1857. Subject to benign neglect by its new owners, the dam partially washed was five years later in 1862, with little damage downstream, emptying most of the man-made lake. The railroad sold the non-operational dam in turn in 1879 to private interests, industrialists from nearby Pittsburgh, who rebuilt the dam to better serve their purposes in developing the South Fork Hunting and Fishing Club.

On Nov 15, 1879, the charter was approved and signed in the Court of Common Pleas in Allegheny County by Judge Edwin H. Stowe, who for some unknown reason ignored the provision in the law which called for the registration of a charter in the ‘office for recording in and for the county where the chief operations are to be carried on.’ Nor did the sportsmen make any effort to conform to the law. Perhaps it seemed a minor point and was overlooked by mistake. In any case, the charter was secured with out the knowledge of the authorities in Cambria County, and there would be speculation for years to come as to what might have happened right then and there had they and Judge Stowe gone about their business in strict accordance with the rules. McCullough (1968, 49)

Capitalizing on an enlarged lake, members built a lodge, and private residences for their recreational use. Dam design modifications included abandoning the original discharge system, a stone culvert at the base with five sets of cast iron pipes and discharge valves. Instead, water filled the dam and exited near the dam via a spillway originally designed only to handle periodic extraordinary flow. A set of iron screens were placed along the support for a bridge built to cross the formerly dry, now flowing spillway, to prevent fish from exiting the lake via the spillway. The original dam was further modified, reduced in height by two feet, in order to widen the road across its top

Horiuchi: Stochastic Process 15 to support two-way traffic. Fundamental to understanding the contribution of governance

failure in these private party modifications, is the absence of communication between

Alleghany county, where the industrialists chartered their club, and Cambria county,

where the club was sited. The dam reconstruction was overseen by a club employee with

experience building railroad embankments. Because the discharge valves had been

removed, there was no means to release water and perform preventative maintenance or

repair small leaks that appeared. Nor was preventative maintenance performed on the top of the dam, where settling of earth at the center requires rebuilding to the original level.

Limited public oversight of this private project resulted in only a few persons with both

engineering knowledge and knowledge of the dam’s reconstruction and operation

understanding the potential for catastrophic failure. Eight years after the dam was rebuilt,

it collapsed following a major rainstorm on May, 31, 1889.

Failure of an O-ring sealing segments of a booster rocket caused the explosion

that destroyed the space shuttle Challenger on January 28, 1986. However,

organizational culture also contributed. (Presidential Commission on the Space Shuttle

Challenger Accident, 1986; Vaughan, 1990) Vaughan used the term “normalization of

deviance” to describe a culture where management and regulators compromised their

own processes and controls, creating unreasonable expectations of success and

minimizing open discussion of problems between the government and its contractor,

Morton Thiokol.

Katrina, one of the strongest storms impact a US coastal zone in the past 100

years, made landfall as a Category 3 hurricane on August 29, 2005. Government services

failed at all levels. Infrastructure weaknesses included telecommunications equipment

Horiuchi: Stochastic Process 16 located below sea level, flooded when the levees failed, worsening official communication problems. The problems of governance during Katrina were neither rare nor specific. The public sector response to Hurricane Katrina’s landfall illustrated some of the same problems in bureaucratic governance as seen in the Indian Ocean earthquake and tsunami event less than a year earlier. (Takeda and Helms, 2006)

Presenting a similar if not quite so likely risk, major levee failure in California’s central valley could damage one of the world’s most productive agricultural regions and stop the transfer of water to the populous south. One third of southern California’s fresh water is transported from the north via the Central Valley Project, and a substantial number of these levees are under private ownership, managed for agricultural support. A

2005 report estimated the economic damage in the range of 30 to 40 billion dollars.

In each instance existing governance failed, according to generally accepted measures of expected performance based upon the degree of public investment into the organizational systems. Responses often involved new interlocking structures, new data measurements and collections. If the problem is we are not connecting the dots, then we need more dots, leading us back to the concept of total information.

Johnstown Hurricane Challenger/ Flood Katrina Columbia Weather X X X Rule “bending” X X X Infrequency X X X Groupthink X X X Dependence on X X X technologies Table 1: Shared explanations for state and institutional failures

Note a pre-eminent role of weather as a factor in three of these cases, which then becomes a favored explanation for those in political authority. However, the weather per

Horiuchi: Stochastic Process 17 se did not create the failures. In Johnstown, the dam broke in large measure because it

had been re-engineered without discharge valves at the base and with iron grates at the

top so the private reservoir operated in closer alignment with the interests of the sport

fishermen. In Hurricane Katrina, the levees were rated for a Category 3 storm, but two

failed, and there were no secondary water-retention structures. In the Challenger disaster,

the Thiokol engineers were unsuccessful in convincing NASA managers that the morning temperature was dangerously cold, based on the effects of blow-by in earlier flights.

Restructuring Assumptions to Address Randomness

This section covers two points of discussion related to revisions of public

administration theory and praxis. First, I reflect on small-d democracy, then alternative

assumptions that address randomness are suggested. The impact of these reflections on

democratic action closes the section.

Implicit in democratic activity is the concept of individual, self-directed activity.

In a sense, one might hope that activities appear inherently random through self-direction,

and thus this concern ties closely to discussion regards any right to privacy in the US. If

the government (or banks, or any commercial firm, for that matter) seeks to capture

transactions at an individual level and to apply pattern seeking in order to successfully

target its interventions, it suggests that individual actions are probabilistically determinist.

US history, for instance the Populist movement of the 19th century, indicates that

unexpected activity by individuals and groups can be unfavorably received by the

politically powerful institutions.

At the high of the populist movement, the party experienced a wave of political

successes. (Schattschneider,1975) In the election of 1890, eight state legislatures went

Horiuchi: Stochastic Process 18 Populist. In response, conservatives of both parties in the 1896 election acted to

destabilize the Populist base (pg. 76) forming blocks of Northern business Republicans

and Southern conservative Democrats (the “solid South”) This split the agrarian radicals

in the south and the west, and crushed the Populist movement. In short, “one party

politics tends strongly to vest political power in the hands of people who already have

economic power,” (pg. 78) reducing the impact of democratic movements. Even when

some democratic processes are repaired, signals of continued dispowerment may persist.

For example, after the Voting Rights Act (similarly the subsequent granting of the vote to

citizens between the ages of 18 and 21) a voluntary reluctance to vote (pg. 95) appeared

and persists. When many do not vote on a voluntary basis, political platforms can pay lip

service to the role of the public and the power of democracy, with little fear that the

people will actually show up. Schattschneider suggests this derives from the thin veneer of democracy on institutional structures that are in reality anti-democratic (pg. 100) Thus the struggle for democracy is still going on, merely shifted from the right to vote to the ability to form political organizations of real power. “Nowadays the fight for democracy takes the form of a struggle over theories of organization, over the right to organize and the rights of political organizations, i.e., about the kinds of things that make the vote valuable.” (pg. 100) Similarly, following the disputed 2000 presidential race the Help

America Vote Act of 2002 (HAVA) is enacted. The effect of HAVA has in the first few elections appears to result not in improved exercise of voting rights and political power; rather the nation has seen hundreds of millions of dollars awarded to a small handful of producers of voting technologies. This misstep is made obvious as states begin to roll back from fully electronic machinery to more comprehensible paper systems. Similarly,

Horiuchi: Stochastic Process 19 health care reform does not seek to remove the entire machinery of insurance companies

and billings (clearly not a zero-cost operation) even if the nation goes to universal care.

The electronic patient record, another massive data collection system that will be linked

to existing structures, becomes the new marker for cost savings and mistake reduction.

Governance structures at random intervals experience instabilities of random duration. The Populist movement can be characterized as an instability of the entrenched

system; changes in the demographics of voters can be viewed similarly. Events can be minor, for instance storms, protests, parades, or major: war, disease, or climate changes.

They can be highly localized to a single site, or involve numerous governance units in a multi-state/international power outage. The degree to which any government continues to operate as expected varies. Some, such as changes in the party in power, might have either minor or major effect. Table 2 offers assumptions that might be revised to better address the questions raised in desiring to maintain good governance under randomness.

Alternative assumptions are offered, and some ideas about how these alternative assumptions might change public administration research and professional training.

Classic Assumption Revision Implications for Praxis Governing structures are Instabilities are frequent if not Bargain with non-states, stable predictable unauthorized parties Institutions define rights Conditions of the moment Redesign institutions for create rights loose coupling option Technologies constrain rights Alternative educational strategies Incrementalism works Incrementalism unnecessarily Insurance? risks catastrophe Alternative decision models Government equalizes Government stratifies, Incentives for ethical earmarks, redistributes administration Data are comprehensible Data are incomplete, Educational and coping obscured, or suggest strategies contradictory explanations Table 2: Revising basic assumptions for public administration

Horiuchi: Stochastic Process 20 Diagram 2 offers a revision to Diagram 1 that suggests an alternative to the “total” information collection and retention model. It involves more probabilistic analysis, which might be either quantitative, or qualitative in nature. By considering an option for non-quantitative, reflective, or discourse-oriented considerations of probable outcomes, an open is created for more democratic action. It allows variation in approaches to the

study of randomness across the continuum of approaches to the study of public

administration Raadschelders (2005) from “scientific knowledge” to interpretive

postmodern expressions.

ŅtotalÓ information

perceptually relevant, knowable information

multiple potential outcome scenarios, each with a calculable probability

Diagram 2: Sensemaking through Scenario and Probability Development

Concluding Remarks

Were public administration schools to acquiesce to a data-driven model of

governance, they could mirror a strategy adopted by some business schools: more

courses in technical data management (a specialization in confidential data mining,

anyone?) supplemented by an ethics course to remind students that governance includes a

Horiuchi: Stochastic Process 21 moral dimension. It is unclear, however, whether this approach has improved the quality

and success of business enterprises. Indeed, developing operational generalists who can recognize and communicate across specializations is offered in Raadschelders (2005) as

one useful bridge to the field’s divergent approaches to characterizing core knowledge

and functions.

Horiuchi: Stochastic Process 22 References

American Civil Liberties Union. (2004). “Total Information Awareness.” Accessed February 20, 2007 at http://www.aclu.org/privacy/spying/14956res20040116.html

Denton, Frank T. “Data Mining as an Industry.” Review of Economics and Statistics, 67:1 (February), pp. 124-127.

Eagle, Antony (2005). Randomness Is Unpredictability.” British Journal for the Philosophy of . 56:4 (December), 749-790.

Electronic Privacy Information Center (2005) “Total ‘Terrorism’ Information Awareness.” Accessed February 20, 2007 at http://www.epic.org/privacy/profiling/tia/default.html

Frank, Walter (1988). “The Cause of the Johnstown Flood.” Civil Engineering, 58:5 (May), 63-66.

Goulielmos, Alexandros M. (2004) “A treatise of randomness tested also in marine accidents.” Disaster Prevention and Management. 13:3; 208-217.

Hand, David J. (1998). “Data Mining: Statistics and More?” The American Statistician, 52:2 (May), 112-118.

Hernandez, Mauricio A. and Stolfo, Salvatore J. (1998) “Real-world Data is Dirty: Data Cleansing and The Merge/Purge Problem.” Data Mining and Knowledge Discovery. 2:1, 9-37.

Kiefer, John J. and Robert S Montjoy, Robert S. (2006). “Incrementalism before the Storm: Network Performance for the Evacuation of New Orleans.” Public Administration Review. Washington: Dec 2006.Vol.66 pg. 122, 9 pgs

McCullough, David (1968). The Johnstown Flood. New York: Simon & Schuster Paperbacks.

Merton, Robert K. (1957). “Bureaucratic Structure and Personality” in Shafritz, Jay M., Ott, Steven J. and Yong Suk Jang, Classics of Organization Theory. 6th edition. Belmont, CA: Thomson Wadsworth, 2005.

Norman, Geoffrey R. and David L. Streiner (1999). PDQ Statistics, 2nd Edition. Lewiston, NY: BC Decker.

Owen, Art (2003). “Data Squashing by Empirical Likelihood.” Data Mining and Knowledge Discovery. 7:1, 101-113.

Horiuchi: Stochastic Process 23 Perrow, C. (1984). Normal accidents: Living with high-risk technologies. New York: Basic Books.

Palmerston, Lord Henry Temple. 1848. Cited in “Morality: Does national interest always come first in Foreign Affairs?” Accessed January 25, 2007 at http://news.bbc.co.uk/hi/english/static/in_depth/uk_politics/2001/open_politics/ foreign_policy/morality.stm

Presidential Commission on the Space Shuttle Challenger Accident. (1986). Report of the Presidential Commission on the Space Shuttle Challenger Accident (Rogers Commission Report). Accessed February 20, 2007 at http://science.ksc.nasa.gov/shuttle/missions/ 51-l/docs/rogers-commission/table-of-contents.html

Raadschelders, Jos C.N. (2005). “Government and Public Administration: Challenges to and need for Connecting Knowledge.” Administrative Theory & Praxis, 27: 4 (December); 602-627.

Reeves, Jay (2007) “VA Notifying 1.8M of Missing Data,” February 12. Accessed February 12, 2007 at http://apnews.myway.com/article/20070213/D8N8GIU00.html

Rogowitz, Bernice E. and Treinish, Lloyd A. (ND). “How NOT to Lie with Visualization.” IBM Thomas J. Watson Research Center. Accessed February 20, 2007 at http://www.research.ibm.com/dx/proceedings/pravda/truevis.htm

Schattschneider, E.E. (1975, 1960). The Semisovereign People: A Realist’s View of Democracy in America. Fort Worth: Harcourt Brace Jovanovich College Publishers.

Schrag, P. (2007). Lecture at California Association of State Counties. January 17.

Takeda, Margaret B. and Helms, Marilyn M. (2006) “ ‘Bureaucracy, meet catastrophe’; analysis of Hurricane Katrina relief efforts and their implications for emergency response governance.” International Journal of Public Sector Management, 9:4, 397-411.

Taleb, N. N. (2004). Fooled by randomness: The role of chance in life and the markets, 2nd ed. New York: Texere.

Tufte, Edward (2001). The Visual Display of Quantitative Information, 2nd Edition. Cheshire, CT: Graphics Press.

Vaughan, Diane (1990). “Autonomy, Interdependence, and Social Control: NASA and the Space Shuttle Challenger.” Administrative Science Quarterly, 35:2; 225-257.

Horiuchi: Stochastic Process 24