IS IT WORKING?

NARRATIVE PERSPECTIVES ON PERFORMANCE-BASED FUNDING

POLICIES IN PUBLIC HIGHER EDUCATION

by

James Capp

A Dissertation Submitted to the Faculty of

the College for Design and Social Inquiry

in Partial Fulfillment of the Requirements for the Degree of

Doctor of Philosophy

Florida Atlantic University

Boca Raton, FL

August 2019

Copyright 2019 by James Capp

ii

ACKNOWLEDGEMENTS

The unrelenting support of my loved ones bears most of the credit for this work.

Years ago, over some pints on Atlantic Avenue, I shook Cara Jean Capp’s hand as a sign of our commitment to embark on this endeavor. My doctoral studies left her carrying the brunt of parenthood and marriage. Running on intermittent sleep, she often left the house before sunrise to meet in the Everglades with presidents, senators, and governors. Yet she read the awful drafts and offered me honest guidance. She kept the coffee on the stove when the nights were long. We survived loss, and miracles, and broken bones, and a flood. All the while, she was my rock. Thank you for your strength and grace, Cara.

I also owe my family immense gratitude. The Piccirillos have loved me as their own, offering me stability when I needed it most. The Capps have muddled through some of life’s most trying times and come out more resilient. Most of all, I’ve got three bonafied blessings in Virginia Ellen, Nora Elizabeth, and the new soul I’m very much looking forward to meeting next spring. At night when I finished writing or came home late from class, I stopped at your beds to thank God for each of you. I deeply appreciate the opportunity to love you and to be your father. There is no greater joy in my life.

Lastly, thanks to those who listened to my rants about public policy and good governance, including my dissertation committee, the amigos, the Provost’s Office, and friends who have shared drinks, meals, and coffee with me. There are too many of you to name. I appreciate the sanity you bring me and the decency you bring to our world.

iv

ABSTRACT

Author: James Capp

Title: Is it Working? Narrative Perspectives on Performance-based Funding Policies in Public Higher Education

Institution: Florida Atlantic University

Dissertation Advisor: Dr. Alka Sapat

Degree: Doctor of Philosophy

Year: 2019

Public higher education increasingly relies on performance-based funding (PBF) policies to enhance accountability. These policies attempt to steer institutions towards

successful outcomes via performance indicators, such as graduation rates. Nationally,

PBF policies continue to grow in popularity despite limited evidence that they are

effective (Hillman, Tandberg, and Gross, 2014).

Motivated by the apparent conflict between the widespread adoption of PBF

policies and the lack of evidence that they actually improve outcomes in higher

education, this dissertation investigates the perceived impacts of PBF policies. Florida’s

public university system serves as the setting for the study due to its uniquely punitive

PBF policy design and the model’s non-standardized performance indicators.

The dissertation assumes the theoretical stance of Stone (1989), in which stories

define problems and drive the policy process. Additionally, the dissertation uses the

narrative policy framework (NPF) for its qualitative analysis, drawing on the coding

v framework of Shanahan et al. (2013), including narrative elements and strategies. The research questions focus on 1) understanding stakeholders and their views of PBF policies, 2) exploring the stakeholders’ perceptions regarding the impacts of such policies, and 3) articulating the overarching narratives and values of stakeholders.

Public documents reveal the official narrative perspectives among Florida’s Board of Governors (BOG) system and the individual universities. The dissertation finds disparate views on the impact of the state’s PBF policy. The BOG system presents

Florida’s policy makers as businesslike heroes who delivered a novel PBF policy to guide strategic enhancements. In contrast, universities set priorities according to historic missions but increasingly note their own compliance with the policy. Furthermore, infrequent but compelling uses of villains and victims in narratives highlight deep-rooted differences in perspectives between policy makers and implementers.

Methodologically, the analysis also suggests that NPF’s structural archetypes might assume excessive conflict, noting how a more interpretivist narrative analysis could result in less adversarial findings. In practice, PBF policies could benefit from recommendations to increase the focus on institutional missions, to formalize a forum for critiquing and enhancing the model, and to refine the policy’s focus on social mobility – all for the betterment of students and the public.

vi

For Mom

IS IT WORKING?

NARRATIVE PERSPECTIVES ON PERFORMANCE-BASED FUNDING

POLICIES IN PUBLIC HIGHER EDUCATION

LIST OF TABLES ...... xiii

LIST OF FIGURES ...... xiv

CHAPTER 1. INTRODUCTION ...... 1

Research Questions ...... 2

Performance-based Funding ...... 4

Narrative Analysis ...... 7

Significance, Implications and Dissertation Roadmap ...... 9

CHAPTER 2. LITERATURE REVIEW ...... 11

Traditions of Accountability in Public Administration ...... 12

Narrative Policy Framework ...... 16

NPF’s Structuralist Approach ...... 22

National Trends in Performance-based Funding ...... 27

Bodies of Literature ...... 35

CHAPTER 3. POLICY CONTEXT ...... 36

viii

Florida’s Budgetary Environment for Public Universities ...... 40

Performance-based Funding in Florida’s Public University System ...... 42

Benchmarking Improvement versus Excellence ...... 45

Linking Scores to Funding ...... 48

Technical Review of Florida’s Metrics...... 50

Metric #1 – Percent of bachelor's graduates employed and/or continuing

education in the United States...... 51

Metric #2 – Median wages of bachelor’s graduates employed full-time...... 53

Metric #3 – Cost per undergraduate degree...... 54

Metric #4 – First-time-in-college student graduation rate...... 56

Metric #5 – Academic progress rate (second year retention with a grade point

average above 2.0)...... 58

Metric #6 – Percent of bachelor’s degrees awarded in areas of strategic

emphasis...... 58

Metric #7 – University access rate (percent of undergraduates who are

eligible to receive a Pell grant)...... 60

Metric #8 – Percent of graduate degrees awarded in areas of strategic

emphasis...... 61

Metric #9 – Board of Governors’ choice metric...... 61

Metric #10 – Board of Trustees’ choice metric...... 63

The Evolving Policy Design for Florida’s Performance-based Funding ...... 64

CHAPTER 4. RESEARCH DESIGN AND METHODOLOGY ...... 67 ix

Data Collection Process ...... 67

Work Plans ...... 70

Accountability Reports ...... 71

BOG System Documents, Media, and Reports ...... 72

Identified Policy Narratives: Elements, Codes, and Themes ...... 73

Qualitative Review...... 77

Limitations ...... 80

Coding Framework ...... 81

CHAPTER 5. ANALYSIS OF FLORIDA BOARD OF GOVERNORS SYSTEM ...... 88

Causal mechanism ...... 88

Hero...... 91

Victim ...... 95

Villain ...... 99

Evidence (setting) ...... 100

Moral of the story...... 102

Plot ...... 103

Statement of a problem ...... 105

Angel shift ...... 107

Containment ...... 108

Devil shift...... 109 x

Expansion ...... 110

Policy beliefs ...... 111

CHAPTER 6. ANALYSIS OF INDIVIDUAL UNIVERSITIES IN FLORIDA ...... 113

Causal mechanisms ...... 113

Hero...... 118

Victim ...... 123

Villain ...... 126

Evidence (setting) ...... 128

Moral of the story...... 131

Plot ...... 138

Statement of a problem ...... 140

Angel shift ...... 145

Containment ...... 148

Devil shift...... 151

Expansion ...... 153

Policy beliefs ...... 156

Overview of Analysis Results ...... 159

CHAPTER 7. DISCUSSION AND CONCLUSIONS ...... 160

Discussion ...... 160

xi

Conclusions ...... 162

Research Question #1...... 162

Research Question #2...... 164

Research Question #3...... 167

Theoretical Implications ...... 168

Practical Implications...... 169

Methodological Implications ...... 173

Future Areas of Research ...... 176

Summary ...... 177

APPENDICES ...... 179

Appendix A. Preeminence Metrics and Benchmarks ...... 180

Appendix B. Policy Narrative Coding Sample of Work Plan ...... 182

REFERENCES ...... 189

xii

LIST OF TABLES

Table 1. Policy narratives (Shanahan et al., 2013, p. 459) ...... 74

Table 2. Detailed coding framework as applied to the PBF policy analysis ...... 82

Table 3. Summary table of coding instances ...... 86

Table 4. Angel shift references to post-graduation outcomes ...... 146

xiii

LIST OF FIGURES

Figure 1. Historical developments in oversight of Florida’s public universities ...... 37

Figure 2. Performance Based Funding Model 2018-19 Benchmarks, excluding

Metric 10 (Florida Board of Governors, 2018a) ...... 47

Figure 3. Summary table (Florida Board of Governors, 2018b) ...... 50

Figure 4. Frequency of coding instances ...... 87

xiv

CHAPTER 1. INTRODUCTION

In the US, performance-based funding (PBF) policies are increasingly popular financial mechanisms that attempt to steer state universities and colleges towards more successful outcomes. Currently more than half the states incorporate some form of PBF policies into the systems that oversee their public higher education institutions, and each year, more states consider proposals and adopt policies of their own (Davies, 2014).

Since 2013, the State University System of Florida has allocated most of its increases in general revenue appropriations using a ten-metric model (Florida Board of Governors,

2018a). Policies falling under this PBF category attempt to connect the budgeting of state

allocations, by varying extents, to specific institutional accountability measures. In

developing these measures, policymakers consider many types of outcomes, such as

graduation rates, career placement rates, and affordability of educational costs (Burke,

1998; Rabovsky, 2014b). Likewise, the state of Florida incorporates these key

performance indicators into its PBF policy, setting up a system to incentivize universities

that enhance these numbers on an annual basis and to reward universities that reach

certain numerical thresholds.

This argument for a more contextual study of PBF policies focuses less on

improvements of quantifiable metrics and instead explores the interpreted consequences

of PBF implementation, as evidenced in organizational narratives. The manner in which

stakeholders interpret PBF policies reveals disconnects between the policy design and

1

the strategic operations of public universities. Better understanding the contrasting interpretations of PBF policies may also showcase opportunities for enhancing the policy design. At the same time, a better understanding of the areas where stakeholders align in their interpretations may result in recommendations to reinforce and streamline what works well at universities.

This chapter first explicitly lays out research questions regarding system-level and organization-level perspectives, which serve as a grounding focal point for this dissertation’s overall plan of inquiry. Second, an introduction to the PBF literature emphasizes how researchers showcase historical shifts of policy focus from quality assurance to quantifiable measures of financial efficiency. Third, this chapter begins to formulate the theoretical approach that guides this dissertation, using the narrative policy framework (NPF) as both a theoretical and methodological foundation to support a qualitative study of PBF policies. Fourth, an overview of this dissertation’s specific approach to NPF methods explains the systematic process of pulling narrative data from public documents to conduct the analysis. Overall, this chapter and those that follow will engage the conversation regarding the perceived consequences of introducing PBF policies to higher education systems, filling an analytical gap in previous quantitative studies that fall short of holistically describing policy successes and failures.

Research Questions

In this dissertation, I strive to investigate the perceived benefits and possible detrimental impacts of PBF policies in state higher education systems. The following questions summarize the intent of this inquiry:

2

1. Who are the stakeholders involved in the formulation and implementation of

PBF policies, and how do they view these types of accountability models in

the public higher education systems?

2. What do PBF policymakers within the university system and university-level

administrators perceive to be the consequences of these efforts?

3. Do overarching narratives of the state university system policymakers and the

university-level policy implementers reflect competing interests and values?

These questions focus on the involved stakeholders’ perceptions and interpretations of PBF policies as evidenced in organizational narratives, as well as how those narratives are articulated. In this case, stakeholders form two primary categories: state system policymakers, who are primarily governing board members and who are quite often political appointees, and university-level policy implementers, who are institutional leaders such as presidents, provosts, vice presidents, or even lower levels of administrators, individuals working daily to implement PBF policies and enhance a university’s performance indicators. The constant underlying concern among both stakeholder groups should be whether or not they include the appropriate performance indicators and whether there are underlying measurement errors influencing quantitative results. Analyzing these concerns is possible through a variety of mechanisms, but systematically studying organizational narratives may provide new and competing views on the effectiveness of PBF policies and the particular PBF model within a state.

The values of organizations are of immense importance to a qualitative inquiry that aims to consider the intended consequences of any policy; especially in the case of

PBF policies, these values are brought to the forefront of the policy discourse. Those

3

system-level policymakers who create the performance models are openly stating that

they value certain outcomes over others, as measured by specific key performance

indicators. By considering and investigating these research questions, we should be able

to provide valuable insights regarding the ways in which, and the extent to which,

organizations implement PBF policies.

Performance-based Funding

Supported by a foundational discussion regarding the broader accountability

literature in the contexts of public administration and public higher education policy, the

different methodologies that scholars have used to analyze the efficacy of PBF policies

have produced results indicating that such policies have not typically been sustainably

effective (Hillman, Tandberg, and Fryar, 2015). These analyses claimed that the policies

fell short in their attempts to assess individual university missions and associated

outcomes (Orr, Jaeger, Schwarzenberger, 2007). These institutional missions likely

inform stakeholder perspectives regarding the role and relative importance of PBF

policies in university operations.

PBF policies exist as an attempt to enhance the accountability of institutions,

following the advent of similar techniques throughout the public sector. Rutherford

(2014) described how “demands for a more efficient and effective system of governance” have become increasingly dominant in governmental administration, and “in response, elected officials have adopted a range of incentive policies aimed to increase the performance of the bureaucracy” (p. 440). This is not a new trend by any means, as

“since the 1970s, federal and state policymakers have become increasingly concerned about improving the performance of higher education institutions” (Dougherty et al.,

4

2014, p. 163). This shift in concerns has led to policy designs that likewise focus on

accountability. As a result of changed priorities, states have required their universities to

develop more robust institutional data collection and reporting systems for a broader set

of performance measures, as well as to build the analytical capacity at universities to

improve upon those measures (Rutherford and Rabovsky, 2014, p. 186). Likewise, a PBF

policy focuses on how implementers leverage funding to enhance accountability, with a

rigid focus on outputs and outcomes.

Just as institutional missions likely inform this implementation of PBF policies at

the operational level within individual universities, the values and perspectives of

policymakers at the state-wide level has likely guided the priorities for what outcomes

and measures the states used to calculate funding decisions. Accordingly, the adoption of

PBF policies typically involves specific key performance indicators that attempt to engage the various important outcomes that state higher education institutions must

achieve in order to claim they are successful: namely, producing an adequate number of

graduates, who graduate in a timely fashion, and are prepared to be future taxpayers who

will join the workforce in targeted fields (Blankenberger and Phillips, 2016, p. 893).

Ideally, universities would make decisions to adjust their operations in order to comply

with the policy and enhance their performance on those outputs and outcomes.

The next question that naturally develops is whether or not these PBF policies are

measurably impactful in what they claim to do. A number of researchers claim that, in general, “the impact of accountability-based reforms has been limited” (Patrick and

French, 2011, p. 361). More specifically, “Performance funding policies in place between

1993 and 2010 appear to have very little direct effect on student outcomes, despite the

5

expectation that institutions will react to funding incentives by prioritizing student

success” (Rutherford and Rabovsky, 2014, p. 205). Institutions do show evidence of

making changes in financial expenditures, but this does not necessarily mean that any of

these changes actually result in improved performance on the key indicators associated

with PBF policies (Kelchen and Stedrak, 2016).

Accordingly, researchers have used a variety of quantitative methodologies to

produce analyses, which provided statistical tests that showcased there was limited evidence to support the argument that PBF policies actually improved the educational outcomes in the states that chose to implement them (Hillman, Tandberg, and Gross,

2014; Rabovsky, 2014b). In cases where there were indeed positive outcomes for such policies, the results were largely short-term in nature (Rutherford, 2014). Current evidence suggests that attempts to affect strategic change in higher education may depend on the scope and type of institution, with research-based universities representing the

least likely setting in which reforms will succeed (Stensaker et al., 2014). In some

instances, this even equated to two-year colleges awarding more certificates rather than full two-year degrees due to the fact that certificates were easier to turn around in a short

period of time (Hillman, Tandberg, and Fryar, 2015).

Despite the fact that there is currently minimal quantitative evidence to support

their successes, PBF policies have continued to grow in popularity. Overall, the PBF

literature presents intriguing perspectives regarding the development and implementation

of the application of accountability measures to funding mechanisms, shining a light on

the competing values involved in such systems (Shin, 2010).

6

The literature is rich in terms of quantitative analyses that provide a sense that

PBF policies are arguably ineffective. This dissertation supplements these studies by

providing a more contextual understanding of what works and does not work according

to different stakeholders in particular settings (e.g. policymakers such as governing board

members or policy implementers such at the university level). Additionally, this analysis

serves to fill a gap beyond what appears repeatedly throughout the perceived impacts of

PBF policies. For instance, quantitative studies have produced findings that suggested that institutions change financial behaviors based on PBF policies, but a qualitative narrative analysis provides unique insights regarding how seriously organizations took the advent of a particular PBF policy as well as the perceived expected managerial changes they should be making in response to the policy creation.

Narrative Analysis

Stakeholder perspectives can reveal important information regarding the development and operationalization of PBF policies. Narrative analysis is one method to observe these perspectives regarding the extent to which the policy impacts stakeholders and the values that inform these perspectives. This dissertation primarily concerns itself with the historical developments and perceived achievements of the State University

System of Florida, which deploys PBF as a policy mechanism to improve accountability and enhance performance throughout the various universities that make up the system.

Accordingly, understanding these developments also requires contextual analyses of public documents and exploration of concurrent historical trends in policy creation in other states throughout the US. The bulk of the data derives from public documents, and the analysis involves qualitative review of narrative elements and strategies retrieved

7

from those planning documents and accountability reports. The narrative policy

framework (NPF) serves as a theoretical framework and methodology to achieve these

objectives and to help answer the research questions discussed above.

The narrative policy framework (NPF) attempts to add scientific rigor to

traditional narrative analysis. According to Jones and McBeth (2010), such a framework

attempts to “synthesize extant narrative scholarship to offer an NPF as a quantitative,

structuralist, and positivistic approach to the study and theory building of policy

narratives" (p. 340). The dominant approach to NPF has involved quantitative analysis as

a way to produce insights regarding narrative trends. Scholars using the framework have

often coded their data using standard elements and strategies from traditional narrative

analysis, in terms of archetypical story characters, settings, and plots. Shanahan et al.

(2013) formalized this approach with a table of elements and strategies, which this

dissertation explores more thoroughly in Chapter 3 regarding the theoretical framework,

as well as in Table 2 in Chapter 5, which discusses these narrative components through a

PBF policy lens.

NPF analyses can involve multivariate regressions that investigate relationships

between the frequency of these elements occurring in discourse throughout the public

policy process. For example, Jones and McBeth (2010) state, “Ultimately, it should be

possible to test the connection between how narratives impact aggregate public opinion

and how (or whether) that public opinion impacts elite and institutional decisions” (p.

347). These scholars in particular have been foundational for NPF and its ongoing

standardized development, choosing to take a positivist approach to counter much older

interpretivist approaches to narrative studies of public policy.

8

In contrast with quantitative NPF analyses, this dissertation uses a qualitative

methodology to review the coded data thematically in the hopes of accumulating the

narrative elements into overarching stories that appear within the public documents.

Coding of the data follows the NPF standards for narrative elements and strategies according to the aforementioned system of Shanahan et al. (2013). Few major studies have used this type of approach, but the work by O’Bryan, Dunlop, and Radaelli (2014) and Gray and Jones (2016) serves as a model for qualitative NPF analysis, as discussed in

Chapter 4. This dissertation fills a methodological gap, in which the mainstream NPF literature is still largely numbers-driven, despite appeals to other scholars for more interpretivist analyses (Shanahan, Jones, and McBeth, 2018; Jones, 2018). The aims of this analysis will be to support this return of narrative analysis to its qualitative roots.

Significance, Implications and Dissertation Roadmap

In addition to methodological and theoretical contributions, this dissertation aims to have practical implications for the development and implementation of PBF policies both in Florida—and more broadly. The perceptions of policymakers and implementers may converge as well as contrast. For example, policymakers could see intended benefits and documented outcomes through the lens of financial accountability, whereas the policy implementers could see intended benefits and documented outcomes through a potentially-competing lens of quality and integrity. Both perspectives will likely reflect underlying interests and values, and there may be additional opportunities for reconciliation.

The hope is that the dissertation will shine a light on disconnects between the state system-level and university-level stakeholders’ perceived impacts of performance

9

models. While PBF policies vary in complexity, scope, and methodology throughout

different states and university systems, some of the findings could inform what an ideal

model considers, how performance models engage particular topics, and overall whether

they are worth the significant time investments involved with their development and

implementation.

The coming chapters further investigate narratives surrounding PBF policies. In

Chapter 2, a review of multiple bodies of literature will highlight major developments in

the topics of a) accountability measures in public administration, b) narrative approaches

and NPF in policy studies, and c) PBF models throughout the US. In Chapter 3, this leads

to the in-depth exploration of the particular model in the state of Florida’s university

system. In Chapter 4, the research design and methodology for this specific investigation

outlines this inquiry of perspectives and organizational narratives in regard to PBF

policies at universities and the system in Florida. In Chapters 5 and 6, the actual analysis

follows the aforementioned finite themes, organizing the findings by stakeholder, in

accordance with standardized NPF narrative elements and narrative strategies. Lastly in

Chapter 7, a discussion of the outcomes of the analysis will enable consideration of more

specific implications, including any overarching stories that evolve during the analysis.

In essence, given the nation’s growing list of states with PBF policies, this dissertation

seeks to highlight the complexity of these policies with an in-depth exploration of

Florida’s system and its perceived consequences.

10

CHAPTER 2. LITERATURE REVIEW

Multiple bodies of research literature contributed to the development of this

dissertation. In order to offer a qualitative approach to NPF in a novel policy arena such

as public higher education, the dissertation must position itself within the literature on

topics such as accountability, narrative analysis (broadly and in NPF more specifically), as well as the growing literature on the increasingly popular PBF policies.

First, scholarship on the accountability of public administration highlights the continual focus on more business-like mechanisms for increased organizational efficiency in the public sector. Second, the literature regarding narrative analyses and related tools offers some insights into the introduction of NPF. This chapter considers the various ways that scholars use NPF, including both quantitative and qualitative studies that explore the dynamics of policy narratives. Third, a comprehensive look at national trends in PBF policy development is helpful to understanding stakeholder perceptions about these policies.

Ultimately, the specific design of each PBF policy (including the metrics and funding model that each state uses) has implications for the extent to which stakeholders perceive these policies as having an impact on daily operations. For that reason, a review of the research literature in this chapter, combined with a technical review of the PBF policy design in the case selection in the chapter that follows, reveals insights regarding the policy environment that is the subject of this dissertation.

11

This chapter begins that exploration by summarizing the advancement of performance management systems (especially accountability mechanisms tied to funding models throughout the US) and by highlighting some of the analyses that have explored the efficacy and sustainability of such policies. A thorough review of the national and state context sets the stage for the NPF analysis, providing a clearer understanding of the stakeholders and their perspectives, values, and relationship with the PBF policy.

Traditions of Accountability in Public Administration

Since its inception as a discipline, the study of public administration has focused on streamlining bureaucratic processes and measuring performance outcomes in order to increase organizational accountability to members of the public. The common cry is to make government more like business, implying that the corporate world has clarity in its goal to generate revenue and is perhaps more willing to enhance the efficiency of operations in order to secure net profits.

First, Wilson (1887) argued that the study of administration could make government’s “business less unbusinesslike” and would “crown its duties with dutifulness” (p. 201). His focus on dutifulness contributed to the advent of an entire academic discipline, in which scholars investigated the duties of the public sector and how it conducts its business. Then, following Wilson’s line of administrative reasoning,

Taylor (1911) proposed scientific management could eliminate the “awkward, inefficient, or ill-directed movements of men” (p. 5). These two concepts – that administration had duties and that it could scientifically manage those duties – would continue to drive the accountability literature in the study of administration for the next century and beyond.

12

Moving forward, scholars of administration acknowledged the relative difficulty of enhancing rates of accountability in light of the very human nature of its operations.

For instance, Simon (1947) argued for more rational decision making but noted the

“bounded rationality” of bureaucrats who had to, at some point, insert themselves into the decision-making process. When faced with competing goals due to the diverse set of stakeholders who made up the public, as outlined in Dahl and Lindblom (1953), the growing question became how to define accountability.

Which stakeholders would win the attention of bureaucrats? The question of

“accountable to whom” launched an entirely new line of democratic inquiry in the study of public administration, most notably Waldo (1948) in his description of the competing obligations of the administrative state to both democratic service and bureaucratic practice. Mosher (1968) even recommended that bureaucrats proactively engage the public by incorporating democratic governance into their work, thus fueling a completely different type of accountability literature – less about machinelike efficiency for policy outputs and more on the accountability for democratic outcomes and processes.

Stakeholder competition for bureaucratic attention further risked the spread of goal confusion, which Argyris (1954) suggested could lead to shortfalls in efficacy for public administration. In an ideal world, a policy would steer implementers within public bureaucracies in a singular direction. Oftentimes, though, multiple policies competed for the attention of public managers, forcing them to “muddle through” the decision-making process (Lindblom, 1959). As a result, scholars of administration argued for a formalized goal-setting process with clearer objectives in public management (Drucker, 1954;

13

Locke, 1968). Public organizations are complicated, and as a result, policy makers

struggle to steer them and hold them accountable.

Scholars pointed to budgeting as the most powerful tool for directing the focus of

public administration. Wildavsky (1969) outlined the political nature of administrative

budgeting, given the inherent competition when deciding who gets what levels and types

of funding. In public administration, launching reforms has been second-nature (O’Toole,

1984). Budget reforms resulted from the constant struggle of public managers to promote

efficiency. Bureaucrats faced a democratic accountability to multifaceted, disparate

stakeholders – all in the hope of doing more, with less, for more members of the public.

Unfortunately, this "deluge" of reforms "may have created confusion within government"

about what policy makers “really want” (Light, 2006). Accordingly, renewed efforts to

reform resulted in an increasingly complex explanation regarding public policy makers’

expectations of bureaucracies.

In the last quarter of the twentieth century, political tides tilted more than ever in

the direction of performance management. Efficiency became not just an ethical

obligation toward being good stewards of the public’s trust and funding. At the federal

level, the tool of public budgeting led to the reduction of funding allocations to agencies.

The evolving budgetary environment forced public managers to "confront tradeoffs

between new demands and old programs rather than to expand whenever a new public

problem arises” (Levine, 1978). One strategic approach to shrinking budgets was that of

Hood (1991), who proposed “a shift in management structures towards decreased

command-orientation and increased 'results-orientation',” which he suggested led to

“improvements in productivity” (p. 16). New public management (NPM) then became

14

one of the major streams of thought in public administration. It became increasing

popular to suggest that policy makers could steer implementers through policies that

strived to manage performance – while being more efficient and reducing costs. One

possible solution for doing so was corporate outsourcing of services while managing

performance through contractual agreements.

As a result, scholars of public administration focused on the development of

performance management systems. As Radin (2006) noted in her review of federal

attempts to promote accountability and develop standardized measures for agency

successes, many performance management systems assume that one-size-fits-all. They

were not nearly inclusive enough of unique agency functions and contexts. In the Clinton

Administration in the 1990s, Al Gore led the development of the National Performance

Review, later renamed the National Partnership for Reinventing Government (NPR).

Subsequently, the George W. Bush Administration launched the Program Assessment

Rating Tool (PART). Radin (2006) noted the reliance on numbers to describe success of performance management, in which policy makers place immense trust – too much so –

in the data at the point of collection and in its sorting. Numbers can be misleading and

easily manipulated, such as by reporting percentage increases over raw numbers (or vice versa) depending on which format benefited the narrative of improved performance.

Ultimately, Radin (2006) concluded, "Certainty about performance measurement

is particularly inappropriate in volatile and rapidly changing issues where goals are

complex and multiple" (p. 29). This harkened back to the multifaceted nature of

budgeting and public policy, in which stakeholders compete for the attention and benefit

of the policy at hand. Moynihan (2008) similarly described the political motivations of

15

these performance management systems, which have been used to benefit particular interests. In order to cater to these interests, policy makers could require certain methods of measurement to produce predisposed outcomes. As a result of such concerns,

Moynihan (2009) noted that a universal theory for performance management has been largely elusive for scholars of public policy and administration. This dissertation looks to place PBF policies in the same grouping as broader attempts to promote accountability through public budgetary policies. The perceived success of PBF policies, particularly in

Florida’s model, are similarly unclear.

Narrative Policy Framework

While this dissertation engages a number of topics – ranging from higher education financial oversight to accountability mechanisms—the research is actually rooted in how stakeholders understand, drive, and influence public policy. The following section outlines the narrative framework that aids scholars in understanding this topic, moving from a broader discussion of the roles of narratives in the public policy process to a more targeted theoretical understanding of how to observe and analyze these narratives.

Namely, NPF offers narrative researchers a coherent, structuralist lens to use. Stories are empirically observable, but at the same time, personalized worldviews influence what stakeholders hear and how they hear it. This dissertation supplements that discussion by integrating the approach of pragmatic scientism with a more open-ended, interpretivist understanding of narratives.

In terms of a general narrative theoretical framework, NPF operates with a standard set of assumptions regarding how narratives are integral to the entirety of the public policy process. This theoretical stance reflects much of the work of Stone (1989),

16

who described the role of causal stories in policy agendas as the way that “situations come to be seen as caused by human actions and amenable to human intervention” (p.

281). This general belief then kickstarts NPF’s theoretical framework, here referred to as a tool to “organize inquiry,” as opposed to a more specific subordinate theory that would be more closely associated with a particular study in which it would be “tested and revised” (Schlager, 2007, p. 293).

The foremost assumption of NPF has been that “meaningful parts of policy reality are socially constructed,” and this baseline assumption leads to two core components of

NPF as a theoretical framework: 1) These “social constructions vary to create different policy realities, but this variation is bounded (e.g., by belief systems, ideologies etc.) and thus is not random but, rather, has some stability over time,” and 2) “Narratives have specific and identifiable structures” (Shanahan, Jones, and McBeth, 2018, p. 333). In essence, stories are how humans understand concepts, and in the case of public policy, they are the bedrock with which problems are identified, solutions are established, and through which public policies are placed onto agendas, regulated, implemented, and assessed. NPF scholars pointed to narrative structure, or “form,” such as the literary concepts of the protagonist or the moral of the story, as “generalizable across all contexts” (Jones, 2018, p. 728). In other words, NPF suggests that certain narrative components appear in every story, though their content shifts. This dissertation adopts that same theoretical framework, attempting to understand, capture, and articulate stakeholder stories regarding the development and implementation of the PBF higher education policy in Florida.

17

Narratives are a common part of everyday life and communication, so it makes

sense that they would be integral to social inquiry. Particularly in terms of public policy,

researchers cannot engage an arena at hand without building concepts in order to address

their social subjects. Stone (2012) simultaneously used narrative and stories to refer to

“the principal means for defining and contesting policy problems,” due to how “most

definitions of policy problems have a narrative structure, however subtle,” that come in

the form of “stories with a beginning, a middle, and an end, involving some change or

transformation” (p. 158). When a policymaker seeks to address constituent concerns, she

or he must conceptualize those concerns in a narrative format in order to understand

them. Who is involved with a policy issue? Is someone doing something wrong, or

perhaps, is there a better way for that someone to be doing the activity that a policy needs

to address? As Stone explained, policy definitions “have heroes and villains and innocent

victims, and they pit forces of evil against the forces of good” (p. 158). Stone provided

scholars with a baseline theoretical framework in which narratives catalyze and sustain the policy process, providing opportunities for stakeholders to shape narratives and, congruently, to shape the policies that they drive.

Once establishing that a problem does exist, narrative structure provides stakeholders with a better understanding of possible solutions, like plot devices and morals to the story. Kaplan (1993) argued, “Plot in such a form provides the policy analyst with a tool that can ’grasp together’ and integrate into one whole and complete story multiple and scattered events” (p. 172). These narrative elements provide coherence for an issue. Inevitably, social scientists will seek to better understand these stories because they “provide explanations of how the world works,” and “these explanations are

18

often unspoken, widely shared, and so much taken for granted that we aren’t even aware

of them” (Stone, 2012, p. 158). Particularly in fields such as anthropology and sociology,

where folktales are embedded in the cultures that a scholar studies, narrative analysis

becomes an important topic for research.

In particular, Stone (1989) and Stone (2012) served as inspiration for the type of

scientific inquiry with which NPF strives to align itself. When Stone (2012) noted how

“rational decision models are partly persuasive techniques mounted by people with stakes

in the outcome,” she opened the door for the types of empirical analyses that NPF

researchers conduct (p. 251). Such a claim, that unspoken but widely shared narratives

saturate much of the supposedly-rational policy agenda setting process, catalyzed policy

scholars to collect data in the form of less obvious stories in order to conduct different

types of analyses. Eventually, the phenomenon of how causal stories drive the policy

process motivated Jones, McBeth, and Shanahan (2014) to establish NPF. They explained that, “if stories are important to us as individuals, then it also probably follows that stories must play an important role for groups and the collective actions in which these groups engage, such as those present in the processes, outcomes, implementation, and designs of public policy” (p. 1).

Even the most rational approaches to public policy reflect a particular narrative – an attempt to objectively outline the policy story at hand. For instance, this point is particularly evident in the highly technical world of environmental policy studies that

relies heavily on quantitative data from biological and ecological analyses in order to

improve natural conditions. With this understanding, stakeholders play an active role in

developing stories in order to fuel the process to create, implement, and assess policies.

19

Likewise, Jones, McBeth, and Shanahan (2014) referred to policy narratives as having

“instrumental goals” due to the fact that narratives are “strategic constructions of a policy reality promoted by policy actors that are seeking to win (or not lose) in public policy battles” (p. 9). Kaplan (1993) agreed, suggesting that “stories can play an important role in argumentative policy analysis, and policy analysts and planners can frame and conduct their arguments through stories of a certain sort” (p. 167).

In the simplest of terms, then, the bedrock of NPF is a focus on the development and strategic use of narratives to drive the policy process. This is not an uncommon approach in the social sciences and is increasingly common in the study of public policy.

Jones, McBeth, and Shanahan (2014) noted the prevalence of “scholarship that was produced in the 1990s that examined the role of narrative in shaping public policy” (p. 3).

These authors also suggested that, while the focus on narratives in policy studies grew, it did not necessarily align with social scientific criteria for scholarly rigor. This growing concern established the need to move towards a standard approach, such as NPF provides, following the “increased methodological sophistication and more generalizable findings, all of which have begun to provide for a scientific understanding of narrative and its role in human understanding and behaviors” (Jones, McBeth, and Shanahan, 2014, p. 3). As such, narratives became a unit of analysis with which individual researchers could pursue analyses, in the hopes of generating replicable results that would fall in line with longstanding tradition of relying on the scientific method.

The key to understanding the role of policy narratives, including what they look like and how they are used to advance particular issues, nevertheless derived from more traditional narrative scholarship, like those found in literary theory. As Czarniawska

20

(2004) outlined, a narrative analysis could thus take more structural or formal literary

tactics, such as those sociologist Claude Lévi-Strauss promoted, in which he analyzed classical myths and outlined the structural components of them to showcase their universal natures, and Vladimir Propp, who does the same with Russian folktales.

Czarniawaska (2004) also described how such a standardized perspective of

narratives also led the way for literary theorist Northrop Frye’s understanding of texts,

wherein he notes that to “stand under” the text is different from explanatory narrative

analysis, which “sets the reader above the text” (p. 60). This is more than just clever

wordplay that juxtaposes the act of understanding with standing under. Instead, Frye

highlighted the active role of readers ascertaining universal truths within text, noting

commonalities in narratives across cultures and contexts. Elsewhere, Mailloux (1990)

refers to this active role as interpretation, in which readers participate in the process of

connecting authorial intention (and authorial context) with the reader’s own worldview.

He states, "Taking a position, making an interpretation, cannot be avoided" (p. 134). That

being said, readers strive to “approximate” an understanding of texts that they can share

with other readers as well as the authors themselves.

In terms of this dissertation, an interpretative analysis might appropriately depict

the perspectives of stakeholders who craft or otherwise feel the impacts of PBF policies.

Stakeholders are authors of their narratives. Like texts, “narratives and stories are basic to

social understanding; through stories we make meaning in our lives” because they force

“a coherent interpretation on the whirl of events and actions around us” (Fischer and

Mandell, 2012, p. 356). Applying a narrative approach to the context of policy analysis

makes sense because “policymakers as learners turn to narratives to make sense of their

21

own situation and to develop strategies for change, incremental or transformational” due to how “public policy easily translates into narrative form” (p. 356).

Through a structural lens, scholars identify standardized narrative elements and then install meaning (i.e. “explanation”) into the text. Accordingly, in terms of a post- structural narrative theoretical framework, “the move from structuralism to poststructuralism was not as dramatic as it may seem” because this shift reflects “above all, abandoning ‘the depth’ for ‘the surface’: if deep structures are demonstrable, they must be observable” (Czarniawska, 2004, p. 88). Narrative form, and its structural components, are the source of narrative content that is subsequently deconstructed in a post-structural approach.

NPF’s Structuralist Approach

In the instance of NPF, Jones, McBeth, and Shanahan (2014) acknowledged how

“narrative form refers to the structure of a narrative, while narrative content refers to the objects contained therein” (p. 4). Traditional research using narratives in public policy studies, such as those studies that Stone (2012) and Fischer and Forester (1993) conducted, investigates the content of narratives in order to better understand the strategies deployed in particular situations.

For example, Kaplan (1993) articulated how “narratives can describe a proposed future,” as “the narrative style forces a knitting together of multiple factors in a complex situation,” and, ultimately, “the consistency of narrative elements in a plot provides an important test of narrative truth” (p. 182). The narrative truth referenced here is not universal but rather situational, in which the truth at hand could be connected with the truth of the stakeholders and factors involved in a particular case study. This complexity

22

of a unique policy instance resists simplification, and accordingly, resists replication due to the individual perspectives and events at hand. For a particular policy scenario, though, a narrative approach can make the policy process easier to comprehend and enlightening.

Critics might suggest narrative analysis is limited as a method, given “an assertion that due to unique context and individual interpretation, narratives cannot be studied scientifically” (Jones, McBeth and Shanahan, 2014, p. 5). Admittedly, with the unique perspectives involved in understanding narrative content, scholars could find it difficult to replicate the research from these types of case studies. At the same time, suggesting that interpretivist narrative analysis could be more “scientific” also implies that such a method is less than a positivist approach. NPF researchers thus note marked differences between their own scholarly aspirations in NPF and in its narrative predecessors.

In public policy studies, exploring the dynamics behind one instance—or a series of instances—of a narrative could be useful to understand a particular topic and its stakeholders, but it does not produce the type of generalizable findings that NPF strives to develop (in which broader lessons can be learned about the structure of policy narratives). In contrast, Jones, McBeth and Shanahan (2014) stated, “NPF takes a specifically structural position, defining generalizable and context-independent narrative elements consisting of a setting, characters, a plot, and a moral of the story” (p. 4). These components are universal in nature, making it more accessible for researchers to be able to replicate the results of another study.

Just as an instrument for measurement limits any given empirical analysis, the narrative elements of NPF also draw boundaries around what is measured and how it is measured. In essence, establishing these standard structural components reflects the NPF

23

attempt “to extract generalizable structures from the existing narrative literatures dispersed across many academic disciplines” (p. 5). Fortunately, this empowers future researchers to attempt the same with different policy arenas, making NPF a useful tool for positivist scholars who are seeking to produce finite descriptions of correlations between a number of policy variables.

The focus on PBF policies – or even the higher education policy arena more

broadly – will be an important new contribution to the NPF literature. While NPF is a

common tool for policy analyses in the environmental fields, some scholars have directed

their inquiries toward novel policy arenas, such as disaster management. For example,

there are very few NPF analyses like Crow et al. (2017), who explored wildfire

emergency policy to develop a “character coding typology that accounts for the potential

effects of the various heroes, villains, and victims on eventual disaster policy outcomes,”

which the scholars hoped “will be useful for scholars to incorporate as they attempt to

apply the NPF to disaster policy” (p. 650). In this particular scenario, common themes in

the NPF analysis emerged: the natural disasters (i.e. the fires) were the villains and

emergency policy personnel were the inherent heroes. The particular policy area may

lend itself to NPF usage, so scholars should continue to explore additional areas beyond

the traditional environmental fields; hence the focus on a novel area like PBF policies in

the context of higher education.

Because narratives have standardizable structures, NPF often seeks to identify categories, which can then be quantified and studied using descriptive and more advanced statistical methods. Crow and Berggren (2014) posited that NPF is both a theoretical framework and a method. These scholars agree upon the assumptions about

24

the socially-constructed nature of our realities and the structure of narratives, forming the basis for a framework. As such, the NPF method relies on a common theoretical understanding of narrative form, or structure, from which different methodological

variations can then spin off.

Standard units of analysis reflect this positivist, formalized and rigorous empirical

approach. Shanahan et al. (2013) identified definitions for elements like “statement of a

problem, victim, hero, evidence (setting), causal mechanism, moral of the story, and

plot," as well as narrative strategies such as "expansion, containment, devil shift, and

angel shift" (p. 459). Such a system of narrative elements provides for a standardized

format of coding that can be useful in either a quantitative or a qualitative approach.

Shanahan et al. (2013) emerged as a key example in the NPF literature, in which

the scholars explored these narrative elements and strategies in the context of policy

coalitions that were both pro- and anti-wind farm. In this application of the NPF

approach, Shanahan et al. (2013) remarked, “Policy narratives must be populated by one

or more characters (hero, villain, and victim), and often offer solutions (e.g., a moral of

the story) and evidence in support of the solution” (p. 458). Again, this understanding of

narratives was particularly structuralist, suggesting standardized components that serve as

the scholars’ coding framework.

Specifically, Shanahan et al. (2013) elaborated how narrative elements support

particular policy perspectives, such as pro-wind farm coalitions propagating the Cape

Wind and Associates (i.e. themselves) as hero characters. To measure each set of stakeholders’ reliance on narrative components in “public consumption documents,” the

scholars noted the average number of mentions of heroes, for example, and how often

25

each coalition used science as evidence (p. 464). The crux of the analysis rested in

frequency and proportion of the narrative elements that stakeholders used – even

timestamping the data to showcase dynamic trends in narrative usage.

Finally, Shanahan et al. (2013) statistically tested relationships between policy

beliefs, such as the relationship between nature and humankind. For example, they

calculated “environment victims–human victims/total, resulting in a -1.00 to +1.00 scale;

(-1.00 is equivalent to only human concerns; +1.00 is equivalent to only environmental

concerns)” (p. 475). In other words, if pro-wind farm coalitions framed the environment

as the victim in the policy narrative more often than it focused on human victims, the

score would be closer to +1.00. By applying their coding framework, the scholars

illustrated the elements and strategies, which in turn formulate their system for coding.

The development of standard narrative elements and strategies made it

particularly straightforward to replicate the coding framework in future research efforts.

Shanahan et al. (2013) define each code in their framework and offer an example from

the wind farm policy scenario. Again, this structuralist understanding of narratives

reflected the similarly structuralist approach of Lévi-Strauss (1955), who did the same by

creating standard components of myths and traditional folklore.

Less common than the structuralist lens of NPF was the interpretivist approach.

Jones and Radaelli (2015) argued that these post-positivist (arguably post-structuralist)

NPF analyses require “thoughtful critique and careful refinement,” but the authors did not rule out the use of NPF by scholars from any particular epistemological approach (p.

352). Elsewhere, Jones and Radaelli (2016) described an epistemological “divide” that

26

social scientists endure when they talk about NPF as a method. Jones (2018) similarly issued a call for NPF interpretivism.

These authors noted the applicability of NPF to both quantitative and qualitative analyses. Still, the vast majority of NPF studies deal in numerical data and advanced statistical methods. As Shanahan, Jones, and McBeth (2018) noted, “The bulk of meso level NPF scholarship to date has employed nonparametric statistics” (p. 342). The most common utilization of NPF as a method relied on the study of frequencies, distributions, and related statistical tests.

Given the continual progression of narrative scholarship and the applicability to a novel policy arena like higher education, the topic of PBF policies appears ripe for an interpretivist NPF analysis. As discussed in the section that follows, the national dialogue in higher education policy regarding accountability does not necessarily align with the outcomes of PBF policies. Whether or not they work appears to be up to interpretation.

National Trends in Performance-based Funding

The national landscape for PBF policies is growing and transforming (Friedel,

Thornton, D’Amico, and Katsinas, 2013). PBF policies are now a part of almost every major state system of higher education throughout the nation, and “as of early 2016, 46 states were considering, transitioning to, or operating a performance-funding program, leaving only four states without any policy activity around this funding method”

(Gándara and Rutherford, 2018, p. 682). Looking at formally-adopted policies instead of just current proposals, “35 states tie at least a portion of higher education appropriations to performance funding policies” (Hillman, Fryar, and Crespín-Trujillo, 2018). Such a

27

saturation of PBF policies certainly warrants additional exploration regarding the efficacy

and overall health of such policies and funding mechanisms.

Many politicians, public officials, and non-governmental organizations support the spread of these policies. While non-profit organizations such as the Bill and Melinda

Gates Foundation and the Lumina Foundation have made claims regarding how those

states that implemented PBF models have positively impacted higher education

outcomes, these privately-funded reports produced conclusions that “are inconsistent with

the growing body of research that finds states with performance funding policies rarely

outperform states that never adopted the policy” (Hillman, Fryar, and Crespín-Trujillo,

2018). This presents further questions regarding the geographic spread and types of

implementation that exist throughout the states.

These models look different, according to the state and local contexts in which

they operate. Historically in the last thirty years in the US, the use of PBF has primarily

been limited to supplemental incentive programs – meaning that the base budget was

typically only partially at risk and public institutions competed for a proportionately

smaller set of funds by achieving special targets for performance indicators (Jongbloed

and Vossensteyn, 2010). Scholars have tried to differentiate between “performance

funding 1.0,” which represent largely supplemental models, and “performance funding

2.0,” which put more base budget funding at risk when institutions do not enhance their

performance. Li (2017) showcased a diffusion model, in which PBF policies spread

geographically according to the behavior of neighboring states within a particular state’s

region. Statistically, states that initially adopted the first type of model were more likely

to proceed to the next (“2.0”) type of model (Li, 2017).

28

In PBF policies, the key performance indicators that serve as variables in funding formulas are largely representative of the competing values of efficiency and efficacy, with a focus on institutions improving their productivity in terms of student outputs while simultaneously spending less money to accomplish this ambitious task. Again, this harkens to descriptions of NPM in which government has been forced to do more with less (Hood, 1991). Jongbloed and Vossensteyn (2010) further elaborated how, fortunately, the market reforms driving PBF policy development have thus far engaged in a soft roll-out, giving colleges and universities the opportunity to join the conversation as stakeholders in the development of specific policies. In the meantime, these soft roll-outs enable PBF reformers to tell the story of themselves as heroes of a war on inefficiency without offering context regarding the size of the battles they have completed. They move forward with successful reforms, gradually coopting additional states to launch even more drastic accountability measures. They receive limited pushback from stakeholders at universities, as their arguments have been buoyed by non-governmental organizations and proprietary reports (Hillman, Fryar, and Crespín-Trujillo, 2018).

In terms of overall efficacy regarding student outcomes, the findings have been mixed. While some states have certainly seen improvement in particular key performance indicators, the connection between PBF policies and those outcomes was not entirely clear. Dougherty et al. (2016) produced one of the most thorough reviews of PBF policies in order to investigate their efficacy. The scholars argue that, while states have clearly sought to promote their own models as successful, and institutions have sought to promote their subsequent compliance, there was “little evidence that the states vigorously built up institutional capacity to respond effectively to performance funding as a principal

29

policy instrument” (Dougherty et al., 2016, p. 203). In other words, states viewed these

models as efficiency tools rather than strategic capacity-building instruments, which means they may not have actually been able to verify that they produced change to the extent that some stakeholders believe.

Others such as Li (2017) pointed to the lack of evidence that states that introduced

PBF policies had resulted in associated improvements in performance. Any increased performance, she explained, occurred at around the national rate of increases. Also, in a recent special report for The Chronicle of Higher Education, Kelderman (2019) claimed,

“Nearly 10 years after states began a wave of performance-based funding, we don’t know whether the policy itself has led to gains in degree completions or even a stronger focus on student success” (p. 25). Similarly, after reviewing PBF policy financial allocations in multiple states, Hagood (2019) found that “it is troubling to see improved performance solely at the institutions that do not seem to reap financial benefits or, worse, experience direct financial burdens” (p. 208). She suggested that PBF policies merely directed funds to institutions that have had historically high performance and high funding levels. She termed these institutions “politically powerful” and questioned whether, despite claims

otherwise, PBF policies resulted in institutional funding levels that reflected “how these

institutions were viewed” (p. 209). There was limited evidence in any of these studies

that increased performance occurred as a result of the introduction of PBF policies.

PBF is often termed as a “newer model” despite having been around for nearly

half a century (Kelderman, 2019, p. 9). This spin has actually been part of a market reform narrative itself. For example, South Carolina launched an initial version of its PBF in the 1970s, with a 1996 re-launch of the program in the form of “Mission Resource

30

Requirement.” The South Carolina Commission on Higher Education (2014) summarized

the gradual devolution from PBF into across-the-board cuts as a result of the Great

Recession. South Carolina no longer boasts any sort of PBF policy.

In the midst of unprecedented budgetary reductions due to limited public funding,

PBF in South Carolina deteriorated to “need-based” funding for public institutions (p. 4).

The state’s Commission on Higher Education cited the volatility of PBF to explain its demise. Even though the state is currently considering yet another revitalized launch of

PBF (termed in the state as “allocation-based funding” or ABF), the Commission highlighted the importance of predictability as a necessary component of any future funding models (p. 6). Predictability is not a term typically associated with market reform narratives and is actually more often connected to reliable government services, so the adoption of PBF policies in South Carolina serves as an example of a PBF failure. The state did not fully embrace the market reform narrative because the innovation story did not thrive in a state where the model was viewed as outdated and unsuccessful.

In contrast, Tennessee’s system operates entirely off PBF, and policymakers are proud to be innovative, as the first state in the nation to transform from a growth-based enrollment funding model to a 100% outcomes-based model, with the metrics for the model weighted according to the unique mission of each institution (Tennessee Higher

Education Commission, 2010). For example, in some cases, a metric regarding access to low-income students may be weighted heavily for a regional institution that successfully serves those demographics, whereas the state’s flagship institution may be assigned more weight to research expenditure metrics. The state’s leaders touted the uniqueness of the model as evidence of its success, citing that the “design, utilizing outcomes and an

31

institution-specific weighting structure, is unique in higher education finance policy” (p.

11). In the same document, the system’s administrators claim to both “jettison” and

“completely replace” (emphasis theirs) the previous model (p. 11).

The state further refined its model in 2015, but only after the initial model ran for a full five-year cycle. The changes did not back down from the commitment to fund

100% of the state’s community colleges and universities on a basis of performance and not enrollments. Further, the state continued to rely on “a three-year average of the outcomes” in order to “limit potential volatility in the formula year over year” (Tennessee

Higher Education Commission, 2015). Institutions may have had a particular outcome drop in a given year, but as long as the three-year trend held strong, the PBF policy in

Tennessee did not penalize these institutions for static noise in the trend. Clearly, innovative reform is the driving narrative in Tennessee; however, the state also expressed a commitment to a sustainable model.

Conducting multivariate analyses on not just Tennessee, but also Indiana and

Ohio, Dougherty et al. (2016) described increased enrollments and degrees awarded in those states. The scholars also clearly noted that they “cannot in any way conclude that performance funding in these three states is producing higher student outcomes” (p. 132).

Of course, increased outcomes are positive changes for all stakeholders, including policymakers and implementers, all of whom seek recognition for their roles in producing these outcomes. Nevertheless, the expansive quantitative analyses of Dougherty et al.

(2016) did not specifically point to PBF policies as catalysts for these positive changes.

Also in the context of Ohio and Tennessee, the quantitative analyses of Hillman,

Fryar, and Crespín-Trujillo (2018) produced findings to conclude that “even the most

32

advanced performance funding states have not yet outperformed those without this funding model” (p. 165). Improvements in key performance indicators may be occurring, but it is still a stretch to conclude that these positive gains in outcomes are due to PBF policies. Rather, they are aligning with national improvements in degree productivity and graduation rates – and are not necessarily even keeping pace with the rest of the US.

Worse – the strategies deployed in order to fall in line with PBF policies are worrisome, as Dougherty et al. (2016) described the prospective unintended impacts of a focus on increasing productivity and overall efficiency without an explicit commitment to enhanced capacity building. They warned that “weakening of academic standards and restricting the admission of less prepared students are disturbing practices, particularly if pursued by institutions historically committed to broad access to higher education” (p.

206). In response to such warnings, researchers such as Gándara and Rutherford (2018) found that certain policy tools are, at least in specific cases, successfully mitigating concerns regarding access through the application of premiums for historically underserved and minority students—in other words, “financial bonuses designed to reward institutions for the enrollment and success of targeted student populations” (p.

683). If such premiums do not exist in a state’s PBF policy, though, the concerns regarding access for minority and low-income students will still exist.

Are the key performance indicators in these models truly representative of the mission and fulfilling the visions of public universities and colleges? Of particular concern are regional schools that serve college-age populations who are geographically restricted in terms of higher education options (for financial or family reasons). Looking at the state of Indiana’s model, Umbricht, Fernandez, and Ortagus (2017) concluded that

33

PBF policies "may not necessarily improve accountability and could have negative

ramifications on college access” (p. 667). In order to graduate students more quickly,

institutions could choose to admit students who are already likely to complete their

degrees on time, rather than students who have struggled in their personal and academic

lives. Unfortunately, the focus on efficiency may motivate schools to adjust admission standards, further restricting the traditional access these institutions provide to historically underserved populations throughout their states.

Given the growing popularity of PBF policies and the possibility that performance is quickly becoming the dominant narrative in public higher education policy, scholars have asked questions about the appropriateness of specific performance indicators

(McLendon and Hearn 2013). Similarly, other researchers have acknowledged the interconnectedness of particular performance indicators and the possibility that PBF policies will give too much weight to any single indicator, which cannot possibly “signal performance of multiple actors and multiple institutions that themselves interact in complex ways” (Kukla-Acevedo, Streams, and Toma, 2012, p. 14). Simple models thus limit the success of PBF policies, which attempt to improve accountability and enhance efficiency through broad generalizations about universities. Models that rely too much on

one metric are perhaps too simple to succeed. Instead, a respectively complex model

could incorporate multiple indicators and consider the types of institutions within the

state system. Regardless, PBF policies have become powerful enough to influence

financial outcomes for entire university systems.

34

Bodies of Literature

This chapter aimed to summarize the various bodies of research literature that

guided this dissertation. First, it reflected on the traditions of accountability and performance management in the discipline of public administration. Then, an overview of the NPF literature explored both quantitative and qualitative applications of the approach, with a particular focus on scholarship that could serve as a methodological model. Lastly,

the literature regarding PBF policies documented the progression of a national trend.

Hopefully these multiple bodies of literature help to position this dissertation within the discipline of public administration, the NPF approach, and the PBF policy arena.

Together, these sections ground the analysis in this dissertation while setting it up for the opportunity to make theoretical, methodological, and practical implications.

35

CHAPTER 3. POLICY CONTEXT

Florida is still developing its PBF policy, which includes supplemental funding and base funding, mixing new performance incentive dollars with reallocated base funds from each institution in the system (Florida Board of Governors, 2013). Li (2017) considers it to be a PBF 2.0 policy. Ultimately, this state serves as a rich case study, with

an influx of narrative fueling the development of its PBF initiative.

First, this chapter outlines the history of public higher education in the state of

Florida that led to the adoption of the PBF policy, in order to clarify some of the

stakeholders and their relationship with institutional performance reforms. It notes

historical events that have contributed to the political environment for public universities

in the state – and provides more technical summary of the policy itself, which the Florida

Board of Governors (BOG) system claims “represents a new era of improvement and

accountability” due to its threshold for minimal performance in order for universities to

earn any of the new funds (Davis, 2014a). Then, the chapter introduces the design of the

state of Florida’s PBF policy, how it stands out in comparison to other models throughout

the country, and why it is an appropriate selection for inclusion in this dissertation.

Ideally, a robust understanding of the unique PBF policy context can fuel a more

thorough narrative analysis, with a clearer picture of stakeholders and the situation in

which they operate.

36

Governance of public universities in Florida, like in any other state, is an inherently political activity due to the substantial costs and number of personnel that accompany the operation of institutions of higher education. Arguably, though, higher education systems in Florida have become increasingly politicized over the course of the past few decades, with significant involvement by elected leaders who have sought direct involvement with policy decisions. This active role replaced the prior practice of delegating authority to qualified representatives, which previously provided some buffer between the political oversight and the governing board. With more involvement by the executive, institutions of higher education were increasingly and directly subjected to the political whims of the state’s elected leaders. Figure 1 represents an overview of the involvement of politicians with the governance systems of higher education in the state.

Figure 1. Historical developments in oversight of Florida’s public universities

2001  Governor Bush signed SB 2108, which replaced Board of Regents and

established local Boards of Trustees at individual universities

2002  Supermajority of residents approved creation of Board of Governors

2011  Newly-elected Governor Rick Scott proposed tuition freezes

2013  Board of Governors system introduced a three-metric PBF policy ($20M)

2014  Board of Governors system introduced a ten-metric PBF policy ($100M)

The attempted to respond to political calls from the state

Legislature in 1998 to elevate the flagship institutions in the state, the University of

Florida and . For example, the board suggested at that time that

37

the university system tier itself according to the mission and productivity of each

university. “The three tiers of the plan are: Research I, Research II, and Comprehensive

tier,” acknowledged Foley (1998). More specifically, “the criteria used to assign each

university to a category include graduate and undergraduate enrollment, amount of

research funding the university receives, degrees awarded and the size of the university's

endowment” (Foley, 1998). The push was unsuccessful, highlighting the lack of the

governor’s political influence on the state universities, empowering faculty and students

who vocally protested the tiering of universities in fear that minority-serving institutions

and universities with regional focuses would suffer as lower-tiered institutions.

To shift the political control of the Board of Regents, then-Governor Jeb Bush vigorously engaged in the establishment of a system that led to “greatly increasing his influence over higher education” (Trombley, 2001). In 2001, Bush became frustrated with the Board of Regents, the advisory board that oversees the State University System

(SUS) of Florida, which comprises twelve institutions of higher education located throughout out the state (Washington Post 2016). At the time, Bush was focused on the abolishment of affirmative action in undergraduate admission processes, which sought to boost university enrollments of historically underrepresented minorities in the state.

Using a preexisting proposal to develop local control of universities through institution- level boards of trustees, Bush aggressively supported the passage of SB 2108 that eliminated the state-wide Board of Regents (Powers and Schuster, 2001).

In the past, the Florida Senate and members of the state’s elected cabinet reviewed and confirmed gubernatorial appointments to the Board of Regents for staggered six-year terms (Florida Statutes 240.207, 1997). The newly-established

38

university trustees, in contrast, were appointed to “serve four-year terms and can be

dismissed by the governor ‘for cause’” (Trombley, 2001). The new system provided the

governor with more direct control over the institutional-level trustees, so that he would no longer have to maneuver to convince the state-wide regents to adopt his policy positions on issues such as affirmative action processes. Likewise, he would not need to wait for holdout regents who disagreed with him to cycle off the board. Instead, Bush had instantaneous appointment power for each full local board of trustees (BOT), resulting in

enhanced decision-making power.

This move by the Republican Bush was met with resistance by members of the

Democratic opposition within the state. The rebuttal was primarily led by sitting US

Senator and former Florida Governor Bob Graham, who launched a ballot initiative in

2001 for the state to adopt a constitutional amendment that permanently established a

state-wide advisory board for coordination of the public universities (Powers and

Schuster, 2001). Subsequently, a super-majority of residents in 2002 voted to change the

state’s constitution to establish the Florida Board of Governors (BOG) to oversee the

SUS. Throughout this dissertation, the term “BOG system” refers to the statewide

governing board and its staff. Furthermore, the governor continued to appoint members to

this system-wide board but now only had the authority to appoint half of each

university’s local BOT, with that power being constitutionally delegated to the newly-

established BOG system. In essence, the governor appointed half of the trustees, but then

the BOG appointed the other half of the trustees – ensuring the governor still had much

political influence on the local boards.

39

From 2003 through 2010, the BOG system primarily functioned as a mechanism

for coordinating degree program offerings, with the local BOT retaining authority to

oversee “matters including, but not limited to, academic and student affairs, strategic

planning, finance, audit, property acquisition and construction, personnel, and budgets,”

in accordance with Section 7(c), Article IX of the Florida Constitution (BOG Regulation

1.001). While the BOG’s formal responsibilities remained, the election of a new governor

led to increased responsibility for the BOG system as an oversight body, as noted below.

In Governor Rick Scott’s 2010 campaign for his first term, the future Republican

governor developed a laser focus on job creation: his seven-step plan to develop 700 thousand new jobs above and beyond the 1 million jobs that economists predicted would likely occur over seven years due to a national economic recovery (PolitiFact, 2018). The primary proposed method for job creation was tax reduction, and as the state of Florida has no income tax, these cuts were related to corporate income and property taxes (Klas,

2011). Even his fellow “Republican lawmakers pushed back against the governor, saying

tax cuts for the middle and working class were more important than giving big business a

break” (Frank, 2011, p. 36). Often, Scott’s strategy would be to pressure to keep special

taxing districts’ millage rates, municipality property taxes, and even university tuition

costs flat – regardless of the need for revenue or the governmental context (Sigo, 2011;

Marklein and Auslen, 2013).

Florida’s Budgetary Environment for Public Universities

Rick Scott’s efforts to freeze taxes were intended to boost jobs, and upon the election of the self-proclaimed “jobs governor,” Scott appointed likeminded representatives to the BOG. For the first time in its existence, the board began to become

40

proactively engaged in the financial administration of universities, with several members

of the BOG voicing concern with tuition rates, despite the fact that Florida for many

years had ranked as having one of the lowest costs in the nation in terms of student

tuition and fees (Rick Scott for Florida, 2014). Tuition increases, the governor argued,

were equivalent to taxes on college students and their families.

The BOG, along with Governor Scott who appointed many of the members of the

board, started to lobby against the automatic tuition increases that universities were

relying on to counter years of across-the-board revenue reductions that the Legislature had made in order to balance its budgets during the Great Recession (Mitchell, 2013).

With general tuition revenues slowing to a freeze, the BOG also began to limit the tuition differential fees that universities were previously able to use to increase revenue on an as-

needed basis. These, too, came to a complete freeze during Scott’s efforts to make

Florida the most affordable state in the nation for higher education (Rick Scott for

Florida, 2014).

Simultaneously, in 2013, the Florida Legislature and the BOG began to turn

towards so-called merit-based allocations through programs such as the preeminent state

research universities program for institutions that attain statutorily-defined benchmarks

for 11 out of 12 “academic and research excellence standards” (Florida Statutes

1001.7065, 2018). See Appendix A for a full breakdown of preeminence metrics and

requisite benchmarks. The Legislature defined these benchmarks using the existing

measurements from two institutions – the flagship and Florida State

University. In effect, political leaders in the state revived the 1998 tiering plan.

41

In the 2013-14 academic year, the preeminence program then rewarded these institutions with millions of dollars in new recurring funding for achieving the benchmarks that were set using the institutions’ own scores. The program also afforded the two schools the opportunity to increase tuition differential fees—a concept which was forbidden for all other public universities (Florida Board of Governors, 2012b). These incentives were not part of the state’s PBF policy. As such, they were not immediately accessible to all universities. Preeminence incentives focused more on institutional designations but still aligned with the timing of the PBF policy creation.

Likewise, the related “emerging preeminent” state research universities program also provided meritorious funding mechanisms for institutions that achieved 6 of the 12 benchmarks in order to further catalyzed continued improvements in these particular

“academic and research excellence standards” (Florida Statutes 1001.7065, 2018). In

2018, the University of South Florida, after years of significant increases in its achievements of the preeminence standards, was able to achieve 11 of the 12 benchmarks, and the institution was officially designated as the state’s third preeminent institution (USF New Era, 2018).

Performance-based Funding in Florida’s Public University System

The monitoring of institutional accountability and enhanced performance were common themes throughout the recent history of the state of Florida’s 4-year university system, since at least the election of Rick Scott as governor in 2010. With Governor

Scott’s focus on job creation and reduction of public spending, initial versions of the state’s existing performance-based funding (PBF) mechanism reflected employment placement and other outcomes for baccalaureate graduates.

42

Established during the 2013 legislative session, the general appropriations bill for

Florida that year set aside $20 million for the university system to distribute amongst the

universities according to performance in three metrics: 1) “percent of bachelor's

graduates employed and/or continuing their education further 1 year after graduation,” 2)

“median average full-time wages of undergraduates employed in Florida 1 year after

graduation,” and 3) “average cost per undergraduate to the institution” (Florida Board of

Governors, 2013a). This early three-metric PBF policy highlights the more direct

influence that the executive in the state had over the public universities, with none of

these measures reflecting the typical focus on academic quality or productivity that a

governing board would want to enhance in a university system. The data definitions and

the methodology for calculating each university’s results for these metrics would

continue to develop over the next five years.

Subsequently, these measures then became metrics one through three of the more

robust ten-metric PBF policy in 2014. The BOG system’s self-proclaimed “guiding

principles” for the model were to “use metrics that align with strategic plan goals, reward

excellence and improvement, have a few clear, simple metrics; and acknowledge the

unique mission of the different institutions” (Florida Board of Governors, 2018a). These

principles are important reference points for what each stakeholder in the PBF policy believe to be the motivation and rules behind the model as it exists as a policy. The appointed members of the Florida Board of Governors, the technical staff who support them, and the university-level stakeholders likely all have differing viewpoints in regards to how accurately these principles are reflected within the PBF policy.

43

First, these guiding principles referenced the BOG’s 2025 System Strategic Plan,

which was originally approved in 2011 (Florida Board of Governors, 2016a). Like many

university-level strategic plans, the three overarching goals of the plan revolve around the

themes of teaching, research, and service. These strategic plan goals definitely appear

throughout the ten metrics that are included in the performance funding model, as well as

the aforementioned twelve preeminence metrics. What is more debatable is the degree to

which each strategic plan goal of teaching, research, and service remained as a core

component of each of these metrics.

Second, the concept of rewarding improvement and excellence relies on

standardized benchmarks for each category that is being measured. Points are awarded

based on the achievement of each university for improvement over the last year as well as

reaching specified targets for excellence.

Third, the BOG system has capped the number of metrics at ten since 2014.

Levels of clarity and simplicity of metrics are largely based on opinion, so understanding

stakeholder perspectives and responses to the metrics themselves will be quite important

in understanding how effectively the board has aligned its model with this principle.

Fourth, the ten-metric model does allow for recognition of individual institutional mission in that the ten metrics are not standard across the system. The model includes at least one BOG-selected metric and at least one university-selected metric for each institution. Stakeholder perspectives accordingly diverge on whether or not recognition of university mission is substantial enough to reward unique successes of each institution.

There are plenty of opportunities to further review the extent to which the system is following the intent of its own guidelines. Still, these principles are referenced on a

44

continual basis and cited when the political appointees on the Board of Governors

evaluate their own efforts throughout the years (Florida Board of Governors, 2018a).

Benchmarking Improvement versus Excellence

As mentioned in the BOG’s second guiding principle for its PBF policy, the

model rewards institutions for both improved performance and comparably “excellent”

performance. For each metric, the PBF policy established two sets of benchmarks that

establish how many points – of 100 total points – each institution earns. The university’s

score for each metric ultimately depends on which set of benchmarks produces a higher

number of points. See Figure 2 for a breakdown of metrics based on excellence and

improvement benchmarks.

First, in terms of improvement, when performance-based funding used a 50-point

scale in 2014 and 2015, a one percentage point increase in a metric would equate to one

full point in the model. In other words, 1% was equal to 1 point. At that time, the

maximum for improvement points available to each institution per metric was 5 points – or a 5% increase. This changed in 2016 when the model expanded to a 100-point scale

(Florida Board of Governors, 2015). The improvement points were then awarded by a

half-percentage point increase basis, and the maximum number of points available to

each institution per metric was 10 points – but still a 5% increase. In other words, .5%

was equal to 1 point. Using both scales, institutions would be disincentivized to improve

at a rate beyond five percent annually, as there would be no opportunity for earning

points in the PBF policy as a result.

This first set of benchmarks, those for improvement, are the same for each of the

current existing ten metrics, regardless of the number of students being counted in a

45

particular outcome or the relative difficulty in lifting the outcome. Some measures count

all undergraduate degrees regardless of original entry to the university as a freshman or a

transfer student, for example, while others only count those students who first entered the

institution as a true, full-time, degree-seeking freshman. In this instance, a 1% increase in

improvement could require anywhere from a handful of students to meet the standard – to

several dozen students in other instances. Even if the population being counted was the same, the improvement points are universal regardless of the relative ease of retaining a student for one year with a 2.0 grade point average versus the more difficult task of ensuring that students have met all graduation requirements in four years.

Second, in contrast, each set of benchmarks for excellence is unique to the particular metric, and the BOG system sets the minimum threshold for 1 point, as well as the range and scale of excellence (i.e. scores the university must achieve to earn each point, as well as the maximum score necessary for earning the full ten points). Given that each excellence benchmark scale differs for each metric, unlike the model’s approach to improvement benchmarking, this manner of earning points does consider the relative difficulty of earning particular scores. If a university’s excellence point total exceeds its improvement, then that is the total number of points that the university has earned for that particular metric. In other words, institutions have the opportunity to earn points based on either improvement benchmarks or excellence benchmarks, depending on which score is higher for the university.

46

Figure 2. Performance Based Funding Model 2018-19 Benchmarks, excluding Metric 10 (Florida Board of Governors, 2018a) 47

Linking Scores to Funding

Performance-based funding is nonrecurring in Florida’s university system,

meaning that institutions are discouraged from spending their funds on anything related

to salaries or benefits, for fear of putting those funds at-risk in subsequent years. The official position of the state system is that “universities have discretion, but are encouraged to spend the funds on initiatives that enhance the student’s experience and improve performance on the model’s metrics” (Florida Board of Governors, 2018a).

Likely one of the most notable and controversial aspects of the state of Florida’s performance-based funding model is the advent of the rule in which the bottom three performing schools do not receive any PBF allocations for that year. Due to the nonrecurring nature of the funding, if an institution falls into the bottom three in terms of total points, then they may be facing layoffs or other difficult budgetary decision making if they used performance-based funding to pay for any recurring costs. If institutions fall below a specified threshold (when the scale was 50 points, the threshold was 25 points or below, and the threshold is 50 points or below on the 100-point scale), then additional funds will be withheld for the next fiscal year and the institution will need to successfully implement an improvement plan in order to ensure the return of the withheld funds

(Florida Board of Governors, 2018a).

This opportunity to submit an improvement plan is only available to an institution once. In the future, if a university falls below the threshold again, after having done the same in a prior year and successfully completed an improvement plan, then that institution will lose a portion of its base funding for one year as a result. This is one of the key design aspects of the PBF policy in Florida – each university allocates a

48

predetermined portion of its own base budget into the “pot” (totaling $295 million in

2018) alongside the state’s investment in the “pot” (totaling $265 million in 2018). The

portion that each university has allocated is then put at risk if the institution falls into the

bottom three.

Likewise, institutions have the opportunity to receive bonus funding if they are

ranked as one of the top three institutions. Those three universities ultimately “receive the

bonus funding based on points earned [per institution] compared to the total of points for

those three institutions” (Florida Board of Governors, 2018a). The model distributes the

funds proportionate to their total PBF scores. The top three institutions also receive

funding from the bottom three institutions, should any or all of those universities fail to

successfully implement improvement plans. The top three and the bottom three are key

aspects of the funding mechanism that is tied to scores in the PBF policy.

Excellence and improvement points, besides determining the final tally of points

per metric, are also part of the tiebreaking process. After simply splitting allocations in

instances when institutions attained the same exact scores in 2014 and 2015, the system

instituted a mechanism for deciding which institution should rank higher in 2016 (Florida

Board of Governors, 2015). If institutions did tie, they would enter the top three or,

conversely, lift both schools out of the bottom three. Accordingly, in the past, some years

had more than three in the “top three” and some years had fewer than three in the

“bottom three.” For example, in fiscal year 2015-16 when Florida State University and

University of North Florida tied for the third-from-last spot with 36 points each and both

earned new funding that particular year (Florida Board of Governors, 2018a). The process for tiebreaking currently has four phases, and the tiebreaker process advances to

49

each next phase only when a tie continues to exist. The newly instituted tiebreakers, in order are “1 - total of excellence and improvement scores, 2 - give advantage to higher points earned through excellence, 3 - score metric by metric giving a point to the school scoring higher, [and] 4 - if tied after three tiebreakers, the tie will go to the benefit of the institutions” (Florida Board of Governors, 2015).

This extremely complicated tiebreaking process does not align with the original commitment to simplicity. Along with the performance indicators themselves, which are frequently non-standardized and unique to only the state of Florida and only in the PBF policy, these metrics have become increasingly complex through the years, as outlined in

Figure 3 and the following sections.

Technical Review of Florida’s Metrics

Figure 3. Summary table (Florida Board of Governors, 2018b)

50

Metric #1 – Percent of bachelor's graduates employed and/or continuing

education in the United States.

The first two metrics in the model are reliant on external data sources, making

them the only two metrics where institutions are not directly and completely tracking

100% of individual student outcomes. Whereas the other metrics directly measure a

specific function of each university (e.g. progressing and graduating students), these two

metrics actually measure employer and other universities’ functions – whether or not the

students are employed or pursuing further education (Florida Board of Governors,

2013a). Importantly, university results for these post-graduation metrics reflect a

snapshot in time, reflecting the student’s employment and educational status one year

after graduation. This is also one of the metrics that includes all bachelor’s graduates,

including students who entered universities as true freshmen during a summer or fall term

as well as students who entered during spring terms or as transfer students who had

preexisting credit hours from another institution but earned a bachelor’s degree from the

public universities in the state system.

When initially implemented in 2013, the portion of this metric related to career placement was limited to those former bachelor’s students who found full-time

employment within the state of Florida, regardless of salary rate. The Florida Education

& Training Placement Information Program (FETPIP), part of the state’s Department of

Education and external to the SUS, reports which students are employed within the state

of Florida. The outcomes for students who have graduated with a bachelor’s degree are

tracked during different points in the academic year, depending on the date of graduation,

such as the spring, summer, or fall terms. Furthermore, BOG system staff must discern if

51

a student is continuing education and simultaneously pursuing full-time employment, in

order to ensure an unduplicated count of student records. Given the complexity of this

tracking, BOG system staff records the student outcome for this metric, rather than

institution-level staff members collecting and analyzing the raw data.

Throughout the five years (2014-2018) of the ten-metric model, the state has greatly expanded the student records that are reported in this measure, eventually including students who found full-time employment anywhere in the United States

(Florida Board of Governors, 2015). Moving forward, these students were included in the reported files of the inter-state Wage Record Interchange System (WRIS2) from the

Federal Employment Data Exchange (FEDES), which collects unemployment insurance information across state boundaries. During the first year of performance-based funding in Florida, when the state included just three metrics in the model, only 42% of bachelor’s graduates in the state system were included in the dataset (Florida Board of

Governors, 2013a). Even after the inclusion of national data, information is only available from 41 states plus the District of Columbia and Puerto Rico, though 92% of bachelor’s graduates are now included in the calculation for the system (Florida Board of

Governors, 2018a, p. 7). Populous states such as New York are excluded from the data, as are students who find employment abroad, making this an incomplete dataset that impacts some institutions more than others, especially if a university usually places graduates in particular industries or at specific companies or types of companies.

For a graduate to count as employed, in 2016 the system established a wage threshold of $25,000, which system staff members identified as the U.S. Census Bureau’s

Current Population Survey (2016) median personal income of Florida residents who are

52

between the ages of 25 and 29 and have attained a high school diploma or equivalent

(Florida Board of Governors, 2015). Even though additional records were included from

throughout the nation for review, those bachelor’s graduates who were earning less than

$25,000 would not count positively towards an institution’s outcome. The metric is not

weighted for institutions who disproportionately serve individuals who are historically

underpaid, such as female and minority students. With a national wage disparity amongst

genders, this would seem to put institutions who serve more females at a disadvantage – and could even disincentivize institutions from diversifying their student demographics.

In addition to employment data, the National Student Clearinghouse (NSC) provides information related to whether or not a bachelor’s graduate is continuing her or his education at any institution of higher education in the nation. Such data is available through the NSC, detailed by student record, but the specifics regarding the nature of the educational program would not be available. In other words, both undergraduate degree programs (i.e. second bachelor’s degrees) and graduate degree programs would contribute toward an institution’s outcomes. Additionally, the amount of credits enrolled is not included in such a report—just whether or not bachelor’s graduates are enrolled at all. Similarly, there are no records related to whether or not the degree program is in a related discipline to the original bachelor’s degree program.

Metric #2 – Median wages of bachelor’s graduates employed full-time.

Like the first metric, the collection of data for median wages occurs in a complicated manner, using multiple data sources and not including complete information for all graduates for every university. Similarly, FEDES’ WRIS2 information only includes reported student salary outcomes from 41 states plus the District of Columbia

53

and Puerto Rico. The system staff pulls the WRIS2 data during the fourth quarter after graduation (i.e. approximately one year later), so information is likewise staggered according to the spring, summer, or fall term in which students graduated. Salaries in this dataset derive from quarterly unemployment insurance figures, so the system staff needs to annualize the data by multiplying the single quarter times four. Additionally, “data does not include individuals who are self-employed, employed by the military, those without a valid social security number, or making less than minimum wage” (Florida

Board of Governors, 2018a).

Rather than use standard deviations or some other method to calculate excellence, system staff develop a process for excellence benchmarks that seeks to “ensure that schools were rewarded for reasonable performance above, at, and just below the system average” (Florida Board of Governors, 2018a). Given that averages fluctuate yearly, and salaries largely depend on macroeconomics, changes in inflation, and other national dynamic trends, the baseline benchmark (i.e. what is worth 1 point) for median salary for full-time employed bachelor’s graduates increases on a regular basis. As such, the definition of excellence changes as well.

Metric #3 – Cost per undergraduate degree.

This third metric is one that has completely changed during the course of the five years in which the ten-metric model has existed in Florida. Initially, the metric was framed as the cost per undergraduate degree to the institution, in which “total undergraduate expenditures are divided by total fundable student credit hours to create a cost per credit hour for each year,” then “[multiplying expenditures] by 30 (120 credit hours is the standard catalog number) to derive a 4-year average cost per undergraduate

54

degree” (Florida Board of Governors, 2013a). This original metric uses actual

expenditures for each institution and rewards decreases in costs per undergraduate

degree. The focus is less on quality of the academic experience, which arguably could

positively correlate with expenditures. By tracking decreases in costs per undergraduate,

the system is incentivizing institutions to reduce spending and enhance efficiency.

While there are multiple ways to calculate this cost per student, the system

chancellor instituted a workgroup in 2013 that recommended to “calculate the cost per

degree to the student, state and institution based on state appropriations and tuition”

(Florida Board of Governors, 2018a, p. 8). The system acknowledged that the workgroup’s proposed methodologies in its “Cost per Degree Work Group report” were

“not utilized” (p. 8). Effective in 2016, the performance-based funding model shifted to a cost per undergraduate degree to the student. Ultimately, the new definition included “net tuition & fees for resident undergraduates per 120 credit hours,” using the calculation of

“resident undergraduate student tuition and fees, books and supplies” along with “the average number of credit hours attempted … and financial aid (grants, scholarships and waivers) provided” (Florida Board of Governors, 2018a, p. 13). A somewhat complicated methodology resulted, particularly in regard to calculating the costs of course materials and textbooks in addition to the sticker price of per-credit instruction. Universities have the ability to earn improvement points by spending more on financial aid. This new definition would actually encourage institutions to spend more to subsidize the education for each student. The BOG system at the state level, so far, has been unable to gather enough university-level course materials data from the institutions in order to implement the “books and supplies” portion of the formula, instead relying on national averages for

55

these costs to the students. Interestingly, the Integrated Postsecondary Education Data

System (IPEDS) has a preexisting, standardized “net cost to the student” measure, which

was not adopted. This standard approach involves taking the federally-reported average

annual estimated cost to a full-time student (colloquially referred to as a “sticker price”)

and subtracts student aid, including grants and scholarships, resulting in a total net cost.

Instead, the state chose to move towards its own unique definition of cost to

student. The Florida system definition emerges as particularly complex due to the

calculation the average number of credit hours attempted per degree awarded –

incentivizing schools to shrink the number of superfluous attempted credit hours.

Institutions are accordingly encouraged to limit, for example, the number of students who

change their majors and are not able to use the credit hours earned from the first major.

Students who wish to explore multiple educational paths are likely encouraged to finish

the first chosen major and to seek a second bachelor’s degree or pursue an alternative

discipline at the graduate level. A metric called “cost to the student” then becomes

increasingly focused on student behavior rather than the rates of tuition and fees, as well

as the rate of institutional gift aid to students.

Metric #4 – First-time-in-college student graduation rate.

Graduation rates are a convenient method for measuring student success outcomes and institutional accountability. Rather than choose the federal standard definition for graduation rate, the state system adopted a unique definition for the six-year graduation rate that includes both full-time and part-time students (Florida Board of Governors,

2018a, p. 9). If students are pursuing their education at a slower rate while they choose to instead work full-time, for instance, they will have a difficult time completing a 120-

56

credit degree program within a six-year time period. It would be impossible for a student to complete only 18 credit hours per year, for instance, and graduate within six years, and yet students who intend to take only 9 credit hours each spring and fall are still included in this definition.

In particular, students who come from financially disadvantaged households, who

need to work full-time in order to support their parents and other family members, may

choose to enroll for fewer credit hours. Their progression will be slower, institutions that

cater to these part-time students are thus disadvantaged by this unique graduation rate that differs from how the rest of the nation calculates this metric. Furthermore, transfer students are excluded from this metric, and while there is no penalty for institutions that

serve primarily transfer students from the two-year state colleges, there is conversely no reward specifically tied to these students.

As of 2018, the state system adopted a four-year graduation rate that only includes students who started at universities as freshmen who were full-time students, and this change was the result of a legislative bill that requires the system to “include a 4-year graduation rate metric” in the performance-based funding model (Florida Board of

Governors, 2018a, p. 11). Accordingly, institutions that cater to economically-advantaged students who begin their careers as freshmen at the more-expensive state universities, who can afford to initially take 30 credits the first year and maintain those enrollments each year, will complete the typical 120-credit degree program within four years.

Institutions will be rewarded accordingly, though there is a four-year delay between any

change in recruitment practices and rewards for improvements on this particular metric.

57

Metric #5 – Academic progress rate (second year retention with a grade point

average above 2.0).

One of the leading indicators for a high graduation rate is a high retention rate within the first year of first-time-in-college freshmen enrolling. This metric considers students who return for a second fall term with at least a 2.0 grade point average. Such a metric could influence universities to alter their admissions processes in order to recruit students who are more likely to be retained – and likely to achieve at a level high enough to be included in the list of student records that are reported in this metric of “academic progress rate.” This is a more immediate reward system for universities, as a one-year change in admissions processes could result in improvement with this metric in the following fiscal calendar year. This version of the retention rate differs from the federal standard of first-year retention rate, primarily in the threshold of achieving the 2.0 grade point average. The system suggests that this particular metric exists to reward

“achievement” (Florida Board of Governors, 2018a, p. 4).

Metric #6 – Percent of bachelor’s degrees awarded in areas of strategic emphasis.

In line with the 2025 strategic plan for the State University System of Florida’s goal related to community engagement and economic advancement, this metric is an attempt to fulfill the Board of Governors’ commitment to strategic workforce development. Specifically, the metric rewards institutions for awarding more degrees in programs that have been categorized by the board as “areas of strategic emphasis.” The initial group of degree programs that the state designated in 2008 as deserving the label of “strategic” are in science, engineering, technology, and mathematics (STEM) disciplines, as well as critical workforce areas such as those that are health-related or

58

related to teacher preparation and broader education professional fields (Florida Board of

Governors, 2008). A subsequent revision to the list of areas of strategic emphasis added degree programs related to global engagement (i.e. languages) and a broader selection of critical workforce needs related to communication and finance (Florida Board of

Governors, 2013b).

The methodology for this metric is based on the percentage of degrees awarded in these areas, and accordingly, institutions that award a large proportion of “non-strategic” degree programs are at risk of not improving or achieving excellence for this particular metric. Areas not covered by the strategic designation include many programs within the popular fields of economics or general business studies, as well as many social sciences and humanities (Florida Board of Governors, 2013b). Furthermore, there are ongoing disagreements regarding what qualifies as a STEM degree program, and psychology is treated as a social science, regardless of the physiological nature or scientific experimentation that occurs throughout the curriculum for the degree program – but

“biopsychology” and “psychobiology” are both included on the list of STEM areas

(Florida Board of Governors, 2013b).

In theory, given the fact that the metric is based on a percentage, an institution could increase the overall raw number of degrees awarded in one of these areas but perhaps not at the rapid rate that it increases the number of degrees awarded in non-

strategic areas, resulting in a poor score for this particular metric for zero improvement

points. This is a metric that is not limited to students who enter as true freshmen, first-

time-in-college students but includes all transfer students as well. The metric thus not

only incentivizes universities to promote particular disciplines but it could also

59

incentivize universities to discourage liberal arts and other broader areas of study, so that the ratio of strategic-to-non-strategic does not penalize the school.

Metric #7 – University access rate (percent of undergraduates who are eligible to receive a Pell grant).

This metric has been a crucial component for the state’s argument that the performance-based funding weighs access against the other metrics that support achievement, “so that institutions that serve a higher percentage of undergraduates with a

Pell grant are acknowledged for their commitment to students with financial need”

(Florida Board of Governors, 2018a, p. 4). Unfortunately, a four-year overview of scores in the metrics shows that this is the only metric in which every single institution, regardless of actual percent of undergraduates with a Pell-grant, received the full amount of excellence points available according to the benchmarks (Florida Board of Governors,

2017a). Even the lowest-achieving institutions received 10 out of 10 points on this metric from 2014 through 2018. In other words, institutions that have higher university access rates are not actually “acknowledged for their commitment to students with financial need” any more than institutions that have lower access rates.

Accordingly, the Florida Legislature passed statutory language that required a more authentic reward for institutions that have higher percentages of Pell-eligible students. As a result, effective 2018, the state “requires access rate benchmarks to be differentiated and scored to reflect the varying access rate levels among the state universities, and prohibits the use of bonus points” (Florida Board of Governors, 2018a, p. 11). Formerly, the full 10 points for excellence benchmarks was set at 30% of undergraduates who are Pell-eligible. Nevertheless, the excellence benchmarks were then

60

readjusted with the 10-point mark at 42% due to the fact that “Florida’s population with family incomes less than $40k for ages 18-24 is 42% based on a 3 year average of US

Census data,” and “76% of fall 2016 Pell-grant students match this criteria” (Florida

Board of Governors, 2018a, p. 11). While raising the 10-point mark from 30% to 42%, the system also lowered the 1-point minimum threshold from 18.8% down to 6%. While increasing the difficulty for achieving the full amount of points, the system also simultaneously lowered the bar for achieving fewer points.

Metric #8 – Percent of graduate degrees awarded in areas of strategic emphasis.

The eighth metric, which applies to all public universities in the state with the exception of the undergraduate-only , follows the same methodology as the sixth metric for the percent of bachelor’s degrees awarded in areas of strategic emphasis (Florida Board of Governors, 2016a). The system strategic plan outlines the strategic areas of emphasis by the U.S. Department of Education’s

Classification of Instructional Programs (CIP) catalog. The list does not differentiate between critical workforce needs for undergraduate-level employees in the state versus graduate-level employees.

Metric #9 – Board of Governors’ choice metric.

In an effort to recognize unique missions of the various universities, the performance-based funding model in Florida empowers the Board of Governors to assign a specific metric to different universities. Most of the universities have been assigned the percent of bachelor’s degrees awarded without excess hours, with the exception of the

University of Florida and Florida State University, for which the assigned metrics are

61

number faculty awards for both, and number of national rankings for institutional

achievement for the New College of Florida.

The majority of the schools are assigned the metric “percent of bachelor’s degrees awarded without excess hours,” which is based on the proportion of degrees that students earn without receiving financial surcharges related to taking credits beyond what is needed to graduate. These surcharges initiate at 110% of the degree program for each student, regardless of how many of these excess hours students actually accumulate

(Florida Board of Governors, 2017b). Even one credit of excess hour would require a surcharge, and institutions will not earn credits in this metric for graduating these students. Essentially, then, this metric is another way of measuring the rate at which students graduate (metric 4). Schools that do well with graduation rates are able to double down on the benefits of quickly graduating students on a timely basis, but schools that struggle to graduate students within four years are essentially twice as disadvantaged.

Furthermore, these surcharges, by design, discourage students from going off track or spending too much time exploring various majors, but they also financially penalize students who are attempting to finish the final requirements for their degree programs. Before this metric, the burden for excess hours was placed solely on the backs of students in the form of financial surcharges. With the advent of this metric in the performance-based funding model, the state of Florida attempts to engage the institutions in the consequences for students who delay graduation beyond 110% of the necessary credit hours.

Some credits may be excluded from the calculation of excess hours, according to the statutes that established the excess surcharge mechanism. As such, “the following

62

types of student credits” can be removed from the calculation, including “remedial

coursework, non-native credit hours that are not used toward the degree, … credit hours

from internship programs, credit hours up to 10 foreign language credit hours,” and

military science coursework (Florida Board of Governors, 2017b). Institutions can

likewise exclude these credits from the calculated progression towards a degree that

requires the excess hour surcharge. Transfer students would be included in this metric,

though the exclusion of non-native credit hours that are not used toward the degree means

that transfer students will have more exclusions than native students who entered as

freshmen, first-time-in-college students. If freshmen were to take credits that were not used toward the degree, those credits would count towards the excess hours calculation, but transfer students would not have that same concern. An institution could seek to recruit and likewise award more degrees to transfer students, which would result in higher performance in this metric if exclusions are applied strategically to non-native coursework. Also, given the complicated calculation, there may be some discrepancy between what the institution actually charges students and what is reported to the state system.

Metric #10 – Board of Trustees’ choice metric.

The last metric is intended to reward institutions for working towards their

specialized mission (Florida Board of Governors, 2018a, p. 3). Universities have chosen

metrics related to degrees awarded to minority students – including those with historically underrepresented ethnicities/racial backgrounds such as black and Hispanic students – as well as percent of undergraduate enrollments that are online or classified as distance learning, among other choice metrics (Florida Board of Governors, 2017b). Even

63

when multiple universities choose the same metric, they may choose differing

methodologies and unique excellence benchmarks. A review of historical institutional

scores on the universities’ own choice metrics shows that almost every single institution

earns the full 10 points each year (Florida Board of Governors, 2017a). Thus, even

though the state system’s claim is that the metric should reward institutional missions, if

it is rewarding all institutions equally each year, then there is little true reward for each

university. The inclusion of this metric makes very little difference in the outcomes for

performance-based funding if all universities are doing equally well in this particular

metric, so the system has committed to reviewing and improving both metric 9 and

metric 10 in the future (Florida Board of Governors, 2018a, p. 13).

The Evolving Policy Design for Florida’s Performance-based Funding

The BOG system provides regular updates to the model, which can result in moving targets for institutions that are attempting to increase in any particular metric

(Florida Board of Governors, 2015; Florida Board of Governors, 2018a). For instance, the introduction of a $25,000 threshold for minimum salaries in Metric 1 means that one year cannot be compared to the last without recalculating prior data submissions using the new metric definition. This recalculation does not always occur within reports at the

BOG system level, ultimately resulting in misleading or less-than-reliable summary statistics regarding the progress that institutions have made in each metric (Florida Board of Governors, 2017a). Changes to the model occur at times according to legislative changes. Often, which was the case with the $25,000 threshold in Metric 1, the governors themselves or their staff members made the recommendations for updates. Florida’s PBF policy then continues to evolve through legislative and administrative processes.

64

While PBF policies differ from state to state, this dissertation provides a review of

the perspectives of stakeholders in Florida, both policymakers on governing boards and

policy implementers at universities, through the lens of the narratives they produce in

public documents.

Additionally, the nature of the PBF policy is itself key to using Florida as the setting for this analysis. Cornelius and Cavanaugh (2016) and Li (2017) both reviewed

Florida’s model. Accordingly, this is not the first study of the state’s PBF policy, and the

dissertation also contributes to the body of research literature regarding how extensive the

impact of the policy is on university operations. Of particular interest is how the state’s

iteration of the policy operates "within certain punitive limits," including the possibility

that institutions "will actually lose funding, as the program is partially funded by

deductions for the base funding" of the various universities (Cornelius and Cavanaugh,

2016, p. 167). In addition, Cornelius and Cavanaugh (2016) described a negative

relationship between the proportion of black students at institutions and the levels of

funding that universities earn in the PBF policy in Florida.

Also, Li (2017) included Florida as a state of interest due to the possibility of

losing funding based on outcomes. The punitive nature of the model, in which

underperformers have funds withheld until they can showcase a plan and a commitment

to improving, makes Florida unique. Other states primarily focus on rewarding

institutions that produce positive results for key performance indicators. Given the novel

approach of incorporating both bonus funds and penalties, with the carrot and the stick,

Florida’s PBF policy is worth investigating further.

65

The uniquely punitive model in Florida, mixed with the availability of documents that reflect organizational narrative content, makes the state a suitable location for further investigation. In particular, the NPF focus on heroes and villains seems apt when dealing with a uniquely punitive policy. Florida’s PBF policy design mirrors the narratives that drove the process to create and revise the policy.

While the above review showcased the policy design complexity from a technical viewpoint, the narrative analysis sheds some light on the complexity of the development and compliance of the PBF policy in the state. The next chapter further explores the methodological framework through which this analysis subsequently occurs.

66

CHAPTER 4. RESEARCH DESIGN AND METHODOLOGY

A vast amount of public data informed this dissertation, and this chapter discusses the plan for organizing and analyzing the narrative information regarding PBF policies in

Florida. In terms of the research design, the dissertation follows a qualitative NPF analysis, relying on structural coding and interpretive analysis of narratives located in public documents. In addition to describing the pieces of data themselves and how their formats look, this chapter also explains how researchers identify policy narratives and produce a coherent analysis. It also considers how an interpretivist NPF lens can provide

more thorough details regarding the ways that stakeholders in Florida view and

understand the PBF policy, as well as the limitations to doing so.

Data Collection Process

The first task at hand was to develop a manner for mining the narratives. For the

purposes of narrative research, public documents are particularly apt for analysis.

Especially with annual reports that use universal templates and standardized questions

that are asked universally across each institution or even the system itself, these types of

public documents will provide rich prose for an NPF content analysis. Public documents

have served as a common data source for NPF analyses using both qualitative and

quantitative approaches (Shanahan, Jones, and McBeth, 2018).

In the documents described in the following sections, universities respond to

template prompts regarding strategies and operations. In essence, the documents follow a

67

question-and-answer format. As such, these reports group information according to

relevant topics, prompting each university to provide its own institutional interpretation

of its priorities and, perhaps even more importantly, to omit what is not of particular

importance for the institution. The information that is absent, and the story that is not

told, can be just as important and informative as the story that is included.

These public reports and annual plans reflect institutional values and strategic

priorities, and it is crucial to observe whether or not institutions are focusing on the PBF

policy in these discussions. As such, these public documents are particularly revealing,

though that is not necessarily the original intent of the documents (which are actually

somewhat perfunctory in nature), to provide the political appointees who govern the universities with enough information to fulfill their fiduciary responsibility to ensure proper oversight.

The public availability of planning and reporting documents is also one of the initial reasons behind the selection of Florida as a setting for the analysis. The state has multiple years of operating its PBF policy, as well as public documentation regarding how the BOG system and the universities have strategically relied on performance models, therefore supporting the state’s inclusion in this dissertation. Likewise, the state of Florida is an appropriate choice due to its open public records laws, with sunshine statutes that “have long been viewed by public information advocates in other states as the model of openness” (McLendon and Hearn, 2006, p. 661). Importantly, many of these public documents in the state directly address the research questions regarding who the stakeholders are and what they interpret to be the consequences of the PBF policy, as discussed below.

68

To pursue this research design, I will undertake document collection within the

Florida state system to investigate how policymakers and implementers articulate their values into the prescribed measures. From 2010 to 2017, each of the twelve universities produced an annual work plan and subsequent accountability report that describes the outputs and outcomes of the planned work (the system modified the format of the two documents in 2018). In other words, within the SUS, public universities annually submit

two standardized types of documents to the BOG system that are available for public

consumption and scholarly review through a narrative lens: 1) the university work plans,

which outlined broad 3-year to 5-year strategies and set annual targets for a number of

key performance indicators, and 2) the university accountability reports, which reflected

on the past year’s key achievements and reported on progress towards the goals that each

university set for themselves in the work plan. The standardized format, with specific

prompts throughout the template, elicits a survey-type response on behalf of the

organizations. The corresponding responses of the universities and the BOG system,

which articulate values and strategic priorities, are not explicit reports on the status of the

PBF policy. They serve as a testament to the focus of the organizations involved in higher

education within the state.

While the state created its first PBF policy in 2013, the data from 2010 until that

point provides baseline narrative evidence, as well as documentation for how the

narratives shifted over time in the agenda setting, policy creation, and implementation

phases of the public policy process. Narratives are indicative of time and the policy

environment, and the full spectrum of data from pre- and post-PBF policy serves to

illustrate the evolution of stakeholder perspectives.

69

Within Regulation 2.002 (2009), University Work Plans and Annual Reports, the

BOG system listed the content that universities are to supposed to include in these two types of documents. According to the regulation, both of these documents should reflect the unique mission of the institution that is developing the report, as well as the institution’s contribution to the system as a whole.

Work Plans

More specifically, Section 3 of Regulation 2.002 (2009) detailed that each university’s annual work plan will “outline the university’s top priorities, strategic directions, and specific actions and financial plans for achieving those priorities, as well as performance expectations and outcomes on institutional and System-wide goals” (p.

1). The work plan ultimately has a more forward-thinking approach, with detailed targets for a number of measures and outcomes.

Breaking down the document type into sections, the template requires each institution to list its mission and vision, as well as provide a brief statement of strategy that broadly defines how each university will work towards its unique institutional vision.

Universities also provide narrative content related to its own strengths, opportunities, and key initiatives and investments that will promote the concepts of enhanced academic quality as well as overall efficiency and the associated benefits to the state. Section 4 of the regulation specifically outlines how “three additional institution-specific goals on which university effort will be focused within the next three years, the proposed strategy for achieving each goal, the metrics by which success will be measured, and any assumptions, including financial, upon which the projected outcomes are predicated” (p.

1). The vast majority of the work plan involves goal-setting and numerical targets, but

70

there are also opportunities for institutions to note any new academic degree programs that they may be considering in the coming years. Likewise, a comparative system work plan document combines the targets into system average targets and predicted progress towards long-term strategic planning goals.

Accountability Reports

As established by Florida State Statute 1008.46 (2018), under the section for the state university accountability process, the “the Board of Governors shall submit an annual accountability report providing information on the implementation of performance standards, actions taken to improve university achievement of performance goals, the achievement of performance goals during the prior year, and initiatives to be undertaken during the next year.” Then, Section 6 of Regulation 2.002 further prescribed how the annual accountability report (termed an “annual report” in the regulation) should look, with prescriptive areas that the Board of Governors staff would then insert these areas into a standardized template.

The mandated sections include “summary information on budgets, enrollments, and other core resources” as well as “reports on undergraduate education, graduate education, and research and economic development, as appropriate to the university’s mission, including narrative to provide context and perspective on key goals, data trends, and university performance on metrics” (p. 2). The template for the report then echoes these required sections, with a section on key achievements from the prior academic year organized by student accomplishments, faculty awards, degree program achievements, research awards, and other institutional activities of note. Accountability reports are reflective in nature, providing detailed narrative that are embedded within sections on

71

undergraduate and graduate education, as well as sponsored research activities, service to

local communities, and major collaborations with industry partners. The sections of the

accountability report mirror the system’s strategic plan, so there is a evident focus on science, technology, engineering and mathematics (STEM) disciplines, as well as a very workforce-driven framework for reporting on community engagement.

BOG System Documents, Media, and Reports

During the same timeframe of 2010 to 2017 as described above, the system produced its own multi-university work plan and accountability report, which

summarized and added to the goals and strategic actions of the individual university

reports. In essence, these reports collected and displayed key performance indicator

outcomes for all of the various universities.

These reports include annual system work plans and annual system accountability

reports, though the majority of the information contained within these documents are not

in narrative format but are specifically quantitative in nature. As a system, the BOG

produces prose within many of its performance-based funding documents, including

annual summaries of university initiatives that utilize performance-based funding allocations, frequently asked questions (FAQ) documents, system-wide marketing materials and press releases, approved changes to the performance-based funding metrics and models, as well as definitions for the metrics themselves. For the purposes of narrative analysis, this dissertation supplements these public documents with other records from 2010 to 2017 such as BOG brochures and information briefs, to collect as robust a group of narrative elements and strategies regarding PBF policies as possible.

72

Importantly, the BOG system posted to its website at least 23 individual media

reports, including press releases and clippings from newspaper articles and op-ed pieces.

While some of these media reports focus on individual universities, they are included in this section of the analysis due to the BOG system considering them noteworthy enough to post publicly at www.flbog.edu/board/office/budget/performance_funding.php (the

BOG system’s PBF policy web page). Spanning four years from the introduction of the

10-metric model in 2014 through 2018, these media reports provide insights regarding the narrative elements and strategies with which the BOG system associates itself.

Identified Policy Narratives: Elements, Codes, and Themes

The foremost responsibility of this dissertation will be identifying the

development of policy narratives, which Shanahan, McBeth, and Hathaway (2011)

suggested evolve “when the author or group strategically constructs the story to try to win

the desired policy outcome” (p. 375). For the purposes of this dissertation, Shanahan et

al. (2013) provides a standard system for coding narrative elements according to a

preexisting tradition. Narrative elements and narrative strategies thus serve as

components for overarching policy narratives, as well as codes in NVivo for subsequent

investigation and analysis. Overarching stories (or series of competing stories)

cumulatively develop from these coded elements and strategies. These pieces all reflect

the underlying narrative perspectives of the stakeholders engaged in the policy process.

For instance, particular stakeholders attempt to position themselves as heroes and

categorize others as villains. Similarly, the interpretations of the stakeholders contribute

to their understanding of the policy and its implications. The plot of the narrative, in

addition to the setting and moral of the story, also reflect the narrative perspectives of the

73

stakeholders. Table 1 breaks down this particular policy narrative coding system, with the

items appearing in the same order as they do in Shanahan et al. (2013).

Table 1. Policy narratives (Shanahan et al., 2013, p. 459)

Name Description from Shanahan et al. (2013) Narrative Elements Causal A theoretical relationship denoting a cause and effect relationship Mechanism between one or more independent variables and a dependent variable. Common causal relationships include intentional, mechanical, inadvertent, and accidental (Stone, 2012). Characters The participants in a policy narrative. Hero The entity designated as fixing or being able to fix the specified problem Victim The entity hurt by a specified condition. Villain The entity responsible for the damage done to the victim. Evidence Support offered with the intention of demonstrating a problem, usually (setting) pertaining to real world fixtures in the problem environment. Moral of the A policy solution offered that is intended to solve the specified story problem. Plot A story device linking the characters, evidence (setting), causal mechanism, and moral of the story (policy solution). Common plots include decline and control (Stone, 2012). Statement of A policy narrative is always built around some stated problem. a problem Narrative Strategies Angel shift A policy story that emphasizes a group or coalition’s ability and/or commitment to solving a problem, while de-emphasizing the villain. Containment A policy story depicting diffused benefits and concentrated costs that is intended to dissuade new participants and maintain the status quo. Devil shift A policy story exaggerating the power of an opponent while understating the power of the narrating group or coalition. Expansion A policy story depicting concentrated benefits and diffuse costs that is intended to draw in more participants and expand the scope of conflict. Policy Beliefs A set of values and beliefs that orient a group and/or coalition.

74

Coding is an analytical decision-making exercise (Elliott, 2018). The Shanahan et

al. (2013) standards reflect a systemized approach, highlighting the structuralism that lies at the root of NPF scholarship (in some ways including this dissertation), as opposed to broader interpretive narrative methodologies. As alluded to previously, this dissertation attempts to add to the NPF methodological framework, though diverging from the popular quantitative methods that scholars of NPF typically use and offering properly descriptive and comparative narrative analyses in order to contextualize the findings.

While NPF may seek to generalize, this dissertation seeks to gain unique insights within the context of the PBF environment (i.e. the state systems). NPF thus serves as a jumping-off point from which this dissertation can use the established narrative elements and strategies—as outlined by Jones and McBeth (2010) and Shanahan et al. (2013)—to code and analyze raw public documents.

A sample of this coding framework in action appears in Appendix B, using an the

2014 work plan from Florida Agricultural and Mechanical University (FAMU). The majority of the coding took place in the narrative sections within the work plan document template, which as noted previously, establishes notions of mission, vision, and strategic priorities for the institution. After coding all university work plans and accountability reports, as well as system reports and other documents, a descriptive analysis of the coded data provides insights into BOG system-level versus university-level stakeholders, particularly in regards to the importance of and efficacy of PBF policies.

A clear example of a qualitative NPF analysis appeared in O’Bryan, Dunlop, and

Radaelli (2014). The aspirations of that analysis did not surround the quantitative significance of narratives in the policy process but rather on “how narratives were

75

deployed, and to what effect” (p. 105). The authors tracked narrative elements in public

records of official legislative hearing minutes for further scrutiny and qualitative analysis

of narrative developments, providing an ideal coding and analytical template for this

project. The fact that they used standardized minutes, with some exceptions, also

supports defining the scope of this dissertation with a focus on particular public records.

In many ways, statements made in the public elevate the seriousness of the commitments

that are made, given the openness and extensiveness of the documentation. Various

constituent groups, as well as oversight bodies, can point to these public records and

compare the different responses from individual universities and the BOG system itself.

Thus, some comparative analysis would be appropriate to understand how different

organizations are treating narratives in different manners.

Likewise, Gray and Jones (2016) noted that qualitative studies in NPF can leverage some of the empirical tools that prior quantitative studies have developed. More specifically, they argued that “NPF provides theoretical means to disaggregate the

component parts of competing policy narratives, examine how they vary, and identify

patterns” (p. 193). Instead of public documents, their source of data is semi-structured

interviews, pulling narrative elements out of the primary sources. Gray and Jones then

suggested that their findings were more enlightening than previous quantitative NPF

studies, as the personal insights of their interview subjects became highlighted. They

explained, “such exposition illustrates several facets of expression and equality policy

narratives that if left to the devices of ‘normal’ science-based NPF would have been

missed,” which include items such as private motivations and elite insights and privileges

76

(p. 215). In a quantitative NPF analysis, this information may not be as apparent without an appropriately normalized population or large enough sample size.

Certainly, the NPF approach also empowers researchers to collect qualitative data, through semi-structured interviews or surveys, and to then categorize and analyze the results in a numerical fashion. For example, McMorris, Zanocco, and Jones (2018), developed scales and conducted an ordered logistic regression analysis. Their findings reflected the perceived levels of narrative persuasiveness, so certainly they correctly described their work as qualitative in nature; however, they also stated that they used “a mixture of qualitative and quantitative tools” to conduct their analysis (p. 792). The flexibility to organize and assess datasets of survey respondents’ opinions, while a benefit for some researchers, does not provide the same deep, contextual analysis of the guiding factors of organizational-level narratives, such as core values and policy beliefs.

Instead, the analysis for this dissertation follows the qualitative procedures of

O’Bryan, Dunlop, and Radaelli (2014) and Gray and Jones (2016). Coded data fall under two categories to limit the investigation to the cornerstone themes of Florida’s PBF model: a) career placement outcomes and b) academic success in student progression and degree completion. The dataset largely excludes references to the research enterprises and other prose content that are non-PBF policy in nature. Additional themes could arise or these themes may evolve throughout the coding and analysis process, but the aim of the study is to limit the number of themes to ensure the analysis remains focused.

Qualitative Review

With the coded data grouped into career placement and academic success themes, a code-by-code qualitative review explores emerging narratives and narrative strategies,

77

as listed in the Shanahan et al. (2013) framework in Table 1. Data from multiple institutions, as well as the BOG system itself, appear throughout the analysis, permitting comparisons and contrasts between stakeholders’ use of narrative elements and strategies.

The analysis itself presents elements according to the aforementioned themes of career placement and academic success, presenting selections of coded public documents alongside descriptive interpretations regarding the role that each narrative element and strategy plays in developing a full policy narrative. These interpretations contribute to a

“thick description” in the qualitative analysis that seeks to interprets the meaning of the stakeholder’s language, much in the way that Clifford Geertz (1973) uses the term to differentiate between the thin description of the physical act of closing one’s eye and the thick description of suggestively winking. The NPF coding mechanism will showcase the structure of the policy narrative in a standardized format, but it is the subsequent qualitative review that digs into the prose, extracting meaning through the context of a larger policy narrative.

Perhaps even more telling than the most overt narrative strategies will be the statements and narrative elements that stakeholders fail to mention. For example, Pell- eligible and minority students suffer from historically-lower academic success rates than their more affluent, white counterparts (Webber and Ehrenberg, 2010). Are institutions referencing underrepresented populations when discussing graduation rates? Is the university system as a whole not prioritizing the essential function of providing access to these groups of students?

This dissertation, then, seeks out the organizational stories that describe the objectives and strategies of PBF models, as well as the assumptions and values that

78

define their successes or failures according to these narratives. This would supplement

quantitative studies that relied largely on surveys from public managers regarding why

they used performance data (Moynihan and Pandey, 2010, McLendon et al., 2006).

The root question of this dissertation, regarding whether or not the PBF policy

specifically in the state of Florida is working, is an oversimplification of a much more

complicated question. All universities are different, with unique priorities, and those

priorities are then different from the BOG system at the statewide policymaking level.

Institutions might value quick graduations in order to produce more alumni who contribute to the state, but socioeconomic status might influence the rates at which particular demographics of students graduate. Surely the state’s PBF policy has had successes, but the unintended consequences, based on one institution prioritizing the issue of increased socioeconomic access to public higher education serves as just one example of how those successes may be mixed with failures. The PBF policy in Florida, as examined in Chapter 3, is remarkably complicated, despite its self-proclaimed guiding

principle to remain simple. As the analysis will exhibit, the narratives associated with it

are equally as complicated.

By grouping narrative content according to the standardized elements and

strategies of Shanahan et al. (2013), the analysis then progresses through the coding

results, forming the analytical pieces of a larger policy narrative discussion. Again, the

order of the analysis follows the same order that Shanahan et al. (2013) use to present

their structural components of narratives. Moving through each element and strategy, the

analysis strives to compile overarching stories (or a limited number of large overarching

stories) in the form of comprehensive policy narratives that reveal the connection

79

between the narrative elements in the data and the public policy process itself. In the

discussion and conclusions in Chapter 7, the findings will attend to the research questions regarding stakeholders and their views on the PBF policy.

Limitations

This type of qualitative review may not produce replicable results, but the findings will hopefully reflect the type of overarching policy narrative that tells a comprehensive story, full of characters, setting, and plot. Admittedly, as a higher education administrator in the state of Florida who works intimately in the area of

strategic planning and policy analysis, I possess many biases as a researcher with prior

knowledge of PBF policies, Florida’s university system and particular model for PBF, the

stakeholders involved, and the overall state of higher education policy.

Conversely, my personal experiences provide expertise and unique insights that

would otherwise be unavailable to scholars without the same years of hands-on PBF

policy implementation. While certain narrative elements are blatantly recognizable, other

stakeholders may be using jargon or less forthcoming language to describe in the public

documents the activities at their institutions or at the state system. With more than ten

years of experience as a public higher education administrator, I am familiar enough with

the policies and processes in Florida’s university system to be able to detect more

nuanced narrative elements and strategies.

Similarly, certain portions of the public documents are perfunctory in nature,

standardized across the various universities, or otherwise unnecessary for coding and

review. This dissertation attempts to recognize these instances and avoid their inclusion

in the analysis. The qualitative review provides more opportunities for explanation of

80

these nuances and is thus valuable as a contribution to the study, in addition to the

limitations that these personal experiences present.

Additionally, while Saldaña (2009) recommends multiple coders in order to

enhance the internal validity of such a research project, this dissertation is an independent

study that relies on the coding decisions of only one individual. Ideally, the analysis

would one day benefit from intercoder agreement from multiple coders, and future

research may have the opportunity to engage additional scholars in the coding activities.

Coding Framework

Despite the tendency of NPF to embrace a standardized narrative structure, qualitative coding is an interpretive exercise, as the decision-making process for assigning codes reflects the cultural lens and traditions that set the “deep structure” for how the analyst perceives each coding instance (Prasad, 2005, p. 102). In the process of coding the documents for this analysis, my own understanding of the PBF policy narratives evolved for each of the structural narrative elements and narrative strategies.

For instance, prior to reviewing public documents, one might assume that the heroes

noted by universities or the BOG system were acting as champions for students. Instead,

it became clear that this type of character included those individuals who were acting as

champions for the public, or taxpayers, more generally. The actual individual who served

as the hero might have remained the same. In contrast, it was the particular code itself

that became more inclusive.

The descriptions in Table 2 represent an application of the standardized NPF

elements and strategies in the context of PBF policies, appearing in the following table in

the same order used by Shanahan et al. (2013).

81

Table 2. Detailed coding framework as applied to the PBF policy analysis

Name Description from Shanahan et al. (2013) PBF policy narrative application Narrative Elements Causal A theoretical relationship denoting a cause and Any reference, whether at the BOG system level Mechanism effect relationship between one or more or at the individual universities, referring to a independent variables and a dependent variable. claim of a direct or indirect causal impact of a Common causal relationships include particular policy, strategy, or initiative. intentional, mechanical, inadvertent, and accidental (Stone, 2012). Characters The participants in a policy narrative. The participants in a PBF policy discussion.

82 Hero The entity designated as fixing or being able to Examples of PBF policy-related achievements by

fix the specified problem the BOG system or the individual universities, particularly in the arena of the academic or career successes of all students or of certain demographics of students. Victim The entity hurt by a specified condition. Examples of limitations or other forms of suffering that limit success in institutional or BOG system-defined outcomes for PBF policies. Villain The entity responsible for the damage done to Examples of the external forces that impose the the victim. aforementioned limitations or other forms of suffering. Evidence Support offered with the intention of Observational data to reinforce the unique (setting) demonstrating a problem, usually pertaining to context or to provide additional understanding of real world fixtures in the problem environment. a stated problem. Stakeholders rely on this element for providing supporting documentation

Name Description from Shanahan et al. (2013) PBF policy narrative application of obstacles or major concerns. Oftentimes, they cite numerical data to provide sufficient real- world evidence to support the existence of an obstacle or major concern. Moral of the A policy solution offered that is intended to A broad strategy or general statement claiming to story solve the specified problem. engage topics, such as academic or career success of students, which are key components of PBF policies (e.g. institution goal/vision statements or other statements of strategy). The moral of the story is not a specific tactical policy solution but rather a much larger proposed strategic framework for engaging a PBF policy 83

obstacle or challenge, such as low graduation rates or low career placement rates. Plot A story device linking the characters, evidence Specific examples of initiatives or work being (setting), causal mechanism, and moral of the done in order to engage topics, such as academic story (policy solution). Common plots include or career success of students, which are key decline and control (Stone, 2012). components of PBF policies. Often directly follows statements of valiance (hero characters) or victimization (victim characters), casual mechanisms, or broader strategies/statements claiming to engage student success (policy statements). Plot here supports these claims of working towards a solution. These are very specific initiatives that would be subordinate to the broader moral of the story.

Name Description from Shanahan et al. (2013) PBF policy narrative application Statement of a A policy narrative is always built around some An explicit deficit or concern articulated by the problem stated problem. BOG system or a university, which prohibits improved academic or career outcomes for students. These are usually challenges or obstacles that would prevent stakeholders from fully complying with the PBF policy. Narrative Strategies Angel shift A policy story that emphasizes a group or A policy story that attempts to highlight ways in coalition’s ability and/or commitment to solving which a university or the BOG system rises a problem, while de-emphasizing the villain. above peer groups, usually with a reference to ranking above those peers or otherwise

84 downplaying their comparable successes. Containment A policy story depicting diffused benefits and A policy story that attempts to slow a concentrated costs that is intended to dissuade university’s response to the development of the new participants and maintain the status quo. state’s PBF policy to maintain the status quo, or in contrast at the BOG system level, attempts to overlook failures or glitches in the PBF policy to maintain the status quo. Devil shift A policy story exaggerating the power of an A policy story that attempts to downplay the opponent while understating the power of the BOG system’s or a university’s ability to narrating group or coalition. increase outcomes in PBF policy, while highlighting the ability of others to succeed in the model. Expansion A policy story depicting concentrated benefits A policy story that attempts to draw in more and diffuse costs that is intended to draw in more partners to spread the responsibility for PBF participants and expand the scope of conflict.

Name Description from Shanahan et al. (2013) PBF policy narrative application policy outcomes (both in terms of successes and failures). Policy Beliefs A set of values and beliefs that orient a group A description of the BOG system or a and/or coalition. university’s core values and beliefs in regards to academic and career success for students (e.g. institutional mission statements).

85

While this dissertation is definitively not engaged in a quantitative analysis, the summary report in Table 3 below is worth consideration at the most superficial of levels.

These tallies reveal comparable trends regarding the presence of particular narrative elements and strategies within the public documents.

Table 3. Summary table of coding instances

BOG BOG University System System Annual University All Annual Documents Accountability Annual Documents Reports and Media Reports Work Plans Reviewed (n=16) (n=25) (n=76) (n=85) (n=202) Narrative elements Causal mechanism 0 7 76 28 111 Hero 22 13 304 74 413 Victim 0 5 24 6 35 Villain 0 2 1 1 4 Evidence 6 0 14 3 23 Moral of the 1 story 1 0 134 175 320 Plot 8 0 345 171 524 Statement of problem 3 1 21 45 70 Narrative strategy 1 Angel shift 0 8 93 32 143 Containment 3 2 12 7 24 Devil shift 3 1 2 5 11 Expansion 8 2 31 18 59 Policy beliefs 7 0 54 91 152 TOTAL 81 41 1,111 656 1,889

When plotted on two separate y axes, one reflecting coding instances in the documents from the BOG system and one reflecting coding instances the documents from the individual universities, the frequencies of coding instances for each narrative element

86

aligned proportionately, regardless of stakeholder (see Figure 4). The major difference in frequency appeared in the plot code, which is addressed later in this analysis. More importantly, this dissertation makes equal consideration of a) the categories with only a handful of coding instances and of b) the categories with hundreds of coding instances.

Something significant might have occurred in those few coding instances. The next section represents a code-by-code review the qualitative data. It serves as a full NPF analysis of the narrative elements and narrative strategies, carefully exploring the most noteworthy coding instances.

600 40 500 35 30 400 25 300 20 200 15 10 100 5 0 0

Individual Universities BOG System

Figure 4. Frequency of coding instances Utilizing the above coding framework and application of NPF, the next chapter strives to overcome the stated limitations and even benefit from them when appropriate.

The coding and analysis in this dissertation aim to comb through all the public documents to organize the various stakeholders’ narrative elements, strategies, and policy beliefs.

Eventually, the results will reveal overarching policy narratives regarding PBF policy development and implementation in the context of Florida.

87

CHAPTER 5. ANALYSIS OF FLORIDA BOARD OF GOVERNORS SYSTEM

The following analysis relies on qualitative NPF traditions, broken down by narrative elements, narrative strategies, and policy beliefs – in the same order in which they appear in Shanahan et al. (2013). First, the dissertation considers the narratives of the BOG system in its various reports and documents, primarily in its role as a policy making body. In the next chapter, narrative components from the individual universities drive a layer of analysis regarding the implementation of PBF policy in Florida. While not exhaustive, the analysis strives to compare and contrast the results, investigating the relative outliers in the results just as much as it looks into the coding instances that are more frequently occurring.

Causal mechanism

Shanahan et al. (2013) referred to the narrative element causal mechanism as “a theoretical relationship denoting a cause and effect relationship between one or more independent variables and a dependent variable” (p. 459). With that baseline understanding, such a narrative element was central to the argument espoused by the state-level stakeholders. To support the argument that the adoption of the PBF policy was directly responsible for increased performance at the various public universities throughout the state, the BOG system referred to the policy itself as the causal mechanism.

88

If the PBF policy served as the independent variable, then the relative success of the various institutions, primarily in the form of final ranking status in the PBF model, served as the dependent variable in this causal mechanism. The BOG system used this approach to reinforce the overall effectiveness of the policy, pointing to particular instances of successful correlation between the advent of the state’s PBF policy and increased institutional performance. The reliance on this particular narrative element thus glossed over the possibility that such increases were a regular part of the operation of institutions of higher education – or that other institutions found it increasingly difficult to escape the grasp of the “bottom three” in the final rankings, year after year.

Of the instances of causal mechanism coding in this dataset, the BOG system documents contained only 7 of the 111 total identified uses of this narrative element.

Rather than occurring in formal settings, such as the BOG approved a system-wide accountability report or work plan, the causal mechanism appeared in press releases, news media, or informal bureaucratic documents compiled by system staff members.

For example, a question-and-answer session with the Palm Beach Post in 2014 included a question for the state chancellor, who serves as the chief executive officer for the system, about the appropriate proportion of performance funding levels compared to regular recurring base funding. The BOG system chancellor, III, responded, “The simple answer is that performance funding should be a large enough percentage of university budgets so that it incentivizes universities to focus and improve on the 10 metrics we’ve identified” (Christie, 2014). The chancellor articulated a direct link between the financial standing of the institution and its outcomes on the state’s key performance indicators.

89

Also in 2014, after the BOG system released the first round of results of the 10-

metric version of the state’s PBF policy, Criser (2014) penned, in a letter to the editor to

the Bradenton Herald, that the PBF policy “rewards the universities that focus to achieve

year­over­year improvement, requiring universities to compete against themselves rather

than each other.” The increased performance he described is thus key to the BOG system’s understanding of its own PBF policy, as opposed to a model that would merely reward sustained, comparative excellence.

More specifically, the BOG system used the causal mechanism narrative element to describe the type of entrepreneurial turnaround story that universities could represent.

If institutions were ranked very low in the first few years of introducing a PBF policy in

Florida, then the BOG system could use any subsequent successes to laud to effectiveness of the state’s particular PBF model and the direct responsibility that PBF policies have for producing such successes. In other words, within the confines of the state of Florida’s

PBF policy, all of the universities have the ability to lift themselves up by the proverbial bootstraps, as noted in one frequently asked question (FAQ) document published by the

BOG on its website:

Universities will need to be strategic in the investment of performance funds to

focus on improving metrics. For example, a university could choose to invest in

improving internship opportunities within the disciplines that perform the best on

these postgraduation metrics, and other career center efforts. (Florida Board of

Governors, 2018a, p. 5)

In one instance, Tom Kuntz, who was then the vice-chair of the BOG system, proclaimed, “The headline writes itself: Performance funding works” (Isern, 2015). This

90

turn of phrase came from someone noted by the Tallahassee Democrat as “one of the key

architects of the metrics system” for the PBF policy (Dobson, 2016). In this particular

circumstance, Kuntz cited the turnaround story of the University of West Florida (UWF)

as evidence to deploy the causal mechanism narrative element. He noted, “Whether through increased communication with students or new faculty in high­demand

workforce areas, UWF found smart ways to invest in its own success” (Isern, 2015). The

next year in 2016, Kuntz was then elevated to the role of chair of the BOG system, and he

explained, “This year’s performance outcomes demonstrate once again that

incentive­based funding works in higher education” (Dobson, 2016). The idea that the

PBF policy works is important to its own growth in funding and impact. The deployment of the causal mechanism narrative element then catalyzes a self-perpetuating phenomenon for the policy itself. Without documented success, linking the incentives and disincentives to future outcomes, the BOG system could not ensure future expansion of the PBF policy.

Hero

Following the same line of reasoning, the BOG system repeatedly invoked the hero narrative element in order to reify itself as a savior in terms of bringing forth a model to catalyze performance enhancements. The heroics were not necessarily directed at any particular entity – the taxpayer, the universities themselves, and the students were all recipients of the heroism.

In some instances, the hero was clearly an individual, such as the system chancellor, Marshall Criser III, whose professional background was in business and not higher education or elsewhere in the public sector. After his appointment, the Palm

91

Beach Post published in 2014 this mini-biography in the introduction to a question-and- answer feature:

To implement this more ‘businesslike’ funding model, Florida’s Board of

Governors tapped Marshall Criser III as its new chancellor in November [2013].

The Palm Beach County native, and UF graduate, is well-known in the state’s

business and political establishment. With more than 30 years with AT&T Florida

(and BellSouth and Southern Bell before), plus stints as chairman of the Florida

Chamber of Commerce and Florida Council of 100, Criser must sell everyone

from lawmakers to university officials to parents and students on this new

approach. (Christie, 2014)

This description of the chancellor was more reminiscent of a clever outsider, ready to use his corporate acumen for the good of the state, as opposed to the typical character of an experienced administrator who was informed in academic protocol and able to navigate the complex bureaucracies of public higher . In many ways, his external perspective reflected that of the political appointees on the Board of

Governors themselves – principled in the rational traditions of the for-profit sector and engaged in a good faith effort to likewise transform public higher education, all in a quixotic attempt to save it. The hero character here reflected an individual in his attempt to champion PBF policies and the philosophical “businesslike” motivations behind it.

Another hero that appeared in the BOG system documents, including in press releases, was Governor Rick Scott, often mentioned in the same breath as the Florida

Legislature. Then-vice chair of the BOG system, Tom Kuntz, praised the support of

Governor Scott, citing his support as the reason that “the State University System can

92

now implement its plan to raise all of our universities to a new level of efficiency, accountability and academic quality” (Davis, 2014b). In other words, the BOG system used a press release to assert that it was the heroic leadership of Governor Scott resulted in the enactment of change, rather than any nationally-recognized best practices or institution-level interventions that were based on the instructional experiences of faculty.

Like the earlier reference to Chancellor Criser, this narrative element derived from the governor delivering a PBF policy that was based on businesslike concepts of efficiency and accountability, with academic quality serving as a literal afterthought.

Furthermore, the 2014 BOG system press release did not dig into the purpose or intended outcomes of the PBF policy. There was a clear reference to how Governor Scott wanted “Florida’s public universities to be among the best in the nation,” although no clear explanation for what “the best” meant in this context (Davis, 2014b). Whatever it was, Governor Scott was going to make it happen. The focus was on how the governor championed responsible spending of taxpayer dollars – an admittedly important topic; however, the substance of the PBF policy, such as its overarching goals and key performance indicators, is overlooked in favor of valiant descriptions of those individuals who provided it to the state.

These remarks contributed to an overall character development of the political leaders in the state having served in a capacity as a hero. In the same press release, then- chair of the BOG system, Mori Hosseini, claimed that “Our Governor and Legislature understand the importance of investing strategically in our universities and students” and are accordingly using the state’s PBF policy to “[ensure] that today’s students have the opportunity to succeed in tomorrow’s workforce,” while Chancellor Criser declared that

93

“Our system is better off because of [Governor Scott’s and the Florida Legislature’s]

leadership” (Davis, 2014b). Again, casting these individuals in this light was self-serving

for the original crafters of the state’s PBF policy. To laud public financial allocations

would likely have led to additional investments, so this press release actually served as

one of the most overt forms of narrative character development in any of the BOG system

documents during the time period reviewed.

When the reports and other documents from the BOG system did not characterize

specific individuals or groups using hero narrative elements, they often referred to “the

Board,” “the System,” or the PBF policy itself as the hero. For example, the BOG system

accountability reports regularly referred to how “the System continues to be ranked in the

top ten nationally for six-year graduation rates,” (Florida Board of Governors, 2013c).

Upon improvement in this particular comparative measure, the discussion eventually

shifted to how “the State University System of Florida six-year graduation rate is ranked

1st compared to the ten largest states” (Florida Board of Governors, 2016b). This

enhanced standing in comparison to other similar-sized states served as an example of heroics in efficiency, having ensured completions rates were admirably high.

In another example, the BOG system published a press release in which it promoted the PBF policy’s positive reception during a panel presentation to a conference for municipal financial analysts – a group of individuals who have arguably limited understanding of complex higher education financial policy. The chief academic officer for the BOG system shared how the state’s PBF policy was unique in the nation in how it

“established a minimum acceptable level of performance for universities by requiring them to put existing funds at risk” (Davis, 2014a). The BOG system thought their model

94

was so heroic that it published an entire press release about a panel presentation at a

peripherally-related professional conference. Despite relying on the hero narrative

element, such an inflation of self-importance, as noted in this example alone, signaled insecurity in the BOG system’s need to justify the policy.

To attend to the concerns of access to higher education, the BOG system similarly framed itself as a hero character in its adjustments to the particular access-related benchmark related to the percent of student enrollments at a university who are Pell eligible. By adjusting the amount of points that the BOG system awarded for attaining specified Pell demographic thresholds, the FAQs document noted that it was being

“representative of the state’s population of low income families and continues to reflect the Board’s policy of encouraging the institutions not to decline in this metric” (Florida

Board of Governors, 2018a, p. 11). As noted previously, historically, the PBF policy did not reward institutions that were particularly high performers for this metric.

Accordingly, the BOG system’s interpretation of its own actions reflected that of a hero for the state’s low-income families. The BOG system was just as willing to laud its own valiance as it was that of its elected and appointed leadership.

Victim

While the BOG system regularly used more positive narrative elements to portray the strength of the PBF policy and itself as a system, the public documents similarly showcased a tendency to avoid characterizing the state’s political leadership in a negative light. Accordingly, the victim element only appeared in these documents in reference to individual universities, and in those cases, the documents themselves were neither BOG-

95

approved nor formal in nature. This further highlighted the avoidance of references to

victimization at the system level.

In the coded data, the instances of the victim narrative element arose in three

documents, two of which were produced externally from the BOG system, such as in

quotes from university-level administrators in newspaper articles. Importantly, the BOG

system decided to include clippings from these media reports in their highlights on the

state’s PBF policy web page. At the BOG system level, policy makers pointed to media

reports and op-eds that focused on the individual university-level experience with the

PBF policy, thus coopting the narratives.

For example, New College of Florida’s vice president of finance and administration, John Martin, highlighted in 2015 in a Herald Tribune article how close his institution was to escaping the near-arbitrary threshold by which it landed in the

“bottom 3,” ultimately meaning that his institution had lost any chance at earning performance funds that fiscal year:

Even though we had 35 points, we were one point behind’ Florida State

University and the University of North Florida, Martin said. ‘So, doggone it, if we

had one more point we would have been in the money. (Webb, 2015)

Admittedly, the mention of the margin of one point was not so much victimization as it was an expression of frustration with the loss itself. In fact, Martin seemed quite tongue-in-cheek with his response to the institution’s suffering. Part of that lightheartedness surely derived from the institution having escaped an additional penalty at the threshold of 25 points, at which point New College of Florida would have not just had performance funds withheld but would also have, for one fiscal year, forfeited

96

roughly 10% of its base budget. Having already completed an improvement plan, the

school would have not had the opportunity to earn back these funds.

The suffering that Martin reflected on here did not result in the type of

institutional crisis that would have surely occurred if faced with a substantial budget

reduction like the one the college avoided. Still, this served as a form of suffering, and so

the instances nevertheless resulted in a victim classification. Such a lighthearted approach

to this “loss” was likely why the BOG system staff members decided to include this

particular newspaper article on its selected media coverage of the PBF policy, as a less

serious victim deflected from the more serious consequences of the model.

In addition to the direct example above, the documents produced indirect references of victimhood. In a News Press article in 2014, prior to the official release of first year of PBF policy outcomes and institution rankings, the administrative leadership at one of the state’s two preeminent universities, Florida State University (FSU), treated the “bottom 3” schools as victims:

Even presidents of Florida’s largest schools have concerns. Florida State

University President Eric Barron worries that the lowest scoring institutions will

stay near the bottom year after year, potentially losing money annually as a result

of their shortcomings. That money then would be redistributed to other

universities. (Breitenstein, 2014)

As mentioned in the previous example, the year that Barron commented on the model was when New College of Florida had already ranked in the “bottom 3.” This ranking eventually occurred three years in a row – in 2014, 2015, and 2016. The news

97

article depicts a sense of pity in the FSU president’s comments, as well as concern regarding the PBF policy’s structural deficit for future success of the “bottom 3.”

Two additional examples of New College of Florida having served as a victim appear in the BOG system’s frequently asked questions (FAQs) document. In the first instance, one of the frequently asked questions is: “Will the performance-based funding model drag down the pre-eminent institutions and New College, which is considered a top liberal arts college?” (Florida Board of Governors, 2018a, p. 4). The suggestion here was that the PBF policy could have made a victim out of this unique institution that boasted a national record of excellence, albeit not necessarily in the key performance indicators with which it is evaluated in the PBF metrics model. The inclusion of this question in the full list of FAQs implied that the BOG system was nervous about this characterization of a national gem as a being “dragged down” by the policy. Of course, the BOG system staff response to this inquiry was “no,” with an explanation that the state of Florida’s brand of PBF policy was as equally focused on improvement as it was on excellence, and so any institution had a shot at success.

The same theme appeared again in the subsequent example of the victim narrative element in the document, which described an ad hoc methodological change that occurred for New College of Florida (NCF). The BOG system document stated that “the small number of NCF graduates makes it necessary to account for every single graduate or their percentages are disproportionately affected” (Florida Board of Governors, 2018a, p. 7). In the exact same document that the BOG system responded to claims that the model “dragged down” the state’s small, liberal arts college, it also admitted that “their percentages are disproportionately affected.” While the victimization that occurred

98

through these public documents was nuanced and oftentimes sympathetic, it served to

highlight the unique struggle that the state’s PBF policy forced upon certain institutions.

Villain

In the case of public documents from the BOG system, the presence of one

particular character narrative element in discussion of the PBF policy did not preclude the

introduction of another. In at least one scenario, the description of a hero even demanded

mention of a villain. If Governor Rick Scott served in this champion role, as described at

length earlier, then what precisely would the force be that he fought against? In one quote

in a BOG system release from 2016, Scott answered this question when he proclaimed, “I

will continue to challenge Florida's universities to make higher education accessible and

affordable, and strive to graduate students in degrees that lead to high-skill, high-wage jobs” (Davis Wise, 2016). He thus positioned himself in an adversarial role against the universities themselves, who according to this logic, were supposedly not focused on

“accessible and affordable” education and did not “strive to graduate students” in the most critical areas for workforce demand.

Such a characterization reflected the attempt to vilify the administrative leadership of universities. Either universities were negligent (and unintentionally had the wrong priorities, producing graduates in unemployable disciplines while they closed the doors to their institutions and ratcheted up their prices), or they were actively working against

Scott and these goals. These details were not clear in the public documents, but either way, the administrative leaders at the universities form the opposition to the aforementioned valiant interpretations of the governor’s actions.

99

Evidence (setting)

The BOG system narrative element of evidence (setting) appeared to have only occurred in official documents. Quite aptly, these coding instances described a shift in focus for the political appointees who reviewed and formally approved the system’s accountability reports. Prior to the adoption of a PBF policy, as noted in the 2012-2013 accountability report, the BOG system directed its attention to enrollment and the resulting expansion of access that occurred alongside increased student populations and numbers of degrees awarded, regardless of academic discipline.

The statement below served as evidence (setting) to describe this concern in real world terms, meaning that data depicted the problem at hand using a specific example:

As a System, undergraduate enrollment increased 1% from Fall 2011 to Fall 2012,

and graduate headcount enrollment increased 1% from Fall 2011 to Fall 2012.

However, the amount of credit hours (as measured by FTE) was flat for

undergraduates and declined at the graduate level. (Florida Board of Governors,

2014a)

While a slight increase in SUS headcount occurred, the noted lack of movement in the federally-standardized measure of full-time equivalent (FTE) student enrollment meant that undergraduate credit hours did not increase simultaneously. Worse, the graduate FTE declined. More students may have attended the universities, but the evidence (setting) suggests that instructional activity did not increase – and actually decreased at the graduate level. The problem, and the evidence of that problem, reflected the BOG system’s focus on enrollment and productivity in the form of the delivery of education via classroom time and the associated credit hour calculation.

100

In contrast, after the implementation of the PBF policy in the state, the BOG system began to focus on the concept of the academic success of undergraduate student, which was noted previously as pervading the model throughout the majority of the metrics. While not providing the specific data to support the claims, the BOG system relied on the evidence (setting) narrative element when it noted trends that described academic success and failure:

The percentage of students who continue to their second Fall term serves as a

valuable early indicator of student success. The percentage of students who have

maintained a Grade Point Average of 2.0 or higher by the end of their first year is

an even stronger predictor of student success. (Florida Board of Governors,

2013c)

Despite the circular logic behind the BOG system’s prediction that students who successfully continued to pursue their education were more likely to succeed (and that students who earned higher grades were likewise predicted to succeed), what the BOG system attempted to describe here was evidence (setting) of a complicated academic problem. The scene it described was not one of dwindling student populations but rather of the instructional quality and engagement opportunities it would have taken for the students to have earned higher grades and continued in their studies at their initial university. A genuine shift in concern from productivity to student completion occurred, as observed in this narrative element. The timing of the shift was of particular note, given that the subsequent evidence (setting) regarding the academic success of undergraduate students occurred in the post-PBF policy era.

101

Moral of the story

Just as other narrative elements evolved with the state’s PBF policy, so too did the

moral of the story narrative element depict a broader shift in strategy. The following

example described the national movement toward PBF policies prior to the adoption by

the state of Florida in 2014, as well as the impending focus of the BOG system’s own

version of such a policy:

Accountability measures in higher education have increasingly focused on

graduation rates as a proxy for institutional effectiveness in state and national

governmental measures, national rankings, and institutional strategic plans.

(Florida Board of Governors, 2012a)

While the above statement reflected more of a national moral of the story than the

state of Florida’s own approach, in subsequent years in 2012 and 2013 the BOG system

made the following claim:

The System is developing a performance-funding model, that will drive

universities toward achieving the State’s top priorities and reward both excellence

and improvement on key metrics, especially in areas of student success. (Florida

Board of Governors, 2013c; Florida Board of Governors, 2014a)

This sentence adequately described the Florida PBF policy in terms of a moral of the story, in which a broad strategic philosophy centered on quantifiable measures of academic and career success for students throughout the various universities in the state.

Likewise, it mirrored the same type of moral of the story narrative element previously attributed to “state and national governmental measures” for accountability. The

102

suggestion here was that standardized approaches for accountability were reliable and

useful for strategic purposes.

For five sequential years from 2012 to 2016, the BOG system advocated for

standardized measures in its moral of the story. Instead of exploring institution-specific

accountability mechanisms that would consider unique institutional missions, the BOG

system described a broad strategic approach to promoting excellence, in which “the State

University System (SUS) of Florida is committed to excellence in teaching, research and

public service—the traditional mission of universities” (Florida Board of Governors,

2012a; Florida Board of Governors, 2013c; Florida Board of Governors, 2014a; Florida

Board of Governors, 2015b; and Florida Board of Governors, 2016b). Admittedly, as noted previously, the university metrics did slightly vary according to mission, mostly through the university Board of Trustees-choice and the BOG-choice metrics. Still, statements like the one above seemed to overpower hints of needing to support unique university missions, instead supporting a system-wide mission. While additional analysis might review the extent to which that was true, the BOG system’s moral of the story suggested that all of the universities had the same mission.

Plot

In terms of specific initiatives that reflected the plot to implement strategic priorities, the BOG system made limited reference to efforts to support completion rates and to promote academic and career success for students. For example, the BOG system public documents made no reference to the statewide adoption of any support systems or technologies, such as predictive analytics subscriptions or electronic degree mapping or advising notes systems. For the most part, the BOG system lacked plot.

103

An interpretive narrative analysis, breaking free of the NPF elements, might have provided a more critical analysis regarding the BOG system’s apparent disinterest in the coordination of system-wide interventions for promoting student success. The BOG system chose not to focus its efforts on proactive interventions at the state level, across several universities. This is not entirely clear and would require additional documentation and consideration to better understand BOG system priorities in terms of the actual work the staff at the state system level is producing. In other words, the NPF coding framework merely depicts an absence of plot coding instances, whereas a less standardized method might provide for more flexibility to analyze such gaps in the narrative.

The sole instance of BOG system plot was the inclusion of regular updates about initiatives related to fully online, or distance learning, educational formats. From 2012 to

2016, the BOG system accountability reports noted progress on distance learning enrollment growth, the first of which acknowledged how “Florida is now working to better organize its distance learning offering,” and explained how “a consultant hired by the Board outlined four options that will help shape recommendations for the future of online learning” (Florida Board of Governors, 2012a). By 2016, nearly half of all students throughout the state were enrolled in at least one full-online course, illustrating rapid and widespread growth for this plot initiative.

Notably, the PBF policy does not directly measure online education rates. Instead, the BOG system viewed fully online education as an opportunity to attend to many of the metrics indirectly. Students who have taken asynchronous online courses would not suffer from as many scheduling bottlenecks they might have experienced with in-person alternatives. Additionally, the possibility existed that reductions in in-person facility use

104

would have eventually lowered costs to students (a misguided assumption, given the potentially high costs of advanced technological systems and professional development efforts that institutions would have needed to adequately deliver online coursework).

Accordingly, the BOG system launched a “Task Force on Postsecondary Online

Education” in conjunction with the Florida College System and private universities and colleges, among other stakeholders (Florida Board of Governors, 2014a). Online education, while not directly responsible for enhancing performance in any single PBF indicator, could have promoted increases across the state. Overall, though, the BOG system did not focus on statewide initiatives, as evidenced by limited instances of plot coding. Such an approach depicted a go-it-alone approach in response to the PBF policy, in which universities competed and were not incentivized to develop cooperative solutions to increasing their specified metrics.

Statement of a problem

Given that Florida’s PBF policy has centered on academic and career success for students since its earliest ideation, the statement of a problem reflected similar concerns.

For example, the 2011-12 system accountability report stated, “Research shows that the highest attrition rates occur in the first two years of college, so early identification is crucial in helping first-time-in-college (FTIC) students who are at risk academically”

(Florida Board of Governors, 2012a). In the most basic form, this statement summarized the ambition of the PBF policy. Career success depended on successful academic experiences, and so academic failure became the core motivation for the BOG system.

Given that the various universities have had unique missions to serve diverse student populations, the solutions at each institution looked different. This changed over

105

the course of the implementation of the PBF policy. Along this line of reasoning, in an op-ed piece in the Pensacola News Journal in 2017, the chair of the Board of Trustees at the University of West Florida (UWF), Mort O’Sullivan, articulated the less obvious, much broader statement of a problem for the BOG system:

When the metrics system was created, we knew we had a challenge. The system

puts all universities in Florida in the same bucket and presents us with challenging

metrics when we are competing with large research universities. Despite those

challenges, the system offers opportunities by allowing scoring on the higher of

two measurements for each metric — an excellence score or an improvement

score. Through our focus on improvement, we have stepped up among the leaders

in the statewide results from this metrics system. However, the design of this

system works in a remarkable way that forces universities like us to continually

improve in order to score well in future years. (O’Sullivan, 2017)

As O’Sullivan alluded to, large research universities like the University of Florida had very little room for improvement in their academic progress rates, which measured retention of students with above a 2.0 grade point average from freshman to sophomore year, so they focused on maintaining excellence. Strategies to accomplish this goal would have looked quite different from those needed to begin the improvement process. Lower tier institutions, though, had no ceiling for improvement points, except their own capacity to improve. Higher tier institutions benefited from excellence points but might have struggled to maintain these scores. While not specific to the academic success of students, this statement of a problem narrative element acknowledged the challenge that

106

Florida’s PBF policy needed to overcome in regards to addressing the different types of

institutions with the state, while somehow standardizing the measures of the model.

Angel shift

A plausible use of the angel shift narrative strategy for the BOG system could have involved the system attempting to elevate itself above its peer systems across the US

– and to deemphasize comparable successes throughout different states. Instead, the primary technique that the BOG system deployed in its documents and reports involved the individual universities themselves. In particular, the angel shift strategy was a useful tool to emphasize the relative improvements of specific universities that had previously scored poorly in the PBF policy and somehow managed to make adjustments that righted themselves within the model. For example, then-BOG system chair Tom Kuntz described how, “in the past four years, we’ve seen steady improvements at the System level and for individual universities,” and “especially exciting is that we’ve seen universities in the bottom three soar to the top of the pack as they’ve renewed their focus on student success” (Davis Wise, 2017). The possibility of transition from the worst to the best was essential to the success of the model, which validated the notion that such turnaround stories could be possible at all.

More specifically, the BOG system documents relied on the angel shift narrative strategy when it referenced the University of West Florida (UWF) and Florida Atlantic

University (FAU) – two institutions that had ranked poorly in the early years of the state’s PBF policy implementation. In the first case, the BOG system published the aforementioned op-ed by the chair of the UWF board of trustees, Mort O’Sullivan, who pointed out that after his institution “found itself ranked in the bottom three, at risk of

107

losing funding,” then “the campus community responded, coming together to focus on

supporting students, improving scores,” and eventually resulting with the institution

ranking “among the three top-performing public universities in Florida, alongside the

University of Florida and the University of South Florida” (O’Sullivan, 2017). Here,

UWF leveraged its comparative success over other universities in order to emphasize its

own turnaround story. The BOG system then leveraged this document by posting it to its

own website, reifying the success of its PBF policy.

The BOG system published similar references to FAU, such as descriptions of the

university “beating out the University of Florida and Florida State for the top spot” – all

“only two years after facing a $7 million penalty from the state for rock bottom

graduation rates” (Ostrowski, 2016). Additionally, Travis (2016) referred to FAU as a

“major turnaround,” considering that "just two years ago was considered one of the

lowest performing" schools in the system. The BOG system also published a report that

described FAU as “skyrocketing from a bottom scoring university in 2014 to the top of

the rankings tied with University of Central Florida” (Davis Wise, 2016). The reliance on

the angel shift of a turnaround success story, with accompanying references to how other universities fared, was a common theme particularly among the news media clips that the

BOG system published. These institutions served as evidence that improvement was

essential to the PBF policy.

Containment

At the same time that the BOG system was highlighting particular success stories,

it did not hesitate to deploy the containment narrative strategy in order to stall critical

assessments of the PBF policy. As noted previously in Chapter 2 of this dissertation,

108

some states, such as Tennessee, have relied on multi-year trend data for their particular brand of PBF policy, a methodological adjustment intended to ease one-year spikes or rapid drops that were outliers and not representative of an institution’s sustained performance (Tennessee Higher Education Commission, 2015).

In its FAQs document, the BOG system posed the question of why its PBF policy was “based on one year and not 2, 3 or 5-year averages,” and the response noted that, “for some metrics, historical data is not available and in other cases the metric definitions have been revised recently, thus the use of averages would not be appropriate” (Florida

Board of Governors, 2018a, p. 7). The system pointed to these concerns as a form of containment strategy, using the passive voice to disguise the actor who was responsible for how “definitions have been revised” (note that it was, of course, the BOG system itself that bore the responsibility). This also disregarded that historical data was not available due to the non-standardized, ad hoc methodologies that the state’s PBF policy used to calculate those metrics. Such concerns appeared to be disingenuous and less-than- reasonable excuses for not adjusting the policy to reflect multi-year trend data.

Devil shift

A common theme throughout the BOG system documents regarding Florida’s

PBF policy was that universities were directly responsible for the high costs of a college education (as evidenced by metric #3 – net cost to student), but the devil shift narrative strategy in the documents deflected this responsibility to the students themselves. An interesting counterproposition, located within the FAQs document, suggested that the

BOG system should not use “student loan/default data as an accountability metric” because “tuition, fees and books only represent one-third of the total 2016-17 costs of

109

attendance within the State University System of Florida” (Florida Board of Governors,

2018a, p. 14). In this one instance, the BOG system pointed to students rather than the

universities. The staff members from the BOG system drafted the response that stated

how “it is a large conceptual jump to actually use this data to hold universities

accountable for the non-instructional financial decisions that individual students choose

to make about their personal lifestyle” (p. 14). In this instance, the BOG system

suggested the majority of costs of educational costs are actually discretionary in nature.

The shifted responsibility did not align with prior characterizations of the

powerful ability to improve that universities wielded, such as those referenced in the causal mechanism narrative element. When the BOG system deployed this particular devil shift narrative strategy, then, they deflected away from the core understanding of the

PBF policy, and in turn, shielded the policy that they had created from further critique.

Expansion

While some of the other narrative strategies shielded the state’s PBF policy from

criticism, the BOG system utilized the expansion narrative strategy to extend the

responsibility for the development of the policy to other stakeholders. For example, a

common refrain for implementing a PBF policy in any state is that accountability

measures were increasingly tied to budgets as part of a national trend. Pointing to the

PBF policy as an increasingly common trend expanded the responsibility to a broader set

of stakeholders (in this case, peer policymakers at other state systems outside of Florida).

In one news media article, Tom Kuntz, “chairman of BOG’s Budget and Finance

Committee which gave initial approval to the performance funding plan, noted that a growing number of states already are using similar models to determine funding for their

110

public universities” (Blackburn, 2014). As noted previously, there were certainly both similarities and unique features of Florida’s model that made it different from other states. Nevertheless, the BOG system was willing to expand responsibility to other partners in PBF policy development, as an effort to hedge its bets regarding policy outcomes, including both the blame for possible failures as well as the glory from any potential successes.

In addition to expansion to other states, the BOG system sought to share responsibility with the individual universities and their leadership. In its FAQ document, the BOG system described public vetting at board meetings and how the system staff ensured that there were “updates provided on the status of developing the model,” in addition to how “discussions have been held with universities through phone calls and face-to-face meetings” (Florida Board of Governors, 2018a, p. 4). If there was an understanding that these institutions and their experts were involved in the policy creation, then in theory that socialization process and engagement should have given representatives from universities the opportunities to change the state’s particular brand of PBF model along the way. In other words, the blame and the failures were shared with individual universities, as well as other state university systems outside of Florida.

Policy beliefs

The policy beliefs for the BOG system, as evidenced in the annual accountability reports, revolve around two ultimately competing themes of system-wide standardization and recognition of distinct university-level goals. This commitment to a simultaneously unified and distributed focus is “achieved through a coordinated system of institutions, each having a distinct mission and each dedicated to meeting the needs of a diverse state

111

and nation” (Florida Board of Governors, 2017c). Despite remaining committed to both concepts strategically, the BOG system promoted a somewhat incoherent message that worked counter to its promise to launch a PBF policy that was simple and clear. In other words, the policy beliefs for the BOG system were not explicit. The leadership could choose to build narratives around the individual strengths of each university, but the policymakers seemed hesitant to fully commit to this approach.

Perhaps the PBF policy could have treated each university with its own unique set of metrics, or at least in its own category of metrics, but that is not the design of Florida’s

PBF policy. While the BOG system certainly leaned towards standardization, the current model incorporated at least two metrics that were supposedly unique for each institution.

A one-size-fits-all PBF approach could indeed fulfill the obligation to standardization, but for five years straight, the BOG system continually re-dedicated itself towards achievement through the diversity of its member institutions. As noted in Chapter 2, this differentiation resulted in maximum scores each year for all of the universities, thus nullifying the actually efficacy of the PBF policy’s attempt to recognize distinct missions.

Nevertheless, the diversity of the universities was central to the BOG system’s policy beliefs. In the following section, the analysis considers the perspectives of those universities through narrative elements, strategies, and the universities’ policy beliefs.

112

CHAPTER 6. ANALYSIS OF INDIVIDUAL UNIVERSITIES IN FLORIDA

The individual universities operated in many different contexts, and the narrative elements and strategies reflected this array of experiences and priorities. While some universities were large, complicated organizations that served multiple stakeholders at once, others had been more targeted in their missions and operations. The work plan documents reflected these differences, among others, and empowered universities to articulate their concerns and opportunities regarding the PBF policy and related initiatives, best practices, and related assessment activities. The accountability reports, completed roughly six months after the end of each fiscal year, outlined nearly- exhaustive lists of interventions, projects, and major achievements for each university as a whole, for the faculty and staff members of each university, and for their students.

In some instances, the components of overarching narratives (e.g. the narrative elements and strategies at universities) evolved alongside PBF policy development and implementation in the state of Florida. In other words, as the policy evolved, so too did the various pieces of the overarching narratives. The following section explores findings that appeared within the narrative elements and strategies within the accountability report and work plan documents.

Causal mechanisms

Within their annual documents, individual universities described a number of different causal mechanisms in which a prescribed action or program would have had

113

explicit results. Often in the work plans, universities discussed initiatives that would

enhance institutional standing in the PBF policy itself, particularly in regards to academic

and career success for students. In the accountability reports, the universities investigated

the outcomes of these specific initiatives, which often focused on PBF policy more so

than even the particular mission of the institution. The causal mechanisms served

evidence of working towards implementing best practices for academic and career

success for students. Universities explained how these projects worked to enhance

institutional performance – but failed to consider whether or not universities were

launching the most appropriate initiatives for the type of institutions they were.

For example, Florida Gulf Coast University (FGCU), a small comprehensive

regional university, continually referenced its relative youth in many of its reports and

plans, as the university first offered classes in 1997. Despite its short history as a local

commuter school, FGCU announced plans to establish an honors college in 2017, which

it claimed would have served as “a vehicle for attracting and retaining the best and the

brightest in an ever-more competitive environment and yields increased scholarship,

student retention, and timely graduation” (Florida Gulf Coast University, 2017b, p. 7).

Whether or not an honors college was suitable for a regional commuter institution was

not a part of the discussion. Instead, the understanding within this causal mechanism was

that an honors college, with its rigor and engagement opportunities, would have drawn

higher achieving students who could contribute to increased persistence and graduation

rates at the university. The appropriateness of the mechanism (i.e. the substance of the intervention) appeared to have been, perhaps, secondary to its causal function.

114

Similarly, Florida Atlantic University (FAU) articulated its intention to enhance

its admission standards due to the function of high-attaining freshmen as more likely to

succeed in higher education – both in terms of first-to-second year retention rate and

timely graduation. FAU (2017a) stated that it had “implemented strategies that have

increased the academic profiles of enrolled students” (p. 7). While not necessarily an

inappropriate approach, the institutional focus clearly shifted away from a broader

mission of access to a willingness to restrict its enrollments to students whose causal

mechanism would be to fuel the self-fulfilling prophecy of student success. Promoting a

higher profile of enrolled students assumedly required excluding traditionally

underserved prospective students who might not historically have access to privileges

such as private tutoring or standardized test preparation centers, due to lack of geographic

or otherwise socioeconomic access. The institution openly spotlighted this change as it

embraced this new strategic direction, much in line with the PBF policy.

Universities also mentioned other examples of the importance of high impact

practices related to students’ academic and career success throughout their public

documents. For example, in its very first work plan since its establishment as an independent institution in 2012, Florida Polytechnic University (2013) referenced the importance of internships in securing students full-time jobs, as “a recent study indicates

that internships have a high ROI,” as companies offered their interns full-time roles 70%

of the time (p. 8). Again, some universities leveraged the causal mechanism function of

these interventions as narrative elements, evoking efficiency and efficacy, rather than

necessarily articulating the specific institutional problem at hand and the appropriate

situational solution.

115

Other universities connected the importance of internships to enhancing the local corporate landscape, in addition to the direct impact on students themselves. Along these lines, Florida State University (FSU) elaborated how “internships not only benefit our students by helping prepare them for the workforce but our students also provide valuable services to local businesses and organizations” (2014a, p. 13). In contrast, some institutions focused on how the causal mechanism directly impacted their own students, rather than include just the community as key to their initiatives. For example, FAU

(2017a) referenced its “Embedded Career Liaison Program that focuses on bolstering the number of students registering for internship and co-op courses” (p. 7), while Florida

International University (FIU) explained how, “through internships, students gain real- world experience and a first-hand opportunity to try out their chosen career and build their resume with actual work experience” (2014a, p. 6). These cases highlighted the necessity of internships to promote career placement. Additionally, the leveraging of a causal mechanism showcased the role of the universities as experts in understanding the interactive role of their various interventions and best practices.

Almost all institutions saw a need to integrate the two populations of the business community and their student bodies. Along those lines, New College of Florida (NCF) explained how “internships engage students directly in the community workforce, and create bridges between the academic program and the world of work” (2017a, p. 14).

Likewise, Florida Polytechnic as an institution was particularly outward-facing, as its accountability report the next year explained how, “as a new university, we have the ability to adapt, be nimble, cooperate, and be responsive to our industry partner’s needs in a timely manner” (2014a, p. 9). The causal focus was not always on the students alone

116

but on members of the local and regional business communities, highlighting a pipeline function that universities fulfilled in the state. With the PBF metrics focusing on career success, the institutions saw a need to develop narrative elements, in this case causal mechanisms, not just regarding their student bodies but for businesses as well.

Multiple universities cited unnamed “national research” as evidence that students who reside on-campus were more likely to graduate on time than their commuting student counterparts (University of North Florida, 2016b, p. 5; University of South

Florida – St. Petersburg, 2013b, p. 5). They claimed residential students took higher course loads, succeeded more in their classes, and formed important social relationships that engaged them more with campus life. Universities often showcased that they were knowledgeable about how a policy solution worked, implying that the institutions themselves were experts regarding which causal mechanisms generally worked.

Some universities developed this narrative element over the course of several years, relying on more robust causal mechanisms that boasted sound contextual research and focused on the particular needs of their students. For instance, FIU (2015a) identified

“College Algebra” as “a critical course for predicting graduation success” (p. 7). The institution subsequently launched “Mastery Math Lab, a computer-assisted, adaptive program of algebra instruction and exercise,” and ultimately found that “requiring student presence two hours/week led to significant increases in passing rates” (2017a, p. 8). First,

FIU identified a problem (i.e. not passing College Algebra), and then it connected that particular concern to a causal mechanism that directly served its students. These are just a few examples of multi-year narrative components that do not use broadly-generalized studies but instead focused on the unique needs of students at the particular institutions.

117

Hero

The narrative element of the hero arose as the most commonly used character in

all of the public documents, including university work plans. In general, each individual

university used the opportunity of the standardized work plan format, in particular the

template section that spotlighted strengths and opportunities, to showcase its own status

as a hero character. Rather than showcasing individual hero characters who played a role

in the policy process, the institutions tended to refer to organization-level heroics.

These findings align with the recurring focus of NPF on the rhetoric of heroics, relying on the traditional role of ethos to stimulate the public with the depiction of a champion and pathos to drive public emotions and frustration with unjust scenarios

(Smith-Walter, 2018). The heroes that universities depict in their accountability reports

and work plans reflect similar patterns. In the documents, universities describe their

efforts to conquer over the stale policy environment with dramatic overtures. They lean

into the rhetorical devices and willingly participate in the theatrics.

In some instances, universities’ descriptions of themselves evolved, such as in the

case of Florida Agricultural and Mechanical University (FAMU). The school referred to

itself as “one of the premiere Historically Black Colleges and Universities (HBCUs) in

the nation” in one of its earlier work plans (2013b), but shifted to focusing more on

explicit accomplishments in subsequent plans. Years later, after the PBF policy had been

in place for multiple years, the university stated:

FAMU has invested significantly in some of these activities designed to increase

student retention and progression, which has been evidenced by the increase in

118

the academic progress rate of FTIC students returning their second year by 21%

since 2010. (Florida Agricultural and Mechanical University, 2016b, p. 5)

In example above, FAMU transitioned from promoting its national acclaim to

relying on more specific evidence of its hero-type actions and results. It is possible that the PBF policy catalyzed this shift, as the new form of accountability in the state no longer empowered universities to rest on their laurels and historical reputations.

Similarly, FSU claimed in its work plan that, “as a top-tier research university, it is crucial to offer the full breadth of disciplinary excellence, and we seek continual improvement in our position in retaining and educating the most promising students in the State of Florida” (Florida State University, 2012b, p. 4). Later, FSU (2017b) delivered on this rhetoric, showcasing its hero status while being “recognized for advancing the persistence of all students, FSU’s 4-year graduation rates are now among the top 15 in the country” (p. 5). In addition to shifting the reference of itself from a “top-tier research university” that just so happened to recruit promising students to a national leader in student persistence, FSU also moved from broader claims of nobility to specific examples of championing students.

Furthermore, universities offered differing interpretations of what is valiant or heroic. For example, FIU (2012b) in its work plan emphasized its national prominence in its ability to produce more Hispanic graduates than any other institution in the country, ultimately claiming that “FIU serves the nation as a demonstration that diversity and excellence can be coterminous” (p. 5). An undoubtedly important factor in practically any other situation, the university’s success in graduating minority students unfortunately

119

played a relatively minor role in its PBF policy success, as it was only measured directly in the university’s own choice metric #10, in degrees awarded to minorities.

Accordingly, the way that the institution emphasized itself as a hero evolved over the years. In 2017, for instance, FIU claimed that it had “seen evidence of dramatic improvements in student success” and was focused more on efficiency of “timely graduation rates” and reducing “excess hours” (2017b, p. 7). Volume was less of the focus, regardless of its role as one of the nation’s leaders in producing Hispanic graduates. The speed with which those minority students progressed became more of the focus than the overall impact of the volume of Hispanic students.

Though it was not clear that the PBF policy itself drove the nature in which an institution cast itself as a hero, these public documents definitely showcased a shift. In one of its accountability reports during the first full year of implementation of the state’s

PBF policy, the University of South Florida (USF) focused on a commitment to developing a diverse institution:

Access to that high-quality higher education is growing significantly, equipping

more and more students from a variety of different backgrounds with the skills

they need for a successful lifetime career. In the last decade, USF’s degree

production has doubled, increasing to a record 10,950 in 2012-13. That includes

students from 120 different countries, making USF one of the 40 most diverse

public universities in the country and the second most diverse institution in the

State University System of Florida. USF is a leader in awarding baccalaureate

degrees to traditionally underrepresented groups and minority students,

recognized in 2012 by Diverse magazine as 28th among all U.S. colleges or

120

universities in awarding undergraduate degrees to minorities. Forty-three percent

of USF undergraduate students receive a federal Pell Grant (IPEDS, 2011-12).

(USF, 2014a, p. 7).

In the above statement from the first year of the ten-metric PBF policy, the university cast itself as a hero of diversity in approximately four different lights, and the institution documented its commitment to serving veterans just a few paragraphs later.

An institution that alone awarded 10,950 degrees would have carried significant weight not just in the state but nationally, as the number of graduates in the workforce and continuing on to other institutions would have been quite significant. Furthermore, though, the institution made a point to highlight that it was not merely producing a large volume of alumni but was doing so with an ongoing record of supporting minority and lower-income individuals. The heroics here represented diverse worldviews, culminating in an image of the institution as a haven for global and intersectional dialogue.

Ultimately, the narrative element of hero was grounded in diversity and service to traditionally disadvantaged populations.

Eventually USF also redirected the way that it described its hero narrative

element, as it focused more on its role as an elite institution. The university proclaimed

how “USF Tampa was designated as a Florida Emerging Preeminent Research University

in June 2016” and “USF Tampa continues to strengthen the quality and reputation of our

programs – evidenced by the record setting fall FTIC student profile” (2017a, p. 6).

While the enhancement of academic rigor would not have definitively indicated

exclusionary practices, USF supported its claim to excellence by noting the record of

success of its student body. Yet again, an institution leaned on the documented success of

121

its high-ability inputs (i.e. the academic preparation of its incoming class) to produce excellent outputs (i.e. timely graduation) in line with the state’s PBF policy. USF’s initial description of itself as a diverse institution was supported by a number of quantitative factors, but those factors would not have driven high ranks in the PBF model and thus were deemphasized.

In a similar manner, Florida Gulf Coast University (FGCU) shifted its self- conceptualization from a hero of access to a hero of efficiency. The values of access and efficiency likely competed. For instance, prior to the advent of the state’s PBF policy, the university noted how “one of the most critical aspects of FGCU is the advanced education it provides to regional residents enabling them to access professional careers vital to Southwest Florida’s further development and economic diversification” (2013a, p. 5). This bold statement reflected the traditional role of FGCU as a provider of access and an economic engine for the region, having empowered its local residents with a broad portfolio of academic offerings.

Only two years into the implementation of the ten-metric PBF model, FGCU

(2016a) lauded its draconian measures to reduce its selection of degree programs that were available to its students. Diversity of education was no longer the focus, when the university’s politically-appointed board of trustees “conducted a major review of academic programs to ensure programs were providing an appropriate return on the university’s allocation of resources” and ultimately “took action on 38 majors and seven programs were formally terminated” (p. 7). The change in specificity is also striking. The first statement from 2013 was broad, without specifics regarding how the university actually provided the region with educational opportunities, while the second statement

122

was specific and action-oriented to support claims of efficiency. From framing its formerly populist, access-based approach of providing access to different professional career opportunities, to framing its more efficiency-based approach of providing a more limited selection of career opportunities, FGCU shifted in the tone of its own heroism.

What is perhaps most telling from the evolution of the various universities’ hero narrative elements was not necessarily the initial versions the universities shared but how they all ended up eventually spotlighting the same type of hero: one that recruited great students, kept them engaged and academically progressing towards timely graduation and successful careers. The final homogeneity of the narrative element spoke directly to the

PBF policy, possibly at the cost of losing focus on the unique mission of each institution, though universities moved more towards providing evidence to support the claims (rather than the previous tendency to merely promote the rhetoric of valiance).

Victim

The work plans did not include frequent reference to universities (or any other entities) as victims, with a few exceptions. Four out of the six coding instances in the work plans were perfunctory in nature. These instances occurred when the University of

North Florida (UNF) added a notation from 2014 to 2017, explaining that the word

“preeminence” in its official vision statement predated the adoption of Florida Statute

1001.7065, which formed the formal Preeminent State Research Universities Program.

While not explicit victimization, UNF’s continual inclusion of this notation is admittedly non-committal, forcing the university to defend its own ambitions in deference to more prominent institutions.

123

The other two coding instances within the work plans addressed funding

concerns. During the first year of the ten-metric PBF policy, FAMU explained that,

“although we have realized efficiencies and made strides in various areas, in order to create transformational change, additional funds are essential” (2014b, p. 4). FAMU had avoided any financial penalties in the PBF policy that year, but it would suffer them in subsequent years. Likewise, in 2013, the University of South Florida – Sarasota-Manatee

(USF-SM) remarked how “the University’s steepest challenge is continuing to provide high quality university education amidst declining state support” (2013b, p. 5). USF-SM

benefited from the PBF policy, as it is part of the University of South Florida system of

campuses that has earned a place as one of the top earners in performance funds, which

directly addressed the declining state support referenced here.

Accountability reports provided similar results, with victimization having

occurred earlier on before PBF policy implementation and in the first years of the model,

but such narrative elements eventually faded out of the reports. FGCU made continual

references to its ranking as having the lowest per student funding in the university system

in terms of state appropriations (2012a, p. 13 and 2014a, p. 6). Similarly, FAU pointed to

a one-time $25 million budget reduction as evidence of external forces restricting

opportunities for institutional success while simultaneously pushing for improved

performance. The university claimed its strategic plan “was developed in the context of

the increased expectations and relentless budget reductions being imposed upon many, if

not all, institutions of higher education” (FAU, 2013a, p. 5). This may have been one of

the most explicit expressions of frustration, with the university framing itself as a victim

124

of not just budgetary constraints but the forces of the PBF policy and related initiatives that drive institutions to promote particular ideas of student academic and career success.

The state’s flagship universities were not immune from similar fiscal complaints.

FSU also made multiple statements that framed itself as a victim, with an expectation for it to compete nationally but without the lofty resources of its top national peers (FSU,

2012a, p. 10; FSU, 2013a, p. 10). The university expressed reservations regarding its likelihood that it would have been able to compete nationally despite funding levels that were limited comparative to national peers. FSU noted, “while we continue to investigate ways to boost our retention rates, sustaining this level of retention in difficult economic times is a challenge” (2014a, p. 6). It was not an uncommon narrative element among the various universities’ accountability reports, but it was noteworthy that such a successful institution would have decided to cast itself as a victim in terms of its finances. Again, these universities focused on budgetary concerns only in reports prior to the PBF policy.

The most striking finding here was that more universities in subsequent years did not frame themselves as victims for budgetary reasons, at least within the pages of the formal work plans, even when institutions ranked in the bottom three of the PBF model.

These examples appeared early on in the adoption of the PBF policy, with one exception.

NCF (2015a), after two years of not having received any performance funding, juxtaposed itself as a victim to the PBF policy and the BOG system staff, as its students continued to graduate outside of the state’s defined areas of strategic emphasis.

Previously, the liberal arts college only offered one degree program (“non-strategic”) in liberal arts and sciences, but it had spun off some of its concentrations within that one degree program into standalone degree programs. Unfortunately for the college, “the

125

BOG decision to exclude both Anthropology and Political Science from the Areas of

Strategic Emphasis was a setback” (p. 8). While certainly cross-coded as a villain (see

next section), the term “setback” was uncommon for an annual report of this kind and

highlighted the feeling the institution was experiencing at the time.

In this example, the college may have been frustrated if it was to have

Anthropology students go on to careers in medical forensics offices (clearly STEM or

health-related in nature and covered by the areas of strategic emphasis) or was to have

Political Science students go on to careers in international relations (clearly global in

nature and thus, again, covered by the areas of strategic emphasis). This is the latest

example of the victim use in either accountability reports or work plans. Nevertheless, the

tone of the universities, overall, shifted away from victimization during the years that the

state implemented the policy. For the most part, the findings did not reflect ongoing use

of the victim narrative element after the first few years of PBF policy implementation.

Villain

While universities tended to use the hero narrative element the most frequently

out of all the character codes in the analysis, they used the villain character least often.

Part of this reluctance may have been due to the format – a formal annual work plan or

accountability report, approved by an institution’s own board of trustees and submitted

for review and approval by the BOG system, might lend itself to pulling some punches.

When a university did produce a document that formulated a villain, then the situation

was certainly remarkable, as noted in the New College of Florida case above regarding

areas of strategic emphasis.

126

While infrequent, the use of this narrative element was profound, such as when

Florida Gulf Coast University (FGCU) formulated a villain in 2015 when it embedded a two-page executive summary introduction within the pages of the official work plan document. FGCU (2015b) noted that the work plan’s “use of one-year trend data is limiting without additional context” that highlighted the relative success that the university boasted in the metric that measures the percentage of graduate degrees awarded in areas of strategic emphasis (p. 2). Certainly unorthodox, the act of defiance disregarded the standardized template and targeted the BOG system staff as villains.

The university took an additional step by explicitly pointing the finger at BOG system staff members regarding another metric. FGCU further remarked, “The ‘n/a’ for one-year trend data on the excess credit hour measure is due to a change in methodology employed by the BOG staff in the calculation of the measure and not indicative of any institutional problem” (p. 2). This example certainly represented the university deploying additional narrative strategies, which are discussed in later sections of this dissertation. It also represented a targeted attack on BOG system staff and highlighted more technical concerns regarding the PBF policy’s metrics reporting structure.

Even more worrisome was the other instance of the villain narrative by an

individual university. Prior to the introduction of the state’s ten-metric version of the PBF

policy, the University of West Florida framed its own students as villains, holding back

the institution from higher rates of student academic and career success. The institution

claimed that it “made an effort to extend access to a greater number of students from

traditionally underrepresented groups, including some students with somewhat weaker

127

academic backgrounds” (2013a, p. 9). Here, UWF aligned its service to underrepresented demographics with somewhat weaker academic backgrounds.

While perhaps delicately stated, the underlying premise is that those students were at fault for the institution’s shortcomings. The institution elaborated a few sentenced later, “UWF has experienced some unevenness in student persistence and success rates as it has tried to determine the optimal profile of students to serve” (p. 9). The statement was unnerving due its lack of ownership of the issue at hand. Rather than acknowledge that students from traditional underrepresented demographics were also underserved, and that perhaps the weak backgrounds noted are indicative of a more systemic concern, the university stated here that it considered shifting its admissions processes in order to remove these types of students from its enrollments. While this statement predated the

PBF policy, it was certainly indicative of the environment in which the policy was being crafted by the BOG system. It showcased a baseline reaction to the PBF policy.

Beyond these statements, universities rarely made such bold claims regarding who was at fault for any institutional suffering, or even the appearance thereof. FGCU attempted to depict the BOG system staff members as villains, and UWF attempted to depict its own underrepresented students as villains. These are particularly important examples of overt, public conflict about the PBF policy between individual universities and the state’s BOG system.

Evidence (setting)

In their accountability reports and work plans, universities highlighted institution- specific evidence (setting) details that described the complexity they encountered while pursuing increased performance. Using statistics, universities shared this information to

128

establish the baseline situation at each of these universities, primarily in terms of the PBF policy. The evidence (setting) narrative elements reinforced the unique context of their institutional missions, which do not necessarily directly align with the performance metrics. For example, FAMU (2014b) noted its historical “focus on STEM and health- related disciplines, areas in which minorities are particularly underrepresented” (p. 5).

FAMU thus sought to increase minority enrollments in academic disciplines in which those same students were underrepresented, pushing against historical national trends.

This statement served to highlight the imposing task that FAMU completed, as it primarily enrolls minority students. The university shared facts related to its specific situation, which are important pieces of evidence to note in light of PBF metrics related to the percentage of degrees awards in areas of strategic emphasis.

Universities also leveraged the narrative element of evidence (setting) to tell their own struggles and how they needed to develop strategies to overcome these concerns.

Beyond merely describing a problem, these forms of evidence served to illustrate problems using real world examples. FIU, as noted previously, demonstrated that there was a causal mechanism related to College Algebra; The university illustrated this dynamic with the statistical claim that “freshmen who fail College Algebra are 75 percent less likely to graduate in a timely manner than students who pass the course” (2013a, p.

7). Clearly, then, the evidence suggested that FIU needed to tackle its high failure rates for that particular course if it were to retain students with adequate academic standing and graduate them in a timely fashion.

While statistics about mathematics difficulties (and broader academic challenges) illustrated the PBF policy implementation for FIU, it was the topic of career success that

129

NCF used to illustrate its own unique context. In regards to the first metric of the PBF

policy, related to the percent of bachelor’s recipients one year after graduation who were

employed full-time with a salary above the $25,000 salary threshold or who were otherwise continuing their education in the US, NCF used the facts regarding their students’ career placement to deepen their understanding of the metric. Rather than leaving it at the surface-level fact that “37% of the 2012 graduates were employed or

enrolled,” NCF noted that this number was significantly lower than the 68% without the

salary threshold. Additionally, the college highlighted the way in which its administrators

dug deeper to discover further student outcomes. The college noted the following:

Twelve additional 2012 graduates were awarded international scholarships or

employed or enrolled overseas, raising the employed/enrolled percentage to 44%.

Since FETPIP and [National Student Clearing House] underrepresent our

graduates, we conducted a telephone survey of the 2012 graduates and learned

that an additional 14% of the 2012 graduates were employed outside of Florida

(NCF, 2014a, p. 13)

The college claimed that it combed through the lists of graduated students to find

every last student and encouraged the BOG system to recognize out-of-country employed graduates. The narrative element of evidence (setting) depicted a unique situation at the college, wherein even the smallest number of students would tip the scale for the metrics in the PBF policy.

Additional evidence regarding timely graduation rates appeared in the documents as well. At FAU (2017b), the work plan described a situation wherein a large proportion of students were enrolled part-time, likely not graduating within the requisite timeline in

130

order to contribute positively to the institution’s standing in the PBF policy. The institution shared that “many students starting their university studies at full-time before having to drop down to part-time status for any number of reasons and/or personal obligations” (p. 7). At traditional commuter schools, this would not have been a problem.

In fact, another state or PBF model might have otherwise rewarded the number of degrees awarded to part-time students in an effort to recognize a special institutional mission to provide access to students who are older or are employed full-time. This evidence (setting) explained why FAU might have struggled with some of the metrics that were targeted towards traditional schools at which students enrolled full-time and maintained full-time enrollments until they graduated within four years.

Institutions relied on the evidence (setting) narrative element to better depict their campus environments in light of the state’s PBF policy. In many instances, this element fueled a larger narrative of empowerment, forcing the BOG system to at least consider the institution-specific data and unique problems they showcased.

Moral of the story

Each institution had its own broad set of solutions to promote students’ academic and career success, though the extent to which each moral of the story aligned with the

PBF policy was not explicitly clear. Some universities with more research expenditures and comparatively successful graduation rates expressed a need to sustain their strategic direction of graduating more students in a timely fashion. Perhaps the simplest approach was that of FSU (2017a), which promoted plans of study for their students, concluding that “we continue to identify and eliminate barriers to graduation by improving processes and optimizing academic pathways, such as strengthening students’ academic maps and

131

expanding student advising.” (p. 7). Better advising and academic planning for students

seemed appropriate, though these solutions do not necessarily target deeper rooted

obstacles to timely graduation, such as cultural and/or socioeconomic barriers.

UWF (2017b) shared a similarly pragmatic, and similarly simple, moral of the

story for overcoming geographic obstacles to timely graduation. The university

highlighted “its multiple instructional sites and strong virtual presence” (p. 4). Like

FSU’s proposal to enhance academic maps and advising, UWF focused on the surface-

level concerns of the PBF policy. Surely access increased. More students could take

classes that were convenient, such as through additional instructional sites that are nearby

students’ homes or by circumventing geographic obstacles altogether through online

education. At the same time, UWF did not engage the more systemic, campus cultural

concerns regarding the drive of students to go to school full-time and graduate in a timely fashion. This moral of the story was aligned with the PBF policy, but it was perhaps not as far-reaching as it could have been.

In another simple strategy, the University of Florida (UF) emphasized increased

spending on additional faculty personnel lines to that would simultaneously boost

“quality and reputation” while also enhancing “undergraduate teaching by helping to

stabilize the student-faculty ratio and by bringing undergraduates in contact with some of

the world’s leading scholars” (2016a, p. 7). Not particularly comprehensive, but this

straightforward solution to the problems of the PBF policy were also appropriate for

preparing students for academic success. In essence, UF suggested that it should use its

buying power to hire more faculty to ensure that the university had enough instructors for

all of its students to continue to have personalized experiences in the classroom.

132

The strategy to spend on faculty personnel was not adopted by just the flagship institutions alone. Regional schools also attempted to take this straightforward approach.

UNF (2016a) noted, “As a result of performance-based funding and special allocation of resources, UNF was able to add approximately 52 new, permanent faculty” (p. 7). This was a cautionary tale. While such an approach would have been sustainable for promoting academic success among students in a recurring funding model, the state’s peculiar nonrecurring “skin in the game” approach made this quite risky, and UNF would have potentially faced layoffs of these new faculty members if they had not succeeded with the PBF policy in future years. The spending solution did not address structural or organizational cultural concerns. In other words, these morals of the stories served as logical frameworks, albeit often drifting towards oversimplification, that aligned directly with the PBF policy.

Other universities had more comprehensive solutions. USF (2016a) directed its attention to a comprehensive academic success strategy, but the university’s strategy was unique in how it was particularly inclusive of the university’s many demographics. USF noted, “We continue to focus on greater academic support and build a campus environment that supports and celebrates success – a strategy that has allowed us to eliminate the degree completion gap by socioeconomic status and race or ethnicity” (p.

8). The philosophy here was rooted in the old adage that a rising tide lifts all boats, with

USF understanding that its overall enhanced success relied on the all students, including minorities, succeeding at heightened rates. The institution did not characterize these students as a component that held the university back from preeminence (see again, in

UWF (2013a), the casting of the student as villain). Rather, the university’s moral of the

133

story cited the elimination of completion gaps for racial/ethnic minorities and

socioeconomically underrepresented students as part of the comprehensive strategy to

build a campus culture of academic success.

While some universities focused on the shift of student support systems or a

campus environment enhancement, others boasted a moral of the story that directed

efforts to the organizational capacity for learning through assessment and analytics. For

instance, FAU (2015a) articulated an attempt to build those support systems and

environments through data-driven tactical planning, highlighting when the university

“implemented an improvement plan with six specific, actionable strategies with

measurable targets to enrich the educational experience by supporting an organizational

culture of student success” (p. 6). The moral of the story thus integrated rational measures

(“measurable targets”) with more nuanced outcomes (“an organizational culture of

student success”).

FAMU (2017b) took the same approach, as it committed to changing itself with

“a more data-driven culture and improve oversight and management of academic, fiscal

and critical business operations, leading to improvement on key performance indicators

and increased efficiency in University operations” (p. 4). In both of these examples of

FAU and FAMU, the universities honed in on the main issue of the state’s PBF policy,

primarily the concepts of academic progression and timely graduation, and provided

overarching solutions to those concerns. Specifically, they pointed to logical frameworks

that would have transformed the universities themselves into analytical organizations.

Interestingly, the types of moral of the story elements produced by other universities were only peripherally focused on the PBF policy, as they each held their

134

own missions with their own slants on what “academic success” might mean to their students. To provide their students with opportunities for academic and career success, institutions built their solutions on the backs of their missions of technical training, liberal arts education, access to urban and minority residents, access to veterans, and access in terms of volume. These were arguably more refined, more directed morals.

Florida Polytechnic (2015a), for example, claimed to be “devoted to offering our graduate and undergraduate students strong technology and engineering degrees designed to meet cutting-edge high-tech employment demands” (p. 5). While the PBF policy did reward a focus on technology and engineering through the metrics for degrees awarded in areas of strategic emphasis, these type of degree programs can also be difficult due to their number of gateway courses (e.g. advanced Calculus or physics preparation for pre- requisite coursework) or due to the lengths of the degree programs (e.g. beyond the traditional 120 credit hours, which could have increased the percent of excess hours attempted and eventually slowed time-to-degree). Nevertheless, the university committed to fulfilling its mission through a moral of the story that described a focus on high-tech education and career preparation.

While access is a metric within the state’s PBF model, as outlined earlier in

Chapter 2 of this dissertation, the lack of variance between universities has meant the policy did not duly reward institutions if they provided disproportionate levels of access to Pell-eligible students or to minority students. FIU (2017b), one of the largest Hispanic

Serving Institutions in the nation, described a moral of the story that contributed to academic success through its role “as an urban public research university in the 21st

135

century” (p. 5). Neither urban service nor community impacts were reflected anywhere in

the PBF policy, but FIU boldly articulated its role in this manner.

Similarly, the University of Central Florida (UCF) regularly framed its impact in

volume and commitment to “harness the power of scale to transform lives and

livelihoods” and to continue “serving a fast-growing region and state” (2017b, p. 5). UCF almost sounded like it was operating outside of the bounds of the PBF policy, with its unique priority to serve a growing state. Both FIU and UCF presented a similar moral of

the story that spoke more to a broad mission of community advancement and engagement

through education than it did the type of efficiency indicative of timely graduation.

These morals focused on impact in terms of urban reach in two of the nation’s

largest cities (Miami and Orlando) and impact in terms of volume of two of the nation’s

highest enrollments in public universities, whereas the PBF policy directed its attention

towards impact in terms of degree completion. These universities saw themselves through

a forward-facing lens more than through the PBF policy, although admittedly, their

solutions did not necessarily preclude success in terms of performance with the metrics.

Another more strategic student success was embedded in the commitment of NCF

(2016b) to promote liberal exploration of education. The college claimed that “it seeks to

inculcate in students the timeless virtues of a liberal arts and sciences education while, at

the same time, acquiring the skills to thrive in a rapidly evolving world,” and ultimately,

“a New College education will propel graduates toward productive careers, post-graduate

study, and lives that make the world a better place” (p. 4). This overarching policy

solution, at least in terms of the state’s PBF policy, was perhaps the least aligned with the

type of academic success described by streamlined progression towards a timely

136

graduation. NCF focused less on efficient complete rates and more on their efforts to

develop productive and engaged citizens.

A well-rounded educational experience might have taken time, and it might have

taken failure, and it might have looked less like the cookie cutter experience depicted in

the PBF policy: quickly choose a major in an area of strategic emphasis, graduate in four

years with minimum failure and without accumulating excess hours, and find a high-

paying career or enroll in further education within one year of education. All of these

things could have happened at NCF, but they might not have happened exactly in this

order, and they might not have occurred on the same timeframe as they would have for a

student at another university.

FGCU (2016b) was the only university that seemed to lack a coherent moral of

the story. The university claimed that it “will achieve national prominence in offering exceptional value in high-quality educational programs that address regional and statewide needs” (p. 4). This approach was not particularly strategic at all. As the university aimed to be a national brand, it also sought regional and statewide recognition.

It planned to be both excellent in quality and to have offered “exceptional value.” All of these notions competed with one another, and the solution to the problems targeted by the

PBF policy – academic and career success – were present neither directly nor indirectly.

While most of the universities ultimately articulated a coherent moral of the story that engaged the problems outlined by the PBF policy and its metrics, the extent to which some of them authentically aligned with policy was unclear. In some instances, the solutions universities planned to undertake to engage student success were unclear themselves, clouded by disparate and conflicting aspirations.

137

Plot

While universities were more likely to include plot than any other narrative

element in their accountability reports and work plans, the relative importance of this

information was questionable. The plot instances that universities included were typically

related to the various student success initiatives that they had launched, and the volume

was unwieldy. Of particular note was the fact that the BOG system documents had only

included 8 coding instances of plot, whereas the individual university accountability reports referenced plot 345 times and the work plans referenced plot 171 times. Clearly, the BOG system left the work to the institutions themselves, at the very least in terms of describing the initiatives related to academic and career success for students.

In terms of the individual universities, the comparative disparity of instances in the two different document types (again, 345-to-171) made sense according to the nature of each document type. Accountability reports were historical and focused on achievements, whereas work plans were forward-looking and focused on planned initiatives. Still, this intriguing disparity, in which 98.5% of plot instances in these documents were attributed to individual universities, was probably ripe for additional quantitative NPF content analysis. Did particular universities include plot more frequently than others, and was it possible that specific institutional characteristics accounted for those differences? Perhaps that could be a topic worth exploring, but this qualitative review permitted a wider sweep of the initiatives, which did not necessarily reveal much differentiation in regards to plot, with few exceptions.

As a summary of the coding results, institutions documented a vast array of initiatives related to students’ academic and career success. Just as important as what the

138

individual universities included was what they did not mention. For example, most

universities discussed a number of initiatives related to academic and career support

services – everything from innovative approaches to expand tutoring to creative

workforce development programs. For example, NCF (2013b) advised that it required all

of its students to participate in hands-on undergraduate research, which it believed

“strengthened writing and critical inquiry skills” (p. 5). In another instance, UNF (2016b) renovated and transformed its Library Commons into a collaborative student support center with tutoring and supplemental instruction (p. 9). UWF (2016b) reorganized all advising, tutoring, and retention programs with its general education course offerings, establishing what it termed a “University College” so that students would have continuity in student support services throughout their entire experience at the university (p. 5).

While in some instances these examples were remedial in nature, for the most part their inclusion served as an opportunity to trade plot with the BOG system that oversaw the submission of these annual reports, as well as with other universities that reviewed one another’s work plans and accountability reports.

Instead of taking advantage of this opportunity to share student support initiatives,

UF (2014b), as the flagship institution in the state and one of the statutorily-designated

preeminent institutions, discussed how the preeminence legislation authorized the

university to deviate from the general education curriculum that was common to all of its

peer universities in the state system. The university explained its initiative to develop a

“signature UF experience that serves to introduce students to important subject matter

and that provides a common student experience to help the freshman class to bond,”

starting with a “humanities course ‘What is the Good Life?’” that UF delivered to all

139

members of its freshmen class (p. 5). Such an initiative conflicted with the standardized approach of the state’s PBF policy, and humanities were not a component of the strategic areas of emphasis.

Accordingly, UF exerted its privilege, in which it did not have to focus on the same support system initiatives, largely because it was already “preeminent” and had high-ability students who did not require the same level of intervention in order for them to succeed academically or in their careers. Instead, UF was able to enrich its curriculum with the same sorts of liberal education offerings that could have led other universities to struggle with the state’s PBF policy implementation.

Statement of a problem

Throughout the accountability reports and work plans, each university attempted to articulate any challenges it foresaw in the form of a statement of a problem. While many of these concerns were financial in nature, they were sometimes more complicated and reflected the diversity of narrative elements found in prior sections, such as moral of the story (which made sense, as that particular element could have just as easily been coded as statement of a solution). While some universities engaged the PBF policy directly and articulated their problems in terms of the academic and career success of their students, others took an institutional approach that only addressed the PBF policy on the most basic, surface level.

The flagship institutions focused mostly on limited resources and the associated limited capacity to hire new faculty, but they fell short of directly engaging topics such as retaining and graduating students. For example, FSU (2016b) clearly noted that “our most immediate need is to hire additional faculty, and to that end, we will use recurring and

140

non-recurring funds to invest in new faculty, replacing the non-recurring funds with recurring revenue when it becomes available” (p. 5). In fact, this concern was not necessarily even about the need for faculty themselves so much as it was about a need for a sustainable funding infrastructure.

UF (2017b) at least attempted to integrate these concerns, as it established financial capacity as its number one challenge and plainly stated,

Resources: the ability to compensate exceptional faculty at nationally competitive

levels so UF can attract and retain them, the ability to provide nationally

competitive graduate stipends to attract top-notch students, the money needed to

refresh and rebuild an aging infrastructure, and the ability to provide need-based

student financial aid to undergraduates to afford them access to higher education.

(p. 5)

In the above statement of a problem, UF sketched out the complex relationships between funding levels and the services needed to retain and graduate students. The university, though, explicitly looked past the PBF policy and into broader conversations regarding national competitiveness, essentially playing an entirely different ballgame than the rest of the schools in the state’s system. This was a comprehensive articulation of the concerns the university had, and it connected the financial restrictions to another layer of concern – students themselves.

In contrast, USF (2015b) conflated these challenges from the beginning, but what remained constant in both situations was an integrated concern of serving students amid a tight budgetary environment. The university explained that it saw challenges with

“maintaining momentum in student success and institutional quality with limited

141

resources, as the university is working to increase budgetary efficiencies and hold down

costs for students” (p. 5). The issue of financial restrictions for students were the priority,

which then subsequently resulted in limited resources for the university.

Over the time period during which the state implemented its PBF policy, some

universities saw their statement of a problem evolve. For example, FGCU (2012b) first

noted its “historic underfunding” and elaborated on a record loss of capital infrastructure

funding (2013b, p. 5). These financial limitations did not explicitly relate to challenges

that were more directly germane to academic or career success. Similarly, FGCU (2014b) admitted difficulties related to the institution’s “relative youth and enhancing our visibility and reputation” (p. 5). Ultimately, the university admitted a need to focus on

“improving the four-year graduation rate” (2016b, p. 7). The university’s perspective

shifted from outwards, regarding reputation, to inwards, regarding its own responsibility

to graduate students in a timely manner. As the PBF policy took effect, FGCU’s

understanding of its own challenges also appeared to evolve.

In a comparable manner, FAU moved from a generic focus on academic success

for its students to a more targeted understanding of the role the university played in

setting the stage for academic success. Early in the adoption of the state’s PBF policy,

FAU (2013a) admitted to needing to focus on “student retention and graduation rates,”

but largely glossed over any strategic or reflexive investigation in the public documents

(p. 5). Eventually, though, FAU (2017b) took its statement of a problem to the next level

down when it noted – at least in terms of “improving graduation rates, academic progress

rates, and reducing time-to-degree” – that “one of our biggest challenges for student

success is the high proportion of undergraduates who are enrolled part-time” (p. 6). The

142

statement thus became more actionable, more analytical, and ultimately more centered on

the measurable success of students, and the factors that promoted improvements in those

measures, as opposed to a surface-level admission of concern.

Similar self-reflection took place at UWF, which framed its own statement of a problem in terms of inputs – meaning the academic records and arguable preparedness of incoming students. Initially, UWF (2013a) explained how the university had

“experienced some variability in student admission metrics (above target on average high school GPA and below target on average SAT score) and fell short on targets for student persistence and success rates” (p. 7). Not only did the university enroll students with lower “admission metrics” in terms of inputs – but it also failed to increase its measures of outcomes. As UWF (2014a) late remarked, “A challenge for any public institution of higher education is to determine the level of access it can provide within the context of its mission and available resources” (p. 7). This introspection was perhaps one of the most honest forms of a statement of a problem regarding the PBF policy, which, as noted previously, tended to reward institutions that already admitted students who were likely to succeed, thus generating a sustained self-fulfilling prophecy through which high- performing institutions could have become higher-performing institutions. The level of access for each university thus was a component of the statement of a problem itself.

Institutions engaged this phenomenon with a few different approaches. UCF

(2016b) continued to grow through a “high transfer population,” which contributed to its overwhelming enrollment size, “resulting in a greater proportion of major-specific course offerings that are more costly than general education course work” (p. 5). This complicated statement of a problem noted the constraints but admitted a continued need

143

to cater to transfer students. Because these students were not members of traditional first- time-in-college fall cohorts, as defined by the federal Integrated Postsecondary Education

Data System (IPEDS), the PBF policy judged transfer students less harshly and excluded them from a number of the most variable metrics (i.e. academic progress rate and graduation rate). The concern UCF expressed herein was even broader than the PBF policy, in this regard, and reflected a true commitment to promoting academic success – for all of its students.

FIU (2015a) expressed similar concerns. The university commented that it was

“committed to increasing access to higher education and successful degree completion for our students who are typically underrepresented in degrees at all levels of higher education” (p. 6). Unlike the above UCF example, FIU directly benefited from these dual commitments – to access (perhaps through Pell eligible enrollment growth) and to degrees awarded to minorities (its university Board of Trustees’ choice metric) – at least in the frame of the state’s PBF policy. Ultimately, though, FIU (2015b) elaborated on what it termed “ineffective pedagogy in gateway courses is a significant barrier to student success” (p. 6). This authentic reflexivity was not directly measured and showcased a commitment to improvement in terms of quality of instruction provided to students, beyond anything measured in the PBF policy or other rankings mechanisms. FIU’s statement of a problem was definitively more than surface-level in terms of enrollment figures for particular demographics, whether socioeconomic or racial, and focused all the way down to the student experiential level. The university dug down into the next layer.

Of all the narrative elements, one of the most honest comments was the statement of a problem from NCF (2014a), which outlined the communal barriers that its students

144

experienced, which were preventing personal engagement with the campus’ academic

environment. The college explained, “Social reasons were cited as the reason for withdrawal (42%) four times as frequently as academic reasons (11%)” (p. 7). Further, the institution surmised that “first year students need to make strong academic connections” (2014b, p. 6). As a result of this barrier to engagement, students suffered from lower-than-expected retention and graduation rates.

Such a reflective admission of an institutional shortcoming did not occur

frequently throughout the public documents, and perhaps never in this manner. The

college’s statement of problem thus centered around students who were doing well

academically but were not otherwise able to connect with their campus community in

order to actively engage in the living-learning experience. Having this type of self-

analysis should ultimately have led to enhanced levels of academic and career success for

students, both measurable in the PBF policy and otherwise, so long as appropriate follow-

up actions occurred to address these concerns.

Angel shift

In terms of narrative strategy, the universities deployed a number of mechanisms

in their accountability reports and their work plans. The angel shift strategy typically

involved universities highlighting their successful roles as heroes relative to non-heroes

(not technically villains). Merely championing students was not sufficient for coding as

angel shift, unless some reference to outperforming other entities accompanied the

description of valiance.

Overwhelmingly, the most common angel shift revolved around post-graduation metrics in the PBF policy, both in terms of career or educational placement rates and in

145

terms of salaries. Many universities made claims about their comparable successes in these areas, showcasing a narrative alignment with the PBF policy and the BOG system regarding the role of universities in contributing to the state’s workforce. See Table 4 below for a more extensive report of these claims.

Table 4. Angel shift references to post-graduation outcomes

FAU highest percentage / second highest of employment metrics

(2014b); ranked first in job placement (2015a); highest rates

(2016a)

FGCU among the leaders (2012a); among the highest rates (2013a);

consistently high ranking (2014a); consistently among leaders

(2015b); consistently places among the top institutions, fourth

(2016a); first (FGCU 2017a)

FIU high paying jobs (2012b); among the best employment metrics in

the system (2013b); higher salaries and employed at higher rates

(2016a); graduates demand higher salaries (2016b)

NCF significantly greater percentage of alumni with doctorates (2015a)

UNF highest percentage, among the three highest (2014a); tops all

schools (2017a)

USF top producer (2014a); post-graduation success (2014b); high

percentage of graduates employed, high wages (2015a); top

performer (2015b); maintained leadership (2016a); top performer

(2016b); highly ranked, leader (2017a); continue to lead (2017b)

146

Many of these references overlapped; multiple universities simultaneously noted

their institutional triumph over other peer universities. Perhaps institutions felt a need to

have stood out comparatively in post-graduation success because it served as the most measurable return-on-investment for the collegiate experience. The angel shift narrative

strategy reflected a sense of superiority (bordering on elitism), but the irony was that half

of the institutions made these same claims. If everyone claimed to be the best, then

recognizing the true “top performer” becomes difficult. The fact that so many claimed to

be “highly ranked” in these outcomes diluted the actual outcomes of the PBF policy.

Interestingly, the flagship institutions of UF and FSU did not implement the angel shift narrative strategy for post-graduation outcomes. This made sense, as regional and metropolitan universities could have very well benefited from geographic proximity to job market openings. Additionally, universities with higher percentages of lower income students might have had a higher percentage of students who were previously employed during their enrollment in school (i.e. working-class students) and then leveraged their prior working experience towards gainful employment after they graduated.

An additional remarkable outlier existed for the use of angel shift narrative strategy among the universities. FGCU (2013a) made a total of eight separate references to comparative ranking in a single year, which was prior to the 2014 introduction of the ten-metric PBF policy. These comments largely focused on comparative growth rate, as the young institution had the fastest-growing enrollments of any university in the state system at that time, and FGCU’s degree productivity was also noteworthy as having the fastest growth rate.

147

These particular feats might have served the university well in another state’s PBF

policy if the model rewarded outputs and productivity in terms of raw degrees awarded

and raw enrollments. Unfortunately, it did not align with the 10 metrics in the state’s

actual PBF policy. The university avoided talking about growth in the future, as did other universities, once the state adopted its PBF policy. Moving forward, the angel shift narrative strategy steered away from outputs and more towards outcomes, such as academic progression rates, timely degree completion rates, and as noted above, overwhelmingly, post-graduation student success.

Containment

When faced with limited success, or otherwise negatively impacted by the PBF policy, the universities used the containment narrative strategy to dissuade additional supporters of the policy and to maintain the status quo of the policy environment.

Instances of containment were largely limited to appearing early in the development of the PBF policy – mostly in 2013 – prior to the first full year of implementation the next year. Even during the initial conversations regarding the development of the metrics and the understanding that the PBF policy was imminent, FGCU (2012a) quipped that “the university’s progress cannot be easily marked by traditional measures alone, but more significantly in the way it transforms the lives of its students and the region” (p. 6).

FGCU (2013a) elaborated on this argument, further stating, “Given the relative youth of the institution, the success of the university cannot be simply gauged in terms of absolute numbers of degrees conferred or in graduation rates.” (p. 5). The idea of the immeasurable outcome, while significant and likely worth further consideration, conflicted with the accountability efforts of the PBF policy.

148

The university thus directly countered the philosophy behind the metrics and

attempted to stifle discussion regarding the BOG system’s completion agenda.

Apparently, producing graduates more efficiently was not an explicit priority for FGCU.

Furthermore, the university made no attempt to offer an alternative to the methodology of

the state’s PBF policy, or to constructively criticize the particular metrics with

prescriptive recommendations more appropriate key performance indicators. Instead,

FGCU offered the clearest and most obvious form of the containment narrative strategy

out of all the universities’ public documents included within this analysis.

Based on later comments in a subsequent work plan, the efficacy of FGCU’s

strategy appeared to be mostly limited. The university issued an executive summary that

it embedded within the document as a particularly defensive preface to the university’s

outcomes in the PBF policy. FGCU (2015b) explained how the university’s final scores

in that year’s results of the PBF model, “while overwhelmingly positive,” also “lack

important context” (p. 2). To provide a more specific example of this lack of context, the

university expressed concerns about one of the key performance indicators in the work plan document. While the PBF policy in 2015 included cost to the institution as a metric,

FGCU critiqued the work plan’s use of the cost to the student, which had not yet made it

into the model. The BOG added the cost to the student as a replacement metric in the

PBF policy in 2016. At the time FGCU expressed these concerns, though, “the Cost of

Attendance” was not the true measure of financial burdens to actual students, the

university argued, but rather “a hypothetical representation of expenses that may or may

not be incurred by a particular student” (p. 3). The university offered an alternative

interpretation of the document and its metrics, which was notably defensive in nature.

149

By articulating methodological concerns in advance, through an executive

summary that supplemented the standardized work plan template, FGCU delegitimized

the key performance indicator at hand. The university pointed out that the measure did

not account for financial aid contributions that covered these student costs, but that would

have reflected a different measure – a net cost – rather than the actual gross statement of

total charges that the BOG system had included in the work plan template. Despite the

institution’s comments that explained away the heightened costs to students, all

universities were subject to the same formula to calculate their expenses. The document

presented these costs in an itemized manner. The extra explanation of the metric was

accordingly superfluous and unnecessarily defensive.

Anyone who took a review of all of the various universities’ data would have only been concerned if FGCU’s costs were out of line with the others. Even then, the university’s comparatively inadequate capacity to award grants and scholarships to students, thus reducing the net cost, would have highlighted an important funding disparity among universities. FGCU did not present such an argument, instead limiting the conversation to methodological concerns about the metric in question. The university thus attempted to contain the PBF policy conversation, knowing that the metrics of the model would have been subject to changes in future years, and the commentary could have constrained the policy’s development.

NCF (2013a) had used a similar approach two years prior, but at that point, the

BOG system was still using cost to the institution in the PBF policy. The college sought

to differentiate its relatively expensive offerings from the rest of the BOG system because

it was the only institution that focused on a traditional liberal arts and sciences

150

experience. With a higher number of faculty members for every student, and with more

operational funding for facilities and support services per every student, “Instructional

Cost per FTE Appropriated funding per actual student at New College of Florida is high

in comparison to the other SUS institutions, but similar to more costly specialized

programs within the Florida universities” (p. 12). The college was remarkably expensive,

but the point NCF made here was that other universities did offer similarly expensive

degree programs. Unfortunately for the college, those other universities were also

saturated with less expensive degree programs that would balance the overall

instructional costs in the end.

Unlike the example from FGCU when institutional costs to the student would have been comparing apples and apples (in that the same methodology and context, as a general rule, applied to all universities through a standard work plan template), NCF’s

example of costs to the student would have been an example of comparing apples and

oranges. The different containment attempts showcased that all uses of this narrative

strategy were not equal.

Devil shift

Unlike some of the other narrative strategies, institutions infrequently used the

devil shift. The hesitancy to do so reflected how universities would have then appeared

weakened while asserting that other entities were more prepared to address concerns in

performance than they were themselves. In essence, this strategy served as an attempt to

point fingers to claim that someone else was responsible for the universities’ woes.

The same two universities that relied on containment of the PBF policy also relied

on the devil shift in regards to increased performance within the framework of the policy,

151

so apparently the narrative strategies were compatible. For example, NCF (2012b)

articulated concerns regarding the methods with which the institution evaluated academic

progress, claiming it would “work with the BOG to resolve any friction between the core

features of our program (narrative evaluations, senior thesis, and academic contracts) and

federal regulations/state statutes based on GPA and credit hours associated with more

traditional academic programs” (p. 5). This happened before the advent of the PBF policy

in Florida. The college recognized a need to standardize as the state system

simultaneously moved more towards standardized forms of institutional performance

review. Still, the college looked to the governing board and the state and federal

regulatory environment to guide the changes to prepare for the introduction of a PBF

policy. This strategy was in no way spiteful or negative, but it did defer to greater powers

to resolve the situation, thus exhibiting a devil shift of minimal proportions.

In contrast, FGCU (2015b) blamed the governing board for the restrictions

associated with the standardized templates of the annual work plan document. It claimed

that the document “and the content” are “the conception of the Board of Governors

(BOG) and features the addition this year of a section that reflects each university’s

performance funding metrics with trend data and proposed goals (by the university’s

administration) for the next three years” (p. 2). Unlike the NCF example above, which

pointed to a need for the BOG system to resolve conflicts with unique institutional

characteristics, FGCU failed to recognize that every other institution also received the

same document, admittedly with pre-populated content in the quantitative sections.

In the case of FGCU, the university could not highlight a particular disadvantage it suffered, or in any way explain why it was disproportionately impacted by the template

152

and the metrics. Had the university relied on a unique institutional mission or a special

regional policy environment for academic or career success, the situation would have

been different. It did not point to anything special about itself that would have explained

why it would have suffered from a standardized work plan template. Instead, FGCU

highlighted the state’s attempts at standardization as a limitation, which may have been

true, albeit it would have been equally limiting for all of the state’s universities. While the devil shift narrative strategy was used infrequently, it did provide some insights into how institutions chose to embrace or reject their own roles in PBF policy implementation.

Expansion

A popular narrative strategy, universities relied on expansion to include more stakeholders and spread the responsibility for institutional performances within the PBF policy. The universities often highlighted partnerships according to their own role within the state. For example, traditionally regional universities chose to acknowledge the importance of relationships with community colleges and state colleges, whereas research universities focused on engagement with other universities in the state and nationally.

Either way, the impact was the inclusion of partners jointly invested in the academic and career success of students.

In terms of regional universities, the most common use of the expansion narrative strategy was to focus on their roles as in economic development and career preparation.

For instance, FAU (2014b) described the “creation of an accelerated pipeline for students in computer science and computer engineering involving FAU, Broward College, Palm

Beach State College and over 30 companies in our service region” (p. 9). In addition to honing in on strategic areas of emphasis like computer science and engineering, the

153

university looped in both state college partners and corporate relationships. UNF (2014b)

and UWF (2015a) did the same with local partners, but largely in terms of developing

pathways from high schools to community colleges to university-level education. NCF

(2016b) did the same, except with the local fine arts schools that fuel the Sarasota area’s arts-based economy. The college termed the collaboration the College Consortium of the

Creative Coast. In all instances, co-investment in the community became particularly clear, which empowered the universities as economic engines in their regions. These institutions were able to share the responsibility for the career outcomes of their students.

Similarly, members of the Florida Consortium of Metropolitan Research

Universities included FIU, UCF, and USF. According to FIU (2015b), the mission of the collaboration was to “drive economic development by creating synergies and efficiencies among the state’s three largest metropolitan public research universities” (p. 6). As universities that each developed in the major cities of Miami, Orlando, and Tampa, respectively, these institutions played a major role in urban environments by providing broad access to higher education and preparing students to enter the local workforces. In terms of how this collaboration would have resulted in such efficiencies, USF (2014b) explained that the consortium made it possible to “share best student success practices and leverage the unique strengths as large, diverse, metropolitan universities” (p. 5).

Rather than venture out on their own, these universities leveraged their partnerships to expand the responsibility for promoting academic and career success throughout three major regions, representing service areas that covered at least half of the state’s population. These metropolitan universities served as one of the few examples of public universities in Florida sharing capacity to enhance their outcomes in the PBF policy.

154

Perhaps the most intriguing finding in this section of the analysis was how

relatively uncommon it was for institutions to collaborate and work together to improve

academic and career success throughout the state. The competitive nature of the PBF policy, with a focus on a top three and a bottom three, might have stifled the sharing of best practices. With the exception of the metropolitan universities mentioned above, the universities did not laud the role of their partner public universities in the state. Similarly, they did not focus on their own roles in sharing best practices and enhancing other institutions. Such a lack of examples of collaboration within the state might have been

indicative of a systemic flaw in the PBF policy, worthy of some additional exploration

and consideration.

With the primary role as a national research university, the state’s flagship

institution highlighted its partnership with universities outside of Florida. UF (2015a)

noted, “In order to further secure a foothold in the online marketplace and to enhance

program quality, UF and several other universities created and capitalized Unizin” (p. 6).

The institution then went on to list the names of the many major national universities that

joined this collaborative. As such, this tool represented a joint opportunity for institutions

from several states to promote learning outcomes in online education. UF leveraged

national partnerships to promote its own legitimacy through the legitimacy of these other

national brands of universities. Simultaneously, the institution established itself as a

leader in this movement, as one of the founders of this collaborative, Unizin. Clearly,

different types of universities chose to engage different types of partnerships, and they

were all able to successfully promote expansion of responsibility for the improved

performance in the PBF policy and the academic and career success of their students.

155

Policy beliefs

Each of the public universities in the state of Florida have their own unique sets of

policy beliefs that guided all of their narrative elements, strategies, and the overarching understanding of the role of the PBF policy in the daily lives of their faculty members, staff members, and students. In most instances, these policy beliefs were so broad that

they were compatible with the PBF policy and inclusive of many shared strategic aims of

the BOG system and the other state universities themselves. In other examples, the policy

beliefs were so different that they potentially conflicted with the more explicit aspirations

of the PBF policy.

In terms of unique strategic approaches to the PBF policy, a few of the

universities stood out with less generic policy beliefs. NCF (2016b) focused on its role to

“offer liberal arts education of the highest quality in the context of a small, residential public honors college with a distinctive academic program” (p. 4). The focus here was on the student as an individual, which was admittedly not essential to the PBF policy, with

its focus on cumulative outcomes (e.g. more efficient averages and enhanced proportions

of science and technology students) – as opposed to individualized educational

experiences in the liberal arts.

Conversely, Florida Polytechnic (2017b) attempted to “prepare 21st century learners in advanced fields of science, technology, engineering, and mathematics (STEM) to become innovative problem-solvers and high-tech professionals” (p.4). This description, unlike the one that NCF provided, more explicitly aligned with the PBF policy’s focus on science and technology, and the policy beliefs reflected an overt focus on successful career outcomes.

156

Some institutions viewed themselves largely outside the lens of the PBF policy.

FAMU served as an example of a university with a particularly unique set of policy beliefs. The university regularly noted its roots as “an 1890 land-grant institution,” and,

“while the University continues its historic mission of educating African Americans,

FAMU embraces persons of all races, ethnic origins and nationalities as life-long members of the university community” (p. 4). Such a historical role did not clearly overlap with the core values indicative of the PBF policy. Instead, a set of historical policy beliefs led the university to continue in the direction of its original mission of promoting access to traditionally underserved populations, and to developing intercultural dialogues regarding education and many other topics. While FAMU’s historical mission might not have been in conflict with the PBF policy, it was not clearly focused on similar topics or policy beliefs.

Relatedly, some other universities described their policy beliefs in line with their mission to provide access to higher education to their geographic area. As was the case with UCF and FIU, some universities connected these core positioning statements directly to the locality of their overall impact. UCF (2017b) described itself as “a public multi-campus, metropolitan research university that stands for opportunity” and “anchors the Central Florida city-state in meeting its economic, cultural, intellectual, environmental, and societal needs” (p. 4). FIU (2014a) claimed that it was “committed to increasing access to higher education and successful degree completion for our students who are typically underrepresented in degrees at all levels of higher education” (p. 6). As two of the largest public universities in the nation, these institutions absolutely embody a

157

commitment to access, but the extent to which their policy beliefs aligned with the PBF

policy was questionable.

As noted previously, the model did not reward the volume of raw degrees

awarded or incentivize growth in headcount or student credit hour enrollments. UCF

believed that its service to the “city-state” of Central Florida resulted in it being an

“anchor” for the region. More explicitly, FIU spoke directly to the completion agenda

and efficient graduation rates when it noted that it was committed to “successful degree

completion.” Unfortunately, the extent to which the PBF policy valued service to underrepresented populations was uncertain, as articulated in Chapter 2. Both of these examples showcased universities’ commitment to enhancing their communities through not just economic contributions but also through cultural empowerment.

While perhaps similar in substance, additional universities were less clear about the role they played in their geographic regions, although those local connections were absolutely key to their policy beliefs. In the case of FGCU (2016b), the university stated that they awarded “undergraduate and graduate degree programs of strategic importance to Southwest Florida and beyond” (p. 4). Similarly, UWF (2016b) referred to itself as

“based in Northwest Florida with multiple instructional sites and a strong virtual presence” (p. 4). In this second instance, UWF commented on its online program offerings and its many campuses and sites, meaning that the accessibility of its educational opportunities was key to the university’s understanding of itself. Perhaps

UWF’s virtual presence was important due to connectivity with the local military institutions and the mobility of those student populations. In both of these situations, the institutions vested themselves within their localities.

158

While not quite the same policy beliefs as UCF and FIU, which were more rooted

in the sheer volume of their enrollments, FGCU and UWF viewed themselves as essential

to regional development. These local connections, while essential to the universities,

were not clearly aligned with the PBF policy.

Overview of Analysis Results

To the best of its abilities, the above analysis thoroughly explored the topics of institutional culture and values. The overarching narratives that the public documents told, both in terms of the policymakers at the BOG system and the implementers at the individual universities, followed the same two themes as stated previously: the promotion of academic success and the assurance of career success for students. Some particularly

striking findings from the BOG system involved the proposition that higher education’s

primary function is to fuel job development. Policymakers saw their responsibility as

focusing on workforce growth and the creation of high-paying jobs. In contrast, the

policy implementers at the universities supported the ideas of liberal education and the

advancement of knowledge for the betterment of their communities and regions. This

represented the most obvious split between the two groups of stakeholders, and it resulted

in at-times unhealthy relationships between them. The next chapter offers a careful look

in to the other results of this analysis.

159

CHAPTER 7. DISCUSSION AND CONCLUSIONS

The analysis revealed a number of recurring themes in the narrative elements,

narrative strategies, and the embedded policy beliefs of both the BOG system and the

universities themselves. In this final chapter, a brief discussion of the analytical findings

prefaces the conclusions that were drawn. Returning to the research questions posed in

the introduction, the results of the analysis depict a very complex and multifaceted policy

environment for Florida’s PBF policy.

Discussion

Even in public documents that did not appear to have the most explicit

opportunities for the development of policy narratives, such as the state of Florida’s

annual accountability plans and work plans, the stakeholders revealed their narrative perspectives on the PBF policy. Overall, the public documents also served as a tool with which the stakeholders could refine and showcase these narratives, and ultimately, their core values and policy priorities.

This dissertation attempted to organize and analyze the narrative elements and strategies for each set of stakeholders, with the ambition to develop overarching policy narratives regarding the adoption and implementation of the PBF policy in Florida.

First, in simplest of terms, the BOG system stakeholders, as policymakers, viewed

politicians and political appointees as the heroes of this story who had the corporate

acumen to improve outcomes in higher education. These leaders positioned themselves as

160

adversaries of ineffectual, sluggish universities within the state, which were in dire need

of change to enhance their performance. BOG system stakeholders used the causal

mechanism of the PBF policy and its metrics to showcase their own valor. They pointed

to turnaround stories at institutions like UWF and FAU as evidence of angel shifting and sought to validate their model through expansion by claiming to have engaged universities in the development of the PBF policy.

At the same time, these leaders sought the containment of further adjustments of their policy design, ensuring that the PBF policy only recognized and rewarded a limited threshold of unique characteristics, regardless of institution-level missions and values.

The BOG system offered a standardized moral of the story that sought to leverage the state’s public universities as engines of workforce development. These state-level policymakers contributed minimal plot in terms of the specific initiatives that the BOG system might coordinate at a macro level, instead leaving universities the responsibility to fuel most of the plot. The overarching narrative for these policymakers involved ambitious political leaders installing a blanket PBF policy, rooted in market-based reform, to kickstart all of the state’s universities to increase efficiency.

Then, the PBF policy implementers at the individual universities defied a singular policy narrative. Importantly, they did all appear to experience a shift in narrative, evolving from an initial attitude of complaints about funding issues to an attitude of silence in terms of containment of the policy. Eventually, they all made themselves out to be the heroes for their students. These universities all described different statements of the problem and morals of the story, depending on their various institutional contexts.

Additionally, they proudly described causal mechanisms to address how interventions

161

functioned to promote academic and career success, and they also described the specific

initiatives they launched as their evolving plots, with the evidence (setting) to depict the

improvements they experience with their newfound analytical approaches.

While the universities all represented very different institutional missions and

core values, and they all espoused different narratives, the introduction of PBF brought

on an observable, simultaneous shift in their narratives. In other words, they all

responded to the PBF policy differently – but they all appeared to respond, at least.

Conclusions

Returning to the research questions that initiated this dissertation, the NPF

analysis herein strives to understand the basic components of the narratives that have

driven PBF policy development in the state of Florida. NPF provided an accessible tool

for analyzing those basic structural components at hand – exploring stakeholders, their

understandings of the PBF policy, and how their unique perspectives and values shape and guide these understandings. Standard narrative elements and strategies call for reflection on multiple interpretations, but NPF also flames inclinations for more flexible interpretive analyses.

Research Question #1.

Who are the stakeholders involved in the formulation and implementation of PBF policies, and how do they view these types of accountability models in the public higher education systems?

In terms of who the stakeholders were at the state’s PBF policymaker level, the foremost revelation from the overview of BOG system documents was their nearly- universal focus on corporatization. Proponents of the policy included leaders who had

162

come from the business world and those who supported the business world. For instance,

the state’s governor, Rick Scott, a multimillionaire with a health care industry

background, aligned with the introduction of a former telecommunications executive as

the BOG system chancellor, Marshall Criser III, as well as the business perspectives of

the political appointees who served on the BOG itself. As a result, the stakeholders from

the BOG system side included advocates of market-based reforms.

The PBF policy thus reflected a focus on strategic areas of emphasis and efficient graduation outcomes, favoring students who progress quickly with minimal “excess” credit hours. The metrics of the PBF policy accordingly harkened to a process management approach, much like the intake and outtake processes of a hospital management system or a customer retention strategy of a telecommunications firm.

While the BOG system rooted its views on the PBF policy through a corporate lens, the topics of academic integrity and programmatic quality did not appear throughout the policy narratives.

In contrast, the universities engaged the PBF policy in their roles of stakeholders guided by both historical, explicit missions as well as less obvious, underlying core values. They produced their own narrative elements and narrative strategies in line with this institutional context. National research universities often shared policy narratives regarding academic and career success that clearly aligned with the PBF policy. In those instances, the university-level stakeholders saw these accountability models as redundant.

This aligns with Hagood (2019), who found that the most well-funded universities were

the most likely to succeed. As a result, they did not need to consider PBF much at all.

They could continue to do what they were already doing and thrive in the model.

163

Alternatively, niche institutions such as New College and FAMU often suffered

as a result of misalignment with the policy, and they produced policy narratives that showcased the disparate intentions of the BOG system and their own institutions. The

PBF policy did not necessarily influence the identity of the universities, but the elements of the policy definitely catalyzed strategic actions on particular occasions while hindering and stifling traditional approaches in other scenarios. In other words, these universities viewed the PBF policy in a serious enough fashion that they highlighted their efforts to comply and thrive in the model. For example, New College noted communal barriers, such as its small enrollment size, that prevented its students from engaging in traditional collegiate activities that might promote retention of first-year students. Notably, they did not openly critique the model. Instead, they merely divulged their incompatibility, suggesting that they believed accountability measures painted with too broad a brush.

Prior narrative studies suggested that stakeholders indeed tend to shift their policy narratives, as described above (McBeth, Lybecker, & Stoutenborough, 2016).

Stakeholders initiate the shift based on other stakeholders who were the targeted audience of the narratives (Kirkpatrick and Stoutenborough, 2018). Overall, institutions absolutely shifted in their tones and in their strategic priorities as a result of the advent of the state’s

PBF policy and increased scrutiny from the BOG system.

Research Question #2.

What do PBF policymakers within the university system and university-level administrators perceive to be the consequences of these efforts?

Perhaps the most telling component of this narrative analysis was the establishment of the character-based narrative elements, as well as the ways in which

164

these characters developed over time. The stakeholders’ conceptualization of the hero,

the victim, and the villain reflected organizational perspectives regarding what was

important and what was the best way to support those priorities. In some situations, these

were unique perspectives that created conflict within the PBF policy implementation, and

in others, the views were overlapping. For the most part, there was a commonality

between the BOG system and the individual universities regarding what constituted good

actions, interventions, and metrics. Likewise, the stakeholders aligned in how they chose

to deploy narrative strategies to discuss them. Apparently, the PBF policy accomplished

its goals of redirecting the attention and efforts of the BOG system and the universities

themselves. University-level stakeholders essentially began to parrot the BOG system,

coopting its narratives about the heroic nature of the PBF policy and its creators.

As noted in the analysis, though, there were instances in which various policy stakeholders had interpreted the same interventions and metrics differently. Instances in

which these conceptualizations conflicted remained key, depicting the working

relationships between the stakeholders and, of utmost importance to this particular

dissertation, their understandings of the PBF policy itself. Stakeholder perspectives also

highlighted the PBF policy’s perceived reach, impact on measurable outcomes, and levels

of acceptance or resistance.

At both the BOG system and the individual institutional levels, the stakeholders

tended to make themselves out to be heroes, leading to the academic and career successes

of their students. This of course accounted for the high volume of coding instances

related to the hero narrative element throughout the public documents included in the

analysis, as noted in the summary table. What was particularly important was how all of

165

the stakeholders chose to highlight different contributions of each hero. The PBF policy

was central to a number of these heroes, though their interactions with the PBF policy

were different according to the core values and priorities of the stakeholders producing

the narrative elements and strategies.

For the most part, the BOG system lauded specific individuals who championed

the causes of the PBF policy on behalf of students and their families (i.e. the taxpayers

within the state of Florida). The BOG system, as the policymaker and the party

responsible for ensuring the maintenance and growth of the PBF policy, viewed

themselves (and specific state-wide politicians and political appointees) as the hero – the catalyst of change. In contrast, the universities focused more broadly on organizational valor, and the focus of that valor drastically transitioned with the development and implementation of the PBF policy. It was individual universities themselves that showcased different levels of compliance and direct support for the metrics and the state’s goals to enhance outcomes.

The type of hero developed throughout the model. Initially, universities tended to promote efficiency in terms of their roles as good stewards of public funding. This likely followed the lead of the BOG system. Ultimately, though, universities moved more towards focusing on their own efforts to promote efficient progression towards timely graduation, directly in line with the PBF policy. This too followed the lead of the BOG system. Accordingly, the policy had real-world operational impacts on the universities.

Throughout the implementation of the PBF policy, the universities also enhanced the ways in which they documented their heroics and provided an increasing amount of plot in the form of supporting evidence for institutional valor. Institutions gradually

166

shifted away from the broadest of generalizations regarding their support of academic and

career success. Instead, they provided specific pieces of evidence regarding their efforts

to intervene with students who were not otherwise likely to be retained, graduated, or

professionally successful.

The increased focus on plot supported the PBF policy, showcasing a shift in

operations, or at least, in the policy narratives in which the universities depicted

themselves as compliant with the policy and having measurably significant impacts on

the “boots on the ground” stakeholders who were supporting students. This explained the

disparity in the frequency of plot in the BOG system versus university-level documents.

Institutions began to showcase work through plot in an effort to craft a story in which they were thriving in the PBF policy. The BOG system did not need to produce plot, as they viewed this primarily as a successful top-down implementation.

Research Question #3.

Do the overarching narratives of the state university system policymakers and the university-level policy implementers reflect competing interests and values?

Lastly, the outlier narrative elements and strategies outlined examples of conflict between the BOG system and the universities. These stories of struggle and the characterization of villains and victims were infrequent, but they were compelling in the way that they highlighted deep-rooted differences in priorities between the universities and the BOG system. Some universities, like FGCU, were explicit in their blame of the state leaders at the BOG system for not recognizing their institutional success. Other universities were underhanded in devil shifts, willing to showcase their own shortcomings only in contrast with the strength in capacity of the BOG system.

167

Foremost of the differences was a focus on promoting access to higher education through expanded service to socioeconomically disadvantaged students and historically underserved populations, including minority students. The BOG system did not necessarily highlight such concerns of access, but individual universities did. As a result, the PBF policy eventually engaged the topic but did not provide outlets to proportionately reward access or social mobility in the same way that it rewarded efficiency in terms of timely graduation and academic success.

Amongst fellow stakeholders at the institutional level (i.e. at different universities), particularly telling was the manner in which universities collaborated with one another through the leveraging of the expansion narrative element. They seemed most willing to engage with their communities and local state colleges, and indeed a handful of instances showcased collaborations between universities in and out of Florida.

Unfortunately, the competitive element of the PBF policy potentially seemed to stifle collaborations through the sharing of best practices with one another, particularly in regards to the topics of academic and student success. This is not to say that universities did not share their work with one another, but by no means was it a central theme or overarching focus for universities. The introduction of competition into the funding mechanism for the state resulted in infrequent opportunities for expansion of responsibilities among the individual universities themselves.

Theoretical Implications

The basic premise behind a narrative theoretical framework is that narratives drive the public policy process. With this in mind, stakeholders can promote particular stories, while deemphasizing others, in an effort to influence policy outcomes. As Stone

168

(1989) posited, understanding the role of stories in driving the public policy process can

also result in an ethical obligation for policy stakeholders. These stories “implicitly call

for a redistribution of power by demanding that causal agents cease producing harm and

by suggesting the types of people who should be entrusted with reform” (p. 300).

Likewise, policy narratives “can restructure political alliances by creating common

categories of victims” (p. 300). The resulting implication for theory in the case of

Florida’s PBF policy is that the BOG system has an obligation to the individual

universities and their unique missions, particularly in terms of providing

underrepresented minorities and socioeconomically disadvantaged students with access to

higher education. Rather than steering the narratives to reify the broad-stroke successes of the state’s flagship institutions, the BOG system should redistribute its power. If it fails to do so, the universities could potentially regroup politically by highlighting commonalities in their victimization, which could then further influence the shape of the

PBF policy.

Practical Implications

The findings of this dissertation could hopefully lead to enhancement in the PBF

policy and its implementation. In particular, there were opportunities to reconcile the

disparate perspectives of the BOG system and that of individual universities. Based on

the narrative elements and strategies outlined within this dissertation, the easiest way to

accomplish this reconciliation would be through the addition of more institution-specific metrics and for the BOG system to more readily acknowledge the unique missions and characteristics of the universities through additional modifications to the PBF policy.

169

Specifically, the BOG system could adjust its own choice metric #9 to be more reflective

of institutional missions, which would double the impact of the BOT choice metric #10.

There was no question that all stakeholders sought to produce more graduates

who would contribute to the state’s workforce development. The BOG system and the

universities had this in common. The PBF policy, with its focus on efficiency, lent itself

towards promoting the academic and career success of those who were already prepared

for such success. In other words, students who came from a background of significant

financial means and the associated privilege were likely to go to the already high-

performing institutions, such as the University of Florida. Nowhere in the overarching

policy narrative did the BOG system focus on enhancing the state’s higher education

capacity to aid its citizenry through social mobility. A broader focus on outputs, such as

raw numbers of degrees awarded, would incentivize institutions to think more broadly.

The BOG system could reward students for graduating transfer students from state

colleges or graduating students who worked while going to school and eventually

graduated beyond the six-year mark. These revisions to the model would more accurately reflect a focus on workforce development for the state, as it would equate to more and better educated workers.

As the model stands now, universities tended to shy away from topics of social mobility when highlighting their own heroics, though they did reiterate them through their context-specific morals of the story and policy beliefs. Some university perspectives

appeared to note that, given their preexisting diverse student populations, they would

likely need to focus on topics of access. Otherwise, they would not have succeeded in the

PBF model. The central themes of academic and student success meant these institutions

170

could not forget that their special populations and missions required particular forms of

academic and student success – for all students. These were not two different

conversations but rather two different components of the same conversation. Modifying

the PBF policy to recognize these unique institutional characteristics would be a start in

the right direction towards reconciling the disparate perspectives.

Similarly, the PBF policy very much reflected a top-down implementation model,

in which the BOG system outlined priorities for these diverse universities. Not

surprisingly, some of the more unique institutions struggled to find ways to follow the

lead. The state’s PBF policy did not directly reward these universities’ unique service

provision, such as in the form of liberal education or urban-based community

development. Additional bottom-up engagement of the various university missions and

characteristics, through additional metrics related to these topics, would likely have

assisted in the success of the PBF policy.

The narrative analysis in this dissertation highlighted the relative success of

standardization in shifting policy narratives. At the same time, it noted the need to

incorporate university-specific context into the policy. A possible solution to reconciling

the need for standardized metrics and university-specific metrics would be a weighting system of the PBF policy, which is a preexisting model from other states (Tennessee

Higher Education Commission, 2010). For instance, the Florida model could be tweaked to enhance the proportional rewards for universities that outlined policy beliefs related to

service to minority students or service to socioeconomically-disadvantaged students.

If an urban university catered more towards students who might have graduated in

six or more years, yet still contributed to the educational enhancement and economic

171

development of its metropolitan service area, perhaps that access-based metric could

count more for the institution. Rather than the university choice metric of degrees

awarded to minorities only counting for ten percent of the total point count, maybe that

metric, or the access metric for Pell eligible students, could make up a quarter or a third

of the overall total point count. In contrast, the model could proportionally reward the

state’s flagship institutions with more weight assigned to similarly appropriate metrics,

like four-year graduation rates. The policy narratives did highlight a need to empower

university missions and unique characteristics. This could have been one practical way to

provide this reconciliation.

Much of these practical implications are also true of other accountability

measures in higher education policy and in public administration more generally. As

Radin (2006) and Moynihan (2009) both noted, policy makers that struggle to develop

universal accountability measures are destined for failure. Too often, performance

management systems do not consider policy impacts on agency cultural context and

organizational structures. In the case of PBF policies more generally, policy makers must

carefully consider university-level missions when choosing performance indicators.

One obvious concern with the PBF policy, based on the disparate focus of the

BOG system and the priorities of individual universities, was the lack of a formal outlet for critical discussions regarding the PBF policy and the state’s overarching concerns.

From the narrative analysis, it was clear that universities were home to countless experts who knew the best practices and, most importantly, knew the unique characteristics and needs of their own students. The BOG system would have benefited from regular insights

172

regarding the overall function of the PBF policy in the daily operations of universities,

and whether or not those impacts were positive or negative – or both.

In terms of the accountability reports and work plans as outlets for public

discourse, there could have been more opportunity to engage in policy analysis directly.

In other words, it may have benefited the PBF policy if there were formalized paths for

criticism or any other expression of policy innovations that could have occurred within

the performance model. Given the consistent nature of these public documents, issued

publicly with the same templates each year, they appeared to be ripe for more critical

forms of planning and annual reporting.

Universities could have provided explicit insights regarding how the nonrecurring

funding structure was difficult for personnel planning purposes, or how the universities’

competitive approach to the model potentially stifled collaborations. Instead, these

documents were largely monuments to PBF policy compliance, again reflective of the

top-down approach of the BOG system and the state’s leaders. Engaging the experts at

universities in a formal, regular format, would only serve to enrich the PBF policy with

additional considerations and perhaps some agility to offer the model some sustainability

for future success. Most importantly, it could potentially elevate the narratives involved

in the discourse about higher education in Florida.

Methodological Implications

In addition to attempting to conduct an in-depth investigation into the roles of stakeholders in a particular public policy environment, with practical implications for the continued development of the PBF policy for the state of Florida, this dissertation also sought to offer contributions to the policy studies literature. In particular, it aspired to

173

contribute to the methodological dialogue about narrative analyses, and more specifically, the NPF as a specific interpretivist method. In addition to NPF’s strengths regarding the coherence and accessibility of the analysis (e.g. broad understanding of what heroes or villains are), the method also raised some concerns regarding whether or not it is flexible enough to ensure a rigorous and thorough investigation. Perhaps the narrative elements and strategies are not universally descriptive of every narrative’s structure.

Without question, this dissertation benefited from the structural components of the qualitative NPF approach. The use of the narrative elements and strategies from

Shanahan et al. (2013) offered the analysis some formal structure. The coherence of the narrative elements and strategies also meant that the analysis was easier to follow than if it had relied on a unique coding structure. In other words, NPF enables scholars to engage one another using a common language of sorts through standardized coding frameworks.

Had this dissertation expanded the codes, through sub-codes or other unique coding methods, the resulting analysis would likely have been a little less accessible. Readers share somewhat similar interpretations of what represents a hero and other narrative elements. Diving into additional codes regarding the types of heroes might have been confusing, or at a minimum, even more subject to personal biases regarding these topics.

NPF kept the conversation on a familiar track, so that scholars had the chance to engage one another with compatible terms and concepts.

With that understanding of the benefit of the standardized NPF coding framework, the analysis might have benefitted from a more open-ended qualitative analysis of the same public documents. Admittedly, an analysis’ conceptualization of stakeholders as heroes, victims, and villains assumes there is a conflict – perhaps it even

174

establishes a conflict – within the public policy discourse as reflected in its documents. If the conversations between the BOG system and the individual universities is only described in particularly adversarial frames, then those stakeholders will always appear to be adversaries. One of the foremost findings of this dissertation was a disconnect between the BOG system and the individual universities. In the real world, beyond the confines of the work plan templates, these relationships are much more nuanced. Perhaps the stakeholders would have appeared to be more compatible had the analysis not forced them into these narrative archetypes. The NPF, as a method, lent itself to the academic rigor of standardization, but it also lacked the flexibility of more open-ended interpretive methods that can more readily describe complexity and social dynamics.

As noted previously regarding quantitative methods, analyzing public policies through categorical or explicitly-measurable analyses might not explore all aspects of the topic of the scholarship. Accordingly, the qualitative NPF method was preferable. Within this dissertation, the topic did not easily fit into black and white codes, instead requiring overlapping codes or a unique understanding of a particular code. Even still, many people contributed to the public documents in the names of the BOG system and the individual universities, and surely there would have been more than one perspective for each of these co-authors. NPF, without micro-level data (e.g. interview or survey findings), did not directly empower the analysis to explore multiple, simultaneous, and competing perspectives within each organization. With a macro-analysis, NPF assumes homogeneity of policy narratives at the institutional level. Moving beyond the restrictive nature of the

NPF might have provided more of the flexibility necessary to better discover the unique

175

institutional situations at the BOG system or the universities, thus providing even more

robust results for the analysis.

Future Areas of Research

A benefit of the standardized NPF qualitative approach arose in how easily the data could have translated into a quantitative analysis. The summary table only scratched the surface of the ways that the dissertation could have counted the coding instances of narrative elements and strategies. Researchers can use coded data, at times even combined with related numerical data, to conduct any number of quantitative analyses.

For example, while the qualitative method here included opportunities to describe narrative element trends and more nuanced developments in narrative strategies, a temporal quantitative analysis could have provided more specific shifts in stakeholder perspectives. Additional research efforts regarding the rate at which stakeholders used narrative elements or strategies, and the changes in those rates, could very well have correlated with specific changes to the PBF policy, with universities launching interventions, or with other events.

Likewise, the extensive amount of coded data in this dissertation could be added, sorted, contrasted, and built into a multiple regression model. Future research could incorporate additional quantitative information to analyze the relationships between narrative data and other university characteristics, such as total headcount enrollment,

number of instructional faculty members, research expenditures, and proportion of Pell

eligible or minority student enrollments. The scope of such research could include

investigations regarding the disparity in the levels of performance funding or perhaps

176

even the development of predictive analytics related to the potential performance of institutions in light of their quantified characteristics.

Moving forward, this dissertation also potentially set the stage for additional applications of qualitative narrative analyses. Just as literary studies developed beyond structuralism to more critical and interpretive methods, so too should policy studies explore the possibility of breaking free of the standardized coding structure of NPF in favor of more critical methodologies. A literary policy analysis could serve the topic of

PBF policies well, but it could also have applications in any other policy arena.

Summary

Governments, including state agencies that oversee higher education, will continue to develop policies that seek to enhance accountability and improve performance. This includes the spread in popularity of PBF policies. Policymakers at the state system level and policy implementers at the individual institutional level have defined perspectives on what these policies should look like and how extensive their reach should be. These perspectives are based on their unique priorities, missions, and core values, and researchers can collect and review these perspectives through narrative analyses, including the NPF approach. With a standardized NPF coding framework, these perspectives might provide insights regarding potential alignment and conflicts between the state system and individual institutional levels of stakeholders. Researchers can use these insights to better understand how the PBF policy operates and to identify future opportunities for refinement and enhancement.

In the case of the state of Florida, the BOG system and the various universities have disparate and often competing views on the PBF policy. Despite a significant

177

number of overlapping perspectives, the universities are all different and prioritize their efforts to promote academic and career success according to their unique perspectives. In this dissertation, the narrative elements and strategies reflected these competing views and core values, highlighting a need for reconciliation between the state and the universities. Possible opportunities to reconcile include enhanced, formal bottom-up dialogue regarding the PBF policy, as well as the weighting of metrics according to institutional missions and priorities. While the NPF approach revealed these elements of the overarching policy narrative, a more open-ended qualitative analysis may have provided even further insights. With these considerations and possible modifications, the

Florida PBF policy could extend its influence on daily and strategic operations at universities, for the betterment of a wider array of students, regardless of socioeconomic or demographic status.

178

APPENDICES

179

Appendix A. Preeminence Metrics and Benchmarks

(Florida Statutes 1001.7065, 2018)

(2) ACADEMIC AND RESEARCH EXCELLENCE STANDARDS. —The following academic and research excellence standards are established for the preeminent state research universities program:

(a) An average weighted grade point average of 4.0 or higher on a 4.0 scale and an average SAT score of 1800 or higher on a 2400-point scale or 1200 or higher on a 1600- point scale for fall semester incoming freshmen, as reported annually.

(b) A top-50 ranking on at least two well-known and highly respected national public university rankings, including, but not limited to, the U.S. News and World Report rankings, reflecting national preeminence, using most recent rankings.

(c) A freshman retention rate of 90 percent or higher for full-time, first-time-in- college students, as reported annually to the Integrated Postsecondary Education Data

System (IPEDS).

(d) A 4-year graduation rate of 60 percent or higher for full-time, first-time-in-college students, as reported annually to the IPEDS. However, for the 2018 determination of a state university’s preeminence designation and the related distribution of the 2018-2019 fiscal year appropriation associated with preeminence and emerging preeminence, a university is considered to have satisfied this graduation rate measure by attaining a 6- year graduation rate of 70 percent or higher by October 1, 2017, for full-time, first-time- in-college students, as reported to the IPEDS and confirmed by the Board of Governors.

(e) Six or more faculty members at the state university who are members of a national academy, as reported by the Center for Measuring University Performance in the Top

180

American Research Universities (TARU) annual report or the official membership directories maintained by each national academy.

(f) Total annual research expenditures, including federal research expenditures, of

$200 million or more, as reported annually by the National Science Foundation (NSF).

(g) Total annual research expenditures in diversified nonmedical sciences of $150 million or more, based on data reported annually by the NSF.

(h) A top-100 university national ranking for research expenditures in five or more science, technology, engineering, or mathematics fields of study, as reported annually by the NSF.

(i) One hundred or more total patents awarded by the United States Patent and

Trademark Office for the most recent 3-year period.

(j) Four hundred or more doctoral degrees awarded annually, including professional doctoral degrees awarded in medical and health care disciplines, as reported in the Board of Governors Annual Accountability Report.

(k) Two hundred or more postdoctoral appointees annually, as reported in the TARU annual report.

(l) An endowment of $500 million or more, as reported in the Board of Governors

Annual Accountability Report.

181

Appendix B. Policy Narrative Coding Sample of Work Plan (Florida Agricultural & Mechanical University, 2014b)

182

183

184

185

186

187

188

REFERENCES

Argyris, C. (1954). The present state of human relations research. New Haven, CT:

Labor and Management Center, Yale University.

Barbero, N. (2014, August 20). University receives $8.1 million in performance-based

funding in time for its 18th year. FGCU Eagle News. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

University-receives-$8.1M-in-PBF.pdf.

Blackburn, D. (2014, January 17). BOG embraces new performance funding. Tallahassee

Democrat. Retrieved from https://www.flbog.edu/pressroom/

newsclips_detail.php?id=28154.

Blankenberger, B. and A. Phillips. (2016). Performance funding in Illinois higher

education: The roles of politics, budget environment, and individual actors in the

process. Educational Policy, 30(6), 884-915.

Breitenstein, D. (2014, January 16). Three Florida universities won't get performance

funds. News Press. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_

funding/newsclips/University-Concerns-about-Performance-Based-Funding.pdf.

Burke, J.C. (1998). Performance funding indicators: Concerns, values, and models for

state colleges and universities. New Directions for Institutional Research, 97, 49-

60.

189

Christie, R. (2014, March 1). Commentary: Criser explains university system’s

‘performance funding’. Palm Beach Post. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

Chancellor-Criser-explains-university-systems-performance-www-

mypalmbeachpost.pdf.

Coleman, S. (2012). The Internet as a space for policy deliberation. The argumentative

turn revisited: Public policy as communicative practice. F. Fischer and H.

Gottweis (Eds.). Durham, NC: Duke University Press.

Cornelius, L.M. and Cavanaugh, T.W. (2016). Grading the metrics: Performance-based

funding in the Florida State University System. Journal of Education Finance,

42(2), 153-187.

Criser, M. (2014, February 14). Florida's State University System adopts metrics to

gauge, improve education. Bradenton-Herald. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

Chancellor-Criser-Letter-to-the-Editor-Bradenton-Herald.pdf.

Crow, D.A., Lawhon, L.A., Berggren, J., Huda, J., Koebele, E., and Kroepsch, A. (2017).

Narrative policy framework analysis of wildfire policy discussions in two

Colorado communities. Politics & Policy, 45(4), 626-656.

Crow, D.A. and J. Berggren. (2014). Using the narrative policy framework to understand

stakeholder strategy and effectiveness: A multi-case analysis. The science of

stories: Applications of the narrative policy framework in public policy analysis.

Jones, M.D, Shanahan, E.A., and M.K McBeth (Eds.). New York, NY: Palgrave

Macmillan.

190

Czarniawska, B. (2004). Narratives in social science research. London, UK: Sage

Publications.

Dahl, R.A., and C.E. Lindblom. (1953). Politics, economics, and welfare. New York,

NY: Harper and Row.

Davies, L. (2014). State 'shared responsibility' policies for improved outcomes: Lessons

learned. Washington, DC: HCM Strategists.

Davis, B. (2014a). Board of Governors’ performance funding model receives recognition

at national conference. Board of Governors website. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

Board-of-Governors-Performance-Funding-Model-Receives-Recognition-at-

National-Conference.pdf.

Davis, B. (2014b). Governor Scott signs State University System budget. Board of

Governors website. Retrieved from

https://www.flbog.edu/pressroom/news.php?id=528.

Davis Wise, B. (2016). Governor Scott signs Board of Governors' performance based

funding model into law. Board of Governors website. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

Governor-Scott-Signs-Board-of-Governors-performance-based-funding-model-

into-law.pdf.

Davis Wise, B. (2017). Universities mark another year of improvement under

performance funding. Florida Board of Governors website. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

191

Universities%20mark%20another%20year%20of%20improvement%20under%20

performance%20funding.pdf.

Dobson, B. (2016, March 16). Florida A&M University makes gains in state metrics

rankings. Tallahassee Democrat. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

Florida-A&M-University-makes-gains-in-state-metrics-rankings.pdf.

Dougherty, K.J., Jones, S.M., Lahr, H., Natow, R.S., Pheatt, L., and V. Reddy. (2014).

Performance funding for higher education: Forms, origins, impacts, and futures.

Annals of the American Academy of Political and Social Science, 655, 163-184.

Dougherty, K.J., Jones, S.M., Lahr, H., Natow, R.S., Pheatt, L., and V. Reddy. (2016).

Performance funding in higher education. Baltimore, MD: Johns Hopkins

University Press.

Drucker, P.F. (1954). The practice of management. New York, NY: Harper & Row.

Dunkelberger, L. (2018, March 28). Universities face tougher graduation standard.

CBS12.com. Retrieved from https://www.flbog.edu/board/office/budget/_doc/

performance_funding/newsclips/News%20Clip%20-%20CBS12%20-

%20universities%20face%20tougher%20graduation%20standard.pdf.

Elliott, V. (2018). Thinking about the coding process in qualitative data analysis. The

Qualitative Report, 23(11), 2850-2861.

Florida Agricultural & Mechanical University (2012a). FAMU Accountability Report,

2010-11. Retrieved from https://www.flbog.edu/board/_doc/accountability/

FAMU_2010-11_Annual_Report_FINAL.pdf.

192

Florida Agricultural & Mechanical University (2012b). FAMU Work Plan, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2012-

13/FAMU_2012-13_Workplan_FINAL.pdf.

Florida Agricultural & Mechanical University (2013a). FAMU Accountability Report,

2011-12. Retrieved from https://www.flbog.edu/board/_doc/accountability/

FAMU_2011-12_Accountability_Report_FINAL.pdf.

Florida Agricultural & Mechanical University (2013b). FAMU Work Plan, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2013-

14/FAMU_2013-14_Workplan_FINAL.pdf.

Florida Agricultural & Mechanical University (2014a). FAMU Accountability Report,

2012-13. Retrieved from https://www.flbog.edu/board/_doc/accountability/

ar_2012-13/FAMU_2012-13_Accountability_Report_FINAL.pdf.

Florida Agricultural & Mechanical University (2014b). FAMU Work Plan, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2014-

15/FAMU_2014-15_Workplan_FINAL.pdf.

Florida Agricultural & Mechanical University (2015a). FAMU Accountability Report,

2013-14. Retrieved from https://www.flbog.edu/board/_doc/accountability/

ar_2013-14/FAMU_2013-14_Accountability_Report_FINAL_2015-01-14.pdf.

Florida Agricultural & Mechanical University (2015b). FAMU Work Plan, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2015/FAMU_2015_Work_Plan_Revised.pdf.

193

Florida Agricultural & Mechanical University (2016a). FAMU Accountability Report,

2014-15. Retrieved fromhttps://www.flbog.edu/board/_doc/accountability/

ar_2014-15/FAMU_2014-15_Accountability_Report__FINAL__2016-03-10.pdf.

Florida Agricultural & Mechanical University (2016b). FAMU Work Plan, 2016-17.

Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2016/FAMU_2016_Work_Plan_FINAL.pdf.

Florida Agricultural & Mechanical University (2017a). FAMU Accountability Report,

2015-16. Retrieved from https://www.flbog.edu/board/_doc/accountability/

ar_2015-16/FAMU_2015-16_Accountability_Report_FINAL.pdf.

Florida Agricultural & Mechanical University (2017b). FAMU Work Plan, 2017-18.

Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2017/FAMU_2017_Work_Plan_FINAL_2017-06-08.pdf.

Florida Atlantic University (2012a). FAU Accountability Report, 2010-11. Retrieved

from https://www.flbog.edu/board/_doc/accountability/FAU_2010-

11_Annual_Report_FINAL.pdf.

Florida Atlantic University (2012b). FAU Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/FAU_2012-

13_Workplan_FINAL.pdf.

Florida Atlantic University (2013a). FAU Accountability Report, 2011-12.

Retrieved from https://www.flbog.edu/board/_doc/accountability/FAU_2011-

12_Accountability_Report_FINAL.pdf.

194

Florida Atlantic University (2013b). FAU Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/FAU_2013-

14_Workplan_FINAL.pdf.

Florida Atlantic University (2014a). FAU Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2012-

13/FAU_2012-13_Accountability_Report_FINAL.pdf.

Florida Atlantic University (2014b). FAU Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/FAU_2014-15_

Workplan_FINAL.pdf.

Florida Atlantic University (2015a). FAU Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2013-

14/FAU_2013-14_Accountability_Report_FINAL_2015-01-16.pdf.

Florida Atlantic University (2015b). FAU Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/FAU_2015_Work_P

lan_FINAL.pdf.

Florida Atlantic University (2016a). FAU Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2014-

15/FAU_2014-15_Accountability_Report_BOT_APPROVED_FINAL_2016-03-

15.pdf.

Florida Atlantic University (2016b). FAU Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/FAU_2016_Work_P

lan_FINAL.pdf.

195

Florida Atlantic University (2017a). FAU Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2015-

16/FAU_2015-16_Accountability_Report_FINAL.pdf.

Florida Atlantic University (2017b). FAU Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/FAU_2017_Work_P

lan_FINAL_2017-05-25.pdf.

Florida Board of Governors (2008). State University System of Florida methodology for

determining areas of programmatic strategic emphasis. Retrieved from

https://www.flbog.edu/board/office/asa/_doc/PreviousMethodology.pdf

Florida Board of Governors (2012a). State University System Annual Accountability

Report, 2010-11. Retrieved from https://www.flbog.edu/board/_doc/

accountability/2010-11_System_Annual_Accountability_Report_FINAL.pdf.

Florida Board of Governors (2012b). Implementation of House Bill 851 preeminent

universities. Retrieved from https://www.flbog.edu/documents_meetings/

0191_0878_6591_9.5.3%20BUD%2005b%20-%20Preeminence%20Metric%20

Scenarios%20Updated%2012_19.pdf.

Florida Board of Governors (2013a). Performance based funding 3 metrics model

approved. Retrieved from https://www.flbog.edu/documents_meetings/

0183_0727_5462_Performance%20Funding%20model_Governors%203%20metr

ics_2013-9-12%20APPROVED.pdf.

Florida Board of Governors (2013b). Methodology for updating programs of strategic

emphasis in the State University System of Florida, Board of Governors 2012 –

2025 Strategic Plan. Retrieved from https://www.flbog.edu/board/office/asa/

196

_doc/2013_09_26-PSE-Methodology-and-list-FINAL.pdf

Florida Board of Governors (2013c). State University System Annual Accountability

Report, 2011-12. https://www.flbog.edu/board/_doc/accountability/2011

_12_SYSTEM_Accountability_Report_FINAL.pdf

Florida Board of Governors (2013d). State University System Work Plan, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2013-

14/2013-14_SYSTEM_Work_Plan_FINAL.pdf.

Florida Board of Governors (2014a). State University System Annual Accountability

Report, 2012-13. Retrieved from https://www.flbog.edu/board/_doc/

accountability/ar_2012-13/2012_13_System_Accountability_Report_

Summary_FINAL_2014-02-3.pdf.

Florida Board of Governors (2014b). State University System Work Plan, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2014-

15/2014-15_SYSTEM_WORK_PLAN_FINAL_2014-08-25.pdf.

Florida Board of Governors (2015a). Performance based funding model: Proposed

changes. Retrieved from https://www.flbog.edu/board/office/budget/_doc

/performance_funding/Changes_2016-17.pdf

Florida Board of Governors (2015b). State University System Annual Accountability

Report, 2013-14. Retrieved from https://www.flbog.edu/board/_doc/

accountability/ar_2013-14/2013_14_System_Accountability_Report_

Summary_REVISED_FINAL.pdf.

197

Florida Board of Governors (2015c). State University System Work Plan, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2015/2015

_SYSTEM_WORK_PLAN_FINAL.pdf.

Florida Board of Governors (2016a). State University System of Florida, Board of

Governors 2012 – 2025 Strategic Plan. Retrieved from

https://www.flbog.edu/board/_doc/strategicplan/2025_System_Strategic_Plan_A

mended_FINAL.pdf

Florida Board of Governors (2016b). State University System Annual Accountability

Report, 2014-15. Retrieved from https://www.flbog.edu/board/_doc/

accountability/ar_2014-15/2014_15_System_Accountability_Report_

Summary_FINAL_2016-04-28.pdf.

Florida Board of Governors (2016c). State University System Work Plan, 2016-17.

Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2016/2016_SYSTEM_WORK_PLAN__2016-09-09.pdf.

Florida Board of Governors (2017a). Performance based funding: At a glance, 2014-

2017. Retrieved from https://www.flbog.edu/board/office/budget/_doc/

performance_funding/PBF-At-a-Glance%202014-17.pdf.

Florida Board of Governors (2017b). Performance based funding metrics definitions.

Retrieved from https://www.flbog.edu/board/office/budget/

_doc/performance_funding/Definitions_2017.pdf

Florida Board of Governors (2017c). State University System Annual Accountability

Report, 2015-16. Retrieved from https://www.flbog.edu/board/_doc/

198

accountability/ar_2015-16/2015_16_System_Accountability_Report_

Summary_FINAL__2017-03-30.pdf.

Florida Board of Governors (2017d). State University System Work Plan, 2017-18.

Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2017/2017_SYSTEM_WORK_PLAN__FINAL.pdf.

Florida Board of Governors (2018a). Performance based funding model (10 metrics):

Questions & answers. Retrieved from https://www.flbog.edu/board/office/budget

/_doc/performance_funding/PBF_FAQs.pdf.

Florida Board of Governors (2018b). Performance Funding Model Overview.

Retrieved from https://www.flbog.edu/board/office/budget/_doc/performance_

funding/Overview-Doc-Performance-Funding-10-Metric-Model-Condensed-

Version.pdf.

Florida Gulf Coast University (2012a). FGCU Accountability Report, 2010-11.

Retrieved from https://www.flbog.edu/board/_doc/accountability/FGCU_2010-

11_Annual_Report_FINAL.pdf.

Florida Gulf Coast University (2012b). FGCU Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/FGCU_2012-

13_Workplan_FINAL.pdf.

Florida Gulf Coast University (2013a). FGCU Accountability Report, 2011-12.

Retrieved from https://www.flbog.edu/board/_doc/accountability/FGCU_2011-

12_Accountability_Report_FINAL.pdf.

199

Florida Gulf Coast University (2013b). FGCU Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/FGCU_2013-

14_Workplan_FINAL.pdf.

Florida Gulf Coast University (2014a). FGCU Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2012-

13/FGCU_2012-13_Accountability_Report__FINAL.pdf.

Florida Gulf Coast University (2014b). FGCU Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/FGCU_2014-15_

Workplan_FINAL.pdf.

Florida Gulf Coast University (2015a). FGCU Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2013-

14/FGCU_2013-14_Accountability_Report_FINAL_2015-01-16.pdf.

Florida Gulf Coast University (2015b). FGCU Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/FGCU_2014-

15_Workplan_FINAL.pdf.

Florida Gulf Coast University (2016a). FGCU Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2014-

15/FGCU_2014-15_Accountability_Report_FINAL.pdf.

Florida Gulf Coast University (2016b). FGCU Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/FGCU_2016_Work

_Plan_FINAL.pdf.

200

Florida Gulf Coast University (2017a). FGCU Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2015-

16/FGCU_2015-16_Accountability_Report_FINAL.pdf.

Florida Gulf Coast University (2017b). FGCU Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/FGCU_2017_Work

_Plan_FINAL_2017-05-12.pdf.

Florida International University (2012a). FIU Accountability Report, 2010-11.

Retrieved from https://www.flbog.edu/board/_doc/accountability/FIU_2010-

11_Annual_Report_FINAL.pdf.

Florida International University (2012b). FIU Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/FIU_2012-

13_Workplan_FINAL.pdf.

Florida International University (2013a). FIU Accountability Report, 2011-12.

Retrieved from https://www.flbog.edu/board/_doc/accountability/FIU_2011-

12_Accountability_Report_FINAL.pdf.

Florida International University (2013b). FIU Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/FIU_2013-

14_Workplan_FINAL.pdf.

Florida International University (2014a). FIU Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2012-

13/FIU_2012_13_Accountability_Report_FINAL.pdf.

201

Florida International University (2014b). FIU Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/FIU_2014-15_

Workplan_FINAL.pdf.

Florida International University (2015a). FIU Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2013-

14/FIU_2013-14_Accountability_Report_FINAL_2014-12-19.pdf

Florida International University (2015b). FIU Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/FIU_2015_Work_Pl

an_FINAL.pdf.

Florida International University (2016a). FIU Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2014-

15/FIU_2014-15_University_Accountability_Report_FINAL-BOT-Approved-

3.11.16.pdf.

Florida International University (2016b). FIU Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/FIU_2016_Work_Pl

an_FINAL.pdf.

Florida International University (2017a). FIU Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2015-

16/FIU_2015-16_Accountability_Report_BOT_Approved_2017-03-

20_revised2017-06-23.pdf

Florida International University (2017b). FIU Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/FIU_2017_Work_Pl

an_FINAL_2017-06-08.pdf.

202

Florida Polytechnic University (2013). FPU Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/FPU_2013-

14_Workplan_FINAL.pdf.

Florida Polytechnic University (2014a). FPU Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2012-

13/FPU_2012-13_Accountability_Report_FINAL.pdf.

Florida Polytechnic University (2014b). FPU Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/FPU_2014-

15_Workplan_FINAL.pdf.

Florida Polytechnic University (2015a). FPU Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2013-

14/FPU_2013-14_Accountability_Report_FINAL_2014-12-10.pdf.

Florida Polytechnic University (2015b). FPU Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/FPU_2015_WorkPl

an_FINAL.pdf.

Florida Polytechnic University (2016a). FPU Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2014-

15/FPU_2014-15_Accountability_Report_FINAL.pdf.

Florida Polytechnic University (2016b). FPU Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/POLY_2016_Work

_Plan_FINAL_2016-09-08.pdf.

203

Florida Polytechnic University (2017a). FPU Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2015-

16/FPU_2015-16_Accountability_Report_FINAL.pdf.

Florida Polytechnic University (2017b). FPU Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/FPU_2017_Work_P

lan_FINAL_2017-06-12.pdf.

Florida State University (2012a). FSU Accountability Report, 2010-11. Retrieved from

https://www.flbog.edu/board/_doc/accountability/FSU_2010-11_Annual_Report_

FINAL.pdf.

Florida State University (2012b). FSU Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/FSU_2012-

13_Workplan_FINAL.pdf.

Florida State University (2013a). FSU Accountability Report, 2011-12. Retrieved from

https://www.flbog.edu/board/_doc/accountability/FSU_2011-12_Accountability_

Report_FINAL.pdf.

Florida State University (2013b). FSU Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/FSU_2013-

14_Workplan_FINAL.pdf.

Florida State University (2014a). FSU Accountability Report, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2012-13/FSU_2012-

13_Accountability_Report_FINAL.pdf.

204

Florida State University (2014b). FSU Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/FSU_2014-

15_Workplan_FINAL.pdf.

Florida State University (2015a). FSU Accountability Report, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2013-14/FSU_2013-

14_Accountability_Report_FINAL_2015-01-12.pdf.

Florida State University (2015b). FSU Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/FSU_2015_Work_P

lan_FINAL.pdf.

Florida State University (2016a). FSU Accountability Report, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2014-15/FSU_2014-

15_Accountability_Report_FINAL.pdf.

Florida State University (2016b). FSU Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/FSU_2016_Work_P

lan_FINAL.pdf.

Florida State University (2017a). FSU Accountability Report, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2015-16/FSU_2015-

16_Accountability_Report_FINAL.pdf.

Florida State University (2017b). FSU Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/FSU_2017_Work_P

lan_2017-06-07_FINAL.pdf.

Foley, M.T. (1998). Protesters voicing their opposition about the state Board of Regents'

plan to classify public universities as research or teaching institutions –

205

Tallahassee, Florida. Color digital image, State Archives of Florida, Florida

Memory. Retrieved from www.floridamemory.com/items/show/134637.

Frank, J. (2011). The education of Rick Scott. American Prospect, 22(6), 34-39.

Friedel, J.N., Thornton, Z.M., D’Amico, M.M., and S.G. Katsinas. (2013). Performance-

based funding: The national landscape. Tuscaloosa, AL: Education Policy Center.

Gándara, D. and A. Rutherford. (2018). Mitigating unintended impacts? The effects of

premiums for underserved populations in performance-funding policies for higher

education. Research in Higher Education, 59(6), 681-703.

Geertz, C. (1973). The interpretation of cultures. New York, NY: Basic Books.

Graves, A. (2018). Gov. Rick Scott mostly fulfills promise to create 700,000 jobs in 7

years. PolitiFact. Retrieved from https://www.politifact.com/florida/

promises/scott-o-meter/promise/588/create-over-700000-jobs/.

Gray, G. and M.D. Jones. (2016). A qualitative narrative policy framework? Examining

the policy narratives of US campaign finance regulatory reform. Public Policy

and Administration, 31(3), 193-220.

Hagood, L.P. (2019). The financial benefits and burdens of performance funding in

higher education. Educational Evaluation and Policy Analysis, 41(2), 189-213.

Heavener, J.W. (2017, June 30). Letter to the editor. The Gainesville Sun. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

Letter_UF-Board-of-Trustees-Chair-Bill-Heavener.pdf.

Heinrich, C.J. (2002). Outcomes-based performance management in the public sector:

Implications for government accountability and effectiveness. Public

Administration Review, 62(6), 712-725.

206

Hillman, N.W., Fryar, A.H., and V. Crespín-Trujillo. (2018). Evaluating the impact of

performance funding in Ohio and Tennessee. American Educational Research

Journal, 55(1), 144-170.

Hillman, N.W., Tandberg, D.A., and A.H. Fryar. (2015). Evaluating the impacts of ‘new’

performance funding in higher education. Educational Evaluation and Policy

Analysis, 37(4), 501-519.

Hillman, N.W., Tandberg, D.A., and J.P.K. Gross. (2014). Performance funding in higher

education: Do financial incentives impact college completions? The Journal of

Higher Education, 85(6), 826-857.

Hood, C.C. (1991). A public management for all seasons? Public Administration, 69(1),

3-19

Isern, W. (2015, March 18). UWF investing in its own success. Pensacola News Journal.

Retrieved from https://www.flbog.edu/board/office/budget/_doc/

performance_funding/newsclips/UWF-investing-in-its-own-success.pdf.

Jongbloed, B., and Vossensteyn, H. (2001). Keeping up performances: An international

survey of performance-based funding in higher education. Journal of Higher

Education Policy and Management, 23(2), 127-145.

Jones, M.D. (2018). Advancing the narrative policy framework? The musings of a

potentially unreliable narrator. The Policy Studies Journal, 46(4), 724-746.

Jones, M.D. and M.K. McBeth. (2010). A narrative policy framework: Clear enough to

be wrong? Policy Studies Journal, 38(2), 329-353.

Jones, M.D., McBeth, M.K., and E.A. Shanahan. (2014). Introducing the narrative policy

framework. In M.D. Jones, E.A. Shanahan, and M.K. McBeth (Eds.), The science

207

of stories: Applications of the narrative policy framework in public policy

analysis (pp. 1-25). New York, NY: Palgrave Macmillan.

Jones, M.D. and C.M. Radaelli (2015). The narrative policy framework: Child or

monster? Critical Policy Studies, 9(3), 339-355.

Jones, M.D. and C.M. Radaelli (2016). The narrative policy framework's call for

interpretivists. Critical Policy Studies, 10(1), 117-120.

Jones, T. (2016). A historical mission in the accountability era: A public HBCU and state

performance funding. Educational Policy, 30(7), 999-1041.

Kelchen, R. and L.J. Stedrak. (2016). Does performance-based funding affect colleges’

financial priorities? Journal of Education Finance, 41(3), 302-321.

Kelderman, E. (2019). The rise of performance-based funding: How colleges are adapting

in the new age of accountability. The Chronicle of Higher Education.

Kirkpatrick, K.J. and J.W. Stoutenborough (2018). Strategy, narratives, and reading the

public: Developing a micro-level theory of political strategies within Narrative

Policy Framework. Policy Studies Journal, 46(4), 949-977.

Klas, M.E. (2011, May 6). Bipartisan vote passes corporate tax cut. The St. Petersburg

Times. p. 7b.

Korn, M. (2017, March 11). States challenge public universities to prove they are worth

their funding. Wall Street Journal. Retrieved from https://www.flbog.edu/board/

office/budget/_doc/performance_funding/newsclips/16_States_Challenge_Public_

Universities.pdf.

208

Kukla-Acevedo, S., Streams, M.E., and E. Toma. (2012). Can a single performance

metric do it all? A case study in education accountability. The American Review

of Public Administration, 42(3), 303-319.

Levine, C.H. (1978). Organizational decline and cutback management. Public

Administration Review, 38(4), 316-325.

Lévi-Strauss, C. (1955). The structural study of myth. The Journal of American Folklore,

68(270), 428-444.

Li, A.Y. (2017). Covet they neighbor or 'reverse policy diffusion'? State adoption of

performance funding 2.0. Research in Higher Education, 58(7), 746-771.

Light, P. C. (2006). The tides of reform revisited: Patterns in making government work,

1945-2002. Public Administration Review, 66(1), 6-19

Lindblom, C.E. (1959). The science of "muddling through." Public Administration

Review, 19(2), 79-88.

Mailloux, S. (1990). Interpretation. In F. Lentricchia and T. McLaughlin (Eds.), Critical

terms for literary study (pp. 121-134). Chicago, IL: University of Chicago Press.

Marklein, M.B. and M. Auslen. (2013, June 19). Tuition hikes may come to a halt. USA

Today, p. 03a.

McBeth, M.K., Lybecker, D.L., and J.W. Stoutenborough (2016). Do stakeholders

analyze their audience? The communication switch and stakeholder personal

versus public communication choices. Policy Sciences, 49(4), 421-444.

McLendon, M.K., Hearn, J.C., and R. Deaton. (2006). Called to account: Analyzing the

origins and spread of state performance-accountability policies for higher

education. Educational Evaluation and Policy Analysis, 28(1), 1-24.

209

McLendon, M.K. and J.C. Hearn (2006). Mandated openness in public higher education:

A field study of state sunshine laws and institutional governance. The Journal of

Higher Education, 77(4), 645-683.

McLendon, M.K., and J.C. Hearn (2013). The resurgent interest in performance-based

funding for higher education. Academe, 99(6), 25-30.

McMorris, C., Zanocca, C., and M. Jones. (2018). Policy narratives and policy outcomes:

An NPF examination of Oregon's Ballot Measure 97. Policy Studies Journal,

46(4), p. 771-797.

Mitchell, T. (2013). Scott’s interest in tuition crosses the line, some say. Miami Herald.

Retrieved from https://www.miamiherald.com/news/state/article1952358.html.

Mosher, F.C. (1968). Democracy and public service. New York, NY: Oxford University

Press.

Moynihan, D.P. (2008). The dynamics of performance management: Constructing

information and reform. Washington, DC: Georgetown University Press.

Moynihan, D.P. (2009). Through a glass, darkly: Understanding the effects of

performance regimes. Public Performance & Management Review, 32(4), 592-

603.

Moynihan, D.P. and S.K. Pandey. (2010). The big question for performance

management: Why do managers use performance information? Journal of Public

Administration Research and Theory, 20(4), 849-866.

New College of Florida (2012a). NCF Accountability Report, 2010-11. Retrieved from

https://www.flbog.edu/board/_doc/accountability/NCF_2010-11_Annual_

Report_FINAL.pdf.

210

New College of Florida (2012b). NCF Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/NCF_2012-

13_Workplan_FINAL.pdf.

New College of Florida (2013a). NCF Accountability Report, 2011-12. Retrieved from

https://www.flbog.edu/board/_doc/accountability/NCF_2011-12_Accountability

_Report_FINAL.pdf.

New College of Florida (2013b). NCF Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/NCF_2013-

14_Workplan_FINAL.pdf.

New College of Florida (2014a). NCF Accountability Report, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2012-13/NCF_2012_13_

Accountability_Report_FINAL.pdf.

New College of Florida (2014b). NCF Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/NCF_2014-

15_Workplan_FINAL.pdf.

New College of Florida (2015a). NCF Accountability Report, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2013-14/NCF_2013-

14_Accountability_Report_FINAL_REVISED_2015-01-13.pdf.

New College of Florida (2015b). NCF Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/NCF_2015_Work_P

lan_FINAL.pdf.

211

New College of Florida (2016a). NCF Accountability Report, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2014-15/NCF_2014-

15_Accountability_Report_FINAL_2016-03-07.pdf.

New College of Florida (2016b). NCF Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/NCF_2016_Work_P

lan_FINAL.pdf.

New College of Florida (2017a). NCF Accountability Report, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2015-16/NCF_2015-

16_Accountability_Report_FINAL.pdf.

New College of Florida (2017b). NCF Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/NCF_2017_Work_P

lan_FINAL_2017-06-12.pdf.

O’Bryan, T., Dunlop, C.A., and C.M. Radaelli (2014). Narrating the “Arab Spring”:

Where expertise meets heuristics in legislative hearings. The science of stories:

Applications of the narrative policy framework in public policy analysis. Jones,

M.D, Shanahan, E.A., and M.K McBeth (Eds.). New York, NY: Palgrave

Macmillan.

Ordway, D.M. (2014, June 17). Florida universities get $200 million in performance

funding. Orlando Sentinel. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/

performance_funding/newsclips/Florida-universities-get-$200-million-to-bolster-

performance-tribunedigital-orlandosentinel.pdf.

212

Orr, D., Jaeger, M. and A. Schwarzenberger. (2007). Performance‐based funding as an

instrument of competition in German higher education. Journal of Higher

Education Policy, 29(1), 3-23.

Ostrowski, J. (2016, March 16). Graduation rate improving, FAU wins top spot in Florida

scorecard. Palm Beach Post. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

Graduationrate-improving-FAU-wins-top-spot-in-Florida-scorecard-

www_mypalmbeachpost.pdf.

O'Sullivan, M. (2017, June 30). Guestview: Long-term strategies pay off for UWF.

Pensacola News Journal. Retrieved from https://www.flbog.edu/board/office/

budget/_doc/performance_funding/newsclips/Guestview-Long-term-strategies-

pay-off-for-UWF.pdf.

O'Toole, L.J. (1984). American public administration and the idea of reform.

Administration & Society, 16(2), 141-166.

Patrick, B.A. and P.E. French. (2011). Assessing new public management’s focus on performance measurement in the public sector. Public Performance & Management

Review, 35(2), 340-369.

Powers, S. and K. Schuster. (2001). Senate agrees to abolish regents. South Florida Sun

Sentinel. Retrieved from http://articles.sun-sentinel.com/2001-04-

26/news/0104260094_1_seven-member-state-board-university-s-11-member-

board-regents.

Prasad, P. (2005). Crafting qualitative research: Working in the postpositivist traditions.

New York, NY: M. E. Sharpe Inc.

213

Rabovsky, T. (2014a). Support for performance-based funding: The role of political

ideology, performance, and dysfunctional information environments. Public

Administration Review, 74(6), 761-774.

Rabovsky, T. (2014b). Using data to manage for performance at public universities.

Public Administration Review, 74(2), 260-272.

Radin, B.A. (2006) Challenging the performance movement: Accountability, complexity,

and democratic values. Washington, D.C: Georgetown University Press.

Rick Scott for Florida (2014). Let’s Keep College Affordable. Retrieved from

http://rickscottforflorida.dev1-ironistic.com/wp-content/uploads/2014/11/Lets-

Keep-College-Affordable.pdf.

Rutherford, A. (2014). Organizational turnaround and educational performance: The

impact of performance-based monitoring analysis systems. American Review of

Public Administration, 44(4), 440-458.

Rutherford, A. and T. Rabovsky. (2014). Evaluating impacts of performance funding

policies on student outcomes in higher education. Annals of the American

Academy of Political and Social Science, 655, 185-208.

Saldaña, J. (2009). The coding manual for qualitative researchers. Thousand Oaks, CA:

SAGE Publications.

Schultz, R. (2015, June 4). FAU making changes and help comes to Boca’s permitting

process. Boca Magazine. Retrieved from https://www.flbog.edu/board/office/

budget/_doc/performance_funding/newsclips/FAU-making-changes-and-help-

comes-to-Boca's-permitting-process-Boca-Raton-Magazine.pdf.

214

Shanahan, E.A., Jones, M.D., and M.K. McBeth. (2011). Policy narratives and policy

processes. The Policy Studies Journal, 39(3), 535-565.

Shanahan, E.A., Jones, M.D., McBeth, M.K., and R.R. Lane. (2013). An angel on the

wind: How heroic policy narratives shape policy realities. Policy Studies Journal,

41(3), 453-483.

Shanahan, E.A., Jones, M.D., and M.K. McBeth. (2011). Narrative policy framework:

The influence of media policy narratives on public opinion. Politics & Policy,

39(3), 373-400.

Shanahan, E.A., Jones, M.D., and M.K. McBeth (2018). How to conduct a Narrative

Policy Framework study. The Social Science Journal, 55, 332-345.

Shin, J.C. (2010). Impacts of performance-based accountability on institutional

performance in the U.S. Higher Education, 60, 47-68.

Sigo, S. (2011, August 16). S&P drops South Florida water district to AA-plus. The Bond

Buyer. Retrieved from https://www.bondbuyer.com/news/s-p-drops-south-florida-

water-district-to-aa-plus.

Simon, H.A. (1947). Administrative behavior: A study of decision-making processes in

administrative organization. New York, NY: Macmillan

Smith-Walter, A. (2018). Victims of health-care reform: Hirschman’s rhetoric of reaction

in the shadow of federalism. Policy Studies Journal, 46(4), 894-921.

South Carolina Commission on Higher Education. (2014). A review of parity funding for

South Carolina higher education. Retrieved from www.che.sc.gov.

215

Stensaker, B., Frølich, N., Huisman, J., Waagene, E., Scordato, L., and P. Pimentel Bótas.

(2014). Factors affecting strategic change in higher education. Journal of Strategy

and Management, 7(2), 193-207.

Stockfisch, J. (2014, June 9). USF St. Pete Makes Push to Retain Students. Tampa Bay

Times. Retrieved from https://www.flbog.edu/board/ office/budget/_doc/

performance_funding/newsclips/USF-St.Pete-makes-push-to-retain-students-

TBO.pdf.

Stone, D.A. (1989). Causal stories and the formation of policy agendas. Political Science

Quarterly, 104(2), 281-300.

Stone, D.A. (2012). Policy paradox: The art of political decision making (3rd ed.). New

York, NY: W.W. Norton & Co.

Taylor, F.W. (1911). The principles of scientific management. New York, NY, US:

Harper and Brothers.

Tennessee Higher Education Commission (2010). Outcomes-based formula narrative.

Retrieved from https://www.tn.gov/content/dam/tn/thec/bureau/fiscal_

admin/fiscal_pol/obff/1-Funding_Formula_2010-15_Presentation.pptx.

Tennessee Higher Education Commission (2015). Outcomes-based formula narrative.

Retrieved from https://www.tn.gov/content/dam/tn/thec/bureau/fiscal_admin/

fiscal_pol/obff/1_-_Outcomes_Based_Funding_Formula_Overview_-

_One_Page.pdf.

Travis, S. (2016, July 1). FAU's $25 million windfall to pay for raises, research, safety.

Sun Sentinel. Retrieved from https://www.flbog.edu/board/

office/budget/_doc/performance_funding/newsclips/FAU_25M_Windfall.pdf.

216

Trombley, W. (2001). Florida's new "K-20" Model: An intensely political battle is waged

over controversial kindergarten-through graduate-school governance structure.

National Center for Public Policy and Higher Education. Retrieved from

http://www.highereducation.org/crosstalk/ct0401/news0401-florida.shtml

Tucker, N. (2016). Jeb Bush: He got his way. Then he got a mess. The Washington Post.

Retrieved from https://www.washingtonpost.com/sf/national/2016/01/07/

decidersbush.

Umbricht, M.R., Fernandez, F., and J.C. Ortagus. (2017). An examination of the

(un)intended consequences of performance funding in higher education.

Educational Policy, 31(5), 643-673.

University of Central Florida (2012a). UCF Accountability Report, 2010-11.

Retrieved from https://www.flbog.edu/board/_doc/accountability/UCF_2010-

11_Annual_Report_FINAL.pdf.

University of Central Florida (2012b). UCF Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/UCF_2012-

13_Workplan_FINAL.pdf.

University of Central Florida (2013a). UCF Accountability Report, 2011-12.

Retrieved from https://www.flbog.edu/board/_doc/accountability/UCF_2011-

12_Accountability_Report_FINAL.pdf.

University of Central Florida (2013b). UCF Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/UCF_2013-

14_Workplan_FINAL.pdf.

217

University of Central Florida (2014a). UCF Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2012-

13/UCF_2012_13_Accountability_Report_FINAL.pdf.

University of Central Florida (2014b). UCF Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/UCF_2014-

15_Workplan_FINAL.pdf.

University of Central Florida (2015a). UCF Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2013-

14/UCF_2013-14_Accountability_Report_FINAL_Revised_2015-01-21.pdf.

University of Central Florida (2015b). UCF Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/UCF_2015_Work_P

lan_FINAL.pdf.

University of Central Florida (2016a). UCF Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2014-

15/UCF_2014-15_Accountability_Report_FINAL.pdf.

University of Central Florida (2016b). UCF Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/UCF_2016_Work_P

lan_FINAL.pdf.

University of Central Florida (2017a). UCF Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/ar_2015-

16/UCF_2015_16_Accountability_Report_FINAL.pdf.

218

University of Central Florida (2017b). UCF Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/UCF_2017_Work_P

lan_FINAL_2017-05-18.pdf.

University of Florida (2012a). UF Accountability Report, 2010-11. Retrieved from

https://www.flbog.edu/board/_doc/accountability/UF_2010-11_Annual_Report_

FINAL.pdf.

University of Florida (2012b). UF Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/UF_2012-

13_Workplan_FINAL.pdf.

University of Florida (2013a). UF Accountability Report, 2011-12. Retrieved from

https://www.flbog.edu/board/_doc/accountability/UF_2011-12_Accountability

_Report_FINAL.pdf.

University of Florida (2013b). UF Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/UF_2013-

14_Workplan_FINAL.pdf.

University of Florida (2014a). UF Accountability Report, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2012-13/UF_2012_13

_Accountability_Report_FINAL.pdf.

University of Florida (2014b). UF Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/UF_2014-

15_Workplan_FINAL.pdf.

219

University of Florida (2015a). UF Accountability Report, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2013-14/UF_2013-

14_Accountability_Report_REVISED_FINAL_2015-03-12.pdf.

University of Florida (2015b). UF Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/UF_2015_Work_Pla

n_FINAL.pdf.

University of Florida (2016a). UF Accountability Report, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2014-15/UF_2014-

15_Accountability_Report_FINAL.pdf.

University of Florida (2016b). UF Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/UF_2016_Work_Pla

n_FINAL.pdf.

University of Florida (2017a). UF Accountability Report, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/accountability/ar_2015-16/UF_2015-

16_Accountability_Report_FINAL.pdf.

University of Florida (2017b). UF Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/UF_2017_Work_Pla

n_FINAL_2017-06-15.pdf.

University of North Florida (2012a). UNF Accountability Report, 2010-11.

Retrieved from

https://www.flbog.edu/board/_doc/accountability/UNF_2010-11_Annual_Report

_FINAL.pdf.

220

University of North Florida (2012b). UNF Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/UNF_2012-

13_Workplan_FINAL.pdf.

University of North Florida (2013a). UNF Accountability Report, 2011-12.

Retrieved from https://www.flbog.edu/board/_doc/accountability/UNF_2011-

12_Accountability_Report_FINAL.pdf.

University of North Florida (2013b). UNF Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/UNF_2013-

14_Workplan_FINAL.pdf.

University of North Florida (2014a). UNF Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2012-

13/UNF_2012_13_Accountability_Report_FINAL.pdf.

University of North Florida (2014b). UNF Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/UNF_2014-

15_Workplan_FINAL.pdf.

University of North Florida (2015a). UNF Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/accountability/

ar_2013-14/UNF_2013-14_Accountability_Report_FINAL_2015-01-20.pdf.

University of North Florida (2015b). UNF Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/UNF_2015_Work_P

lan_FINAL.pdf.

221

University of North Florida (2016a). UNF Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/ar_2014-

15/UNF_2014-15_Accountability_Report_FINAL_2016-03-08.pdf.

University of North Florida (2016b). UNF Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/UNF_2016_Work_P

lan_FINAL.pdf.

University of North Florida (2017a). UNF Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2015-

16/UNF_2015_16_Accountability_Report_FINAL.pdf.

University of North Florida (2017b). UNF Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/UNF_2017_Work_P

lan_FINAL_2017-05-24.pdf.

University of South Florida (2012a). USF Accountability Report, 2010-11.

Retrieved fromhttps://www.flbog.edu/board/_doc/accountability/ USF_2010-

11_Annual_Report_FINAL.pdf.

University of South Florida (2012b). USF Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-

13/USF_System_2012-13_Workplan_FINAL.pdf.

University of South Florida (2013). USF Accountability Report, 2011-12. Retrieved from

https://www.flbog.edu/board/_doc/accountability/USF_2011-12_Accountability_

Report_FINAL.pdf.

222

University of South Florida (2014a). USF Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/ar_2012-

13/USF_SYSTEM_2012_13_Accountability_Report_FINAL.pdf.

University of South Florida (2014b). USF Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/USF_2014-

15_Workplan_FINAL.pdf.

University of South Florida (2015a). USF Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/ar_2013-

14/USF-SYSTEM_2013-14%20Accountability%20Report_FINAL_2014-12-

18.pdf.

University of South Florida (2015b). USF Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/USF_2015_Work_P

lan_FINAL_REVISED.pdf.

University of South Florida (2016a). USF Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/ar_2014-

15/USF-SYSTEM_2014-15_Accountability_Report_FINAL_2016-03-03.pdf.

University of South Florida (2016b). USF Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/USF_2016_Work_P

lan_FINAL.pdf.

University of South Florida (2017a). USF Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/accountability/

ar_2015-16/USF%20SYSTEM_2015-16_Accountability_2017-03-

09_FINAL_REVISED.pdf.

223

University of South Florida (2017b). USF Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/USF-

SYS_2017_Work_Plan_FINAL_2017-06-12.pdf.

University of South Florida–Tampa (2013). USF–Tampa Work Plan, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/ workplan/workplan_2013-

14/USF-TAMPA_2013-14_Workplan_FINAL.pdf.

University of South Florida–Tampa (2014a). USF–Tampa Accountability Report, 2012-

13. Retrieved from https://www.flbog.edu/board/_doc/accountability/ar_2012-

13/USF_TAMPA_2012_13_Accountability_Report_FINAL.pdf.

University of South Florida–Tampa (2014b). USF–Tampa Work Plan, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/ workplan/workplan_2014-

15/USF-TAMPA_2014-15_Workplan_FINAL.pdf.

University of South Florida–Tampa (2015). USF–Tampa Work Plan, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2015/

USF-Tampa_2015_Work_Plan_FINAL_REVISED.pdf.

University of South Florida–Tampa (2016). USF–Tampa Work Plan, 2016-17.

Retrieved from https://www.flbog.edu/board/_doc/ workplan/

workplan_2016/USF-Tampa_2016_Work_Plan_FINAL.pdf.

University of South Florida–Tampa (2017). USF–Tampa Work Plan, 2017-18.

Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2017/USF-TAMPA_2017_Work_Plan_FINAL_2017-06-12.pdf.

224

University of South Florida–St. Petersburg (2013). USF–St. Petersburg Work Plan, 2013-

14. Retrieved from https://www.flbog.edu/board/_doc/workplan/workplan_2013-

14/USF-SP_2013-14_Workplan_FINAL.pdf.

University of South Florida–St. Petersburg (2014a). USF–St. Petersburg Accountability

Report, 2012-13. Retrieved from https://www.flbog.edu/board/_doc/

accountability/ar_2012-13/USF_ST.PETERSBURG_2012_13_

Accountability_Report_FINAL.pdf.

University of South Florida–St. Petersburg (2014b). USF–St. Petersburg Work Plan,

2014-15. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2014-15/USF-SP_2014-15_Workplan_FINAL.pdf.

University of South Florida–St. Petersburg (2015). USF–St. Petersburg Work Plan, 2015-

16. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2015/USF-SP_2015_Work_Plan_FINAL.pdf.

University of South Florida–St. Petersburg (2016). USF–St. Petersburg Work Plan, 2016-

17. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2016/USF-SP_2016_Work_Plan_FINAL.pdf.

University of South Florida–St. Petersburg (2017). USF–St. Petersburg Work Plan, 2017-

18. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2017/USF-STPETE_2017_Work_Plan_FINAL_2017-06-12.pdf.

University of South Florida–Sarasota/Manatee (2013). USF–Sarasota/Manatee Work

Plan, 2013-14. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2013-14/USF-SM_2013-14_Workplan_FINAL.pdf.

225

University of South Florida–Sarasota/Manatee (2014). USF–Sarasota/Manatee Work

Plan, 2014-15. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2014-15/USF-SM_2014-15_Workplan_FINAL.pdf.

University of South Florida–Sarasota/Manatee (2015). USF–Sarasota/Manatee Work

Plan, 2015-16. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2015/USF-SM_2015_Work_Plan_FINAL.pdf.

University of South Florida–Sarasota/Manatee (2016). USF–Sarasota/Manatee Work

Plan, 2016-17. Retrieved from https://www.flbog.edu/board/_doc/workplan/

workplan_2016/USF-SM_2016_Work_Plan_FINAL.pdf.

University of South Florida–Sarasota/Manatee (2017a). USF–Sarasota/Manatee

Accountability Report, 2015-16. Retrieved from https://www.flbog.edu/board/

_doc/accountability/ar_2012-13/USF_SARASOTA-MANATEE_2012_13_

Accountability_Report_FINAL.pdf.

University of South Florida–Sarasota/Manatee (2017b). USF–Sarasota/Manatee Work

Plan, 2017-18. Retrieved from https://www.flbog.edu/board/

_doc/workplan/workplan_2017/USF-SAR-MAN_2017_Work_Plan

_FINAL_2017-06-12.pdf.

University of West Florida (2012a). UWF Accountability Report, 2010-11.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/UWF_2010-

11_Annual_Report_FINAL.pdf.

University of West Florida (2012b). UWF Work Plan, 2012-13. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2012-13/UWF_2012-

13_Workplan_FINAL.pdf.

226

University of West Florida (2013a). UWF Accountability Report, 2011-12.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/UWF_2011-

12_Accountability_Report_FINAL.pdf.

University of West Florida (2013b). UWF Work Plan, 2013-14. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2013-14/UWF%202013-

14_Workplan_FINAL.pdf.

University of West Florida (2014a). UWF Accountability Report, 2012-13.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/ar_2012-

13/UWF_2012_13_Accountability_Report_FINAL.pdf.

University of West Florida (2014b). UWF Work Plan, 2014-15. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2014-15/UWF_2014-

15_Workplan_FINAL.pdf.

University of West Florida (2015a). UWF Accountability Report, 2013-14.

Retrieved from https://www.flbog.edu/board/_doc/ accountability/ar_2013-

14/UWF_2013-14_Accountability_Report_FINAL_2014-12-12.pdf.

University of West Florida (2015b). UWF Work Plan, 2015-16. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2015/UWF_2015_Work_

Plan_FINAL.pdf.

University of West Florida (2016a). UWF Accountability Report, 2014-15.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ ar_2014-

15/UWF_2014-15_Accountability_Report_FINAL.pdf.

227

University of West Florida (2016b). UWF Work Plan, 2016-17. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2016/UWF_2016_Work_

Plan_FINAL.pdf.

University of West Florida (2017a). UWF Accountability Report, 2015-16.

Retrieved from https://www.flbog.edu/board/_doc/accountability/ ar_2015-

16/UWF_2015_16_Accountability_Report_FINAL.pdf.

University of West Florida (2017b). UWF Work Plan, 2017-18. Retrieved from

https://www.flbog.edu/board/_doc/workplan/workplan_2017/UWF_2017_Workpl

an_FINAL_2017-05-25.pdf.

USF New Era (2018). University of South Florida Preeminence website. Retrieved from

https://usfnewera.org.

UWF News Room (2017, June 22). UWF ranks in top three in Florida Board of

Governors performance-based funding model. Retrieved from

https://www.flbog.edu/board/office/budget/_doc/performance_funding/newsclips/

UWF%20ranks%20high%20in%20Florida%20Board%20of%20Governors'%20p

erformance%20funding%20model.pdf.

Waldo, D. (1948). The administrative state: A study of the political theory of American

public administration. New York, NY: Ronald Press

Webb, S. (2015, August 16). New College is striving for state dollars. Herald-Tribune.

Retrieved from https://www.flbog.edu/board/office/budget/_doc/performance

_funding/newsclips/New-College-is-striving-for-state-dollars-HeraldTribune.pdf.

228

Webber, D.A. and R.G. Ehrenberg. (2010). Do expenditures other than instructional

expenditures affect graduation and persistence rates in American higher

education? Economics of Education Review, 29(6), 947-958.

Wexelman, A. (2015, February 5). UCF receives $21.8 million in performance funding.

Central Florida Future. Retrieved from https://www.flbog.edu/board/office/

budget/_doc/performance_funding/newsclips/UCF-receives-$21.8M-in-PBF.pdf.

Wilson, W. (1887). The study of administration. Political Science Quarterly, 2(2), 197-

222

229