Tech University, Jon Ilseng, December 2015

Measuring Productivity And Performance In A System Of Systems Of Government Organizations, Educational Institutions And Industry To Meet The Needs Of Society

by

Jon Ilseng, B.S., Basic Sciences, M.S Engineering Management, M.E. Transdisciplinary Studies.

A Dissertation

In

MECHANICAL ENGINEERING

Submitted to the Graduate Faculty of Texas Tech University in Partial Fulfillment of the Requirements for the Degree of

DOCTOR OF PHILOSOPHY

Approved

Dr. Atila Ertas

Dr. Susan Mengel

Dr. Burak Aksak

Dr. David Wyrick

Mark A. Sheridan, PhD Dean of Graduate School

December, 2015

Copyright 2015, Jon Ilseng

Texas Tech University, Jon Ilseng, December 2015

ACKNOWLEDGMENTS

There are numerous persons I wish to acknowledge as I prepare this dissertation. First, I would like to thank my beautiful wife Kim for the support, encouragement and love during this difficult process. She was always willing to listen to the gripes, complaints I had and was a tremendous encourager. I would like to acknowledge my son J.J. Ilseng, daughter Brooke Thompson and her husband Ryan Thompson, and daughter Kaitlin Olson and her husband Jeff Olson. They were always there to encourage me from the very start of the PhD Program. I would like to thank Committee Advisor Dr. David Wyrick for the tremendous advice on research material and the excellent feedback he gave me as I prepared this dissertation. I know it was very difficult for him as he was out of the country but his responsive feedback to the numerous questions I had was very helpful. I would like to think Dr. Atila Ertas for the excellent feedback he gave on this dissertation paper. Dr. Ertas has been an outstanding resource and provided excellent guidelines on how to pursue a PhD.

iii Texas Tech University, Jon Ilseng, December 2015

TABLE OF CONTENTS

ABSTRACT 1 INTRODUCTION ...... 1 1.1 Background ...... 1 1.1.1 Introduction ...... 1 1.2 Motivation ...... 4 1.3 Goal ...... 5 1.4 Significance ...... 7 1.5 Research Question ...... 8 1.6 Specific Aims ...... 8 1.6.1 State the performance measures in industry, government and education (Aim 1) ...... 8 1.6.2 Identify System Relationships from a SoS perspective (Aim 2) ...... 9 1.6.3 Propose a SoS sustainability Performance Measures (Aim 3) ...... 9 1.7 Scope of Work ...... 10 1.8 Organization of the Dissertation ...... 11 2 Literature Search ...... 12 2.1 Sustainability Literature ...... 12 2.2 System-of-Systems Literature...... 15 2.3 Balanced Scorecard Framework Literature ...... 16 2.4 Current performance measures collected by Government Organizations, For-profit companies, non-profit companies and educational institusions literature ...... 16 2.5 Evidential Reasoning Literature ...... 24 2.6 Data Envelopment Analysis Literature ...... 25 2.7 Current Performance Measurement Systems Literature ...... 29 3 RESEARCH DESIGN AND METHODS ...... 39 3.1 Introduction ...... 39 3.2 Performance Measures (Aim 1) ...... 41

iv Texas Tech University, Jon Ilseng, December 2015

3.2.1 Data Envelopment Analysis...... 41 3.2.2 2 Evidential Reasoning ...... 44 3.3 Identify System Relationships from a SoS perspective (Aim 2) ...... 45 3.3.1 Introduction ...... 45 3.3.2 SoS Relationship ...... 46 3.4 Propose a SoS sustainability Performance Measures (Aim 3) ...... 47 3.4.1 Introduction ...... 47 3.4.2 Approach ...... 47 4 IMPLEMENTATION ...... 49 4.1 Introduction ...... 49 4.1.1 Case Studies for Government Agencies ...... 50 4.1.2 Case Studies for For-Profit Organizations ...... 54 4.1.3 Case Studies for Educational Institutions ...... 58 4.1.4 Case Studies for Data Envelopment Analysis ...... 61 4.1.5 Case Studies for Evidential Reasoning ...... 61 5 METHODOLOGY ...... 64 6 SURVEY DEVELOPMENT AND DATA COLLECTION ...... 69 6.1 Cost Performance Index Data ...... 69 6.2 Logistics Performance Index Data ...... 71 6.3 City of Measures ...... 79 6.3.1 Economic Vibrancy Focus Area ...... 80 6.3.2 Office of Economic Development- Dallas Protocol and World Affairs ...... 83 6.3.3 Clean Healthy Environment Focus Area ...... 85 6.3.4 Effective, Efficient and Economical Focus Area ...... 88 6.4 Texas Tech University Institutional Review Board Proposal ...... 91 7 RESULTS AND DISCUSSION ...... 92 8 CONCLUSIONS, FUTURE WORK AND CONTRIBUTIONS ...... 107

Appendix A CPICTD DATA ...... 137 Appendix B LPI DATA ...... 155

v Texas Tech University, Jon Ilseng, December 2015

Appendix C - DALLAS ECONOMIC VIBRANCY FOCUS AREA DATA ...... 177 Appendix D - DALLAS CLEAN HEALTHY FOCUS AREA DATA ...... 185 Appendix E DALLAS EFFECTIVE, EFFICIENT AND ECONOMICAL FOCUS AREA DATA ...... 191 Appendix F TEXAS TECH UNIVERSITY INSTITUTIONAL REVIEW BOARD PROPOSAL ...... 197 APPENDIX G ONLINE SURVEY RESULTS ...... 211 APPENDIX H COST LOGISTICS INDEX SURVEY RESULTS ...... 239 APPENDIX I PERFORMANCE LOGISTICS INDEX SURVEY RESULTS ...... 243 APPENDIX J AVAILABILITY LOGISTICS INDEX SURVEY RESULTS...... 247

LIST OF TABLES

Table 3-1: Data Envelopment Analysis Model ...... 41 Table 3-2: CSM performance dimensions and measures [112] ...... 42 Table 3.2.1-3-3: Inputs and outputs of public university teaching efficiency [113]...... 43 Table 3.2.1-3-4: Inputs and outputs of public university research efficiency [114] ...... 44 Table 3-5: ER results for three schools [115] ...... 44 Table3-6: Attribute probabilities for ER results [116] ...... 45 Table 3-7: Attribute probabilities for ER results [117] ...... 45 Table 3-8: SoS characteristics from Environmental Aspects [118] ...... 46 Table 4-1: Journal articles’ case studies for SoS for-profit organizations, government agencies and educational institutions ...... 49 Table4-2: Journal articles’ case studies using Data Envelopment Analysis and Evidential Reasoning ...... 50 Table 4-3 Respondents and Contract Areas ...... 51 Table 4-4: Performance Measures Used (% “yes”) ...... 52 Table 4-5: Corporate Social Performance [149] ...... 55 Table 4-6: Characteristics of SoS Engineering Problems and Corporate Sustainability [152] ...... 56 Table 4-7: Twelve Nigerian Universities used in Case Study [154] ...... 58

vi Texas Tech University, Jon Ilseng, December 2015

Table 4-8: Output Factors and Research Productivities by Field of Study [156] ...... 59

Table 6-1: U.S. Air Force B-1B Bomber Program CPICTD Data ...... 70

Table 6-2: U.S. Air Force C-17A Transport Program CPICTD Data ...... 71 Table 6-3 : Calendar Year 2007 Reported Logistics Performance Index Scores ...... 73 Table 6-4 : Number of Awareness Events Held ...... 81 Table 6-5 Number of Business Referrals - Protocol ...... 84 Table 6-6 Regulated Source Investigations ...... 85 Table 6-7 Number of Purchasing Transactions ...... 89 Table 7-1: 2007 LPI Scores ...... 93 Table 7-2: 2010 LPI Scores ...... 93 Table 7-3: 2012 LPI Scores ...... 93 Table 7-4: 2014 LPI Scores ...... 94 Table 7-5 Standard Balanced Scorecard for SoS parasitic measures ...... 98 Table 7-6: Standard Balanced Scorecard for on-line survey results ...... 99

Table 8-1 B-1B CPICTD and LPI measures versus CLI SoS measure ...... 112

Table 8-2 C-17A CPICTD and LPI measures versus CLI measure ...... 113 Table 8-3 System Performance measures versus PLI measure ...... 115 Table 8-4 Availability Performance measures versus ALI measure ...... 118

Table A-9-1 U.S. Navy E-6A TACAMP Program CPICTD Data ...... 138

Table A-9-2 U.S. Navy F/A-18E/F Fighter/Attack Program CPICTD Data...... 138

Table A-9-3 U.S. Air Force F-22 Advanced Tactical Fighter Program CPICTD Data .....139 Table A-9-4: U.S. Air Force/Navy Joint Primary Aircraft Training System Program CPICTD Data ...... 139 Table A-9-5: U.S. Air Force Joint Surveillance Target Attack Radar System (Airborne Segment) Program CPICTD ...... 140

Table A-9-6: U.S. Army Longbow Apache Airframe Modifications Program CPICTD .140

Table A-9-7: U.S. Army AH-64 Helicopter Program CPICTD Data ...... 141 Table A-9-8: U.S. Navy Undergraduate Jet Flight Training System (45TS) Program CPICTD Data ...... 141

Table A-9-9: U.S. Army Advanced Field Artillery Tactical Data System Program CPICTD Data ...... 142

vii Texas Tech University, Jon Ilseng, December 2015

Table A-9-10: U.S. Air Force Airborne Warning and Control System Program CPICTD Data ...... 142 Table A-9-11: U.S. Air Force B-1B Conventional Mission Upgrade Program-Computer CPICTD Data ...... 143 Table A-9-12: U.S. Air Force B-1B Conventional Mission Upgrade Program-Joint Direct Attack Munition CPICTD Data ...... 143

Table A-9-13: U.S. Air Force High Speed Anti-Radiation Missile Program CPICTD Data144

Table A-9-14: U.S. Navy Harpoon Missile Program CPICTD Data ...... 144

Table A-9-15: U.S. Navy High Speed Anti-Radiation Missile Program CPICTD Data...145 Table A-9-16: U.S. Air Force Low Altitude Navigation Targeting Infrared Night (LANTIRN) Program CPICTD Data ...... 145

Table A-9-17: U.S. Army Longbow Apache Fire Control Radar Program CPICTD Data146

Table A-9-18: U.S. Army Multiple Launch Rocket System Program CPICTD Data ...... 146 Table A-9-19: U.S. Air Force Joint Tactical Information Distribution System Program CPICTD Data ...... 147 Table A-9-20: U.S. Army Forward Area Air Defense Command and Control Program CPICTD Data ...... 147 Table A-9-21: U.S. Army All Source Analysis System/Enemy Situation Correlation Element Program CPICTD Data ...... 148 Table A-9-22: U.S. Air Force Advanced Medium Range Air-to-Air Missile Program CPICTD Data ...... 148

Table A-9-23: U.S. Navy AIM-7M Short Range Air-to-Air Missile Program CPICTD Data ...... 149

Table A-9-24: U.S. Army Longbow Hellfire Missile Program CPICTD Data ...... 149

Table A-9-25: U.S. Army Tactical Missile System Program CPICTD Data ...... 150

Table A-9-26: U.S. Army Patriot Advanced Capability Missile Program CPICTD Data 150

Table A-9-27: U.S. Navy Trident II Missile Program CPICTD Data ...... 151 Table A-9-28: U.S. Air Force NAVSTAR Global Positioning System Program Satellite CPICTD Data ...... 151

Table A-9-29: U.S. Air Force Ground Launched Cruise Missile Program CPICTD Data152 Table A-9-30: U.S. Air Force Cluster Bomb Unit-97B Sensor Fused Weapon Program CPICTD Data ...... 152

Table A-9-31: U.S. Army Bradley Fighting Vehicle Program CPICTD Data ...... 153

Table A-9-32: U.S. Navy Advanced Self Protection Jammer Program CPICTD Data ....153

viii Texas Tech University, Jon Ilseng, December 2015

Table A-9-33: U.S. Army Advanced Anti-tank Weapon System-Medium Program CPICTD Data ...... 154 Table B-10-1 Calendar Year 2010 Reported Logistics Performance Index Scores ...... 156 Table B-10-2 Calendar Year 2012 Reported Logistics Performance Index Scores ...... 162 Table B-10-3 Calendar Year 2014 Reported Logistics Performance Index Scores ...... 169 Table C-10-4 Number of business related inbound delegations assisted to promote international business ...... 182 Table C-10-5 Number of COD Partnership Events ...... 183

LIST OF FIGURES

Figure 1-1 Areas of Emphasis in Systems and SoS [1] ...... 3 Figure 1-2 Venn Diagram Industry, Government and Education ...... 6 Figure 5-1 Standard Balanced Scorecard Template [171] ...... 65 Figure 5-2 Standard Balanced Scorecard Template [172] ...... 67 Figure 7-1 Cost/Logistic Index Measures for fifty-four systems in Dallas-Fort Worth, TX SoS ...... 102 Figure 7-2 Cost/Logistic Index Measures for thirty-six systems in Dallas-Fort Worth, TX SoS ...... 102 Figure 7-3 Performance/Logistic Index Measures for fifty-four systems in Dallas-Fort Worth, TX SoS...... 103 Figure 7-4 Performance/Logistic Index Measures for thirty-six systems in Dallas-Fort Worth, TX SoS...... 104 Figure 7-5 Availability/Logistic Index Measures for fifty-four systems in Dallas-Fort Worth, TX SoS...... 105 Figure 7-6 Availability/Logistic Index Measures for thirty-six systems in Dallas-Fort Worth, TX SoS...... 106 Figure 8-1 BSC Performance Measures ...... 109

ix Texas Tech University, Jon Ilseng, December 2015

ABSTRACT

For-profit organizations strive to be as effective and efficient as possible. Lean management has become the de facto standard for resource allocation and decision- making. Previous work has described the pros and cons of lean management, both for short-term competitive positioning and for long-term viability. One of the key problems is how to measure performance such that doing things right (efficiency) does not impair the organization from doing the right thing (effectiveness). Manufacturing firms, as other for-profit organizations, are situated in areas that have a population with an educational system. This creates a local System of Systems. The interaction of these systems can be either parasitic, where the success of one requires the sacrifice of another, or symbiotic, where success for one fuels success in the others. As sustainability concerns become more pervasive, the relationship of efficiency and effectiveness in this System of Systems becomes of greater interest. How performance is measured in such an environment should therefore be important. This dissertation reviews the traditional measures of performance in for-profit organizations, in governmental organizations and educational institutions. The relation of for-profit firms, education and government is phrased in a systems perspective, with a discussion of the forces for parasitic and symbiotic behavior. Based on this, performance measures that integrate the needs of the various systems, as well as the overall System of Systems, are proposed. For-profit organizations, manufacturing firms and research & development (R&D) agencies tend to be located in areas that have a high number of technology companies and well-known educational institutions. The Dallas-Fort Worth, Texas metropolitan area is home to a high number of technology companies (e.g. Raytheon, Lockheed Martin, Cisco, Oracle, Texas Instruments, etc.) and there are well- known educational institutions located there such as the University of Texas-Dallas, University of Texas-Arlington, Southern Methodist University and Texas Christian University. The Dallas-Fort Worth, Texas metropolitan area is the local System of Systems. Focusing on one local System of Systems is more efficient and provides a

x Texas Tech University, Jon Ilseng, December 2015 larger impact rather than focusing on numerous local System of Systems. This dissertation proposes the following performance measures:

1. Cost/Logistics Index – Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. 2. Performance/Logistics Index – Performance/Logistics Index measures the performance of the SoS versus the logistics information. 3. Availabilty/Logistics Index – Availability/Logistics Index measures the availability of the SoS versus the logistics information.

These proposed Performance Measures integrate various needs of the Dallas-Fort Worth, Texas SoS with regards to sustainability concerns and further the research on Transdisciplinary Engineering. Transdisciplinary Research and Educational Programs typically consist of four core courses (design fundamentals, process fundamentals, systems fundamentals, metrics fundamentals). The metrics fundamentals course develops concepts of engineering measurement as well as quality assurance. These proposed Performance Measures could be added to the current Transdisciplinary Metrics Fundamentals Course. I recommend that the above performance measures can be used to meet current and future sustainability concerns. These proposed performance measures enable the Dallas-Fort Worth SoSs to truly execute their strategies.

xi Texas Tech University, Jon Ilseng, December 2015

CHAPTER 1 1 INTRODUCTION

1.1 Background 1.1.1 Introduction Performance measures are currently used in for-profit organizations, government agencies and educational institutions to determine the efficiency and effectiveness of such organizations. Performance measures, quantitatively, describe important and key information about services and the processes that produce them. Performance measures are typically based on data and describe whether an organization is achieving its goals and if any progress is being made to meeting its goals. Performance measures normally let organizations know:  How they are currently performing with regards to their respective competitors.  If they are meeting their current goals.  If their customers are satisfied.  If their processes are in statistical control.  If and where improvements are necessary.  The information necessary to make intelligent decisions.

A Performance Measure is typically composed of some number or value and a unit of measure. The number or value is a magnitude (how much) and the unit of measure is a meaning (what). Most performance measures can be categorized into one of the following five general categories: 1. Efficiency: Indicates the degree to which a process produces the required output at a minimum resource cost. It asks the question “Are we doing things right?” 2. Quality: Degree to which a product or service meets customer requirements and expectations.

1 Texas Tech University, Jon Ilseng, December 2015

3. Timeliness: Whether a unit of work was done correctly and on time. 4. Productivity: Defined as the value added by the process divided by the value of the labor and capital consumed. 5. Safety: Determines the overall health of the organization and the working environment of the employees.

Performance measures are normally objective, timely, accurate, useful, motivating and trackable. They provide a tool for organizations to manage progress towards achieving predetermined goals, defining key indicators of organizational performance and customer satisfaction. Currently, there are traditional performance measures (e.g. productivity, efficiency and quality performance metrics) in industry, government agencies and educational institutions which determine the efficiency and effectiveness of such organizations. Good performance is the criterion whereby an organization (e.g. for- profit, not-for-profit, government, educational) determines its capability to prevail in competitive environments. As these organizations strive to be effective and efficient, one key problem is how to measure performance such that doing things right (efficiency) does not impair an organization from doing the right thing (effectiveness). These traditional performance measures can be analyzed from a System-of- Systems (SoS) perspective. For the purpose of this dissertation, the SoS definition defined in the U.S. Department of Defense (DoD) Defense Acquisition Guidebook will be used: A system of systems is a set or arrangement of interdependent systems that are related or connected to give a given capability. The loss of any part of the system will significantly degrade the performance or capabilities of the whole. The development of a system of systems solution will involve trade space between the systems as well as within an individual system’s performance. Areas of emphasis for SoS are shown in Figure 1-1 Areas of Emphasis in Systems and SoS . For-profit organizations, manufacturing firms and research & development

2 Texas Tech University, Jon Ilseng, December 2015

(R&D) agencies tend to be located in areas of the world that have educational systems. This creates a local SoS. The interaction of this local SoS can be either parasitic, where the success of one requires the sacrifice of another; or they can be symbiotic, where success for one fuels success in the others. With sustainability concerns becoming more pervasive in today’s world, the relationship of efficiency and effectiveness in this local SoS becomes of greater interest. Performance measures to integrate the needs of these organizations and the overall SoS also become very important.

Figure 1-1 Areas of Emphasis in Systems and SoS [1]

This dissertation investigates the traditional performance measures in for-profit organizations, government agencies and educational institutions. The relation of firms,

3 Texas Tech University, Jon Ilseng, December 2015 education and government will be phrased in a systems perspective, with a discussion of the forces for parasitic and symbiotic behavior. Based on this, performance measures that integrate the needs of the various systems, as well as the overall SoS, are proposed.

1.2 Motivation For-profit organizations, not-for-profit organizations, Government Agencies and Educational Institutions all strive to be effective, productive and efficient as much as possible. Each is always told “to do more with less”. In other words, produce more automobiles, cellular phones, regulations, research grants, graduates, etc. than the previous year; but do it with reduced budgets, resources and in a timely manner. Popular techniques such as Lean Management, Six Sigma and Total Quality Management are benchmarks for decision-making and making as such organizations more productive and effective. However, it is important that efficiency does not impair an organization from being effective and productive. For-profit organizations, manufacturing firms and research & development (R&D) agencies tend to be located in areas of the world that have educational systems. This creates a local SoS. The Silicon Valley metropolitan area in Northern is home to a high number of R&D firms, technology corporations and venture capitalists. Also, located in Silicon Valley are well-known educational institutions such as the University of California-Berkley and Stanford University. The Boston, Massachusetts metropolitan area is home to a high number of technology companies and there are well-known educational institutions located there also such as the Massachusetts Institute of Technology, Northeastern University, Harvard University and Boston College University. The Dallas-Fort Worth, Texas metropolitan area is also home to a high number of technology companies (e.g. Raytheon, Lockheed Martin, Cisco, Oracle, etc.) and there are well-known educational institutions located there such as University of Texas-Dallas, University of Texas-Arlington, Southern Methodist University and Texas Christian University. For the purposes of this dissertation, the focus of the local SoS will be the Dallas-Fort Worth, Texas metropolitan

4 Texas Tech University, Jon Ilseng, December 2015 area. Focusing on one local SoS is much more efficient and will provide a much more impact rather than focusing on numerous local SoSs. 1.3 Goal Current performance measures for the local Dallas-Fort Worth, Texas SoS do not integrate needs for a sustainable society, do not measure the relationship of efficiency and effectiveness and do not measure interactions (parasitic or symbiotic). The goal of this dissertation is to propose Performance Measures that integrate the needs of the local Dallas-Fort Worth, Texas SoS and the needs of for-profit organizations, government agencies and educational institutions. The intent is to provide enough detailed data and information on these proposed Performance Measures to meet current and future sustainability concerns. As sustainability concerns and issues become pervasive, relationship between productivity/efficiency and effectiveness in the local Dallas-Fort Worth, Texas SoS becomes very important. These proposed performance measures: 1) take into account cost, performance, availability and logistic characteristics which current performance measures do not, 2) integrate the local Dallas-Fort Worth, TX SoS to meet current and sustainability concerns which current performance measures do not, and 3) will be used for a current U.S. Air Force contract to help reduce life cycle costs. Figure 1-2 is a Venn diagram that shows the intersection of the current performance measures that industry, government and education used in the local Dallas-Fort Worth, TX SoS. The green intersected areas represent a symbiotic SoS while the red intersected area represents the “sweet spot” of those performance measures that are used in common between industry, government and education. The common performance measures are the ones which cause an issue of efficiency and effectiveness because they are defined and measured differently by industry, government and education in the local Dallas-Fort Worth, TX SoS. They impair the local Dallas-Fort Worth, TX SoS from doing things right (efficiency) and doing the right thing (effectiveness). Managing them is the most effective, efficient and the best strategic way to get increased participation of industry, government and education. These common performance measures are as follows:

5 Texas Tech University, Jon Ilseng, December 2015

 Engineering Productivity  Organizational Performance  Research and Development Productivity  Corporate Social Performance  Operation Performance  Customer Satisfaction  Financial Performance  Human Resource Performance  Resource Productivity  Research & Development Performance  Contract accountability  Sustainability measurement  Environmental Sustainability

Figure 1-2 Venn Diagram Industry, Government and Education

6 Texas Tech University, Jon Ilseng, December 2015

The performance measures proposed address each of these common measures and are discussed in Chapter 7.

1.4 Significance Interactions are very important for the local SoS. Interactions can be either parasitic, where the success of one requires the sacrifice of another. Or, they can be symbiotic, where success for one fuels success in the others. As sustainability concerns become more prevalent, the relationship of efficiency and effectiveness in System of Systems is crucial. Sustainability is a continuing global issue and industry, educational institutions and Government agencies must continue to address it. For this research, sustainability is defined as “the ability to sustain, or a state that can be maintained at a certain level”. [2] For these local SoS to be competitive and productive, it is crucial that each one address sustainability with regards to efficiency and effectiveness. With regards to education curriculums at universities and colleges world-wide, “the pace of change in education curriculums is growing exponentially due to numerous legislative arrangements and changes.” [3] With the rapid pace of change, it is critical that educational institutions are effective and efficient to meet the growing sustainability needs. With regards to industry (profit and non-profit), having sustainable business practices must be efficient, effective and productive. For example, in the aviation industry, aerospace manufacturers must be readily adaptable to change with regards to sustainability. The “aviation industry will be driven by, and judged on its environmental record”. “The impact of aviation on the environment has been a source of focus and concern for the industry for more than 60 years.” [4] With regards to government agencies, they must also be efficient, effective and productive to meet the growing sustainability needs. For example, government contracts are measured as to what is their impact on the government’s ability to effectively manage

7 Texas Tech University, Jon Ilseng, December 2015 contracts, specifically, measuring costs, client impact, service timeliness and disruptions that are associated with higher perceived accountability effectiveness. [5] As stated earlier, proposed performance measures will be introduced that will integrate the needs of these local system of systems.

1.5 Research Question In this dissertation the research questions that are addressed are the following: “What proposed measures, other than the traditional performance measures, could measure the needs of for-profit organizations, not-for-profit organizations, Government Agencies and Educational Institutions?” These proposed measures must be understandable, accountable and verifiable. “What proposed measures, other than the traditional performance measures, could integrate the needs of the local Dallas-Fort Worth, Texas SoS?”

1.6 Specific Aims 1.6.1 State the performance measures in industry, government and education (Aim 1) What are the current performance measures in industry, government and educational institutions? There are currently traditional performance measures that industry, government and educational systems use to measure productivity and efficiency. Aim 1 reviews in detail the current performance measures in industry, government and educational systems. For example, numerous United States Department of Defense (DoD) contractors use engineering performance measures such as productivity (systems engineering, software and hardware) to measure the rate at which budgeted work must be completed to meet a schedule. Software Engineering productivity measures the lines of code/hour (e.g. 1.6 LOC/hr) that a software coder must write to ensure that the software configuration item is completed on-time and without error. Systems Engineering productivity measures the rate at which system-level requirements are completed and

8 Texas Tech University, Jon Ilseng, December 2015 verified. Hardware Engineering productivity measures the rate at which engineering drawings must be completed to ensure a release date is met. Government agencies have current performance measures such as inputs (resources used), outputs (program activities), efficiency measures (ratio of inputs to outputs), and outcomes (the actual results of programs and services). These performance measures are currently used in the following areas of Government agencies: 1) Risk Management, 2) Public Health, 3) Economic Development, 4) Information Technology, 5) Public Health, and 6) Streets and Roads. Government agencies include the federal, state, and local levels. Educational institutions (universities, public schools, private schools, etc.) have current performance measures. For example, some U.S. state legislatures decide appropriations to their state colleges and universities based on how well that institution performs. Also, when the No Child Left Behind Act passed in 2001, the act required states to develop subject area of measures of adequate yearly progress (AYP) in reading and math that could be used to evaluate the quality of the public education system. [6]

1.6.2 Identify System Relationships from a SoS perspective (Aim 2) Aim 2 identifies the relation of these systems from a System-of-Systems perspective and discusses how parasitic and symbiotic behaviors affect it. For the SoS definition, the definition described in Paragraph 1.1.1 will be used. Parasitic behavior is defined as where success for one fuels success in the others. Areas of emphasis for a SoS are shown in Figure 1-1.

1.6.3 Propose a SoS sustainability Performance Measures (Aim 3) Aim 3 identifies and proposes performance measures that integrate the needs of these various systems, as well as the overall SoS, with regards to sustainability. By using the Balanced Scorecard (BSC) measurement system, I am proposing the following performance measures to address Aim 3:

9 Texas Tech University, Jon Ilseng, December 2015

1. Cost/Logistics Index – Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. 2. Performance/Logistics Index – Performance/Logistics Index measures the performance of the SoS versus the logistics information. 3. Availabilty/Logistics Index – Availability/Logistics Index measures the performance of the SoS versus the logistics information. The BSC measurement system was used to conduct performance and enhance responsibility in Government Public Administration [7]. For the Government Public Administration, the BSC measurement system included four perspectives to measure government performance: financial responsibility, service to citizens, work process improvement and learning and growth of employees [8]

1.7 Scope of Work Aim 1 defines and describes the current performance measures in industry, government and educational systems used to measure productivity and efficiency. It describes industry performance measures such as Software Engineering productivity, Systems Engineering productivity and Hardware Engineering productivity. It describes Government performance measures such as inputs (resources used), outputs (program activities), efficiency measures (ratio of inputs to outputs), and outcomes (the actual results of programs and services). It also describes educational institutions’ performance measures such as the No Child Left Behind Act measures of adequate yearly progress (AYP) in reading and math that is used to evaluate the quality of the public education system. Aim 2 defines and identifies the relation of these systems (Industry, Government and Educational Institutions) from a System-of-Systems perspective and discusses how parasitic and symbiotic behaviors affect it. Aim 3 defines and proposes performance measures that could integrate the needs of these various systems, as well as the overall SoS, with regards to sustainability.

10 Texas Tech University, Jon Ilseng, December 2015

1.8 Organization of the Dissertation The dissertation is organized by chapters. Chapter 1 Introduction gives an introduction to the dissertation and discusses the motivation, goals and significance of the dissertation topic. Chapter 1 lists the three Aims and the scope of work for each Aim. Chapter 2 Literature Search lists the research references and summarizes each literature reference. I researched specific topics that are relevant to this dissertation and they are as follows: 1) Sustainability, 2) System-of-Systems, 3) Balanced Scorecard Framework, 4) Current performance measures collected by government organizations, for-profit companies, non-profit companies and educational institutions, 5) Evidential Reasoning, 6) Data Envelopment Analysis and 7) Current Performance Measurement Systems. Chapter 3 Research Design and Methods discusses the research design and methods for each of the three Aims. Chapter 4 Implementation discusses case studies that have been researched and conducted for SoS for-profit organizations, government agencies and educational institutions. Chapter 5 Methodology discusses how the Balanced Scorecard (BSC) measurement system is used to define performance measures. Chapter 6 Survey Development and Data Collection discusses how I collected primary data (data collected for the first time and in crude form) and secondary data (public data which has already been collected. Chapter 6 also lists the primary and secondary data. Chapter 7 Results and Discussion discusses the results of the primary and secondary data. Chapter 8 Conclusions, Future Work and Contributions summarizes the dissertation, discusses what future work could be conducted from this dissertation and the contributions the dissertation makes.

11 Texas Tech University, Jon Ilseng, December 2015

CHAPTER 2 2 LITERATURE SEARCH Chapter 1 introduced the problem of performance measures that can be used in the Dallas-Fort Worth industry-government-education System of Systems, and described how this dissertation is organized. Chapter 2 Literature Search lists and summarizes literature references for the following areas: 1) Sustainability, 2) System-of-Systems, 3) Balanced Scorecard Framework, 4) Current performance measures collected by government organizations, for-profit companies, non-profit companies and educational institutions, 5) Evidential Reasoning, 6) Data Envelopment Analysis and 7) Current Performance Measurement Systems. The following paragraphs are the literature references. 2.1 Sustainability Literature Kajikawa describes how to bridge the existing gaps among disciplines and sectors [9]. The author considers the interdisciplinary characteristics of sustainability science and discusses the key issues to integrate disciplines. He analyzes the structure of sustainability science and discusses the necessity of transdisciplinary expertise and how such initiatives were conducted in . The forces driving lean management, internationalization, sustainability, and technology policy are discussed by Wyrick, Natarajan, Eseonu, Lindeke & Chen [10]. They describe the implications for manufacturers, individuals and governments and provide recommendations for implementation and further work suggestions. Marinova & Newman describe how data on research publications shows that Australia outperforms the United Kingdom and New Zealand, whose systems are being used as the model for the proposed changes in Australia [11]. They describe how research funding should provide the basis for achieving long-term sustainability. They propose a new funding model that allows for diversity and flexibility in research to properly reflect the complexity of the academic world. Lotz-Sisitka considers the question of what education for sustainable development (ESD) research might signify when linked to the concept of “retention” and how this

12 Texas Tech University, Jon Ilseng, December 2015 relation might be researched [12]. She considers two different perspectives on retention. She describes an ESD research agenda that documents retention by focusing on the issue of keeping children in schools. She also discusses a related ESD research agenda that focuses more on the pedagogical and curricular aspects of retention. Hasna presents the application of sustainability design criteria in the context of capstone design projects by way of applying social, economic, ecological, technological and time SEETT framework [13]. He argues that sustainability feasibility studies and assessment in capstone engineering design projects are of grave importance for the success of the new frontier Abbott focuses on the growth of Private Sustainable Governance (PSG) [14]. He examines the public-private engagement gap in practice and scholarship. He considers why the interstate system has been loath to engage with PSG and identifies a number of benefits that public engagement could achieve. Carvalho proposes a methodology to establish a quantitative definition of sustainability structured on the principles of minimum and maximum entropy production [15]. Based on this, he outlines a way of organizing the many sources of, and kinds of energy, we have available to us in order of the intensity of their respective environmental impacts. He produces an Environmental Sustainability Index, linked to existing statistical indicators of human development, and arrives at a Sustainable Human Development Index. He believes the Environmental Sustainability Index is positively or negatively influenced by parameters linked to environmental sustainability and quality of life. Munier proposes a methodology for establishing a set of sustainability indicators as a baseline to measure the state of a city, and after appropriate actions is taken, to assess performance, to determine if changes point to improvements regarding goals [16]. The methodology starts with choosing a procedure for identification, scope, evaluation, and level of indicators to build the initial dataset. As a second step, the methodology produces a reduced set of indicators, complying with all criteria imposed, and selected

13 Texas Tech University, Jon Ilseng, December 2015 under the condition of extracting the maximum amount of information from the initial dataset. Moldan,, Janouskova and Hak analyse the different approaches and types of indicators developed which are used for the assessment of environmental sustainability [17]. One important aspect they describe is setting targets and then “measuring” the distance to a target to get the appropriate information on the current state or trend. Idoro discusses a study of the level of mechanization and its sustainability, the relationship between mechanization and its sustainability, and between mechanization and project outcome [18]. To achieve these, a field survey involving a sample of eighty projects was conducted with the aid of questionnaires. Data was collected on the production methods adopted in excavation and concreting, whether or not the use of plant for the operations were sustainable and the initial and actual delivery time and cost of the projects sampled. The study concludes that there is a strong need for measures that will improve the level of mechanization and its sustainability in the industry and recommend the introduction of plant mobilization fund by clients, incentives on importation of construction plant and an effective and functional lease market for construction plant as some of the measures that will improve mechanized construction. Char and David discuss the negative relationship between pro-market reforms and the sustainability of superior profits in an emerging economy [19]. The decline in sustainability of superior profits shows that pro-market reforms bring significant threats in addition to the various opportunities such as greater availability of production factors and greater freedom to enter and operate businesses highlighted in the extant literature. Their study contributes to a more complete conceptual understanding of the performance consequences of pro-market reforms in emerging economies. They show that investment in research and development and greater investments in marketing and advertising are firm-level resources that provide a measure of protection against the erosion in sustainability of superior profits associated with pro-market reforms. Fox, Hundley, Cowan Tabasi and Goodman discuss how sustainability can be taught and how sustainability might be integrated into the curriculum from three

14 Texas Tech University, Jon Ilseng, December 2015 perspectives: course, program and degree [20]. At the course level, they discuss examples of how to integrate the concepts and applications of sustainability into existing material. Program-level considerations for teaching sustainability are also discussed. The current situation and the demand for a sustainable knowledge in the workplace and how that might lead to a sustainable degree is also discussed.

2.2 System-of-Systems Literature Searcy describes the implications of applying a System of Systems (SoS) Engineering perspective to corporate sustainability performance measurement [21]. He highlights that measuring corporate sustainability is a complex problem characterized by pluralistic goals, ambiguity, emergence, uncertainty and context dominance. He describes the distinctions between traditional systems engineering and system of systems engineering and briefly reviews the characteristics of system of systems engineering problems. He also discusses the implications of a SoS engineering perspective for corporate sustainability performance measurement. Yuutaka, Kobashi and Nakamima describe a human health management system scheme and its practical applications [22]. They focus on health management, medical diagnosis, and surgical support system of systems engineering (SoSE). The application domains described by them are broad and essential in health management and clinical practice. They describe a system of systems (SoS) in human health management and in medical diagnostic imagine. They also describe a SoS for a medical ultrasonic surgery support device. Karcanias and Hessami attempt to formalize the loose concept of “System of Systems” (SoS) within the context of Systems Theory and develop a conceptual framework for emergence that is suitable for further development [23]. They view the notion of SoS as an evolution of the standard notion of systems and provide an abstract and generic definition that is detached from the particular domain. The Systems Engineering Guide for Systems of Systems (Version 1.0) provides today’s systems engineering practitioners with well grounded, practical guidance on what to expect as they work in today’s increasingly complex systems environment and tackle the

15 Texas Tech University, Jon Ilseng, December 2015 challenges of systems of systems [24]. This guide is a step in supporting the systems engineering community to adapt systems engineering processes to address the changing nature of today’s world increasingly characterized by networked systems and systems of systems. Dahmann and Baldwin discuss how the U.S. Department of Defense (DoD) needs to manage and engineer complex Systems of Systems (SoS) [25]. It is critical that Systems Engineers are sufficiently trained to face the challenges in applying Systems Engineering processes to support SoS, particularly in situations where the systems retain their independence. They discuss the situation in SoS in the context of existing SoS frameworks and discusses the current DoD approach to SoS and the challenges Systems Engineers will face at both the SoS and System Levels. They suggest areas for further investigation to address key issues as Systems Engineering takes up the challenge of these changes in the interdependent networked environment of the future battle space.

2.3 Balanced Scorecard Framework Literature Amado, Santos and Marques describe the development of a conceptual framework which aims to assess Decision Making Units (DMUs) from multiple perspectives [26]. Their proposed framework combines the Balanced Scorecard (BSC) method with the non-parametric technique known as Data Envelopment Analysis (DEA) by using various interconnected methods which try to encapsulate four perspectives of performance (financial, customers, internal processes, learning and growth). By integrating the BSC with the DEA, they identify where there is room for improving organizational performance and points out opportunities for reciprocal learning between DMUs.

2.4 Current performance measures collected by Government Organizations, For-profit companies, non-profit companies and educational institusions literature Patrick and French discuss present research indicating that the development and use of performance measures to hold educators accountable and improve performance is limited by organized employee groups and enhanced by minority student populations

16 Texas Tech University, Jon Ilseng, December 2015

[27]. They describe how proposed significant increases in student performances as a result of the No Child Left Behind Act are not evidenced Baruch & Ramalho describe the way academic scholarly works measure organizational outcomes, commonly reported as either organizational effectiveness or organizational performance (OEP) [28]. They describe analysis of 149 scholarly publications published in the past decade focusing on business organizations, nonprofit organizations and mix of both. They describe a multivariate analysis of different configurations of criteria for business and not-for-profit research. Berument, Dincer and Mustafaoglu examine the relationship for the country of Turkey between growth and growth volatility for a small open economy with high growth volatility [29]. Quarterly data for the period from 1987Q1 to 2007Q3 suggests that growth volatility reduces growth and that this result is robust under different specifications. They contribute to the literature by focusing on how growth volatility affects a set of variables that are crucial for growth. Empirical evidence from Turkey suggests that higher growth volatility reduces total factor productivity, investment, and the foreign currency value of local currency (depreciation). Moreover, it increases employment, though the evidence for this is not statistically significant. Sackett discusses how Engineering Productivity cannot be measured objectively and how subjective estimates are constantly being made [30]. He discusses how the shift from the measurement of productivity to engineering effectiveness occurs. Russell and Allwood discuss the comparison of the life-cycle environmental impacts of changed production structures for two consumer goods (high-density polyethylene (HDPE) shopping bags and beds) in Jamaica [31]. They discuss three alternative production structures for each products and how each reflected an increase in Jamaica. Assaf and Josiassen study measures and compare the efficiency and productivity of European and U.S. airlines, from the 2001 to 2008 period [32]. They measure efficiency by estimating a Bayesian distance frontier model subject to regularity constraints. Productivity estimates are also derived parametrically, based on the

17 Texas Tech University, Jon Ilseng, December 2015 estimates of the distance frontier model. A comparison based on the type of airlines indicates that low-cost airlines are on average more productive and efficient than full- service airlines. The decomposition of productivity growth and related market discussions are also provided Collins discusses the concept of collective intelligence - the idea that groups of individuals possess intelligence not directly related to the cognitive ability of the groups' individual members— and how it has served as a critical factor in understanding teams’ productivity [33]. Soriano and Castrogiovanni investigate the effects of entrepreneurial human capital on SME performance using data on 2,713 SMEs within the European Union [34]. They describe how performance was measured in two ways: profitability as Return on Assets (ROA) and productivity as revenue per employee. Their results indicate that both profitability and productivity are positively related to industry-specific knowledge possessed by the CEO-owner prior to starting up the firm and the general business knowledge acquired once the firm is up and running. They believe there is a link between performance and inclusion of other CEO-owners in the founder’s inner circle of advisors. Boehm discusses how the Japanese truck manufacturer Mitsubishi Fuso faced the need to reduce its R&D costs by 30 percent [35]. Rather than adopt a simple cost-cutting approach, the company used the situation as an opportunity to apply lean principles in order to transform the way its R&D activities were run, organized, and managed. Boehm describes the principles, elements, and results of this R&D restructuring effort Gazley uses the context of local government–nonprofit partnerships to test the potential influence of various partnership and organizational factors on collaborative outcomes, using two contrasting outcome measures [36]. She finds that formal contracts and prior experience working with nonprofits and volunteers can increase at least a public manager’s perception of success, but the strongest association to real performance improvement comes from the intensity of shared goals and the level of investment in the partnership. She suggests the value in comparing multiple performance measures and

18 Texas Tech University, Jon Ilseng, December 2015 also reinforces an understanding of the experiential, interpersonal factors that support collaborative success. Steinberger and Krausmann discuss resource productivity, measured as GDP output/resource input, and how it is a widespread sustainability indicator that combines economic and environmental information [37]. They demonstrate that different types of materials and energy exhibit fundamentally different behaviors, depending on their international income elasticities of consumption. They question the interpretation of resource productivity as a sustainability indicator. Serikh, Yakimova and Palchun describe how the quality of educational services and processes of their granting cannot be improved without identification and measurement of their characteristics [38]. They describe the concept “quality control” that includes planning and improvement of quality. They also describe the program of continuous improvement. Herron and Braiden describe a model which has been developed to direct and generate productivity improvement in a group of manufacturing companies [39]. The companies are of all sizes including Small and Medium Enterprises (SMEs) and form a cross-section of industries and abilities with regard to manufacturing. There is a wide range of manufacturing efficiency improvement methods available to the companies, such as Just in Time (JIT), or a range of lean manufacturing tools. They describe the approach and the results obtained from 15 companies. Vanclay and Bommann describe the Excellence for Research in Australia (ERA) and its attempt by the Australian Research Council to rate Australian universities on a 5- point scale within 180 Fields of Research using metrics and peer evaluation by an evaluation committee [40]. They describe the bibliometric data and how it is extensive. They also describe the indirect H2 index, how it is objective and can be computed automatically and efficiently. Brown and Gobeli review the major measurement issues connected with Research & Development (R&D) productivity and represent the results of a case study to develop an R & D productivity measurement system [41]. They describe the process of designing

19 Texas Tech University, Jon Ilseng, December 2015 such a system for a high-tech, product-development organization, and also provide a reduced list of ten R&D productivity indicators for ongoing monitoring purposes. Zhao describes the comparison between local government and instrumental rationality on their connotations and practice [42]. He compares the fundamental orientations of public policy performance and explains the four prime value factors (equity, service, responsibility and good governance) of policy performance evaluation of local government. Maudgalya, Genaidy and Shell describe that the objective of this study is to determine whether workplace safety as a business objective adds value to the business bottom line [43]. Their research reviews published case studies to determine if there is a relationship between safety initiatives and increased productivity, quality, and cost efficiencies. Eighteen case studies (17 published by the National Safety Council) were analyzed using the Workplace Safety Intervention Appraisal Instrument. The appraisal scores ranged from 0.55 to 1.27, with an average of 0.91. The case studies were relatively strong in the Evidence Reporting and Data Analysis categories, as compared to the Subject Selection, Observation Quality. Ryynanen, Sirvio, Tanninen and Lindell discuss the suitability of digital printing in short runs in packaging industry [44]. Their study calculates the break-even point to be in the range of 4000-8000 packages, depending on the package and sheet sizes. Phusavat and Photaranon address two key problems facing the production department at the government pharmaceutical organization (GPO) [45]. The two key problems were: 1) a lack of productivity and performance measurement at the operational level, and 2) a need to assess the functional readiness to undertake its own performance-analysis work. Their study generated insightful information into operational performance at the GPO. It was clear that productivity represented the basis for formulating future policy initiatives. Lee, Park and Kim evaluate the performance of natural gas transportation utilities, focusing on the three-key strategic performance measures of profit, productivity, and price differential [46]. They propose a methodology that expresses the three

20 Texas Tech University, Jon Ilseng, December 2015 performance measures in a unified single equation using the Edgeworth index. The proposed methodology is applied to an international comparison of the performance of 28 natural gas transportation utilities operating in eight countries in which the business environments difer greatly. Their empirical results show the possible causes of performance differences and may shed some light on the direction of regulatory policy, especially for developing countries that have relatively short histories in the natural gas industry, such as Korea. Pearson, Nixon and Keersens-Van focus on the future trends in R&D management for the way performance is [47]. They discuss findings, results and how management can identify these future trends. Lahiri and Kumar measure and rank the productivity of academic institutions and faculty members based on the number of publications appearing in the top three core international business journals between 2001 and 2009 [48]. Their research serves as a useful update and extension of studies by Morrison and Inkpen (1991), Inkpen and Beamish (1994), and Kumar and Kundu (2004), which examined the top three international business journals, namely, Management International Review, Journal of International Business Studies, and Journal of World Business. Isola, Siyanbola and Ilori discuss critical input and output factors of research and development (R&D) in higher education institutions in Nigeria [49]. They show quantitative analyses of researchers’ productivity using partial productivity approach and an assessment of factors influencing research productivity. Based on a recent research effort, researchers were sampled from 12 leading Universities. They were randomly selected from the fields of Agriculture, Science and Technology/Engineering. Findings from the results reveal that productivity of the researchers is of medium category. Also, input factors such as: qualifications of researchers, years of experience, research collaborations and time spent on research; significantly contribute to research productivity. Mahto, Davis, Pearch II and Robinson, Jr. discuss how business goals in family businesses are often subsumed by family goals [50]. As a result, reference performance

21 Texas Tech University, Jon Ilseng, December 2015 for each family business is different. This makes the popular financial performance measure of publicly traded companies, profit maximization, insufficient for family businesses. The authors believe that for evaluating family businesses, the family members’ satisfaction with firm performance is a better measure of performance. In their study, the authors identify three predictors of family members’ satisfaction with firm performance and test the proposed linkages on two samples of family businesses. Baum and Wally discuss a 4-year study that examines the effect of strategic decision speed upon subsequent firm performance and identifies environmental and organizational characteristics that relate to decision speed [51]. They draw upon strategic decision-making theory and organization theory to propose that strategic decision speed mediates the relation between environmental and organizational characteristics and performance. Measures of business environment, organization structure, strategic decision speed, and firm performance (growth and profitability) were collected from 318 CEOs from 1996 to 2000. Structural equation modeling confirmed that fast strategic decision-making predicts subsequent firm growth and profit and mediates the relation of dynamism, munificence, centralization, and formalization with firm performance. Davis-Mendelow describes how the Global Aviation Industry is changing and that the industry will be driven by, and judged on, its environmental record [52]. The international aviation community, through ICAO/CAEP has made significant progress addressing the most visible environmental issues in aviation; airframe and engine noise and air emissions (NOx, C02). This initiative anticipates the next wave of environmental concerns in aviation, the role and contribution of individual aerospace manufacturing processes towards sustainability. Fisher and Schoenberger examine the inevitable interconnection between engineering, economics and environmental impact in the electricity industry, and how they shape the roles engineers can and should play in designing the energy systems of the future [53]. Involvement of engineers needs to happen in two ways: engineers can bring a technological awareness to the policy debate; and engineers need to be able to identify and re-think their own assumptions about engineering problems, particularly during times

22 Texas Tech University, Jon Ilseng, December 2015 of economic and regulatory change. In other words, engineers must help others re-think their assumptions, and engineers must allow others to help them re-think their own. Asgarzadeh, Koga, Yoshizawa, Munakata and Hirate discuss the research of oppression to the international scientific society to present parts of years of Japanese research in this field [54]. They compare the largeness and quality of trees’ affect against other physical factors in the city environment. Two experiments were conducted, one in the real Tokyo urban environment - as a mega city and the other utilizing 3-dimensional computer software to simulate the real urban environment in an experiment room. Totally, 60 participants from the field of architecture looked at specific images and responded by filling in a pre-designed questionnaire. Results indicate that oppression which increases as building’s solid angle increases is significantly influenced by the existence of trees and the sky factor. Althin, Behrenz, Fare, Grosskopf and Mellander measure how well Swedish employment offices perform in delivering the services required of them by the Swedish government [55]. In contrast to earlier studies they use a dynamic efficiency framework, which allows a better model for placements of intermediate nature across periods. The empirical results demonstrate an increase in the office’s expected workloads over time and point to substantial differences in performance across offices. The results also point toward more than optimal placements in intermediate outputs such as non-matching jobs, training, and continued unemployment. McGrath and Romeri discuss the R&D Effectiveness Index and alternative ways to calculate it [56]. They believe that the R&D Effectiveness Index can be used to compare performance, measure improvement and evaluate business units. Sarkees discusses the capability of firms to sense and respond to changes in technologies, called technological opportunism [57]. He explores the links between technological opportunism and firm performance. The results show that technological opportunism has a strong positive impact on key measures of performance such as firm sales, profits and market value. Importantly, marketing emphasis is the mechanism through which the technological opportunism–performance relationship is achieved.

23 Texas Tech University, Jon Ilseng, December 2015

Finally, the impact of marketing emphasis on Business to Business (B2B) firms is different than that for Business to Consumer (B2C) firms, highlighting the importance of these activities for B2B marketing managers. Maudgalya, Genaldy and Shell describe case studies to determine whether workplace safety as a business objective adds value to the business bottom line [58]. They reviewed case studies to determine if there is a relationship between safety initiatives and increased productivity, quality, and cost efficiencies. There were a total of eighteen case studies that were analyzed using the Workplace Safety Intervention Appraisal Instrument. Romzek and Johnston examine the effectiveness of contract accountability in social service contracts [59]. Their analysis is based on five case studies of Kansas contracts for selected welfare, Medicaid, and foster care and adoption services. Their results indicate the state of Kansas has achieved moderate to high levels of accountability effectiveness, especially in terms of specifying social service contracts and selecting appropriate accountability strategies. Their conclusions contradict the conventional wisdom, theory and existing research on contracting.

2.5 Evidential Reasoning Literature The process of assessing schools is not an easy task as it involves many attributes as discussed by Borhan & Jemain [60]. They propose an innovative approach called Evidential Reasoning (ER) that could be used to assess school performance in a multilevel or hierarchical setting which involves indirect measurement of quality by using standardized examination results, rather than directly measuring the quality of the processes unfolding within the schools. The approach is different from most conventional decision making modeling methods in that it employs a belief structure to represent an assessment as a distribution. They conclude by revealing there is little similarity when comparing the school ranking with the normal practice currently adopted. Wang, Xie, Chin and Fu propose an accident analysis model to develop cost- efficient safety measures for preventing accidents using the Bayesian Network and

24 Texas Tech University, Jon Ilseng, December 2015

Evidential Reasoning (ER) approach [61]. The ER approach provides a procedure for aggregating calculations, which can preserve the original features of multiple attributes with various types of information. ER provides a solution for processing subjective risk assessment possibly with academic bias resulting from various opinions of different individuals. They discuss an ER-based cost-benefit analysis method considering risk reduction. Xu discusses the Evidential Reasoning (ER) approach and how it is used to analyze multiple criteria decision problems under various types of uncertainty using a unified framework [62]. He describes how the ER approach is surveyed from two aspects: 1) theoretical development and 2) applications. He then discusses how the ER approach is outlined with a focus on the links among its various developments. Jian, Li, Zhou, Xu and Chen discuss Weapon System Capability Assessment (WSCA), how it is the initial point of quantification of capabilities in the military capability planning, and how Evidential Reasoning (ER) was used to develop various types of uncertainties such as ignorance and subjectiveness [63]. The ER approach is used to aggregate the capability measurement information from sub-capability criteria to top-capability criterion. They present results using the ER approach.

2.6 Data Envelopment Analysis Literature How corporate social performance (CSP) metrics pose a major challenge to researchers and practioners has been discussed by Chien-Mung & Magali [64]. They provided a critical evaluation of current aggregation approaches and proposed a new methodology based on data envelopment analysis (DEA) to compute a corporate social performance index. There was CSP data from 2,190 firms in three major industries and the study presents the first application of the DEA model for CSP and ordinal data opens up a new path for future empirical CSP research. Cheng, Zervopoulos and Qian discuss Data Envelopment Analysis (DEA), a linear programming methodology, to evaluate the relative technical efficiency for each member of a set of peer decision making units (DMUs) with multiple inputs and multiple

25 Texas Tech University, Jon Ilseng, December 2015 outputs [65]. DEA is widely used to measure performance in many areas. Currently, the traditional DEA model cannot deal with negative input or output values. They develop a variant of the traditional radial model whereby original values are replaced with absolute values as the basis to quantify the proportion of improvements to reach the frontier. The new radial measure is units invariant and can deal with all cases of the presence of negative data. The variant of the radial measure preserves the property of proportionate improvement of a traditional radial model and provides the exact same results in the cases that the traditional radial model can deal with. They explain advantages of the new approach. Kuah and Wong discuss how the efficiency of universities is vital for effective allocation and utilization of educational resources. They discuss a Data Envelopment Analysis (DEA) model for jointly evaluating the relative teaching and research efficiencies of universities [66]. The inputs and outputs for university performance measurement are identified and comprise a total of 16 measures which are believed to be essential. A variant of DEA, joint DEA maximization, was used to model and evaluate these measures. The model was tested using a hypothetical example and its use and implications in university performance measurement is described. The application of DEA enables academics to identify deficient activities in their universities and take appropriate actions for improvement. Sutton and Dimitorv discuss how a variation of Data Envelopment Analysis (DEA), the Generalized Symmetric Weight Assignment Technique, is used to assign sailors to the jobs for the U.S. Navy [67]. This method differs from others as the assignment is a multi-objective problem where the importance of each objective, called a metric, is determined by the decision-maker and promoted within the assignment problem. They explore how the method performs as the importance of particular metric increases. They conclude that the proposed method leads to substantial cost savings for the U.S. Navy without degrading the resulting assignments’ performance on other metrics.

26 Texas Tech University, Jon Ilseng, December 2015

Branda discusses new efficiency tests which are based on traditional Data Envelopment Analysis (DEA) models and take into account portfolio diversification [68]. He identifies the investment opportunities that perform well without specifying bias to risk. He uses general deviation measures as the inputs and return measures as the outputs. He concludes with test results efficiencies of 25 world financial indices using DEA models with CVaR deviation measures. Ho and Liu discuss cross-border diffusion and how Data Envelopment Analysis (DEA) is used to analyze knowledge dissemination [69]. By collecting theoretical and application papers in DEA methodology from the Web of Science data set, they analyze the academic network of 610 researchers and identify author locations, research disciplines and their mutual linkages to explain the importance of personal specific characteristics in cross-border diffusion. Regression models and network analysis show the advantages of personal research seniority and cross-disciplinary coordinating capabilities for researchers to diffuse knowledge from one region to another. Paradi and Zhu survey 80 published DEA applications in 24 countries/areas that specifically focus on bank branches [70]. They discuss key issues related to the design of DEA models and how to design future experiments and studies in this domain. The survey found that methodological improvements account for the majority of DEA studies on bank branches. They believe that the basic DEA models as well as their many extensions would likely play a more important role in bank branch studies in the future. Samoilenko and Osei-Bryson describe a mechanism that allows for discovering appropriate productivity models for improving overall performance and feedback-type mechanism that allows the evaluation of multiple productivity models in order to select the most suitable one [71]. They focus on organizations that consider the states of their internal and external organizational environment in the formulation of their strategies. They propose and test a DEA-centric Decision Support System that aims to assess and manage the relative performance of such organizations. . Amirteimoori and Kordrostami extend prior researches on DEA-based production planning in a centralized decision-making environment [72]. In such an environment, the

27 Texas Tech University, Jon Ilseng, December 2015 production planning problem involves the participation of all individual units, each contributing in part to the total production. Their approach takes the size of operational units into consideration and the production level for each unit becomes proportional to the ability of the units. The applicability of their proposed approach in real applications is described using two real cases. Lee and Saen explain the measurement of corporate sustainability management by introducing a DEA technique and empirical case of a Korean electronics industry [73]. They employ cross-efficiency in the presence of dual-role factors by reflecting some complexity of a real-world case, donations for tax-benefits. They propose a new model to measure corporate sustainability management by employing the combined approach of cross-efficiency and dual-role factors. The new model and study findings contribute to the body of knowledge in corporate sustainability management and its performance measurement. Chen, Zhu, Yu and Noori propose a novel use of the two-stage network data envelopment analysis (DEA) to evaluate sustainable product design performances [74]. They conceptualize “design efficiency” as a key measurement of design performance in terms of how well multiple product specifications and attributes are combined in a product design that leads to lower environmental impacts or better environmental performances. They develop a new research framework for evaluating sustainable design performances as well as by proposing an innovative application of the two-stage network DEA for finding the most eco-efficient way to achieve better environmental performances through product design. Farris, Groesbeck, Van Aken and Letens present a case study of how Data Envelopment Analysis (DEA) was applied to generate objective cross-project comparisons of project duration within an engineering department of the Belgian Armed Forces [75]. For this case study, they demonstrate how DEA fills a gap not addressed by commonly applied project evaluation methods (i.e. earned value management) by allowing the objective comparison of projects on actual measures, such as duration and cost. They describe how DEA allowed the department to gain new insight about the

28 Texas Tech University, Jon Ilseng, December 2015 impact of changes to its engineering design process, creating a performance index that simultaneously considers project duration and key input variables that determine project duration. They conclude on how to apply DEA as a project evaluation tool for project managers, program office managers, and other decision-makers in project-based organizations. Blidisel describes how the Data Envelopment Analysis method was used to measure efficiency for Romanian higher educational institutions [76]. Data Envelopment Analysis was used to assess the efficiency of 40 universities for three main categories: 1) Level 2 - Universities focused on advanced research and education, 2) Level 3 - Universities focused on education and scientific research, specifically, education and artistic creation universities, 3) Level 1 - Universities focused on education. She concludes that all Level 1 universities have the highest Date Envelopment Analysis efficiency value. Also, the bigger the classification index of a public university, the higher the Data Envelopment Analysis efficiency.

2.7 Current Performance Measurement Systems Literature Zaum, Olbrich & Barke describe an approach to automated data extraction developed in cooperation with industry partners [77]. They describe concepts based on the evaluation of a large collection of logfile data generated by a state-of-the-art workflow in the semiconductor industry and on staff feedback. They also describe the experiences gathered in the process of implementing their own approach. Norcorss describes how hard-earned improvements quickly slip away if appropriate measures are not put in place to monitor performance [78]. She describes that with a well-designed, effectively operated performance system, companies can ensure that improvements achieved during the initial phase of an operational transformation are maintained and enhanced over the long term. Zeydan, Colpan & Taylan describe the process of how to remain competitive in the market [79]. They describe decision makers who regularly measure and evaluate the process performance of organizations. They discuss the performance of a manufacturing

29 Texas Tech University, Jon Ilseng, December 2015 company pertaining to 10 manufacturing job shops and how the evaluation was utilized by a new Multiple Criteria Decision Making (MCDM) methodology. The MCDM methodology has three stages established to develop a complete and accurate evaluation model. Mola and Parsaie identify the critical dimensions of performance and a set of measures that reflects the performance it is trying to achieve as a first step to designing an effective performance measurement system [80]. They propose a framework to select the appropriate measures. Amirkhanyan evaluates the effect of different performance measurement practices on accountability effectiveness in government contracts [81]. She describes findings suggesting that performance measurement has a positive impact on government’s ability to effectively manage contracts. The study results provide motivation for contract managers to optimize performance monitoring and reduce transaction costs by relying on measures that are more likely to improve contract implementation. Wang attempts to establish an index system to measure its performance from four dimensions: operation performance, customer satisfaction degree, financial performance and human resource performance [82]. He describes the methods of membership degree analysis and correlation analysis used to systematically construct the index system. He states that most indicators will reflect main characteristics and measure requirements of the measuring targets. Likierman discusses the following five traps of Performance Measurement: 1) Measuring against yourself, 2) Looking backward, 3) Putting your faith in numbers, 4) Gaming your metrics, and 5) Sticking to your numbers too long [83]. He concludes that a really good assessment system must bring finance and line managers into some kind of meaningful dialogue that allows the company to benefit from the both the relative independence of the finance and line managers and the company. Wu, Tsai, Shih and Fu discuss a new Government Performance Evaluating (GPE) procedure using a balanced scorecard structure integrated with a fuzzy linguistic scale

[84]. Their GPE procedure contributes the following: (1) integration of financial, citizen

30 Texas Tech University, Jon Ilseng, December 2015 service and internal work processes as well as learning and growth perspectives in the evaluation procedure; (2) use of a fuzzy linguistic scale to convert the subjective cognition of managers into an information entity and (3) confirmation of improvement. Zhaoji and Lifeng describe the performance evaluation from two aspects (positive and negative) in a macroscopic environment [85]. They describe the system dynamics method to explain the mechanism of these two kinds of effects on China’s public policy implementation. They also describe how computer simulation constructs the local government performance evaluation of public policy implementation in the stock and flow model of the overall system dynamics. Mola and Parsaei describe performance measurement and how numerous integrated performance measurement systems have been proposed in the field of performance measurement [86]. Performance measurement systems are designed to help organizations identify a set of performance measures that appropriately reflect their objectives. They review some of the measurement systems and identify the criteria that they reveal. Przekop and Kerr discuss how life cycle tools can be used to design and continually improve future products as well as provide bottom line cost savings [87]. They describe how to increase the recycled content of products and reduce or eliminate the hazardous substance content in products. They also describe data collection and management procedures and tools that have been developed and implemented to meet the global market demands. They believe that companies that are proactive in implementing data collection and management procedures throughout their corporation are better positioned to meet end-of-life reporting requirements. Coccia describes two general models to assess the R&D performance of a public research lab [88]. These models could provide indications about the performance and then productivity of research labs. The Model 11, an evolution of the model 1, is successfully applied to 200 public research institutes belonging to the Italian National Research Council. These functions are tools for appropriate decisions and actions to

31 Texas Tech University, Jon Ilseng, December 2015 improve research performance, especially by the more effective use of existing resources and for reducing the X inefficiency. Kaisheng, Zeng and Xiaohui describe the problem management faces with regards to developing performance measure systems for electronic business or e-business [89]. They describe the gap between linking performance measures with the uncertainty of e-business. They present the results of an effort to redress the above gap. They shed some light on the development of performance measurement for e-business by proposing that in conditions of the high uncertain e-business environment, the selective nonfinancial performance measures under the balanced scorecard (BSC) framework are most useful in improving organizational effectiveness. Lin and Ma describe the Analytic Hierarchy Process (AHP) and how it is used to evaluate performance in public decision-making [90]. With the analysis of operating process of AHP approach, the problems such as difficult presentation of systematic evaluation information, difficult quantification of performance index and difficult realization of effective comprehensive evaluation in performance evaluation can be solved to a certain extent. Their research results show that the hierarchical structure model can systematically present evaluation information, the single-level ranking can facilitate quantizing criteria and indexes, and the composite ranking can realize comprehensive evaluation. In short, the application of AHP will make public decision- making more scientific. Cheng and Lin describe the current urban redevelopment problems in Taiwan old urban centers [91]. Based on an indicator system framework, their study proposes a regeneration model—a performance evaluation method for a livable urban district, composed of four constructs, and 21 general indicators and 62 policy-making indicators. The four constructs are: 1) land use sustainability, 2) transit oriented development (TOD), 3) pattern, district composition, and 4) architectural typology and estate. Considering this framework feasibility, they recruited a panel of experts for judgment of priority for the indicator system in two phases of questionnaires. The first phase uses the fuzzy Delphi method (FDM) for experts to decide each policy-making indicator and the

32 Texas Tech University, Jon Ilseng, December 2015 degree of importance of that indicator. The second phase uses the analytic hierarchy process (AHP) to determine the priority of the four constructs and that of each general indicator. Since panel experts from various disciplines and regions evaluate this indicator system with priorities, their current study discusses the variance of priorities caused by different panel experts. This AHP framework serves as a guideline for local government to proceed with regeneration plans toward livable districts, or a framework to evaluate the performance of regenerated old urban districts. Martinez-Lorente, Angel, Dewhurst and Gallego-Rodriguez discuss Total Quality Management (TQM) and its consumer perception of it [92]. They examine the relationship between TQM, some marketing mix variables and measures of company performance. They discuss an empirical study of Spanish manufacturing companies and the results showed that the most important TQM dimensions are the system of employee relations and the use of quality management-related design tools. Puyun and Wei discuss the hidden costs when implementing E-government, including costs of impersonal communication costs, loss of information, and the network paralysis and network attacks [93]. They take into account the impact of training for citizens, re-shaping the structure of government and the convenience of administrative supervision. They construct an index system of E-government input-output efficiency to understand entire effect of E-government. They conclude that it is biased to analyze the benefits of E-government neglecting the specific economic and social environment. Murdoch, Fernandes and Astley discuss a method of developing performance measures and linking them to role-based process modeling and supporting infrastructure technology [94]. Their method addresses the development of measures in situations that initially lack sufficient theory to formalize measurement constructs. Subjective empirical observations are accepted, assigned to specified roles and used as a basis for the development of formal measures, based on practice. Proposals are expressed in the context of current measurement frameworks developed within the software and systems community.

33 Texas Tech University, Jon Ilseng, December 2015

Alladi and Vadar outline how to identify all stakeholders and get their concurrence systemically [95]. They propose the use of a holistic stakeholder matrix for identification of right stakeholders at the right time. They believe successful stakeholder engagement based on systems principles ensures sustainability for the project and to the organization. Neely, Mills, Platts, Richards, Gregory, Bourne and Kennerly describe the development and testing of a structured methodology for the design of performance measurement systems [96]. Frameworks, such as the balanced scorecard and the performance prism, have been proposed, but until recently little attention has been devoted to the question of how these frameworks can be populated, i.e. how managers can decide specifically which measures to adopt. Following a wide ranging review of the performance measurement literature, a framework identifying the desirable characteristics of a performance measurement system design process is developed. This framework provided guidelines which were subsequently used to inform the development of a process-based approach to performance measurement system design. The process was enhanced and refined during application in three action research projects, involving major UK automotive and aerospace companies. The revised process was then formally documented and tested through six further industrial applications. Neely, Richards, Mills, Platts and Bourne discuss the design of performance measures that are appropriate for modern manufacturing firms and the importance to both academics and practitioners [97]. Existing systems tend to accept simplistic notions in evaluating performance of their manufacturing facilities. Also, the general tendency in many companies is to evaluate manufacturing primarily on the basis of cost and efficiency. They discuss developing a balanced scorecard and how should specific measures of performance be designed. Also, how designing a performance measure involves much more than simply specifying a robust formula. The frequency of measurement and the source of data both have to be considered. Giannoccaro, Ludovico and Triantis discuss current developments in performance measures, consider the definitions used in performance measures, the current meaning of

34 Texas Tech University, Jon Ilseng, December 2015 terms and the development of the performance measurement framework [98]. They discuss how managers need an effective process for designing measures to meet their unique and evolving needs. They present a structured methodology for developing performance measures for a particular environment that can be used by any organization to define quality measures for control, assurance and improvement. Washio and Yamada propose a model called rank-based measure (RBM) to evaluate Decision Making Unit (DMU) from a different standpoint [99]. They suggest a method to obtain a weight which gives the best ranking, and calculates a weight between maximizing the efficiency score and keeping the best ranking. They describe a numerical experiment to compare the rankings and scores with traditional evaluation. Papavramides, Aupee and Yttesen describe a different approach in estimating how project performance can affect positively or negatively organizational capacities and generate proportional organizational impact [100]. They present an approach according to which projects should be also evaluated for the positive or negative “organizational impact” they create to both the performing and the recipient(s) organization(s). They describe a framework to measure organizational impact created by a project in a case study.

The main objectives of this dissertation are as follows: 1. Propose Performance Measures that integrate the needs of the Dallas-Fort Worth, Texas SoS . 2. Propose Performance Measures that integrate the needs of for-profit organizations, government agencies and educational institutions. 3. Ensure the proposed Performance Measures meet current and future sustainability concerns. 4. Ensure the relationship between productivity/efficiency and effectiveness in the Dallas-Fort Worth, Texas SoS addresses current and future sustainability concerns and issues.

35 Texas Tech University, Jon Ilseng, December 2015

There are thirty-five literature references relating to Aim 1 ““What are current performance measures in industry, government and educational institutions?”, five relating to Aim 2 ““Identify the Relation of these Systems from a System-of-Systems perspective and discuss how parasitic and symbiotic behaviors affect it” and fifty-two relating to Aim 3 “Identify and Propose Performance Measures that integrate the needs of these various SoS with regards to sustainability”. Regarding Aim 1, current performance measures in industry, government and educational institutions, the literature references covered it very well. The literature references describe current performance measures such as: 1) Corporate Social Performance, 2) No Child Left Behind Act, 3) Evidential Reasoning to assess schools’ performance, 4) Organizational Effectiveness, 5) Multiple Criteria Decision Making, 6) Accountability Effectiveness in Government Contracts, 7) Engineering Productivity, 8) Operation Performance, 9) Customer Satisfaction, 10) Financial Performance, 11) Human Resource Performance, 12) Efficiency and Productivity of European and U.S. airlines, 13) Collective Intelligence, 14) Profitability as Return on Assets, 15) Productivity as Revenue/employee, 16) Intensity of shared goals, 17) Level of investment in partnerships, 18) Resource Productivity, 19) Research & Development performance of public research labs, 20) Excellence for Research in Australia field of research metrics, 21) Research & Development Productivity, 22) Policy performance evaluation of local governments, 23) Productivity of Government Pharmaceutical Organizations, 24) Profit, productivity and price differential performance measures for natural gas transportation utilities, 25) Productivity of academic institutions, 26) E-government input-output efficiency, 27) Research & Development input and output factors for Nigerian higher education institutions, 28) Business goals in family businesses, 29) Research & Development Effectiveness Index, 30) Technological opportunism and firm performance, 31) Workplace safety as a business objective and 32) Contract accountability in social service contracts. There were only five literature references for Aim 2, signifying that there is a knowledge gap regarding the relation of these Systems from a System-of-Systems

36 Texas Tech University, Jon Ilseng, December 2015 perspective and how parasitic and symbiotic behaviors affect them. Even though there were few literature references for Aim 2, they were very beneficial. One literature reference described how SoS Engineering can be applied to measure corporate sustainability performance. Another literature reference described how SoS Engineering can be applied to measure a Human Health Management System scheme and its practical applications. The literature reference describing the Systems Engineering Guide for Systems of Systems Version 1.0 is an excellent guide supporting the Systems Engineering community to adapt Systems Engineering processes for complex SoSs. The majority of the literature references were for Aim 3 “Identify and propose Performance Measures that integrate the needs of these various SoS with regards to sustainability”. They focused on the following topics: 1) Sustainability science and how such initiatives were conducted in Japan, 2) Lean management, internationalization, sustainability and technology policy, 3) Long-term sustainability, 4) Education for sustainable development research, 5) Growth and growth volatility for the country of Turkey, 6) Sustainability design criteria, 7) Private Sustainable Governance, 8) Performance Measurement traps, 9) A new Government Performance Evaluating procedure, 10) Performance Measurement Systems, 11) Life cycle tools, 12) Quantitative definition of sustainability, 13) Productivity improvement for small and medium enterprises, 14) Sustainability indicators as a baseline, 15) Performance measurement systems for electronic businesses, 16) Environmental Sustainability, 17) Analytic Hierarchy Process and how it is used to evaluate performance in public decision-making, 18) Workplace safety as a business objective and whether it adds value to a business bottom line, 19) Performance evaluation method for a livable urban district, 20) Total Quality Management, 21) Relationship between mechanization and sustainability, 22) Negative relationship between pro-market reforms and the sustainability of profits in an emerging economy, 23) Sustainability course, program and degree, 24) Balanced Scorecard and Performance Prism performance measurement frameworks, 25) Data Envelopment Analysis, 26) Evidential Reasoning, 27) Rank-based measure model, 28) Data Envelopment Analysis-centric Decision Support System, 29) Data Envelopment

37 Texas Tech University, Jon Ilseng, December 2015

Analysis-based production planning, 30) Corporate sustainability management and 31) Sustainable product design performances. The literature references helped define the Research Design and Methods for each of the three Aims. The Research Design and Methods for the three Aims are described in Chapter 3.

38 Texas Tech University, Jon Ilseng, December 2015

CHAPTER 3 3 RESEARCH DESIGN AND METHODS 3.1 Introduction The research design consisted of case studies, an on-line survey and data collection (primary and secondary data). The research methods for the three Aims are described in the following sections. For Aim 1, Date Envelopment Analysis (DEA) and Evidential Reasoning (ER) methodologies are used. For Aim 2, the U.S. Department of Defense (DoD) Systems Engineering Guide for System-of-Systems Engineering Version 1, August 2008 is used. For Aim 3, DEA and ER methodologies are used. The research design and methods for Aim 1 “What are current performance measures in industry, government and educational institutions?” include DEA and ER. DEA is a “mathematical programming method for evaluating firms’ productive efficiency that has been used extensively in the operations research and management literature.” The DEA methodology notes that “efficient firms are those that use minimal inputs to produce maximum outputs” [101]. ER focuses on the “evidential reasoning algorithm and is different from many conventional Multi Criteria Decision Making modeling methods in that it uses an evidence-based reasoning process to reach a conclusion” [102]. ER has many advantages such as it can handle: 1) A mixture of quantitative and qualitative information, 2) A mixture of deterministic and random information, 3) Incomplete information and 4) A large number of attributes and alternatives. [103] The research design and methods for Aim 2 “Identify the Relation of these Systems from a System-of-Systems perspective and discuss how parasitic and symbiotic behaviors affect it” include the U.S. DoD Systems Engineering Guide for System-of-Systems Engineering Version 1, August 2008. The U.S. DoD SoS Engineering perspective defines four types of SoS: directed, acknowledged, collaborative and virtual. [104] Directed SoS are those in which the integrated system-of-systems is built and managed to fulfill specific purposes. It is centrally managed during long-term operation to continue to fulfill those purposes as well as any new ones the system owners might wish to address. The component systems maintain an ability to operate independently, but their normal

39 Texas Tech University, Jon Ilseng, December 2015 operational mode is subordinated to the central managed purpose. [105] Acknowledged SoS have recognized objectives, a designated manager, and resources for the SoS; however, the constituent systems retain their independent ownership, objectives, funding, and development and sustainment approaches. Changes in the systems are based on collaboration between the SoS and the system. [106] Collaborative SoS has the component systems interact more or less voluntarily to fulfill agreed upon central purposes. The Internet is a collaborative system. The Internet Engineering Task Force works out standards but has no power to enforce them. The central players collectively decide how to provide or deny service, thereby providing some means of enforcing and maintaining standards. [107] Virtual SoS lack a central management authority and a centrally agreed upon purpose for the SoS. Large-scale behavior emerges and it may be desirable. [108] The research design and methods for Aim 3 “Identify and Propose Performance Measures that integrate the needs of these various SoS with regards to sustainability” include a process-based approach which have desirable characteristics of a performance measurement system design process. [109] Specifically, that performance measures must: 1) be explicit and understandable, 2) ensure calculation method(s) is very clear, 3) ensure data collection is understandable, 4) enable/facilitate benchmarking, 5) ratio-based performance measures are preferable to absolute numbers, 6) provide fast feedback and 7) stimulate continuous improvement. [110] Also, a structured approach which specifies what a performance measure constitutes is required. The framework is defined as follows: 1) Performance measures should be derived from strategy, 2) Performance measures should be relevant, 3) Performance measures should relate to specific goals or targets, 4) Performance measures should be consistent in that they maintain their significance as time passes, 5) Performance measures should be reported in a simple consistent format, 6) Performance measures should provide relevant information, 7) Performance measures should be precise and 8) Performance measures should be objective. [111]

40 Texas Tech University, Jon Ilseng, December 2015

3.2 Performance Measures (Aim 1) 3.2.1 Data Envelopment Analysis Data Envelopment Analysis is a nonparametric method widely used in the evaluation of performance of Decision Making Units (DMUs). It is used to empirically measure productive efficiency of DMUs. DMUs can be business units (e.g. bank branches), government agencies, educational institutions, for-profit institutions, non- profit institutions and people (assessing student performance). For the DMUs, it is assumed that when compared they are assumed to operate homogeneously; in other words, they receive the same inputs and produce the same outputs and these inputs/outputs represent a whole population. Figure 3.2.1-1 shows a model for a bank branch.

Table 3-1: Data Envelopment Analysis Model

Some important assumptions for the DEA model are as follows:

 The efficiency of each DMU is the ratio (sum of weighted outputs/sum of weighted inputs) adjusted to be a number between 0 and 1. The less inputs consumed and the more outputs produced, the more efficient is a DMU.  In DEA, the weights are not known in advance and are not even the same among DMUs. This is a unique feature of DEA and has the required element of objectivity which is missing when applying arbitrary weights.

41 Texas Tech University, Jon Ilseng, December 2015

 Those DMUs which have an efficiency of 1 have a mathematical space which envelops all the other DMU points. This allows the calculation of potential improvements for inefficient DMUs.

DEA is capable of handling multiple inputs and outputs which are expressed in different measurement units. This type of analysis cannot be done with classical statistical methods. Furthermore, because DEA is not a statistical method, one is not constrained in the type and the relations of the data used, as in regression techniques. Inputs and outputs can be anything, including qualitative measurements. In the bank branches example, the quality of service output could be such a measurement, based on the results of a customers’ survey

For current performance measures, DEA has been used to measure corporate sustainability management (CSM). CSM measures are shown in Table 3.2.1-1.

Table 3-2: CSM performance dimensions and measures [112] CSM performance Key Performance Measures dimensions Indicators Economic transparency and  Corporate  Number of board profitability transparency Governance meetings  Personnel costs/expenses of communication and

relevant meetings

 Corporate  Material costs – design transparency and printing costs of and communication materials accountability (e.g. annual sustainability reports, financial reports, etc.)  Personnel/administrative costs

Social responsibility  Human rights  Number of employee

42 Texas Tech University, Jon Ilseng, December 2015

CSM performance Key Performance Measures dimensions Indicators training hours for corporate social responsibility  Expenses to train and promote corporate social responsibility  Social  Number of social events contribution with local communities  Amounts of donations  Volunteering hours/personnel costs Environmental  Environmental  Number of green sustainability management technology development and innovation projects  Expense of environmental management  Costs on environmental product innovation

DEA has also been used to measure the efficiency assessment of public universities; specifically, to evaluate teaching and research efficiencies. Table 3.2.1-2 shows DEA inputs and outputs for measuring public university teaching efficiency.

Table 3.2.1-3-3: Inputs and outputs of public university teaching efficiency [113] Inputs Outputs Number of academic staffs Number of undergraduate and postgraduate taught course students Number of undergraduate Average graduates’ results and postgraduate taught course students Average students’ Graduation Rate qualifications University expenditures Graduates’ employment rate

43 Texas Tech University, Jon Ilseng, December 2015

Table 3.2.1-3 shows DEA inputs and outputs for measuring public university research efficiency

Table 3.2.1-3-4: Inputs and outputs of public university research efficiency [114] Inputs Outputs University expenditures Number of graduates from research

Number of research staffs Number of publications Average research staffs’ Number of awards qualifications Number of research Number of intellectual properties students Research grants Number of intellectual properties

3.2.2 Evidential Reasoning Evidential Reasoning (ER) is currently used to measure educational institutions. Table 3.2.2-1 shows how ER for three schools was used to assess the schools’ performance in three subjects: Mathematics, Science and English. Each subject is of equal importance and the performance of each subject is categorized in three levels: Worst (W), Average (A), and Excellent (E).

Table 3-5: ER results for three schools [115] School Mathematics Science English W A E W A E W A E

A 0.25 0.68 0.07 0.45 0.50 0.05 0.10 0.32 0.58 B 0.38 0.44 0.18 0.09 0.65 0.26 0.16 0.51 0.33 C 0.19 0.51 0.30 0.24 0.59 0.17 0.21 0.47 0.32

Attribute probabilities were assessed for each ER results are they are shows in Table 3.2.2-2.

44 Texas Tech University, Jon Ilseng, December 2015

Table3-6: Attribute probabilities for ER results [116] School Mathematics Science English W A E W A E W A E

A 0.083 0.226 0.023 0.149 0.166 0.016 0.033 0.106 0.193 B 0.126 0.146 0.059 0.030 0.216 0.086 0.053 0.169 0.109 C 0.063 0.169 0.099 0.079 0.196 0.056 0.069 0.156 0.106

Using the ER results and ER attribute probabilities, final school performance and its ranking are shown in Table 3.2.2-3.

Table 3-7: Attribute probabilities for ER results [117] School School Performance Ranking A 0.5017 3 B 0.5437 2 C 0.5446 1

3.3 Identify System Relationships from a SoS perspective (Aim 2) 3.3.1 Introduction System–of-Systems (SoS) is a collection of task-oriented or dedicated systems that pool their resources and capabilities together to create a new, more complex system which offers more functionality performance than the sum of the constituent systems. With the problems of increasing complexity facing decision makers for today’s industry, government organizations and educational institutions, SoS is being applied for many of these problems. Multiple, evolving, heterogeneous and distributed systems are involved and embedded in many levels. Therefore, a framework is needed to enable satisfactory decision-support. Addressing and identifying the relation of industry, government organizations and educational institutions from a SoS perspective is urgent and critical because it involves decisions that commit large amounts of money and resources. These decisions can lead to ultimate failure or success and carry significant consequences for

45 Texas Tech University, Jon Ilseng, December 2015 today’s society and for future generations. SoS have challenges that extend beyond current complex systems. SoS have an evolving collection of distributed, heterogeneous networks of systems and each is capable of independent and useful operations. The combination produces emergent and enterprise capabilities that individual systems alone cannot obtain. Understanding how a SoS is employed in today’s environment is critical. Table 3.3-1 describes SoS characteristics from different environmental aspects.

Table 3-8: SoS characteristics from Environmental Aspects [118] Environmental Aspects SoS Stakeholder Involvement All stakeholders may not be recognized Governance Had added levels of complexity due to management and funding; SoS does not have authority overall all the systems Operational Focus Must meet a set of operational objectives using systems whose objectives may or may not align with SoS objectives Acquisition Added complexity due to multiple system lifecycles across acquisition programs (legacy systems, development systems, new systems) Test & Evaluation Challenging due to difficulty of synchronizing across multiple systems’ life cycles Boundaries & Interfaces Focuses on identifying the systems that contribute to the SoS objectives and enables flow of data, control and functionality across the SoS while balancing the needs of the systems Performance & Behavior Performance across the SoS that satisfies SoS user capability needs while balancing needs of the systems.

3.3.2 SoS Relationship For the industry, government and educational SoSs, the following are key characteristics:

46 Texas Tech University, Jon Ilseng, December 2015

1. Component systems achieve well-substantiated purposes in their own right if removed from the SoS. 2. Component systems are managed in large part for their own purposes rather than the purposes of the whole. 3. SoS exhibit behaviors not achievable by the component systems acting independently. 4. Functions, behaviors and component systems may be added or removed during its use.

3.4 Propose a SoS sustainability Performance Measures (Aim 3) 3.4.1 Introduction Aim 3 identifies and proposes performance measures that integrate the needs of these various systems, as well as the overall SoS, with regards to sustainability. By using the Balanced Scorecard (BSC) measurement system, I am proposing the following performance measures to address Aim 3: 1. Cost/Logistics Index – Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. 2. Performance/Logistics Index – Performance/Logistics Index measures the performance of the SoS versus the logistics information. 3. Availabilty/Logistics Index – Availability/Logistics Index measures the availability of the SoS versus the logistics information.

3.4.2 Approach The research methods used to identify and propose Performance Measures will be DEA and ER. DEA has been used to assess performance in educational institutions. With educational institutions under constant financial pressure to be more efficient with their allocated resources, it is crucial they are productive in fulfilling the increasing demand for education. U.S. and foreign governments are very concerned with the increasing cost of education at higher universities. In Romania, DEA was used to assess the efficiency

47 Texas Tech University, Jon Ilseng, December 2015 of 40 universities in three levels of classifications: 1) Level 1 – Universities focused on education, 2) Level 2 – Universities focuses on advanced research and education, and 3) Level 3 – Universities focused on education and scientific research, specifically, education and artistic creation universities. [119] DEA has been used to measure the efficiency of the banking industry, specifically, bank branch efficiency. Eighty (80) DEA applications for twenty-four (24) countries/areas were surveyed to identify the best practitioners and the areas in need of improvement for a bank’s complex operating situations. [120] DEA has also been used to create a Decision Support System (DSS) to assess and manage the relative performance of for-profit organizations. [121] ER has been used to assess military capability planning, specifically, weapon system capability assessment (WCSA). ER was used to assess WCSA with regards to a Main Battle Tank. [122] ER was used to develop various types of uncertainties such as ignorance and subjectiveness. The ER approach was used to aggregate the capability measurement information from sub-capability criteria to top-capability criterion. ER approach has been applied for engineering design, engineering evaluation, new product design and business management. [123]. The research design and methods described in Chapter 3 were beneficial in researching case studies which are described in the next Chapter 4 Implementation.

48 Texas Tech University, Jon Ilseng, December 2015

CHAPTER 4 4 IMPLEMENTATION 4.1 Introduction Chapter 4 Implementation centers on case studies that have been researched and conducted for SoS for-profit organizations, government agencies and educational institutions. It also involves Case Studies using Date Envelopment Analysis (DEA) and Evidential Reasoning (ER). Table 4.1-1 lists journal articles’ case studies that have been researched and conducted for SoS for-profit organization, government agencies and educational institutions. Each case study is discussed in detail in the following paragraphs.

Table 4-1: Journal articles’ case studies for SoS for-profit organizations, government agencies and educational institutions Journal Article SoS [124] Government Agencies [125] Educational Institutions [126] For-profit organizations [127] Educational Institutions [128] For-profit organizations [129], [130] Government Agencies

Table4-2: Journal articles’ case studies using Data Envelopment Analysis and Evidential Reasoning lists journal articles’ case studies that have been conducted using DEA and ER.

49 Texas Tech University, Jon Ilseng, December 2015

Table4-2: Journal articles’ case studies using Data Envelopment Analysis and Evidential Reasoning Journal Article Approach [131] Data Envelopment Analysis [132] Data Envelopment Analysis [133] Evidential Reasoning [134] Evidential Reasoning [135] Evidential Reasoning

4.1.1 Case Studies for Government Agencies A case study for Government Agencies was conducted to evaluate the effect of different performance measurement practices for accountability effectiveness of state and local government contracts. “Accountability effectiveness” describes the capacity of a Government Agency to design, implement, manage and achieve accountability for its social service contracts. [136]. There were three main objectives for this case study: 1) How much of an influence did performance measurement have on accountability effectiveness for a sample of state and local government contracts, 2) What effect did fifteen (15) performance measure practices have on contract accountability effectiveness, and 3) Determine if any collaborative measurement efforts affect accountability effectiveness. [137] The case study involved sixty-nine interviews with various government contract managers in state and local government agencies. Table 4.1.1-1 shows the service areas, number of government respondents and contractors, and percentage of contract areas. Interviews with program managers, procurement officers and Government officers who managed multiple contracts were asked questions to discuss what the most typical contract they monitored was [138].

50 Texas Tech University, Jon Ilseng, December 2015

Table 4-3 Respondents and Contract Areas Service Area Government Respondents (n = 39) Contractors (n = 30) Consulting, n = 9, 23.1% n = 6, 20% evaluation and management training Long-term care n = 7, 17.9% n = 1, 3.3% Construction, n = 6, 15.4% n = 3, 10.0% maintenance, public works Medical, nursing n = 5, 12.8% n = 2, 6.7% care, health management Mental health n = 4, 10.3% n = 1, 3.3% Information n = 3, 7.7% n = 2, 6.7% technology Programs for n = 1, 2.6% n = 3, 10.0% women and children Criminal justice n = 1, 2.6% n = 2, 6.7% Environmental n = 2, 5.1% n = 2, 6.7% (planting, plant control) Food supply n = 1, 2.6% n = 0, 0% monitoring Animal care n = 0, 0% n =1, 3.3% Substance abuse, n = 0, 0% n =3, 10.0% homelessness Janitorial n =0, 0% n = 1, 3.3% Translation n =0, 0% n =1, 3.3% Recreational n =0, 0% n =2, 6.7% TOTAL n = 39, 100% n = 30, 100%

In addition to service areas, the case study also tracked fifteen (15) specific performance measures which the sixty-nine (69) Government respondents and contractors normally used (see Table 4.1.1-2) [139].

51 Texas Tech University, Jon Ilseng, December 2015

Table 4-4: Performance Measures Used (% “yes”) Performance Government Respondents (n = 39) Contractors (n = 30) Measurement Costs 64.1 46.7 Quality 94.9 66.7 Workload 87.2 63.3 Impact on clients 79.5 56.7 Client satisfaction 84.6 53.3 Equitable delivery 41.0 30.0 of services Compliance with 64.1 43.3 laws/regulations Timeliness 94.9 66.7 Disruptions 97.4 73.3 Process specified 69.2 43.3 in details Quantitative 76.9 60.0 indicators Qualitative 82.1 66.7 indicators Informal 94.9 70.0 monitoring Measures tailored 41.0 40.0 to organizations Reputation 64.1 56.7

What is interesting about these fifteen (15) performance measures listed above is that quality, workload, timeliness and informal monitoring are very common amongst the Government respondents and contractors. The biggest disparity is for Client satisfaction where the Government used it 84.6% of the time while the Contractors only used it 53.3% of the time. This case study revealed the following: 1) any use of informal

52 Texas Tech University, Jon Ilseng, December 2015 monitoring during oversight can have a negative effect on accountability effectiveness, specifically, receiving timely and accurate information and using it to reveal problems. [140]. Also, that Government respondents are less likely to have a perception of high accountability effectiveness, whereas, contractors tend to be satisfied with the extent of evaluation. [141] A second case study for Government Agencies was evaluating public policy performance for local governments in the United Kingdom and United States. This case study compared value orientation and influencing factor with regards to the policy performance evaluation of a local government. The case study focused on four prime value factors for public policy performance and they are: 1) equity, 2) service, 3) responsibility, and 4) good governance. [142] Equity is one performance measure and the case study defined it as “whether the groups or individuals receiving services are treated fairly”. [143] The case study summarized that equity was a good evaluator of local government actions, even though there are many difficulties when measuring it. For service, this case study regarded it as the most important performance measure when evaluating local governments. Key to service as a performance measure is whether a local government can provide high-quality public services, increase a customer’s choices and meet the customer’s demand. In order to do these things, local governments must be willing to change their traditional roles and concepts from “managers to service providers”, listen to the public’s voice and regard the customer’s need as “action- oriented”. [144] This case study regards responsibility as a fundamental performance measure for local governments. With responsibility, the local governments had to pay more attention to the result and evaluation of their actions. Responsibility forced the local governments to expand and diversify and to have more effective mechanisms to implement their responsibility. For good governance, this case study recommended that the role of local governments is not only “the regulators, funders and service providers”, but also “the promoters and partners of the other society sectors, such as business organizations and nonprofit organizations”. [145] The case study realized that good

53 Texas Tech University, Jon Ilseng, December 2015 governance must involve partnerships between the local governments and the private sector. A third case study for Government Agencies was evaluating the performance of local governments in China, specifically, why local governments are inefficient at all levels of administration. The case study used the Systems Dynamics method to conduct the evaluation. The Systems Dynamics method is based on systems theory, cybernetics and information theory and is a quantitative analysis method used to conduct research on the social, economic and other kinds of complex systems. [146] There were three groups which the case study evaluated and they were: 1) Policy makers, 2) Policy performers and 3) Public interest groups. The case study evaluation results were: 1) Help is needed to improve the quality of public policy executives, 2) Help is needed to improve the monitoring mechanisms of policy implementation, 3) Help increase the resources that input into public policy and optimize the environment of public policy implementation. [147].

4.1.2 Case Studies for For-Profit Organizations A case study was conducted to measure organizational performance and effectiveness regarding for-profit organizations. The for-profit organizations in this case study included global private businesses and technology companies. The criteria for the measurement included the following: 1) Efficiency and/or productivity, 2) Sales, 3) Profitability and/or shareholder return, 4) Financial success, 5) Employee satisfactory and/or morale, 6) Growth and/or market share, 7) Customer orientation, 8) Public image and reputation, 9) Quality and 10) Social Performance. [148] For the above criteria, the case study revealed that profitability and/or shareholder return and financial success were key to improving the organizational performance and effectiveness of for-profit organizations. Employee satisfaction was a close second because it included social concerns and human management. A second case study involved measuring corporate social performance (CSP) for over 3,000 U.S. publicly traded companies. CSP, as defined in this case study, is based

54 Texas Tech University, Jon Ilseng, December 2015

on the Kinder, Lydenberg and Domini, Inc. Structure Version 2007 and is shown in Figure 4.1.2-1:

Table 4-5: Corporate Social Performance [149]

Corporate Social Performance

Level 1: Categories 1-1 Environmental Performance 1-2 Social Ratings 1-3 Governance Ratings

Level 2: Issues 2-1 2-2 2-3

 Climate change Community Reporting

 Product & Services Diversity Structure  Operations & Mgt Employee Relations Structure  Other factors Human rights Other Level 3: Items Concerns ProductConcerns Concerns  Climate change Investment controversies Public Policy  Ozone depleting chemicals Negative economic impact Compensation  Agricultural chemicals Tax disputes Transparency  Hazardous waste Other concerns Ownership

Strengths Strengths Strengths Clean energy Charitable giving Public Policy Beneficial products & services Innovative giving Transparency Pollution prevention Support for education Compensation Recycling Ownership Management Systems

CSP as shown in Figure 4.1.2-1 assessed over 3,000 U.S publicly traded companies for four major areas: 1) Environment, 2) Social, 3) Governance and 4) Controversial

55 Texas Tech University, Jon Ilseng, December 2015 business involvement. [150] Each of these issues was evaluated from two different perspectives, concern and strength. The results from this Case Study were: 1) Chief Executive Officers (CEOs) are able to make resource and operation planning decisions to improve their company’s CSP, 2) DEA efficiency scores help companies devise strategies at the operational level to improve their CSP, 3) DEA efficiency scores and benchmark targets help companies benchmark their supply chain partners’ CSP. A third Case Study involved measuring corporate sustainability performance by applying a SoS Engineering perspective. The corporations in this Case Study used a Sustainability Performance Measurement System (SPMS) to measure progress towards their sustainability goals, objectives and targets. [151] SPMS is very similar to other performance measurement systems but it focuses on issues relevant to sustainability. The case study dealt with the characteristics of SoS Engineering problems and how these challenges are associated with corporate sustainability (see Table 4.1.2-1).

Table 4-6: Characteristics of SoS Engineering Problems and Corporate Sustainability [152] Characteristics Definition Corporate Sustainability Holistic Problem Requires consideration of the Sustainability Space technical, human/social, managerial, recognizes organizational, policy and political interrelation and dimensions interdependence between a vast array of economic, environmental and social issues. Corporations must consider the technical, human, managerial, organizational, policy and political dimensions. Ambiguity Difficulty in clearly demarking Establishing an problem boundaries as well as their interpretation of interpretation is an inherent sustainability for characteristic of SoSE. corporations is key.

56 Texas Tech University, Jon Ilseng, December 2015

Characteristics Definition Corporate Sustainability Uncertainty SoSE problems are not tightly bound. Corporate sustainability is a process of continual transformations, therefore, not tightly bound. Highly Contextual The circumstances, conditions, No universally factors and patterns that give applicable approach meaning and purpose to the SoS. to corporate sustainability; different strategies, actions and solutions appropriate in different contexts. Emergence SoS behavioral and structural Corporate patterns and their interpretations. sustainability is a dynamic, evolutionary process. Non-ergodic Condition of having no defined states No definitive end or discernable transitions between point where a states. corporation can declare that it has achieved sustainability. Non-monotonic Increases in knowledge are not Decisions to move a reciprocated by increases in corporation towards understanding. sustainability are based on incomplete understanding that changes over time.

This case study summarized how corporate sustainability performance has the same characteristics that are normally associated with SoS Engineering problems. This case study summarized that corporations will have the following issues when establishing and developing their own SPMS. [153].  The purpose and objectives of the SPMS will change over time.  An optimal SPMS is not realistic.

57 Texas Tech University, Jon Ilseng, December 2015

 The SPMS represents an initial response, not a lasting solution.  The SPMS does not directly measure corporate sustainability.  The SPMS quality can only be assessed relative to the context in which it is applied.  There will be trade-offs between economic, environmental and social goals and objectives.  The SPMS should include measures of the whole.  The process of developing a SPMS must remain flexible.

4.1.3 Case Studies for Educational Institutions A case study was performed in Nigeria regarding research productivity for selected higher educational institutions. Specifically, the case study focused on input and output factors of research & development (R&D) and what factors influenced research productivity. The case study had a sample size of twelve (12) leading Nigerian universities and the researchers were randomly selected from the areas of Agriculture, Science and Engineering. The twelve (12) Nigerian universities are shown in Table 4.1.3-1 below.

Table 4-7: Twelve Nigerian Universities used in Case Study [154] Nigerian Universities Ahmadu Bello, University, Zaria, Kaduna State Babcock University, Ogun State BOWEN University Enugu State University of Technology, Enugu State University of Nigeria, Nsuka Ladoke Akintola University of Technology, Ogbomosho, Oryo State Lagos State University Obafemi Awolowo University, Ile Ife, Osun State

58 Texas Tech University, Jon Ilseng, December 2015

University of Ilorin, Kwara State University of Maiduguri, Borno State University of Benin, Edo State University of Ibadan, Oyo State

Researchers’ publications from all twelve (12) universities consisted of journal articles, conference papers and contributions to books. The case study defined productivity “as a ratio of researchers’ publications and the number of years used to produce the output (for this case study five years).” [155] The results of the case study are shown in Table 4.1.3-2 and describe the output factors and the research productivity distributed by the appropriate field of study

Table 4-8: Output Factors and Research Productivities by Field of Study [156] Output Factors Agriculture (n = 92) Science (n = 144) Technology (n = 100) (Publications) Research Productivity Research Research Productivity Productivity Local Journal 4.34 2.74 3.27 Foreign Journal 2.79 3.01 2.48 Local Conference 3.40 2.89 2.25 Foreign Conference 1.42 1.10 1.08 Book/Chapters in 0.88 0.58 0.64 Books

Looking at Table 4.1.3-2 indicates that more local journals were published vice international journals. The case study summarized that approximately 14% of researchers had low productivity, 70% had medium productivity and 16% had high productivity. The case study also summarized that the researchers’ qualification, years of experience, research collaborations and time devoted to research significantly influenced R&D productivity. [157]

59 Texas Tech University, Jon Ilseng, December 2015

A second case study on educational institutions involved the No Child Left Behind (NCLB) Act of 2001 and if it was influential in shaping the goals and outcomes of educational policy across the United States. The NCLB Act of 2001 was passed by the U.S. Congress to help synchronize state accountability models. It required each of the fifty (50) United States to develop measures of Adequate Yearly Progress (AYP) in the subjects of reading and mathematics in order to evaluate the quality of the public education system. One AYP performance was the performance accountability index and it consisted of the following four elements [158]: 1. Trajectory Selection – Provides a detailed description of how each of the fifty (50) United States plan to advance toward a 100% proficiency goal by the end of the 2014 school term. 2. Confidence Intervals – Ranges of the performance expectations bounding the band of acceptable scores. Confidence intervals were used to define the level of difficulty each state had in accomplishing performance targets. 3. Rewards – States were awarded points from zero to one based on whether they incorporated grants, increased discretion or provided monetary rewards. 4. Supplemental Measures – States received additional points if they incorporated supplementary performances not mandated by NCLB. For example, Arizona reported both AYP and its own state accountability indicator to take growth into account when assessing test results. Also, Delaware incorporated the AYP measures into its State Progress Determination system which included each school’s accountability history in its academic rating.

The overall summary indicated the following [159]:

60 Texas Tech University, Jon Ilseng, December 2015

 The development and use of performance measures to hold educators accountable and improve performance was limited by organized employee groups and enhanced by minority student populations.  States that had strong unions tended to produce watered-down reforms, which raised the concern about employee involvement in the process.  States with more minorities implemented systems that were more demanding  Impact of accountability-based reforms is limited.  States with higher performance index scores under NCLB failed to produce significant improvements in student performance in terms of increases in the average scale eighth-grade reading and math index variable.

4.1.4 Case Studies for Data Envelopment Analysis A case study was performed within an Engineering Department of the Belgian Armed Forces of how Data Envelopment Analysis was used to generate objective comparisons of project durations. [160] This case study focused on how Data Envelopment Analysis was used to compare fifteen (15) Engineering Design Projects based on actual measures such as duration and cost. The case study also focused on how the impact of changes affects the Engineering Design Process. The evidence from this case study suggested that using the Data Envelopment Analysis shortened the project duration on an average of 22%. [161] Another case study was performed using the Data Envelopment Analysis model to evaluate the teaching and research efficiencies of universities. [162] The case study first identified inputs/outputs for university performance measurement which consisted of sixteen (16) total measures. Even with a small sample size of thirty (30), using the Data Envelopment Model demonstrated the large differences between efficient and inefficient universities. 4.1.5 Case Studies for Evidential Reasoning A case study was performed using the Evidential Reasoning approach to measure the Weapon System Capability Assessment (WSCA) for four Main Battle Tanks. [163]

61 Texas Tech University, Jon Ilseng, December 2015

Specifically, the Evidential Reasoning approach was used to measure the attack capability, mobility capability, defense capability and communication & control capability for each of the four Main Battle Tanks. The four Main Battle Tanks were: 1) Chinese Type 98T, 2) United States of America (USA) M1A2 Abrams, 3) United Kingdom Challenger 2E and 4) German Leopard 2. By using the Evidential Reasoning approach, these four Main Battle Tanks were ranked by each of the above capabilities. The results were that the Chinese Type 98T had the best attack capability, the German Leopard 2 had the best mobility capability, the United Kingdom Challenger 2E had the best defense capability and the USA M1A2 Abrams had the best communication & control capability. [164] A second case study was performed surveying the Evidential Reasoning approach from two viewpoints: 1) Theoretical Development and 2) Applications. [165] The case study focused on the Evidential Reasoning approach that has been applied to the following areas: 1) Engineering Design, 2) Reliability, Safety and Risk Assessment, 3) Business Management, 4) Project Management and Supply Chain Management, 5) Environmental and Sustainability Management, 6) Policy making and 7) Group Decision Making. The case study concluded that the Evidential Reasoning approach is very capable of handling various types of uncertainties in an integrated way. [166] A third case study was conducted in Malaysia using Evidential Reasoning to assess schools’ performance by using standardized examination results vice directly measuring the quality of the processes unfolding within the schools. [167] This case study was able to assess and rank schools by an appropriate weight and merit point for each attribute and grade. Fifty-seven (57) schools were selected (boarding, religious and vocational/educational schools were excluded) and there was no requirement based on admission standards. The case study concluded that the Evidential Reasoning approach was a mathematically sound approach towards measuring the schools’ performance. [168] The three case studies on Government Agencies conclude that: 1) Informal monitoring has a negative effect on accountability effectiveness, 2) Government

62 Texas Tech University, Jon Ilseng, December 2015 respondents have a low perception of accountability effectiveness and Contractor respondents have a high perception of accountability effectiveness, 3) Good governance involves partnerships between local governments and the private section, 4) Help is needed to improve the quality of public policy executives and the monitoring mechanisms of policy implementation. The three case studies on For-Profit Organizations conclude that: 1) Profitability and/or shareholder return are key to improving organizational performance, 2) Measuring CSP helps CEOs make resource and operation planning decisions, 3) DEA efficiency scores help companies devise strategies at the operational level to improve their CSP, 4) DEA efficiency scores and benchmark targets help companies benchmark their supply chain partners’ CSP and 5) SPMS is a good measure of progress towards corporations’ sustainability goals and objectives. The two case studies on Educational Institutions conclude that: 1) Researchers’ qualification, years of experience, research collaborations and time devoted to research influence R&D productivity and 2) The NCLB Act of 2001 can hold educators accountable and improve performance even though it is limited by organized employee groups. The research and implementation from these case studies helped define the methodology used to define performance measures. The methodology used is the Balanced Scorecard System and is described in Chapter 5 Methodology.

63 Texas Tech University, Jon Ilseng, December 2015

CHAPTER 5 5 METHODOLOGY The case studies described in Chapter 4 Implementation defined the methodology I used. For the purposes of this dissertation, the methodology used to define performance measures is the balanced scorecard (BSC) measurement system. As discussed in Chapter 2 LITERATURE SEARCH, the BSC has been used to measure performance in Government Agencies for the following measures: 1) Financial responsibility, 2) Service to citizens, 3) Work process improvement and 4) Learning and growth of employees. The BSC has also been used to evaluate performance for manufacturing firms for the following measures: 1) Cost and 2) Efficiency. The BSC is a well-known and widely used framework for performance measurement. It is defined as being a “frame that helps the organization to transpose the strategy into operational objectives, in order to direct both the organization performance and behavior”. [169] BSC is a reasonably good guide that reaches the final goal of any organization, whether it is performance improvement or profit, from more than one perspective. For this research, BSC methodology will examine performance by looking at production and innovation, measuring performance in terms of maximizing profit from current products and following indicators for future productivity. The Balanced Scorecard helps align key performance measures with strategy at all levels of an organization. There are other measurement systems that have been implemented, but the BSC helps planners identify what should be done and measured. Plus, it enables company executives to truly execute their strategies. The BSC model translates an organization’s vision, strategy and long-term goals into a set of measures built around four perspectives: 1) Financial, 2) Customer, 3) Internal Business Processes and 4) Innovation and Learning. The BSC model was created by Robert S. Kaplan and David P. Norton (referred to as the Kaplan and Norton model) and is considered as the standard BSC template. [170] Their model added the three additional perspectives (Customer, Internal Business Process, Innovation and Learning) to the traditional financial measure. It incorporates both leading and lagging indicators, drives the choice of performance measures, and emphasizes balancing across

64 Texas Tech University, Jon Ilseng, December 2015 multiple dimensions of performance. This ensures that good performance in one area is not offset by poor performance elsewhere. Figure 5-1 illustrates the standard BSC template.

Financial Perspective

How do we look at shareholders?

Customer Vision and Internal Business Perspective Strategy Process Perspective

How do What must we customers see excel at? us?

Figure 5-1 Standard Balanced Scorecard Template [171]

65 Texas Tech University, Jon Ilseng, December 2015

The reasons this dissertation uses the BSC template are to define and propose performance measures to integrate the Dallas-Fort Worth SoS with regards to sustainability are as follows:  Measures the progress of any organization toward its strategic goals by translating vision and objectives into measures across a balanced set of perspectives.  Captures customers’ expectations and measures an organization’s ability to meet them.  Translates strategy, mission and vision into tangible measures for use by decision makers.  Is the culmination of sophisticated data collection.  Drives the process of change.

The components of the BSC template focuses on four items which are shown in Figure 5- 2: 1) Perspectives (Financial, Internal Business Process, Innovation and Learning, Customer, 2) Objectives (What an organization needs to do to accomplish its strategy), 3) Metrics (Actionable and tangible measurements that support achieving objectives), and 4) Targets (Performance level expectations set against the strategic plan. The BSC template challenges an organization’s strategy to ensure that it is the most effective one for the company. It stresses vision and strategy at the center of performance measurement. A clear strategy and mission are critical to ensuring the BSC template is successful. The BSC does not give or define an organization’s strategy, but it notifies an organization very quickly that the organization’s strategy is not working or will not work. The BSC template helps align an organization’s strategic objectives across an entire organization at all levels. The BSC template helps an organization’s employees at all levels to clearly understand what they each can do to make the organization successful and meets its strategic objectives. Once an organization or company implements the BSC template, it ensures performance measures indicate whether the overall strategy is being met and fulfilled effectively.

66 Texas Tech University, Jon Ilseng, December 2015

Financial Perspective

To succeed Objectives Measures Targets financially, how should we appear to our shareholders?

Customer Perspective

Objectives Measures Targets To achieve our strategy, how should we appear to our customers?

Internal Business Process Perspective

To satisfy our shareholders Objectives Measures Targets and customers, what business processes must we excel at?

Innovation & Learning Perspective

To achieve our Objectives Measures Targets strategy, how will we sustain our ability to change and improve?

Figure 5-2 Standard Balanced Scorecard Template [172]

67 Texas Tech University, Jon Ilseng, December 2015

The BSC methodology helped define the data collection process described in Chapter 6

68 Texas Tech University, Jon Ilseng, December 2015

CHAPTER 6 6 SURVEY DEVELOPMENT AND DATA COLLECTION The BSC methodology discussed in Chapter 5 helped define the data collection process and the survey created to collect data. The data collection process involved various commonly used methods. Primary data (data collected for the first time and in crude form) and secondary data (public data which has already been collected) were collected. For the primary data, two different commonly used methods were used: 1) Direct on-line Survey and 2) Direct Oral Interview. Secondary data was used from: 1) DoD Selected Acquisition Reports (SARs) in a study published by the RAND Corporation for the U.S. Air Force, 2) World Bank Open Data and 3) City of Dallas, Texas Dallas Measures. The World Bank Open Data and Dallas Measures are free and accessible to the public. 6.1 Cost Performance Index Data When determining what performance measures to collect data for, what do organizations (education, for-profit, non-profit, government agencies) typically consider as their primary measures? The first and foremost primary measure is cost (actual and predicted). Every organization must consider cost because it is a key indicator as to whether an organization survives or not. When researching these organizations for what cost performance measure is the most commonly used, the Cost Performance Index (CPI) was the one most commonly used. CPI is defined as a cost efficiency factor representing the actual costs expended and the value of the work performed. CPI is a commonly used performance measure for DoD Contractors and commercial companies. The actual cost expended is referred to as the Actual Cost of Work Performed (ACWP). The value of the work performed is referred to as the Budgeted Cost of Work Performed (BCWP). ACWP is the actual hours of work performed through a point in time. BCWP is the planned hours of activities that have been completed at any given point in the schedule. CPI is calculated as follows: CPI = BCWP/ACWP. CPI indicates the cost efficiency with which contractual work has been accomplished. A CPI of less than 1.0 implies a cost overrun and that the cost of completing the work is higher than planned. A CPI

69 Texas Tech University, Jon Ilseng, December 2015 greater than 1.0 implies an on target condition and that the cost of completing the work is less than planned. A CPI equal to 1.0 implies that the cost of completing the work is right on plan. CPI can be a monthly or cumulative average as long as they are consistent. The CPI data collected is cumulative over a period of time. The period of time varies depending on the program timeframe and how far in the program CPI was collected. The CPI cumulative-to-date (CTD) data was extracted from DoD Selected Acquisition Reports (SARs) in a study published by the RAND Corporation for the U.S. Air Force. [173] Thirty-five (35) major DoD Acquisition Programs were analyzed. The time frame is fiscal year 2004 (October 2003 – September 2004). The CPI data collected for two U.S. Air Programs is shown in Tables 6-1 and 6-2. The CPI data for the remaining thirty-three (33) programs are in Appendix A, Tables A-1 through A-33.

Table 6-1: U.S. Air Force B-1B Bomber Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 1.00 December 2003 0.99 January 2004 0.97 February 2004 0.98 March 2004 0.98 April 2004 0.99 May 2004 0.99 June 2004 1.00 July 2004 1.00 August 2004 1.02 September 2004 1.02

70 Texas Tech University, Jon Ilseng, December 2015

Table 6-2: U.S. Air Force C-17A Transport Program CPICTD Data

Month Year CPICTD October 2003 1.00 November 2003 0.96 December 2003 1.01 January 2004 1.05 February 2004 1.06 March 2004 1.12 April 2004 1.15 May 2004 1.33 June 2004 1.37 July 2004 1.55 August 2004 1.60 September 2004 1.60

6.2 Logistics Performance Index Data The performance measure Logistics Performance Index (LPI) is commonly used throughout the world. LPI is a comprehensive index the World Bank uses to help countries identify the challenges and opportunities they face in trade logistics performance. The LPI overall score reflects perceptions of a country’s logistics and is the weighted average of a country’s scores based on six dimensions: 1. Efficiency of the customs clearance process (i.e. speed, simplicity and predictability of formalities) by border control agencies. 2. Quality of trade and transport related infrastructure (e.g. ports, railroads, roads, information technology). 3. Ease of arranging competitively priced shipments. 4. Competence and quality of logistics services (e.g., transport operators, customs brokers). 5. Ability to track and trace consignments.

71 Texas Tech University, Jon Ilseng, December 2015

6. Timeliness of shipments in reaching destination within the scheduled or expected delivery time.

The LPI ranges in scores from 1 (worst) to 5 (best), with a higher score representing better performance. LPI data is collected from worldwide surveys conducted by the World Bank in partnership with academic and international institutions, private companies and individuals involved in international logistics. The surveys are designed and implemented by the World Bank International Trade and Transport Departments, with support from Finland’s Turku School of Economics. The first survey occurred in 2007 and every two years thereafter. Each country’s score is the total of the evaluation resulting from the survey. Respondents to the survey score each of the above six dimensions from 1 (worst) to 5 (best). Scores for the six dimensions are averaged across all respondents and aggregated to a single score. The respondent demographics consist of nearly 1,000 logistics professionals from international logistics companies from approximately 130 countries. The first survey was conducted in calendar year (CY) 2006 and the LPI scores reported in CY 2007. The next surveys were conducted every two years thereafter in CYs 2009, 2011 and 2013 and LPI scores reported in CYs 2010, 2012 and 2014. Table 6.3 shows the LPI scores reported in CY 2007. The LPI scores for CYs 2010, 2012 and 2014 are shown in Appendix B, Tables B-1 through B-3. For purposes of efficiency, each of the above six dimensions are labeled in the tables as follows: 1. CUSTOMS: Efficiency of the customs clearance process (i.e. speed, simplicity and predictability of formalities) by border control agencies. 2. INFRASTRUCTURE: Quality of trade and transport related infrastructure (e.g. ports, railroads, roads, information technology). 3. EASE OF SHIPMENT: Ease of arranging competitively priced shipments. 4. LOGISTICS SERVICES: Competence and quality of logistics services (e.g., transport operators, customs brokers).

72 Texas Tech University, Jon Ilseng, December 2015

5. EASE OF TRACKING: Ability to track and trace consignments. 6. TIMELINESS: Timeliness of shipments in reaching destination within the scheduled or expected delivery time.

Table 6-3 : Calendar Year 2007 Reported Logistics Performance Index Scores Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Afghanistan 1.21 1.30 1.10 1.22 1.25 1.00 1.38 Albania 2.08 2.00 2.33 2.33 2.00 1.67 2.13 Algeria 2.06 1.60 1.83 2.00 1.92 2.27 2.82 Angola 2.48 2.40 2.25 2.50 2.50 2.38 2.82 Argentina 2.98 2.65 2.81 2.97 3.00 3.00 3.50 Armenia 2.14 2.10 1.78 2.00 2.11 2.22 2.63 Australia 3.79 3.58 3.65 3.72 3.76 3.97 4.10 Austria 4.06 3.83 4.06 3.97 4.13 3.97 4.44 Azerbaijan 2.29 2.23 2.00 2.50 2.00 2.38 2.63 Bahrain 3.15 3.40 3.40 3.33 2.75 3.00 3.00 Bangladesh 2.47 2.00 2.29 2.46 2.33 2.46 3.33 Belarus 2.53 2.67 2.63 2.13 2.13 2.71 3.00 Belgium 3.89 3.61 4.00 3.65 3.95 3.96 4.25 Benin 2.45 1.80 1.89 2.78 2.56 2.89 2.78 Bolivia 2.31 2.00 2.08 2.42 2.17 2.38 2.81 Bosnia & 2.46 2.32 2.26 2.50 2.37 2.29 3.00 Herzegovina Brazil 2.75 2.39 2.75 2.61 2.94 2.77 3.10 Bulgaria 2.87 2.47 2.47 2.79 2.86 3.14 3.56 Burkina Faso 2.24 2.13 1.89 2.67 2.33 2.13 2.25 Burundi 2.29 2.20 2.50 2.50 2.50 2.00 2.00

73 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Buthan 2.16 1.95 1.95 2.06 2.18 2.27 2.57 Cambodia 2.50 2.19 2.30 2.47 2.47 2.53 3.05 Cameroon 2.49 2.57 2.00 2.33 2.25 2.50 3.29 Canada 3.92 3.82 3.95 3.78 3.85 3.98 4.19 Chad 1.98 2.00 1.80 1.83 1.82 1.91 2.56 Chile 3.25 3.32 3.06 3.21 3.19 3.17 3.55 China 3.32 2.99 3.20 3.31 3.40 3.37 3.68 Colombia 2.50 2.10 2.28 2.61 2.44 2.63 2.94 Comoros 2.48 2.30 2.46 2.33 2.64 2.50 2.67 Costa Rica 2.55 2.49 2.43 2.53 2.43 2.57 2.89 Cote d’lvoire 2.36 2.22 2.22 2.13 2.38 2.00 3.25 Croatia 2.71 2.36 2.50 2.69 2.83 2.46 3.45 Cyprus 2.92 2.77 2.91 2.92 2.77 2.92 3.25 3.13 2.95 3.00 3.06 3.00 3.27 3.56 Denmark 3.86 3.97 3.82 3.67 3.83 3.76 4.11 Djibouti 1.94 1.64 1.92 2.00 2.00 1.82 2.30 Dominican 2.38 2.33 2.18 2.34 2.25 2.28 2.89 Republic Ecuador 2.60 2.25 2.36 2.64 2.64 2.45 3.27 Egypt 2.37 2.08 2.00 2.33 2.38 2.62 2.85 El Salvador 2.66 2.38 2.42 2.78 2.53 2.82 3.06 Eritrea 2.19 2.14 2.00 2.00 2.67 2.50 1.83 Estonia 2.95 2.75 2.91 2.85 3.00 2.84 3.35 Ethiopia 2.33 2.14 1.88 2.43 2.00 1.83 3.67 Finland 3.82 3.68 3.81 3.30 3.85 4.17 4.18 3.76 3.51 3.82 3.63 3.76 3.87 4.02

74 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Gabon 2.10 2.25 2.40 1.67 2.00 2.00 2.33 Gambia 2.52 2.25 2.33 2.67 3.00 2.33 2.50 Germany 4.10 3.88 4.19 3.91 4.21 4.12 4.33 Ghana 2.16 2.00 2.25 2.25 1.75 2.25 2.50 Greece 3.36 3.06 3.05 3.11 3.33 3.53 4.13 Guatemala 2.53 2.27 2.13 2.62 2.50 2.43 3.23 Guinea 2.71 2.50 2.33 2.50 2.67 2.83 3.50 Guinea-Bissau 2.28 2.14 2.25 2.22 2.00 2.22 2.86 Guyana 2.05 1.95 1.78 1.80 1.95 2.35 2.50 Haiti 2.21 2.08 2.14 2.20 2.11 2.16 2.60 Honduras 2.50 2.48 2.32 2.48 2.41 2.41 2.88 Hong Kong, 4.00 3.84 4.06 3.78 3.99 4.06 4.33 China Hungary 3.15 3.00 3.12 3.07 3.07 3.00 3.69 India 3.07 2.69 2.90 3.08 3.27 3.03 3.47 Indonesia 3.01 2.73 2.83 3.05 2.90 3.30 3.28 Iran 2.51 2.50 2.44 2.59 2.69 2.00 2.80 Ireland 3.91 3.82 3.72 3.76 3.93 3.96 4.32 Israel 3.21 2.73 3.00 3.27 3.23 3.46 3.58 Italy 3.58 3.19 3.52 3.57 3.63 3.66 3.93 Jamaica 2.25 2.35 2.03 2.13 2.07 2.24 2.65 Japan 4.02 3.79 4.11 3.77 4.12 4.08 4.34 Jordan 2.89 2.62 2.62 3.08 3.00 2.85 3.17 Kazakhstan 2.12 1.91 1.86 2.10 2.05 2.19 2.65 Kenya 2.52 2.33 2.15 2.79 2.31 2.62 2.92 Kuwait 2.99 2.50 2.83 2.60 3.00 3.33 3.75

75 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Kyrgyz 2.35 2.20 2.06 2.35 2.35 2.38 2.76 Republic Laos, PDR 2.25 2.08 2.00 2.40 2.29 1.89 2.83 3.02 2.53 2.56 3.31 2.94 3.06 3.69 Lebanon 2.37 2.17 2.14 2.50 2.40 2.33 2.67 Lesotho 2.30 2.40 2.00 2.50 2.20 1.83 2.83 Liberia 2.31 2.40 2.14 2.83 2.00 2.00 2.43 Lithuania 2.78 2.64 2.30 3.00 2.70 2.60 3.40 Luxembourg 3.54 3.67 3.86 3.00 3.22 3.56 4.00 Macedonia 2.43 2.00 2.29 2.67 2.33 2.50 2.83 Madagascar 2.24 2.24 2.13 2.25 2.00 2.19 2.67 Malawi 2.42 2.25 2.13 2.56 2.56 2.00 3.00 Malaysia 3.48 3.36 3.33 3.36 3.40 3.51 3.95 Mali 2.29 2.17 1.90 2.23 2.21 2.38 2.88 Mauritania 2.63 2.40 2.20 2.60 2.70 2.80 3.10 Mauritius 2.13 2.00 2.29 2.20 1.75 2.25 2.33 Mexico 2.87 2.50 2.68 2.91 2.80 2.96 3.40 Moldova 2.31 2.14 1.94 2.36 2.21 2.50 2.73 Mongolia 2.08 2.00 1.92 2.50 1.80 2.00 2.25 Morocco 2.38 2.20 2.33 2.75 2.13 2.00 2.86 Mozambique 2.29 2.23 2.08 2.25 2.36 2.00 2.83 Myanmar 1.86 2.07 1.69 1.73 2.00 1.57 2.08 Namibia 2.16 2.14 2.00 2.14 1.83 1.83 3.00 Nepal 2.14 1.83 1.77 2.09 2.08 2.33 2.75 Netherlands 4.18 3.99 4.29 4.05 4.25 4.14 4.38 New Zealand 3.75 3.57 3.61 3.77 3.82 3.68 4.05

76 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Nicaragua 2.21 2.14 1.86 2.18 2.41 2.19 2.50 Niger 1.97 1.67 1.40 1.80 2.00 2.00 3.00 Nigeria 2.40 2.23 2.23 2.49 2.38 2.36 2.69 Norway 3.81 3.76 3.82 3.62 3.78 3.67 4.24 Oman 2.92 2.71 2.86 2.57 2.67 2.80 4.00 Pakistan 2.62 2.41 2.37 2.72 2.71 2.57 2.93 Panama 2.89 2.68 2.79 2.80 2.73 2.93 3.43 Papua New 2.38 2.00 2.00 2.57 2.29 2.29 3.14 Guinea Paraguay 2.57 2.20 2.47 2.29 2.63 2.67 3.23 Peru 2.77 2.68 2.57 2.91 2.73 2.70 3.00 Philippines 2.69 2.64 2.26 2.77 2.65 2.65 3.14 Poland 3.04 2.88 2.69 2.92 3.04 3.12. 3.59 Portugal 3.38 3.24 3.16 3.23 3.19 3.44 4.06 Qatar 2.98 2.44 2.63 3.00 3.00 3.17 3.67 Romania 2.91 2.60 2.73 3.20 2.86 2.86 3.18 Russian 2.37 1.94 2.23 2.48 2.46 2.17 2.94 Federation Rwanda 1.77 1.80 1.53 1.67 1.67 1.60 2.38 Sao Tome 2.86 2.50 2.20 3.40 3.00 3.00 3.00 Saudi Arabia 3.02 2.72 2.95 2.93 2.88 3.02 3.65 Senegal 2.37 2.38 2.09 2.73 2.30 2.30 2.63 Serbia & 2.28 2.33 2.18 2.25 2.29 2.07 2.54 Montenegro Sierra Leone 1.95 1.58 1.83 1.82 1.91 2.00 2.64 Singapore 4.19 3.90 4.27 4.04 4.21 4.25 4.53

77 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Slovak 2.92 2.61 2.68 3.09 3.00 2.87 3.26 Republic Slovenia 3.14 2.79 3.22 3.14 3.09 2.91 3.73 Solomon 2.08 1.73 2.00 2.36 2.10 2.00 2.30 Islands Somolia 2.16 2.43 1.63 1.88 2.25 1.75 3.00 South Africa 3.53 3.22 3.42 3.56 3.54 3.71 3.78 South Korea 3.52 3.22 3.44 3.44 3.63 3.56 3.86 Spain 3.52 3.17 3.51 3.45 3.55 3.63 3.86 Sri Lanka 2.40 2.25 2.13 2.31 2.45 2.58 2.69 Sudan 2.71 2.36 2.36 2.67 2.83 2.92 3.17 Sweden 4.08 3.85 4.11 3.90 4.06 4.15 4.43 Switzerland 4.02 3.85 4.13 3.67 4.00 4.04 4.48 Syria 2.09 2.17 1.91 2.00 1.80 2.00 2.67 Taiwan 3.64 3.25 3.62 3.65 3.58 3.60 4.18 Tajikistan 1.93 1.91 2.00 2.00 1.90 1.67 2.11 Tanzania 2.08 2.07 2.00 2.08 1.92 2.17 2.27 Thailand 3.31 3.03 3.16 3.24 3.31 3.25 3.91 Timor-Leste 1.71 1.63 1.67 1.50 1.60 1.67 2.25 Togo 2.25 2.10 2.25 2.40 2.40 2.20 2.11 Tunisia 2.76 2.83 2.83 2.86 2.43 2.83 2.80 Turkey 3.15 3.00 2.94 3.07 3.29 3.27 3.38 Uganda 2.49 2.21 2.17 2.42 2.55 2.33 3.29 Ukraine 2.55 2.22 2.35 2.53 2.41 2.53 3.31 United Arab 3.73 3.52 3.80 3.68 3.67 3.61 4.12 Emirates

78 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking United 3.99 3.74 4.05 3.85 4.02 4.10 4.25 Kingdom United States 3.84 3.52 4.07 3.58 3.85 4.01 4.11 Uruguay 2.51 2.29 2.38 2.40 2.45 2.57 3.00 Uzbekistan 2.16 1.94 2.00 2.07 2.15 2.08 2.73 Venezuela 2.62 2.37 2.51 2.69 2.59 2.54 3.03 Vietnam 2.89 2.89 2.50 3.00 2.80 2.90 3.22 Yemen 2.29 2.18 2.08 2.20 2.22 2.30 2.78 Zambia 2.37 2.08 2.00 2.40 2.44 2.80 2.50 Zimbabwe 2.29 1.92 1.87 2.27 2.21 2.64 2.85

6.3 City of Dallas Measures The City of Dallas has a performance measurement system called “Dallas Measures”. “Dallas Measures” collects data on program activities and accomplishment progress towards the City of Dallas’ goals and objectives. There are six Key Focus Areas that “Dallas Measures” collects data for and they are as follows: 1. Public Safety – consists of the Court and Detention Services, Emergency Management, Judiciary, Fire Department and Police Department. 2. Economic Vibrancy – consists of Aviation, Code Compliance Services, Convention & Event Services, Development Services, Housing, Office of Economic Development, Park & Recreation, Public Works, Street Services and Water Utilities. 3. Clean Healthy Environment – City Attorney, Code Compliance, Court & Detention Services, Environmental & Health Services, Environmental Quality, Sanitation and Water Services.

79 Texas Tech University, Jon Ilseng, December 2015

4. Culture, Arts and Recreation – Cultural Affairs, Environmental & Health Services, Library, Park & Recreation and Public Works. 5. Educational Enhancements – Cultural Affairs, Environmental & Health Services and Library. 6. Efficient, Effective and Economical Government – Business Development & Procurement, City Attorney, City Auditor, City Secretary, Civil Service, Human Resources, Public Information Office, Strategic Customer Services and Water Utilities.

For the purposes of this research, only those performance measures which are applicable to the Dallas-Fort Worth, Texas SoS and sustainability needs are mentioned. The key focus areas’ performance measures, the definitions for each one and the data are as follows: 6.3.1 Economic Vibrancy Focus Area The Business Development and Procurement Services – Vendor Development metric measures how the City of Dallas does business with Small and Minority/Women Business Enterprises (MWBE) regarding the number of awareness events held, training sessions/Full Time Equivalent(FTE), percentage of MWBE vendors registered online without assistance and number of training sessions conducted for MWBE vendors. The overall Target Index for each is graded as follows: o Excellent – 105% or greater o Good – 90% or greater o Caution – 80% or greater o Poor – 75% or greater o Very Poor – less than 75%

Table 6.4 shows the data for the Number of Awareness Events Held. Appendix C, Tables C-1 through C-3 shows the data for the other three performance measures.

80 Texas Tech University, Jon Ilseng, December 2015

Table 6-4 : Number of Awareness Events Held Month/Year Actual Target Target Index Target Index Grade October/2008 23.0 17.0 135.3% Excellent November/2008 43.0 34.0 126.5% Excellent December/2008 61.0 51.0 119.6% Excellent January/2009 83.0 68.0 122.1% Excellent February/2009 104.0 85.0 122.4% Excellent March/2009 123.0 102.0 120.6% Excellent April/2009 149.0 120.0 124.2% Excellent May/2009 173.0 138.0 125.4% Excellent June/2009 198.0 156.0 126.9% Excellent July/2009 218.0 174.0 125.3% Excellent August/2009 240.0 192.0 125.0% Excellent September/2009 263.0 210.0 125.2% Excellent October/2009 18.0 14.0 128.6% Excellent November/2009 30.0 27.0 111.1% Excellent December/2009 39.0 40.0 97.5% Good January/2010 58.0 53.0 109.4% Excellent February/2010 73.0 66.0 110.6% Excellent March/2010 96.0 79.0 121.5% Excellent April/2010 110.0 92.0 119.6% Excellent May/2010 127.0 105.0 121.0% Excellent June/2010 154.0 118.0 130.5% Excellent July/2010 162.0 131.0 123.7% Excellent August/2010 172.0 144.0 119.4% Excellent September/2010 179.0 157.0 114.0% Excellent October/2010 14.0 9.0 155.6% Excellent November/2010 23.0 19.0 121.1% Excellent

81 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade December/2010 31.0 28.0 110.7% Excellent January/2011 40.0 40.0 100.0% Good February/2011 49.0 52.0 94.2% Good March/2011 60.0 63.0 95.2% Good April/2011 77.0 75.0 102.7% Good May/2011 87.0 93.0 93.5% Good June/2011 100.0 111.0 90.1% Good July/2011 110.0 124.0 88.7% Caution August 119.0 141.0 84.4% Caution September/2011 167.0 157.0 106.3% Excellent October/2011 14.0 7.0 200.0% Excellent November/2011 23.0 14.0 164.3% Excellent December/2011 31.0 19.0 163.2% Excellent January/2012 40.0 26.0 153.8% Excellent February/2012 49.0 33.0 148.5% Excellent March/2012 60.0 40.0 150.0% Excellent April/2012 77.0 47.0 163.8% Excellent May/2012 87.0 54.0 161.1% Excellent June/2012 100.0 61.0 163.9% Excellent July/2012 110.0 66.0 166.7% Excellent August/2012 119.0 73.0 163.0% Excellent September/2012 130.0 80.0 162.5% Excellent October/2012 14.0 9.0 155.6% Excellent November/2012 21.0 18.0 116.7% Excellent December/2012 27.0 27.0 100.0% Good January/2013 35.0 36.0 97.2% Good

82 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade February/2013 45.0 44.0 102.3% Good March/2013 48.0 52.0 92.3% Good April/2013 61.0 60.0 101.7% Good May/2013 71.0 68.0 104.4% Good June/2013 82.0 76.0 107.9% Excellent July/2013 91.0 84.0 108.3% Excellent August/2013 97.0 92.0 105.4% Excellent September/2013 102.0 100.0 102.0% Good October/2013 14.0 10.0 140.0% Excellent November/2013 36.0 20.0 180.0% Excellent December/2013 48.0 31.0 154.8% Excellent January/2014 57.0 42.0 135.7% Excellent February/2014 64.0 53.0 120.8% Excellent March/2014 75.0 64.0 117.2% Excellent April/2014 91.0 75.0 121.3% Excellent May/2014 98.0 86.0 114.0% Excellent June/2014 104.0 97.0 107.2% Excellent

6.3.2 Office of Economic Development- Dallas Protocol and World Affairs The City of Dallas currently has a contract with the World Affairs Council to aid the overall City economic development strategy by hosting international visitors and trade delegations, managing the Sister City and Friendship City Programs and leveraging resources from other international organizations. The following performance measures are recorded as part of this contract: 1) Business Referrals – Protocol- Measures the number of business referrals from Mexico, Canada, China, plus Dallas’ top ten trading partners.

83 Texas Tech University, Jon Ilseng, December 2015

2) Inbound Delegations Assisted – Measures the number of business related inbound delegations assisted to promote international business. 3) City of Dallas (COD) Partnership Events – Measures the number of the COD partnership events held annually.

The overall Target Index for each is graded as follows: o Excellent – 105% or greater o Good – 90% or greater o Caution – 80% or greater o Poor – 75% or greater o Very Poor – less than 75%

Table 6-5 shows the data for the Number of Business Referrals – Protocol. Appendix D, Tables D-1 and D-2, show the data for the other two measures..

Table 6-5 Number of Business Referrals - Protocol Fiscal Actual Target Target Index Target Index Year/Quarter Grade FY09/Q1 19.0 8.0 200.0% Excellent FY09/Q2 19.0 16.0 118.8% Excellent FY09/Q3 29.0 24.0 120.8% Excellent FY12/Q1 4.0 7.0 57.1% Very Poor FY12/Q2 12.0 14.0 85.7% Caution FY12/Q3 22.0 23.0 95.7% Good FY12/Q4 46.0 30.0 153.3% Excellent FY13/Q1 7.0 7.0 100.0% Good FY13/Q2 12.0 14.0 85.7% Caution FY13/Q3 15.0 23.0 65.2% Very Poor

84 Texas Tech University, Jon Ilseng, December 2015

6.3.3 Clean Healthy Environment Focus Area The Environmental Quality – Air Quality Compliance department is within the Clean Healthy Environment Focus Area. It conducts investigations of industry plants and businesses, gasoline service stations, paint and body shops, used car lots, construction sites, dry cleaners and citizen complaints targeting air contaminants that have the potential to be injurious or to adversely affect human health and the environment. The following performance measures are recorded: 1) Regulated Source Investigations- Measures the number of regulated source investigations. 2) Percent of Complaints resolved after initial investigation – Measures the percentage of complaints resolved after the initial investigation. 3) Percent of facilities in compliance with applicable regulations during the initial investigation – Measures the percentage of facilities that are compliant with the applicable regulations during the initial investigation. The overall Target Index for each is graded as follows: o Excellent – 105% or greater o Good – 90% or greater o Caution – 80% or greater o Poor – 75% or greater o Very Poor – less than 75%

Table 6-6 shows the data for Regulated Source Investigations. Appendix E, Tables E-1 and E-2 show the data for the other two measures.

Table 6-6 Regulated Source Investigations Month/Year Actual Target Target Index Target Index Grade October/2008 62 62 100% Good November/2008 124 117 105.6% Excellent

85 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade December/2008 186 172 107.5% Excellent January/2009 248 229 107.7% Excellent February/2009 313 306 102.2% Good March/2009 381 378 100.7% Good April/2009 462 455 101.5% Good May/2009 533 532 100.1% Good June/2009 633 619 102.2% Good July/2009 711 706 100.7% Good August/2009 853 834 102.2% Good October/2009 63 63 100.0% Good November/2009 126 111 112.7% Excellent December/2009 189 172 109.0% Excellent January/2010 252 237 106.0% Excellent February/2010 315 303 103.8% Good March/2010 383 378 101.3% Good April/2010 444 441 100.6% Good May/2010 518 504 102.7% Good June/2010 583 567 102.8 Good July/2010 672 630 106.6% Excellent August/2010 734 693 105.9% Excellent September/2010 781 755 103.4% Good September/2011 825 796 103.6% Good October/2011 68 36 147.1% Excellent November/2011 136 104 123.5% Excellent December/2011 204 152 125.5% Excellent January/2012 272 242 111.0% Excellent

86 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade February/2012 340 294 113.5% Excellent March/2012 408 357 112.5% Excellent April/2012 476 425 110.7% Excellent May/2012 544 504 107.4% Excellent June/2012 612 566 107.5% Excellent July/2012 680 642 105.6% Excellent August/2012 758 748 101.3% Good September/2012 816 801 101.8% Good October/2012 76 68 111.7% Excellent November/2012 136 114 119.2% Excellent December/2012 204 169 120.7% Excellent January/2013 272 225 120.8% Excellent February/2013 340 258 131.7% Excellent March/2013 408 321 127.1% Excellent April/2013 476 404 117.8% Excellent May/2013 544 481 113.0% Excellent June/2013 612 548 111.6% Excellent July/2013 680 673 101.0% Good August/2013 774 748 103.4% Good September/2013 841 816 103.0% Good October/2013 85 68 125.0% Excellent November/2013 136 130 104.6% Excellent December/2013 237 204 116.1% Excellent January/2014 293 272 107.7% Excellent February/2014 355 340 104.4% Good March/2014 421 408 103.3% Good

87 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade April/2014 476 468 101.7% Good May/2014 544 531 102.4% Good June/2014 612 583 104.9% Good

6.3.4 Effective, Efficient and Economical Focus Area The Business Development and Procurement – Purchasing/Contract Management is within the Effective, Efficient and Economical Focus Area. It is governed by the Texas Local Government Procurement Code 252, AD 4-5 and the City Charter. It includes the development of procurement specifications, collaboration within departments and partnering with other governmental agencies. The following performance measures are recorded as part of this contract:

1) Regulated Source Investigations- Measures the number of purchasing transactions. 2) Percent of bids advertised within publisher’s deadline – Measures the percentage of bids advertised within a publisher’s deadline. 3) Percent of requisitions processed in compliance with State Law – Measures the percentage of requisitions processed in compliance with Texas State Law. 4) Number of contracts managed to Effectiveness – Measures the number of contracts to effectiveness.

The overall Target Index for each is graded as follows: o Excellent – 105% or greater o Good – 90% or greater o Caution – 80% or greater o Poor – 75% or greater o Very Poor – less than 75%

88 Texas Tech University, Jon Ilseng, December 2015

Table 6-7 shows the data for the Number of Purchasing Transactions. Appendix F, Tables F-1 through F-3 shows the data for other three measures.

Table 6-7 Number of Purchasing Transactions Month/Year Actual Target Target Index Target Index Grade October/2008 1,813 1329 136.4% Excellent November/2008 3,452 2,658 129.9% Excellent December/2008 5,333 3,987 133.8% Excellent January/2009 7,006 5,316 131.8% Excellent February/2009 8,665 6,645 130.4% Excellent March/2009 10,419 7,974 130.7% Excellent April/2009 12,080 9303 129.9% Excellent May/2009 13,512 10,632 127.1% Excellent June/2009 14,888 11,961 124.5% Excellent July/2009 16,174 13,290 121.7% Excellent August/2009 17,747 14,620 121.4% Excellent September/2009 18,776 15,950 117.7% Excellent October/2009 1,330 1,301 102.2% Good November/2009 2,659 2,263 117.4% Excellent December/2009 3,988 3,269 121.9% Excellent January/2010 5,317 4,238 125.4% Excellent February/2010 6,646 5,126 129.6% Excellent March/2010 7,975 6,002 132.8% Excellent April/2010 9,304 6,659 139.7% Excellent May/2010 10,633 7,263 146.3% Excellent June/2010 11,962 7,978 149.9% Excellent July/2010 13,291 8,623 154.1% Excellent

89 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade August/2010 14,620 9,400 155.5% Excellent September/2010 15,950 10,007 159.3% Excellent September/2011 16,583 11,500 144.2% Excellent October/2011 845 712 118.8% Excellent November/2011 1,660 1,420 116.9% Excellent December/2011 2,176 2,128 102.3% Good January/2012 3,235 2,836 114.1% Excellent February/2012 4,210 3,544 118.8% Excellent March/2012 5,102 4,252 120.0% Excellent April/2012 6,056 4,960 122.1% Excellent May/2012 7,113 5,668 125.5% Excellent June/2012 8,110 6,376 127.2% Excellent July/2012 9,180 7,084 129.6% Excellent August/2012 10,036 7,792 128.8% Excellent September/2012 11,092 8,500 130.5% Excellent October/2012 932 712 130.1% Excellent November/2012 1,943 1,420 136.9% Excellent December/2012 2,938 2,128 138.1% Excellent January/2013 3,984 2,836 140.5% Excellent February/2013 5,011 3,544 141.4% Excellent March/2013 6,161 4,252 144.9% Excellent April/2013 7,325 4,960 147.7% Excellent May/2013 8,513 5,668 150.2% Excellent June/2013 9,710 6,376 152.3% Excellent July/2013 10,866 7,084 153.4% Excellent August/2013 11,999 7,792 154.0% Excellent September/2013 13,175 8,500 155.0% Excellent

90 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade October/2013 578 425 136.2% Excellent November/2013 1,154 850 135.8% Excellent December/2013 1,785 1,275 140.0% Excellent January/2014 2,286 1,700 134.5% Excellent February/2014 2,853 2,125 134.3% Excellent March/2014 3,417 2,550 134.0% Excellent April/2014 3,977 2,975 133.7% Excellent May/2014 4,505 3,400 132.5% Excellent June/2014 5,068 3,825 132.5% Excellent

6.4 Texas Tech University Institutional Review Board Proposal An on-line survey was conducted to gather performance measurement data from the Dallas-Fort Worth SoS. The on-line survey was sent to Dallas-Fort Worth SoS City Managers, County Commissioners, Universities, Community Colleges, Not-for-profit Companies and For-profit Companies. The on-line survey was sent to 100 respondents. The Qualtrics Software Tool was used to collect the on-line survey results. Since some of the respondents may be interviewed and prior to the survey being sent, myself and Dr. Atila Ertas, Principal Investigator, submitted a proposal to the Texas Tech University Human Research Protection Program. This proposal needed approval from the Texas Tech University Institutional Review Board to protect the rights and welfare of human subjects participating in this research. The proposal that was approved by the Texas Tech University Institutional Review Board is described in Appendix F. Chapter 6 discusses the primary and secondary data collected for current performance measures. The data includes CPI data from DoD Acquisition Programs, LPI from the World Bank and current performance measures the City of Dallas, Texas collects. The data collected and shown in Chapter 6 and Appendices A through E is analyzed and discussed in Chapter 7 Results and Discussion.

91 Texas Tech University, Jon Ilseng, December 2015

CHAPTER 7 7 RESULTS AND DISCUSSION

Chapter 6 and Appendices A through F show the primary and secondary data collected. Chapter 7 discusses and analyzes the data. As stated in Chapter 6, a CPI of less than 1.0 implies a cost overrun and that the cost of completing the work is higher than planned. A CPI greater than 1.0 implies an on target condition and that the cost of completing the work is less than planned. A CPI equal to 1.0 implies that the cost of completing the work is right on plan. When reviewing the CPI data for the thirty-five (35) major DoD Acquisition Programs, the overall results show programs initially performing under budget. As the programs move forward, they tend to perform over budget. Eventually, the programs finish the year performing under budget. The primary reasons these programs initially perform under budget, or show positive cost variances, are because: 1) Programs are working with a fully staffed team and budgets are set and 2) Customer and Contractor are in agreement as to how to execute the program. Once the program moves forward, various issues and risks begin to occur which cause negative cost variances.. These issues include: 1) Performance Requirements are not well defined which leads to delay of writing satisfactory Performance Specifications and 2) Additional staffing is needed. Once a program mitigates these issues and risks, it eventually becomes stable and the negative cost variances start becoming positive. CPI would be a good measure for integrating SoSs with regards to sustainability. The LPI scores from 1 (worst) to 5 (best) were very interesting for the timeframes. As mentioned earlier, there are six dimensions and each is scored from 1 (worst) to 5 (best). The six dimensions are as follows:

1. CUSTOMS: Efficiency of the customs clearance process (i.e. speed, simplicity and predictability of formalities) by border control agencies. 2. INFRASTRUCTURE: Quality of trade and transport related infrastructure (e.g. ports, railroads, roads, information technology).

92 Texas Tech University, Jon Ilseng, December 2015

3. EASE OF SHIPMENT: Ease of arranging competitively priced shipments. 4. LOGISTICS SERVICES: Competence and quality of logistics services (e.g., transport operators, customs brokers). 5. EASE OF TRACKING: Ability to track and trace consignments. 6. TIMELINESS: Timeliness of shipments in reaching destination within the scheduled or expected delivery time.

I grouped the overall LPI scores in five different ranges as follows:  1.00 – 1.99 (Worst)  2.00 – 2.99 (Marginal)  3.00 – 3.99 (Good)  4.00 – 5.00 (Best) The following are number of overall LPI scores in the above defined groups for the calendar years 2007, 2010, 2012 and 2014.

Table 7-1: 2007 LPI Scores 1.00 – 1.99 2.00 – 2.99 3.00 – 3.99 4.00 - 5.00 (Worst) (Marginal (Good) (Best) 9 98 34 8

Table 7-2: 2010 LPI Scores 1.00 – 1.99 2.00 – 2.99 3.00 – 3.99 4.00 - 5.00 (Worst) (Marginal (Good) (Best) 3 101 38 4

Table 7-3: 2012 LPI Scores 1.00 – 1.99 2.00 – 2.99 3.00 – 3.99 4.00 - 5.00 (Worst) (Marginal (Good) (Best) 2 99 48 6

93 Texas Tech University, Jon Ilseng, December 2015

Table 7-4: 2014 LPI Scores 1.00 – 1.99 2.00 – 2.99 3.00 – 3.99 4.00 - 5.00 (Worst) (Marginal (Good) (Best) 1 97 53 5

The countries that had the Best (5) scores are Austria, Belgium, Denmark, Finland, Germany, Hong Kong, Japan, Netherlands, Singapore, Sweden, Switzerland and United Kingdom. The countries that had the Worst (1) are Afghanistan, Burundi, Chad, Democratic Republic of Congo, Djibouti, Eritrea, Niger, Rwanda, Sierra Leone, Somolia, Tajikistan and Timor-Leste. The countries with the Best (5) scores are economically strong and strive to meet sustainability needs, while the countries with the Worst (5) scores are not economically strong and struggle to meet sustainability needs. LPI would be a good measure for integrating SoSs with regards to sustainability. The city of Dallas performance measures mentioned earlier have some interesting results. For the Economic Vibrancy Focus Area, specifically, the Business Development and Procurement Services – Vendor Development, the number of awareness events held and training sessions/FTE had the consistent Target Index Grade of Excellent. The number of training sessions conducted for MWBE vendors had mostly Target Index Grades of Excellent but also had Caution and Poor. For the Office of Economic Development – Dallas Protocol and World Affairs Council Contract, the performance measure number of business related inbound delegations assisted to promote international business had Target Index Grades of Good and Excellent. The performance measure Number of Business Referrals – Protocol had Target Index Grades ranging from Very Poor to Excellent. Except for a few Cautions, the performance measure Number of COD Partnership Events had Target Index Grades of Good and Excellent. For the Clean Healthy Environment Focus Area, the performance measure Regulated Source Investigations had mostly Target Index Grades of Good and Excellent, while the performance measure percent of facilities in compliance with applicable

94 Texas Tech University, Jon Ilseng, December 2015 regulations during the initial investigation had Target Index Grades all Good. Except for a few Poor and Caution Target Index Grades, the performance measure Percent of Complaints resolved after initial investigation had Target Index Grades of mostly Good and Excellent. For the Effective, Efficient and Economical Focus Area, the performance measures Number of Purchasing Transactions and Percentage of bids advertised within publisher’s deadline both had Good and Excellent Target Index Grades. The performance measure Number of Contracts managed to Effectiveness had Good and Excellent Target Index Grades but also had 34% Caution Target Index Grades. For the on-line survey that was sent to Dallas-Fort Worth SoS City Managers, County Commissioners, Universities, Community Colleges, not-for-profit and for-profit companies, 90 respondents out of 100 completed it. The on-line survey was developed based on what the author of this dissertation believed were questions related to what performance measures are currently collected and what performance measurement models or tools are currently being implemented. Question 1 asked the definition of your current organization (i.e. Educational Institution (Community College or University), For-Profit or Non-Profit Company, Government Organization (City, County, State or Federal). Question 2 asked the number of employees in their respective current organization. Question 3 was “What kind(s) of performance measurement is (are) implemented in your organization” offered interesting responses. Question 4 was “Which performance measurement model or tools are used in your organization. Question 5 asked “What are the initial reasons for your organization to implement its performance measurement system?” Question 6 asked to “List the top five most important performance measures in your organization and to define them”. Question 7 asked “For any of the top five most important performance measures listed in Question 6, are the publicly accessible?” The results of the online survey are shown in Appendix G Table G-1.

95 Texas Tech University, Jon Ilseng, December 2015

Reviewing the results to Question 3, the top performance measures implemented in order of importance are: 1) Financial Performance Measurement, 2) Sustainability Measurement, 3) Process Management Measurement, 4) Customer Satisfaction Measurement, 5) Human Resource Performance Measurement and 6) Strategy Performance Measurement. Logistics performance was also listed under the Other category. It is not surprising that Financial Performance was the top performance measure since all these SoSs realize that having satisfactory financial performance ensures they stay in business. For Question 4, the following are the results:

Question 4 Results 60 48 50 40 30 20 12 12 10 8 10 0 0 0 Question 4 Results

The Benchmarking Systems Model was the most used performance model from the survey. This was a little surprising in that I thought that the Balanced Scorecard Method and ISO9000 Certification models would be the most commonly used performance models.

96 Texas Tech University, Jon Ilseng, December 2015

For Question 5, the following are the results:

Question 5 Results

100 95 95 90 90 90 90 90 90 85 80 80 75 70 Question 5 Results

The Question 5 results are not too surprising. Most of the reasons that the on-line respondents had were what I expected.

For Question 6, all on-line survey respondents listed the following:

a. Cost/Financial Measure b. Customer Satisfaction c. Logistics d. Sustainability e. Return on Invested Capital f. Measures relating to Environmental Issues (energy usage, water usage, etc.)

97 Texas Tech University, Jon Ilseng, December 2015

The Standard Balanced Scorecard Template was used for the SoS parasitic performance measures described in section 1.3 and for the on-line survey data. The Standard Balanced Scorecard for the SoS parasitic data (red text under Measures) is shown in Table 7-5. The Standard Balanced Scorecard for the on-line survey results (red text under Measures) is shown in Table 7-6.

Table 7-5 Standard Balanced Scorecard for SoS parasitic measures Financial Perspective To succeed financially, how should Objectives Measures Targets we appear to our shareholders? Lower costs Financial Performance For-profit/non-profit companies Increase profitability Contract Accountability Government agencies Educational institutions

Customer Perspective

To achieve our strategy, how should Objectives Measures Targets we appear to our customers? Lower wait time Customer satisfaction For-profit/non-profit Corporate social performance companies Improve customer retention Operation Performance Government agencies Human Resource Performance Educational institutions R&D Performance

Internal Business Perspective

To satisfy our shareholders and customers, Objectives Measures Targets what business perspective processes muse we excel at? Increase process efficiency Org performance For-profit/non-profit Engineering productivity companies Improve customer retention R&D Productivity Government agencies Resource productivity Educational institutions

Innovation & Learning Perspective

To achieve our strategy, how will we Objectives Measures Targets sustain our ability to change and improve? Improve knowledge Environmental sustainability For-profit/non-profit and skills Sustainability Mgmt companies Improve tools & Government agencies technology Educational institutions

98 Texas Tech University, Jon Ilseng, December 2015

Table 7-6: Standard Balanced Scorecard for on-line survey results

Financial Perspective To succeed financially, how should Objectives Measures Targets we appear to our shareholders? Lower costs Financial Performance For-profit/non-profit Measurement companies Increase profitability Government agencies Educational institutions

Customer Perspective To achieve our strategy, how should Objectives Measures Targets we appear to our customers? Lower wait time Sustainability measurement For-profit/non-profit Customer Satisfaction measurement companies Improve customer retention Government agencies Educational institutions

Internal Business Perspective To satisfy our shareholders and customers, what business perspective processes muse Objectives Measures Targets we excel at? Increase process efficiency Human Resource Performance For-profit/non-profit Measurement companies Improve customer retention Government agencies Educational institutions

Innovation & Learning Perspective To achieve our strategy, how will we Objectives Measures Targets sustain our ability to change and improve? Improve knowledge Strategy Performance measurement For-profit/non-profit and skills companies Improve tools & Government agencies technology Educational institutions

99 Texas Tech University, Jon Ilseng, December 2015

The results address the research questions in Chapter 1. One research question was “What proposed measure, other than the traditional performance measures, could measure the needs of for-profit organizations, not-for-profit organizations, Government Agencies and Educational Institutions?”. By using the BSC, the following performance measures are proposed to address this research question:

 Cost/Logistics Index – Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. It is calculated as follows: o (Actual cost/Budgeted cost)/(Logistics Performance Index)  Performance/Logistics Index – Performance/Logistics Index measures the performance of the SoS versus the logistics information. o (SoS performance)/(Logistics Performance Index)  Availability/Logistics Index – Availability/Logistics Index measures the availability of the SoS versus the logistics information. a. (SoS Availability)/(Logistics Performance Index)

The second research question was “What proposed measures, other than the traditional performance measures, could integrate the needs of the local Dallas-Fort Worth, Texas SoS?” The same performance measures proposed could also integrate the needs of the local Dallas-Fort Worth, Texas SoS. Aim 1 is “What are the current performance measures in industry, government and educational institutions? As described in Chapters 4, 5 and 6 and Appendices A through E and Appendix G, current performance measures in industry, government and educational institutions are as follows:  Cost Performance Index  Logistics Performance Index  City of Dallas, Texas Performance Measures  Corporate Social Performance  Engineering Productivity

100 Texas Tech University, Jon Ilseng, December 2015

 Customer Satisfaction  Sustainability  Return on Invested Capital  Measures relating to Environmental issues (energy usage, water usage, etc.)

Aim 2 is “Identify the relation of these Systems from a Systems-of-Systems perspective and discuss how parasitic and symbiotic behaviors affect it.” The Dallas-Fort Worth, Texas SoS contains many systems (i.e., for-profit organizations, not-for- organizations, Government Agencies and Educational Institutions).

Aim 3 is “Identify and Propose Performance Measures that integrate the needs of these various SoS with regards to sustainability.” The performance measures I proposed above integrate the needs of the Dallas-Fort Worth, Texas SoS with regards to sustainability. The Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. It is calculated as follows: (Actual cost/Budgeted cost)/(Logistics Performance Index). For each system in the Dallas-Fort Worth SoS, you take the monthly actual costs expended and divide it by the budgeted costs. Then you divide this value by the Logistics Performance Index, which is defined in Section 6.2 and Section 7. You can find the data for this measure in each system’s on-line database. There were 90 respondents in the on-line survey. Assigning CPICTD data from the Dod SARs and the LPI scores from the World Bank to match each system in the Dallas-Fort Worth SoS as close as possible, the Cost/Logistics Index measures for each of the 90 respondents is shown in the following Figures 7-1 and 7-2. Figure 7-1 shows the Cost/Logistics Index for fifty-four systems in the Dallas-Fort Worth, TX SoS and Figure 7-2 shows the Cost/Logistics Index for thirty-six systems in the Dallas-Fort Worth, TX SoS.

101 Texas Tech University, Jon Ilseng, December 2015

4.50

4.00

3.50

3.00

2.50

2.00

1.50

1.00 CPI 0.50 LPI 0.00

CLI

AT&T

UT-Dallas

Radioshack

EllisCounty

Monitronics

WiseCounty

HuntCounty

UT-Arlington

CollinCounty

DallasCounty

Cisco Systems

Parker County

Tarrant County

DentonCounty

JohnsonCounty

Cityof Hurst, TX

Cityof Lucas, TX

Cityof Irving, TX

Cityof Keller, TX

Rockwall County

Cityof Euless, TX

Cityof Denton, TX

Cityof Bedford, TX

Cityof Addison, TX

Cityof Fairview, TX

DallasCan Academy

Cityof Lewisville, TX

Cityof Southlake, TX

Cityof Mansfield, TX

Cityof Colleyville, TX

Cityof Grapevine, TX

MaryKay Foundation

GlobalFutureInstitute

NorthTexas Food Bank

Cityof Grand Prarie, TX

Universityof North Texas

TexasChristian University

TheSamaritan Inn McKinney

DallasLeadership Foundation

Task ForceDagger Foundation

SouthernMethodist University

NorthTexas Clean Air Coalition

CatholicCharities of Dallas, Inc.

McKinneyHabitat Humanityfor

McKinneyEducation Foundation

Friendsof the Frisco PublicLibrary

DallasBlack Chamber Commerce… of

McKinneyCommunity Development…

CollinCounty Healthcare Foundation

CommunityISD Education Foundation

TexasExtension Education Foundation

TexasAssociation of Community College… HomelessVeterans Services ofDallas, Inc. SusanG. KomenBreast CancerFoundation

Figure 7-1 Cost/Logistic Index Measures for fifty-four systems in Dallas-Fort Worth, TX SoS

Figure 7-2 Cost/Logistic Index Measures for thirty-six systems in Dallas-Fort Worth, TX SoS

This data shows that the higher the Cost Index (i.e. greater than 1.0) and the higher the LPI (i.e. scores greater than 3.5 or higher) the lower the Cost/Logistic Index measure, which is what you want. The lower Cost Logistics Index Measures are good

102 Texas Tech University, Jon Ilseng, December 2015 scores while the higher scores are not. The scores range from 0.245 to 0.644. If you sum all the ninety Cost Logistics Index Measures and divide by ninety, you have an average Cost Logistic Index Measure = 0.3478 for the Dallas-Fort Worth, TX SoS. What this measure does is tell you that it is a good indicator for measuring the Dallas-Fort Worth, TX SoS from a Cost/Logistics viewpoint. The lower the value implies the SoS is performing within its cost budget while also managing the logistics that come with it. The Performance/Logistics Index measures the performance of the Dallas-Fort Worth, TX SoS versus the logistics information. For each system in the Dallas-Fort Worth, TX SoS, performance is defined as the (system operational time/system non- operational time)/(Logistics Performance Index). For each system in the Dallas-Fort Worth SoS, you take the system operational and divide it by the system non-operational time. Then you divide this value by the Logistics Performance Index, which is defined in Section 6.2 and Section 7. There were 90 respondents in the on-line survey. If I assign values from 0 to 1.0 for each system performance in the Dallas-Fort Worth, TX SoS and the LPI scores from the World Bank to match each system in the Dallas-Fort Worth SoS as close as possible, the Performance/Logistics Index measures for each of the 90 respondents is shown in the following Figures 7-3 and 7-4. Figure 7-3 shows the Performance/Logistics Index for fifty-four systems in the Dallas-Fort Worth, TX SoS and Figure 7-4 shows the Performance/Logistics Index for thirty-six systems in the Dallas- Fort Worth, TX SoS.

Figure 7-3 Performance/Logistic Index Measures for fifty-four systems in Dallas-Fort Worth, TX SoS

103 Texas Tech University, Jon Ilseng, December 2015

Figure 7-4 Performance/Logistic Index Measures for thirty-six systems in Dallas-Fort Worth, TX SoS

This data shows that the higher the Performance Index (i.e. values approaching 1.0) and the higher the LPI (i.e. scores greater than 3.5 or higher) the lower the Performance/Logistic Index measure, which is what you want. The lower Performance Logistics Index Measures are good scores while the higher scores are not. The scores range from 0.195 to 0.379. If you sum all the ninety Performance/Logistics Index Measures and divide by ninety, you have an average Logistic Index Measure = 0.270 for the Dallas-Fort Worth, TX SoS. What this measure does is tell you that it is a good indicator for measuring the Dallas-Fort Worth, TX SoS from a Performance/Logistics viewpoint. The lower the value implies the SoS is performing as required while also managing the logistics that come with it.

104 Texas Tech University, Jon Ilseng, December 2015

The Availability/Logistics Index measures the availability of the Dallas-Fort Worth, TX SoS versus the logistics information. For each system in the Dallas-Fort Worth, TX SoS, Availability is defined as the (system uptime/system downtime)/(Logistics Performance Index). For each system in the Dallas-Fort Worth SoS, you take the system uptime and divide it by the system downtime. Then you divide this value by the Logistics Performance Index, which is defined in Section 6.2 and Section 7. There were 90 respondents in the on-line survey. If I assign values from 0 to 1.0 for each system availability in the Dallas-Fort Worth, TX SoS and the LPI scores from the World Bank to match each system in the Dallas-Fort Worth SoS as close as possible, the Performance/Logistics Index measures for each of the 90 respondents is shown in the following Figures 7-5 and 7-6. Figure 7-5 shows the Availability/Logistics Index for fifty-four systems in the Dallas-Fort Worth, TX SoS and Figure 7-6 shows the Availability/Logistics Index for thirty-six systems in the Dallas-Fort Worth, TX SoS.

Figure 7-5 Availability/Logistic Index Measures for fifty-four systems in Dallas-Fort Worth, TX SoS

105 Texas Tech University, Jon Ilseng, December 2015

Figure 7-6 Availability/Logistic Index Measures for thirty-six systems in Dallas-Fort Worth, TX SoS

This data shows that the higher the Availability Index (i.e. values approaching 1.0) and the higher the LPI (i.e. scores greater than 3.5 or higher) the lower the Availability/Logistic Index measure, which is what you want. The lower Performance Logistics Index Measures are good scores while the higher scores are not. The scores range from 0.212 to 0.418. If you sum all the ninety Availability/Logistics Index Measures and divide by ninety, you have an average Availability/Logistic Index Measure = 0.278 for the Dallas-Fort Worth, TX SoS. What this measure does is tell you that it is a good indicator for measuring the Dallas-Fort Worth, TX SoS from a Performance/Logistics viewpoint. The lower the value implies the SoS is performing as required while also managing the logistics that come with it. The Chapter 7 results discussed LPI scores, City of Dallas, Texas current performance measures, the on-line survey results and use of the Standard Balanced Scorecard for the data collected and the on-line survey results. The results were used to calculate the three Performance Measures based on the LPI scores, on-line survey results, and the DoD SARs. The new performance measures are discussed in Chapter 8.

106 Texas Tech University, Jon Ilseng, December 2015

CHAPTER 8 8 CONCLUSIONS, FUTURE WORK AND CONTRIBUTIONS From the results of this research as described in Chapter 7, it is seen that traditional performance measures currently do not integrate the needs of the local Dallas- Fort Worth, Texas SoS and the needs of for-profit organizations, government agencies and educational institutions. Also, they need to meet current and future sustainability concerns. As sustainability concerns and issues become pervasive, relationship between productivity/efficiency and effectiveness in the local Metroplex SoS becomes very important. One proposed performance measure that integrates the needs of the local Metroplex SoS with regards to sustainability is a balanced scorecard measurement system. A balanced scorecard measurement system could be used to conduct performance and enhance responsibility in Government Public Administration that includes four perspectives to measure government performance: 1) financial responsibility, 2) service to citizens, 3) work process improvement and 4) learning and growth of employees. A comparison of these measures of productivity and performance provides insight into potential challenges of measurement for a holistic System-of- Systems. Based on the data and survey results collected and using the BSC Template, the following performance measures are proposed for the Dallas-Fort Worth, Texas SoS.

1. Cost/Logistics Index – Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. The Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. It is calculated as follows: (Actual cost/Budgeted cost)/(Logistics Performance Index). 2. Performance/Logistics Index – Performance/Logistics Index measures the performance of the SoS versus the logistics information. The Performance/Logistics Index measures the performance of the Dallas- Fort Worth, TX SoS versus the logistics information. For each system in

107 Texas Tech University, Jon Ilseng, December 2015

the Dallas-Fort Worth, TX SoS, performance is defined as the (system operational time/system non-operational time)/(Logistics Performance Index). 3. Availability/Logistics Index – Availability/Logistics Index measures the availability of the SoS versus the logistics information. The Availability/Logistics Index measures the availability of the Dallas-Fort Worth, TX SoS versus the logistics information. For each system in the Dallas-Fort Worth, TX SoS, availability is defined as the (system uptime/system downtime)/(Logistics Performance Index).

The methodology to replicate this study and techniques to a different geographic area is to do the following tasks:  Define your local System-of-System with regards to for/non-profit companies, governmental agencies and educational institutions in the local System-of-System.  Collect cost, performance, availability and logistics performance data as defined in Chapter 7.  Use the definitions and equations for the three performance measures defined in Chapter 7 to calculate each performance measure.

Figure 8-1 shows the relationship of the BSC for the three performance measures and the identification of which performance measure fits for each BSC perspective.

108 Texas Tech University, Jon Ilseng, December 2015

Financial Perspective

To succeed financially, how should Objectives Measures Targets we appear to our shareholders? Lower costs Cost Logistics Index For-profit/non-profit companies Increase profitability Government agencies Educational institutions

Customer Perspective To achieve our strategy, how should Objectives Measures Targets we appear to our customers? Lower wait time Performance Logistics Index For-profit/non-profit Availability Logistics Index companies Improve customer retention Government agencies Educational institutions

Internal Business Perspective

To satisfy our shareholders and customers, what business perspective processes muse Objectives Measures Targets we excel at? Increase process efficiency Cost Logistics Index For-profit/non-profit Availability Logistics Index companies Improve customer retention Performance Logistics Index Government agencies Educational institutions

Innovation & Learning Perspective To achieve our strategy, how will we Objectives Measures Targets sustain our ability to change and improve? Improve knowledge Availability Logistics Index For-profit/non-profit and skills Performance Logistics Index companies Improve tools & Government agencies technology Educational institutions

Figure 8-1 BSC Performance Measures When looking at the on-line survey results using the three new performance measures, the following are the conclusions reached:

109 Texas Tech University, Jon Ilseng, December 2015

 For the Cost Logistics Index performance measure, the lower scores were represented by for-profit companies (i.e. Lockheed Martin, L3 Mustang Technology, Hewlett-Packard), local governments (i.e., City of Dallas, City of Plano, City of McKinney) and educational institutions, specifically community colleges (i.e, El Centro Community College, Richland Community College, Brookhaven Community College). The local major universities (i.e., Texas Christian University and Southern Methodist University) had higher scores than the local community colleges. The not-for- profit companies (i.e., North Texas Food Bank, Catholic Charities of Dallas, Inc. and Susan G. Komen Breast Cancer Foundation) had higher scores than the for-profit companies. The local county governments (i.e., Collin County, Hunt County, and Tarrant County) had higher scores than the local city governments.  For the Performance Logistics Index performance measure, the lower scores were represented by some not-for-profit companies (i.e., Dallas Can Academy, The Samaritan Inn, Friends of the Frisco Public Library), for-profit companies (Intuit, Kimberly-Clark, I2 Technologies), local governments (i.e, City of Dallas, City of Fairview and City of Coppell) and local community colleges (i.e., Richland Community College, El Centro Community College and Brookhaven Community College). The higher scores were represented by local universities (i.e., Texas Christian University, Southern Methodist University, and the University of Texas-Arlington), majority not-for-profit companies (i.e., McKinney Community Development Corporation, Task Force Dagger Foundation and McKinney Education Foundation) and local county governments (i.e., Parker County, Tarrant County and Rockwall County).  For the Availability Logistics Index Performance Measure, the lower scores were represented by local city governments (i.e. City of Dallas, City of Fairview and City of Coppell), for-profit companies (i.e. Raytheon, Celanese and Oracle) and some local community colleges (i.e., Richland Community

110 Texas Tech University, Jon Ilseng, December 2015

College, Collin County Community College and El Centro Community College). The higher scores were represented by not-for-profit companies (i.e., Dallas Leadership Foundation, Global Future Institute, and Task Force Dagger Foundation), local universities (i.e. University of Texas-Dallas, University of North Texas, and Southern Methodist University). If you compare all three performance measures together, the common theme seems to be that the local universities and the not-for-profit companies have the highest scores in all three performance measures. In other words, with regards to managing their respective logistics responsibilities, the local universities and not-for-profit companies tend to overrun their budgets, underperform and are not available all the time to their customers. Comparing the results from the three performance measures to other traditional measures and methods, the following conclusions are described in the following paragraphs. Traditional measures such as Cost Performance Index and Logistics Performance Index only measure a system individually, while these three proposed measures can measure a SoS (composed of individual systems). Table 8-1 shows the

CPICTD for the B-1B program from October 2003 through September 2004, which represents only one system, the top twelve LPI scores for twelve countries in CY 2007, each country representing one system and Cost/Logistic Index scores for the top twelve systems in the Dallas-Fort Worth, TX SoS. Table 8-2 shows the CPICTD for the C-17A program from October 2003 through September 2004, which represents only one system, the next top twelve LPI scores from twelve countries in CY 2007, each country representing one system and the next top twelve Cost/Logistic Index scores for the Dallas-Fort Worth, TX SoS. The CLI measure for the Dallas-Fort Worth, TX SoS takes into account both the cost and logistics of any system in the SoS while the traditional measures CPI and LPI do not.

111 Texas Tech University, Jon Ilseng, December 2015

Table 8-1 B-1B CPICTD and LPI measures versus CLI measure B-1B CPICTD LPI DFW SoS CPI DFW SoS LPI DFW SoS CLI 1.00 4.19 (Singapore) 1.00 (City of 3.86 (City of 0.259067 (City of Allen, TX) Allen,TX) Allen, TX) 1.00 4.18 0.97 (Hewlett- 3.76 (Hewlett- 0.257979 (Netherlands) Packard) Packard) (Hewlett- Packard) 0.99 4.10 (Germany) 0.99 (Richland 3.84 (Richland 0.257813 Community Community (Richland College) College) Community College) 0.97 4.08 (Sweden) 0.96 (City of 3.75 (City of 0.256 (City of DeSoto, TX) DeSoto, TX) DeSoto, TX) 0.98 4.06 (Austria) 1.00 (City of 3.92 (City of 0.255102 (City of McKinney, TX) McKinney, TX) McKinney, TX) 0.98 4.02 (Japan) 0.97 (City of 3.82 (City of 0.253927 (City of Plano, TX) Plano, TX) Plano, TX) 0.99 4.02 0.95 (I2 3.75 (I2 0.253333 (I2 (Switzerland) Technologies) Technologies) Technologies) 0.99 4.00 (Hong 1.02 (City of 4.06 (City of 0.251232 (City of Kong) Dallas, TX) Dallas, TX) Dallas, TX) 1.00 3.99 (United 0.99 (Kimberly- 3.99 (Kimberly- 0.24812 Kingdom) Clark) Clark) (Kimberly-Clark) 1.00 3.92 (Canada) 0.99 (Lockheed 4.00 (Lockheed 0.2475 Martin) Martin) (Lockheed Martin) 1.02 3.91 (Ireland) 0.99 (L3 Mustang 4.00 (L3 Mustang 0.2475 (L3 Technology) Technology) Mustang Technology)

112 Texas Tech University, Jon Ilseng, December 2015

1.02 3.89 (Belgium) 0.98 (El Centro 3.99 (El Centro 0.245614 (El Community Community Centro College) College) Community College)

Table 8-2 C-17A CPICTD and LPI measures versus CLI measure C-17A CPICTD LPI DFW SoS CPI DFW SoS LPI DFW SoS CLI 1.00 3.86 (Denmark) 1.10 (Collin 3.91 (Collin 0.28133 (Collin County CC) County CC) County CC) 0.96 3.86 (Denmark) 0.99 (Bank of 3.54 (Bank of 0.279661 (Bank America) America) of America) 3.84 (U.S.) 1.00 (City of 3.58 (City of 0.27933 (City of 1.01 Arlington, TX) Arlington, TX) Arlington, TX) 3.82 (Finland) 1.15 (City of 4.19 (City of 0.274463 (City of 1.05 Coppell, TX) Coppell, TX) Coppell, TX) 3.81 (Norway) 1.02 (City of 3.75 (City of 0.272 (City of 1.06 Frisco, TX) Frisco, TX) Frisco, TX) 3.76 (France) 1.01 (Ericcson) 3.82 (Ericcson) 0.264398 1.12 (Ericcson) 3.75 (New 1.03 (Texas 3.91 (Texas 0.263427 (Texas 1.15 Zealand) Instruments) Instruments) Instruments) 3.73 (UAE) 0.99 (City of 3.76 (City of 0.263298 (City of Farmers Branch, Farmers Branch, Farmers Branch, 1.33 TX) TX) TX) 1.37 3.64 (Taiwan) 1.10 (Intuit) 4.18 (Intuit) 0.263158 (Intuit) 3.58 (Italy) 0.99 (City of Fort 3.79 (City of Fort 0.261214 (City of 1.55 Worth, TX) Worth, TX) Fort Worth, TX) 1.60 3.54 0.99 (Celanese) 3.82 (Celanese) 0.259162

113 Texas Tech University, Jon Ilseng, December 2015

(Luxembourg) (Celanese) 3.53 (South 0.99 (Brookhaven 3.82 (Brookhaven 0.259162 1.60 Africa) CC) CC) (Brookhaven CC)

These proposed measures measure System Performance with regards to managing Logistics and Sustainability, while traditional measures do not. Table 8-3 shows the City of Dallas effective, efficient and economical focus area percentage of bids advertised within publisher’s deadline. Specifically, the Target Index is calculated as actual percentage of bids advertised within publisher’s deadline/target percentage of bids advertised within publisher’s deadline. The values are expressed in whole numbers. For example, 1.00 represents 100%, 1.11 represents 111%, etc. This specific measure closely represents System Performance for a certain measure with the City of Dallas Program Measures System. The time period is from October 2008 through September 2012, thirty-seven months. This measure represents only one system. Table 8-3 also shows the top thirty-seven LPI scores for thirty-seven countries in CY 2010, each country representing one system and Performance/Logistic Index scores for the top thirty-seven systems in the Dallas-Fort Worth, TX SoS.

114 Texas Tech University, Jon Ilseng, December 2015

Table 8-3 System Performance measures versus PLI measure System Performance LPI DFW SoS DFW SoS LPI DFW SoS PLI (% of bids advertised Performance within publisher’s Index deadline) 1.00 4.11 0.75 (Dallas Can 3.84 (Dallas 0.195 (Dallas (Germany) Academy) Can Academy) Can Academy) 1.00 4.09 0.78 (I2 3.75 (I2 0.208 (I2 (Singapore) Technologies) Technologies) Technologies) 4.08 (Sweden) 0.80 (Richland CC) 3.84 (Richland 0.208 1.00 CC) (Richland CC) 4.07 0.84 (City of 4.02 (City of 0.208 (City of 1.00 (Netherlands) Fairview, TX) Fairview, TX) Fairview, TX) 1.00 3.98 0.84 (Kimberly 3.99 (Kimberly 0.210 (Luxembourg) Clark) Clark) (Kimberly Clark) 1.00 3.97 0.82 (TX Extension 3.87 (TX 0.211 (TX (Switzerland) Education Extension Extension Foundation) Education Education Foundation) Foundation) 1.00 3.97 (Japan) 0.80 (Samaritan Inn 3.76 (Samaritan 0.212 McKinney) Inn McKinney) (Samaritan Inn McKinney) 1.00 3.95 (United 0.89 (Intuit) 4.18 (Intuit) 0.212 (Intuit) Kingdom) 1.00 3.94 0.90 (City of 4.19 (City of 0.214 (City of (Belgium) Coppell, TX) Coppell, TX) Coppell, TX) 1.00 3.93 0.75 (Community 3.49 0.214 (Norway) ISD Education (Community (Community

115 Texas Tech University, Jon Ilseng, December 2015

Foundation) ISD Education ISD Education Foundation) Foundation) 1.00 3.89 (Ireland) 0.81 (City of 3.75 (City of 0.216 (City of DeSoto, TX) DeSoto, TX) DeSoto, TX) 1.00 3.89 (Finland) 0.88 (City of 3.99 (City of 0.220 (City of Southlake, TX) Southlake, TX) Southlake, TX) 3.88 (Hong 0.88 (El Centro CC) 3.99 (El Centro 0.220 (El 1.11 Kong) CC) Centro CC) 1.11 3.87 (Canada) 0.92 (City of Lucas, 4.11 (City of 0.223 (City of TX) Lucas, TX) Lucas, TX) 1.11 3.86 (United 0.90 (Lockheed 4.00 (Lockheed 0.225 States) Martin) Martin) (Lockheed Martin) 1.11 3.85 0.89 (City of 3.92 (City of 0.227 (City of (Denmark) McKinney) McKinney) McKinney) 1.11 3.84 (France) 0.95 (Raytheon) 4.18 (Raytheon) 0.227 (Raytheon) 1.11 3.84 0.91 (L3 Mustang 4.00 (L3 0.227 (L3 (Australia) Technology) Mustang Mustang Technology) Technology) 1.11 3.76 (Austria) 0.88 (City of 3.82 (City of 0.230 (City of Carrollton, TX) Carrollton, TX) Carrollton, TX) 1.11 3.71 (Taiwan) 0.88 (Celanese) 3.82 (Celanese) 0.230 (Celanese) 1.11 3.65 (New 0.95 (Oracle) 4.10 (Oracle) 0.231 (Oracle) Zealand) 1.11 3.64 (Italy) 0.91 (Collin County 3.91 (Collin 0.232 (Collin CC) County CC) County CC) 1.11 3.64 0.95 (City of 4.06 (City of 0.233 (City of

116 Texas Tech University, Jon Ilseng, December 2015

(Republic of Dallas, TX) Dallas, TX) Dallas, TX) Korea) 3.63 (Spain) 0.90 (Brookhaven 3.82 0.235 CC) (Brookhaven (Brookhaven 1.11 CC) CC) 3.63 (UAE) 0.82 (Research in 3.48 (Research 0.235 Motion) in Motion) (Research in 1.11 Motion) 3.51 (Czech 0.91 (City of Allen, 3.86 (City of 0.235 (City of 1.05 Republic) TX) Allen, TX) Allen, TX) 1.05 3.49 (China) 0.95 (USAA) 4.02 (USAA) 0.236 (USAA) 3.46 (South 0.90 (City of Fort 3.79 (City of 0.237 (City of Africa) Worth, TX) Fort Worth, TX) Fort Worth, 1.05 TX) 3.44 (Poland) 0.85 (Tarrant 3.54 (Tarrant 0.240 (Tarrant 1.05 County CC) County CC) County CC) 3.44 0.81 (North TX 3.37 (North TX 0.240 (North (Malaysia) Clean Air Clean Air TX Clean Air 1.05 Coalition) Coalition) Coalition) 3.41 (Israel) 0.91 (City of 3.76 (City of 0.242 (City of 1.05 Addison, TX) Addison, TX) Addison, TX) 3.37 (Bahrain) 0.95 (TX 3.91 (TX 0.242 (TX 1.05 Instruments) Instruments) Instruments) 3.34 0.91 (City of 3.74 (City of 0.243 (City of 1.05 (Portugal) Rockwall, TX) Rockwall, TX) Rockwall, TX) 3.34 0.93 (City of Plano, 3.82 (City of 0.243 (City of 1.05 (Lebanon) TX) Plano, TX) Plano, TX) 3.22 (Saudi 0.92 (Hewlett- 3.76 (Hewlett- 0.244 (Hewlett- 1.05 Arabia) Packard) Packard) Packard)

117 Texas Tech University, Jon Ilseng, December 2015

3.22 (Turkey) 0.92 (City of 3.75 (City of 0.245 (City of 1.05 Frisco, TX) Frisco, TX) Frisco, TX) 3.20 (Brazil) 0.88 (City of 3.58 (City of 0.245 (City of 1.05 Arlington, TX) Arlington, TX) Arlington, TX)

These proposed measures measure Availability Performance with regards to managing Logistics and Sustainability, while traditional measures do not. Table 8-4 shows the City of Dallas effective, efficient and economical focus area number of contracts managed to effectiveness. Specifically, the Target Index is calculated as actual percentage of number of contracts managed to effectiveness. The values are expressed in whole numbers. For example, 1.05 represents 105%, 1.02 represents 102%, etc. This specific measure closely represents Availability Performance for a certain measure with the City of Dallas Program Measures System. The time period is from September 2009 through June 2014, forty-four months. This measure represents only one system. Table 8-4 also shows the top forty-four LPI scores for forty-four countries in CY 2012, each country representing one system and Availability/Logistic Index scores for the top forty- four systems in the Dallas-Fort Worth, TX SoS.

Table 8-4 Availability Performance measures versus ALI measure Availability LPI DFW SoS DFW SoS LPI DFW SoS ALI Performance (% of Availability Index number of contracts managed to effectiveness) 1.05 4.13 0.84 (City of 4.02 (City of 0.208 (City of (Singapore) Fairview, TX) Fairview, TX) Fairview, TX) .984 4.12 (Hong 0.89 (Intuit) 4.18 (Intuit) 0.212 (Intuit) Kong)

118 Texas Tech University, Jon Ilseng, December 2015

4.05 (Finland) 0.80 (I2 3.75 (I2 0.213 (I2 .980 Technologies) Technologies) Technologies) 4.03 0.90 (City of 4.19 (City of 0.214 (City of 1.00 (Germany) Coppell, TX) Coppell, TX) Coppell, TX) 1.01 4.02 0.88 (City of 3.99 (City of 0.220 (City of (Netherlands) Southlake, TX) Southlake, TX) Southlake, TX) 1.02 4.02 0.92 (City of 4.11 (City of 0.223 (City of (Denmark) Lucas, TX) Lucas, TX) Lucas, TX) 1.06 3.98 0.90 (Lockheed 4.00 (Lockheed 0.225 (Lockheed (Belgium) Martin) Martin) Martin) 1.10 3.93 (United 0.89 (City of 3.92 (City of 0.227 (City of States) McKinney, TX) McKinney, TX) McKinney, TX) 1.13 3.93 (Japan) 0.95 (Raytheon) 4.18 (Raytheon) 0.227 (Raytheon) 1.16 3.90 (United 0.91 (L3 Mustang 4.00 (L3 0.227 (L3 Kingdom) Technology) Mustang Mustang Technology) Technology) 1.19 3.89 (Austria) 0.88 (Celanse) 3.82 (Celanese 0.230 (Celanese) 1.22 3.85 (Sweden) 0.88 (City of 3.82 (City of 0.230 (City of Carrollton, TX) Carrollton, TX) Carrollton, TX) 3.85 (Canada) 0.92 (Kimberly- 3.99 (Kimberly- 0.230 (Kimberly- 1.24 Clark) Clark) Clark) 1.40 3.85 (France) 0.90 (El Centro 3.89 (El Centro 0.231 (El Centro CC) CC) CC) 1.01 3.82 0.95 (Oracle) 4.10 (Oracle) 0.231 (Oracle) (Luxembourg) 1.02 3.80 0.91 Collin 3.91 (Collin 0.232 (Collin (Switzerland) County) County CC) County CC) 1.02 3.78 (UAE) 0.90 (Wise 3.85 (Wise 0.233 (Wise County) County) County)

119 Texas Tech University, Jon Ilseng, December 2015

1.00 3.73 0.95 (City of 4.06(City of 0.234 (City of (Australia) Dallas, TX) Dallas, TX) Dallas, TX) .991 3.71 (Taiwan) 0.90 (Richland CC) 3.84 (Richland 0.234 (Richland CC) CC) .991 3.70 (Korea 0.90 (Brookhaven 3.82 0.235 Republic) CC) (Brookhaven CC) (Brookhaven CC) .998 3.70 (Spain) 0.91 (City of Allen, 3.86 (City of 0.235 (City of TX) Allen, TX) Allen, TX) .974 3.67 (South 0.95 (USAA) 4.02 (USAA) 0.236 (USAA) Africa) 0970 3.67 (Italy 0.91 (Dallas Can 3.84 (Dallas Can 0.237 (Dallas Academy) Academy) Can Academy) .970 3.52 (China) 0.90 (City of Fort 3.79 City of Fort 0.237 (City of Worth, TX) Worth, TX) Fort Worth, TX) .971 3.52 (Ireland) 0.90 (City DeSoto, 3.75 (City of 0.240 (City of TX) DeSoto, TX) DeSoto, TX) 3.51 (Turkey) 0.85 (Tarrant 3.54 (Tarrant 0.240 (Tarrant .974 County CC) County CC) County CC) 3.50 0.81 (North TX 3.37 (North TX 0.240 (North TX (Portugal) Clean Air Clean Air Clean Air .975 Coalition) Coalition Coalition) 3.49 0.93 (City of Hurst, 3.85 (City of 0.241 (City of 1.00 (Malaysia) TX) Hurst, TX) Hurst, TX) 3.43 (Poland) 0.91 (City of 3.76 (City of 0.242 (City of 1.00 Addison, TX) Addison, TX) Addison, TX) 3.42 (New 0.95 (TX 3.91 (TX 0.242 (TX 0.945 Zealand) Instruments) Instruments) Instruments) 3.39 (Iceland) 0.91 City of 3.74 (City of 0.243 ( City of .945 Rockwall, TX) Rockwall, TX) Rockwall, TX)

120 Texas Tech University, Jon Ilseng, December 2015

3.32 (Qatar) 0.93 (City of 3.82 (City of 0.243 (City of .903 Plano, TX) Plano, TX) Plano, TX) 3.29 0.85 (Research in 3.48 (Research in 0.244 (Research .880 (Slovenia) Motion) Motion) in Motion) 3.24 (Cyprus) 0.92 (Hewlett- 3.76 (Hewlett- 0.244 (Hewlett- .869 Packard) Packard) Packard) 3.21 0.92 (City of 3.75 (City of 0.245 (City of .849 (Bulgaria) Frisco, TX) Frisco, TX) Frisco, TX) 3.18 (Saudi 0.88 (City of 3.58 (City of 0.245 (City of .835 Arabia) Arlington, TX) Arlington, TX) Arlington, TX) 3.18 0.94 (Ericsson) 3.82 (Ericsson) 0.246 (Ericsson) .826 (Thailand) 3.17 (Tunisa) 0.93 (City of 3.76 (City of 0.247 (City of Farmers Branch, Farmers Branch, Farmers Branch, .819 TX) TX) TX) 3.17 (Chile) 0.96 (MetroPCS 3.86 (MetroPCS 0.248 (MetroPCS .824 Communications) Communications) Communications) 3.17 0.93 (City of 3.73 (City of 0.249 (City of .810 (Hungary) Mesquite, TX) Mesquite, TX) Mesquite, TX) 3.16 (Croatia) 0.94 (The 3.76 (The 0.250 (The Samaritan Inn Samaritan Inn) Samaritan Inn) .851 McKinney) 3.14 (Czech 0.98 (Collin 3.89 (Collin 0.252 (Collin .845 Republic) County) County) County) 3.13 (Brazil) 0.89 (Community 3.49 (Community 0.255 ISD Education ISD Education (Community ISD Foundation) Foundation) Education .844 Foundation) .844 3.06 (Mexico) 0.99 (TX 3.879 (TX 0.256 (TX

121 Texas Tech University, Jon Ilseng, December 2015

Extension Extension Extension Education Education Education Foundation) Foundation) Foundation)

The whole point is that these three performance measures can measure cost, performance and availability for each system in a SoS with regards to sustainability, whereas current measures do not. On a scale from 0 to 0.99, the lower the scores for each measure, the better. For example, in the Dallas-Fort Worth, TX SoS, El Centro Community College has the #1 CLI = 0.24, #12 PLI = 0.22 and #14 ALI = 0.23. Since El Centro Community College is an educational institution, we can research why an educational institution in the Dallas-Fort Worth, TX SoS is ranked very high in these three performance measures. For the non-profit system, Global Future Institute has the #90 CLI = 0.64, #90 PLI = 0.37 and #89 ALI = 0.41. Why is this non-profit system ranked low in these three performance measure? For a profit system, L3 Mustang Technology has the #2 CLI = 0.24, #18 PLI = 0.22 and #10 ALI = 0.22. For a governmental institution, City of Dallas, TX has the #5 CLI = 0.25, #23 PLI = 0.23, #18 ALI = 0.23. Each performance measures indicates how that particular system in a SoS is performing in terms of cost, performance and availability with regards to sustainability. The three proposed measures can measure any geographic SoS while traditional measures do not. For example, these three performance measures could be used to perform a SoS study comparing the geographic SoS of a community that attracts an existing company at the expense of a community that loses that company. For example, if Texas Instruments, which has its corporate headquarters in the Dallas-Fort Worth, TX SoS, decides to relocate to the Houston, TX SoS, these three performance measures could be used to explain or predict these types of move. The methodology to collect the necessary data is explained previously at the beginning of this Chapter 8.

122 Texas Tech University, Jon Ilseng, December 2015

The intent and contribution of this research is to positively impact and address the global sustainability needs with regards to efficiency and effectiveness of the Dallas-Fort Worth, Texas SoS. Measuring performance in such an environment is critical as sustainability concerns become more pervasive. These proposed Performance Measures integrate various needs of the Dallas-Fort Worth, Texas SoS with regards to sustainability concerns and further the research on Transdisciplinary Engineering. Transdisciplinary Research and Educational Programs typically consist of four core courses (design fundamentals, process fundamentals, systems fundamentals, metrics fundamentals). The metrics fundamentals course develops concepts of engineering measurement as well as quality assurance. These proposed Performance Measures could be added to the current Transdisciplinary Metrics Fundamentals Course. The above performance measures can be used to meet current and future sustainability concerns. These proposed performance measures enable the Dallas-Fort Worth SoSs to truly execute their strategies. For future work, these three performance measures could be used to perform a SoS study comparing the geographic SoS of a community that attracts an existing company at the expense of a community that loses that company. For example, if Texas Instruments, which has its corporate headquarters in the Dallas-Fort Worth, TX SoS, decides to relocate to the Houston, TX SoS, these three performance measures could be used to explain or predict these types of move. The methodology to collect the necessary data is explained previously at the beginning of this Chapter 8. Further research is needed to determine future performance measures. This dissertation has primarily laid the groundwork for future work by locating similar studies, identifying the gap in performance measures used in a SoS of for-profits, governmental agencies, and educational institutions; and targeting future work to provide measures compatible with the balanced scorecard.

123 Texas Tech University, Jon Ilseng, December 2015

REFERENCES

[1] Valerdi, R.; Ross, A.M.; and Rhodes, D.H., "A framework for evolving system of systems engineering," Crosstalk, Vol. 20, pp. 28-30, 2007. [2] Kajikawa, Yuya, "Promoting Interdisciplinary Research and Transdisciplinary Expertise for Realizing Sustainable Society," IEEE 2010, pp 1-6. [3] Hasna, Abdallah M., "Embedding Sustainability in Capstone Engineering Design Projects," IEEE EDUCON Education Engineering 2010-The Future of Global Learning Engineering Education, April 14-16 2010, Madrid, Spain, pp. 1601- 1610. [4] Davis-Mendelow, Steven, PhD., "Sustainability for a Change: An Aerospace Perspective", IEEE 2006, pp. 1-4. [5] Amirkhanyan, Anna A., "What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts," Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. [6] Patrick, Barbara A. and French, P. Edward, “Assessing New Public Management’s Focus On Performance Measurement In The Public Sector”, Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 340-369. [7] Wu, Jerry Chun-Teh; Tsai, Hsien-Tang; Shih, Meng-Hsun; Fu, Hwai-Hui, "Government Performance Evaluation using a balanced scorecard with a fuzzy linguistic scale," The Service Industries Journal, Vol, 30, No. 3, March 2010, pp 449-462. [8] Wu, Jerry Chun-Teh; Tsai, Hsien-Tang; Shih, Meng-Hsun; Fu, Hwai-Hui, "Government Performance Evaluation using a balanced scorecard with a fuzzy linguistic scale," The Service Industries Journal, Vol, 30, No. 3, March 2010, pp 449-462. [9] Kajikawa, Yuya, "Promoting Interdisciplinary Research and Transdisciplinary Expertise for Realizing Sustainable Society," IEEE 2010, pp 1-6. [10] Wyrick, David; Natarajan, Ganapathy; Eseonu, Chinweike; Lindeke, Richard and Chen, Hongyi, “Lean Management, Internationalization, Sustainability and Technology Policy: Congruence or Turbulence,” FAIM 2012, pp. 1-9. [11] Marinova, Dora and Newman, Peter, “The changing research funding regime in Australia and academic productivity,” Mathematics and Computers in Simulation 78 (2008), pp. 283-291. [12] Lotz-Sisitka, Heila, “ Education for Sustainable Development and Retention: Unravelling a research agenda,” Springer Science Business Media, Vol. 2010, 13 June 2010. [13] Hasna, Abdallah M., “Embedding Sustainability in Capstone Engineering Design Projects,” IEEE EDUCON Education Engineering 2010.

124 Texas Tech University, Jon Ilseng, December 2015

[14] Abbott, Kenneth W., “Engaging the public and private in global sustainability governance,” International Affairs 88: 3 (2012), The Royal Institute of International Affairs, pp. 543-564. [15] Carvalho, Joaquim Francisco de, “Measuring economic performance, social progress and sustainability using an index”, Renewable and Sustainable Energy Reviews 15 (2011) pp. 1073–1079. [16] Munier, N., “Methodology to select a set of urban sustainability indicators to measure the state of the city, and performance assessment”, Ecological Indicators 11 (2011) 1020–1026. [17] Moldan, Bedrich; Janouskova, Svatava and Hak, Tomas, "How to Understand and Measure Environmental Sustainability: Indicators and Targets", Elsevier Ecological Indicators 17 (2012) pp. 4-13. [18] Idoro, Godwin Iroroakpo, “Sustainability of Mechanisation in the Nigerian Construction Industry”, Journal of Civil Engineering and Management, 2012 Volume 18(1): 91–105. [19] Char, Murali D.R. and David, Parthiban, “Sustaining Superior Performance In An Emerging Economy: An Empirical Test in the Indian Context”, Strategic Management Journal, 33: 217-229 (2012). [20] Fox, Patricia; Hundley, Stephen; Cowan, David Jan; Tabasi, Joe and Goodman, David, “Teaching Sustainability: Course, Program and Degree Considerations”, PICMET 2009 Proceedings, August 2-6, Portland, USA © 2009 PICMET. [21] Searcy, Cory, “Corporate Sustainability Performance Measurement,” Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics, San Antonio, TX, USA - October 2009. [22] Yutaka Hata, Yutaka,; Kobashi, Syoji and Nakajima, Hiroshi, "Human Health Care System of Systems," IEEE Systems Journal, Vol. 3, No. 2, June 2009. [23] Karcanias, Nicos and Hessami, Ali G., “System of Systems and Emergence Part 1: Principles and Framework”, 2011 Fourth International Conference on Emerging Trends in Engineering & Technology. [24] Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering. "Systems Engineering Guide for Systems of Systems, Version 1.0". , D.C: ODUSD(A&T)SSE, 2008. [25] Dahmann, Judith and Baldwin, Kristen J., “Understanding the Current State of U.S. Defense Systems of Systems and the Implications for Systems Engineering,” SysCon 2008 – IEEE International Systems Conference, Montreal, Canada, April 7-10, 2008. [26] Amado, Carla A.F.; Santos, Sergio P. and Marques, Pedro M., “Integrating the Data Envelopment Analysis and the Balanced Scorecard approaches for enhanced performance assessment”, Omega 40 (2012), 390-403. [27] Patrick, Barbara A. and French, P. Edward “Assessing New Public Management’s Focus on Performance Measurement In The Public Sector,” Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 340-369.

125 Texas Tech University, Jon Ilseng, December 2015

[28] Baruch, Yehuda and Ramalho, Nelson, “Communalities and Distinctions in the Measurement of Organizational Performance and Effectiveness Across For-Profit and Nonprofit Sections,” 2006 Association for Research on Nonprofit Organizations and Voluntary Action, Vol. 35, No. 1, March 2006, pp. 39-65. [29] Berument, M. Hakan; Dincer, N. Nergiz and Mustafaoglu, Zafer, “Effects of growth volatility on economic performance – empirical evidence from Turkey,” European Journal of Operational Research, 15 September 2011 [30] Sackett, W.T., “Engineering Productivity-Its Measurement and Career Impact”, Sixth IEEE-USA Careers Conference. [31] Russell, S.N. and Allwood, J.M., “Environmental evaluation of localizing production as a strategy for Sustainable Development: a case study of two consumer goods in Jamaica,” Journal of Cleaner Production 16 (2008), pp. 1327- 1338. [32] Assaf, A. George and Josiassen, Alexander, "European vs. U.S. Airlines: Performance Comparison in a Dynamic Market", Elsevier Tourism Management 33 (2012) 317-326. [33] Collins, Luke, “Group Intelligence, Teamwork and Productivity,” Perspectives, March-April 2012. [34] Soriano, Domingo Riberio and Castrogiovanni, Gary J., “The Impact of Education, Experience and Inner Circle advisors on SME performance: Insights from a study of public development centers,” Springer Science Business Media, 17 March 2010. [35] Boehm, Eike, "Improving Efficiency and Effectiveness in an Automotive R&D Organization," Research-Technology Management, March-April 2012, pp. 18-25. [36] Gazley, Beth, “Linking Collaborative Capacity to Performance Measurement in Government–Nonprofit Partnerships”, 2010 SAGE Publications. [37] Steinberger, Julia K. and Krausmann, Fridolin, “Material and Energy Productivity”, 2011 American Chemical Society Publications. [38] Serikh, V.I.; Yakimova, I.V. and Palchun, Yu A., “Measurements in Educational Processes”, 2010 IEEE. [39] Herron, Colin and Braiden, Paul M., “A methodology for developing sustainable quantifiable productivity improvement in manufacturing companies”, International Journal of Production Economics 104 (2006), pp. 143-153. [40] Vanclay, Jerome K. and Bommann, Lutz, “Metrics to evaluate research performance in academic institutions: a critique of ERA 2010 as applied in forestry and the indirect H2 index as a possible alternative”, Scientometrics (2012) 91:751–771. [41] Brown, Warren B. and Gobeli, David, “Observations on the Measurement of R&D Productivity: A Case Study,” IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 39, NO. 4, NOVEMBER 1992. [42] Zhao, Lei, “Policy Performance Evaluation of Local Government: Value Orientation and Influencing Factor”, 2011 Fourth International Conference on Business Intelligence and Financial Engineering.

126 Texas Tech University, Jon Ilseng, December 2015

[43] Maudgalya, Tushyati; Genaidy, Ash and Shell, Richard, “Productivity–Quality– Costs–Safety: A Sustained Approach to Competitive Advantage—A Systematic Review of the National Safety Council’s Case Studies in Safety and Productivity,” Human Factors and Ergonomics in Manufacturing, Vol. 18 (2) 152–179 (2008). [44] Ryynanen, Marko; Sirvio, Petri; Tanninen, Panu and Lindell, Henry, “A Productivity Study of Digital Printing in the Packaging Industry”, PACKAGING TECHNOLOGY AND SCIENCE 2012; 25: 119–124. [45] Phusavat, Kongkiti and Photaranon, Watcharapon, “Productivity/performance Measurement: Case application at the government pharmaceutical organization”, Industrial Management & Data Systems Vol. 106 No. 9, 2006. [46] Lee, Jeong-Dong; Park, Sung-Bae and Kim, Tai-Yoo, “Profit, productivity, and price differential: an international performance comparison of the natural gas transportation industry”, Energy Policy 27 (1999) 679-689. [47] Pearson, A.W.; Nixon, W.A. and Kerrsens-van Drongelen, “R&D as a business- what are the implications for performance measurement”, R&D Management 30, 4, 2000. [48] Lahiri, Somnath and Kumar, Vikas, “Ranking International Business Institutions and Faculty Members Using Research Publication as the Measure”, Manag Int Rev (2012) 52:317–340. [49] Isola, O.O.; Siyanbola, W.O. and Ilori, M.O., “Research Productivity in Selected Higher Education Institutions in Nigeria”, 2011 IEEE. [50] Mahto, Raj V.; Davis, Peter S.; Pearce II, John A. and Robinson, Jr., Richard B. “Satisfaction with Firm Performance in Family Business,” Entrepreneurship Theory and Practice, 2010 Baylor University. [51] Baum, J. Robert and Wally, Stefan, “Strategic Decision Speed and Firm Performance,” Strategic Management Journal, 24: 1107–1129 (2003). [52] Davis-Mendelow, Steven, PhD., "Sustainability For A Change: An Aerospace Perspective," IEEE 2006, pp. 1-4. [53] Fisher, Emily Bartholomew and Schoenberger, Erica, “Who’s Driving the Bus: The Importance of Interdisciplinary Awareness on the Road to Sustainability”, IEEE Energy 2030, 17-18 Nov 2008. [54] Asgarzadeh, Morteza; Koga, Takaaki; Yoshizawa, Nozomu; Munakata, Jun and Hirate, Kotaroh, “A Transdisciplinary Approach to Oppressive Cityscapes and the Role of Greenery as Key Factors in Sustainable Urban Development”, 2009 IEEE. [55] Althin, Rikard; Behrenz, Lars; Fare, Rolf; Grosskopf, Shawna and Mellander, Erik, “Swedish employment offices: A new model for evaluating effectiveness”. European Journal of Operation Research, 207 (2010) 1535–1544.

[56] McGrath, Michael E. and Romeri, Michael N., “The R&D Effectiveness Index: A Metric for Product Development Performance”, Journal of Product Innovative Management, 1994; 11: 213-220.

127 Texas Tech University, Jon Ilseng, December 2015

[57] Sarkees, Matthew, “Understanding the links between technological opportunism, marketing emphasis and firm performance: implications for B2B. Industrial Marketing Management 40 (2011) 785–795. [58] Maudgalya, Tushyati; Genaldy, Ash and Shell, Richard, “Productivity-Quality- Costs-Safety: A Sustained Approach to Competitive Advantage – A Systematic Review of the National Safety Council’s Case Studies in Safety and Productivity”, Human Factors and Erogonomics in Manufacturing, Vol. 18 (2) 152-179 (2008). [59] Romzek, Barbara S. and Johnston, Jocelyn M., “State Social Services Contracting: Exploring the Determinants of Effective Contract Accountability”, Public Administration Review, July/August 2005, Vol. 65, No. 4 (2005). [60] Borhan, Maslan and Jemain, Abdul Aziz, “Assessing Schools’ Academic Performance Using a Belief Structure,” Springer Science Business Media, V. 2011, 10 February 2011. [61] Wang, Yan Fu; Xie, Min; Chin, Kwai-Sang and Fu, Xiu Ju, “Accident analysis model based on Bayesian Network and Evidential Reasoning Approach,” Journal of Loss Prevention in the Process Industries 26 (2013) 10-21. [62] Xu, Dong-Ling, “An introduction and survey of the evidential reasoning approach for multiple criteria decision analysis,” Annals of Operation Research (2012), Volume 195, Issue 1, pages 163-187. [63] Jian, Jiang; Li, Xuan; Zhou, Zhi-jie; Xu, Dong-ling and Chen, Ying-wu, “Weapon System Capability Assessment under uncertainty based on the evidential reasoning approach,” Expert Systems with Applications 38 (2011) 13773-13784. [64] Chen, Chien-Ming and Delmas, Magali, "Measuring Corporate Social Performance: An Efficiency Perspective", 2010 Production and Operations Management Society, Vol. 20, No. 6, November-December 2011, pp. 789-804. [65] Cheng, Gang; Zervopoulos, Panagiotis and Qian, Zhenhua, “A variant of radial measure capable of dealing with negative inputs and outputs in data envelopment analysis,” European Journal of Operational Research 225 (2013) 100-105. [66] Kuah, Chuen Tse and Wong, Kuan Yew, “Efficiency assessment of universities through data envelopment analysis”, Procedia Computer Science 3 (2011) 499- 506. [67] Sutton, Warren and Dimitrov, Stanko, “The U.S. Navy explores detailing cost reduction via Data Envelopment Analysis,” European Journal of Operational Research 227 (2013) 166-173. [68] Branda, Martin, “Diversification-consistent data envelopment analysis with deviation measures”, European Journal of Operational Research, 226 (2013) 626- 635.

[69] Ho, Mei Hsui-Ching and Liu, John S, “The motivations for knowledge transfer across borders: the diffusion of date envelopment analysis (DEA) methodology”, Scientometrics (2013) 94:397-421 [70] Paradi, Joseph C. and Zhu, Haiyan, “A survey on bank branch efficiency and performance research with data envelopment analysis”, Omega 41 (2013) 61-79.

128 Texas Tech University, Jon Ilseng, December 2015

[71] Samoilenko, Sergey and Osei-Bryson, Kweku-Muata, “Using Data Envelopment Analysis (DEA) for monitoring efficiency-based performance of productivity- driven organizations: Design and implementation of a decision support system”, Omega 41 (2013), 131-142. [72] Amirteimoori, Alireza and Kordrostami, Sohrab, “Production Planning in Data Envelopment Analysis”, International Journal of Production Economics 140 (2012), 212-218. [73] Lee, Ki-Hoon and Saen, Reza Farzipoor, “Measuring corporate sustainability management: A data envelopment analysis approach”, International Journal of Production Economics 140 (2012), 219-226. [74] Chen, Chialin; Zhu, Joe; Yu, Jiun-Yu and Noori, Hamid, “A new methodology for evaluating sustainable product design performance with two-stage network data envelopment analysis”, European Journal of Operational Research 221 (2012), 348-359. [75] Farris, Jennifer A.; Groesbeck, Richard L.; Van Aken, Eileen M. and Letens, Geert, “Evaluating the Relative Performance of Engineering Design Projects: A Case Study Using Data Envelopment Analysis”, IEEE Transactions on Engineering Management, Vol. 53, No. 3, August 2006. [76] Blidisel, Rodica Gabriela, “Data Envelopment Analysis and The Efficiency of Romanian Public Higher Education”, Metalurgia International, Vol. XVIII No. 3 (2013), 221. [77] Zaum, D.; Olbrich, M. and Barke, E. “Automatic Data Extraction: A Prerequisite for Productivity Measurement,” IEEE 2008, pp. 1-5. [78] Norcross, Lisa, “Building on Success”, IET Manufacturing Engineer, June/July 2006, pp. 1-4. [79] Zeydan, Mithat; Colpan, Cuneyt and Taylan, Osman, “A Competitive Performance Evaluation Methodology in a Manufacturing Enterprise,” 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD 2010). [80] Mola, Khaled Gad El and Parsaie, Hamid, “Dimensions and Measures of Manufacturing Performance Measurement,” Proceedings of the IERC 2004 Conference, Houston, Texas, May pp15-19, 2004. [81] Amirkhanyan, Anna A., “What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts,” Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. . [82] Wang, Jiahe, "Establishing the Index System for Performance Measurement of Local Government's Public Service Quality Management," 2011 IEEE. [83] Likierman, Andrew, “The Five Traps of Performance Measurement”, Harvard Business Review, October 2009. [84] Wu, Jerry Chun-Teh; Tsai, Hsien-Tang; Shih, Meng-Hsun; Fu, Hwai-Hui, "Government Performance Evaluation using a balanced scorecard with a fuzzy

129 Texas Tech University, Jon Ilseng, December 2015

linguistic scale," The Service Industries Journal, Vol, 30, No. 3, March 2010, pp 449-462. [85] Zhaoji, Wang Wenjuan and Lifeng, Guo, “Influence Mechanism of Performance Evaluation of Public Policy Implementation in Local Government Based on System Dynamics,” IEEE 2011. [86] Mola, Khaled Gad El and Parsaei, Hamid, “Integrated Performance Measurement Systems: A Review and Analysis,” IEEE 2007. [87] Przekop, Laura Armstrong and Kerr, Shawn, “Life Cycle Tools for Future Product Sustainability,” IEEE 2004. [88] Coccia, Mario, “Metrics of R&D Performance and Management of Public Research Labs”, 2003 IEEE. [89] Kaisheng, Zeng and Xiaohui, “ Performance Measurement Systems for E- business”, 2011 The 6th International Forum on Strategic Technology. [90] Lin, Minjuan and Ma, Yuan, “Problems and Strategies for Performance Evaluation in Public Decision-Making What Can AHP Do?”, 2011 Fourth International Conference on Business Intelligence and Financial Engineering. [91] Cheng, Hsienhsin and Lin, Chaochih, “ Regeneration Model of Taiwan Old Urban Centers - A Research Framework of a Performance Evaluation System for a Livable Urban District”, Journal of Asian Architecture and Building Engineering/May 2011/170.

[92] Martinez-Lorente, Angel R.; Dewhurst, Frank W. and Gallego-Rodriguez, Alejandrino, “Relating TQM, marketing and business performance: an exploratory study”, International Journal of Production Research, 2000, Vol, 38, No. 14, pp. 3227-3246. [93] Puyun, Bi and Wei, Jiang, “Research on the input-output efficiency of E- Government”, 2011 IEEE. [94] Murdoch, John; Fernandes, Kiran J. and Astley, Ken, “Role-Based Measurement of Performance”, IEEE 2012. [95] Alladi, Anuradha and Vadar, Sekhar, “Systemic Approach to Project Management: A Stakeholders Perspective for Sustainabilitiy”, IEEE 2008. [96] Neely, Andy; Mills, John; Platts, Ken; Richards, Huw; Gregory, Mike; Bourne, Mike and Kennerly Mike, “Performance Measurement system design: developing a testing process-based approach”, International Journal of Operations & Production Management, Vol. 20, No 10, 2000, pp. 1119-1145. [97] Neely, Andy; Richards, Huw; Mills, John; Platts, Ken and Bourne, Mike, “Designing performance measures: a structured approach” International Journal of Operations & Production Management, Vol. 17, No. 11, 1997, pp. 1131-1152. [98] Giannoccaro, R.; Ludovico, A.D. and Triantis, K., “A Structured Methodology for Developing Performance Measures in any Environment,” Advances in Production Engineering & Management 2 (2007) 2, 91-99 [99] Washio, Satoshi and Yamada, Syuuji, “Evaluation Method based on Ranking in Data Envelopment Analysis”, Expert Systems with Applications (2013) 257-262.

130 Texas Tech University, Jon Ilseng, December 2015

[100] Papavramides, Thanos C.; Aupee, Pascal and Yttesen, Jacob, “Assessing Project Performance by Measuring Organizational Impact”, IEEE Transactions on Engineering Management, 978-1-4244-2289-0, 2008. [101] Chen, Chien-Ming and Delmas, Magali, "Measuring Corporate Social Performance: An Efficiency Perspective", 2010 Production and Operations Management Society, Vol. 20, No. 6, November-December 2011, pp. 789-804. [102] Borhan, Maslan and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", Springer Science Business Media B.V. 2011, Society Indic Research (2012) 106: 187-197. [103] Borhan, Maslan and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", Springer Science Business Media B.V. 2011, Society Indic Research (2012) 106: 187-197. [104] Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering. Systems Engineering Guide for Systems of Systems, Version 1.0. Washington, D.C: ODUSD(A&T)SSE, 2008. [105] Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering. Systems Engineering Guide for Systems of Systems, Version 1.0. Washington, D.C: ODUSD(A&T)SSE, 2008. [106] Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering. Systems Engineering Guide for Systems of Systems, Version 1.0. Washington, D.C: ODUSD(A&T)SSE, 2008. [107] Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering. Systems Engineering Guide for Systems of Systems, Version 1.0. Washington, D.C: ODUSD(A&T)SSE, 2008. [108] Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering. Systems Engineering Guide for Systems of Systems, Version 1.0. Washington, D.C: ODUSD(A&T)SSE, 2008. [109] Neely, Andy; Mills, John; Platts, Ken; Richards, Huw; Gregory, Mike; Bourne, Mike and Kennerly, Mike, "Performance Measurement system design: developing and testing a process-based approach", International Journal of Operations & Production Management, Vol. 20, No 10, 2000, pp. 1119-1145. [110] Neely, Andy; Mills, John; Platts, Ken; Richards, Huw; Gregory, Mike; Bourne, Mike and Kennerly, Mike, "Performance Measurement system design: developing and testing a process-based approach", International Journal of Operations & Production Management, Vol. 20, No 10, 2000, pp. 1119-1145. [111] Neely, Andy; Richards, Huw; Mills, John; Platts, Ken and Bourne, Mike, "Designing performance measures: a structured approach", International Journal of Operations & Production Management, Vol. 17, No. 11, 1997, pp 1131-1152. [112] Lee, Ki-Hoon and Saen, Reza Farzipoor, "Measuring corporate sustainability management: A data envelopment analysis approach", International Journal of Production Economics 140 (2012), pp. 219-226. [113] Kuah, Chuen Tse and Wong, Kuan Yew, "Efficiency assessment of universities through data envelopment analysis", Procedia Computer Science 3 (2011), pp. 499-506.

131 Texas Tech University, Jon Ilseng, December 2015

[114] Kuah, Chuen Tse and Wong, Kuan Yew, "Efficiency assessment of universities through data envelopment analysis", Procedia Computer Science 3 (2011), pp. 499-506. [115] Borhan, Maslan and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", Society of Indic Res (2012), pp. 187-197. [116] Borhan, Maslan and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", Society of Indic Res (2012), pp. 187-197. [117] Borhan, Maslan and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", Society of Indic Res (2012), pp. 187-197. [118] Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering. Systems Engineering Guide for Systems of Systems, Version 1.0. Washington, D.C: ODUSD(A&T)SSE, 2008 [119] Blidisel, Rodica Gabriela, "Data Envelopment Analysis and the Efficiency of Romanian Public Higher Education", Metalurgia International, Vol. XVIII, No. 3 (2013), 221. [120] Paradi, Joseph C. and Zhu, Haiyan, "A survey on bank branch efficiency and performance research with data envelopment analysis", Omega 41 (2013), pp. 61- 79. [121] Samoilenko, Sergey and Osei-Bryson, Kweku-Muata, "Using Data Envelopment Analysis (DEA) for monitoring efficiency-based performance of productivity- driven organizations: Design and implementation of a decision support system", Omega 41 (2013), pp. 131-142. [122] Jian, Jiang; Li, Xuan; Zhou, Zhi-jie; Xu, Dong-ling and Chen, Ying-wu, "Weapon System Capability Assessment under uncertainty based on the evidential reasoning approach", Expert Systems with Applications 38 (2011), pp. 13773- 13784. [123] Xu, Dong-Ling, "An introduction and survey of the evidential reasoning approach for multiple criteria decision analysis", Ann Oper Res (2012), 195: 163-187. [124] Amirkhanyan, Anna A., "What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts," Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. [125] Isola, O.O.; Siyanbola, W.O and Ilori, M.O., "Research Productivity in Selected Higher Education Institutions in Nigeria", IEEE 2011 (2011) 978-1-890843-23- 6/11, pp. 1-6 . [126] Chen, Chien-Ming and Delmas, Magali, "Measuring Corporate Social Performance: An Efficiency Perspective", 2010 Production and Operations Management Society, Vol. 20, No. 6, November-December 2011, pp. 789-804. [127] Patrick, Barbara A. and French, P. Edward, “Assessing New Public Management’s Focus On Performance Measurement In The Public Sector”, Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 340-369.

132 Texas Tech University, Jon Ilseng, December 2015

[128] Searcy, Cory, "Corporate Sustainability Performance Measurement", Proceedings of the 2009 IEEE International Conference on Systems, Man and Cybernetics (2009), pp 1057-1060. [129] Zhao, Lei, "Policy Performance Evaluation of Local Government: Value Orientation and Influencing Factor", 2011 Fourth International Conference on Business Intelligence and Financial Engineering (2011), pp. 535-538. [130] Zhaoji, Yu; Wenjuan, Wang and Lifeng, Guo, "Influence Mechanism of Performance Evaluation of Public Policy Implementation in Local Government Based on System Dynamics", IEEE 2011 (2011), 978-1-4244-8385-3/11, pp.664- 668. [131] Farris, Jennifer A.; Groesbeck, Richard L.; Van Aken, Eileen M. and Letens, Geert, "Evaluating the Relative Performance of Engineering Design Projects: A Case Study of Using Data Envelopment Analysis", IEEE Transactions on Engineering Management (2006), Vol. 53, No. 3, pp. 471-482. [132] Kuah, Chuen Tse and Wong, Kuan Yew, "Efficiency assessment of universities through data envelopment analysis", (2011), Procedia Computer Science 3, pp. 499-506 [133] Xu, Dong-Ling, "An introduction and survey of the evidential reasoning approach for multiple criteria decision analysis", (2012), Ann Oper Res, 195, pp. 163-187. [134] Jiang, Jiang; Li, Xuan; Zhou, Zhi-jie; Xu, Dong-ling and Chen, Ying-wu, "Weapon System Capability Assessment under uncertainty based on the evidential reasoning approach", (2011), Expert Systems with Applications 38, pp. 13773- 13784. [135] Borhan, Masian and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", (2012), Soc Indic Res 106, pp. 187-197. [136] Romzek, Barbara S. and Johnston, Jocelyn M., "State Social Services Contracting: Exploring the Determinants of Effective Contract Accountability", (2005), Public Administration Review, Vol. 65, No. 4, pp 436-449. [137] Amirkhanyan, Anna A., "What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts," Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. [138] Amirkhanyan, Anna A., "What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts," Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. [139] Amirkhanyan, Anna A., "What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts," Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. [140] Amirkhanyan, Anna A., "What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government

133 Texas Tech University, Jon Ilseng, December 2015

Contracts," Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. [141] Amirkhanyan, Anna A., "What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts," Public Performance & Management Review, Vol. 35, No. 2, December 2011, pp. 303-339. [142] Zhao, Lei, "Policy Performance Evaluation of Local Government: Value Orientation and Influencing Factor", 2011 Fourth International Conference on Business Intelligence and Financial Engineering, IEEE 2011, 978-0-7695-4527- 1/1, pp. 535-538. [143] Zhao, Lei, "Policy Performance Evaluation of Local Government: Value Orientation and Influencing Factor", 2011 Fourth International Conference on Business Intelligence and Financial Engineering, IEEE 2011, 978-0-7695-4527- 1/1, pp. 535-538. [144] Zhao, Lei, "Policy Performance Evaluation of Local Government: Value Orientation and Influencing Factor", 2011 Fourth International Conference on Business Intelligence and Financial Engineering, IEEE 2011, 978-0-7695-4527- 1/1, pp. 535-538. [145] Zhao, Lei, "Policy Performance Evaluation of Local Government: Value Orientation and Influencing Factor", 2011 Fourth International Conference on Business Intelligence and Financial Engineering, IEEE 2011, 978-0-7695-4527- 1/1, pp. 535-538. [146] Zhaoji, Yu; Wenjuan, Wang and Lifeng, Guo, "Influence Mechanism of Performance Evaluation of Public Policy Implementation in Local Government Based on System Dynamics", IEEE 2011, 978-1-4244-8385-3/11, pp. 664-668. [147] Zhaoji, Yu; Wenjuan, Wang and Lifeng, Guo, "Influence Mechanism of Performance Evaluation of Public Policy Implementation in Local Government Based on System Dynamics", IEEE 2011, 978-1-4244-8385-3/11, pp. 664-668. [148] Baruch, Yehuda and Ramalho, Nelson, "Communalities and Distinctions in the Measurement of Organizational Performance and Effectiveness Across For-Profit and Nonprofit Sectors", Nonprofit and Voluntary Sector Quarterly 2006 35:39, pp. 39-65. [149] Chen, Chien-Ming and Delmas, Magali, "Measuring Corporate Social Performance: An Efficiency Perspective", Production and Operations Management, Vol. 20, No. 6, November-December 2011, pp. 789-804. [150] Chen, Chien-Ming and Delmas, Magali, "Measuring Corporate Social Performance: An Efficiency Perspective", Production and Operations Management, Vol. 20, No. 6, November-December 2011, pp. 789-804. [151] Searcy, Cory, "Corporate Sustainability Performance Measurement: Lessons from System of Systems Engineering", Proceedings of the 2009 IEEE International Conference on Systems, Man and Cybernetics, IEEE 2009 978-1-4244-2794- 9/09, pp. 1057-1060. [152] Searcy, Cory, "Corporate Sustainability Performance Measurement: Lessons from System of Systems Engineering", Proceedings of the 2009 IEEE International

134 Texas Tech University, Jon Ilseng, December 2015

Conference on Systems, Man and Cybernetics, IEEE 2009 978-1-4244-2794- 9/09, pp. 1057-1060. [153] Searcy, Cory, "Corporate Sustainability Performance Measurement: Lessons from System of Systems Engineering", Proceedings of the 2009 IEEE International Conference on Systems, Man and Cybernetics, IEEE 2009 978-1-4244-2794- 9/09, pp. 1057-1060. [154] Isola, O.O.; Siyanbola, W.O and Ilori, M.O., "Research Productivity in Selected Higher Education Institutions in Nigeria", IEEE 2011 (2011) 978-1-890843-23- 6/11, pp. 1-6. [155] Isola, O.O.; Siyanbola, W.O and Ilori, M.O., "Research Productivity in Selected Higher Education Institutions in Nigeria", IEEE 2011 (2011) 978-1-890843-23- 6/11, pp. 1-6. [156] Isola, O.O.; Siyanbola, W.O and Ilori, M.O., "Research Productivity in Selected Higher Education Institutions in Nigeria", IEEE 2011 (2011) 978-1-890843-23- 6/11, pp. 1-6. [157] Isola, O.O.; Siyanbola, W.O and Ilori, M.O., "Research Productivity in Selected Higher Education Institutions in Nigeria", IEEE 2011 (2011) 978-1-890843-23- 6/11, pp. 1-6. [158] Isola, O.O.; Siyanbola, W.O and Ilori, M.O., "Research Productivity in Selected Higher Education Institutions in Nigeria", IEEE 2011 (2011) 978-1-890843-23- 6/11, pp. 1-6. [159] Isola, O.O.; Siyanbola, W.O and Ilori, M.O., "Research Productivity in Selected Higher Education Institutions in Nigeria", IEEE 2011 (2011) 978-1-890843-23- 6/11, pp. 1-6. [160] Farris, Jennifer A.; Groesbeck, Richard L.; Van Aken, Eileen M. and Letens, Geert, "Evaluating the Relative Performance of Engineering Design Projects: A Case Study of Using Data Envelopment Analysis", IEEE Transactions on Engineering Management (2006), Vol. 53, No. 3, pp. 471-482. [161] Farris, Jennifer A.; Groesbeck, Richard L.; Van Aken, Eileen M. and Letens, Geert, "Evaluating the Relative Performance of Engineering Design Projects: A Case Study of Using Data Envelopment Analysis", IEEE Transactions on Engineering Management (2006), Vol. 53, No. 3, pp. 471-482. [162] Kuah, Chuen Tse and Wong, Kuan Yew, "Efficiency assessment of universities through data envelopment analysis", (2011), Procedia Computer Science 3, pp. 499-506 [163] Jiang, Jiang; Li, Xuan; Zhou, Zhi-jie; Xu, Dong-ling and Chen, Ying-wu, "Weapon System Capability Assessment under uncertainty based on the evidential reasoning approach", (2011), Expert Systems with Applications 38, pp. 13773- 13784. [164] Jiang, Jiang; Li, Xuan; Zhou, Zhi-jie; Xu, Dong-ling and Chen, Ying-wu, "Weapon System Capability Assessment under uncertainty based on the evidential reasoning approach", (2011), Expert Systems with Applications 38, pp. 13773- 13784.

135 Texas Tech University, Jon Ilseng, December 2015

[165] Xu, Dong-Ling, "An introduction and survey of the evidential reasoning approach for multiple criteria decision analysis", (2012), Ann Oper Res, 195, pp. 163-187. [166] Xu, Dong-Ling, "An introduction and survey of the evidential reasoning approach for multiple criteria decision analysis", (2012), Ann Oper Res, 195, pp. 163-187. [167] Borhan, Masian and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", (2012), Soc Indic Res 106, pp. 187-197. [168] Borhan, Masian and Jemain, Abdul Aziz, "Assessing Schools' Academic Performance Using a Belief Structure", (2012), Soc Indic Res 106, pp. 187-197. [169] Mircea, M.; Stoian, I.; Meza, R. and Dancea,O., "Information System for the Global Performance Evaluation of Organizations Based on the Balanced Scorecard Method", IEEE 2006 (2006), 1-4244-0361-8/06, pp. 1-5. [170] Kaplan, Robert S. and Norton, David P., "The Balanced Scorecard-Measures that drive Performance", Harvard Business Review, January-February 1992, pp. 71- 79. [171] Kaplan, Robert S. and Norton, David P., "The Balanced Scorecard-Measures that drive Performance", Harvard Business Review, January-February 1992, pp. 71- 79. [172] Kaplan, Robert S. and Norton, David P., "The Balanced Scorecard-Measures that drive Performance", Harvard Business Review, January-February 1992, pp. 71- 79. [173] Bolten, Joseph G.; Leonard, Robert S.; Arena, Mark V.; Younassi, Obaid and Sollinger, Jerry M., "Sources of Weapon System Cost Growth: Analysis of 35 Major Defense Acquisition Programs", RAND Project Air Force, 2008, pp. 1- 118.

136 Texas Tech University, Jon Ilseng, December 2015

APPENDIX A CPICTD DATA

137 Texas Tech University, Jon Ilseng, December 2015

Table A-9-1 U.S. Navy E-6A TACAMP Program CPICTD Data Month Year CPICTD October 2003 0.96

November 2003 0.93 December 2003 1.02

January 2004 0.89 February 2004 0.89 March 2004 0.92 April 2004 0.93 May 2004 0.96 June 2004 1.00 July 2004 0.99 August 2004 1.03 September 2004 1.04

Table A-9-2 U.S. Navy F/A-18E/F Fighter/Attack Program CPICTD Data Month Year CPICTD October 2003 1.01 November 2003 1.03 December 2003 1.04

January 2004 1.10 February 2004 1.31 March 2004 1.19 April 2004 1.17 May 2004 1.19 June 2004 1.27 July 2004 1.23 August 2004 1.30 September 2004 1.29

138 Texas Tech University, Jon Ilseng, December 2015

Table A-9-3 U.S. Air Force F-22 Advanced Tactical Fighter Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 0.85 December 2003 0.85

January 2004 0.85 February 2004 0.86 March 2004 0.92 April 2004 1.05 May 2004 1.21 June 2004 1.19 July 2004 1.18 August 2004 1.22 September 2004 1.21

Table A-9-4: U.S. Air Force/Navy Joint Primary Aircraft Training System Program CPICTD Data

Month Year CPICTD October 2003 1.01 November 2003 1.10 December 2003 1.20

January 2004 1.15 February 2004 1.10 March 2004 1.12 April 2004 1.11 May 2004 1.13 June 2004 1.11 July 2004 1.10 August 2004 1.09 September 2004 1.09

139 Texas Tech University, Jon Ilseng, December 2015

Table A-9-5: U.S. Air Force Joint Surveillance Target Attack Radar System (Airborne Segment) Program CPICTD Month Year CPICTD October 2003 1.08 November 2003 1.09 December 2003 1.14

January 2004 1.61 February 2004 1.59 March 2004 1.40 April 2004 1.50 May 2004 1.74 June 2004 1.80 July 2004 1.73 August 2004 1.75 September 2004 1.76

Table A-9-6: U.S. Army Longbow Apache Airframe Modifications Program CPICTD Month Year CPICTD October 2003 1.00 November 2003 1.24 December 2003 1.52

January 2004 1.00 February 2004 1.19 March 2004 1.21 April 2004 1.46 May 2004 1.00 June 2004 1.03 July 2004 1.10 August 2004 1.08 September 2004 1..09

140 Texas Tech University, Jon Ilseng, December 2015

Table A-9-7: U.S. Army AH-64 Helicopter Program CPICTD Data Month Year CPICTD October 2003 1.02 November 2003 1.03 December 2003 1.20

January 2004 1.21 February 2004 1.20 March 2004 1.33 April 2004 1.41 May 2004 1.67 June 2004 1.70 July 2004 1.67 August 2004 1.69 September 2004 1.75

Table A-9-8: U.S. Navy Undergraduate Jet Flight Training System (45TS) Program CPICTD Data

Month Year CPICTD October 2003 0.82 November 2003 0.97 December 2003 0.98

January 2004 0.99 February 2004 1.06 March 2004 1.13 April 2004 1.20 May 2004 1.24 June 2004 1.27 July 2004 1.24 August 2004 1.49 September 2004 1.45

141 Texas Tech University, Jon Ilseng, December 2015

Table A-9-9: U.S. Army Advanced Field Artillery Tactical Data System Program CPICTD Data Month Year CPICTD October 2003 1.13 November 2003 1.13 December 2003 0.81

January 2004 0.83 February 2004 0.85 March 2004 0.84 April 2004 0.87 May 2004 0.86 June 2004 0.85 July 2004 0.84 August 2004 0.85 September 2004 0.84

Table A-9-10: U.S. Air Force Airborne Warning and Control System Program CPICTD Data Month Year CPICTD October 2003 1.33 November 2003 1.11 December 2003 1.12

January 2004 1.18 February 2004 1.17 March 2004 1.00 April 2004 1.02 May 2004 1.10 June 2004 1.10 July 2004 1.17 August 2004 1.19 September 2004 1.19

142 Texas Tech University, Jon Ilseng, December 2015

Table A-9-11: U.S. Air Force B-1B Conventional Mission Upgrade Program-Computer CPICTD Data

Month Year CPICTD October 2003 1.00 November 2003 1.00 December 2003 0.99

January 2004 0.97 February 2004 0.98 March 2004 0.98 April 2004 0.99 May 2004 0.99 June 2004 1.00 July 2004 1.00 August 2004 1.02 September 2004 1.02

Table A-9-12: U.S. Air Force B-1B Conventional Mission Upgrade Program-Joint Direct Attack Munition CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 1.00 December 2003 0.99

January 2004 0.97 February 2004 0.98 March 2004 0.98 April 2004 0.99 May 2004 0.99 June 2004 1.00 July 2004 1.00 August 2004 1.02 September 2004 1.02

143 Texas Tech University, Jon Ilseng, December 2015

Table A-9-13: U.S. Air Force High Speed Anti-Radiation Missile Program CPICTD Data Month Year CPICTD October 2003 1.22 November 2003 1.45 December 2003 0.81

January 2004 1.72 February 2004 1.54 March 2004 1.50 April 2004 1.52 May 2004 1.00 June 2004 1.15 July 2004 1.11 August 2004 1.04 September 2004 1.02

Table A-9-14: U.S. Navy Harpoon Missile Program CPICTD Data Month Year CPICTD October 2003 1.29 November 2003 1.12 December 2003 1.15

January 2004 1.22 February 2004 1.29 March 2004 1.32 April 2004 1.45 May 2004 1.44 June 2004 1.34 July 2004 1.52 August 2004 1.71 September 2004 1.64

144 Texas Tech University, Jon Ilseng, December 2015

Table A-9-15: U.S. Navy High Speed Anti-Radiation Missile Program CPICTD Data Month Year CPICTD October 2003 1.02 November 2003 1.12 December 2003 1.23

January 2004 1.36 February 2004 1.60 March 2004 1.65 April 2004 1.25 May 2004 1.35 June 2004 1.36 July 2004 1.47 August 2004 1.54 September 2004 1.55

Table A-9-16: U.S. Air Force Low Altitude Navigation Targeting Infrared Night (LANTIRN) Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 1.04 December 2003 1.09

January 2004 1.13 February 2004 1.13 March 2004 1.15 April 2004 1.13 May 2004 1.13 June 2004 1.13 July 2004 1.15 August 2004 1.17 September 2004 1.17

145 Texas Tech University, Jon Ilseng, December 2015

Table A-9-17: U.S. Army Longbow Apache Fire Control Radar Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 1.19 December 2003 1.21

January 2004 1.46 February 2004 1.00 March 2004 1.03 April 2004 1.04 May 2004 1.05 June 2004 1.04 July 2004 1.03 August 2004 1.04 September 2004 1.04

Table A-9-18: U.S. Army Multiple Launch Rocket System Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 0.98 December 2003 1.01

January 2004 0.96 February 2004 0.92 March 2004 0.90 April 2004 0.91 May 2004 0.90 June 2004 0.93 July 2004 0.92 August 2004 0.97 September 2004 0.97

146 Texas Tech University, Jon Ilseng, December 2015

Table A-9-19: U.S. Air Force Joint Tactical Information Distribution System Program CPICTD Data

Month Year CPICTD October 2003 1.22 November 2003 1.88 December 2003 1.88

January 2004 1.97 February 2004 2.02 March 2004 2.12 April 2004 1.27 May 2004 1.20 June 2004 1.30 July 2004 1.30 August 2004 1.68 September 2004 1.65

Table A-9-20: U.S. Army Forward Area Air Defense Command and Control Program CPICTD Data

Month Year CPICTD October 2003 1.03 November 2003 1.18 December 2003 1.40

January 2004 1.39 February 2004 1.60 March 2004 1.59 April 2004 1.81 May 2004 1.49 June 2004 1.45 July 2004 1.29 August 2004 1.27 September 2004 1.27

147 Texas Tech University, Jon Ilseng, December 2015

Table A-9-21: U.S. Army All Source Analysis System/Enemy Situation Correlation Element Program CPICTD Data Month Year CPICTD October 2003 1.30 November 2003 1.63 December 2003 1.73

January 2004 1.77 February 2004 1.78 March 2004 1.77 April 2004 1.76 May 2004 2.26 June 2004 2.34 July 2004 2.33 August 2004 2.22 September 2004 2.33

Table A-9-22: U.S. Air Force Advanced Medium Range Air-to-Air Missile Program CPICTD Data

Month Year CPICTD October 2003 1.01 November 2003 1.15 December 2003 1.05

January 2004 1.06 February 2004 1.11 March 2004 1.18 April 2004 1.58 May 2004 1.54 June 2004 1.62 July 2004 1.48 August 2004 1.75 September 2004 1.82

148 Texas Tech University, Jon Ilseng, December 2015

Table A-9-23: U.S. Navy AIM-7M Short Range Air-to-Air Missile Program CPICTD Data Month Year CPICTD October 2003 0.95 November 2003 1.19 December 2003 1.16

January 2004 1.37 February 2004 1.37 March 2004 1.40 April 2004 1.37 May 2004 1.39 June 2004 1.42 July 2004 1.42 August 2004 1.41 September 2004 1.43

Table A-9-24: U.S. Army Longbow Hellfire Missile Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 1.01 December 2003 1.44

January 2004 1.35 February 2004 1.07 March 2004 1.08 April 2004 1.10 May 2004 1.12 June 2004 1.11 July 2004 1.13 August 2004 1.15 September 2004 1.14

149 Texas Tech University, Jon Ilseng, December 2015

Table A-9-25: U.S. Army Tactical Missile System Program CPICTD Data Month Year CPICTD October 2003 0.95 November 2003 0.92 December 2003 0.91

January 2004 0.90 February 2004 0.82 March 2004 0.83 April 2004 0.86 May 2004 1.16 June 2004 1.15 July 2004 1.13 August 2004 1.12 September 2004 1.13

Table A-9-26: U.S. Army Patriot Advanced Capability Missile Program CPICTD Data Month Year CPICTD October 2003 1.03 November 2003 1.04 December 2003 1.05

January 2004 1.04 February 2004 1.31 March 2004 1.48 April 2004 1.48 May 2004 1.49 June 2004 1.54 July 2004 1.52 August 2004 1.53 September 2004 1.51

150 Texas Tech University, Jon Ilseng, December 2015

Table A-9-27: U.S. Navy Trident II Missile Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 0.99 December 2003 0.94

January 2004 0.93 February 2004 0.90 March 2004 0.88 April 2004 0.84 May 2004 0.83 June 2004 0.85 July 2004 0.87 August 2004 0.87 September 2004 0.86

Table A-9-28: U.S. Air Force NAVSTAR Global Positioning System Program Satellite CPICTD Data

Month Year CPICTD October 2003 1.07 November 2003 1.07 December 2003 1.11

January 2004 1.11 February 2004 1.16 March 2004 1.02 April 2004 1.08 May 2004 1.04 June 2004 1.04 July 2004 1.06 August 2004 1.06 September 2004 1.09

151 Texas Tech University, Jon Ilseng, December 2015

Table A-9-29: U.S. Air Force Ground Launched Cruise Missile Program CPICTD Data Month Year CPICTD October 2003 1.05 November 2003 1.23 December 2003 1.51

January 2004 1.88 February 2004 2.03 March 2004 2.07 April 2004 2.06 May 2004 2.04 June 2004 2.01 July 2004 1.97 August 2004 1.89 September 2004 1.90

Table A-9-30: U.S. Air Force Cluster Bomb Unit-97B Sensor Fused Weapon Program CPICTD Data

Month Year CPICTD October 2003 1.00 November 2003 1.01 December 2003 1.01

January 2004 1.07 February 2004 1.05 March 2004 1.02 April 2004 0.99 May 2004 0.91 June 2004 0.95 July 2004 0.98 August 2004 0.97 September 2004 0.96

152 Texas Tech University, Jon Ilseng, December 2015

Table A-9-31: U.S. Army Bradley Fighting Vehicle Program CPICTD Data Month Year CPICTD October 2003 1.16 November 2003 1.11 December 2003 1.12

January 2004 1.24 February 2004 1.30 March 2004 1.35 April 2004 1.87 May 2004 2.58 June 2004 2.78 July 2004 2.59 August 2004 2.61 September 2004 2.58

Table A-9-32: U.S. Navy Advanced Self Protection Jammer Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 1.16 December 2003 1.41

January 2004 1.37 February 2004 1.26 March 2004 1.36 April 2004 1.26 May 2004 1.43 June 2004 1.28 July 2004 1.22 August 2004 1.23 September 2004 1.21

153 Texas Tech University, Jon Ilseng, December 2015

Table A-9-33: U.S. Army Advanced Anti-tank Weapon System-Medium Program CPICTD Data Month Year CPICTD October 2003 1.00 November 2003 1.02 December 2003 1.13

January 2004 1.39 February 2004 1.62 March 2004 1.52 April 2004 1.42 May 2004 1.44 June 2004 1.43 July 2004 1.45 August 2004 1.46 September 2004 1.47

154 Texas Tech University, Jon Ilseng, December 2015

APPENDIX B LPI DATA

155 Texas Tech University, Jon Ilseng, December 2015

Table B-10-1 Calendar Year 2010 Reported Logistics Performance Index Scores Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Afghanistan 2.24 2.22 1.87 2.24 2.09 2.37 2.61 Albania 2.46 2.07 2.14 2.64 2.39 2.39 3.01 Algeria 2.36 1.97 2.06 2.70 2.24 2.26 2.81 Angola 2.25 1.75 1.69 2.38 2.02 2.54 3.01 Argentina 3.10 2.63 2.75 3.15 3.03 3.15 3.82 Armenia 2.52 2.10 2.32 2.43 2.59 2.26 3.40 Australia 3.84 3.68 3.78 3.78 3.77 3.87 4.16 Austria 3.76 3.49 3.68 3.78 3.70 3.83 4.08 Azerbaijan 2.64 2.14 2.23 3.05 2.48 2.65 3.15 Bahamas, The 2.75 2.38 2.40 2.69 2.69 2.81 3.46 Bahrain 3.37 3.05 3.36 3.05 3.36 3.63 3.85 Bangladesh 2.74 2.33 2.49 2.99 2.44 2.64 3.46 Belgium 3.94 3.83 4.01 3.31 4.13 4.22 4.29 Benin 2.79 2.38 2.48 2.65 2.64 3.07 3.49 Bhutan 2.38 2.14 1.83 2.44 2.24 2.54 2.99 Bolivia 2.51 2.26 2.24 2.53 2.38 2.38 3.20 Bosnia & 2.66 2.33 2.22 3.10 2.30 2.68 3.18 Herzegovina Botswana 2.32 2.09 2.09 1.91 2.29 2.59 2.99 Brazil 3.20 2.37 3.10 2.91 3.30 3.42 4.14 Bulgaria 2.83 2.50 2.30 3.07 2.85 2.96 3.18 Burkina Faso 2.23 2.22 1.89 1.73 2.02 2.77 2.77 Cambodia 2.37 2.28 2.12 2.19 2.29 2.50 2.84 Cameroon 2.55 2.11 2.10 2.69 2.53 2.60 3.16 Canada 3.87 3.71 4.03 3.24 3.99 4.01 4.41 Chad 2.49 2.27 2.00 2.75 2.04 2.62 3.14

156 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Chile 3.09 2.93 2.86 2.74 2.94 3.33 3.80 China 3.49 3.16 3.54 3.31 3.49 3.55 3.91 Colombia 2.77 2.50 2.59 2.54 2.75 2.75 3.52 Comoros 2.45 1.96 1.76 2.56 2.26 2.79 3.23 Congo, 2.68 2.60 2.27 2.56 2.93 2.43 3.20 Democratic Republic Congo, 2.48 2.02 1.62 2.33 2.42 2.33 4.00 Republic Costa Rica 2.91 2.61 2.56 2.64 2.80 3.13 3.71 Cote d’lvoire 2.53 2.16 2.37 2.44 2.57 2.95 2.73 Croatia 2.77 2.62 2.36 2.97 2.53 2.82 3.22 Cuba 2.07 1.79 1.90 2.32 1.88 2.03 2.41 Cyprus 3.13 2.92 2.94 3.13 2.82 3.51 3.44 Czech 3.51 3.31 3.42 3.27 3.60 4.16 Republic 3.25 Denmark 3.85 3.58 3.99 3.46 3.83 3.94 4.38 Djibouti 2.39 2.25 2.33 2.50 2.17 2.42 2.67 Dominican 2.82 2.51 2.34 2.59 2.42 3.17 3.85 Republic Ecuador 2.77 2.32 2.38 2.86 2.60 2.84 3.55 Egypt 2.61 2.11 2.22 2.56 2.87 2.56 3.31 El Salvador 2.67 2.48 2.44 2.18 2.66 2.68 3.63 Eritrea 1.70 1.50 1.35 1.63 1.88 1.55 2.21 Estonia 3.16 3.14 2.75 3.17 3.17 2.95 3.68 Ethiopia 2.41 2.13 1.77 2.76 2.14 2.89 2.65

157 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Fiji 2.24 1.95 1.98 2.48 2.11 1.96 2.82 Finland 3.89 3.86 4.08 3.41 3.92 4.09 4.08 France 3.84 3.63 4.00 3.30 3.87 4.01 4.37 Gabon 2.41 2.23 2.09 2.29 2.31 2.67 2.87 Gambia 2.49 2.38 2.17 2.54 2.37 2.27 3.15 Georgia 2.61 2.37 2.17 2.73 2.57 2.67 3.08 Germany 4.11 4.00 4.34 3.66 4.14 4.18 4.48 Ghana 2.47 2.35 2.52 2.38 2.42 2.51 2.67 Greece 2.96 2.48 2.94 2.85 2.69 3.31 3.49 Guatemala 2.63 2.33 2.37 2.16 2.74 2.71 3.52 Guinea 2.60 2.34 2.10 2.43 2.68 2.89 3.10 Guinea-Bissau 2.10 1.89 1.56 2.75 1.56 1.71 2.91 Guyana 2.27 2.02 1.99 2.31 2.25 2.28 2.70 Haiti 2.59 2.12 2.17 3.17 2.46 2.43 3.02 Honduras 2.78 2.39 2.31 2.67 2.57 2.83 3.83 Hong Kong, 3.88 3.83 4.00 3.67 3.83 3.94 4.04 China Hungary 2.99 2.83 3.08 2.78 2.87 2.87 3.52 Iceland 3.20 3.22 3.33 3.10 3.14 3.14 3.27 India 3.12 2.70 2.91 3.13 3.16 3.14 3.61 Indonesia 2.76 2.43 2.54 2.82 2.47 2.77 3.46 Iran 2.57 2.22 2.36 2.44 2.65 2.50 3.26 Iraq 2.11 2.07 1.73 2.20 2.10 1.96 2.49 Ireland 3.89 3.60 3.76 3.70 3.82 4.02 4.47 Israel 3.41 3.12 3.60 3.17 3.50 3.39 3.77 Italy 3.64 3.38 3.72 3.21 3.74 3.83 4.08

158 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Jamaica 2.53 2.00 2.07 2.82 2.32 3.07 2.82 Japan 3.97 3.79 4.19 3.55 4.00 4.13 4.26 Jordan 2.74 2.31 2.69 3.11 2.49 2.33 3.39 Kazakhstan 2.83 2.38 2.66 3.29 2.60 2.70 3.25 Kenya 2.59 2.23 2.14 2.84 2.28 2.89 3.06 Korea, 3.64 3.33 3.62 3.47 3.64 3.83 3.97 Republic Kuwait 3.28 3.03 3.33 3.12 3.11 3.44 3.70 Kyrgyz 2.62 2.44 2.09 3.18 2.37 2.33 3.10 Republic Laos, PDR 2.46 2.17 1.95 2.70 2.14 2.45 3.23 Latvia 3.25 2.94 2.88 3.38 2.96 3.55 3.72 Lebanon 3.34 3.27 3.05 2.87 3.73 3.16 3.97 Liberia 2.38 2.28 2.00 2.33 2.16 2.38 3.08 Libya 2.33 2.15 2.18 2.28 2.28 2.08 2.98 Lithuania 3.13 2.79 2.72 3.19 2.85 3.27 3.92 Luxembourg 3.98 4.04 4.06 3.67 3.67 3.92 4.58 Macedonia 2.77 2.55 2.55 2.83 2.76 2.82 3.10 Madagascar 2.66 2.35 2.63 3.06 2.40 2.51 2.90 Malaysia 3.44 3.11 3.50 3.50 3.34 3.32 3.86 Maldives 2.40 2.25 2.16 2.42 2.29 2.42 2.83 Mali 2.27 2.08 2.00 2.17 2.13 2.31 2.90 Malta 2.82 2.65 2.89 2.91 2.89 2.56 3.02 Mauritius 2.72 2.71 2.29 3.24 2.43 2.57 2.91 Mexico 3.05 2.55 2.95 2.83 3.04 3.28 3.66 Moldova 2.57 2.11 2.05 2.83 2.17 3.00 3.17

159 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Mongolia 2.25 1.81 1.94 2.46 2.24 2.42 2.55 Montenegro 2.43 2.17 2.45 2.54 2.32 2.44 2.65 Mozambique 2.29 1.95 2.04 2.77 2.20 2.28 2.40 Myanmar 2.33 1.94 1.92 2.37 2.01 2.36 3.29 Namibia 2.02 1.68 1.71 2.20 2.04 2.04 2.38 Nepal 2.20 2.07 1.80 2.21 2.07 2.26 2.74 Netherlands 4.07 3.98 4.25 3.61 4.15 4.12 4.41 New Zealand 3.65 3.64 3.54 3.36 3.54 3.67 4.17 Nicaragua 2.54 2.24 2.23 2.63 2.31 2.51 3.21 Niger 2.54 2.06 2.28 2.66 2.42 2.45 3.28 Nigeria 2.59 2.17 2.43 2.84 2.45 2.45 3.10 Norway 3.93 3.86 4.22 3.35 3.85 4.10 4.35 Oman 2.84 3.38 3.06 2.31 2.37 2.04 3.94 Pakistan 2.53 2.05 2.08 2.91 2.28 2.64 3.08 Panama 3.02 2.76 2.63 2.87 2.83 3.26 3.76 Papua New 2.41 2.02 1.91 2.55 2.20 2.43 3.24 Guinea Paraguay 2.75 2.37 2.44 2.87 2.59 2.72 3.46 Peru 2.80 2.50 2.66 2.75 2.61 2.89 3.38 Philippines 3.14 2.67 2.57 3.40 2.95 3.29 3.83 Poland 3.44 3.12 2.98 3.22 3.26 3.45 4.52 Portugal 3.34 3.31 3.17 3.02 3.31 3.38 3.84 Qatar 2.95 2.25 2.75 2.92 2.57 3.09 4.09 Romania 2.84 2.36 2.25 3.24 2.68 2.90 3.45 Russian 2.61 2.15 2.38 2.72 2.51 2.60 3.23 Federation

160 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Rwanda 2.04 1.63 1.63 2.88 1.85 1.99 2.05 Saudi Arabia 3.22 2.91 3.27 2.80 3.33 3.32 3.78 Senegal 2.86 2.45 2.64 2.75 2.73 3.08 3.52 Serbia & 2.69 2.19 2.30 3.41 2.55 2.67 2.80 Montenegro Sierra Leone 1.97 2.17 1.61 2.33 1.53 1.73 2.33 Singapore 4.09 4.02 4.22 3.86 4.12 4.15 4.23 Slovak 3.24 2.79 3.00 3.05 3.15 3.54 3.92 Republic Slovenia 2.87 2.59 2.65 2.84 2.90 3.16 3.10 Solomon 2.31 2.08 2.23 2.18 2.27 2.03 3.05 Islands Somolia 1.34 1.33 1.50 1.33 1.33 1.17 1.38 South Africa 3.46 3.22 3.42 3.26 3.59 3.73 3.57 Spain 3.63 3.47 3.58 3.11 3.62 3.96 4.12 Sri Lanka 2.29 1.96 1.88 2.48 2.09 2.23 2.98 Sudan 2.21 2.02 1.78 2.11 2.15 2.02 3.09 Sweden 4.08 3.88 4.03 3.83 4.22 4.22 4.32 Switzerland 3.97 3.73 4.17 3.32 4.32 4.27 4.20 Syria 2.74 2.37 2.45 2.87 2.59 2.63 3.45 Taiwan 3.71 3.35 3.62 3.64 3.65 4.04 3.95 Tajikistan 2.35 1.90 2.00 2.42 2.25 2.25 3.16 Tanzania 2.60 2.42 2.00 2.78 2.38 2.56 3.33 Thailand 3.29 3.02 3.16 3.27 3.16 3.41 3.73 Togo 2.60 2.40 1.82 2.42 2.45 3.42 3.02 Tunisia 2.84 2.43 2.56 3.36 2.36 2.56 3.57

161 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Turkey 3.22 2.82 3.08 3.15 3.23 3.09 3.94 Turkmenistan 2.49 2.14 2.24 2.31 2.34 2.38 3.51 Uganda 2.82 2.84 2.35 3.02 2.59 2.45 3.52 Ukraine 2.57 2.02 2.44 2.79 2.59 2.49 3.06 United Arab 3.63 3.49 3.81 3.48 3.53 3.58 3.94 Emirates United 3.95 3.74 3.95 3.66 3.92 4.13 4.37 Kingdom United States 3.86 3.68 4.15 3.21 3.92 4.17 4.19 Uruguay 2.75 2.71 2.58 2.77 2.59 2.78 3.06 Uzbekistan 2.79 2.20 2.54 2.79 2.50 2.96 3.72 Venezuela 2.68 2.06 2.44 3.05 2.53 2.84 3.05 Vietnam 2.96 2.68 2.56 3.04 2.89 3.10 3.44 Yemen 2.58 2.46 2.35 2.24 2.35 2.63 3.48 Zambia 2.28 2.17 1.83 2.41 2.01 2.35 2.85

Table B-10-2 Calendar Year 2012 Reported Logistics Performance Index Scores Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Afghanistan 2.30 2.33 2.00 2.33 2.16 2.10 2.80 Albania 2.77 2.43 2.43 2.84 2.65 2.65 3.58 Algeria 2.41 2.26 2.02 2.68 2.13 2.46 2.85 Angola 2.28 2.33 2.48 2.26 2.00 2.00 2.59 Argentina 3.05 2.45 2.94 3.33 2.95 3.30 3.27 Armenia 2.56 2.27 2.38 2.65 2.40 2.57 3.07 Australia 3.73 3.60 3.83 3.40 3.75 3.79 4.05

162 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Austria 3.89 3.77 4.05 3.71 4.10 3.97 3.79 Azerbaijan 2.48 1.92 2.42 2.43 2.14 2.75 3.23 Bahamas, The 2.75 2.69 2.77 2.72 2.69 2.65 2.99 Bahrain 3.05 2.67 3.08 2.83 2.94 3.42 3.41 Belarus 2.61 2.24 2.78 2.58 2.65 2.58 2.87 Belgium 3.98 3.85 4.12 3.73 3.98 4.05 4.20 Benin 2.85 2.59 2.57 2.44 2.90 2.87 3.74 Bhutan 2.52 2.29 2.29 2.61 2.42 2.56 2.90 Bolivia 2.61 2.40 2.39 2.60 2.58 2.73 2.95 Bosnia & 2.99 2.65 2.86 3.00 2.93 2.81 3.61 Herzegovina Botswana 2.84 2.82 2.82 2.53 2.74 2.73 3.43 Brazil 3.13 2.51 3.07 3.12 3.12 3.42 3.55 Bulgaria 3.21 2.97 3.20 3.25 3.10 3.16 3.56 Burkina Faso 2.32 2.12 2.40 2.33 2.28 2.13 2.67 Burundi 1.61 1.67 1.68 1.57 1.43 1.67 1.67 Cambodia 2.56 2.30 2.20 2.61 2.50 2.77 2.95 Cameroon 2.53 2.37 2.24 2.37 2.41 2.55 3.19 Canada 3.85 3.58 3.99 3.55 3.85 3.86 4.31 Central African 2.57 2.45 2.09 2.33 2.70 2.48 3.33 Republic Chad 2.03 1.86 2.00 2.00 2.00 1.57 2.71 Chile 3.17 3.11 3.18 3.06 3.00 3.22 3.47 China 3.52 3.25 3.61 3.46 3.47 3.52 3.80 Colombia 2.87 2.65 2.72 2.76 2.95 2.66 3.45 Comoros 2.14 2.00 1.94 1.81 2.20 2.20 2.70

163 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Congo, 2.21 2.10 1.96 2.23 2.17 2.35 2.38 Democratic Republic Congo, 2.08 1.80 1.27 1.94 2.15 2.35 2.90 Republic Costa Rica 2.75 2.47 2.60 2.85 2.53 2.81 3.19 Cote d’lvoire 2.73 2.31 2.31 2.90 2.73 2.69 3.36 Croatia 3.16 3.06 3.35 2.95 2.92 3.20 3.54 Cuba 2.20 2.18 2.08 2.12 2.21 2.26 2.31 Cyprus 3.24 3.02 3.17 3.21 3.17 3.36 3.54 Czech Republic 3.14 2.95 2.96 3.01 3.34 3.17 3.40 Denmark 4.02 3.93 4.07 3.70 4.14 4.10 4.21 Djibouti 1.80 1.72 1.51 1.77 1.84 1.73 2.19 Dominican 2.70 2.53 2.61 2.83 2.74 2.49 2.97 Republic Ecuador 2.76 2.36 2.62 2.86 2.65 2.58 3.42 Egypt 2.98 2.60 3.07 3.00 2.95 2.86 3.39 El Salvador 2.60 2.28 2.46 2.57 2.60 2.60 3.08 Eritrea 2.11 1.78 1.83 2.63 2.03 1.83 2.43 Estonia 2.86 2.51 2.79 2.82 2.82 3.00 3.23 Ethiopia 2.24 2.03 2.22 2.35 2.16 2.10 2.54 Fiji 2.42 2.07 2.22 2.41 2.18 2.48 3.12 Finland 4.05 3.98 4.12 3.85 4.14 4.14 4.10 France 3.85 3.64 3.96 3.73 3.82 3.97 4.02 Gabon 2.34 2.00 2.00 2.40 2.40 2.20 3.00 Gambia 2.46 2.29 1.90 2.63 2.55 2.80 2.55

164 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Georgia 2.77 2.90 2.85 2.68 2.78 2.59 2.86 Germany 4.03 3.87 4.26 3.67 4.09 4.05 4.32 Ghana 2.51 2.33 2.05 2.81 2.68 2.31 2.76 Greece 2.83 2.38 2.88 2.69 2.76 2.98 3.32 Guatemala 2.80 2.62 2.59 2.82 2.78 2.80 3.19 Guinea 2.48 2.42 2.34 2.67 2.59 2.33 2.50 Guinea-Bissau 2.60 2.39 2.68 2.61 2.58 2.58 2.74 Guyana 2.33 2.29 2.15 2.35 2.33 2.14 2.67 Haiti 2.03 1.78 1.78 1.94 1.74 2.15 2.74 Honduras 2.53 2.39 2.35 2.70 2.44 2.35 2.90 Hong Kong, 4.12 3.97 4.12 4.18 4.08 4.09 4.28 China Hungary 3.17 2.82 3.14 2.99 3.18 3.52 3.41 Iceland 3.39 3.53 3.39 3.01 3.47 3.39 3.62 India 3.08 2.77 2.87 2.98 3.14 3.09 3.58 Indonesia 2.94 2.53 2.54 2.97 2.85 3.12 3.61 Iran 2.49 2.19 2.42 2.49 2.66 2.49 2.66 Iraq 2.16 1.75 1.92 2.38 2.19 1.86 2.77 Ireland 3.52 3.40 3.35 3.40 3.54 3.65 3.77 Italy 3.67 3.34 3.74 3.53 3.65 3.73 4.05 Jamaica 2.42 2.22 2.27 2.43 2.21 2.43 2.91 Japan 3.93 3.72 4.11 3.61 3.97 4.03 4.21 Jordan 2.56 2.27 2.48 2.88 2.17 2.55 2.92 Kazakhstan 2.69 2.58 2.60 2.67 2.75 2.83 2.73 Kenya 2.43 2.08 2.16 2.69 2.38 2.34 2.88 Korea, 3.70 3.42 3.74 3.67 3.65 3.68 4.02

165 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Republic Kuwait 2.83 2.73 2.82 2.68 2.68 2.98 3.11 Kyrgyz 2.35 2.45 2.49 2.00 2.25 2.31 2.69 Republic Laos, PDR 2.50 2.38 2.40 2.40 2.49 2.49 2.82 Latvia 2.78 2.71 2.52 2.72 2.64 2.97 3.08 Lebanon 2.58 2.21 2.41 2.71 2.38 2.61 3.11 Lesotho 2.24 2.00 2.13 2.13 2.42 1.99 2.73 Liberia 2.45 2.00 2.41 2.54 2.46 2.42 2.84 Libya 2.28 2.08 1.75 2.63 2.25 2.38 2.51 Lithuania 2.95 2.73 2.58 2.97 2.91 2.73 3.70 Luxembourg 3.82 3.54 3.79 3.70 3.82 3.91 4.19 Macedonia 2.56 2.24 2.60 2.66 2.66 2.41 2.79 Madagascar 2.72 2.80 2.40 2.40 2.80 2.80 3.13 Malawi 2.81 2.51 2.78 3.01 2.85 2.56 3.09 Malaysia 3.49 3.28 3.43 3.40 3.45 3.54 3.86 Maldives 2.55 2.24 2.47 2.47 2.68 2.43 2.96 Malta 3.16 2.81 3.10 3.17 3.01 3.05 3.79 Mauritania 2.40 2.33 2.34 2.52 2.28 2.28 2.60 Mauritius 2.82 2.58 2.83 2.50 2.67 2.83 3.52 Mexico 3.06 2.63 3.03 3.07 3.02 3.15 3.47 Moldova 2.33 2.17 2.44 2.08 2.15 2.44 2.74 Mongolia 2.25 1.98 2.22 2.13 1.88 2.29 2.99 Montenegro 2.45 2.31 2.30 2.22 2.35 2.62 2.89 Morocco 3.03 2.64 3.14 3.01 2.89 3.01 3.51 Myanmar 2.37 2.24 2.10 2.47 2.42 2.34 2.59

166 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Namibia 2.65 2.73 2.72 2.49 2.65 2.85 2.52 Nepal 2.04 3.85 4.15 3.86 4.05 4.12 4.15 Netherlands 4.02 3.85 4.15 3.86 4.05 4.12 4.15 New Zealand 3.42 3.47 3.42 3.27 3.25 3.58 3.55 Niger 2.69 2.67 2.45 2.91 2.49 2.49 3.07 Nigeria 2.45 1.97 2.27 2.60 2.52 2.35 2.92 Norway 3.68 3.46 3.86 3.49 3.57 3.67 4.09 Oman 2.89 3.10 2.96 2.78 2.73 2.59 3.17 Pakistan 2.83 2.85 2.69 2.86 2.77 2.61 3.14 Panama 2.93 2.56 2.94 2.76 2.84 3.01 3.47 Papua New 2.38 1.98 2.20 2.34 2.18 2.51 3.01 Guinea Paraguay 2.48 2.36 2.41 2.31 2.49 2.59 2.74 Peru 2.94 2.68 2.73 2.87 2.91 2.99 3.40 Philippines 3.02 2.63 2.80 2.97 3.14 3.30 3.30 Poland 3.43 3.30 3.10 3.47 3.30 3.32 4.04 Portugal 3.50 3.19 3.42 3.43 3.48 3.60 3.88 Qatar 3.32 3.12 3.23 2.88 3.25 3.50 4.00 Romania 3.00 2.65 2.51 2.99 2.83 3.10 3.82 Russian 2.58 2.04 2.45 2.59 2.65 2.76 3.02 Federation Rwanda 2.27 2.19 1.88 2.27 2.06 2.39 2.76 Saudi Arabia 3.18 2.79 3.22 3.10 2.99 3.21 3.76 Sco Tomi and 2.48 2.33 2.24 2.33 2.42 2.78 2.78 Principe Senegal 2.49 2.46 2.31 2.72 2.55 2.10 2.74

167 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Serbia 2.80 2.39 2.62 2.76 2.80 3.07 3.14 Sierra Leone 2.08 1.73 2.50 1.85 1.98 2.14 2.35 Singapore 4.13 4.10 4.15 3.99 4.07 4.07 4.39 Slovak 3.03 2.88 2.99 2.84 3.07 2.84 3.57 Republic Slovenia 3.29 3.05 3.24 3.34 3.25 3.20 3.60 Solomon 2.41 2.37 2.03 2.44 2.14 2.39 3.04 Islands South Africa 3.67 3.35 3.79 3.50 3.56 3.83 4.03 Spain 3.70 3.40 3.74 3.68 3.69 3.67 4.02 Sri Lanka 2.75 2.58 2.50 3.00 2.80 2.65 2.90 Sudan 2.10 2.14 2.01 1.93 2.33 1.89 2.31 Sweden 3.85 3.68 4.13 3.39 3.90 3.82 4.26 Switzerland 3.80 3.88 3.98 3.46 3.71 3.83 4.01 Syria 2.60 2.33 2.54 2.62 2.48 2.35 3.26 Taiwan 3.71 3.42 3.77 3.58 3.68 3.72 4.10 Tajikistan 2.28 2.43 2.03 2.33 2.22 2.13 2.51 Tanzania 2.65 2.17 2.41 2.91 2.64 2.77 2.97 Thailand 3.18 2.96 3.08 3.21 2.98 3.18 3.63 Togo 2.58 2.29 2.46 3.13 2.29 2.46 2.77 Tunisia 3.17 3.13 2.88 2.88 3.13 3.25 3.75 Turkey 3.51 3.16 3.62 3.38 3.52 3.54 3.87 Ukraine 2.85 2.41 2.69 2.72 2.85 3.15 3.31 United Arab 3.78 3.61 3.84 3.59 3.74 3.81 4.10 Emirates United 3.90 3.73 3.95 3.63 3.93 4.00 4.19

168 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Kingdom United States 3.93 3.67 4.14 3.56 3.96 4.11 4.21 Uruguay 2.98 2.99 2.87 2.91 2.98 2.98 3.16 Uzbekistan 2.46 2.25 2.25 2.38 2.39 2.53 2.96 Venezuela 2.49 2.10 2.17 2.54 2.33 2.57 3.18 Vietnam 3.00 2.65 2.68 3.14 2.68 3.16 3.64 Yemen 2.89 2.29 2.62 3.14 2.79 3.12 3.29 Zambia 2.55 2.31 2.20 2.67 2.27 2.50 3.27

Table B-10-3 Calendar Year 2014 Reported Logistics Performance Index Scores Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Afghanistan 2.07 2.16 1.82 1.99 2.12 1.85 2.48 Algeria 2.65 2.71 2.54 2.54 2.54 2.54 3.04 Angola 2.54 2.37 2.11 2.79 2.31 2.59 3.02 Argentina 2.99 2.55 2.83 2.96 2.93 3.15 3.49 Armenia 2.67 2.63 2.38 2.75 2.75 2.50 3.00 Australia 3.81 3.85 4.00 3.52 3.75 3.81 4.00 Austria 3.65 3.53 3.64 3.26 3.56 3.93 4.04 Azerbaijan 2.45 2.57 2.71 2.57 2.14 2.14 2.57 Bahamas, The 2.91 3.00 2.74 2.96 2.92 2.64 3.19 Bahrain 3.08 3.29 3.04 3.04 3.04 3.29 2.80 Bangladesh 2.56 2.09 2.11 2.82 2.64 2.45 3.18 Belarus 2.64 2.50 2.55 2.74 2.46 2.51 3.05 Belgium 4.04 3.80 4.10 3.80 4.11 4.11 4.39 Benin 2.56 2.64 2.35 2.69 2.35 2.45 2.85

169 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Bhutan 2.29 2.09 2.18 2.38 2.48 2.28 2.28 Bolivia 2.48 2.40 2.17 2.35 2.68 2.68 2.60 Bosnia & 2.75 2.41 2.55 2.78 2.73 2.55 3.44 Herzegovina Botswana 2.49 2.38 2.23 2.42 2.58 2.40 2.94 Brazil 2.94 2.48 2.93 2.80 3.05 3.03 3.39 Bulgaria 3.16 2.75 2.94 3.31 3.00 2.88 4.04 Burkina Faso 2.64 2.50 2.35 2.63 2.63 2.49 3.21 Burundi 2.57 2.60 2.40 2.60 2.51 2.51 2.76 Cambodia 2.74 2.67 2.58 2.83 2.67 2.92 2.75 Cameroon 2.30 1.86 1.85 2.20 2.52 2.52 2.80 Canada 3.86 3.61 4.05 3.46 3.94 3.97 4.18 Central African 2.36 2.47 2.50 2.16 2.31 2.31 2.47 Republic Chad 2.53 2.46 2.33 2.33 2.34 2.71 3.02 Chile 3.26 3.17 3.17 3.12 3.19 3.30 3.59 China 3.53 3.21 3.67 3.50 3.46 3.50 3.87 Colombia 2.64 2.59 2.44 2.72 2.64 2.55 2.87 Comoros 2.40 2.58 2.30 2.51 2.26 2.37 2.37 Congo, 1.88 1.78 1.83 1.70 1.84 2.10 2.04 Democratic Republic Congo, 2.08 1.50 1.83 2.17 2.17 2.17 2.58 Republic Costa Rica 2.70 2.39 2.43 2.63 2.86 2.83 3.04 Cote d’lvoire 2.76 2.33 2.41 2.87 2.62 2.97 3.31

170 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Croatia 3.05 2.95 2.92 2.98 3.00 3.11 3.37 Cuba 2.18 2.17 1.84 2.47 2.08 1.99 2.45 Cyprus 3.00 2.88 2.87 3.01 2.92 3.00 3.31 Czech Republic 3.49 3.24 3.29 3.59 3.51 3.56 3.73 Denmark 3.78 3.79 3.82 3.65 3.74 3.36 4.39 Djibouti 2.15 2.20 2.00 1.80 2.21 2.00 2.74 Dominican 2.86 2.58 2.61 2.93 2.91 2.91 3.18 Republic Ecuador 2.71 2.49 2.50 2.79 2.61 2.67 3.18 Egypt 2.97 2.85 2.86 2.87 2.99 3.23 2.99 El Salvador 2.96 2.93 2.63 3.20 3.16 3.00 2.75 Eritrea 2.08 1.90 1.68 1.90 2.23 2.01 2.79 Estonia 3.35 3.40 3.34 3.34 3.27 3.20 3.55 Ethiopia 2.59 2.42 2.17 2.50 2.62 2.67 3.17 Fiji 2.55 2.40 2.47 2.72 2.22 2.47 2.97 Finland 3.62 3.89 3.52 3.52 3.72 3.31 3.80 France 3.85 3.65 3.98 3.68 3.75 3.89 4.17 Gabon 2.20 2.00 2.08 2.58 2.25 1.92 2.31 Gambia 2.25 2.06 2.00 2.67 2.22 2.00 2.46 Georgia 2.51 2.21 2.42 2.32 2.44 2.59 3.09 Germany 4.12 4.10 4.32 3.74 4.12 4.17 4.36 Ghana 2.63 2.22 2.67 2.73 2.37 2.90 2.86 Greece 3.20 3.36 3.17 2.97 3.23 3.03 3.50 Guatemala 2.80 2.75 2.54 2.87 2.68 2.68 3.24 Guinea 2.46 2.34 2.10 2.47 2.35 2.41 3.10 Guinea-Bissau 2.43 2.43 2.29 2.29 2.57 2.29 2.71

171 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Guyana 2.46 2.46 2.40 2.43 2.27 2.47 2.74 Haiti 2.27 2.25 2.00 2.27 2.14 2.32 2.63 Honduras 2.61 2.70 2.24 2.79 2.47 2.61 2.79 Hong Kong, 3.83 3.72 3.97 3.58 3.81 3.87 4.06 China Hungary 3.46 2.97 3.18 3.40 3.33 3.82 4.06 Iceland 3.39 3.54 3.34 3.15 3.46 3.38 3.51 India 3.08 2.72 2.88 3.20 3.03 3.11 3.51 Indonesia 3.08 2.87 2.92 2.87 3.21 3.11 3.53 Iraq 2.30 1.98 2.18 2.31 2.15 2.31 2.85 Ireland 3.87 3.80 3.84 3.44 3.94 4.13 4.13 Israel 3.26 3.10 3.11 2.71 3.35 3.20 4.18 Italy 3.69 3.36 3.78 3.54 3.62 3.84 4.05 Jamaica 2.84 2.88 2.84 2.79 2.72 2.72 3.14 Japan 3.91 3.78 4.16 3.52 3.93 3.95 4.24 Jordan 2.87 2.60 2.59 2.96 2.94 2.67 3.46 Kazakhstan 2.70 2.33 2.38 2.68 2.72 2.83 3.24 Kenya 2.81 1.96 2.40 3.15 2.65 3.03 3.58 Korea, 3.67 3.47 3.79 3.44 3.66 3.69 4.00 Republic Kuwait 3.01 2.69 3.16 2.76 2.96 3.16 3.39 Kyrgyz 2.21 2.03 2.05 2.43 2.13 2.20 2.36 Republic Laos, PDR 2.39 2.45 2.21 2.50 2.31 2.20 2.65 Latvia 3.40 3.22 3.03 3.38 3.21 3.50 4.06 Lebanon 2.73 2.29 2.53 2.53 2.89 3.22 2.89

172 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Lesotho 2.37 2.22 2.35 2.48 2.23 2.35 2.60 Liberia 2.62 2.57 2.57 2.57 2.86 2.57 2.57 Libya 2.50 2.41 2.29 2.29 2.29 2.85 2.85 Lithuania 3.18 3.04 3.18 3.10 2.99 3.17 3.60 Luxembourg 3.95 3.82 3.91 3.82 3.78 3.68 4.71 Macedonia 2.50 2.35 2.50 2.38 2.51 2.46 2.81 Madagascar 2.38 2.06 2.15 2.38 2.33 2.29 3.07 Malawi 2.81 2.79 3.04 2.63 2.86 2.63 2.99 Malaysia 3.59 3.37 3.56 3.64 3.47 3.58 3.92 Maldives 2.75 2.95 2.56 2.92 2.79 2.70 2.51 Malta 3.11 3.00 3.08 3.23 3.00 3.15 3.15 Mauritania 2.23 1.93 2.40 2.07 2.06 2.23 2.75 Mauritius 2.51 2.25 2.50 2.63 2.48 2.34 2.88 Mexico 3.13 2.69 3.04 3.19 3.12 3.14 3.57 Moldova 2.65 2.46 2.55 3.14 2.44 2.35 2.89 Mongolia 2.36 2.20 2.29 2.62 2.33 2.13 2.51 Montenegro 2.88 2.83 2.84 3.15 2.45 2.76 3.19 Myanmar 2.25 1.97 2.14 2.14 2.07 2.36 2.83 Namibia 2.66 2.27 2.57 2.70 2.69 2.56 3.15 Nepal 2.59 2.31 2.26 2.64 2.50 2.72 3.06 Netherlands 4.05 3.96 4.23 3.64 4.13 4.07 4.34 New Zealand 3.64 3.92 3.67 3.67 3.56 3.33 3.72 Niger 2.39 2.49 2.08 2.38 2.28 2.36 2.76 Nigeria 2.81 2.35 2.56 2.63 2.70 3.16 3.46 Norway 3.96 4.21 4.19 3.42 4.19 3.50 4.36 Oman 3.00 2.63 2.88 3.41 2.84 2.84 3.29

173 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Pakistan 2.83 2.84 2.67 3.08 2.79 2.73 2.79 Panama 3.19 3.15 3.00 3.18 2.87 3.34 3.63 Papua New 2.43 2.40 2.23 2.47 2.47 2.27 2.73 Guinea Paraguay 2.78 2.49 2.46 2.83 2.76 2.89 3.22 Peru 2.84 2.47 2.72 2.94 2.78 2.81 3.30 Philippines 3.00 3.00 2.60 3.33 2.93 3.00 3.07 Poland 3.49 3.26 3.08 3.46 3.47 3.54 4.13 Portugal 3.56 3.26 3.37 3.43 3.71 3.71 3.87 Qatar 3.52 3.21 3.44 3.55 3.55 3.47 3.87 Romania 3.26 2.83 2.77 3.32 3.20 3.39 4.00 Russian 2.69 2.20 2.59 2.64 2.74 2.85 3.14 Federation Rwanda 2.76 2.50 2.32 2.78 2.64 2.94 3.34 Saudi Arabia 3.15 2.86 3.34 2.93 3.11 3.15 3.55 Sco Tomi and 2.73 2.42 2.59 2.95 2.50 3.13 2.77 Principe Senegal 2.62 2.61 2.30 3.03 2.53 2.65 2.53 Serbia 2.96 2.37 2.73 3.12 3.02 2.94 3.55 Singapore 4.00 4.01 4.28 3.70 3.97 3.90 4.25 Slovak 3.25 2.89 3.22 3.30 3.16 3.02 3.94 Republic Slovenia 3.38 3.11 3.35 3.05 3.51 3.51 3.82 Solomon 2.59 2.49 2.46 2.22 2.72 2.72 2.96 Islands South Africa 3.43 3.11 3.20 3.45 3.62 3.30 3.88

174 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Spain 3.72 3.63 3.77 3.51 3.83 3.54 4.07 Sri Lanka 2.70 2.56 2.23 2.56 2.91 2.76 3.12 Sudan 2.16 1.87 1.90 2.23 2.18 2.42 2.33 Sweden 3.96 3.75 4.09 3.76 3.98 3.98 4.26 Switzerland 3.84 3.92 4.04 3.58 3.75 3.79 4.06 Syria 2.09 2.07 2.08 2.15 1.82 1.90 2.53 Taiwan 3.72 3.55 3.64 3.71 3.60 3.79 4.02 Tajikistan 2.53 2.35 2.36 2.73 2.47 2.47 2.74 Tanzania 2.33 2.19 2.32 2.32 2.18 2.11 2.89 Thailand 3.43 3.21 3.40 3.30 3.29 3.45 3.96 Togo 2.32 2.09 2.07 2.47 2.14 2.49 2.60 Tunisia 2.55 2.02 2.30 2.91 2.42 2.42 3.16 Turkey 3.50 3.23 3.53 3.18 3.64 3.77 3.68 Turkmenistan 2.30 2.31 2.06 2.56 2.07 2.32 2.45 Ukraine 2.98 2.69 2.65 2.95 2.84 3.20 3.51 United Arab 3.54 3.42 3.70 3.20 3.50 3.57 3.92 Emirates United 4.01 3.94 4.16 3.63 4.03 4.08 4.33 Kingdom United States 3.92 3.73 4.18 3.45 3.97 4.14 4.14 Uruguay 2.68 2.39 2.51 2.64 2.58 2.89 3.06 Uzbekistan 2.39 1.80 2.01 2.23 2.37 2.87 3.08 Venezuela 2.81 2.39 2.61 2.94 2.76 2.92 3.18 Vietnam 3.15 2.81 3.11 3.22 3.09 3.19 3.49 Yemen 2.18 1.63 1.87 2.35 2.21 2.21 2.78 Zambia 2.46 2.54 2.31 2.13 2.47 2.47 2.91

175 Texas Tech University, Jon Ilseng, December 2015

Country Overall Customs Infrastructure Ease of Logistics Ease of Timeliness LPI Shipment Services Tracking Zimbabwe 2.34 1.89 2.25 2.25 2.50 2.22 2.93

176 Texas Tech University, Jon Ilseng, December 2015

APPENDIX C - DALLAS ECONOMIC VIBRANCY FOCUS AREA DATA

177 Texas Tech University, Jon Ilseng, December 2015

Table C-1 Training Sessions per FTE Month/Year Actual Target Target Index Target Index Grade October/2008 25.0 16.129 155.0% Excellent November/2008 21.053 16.129 130.5% Excellent December/2008 16.038 14.019 114.4% Excellent January/2009 16.788 14.493 115.8% Excellent February/2009 16.568 14.793 112.0% Excellent March/2009 16.915 15.0 112.8% Excellent April/2009 17.167 15.152 113.3% Excellent May/2009 16.981 15.267 111.2% Excellent June/2009 16.294 14.658 111.2% Excellent July/2009 16.667 14.793 112.7% Excellent August/2009 16.944 14.634 115.8% Excellent September/2009 17.048 14.5 117.6% Excellent October/2009 41.667 25.0 166.7% Excellent November/2009 25.00 21.429 116.7% Excellent December/2009 21.154 17.308 122.2% Excellent January/2010 20.896 17.647 118.4% Excellent February/2010 19.512 17.857 109.3% Excellent March/2010 21.429 18.0 119.0% Excellent April/2010 22.807 18.103 126.0% Excellent May/2010 21.538 18.182 118.5% Excellent June/2010 19.608 17.308 113.3% Excellent July/2010 19.643 17.442 112.6% Excellent August/2010 19.565 17.553 111.5% Excellent September/2010 19.5 17.5 111.4% Excellent October/2010 18.75 18.75 100.0% Good November/2010 17.50 15.63 111.9% Excellent

178 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade December/2010 12.5 8.74 143.0% Excellent January/2011 11.11 6.29 176.5% Excellent February/2011 13.79 9.10 151.5% Excellent March/2011 17.48 16.9 103.4% Good April/2011 20.34 24.09 84.4% Caution May/2011 22.56 28.44 79.3% Poor June/2011 23.4 29.28 79.9% Poor July/2011 23.65 29.37 80.5% Caution August/2011 24.36 29.67 82.1% Caution September/2011 25.77 14.42 178.7% Excellent October/2011 37.50 18.75 200.0 Excellent November/2011 25.0 15.63 160.0% Excellent December/2011 17.14 14.58 117.6% Excellent January/2012 19.61 14.06 139.4% Excellent February/2012 20.90 13.75 152.0% Excellent March/2012 18.07 13.54 133.5% Excellent April/2012 17.17 13.39 128.2% Excellent May/2012 17.89 12.50 143.1% Excellent June/2012 18.71 12.50 149.6% Excellent July/2012 19.05 12.50 152.4% Excellent August/2012 20.0 12.50 160.0% Excellent September/2012 22.09 12.50 176.7% Excellent

179 Texas Tech University, Jon Ilseng, December 2015

Table C-2 Percentage of MWBE vendors registered online without assistance Month/Year Actual Target Target Index Target Index Grade October/2008 95.0% 95.0% 100.0% Good November/2008 95.0% 95.0% 100.0% Good December/2008 95.0% 95.0% 100.0% Good January/2009 95.0% 95.0% 100.0% Good February/2009 95.0% 95.0% 100.0% Good March/2009 95.0% 95.0% 100.0% Good April/2009 95.0% 95.0% 100.0% Good May/2009 95.0% 95.0% 100.0% Good June/2009 95.0% 95.0% 100.0% Good July/2009 95.0% 95.0% 100.0% Good August/2009 95.0% 95.0% 100.0% Good September/2009 95.0% 95.0% 100.0% Good

Table C-3 Number of training sessions conducted for MWBE vendors Month/Year Actual Target Target Index Target Index Grade September/2009 67.0 58.0 115.5% Excellent October/2009 5.0 3.0 166.7% Excellent November/2009 7.0 6.0 116.7% Excellent December/2009 11.0 9.0 122.2% Excellent January/2010 14.0 12.0 116.7% Excellent February/2010 16.0 15.0 106.7% Excellent March/2010 21.0 18.0 116.7% Excellent April/2010 26.0 21.0 123.8% Excellent May/2010 28.0 24.0 116.7% Excellent June/2010 30.0 27.0 111.1% Excellent

180 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade July/2010 33.0 30.0 110.0% Excellent August/2010 36.0 33.0 109.1% Excellent September/2010 39.0 35.0 111.4% Excellent October/2010 3.0 3.0 100.0% Excellent November/2010 7.0 5.0 140.0% Excellent December/2010 7.0 7.0 100.0% Good January/2011 8.0 6.4 125.0% Excellent February/2011 12 10.3 116.7% Excellent March/2011 18.0 21.0 83.3% Caution April/2011 24 33.0 70.8% Poor May/2011 30.0 40.0 73.3% Poor June/2011 33.0 41.8 78.8% Poor July/2011 35.0 43.75 80.0% Caution August/2011 38.00 46.0 81.6% Caution September/2011 42.0 30.0 140.0% Caution October/2011 3.0 3.0 100.0% Good November/2011 5.0 5.0 100.0% Good December/2011 6.0 7.0 85.7% Caution January/2012 10.0 9.0 111.1% Excellent February/2012 14.0 11.0 127.3% Excellent March/2012 15.0 13.0 115.4% Excellent April/2012 17.0 15.0 113.3% Excellent May/2012 22.0 17.0 129.4% Excellent June/2012 26.0 19.0 136.8% Excellent July/2012 28.0 21.0 133.3% Excellent August/2012 31.0 23.0 134.8% Excellent

181 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade September/2012 36.0 25.0 144.0% Excellent October/2013 3.0 3.0 100.0% Good November/2013 7.0 6.0 116.7$ Excellent December/2013 9.0 9.0 100.0% Good January/2014 12.0 12.0 100.0% Good February/2014 18.0 15.0 120.0% Excellent March/2014 24.0 18.0 133.3% Excellent April/2014 28.0 22.0 127.3% Excellent May/2014 29.0 26.0 111.5% Excellent June/2014 34.0 29.0 117.2% Excellent

Table C-10-4 Number of business related inbound delegations assisted to promote international business Fiscal Actual Target Target Index Target Index Year/Quarter Grade FY09/Q1 10.0 7.0 142.9% Excellent FY09/Q2 16.0 13.0 123.1% Excellent FY09/Q3 29.0 19.0 152.6% Excellent FY11/Q1 6.0 6.0 100.0% Good FY11/Q2 12.0 12.0 100.0% Good FY11/Q3 18.0 17.0 105.6% Excellent FY11/Q4 24.0 23.0 104.2% Good FY12/Q1 6.0 5.0 120.0% Excellent FY12/Q2 12.0 11.0 109.1% Excellent FY12/Q3 19.0 18.0 105.6% Excellent FY12/Q4 25.0 25.0 100.0% Good FY13/Q1 10.0 5.0 200.0% Excellent

182 Texas Tech University, Jon Ilseng, December 2015

Fiscal Actual Target Target Index Target Index Year/Quarter Grade FY13/Q2 16.0 11.0 145.5% Excellent FY13/Q3 22.0 18.0 122.2% Excellent

Table C-10-5 Number of COD Partnership Events Month/Year Actual Target Target Index Target Index Grade September/2010 16.0 10.0 150.0% Excellent October/2010 6.0 7.2 83.3% Caution November/2010 6.0 7.2 83.3% Caution December/2010 6.0 7.2 83.3% Caution January/2011 12.0 14.4 83.3% Caution February/2011 12.0 14.4 83.3% Caution March/2011 12.0 14.4 83.3% Caution April/2011 18.0 19.0 94.4% Good May/2011 18.0 19.0 94.4% Good June/2011 18.0 19.0 94.4% Good July/2011 24.0 26.0 91.7% Good August/2011 24.0 26.0 91.7% Good September/2011 24.0 16.0 160.0% Excellent October/2011 5.0 3.0 166.7% Excellent November/2011 5.0 3.0 166.7% Excellent December/2011 5.0 3.0 166.7% Excellent January/2012 10.0 7.0 142.9% Excellent February/2012 10.0 7.0 142.9% Excellent March/2012 10.0 7.0 142.9% Excellent April/2012 17.0 11.0 154.5% Excellent May/2012 17.0 11.0 154.5% Excellent

183 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade June/2012 17.0 11.0 154.5% Excellent July/2012 22.0 15.0 146.7% Excellent August/2012 22.0 15.0 146.7% Excellent September/2012 22.0 15.0 146.7% Excellent

184 Texas Tech University, Jon Ilseng, December 2015

APPENDIX D - DALLAS CLEAN HEALTHY FOCUS AREA DATA

185 Texas Tech University, Jon Ilseng, December 2015

Table D-1 Percent of Complaints resolved after initial investigation Month/Year Actual Target Target Index Target Index Grade October/2008 100%` 95% 105.3% Excellent November/2008 92.857% 95% 97.7% Good December/2008 92.657% 95.0% 97.6% Good January/2009 92.0% 95.0% 96.8% Good February/2009 93.605% 95.0% 98.5% Good March/2009 92.804% 95.0% 97.7% Good April/2009 92.532% 95.0% 97.4% Good May/2009 92.730% 95.0% 97.6% Good June/2009 91.829% 95.0% 96.7% Good July/2009 92.646% 95.0% 97.5% Good August/2009 93.06% 95.0% 98.0% Good September/2009 93.638% 95.0% 98.6% Good October/2009 91.6% 95.0% 96.4% Good November/2009 95.8% 95.0% 100.8% Good December/2009 97.2% 95.0% 102.3% Good January/2010 93.733% 95.0% 98.7% Good February/2010 94.987% 95.0% 100.0% Good March/2010 95.822% 95.0% 100.9% Good April/2010 93.562% 95.0% 98.5% Good May/2010 92.283% 95.0% 97.1% Good June/2010 89.437% 95.0% 94.1% Good July/2010 89.243% 95.0% 93.9% Good August/2010 90.221% 95.0% 95.0% Good September/2010 89.647% 95.0% 94.4% Good September/2011 95.10% 90.0% 105.7% Excellent October/2011 100.0% 90.0% 111.1% Excellent

186 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade November/2011 100.0% 90.0% 111.1% Excellent December/2011 100.0% 90.0% 111.1% Excellent January/2012 95.0% 90.0% 105.6% Excellent February/2012 96.0% 90.0% 106.7% Excellent March/2012 96.67% 90.0% 107.4% Excellent April/2012 97.14% 90.0% 107.9% Excellent May/2012 94.72% 90.0% 105.2% Excellent June/2012 95.31% 90.0% 105.9% Excellent July/2012 95.78% 90.0% 106.4% Excellent August/2012 94.64% 90.0% 105.2% Excellent September/2012 94.16% 90.00 104.6% Good October/2012 90.0% 90.0% 100.0% Good November/2012 70.0% 90.0% 77.8% Poor December/2012 71.67% 90.0% 79.6% Poor January/2013 78.75% 90.0% 87.5% Caution February/2013 83.0% 90.0% 92.2% Good March/2013 82.80% 90.0% 92.0% Good April/2013 83.83% 90.0% 93.1% Good May/2013 84.07% 90.0% 93.4% Good June/2013 85.84% 90.0% 95.4% Good July/2013 87.25% 90.0% 96.9% Good August/2013 88.41% 90.0% 98.2% Good September/2013 88.44% 90.0% 98.3% Good October/2013 100.0% 93.0% 107.5% Excellent November/2013 100.0% 93.0% 107.5% Excellent December/2013 100.0% 93.0% 107.5% Excellent

187 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade January/2014 100.0% 93.0% 107.5% Excellent February/2014 97.14% 93.00% 104.5% Good March/2014 94.84% 93.00% 102.0% Good April/2014 93.79% 93.00% 100.9% Good May/2014 91.00% 93.00% 97.8% Good June/2014 92.0% 93.0% 98.9% Good

Table D-2 Percent of facilities in compliance with applicable regulations during the initial investigation Month/Year Actual Target Target Index Target Index Grade October/2008 93.478% 90.0% 103.9% Good November/2008 90.787% 90.0% 100.9% Good December/2008 90.916% 90.0% 101.0% Good January/2009 92.452% 90.0% 102.7% Good February/2009 93.382% 90.0% 103.8% Good March/2009 94.250% 90.0% 104.7% Good April/2009 94.041% 90.0% 104.5% Good May/2009 93.377% 90.0% 103.8% Good June/2009 93.669% 90.0% 104.1% Good July/2009 93.276% 90.0% 103.6% Good August/2009 93.177% 90.0% 103.5% Good September/2009 93.039% 90.0% 103.4% Good October/2009 89.6% 90.0% 99.6% Good November/2009 88.417% 90.0% 98.2% Good December/2009 91.203% 90.0% 101.3% Good

188 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade January/2010 91.864% 90.0% 102.1% Good February/2010 90.46% 90.0% 100.5% Good March/2010 91.217% 90.0% 101.4% Good April/2010 91.769% 90.0% 102.0% Good May/2010 91.616% 90.0% 101.8% Good June/2010 91.522% 90.0% 101.7% Good July/2010 92.257% 90.0% 102.5% Good August/2010 92.668% 90.0% 103.0% Good September/2010 92.924% 90.0% 103.2% Good September/2011 94.31% 93.0% 101.4% Good October/2011 86.11% 93.0% 92.6% Good November/2011 86.44% 93.0% 92.9% Good December/2011 90.96% 93.0% 97.8% Good January/2012 91.27% 93.0% 98.1% Good February/2012 92.64% 93.0% 99.6% Good March/2012 93.07% 93.0% 100.1% Good April/2012 93.43% 93.0% 100.5% Good May/2012 93.46% 93.0% 100.5% Good June/2012 93.83% 93.0% 100.9% Good July/2012 94.18% 93.0% 101.3% Good August/2012 94.16% 93.0% 101.3% Good September/2012 94.46% 93.0% 101.6% Good October/2012 94.74% 93.0% 101.9% Good November/2012 93.43% 93.0% 100.5% Good December/2012 93.80% 93.0% 100.9% Good January/2013 94.46% 93.0% 101.6% Good

189 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade February/2013 91.93% 93.0% 98.8% Good March/2013 92.72% 93.0% 99.7% Good April/2013 93.24% 93.0% 100.3% Good May/2013 92.62% 93.0% 99.6% Good June/2013 92.61% 93.0% 99.6% Good July/2013 92.79% 93.0% 99.8% Good August/2013 93.0% 93.0% 100.0% Good September/2013 93.21% 93.0% 100.2% Good October/2013 98.82% 95.0% 104.0% Good November/2013 98.31% 95.0% 103.5% Good December/2013 97.32% 95.0% 102.4% Good January/2014 95.31% 95.0% 100.3% Good February/2014 94.31% 95.0% 99.3% Good March/2014 94.25% 95.0% 99.2% Good April/2014 94.16% 95.0% 99.1% Good May/2014 93.90% 95.0% 98.8% Good June/2014 94.57% 95.0% 99.6% Good

190 Texas Tech University, Jon Ilseng, December 2015

APPENDIX E DALLAS EFFECTIVE, EFFICIENT AND ECONOMICAL FOCUS AREA DATA

191 Texas Tech University, Jon Ilseng, December 2015

Table E-1: Percentage of bids advertised within publisher’s deadline Month/Year Actual Target Target Index Target Index Grade October/2008 100.0% 100.0% 100.0% Good November/2008 100.0% 100.0% 100.0% Good December/2008 100.0% 100.0% 100.0% Good January/2009 100.0% 100.0% 100.0% Good February/2009 100.0% 100.0% 100.0% Good March/2009 100.0% 100.0% 100.0% Good April/2009 100.0% 100.0% 100.0% Good May/2009 100.0% 100.0% 100.0% Good June/2009 100.0% 100.0% 100.0% Good July/2009 100.0% 100.0% 100.0% Good August/2009 100.0% 100.0% 100.0% Good September/2009 100.0% 100.0% 100.0% Good October/2009 100.0% 90.0% 111.1% Excellent November/2009 100.0% 90.0% 111.1% Excellent December/2009 100.0% 90.0% 111.1% Excellent January/2010 100.0% 90.0% 111.1% Excellent February/2010 100.0% 90.0% 111.1% Excellent March/2010 100.0% 90.0% 111.1% Excellent April/2010 100.0% 90.0% 111.1% Excellent May/2010 100.0% 90.0% 111.1% Excellent June/2010 100.0% 90.0% 111.1% Excellent July/2010 100.0% 90.0% 111.1% Excellent August/2010 100.0% 90.0% 111.1% Excellent September/2010 100.0% 90.0% 111.1% Excellent September/2011 100.0% 90.0% 111.1% Excellent

192 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade October/2011 100.0% 95.0% 105.3% Excellent November/2011 100.0% 95.0% 105.3% Excellent December/2011 100.0% 95.0% 105.3% Excellent January/2012 100.0% 95.0% 105.3% Excellent February/2012 100.0% 95.0% 105.3% Excellent March/2012 100.0% 95.0% 105.3% Excellent April/2012 100.0% 95.0% 105.3% Excellent May/2012 100.0% 95.0% 105.3% Excellent June/2012 100.0% 95.0% 105.3% Excellent July/2012 100.0% 95.0% 105.3% Excellent August/2012 100.0% 95.0% 105.3% Excellent September/2012 100.0% 95.0% 105.3% Excellent

Table E-2 Percentage of requisitions processed in compliance with Texas State Law Month/Year Actual Target Target Index Target Index Grade October/2008 100.0% 100.0% 100.0% Good November/2008 100.0% 100.0% 100.0% Good December/2008 100.0% 100.0% 100.0% Good January/2009 100.0% 100.0% 100.0% Good February/2009 100.0% 100.0% 100.0% Good March/2009 100.0% 100.0% 100.0% Good April/2009 100.0% 100.0% 100.0% Good May/2009 100.0% 100.0% 100.0% Good June/2009 100.0% 100.0% 100.0% Good July/2009 100.0% 100.0% 100.0% Good

193 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade August/2009 100.0% 100.0% 100.0% Good September/2009 100.0% 100.0% 100.0% Good October/2009 100.0% 100.0% 100.0% Good November/2009 100.0% 100.0% 100.0% Good December/2009 100.0% 100.0% 100.0% Good January/2010 100.0% 100.0% 100.0% Good February/2010 100.0% 100.0% 100.0% Good March/2010 100.0% 100.0% 100.0% Good April/2010 100.0% 100.0% 100.0% Good May/2010 100.0% 100.0% 100.0% Good June/2010 100.0% 100.0% 100.0% Good July/2010 100.0% 100.0% 100.0% Good August/2010 100.0% 100.0% 100.0% Good September/2010 100.0% 100.0% 100.0% Good September/2011 100.0% 100.0% 100.0% Good October/2011 100.0% 100.0% 100.0% Good November/2011 100.0% 100.0% 100.0% Good December/2011 100.0% 100.0% 100.0% Good January/2012 100.0% 100.0% 100.0% Good February/2012 100.0% 100.0% 100.0% Good March/2012 100.0% 100.0% 100.0% Good April/2012 100.0% 100.0% 100.0% Good May/2012 100.0% 100.0% 100.0% Good June/2012 100.0% 100.0% 100.0% Good July/2012 100.0% 100.0% 100.0% Good August/2012 100.0% 100.0% 100.0% Good

194 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade September/2012 100.0% 100.0% 100.0% Good

Table E-3 Number of contracts managed to Effectiveness Month/Year Actual Target Target Index Target Index Grade September/2009 506.5 480.0 105.5% Excellent October/2009 494.0 502.0 98.4% Good November/2009 492.0 502.0 98.0% Good December/2009 502.333 502.0 100.1% Good January/2010 510.75 502.0 101.7% Good February/2010 515.6 502.0 102.7% Good March/2010 536.667 502.0 106.9% Excellent April/2010 556.571 502.0 110.9% Excellent May/2010 571.5 502.0 113.8% Excellent June/2010 582.889 502.0 116.1% Excellent July/2010 598.7 502.0 119.3% Excellent August/2010 612.909 502.0 122.1% Excellent September/2010 625.833 502.0 124.7% Excellent September/2011 830.25 592.0 140.2% Excellent October/2011 882.0 870.0 101.4% Good November/2011 891.0 870.0 102.4% Good December/2011 893.67 870.0 102.7% Good January/2012 872.75 870.0 100.3% Good February/2012 862.6 870.0 99.1% Good March/2012 852.5 870.0 98.0% Good April/2012 847.0 870.0 97.4% Good May/2012 843.63 870.0 97.0% Good

195 Texas Tech University, Jon Ilseng, December 2015

Month/Year Actual Target Target Index Target Index Grade June/2012 843.67 870.0 97.0% Good July/2012 845.0 870.0 97.1% Good August/2012 847.18 870.0 97.4% Good September/2012 848.0 870.0 97.5% Good October/2012 874.0 870.0 100.5% Good November/2012 822.50 870.0 94.5% Good December/2012 786.0 870.0 90.3% Good January/2013 765.25 870.0 88.0% Caution February/2013 756.4 870.0 86.9% Caution March/2013 738.67 870.0 84.9% Caution April/2013 726.29 870.0 83.5% Caution May/2013 718.5 870.0 82.6% Caution June/2013 712.11 870.0 81.9% Caution October/2013 630.0 765.0 82.4% Caution November/2013 620.0 765.0 81.0% Caution December/2013 650.67 765.0 85.1% Caution January/2014 646.5 765.0 84.5% Caution February/2014 645.6 765.0 84.4% Caution March/2014 647.33 765.0 84.6% Caution April/2014 646.29 765.0 84.5% Caution May/2014 646.5 765.0 84.5% Caution June/2014 648.33 765.0 84.7% Caution

196 Texas Tech University, Jon Ilseng, December 2015

APPENDIX F TEXAS TECH UNIVERSITY INSTITUTIONAL REVIEW BOARD PROPOSAL

197 Texas Tech University, Jon Ilseng, December 2015

PERFORMANCE MEASUREMENT DATA COLLECTION IN THE DALLAS/FORT WORTH, TEXAS METROPLEX Dr. Atila Ertas, Principal Investigator Jon K. Ilseng, PhD Candidate, Texas Tech University, co-investigator

I. Rationale: For-profit organizations strive to be as effective and efficient as possible. Lean management has become the de facto standard for resource allocation and decision- making. Previous work has described the pros and cons of lean management, both for short-term competitive positioning and for long-term viability. One of the key problems is how to measure performance such that doing things right (efficiency) does not impair the organization from doing the right thing (effectiveness). Manufacturing firms, as other for-profit organizations, are situated in areas that have a population with an educational system. This creates a local System of Systems. The interaction of these systems can be either parasitic, where the success of one requires the sacrifice of another, or symbiotic, where success for one fuels success in the others. How performance is measured in such an environment should therefore be important. For this dissertation, the local System-of-Systems is the Dallas/Fort Worth, Texas metroplex region. The purpose of the survey is to collect performance measurement data that educational institutions, for-profit and not-for-profit companies and government organizations (city, county, state and federal) currently collect. The performance measurement data will be analyzed and reviewed to propose performance measures that integrate the needs of the local Dallas/Fort Worth, Texas Metroplex SoS and the needs of for-profit organizations, government agencies and educational institutions. The intent is for the proposed Performance Measures to meet current and future sustainability concerns. As sustainability concerns and issues become pervasive, relationship between productivity/efficiency and effectiveness in the local Dallas/Fort Worth, TexasMetroplex SoS becomes very important. The Dallas-Fort Worth Metroplex is home to a high number of technology companies (Raytheon, Lockheed Martin, Cisco, Oracle, Texas Instruments, etc.) and research universities including the University of Texas-Dallas (UTD), University of

198 Texas Tech University, Jon Ilseng, December 2015

Texas-Arlington (UTA), Southern Methodist University (SMU), University of North Texas-Denton (UNT) and Texas Christian University (TCU) and community and technical colleges like Collin County Community College (CCCC) and Tarrant County Community College (TCCC). For the Dallas-Fort Worth Metroplex local SoS, interactions are very important. These interactions can be either parasitic, where the success of one requires the sacrifice of another. Or they can be symbiotic, where success for one fuels success in the others. As sustainability concerns become more prevalent, the relationship of efficiency and effectiveness in this local SoS is crucial to ensure it is symbiotic. Sustainability is continuing global issue and industry, educational institutions and government agencies must continue to address it. In this research, sustainability is defined as “the ability to sustain, or a state that can be maintained at a certain level”. (Kajikawa, 2010). For the Metroplex local SoS to be competitive and productive, it is crucial that each one address sustainability with regards to efficiency and effectiveness. With regards to education curricula at universities and colleges world-wide, “the pace of change in education curriculums is growing exponentially due to numerous legislative arrangements and changes.” (Hasna, 2010). With regards to industry (profit and non-profit), having sustainable business practices must be efficient, effective and productive. For example, in the aviation industry, aerospace manufacturers must be readily adaptable to change with regards to sustainability. The “aviation industry will be driven by, and judged on its environmental record. The impact of aviation on the environment has been a source of focus and concern for the industry for more than 60 years.” (Davis-Mendelow, 2006). With regards to Government Agencies, they must also be efficient, effective and productive to meet the growing sustainability needs. For example, Government contracts are measured as to what is their impact on the Government’s ability to effectively manage contracts, specifically, measuring costs, client impact, service timeliness and disruptions that are associated with higher perceived accountability effectiveness. (Amirkhanyan, 2011).

199 Texas Tech University, Jon Ilseng, December 2015

There are currently traditional performance measures that industry, government and educational systems use to measure productivity and efficiency. For example, numerous United States Department of Defense (DoD) contractors use engineering performance measures such as productivity (Systems Engineering, Software Engineering and Hardware Engineering) to measure the rate at which budgeted work must be completed to meet a schedule. Software Engineering Productivity measures the line of code/hour (e.g. 1.6 LOC/hour) that a software coder must write to ensure that the software configuration item is completed on-time and without error. Systems Engineering productivity measures the rate at which system-level requirements are completed and verified. Hardware Engineering productivity measures the rate at which engineering drawings must be completed to ensure a release date is met. Government agencies have current performance measures such as inputs (resources used), outputs (program activities), efficiency measures (ratio of inputs to outputs), and outcomes (the actual results of programs and services). Performance measures are in the following Government Agencies: 1) Risk Management, 2) Public Health, 3) Economic Development, 4) Information Technology, 5) Streets and Roads. Government agencies include the federal, state and local levels. Educational institutions (universities, public schools, private schools, etc.) have current performance measures. For example, some U.S. state legislatures decide appropriations to their state colleges and universities based on how well that institution performs. Also, when the United States Congress passed the No Child Left Behind Act in 2001, the act required states to develop subject area of measures of adequate yearly progress (AYP) in reading and math that could be used to evaluate the quality of the public education system. (Patrick & French, 2011).

II. Subjects: Ten subjects will be recruited for this study from 1) City Managers, 2) Public Relations personnel, 3) Marketing personnel, and 4) College/University Professors. The inclusion/exclusion criterion stipulates subjects from the Dallas-Fort Worth Metroplex City Managers and Dallas-Fort Worth Metroplex University/College Professors that collect performance measures.

200 Texas Tech University, Jon Ilseng, December 2015

The recruitment procedure will consist of sending an email to subjects. All email addresses will be obtained via public websites. See Attachment 1 for recruiting message. A link to an on-line survey will be sent to the subjects embedded in the recruiting email. After completing the on-line survey, subjects will be asked if they are willing to be interviewed anonymously. If so, a follow-up interview will be conducted with those subjects. There is very little risk with the subjects completing the survey.

III. Procedures: Researchers will email the subjects and include a link to the online survey in the message (see Attachment 1). Subjects will be asked to complete the survey (see Attachment 2) and submit their answers. Another screen will appear and ask the subjects if they are willing to be interviewed anonymously. If so, subjects will voluntarily provide their name and contact information so the researchers can arrange the interviews. The researchers will send an email to the subjects (see Attachment 3) to arrange a time and place for the interviews. See Attachment 4 for the interview questions. Interviews will take place in the subjects’ offices and be conducted anonymously. No personal data will be collected during these interviews. There are no risks (physical, psychological, social, legal and economic) to any of the participants beyond daily life. The researchers will protect the identities of the subjects. The anticipated amount of time to collect this performance measurement data is three months (September– December 2014).

IV. Adverse Events and Liability: The research does not pose any more than minimal risk to the participant beyond that of everyday life and therefore, no liability plan is offered.

V. Consent Form: N/A.

201 Texas Tech University, Jon Ilseng, December 2015

References Kajikawa, Yuya (2010). Promoting Interdisciplinary Research and Transdisciplinary Expertise for Realizing Sustainable Society. 978-1-890843-21- 0/10IEEE 2010, 1-6. Hasna, Abdallah M. (2010). Embedding Sustainabiity in Capstone Engineering Design Projects. IEEE EDUCON Education Engineering Education, April 14-16 2010, Madrid Spain, 1601-1610. Davis-Mendelow, Steven, PhD (2006). Sustainability for a Change: An Aerospace Perspective. 1-4244-0218-2/06, IEEE 2006, 1-4. Amirkhanyan, Anna A. (2011). What is the Effect of Performance Measurement on Perceived Accountability Effectiveness in State and Local Government Contracts. Public Performance & Management Review, Vol. 35, No. 2, 303-339. Patrick, Barbara A. and French, P. Edward (2011). Assessing New Public Management’s Focus on Performance Measurement In the Public Sector. Public Performance & Management Review, Vol. 35, No. 2, 340-369.

Attachment 1. Recruiting Materials

Dear ______:

My name is Jon K. Ilseng and I am a PhD Candidate in the Texas Tech University Mechanical Engineering Department. I am researching performance measurement data that Educational Institutions, For-Profit/Not-for-Profit Companies and Government Agencies currently collect. This research will be focused on the Dallas/Fort-Worth, Texas Metropolitan area (referred to as the Metroplex System-of-Systems). Dr. Atila Ertas is the professor overseeing the research study.

I am contacting you because you are a (City Manager/University Professor) in the Dallas/Fort Worth area. I would like to ask you to participate in this research study. The

202 Texas Tech University, Jon Ilseng, December 2015 link below will take you to an on-line short survey asking questions that concern performance measurement data. The survey will take only 10-15 minutes of your time. Answering the survey questions is your choice and you can skip questions or even quit at any time by closing the browser window.

When you are done with the questions, please click on the submit button. Your answers will not be tied to your name. But, another screen will appear to ask you if you are willing to participate in an interview. The interview is also voluntary and will last about 30 minutes. I will meet you at a place of your choosing and a time convenient for you.

If you have any questions, please do not hesitate to call Jon K. Ilseng at 214-336-5033. Jon K. Ilseng can also be reached via the following email address: [email protected]. Texas Tech University also has a Board that protects the rights of people who participate in research. You can call to ask them questions at 806-742-2064 or send your questions to the Human Research Protection Program, Office of the Vice President for Research, Texas Tech University, Lubbock, Texas 79409, or you can email your questions to [email protected] .

Thank you for your time and consideration in helping collect this data. Please click on the link below to go to the survey.

Sincerely,

Dr. Atila Ertas Jon K. Ilseng Mechanical Engineering Department PhD Candidate/Texas Tech University [email protected] [email protected]

203 Texas Tech University, Jon Ilseng, December 2015

Attachment 2. Survey Thank you for participating in this research study. The purpose of the survey is to collect performance measurement data that educational institutions (e.g. community colleges, universities), for-profit and not-for-profit companies, and government organizations (city, county, state and federal) located in the Dallas-Fort Worth, Texas Metroplex currently use. The survey takes 10-15 minutes to complete. You may skip questions and quit the survey at any time by closing the browser window. At the end of the survey, you will be asked if you are willing to be contacted for an interview. Please provide your name and email address if you are willing so I can contact you. If you have questions, please contact Dr. Atila Ertas or Jon Ilseng at the following numbers: 214- 336-5033 (Cell) or 972-344-2580 (Work).

1. Of the following choices, how would you define your current organization?

□ Educational Institution (Community College) □ Educational Institution (University) □ For-Profit Company □ Not-for-Profit Company □ Government Organization (City) □ Government Organization (County) □ Government Organization (State) □ Government Organization (Federal)

2. Based on the selection in Question 1, please estimate the number of employees in your current organization.

□ 1-10 □ 11-20 □ 21-100 □ 101-250 □ 251 or more

204 Texas Tech University, Jon Ilseng, December 2015

3. What kind(s) of performance measurement is (are) implemented in your organization?

□ Financial Performance Measurement Human Resource Performance Measurement □ Customer Satisfaction Measurement □ Process Management Measurement □ Strategy Performance Measurement □ Sustainability Measurement (impact on society) □ Innovation Measurement □ Other: ______

4. Which performance measurement model or tools are used in your organization?

□ ISO9000 Certification □ Total Quality Management (TQM) □ Baldridge Business Excellence criteria □ Balanced Scorecard □ Key Performance Indicator (KPI) System □ Benchmarking Systems □ Other: ______□ None of the above

5. What are the initial reasons for your organization to implement its performance measurement system?

□ Estimating Customer Requirements □ Providing feedback for people to monitor their own performance levels □ Highlighting quality problems and determining which areas most need attention □ To identify possible needs for changes in strategy

205 Texas Tech University, Jon Ilseng, December 2015

□ Justifying the use of resources □ Providing feedback for driving the improvement effort □ For decision support at the top-management level □ For decision support at the operating level □ To determine the bonus to management and/or staff □ Other: ______

6. Please list the top five (5) most important performance measures in your organization and define them.

a.

b.

c.

d.

e.

7. For any of the top five (5) most important performance measures listed in question 6, are they publicly accessible? If so, please explain how to retrieve this publicly accessible data.

SUBMIT – Thank you for completing the survey.

8. If you are willing to be interviewed regarding this survey, please provide your name and contact information so I can arrange a time and place to meet.

206 Texas Tech University, Jon Ilseng, December 2015

Name: ______

Email: ______

Attachment 3 Email for the interview

Thank you very much for completing this on-line survey or performance measurement usage. It is a tremendous help for this dissertation research.

You stated that you are willing to be interviewed and this email is a follow-up. The interview should not take longer than 30 minutes and will be a discussion on the survey results you completed. Your participation is voluntary and the interview will be conducted anonymously.

The interview time and place is based on your availability. Please contact me so we can arrange a day and time. I can be reached at the following phone numbers: 214-336-5033 (cell), 972-344-2580 (work). I can also be reached at the following email addresses: [email protected] and [email protected].

Thank you very much for your time for this interview.

Regards, Dr. Atila Ertas Jon K. Ilseng Mechanical Engineering Department PhD Candidate/Texas Tech University [email protected] [email protected]

Attachment 4 Interview Questions

207 Texas Tech University, Jon Ilseng, December 2015

1. For Question 3, your survey results indicated that the following performance measurements are implemented in your organization (list the performance measurements from the applicable respondent’s survey). Please explain why these particular performance measurements are implemented and the importance of them.

2. For Question 4, your survey results indicated that the following performance measurement model or tools are used in your organization (list the model or tools from the applicable respondent’s survey). Please explain why your organization uses these specific model or tools.

3. For Question 5, you listed the following initial reasons as to why your organization implements its performance measurement system (list the initial reasons from the applicable respondent’s survey). Can you think of other reasons why your organization implements its performance measurement system?

4. For Question 6, you listed the following top five most important performance measures in your organization and the definition for each one (list the initial reasons from the applicable respondent’s survey). Can you think of other performance measures that your organization currently uses?

The on-line survey results are as follows:

1. Of the following choices, how would you define your current organization?

i. □ Educational Institution (Community College) ii. □ Educational Institution (University) iii. □ For-Profit Company

208 Texas Tech University, Jon Ilseng, December 2015

iv. □ Not-for-Profit Company v. □ Government Organization (City) vi. □ Government Organization (County) vii. □ Government Organization (State) viii. □ Government Organization (Federal)

2. Based on the selection in Question 1, please estimate the number of employees in your current organization.

i. □ 1-10 □ 11-20 □ 21-100 □ 101-250 □ 251 or more

3. What kind(s) of performance measurement is (are) implemented in your organization?

□ Financial Performance Measurement □ Human Resource Performance Measurement □ Customer Satisfaction Measurement □ Process Management Measurement □ Strategy Performance Measurement □ Sustainability Measurement (impact on society) □ Innovation Measurement □ Other: ______

4. Which performance measurement model or tools are used in your organization?

□ ISO9000 Certification □ Total Quality Management (TQM) □ Baldridge Business Excellence criteria □ Balanced Scorecard □ Key Performance Indicator (KPI) System

209 Texas Tech University, Jon Ilseng, December 2015

□ Benchmarking Systems □ Other: ______□ None of the above

5. What are the initial reasons for your organization to implement its performance measurement system?

□ Estimating Customer Requirements □ Providing feedback for people to monitor their own performance levels □ Highlighting quality problems and determining which areas most need attention □ To identify possible needs for changes in strategy □ Justifying the use of resources □ Providing feedback for driving the improvement effort □ For decision support at the top-management level □ For decision support at the operating level □ To determine the bonus to management and/or staff □ Other: ______

6. Please list the top five (5) most important performance measures in your organization and define them.

a.

b.

c.

d.

e.

210 Texas Tech University, Jon Ilseng, December 2015

APPENDIX G ONLINE SURVEY RESULTS

211 Texas Tech University, Jon Ilseng, December 2015

Table G- 1 Online Survey Results

Respondent Question Question Question Question Question Question Question Question 1 2 3 4 5 6 7 8 Of the following Based on the What kind(s) Which What are the Please list the For If you are willing to be choices, how would selection in of performance initial reasons top five most any of the top interviewed regarding define your current Question 1, please performance model or tools for your important five most this survey, please organization? estimate the measurement are used in organization to performance important provide name and number of is (are) your implement its measures in performance email address. employees in your implemented organization? performance your measures listed organization. in your measurement organization in Question 6, organization? system? and define are they publicly them. accessible? If so, please explain how to retrieve this publicly accessible data. City of Dallas, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Cecilia Scheu (City of Sustainability management relating to Dallas), Performance Performance Key level, For Environmental Measure Coordinator, Measurement Performance decision issues (energy 214-671-5006, Government (impact on Indication support at the usage, water cecilia.scheu@dallascit Organization (City) 251 or more society) (KPI) operating level usage, etc.) Yes yhall.com City of Fort Estimating Worth, TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance Key level, For Environmental Measurement Performance decision issues (energy Government (impact on Indication support at the usage, water Organization (City) 251 or more society) (KPI) operating level usage, etc.) Yes City of Plano, TX Estimating Customer Requirements, Highlighting quality Financial problems and Performance determining Cost/Financial Measurement, which areas Measure, Customer most need Customer Satisfaction attention, To Satisfaction, Measurement, identify Sustainability, Process possible needs Return on Management for changes in Invested Performance strategy, Capital, Measurement, justifying the Measures Sustainability use of relating to Performance Key resources, Environmental Measurement Performance providing issues (energy Government (impact on Indication feedback for usage, water Organization (City) 251 or more society) (KPI) driving the usage, etc.) Yes

212 Texas Tech University, Jon Ilseng, December 2015

improvement effort, For decision support at the top management level, For decision support at the operating level City of Allen Estimating Worth, TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance Key level, For Environmental Measurement Performance decision issues (energy Government (impact on Indication support at the usage, water Organization (City) 251 or more society) (KPI) operating level usage, etc.) Yes City of Estimating McKinney, TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Estimating Richardson Customer Worth, TX Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes

213 Texas Tech University, Jon Ilseng, December 2015

City of Garland, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Mesquite, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Carrolton, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Farmers Financial Estimating Cost/Financial Branch, TX Performance Customer Measure, Measurement, Requirements, Customer Customer Highlighting Satisfaction, Satisfaction quality Sustainability, Measurement, problems and Return on Process determining Invested Management which areas Capital, Government Performance Benchmarking most need Measures Organization (City) 251 or more Measurement, Systems attention, To relating to Yes

214 Texas Tech University, Jon Ilseng, December 2015

Sustainability identify Environmental Performance possible needs issues (energy Measurement for changes in usage, water (impact on strategy, usage, etc.) society) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level City of Arlington, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Grand Estimating Prarie, TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Irving, TX Estimating Customer Requirements, Highlighting quality Financial problems and Performance determining Cost/Financial Measurement, which areas Measure, Customer most need Customer Satisfaction attention, To Satisfaction, Measurement, identify Sustainability, Process possible needs Return on Management for changes in Invested Performance strategy, Capital, Measurement, justifying the Measures Sustainability use of relating to Performance resources, Environmental Measurement providing issues (energy Government (impact on Benchmarking feedback for usage, water Organization (City) 251 or more society) Systems driving the usage, etc.) Yes

215 Texas Tech University, Jon Ilseng, December 2015

improvement effort, For decision support at the top management level, For decision support at the operating level City of Estimating Lewisville, TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer Financial driving the Satisfaction, Performance improvement Sustainability, Measurement, effort, For Return on Customer decision Invested Satisfaction support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Estimating Colleyville, TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Estimating Grapevine, TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes

216 Texas Tech University, Jon Ilseng, December 2015

City of Keller, TX Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of DeSoto, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, Financial top Measures Performance management relating to Measurement, level, For Environmental Customer decision issues (energy Government Satisfaction Benchmarking support at the usage, water Organization (City) 251 or more Measurement Systems operating level usage, etc.) Yes City of Mansfield, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 251 or more society) Systems operating level usage, etc.) Yes City of Frisco, TX Financial Estimating Cost/Financial Performance Customer Measure, Measurement, Requirements, Customer Customer Highlighting Satisfaction, Satisfaction quality Sustainability, Measurement, problems and Return on Process Key determining Invested Management Performance which areas Capital, Government Performance Indication most need Measures Organization (City) 251 or more Measurement, (KPI) attention, To relating to Yes

217 Texas Tech University, Jon Ilseng, December 2015

Sustainability identify Environmental Performance possible needs issues (energy Measurement for changes in usage, water (impact on strategy, usage, etc.) society) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level City of Denton, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance Key level, For Environmental Measurement Performance decision issues (energy Government (impact on Indication support at the usage, water Organization (City) 251 or more society) (KPI) operating level usage, etc.) Yes City of Coppell, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the Financial use of Performance resources, Cost/Financial Measurement, providing Measure, Customer feedback for Customer Satisfaction driving the Satisfaction, Measurement, improvement Sustainability, Process effort, For Return on Management decision Invested Performance support at the Capital, Measurement, top Measures Sustainability management relating to Performance Key level, For Environmental Measurement Performance decision issues (energy Government (impact on Indication support at the usage, water Organization (City) 251 or more society) (KPI) operating level usage, etc.) Yes City of Hurst, TX Estimating Customer Requirements, Highlighting quality problems and determining Cost/Financial which areas Measure, most need Customer attention, To Satisfaction, identify Sustainability, possible needs Return on for changes in Invested strategy, Capital, Financial justifying the Measures Performance use of relating to Measurement, resources, Environmental Customer providing issues (energy Government Satisfaction Benchmarking feedback for usage, water Organization (City) 251 or more Measurement, Systems driving the usage, etc.) Yes

218 Texas Tech University, Jon Ilseng, December 2015

improvement effort, For decision support at the top management level, For decision support at the operating level City of Bedford, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, Financial top Measures Performance management relating to Measurement, level, For Environmental Customer decision issues (energy Government Satisfaction Benchmarking support at the usage, water Organization (City) 251 or mor Measurement, Systems operating level usage, etc.) Yes City of Euless, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, Financial top Measures Performance management relating to Measurement, level, For Environmental Customer decision issues (energy Government Satisfaction Benchmarking support at the usage, water Organization (City) 251 or more Measurement, Systems operating level usage, etc.) Yes City of Fairview, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer Financial driving the Satisfaction, Performance improvement Sustainability, Measurement, effort, For Return on Customer decision Invested Satisfaction support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 21-100 society) Systems operating level usage, etc.) Yes

219 Texas Tech University, Jon Ilseng, December 2015

City of Lucas, TX Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer Financial driving the Satisfaction, Performance improvement Sustainability, Measurement, effort, For Return on Customer decision Invested Satisfaction support at the Capital, Measurement, top Measures Sustainability management relating to Performance level, For Environmental Measurement decision issues (energy Government (impact on Benchmarking support at the usage, water Organization (City) 21-100 society) Systems operating level usage, etc.) Yes City of Addison, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial Financial providing Measure, Performance feedback for Customer Measurement, driving the Satisfaction, Customer improvement Sustainability, Satisfaction effort, For Return on Measurement, decision Invested Sustainability support at the Capital, Performance top Measures Measurement management relating to (impact on Key level, For Environmental society), Performance decision issues (energy Government Customer Indication support at the usage, water Organization (City) 251 or more Satisfaction (KPI) operating level usage, etc.) Yes City of Rockwall, Estimating TX Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer Financial driving the Satisfaction, Performance improvement Sustainability, Measurement, effort, For Return on Customer decision Invested Satisfaction support at the Capital, Measurement, top Measures Sustainability management relating to Performance Key level, For Environmental Measurement Performance decision issues (energy Government (impact on Indication support at the usage, water Organization (City) 251 or more society) (KPI) operating level usage, etc.) Yes City of Southlake, Financial Estimating Cost/Financial TX Performance Customer Measure, Measurement, Requirements, Customer Customer Highlighting Satisfaction, Satisfaction quality Sustainability, Measurement, problems and Return on Process Key determining Invested Management Performance which areas Capital, Government Performance Indication most need Measures Organization (City) 251 or more Measurement, (KPI) attention, To relating to Yes

220 Texas Tech University, Jon Ilseng, December 2015

Sustainability identify Environmental Performance possible needs issues (energy Measurement for changes in usage, water (impact on strategy, usage, etc.) society) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level Collin County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability Benc decision issues (energy Organization Performance hmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes Dallas County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability decision issues (energy Organization Performance Benchmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes Tarrant County Estimating Customer Requirements, Highlighting quality problems and determining Cost/Financial which areas Measure, Financial most need Customer Performance attention, To Satisfaction, Measurement, identify Sustainability, Performance possible needs Return on Measurement, for changes in Invested Process strategy, Capital, Management justifying the Measures Performance use of relating to Measurement, resources, Environmental Government Sustainaability providing issues (energy Organization Performance Benchmarking feedback for usage, water (County) 251 or more Measurement Systems driving the usage, etc.) Yes

221 Texas Tech University, Jon Ilseng, December 2015

improvement effort, For decision support at the top management level, For decision support at the operating level Denton County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability decision issues (energy Organization Performance Benchmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes Ellis County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability decision issues (energy Organization Performance Benchmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes Hunt County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability decision issues (energy Organization Performance Benchmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes

222 Texas Tech University, Jon Ilseng, December 2015

Johnson County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability decision issues (energy Organization Performance Benchmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes Parker County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability decision issues (energy Organization Performance Benchmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes Rockwall County Estimating Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Government Sustainaability decision issues (energy Organization Performance Benchmarking support at the usage, water (County) 251 or more Measurement Systems operating level usage, etc.) Yes McKinney Financial Estimating Cost/Financial Habitat for Performance Customer Measure, Humanity Measurement, Requirements, Customer Performance Highlighting Satisfaction, Measurement, quality Sustainability, Process problems and Return on Management determining Invested Government Performance which areas Capital, Organization Measurement, Benchmarking most need Measures (County) 251 or more Sustainaability Systems attention, To relating to Yes

223 Texas Tech University, Jon Ilseng, December 2015

Performance identify Environmental Measurement possible needs issues (energy for changes in usage, water strategy, usage, etc.) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level Dallas Can Estimating Academy Customer Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs Cost/Financial for changes in Measure, Financial strategy, Customer Performance justifying the Satisfaction, Measurement, use of Sustainability, Performance resources, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Sustainability decision issues (energy Performance Benchmarking support at the usage, water Not-for-Profit 21-100 Measurement Systems operating level usage, etc.) Yes Friends of the Estimating Frisco Public Customer Library Requirements, Highlighting quality problems and determining which areas most need attention, To identify possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, Financial feedback for Customer Performance driving the Satisfaction, Measurement, improvement Sustainability, Performance effort, For Return on Measurement, decision Invested Process support at the Capital, Management top Measures Performance management relating to Measurement, level, For Environmental Sustainaability decision issues (energy Performance Benchmarking support at the usage, water Not-for-Profit 251 or more Measurement Systems operating level usage, etc.) Yes The Samaritan Financial Estimating Inn McKinney Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs Cost/Financial for changes in Measure, strategy, Customer justifying the Satisfaction, use of Sustainability, resources, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Benchmarking support at the usage, water Not-for-Profit 21-100 Systems operating level usage, etc.) No

224 Texas Tech University, Jon Ilseng, December 2015

Task Force Financial Estimating Dagger Performance Customer Foundation Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs Cost/Financial for changes in Measure, strategy, Customer justifying the Satisfaction, use of Sustainability, resources, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Benchmarking support at the usage, water Not-for-Profit 21-100 Systems operating level usage, etc.) No Texas Extension Financial Esti Education Performance matingCustom Foundation Measurement, er Performance Requirements, Measurement, Highlighting Process quality Management problems and Performance determining Measurement, which areas Sustainaability most need Performance attention, To Measurement identify possible needs Cost/Financial for changes in Measure, strategy, Customer justifying the Satisfaction, use of Sustainability, resources, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Benchmarking support at the usage, water Not-for-Profit 11-20 Systems operating level usage, etc.) No Texas Association Fina Estimating of Community ncial Customer College Performance Requirements, Foundation Inc. Measurement, Highlighting Performance quality Measurement, problems and Process determining Management which areas Performance most need Measurement, attention, To Sustainaability identify Performance possible needs Measurement for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Benchmarking support at the usage, water Not-for-Profit 21-100 Systems operating level usage, etc.) No Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Cost/Financial Management determining Measure, Performance which areas Customer Measurement, most need Satisfaction, Sustainaability attention, To Sustainability, Performance identify Return on Measurement possible needs Invested for changes in Capital, strategy, Measures justifying the relating to use of Environmental Collin County resources, issues (energy Healthcare Benchmarking providing usage, water Foundation Not-for-Profit 21-100 Systems feedback for usage, etc.) No

225 Texas Tech University, Jon Ilseng, December 2015

driving the improvement effort, For decision support at the top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental McKinney decision issues (energy Education Benchmarking support at the usage, water Foundation Not-for-Profit 21-100 Systems operating level usage, etc.) No Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to McKinney level, For Environmental Community decision issues (energy Development Benchmarking support at the usage, water Corporation Not-for-Profit 101-250 Systems operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of Cost/Financial resources, Measure, providing Customer feedback for Satisfaction, driving the Sustainability, improvement Return on effort, For Invested decision Capital, support at the Measures top relating to management Environmental level, For issues (energy North Texas Food Benchmarking decision usage, water Bank Not-for-Profit 101-250 Systems support at the usage, etc.) Yes

226 Texas Tech University, Jon Ilseng, December 2015

operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Global Future Benchmarking support at the usage, water Institute Not-for-Profit 21-100 Systems operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental Susan G. Komen decision issues (energy Breast Cancer Benchmarking support at the usage, water Foundation Not-for-Profit 251 or more Systems operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental North Texas decision issues (energy Clean Air Benchmarking support at the usage, water Coalition Not-for-Profit 251 or more Systems operating level usage, etc.) Yes Financial Estimating Cost/Financial Performance Customer Measure, Measurement, Requirements, Customer Performance Highlighting Satisfaction, Measurement, quality Sustainability, Process problems and Return on Management determining Invested Mary Kay Performance Benchmarking which areas Capital, Foundation Not-for-Profit 251 or more Measurement, Systems most need Measures Yes

227 Texas Tech University, Jon Ilseng, December 2015

Sustainaability attention, To relating to Performance identify Environmental Measurement possible needs issues (energy for changes in usage, water strategy, usage, etc.) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Catholic Charities Benchmarking support at the usage, water of Dallas, Inc. Not-for-Profit 251 or more Systems operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental Community ISD decision issues (energy Education Balanced support at the usage, water Foundation Not-for-Profit 251 or more Scorecard operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Cost/Financial Management determining Measure, Performance which areas Customer Measurement, most need Satisfaction, Sustainaability attention, To Sustainability, Performance identify Return on Measurement possible needs Invested for changes in Capital, Dallas Black strategy, Measures Chamber of justifying the relating to Commerce use of Environmental Business resources, issues (energy Development Benchmarking providing usage, water Corporation Not-for-Profit 251 or more Systems feedback for usage, etc.) No

228 Texas Tech University, Jon Ilseng, December 2015

driving the improvement effort, For decision support at the top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental Homeless decision issues (energy Veterans Services Benchmarking support at the usage, water of Dallas, Inc. Not-for-Profit 251 or more Systems operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs Cost/Financial for changes in Measure, strategy, Customer justifying the Satisfaction, use of Sustainability, resources, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Dallas Leadership Benchmarking support at the usage, water Foundation Not-for-Profit 101-250 Systems operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Balanced support at the usage, water Cisco Systems Not-for-Profit 101-250 Scorecard operating level usage, etc.) Yes Financial Estimating Cost/Financial Performance Customer Measure, Measurement, ISO9000 Requirements, Customer Texas Instruments For-Profit 251 or more Performance Certification Highlighting Satisfaction, yes

229 Texas Tech University, Jon Ilseng, December 2015

Measurement, quality Sustainability, Process problems and Return on Management determining Invested Performance which areas Capital, Measurement, most need Measures Sustainaability attention, To relating to Performance identify Environmental Measurement possible needs issues (energy for changes in usage, water strategy, usage, etc.) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy ISO9000 support at the usage, water Lockheed Martin For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Total Quality support at the usage, water Oracle For-Profit 251 or more Management operating level usage, etc.) Yes Financial Estimating Cost/Financial Performance Customer Measure, Measurement, Requirements, Customer Performance Highlighting Satisfaction, Measurement, quality Sustainability, Process problems and Return on Management determining Invested Performance which areas Capital, Measurement, most need Measures Sustainaability attention, To relating to Performance identify Environmental Measurement possible needs issues (energy Total Quality for changes in usage, water Ericcsson For-Profit 251 or more Management strategy, usage, etc.) Yes

230 Texas Tech University, Jon Ilseng, December 2015

justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy ISO9000 support at the usage, water Hewlett-Packard For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy ISO9000 support at the usage, water USAA For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Cost/Financial Measurement possible needs Measure, for changes in Customer strategy, Satisfaction, justifying the Sustainability, use of Return on resources, Invested providing Capital, feedback for Measures driving the relating to improvement Environmental effort, For issues (energy Total Quality decision usage, water AT&T For-Profit 251 or more Management support at the usage, etc.) Yes

231 Texas Tech University, Jon Ilseng, December 2015

top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Total Quality support at the usage, water Monitronics For-Profit 251 or more Management operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Research in ISO9000 support at the usage, water Motion For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy ISO9000 support at the usage, water I2 Technologies For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Cost/Financial Performance Customer Measure, L3 Mustang Measurement, Total Quality Requirements, Customer Technology For-Profit 251 or more Performance Management Highlighting Satisfaction, Yes

232 Texas Tech University, Jon Ilseng, December 2015

Measurement, quality Sustainability, Process problems and Return on Management determining Invested Performance which areas Capital, Measurement, most need Measures Sustainaability attention, To relating to Performance identify Environmental Measurement possible needs issues (energy for changes in usage, water strategy, usage, etc.) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy ISO9000 support at the usage, water Intuit For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental Affiliated decision issues (energy Computer ISO9000 support at the usage, water Services For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Cost/Financial Performance Customer Measure, Measurement, Requirements, Customer Performance Highlighting Satisfaction, Measurement, quality Sustainability, Process problems and Return on Management determining Invested Performance which areas Capital, Measurement, most need Measures Sustainaability attention, To relating to Performance identify Environmental Measurement possible needs issues (energy ISO9000 for changes in usage, water Radioshack For-Profit 251 or more Certification strategy, usage, etc.) Yes

233 Texas Tech University, Jon Ilseng, December 2015

justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Bank of America, ISO9000 support at the usage, water Corp. For-Profit 251 or more Certification operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Total Quality support at the usage, water Raytheon For-Profit 251 or more Management operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Cost/Financial Measurement possible needs Measure, for changes in Customer strategy, Satisfaction, justifying the Sustainability, use of Return on resources, Invested providing Capital, feedback for Measures driving the relating to improvement Environmental effort, For issues (energy ISO9000 decision usage, water Celanese For-Profit 251 or more Certification support at the usage, etc.) Yes

234 Texas Tech University, Jon Ilseng, December 2015

top management level, For decision support at the operating level Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy MetroPCS Balanced support at the usage, water Communications For-Profit 251 or more Scorecard operating level usage, etc.) Yes Fina Estimating ncial Customer Performance Requirements, Measurement, Highlighting Performance quality Measurement, problems and Process determining Management which areas Performance most need Measurement, attention, To Sustainaability identify Performance possible needs Measurement for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental decision issues (energy Balanced support at the usage, water Kimberly-Clark For-Profit 251 or more Scorecard operating level usage, etc.) Yes 251 or more Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental Southern decision issues (energy Methodist ISO9000 support at the usage, water University For-Profit Certification operating level usage, etc.) Yes 251 or more Financial Key Estimating Cost/Financial Educational Performance Performance Customer Measure, Texas Christian Institution Measurement, Indication Requirements, Customer University (University) Performance (KPI) Highlighting Satisfaction, Yes

235 Texas Tech University, Jon Ilseng, December 2015

Measurement, quality Sustainability, Process problems and Return on Management determining Invested Performance which areas Capital, Measurement, most need Measures Sustainaability attention, To relating to Performance identify Environmental Measurement possible needs issues (energy for changes in usage, water strategy, usage, etc.) justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level 251 or more Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to Key level, For Environmental Educational Performance decision issues (energy Institution Indication support at the usage, water UT-Dallas (University) (KPI) operating level usage, etc.) Yes 251 or more Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental Educational decision issues (energy Institution Total Quality support at the usage, water UT-Arlington (University) Management operating level usage, etc.) Yes 251 or more Financial Estimating Cost/Financial Performance Customer Measure, Measurement, Requirements, Customer Performance Highlighting Satisfaction, Measurement, quality Sustainability, Process problems and Return on Management determining Invested Performance which areas Capital, Measurement, most need Measures Sustainaability attention, To relating to Performance identify Environmental Educatio Measurement possible needs issues (energy University of nalInstitution Total Quality for changes in usage, water North Texas (University) Management strategy, usage, etc.) Yes

236 Texas Tech University, Jon Ilseng, December 2015

justifying the use of resources, providing feedback for driving the improvement effort, For decision support at the top management level, For decision support at the operating level 251 or more Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to level, For Environmental Collin County Educational decision issues (energy Community Institution Total Quality support at the usage, water College (University) Management operating level usage, etc.) Yes 251 or more Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to Educational level, For Environmental Tarrant County Institution decision issues (energy Community (Community Balanced support at the usage, water College College) Scorecard operating level usage, etc.) Yes 251 or more Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Cost/Financial Measurement possible needs Measure, for changes in Customer strategy, Satisfaction, justifying the Sustainability, use of Return on resources, Invested providing Capital, feedback for Measures driving the relating to Educational improvement Environmental Richland Institution effort, For issues (energy Community (Community Balanced decision usage, water College College) Scorecard support at the usage, etc.) Yes

237 Texas Tech University, Jon Ilseng, December 2015

top management level, For decision support at the operating level 251 or more Fina Estimating ncial Customer Performance Requirements, Measurement, Highlighting Performance quality Measurement, problems and Process determining Management which areas Performance most need Measurement, attention, To Sustainaability identify Performance possible needs Measurement for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to Educational level, For Environmental El Centro Institution decision issues (energy Community (Community Balanced support at the usage, water College College) Scorecard operating level usage, etc.) Yes Financial Estimating Performance Customer Measurement, Requirements, Performance Highlighting Measurement, quality Process problems and Management determining Performance which areas Measurement, most need Sustainaability attention, To Performance identify Measurement possible needs for changes in strategy, justifying the use of resources, Cost/Financial providing Measure, feedback for Customer driving the Satisfaction, improvement Sustainability, effort, For Return on decision Invested support at the Capital, top Measures management relating to Educational level, For Environmental Broohaven Institution decision issues (energy Community (Community Total Quality support at the usage, water College College) 251 or more Management operating level usage, etc.) Yes

238 Texas Tech University, Jon Ilseng, December 2015

APPENDIX H COST LOGISTICS INDEX SURVEY RESULTS

239 Texas Tech University, Jon Ilseng, December 2015

CPI LPI CLI

Global Future Institute 1.25 1.94 0.64433 Dallas Can Academy 1.20 2.37 0.506329 The Samaritan Inn McKinney 1.20 2.38 0.504202 North Texas Clean Air Coalition 1.10 2.19 0.502283 Mary Kay Foundation 1.30 2.66 0.488722 Texas Extension Education Foundation 1.11 2.28 0.486842 Collin County Healthcare Foundation 1.12 2.31 0.484848 Community ISD Education Foundation 1.02 2.19 0.465753 Dallas Black Chamber of Commerce Business Development Corporation 1.06 2.33 0.454936

Wise County 0.99 2.21 0.447964 City of Hurst, TX 1.20 2.69 0.446097 North Texas Food Bank 1.10 2.5 0.44 Collin County 1.10 2.51 0.438247

Johnson County 1.10 2.52 0.436508 Task Force Dagger Foundation 0.98 2.27 0.431718 Catholic Charities of Dallas, Inc. 1.25 2.95 0.423729 McKinney Habitat for Humanity 1.10 2.6 0.423077 Tarrant County 1.10 2.62 0.419847 Susan G. Komen Breast Cancer Foundation 0.99 2.38 0.415966 Hunt County 1.05 2.53 0.41502 Radioshack 0.95 2.31 0.411255 Dallas Leadership Foundation 0.96 2.37 0.405063 Texas Association of Community College Foundation Inc. 1.00 2.51 0.398406 Parker County 0.99 2.5 0.396 Homeless Veterans Services of Dallas, Inc. 1.05 2.66 0.394737 McKinney Community Development Corporation 0.99 2.51 0.394422 City of Keller, TX 1.06 2.71 0.391144 Ellis County 0.98 2.51 0.390438 City of Grapevine, TX 1.20 3.15 0.380952 UT-Dallas 0.99 2.62 0.377863 McKinney Education Foundation 0.98 2.6 0.376923 Texas Christian University 0.99 2.63 0.376426 Denton County 1.20 3.21 0.373832 Rockwall County 1.01 2.71 0.372694 Monitronics 1.10 2.99 0.367893 City of Euless, TX 1.10 3.01 0.365449 University of North Texas 0.97 2.66 0.364662 AT&T 1.30 3.58 0.363128 Cisco Systems 1.10 3.15 0.349206 City of Bedford, TX 1.03 2.98 0.345638

240 Texas Tech University, Jon Ilseng, December 2015

City of Denton, TX 1.03 3.02 0.34106 Dallas County 0.98 2.89 0.3391 Southern Methodist University 0.98 2.89 0.3391 City of Lewisville, TX 1.01 2.99 0.337793 UT-Arlington 0.97 2.89 0.33564 Friends of the Frisco Public Library 0.99 2.95 0.335593 City of Colleyville, TX 1.02 3.07 0.332248 City of Mansfield, TX 1.00 3.02 0.331126 City of Fairview, TX 1.33 4.02 0.330846 City of Southlake, TX 1.30 3.99 0.325815 City of Irving, TX 0.98 3.01 0.325581 City of Grand Prarie, TX 0.97 3.01 0.322259 City of Addison, TX 1.20 3.76 0.319149 City of Lucas, TX 1.25 4 0.3125 USAA 1.25 4.02 0.310945 Tarrant County Community College 1.10 3.54 0.310734 City of Garland, TX 1.00 3.25 0.307692 City of Richardson, TX 1.00 3.32 0.301205 City of Mesquite, TX 0.98 3.27 0.299694 City of Rockwall, TX 1.10 3.74 0.294118 Oracle 1.20 4.10 0.292683 City of Carrollton, TX 1.10 3.82 0.287958 Research in Motion 1.00 3.48 0.287356 Raytheon 1.20 4.18 0.287081 MetroPCS Communications 1.10 3.86 0.284974 Affiliated Computer Services 0.99 3.48 0.284483 Collin County Community College 1.10 3.91 0.28133 Bank of America, Corp. 0.99 3.54 0.279661 City of Arlington, TX 1.00 3.58 0.27933 City of Coppell, TX 1.15 4.19 0.274463 City of Frisco, TX 1.02 3.75 0.272 Ericcsson 1.01 3.82 0.264398 Texas Instruments 1.03 3.91 0.263427 City of Farmers Branch, TX 0.99 3.76 0.263298 Intuit 1.10 4.18 0.263158 City of Fort Worth, TX 0.99 3.79 0.261214 Celanese 0.99 3.82 0.259162 Brookhaven Community College 0.99 3.82 0.259162 City of Allen, TX 1.00 3.86 0.259067 Hewlett-Packard 0.97 3.76 0.257979 Richland Community College 0.99 3.84 0.257813 City of Desoto, TX 0.96 3.75 0.256 City of McKinney, TX 1.00 3.92 0.255102 City of Plano, TX 0.97 3.82 0.253927 I2 Technologies 0.95 3.75 0.253333 City of Dallas, TX 1.02 4.06 0.251232 Kimberly-Clark 0.99 3.99 0.24812 Lockheed Martin 0.99 4.00 0.2475 L3 Mustang Technology 0.99 4.00 0.2475 El Centro Community College 0.98 3.99 0.245614

241 Texas Tech University, Jon Ilseng, December 2015

4.50 4.00 3.50 3.00 2.50 2.00 1.50 1.00 CPI 0.50 0.00 LPI

CLI

Radioshack

Monitronics

UT-Arlington

CollinCounty… CollinCounty

Cisco Systems

DentonCounty

Cityof Hurst, TX

SusanG. Komen…

City of Irving, of City TX

Cityof Keller, TX

Cityof Denton, TX

TheSamaritan Inn… Task ForceDagger…

City of Addison, of City TX

City of Fairview, Fairview, of TXCity

Universityof North…

HomelessVeterans…

SouthernMethodist…

TexasAssociation of…

City of Colleyville, Colleyville, of TXCity

Cityof Grapevine, TX

McKinneyEducation…

McKinneyHabitat for…

MaryKay Foundation DallasBlack Chamber… GlobalFutureInstitute

4.50 4.00 3.50 3.00 2.50 2.00 1.50 1.00 CPI 0.50 0.00 LPI CLI

242 Texas Tech University, Jon Ilseng, December 2015

APPENDIX I PERFORMANCE LOGISTICS INDEX SURVEY RESULTS

243 Texas Tech University, Jon Ilseng, December 2015

Performance Index LPI PLI Global Future Institute 0.85 2.24 0.379464 McKinney Community Development Corporation 0.91 2.51 0.36255 Task Force Dagger Foundation 0.82 2.27 0.361233

Ellis County 0.89 2.51 0.354582 Susan G. Komen Breast Cancer Foundation 0.83 2.38 0.348739 McKinney Education Foundation 0.90 2.6 0.346154 University of North Texas 0.92 2.66 0.345865

Parker County 0.86 2.5 0.344

Tarrant County 0.90 2.62 0.343511 Rockwall County 0.93 2.71 0.343173 UT-Dallas 0.89 2.62 0.339695 City of Keller, TX 0.92 2.71 0.339483 Dallas Leadership Foundation 0.79 2.37 0.333333 Hunt County 0.84 2.53 0.332016 Texas Christian University 0.87 2.63 0.330798 Texas Association of Community College Foundation Inc. 0.83 2.51 0.330677

Dallas County 0.95 2.89 0.32872 Radioshack 0.75 2.31 0.324675

Johnson County 0.80 2.52 0.31746 Southern Methodist University 0.91 2.89 0.314879 McKinney Habitat for Humanity 0.81 2.6 0.311538 Catholic Charities of Dallas, Inc. 0.90 2.95 0.305085 Cisco Systems 0.96 3.15 0.304762 City of Bedford, TX 0.89 2.98 0.298658 Homeless Veterans Services of Dallas, Inc. 0.79 2.66 0.296992 City of Irving, TX 0.89 3.01 0.295681 City of Grapevine, TX 0.93 3.15 0.295238 City of Lewisville, TX 0.88 2.99 0.294314 UT-Arlington 0.85 2.89 0.294118 City of Denton, TX 0.88 3.02 0.291391 North Texas Food Bank 0.90 3.16 0.28481 Mary Kay Foundation 0.90 3.2 0.28125 Denton County 0.90 3.21 0.280374 City of Euless, TX 0.83 3.01 0.275748 City of Grand Prarie, TX 0.83 3.01 0.275748 City of Mansfield, TX 0.83 3.02 0.274834 Dallas Black Chamber of Commerce Business Development Corporation 0.86 3.13 0.27476 Collin County Healthcare Foundation 0.83 3.09 0.268608 City of Richardson, TX 0.89 3.32 0.268072 Monitronics 0.80 2.99 0.267559 AT&T 0.95 3.58 0.265363 City of Garland, TX 0.86 3.25 0.264615

244 Texas Tech University, Jon Ilseng, December 2015

Friends of the Frisco Public Library 0.78 2.95 0.264407 City of Colleyville, TX 0.81 3.07 0.263844 Bank of America, Corp. 0.92 3.54 0.259887 Affiliated Computer Services 0.90 3.48 0.258621 City of Mesquite, TX 0.84 3.27 0.256881 Wise County 0.90 3.51 0.25641 City of Hurst, TX 0.98 3.85 0.254545 Collin County 0.98 3.89 0.251928 MetroPCS Communications 0.96 3.86 0.248705 City of Farmers Branch, TX 0.93 3.76 0.24734 Ericcsson 0.94 3.82 0.246073 City of Arlington, TX 0.88 3.58 0.24581 City of Frisco, TX 0.92 3.75 0.245333 Hewlett-Packard 0.92 3.76 0.244681 City of Plano, TX 0.93 3.82 0.243455 City of Rockwall, TX 0.91 3.74 0.243316 Texas Instruments 0.95 3.91 0.242967 City of Addison, TX 0.91 3.76 0.242021 North Texas Clean Air Coalition 0.81 3.37 0.240356 Tarrant County Community College 0.85 3.54 0.240113 City of Fort Worth, TX 0.90 3.79 0.237467 USAA 0.95 4.02 0.236318 City of Allen, TX 0.91 3.86 0.235751 Research in Motion 0.82 3.48 0.235632 Brookhaven Community College 0.90 3.82 0.235602 City of Dallas, TX 0.95 4.06 0.23399 Collin County Community College 0.91 3.91 0.232737 Oracle 0.95 4.10 0.231707 City of Carrollton, TX 0.88 3.82 0.230366 Celanese 0.88 3.82 0.230366 L3 Mustang Technology 0.91 4.00 0.2275 Raytheon 0.95 4.18 0.227273 City of McKinney, TX 0.89 3.92 0.227041 Lockheed Martin 0.90 4.00 0.225 City of Lucas, TX 0.92 4.11 0.223844 City of Southlake, TX 0.88 3.99 0.220551 El Centro Community College 0.88 3.99 0.220551 City of Desoto, TX 0.81 3.75 0.216 Community ISD Education Foundation 0.75 3.49 0.2149 City of Coppell, TX 0.90 4.19 0.214797 Intuit 0.89 4.18 0.212919 The Samaritan Inn McKinney 0.80 3.76 0.212766 Texas Extension Education Foundation 0.82 3.87 0.211886 Kimberly-Clark 0.84 3.99 0.210526 City of Fairview, TX 0.84 4.02 0.208955 Richland Community College 0.80 3.84 0.208333 I2 Technologies 0.78 3.75 0.208 Dallas Can Academy 0.75 3.84 0.195313

245 Texas Tech University, Jon Ilseng, December 2015

5.00 4.00 3.00 2.00 1.00 #REF! 0.00 #REF! #REF!

5.00 4.00 3.00 2.00 Performance Index 1.00 0.00 LPI PLI

246 Texas Tech University, Jon Ilseng, December 2015

APPENDIX J AVAILABILITY LOGISTICS INDEX SURVEY RESULTS

247 Texas Tech University, Jon Ilseng, December 2015

Availability LPI ALI Task Force Dagger Foundation 0.95 2.27 0.418502 Global Future Institute 0.92 2.24 0.410714 Dallas Leadership Foundation 0.93 2.37 0.392405 McKinney Community Development Corporation 0.91 2.51 0.36255 McKinney Habitat for Humanity 0.94 2.6 0.361538 Texas Association of Community College Foundation Inc. 0.90 2.51 0.358566

Johnson County 0.90 2.52 0.357143 Susan G. Komen Breast Cancer Foundation 0.85 2.38 0.357143

UT-Dallas 0.93 2.62 0.354962 Ellis County 0.89 2.51 0.354582 Hunt County 0.89 2.53 0.351779 Radioshack 0.81 2.31 0.350649 McKinney Education Foundation 0.90 2.6 0.346154 University of North Texas 0.92 2.66 0.345865

Parker County 0.86 2.5 0.344

Tarrant County 0.90 2.62 0.343511

Rockwall County 0.93 2.71 0.343173 City of Keller, TX 0.92 2.71 0.339483 Texas Christian University 0.87 2.63 0.330798 Dallas County 0.95 2.89 0.32872 Southern Methodist University 0.91 2.89 0.314879 Homeless Veterans Services of Dallas, Inc. 0.83 2.66 0.31203

Collin County Healthcare Foundation 0.95 3.09 0.307443 City of Bedford, TX 0.91 2.98 0.305369 Catholic Charities of Dallas, Inc. 0.90 2.95 0.305085 Cisco Systems 0.96 3.15 0.304762 Friends of the Frisco Public Library 0.88 2.95 0.298305 City of Denton, TX 0.90 3.02 0.298013 City of Irving, TX 0.89 3.01 0.295681 City of Lewisville, TX 0.88 2.99 0.294314 UT-Arlington 0.85 2.89 0.294118 City of Euless, TX 0.88 3.01 0.292359 City of Colleyville, TX 0.89 3.07 0.289902 City of Grapevine, TX 0.91 3.15 0.288889 North Texas Food Bank 0.91 3.16 0.287975 Monitronics 0.85 2.99 0.284281 Mary Kay Foundation 0.90 3.2 0.28125 Denton County 0.90 3.21 0.280374 City of Grand Prarie, TX 0.83 3.01 0.275748 City of Mansfield, TX 0.83 3.02 0.274834

248 Texas Tech University, Jon Ilseng, December 2015

Dallas Black Chamber of Commerce Business Development Corporation 0.86 3.13 0.27476 City of Garland, TX 0.89 3.25 0.273846 City of Richardson, TX 0.89 3.32 0.268072 AT&T 0.95 3.58 0.265363 Bank of America, Corp. 0.92 3.54 0.259887 Affiliated Computer Services 0.90 3.48 0.258621 Texas Extension Education Foundation 0.99 3.87 0.255814 Community ISD Education Foundation 0.89 3.49 0.255014 Collin County 0.98 3.89 0.251928 The Samaritan Inn McKinney 0.94 3.76 0.25 City of Mesquite, TX 0.93 3.73 0.24933 MetroPCS Communications 0.96 3.86 0.248705 City of Farmers Branch, TX 0.93 3.76 0.24734 Ericcsson 0.94 3.82 0.246073 City of Arlington, TX 0.88 3.58 0.24581 City of Frisco, TX 0.92 3.75 0.245333 Hewlett-Packard 0.92 3.76 0.244681 Research in Motion 0.85 3.48 0.244253 City of Plano, TX 0.93 3.82 0.243455 City of Rockwall, TX 0.91 3.74 0.243316 Texas Instruments 0.95 3.91 0.242967 City of Addison, TX 0.91 3.76 0.242021 City of Hurst, TX 0.93 3.85 0.241558 North Texas Clean Air Coalition 0.81 3.37 0.240356 Tarrant County Community College 0.85 3.54 0.240113 City of Desoto, TX 0.90 3.75 0.24 City of Fort Worth, TX 0.90 3.79 0.237467 Dallas Can Academy 0.91 3.84 0.236979 USAA 0.95 4.02 0.236318 City of Allen, TX 0.91 3.86 0.235751 Brookhaven Community College 0.90 3.82 0.235602 Richland Community College 0.90 3.84 0.234375 City of Dallas, TX 0.95 4.06 0.23399 Wise County 0.90 3.85 0.233766 Collin County Community College 0.91 3.91 0.232737 Oracle 0.95 4.10 0.231707 El Centro Community College 0.90 3.89 0.231362 Kimberly-Clark 0.92 3.99 0.230576 City of Carrollton, TX 0.88 3.82 0.230366 Celanese 0.88 3.82 0.230366 L3 Mustang Technology 0.91 4.00 0.2275 Raytheon 0.95 4.18 0.227273 City of McKinney, TX 0.89 3.92 0.227041 Lockheed Martin 0.90 4.00 0.225 City of Lucas, TX 0.92 4.11 0.223844 City of Southlake, TX 0.88 3.99 0.220551 City of Coppell, TX 0.90 4.19 0.214797 I2 Technologies 0.80 3.75 0.213333 Intuit 0.89 4.18 0.212919 City of Fairview, TX 0.84 4.02 0.208955

249 Texas Tech University, Jon Ilseng, December 2015

4.50 4.00 3.50 3.00 2.50 2.00 1.50 1.00 Availability 0.50 LPI 0.00 ALI

4.50 4.00 3.50 3.00 2.50 2.00 1.50 1.00 0.50 0.88 0.00 3.58

0.245810056

Intuit

USAA

Oracle

Celanese

Raytheon

WiseCounty

Kimberly-Clark

Tarrant Tarrant County…

I2 Technologies I2

City of Allen, of City TX

Cityof Hurst, TX

Cityof Lucas, TX

Cityof Plano, TX

Cityof Frisco, TX

Cityof Dallas, TX

Hewlett-Packard

LockheedMartin

Cityof Desoto, TX

Cityof Coppell, TX

TexasInstruments

Cityof Addison, TX

Cityof Fairview, TX

ResearchMotionin

Cityof Rockwall, TX

DallasCan Academy

Cityof McKinney, TX

Cityof Southlake, TX

Cityof Carrollton, TX

ElCentro Community…

NorthTexas Clean Air…

Cityof Fort Worth, TX

L3 L3 MustangTechnology

BrookhavenCommunity…

Collin CountyCollinCommunity… RichlandCommunity College

250