<<

A Dissertation

entitled

Big and Small for Value Creation and Delivery: Case for Manufacturing Firms

By

Blaine David Stout

Submitted to the Graduate Faculty as partial fulfillment of the requirements for the

Doctor of Philosophy Degree in Manufacturing and Technology Management

______Dr. Paul C. Hong, Major Advisor

______Dr. An Chung Cheng, Committee Member

______Dr. Thomas S. Sharkey, Committee Member

______Dr. Steven A. Wallace, Committee Member

______Dr. Amanda C. Bryant-Friedrich Dean, College of Graduate Studies

The University of Toledo

December 2018

Copyright 2018 ©, Blaine David Stout

This document is copyrighted material. Under copyright law, no parts of this document may be reproduced without the expressed permission of the author.

An Abstract of

Big and Small Data for Value Creation and Delivery: Case for Manufacturing Firms

By

Blaine David Stout

Submitted to the Graduate Faculty as partial fulfillment of the requirements for the

Doctor of Philosophy Degree in Manufacturing and Technology Management

The University of Toledo November 2018

Today’s small-market and mid-market sized manufacturers, competitively face increasing pressure to capture, integrate, operationalize, and manage diverse sources of digitized data.

Many have made significant investments in data technologies with the objective to improve on organization performance yet not all have realized demonstrable benefits that create organization value. One simple question arises, do business-analytics make a difference on company performance in today’s information intensive environment? The research purpose, to explore this question by looking through the lens of data-centric pressure placed on management driving the invested use of data-technologies; how these drivers impact on management influence to adopt a digitized organization mindset, effecting data practices, shaping key processes and strategies and leading to capabilities growth that impact on performance and culture. The terms ‘’ and ‘Small Data’ are two of the most prolific used phrases in today’s world when discussing business analytics and the value data provides on organization performance. Big Data, being strategic to organization decision-making, and Small Data, operational; is captured from a host of internal and external sources. Studying how leveraging business-analytics into organizational value is

i

of research benefit to both academic and practioner audiences alike. The research on ‘Big and Small Data, and business analytics’ is both varied and deep and originating from a host of academic and non-academic sources; however, few empirical studies deeply examine the phenomena as experienced in the manufacturing environment. Exploring the pressures managers face in adopting data-centric managing beliefs, applied practices, understanding key value-creating process strategy mechanisms impacting on the organization, thus provides generalizable insights contributing to the pool of knowledge on the importance of data-technology investments impacting on organizational culture and performance outcomes. The exploratory and theory building phase of the research uses case studies to examine topics of interest and uncover others adding to the richness of the study on which to build a research model. To hear the voice of practice, multiple in-depth, semi-structured interviews were conducted among senior managers of 10 regional manufacturers located in the central Midwest United States. The research’s confirmatory phase firstly reviews literature on topics revealed in the research model augmenting the depth of data on which to construct a large-scale survey instrument. Secondly, a survey-study is developed and conducted among 333 managers of manufacturers located across North America. The results are presented in two forms, through a multiple regression analysis and structural equation modeling. Both demonstrating the moderating impact of executive management influence and data accessibility and use mechanisms on organization performance in the form of capabilities growth. The research presents a data-centric management influence model generalizing the effect of management influence and investment making on data- technologies; thereby enabling a data-centric mindset or culture to maintain and sustain organization value in knowledge or digital intense competitive environments.

ii

To my most loving wife, Linda, without her steadfast support, giving of time, care, insights, laughter, smiles, understanding patience, and persistent urges on completing

this doctoral pursuit, its achievement would not be possible.

With my deepest heartfelt appreciation, Thank You

iii

ACKNOWLEGEMENTS

Life is about learning, learning requires teachers, teachers who appear at the right time in our lives to make the learning experience possible. Some present, some past, some in spirit. To those present, my most grateful appreciation is given to my dissertation chair, Dr. Paul Hong, one of the first professors met when joining the PhD program. Dr.

Hong has a unique, engaging curiosity about people, and when asked about interests in that first meeting, he believed we would be doing some ‘great research together’. His mentoring, patience, and wisdom since has been invaluable on this learning experience.

The committee members, Dr. Thomas Sharkey, a mentor, friend and supporter of this doctoral pursuit, who I’ve known since my MBA days here at the University and have always appreciated his guidance. Dr. An Chung Cheng, whose involvement from the language arts disciplines adds an affinitive dimension on the committee that is gleaned from global perspectives on how we as people communicate is likewise most appreciated.

Dr. Steven Wallace, whose understanding of case study research and serving as a sounding board on similar business analytics topics of interest has been most helpful on this effort. To all of the professors whose seminars shaped my learning, understanding of research, writing papers (many they were) and preparing for this event, my grateful appreciation. To the University of Toledo, and The College of Business and Innovation its Deans and PhD program directors, past and present, for granting me the wonderful opportunity to study with the best and brightest colleagues from distances near and far.

From the practical side of learning, the case study participants, whose names are withheld for anonymity reasons, are thanked for contributing on the richness of discovery found in those meetings and conversations. A grateful appreciation to a dear friend and

iv

professional colleague, Rob Bleile, whose information systems and technology insights, knowledge on data collection, research and analytics, as well as having career manufacturing experience, was instrumental on this research effort, thank you. Those past, are mentors from prior professional careers in manufacturing, providing opportunities on serving in high levels of responsibility and learning the best forms of leadership on which an organization effectively functions; I am very grateful for those learning experiences. To those is spirit, my mom and dad, my wife’s mom and dad, and a host of other relatives and friends who I know have an unseen role in this achievement.

To the wisdom of my faith, that God guides us on paths best for our journey in this life, it is meant as it is to be. Thank you

v

TABLE OF CONTENTS

Contents

ACKNOWLEGEMENTS ...... iv

TABLE OF CONTENTS ...... vi

List of Tables ...... x

List of Figures ...... xii

CHAPTER 1: INTRODUCTION ...... 1

1.1 Research importance ...... 1

1.2 Research contribution ...... 4

1.2 Essay organization ...... 6

CHAPTER 2: VOICE OF THE MANUFACTURER ...... 8

2.0 Grounded theory ...... 8

2.1 Case Study Research ...... 11

2.1.1 Case study process ...... 13 2.1.2 Case selection ...... 14 2.1.3 Field interviews ...... 15 2.1.4 Coding and analysis ...... 16 2.1.5 Data structure...... 16 2.2 Data-centric Adoptive Drivers ...... 18 2.2.1 Aggregate dimension: Performance Pressure ...... 19 2.2.2 Aggregate dimension: Competitive Pressure ...... 25 2.2.3 Aggregate dimension: Innovation Pressure ...... 28 2.2.4 Aggregate dimension: Cyber-security Pressure...... 31 2.3 Moderation mechanism: Management influence ...... 35

2.4 Moderation mechanism: Data accessibility and use ...... 39

2.5 Data-integration practices ...... 46 2.5.1 Aggregate dimension: Strategic-level ...... 47 2.5.2 Aggregate dimension: Operational-level ...... 53 2.5.3 Aggregate dimension: IIOT-level ...... 57

vi

2.5.4 Aggregate dimension: Data-security-level ...... 60 2.6 Data-actuation, Key Processes ...... 62

2.6.1 Aggregate dimension: Productivity processes ...... 63 2.6.2 Aggregate dimension: Planning processes ...... 69 2.6.3 Aggregate dimension: Innovation processes ...... 74 2.6.4 Aggregate dimension: Data governance processes...... 80 2.7 Organization performance outcomes ...... 81

2.7.1 Organization capabilities growth ...... 83 2.7.2 Planned and or Market capabilities growth ...... 86 2.7.3 Aggregate dimension: Innovation capabilities growth ...... 89 2.8 Research Model: The data-centric eco-system of a manufacturer ...... 91 2.8.1 Final aggregation ...... 92 CHAPTER 3: VOICE OF LITERATURE...... 97

3.0 Theory ...... 97 3.0.1 From the RBV perspective ...... 98 3.0.2 From the KBV perspective ...... 100 3.0.3 From the Organization Learning perspective ...... 102 3.0.4 Technology organization environment perspective ...... 103 3.1 Executive Management influence ...... 104 3.1.1 Data-analytics driven enterprise ...... 107 3.1.2 Barriers to a data-driven culture change ...... 113 3.1.3 Implementing a data-driven culture ...... 115 3.1.4 Making the data investment ...... 116 3.1.5 Tangible value of data ...... 117 3.1.6 Construct and definition ...... 123 3.2 Data accessibility and use mechanisms ...... 123 3.2.1 Data-use ...... 128 3.2.2 Data accessibility ...... 132 3.2.3 Data management ...... 133 3.2.4 Data analytics and business analytics ...... 133 3.2.5 Construct and definition ...... 136 3.3 Knowledge intense environments driving data-analytics adoption ...... 136 3.3.1 Performance pressure ...... 138

vii

3.3.2 Competitive pressure ...... 140 3.3.3 Innovation pressure ...... 141 3.3.4 Cyber-security pressure ...... 143 3.3.5 Constructs and definitions ...... 147 3.4 Data-analytics integration practices ...... 147 3.4.1 Strategic-level practice ...... 149 3.4.2 Operation-level practice ...... 152 3.4.3 Data-security level practice ...... 155 3.4.4 Industrial Internet of Things (IIOT) level practice ...... 157 3.4.5 Constructs and definitions ...... 163 3.5 Data-analytics actuated processes ...... 164 3.5.1 Key productivity process ...... 165 3.5.2 Key planning process ...... 168 3.5.3 Key Data Governance process ...... 171 3.5.4 Key innovation processes ...... 176 3.5.5 Constructs and definitions ...... 180 3.6 Making a difference on organizational performance ...... 181

3.6.1 Organization capabilities growth ...... 183 3.6.2 Innovation capabilities growth ...... 185 3.7 Summary ...... 187

CHAPTER 4: VOICE OF MANY...... 187

4.1 General facts on the final survey instrument sample characteristics ...... 188

4.2 Research model on data-analytics on organization outcomes ...... 193

4.3 Item generation and pre-testing ...... 194

4.4 Final pilot-study ...... 196

4.5 Methods and Analysis ...... 197 4.5.1 Principal Components ...... 201 4.6. Multiple regression model and results ...... 206 4.6.1 Pressures on data integration practices regression model ...... 208 4.6.2 Practices on key processes regression model ...... 212 4.6.3 Processes on organization outcomes ...... 216 4.6.4 Discussion ...... 219

viii

4.7 Structural model and results ...... 224 4.7.1 [DCP-EMI-DAUM-DIP] model ...... 227 4.7.2 [DIP-EMI-DAUM-DAP] model...... 230 4.7.3 [DAP-EMI-DAUM-OCG] model...... 233 4.7.4 Discussion ...... 235 4.6.4.1 Hypotheses Result ...... 237 CHPATER 5: ARIA ...... 239

5.0 Summary and management implications ...... 239

5.1 Recommendations on the research ...... 245

5.2 Recommendations on further research ...... 248

References ...... 250

ix

List of Tables Table 1: Case Study Sample Participants ...... 15

Table 2: Measures expressed in the case studies ...... 24

Table 3: Data-centric Adoptive Drivers Constructs and Definitions...... 35

Table 4: Executive Management Influence Constructs and Definitions ...... 39

Table 5: Data Accessibility & Use Integration Constructs and Definitions ...... 45

Table 6: Strategic Level Data Constructs and Definitions ...... 52

Table 7: Operation Level Data Integration Constructs and Definitions ...... 56

Table 8: IIOT Level Data Constructs and Definitions ...... 60

Table 9: Data-security Level Constructs and Definitions ...... 62

Table 10: Key Productivity Processes Constructs and Definitions ...... 68

Table 11: Key Planning Processes Constructs and Definitions ...... 74

Table 12: Key Innovation Processes Constructs and Definitions ...... 79

Table 13: Organization capabilities growth (Labor) ...... 84

Table 14: Organization capabilities growth (Manufacturing) ...... 85

Table 15: Organization capabilities growth (Cyber-security) ...... 85

Table 16: Organization capabilities (Financial) ...... 86

Table 17: Market Capabilities Growth Constructs and Definitions ...... 88

Table 18: Innovation Capabilities Growth Constructs and Definitions ...... 90

Table 19: Data-centric pressures constructs, definitions ...... 147

Table 20: Data-centric integration practices constructs and definitions ...... 163

Table 21: Data-centric actuated processes constructs and definitions...... 180

Table 22: Data-centric pressures factors and scores ...... 201

Table 23: Data-centric integration practice items...... 202

Table 24: Data-centric actuated process items ...... 203

Table 25: Data-centric management influence and data use mechanisms ...... 204

x

Table 26: Organization Capabilities Growth ...... 205

Table 27: Performance pressure on integration practices ...... 208

Table 28: Innovation pressure on integration practices ...... 209

Table 29: Competitive pressure on integration practices ...... 210

Table 30: Cyber-security pressure on integration practices ...... 211

Table 31: Strategic level data integration on key processes ...... 212

Table 32: Operation level data integration on key processes ...... 213

Table 33: Data-security level integration on key processes ...... 214

Table 34: IIOT level integration on key processes ...... 215

Table 35: Key productivity actuated processes on organization outcomes ...... 216

Table 36: Key planning actuated processes on organization outcomes ...... 217

Table 37: Key data governance actuated processes on organization outcomes ...... 218

Table 38: Key innovation processes on organization outcomes ...... 219

Table 39: DCP-EMI-DAUM-DIP Model Hypothesis Results ...... 237

Table 40: DIP-EMI-DAUM-DAP Stage 2 Hypotheses Result ...... 237

Table 41: DAP-EMI-DAUM-OCG Stage 3 Hypotheses Result ...... 238

xi

List of Figures Figure 1: Case Selection Process ...... 13

Figure 2: Performance Pressures ...... 22

Figure 3: Competitive Pressures ...... 27

Figure 4: Innovation Pressures ...... 30

Figure 5: Cyber-security Pressure ...... 34

Figure 6: Executive Management Influence ...... 38

Figure 7: Data Accessibility & Use Integration (Mechanisms) ...... 44

Figure 8: Strategic Level Data Integration ...... 51

Figure 9: Operation Level Data Integration ...... 55

Figure 10: IIOT Data Level Integration ...... 59

Figure 11: Data-asset Security Integration ...... 61

Figure 12: Data-actuation Key Productivity Processes ...... 67

Figure 13: Data-actuated Key Planning Processes ...... 73

Figure 14: Data-actuated Key Innovation Processes ...... 78

Figure 15: Data-governance processes ...... 81

Figure 16: Organization Capabilities Growth ...... 83

Figure 17: Planned or market capabilities growth ...... 87

Figure 18: Innovation Capabilities Growth ...... 89

Figure 19: Case Study Concluding Research Model ...... 91

Figure 20: Case-study Generalized Concluding Research Model ...... 96

Figure 21: Survey Demographics Dashboard ...... 191

Figure 22: Demographics dashboard technology implemented ...... 192

Figure 23: DCP→EMI→DAUM→DIP model stage ...... 227

Figure 24: DIP→EMI→DAUM→DAP model stage ...... 230

Figure 25: DAP→EMI→DAUM→OCG model stage ...... 233

xii

xiii

CHAPTER 1: INTRODUCTION

1.1 Research importance

Does data-analytics (DA), its applied vernacular Business Analytics1(BA), or the act of digitization (DZ) make a difference on company performance in a knowledge- intensive environment? Motivating this question is a quote taken from IQMS (2015),

“the future of manufacturing lies in the hands of businesses that can best capture and manage data”; paramount to “improved decision making and performance” (McAfee,

2012, p 62); through the organization’s ability to apply data insights enabling management to know more about the business; and requiring the cognitive task of discerning what data is important to the organization’s functionality and growth. DA is critical to a firm’s digital or IT strategy; Bughin et al (2017) study among varied industries, found a digitized strategy coupled with superior operational and organizational practices important to competitive economic advantage and the separator between high and low preforming companies. As a firm ramp-ups a data-driven mind-set, peer competition may be doing or have done the same and identifying parallel or differentiating value-creation opportunities; this chase of data to improve on performance

1 Throughout this paper the terms data-analytics, business-analytics, and digitization are used interchangeably and at times will be abbreviated by the acronym DA, BA, or DZ. DA is the process of examining data sets, drawing conclusions from the information contained therein by use of qualitative and quantitative analysis technologies to make more informed business decisions or in the scientific realm to validate research models, theories and hypotheses (TechTarget; techtarget.com/definition/data-analytics). BA is the scientific process of transforming data into insight for making better decisions (INFORMS) that are data-driven and fact-based (Camm, et al.,2017, p5)

1

results in ever-increasing economic pressures placed on management to continuously grow revenue and profits to sustain competitiveness and enhance firm value.

Contained within these progressions of thought is the fundamental aspect of data, its value as a resource; and like any resource-material processed from a raw-state, its value is unleashed as it is worked and shaped into useful forms that have value-creating application. Converting these data materials into usable insights, when needed, where needed, and made accessible at the appropriate organizational level where is core to the practical purpose of data-analytics (Chen et al., 2015); the mill in which raw Big and

Small Data is transformed into forms of organizational value.

Big Data, as a representative term of the knowledge-intense environment manufacturers live within, is characteristically referenced by attributes of volume (size), variety (type), velocity (speed of delivery), veracity (quality), and value (usefulness)

(Chen et al., 2015; Baesens, et al., 2016) consisting of structured and unstructured data- sets originating from sources external to the company (albeit, appropriate internal data are integrated for strategic analysis). External data include, but not limited to: commodity data from futures exchanges used in forecasting raw-material availability and cost, social media data to identify sentiments and trends, customer data used to identify new products or business opportunities, supplier data for sourcing decisions, industry data for benchmarking purposes, and government data to understand policy changes and regulations that affect the business.

Small Data is internal to the company, the operation-data or tactical-data “consisting of usable chunks” (Banafa, 2016) that is manageable versus the enormity and

2

unstructured complexity of Big Data. Small Data is generally structured, easily accessible, understandable, and actionable (Davenport, 2012), generated from business process and production process activities. Business process data is composed of information collected from systems focused on customer-facing and operation-facing management tasks. Production process data is gathered from systems and devices connected to machine-facing and material-facing management tasks.

Big Data is transformative on business strategies and models through information sources influence on organization decision making, Small Data transforms practices, processes and procedures for more efficient use (Chen et al., 2015; Slinger and Morrison,

2014; Galbraith, 2014; McAfee and Brynjolfsson, 2012). Big and Small Data-supported decision-making become the proxy of organizational influence and augments legacy self- reliance on ‘gut-feeling’, and or decisions made in context with less accurate, timely, or reliable information on both strategic and operational strategies.

The data will set you free, playing on an oft used phrase of truth setting one free, represents a fundamental aspect of how information when digitally captured, stored, and integrated into an organization works to free the organization’s value creating capabilities. Value in the sense of insightful decision-making on productivity performance, innovativeness, planning, and optimal deployment of resources, building organization capabilities; guided by the stories data tells management about organizational effectiveness and growth opportunities. As an economic value, data has become a critical asset, managers want to know the value or worth of information in terms of usefulness and return on the investments made in the data and data-technologies used to reveal its insights (Bughin, et al., 2016).

3

Data-analytics is not a stand-alone, central-idea, it requires understanding the adoptive drivers that motivate the invested use of data; the cross-discipline implementation and multi-organizational level of data-accessibility and data-supported decision-making agility in consideration of decisions speedily made in knowledge intensive environments (Eisenhardt 1989b). Thus evolving towards a data-centric culture that embraces risk-taking through cross-function collaborative data-analysis (Kane, et al.,

2017; Shields, 2017; Alamar, 2013), investing advanced quantitative metrics (data- technologies) with other types of information to aid the decision-making process.

Akter, et al. (2016) reports a gap existing between data-technologies investment and organization performance, stating some researchers find the value of analytics cresting in use and difficult to assess the return on investment in terms of increased productivity, and innovation capabilities. Bughin et al (2017) counters by finding “heavy digital investment is a differentiator” (p 15) of leading companies and those investments demonstrate a favorable return on investment; albeit subjective among firms of arriving at a return on investment metric that meets specific internal qualifications, and whether the applied metric(s) are based upon the investment being viewed from a long or short-term perspective and quantified value.

1.2 Research contribution

While extant literature about Big Data (Small Data to lesser extents), its meaning, application, and impact on large organizations is prolific (since becoming a near household or rather a ‘businesshold’ word in the 1990s), little empirical research has been conducted on the impact of Big Data or Small Data and DA within small to medium sized

4

manufacturers; and of greater rarity is research designed to include qualitative and quantitative methods that explore the notion of DA creating value on the organization.

According to the US Census statistic [2012] small to mid-market size manufacturers account for 99% of all US manufacturers and represent 77% of revenues. Compound this fact, from a geographic regional perspective, 44% of manufacturing employment within the United States is contained in the states of Indiana, Michigan, and Ohio and 43% of total manufacturing revenues2. A 2016 survey (Agrawal, 2016) among small to mid- market size manufacturers, found seventy-seven percent of mid-market firms3 have invested and deployed data-analytic solutions and spending on average $85,000 per year; with thirty-three percent of small market firms annually spend on average $10,000.

Questions regarding future information technology adoption plans, showed 91 percent of mid-market companies identifying Big Data / Analytics as being a top-priority with 29 percent of small-market firms prioritizing analytics.

DA is on the mind of the manufacturer, and it is the quest of this research to explore how the level of DA investment, not merely financial, in terms of organization resources is transforming capabilities of the small to mid-market manufacturer. Recounting these aforementioned facts is foundational towards understanding the importance of implementing DA and the organization moving towards a data-centric culture.

Understanding the type of data used, the investments made in technologies to extract value from data variety, the pressures on management to build an organizational data-

2 Source data: US Census Bureau American Fact Finder; Annual Survey of Manufacturers: Geographic Area Statistics: Statistics for All Manufacturing by State: 2014, 2013, 2012 – see table 1 in the appendices https://factfinder.census.gov/faces/nav/jsf/pages/download_center.xhtml 3 Mid-market firms at those with revenues less than 1 billion USD and have 500-1,000 employees, small market firms have less than 100 employees and less than $10 million in revenue, large market firms have 1,000 or more employees and revenues over 1 billion USD.

5

centric mindset, the processes affected by DA, and the outcomes leading to organization, market, and innovation growth.

1.2 Essay organization

This paper is organized as a trilogy based on a multi-method empirical research approach; case studies, literature review, and a large-scale survey are used and sequenced operationally in that order. The primary thrust of the research is Chapter-two, titled the

‘The Voice of the Manufacturer’; a case study of ten Midwest United States manufacturers emerging various concepts, themes and dimensions. The case research followed the methodology prescribed by Gioia, et al (2010, 2012) and Corley and Gioia

(2004, 2011). Researchers are predisposed to literature on topics of interest (as is this researcher) where bias forms as impediments to case study research, it is advised to, as best possible, refrain from things you know and approach case studies in a manner that allows for keen, open-minded listening on the free-flowing of dialogue among the interviewer and interviewee and not constraining discoveries with known literature. In this manner, theory emerges, and greater insights are revealed than framed by specific, literature supported questions. After analysis of the case study data, a research model is formed on which a literature review is conducted to augment the case-study findings.

Chapter-three, ‘The Voice of Literature’, supplements Chapter-two and leads a discussion on literature relative to the findings in the case studies. Supportive theories on data as a valuable resource and the knowledge it provides the organization as foundational to value creation. Following the section on theory development, literature is reviewed on the emergent topics found through the case studies; those being the adoptive

6

drivers motivating organizations towards a data-mindset culture, the accessibility and use of data in the form of practices, the transformation of data into key data-actuated processes or strategies, and the value-added outcomes of data on the organization.

Chapter-four, ‘The Voice of the Manufacturer’ and through a nationally distributed survey extends the findings of Chapter-two and three. Analysis on the survey data is conducted using multiple regression analysis and structural equation modeling; important on the research is testing the moderating effects of two influences, executive management influence and data accessibility and use mechanisms.

In Chapter-two, propositions are formed through the case studies and a research model is presented, literature review in Chapter-three further shapes hypotheses from the model and tested by survey research in Chapter-four ‘The Voice of Many’. Chapter-five,

‘The Aria’, concludes with a research summary, the limitations of each research method, management implications, and opportunities for further research.

[The remainder of this page left intentionally blank]

7

CHAPTER 2: VOICE OF THE MANUFACTURER

2.0 Grounded theory

Conventional, empirical research utilizes literature reviews on keen interest topics to study certain phenomena for developing theory. However, literature does not intimately record the voice of managers, case studies are ideal mechanisms for listening to that voice and the perspective of management as it organizationally pursues a DA mind-set.

An inductive, theoretical foundation conceptualizes on drivers impacting on practices

(strategies), practices forming processes (operationalizing), and processes leading to outcomes (Hong, 2018) is used on conducting the comparative case studies. As such, this paper primarily follows a variance research agenda, exploring the variables of how and why of data-analytics adoption and its effect on organization outcomes; to a lesser extent, incorporating process research and detecting sequential event patterns among the comparative studies (Langley, 1999). Glasser and Strauss (1967), state qualitative, inductive grounded theory generation (predictive, explanatory, and relevant) is accomplished through comparative analysis4 using a codified methodology as “suited to its supposed use, based on data that can usually not be completely refuted” (p2-4)5; where collection, coding, and data-analysis cooperate in alignment to generate theory.

Systematically, comparative coding of small units of data to form categories containing sub-categories and aggregating to dimensions; with the research identifying core-

4 The purpose of comparative analysis: accuracy of evidence collected among comparative groups, establish delimiting theory generalizations, verifying theory, and generating theory (chapter 2). 5 The useful attributes of theory generation: the degree of inductivity, logical consistency, clarity, parsimony, scope, integration, fit, its ability to work according to its supposed use.

8

categories woven to form theory (Langley, 1999, p700; Straus and Corbin, 1990).

Langley (1999) posits grounded theory as a strategy, highly accurate in its proximity to the original-data, beginning with empirical data derived from comparative interviews to illustrate the experiential, “the processes by which the organizing and organization unfold” (Langley, 1999; Gioia et al, 2012, p16).

The research model is developed as such and used to expand the overall research; the second research phase confirms the findings of the first phase through a survey instrument, testing the validation or invalidation on the propositions formed in the case study analysis. In this context, the research looks to literature related to the topics embedded in the conceptual research model; drivers, practices, processes, and outcomes.

During the first phase of case study research, developing strictly formed constructs based on literature is cautioned by Gioia et al (2012) and advises when conducting qualitative studies, “advances in knowledge that are too strongly rooted in what we already know delimit what we can know” (p16). Essentially, in the context of pre- research literature review, known literature bias may shade the researcher’s agility and ability to recognize new and novel concepts that artificially impact on concept discovery and generating grounded theory through participant interviews. However, a balance needs struck between ‘knowing too much and knowing too little’, Yin (2014), a literature review is used to “develop sharper and more insightful questions, a means to an end and not an end in itself” (p14-15), and “does not describe the most recent insights due to publication lag” (Dul and Hak, 2004, p 49). Granted, known literature of interest to this paper’s research, was conducted to develop a fundamental understanding of related concepts; however those concepts were held guarded in the sense of an awareness to not

9

permit reviewed literature become constraining on questions asked among the case study participants. Hence the semi-structured, open-question format of the interviews allowing the participant to elaborate and at times diverge the discussion, providing rich content and contemporary intrigue to the topic researched.

Essay topics are discussed in following order: data-centric adoptive drivers, where interview responses and observations are analyzed relative to performance, competitive, and technology pressures placed on management to influence the organization adopting and evolving towards a data-centric culture. Second, executive management influence and examines research participant influence on organizationally adopting data- technologies and experience background and degree of organizationally embracing data- centric managing beliefs. Third, data use practices, reveals the definition of Big and

Small Data through participant eyes and its strategic and operational importance on organization decision-making and extends these themes to the organizational level of data accessibility and useful application. Fourth, focuses on key process indicators, these are data-shaped metrics aligned with the organization’s strategies and objectives used by management to understand organization productivity, its forecasting and planning abilities to predict current and future outcomes, and types of innovation occurring within the organization.

Fifth, and final topic discussion, organizational performance outcomes, these are growth-oriented metrics and considers these to be: organizational (i.e. employment, capabilities, and financial.), market (i.e. existing and new market growth), and innovation

(new business creation, new IP creations, technology investments). The sections are structured firstly, by presenting a model of the dimension revealed through the

10

interviews. following the topic discussions describe the research model, case research methodology and analytic technique, concluding discussion, research implications and limitations.

Within these discussions, the term technology is used inter-changeable with either singular and plural meanings of data-technology, data-analytics and information or communications technology. Technology as a positioning definition for this research, is any know-how or knowledge that improves our understanding about how to do things

(Chadee and Pang, 2008; Capon and Glazer, 1987) being “part and parcel of the mainstream of cultural inclinations, irrevocably bound to the social setting in which they arise” (Bugliarello and Doner, 1973, p137).

2.1 Case Study Research

The case study research method is an empirical inquiry: one that is exploratory and or explanatory in its research objective, an in-depth examination “investigating a contemporary phenomenon within a real-world context” where the “investigator has little or no control of behaviors and the results” (Yin, 2004, p16), “A research strategy which focuses on understanding the dynamics present in single settings” (Eisenhardt, 1989, p534), “[case research] is building and testing statements by analyzing evidence drawn from observation especially when topics of interest definitions are lacking” (Dul and

Hak, 2008, p180), “are more powerful at indicating causal, predictive relationships than many other forms of empirical research” (Stuart, et al., 2002, p422), “a prime means of developing well-grounded theories” or examining unfamiliar situations, “focuses on current conditions of the phenomenon” (McCutcheon and Meredith, 1993, p239-241),

11

“assesses the conditions surrounding the phenomenon to build a plausible explanation or discover causal relationships that link antecedents to the results” (McCutcheon and

Meredith, 1993 p 240; Voss, et al. 2002; Benbasat, et al., 1987). Case studies can be of single or multiple design, and at various levels of analysis (Yin, 1984, 2014) and used to describe a phenomenon, or for theory-testing, or theory-building (Dull and Hak, 2008;

Eisenhardt, 1989a). To understand the explanatory effect of data [big and small] and its use on the organization; this research sought to understand how data is used and its impact on organization performance, how data creates value and its enabling or constraining effect, and why organizations adopt a data-driven culture and what are the drivers to that adoption. Yin (2014), speaks to the “iterative nature of explanation building” (p149) that encompasses an initial theoretical statement or proposition, contextually compare findings of an initial case with the statement, revision of the statement or proposition, compare case other details with the revision, compare with additional cases, and iterate as necessary. Primary data derived from case-based research is gathered through direct observation and or interviews of people involved in the phenomenon being studied, secondary data that supplements primary data originates from publicly available documentation such as annual 10K reports, investor presentations, online news posts, and or other records obtained from participants involved with the study (McCutcheon and Meredith, 1993; McCutcheon, et al. 2002), and “typically combine data collection methods such as archives, interviews, questionnaires, and observations and [data] may be qualitative and or quantitative” (Eisenhardt, 1989, p534).

Yin (2014, p 102) states “case study evidence may come from six sources: documents,

12

archival records, interviews, direct observation, participant-observation, and physical artifacts”.

2.1.1 Case study process

The case process illustrated in figure 15 was adapted from investigative paths prescribed by Park and Hong (2017). Other investigative paths were studied for comparatives and understanding and as described by, Eisenhardt (1989); Miles and Huberman (1994); Yin

(2003); Corley and Gioia (2004); and Dul and Hak (2004).

Figure 1: Case Selection Process

Defining Search Criteria Participation response Midwest Regional Manufacturers Case Selection Both e-mail and phone call Mid-market Revenue < $1b Industry type variety solicitations were used

Field Documentation Interview format Electronically recorded as allowed Field Semi-structured Physical note-taking when not Interviews Using topics and open ended 2 interviewers present questions

Coding Transcriptions within 48 hours Follow-up Line-coded Coding and Analysis Questions that arose were Text analyzed discussed with participants as Data structure established needed

Quality Assessment Participants Feedback Data structure serves to build the Case Write-up Pilot study and case write-up and to develop feedback propositions and theories discussion to be delivered

[Remainder of page left intentionally blank

13

2.1.2 Case selection

This research investigates ten Midwest manufacturers (mid-market and small-market, with revenues less than $1 billion) to examine whether data-analytic investments improve on organization performance; motivated by the prospect of productivity and process improvement gains, improved employee job-performance, the search for new revenue streams or business models, improving customer revenue and profit performance, and enabling organization innovativeness. E-mails were sent to known contacts and phone calls to discuss the project.

Executive level participation in this research was: 1 CEO, 4 presidents, 1 SVP of finance, 1, SVP business intelligence, 2 controllers, 1 VP operations, 1 VP technical sales, and 1 director of production and operations. Executive levels of managers are more likely to influence the investment, adoption, and use of data-technologies through leadership and exercise of power types (French and Raven, 1960; Avolio, et al., 2009) with vested authority to organizationally make decisions (Parnell and Bresnick, 2013, p

2). Participant companies were chosen with diversity among manufacturing industry types; the research desiring a cross-sector portfolio of companies operating in engineered products, materials, agriculture in the form of food processing and biologics, and consumer durables verticals (table 1).

[Remainder of page left intentionally blank]

14

Table 1: Case Study Sample Participants

Revenue How Sector Held Employees Locations (2016) Interviewee Recorded MM 600 (at SVP Finance Primary Metals Public 1 (Domestic) $850 Electronic location) Controller Food Private 510 1 (Domestic) $125 CEO Electronic 680 (at 8 (Domestic) Director Food Public $2,500 Notes location) 8 (Foreign) Operations Sr. VP Tobacco Public 290 1 (Domestic) $210 Electronic Planning 2 (Domestic) VP Technical Chemicals Private 50 (at location) $15 Electronic 1 (Foreign) Sales Electronic Wood Private 100 2 (Domestic) $25 President Notes Plastics, Rubber Private 70 1 (Domestic) $20 President Electronic Furniture Private 3,000 6 (Domestic) $300 VP Operations Notes 250 (at 16 (Domestic) President Electronic Furniture Public $200 location) 2(Foreign) Controller Notes Fab Metals Private 35 1 (Domestic) $6 President Electronic

2.1.3 Field interviews

Questions asked of the participants were conducted in a semi-structured format (Dul and Hak, 2008); discussing in an open-ended way, yields a richness of information greater than phone interviews; seven were conducted face-to-face and three telephonically. Research topics of interest were presented in a way that allowed for extending discussion on the topic without the use of leading-questions and beyond yes or no answers.

The face-to-face method averaged two to three hours, and at times longer with plant- tours. When a plant tour occurred, recording and camera devices were not permitted, notes realized from the tour were jotted down in a note-pad and detailed when transcribing the interview. Telephonic interviews, approximately 60 minutes. Each interview, when electronically recorded, was transcribed using a professional transcription software program. Note-taken interviews were manually transcribed within

15

forty-eight hours. Two interviewers participated in all by two interactions. When permitted, all interviews were electronically recorded, with the exception of two.

2.1.4 Coding and analysis

Pattern matching logic was used in this study, by means of framing the interview in a semi-structured, open-dialogue manner that focused on the primary research model constructs: data adoptive drivers, data use practices, key process indicators, and organizational performance outcomes. Pattern matching is considered “one of the most desirable techniques” (Yin, 2014, p143) in case study research.

The logic is to compare findings among the case study and or case studies; these can be “related to the independent or the dependent variables of the study” and if the

“predicted patterns and empirical are similar then internal validity is strengthened” (Yin,

2014, pp 143-144). “One aim of studying multiple cases is to increase generalizability” and to use “adequately sampled cases”, a mix of typical, diverse, unusually effective, and unusually ineffective participants (Yin, 2014 pp 42-44; Dul and Hak 2004; Miles and

Huberman, 1994). Each transcription was line numbered and using text analytics, sought common statements made about a topic. Each topic was entered onto a spreadsheet and reference coded to the interview and numbered line to provide a control mechanism and rigor to the analysis. Each interview was transcribed within 48-hours of its occurrence.

2.1.5 Data structure

Data-structure was formed utilizing the methodology developed by Gioia, et al (2012) and Corley and Gioia (2004, 2011) as a means for structuring qualitative, interpretative

16

research and in presenting that research … doing so in a sensible way that progresses from raw-data, to concepts, to themes, and to aggregate dimensions or as the authors succinctly state “No data structure; Know nothing” (Gioia et al, 2012, p 21). The analysis begins with “balancing the deep embeddedness of the informant’s view in living the phenomenon with the necessary 30,000-foot view” (Gioia et al, 2012, p 21) … cycling between emergent data, concepts, themes, and dimensions with the relevant literature to discover and reveal new concepts.

Gioia et al. (2012) caution towards “not knowing too much of the literature (as being in a state of semi-ignorance or enforced ignorance of literature) to avoid confirmation bias on the analysis”, the researcher doing her or his best to “suspend known beliefs” and avoid “reinventing well-ridden wheels” (Gioia et al, 2012, p 21).

Care was taken to steer clear of confirmation biases or desirability biases during the interviews and analysis. That is why open-ended questions were put forth to the participants, with leading questions held to a minimum and only used when clarifying a point or response not understood clearly. After-interview questions were posed on topics needing further clarification or in comparison with similar comments made by other participants.

The data-structure segments in a three-step fashion. First order concepts, those

“informant-centric” terms stated by the interview participants and building a data-base of statement commonality. Second order “theory-centric” themes are the aggregate of first- order terms. These themes guide the research into understanding phenomena as expressed by the participants. The last step, aggregating second-order themes into a theoretical dimension and model. Proceeding in this rigorous method and manner allows for

17

grounded theory to be firmly established and provides validation on how the research was conducted, data collected, codified, analyzed and shaped into a theoretical model. For sale of brevity, the remainder of this chapter is presented by aggregated dimension with a short narrative on the second-order themes and presenting tables containing first-order concepts, following each section is a table with interpreted constructs and definition.

2.2 Data-centric Adoptive Drivers

A concise definition of data-centric adoptive drivers, as interpreted from the case interviews, is the performance, competitive, and innovation pressures placed on management by internal and external influences to organizationally adopt a data-centric mindset and culture. Performance pressures have many sources, common themes presented during the interviews focused on ‘the investments made in data-analytics and data-technologies and its return in economic and non-economic terms’, ‘enabling talent through the organizational use of data-technologies and access to the right data allowing for informed and effective decision making’, and ‘the effective use of data, it its variety, to sustain operations’. Sustain in the sense, data is used on a consistent basis in support of the organization’s strategic and operational goals, its objectives. In this context, sustains meaning, is with the application and use of data-technologies the company is better able to manage growth, generate value, and protect its data assets from cyber intrusions.

Competitive pressures coalesced into three themes, ‘meeting government regulations’,

‘meeting industry standards’, and ‘meeting customer expectations’. Interesting to this observation, was a direct absence of competitors being the reason for data-technology adoption. This may be an obvious given inference, that to remain competitive, adoption

18

of data-technologies is being driven by the need to improve on organization processes and outcomes and doing so will enhance competitive advantages. Albeit, none of the interviewees stated directly that the pressure to adopt a data-centric mindset was due to competitor influences. Rather, it was either viewed as a need to continuously upgrade existing technologies to meet certain regulatory requirements or industry standards, or as learned from a couple CEOs and company presidents, the self-imposed technological propensity to ‘make operations run smoother, more efficient, and prepare the company for the future’ as was any other reasoning. This last statement leads nicely into the third pressure attribute, innovation. Three ‘capability’ themes were observed, the first to

‘expand on innovation creation’, the second, ‘enhance data-driven innovation, and third,

‘innovate operating capabilities’. Strongly expressed, was the need to adopt data- technologies, and those related, to expand on current innovation capabilities, with the

‘need to stay on top of technology advancements’ to enable greater innovation to occur within the organization.

Clear consensus among the interviewees was the desire for data to provide insights on how to improve on organization processes, ‘finding ways to do things better’, and what will the ‘data tell us, that we did not know before, or what can it tell us about in what way can it help create innovation. With this brief introduction to data-adoptive drivers, the article now drives deeper into the interview observations.

2.2.1 Aggregate dimension: Performance Pressure

Three, second-order themes were extrapolated from first-order concepts, leading to the aggregate dimension of performance pressure. In other words when senior management

19

invests in data-technologies and takes on the task of supporting data initiatives, there exists certain ambition pressures to realize positive outcomes from the applied use of data technologies (data initiatives) to validate the investment decision. Ambition pressures become the performance pressure that motivates senior management to realizing tangible proof the decisions made were organizationally correct; these being 1) seeking returns on data asset investments, 2), enabling talent, and 3) sustaining operations. 50% of interviews referenced investments made in data-technologies, determining value creating returns important to understanding the impact of data-use on the organization; 33% percent believed data-technologies were useful to enable higher levels of job-satisfaction, the same number regarding data-technologies as important to adapting and or reallocating human capital resources to achieve the organization’s strategies and objectives. Each interviewee referenced performance measures being linked with either strategy and or some overarching direction. Varying degrees of metric- tied strategies were discussed; 25% mentioned complying with government regulations as a pressure that needs constant vigilance; 50% stated data security as critical to the organization and stakeholders. Two interviewees stated finance departments as being instrumental in DA, working across the organization to set fiscally tied metrics.

Small to mid-market manufacturing firm’s data-technologies investment is significant and required to attain sufficient levels of data accessibility for use in decision making.

Interviewees willing to disclose either annual expenses on data related activities or investments made in the specific data-analytic technologies ranged from $250,000 to

$1,000,000 for software programs (either internal service-based or external cloud-based) including, but not limited to seat licenses, initial implementation fees, and hardware.

20

Investments made to purchase data from external sources, revealed generalized in amounts of $50,000 to $100,000 annually. Much of investments are hidden within

Information Technology budgets and difficult to identify. Seeking returns on data asset investments (RODA) concerning ‘data accessibility and its impact on data value- creation’, ‘cross-functional data collaboration impact on value-creating outcomes’, and

‘data-variety integration impact on value-creating outcomes’.

Enabling talent, in varied references, the importance of data-technologies on job- productivity, job-frustrations6, job-routines, and job-control surfaced and considered a motivating factor on improving job satisfaction by enhancing self-efficacy7 and sense of what one does. The need for continued improvement on these job-dimensions, and while not uniformly stated, an over-arching concern is due to the perceived ‘routineness’ of manufacturing; finding individuals with the skills and talents willing to work within these industries is becoming ever more challenging. Making jobs more interesting by providing some sense of personal job-control through enabling technologies, to increase creativity, working to eliminate routine activities, critical considerations that may have partial solutions found in data-technologies.

Sustaining operations by aligning metrics with strategies and evaluating management on those measures, whether referenced succinctly or briefly in conversation, were consistent observations across the interviews. Key strategy tied measures stated, but

6 Job frustration as defined in the case studies: the aggravation experienced when the non- functionality, use-complexity, or information confusion presented by a technology obstructs an individual’s job-performance and decision-making capacity. 7 Bandura (1993) defines self-efficacy as “an individual’s confidence in their ability to perform a behavior, the capacity to act in a variety of circumstances”

21

not limited to, return on assets as a measure of capital investments, reduction in energy used per unit of item produced as a measure of energy strategy, the number of units annually produced as a measure of innovation strategy, the number of products sold into new markets as market-growth strategy, the reduction of per unit cost as a measure of innovation strategy, the number of advanced technologies implemented as a measure of organization capabilities strategy, and several others, table 2 details the interviewee references. Figure 2 presents a summary of first-order concepts and second order themes, followed by propositions. Table 2 illustrates strategy tied metrics.

Figure 2: Performance Pressures

P1a: Management seeks data technologies to increase returns on investment

P1b: Management seeks cross-function collaboration for returns on data investments

P1c: Management seeks data-technologies to enable talent

P1d: Management seeks to reduce job-frustrations through data-technologies

22

P1e: Management seeks to automate routine job-activities to increase job-control

P1f: Management seeks to improve on organizational critical thinking skills.

P1g: Management seeks data-technologies to redeploy human capital resources.

P1h: Management seeks data-technologies to accurately align strategy objectives

P1i: Management seeks data-technologies to improve organization critical thinking skills

P1j: Management seeks data-technologies to facilitate government agency compliance.

[Remainder of page left intentionally blank]

23

Table 2: Measures expressed in the case studies

Measure Definition Strategy Management Level Link ROA – reference how well management is utilizing company Return on Assets (ROA) assets ROI – references the overall Return on Investments return on investments made on (ROI) technologies or projects relative to the cost of capital CEO President Return on Sales (ROS) ROS – references overall CFO profitability Enterprise VP Finance Revenue growth (RG) RG – references revenue growth COO over prior years VP Operations

Cash-flow (CF) CF – references measures of cash flow contribution to sustain operations Company-value (CV) CV – references how the value of the company is changing

Energy-use per unit The cost-efficient use of energy in produced the production process CFO VP Finance Identify process complexity to Number of processes Innovation COO reduce the number of processes VP Operations Speed of production Number of products produced on a VP production production line The re-use, re-purpose, recycling of waste materials, the calculated cost of selling waste materials, and Waste monetization or the cost and revenue CFO opportunity of internally using VP Finance Safety incidents per number materials COO of employees Operation Personnel suffering injury in the VP Operations Employee productivity work environment per 100 VP production CIO employees VP IT Revenue per employee, profit per employee, training investment per employee

24

2.2.2 Aggregate dimension: Competitive Pressure

While inference to these topics were several, extrapolating cross-comparative concepts and aggregations revealed two streams of thought: the first being, competitive pressure driven by regulatory, industry, and customer expectations. The second, competitive pressure driven by motivations to make data-technology investments for either the ‘novelty of staying ahead of competition’ or a ‘novelty not competitively expressed’, that simply works to afford the business with greater opportunities for organizational growth. Granted, each of these themes has a fit into varied competitive models or reasonings and it is not the purpose of this research to argue one model or method over another, it is to report the expressions of the participants as illustrated in figure 3.

To varying degrees, each participant company has some form of regulatory responsibility and most common among the sample group is OSHA8 and the responsibility for accurately reporting injury events and organizationally, providing systemic processes to ensure the working well-being of employees. When comparing interview outtakes, certain patterns formed: first being, the need to maintain document adherence to regulatory requirements. Second, all have processes related to product traceability, with access to data near real-time, allowing for timely recall of information related to product issues. Third, meeting regulatory requirements are a simple function of doing business. The competitive advantage is adhering to those requirements but doing so

8 Is a part of the US Department of Labor and established to: ‘assure safe and healthful working conditions for working men and women by setting and enforcing standards by providing training, outreach, education, and assistance’.

25

in a fashion that quickly mitigates product issues that may be damaging to the company’s reputation and therefore, impacting on its industry market share. None of the participant companies referenced any regulatory situations related to the products they produce, albeit some references were made to other competitors whose products came under scrutiny due to performance failure issues.

Similar in nature, companies choose to comply with industry standards to remain competitive. This statement was born out in two of the case interviews stating customers adhere to these standards when providing products to the end consumer, with all industry companies treating wood materials expected to provide products based on established organization guidelines. Similar, hot-roll metals manufacturers follow specific guidelines9 on metal chemistry, mechanical properties, and other metal characteristics so a common understanding of metal performance is known by customers purchasing such materials. Food processors work to maintain product quality, while remaining sensitive to federal regulations (i.e. UDSA, FDA, or other country blocs’ established standards) may exceed industry standards to deliver a more superior product than competitors.

Data-technologies play an important role as conduits allowing companies to access industry benchmark data, to measure against product performance standards. Industry benchmark data is not a substitute for other data streams obtained from external sources that aid companies in developing new products or improving on existing products in maintaining. The competitive advantage is found in the character consistency of the

9 ASTM, American Society for Testing, and Materials, provides standards upon which many manufacturers maintain in their production processes and material compositions. The society provides guidelines and standards specific to the industries studied in this research.

26

products provided, delivery timeliness by the company to its customer, and minimization of field-use product service issues; data-technology enablers to sustain competitiveness.

Regarding meeting customer expectations, the company self-determines to invest in data technologies, based on its internal assessment that doing so would enable it to continue meeting customer requirements or expectations and, in the process, capture more customer information that will be useful to interactions. In other words, the company is under pressure to better understand its customers, assessing interactions, determining business process efficiencies and the use of data-technologies to facilitate those efforts.

Figure 3 summarizes first order concepts and second order themes followed by propositions.

Figure 3: Competitive Pressures

P2a: Management seeks data-technologies seeks to meet regulatory to sustain competitiveness

27

P2b: Management seeks data-technologies to meet industry standards to sustain competitiveness

P2c: Management seeks data-technologies to meet customer requirements and expectations.

2.2.3 Aggregate dimension: Innovation Pressure

These are but a few of the generalized topics discussed among the research participants. Three themes were observed and when aggregated formed the dimension as technology pressure. These being, to expand innovation creating capabilities, to enhance decision-making capabilities, and to sustain operating capabilities. The revealed concepts and themes defines technology pressure as external influences and internal ambitions placed on and of management to make technology investments and to seek value-creating results.

Moving towards more data-centric management models, yields to innovation (at times the term technology is interchanged with innovation) pressures, queries: is the company prepared for changing technologies, does it keep up with the technology standards of the industry, does it consider technology implementations and data-analytics as important to creating that competitive advantage, or simply staying current with industry trends. To what degree is innovation pressure externally applied. External in the sense, industry competitive pressures are requiring the company to invest into data- technologies just to ‘keep up’ with the market.

Or internal to the company in self-pressuring data-technology investments to ‘stay ahead’ of the market, or to create new markets; where management recognizes

28

investments will benefit the organization, and the pressure on management then becomes the expected results found in the hopeful and useful application of the technology and return on that investment. When considering this concept, not only is it capital, or expense related, it has more to do with the company understanding how the technology will provide innovativeness on maintaining (keep things going) and sustaining (keep things growing) the organization. The novelty theme of technology inspired innovativeness, management’s desire to stay on top of technology advancements making things maintainable and sustainable. Accordingly, participants referenced data- technology investments being important to the organization’s infrastructure, data and information systems were vital to ongoing operations.

Given the prospect of companies becoming more data-centric in managing beliefs, understanding the value delivery of those investments is becoming more top of mind, especially in making data accessible and usable on decision-making. Four cross-case examples of data-technologies enabling decision-making; combining fact-based support in context with experiential information are evident concepts important to managers; employing data-supported decision-making while not foregoing experiential, tacit information (or intuitive inputs) is important, indicating too much reliance on data guiding the decision-making process may or may not be of benefit on the process. All participants acknowledged, having the right data readily available is important to decision-making, but so is experience. Several dimensions for reasoning real-time preferences to data accessibility are observed in the comments; those naturally occurring in manufacturing, where production line operations are critical to sustaining operation capabilities, any form of down-time is costly, and requires close monitoring to

29

proactively respond to disruptive events. The concept of timely data access, becomes innovative in use as ways of creating avoidance processes, anticipating interruptions on productivity. In manufacturing, the primary use of data-technologies, from a predictive perspective, is related to equipment and machine maintenance. Down-time events as mentioned in earlier discussions are costly to the organization and preventative maintenance using data was discussed throughout all the interviews and requires no further discussion. Figure 4 summarizes first order concepts and second order themes followed by propositions.

Figure 4: Innovation Pressures

P3: Management seeks data-technologies to expand organization innovation capabilities decision-making capabilities, and improve operation capabilities

P3a: Management seeks data-technologies with the purpose of novel innovation.

P3b: Management seeks to stay on top of technology advancement through experimenting on novel data-technologies to increase innovation capabilities

30

P3d: Management seeks data-technologies to provide the organization with greater decision-making capabilities.

P3e: Management seeks data-technologies as supporting experiential inputs for decision- making

P3f: Management seeks data-technologies to innovate operating capabilities.

P3g: Management, seeks real-time access to meaningful data to sustain operating capabilities.

P3h: Management, seeks data-technologies that will allow for improved event predictability to sustain operating capabilities.

2.2.4 Aggregate dimension: Cyber-security Pressure

Cyber-security pressures are those internal and external motivations to protect the data assets of the company. Three second order themes were derived from the transcripts, internally caused data departure, externally caused data departure, and data privileges.

Figure 5 illustrates the first order concepts forming these themes. Data visualization, as mentioned in the prior section, helps control the look and distribution of data, hence governance, in the sense of determining what data is important to be viewed and related to company performance objectives. Departing from this interpretation, data governance elevated to a unanimous concern among the interviewees and bilaterally interpreted as

‘who in the organization has access to the data, and what level of data is necessary for job-relevant use’ and also meaning ‘cyber-security, data-breaches, data leaving the company’. What data is important to see, and what data is relevant to making decision on the job activity? This is a scalable question. The CEO requires access to all types of data, to measure performance against strategy, objectives, trends. The field sales representative, data required is only related to the territory, product sales, customer purchase history, product profitability, etc... The product design engineer, data on current

31

product performance, performance data collected on prototypes of new products. The

CFO, data on everything. These are but simple examples on scalability of job-relevant data and the associated privilege levels assigned to position and by job-function. Case study companies mature in the management of data have protocols in place that assign scalable levels of data-access, or privileges.

Today, most companies allow the use of mobile devices to conduct work activities and the same held true for the case study companies. Doing so invites a host of opportunities for ‘data to walk out the door’. Primary to accessibility, is the level of data privilege. The CEO of the company may work on a lap-top while traveling, tablet on the airplane, and smartphone when moving from meeting to meeting with privileges to access all levels of data. The sales rep (for example) performs her or his job using similar electronic devices, with access only to job relevant data. Both have the freedom to download relevant data onto external storage devices that may or may not be company owned.

Herein lies three problems with data-assets walking out the door; the first relates to the employee who inadvertently loses control computing devices in terms of misplacing or being stolen, second data can be downloaded and stored on a variety of storage devices that may contain viruses and corrupt data when uploaded into a company’s data-warehouse, and third the employee leaves the company for employment elsewhere, and while not taking physical data-assets with them, the knowledge on using the data-assets departs. The first and second scenario can be managed to greater and

32

lesser extents through various technology control mechanisms, the third through possible legal means.10

The fear of attacks on information systems was not an overt concern among the observations. Acknowledging the need for tightened security on data was expressed in those companies who manufacture products sold to retail consumers through the company’s customers. Warranty data is collected through mail-in or on-line registration and the information contained is primarily, date of purchase, product serial numbers, where purchased, purchaser’s name, address, and contact information for two of the furniture manufacturers. Intrusions by external entities usually occur through back-door efforts. employees unwittingly open phishing e-mails containing links that invite viruses effecting information systems, or the link may embed scripts that capture passwords and user names gaining access to company servers. Protecting against e-mail hacks, requires educating employees on the importance of safe-use of e-mails and not responding to those requests for information. Cyber-security also requires a champion to head data- security efforts, setting data-governance infrastructure and protocols, organizationally communicating on the importance of data-security. Figure 5 summarizes first order concepts and second order themes followed by propositions.

[Remainder of page left intentionally blank]

10 Non-disclosure, confidentiality, non-compete agreements if these have application to the reason for separation. If the employee choses to leave on her or his own volition, the tacit knowledge (from data- assets) gained while employed with the company, now leaves the company and with that departure any information that may be harmful to the company if transferred to competitors and or used against the company in any competitive manner.

33

Figure 5: Cyber-security Pressure

P4a: Companies that employ greater use of data-technologies will plan and implement strategies to safeguard internal data from departing the organization.

P4b: Companies that employ greater use of data-technologies will plan and implement strategies to safeguard data from external intrusion on data-assets.

P4c: Managers seek data-security technologies to prevent unwanted departure of data- assets.

P4d: Managers seek data-security technologies in response to unknown threats on data- assets.

[Remainder of page left intentionally blank]

34

Table 3: Data-centric Adoptive Drivers Constructs and Definitions

Item Definition Dimensions or measures A pressure placed on  Performance pressure management by internal and  Competitive pressure Data-centric adoptive driver external forces to Innovation pressure organizationally adopt a data  driven culture  Cyber-security pressure  Problem-solving and or The organizationally decision-making methods commonized use of and  Computer tools facilitating Data-centric culture reliance on data in support of the use of data problem-solving and decision- making activities  Continuous education on technology use  Economic and non- Internal motivations and economic ROI ambitions of senior Performance pressure  Enabling human capital management on productivity resources to invest in data-technologies  Sustaining operations External motivations and  Meeting customer ambitions of senior expectations Competitive pressure management to stay  Meeting industry standards technologically ahead of  Meeting regulatory competitors standards Internal motivations and  Expand innovation creation ambitions of senior managers Innovation pressure  Data-driven innovation to build unique and differentiated capabilities  Innovate operations  Expenditures on data-asset protection mechanisms External motivations and  Senior positions leading ambitions of senior cyber protection efforts Cyber-security pressure management to protect the  Frequency of organization data-assets of the company wide communication on data security events 

2.3 Moderation mechanism: Management influence

Executive management in the context of this case research refers to individuals with the authority to make decisions that enable the achievement of organization objectives. As demonstrated in the section on data-adoptive drivers of technology,

35

competitive, and performance pressures, executive management is dealing with many motivations for adopting data technologies. In each of the case study interviews, those motivations varied among the executives, yet management’s continued influence on the adoption and implementation of data technologies tightly threads together the adoptive pressures, data-use practices, key processes and performance outcomes of the organization. Figure 6 illustrates the first order concepts derived from the interviews, second order themes, culminating in the aggregate dimension, Executive Management

Influence. Level-one management participants have career backgrounds in engineering

(1), information technologies (1), marketing (1), and operations (2). Level-two, finance

(5) and operations (1). And level-three, technical sales (1), operations and production

(1), and finance.

Interaction with each of the participants demonstrated differing intensities as influencers in the adoption of and adaptation to data technologies. Three of the five level one influencers in this study consistently expressed their initiative in the adoption of data technologies as being the primary driving force behind the investment and allocation of resources; the other two, being a part of larger conglomerates, became influencers as part of the overall strategic direction of the parent firm.

Level-two influencers interviewed for this research, are senior executives serving in finance roles. The analytic nature of a finance discipline is closely associated with

‘having accurate and timely data’ for decision-making. Level-two participants are the primary initiators of adopting data-technologies within their companies and given this charge by the CEOs of the organizations. Among the five interviewed companies, a joined-alignment exists between finance and information technologies functions; with the

36

finance function providing a greater influence as to what data-technology to adopt and gauging the value of that data-technology by means of estimating its return on investment. The return on investment, isn’t relegated to a discounted cash-flow analysis

(albeit, it is a part of the examination) as a determinant of adoption, it also considers how extensive the data-technology’s reach is within the organization. Finance heads are the driving influence in adopting data-technologies, CEOs are disposed to giving the task of selecting and adopting a data-technology to finance function heads, and the finance function is collaborative across other function disciplines in its approach to selecting a data-technology. CFOs, VPs of Finance, VPs Business Intelligence drive the data- technology initiative; primarily from a data governance perspective but looking for ways in which data can be used across functional areas to make value-oriented decisions.

Reaching the user-groups and demonstrating the use of the technology is critical to successful adoption, a common observation among the conversations with level-three influencers. The successful use of data-technologies is also shaped by how level-one and level-two influencers understand the challenges of level-three influencers and how those challenges flow upward to influence change. Three supporting observations on data- technology adoption: bringing functional area heads into a room to discuss how data affects decision making from a cross functional perspective (DeClerq et al. 2011), function area heads press upward for access to data upon which to make informed decisions, and function area heads learning how to influence the use of analytics within their areas or responsibility. Interesting to this observation, was the president’s focus on

‘how technologies can help reduce job-performance frustrations’ as a driving motivation for committing capital and organization resources. As an influencer, the decisions made

37

on the technologies to adopt are inclusive of ‘sub-influencers’ on organizationally using the technologies; final authorities rested with level-one influencers. Figure 6 summarizes first order concepts and second order themes followed by propositions.

Figure 6: Executive Management Influence

P4a: Level-one influencer’s with an analytic/technological orientation actively pursue the adoption of data-technologies.

P4b: Level-one influencers will seek the advice of those who will use the technology before making a final adoption decision.

P4c: Level-one influencers seek to allocate resources on data-technology initiatives.

P4d: Level-one influencers are disposed to assign the responsibility of selecting a data- technology to heads of the finance function.

P4e: Finance heads seek cross-function collaboration on selecting a data-technology

38

P4f: The greater the cross-function collaboration with level-three influencers, the greater the success of adopting a data-technology.

Table 10 summarizes the construct items, definitions, and measures interpreted from the case interviews.

Table 4: Executive Management Influence Constructs and Definitions

Item Definition Dimensions or measures  Organizational hierarchy on Individuals who set company levels of influence in strategy, policy, resource decision-making allocations and make Executive management  Organizational influence on investments with authority to technology adoption make decisions affecting Allocation of resources organization outcomes   Allocation of capabilities The CEO or President of the  Board of directors Level one influencers company  Other answering bodies  CFO, CIO, CTO, CMO, Level two influencers Senior management CDO  Senior VPs  VPs  General mangers Level three influencers Function area managers  Controllers  Directors  Plant managers

2.4 Moderation mechanism: Data accessibility and use

An April 2016 McKinsey report, ‘The need to lead in data analytics’ (Brown and

Gottlieb, 2016), asked respondents about data and analytics capabilities within their respective organizations; the question framed the practice dimensions of, data accessibility across the organization, tools and expertise to work with unstructured data

(see discussion on strategic-level data), self-serve analytics capability for users, Big Data and analytics tools (e.g. Hadoop, SAS, SPSS), advanced modeling techniques (machine

39

learning, real-time analytics, natural language processing), and other. Of these mechanisms, data that are accessible across the organization, tools and expertise, and self-serve capabilities were significant among respondents from high-performing organizations11.

Referencing this survey, provides secondary evidence as to the validity of the second order themes garnered from the case studies: these being deepening the organization level of data availability, and deepening the level of data understanding.

The more critical mechanism role is established in data flow through the organization, its fluidity in terms of ease of access and use. The viscosity of data, its density in terms of variety and volume and organization flow determines what data is important for employee use is as much a function of management channeling the right data at the right time, as it is for the employee to understand what data is important to job performance.

Data-accessibility, as interpreted from the case studies is defined as the methods and mechanisms of making data universally available within an organization’s operating environment. Data-use is defined as methods and mechanisms of integrating data variety into an organization’s operating environment.

Organizationally, each participant stated to greater and lesser extents how information in the form of data is made available among organization levels. First-order concepts fell into five items; data access is made available to management personnel only, implementation of new data-technologies will provide the ability to move data to more levels in the organization, at some point-in-time employees will see the data which

11 High-performing organizations (HPO) were those in top-quartile rankings of revenue growth, EBIT growth, and return on digital investment; HPOs ‘closely tying digital strategies with corporate strategies, responding to digitization influences by changing corporate strategies, ‘through strategic reallocation of resources to create value and higher returns’.

40

effects their decisions, and the cross-functional use of data is important to decision- making, and self-service availability to data; providing the mechanisms to move data to applicable levels within the organization. Performance relevance to the interview transcripts are captured in: ‘the ability to access relevant data versus waiting for someone to distribute a report aiding in doing one’s job, the ability to improve on processes through data-technologies to ease task intensities, data technologies to improve on job productivity, the ability to visually see performance metrics is important to the job performance and innovation creation, greater job-control, job performance effectiveness and efficiency, speedier task accomplishment, and help build employee’s knowledge and understanding about things that effect their job’. Job performance expectation is then defined as: the believed use of a data-technology to enable the user’s ability to efficiently and effectively perform a task.

Observing frustration reducing mechanisms revealed in the case-interviews; the steel manufacturer, to reduce the stressful impact of adopting a new, organization wide data-technology, gathered heads of each function area and made part of the adoption process, serving to create subject matter experts within those function areas and facilitate acceptance. Thus fitting the definition of cross-function collaboration is defined as: outcomes derived from the coordinated sharing of information among members of different function areas using data-analytic technologies.

Data accessibility by management only was common among participants: most in part due to controlling the confidentiality of data within the organization. The privately held companies provide accessibility on financial information to only a few individuals, sharing any revenue or profit details organizationally was primarily kept to percentage

41

metrics as in increases or decreases over prior year performances, or relative to equivalent time-periods. Some senior managers would share generalized revenue figures with employees, but not profitability; having sensitive information leak from the organization being the critical concern.

Data related to strategies and performance objectives would find movement to lower levels within the organization, being deemed relevant to specific departments or individuals. While sensitivity and leakage of data was a cross-company concern, there is a desire to share department or division financial metrics with employees for illustrating how decisions affect financial performance and using the data as learning lessons to identify areas of improvement. ‘At some point in time employees will see the data which effects their decisions’, this generalized statement gleaned from the interviews illustrates the ultraistic desire for employees to have access to the data that effects their job performance. Collectively among the dialogues, not all employees require or should be advanced on their ability to analyze data, let alone given access to data. Taken in context, none rejected the idea of educating employees to become more effective and productive on the job-performance and some form of understanding how data measures performance would be important. The greater concern was allowing data, regardless of its job-relevance, to be spread organizationally without some form of governance on its distribution (discussed in 2.2.4). The Practice portion defined as; routinely applied data-variety12 appropriate to the decision under consideration. Data use practices then

12 Data-variety represents the number and type of sources generating data to be collected; these can range from machine sensors, robotics, wearables, smart devices, social networks, financial systems, MES, MRP, ERP, CRM, etc. systems which collect information concerning all operational aspects of the organization.

42

centers on the type of data being captured, accessibility by employees to the data, and the interaction methods of data-technology tool applications employed by the organization to facilitate data use.

Practice also infers reducing employee job frustrations can be eased by deploying data-technologies or other perceived helpful technology, however technology may not deliver on the intended result, possibly exacerbating frustration levels and increasing stressful effects13. Without the proper education on the use of the technology, frustrations become increasing elevated than previously experienced.

The organizational level-depth of data availability is observed as dependent on whether management feels compelled to make data available or believes the individuals given that availability understand is use. Investments in education on the use of data and its analysis and investments in technologies that enable real-time accessibility of data were consensus comments among the interviewees.

For data accessibility to be made possible at all organization levels, education is most critical activity the organization is required to undertake. Data is invisible, a spreadsheet may contain dozens of columns with header titles, hundreds of rows with numbers or symbols, and the only visible reference is the data contained in the voluminous number of cells. Overwhelming, unless the person viewing the data is a data-analyst and even then, without tools for interpretation, data is of little use. The simple fact is, ‘for data to be actionable it has to be viewable’ and this visibility has to accessible on any screen

13 Tarafadar, et al., (2010-11) the term ‘technostress’ and its effect refers to the negative or frustrating experiences by the user in the use of a technology or several technologies resulting from ‘application multi-tasking, constant connectivity, information overload, consequential uncertainty in understating the use of a technology, continual relearning new technologies, and technical problems with a technology that causes disruption to task accomplishment.

43

electronic device and consistent on the visuals of the metrics it is illustrating and preferably in real-time. Figure 7 summarizes first order concepts and second order themes followed by propositions.

Figure 7: Data Accessibility & Use Integration (Mechanisms)

P5a: Job performance productivity increases with greater access to data.

P5b: Job performance productivity increases with education on the use of data- technologies.

P5c: Job performance productivity increases when data is presented in a visual manner and on regular frequency.

P5d: Job performance productivity increases when computing devices and software are regularly upgraded to provide greater functionality.

[Remainder of page left intentionally blank]

44

Table 5: Data Accessibility & Use Integration Constructs and Definitions

Item Definition Dimensions or measures Routinely applied data-  Types of deployed data- technologies appropriate to technologies Data use practices decision-making and task  Depth of organizational management level deployment  Hierarchical levels of data- The methods, mechanisms of distribution making data universally  Number and type of Data-accessibility available within an computer tools facilitating organization’s operating access to data environment  Types of data variety  Frequency of education on The methods, mechanisms, of data-technologies use integrating data into useful  Expenditures on data- Data-use application within an technology education organization’s operating environment  Number of personnel utilizing data-technologies  Cross-functional use of data The levels within the  Levels of data ‘self-service’ Depth of data availability organization at which data can accessibility be obtained  Functional area involvement on determining data needs  Frequency of education on data accessibility and use The levels within the  Levels of timeliness of data organization at which data are (e.g. real-time) Depth of data understanding relied upon for job  Levels of data visibility (e.g. functionality dashboards)  Levels of data-supported recommendations The methods, mechanisms,  Data- technologies Data Accessibility & Use making data available and  Education on use Integration Mechanisms useful in the organization’s operating environment.  Data-distribution

[Remainder of page left intentionally blank]

45

2.5 Data-integration practices

The umbrella term ‘data-integration practices’ the strategies of data-integration, the ordinary and extraordinary aggregate dimensions of ‘strategic level data’ and

‘operational level data’. ‘Strategic-level data’ second order themes are ‘determining business opportunities’, ‘determining business environments’, and ‘determining fungibility of source materials’ speaking on the strategic influence of external data on the organization. ‘Operational-level data’ with second order themes of ‘innovating operational capabilities’, ‘improving R&D capabilities’, and ‘sustaining operational capabilities’ considers the influence of internal data as critical to sustaining company operations. IIoT Level with second order themes of ‘automation / robotics deployment’,

‘device connectivity’, and ‘data retrievability’. Data-security level integration with second order themes of securing ‘data-assets from competitors’, ‘protection strategies’, and ‘data asset value’.

Observed in two situations; one concerns a newly deployed ERP system providing greater functionality than the replaced system, confusion over process steps remained after implementation, increasing frustrations by slowing the speed to complete tasks. After several months of continued use and support from system trained subject matter experts, frustrations subsided and was replaced by feelings of ‘time better used’ in accomplishing tasks, a supporting motivation on the technology’s implementation.

Another example revealed frustrations reduced with the use of data-visualization tools.

Once the comfort level of use and confidence level in the data being automatically generated increased and the time-gain versus time lost in the process of manual

46

downloading information onto spreadsheets and creating visuals was acknowledged, frustrations reduced. The emotional factor here was report authorship and trust in the new technology to provide the right data at the right time for decision-making. Once individuals became accustomed to accessing the data and seeing in creative, graphical, and illustrative views, the time spent manually downloading data and creating spreadsheets gave way to trusting the data being delivered and trusting the data to support decision-making. The concept of making data visible is to help focus the individual’s cognition on information (the right information at the right time) that is relevant and important to task accomplishment and making decisions. In doing so, job-frustrations are reduced.

2.5.1 Aggregate dimension: Strategic-level

What is strategic-level data? Research suggests data in support of enterprise goals

(Demartini, 2014) originating from market, consumer, industry, government, and other relevant data captured from external as well as internal sources of information. Thinking on the term Big Data, the characteristics noted above are core to its generally accepted

‘variety’ dimension, meaning data from varying sources in both structured and unstructured forms; other dimensions are velocity, volume, veracity, and value. Desouza and Smith (2014) add ‘viscosity, variability, and volatility to the ‘V’ nomenclature. In reviewing the transcripts, the overwhelming amount of data manufacturers are managing presents arguments for data clarity and degrees of necessity in decision making, hence the density of data or its viscous nature in how data flows through the organization.

47

Opening discussions on the term Big Data revealed seven of the interviewees referenced some or all of these dimensions, indicating an acute knowledge existing on the term and contemporary relevance on its meaning. As the conversations continued, four of the seven conversations steered towards Big Data’s as strategic applications. Common second order themes framed the use of data in the following ways: first, to find and determine new business opportunities supporting either the existing business, discover previously not known opportunities, and those aligned with enterprise strategies. Second, forecasting business environments, projecting human capital needs as compared with the company’s current talent pool as aligned with new technologies that are or will change on how the business operates. Incorporating social media data into strategy based on trending customer sentiments on both macro and micro issues that affect the company’s business. Third, considers the fungibility of resources, as in ‘can existing materials used in the production process be readily substituted with other materials that will perform equally or better than those currently used’. These themes shaped the aggregate dimension of Strategic Level Data Integration (figure 8). Foremost of first order concepts when discussing business opportunities is customer data, in the form of end-user surveys, affinitive data provided by customers, and trend data from the company’s internal sources.

Business opportunity data comes in the form of new product requests or interpreted through data analyzed insights. Each of the interviewees stated industry and customer macro data as important on the company’s product and business decisions. The source data is primarily derived from written documents subscribed to from various industry trade organizations and then interpreted on the company’s business environment, its

48

current and future effect. Five companies obtain data in downloadable form, and two through application programming interfaces (API) making data easily accessible and readily usable in frequently produced reports discussing trends and industry changes.

Determining business environments, to extents, overlaps in definition between determining business opportunities and business environments. Primarily in the manner on the interpretation and response to external data. Observations on business environment produced first order concepts of: macro-economic and macro-industry and socio data used to understand effects on human-capital resources understanding sentiments through socio-text analysis affecting business perceptions, economic policy decisions impacting on financial capital, government policy decisions, and predicting competitor actions.

These first order concepts, and references by the interviewees varied. Unanimous among the dialogues are on government policy decisions and economic policy decisions effecting business decisions and strategy. Using industry data to, in some form, predict competitor actions was voiced in three interviews. The remaining concepts singular in response, however fit well into understanding data that is used in determining external influences on the business environments.

Definitions on fungibility take several forms, equivalent substitutability of one thing for another, something of the same performance specification, equally exchanged.

Non-fungibility, is the uniqueness of the thing, it cannot be substituted for another since the performance specifications cannot be equally exchanged, a certain rarity to the thing.

Four first-order concepts were observed: external information on raw material availability, external data gathered to determine other industry use of the same raw materials effecting availability, external data gathered on agency regulations that effect

49

raw material substitutability, and external data gathered from supply chain members to determine substitutability of materials.

Raw materials to a manufacturer are its life-blood, without which nothing gets produced. Critical to that life-blood is the material’s degree of fungibility and how that fungibility affects strategy. Each company interviewed has some mechanism in place to alert them on material availability. Most are electronically fed data streams from suppliers, in the form of shipment quantities and expected arrival dates, some are manual for smaller manufacturers, relying on fax and email communication on material availability. Data is trended in context with scheduled demand while anticipating future requirements. The manufacturers in this study rely on several sources of data to determine near and long-term impacts on its ability to sustain its business. Suppliers are important to material substitutability. Changing regulations and or market conditions affect manufacturing materials, suppliers have their ‘ears to the ground’ and provide manufacturers with richness of data through conversations between agents of the companies and companies’ inter-connected information systems, or distributed data.

Important on this strategic level integration narrative is the use of leading and lagging indicators, cause and effect, leading as the cause, lagging as the effect, those critical to the success of the company. When referencing metrics and what measures are critical, strategically and operationally, on the organization. The question’s intent is not to gain an itemized listing of indicators, as much as how metrics are categorized, does the company use core leading indicators when providing data support on decisions, and are the leading indicators applied in some form to outcomes, or lagging indicators. Financial metrics lag the daily sales generation of the company, customer satisfaction lags

50

experience with product performance or product quality, plant utilization lags production line capacity, machine use lags machine down-time, and so on, the point being each of the interviewees expressed some form of metrics that measure causes and some form of metrics that measure effects. Participants reference of non-financial metrics are, as the term suggests, those measures that are not financial. However, ‘any metric used to measure performance can be financially tied in some way’, and any metric has some form of strategic tie, not just operational. Figure 8 summarizes first order concepts and second order themes followed by propositions.

Figure 8: Strategic Level Data Integration

P8: Companies that integrate external sourced data are better positioned to create new business opportunities.

P8: Companies that integrate external sourced data are better positioned manage changing business environments

P10: Companies that integrate external data are better positioned to manage fungibility

51

Table 6: Strategic Level Data Constructs and Definitions

Item Definition Dimensions or measures  Big-data: characterized by volume, variety, veracity, Data in support of enterprise velocity, value, and viscosity Strategic-level data goals to sustain organization  Externally sourced data competitiveness applied to strategy development and strategy operationalization Externally sourced data upon  Sourced customer data Determining business analyzation reveals insights on  Sourced market data opportunities (DBO) new commerce ideas effecting strategy  Sourced trend data  Sourced data on managing human capital Externally sourced data upon  Sourced socio-text/media Determining business analyzation reveals insights on data environments (DBE) changes to market conditions  Sourced data on financial effecting strategy markets  Sourced data on government regulatory policies  Sourced data from suppliers on material availability or Externally sourced data upon substitutability Determining resource (DRF) analyzation reveals insights on  Sourced data on regulatory material fungibility changes to resource material environments affecting preferences effecting strategy supply  Sourced data on competitors

[Remainder of page left intentionally blank]

52

2.5.2 Aggregate dimension: Operational-level

Strategic level data is extracted from external sources and some internal, data that affects strategies, the company’s competitive position within the marketplace, the decisions it makes on sustaining company operations. Operational-level data in some respects overlaps with those internal to strategic level data, such as those used in product innovation that may move the company into new markets and possibly abandon those of legacy. However, operational level data is concerned with the fundamental operations of the company, and the decisions made using data supported recommendations. The manufacturers interviewed in this research could speak at length on operational data integrating into the decision-making process. There is an observed, natural inclination for manufacturers to focus on machine activity, capacity performance, labor performance, product cost performance, customer profitability, leading and lagging indicators for predictions (some of which can be classified as strategic, although not referenced in the discussion on strategic-level data, this section will explore these cross-level indicators), and others. Three second order themes were observed among the cases; innovating operational capabilities, developing research and development capabilities, and sustaining operational capabilities, figure 10 illustrates these themes in context with the related first order concepts. A general definition of operational-level data derived from the interviews is data in support of maintaining and sustaining achievement of organization goals and objectives; in turn operational-data is in support of strategic-level data. Innovating operational capabilities, improving on R&D capabilities, are fundamental to sustaining operational capabilities. Innovating operational capabilities, improving on R&D capabilities, data collected that is actionable on affecting new product development, new

53

manufacturing methods and processes. The remaining second order themes speak to data that is timely, predictive and actionable on affecting decision-making among all operation levels within the organization and commonizing decision-making and or problem-solving methods.

When ventured the question on a one-word definition of capabilities, ‘resources’ and ‘strengths’ were common responses among the interviewees. Following these responses on the term data, ‘data is a critical resource of the company’ and ‘data is a capability, wait, and doing something with the data is a capability’; the criticality of data as a resource, more importantly as a capability of the organization.

Four first order concepts are observed, the use of leading and lagging indicators used in decision-making, the use of non-financial performance data for decision-making, the use of financial data in decision-making, and the deployed information system technologies facilitating the timely collection and distribution of data for decision-making.

As with discussions on the types of data, questions concerning types of methods on the process of decision-making, problem solving was not the topic focus, it was relative to, whether a formalized process is utilized, is it a data-supported in making solution recommendations, and the approval path for acting on those recommendations.

Transcripts revealed two interviewed companies (both publicly traded) expressed having a formal method, however these were only relevant to requesting capital funding for projects. Pseudo formal problem-solving approaches were demonstrated in each dialogue, most following a problem identification and impact analysis. Figure 9 summarizes first order concepts and second order themes followed by propositions.

54

Figure 9: Operation Level Data Integration

P11: Companies who recognize the need to apply disciplined use of critical success factors are better positioned to sustain operational capabilities.

P12: Companies who adopt formal, commonized approaches to decision-making and problem-solving are better positioned to sustain operational capabilities.

P13: Companies that have well integrated data-collection and analysis mechanisms are better positioned to strengthen their innovative capabilities

[Remainder of page left intentionally blank]

55

Table 7: Operation Level Data Integration Constructs and Definitions

Item Definition Dimensions or measures  Internal data collected from Data in support of maintaining information systems, digital organization capabilities to devices, and other mechanisms Operational-level data where data is generated achieve strategic objectives  Data important to the day-to- and goals day functioning of the organization  Actionable data affecting product quality, product Internal sourced data revealing throughput, manufacturing Innovating operational insights on altering the processes, and allocation of capabilities performance of existing human capital resources  Innovation teams  Simulation testing  Technology testing  Actionable data collected affecting new product Internal and external sourced development, new data revealing insights on manufacturing methods and Improving Innovation amplifying exploratory processes capabilities resources to enable  Market research innovativeness  Prototype testing  Simulation testing  Technology testing  Actionable data affecting decision-making among all operation levels within the organization  Commonizing approaches and Internal and external sourced or methods to decision-making and or problem-solving Sustaining operational data revealing insights on  Leading indicators capabilities (SOC) supporting functional  Lagging indicators resources  Non-financial data  Financial data  Commonizing decision-making approaches  Commonizing problem-solving approaches

[Remainder of page left intentionally blank]

56

2.5.3 Aggregate dimension: IIOT-level

Innovation has become such a common term, its strength of meaning becomes subdued with repetitive use, almost a toss-away term and phrase, ‘let’s innovate this, let’s innovate that, let’s innovate what’ without understanding its true intent; to thoughtfully explore and discover the new or different when compared to something that currently exists or the thing that does not exist; a process, a product, an improvement that favorably affects the performance of a thing, a betterment that provides a demonstrated benefit.

Data serves thoughtful exploration.

Every interviewee stated innovation as important to the company. Some skeptical on innovation’s impact, especially in regard to investments made in technologies that may not be providing expected returns, a discussion saved for later in this article.

Innovation observed in first order concepts are: collecting data from devices and sensors attached to machines that monitor quality performance, collected data to improve on throughput or machine utilization, data collected to determine how automation and robotics may improve on the production process, and data collected on labor use and labor performance or best use of human capital.

Sublime to this discussion on innovation is the contemporary phenomenon of the

Industrial Internet of Things (IIOT) and the connectivity or ‘smart-factory’ expressively desired by interviewees and influence on innovation. Discussing data collected from machine and equipment sensors, most common innovation related responses dealt with processes, how processes can be improved through collected data. Much of the data collected from the interviews occurred during plant tours, no recording devices were allowed, nor photography, the details were recorded later in the day upon returning from

57

the interview. The following observations are truncated, for the sake of brevity and simply to illustrate first order concepts. Innovating on processes is innovating on quality, changing a production process has the benefit of improving on quality, machine data speaks to many possible innovations the intended change on one may affect another. The two terms are homogenous in many ways, they are different while resulting in affinitive outcomes, some intended, some not. The intent to improve on a process initially may have been to mitigate a safety issue, the result may also improve on quality. How does one innovate on quality, by innovating on processes.

The data collected from human learned tacit inputs become innovative when those inputs recommend changes on processes. Integrate tacit knowledge with data- observations and analysis, and innovation occurs. Innovation in the sense, finding new ways to perform tasks using alternative technologies. Fundamental to simply replacing human mechanized tasks with robots, is the cost benefit analysis on the investment and doing what is best for the business while considering the effect on its shareholders and stakeholders. Collected data from currently deployed robots helps in this analysis, referenced cost savings and performance benefits can be used in benchmarking future plans to introduce robotics to the manufacturing process to support the business case on the investment.

Automating, or having data self-generated to the user, reducing or eliminating the manual aspects of the routine can be accomplished through data visualizations (discussed earlier in this article) whereby dashboards are made available on a timely, actionable basis. The other visualization benefit is the functionality to provide self-service data accessibility, immediately populating selected variables into graphical from data-tables

58

onto the user’s desktop. The follow-on question of has this reduced time spent on manual analysis; the answers mixed, reasoned by the recent introduction of automated visual technologies, and weaning off well-treaded dependence on self-designed processes takes time. Figure 10 summarizes first order concepts and second order themes followed by propositions.

Figure 10: IIOT Data Level Integration

P14: Companies who embrace IIOT technologies are better positioned on adapting to technology changing environments.

P15: Companies who embrace IIOT technologies are better positioned on collecting and analyzing data to improve operational performance

P16: Companies who embrace IIOT technologies are better positioned on making timely changes on processes

[Remainder of page left intentionally blank]

59

Table 8: IIOT Level Data Constructs and Definitions

Item Definition Dimensions or measures  Number of interconnected devices Information captured from  Number of sensors, scanners, IIOT Level data wired and wireless networked optimizers used to relay devices information  Volume of data  Variety of data  Number of internal network connections Information relayed on  Number of external network Device connectivity machine operating status connections  Velocity of data retrieval  Scalability of connectivity Non-human involved  Number of installed robots Manufacturing Automation manufacturing activities  Number of automated processes

2.5.4 Aggregate dimension: Data-security-level

Data-security is repeated herein from the second order theme Cyber-security pressure. The pressure on protecting data from breaches also plays into the data- accessibility and use. Repeating the descriptive on this theme as explained in section

2.2.4 is redundant to the reader. The practices employed by the interviewed companies on data privileges and the methods on securing data is concerned with securing data-assets from competitors, implementing data-asset protection strategies, and considering data as a valuable asset of the company. Two companies directly referenced securing data from leakage to competitors, both were publicly traded, possibly indicating a greater sensitivity to data intrusions; albeit this does not preclude the same sensitivity felt by privately held companies, simply not directly expressed in the transcripts. Three companies referenced data-asset protection strategies in degrees of formalized effort; all had defined individuals accountable for data-asset security. Among all the interviewees, data was viewed as

60

valuable, something unique to each organization, a thing that holds much promise in its ability to create value. Figure 11 summarizes first order concepts and second order themes followed by propositions.

Figure 11: Data-asset Security Integration

P17: Companies who integrate data-security mechanisms reduce the likelihood of unwanted data asset departure.

P18: Companies who integrate a data-security champion on communicating data-asset importance reduce the likelihood of unwanted data asset departure.

P19: Companies who integrated a data-security strategy into enterprise strategies are better positioned to manage unwanted data-asset departure

[Remainder of page left intentionally blank]

61

Table 9: Data-security Level Constructs and Definitions

Item Definition Dimensions or measures Technology integrated data-  Number of mechanisms in mechanisms to protect data assets place to protect data-assets Data-asset security from unwanted departure from  Formalized protocols for the the company protection of data assets A high-level set of organizationally integrated goals  Defined data-asset strategy Data asset protection strategy and objectives for the of plan protection data-assets.  Frequency of communication The communicated economic and on how data effects the value of Data-asset value non-economic placed importance the company of data assets on the company  Demonstration of data asset value at all organization levels

2.6 Data-actuation, Key Processes

A Key process, as interpreted from the case interviews, are ordinary and extraordinary operationalized systems reflective of the organization’s strategic and operation goals and objectives made visible through data-practice integration. When thinking on a process, or a set of routine steps in the performance of an activity or task with an expected outcome, processes become invisible without knowing how the process is performing; is it efficient, is it effective, is it achieving the desired result. Data accessibility and use provides the mechanism to create visibility, identifying those processes under-performing, providing insights on the problem and possible solutions for correction. Processes are interconnected, one affects another, failure in one may infect the entire eco-process-system. The ability to quantitatively measure process steps, making adjustments on those processes when measures fall outside performance boundaries, and create new processes enhancing capabilities are functions of data use.

Processes require data-measured visibility, and their connectivity to the goals and

62

objectives meant to achieve, without this level of metricity a process will underperform expectations.

The prior section discussion on integration themes of ‘strategic-level’,

‘operational-level’, ‘IIOT level’ and ‘data-security level’ provided the foundation for understanding the instrumental role ‘data use practices’ plays on developing key processes. Insights found in those discussions, comparatives among the transcripts and follow-up conversations for topic clarification, revealed three commonly referenced dimensions influenced by data use practices: productivity, planning, and innovation.

Within those dimensions, productivity is built on second order themes as components,

‘safety’, ‘throughput’, and ‘financial’; planning as ‘trade industry’, ‘plant-utilization’, and ‘supplier’; innovation as ‘product’, ‘process’, and ‘business model’ components

2.6.1 Aggregate dimension: Productivity processes

Returning to the operational-level data discussion, the components of productivity, safety

(labor related), through-put, and financial were commonly raised items when discussing data use and productivity. Monitoring process tied metrics through data-practices allows for modifying processes to meet the dynamic demands of the company. A key productivity process as such is defined as the data-measurable relationship of inputs and outputs on a routine, practice or activity mechanism; analyzing relationship outcomes to maintain and sustain operations. Unequivocally, ‘safety’ is the single most mentioned item; motivated in part by watchful compliance with OSHA regulations and fear of incurring costly penalties and being safety conscious to minimize the expense of workers compensation insurance. While externally influenced, these are not the only reasons to be

63

concerned over safety. Conversationally there existed a sincere desire to protect worker safety, their welfare regardless of other motivations. Moreover, this protection is construed as self-interest motivated, the self-interest of the company to maintain its labor- force. Losing talent, experience, and job-knowledge is difficult to replace and even more so in manufacturing given previous discussions in this research on finding and retaining talent. Collecting data on incidents, processes that rely heavily on human activity to perform tasks and using that data to modify the process is critical to safety. Given the manufacturer’s propensity on human involvement in the production process, demonstrates safety as a critical component of productivity.

Second most referenced item ‘through-put’; in terms of production, machine and or labor processes that constrain the ability to meet production goals and the use of data to determine the degree of constraint imposed by existing processes. Primary use of data in the analysis of through-put and learning process interrupters, machine and equipment maintenance, change-overs and set-ups, monitoring patterns of down-time to determine unanticipated causes. Companies interviewed also consider labor as a constraining element on production, can processes become automated to improve on through-put.

Third component, financial, and the use of data-analysis on process costs or cost of activity and the investments required to improve on those processes. Three interviewees viewed any process investment as measured in financial terms, the process being defined by its cost performance. Figure 10 illustrates the first order concepts which formed each second order theme. Figure 12 models this dimension.

The safety component is shaped by first order concepts of: monitoring safety issues as productivity metrics, frequent reporting on OSHA compliance requirements,

64

monitoring workers compensation insurance expense, evaluating processes that effect employee safety welfare, and protecting the labor pool working in the plant are topical to safety processes found in the cross-case comparatives. ‘Manufacturers are natural review targets for OSHA’ (President consumer products company), making this statement does not reference manufacturers being a singled-out industry, it references the many processes found in the manufacturing environment being suspect for the existence of unsafe working condition. Safety strategy is voiced among all participants, establishing goals and objectives on safety performance as it relates to productivity with elements of: regular manufacturing process reviews, establishing safety education curriculum and annual event calendar, appointing safety leads, methodology for data collection and setting performance tied metrics aligned with productivity objectives.

Through-put, manufacturing process efficiency ‘to maximize production capabilities’, using 100% of production capacity, meeting planned production goals or production schedules, achieving annual unit production objectives by product line; each a unique, yet similar definition offered by interviewees. Through-put, the second most discussed item when thinking on items that influence productivity and by extension to include ‘data use on monitoring process enablers and constrainers’. Enablers, those implemented manufacturing activities and or technologies that facilitate efficient process performance.

Constrainers, those identified manufacturing activities and or technologies that impede efficient process performance. ‘Data use to improve on production processes’; within the manufacturing environment, idle machinery and or inefficient production processes are costly. Costly on not meeting production objectives and costly on company value.

Thereby effecting (in the short and long term) customer expectations and market share,

65

labor retention (if layoffs occur), maximizing the use of assets, capabilities and ROI, and opportunity for innovation to occur; items referenced during interviews. Phrases primary on the discussion, ‘keep the production lines running at capacity’, ‘look for ways to increase capacity’, ‘minimize production line interruptions’, ‘meeting our unit projections’; each a factor of maximizing production capabilities and examined through data on monitoring and identifying process enablers and constrainers.

Third among the components of productivity, is financial. Processes based on data collection of cost information; labor time allocated to the process and cost of labor, energy used in the process and cost of energy, maintenance required of the process and cost of maintenance, cost of process interruptions, with improvements on processes measured by the cost of investment and expected return. Paraphrasing and unique on the topic of financial aspect of productivity, is the ‘cost of complacency’ (VP finance primary metals). In other words, measuring the opportunity cost or benefit lost in accepting current process performance versus regularly re-evaluating how processes perform and seeking alternatives to improve cost effectiveness14. The concept being to bring visibility on process cost and process value contribution; accepting current processes as adequate if performing to expectations may be the right decision, the insight here is to not always accept adequacy if change is seen as not required. Figure 12 summarizes first order concepts and second order themes followed by propositions.

[Remainder of page left intentionally blank

14 While the terminology is not referenced, the comments made by theses interviewees represent a form of zero-based budgeting, regularly reassessing processes to find better cost alternatives.

66

Figure 12: Data-actuation Key Productivity Processes

P20: Companies that use data to regularly evaluate safety processes are better positioned to limit work-force vulnerabilities

P21: Companies that use data to regularly evaluate production processes are better positioned to expand on manufacturing capabilities

P22: Companies that regularly review processes in financial terms are better positioned to increase the value of productivity.

[Remainder of page left intentionally blank]

67

Table 10: Key Productivity Processes Constructs and Definitions

Item Definition Dimensions or measures A mechanism employed to  Made visible through data accomplish the organization’s use practices strategic and operation goals Key process  Strategic-data alignment and objectives. Mechanism as a routine,  Operational-data alignment practice, or activity.  Process mechanisms The data-measurable  Relationship outcomes of relationship of inputs and operation processes Key productivity processes outputs on an operation Productivity improvement activity towards some  actions expectation.  Safety strategy  OSHA responsiveness The data-measurable  Workers compensation mechanisms deployed on Safety component insurance expense maintaining and sustaining Labor pool protection human capital   Worker welfare  Worker satisfaction The data-measurable  Maximize production mechanisms deployed on capabilities Through-put component expanding manufacturing  Process enablers capabilities  Process constrainers  Costing processes The data-measurable  Costing methods mechanisms deployed on  Costing contribution Financial component optimizing manufacturing  Cost of manufacturing costs and value contribution  Cost of complacency  Costing elements

[Remainder of page left intentionally blank]

68

2.6.2 Aggregate dimension: Planning processes

Three component themes support planning processes, trade industry utilization, capabilities utilization, and buyer/supplier utilization. Planning as a process, as defined through comparative interpretation of the case studies, is the integration of business elements relevant to current and future business activities aligned with strategies and objectives. Business elements are represented by these second order themes, the relation to current and future activities is found in the data on which support ‘making plans’, with plans being agile to changing environments and re-aligned as necessary to achieve company strategy and objectives. Manufacturing is built upon adherence to production planning techniques that maximize process capabilities to meet through-put objectives or the expected unit production. Plans are made for achieving strategies and objectives and many businesses elements shape planning, more than described by the themes derived from the interviews, herein provides context important on the planning process.

Trade industry component is formed by first order concepts of: using strategic and operational related external data to create data-bases on trade related information impacting on plan formation and sustaining competitiveness (e.g. commodity markets, raw-material availability, capital markets, labor markets, and competition), market trends, maintaining trade industry standards, and complying with government or agency policies of regulatory environments placed on the trade industry. Trade industry, for this research, represents the degree of strategic data (e.g. externally sourced from industry, government agency, market and other research providers) when processed with operational data influence plan development in near and long-term planning. Strategic data Business elements of strategic data include: capital markets as important to funding strategic

69

ambitions, credit vehicles for funding operations. Maintaining trade industry standards

(i.e. as standards change, planning is required for adjusting to those standards, may entail new machinery or processes, investing in those changes are important inputs to meeting new standards) and complying with government or agency policies of regulatory environments placed on the trade industry (e.g. EPA and USDA regulations were most often mentioned). Inclusion of these first order concepts as business elements in planning sustained competitiveness within respective trade industries invites the following proposition.

Capabilities utilization component considers projecting manufacturing capacity to meet planned unit production objectives, projecting labor resources, waste utilization, and projecting gaps in capabilities that may prevent plan achievements. Supplier in terms of plans to ensure material reliability, quality and availability, customer data in terms of forecasted demand, market trends in terms of forecasting material requirements. Not all companies interviewed tie raw-materials to commodity trade markets, however all companies have some form of marketplace that provides operational data for planning on anticipated cost and availability variances. Whether the cost or availability of steel, sugar, cocoa, wheat, timbers, oil, polyesters, energy, lumber, textiles, production planning is made on projecting material flow to meet forecasted demand

Manufacturing capacity as a capability begins with knowledge of annual product demand. Projecting capacities is dynamic, as the year progresses adjustments are made to demand fluctuations, raw material availability, or unexpected events that disrupt capacities. Capacity planning begins with knowing production targets in terms of units required of the plan and the projected unit cost associated with product profitability.

70

Assessing manufacturing capabilities to meet targets and projecting raw material supply follows in context with associated manufacturing costs and plan achievement.

Planning labor requirements required of the production process in context with capacity resources, is an on-going dilemma for manufacturers. The attractiveness of manufacturing as an industry, working in rural locations, perceived lower levels of compensation and benefits, perceived routineness of manufacturing job-activities, and perceived level of career advancement are dimensions retarding recruitment and retainment of employees. Another dimension to labor resource capacity, is skill-set and knowledge-set capacity. The advent of adopting technologies is requiring manufacturers to seek individuals with skills and knowledge to work with ever-more complex manufacturing processes. The more technologically advanced the manufacturer becomes, the greater influence technology has on shaping the skill and knowledge capacities required of the organization and planning for those resources. The greater shrink of available labor, the more reliant the manufacturer becomes on finding alternate forms of manufacturing, as in off-shoring to where labor is available or planning on increased use of automation and robotics to offset labor shortfalls. Labor resource capacity, possibly more critical than any other resource in the strategic plans of the organization.

Planning for waste utilization, or waste economic contribution, or waste monetization

(terms used among interviewees) on organization value, has always been important to the manufacturing process. Recycling of material, remanufacturing of material, and repurposing of material in planning proved evident in discussions on ‘things we plan on’

(President metal fabrication company). Some company income statements line item to include the monetization of waste materials sold to third parties, waste economic value

71

also finds its way on the income statement through remanufactured products, and some through energy used in the manufacturing process. The financial report reference is a call-out of how waste impacts on organization value and the importance of waste resource capacity in planning.

Analyzing data on look-forward capabilities utilization when setting plan outcomes reveals possible gaps in their accomplishment; the voids found among available resources and or capabilities as compared with those required to execute a plan. One-year look forward operation planning is based on rolling twelve months, believed to be a more effective means of anticipating resource gaps versus a traditional one-year start and stop process (i.e. complete one year’s plan and begin planning for the next). Using rolling trends allows this company to view past events that may have strained resources and impeded plan performance and projecting possible recurrence in the near future (one to two years).

Buyer-Supplier utilization component is the degree of data collected and integrated from customers and manufacturing material suppliers into the planning processes. Market place influences, requests for new materials and material formulations to meet changing customer preferences, forecasting future demands and material changes is highly data-dependent on proximity to buyer/supplier data-sources. Planning is reliant on customer purchase history and forecasting reliant on predicting products falling out of market favor and introducing replacements, influenced by customer input on demand affecting planning horizons. Figure 13 summarizes first order concepts and second order themes followed by propositions.

72

Figure 13: Data-actuated Key Planning Processes

P23: Companies who analyze externally sourced industry data in planning processes are better positioned to sustain competitiveness.

P24: Companies who regularly analyze data on manufacturing capacity utilization are better positioned to meet changing market demands.

P25: Companies who regularly analyze levels of labor resource capacity are better positioned in adapting to technology changing environments.

P26: Companies who continuously scan for the optimal economic return on the use of waste materials are better positioned to add incremental value to the organization.

P27: Companies who regularly analyze and identify gaps between capabilities and planned outcomes are better positioned to achieve organization strategies

P28: Companies who regularly analyze trends on buyer demand and supply of materials used in product manufacturing are better positioned in adapting to changing competitive strategies.

73

Table 11: Key Planning Processes Constructs and Definitions

Item Definition Dimensions or measures  Agility on changing The integration of organized environments  Planning techniques activities on current and future Key planning processes  Strategic data business elements aligned with  Operational data strategies and objectives.  External data integration  Internal data integration  Capital markets Integration of external and  Commodity markets Trade industry utilization internal data in short and long-  Supply resources component term planning mechanisms  Human capital resources  Regulatory environments Integration of external and  Manufacturing capacities Capabilities utilization  Labor-skill-talent capacities internal data in the planned component  Waste-material utilization use of resources  Capabilities gaps Integration of customer  Projecting material reliability Buyer/supplier utilization  Projecting material demand (buyer) and supplier data in component  Projecting customer demand the planned use of resources  Projecting buyer trends

2.6.3 Aggregate dimension: Innovation processes

Three second order themes as components form innovation processes; new processes introduced through data-analysis and those created by adopting new technologies, new products developed with data-analytics, and people, the interaction with data and technologies that motivate innovativeness. Innovation in a general one- word meaning is ‘change’; its attributes as incremental, retaining degrees of familiarity of that which exists within the operating environment, or novel by introducing something totally new.

Innovation as a process, adds complexity to the simple ‘change’ definition and the steps to creating innovation are not linear but involve thinking in non-linear terms.

Developing new processes may require understanding related and unrelated processes

74

affecting multiple function areas, and innovations on those may precede the intention on changing or creating a new focal process. Innovation as a product found evidence in the development of product enhancements at the agriculture biologics company formulating new strains to enhance plant growth, the wood products company investing in new metal fabrication facilities and developing unfamiliar new products, the food processing confectionary company developing new candied flavors for existing products, and the furniture company designing new modular cabinetry for adaptation on changing life- styles. Data from customer input, employee input, benchmarking competitor products, and experimentation are first order concepts of innovation’s product component.

Innovation as people, is most interesting and unique of the components revealed during the interviews. Process and product innovation are top-of-mind outcome responses when asking the meaning of innovation. In two interviews, common discussion on the people factor of innovation surfaced.

The process component of innovation considers changes on processes as being made by the introduction of a new technology into the manufacturing environment, or the use of data in reviewing current process performance and through data-analysis innovate that process to improve on performance. Understanding non-linearity in manufacturing

(e.g. production control, demand uncertainty and through put speeds), is important to innovating processes. Observations on non-linearity when adapting data-technologies to change processes, brought forward innovation through redesigning production processes.

Production data is used to measure process performance against some expected outcome.

When data indications short-fall or over-fall on expectations, analysis of the result and timing factor significantly into the level of responsiveness on the process. When asked a

75

question on ‘reacting to data’; general responses regarded machine maintenance, down- time, getting things up-and-running, not losing production time, getting it fixed as common characteristics of data-use in process innovation. Observing processes and reviewing performance data may lead to the introduction of new technologies (e.g. robotics), or redesign of those currently in use (e.g.re-engineering of existing equipment).

Innovating processes is as much a function of determining something new to be added as it is to tweak what exists. Fundamental to process change is understanding the cost of the current process in terms of lost opportunity as compared to the opportunity gain in changing the process.

The product component, a simplified term meaning; insights found through data- analysis on the development of new things or improvements on existing things. In other words, the influence of data on product innovations. Similar in observation as processes, evidence on product innovation was learned during factory tours and discussed in face-to- face meetings. When discussing product innovation, details are held to minimum explanation in this article section, agreed to non-disclosure reasons allow for only general references to be made on any observed or articulated innovations underway. Comparing competitive products is a good method for collecting and measuring performance data and determining levels of features to price to value relationships. The collected, analyzed data may influence innovations on current products, or suggest the opportunity for developing a new product.

The people component. Innovation is people driven, i.e. the insights revealed through data-analysis and data-use operationalized by human interaction on creating innovativeness; the changes on requirements of human capital to maintain and sustain

76

operations. Observing work behaviors, highlights issues frustrating an employee’s ability to perform a task or meet performance expectations can be motivators for innovativeness.

Self-aware observations by the employee to ‘do-something’ on an issue is innovative thinking; whether the decision is to recommend a change in procedures to reduce the frustration, or the simple act of leaving the company, is innovative; i.e. initiating something new to be done, taking an action based on a new idea. Management by walking around, observing work behaviors, taking time to understand existing processes retarding performance and causing frustrations, gaining insights and involvement from the individuals performing the task, each an action leading to solutions and significant contributor to innovation processes. Motivating innovation and sustaining innovativeness among employees results from working with the employee to find solutions, innovations on how the task is performed. Encouraging innovativeness, is through idea exchanges during department meetings, on-line suggestion boxes (some still use the ‘box-on-the- wall’ method), off-site idea generating sessions, third-party innovation camps, and employee SWOT analysis. Bringing employees together on task improvements, creates innovation opportunities. Cross-collaboration among functional areas for sharing data and acting on data-analysis has been widely discussed. The groups may not be formed specifically as ‘innovation teams’, however they are usually purposed with developing innovative solutions on identified opportunities. Figure 14 summarizes first order concepts and second order themes followed by propositions.

[Remainder of page left intentionally blank]

77

Figure 14: Data-actuated Key Innovation Processes

P29: Companies who regularly data-analyze production processes are better positioned to create process innovations.

P30: Companies who regularly competitively benchmark products are better positioned to create new product innovations.

P31: Companies who regularly encourage and act on innovation inputs from employees are better positioned to sustain innovativeness.

P32: Companies who encourage cross-function collaboration on organization problems and opportunities are better positioned to sustain innovativeness.

P33: Companies who demonstrate commitments to innovative thinking through technology and work-space investments are better positioned to sustain innovativeness.

[Remainder of page left intentionally blank]

78

Table 12: Key Innovation Processes Constructs and Definitions

Item Definition Dimensions or measures The integration of new or  Existing processes Key innovation processes modified mechanisms to New processes create novelty and value  Incremental improvement on  Pre and post changes on mechanism operation or existing mechanisms Innovation introducing new mechanism novelty supplanting existing  Pre and post changes on new mechanisms mechanisms.  Production data Changes on mechanisms by  Non-production data Process component the introduction of a new  New processes tested technology  New process introductions  New technologies adopted  Competitor benchmarking Changes on products or  Customer data introducing new products as  Supplier data Product component influenced by internal and  Customer focus group data external data  Purchased business data  Prototype data  Work-task assessments  Data-sharing cross-function Changes on requirements of teams People component human capital to maintain and  Inter-department sustain operations collaborative networks  Innovation teams  Innovation centers

[Remainder of page left intentionally blank]

79

2.6.4 Aggregate dimension: Data governance processes

Data governance and data-security are symbiotic in their definition and relationship.

Data-security refers to data-asset protection, data governance refers to allowable data- asset access, the protocols on how and by whom data is accessed and used. The reader will recognize some overlap in definition, this is unavoidable, since both topics are managing the same resource, data. Three second order themes are shaped through the case-studies: governance infrastructure, governance leads, and governance access.

Characteristic of data governance is the infrastructure companies build to govern data; technologies (e.g. virus, malware, phishing protection are commonly mentioned technologies among the case study participants), and scalability of data controls as the organization and its information systems undergo changes when upgrading to new software and hardware.

Governance leads, are instrumental the at data-security level integration and data- governance process level in the company having a governance strategy in place, led by a champion who oversees the distribution of data, and authorizes access to its use. The champion also views data governance from a competitive advantage perspective, the protocols necessary to insure those measures are superior to competitors in constraining unauthorized access, thereby reassuring stakeholders of measures that protect not only data assets, but image assets such as company reputation, the reputation of its interaction among customers and suppliers.

Governance access, data warehousing (e.g. cloud-based, internal servers), data access privileges, scalability of data distribution requirements are fundamental protocols requiring formalization as a part of the company’s overall cyber-security strategy

80

Figure 15: Data-governance processes

P34: Companies who generate high levels of data access will build data governance infrastructures on data containment.

P35: Companies who generate high levels of data usage will treat data governance as a management responsibility

P36: Companies who generate high levels of data are more apt to employ the latest in data warehousing techniques.

2.7 Organization performance outcomes

This case-study research deduces organization performance outcome to mean, results from processes measured against planned expectations; a simple definition yet purposely made broad allowing for generic interpretation and application. The prior section discussion on processes summarized interviewee thoughts on key dimensions effecting performance and operationalized by data use practices; productivity, planning,

81

and innovation. Interviewees measured organization performance along several dimensions, weighting those on one outcome term, growth; generally interpreted as planned improvements on company capabilities.

The origin question of this research, does the investment made in data- technologies make a difference on company performance in a knowledge-intensive environment, is demonstrated through outcomes realized in some measure of value.

Growth as an outcome value, transitions to the effect of data-technologies on organization change, market presence, and sustained innovativeness. Second order themes, employment growth, capabilities growth, cyber growth, and financial growth form the aggregate dimension of ‘organization growth’. Existing and new market growth aggregate to ‘market growth’. New business creation, new intellectual properties creation, and technology investments round out the dimension ‘innovation growth’.

The manner interviewees applied measures to capabilities growth dimensions varied, with those organizational most similar in discussion and measurement; market growth topics were limited not desiring to expose competitive particulars; guarded discussions on innovation revealed general insights on company activities. In this section, the reader will note for the sake of brevity, only summary reference support on first order concepts and second order themes are made. Figures summarize first order concepts and second order themes followed by propositions, tables contain construct items, definitions, followed by propositions.

82

2.7.1 Organization capabilities growth Figure 16: Organization Capabilities Growth

P37: Companies seek investments in advanced technologies to mitigate declining work- force capabilities.

P38: Companies who invest in advanced technologies also invest in recruiting and retaining high skill level employees.

P39: Changes in a company’s employment population may indicate either an expansion or contraction of capabilities.

P40: Companies who invest in regular upgrades on technology tools are better positioned to recruit and retain a skilled work-force.

P41: Companies who invest in regular assessments of work satisfaction are better positioned to recruit and retain a skilled work-force.

P42: Companies who regularly assess capabilities anticipating technology investments will be better positioned to address gaps in capabilities to achieve company objectives.

83

P43: Companies who make data-protection a strategic important capability are better positioned to manage cyber-security events

P44: Companies who view technology investments as long-term in generating positive returns are better positioned to organizationally embed the use of the technology.

P45: Companies who establish a financial means to measure the asset value and profit contribution of a technology are better positioned to make future technology investments.

Table 13: Organization capabilities growth (Labor)

Item Definition Measure Change in work-force  Time-period comparatives Labor capabilities growth population size (WFP) of WFP size WFP size affected by the Employment tied advanced  WFP size pre-introduction introduction of advanced technology investments WFP size post-introduction technologies  WFP size reflective of  WFP size pre-addition Organic employment growth increases or decreases of WFP size post-addition internally added capabilities  WFP size reflective of  WFP size pre-addition Non-organic employment growth increases or decreases of externally added capabilities  WFP size post-addition  Average annual expenditure Replacement frequency of on non-labor related IT Digital technology tool upgrades software and hardware tools infrastructure per employee used by employees  Posted time-schedule on technology tool upgrades  Frequency of assessments Surveys and or other methods taken used to determine levels of Worker sustainment assessment frustration on technology  Skill level requirements of a technology compared with aided productivity those currently possessed

[Remainder of page left intentionally blank]

84

Table 14: Organization capabilities growth (Manufacturing)

Item Definition Measure Manufacturing Capabilities A resource employed to  Type of capability Growth achieve company objectives  Time-period comparatives Change in the number of on number and type of resources and technologies available capabilities Capabilities growth used to achieve company  Time-period comparatives objectives on the number of deployed technologies  Number of capabilities developed using internal In-sourcing of resources to Capabilities verticalization resources achieve company objectives  Capital investments made on in-sourcing resources  Number of capabilities out- sourced to third parties Repositioning resources to Changes in number of in- Capabilities reconfiguration  achieve company objectives source capabilities based on new technology introductions

Table 15: Organization capabilities growth (Cyber-security)

Item Definition Measure Changes in resources to  Time-period changes to Cyber-security capabilities prevent the unauthorized resources committed on growth departure of data assets data-protection  Financial investments made to prevent data breaches Resources made available on Cyber capability the protection of data assets  Infrastructure investments made to prevent data breaches  Annual financial investment in cyber resources as Data asset protection compared with competitors Cyber advantage capabilities greater than those  Number of cyber events held by competitors versus competitors  Cost of cyber events

[Remainder of page left intentionally blank]

85

Table 16: Organization capabilities (Financial)

Item Definition Measure Changes in returns on asset  Time-period changes to Financial capabilities growth performance ROA or RONA  Technology investment Changes in profit performance compared with performance Technology induced profit of a process with the expectations growth introduction of a new  Technology performance pre technology asset and post introduction on profit performance  Technology investment Changes in cost performance compared with performance Technology induced cost of a process with the expectations improvements introduction of a new  Technology performance pre technology asset and post introduction on cost performance  Business case on cost- benefit analysis The steps taken by the  Annual expenditures on company to insure successful technology education Technology embeddedness implementation of a  Time-line of technology technology selection  Time-line of technology implementation

2.7.2 Planned and or Market capabilities growth

Observation and summarization of interviewee discussions on data and its influence on competitiveness, categorizes as themes gained from data-analysis to sustain and or grow business among existing customers, creating growth opportunities through new customer acquisition and or entering new markets. These categorizations, titled existing market growth and new market growth, form the aggregate dimension, market growth; meaning those evolutionary changes on the organization’s capabilities as influenced by the planned introduction and future use of new technologies. Three first order concepts illustrated through transcribed statements on existing market growth spoke to: customer

86

involvement with product enhancements leading to increased sales with existing customers, acquisitions of competitor businesses within the same served market, and analysis of customer data to manage selling environments. First order concepts shape new market growth, data gathered from external and internal data sources to identify new markets. Entering new markets, those which the company currently does not conduct business, is a growth mechanism to sustain strategic and operational objectives.

Figure 17: Planned or market capabilities growth

P46: Companies that plan on adopting advanced technologies are better positioned to sustain future competitiveness

P47: Companies that utilize data-technologies are better able to plan capabilities to sustain future competitiveness.

[Remainder of page left intentionally blank]

87

Table 17: Market Capabilities Growth Constructs and Definitions

Item Definition Measure  Time-period changes to Changes on the organization’s Market growth share of market measured as competitiveness a % of market share  Market potential measured Changes on customer against items sold Existing market growth performance in currently performance served markets  Changes in the number of competitors  Number of items sold per customer Financial objective measures  Revenue per item sold per Existing market performance of currently served markets customer  Profit per item sold per customer  Customer segment items sold  Customer segment sales trends Market area attributes in Customer selling environment  Changes in the number of which the company operates competitors  Externalities affecting customer segments  Items sold by geography  Analysis of affinitive product data to reveal new market opportunities Strategic decisions to expand a  Analysis of internal data to company’s presence into reveal verticalization New Market growth marketplaces not currently opportunities leading to new served market entrance  Analysis of research data to reveal new market opportunities

[Remainder of page left intentionally blank]

88

2.7.3 Aggregate dimension: Innovation capabilities growth

Without the introduction and implementation of new ideas into the organization thought-stream, performance stagnates, products stale, employees weary on unproductive tasks, firm-value declines, competitors encroach on customers, and human capital departs for greener working pastures. These summarized interviewee expressions on outcomes are core reasonings for organizationally spurring innovation. Extrapolated from first order concepts, three second order themes form as growing organization innovativeness; productivity momentum from data induced innovation, organization momentum to data- driven innovativeness; concept momentum, externally and internally collected ideas gathered on sustaining operations and competitiveness. The operative term within innovation growth is momentum, meaning the consistent and continuous flow of internal and external inputs on ideas then transformed into value creating activities or products to sustain a company’s competitiveness.

Figure 18: Innovation Capabilities Growth

89

P48: Companies that utilize data technologies are better positioned to sustain productivity innovation

P49: Companies that utilize data-technologies are better positioned to sustain organization innovativeness

P50: Companies that utilize data-technologies are better positioned to sustain concept innovation

Table 18: Innovation Capabilities Growth Constructs and Definitions

Item Definition Measure Changes on the consistent and  Number and type of acted continuous flow of ideas upon ideas Innovation growth transformed into value  Annual profit contribution creating activities or products of implemented new ideas  Number of changed processes  Pre and post outcome of The consistent and continuous process changes; e.g. cost Productivity momentum flow of ideas on improving reduction, through-put, processes quality improvement  Annual expenditures on information and or digital technologies  Number of idea recommendations per employee versus number of The consistent and continuous implemented ideas Organization momentum flow of acted upon ideas made  Education expenditures per by employees employee on technology use  Annual digital technology expenditures  Annual R&D expenditures  Annual number of prototypes developed The consistent and continuous  Annual number of flow of ideas unique and new prototypes that become Concept momentum to the company and or marketable marketplace transformed into  Annual expenditures on competitive advantages customer and third-party inputs  Annual number of new patents applied for.

90

2.8 Research Model: The data-centric eco-system of a manufacturer

The purpose of this research is exploratory, its approach to examine topics related to the use of data-analytics in a knowledge intensive environment through case-study methodology. Important to this examination is doing so with minimally held biases or pre-conceived knowledge on subjects of interest. Rather, the richness of the study is to allow interview conversations flow along a semi-structured format where topics could be explored beyond the limitations of focused questions

Figure 19: Case Study Concluding Research Model

Data-centric Ecosystem

Level 1 Level 2 Level 3 Managers Managers Managers

Executive Executive Executive Management Management Management Influence Influence Influence

Data-Centric Data Data Organization Adoptive Integration Actuated Key Performance Drivers Practices Processes Outcomes

Data Data Data Accessibility Accessibility Accessibility & Use & Use & Use Integration Integration Integration Strategic Innovation Mechanism Mechanism Mechanism Level Data Productivity Pressure Integration Organization Capabilities Growth

Operation Performance Level Data Planning Pressure Integration

Market Capabilities Growth Competitive IIOT Innovation Pressure Integration

Innovation Cyber- Data- Capabilities Cyber- security security Growth security Pressure Integration

91

. In some respect, the author acknowledges the ‘a priori’ aspect of this research, however its uniqueness and depth of intrinsic discovery revealed insights possibly not found or explored through traditional survey methods as typically used on empirical studies. Brought to the forefront are four principal models encased within the data-centric ecosystem model, as illustrated in figure 14, any of which provide a platform for additional research and discovery; each model divides by theme aggregated dimensions producing fifteen sub-constructs.

2.8.1 Final aggregation

The aggregated dimensions, as previously discussed, are now interpreted as macro dimensions. Macro in the sense of overarching, generalized dimensions. Data adoptive drivers are shaped by internal and external pressures. Internal meaning, those ambitions and motivations on adopting a data-centric mind-set driven by self-desired needs to improve organization functionality; as demonstrated in the case-studies by executives who desire to find better ways to improve the organization environment not directly based on external pressures. Whether those desires are to provide technologies to enable better job-performance or to provide a more productive working environment, they are resident in the CEO and management beliefs making the work-place experience one that facilitates company strategies and objectives. Performance and innovation pressures are internal to the organization, both independent and dependent on external pressures. The company has a choice to improve on performance through innovativeness, at times thought independent of external pressures. Providing employees with new technology hardware and software for productivity, initially may not be motivated by competitive pressures. Or the movement to new offices in order to facilitate a better working

92

environment, may be driven by the need to give employees a pleasant place in which to work. These actions may be motivated by the desire of the CEO or owner, to have a place that is representative of their vision of the company. Indirectly, these motivations are tied to external pressures, directly their effect is pronounced by increased productivity and performance. External pressures, those which influence internal pressures to take certain actions or adopt certain technologies to sustain competitiveness. If competitors are

‘deep-diving’ into data-analytics to gain competitive advantage, then the focal company has a choice to either, do nothing and possibly watch market share diminish, to match those initiatives to keep pace with the market, or to exceed those initiatives and hopefully gain a greater competitive advantage. The case studies revealed competitive pressures as external to the organization, yet internally adapted. Cyber-security was a ‘carve-out’ effect found in the case studies; an external pressure caused by the intrusion of entities outside the company to compromise and or steal data assets. While some cyber-security activities are internal on the company, their motivations may be external, through data

‘walking-out-the-door’ to competitors or other harmful actors as discussed earlier in this article. Unique external pressures, such as this example, are those the company may not recognize until an event occurs that brings realization to something previously unknown.

Anticipating unknown risks, not a dimension fully discussed during the interviews, underpins external pressures companies experience and a topic to be further researched.

When considering data integration practices and data actuated processes, these are placed in aggregated ‘ordinary and extraordinary’ dimensions and effect on the company.

Ordinary dimensions are those common on the organization, extraordinary those uncommon on the organization, those that have the potential to ‘up-end’ current

93

practices. Ordinary dimensions are those identified as strategic and operation level data integration practices, productivity and planning actuated processes. Extraordinary are those uncommon, in the case of integrated data practices, these are represented by

Industrial Internet of Things and data-security integrated practices. Regarding actuated processes, these are represented by innovation actuated processes and data governance actuated processes. Companies ordinarily integrate strategic and operation level data on decision-making and problem solving. New technology trends, uncommon to the company, such as IIOT applications and adopting data-security level integration add to the newness of data integration and extraordinary on the organization. Likewise, extraordinary actuated processes are those affected by integrated practices, innovation processes are a result of IIOT integration and data governance processes are a result of data-security level integration.

Executive management influence (EMI) and data accessibility and use are mechanisms (DAUM) relied upon by the organization to integrate and actuate data in growing organization capabilities and producing value creating outcomes. The case studies elude to these mechanisms being continuous in their effect on the organization, flowing from initiated pressures on data-centric adoption to value creation outcomes.

Surfaced in several conversations was the need of CEOs, owners, and senior management on providing the necessary resources and support to realize positive returns on technology investments. Simply purchasing a new technology and not emphasizing its continuous productive use diminishes the technologies value creating ability. Considering the first order concepts of EMI on resource allocation, importance of data analytics in decision-making and problem-solving, securing talent with an analytic and business

94

mindset, the CEO / president / owner leading by ‘use’ example, requiring cross-function participation on improving operations, and understanding the changes data-centric adoption invites on the organization. These concepts are continuously emphasized by managers to enhance positive returns on technology investments. DAUM, relates to the continuous provision of mechanisms to enable data accessibility and use. First order concepts center on organization level distribution of job-relevant data, timeliness of data distribution, data visibility and automated data-generation (versus manually accessing data on an ad-hoc basis), and user education – training as important on maintaining continuous data accessibility and use. Much like EMI, the consistent reinforcement of data applications is found through how data is understood and used throughout the organization, its deepening within the organization.

Outcomes are capabilities growth oriented and segmented into those organization, market, and innovation related. Reclassifying these dimensions based on the basis of capabilities growth and value creation considers the effect of data-centric adoption impact on current capabilities growth and planned capabilities growth; the former representative of recent effects on organization growth and the latter on market growth.

Value creation is the effect of data-centric innovativeness in the form of current cost improvements on the organization and those planned. Given these generalized aggregations, the research model is reframed in figure 18.

[Remainder of page left intentionally blank]

95

Figure 20: Case-study Generalized Concluding Research Model

Executive Management Influence

Internal Ordinary Ordinary Practices Processes Pressures Capabilities Data-centric Data Integrated Data Actuated Growth Pressures Practices Key Processes & Value Creation External Extraordinary Extraordinary Pressures Practices Processes

Data Accessibility and Use Mechanisms

[Remainder of page left intentionally blank]

96

CHAPTER 3: VOICE OF LITERATURE

3.0 Theory

Following the chapter two on the Voice of the Manufacturer, its reliance on case study findings foregoing prior researched literature, the research now turns towards the Voice of Literature. At times in this literature review, sprinklings of insights gleaned from the case studies will be interjected into the narratives.

Harking back to the originating research questions; do business analytics15 [data- technologies] make a difference on company performance in today’s information intense environment, and subsequently how are manufacturing firms capturing, integrating, and actuating data to create value. What are the drivers of data centricity adoption, what mechanisms influence the successful integration and actuation of data technologies, and how do data technologies facilitate the growth of organization capabilities. Is data a tangible or intangible resource, a strategic asset unique to the company, used for competitive advantage? Is data-analytics a competency critical to firm performance?

Does the external business environment motivate management to adopt a data-driven mindset strictly for competitive reasons, or are reasons equally associated with the organization’s ability to sustain its existence by providing a data oriented working environment to manage the ever-increasing demands of a world being consumed by immediacy of information and digitization. Theory speaks to data as a unique resource to the firm, data-analytics as a capability of the firm, organizational learning from the application of BA-DA as knowledge of the firm, and knowledge turned into performance

15 Business analytics (BA) is used interchangeably with data analytics (DA).

97

outcomes, aligned with expected, value-creating returns to the firm, becoming a strategic capability.

3.0.1 From the RBV perspective

Wade and Holland (2004) premise the Resource Based View (RBV) of the firm is useful when studying and conducting research on information technology and system topics16. RBV states, a firm’s internal environment is the driver for competitive advantage, where the organization develops and possesses certain resources and capabilities that shape its competitive ability to operate in the external environment.

Wernerfelt (1984) posed the ‘firm as a bundle of resources tied semi-permanently to the firm, characterized by the resource’s tangible and intangible nature, its strength and weakness that alters over time on importance to the firm in the ability to produce high returns.’ Data as a resource important to the company’s internal environment, is both tangible and intangible. Hunt (1997) describes tangible resources as physical, tactile and intangible resources as non-physical, non-tactile, both enabling the firm to create a competitive resource advantage (RA Theory), by their use to efficiently produce and effectively market something of competitive value. Data in a tangible resource advantage state, is the outcome of its analysis in the form of a decision that leads to a new product, a better process that can be financially tied to cost improvements, a return of value that can

16 Wade and Hulland recommend, when studying RBA as a part of IS Research, to consider variables that are not only firm-level since items such as ROI, ROA, market share, can be too restrictive and limit the ability to measure more important effects on the organization. IT and therefore DA, have cross-organization effects and its effect should be viewed from other measures. This paper’s research took care in measuring items in terms of employment, production outputs, processes changed, energy use in the production process, and several others to more accurately measure outcomes. In addition, Wade and Hulland advise to consider attributes of performance assessment, competitive assessment, and over-time performance in designing variables.

98

be quantitatively calculated. Data, an intangible intellectual property as commonly understood in literature, is without form until analyzed and used in some fashion to produce an outcome. Data over time can lose its value if not updated, or its capabilities diminished if investments made in DA infrastructure to reveal relevant insights are not made, hence the semi-permanent nature of data, its perishability and consumability.

Data, as an intangible asset; Del Canto and González (1999) posit a firm’s ‘scientific and technical knowledge most frequently bring together the requirements necessary for competitive advantage, difficult to externally observe, invisible assets on the balance sheet’. Data and Data-analytics likewise are invisible assets, made visible only when applied to a purpose that produces a tangible result.

Resources are understood dimensionally: Multi-categorized as financial, physical, human, technological, reputational, and organizational (Grant, 1991). Tri-categorized by

Ansoff (1965, 1988) as physical, monetary, and human; Amit and Shoemaker (1993) as physical, human, and technological; Barney (1991) as physical capital, human capital, and organization capital); as physical assets, knowledge assets, and human assets (Wang and Ahmed, 2004). Bi-categorized as individual-level resources and firm-level resources

(Lee et al., 2001); property-based and knowledge-based (Miller and Shamise, 1996).

Resources are firm specific, ‘all assets, capabilities, organizational processes, information, knowledge, controlled by the firm enabling it to conceive and implement strategies improving its efficiency and effectiveness’ (Barney, 1991). Resources are competencies core to the organization, distinctive for competing in the marketplace, rare, inimitable, non-substitutable, and non-reproducible (Barney, 1991; Prahalad and Hamel,

1990, 1994). Competency is dependent on the firm’s ability to manage capabilities

99

organizationally categorized as cross-functional, broad-functional, activity-related, or specialized (Grant, 1996); where capabilities sustain competitive advantage by deploying resources combinative and recombinative among firm-specific organizational tangible and intangible processes, legacy developed, through resource interaction responding to changing internal and external environments (Amit and Shoemaker, 1993; Teece, et al.

1997), in other words, how resources and capabilities are managed and organizationally integrated.

Viewing data through the RBV and RA lens, it is a cross-dimensional firm-specific resource; its fit into any one of the referenced categories, easily justified due in part to all resources generate data and resources are organizationally intertwined with data exchanged among those resources. DA is a cross-organization capability and competency, the more dexterous and agile a company is in capturing, integrating, collaborating (Akter, et al., 2016; Davenport, 2014), and managing data, the greater is the opportunity for innovation to occur, creating economic value for the organization superior to competitors.

3.0.2 From the KBV perspective

Given the resource-dimensionality of data cloaked in RBV, the knowledge-based perspective (KBV) strengthens data as a critical organization resource. Thinking of data within the DIKW hierarchy, it is viewed as symbolic products of observation, functional in application and only useful when relevant (Ackoff, 1989; Rowley, 2006); in this sense all wisdom cannot occur without data. Data is the secret ingredient and DA the formula

100

on which the organization relies on to grow capabilities for competitive advantage, a proposition that conveniently fits in the RBV and KBV narrative.

Knowledge is embedded in organization culture, routines, policies, systems, artifacts, and people (Grant, 1996); knowledge-based resources (for the most part) are heterogenous to each firm (Alavi and Leidner, 2001), and the most important resource of the firm (Alavi and Tiwana, 2002). Data requires analysis before transforming into information; thus, becoming organization knowledge, forming competencies, building capabilities, shaping intellectual assets, enabling superior organization performance

(Prahalad and Hamel, 1994), especially in the digitized revolution of Industry 4.0

(Schwab, K., 2016).

Knowledge is categorized as core, required to manage operations; advanced, required to sustain competitiveness, and innovative, required to excel in competitiveness (Zack,

1999). Knowledge is embedded organizationally as a system for processing information and solving problems; efficiently managing information for making decisions and creating new knowledge for innovation in changing environments (Nonaka, 1994). If the intent of DA is value creation, it is primarily accomplished through insights delivered through transforming information into innovation knowledge; developing new and or improved products, processes, or business models. Subsequently, these theories suggest firms that strategically allocate resources create greater value through data accessibility and use, generating higher returns and making possible competitive advantages. RBV and

KBV provide grounding to understand resource value and the motivations to value data, support organization cross-functionality importance of data sharing and DA in creating innovation, and implications on strategy. Missing from this discussion is the learning

101

aspect that data and DA provide and operationalized to meet the challenges found in knowledge intensive, competitive environments.

3.0.3 From the Organization Learning perspective

Data as a rare resource, as a knowledge capability, thereby serves the organization in its decision-making actions, bridging its asset uniqueness as an asset application critical to understanding changing environments, and enabling competitiveness. Organization learning (OL) states, to be competitive, goals need adjusting to changing environments, understanding and interpretation of the environment to assess strategies (Weick, 1979;

Daft and Weick, 1984; Fiol and Lyles, 1985), a strategic capability (Bapuji and Crossan,

2004), a dynamic capability, adapting to changing business environments developing new knowledge for competitive advantage (Kandemir and Hult, 2005); Cyert and March

(1963) posit OL as a strategic capability necessary on creating a competitive advantage and where organizational decision-making is fundamental on available information [data] to make informed decisions quickening the advantage. The learning aspect, itself in the remembered outcome of past decisions, is made useful on future actions. Remembered learning then becomes the degree of organizational effectiveness when expectations, shaped by learning, are aligned with dynamic environments; recurring in a continuous cycle of environment scanning and knowing what data to collect, knowing its meaning, and knowing its use towards achieving some goal and or objective. Organization learning is manifested through a culture of learning (Fiol and Lyles, 1985), adopting a

DA cultural mindset is reflective of management’s strategic support of DA initiatives; empowering learning via decision-making flexibility made possible by an organization

102

control structure (e.g. centralized, decentralized) that invites willing innovativeness (Fiol and Lyles, 1985) to address changing environments. Barney (1986) on organization culture and financial performance, theorizes culture adds value to the firm, possessing a certain rarity among competitors that is difficult to imitate; accordingly a data-centric culture embedded by OL presents a competitive advantage for the firm choosing to embrace knowledge and digital intense contemporary environments.

3.0.4 Technology organization environment perspective

To extents on this literature research, DiPietro, et al, (1990); Tornatzky and

Fleischer’s (1990) framework on technology-organization-environment technological innovation adoption (Baker, 2012) has applicability on the adoption of data-analytic technologies; the adoption of new technologies stems from organizational and environmental motivations. The technological context of the framework considers the relevance on internal and external technologies required of the firm, organizational context are firm demographics and structure descriptive measures indicating the need for technology adoption, and environmental context the external pressures placed on the firm by the market in which it operates (technological, competitive, regulatory, etc.). While not used as a theoretical foundation on this research, it does provide supportive context on the originating research model developed from the case studies. The technological context being the data-centric drive to adopt data-technologies and the current practices employed by the firm and the mechanisms (EMI and DAMU) used to data-integrate practices and data-actuate processes; the organizational context being the organization structure and firm demographics (the case studies revealed company size isn’t necessarily

103

a predicator on technology adoption), and the environmental context being the external pressures exerted on the firm which influences desires on innovativeness.

3.1 Executive Management influence

Commitment from senior executives is essential to the successful implementation of organization wide information systems (Laughlin, 1999), requiring leadership in vision, setting a tone of importance for the implementation, active participation, and as in DA, active use of the technology (Wittmer, 2014; O’Leary, 2000). Senior leaders that pay attention to advanced analytics will have transformational effects on the business (Barton and Court, 2012) and could realize productivity gains of 5% to 6% higher than industry peers (McAfee and Brynjolfsson, 2012). ‘People vested with authority and responsibility for the organization’ (Parnell and Bresnick, 2013, p 2). These include any individual possessing a leadership role; a level-one executive possesses a ‘C-suite’ (chief of) and or president title, level-two would be senior vice presidents, level-three mid-level vice presidents and level-four functional area directors, managers.

Management influence on the adoption of and adaptation to new technologies impacting on organization change is shaped by personal leadership characteristics; the charismatic inspiration to enlist the aid of others through effective communication of a vision, a direction (Wittmer, 2014), an enthusiastic driving force behind a cause, an understanding of organization culture to align it with new paradigms, transformational

(Bass and Avolio, 199317; Rubin, et al, 2005), an analytic propensity (Davenport, 2010),

17 Bass and Avolio describe leadership behavior as ‘transformational, transactional, and non- transactional with transformational representing active / engaged leadership.

104

a collaborative mind-set, a belief in the power of technology to enable organization performance, flexibility to changing environments, allowing risk-taking innovativeness

(Singh, 1986), a willingness to understand the nature of work, and allocating resources for the advancement of the organization; attributes identified among the executives interviewed for this research.

Gladwell (2002), in context with organizational change, states the message of change is best communicated by those who can inspire change and thereby influence organization culture (Vales, 2007). Executive management is comprised of those individuals who are in the best position to influence a data-driven organization culture,

‘developed by a group as it learns to cope with external adaptation and internal integration’ (Cooper, 1994, p 18), where influencers consistently emphasize the beliefs and values of adopting a culture changing technology, and those possessing a greater

‘people versus production oriented behavior’ exerting the most influence’ (Harper and

Utley, 2001, p 11) to that adoption.

A data-driven organization culture is then the outcome of management adopting a data-analytic mindset and investment in technologies that allow for operational, organizational, managerial, and competitive benefits to be realized (McDermott and

Stock, 1999) and sustain a culture. The use of data and its analytics, it is revolutionary in changing roles, transforming how “companies organize, operate, manage talent, and create value” (Henke, et al., 2016, p3). Doing so requires leadership from the CEO accepting a new cultural reality; where not adapting to data-centric environment, is done at the risk of long-term sustainability. As executive management influence increases, the expectation on cross-function collaboration on the use of data-technologies increases.

105

Hwang, et al. (2015, p5) describes this coordination as ‘the extent to which intra-inter organizational work collaboration is required to achieve complex performance outcomes’ being ‘responsible for a shared outcome’ (Klotz, 2017, p2), ‘improving on the organization’s responsiveness towards problem solving capabilities, creative capacity, and efficient allocation of resources (Parker, 1994; Holland, et al 2000).

Cross function collaboration on data-analytics use aids to realize greater returns on technology investments. “Many organizations have responded to competitive pressure by making large technology investments … without adopting the necessary organizational changes to make the most of them” (Bughin, et al., 2016, p34). Without adequate resource allocations and executive leadership on using data-technologies, expected returns on the invested technology may never be fully realized. Furthering this belief, is Jeff Immelt (CEO, General Electric, 201618) who stated “an analytics-enabled transformation is as much about a cultural change as it is about parsing data [….] product managers have to be different, salespeople have to be different, on-site support has to be different [….] and treating analytics as being core to the company over the next twenty years just as material science was for the past fifty”, referencing how data, its sources, connectivity are critical to ensure long-term, successful performance.

Bridging the case study definition, executive management influence changes to those individuals with authority, control, and persuasiveness over company resources and assets to affect the application and use of a technology.

18 McKinsey & Company (October 2015) Interview ‘GE’s Jeff Immelt on digitizing the industrial space’

106

3.1.1 Data-analytics driven enterprise

Ubiquitous digital native cultures are those businesses built on foundations of data, data- analytics, and algorithms as the primary raw material resources used for their existence; companies as Google, Amazon, Uber, Airbnb, Lyft, Netflix, Pinterest, Spotify, Tesla, and others, through advanced algorithms, found the use of varied digitized data streams had value on which to build successful, innovative business models. These ground-up new models, began with organization cultures driven top-down by management, and relying on data to shape how the business is structured and managed, how it would operate to achieve its objectives, and how decisions are made. It is much easier to plant new seeds of a data-driven culture in fresh organization soil, and much more challenging when managers of legacy organizations must till the organization to plant the seeds of change.

Yet, some very conventional and legacy laden companies such as General Electric,

Siemens, Boeing, Airbus, John Deere, and others have made the transition to being more culturally data-driven; doing so required management mind-set to think digitally from the top-down and then making the financial and human capital investment necessary to instill a new culture. Organizations who establish formalized data-use strategies that influence organizational culture have executive proponents who are also users of data, where data insights and intuition are balanced inputs in organizational decision-making processes, where data-use is strategically versus operationally applied, and where data becomes more than a tool for process optimization, it is used in the discovery of new ideas and innovation (Ransbotham, S. 2015; Ransbotham, et al. 2016).

107

The challenge for many managers found in moving the organization culture towards being data-driven, is making ‘the expense and risk of data worth it’ (Fitzgerald, 2015, p

3). To realize favorable, cultural changing returns on investment in data-technologies, places certain performance pressures on managers to insure data accessibility and their useful, productive application.

Cultural change is driven form the top-down and data is the life-blood to establish data-driven organizational change and “CEOs are on the hook for performance” (Henke, et al., 2016, p1), for understanding how data as a valuable corporate asset (Bughin, et al.,

2016) impacts on organization performance. Top-down driven organization change is not a new revelation to the study of executive management. However, transitioning to a culture that is data-driven requires senior management to be ‘one with the data’19 to reshape how the company operates and possibly altering strategies to take advantage of revelations found in a new data-driven culture.

McKinsey Research (Brown and Gottlieb, 2016) found that “ensuring senior management involvement in data and analytics activities is the most significant reason for data-use effectiveness” and the most significant challenge is securing senior leadership for analytics projects (p 7). Forty-two percent of the 519 respondents that were classified as ‘high-performing companies (those that have adopted data analytics capabilities versus industry competitors)” stated analytic activities can be attributed to a three-percent (three hundred basis points) increase in revenue (less data analytic enabled companies reported a one-percent increases). Assuming the accuracy of this survey, this form of revenue

19 Referencing an acceptance of data as the primary source for analyzing business operations, opportunities, objectives

108

increase is substantial, regardless of organization size. In that same report, the challenges to attracting and retaining business users with analytic-related skills was found to be one of the most important factors in creating a data-driven culture (followed by data scientists and engineers); primarily due to limited career pathing and aggressive compensation competition seeking like talent capabilities. An in-depth discussion as to these reasons was not detailed in the report and suffice to say any related explanation on the part of this research paper would be pure speculation; albeit it would be worth a study in future research.

Establishing a data-driven culture begins with senior executives of the organization and their propensity to allow data to influence decision making. Who has access to the data, and in what form is critical to decision making (McAfee and Brynjolfsson, 2012); especially when data is “scarce, expensive to obtain, and not digitized (easily accessible formats)”. All too often ‘decisions by intuition’ over-rules the consideration of data; where opinions become predictive and decision-making judgement is the bias of the “the highest paid person’s opinion (HiPPO)” (Kaushik, 2009; Marr 2017; Mauboussin, 2012).

A data driven culture exemplifies five management characteristics (McAfee, 2012): executive leadership that understands the decision power contained in the data, yet remains sensitive to intuitive insights, data talent management to clean and organize data-sets and assist with data interpretation vis-à-vis visualization, technology strategy to embrace new tools that enable data analysis and data collection, decision making that makes relevant data available and grants decision making rights while also ensuring for cross-functional cooperation, and the company culture that becomes ‘question oriented’ by continuously asking what problem are we trying to solve, what do we know and not

109

know about the problem, what data is needed to better understand the problem, what is the origin of the data, and what type of analysis is required to facilitate decision making.

For example20: What is causing order processing delays, what data do we need to understand the problem, who is it effecting and who else do we need to speak with about the problem, or what is the trend use for a certain product, who other than our current customers can benefit from this product, how do we identify them, how do we reach them, or How do we set optimal pricing based on material resource availability? or We need to know the timing of receiving and off-loading incoming freight, can we speed this process? Each of these real-life, case study questions requires data, and the culture of the company should be one where the employees involved with the problem are assessing solutions through the support of data to make the most informed choices.

Data influenced decision making is requisite on the number of management responsibility levels within the firm and freedom made available to managers for enacting decisions. Larson (2012) discusses the impact of data use by managers through the lens of timing sensitivity of data relevance and to the type and the contextual goal or scope of the decision through a pyramidal illustration whereby; 1) ‘front-line’ decision makers are responsible for ‘day-to-day’ operation decisions requiring ‘concrete measures’ of ‘detail- level-data’ relative to the function(s) being performed and with a ‘hourly-daily latency’ data requirement. 2) ‘mid-level managers’ are focused on the firm’s short-term goals where ‘concrete measures’ of summarized ‘drill-down’ capable data and or data-mining is necessary to the decision-making process with latency requirements being data viewed on a weekly or monthly basis, and 3) ‘upper-management’ is concerned with long-term

20 Source of examples taken from consulting experiences of the author with manufacturers

110

goals supported by ‘concrete measures’ of ‘highly summarized KPIs’ with both historical and forecasted latencies.

Interesting to the use of data in creating organization culture, Kiron et al., (2014) positioned organizations at three levels; those ‘analytically challenged’, ‘analytical practioners’, and ‘analytic innovators’. Analytic challenged organizations are prone to rely on management experience and intuition versus data analysis in decision-making, the application of data is strictly seen as means to cost reduction initiatives, data quality and accessibility is often poor, and the organization usually lacks overall data management capabilities. Analytic practioner organizations are working to become a data-driven culture, operationally focused on data-use, where data quality is ‘good enough’ to support decisions, and it possesses the right type of data needed to make decisions. An analytic innovative organization is led by senior management that embraces the use of data in all decision-making processes within the organization, it views data as strategic applications with a mind-set that values the importance data as an asset, and whose senior management and management teams are skilled in data-use and analytics.

A data-driven culture “unites business and technology around a common goal through a specific set of behaviors, values, decision-making norms and outcomes” (Kiron, et al,

2014, p 10) with a common language when discussing the meaning of data21, “should

21 Kiron et al, 2014 Behaviors defined as the strategic integration of information management and business analytics, promoting collaboration and analytic best practices across the organization, investments made in analytic tools and education, senior management pressure to become data-drive. Values, data is a ‘core-asset’ with ‘top-down’ mandates. Decision-making norms, data insights lead to strategies, analytics out-weighs experiential information in major decision-making scenarios, an openness to challenging the status quo. Outcomes, analytics changes to the way business is conducted causing shifts centers of power due to the reliance on data for decision making.

111

have purpose, a grounded foundation, and mindful adoption”22 and the “real power of analytic-enabled insights comes when they are so fully embedded in the culture that the predictions and prescriptions drive a company’s strategy and operations, reshaping how the organization delivers on them” (Henke, et al., 2016).

Adopting a data-driven culture is in effect a situated change that is causing a planned organizational transformation; adapting to the impact of data analytics and decision- making processes and its continuous effect (Orlikowski, 1992, 1996) on reshaping the organization’s operating strategies as data made decisions alter previously held paradigms about how the organization functions and its performance expectations.

Granted, organizational transformation is academically considered revolutionary (i.e. a large-scale undertaking that is disruptive), while organizational development is viewed as an evolutionary (i.e. well-planned, incremental process) on its effect on the organization

(Porras and Silvers, 1991). A data-driven culture, as noted in current literature, is viewed as taking a revolutionary step or leap in the advancement of the organization and such fits a potentially disruptive change to the preverbal ‘ways things are done around here’ attitude; however, in its application as a culture changing vehicle, moving the organization into a state of being data-driven appears more evolutionary in that it requires considerable planning and coordination of activities (the initial impact initially incremental and may eventually lead to more radical changes). Data analytics, much like

22 Foundation (Henke, et al., 2016) as in technology and infrastructure (variety use of analytic tools and methodologies) & organization and governance (instill a company wide data orientation, building teams with complimentary data skills) – Insights as in purposeful data and purposeful use cases (capturing pertinent data from internal, external, hard, or fuzzy sources; applying data to value-creating cases that clearly measure impact) & Loops not lines as in feedback loops (observe, orient, decide, act, feedback, repeat process) – Actions as in Insights into action (integrate insights into real-life workflows and processes, creating user friendly interfaces and platforms) and Adoption (making adoption the deliverable)

112

introducing any new advanced technology into the organization environment, should consider the interplay among the introduced technology with the social structures and human interaction which can enable or constrain facilitating a cultural change (DeSanctis and Poole, 1994).23 Questions senior management should ask of itself to frame a data- driven culture are: 1) is the organization open to new ideas and have a willingness to challenge the status quo, 2) is data viewed as a ‘core asset’, 3) is there a ‘top-down’ initiative to make the organization data-driven, 4) does data help form strategies, and 5) can we listen to the data and alter how the way business is conducted (Kiron, et al.,

2014).

3.1.2 Barriers to a data-driven culture change

McAfee and Brynjolfsson (2012) found leadership, talent management, technology, decision-making and company culture as challenges on establishing a data-driven enterprise; conflicts against adopting data-centricity are often due to senior managers reliance on experience, intuition, or as the authors state HiPPO (highest paid person’s opinion), in decision-making processes, supplanting data evidence contrary to non- evidenced opinion. A 2016, McKinsey survey, ranked the challenges companies face in establishing a data-driven culture; first was ‘designing an appropriate organizational structure to support data and analytic activities’, followed closely by ‘ensuring senior management involvement’, third involved ‘designing effective data architecture and

23 Adaptive Structuration Theory that describes the interplay between advanced information technologies, organizational social structures, and human interaction where group decision support systems comprised of a ‘spirit’ of group decision processes, participatory leadership, time efficiencies, orderly conflict management, and an atmosphere that facilitates cooperation and coordination of a technology adoption.

113

technology infrastructure’, fourth ‘securing internal leadership for data and analytics projects’, and fifth ‘constructing a strategy’ for data use. Rounding out the remaining responses were: ‘tacking business impact of data and analytic activities’, ‘attracting and or retaining appropriate talent’, ‘investing at scale’, ‘providing support’, and ‘creating flexibility in existing processes to take advantage of data-driven insights’. Categorizing the responses, the report identifies three in which companies should internally think about barriers they could encounter: 1) strategy, leadership and talent, 2) organizational structure and processes, and 3) information technology infrastructure; albeit there is debate among researchers and practioners as to data activities operating somewhat independently from traditional IT departments. As an aside to the above discussion, four types of data talent are being sought by companies moving towards a data-driven culture, data architects, data engineers, data scientists, and data translators 24. Arising from the survey, companies struggling to gain a foothold onto a data-driven culture are faced with internal talent not capable of aiding in a cultural transition, let alone senior management possessing both the desire and utilitarian belief that becoming data-driven is key to ongoing organization performance and sustainability.

Other barriers to becoming culturally data-driven are concerns over privacy issues and data leakage that appears to have become ever more commonplace news events as reported by various information sources. Privacy issues then bridge to cyber-security, where

24 Data architect designs data systems and processes, data engineers build data products and scales solutions, data scientist analyzes data utilizing advanced tools and techniques, and data translator have both a technology and business application understanding of the data and its application in business settings. The age of analytics: Competing in a data driven world. McKinsey and Company, 2016. The report also highlights a shortfall in available talent that will extend well beyond demand; as academia works to catch-up on developing data analytic programs, it is projected that in the mid to late 2020’s, available talent will begin entering the work-force to satisfy demand.

114

competition could gain access to sensitive company data. Both privacy and cyber security barriers move onto a larger stage of concern and precipitate the financial liabilities companies would face should such unauthorized access occur.

Patience with the data, can also become a barrier; the insights data provides, may not immediately become a monetized outcome and managers should recognize that insights revealed today and ignored, could become business models or innovations of tomorrow and drivers of growth.

3.1.3 Implementing a data-driven culture

Companies who embrace a data-driven mind-set are more likely to align DA and corporate strategies, responding strategically to changes that affect products, distribution, processes, supply chains, and the internal-external organization eco-system, avoiding siloed behaviors, commonizing cross-function cultures and customer support perspectives

(Bughin et al, 2017). Items required of successfully implementing a data-driven culture, synthesizes these into the following: Use case analysis and or problem-solving methods on how data fulfills the information required in decision-making; its alignment with expected outcomes, through routinely asking what data is needed to understand the question. Know the data ecosystem: what the source and type of available data is, internal and external to the organization, determining the key data and variables important to the question under consideration. Establish data-use protocols to test various analytics, seeking predictors that aid in understanding business processes and outcomes. Build cross-function collaboration that allows for data to be woven together across the organization to provide deeper insights as to cause and effect on business processes Data-

115

analytics integration: make accessibility to data and is analysis organization-wide and driven to where the most opportune and efficient decisions can be made. Automate data processes with intuitive user interfaces that aid in efficiency of functions and work-flows.

Educate, build frontline and management data-analytic capabilities. Manage the cultural- change through frequently measured key performance indicators and making those KPIs visible and relatable. Appreciate the asset value of data, it cost to acquire and use, and make an effort to understand its useful return on investment through innovative new processes and products (Bughin et al, 2017).

Hm1a: Executive management influence (EMI) continuously moderates with data accessibility and use mechanisms (DAUM) on data integration practices. Hm1b: … on data actuation processes. Hm1c: … on organization outcomes. Hm1d: EMI has a positive and direct effect on data accessibility and use mechanisms (DAUM).

3.1.4 Making the data investment

To provide an economic context to ‘Big Data’, studies by Business Intelligence, Gartner

Research and other research organizations state, the number of connected devices will surpass 20 trillion by 2020. IDC Research forecasts these interconnected devices [known as the ‘Internet of Things’] and related information systems will create a business analytics market of $187 billion (USD) with $39.2 billion (USD) gleaned from manufacturing businesses. Per the US Census bureau [2012], 297,191 manufacturers are operating in the United States; 251,956 with revenues less than $10 million (USD) are classified as small enterprises (small-market), 44,240 with revenues less than $1 billion as medium (mid-market), and 815 with revenues greater than $1 billion as large (large-

116

market). To establish data investment benchmarks for this dissertation’s research and proportionally using the estimated invested business analytics figure for manufacturers mentioned above, these would indicate that small-market manufacturers may invest up to

$65 thousand and those mid-market, $3.8 million to enhance their data analytics capabilities. The US Census 2012 data also demonstrates that overall capital expenditures among small and medium sized manufacturers range from 2.4% to 3.4% of which data investment may be included. Examining current and planned expenditures will be useful toward understanding the impact of data investment on organization performance; furthering the question of data investment being a key value indicator of organization performance. As a matter of context, small and medium manufactures represent 77.2% of total manufacturing revenue; hence the importance of studying these size segments due to their numerical industry influence and understanding how data usage is working to transform those businesses.

3.1.5 Tangible value of data

Since this research level of analysis focus is manufacturers, thinking of data as being the raw material that produces a product is a start towards its understanding as a tangible asset. A manufacturer of metal fabricated parts knows the raw-material cost of the sheet steel which is used to form a product; the initial cost of the raw material may be $40 a ton, and when processed yields parts worth $2,000 in gross revenue. The obvious interpretation, the steel in its raw state has little value, but when formed into purposeful parts greatly increases its raw-state value. Data, likewise has little value in its raw state until it is formed into something of use that creates value. Through observations gleaned

117

from case interviews conducted in this research, comments surfaced concerning the value of data assets; or in what way can data assets be valued and its economic as well as strategic importance on the organization.

The value of data synthetically finds its value listed on a firm’s balance sheet25, in the form of intellectual capital, typified as an intangible knowledge asset. Researchers studying knowledge-based perspectives (Teece, 1997; Grant, 1996, 1997; Alavi, et.al,

2001), resource-based perspectives (Penrose, 1959; Wernerfelt, 1984; Barney, 1991) in context with strategic management competitive advantage (Teece, et al., 1997), and information systems design and strategy (Peppard and Ward, 2004; Henfridsson and

Lind, 2014) classify data as a resource, capability, and asset.

Data assets, often referenced as digital assets, can be measured in both financial and non-financial terms or by its intrinsic value, or how the data ordinally contributes on the organization; all data has value in its useful application, however some types of data may have greater influence or importance on the organization than others. The asset value of data relates to its cost of capture and integration into the organization its cost of operationalizing; costs associated with where and how the data is stored, how the data is maintenanced, and how the data is analyzed. Information, as an “unvalued asset” has

“resisted quantitative measurement” (Moody and Walsh, 1999, p2), data is the raw material of useful information, with software and hardware essentially the plant and

25 This is not to suggest accounting rules are changed to accommodate some form of quantifying the asset value of data and making a place for it on a firm’s balance sheet. Data is well-known as a part of the firm’s intellectual properties or capital and recognized in the form of an intangible asset and aggregated among the value placed by the firm on its intellectual properties. The purpose of stating the asset value of data and giving it a pseudo balance sheet item importance, is to make its form internally visible as a discrete asset and appreciated among the firm’s managers. By making data a quantitative substance, it can then be viewed as something tangible that requires maintenance, no different than any other tangible asset.

118

equipment of IT that enables a deliverable outcome. The costs of software, hardware (i.e. servers, cloud-based storage) and the labor required to maintenance and analyze data are relatively easy to determine. Data, in its raw state, is of little value; data once analyzed and made actionable as information becomes valuable and that value increases with usage

(Moody and Walsh, 1999). It is the value created by data usage that quantifies returns on investment. The intrinsic value of data can then be quantified in weighted terms along the lines of its veracity, accessibility, completeness, and competitive uniqueness, its business value as related to timeliness or shelf-life value of the data, and its performance value measured by the impact on KPIs (Freidman, 2015).

Data value becomes visible through use and methods of viewing its application

(among others, not herein mentioned) such as the ‘Navigator’ 26 (Edvinsson and

Malone,1997; Edvinsson, 1997, 2002), ‘Scorecard’ (Kaplan and Norton, 1996), and

‘Dashboard’ (Conference Board, 1997) as referenced by Luthy (1998), the ‘Intangible

Asset Monitor’ (Sveiby, 1997), and Technology Broker (Brooking and Motta, 1996).

Data as intellectual capital is generally defined as an organization’s intangible or invisible assets (Heisig, et al., 2001), those collective knowledge assets that can be legally defended (Edvinsson and Malone, 1997, 2002); composed of know-how, skills, expertise, creativity, image properties, internal processes, structures, and customer relationships that allow for organization functionality and wealth (or value) creation (Stewart, 1997;

Bontis, 2000). Intellectual capital as “human capital transforms into structural capital;

26 Navigator is a well-cited demonstration of exploring the measurable value of intellectual capital and detailed in the originating thesis ‘Visualizing Intellectual Capital in Skandia’ (Edvinsson and Malone,1995; Edvinsson, 1997, 2002). The essence of this model considered the balancing the firm’s financial aspects with those non-financial as captured in three dimensions, human, structural, and customer capital; thereby visually raising management awareness of the value found in capabilities contained within intellectual capital as non-financial assets (Edvinsson, 1997; Luthy, 1998).

119

human capital cannot be owned, only rented, whereas structural capital can be owned and traded by the firm” (Edvinsson,1997, p369)27.

Housed within structural capital are organizational, process, and innovation capital components of which databases and data are contained. Through analytics, data converts structural capital through value-creating, insights, information, innovative processes and products that build organization capabilities. Bontis (2000) highlights financial, customer, process, renewal and development and human measures28 derived from data to visualize capability creating value. Kaplan and Norton (1992, 1996, 2000) established the balanced scorecard method, allows for setting measures against strategic goals relative to the firm’s strategy; financial, internal business, innovation and learning, and customer. Hence, the value of data can be measured regarding the wealth it creates, the capabilities it creates or enhances; whether that wealth is found in cost-saving initiatives or the development of new products, new markets, or new business models.

Data’s marginal cost of exploitation is low and its reuse, unlike other assets, technically has no limits of capacity and combinative or re-combinative application (Higson and

27 In this context, Bontis (2000) explains that ‘leadership’ is fundamental to this transformation, considering human capital to be ‘volatile in that people enter and leave the firm and either bringing knowledge to the firm or taking knowledge from the firm, the employee is then ‘rented’ for the knowledge contributed to the organization. Structural capital are those elements which form material, tradable value. Sveiby (1990), preceded in defining structural capital and on which Edvinsson built his works, in describing ‘structural capital’ as meaning the know-how possessed by the firm’s personnel, the firm’s problem-solving ability, and, the firm’s customer capital referring to those customers ready to make transactions with the firm. 28 Bontis (2000) highlights the five primary focus dimensions of Skandia; financial, customer, process, renewal and development, and human. For example, under the term ‘process focus’, measures such as PCs per employee, processing times, and IT capacity were named; given the time frame in which this article was written, today’s measures under this category would be revised to illustrate, server capacity, download and upload speeds, visual screens per user, frequency of technology upgrades, and many others which can be quantified; or financial focus, the cost of data, its acquisition and maintenance can be measured against the innovations it produces. These focus dimensions are universal to any organization and can be modified accordingly.

120

Waltho, 2009) to create value. Short and Todd (2017) find companies need to “develop greater expertise in valuing data assets” (p17); ‘data-value’, comprised of three dimensions, asset value, activity value, and expected future value.29 Like any other physical or financial asset, the asset value of data could be stress-tested, and its measurement of return as a value-creating investment could be assessed through cost benefit analysis.

Data is typically viewed as an intangible asset, its quantitative valuation made by means of cost-based, market-based, and or income-based methods30 Findings among the case study companies, illustrated few had established a means and/or method to value data assets along these lines of thought. The fundamental challenge companies face to valuing data is understanding its complexities and its economic extension as a contributor on organization performance. Sveiby (2010) suggested measuring intangible assets through ‘direct intellectual capital methods, market capitalization methods, return on assets methods, and scorecard methods’ among many other models identified through the author’s literature review31. Concepts of economic value added (EVA), and deprival

29 Short and Todd (2017) state data-value is initially formulated as a strategic asset that can be monetized, followed by its value in use as defined by its application (i.e. CRM, CAD, MES, ERP, systems etc.) and frequency of use (i.e. application workload, transaction rates, accessibility), and lastly the data’s future value as determined by market-based transactions among like assets, cash flow generation, cost savings, and cost avoidance. 30 In the article, ‘Valuing Data is Hard’ (Mawer, C., 2015: https://svds.com/valuing-data-is-hard/) the author defines cost-based value as being the cost associated with creating an asset, market-based as meaning the comparable value of the data in the marketplace, and or income based that is an estimation of future cash flows. In this regard the income-based approach is preferred for data valuation where possible, and a difficult task at best to accomplish. To understand the difficulty in valuing data, is to view the data from a value chain perspective. Raw data moves from stages of processing, integration, analysis, actionable insights, and actions that create value derived from the data. The value of data increases as it advances through the chain and its usefulness may not be assigned to a singular application but produce multiple insights and pathed along other value chains to create additive value on the organization. The data would then be valued according to the value chain in which it is used and thereby adding to the complexity in determining its many-use, overall economic value as a strategic asset. 31 Direct intellectual capital (DIC) method is accomplished by deconstructing the intangible asset into quantifiable components that can be aggregated into a monetary value, Market capitalization(MCM)

121

value (DV) may also provide a means to quantify data as an asset. Bughin et al (2017) values data-technology initiatives relative to the return on investment greater than the cost of capital; noting research conducted on the digitization effect on product and services, marketing and distribution, business processes, and supply chains produced varying levels of contribution on economic value.32 Moody and Walsh (2002) outline the investment value characteristics of information through “7-laws”: 1) information (e.g.

Big and Small Data) is sharable and can be shared without loss of value, 2) its value increases with use, 3) it is perishable and depreciates over time, 4) accuracy of information increases its value, 5) when combined with other information, the value increases, 6) more information is not necessarily better, and 7) it is not depletable, data assets are not depleted by use. Laney (2014) through examination of asset characteristics, find three of importance; 1) its cash exchangeability, 2) specific entity ownership, and 3) ability to provide future benefits. Data as an asset, can be monetized, is ownership specific, and has unlimited potential value-creating benefits. The fundamental valuation aspect of data is to view it as a strategic asset of the firm, “that

method views the difference between a firm’s market capitalization and stockholder’s equity as a measure of intellectual capital (albeit, very difficult to ascertain with privately held companies), Return on assets(ROA) calculates an estimated value for intangible assets (or intellectual capital) by considering the firm’s ROA (average earnings divided by average tangible assets) as compared with relative industry average with the difference becoming a multiplier against the firm’s intangible asset value resulting in an average earnings from intangibles and followed by dividing this amount by the firm’s average cost of capital resulting in an estimate of intangible asset value, and the Scorecard (SC) method is similar to the Direct intellectual capital method without assigning a monetary value to the components, rather indicators are utilized denoted on visual representations that provides some relatable measure of intrinsic value. The author identified through literature, 13 varied forms of DIC methods, 21 SC methods, 5 MCM methods, and 3 ROA methods; none of which directly address the specific valuation of data as an asset. The purpose for illustrating this portfolio of methods is to assess the possibility that any one of these, upon additional study, may provide a platform on which to formulate a data valuation method. 32 Bughin et al (2017) study respondents indicated the effect of digitization on supply chains demonstrates the greatest opportunity for firms to impact on economic growth (revenue and EBIT growth as measured in the study), followed by products and services, and business processes albeit these contribute primarily to “profit improvements, but little to top-line growth”.

122

allows companies to acquire or maintain a competitive edge” (Gilkman and Glady, 2015) and “elevating data to asset status can pave the way for innovation and cultural change”

(Laskowski, 2014).

3.1.6 Construct and definition

Item Case Definition Combinative Definition

Individuals who set company strategy, Individuals with the authority, policy, resource Executive control, and persuasiveness over allocations, and make Management company resources and assets to investment with Influence affect the application and use of authority to make a technology. decisions affecting organization outcomes. Laughlin, 1999; Wittmer, 2014; O’Leary, 2000; Barton and Court, 2012; McAfee and Brynjolfsson, 2012; Davenport, 2010; Gladwell; 2002; Vales, 2007; Cooper, 1994; Harper and Utley, 2001; McDermott and Stock, 1999; Henke, et al, 2016; Hwang, et al. 2015; Klotz and Edmondson, 2017; Parker, 1994; Holland, et al., 2000; Bughin, et al. 2016

3.2 Data accessibility and use mechanisms

Today’s manufacturers are handling varied data streams flowing from Big and Small

Data generating sources. Bughin, et al., (2016) emphasizes effective data-use as the result of a circular ecosystem that considers; 1) data-generation sources 2) data aggregation, 3) data-analytics), and 4) data value creating ability. Davenport (2013) describes the evolution of data-analytic use in progressive-era terms. Analytics 1.0, the era of business intelligence made possible by advances in computing technologies, uncovered insights collected from a company’s internal transaction, production, and operation processes, its Small Data.

123

Analytics 2.0, with the advent of Big Data, where digital native companies such as Google,

Amazon, UBER, collect massive troughs of externally generated consumer behavioral data for strategic application. In comparison with Small Data gathered from a company’s internal information systems (e.g. ERP, CRM, MES, etc.), operational relative data.

Analytics 3.0, is the movement of non-digital native companies towards power analytics, blending Big Data and Small Data revealing innovations through cross-function-discipline collaboration on data-initiatives. Innovations shape new ways to conduct business though a “company’s management resolving to compete on analytics … improving on internal business decisions … creating more valuable products and services”33 (. Davenport 2013 p 68).

For analytics to be put into practice, building DA power and transforming that power into innovations becomes the penultimate strategic objective. Attributes of DA practices are: Data-use, defined by the frequency of integration among Big and Small Data variety collected from external and internal sources to yield insights that are descriptive, predictive, and prescriptive (Davenport, 2013; Short and Todd, 2017). Data-stock, the data variety inventory held by the organization (e.g. data-warehouse, data-marts), (Davenport,

Barth, and Bean, 2012). Data-accessibility made possible by data-analytic technologies, data management where the most timely received data is made part of the decision-making process and quickly acted upon (Davenport, Barth, and Bean, 2012).

33 Davenport (2013) references four companies to illustrate Analytics 3.0; The Bosch Group and applying cross-function collaboration to provide intelligent customer offerings; Schneider Electric shifting over the years from heavy manufacturing to electronics and intelligent control devices for use in manufacturing and energy industries; GE becoming a provider of asset and operations optimization services and active product management through real-time sensor connectivity; UPS with self-developed real-time package tracking systems.

124

It is the relational understanding of Big (strategic) and Small (operational) Data that creates value in decision making; insights are driven by acumen on the topic and the analysis mechanism that reveals useful relationship insights. Like the oft referenced statement about good processes make even bad or mediocre employees perform better, bad processes can make good employees appear to perform poorly. The same can be said for data-analytics, good analytics make what appears to be unimportant data insightful, poor analytics can make important data mis-understood. The mechanisms of analysis and use of data in decision-making represents both opportunities and challenges to managing organizational outcomes and “capitalizing them to their advantage” (Kwon, et al., 2014, p387).

Capitalizing on the advantages data may offer the organization, requires examination through data use analytic practices; how a manufacturing organization understands the quantitative practices used to reveal statistical meaningful information (Hoerl, et al.,

2014)34. How the organization accesses data and at what levels within the organization is data made available to create value through ‘tools to access data’, the ‘data freshness’ (i.e. data in the real-time, near-time, or lag-time 35), and the ‘devices where data is made available’ (e.g. mobile devices).

Moon (2013) posits the ‘ability to think strategically is an increasing important requirement for managers at diverse levels in the organization’ Harper and Utley (2001,

34 Hoerl, et al., (2014) considers the concept of statistical thinking as in analytics, the “quantitative methods to discover meaningful information in data” whose fundamentals are made of building blocks: clearly defined problem statements, process understanding, analysis strategy, variation of sources, quality of data, knowledge within the domain, sequential analysis approaches, and modeling processes. 35 This paper’s research defines: real-time as instant accessibility to data that updates on a time- present basis, near-real-time as being within an acceptable range of time such as a 15-minute delay, or lag- time where data-batching is required, and data updates occur the following day.

125

p14) and found ‘it is important not only that employees have the proper tools to perform their job, but also they be given the autonomy to decide on what extent to utilize those tools’; further, the organization should remain flexible towards the re-tooling of technologies to enable [effective decision-making], successful outcomes, valuing the free-flow of information among organization members, and conscious of attributes that restrain technology implementations.

Orlikowski and Iacono (2001) segment information processing (e.g. data technologies) as an artifact, a tool that “alters and enhances the ways humans and organizations process information” (p124), and extending that finding on how information can become an ‘overload’ factor when too much information is made available and negatively impacts on the ability to process information; especially observed in the context of this paper’s research where data-variety as seen in today’s manufacturing environment is voluminous. Thompson, et al (1991) found that belief in a technology and its use, can boost job performance with Beaudry and Pinsonneault (2010) presenting the emotional impact on technology use and how frustrations and elations36 with technology use, vary with the individual and usefulness perception of the technology. The emotions displayed by an individual may be related to poor functionality of existing technologies used in task performance, or the emotions may be related to the individual’s ability or cognitive capacity to use the technology.

36 Beaudry and Pinsonneault (2010) provide a framework on which to classify emotions: those related to opportunity (achievement, i.e. happiness, satisfaction, pleasure, relief – challenge, i.e. excitement, hope, anticipation, playfulness), threat (loss, i.e. anger, dissatisfaction, disappointment, frustration – deterrence, i.e. anxiety, fear, worry, distress). Opportunity and Threat are conditioned by perceptions of control, whether a lack of control or control thereof. Thompson et al. (1991) also found emotions play a significant role in the use of technologies.

126

Tarafadar, et al., (2010-2011) considers the effect of technology use in forms of information overload, application multitasking, information system upgrades, upgrade consequential uncertainties, continual relearning, insecurity with system use, and technical problems as causal to employee frustrations, terming this effect as

‘technostress’. Stress as the authors describe, is individual dependent and subject to the person’s role and task responsibilities; the outcome of the stress, and the organizational coping mechanisms to manage and reduce stress37. Eppler and Mengis (2004) provide a detailed cross-function analysis of literature on the mechanisms available to reduce technology induced stress and improve on individual technology adaption and cross- function collaboration in the use of a technology or data-technologies. The authors segment these mechanisms into five baskets: 1) personal factors, those related to training and skill development programs in technology use., 2) information characteristics, the manner data and information is formatted and delivered to the individual, 3) task and process parameters, focuses on processes and protocols of task management and the collaborative nature of integrating information, 4) organizational design, considers the social and structural aspects of the organization, and 5) information technology application, deals with the types, purpose, and performance of technologies among users, individually or within teams.

The ease of technology use and intersection with innovation outcomes is based on degrees of perceived use-difficulty (Moore and Benbasat, 1991; Thompson, et al. 1991;

37 The authors term these factors as ‘stressors, strain, and situational variables’. Constructs of end- user satisfaction and end-user performance is shaped by these technostress creators and mitigated through user involvement with system implementations and termed ‘involvement facilitation and innovation support’.

127

Davis, 1989). Venkatesh, et al., (2003) identified performance expectancy as the belief by a user in a [data] technology to aid in facilitating task performance, a data-technology assists in the interpretation of data “within some context or view-of-the-world” (Mason,

1978, p222)38. Job-frustrations and job-expectations can be mitigated through the quality of the technology (i.e. data-analytic systems), quality of the information (i.e. data veracity), and service quality (i.e. data viscosity) distributed through the organization

(DeLone and McClean, 2003). Interpreting these dimensions along the lines of this paper’s research, would view system quality as technology that is ‘usable, available, reliable, adaptable, and responsive, with information quality as data which is

‘personalized (i.e. data relevant to the task), complete (all known relevant data that impacts on the task), easy to understand (i.e. visualizing data or formatted in ways to ease understanding), and secure (i.e. data sensitivity and protection is critical to data use), and service quality, as the level and support intensity given the user.

3.2.1 Data-use

Data are different; transactional data found in ERP and like systems records business activities, sensor data records information collected from electronic devices connected to physical environments, geo-spatial data records geographic locations where an activity is occurring, reference data as contained in publicly available textual media [physical and electronic formats of readable, written documents such as magazines, textbooks,

38 Mason (1978) In developing a theory of information, apropos to this discussion, considers information as existing at two levels, sematic (variety of data interpreted outputs handled in a time-period) and technical (volume and velocity of data inputs handled in a time-period). Where the purpose of information is measured in terms of an ‘information producing unit’ (as in data) being converted into useful information. An information producing unit is comprised of ‘physical activity’ (the technology used to capture data), ‘logical activity’ (the technology used to process data), and ‘conversion activity’ (the technology used to interpret data).

128

literature, newspapers, etc.], and intimate data which records personal behavioral activities of web-site browsing, video and audio postings and streaming, blog writing, social media communications, on-line purchase-payment-banking actions, and any other like manner place where human interaction by means of interacting with a wearable, handheld, and or computer device that captures, records, and stores data (Chen, et al,

2015). Data are formed in structured and unstructured formats and combining these to learn of relational insights to help guide decision making is both the value and challenge for many enterprises; transactional, sensor, geo-spatial and some forms of intimate data is usually captured in structured formats, whereas many other forms of reference and intimate data are unstructured.

Data-use as a new capability (data and DA is often embedded in IT capabilities, information system stability and scalability), is based on discovery and agility

(Davenport, et al. 2012); as increases in the volume and velocity of data occurs, the need for a new set of skills (either contained in IT capabilities or differentiated as a separate core organizational capability) is required to integrate analytic capabilities into production environments. In other words, data and its use is redefining the role of IT through it effect on changing technologies, skills, and processes.

“As data grows more complex, distilling it and bringing it to life through visualization is becoming critical to help make the results of analysis digestible for decision makers” (Bughin, et al., 2016, p5). “The biggest problem in most organizations today is not the lack of information (i.e. data) but its over-abundance … when the quantity of information exceeds the human capacity to process it” (Moody and Walsh

1999, p9) decision making performance then decreases, with the belief culturally more

129

data and or information is better for the organization (O’Reilly et al. 1991; Driver and

Mock, 1975). This paradox is realized when putting data into play within the organization; we all seem to want more data, but what data, for what purpose, and when is it needed? Experiencing an environment of data richness and variety can be a wonderful information state to operate within; however, like a rich smorgasbord of food, too much data without purposed intent to consume what is important to satisfy one’s desires, can be counterproductive to decision making.

“Humans have limited time and brainpower; as a result, they use heuristics (a practical yet non-optimal method of problem solving; i.e. rules of thumb, intuition made decisions based upon prior experience of similar circumstances, and or simply an instinctive ‘gut feeling’ that has no fact-based rationale) to help make judgements”

(Thaler, 2015, p22). Heuristics, in this sense provides a framework in which decisions are commonly made; data and data-analytics augments the bias boundaries of heuristic decision making by expanding the limits of human nature’s ‘cognitive ability to solve complex problems’ (Thaler, 2015, p23). Business intelligence, the “techniques, technologies, systems, practices, methodologies, and applications that analyze data”

(Chen, et al., 2012, p1166), helping to better management’s understanding of the business enterprise (i.e. operating performance, financial results, resource allocations), competitive markets, and environmental externalities needed for efficient and effective decision making. “When humans make decisions, the process is often muddy, biased, or limited by our ability to process information …data and data-analytics can change all that … adding automated algorithms to make the process instantaneous” (Bughin, et al., 2016,

130

p11) [i.e. overload of too much information and speeding data via automated generating systems to minimize overload].

Machine learning39 (or , AI) is touted as being instrumental in aiding decision making; dealing with highly complex classification, prediction, and generation problems and through analytics advises managers on allocating resources, forecasting, discovery of new trends and trends, price and planning optimizations, managing unstructured data (understanding natural language generation as found in unstructured intimate data), and sorting potential issues to identify possible solutions

(Bughin, et al., 2016). Neural network40 (Schmidhuber, 2014) advancements in machine learning are classified into convolutional (image recognition) and recursive (speech recognition, language processing) (Bughin, et al., 2016) and termed ‘deep learning systems’ and defined as “computational models composed of multiple processing levels to learn representations of data with multiple levels of abstractions” (LeCun, et al., 2015).

Reinforcement learning, another aspect of machine learning, refers to ‘trial and error’ or sequential decision making (Li, 2017), where the algorithm considers and acts on multiple options to arrive at a prescribed, but non-directed, objective or goal with an optimal solution. Ensemble learning, as the operative word suggests, refers to a collection of machine learning methods utilized to provide optimal predictors on a range

39 Machine learning; the algorithms essentially learn based on inducing the data it collects and assesses to establish real-world representations and alters its assessment upon receiving new data; thereby continuously learning occurs within the algorithm. 40 Neural network; a system of connected processors (neurons) producing sequential ‘real-valued’ activations; input neurons perceive the environment and are activated by sensors, other neurons activate via weighted connections from previous activated neurons, some neurons ‘influence the environment through triggering actions.

131

of hypotheses; noted as classifiers “whose individual decisions are combined in some way to classify new examples” (Dietterich, 2000, p 1).

3.2.2 Data accessibility

The collection, analysis and use of structured and unstructured data41 in decision making represent both opportunities and challenges to managing organizational outcomes and “capitalizing them to their advantage” (Kwon, et al., 2014, p387). Capitalizing on the advantages data may offer the organization requires examination through data analytic practices; or how a manufacturing organization understands the “technologies and techniques” (Kwon, et al., 2014, p387) used to reveal meaningful information (Hoerl, et al., 2014) creating value, impacting on organization performance.

An April 2016 McKinsey report asked respondents about data and analytics capabilities within their respective organizations; the question framed dimensions of, data accessibility, tools and expertise to work with unstructured, real-time data, self-serve analytics capabilities, and Big Data and advanced analytic tools (an ‘other’ category was listed, but not defined). Of these practices, data accessibility and its latency deemed most important. Data accessibility and accessing the data in real-time is “desirable to reduce the latency between a business event and a corresponding action” (Uckelmann, et al.,

2011, p8) by having the right data at the right time, at the right place and usable with

41 Structure data sets, as the operative term suggests, are data that is highly organized, be formed into a relational data-base that is easily searchable and configured to provide insights (i.e. structured query language (SQL) spreadsheets, data views generated by ERP systems). Unstructured data refers to data that is unorganized and requires considerable time to capture; the highly complex nature of unstructured data is challenging (difficult to assemble) to integrate into a data-base; examples would be text related communications (e-mails, chats, texting) that may deal with more than one topic or issue embedded in the communication.

132

minimal effort, be humanly readable, semantically and syntactically, to allow for pattern recognition and analysis.

3.2.3 Data management

While data accessibility is desirable in supporting decision making situations, the quality, veracity of the data (trust in data quality) is of even greater importance 42

(Baesens, et al., 2016; Moges et al., 2013). Trust is an important characteristic (possibly the most important) of data, and the more reliable the data and methods of analysis to yield acceptable insights, the greater is the trust in the data. Data quality is defined as its fitness for use (Pipino, et al., 2002), which refers to having not just the right type of data available, but to have valid, quality data used in analytical models as input to the decision-making process. Insuring quality data requires procedures and monitoring of:

1) how data is collected and from what data generating sources, 2) the data repository and updating frequency of the data, 3) retrieval speed of the data, and 4) how data is then processed and prepared for analytical purposes (Baesens, et al. 2016).

3.2.4 Data analytics and business analytics

Data-analytics has existed for thousands of years, possibly as long as humans have walked the earth: from early-man, through trial and error, calculating the speed at which to run from or to prey; to centuries old Sun-dials that illustrated times between sunrise and sunset; Sumerians,

42 Moges et al., (2013) conducted a study among financial institutions highlighting data quality as critical to firm success; the study continued by finding the financial and firm resource cost of making available quality data is very high and some firms forgo the investment; as a result, the firm may be working with data that is not valid, or not of the quality necessary and could degrade the value of decisions being made. When firms think of data, many times it is in the context of using it to create a competitive advantage, and or some other value-added characteristic; by not ‘quality’ checking data could compromise decisions being made via ‘failures to identify and pursue potential long-term gains or benefits’ (p809)

133

Greeks, Egyptians and other ancient civilizations using plumb-line and square on geometry and mathematical analysis to build temples, monuments, and cities; Druids tracking sun, moon, and planetary and analyzing movements through circular-positioned monoliths for seasonal planting, harvesting, and religious activities; to the loom (often referred as the first computer) where pattern design was a function of human input in analysis of how fabric thread types and colors were integrated to form useful materials; and the unknown origin of the abacus for analysis of quantities. Man has been in a constant state of analysis and these ancient data-technologies illustrate a simple point, each represented data transformed into something people could easily understand and use. Hence the purpose of DA, to make visible the invisible, the uncertain certain, giving insights from collected observations to act upon.

So, what is DA? In its purest definition, is a discovery method for problem-solving and decision-making. Commonly placed into action through a seven-step process: first recognizing a problem and framing its scope of effect on the decision to be made or outcome to be achieved, second asking what known and unknown information is required to understand the problem, third determine what sources can provide that information, fourth collecting the information, fifth analyzing and understanding the relationships between the information and problem, sixth determining how the relationships work together and produce insights on the problem, and seventh using the interpreted insights to make decisions on solutions to the problem (Hoerl, et al., 2014; Albright and Winston,

2015).

So, what is BA? Very much the same definition and process as DA. INFORMS states BA is a scientific process of transforming data into insight for making fact-based

134

and or data-driven decisions (Camm, et al., 2015), extracting useful business patterns and decision models from data-sets (Baesens, 2014), supporting manager’s decisions beyond intuition or gut-feelings (Davenport, 2013). The terms DA and BA are interchangeable, requiring rigor in application and understanding of use; taking form in descriptive

(understanding what has occurred), predictive (understanding what could occur), and prescriptive43 (understanding alternative outcomes) techniques and contextual with the level of risk-reward tolerance on the decision being made. Shaped by the business relevance of the technique used on analyzing a problem; characterized by statistical significance and power, interpability, justifiability, operational efficiency, cost of analysis, and regulatory concerns (Baesens, 2014)44. As the sophistication of the technique or mechanism increases, so does its value on the decisions being made.

Combinating with the case-study definition, data accessibility and use mechanism means those methods, manners, technologies used to facilitate the integration of data into the organization environment.

43 Descriptive, what has occurred? (e.g. studying past product sales behavior or past material performance to identify problems or opportunities for product improvement); Predictive, what may happen? (impact of variables on each anticipating effect, e.g. social media text analysis or survey data used to predict market acceptance of a product); Prescriptive, what options do we have? (indicating courses of action, e.g. scenario optimization pricing models for airlines and seat-capacities, as seat supply diminishes, pricing increases to reflect demand or simulation factory production models that illustrate changes in processes that would speed production and through-put). 44 Interpretability, viewing and understanding patterns in the data; Justifiability, relationship or correspondence with prior knowledge and intuition; Operational Efficiency, the process of collecting data, preprocessing, model evaluation, and outputs to the business application; Economic cost, expenses associated with the analysis, cost of technology, data collection, analyst personnel, etc.; Regulatory, compliance with any government policies and data-types that can be used to build models (e.g. banking regulations and stress testing financial soundness).

135

Hm2a: Data accessibility and use mechanisms (DAUM) continuously moderate the organizational integration of data. Hm2b: …DAUM continuously moderate the actuation of data. Hm2c: … DAUM continuously moderates on organization outcomes

3.2.5 Construct and definition

Item Case Definition Combinative Definition

Those, methods, The methods and manners, and Data mechanisms making technologies used to Accessibility data available and facilitate the & Use useful in the integration and Integration organization’s operating actuation of data into Mechanisms environment the organization environment. Bughin, et al. 2016; Davenport, 2013; Short and Todd, 2017; Davenport, et al. 2012; Kwon, et al. 2014; Hoerl, et al. 2014; Moon, 2013; Orlikowski and Iacono, 2001; Thompson, et al. 1991; Beaudry and Pinsonneault, 2010; Tarafadar, et al. 2010, 2011; Eppler and Mengis, 2004; Moore and Benbasat, 1991; Mason, 1978; DeLone and McClean, 2003;Chen, et al, 2016; Moody and Walsh, 2016; O’Reilly, 1980; Driver and Mock, 1975; Thaler, 2015; Schmidhuber, 2014; LeCun, et al. 2015; Li, 2017; Dietterich, 2000; Uckelmann, et al. 2011; Baesens, et al. 2016; Moges, et al. 2013; Pipino; et al. 2012l; Albright and Winston, 2015; Camm, et al. 2015;

3.3 Knowledge intense environments driving data-analytics adoption

Why adopt data-analytics and or a data-centric management style? Literature states the motivation for doing so is driven by realizing performance and process improvements. The case studies conducted in this research also point to simply ‘getting a handle’ on all the information bombarding the company from everywhere [internal, external sources] and how to make sense of it all; in other words data management and data sensemaking (Weick and Sutcliffe, 2005; Gioia and Chittipeddi, 1991). Technology

136

forecasters indicate, the use and potential of a data-driven business will become more prevalent as the data-sources, technologies, and methods to capture, and organizationally integrate data continue to accelerate (Kwon, et al., 2014). Computer controlled equipment is creating a data-based understanding of how automated machines and humans interact to perform tasks and the fact-based means to make decisions using real- time (or near real-time) gathered data. As artificial intelligence, machine-learning, and task-performance automation occurs and data generated from these devices leads to improved productivity, the need of employing production line laborers (and or other repetitive task positions) will decline and unfortunately at a cost of reduced employment

(Frey and Osborne, 201345: Brynjolfsson and McAfee, 2012) that may impact senior managements’ view of its social responsibility (i.e. economic well-being) to the firm’s employees as stakeholders in the organization.

Production line workers are not the only employment category suffering from the greater use of automation and data fed decisions, McKinsey Global Institute (2013) predicts the use of algorithms to perform analytic tasks will globally displace 140 million knowledge workers in the years ahead (Frey and Osborne, 2013). Performance pressures arising from an employment, social responsibility perspective in context with understanding the challenges faced when removing the human element from organization operations through investments made heavily on data-technologies, machine-learning technologies, and other forms of automation, may or may not provide satisfactory

45 The Future of Employment: How Susceptible are Jobs to Computerization? Oxford Martin School, University of Oxford Press

137

performance outcomes should the company not properly plan to anticipate those impacts on the organization.

Contextually, common themes emerge from the case interviews, some anticipated as expressed in literature, some newly revealed. With these characterized as motivational dimensions of performance pressure, competitive pressures, and technology pressure. A

Data-centric adoptive driver is defined as, motivational pressures upon which an organization adopts a data-centric management belief.

Aggregating into two overarching dimensions, pressures can be segmented as push and pull drivers. Push drivers are those external forces on the company motivating management to invest in data technologies. Pull drivers are internal motivations, ambitions of management to realize a return on technology made investments.

3.3.1 Performance pressure

Performance pressure is simply interpreted to mean those motivations placed by managers on the organization to achieve some desired outcome; characterized as measurable expectations, stipulated by management in support of company strategies and objectives, where deployment of resources organizationally delivers on expected outcomes (Hong and Stout, 2016; Parmenter, 2015; Gardner, 2012). Relative to DA, performance pressure on management is to become less reliant on experience and intuition (or gut-feeling as noted in the case interviews) as sources of supporting information and more reliant on accurate information presented through data and its

138

analytics46 (Kiron et al., 2014). Any pressure is the sum of factors increasing the motivation to perform well (Baumeister, 1984), data and applied information works in supporting performance on addressing pressures. Pressure performance themes revealed in the case studies sought returns on data investments, enabling talent, and sustaining operations; concepts being management evaluations through strategy-linked performance indicators on which the company seeks to maintain strategic focus on certain outcomes, allocation of resources based on performance expectations that are aligned with strategy, monitoring operation processes to understand redundancy and value-added activities, reducing employee job-frustrations, managing talent, and organizationally enhance critical thinking skills, solving problems, making informed decisions.

Brynjolfsson and McAfee (2012) discuss the creative destruction of technology on jobs and the pressure managers face in balancing the integration of technologies and promises of improved productivity requiring ‘parallel innovation’ of organization processes, structures, skills and talents required of technological adaptations and the need to invest in human capital to meet intensified, fast-paced market environments motivated by internal and external pressures to achieve certain outcomes (Gardner, 2012) is important on managing technology integration. The case research definition of performance pressure is the internal motivations and ambitions of senior management on productivity to invest in data-technologies. Re-combinated with literature as the productivity motivations placed on senior managers to organize, allocate and deploy data asset resources.

46 Kiron et al., (2014) conducted a forth in a survey among 2,037 business executives, managers, and analysts working in 25 varied industries and located in 100 countries. The term analytics refers to data- use and related insights formed through applied analytic disciplines that ‘drive fact-based planning, decision making, execution, management, measurement and learning’.

139

H3a: Performance pressure (PP) will have a direct and positive effect EMI. H3b: … on

DAUM. H3c: PP when moderated by EMI and DAUM will have a positive effect on data integration practices; … strategic level data (SLDIP) … operation level (OLDIP), …

Data-security level (DSLIP) … Industrial Internet of Things level (IIOTLIP) H3d: PP when not moderated by EMI and or DAUM will have a lesser effect on DIP.

3.3.2 Competitive pressure

Competitive pressure is asymmetrical, how one firm responds to another differs, some firms will feel greater motivations to maintain competitiveness, others will find it less necessary; yet understanding the pressure systems facing a firm, regardless of believed effect, is important (D’Aveni, 2002). Competitive pressure, management leadership and the business model of the firm are constantly under pressure to adapt to changing market conditions (Amit and Zott, 2012). Accordingly, and frequently, competitive pressures are traditionally measured by buyer power, supplier power, barriers to entry, threat of substitution, and intra-industry rivalry, environmental elements known to dynamically affect a firm’s strategies (Porter, 1991; Porter and Millar, 1985; Porter and Heppelmann, 2015). Data and DA are assets and capabilities when understood and applied appropriately work to support strategies and sustain competitive advantage, vehicles that allow for embracing competitive pressures. Competitive pressures are vaguely defined through literature, yes, many an article has been written on competitive advantages, measuring competitive pressures, and offering insights on understanding the topic, but no clear agreement on definition. Competition pressure can be summarized as changes on business environments through actions taken by one actor and felt by another

140

actor to be disruptive on their status quo; requiring a reaction or response to the action that sustains the status quo. Status quo not in the terms of complacency, but in the way a business operates, its responsiveness to changing environments, its modification on the status quo. Wrapped into competitive pressures, and to extents relative on performance pressures, are those institutional pressures (regulatory, policy driven, environmental) placed on the adoption of externally motivated initiatives, effecting overall business performance and woven into the supply chain (Gimenez, et al, 2012). The case study definition of competitive pressure is the external motivations and ambitions of senior management to stay technologically ahead of competitors. Re-combinated as the external motivations placed on senior managers on investing in data-technologies technologies to meet changing market environments.

H4a: Competitive pressure (CP) will have a direct and positive effect on EMI. H4b: … on DAUM. H4c: CP when moderated by EMI and DAUM will have a positive effect on data integration practices (DIP) … strategic level data (SLDIP) … operation level

(OLDIP) … Data-security level (DSLIP) … Industrial Internet of Things level (IIOTLIP)

H4d: CP when not moderated by EMI and or DAUM will have a lesser effect on DIP.

3.3.3 Innovation pressure

Innovation pressure as a term found in literature is not identified and when innovation is discussed it is within the realm of competitive pressure; generally understood that competition spurs innovation, and subjective to the intensity of competition in the arena where the company operations. In commonly understood theory, lesser competition yields lesser motivations to innovate, greater competition increases innovation (Barbos,

141

2015). Revealed in the case study research is the definition of innovation pressure; the internal motivations and ambitions of senior managers to build unique and differentiated capabilities. The internalized context is framed with the thought, not all motivations on innovation are initially grounded in competitive pressure, albeit their outcome may have an indirect effect on competitiveness, yet the pressure to innovate is a dimension viewed as an internal desire to do something better, just for the sake of innovation itself.

Stampfl (2014, p 12) identified four types of innovation criterion applicable to defining innovation pressure; domain, degree of novelty, degree of change, and trigger47. Of particular interest is ‘trigger’ in discussing innovation pressure. Accordingly, the author defines ‘trigger’ in terms of ‘pull innovation (market demand motivated)’ and ‘push innovation (new technology motivated)’. Followed by ‘degree of novelty’, and whether the innovation is as the author states “new to the world” or “new to the organization

(firm-only)”. Literature is replete with innovation terminology, and numerating these would of little benefit on the term innovation pressure sans to say that innovation pressure could be granulized in further research. Returning to the case study definition and its internalized motivation, the push aspect of adopting data-technologies could solely be from a ‘new technology’ perspective, one that simply aids the organization in performing better. The innovation pressures described in the case studies primarily focused on the impact of innovation on increasing or enhancing organization capabilities.

The pressures to adopt data-technologies are improving organizational decision-making, maintaining and sustaining operations, motivations to stay on top of new technologies, all

47 Domain references the type of innovation, product, process, market, social; degree of novelty as objective innovation being new to the world and subjective being new to the organization, degree of change references radical or incremental change, trigger as motivations that initiate the innovation, pull and push. (Stampfl, 2014, p 12; Vahls and Burmester, 2002; Kupsch et al. 1991).

142

centered in pushed and subjective characteristics on the organization. Innovation pressure defined in a combinative form as the internalized push and pull motivations on senior managers to objectively or subjectively build unique and differentiated organization capabilities.

H5a: Innovation pressure (IP) will have a direct and positive effect on EMI. H5b: … on

DAUM. H5c: IP when moderated by EMI and DAUM will have a positive effect on data integration practices; … strategic level data (SLDIP), … operation level (OLDIP), …

Data-security level (DSLIP), … Industrial Internet of Things level (IIOTLIP) H5d: IP when not moderated by EMI and or DAUM will have a lesser effect on DIP.

3.3.4 Cyber-security pressure

Embedded topics within the case studies are references to data-security, or cyber-security and the concerns over unwanted departure of data from the company. Securing data assets, as a pressure or motivation among managers, is significantly relevant to this research. Especially in light of the effect IIOT (discussed in section 3.4.3) on manufacturers deploying advanced technologies and networked data-technologies. The opportunity for the hacking of servers, lost mobile devices, interruption of business activities continues to grow and can only be forestalled, through management’s desire to protect the company’s data assets. Mangelsdorf (2017) during an interview with Stuart E.

Madnick48 reports an “estimated 50% to 80% of all cyber-attacks are aided or abetted by insiders, usually unintentionally” (p23), this occurs from employees unknowingly

48 Stuart E. Madnick. Director of MIT Interdisciplinary Consortium for Improving Critical Infrastructure Cybersecurity.

143

opening phishing49 e-mails, clicking on links that allow a hacker access to company data.

Madnick furthers the finding by stating “a lot of vulnerabilities come from corporate culture” (p23) and is not limited to the company itself, but the value-chain in which it operates; requiring a culture of data asset security to become end-to-end (e2e) with its customers, suppliers, and related stakeholders. The pressure to prevent a cyber-attack, the pressure to discover vulnerabilities, and the pressure to have recovery plans in place in the event of an attack are fundamental to developing a data-security strategy and making it a part of company culture with “cybersecurity, a pillar of enterprise risk management”

(Barbier, et al. 2016, p2).

Rothrock, et al. (2018) report on the findings of a survey50 conducted among 200

CEOs found 87% of respondents believed better ways to measure the effectiveness on cyber-security investments are needed, and 72% seek meaningful metrics on which to measure investments, an additional pressure manager encounter in assessing the value received on data-asset protection. Gartner Inc., (2017) cyber-security spending would represent $96 billion in 2018, yet the survey revealed most CEOs “believe cyber-security to be a strategic function” beginning with executive involvement with 89% of respondents “treating it as a function of IT” subjective to IT budgets and not differentiated as a separate strategic focus. A 2016 Cisco survey reports cybersecurity responsibility as being held by CEO/Board Members (39%), Chief Risk Officer (35%), the CIO/CISO (30%), Function Heads (20%), CFO (19%), and Individual Employees

49 Madnick states, in general observation only 1% to 3% of phishing e-mails are opened, the greater challenge is in the opening of what is termed ‘spear-phishing’ emails which appear to originate from company executives asking for the employee to take a job-related action, the open rate in these situations is 70%. Hackers can emulate emails that have the look and feel of a company communication. 50 Survey conducted by RedSeal Inc., a cyber analytics company, Sunnyvale, CA

144

(4%). Somewhat consistent with the case studies, the CEO/President/Owner felt the most responsibility for protecting data assets.

On the effect of cybersecurity on innovation, the same Cisco survey found 71% of respondents believed “concerns over cybersecurity are impeding innovation … 39% have

“halted mission-critical initiatives due to cybersecurity issues” (p4). That being said,

69% of respondents believed digitization [innovation] is important to growth strategies with 64% being cybersecurity as significant on digitization; 28% of responding manufacturers believed cybersecurity enables growth, 72% as a function of risk reduction. The report also finds, among manufacturers, cybersecurity initiatives create technology adoption lags as companies redesign operating environments to accommodate

IIOT, analytics, and related applications 51; delaying these activities “lessen the company’s ability to innovate and grow”, “I think it’s really important that we stop thinking about security as a defense-centric approach … we need to think about it as an enabler that supports innovation” (Barbier, et al. 2016, p ii; Dahn, 2016)52.

On the effect on competitive advantage. Most companies when moving onto a digital transformation (Data-centricity), data-security is many times an afterthought on the implementation of new technologies. Investments in data-analytic technologies, advanced manufacturing systems, IIOT technologies, require protection and competitors may or may not weave data-security initiatives into newly adopted technologies. “As

51 The survey found the adoption of digital use cases to be: predictive maintenance, analytics, quality and defect control automation, energy management, connect products maintenance, assembly line changeover, remote maintenance, visual factory. 52 Mike Dahn, head of data security and industry relations at Square Inc., one of the case interviews conducted by Cisco in the report ‘Cybersecurity as a growth advantage’, Barbier, et al. (2016)

145

companies continue their digital transformations, they need to adopt flexible and ubiquitous defense measures … unanticipated costs, operational shutdowns, reputational damage, legal consequences” (Wellers, 2017, pp 2-3) become the result of not taking appropriate measures on data-security. Competitors not taking the extra steps to incorporate data asset protection, as an ongoing process, leave themselves open to competitive disadvantages. The case studies defined cyber-security pressure as external motivations and ambitions of senior management to protect the data-assets of the company. The recombinative definition is the external and internal threats of data asset loss placed on managers to protect data assets and company sustainability.

H6a: Cyber-security pressure (CYBP) will have a direct and positive effect on EMI.

H6b: … on DAUM. H6c: CYBP when moderated by EMI and DAUM will have a positive effect on data integration practices; … strategic level data (SLDIP), … operation level (OLDIP), … Data-security level (DSLIP), … Industrial Internet of Things level

(IIOTLIP) H6d:CYBP when not moderated by EMI and or DAUM will have a lesser effect on DIP.

[Remainder of page left intentionally blank]

146

3.3.5 Constructs and definitions Table 19: Data-centric pressures constructs, definitions

Item Case Study Definition Combinative Definition

The internalized productivity The internal motivations motivations placed on senior Performance and ambitions on senior managers to organize, Pressure management to invest in allocate, and deploy data data-technologies asset resources. Hong and Stout, 2017; Parameter, 2015; Gardner, 2012; Kiron, et al. 2014; Baumeister, 1984; Brynjolfsson and McAfee, 2012; Davenport, 1993 The external and internal External motivations and threats of data-asset loss Cyber ambitions of senior placed on managers to Pressure management to protect the protect data-assets and data-assets of the company company sustainability Mangelsdorf, 2017; Barbier, et al. 2016; Rothrock, et al. 2018; Gartner, Inc. 2017; Dahn in Barbier, et al, 2016: Wellers and Somaini, 2017 The internal motivations The internalized push and and ambitions on senior pull motivations on senior Innovation managers to build unique managers to objectively or Pressure and differentiated subjectively build unique and capabilities differentiated capabilities Barbos, 2015; Stampfl, 2014; Davenport, 1993, Milliou and Petrakis, 2011 The external motivations The external motivations and ambitions of senior placed on senior managers Competitive management to stay on investing in technologies Pressure technologically ahead of to meet changing market competitors. environments. D’Aveni, 2002; Amit and Zott, 2012; Porter, 1980, 1987; Davenport, 1993; Gimenez, et al. 2012

3.4 Data-analytics integration practices

“Big Data allows us to leverage both prediction and causal analysis’ (Baesens, et al., 2016, p810) that affect decisions; ‘bad decisions impact a firm’s bottom-line and allocation of critical resources’ and ‘drawing out decision-ready inferences from data-

147

analytics (e.g. analytic models53) that influence and enhance the firm’s decision making’ ability (Baesens, et al., 2016, p814). Trusting the reliability of data and the analytics of the data to communicate its value, translated into a quantifiable return on investment54 relative to the investment made in data analysis (or business intelligence) and the decision-making insights being provided; in other words, how is the data to be used and for what outcome. “Data has become a critical asset, and business leaders want to know what the information they hold is worth … new [business] leaders invest heavily in digital platforms, data, and analytic talent” (Bughin, et al., 2016, p9) to extract the value found in data while understanding the effects on its integration.

General literature review outlines several dimensions on putting data into organizational play: first, is organization-effect, the management of data to provide for its access and use; second, technology-effect, those artifact information systems and resources that enable data-analysis to be conducted; third, human-effect, the tacit and explicit knowledge of analytics and its applied use. Regarding organization effect, access and use is a function of cross-discipline integration on data-initiatives or managing that capability across business and operation functions (Davenport, 2012, 2013; McAfee and

Brynjolfsson, 2012; Barton and Court, 2012). Technology-effect is the degree and frequency of adoption, financial investments made, and resources allocated by management into data-technologies (Kiron et al., 2014; Arifin and Frmanzah, 2015). The human-effect, the knowledge intense environment in which a company operates, is under

53 Analytic models reference those mathematical models originating from statistic, econometrics, machine learning, and or artificial intelligence disciplines. 54 Return on investment can be interpreted, and or calculated in many ways; the costs associated with collecting and interpreting data (human capital and technology capital), opportunity costs associated with alternative choices, discounted net present value analysis of how the data analytics is affecting cash- flows by the decisions it invites.

148

constant change and as change occurs, the requirements on human capital also changes

(Davenport, 2014; McAfee and Brynjolfsson, 2012). Data integration practices balance these effects, defining and categorizing data to capture is critical to integration.

Data is generally discussed as Big Data and Small Data, Davenport (2014) seeks to clarify the generally applied term ‘Big Data’ by stating the type of data and its use defines its ‘Big or Small’ Characteristics; stating “few organizations confess to working with “small data” even though it’s a perfectly respectable activity … for the term [Big

Data] to be truly useful its opposite needs also to be valid” (Davenport, 2014 p9). Hence this paper’s classification of strategic and operational data, both contain data internally and externally captured, with strategic data having enterprise effects and operation data having, simply, operating effects.

3.4.1 Strategic-level practice

What is strategic-level data? Case studies and literature research suggest a definition meaning data-variety in support of enterprise goals (Bogetoft, 2012; Baan,

2009; Demartini, 2014); data-variety types originating from market, consumer, industry, government, and other relevant data captured from external as well as internal sources of information. Convention categorizes strategic-level data in the realm of ‘big-data’, with defined characteristics of volume, velocity, variety (Chen, et al., 2012), veracity, and value (Baesens, et al., 2016; Baesens, 2014)55, viscosity, variability, and volatility

55 Chen, et al., 2012: volume describes the physical size of the data being collected (e.g. bytes of data, gigabits, terabits, etc.); velocity describes the speed of accessing data, in real or near-real time; variety represents the number and type of sources generating data. Baesens et al., 2016 adds to the accepted 3-Vs characteristics of Big-data with veracity, meaning the quality, accuracy, trustworthiness, integrity of the data; value and how the data is being analyzed, the construction, performance, contextual appropriateness, and evaluation of an analytical model (p814).

149

(Desouza and Smith, 2014)56 ; where volumes and variety of data are too large for ordinary data processing technologies and require statistical tools to make the data useful

(Hoerl, et al., 2014; Elgendy and Elragal, 2014). Big data describes the continued growth in data variety and its productive use, challenging companies to “find economical ways of integrating heterogeneous data sets” (Desouza and Smith, 2014, p 40); data is viewed in two forms, structured and unstructured, and plays into how the data is captured, analyzed, and organizationally integrated57.

As observed in the case-interviews; the organizations studied routinely analyze operational data (e.g. production, transactional, shipping, inventory, finance, etc.) with a few accessing external data characterized as big-data such as consumer market data, supply chain data (albeit to constrained extent), and industry data. “Most organizations collect data to meet operational needs … buried in the organization’s administrative systems” and integrating data variety from external sources is difficult given data characteristics58 and available analytic tools for integration (Desouza and Smith, 2014, p41). Ross, et al. (2013, p 92) companies “need to first learn how to use the data already embedded in their core operating systems … otherwise they will not be in a position to benefit from big data”. Important on data type embedded in Big data (and small data) are those of leading and lagging indicators, reviewing lagging indicators (long-term view,

56 Desouza and Smith speak to viscosity, measures the resistance on data flow, variability, the predictability in rates of data flow, volatility the shelf-life of data. 57 Structure data are data that is highly organized, be formed into a relational data-base that is easily searchable and configured to provide insights (i.e. structured query language (SQL) spreadsheets, data views generated by ERP systems). Unstructured data are data that is unorganized and requires considerable time to organize and cleanse; the highly complex nature of unstructured data is challenging (difficult to assemble) to integrate into a data-base; examples would be text related communications (e-mails, chats, texting) that may deal with more than one topic or issue embedded in the communication. 58 Data integration requires data quality, good meta-data, data that describes data. Desouza and Smith, 2014.

150

outcome context) in context with leading indicators (short-term view, performance context) will not speak to how outcomes will be realized or indicate progression towards those outcomes and visa-versa (Kaplan and Norton, 1992). Baesens, et al. (2016) states, big-data flows from five, heterogenous sources, combinating this data-variety through business analytics59 is transformational to business, enabling “better data-driven decision- making in an organizational context” (p808). Data generating systems being large-scale enterprise systems (i.e. ERP, CRM, SCM, MES), social media (contextual, unstructured media derived from sources such as Facebook, Twitter, or internal chat-rooms, internal electronic communications), mobile devices (smart devices, iPads, smart phones, wearables), internet-of-things (sensor enabled systems and devices), and publicly available data (i.e. open-data such as publicly US Census data, or purchased data, business data, trade data, environment, weather, etc.); systems that generate big and small data.

Big data supports new types of decisions that could alter the way a company operates based on how data is viewed from enterprise perspectives and integrated cross- organizationally (Davenport, 2014); integrating both big and small data to reveal strategic insights. Grover, et al. (2018) considers data in terms of strategic value, its function and symbolic application; function as in data-analytics tied performance improvement and symbolic as in those attributes representing a company’s perceived image. A

59 Baesens, et al. (2016) suggests the term ‘business analytics (BA)’ is becoming a staple function to data-driven decision making for many businesses. The term transcends the oft used term ‘business intelligence (BI)’ where the use of advanced reporting and visualization tools are fundamental to delivering useful insights on data; analytics seeks to drill deeper, go below the surface of BI by linking data to ‘explanatory variables’, and their causal inference to predictive decision-making.

151

recombinative definition is enterprise level data organizationally integrated effecting strategies and operations.

H7a: SLDIP will have a direct and positive effect on EMI. H7b: … on DAUM. H7c:

SLDIP when moderated by EMI and DAUM will have a positive effect on data actuation processes (DAP); … Productivity actuated processes (PAP), … Planning actuated processes (PLAP), … Data governance actuated processes (DGAP), … Innovation actuated processes (IAP) H7d: SLDIP when not moderated by EMI and or DAUM will have a lesser effect on DAP.

3.4.2 Operation-level practice

The term ‘Small Data’ granulizes from Big Data, “consisting of usable chunks”

(Banafa, 2016), manageable versus the enormity or ‘managing complexity’ of Big Data.

Small Data is structured, easily accessible, understandable, and actionable; derived from traditional system data sources such as enterprise resource planning, customer relationship management, product life-cycle applications, manufacturing execution systems, material resource planning, point of sale applications, financial / accounting packages, etc. Freidman (2015) characterizes Big-Data from an IIoT perspective on the organization as falling into four information-baskets; status data, (data generated by devices capable of gathering and transmitting information as related to the performance of certain processes), location data (data that is transmitted by devices that identifies the locale as to where a performed action is taking), automation data (data that is gathered and relayed by automated equipment such as robots, or sensors where non-human actions are occurring), and actionable data (the combination of the three aforementioned data

152

types and when configured, provide management with performance insights to make informed decisions). These characterizations fit well with this research’s definition on small data, even though umbrellaed under Friedman’s definition of Big Data.

Small Data classified as operation-level data also captures ‘human interactions, and or behavior’ (Baesens, et al., 2016) leading to insights, for analysis (Bughin, et al.,

2016). Small Data does what Big Data cannot accomplish, that is to combine data that may seem unrelated; exploring data across several data sources (Lindstrom, 2016;

Tetlock, 2015) uncovering trends shielded by Big Data.

Data variety can be analyzed to reveal Small Data facts, given the availability of tools and techniques to see the stories in the data; Big Data is data, just cold, hard and emotionless, while Small Data allows for qualitative, emotional understanding of data relationships (Heath, C., 2016), this is especially important when considering business processes and innovations identified through DA that reduces process complexity and improves on value-added processes.

Small Data, for this research, is operationally understood through data integrated by business and production processes; those formalized steps designed to produce, sell, and service a product and the manner this is accomplished and where data from those activities is generated and collected for analysis. Business process data is composed of two dimensions, customer-facing tasks and operation-facing tasks (Schoenherr and

Swink, 2012) occurring at the non-manufacturing functional-level.

Customer-facing tasks are those involving proximity contact with the customer such as transaction-order-fulfilment management, credit management, delivery-logistics

153

management, and after-sale-service management tasks. Operation-facing tasks are those related to the functions of financial management (product-costing, warranty cost, cash- cycle, inventory control, and asset-resource utilization), human-resource management, and information system management. Business-process data is the digitized information generated from non-manufacturing-function-level activities by systems and system- devices for collection an analysis.

Production process data is composed of machine-facing and material-facing tasks occurring at the manufacturing functional-level. Machine-facing tasks are related to machine-capacity management, machine-maintenance management, quality management, and energy-management. Material-facing tasks are related to raw-material-supply management, waste-material management, and finished-stock-management. Production- process data is the digitized information generated from manufacturing-function-level activities by systems and system-devices for collection and analysis.

H8a: OLDIP will have a direct and positive effect on EMI. H8b: … on DAUM. H8c:

OLDIP when moderated by EMI and DAUM will have a positive effect on data actuation processes; … Productivity actuated processes (PAP), … Planning actuated processes

(PLAP), … Data governance actuated processes (DGAP), … Innovation actuated processes (IAP) H8d: OLDIP when not moderated by EMI and or DAUM will have a lesser effect on DAP.

[Remainder of page left intentionally blank]

154

3.4.3 Data-security level practice

According to a 2017 United Kingdom Government Cyber-security office survey60,

“breaches are linked to human factors” (p40) yet few businesses provide data-security training (20% do so), or established policies on data-security (33% do so). Simmonds

(2018) reported a Kaspersky sponsored poll (2017) stating 59% of respondents believed data breaches are caused by “careless or uninformed employee actions” and exacerbated by the increased use of mobile technologies, and remote working situations (controls in company environments are better positioned on data-security containment, then those off- site). Misplaced mobile devices, lost USB sticks are the most common breach mechanisms on data-security lapses. Accordingly, the author finds where the role of data-security champion is found to be few, and at times vaguely defined especially given the authors reporting that the number of IT personnel per 100 employees is one when surveying a broad number of companies in diverse industries.

With limited IT resources and lack of data-security champions, lapses in data- security are bound to occur. Controlling data-flow to places not authorized by the company is one means to limit data-breach events. Others such as multi-factor authentication on data access, restricting unauthorized movement of data to cloud services, upgrading ant-virus and anti-malware on a frequent basis better protects data from unwanted departure (as also noted in the case studies, recalling one particular company that instituted an automated method for remotely ‘wiping clean’ the drives of company owned mobile devices should management believe a need to do so).

60 Klahr, et al (2017) United Kingdom Department for Culture, Media & Sport, University of Portsmouth, conducted by Ipsos MORI Social Research Institute

155

Communicating the cognitive need on the importance of data-security cannot be overstated, educating on cyber threats and ways employees can take responsibility for data protection is critical on establishing cyber-security protocols.

While not manufacturing focused, Accenture conducted a study among electric utility companies and found ‘interruption to supply (electric, thinking in terms of the manufacturer, a stopping on the ability to produce products), compromising employee or customer safety, theft of sensitive customer or employee data and theft of company data as the top four response items of the survey. Noted, the concerns over the departure of data from the company is a critical, and on-going concern of any company. Tacking back to the discussion on IIOT level integration, the tie to data-security becomes more pronounced. Without moving onto specific narratives regarding IIOT security technologies, mesh protocols on device connectivity, the networks in which they communicate should be built around the IIOT environment allowing for rich interconnection among devices and securing the wireless and uninterrupted distribution of data (Beecher, 2018).

Anderson, et al. (2017) presents an information security control theory (ISCT), positing tensions management experiences on the “need to share information” and the

“need to protect information”. Performed in the health care industry, the authors’ research concludes the importance on establishing an information security policy bridging management’s understanding on “exposure control reasoning” and “ethical control reasoning”61; managing the need for information with the control of information. ISCT

61 Exposure control reasoning is the management of security risk exposures “inherent on information assets” (end-user devices, servers, networks, any information system applied hardware that

156

ties to the findings in the case studies concerning, distributing job related data in context with reducing job performance frustrations while maintaining controls on where the data is located (mobile devices) and the level of access to that data. The case study definition on data-security level integration is technology integrated data-mechanisms to protect data assets from unwanted departure. The re-combinative definition is the integration of data-asset protection mechanisms into data-generating technologies to prevent the unwanted departure of data-assets.

H9a: DSLIP will have a direct and positive effect on EMI. H9b: … on DAUM. H9c:

DSLIP when moderated by EMI and DAUM will have a positive effect on data actuation processes; … Productivity actuated processes (PAP), … Planning actuated processes

(PLAP), … Data governance actuated processes (DGAP), … Innovation actuated processes (IAP) H9d: DSLIP when not moderated by EMI and or DAUM will have a lesser effect on DAP.

3.4.4 Industrial Internet of Things (IIOT) level practice

Culled from operation level data discussions in the case studies, the Industrial Internet of Things (IIOT) is further examined through its digitization effect on the organization and the connective meaning of IIOT. A “digital transformation begins with the executive mandate” (Dunbrack, et al., 2016) where a “strong sense of urgency” exists among executives to adopt IoT strategies. The Industrial Internet of Everything; interconnected

contains data) from internal or external threat sources. Ethical control reasoning is ‘right or wrong action’ determination on privacy controls, augmenting hard-actioned controls of exposure reasoning. Dimensioned as ‘utilitarian reasoning’, controls for the greater good and risk management on the degree of harm on stakeholders due to data breach events. ‘Deontological reasoning’, relies on individual interpretation on “moral duty, adherence to rules, compliance with regulations and laws”. Anderson, C., Baskerville, R.L., Kaul, M. (2017) Journal of Management Information Systems. 34 (2), pp 1082-1112.

157

devices that generate data, the connectivity of devices to create value (Kreidler and

Wascow, 2014), communicating with each other to aid management by providing real- time information for insightful decision making (i.e. through business intelligence, data- analytics), making it possible to transform business processes (Lee and Lee, 2015). A technological phenomenon seeded in “ubiquitous communication connectivity” (Li, et al.,

2012), “a new technology paradigm envisioned as a [global] network of machines and devices capable of interacting with each other” and the “true value of the IoT “is realized through improved equipment use monitoring and control, helping to manage buyer- supplier inventory, customer relationships, and providing data for effective business intelligence (Lee and Lee, 2015). Where the “virtual world of information technology integrates seamlessly with the real world of things (i.e. computers, networked devices)”

(Uckelmann, et al., 2011) and these things (objects) exchange data over a network with and or without human intervention (Manzoor, 2016).

In a 2015 survey conducted by International Data Corporation (IDC) found 33% of executives believed their industry would be competitively disrupted by 2018, 16% of the population being millennials (experiencing connectivity as a lifestyle translated into the business-world) will accelerate the adoption of IoT, fifty-eight percent believe IoT is a strategic necessity, and twenty-four percent view adopting IoT technologies as being transformative. In that same report, by 2017 “sixty percent of global manufacturers will use analytics to sense and analyze data from connected products, manufacturing while optimizing discrete portfolios of products” and the same percentage of companies by

2018 will integrate information technologies and operation technologies to fully realize the value of their IoT investments”.

158

Supply & Demand Chain Executive (June 2016) reported a survey among manufacturers that stated 24% of respondents reported of ‘having no plans to implement

IoT technologies’, 24% ‘implementing IoT technologies over an unspecified period-of- time’, 33% currently using IoT technologies, and the remaining, not sure. In crafting an

IIoT strategy a phased-in approach process can be taken, versus all-at-once; a phase-in provides the opportunity to test processes and insure accuracy of retrieved data.

Bono from PricewaterhouseCoopers (2016) recommends the company first consider the ecosystem of the devices to be connected and what information is expected from the network and is the network based on internet protocol enabled products (transmitting sensors placed on devices where performance information can be retrieved over the internet through web-based applications), what data is to be collected, and storage location (i.e. cloud-based, or housed on internal servers). Lastly, the data itself, how will the data be extracted, and what analytics applications will be used in their interpretation.

An IIoT strategy is about integrating data from the business, machines, and shop floor to make better business decisions (Kletti, 2007). When considering a digital strategy (data- analytics and the IIoT) looking at the ecosystem within and without the organization needs assessed; Bharadwaj, et al., (2013) views this process through understanding the scope of the digital business strategy, its scale, its speed, and its sources of value creation62.

62 Scope being the effectiveness among relationships (i.e., to firms, industries, information technology infrastructures, and the external environment) – scale meaning infrastructure dynamic capabilities through on-demand access to resources, network communications among stakeholders, harnessing massive amounts of heterogenous data, and modularization of business processes where ‘plug- and-play’ capabilities link to digital assets – speed as in product launches and responding to rapidly changing technologies, decision making where real-time response to data-analytics creates a competitive advantage, supply chain orchestration that innovates product portfolios, network formation, the digital

159

A company should perform a cost-benefit analysis as part of any due diligence prior to adopting IIoT technologies and digital strategy; Lee and Lee (2015) recommends conducting a ‘real option approach’63 given that “managers intuitively” understand the value of an initiative in both profit and opportunity. Li, et al., (2012) states management’s strategic intent when getting on the ‘IIoT bandwagon’, is to either ‘get-ahead of or

‘catch-up with’ competitors (p206). Where ‘get ahead in the market’ is to lead the competition through early adoption of a new technology that creates a competitive advantage and the ‘catch-up with the market’ impetus arises from the fear of losing market-share, or simply following the success of the industry leader, and or finding a more effective deployment of the technology to unseat the leader.64

world is based on technical platform preferences – value creation and capture experienced through leveraging the value of data as in learning customer preferences and targeting product / service offerings, digital strategies that allow for ‘multi-sided’ business models to occur through affinitive relationships (in the case of mobile devices, hardware manufacturers, telecom operators, service providers [e.g. Facebook], and the digital nature of the industry as to what architectures become standards. 63 . Real option valuation considers four approaches; to abandon or switch by walking away from financial losing projects or redeploying those resources, option to contract that allows for project scalability, option to defer or postpone where management takes a ‘wait and see’ position based on project profitability, and the option to expand or scale-up based on the success of the project. Real option valuation parameters include a net present value analysis of the project based on estimates of the monetary investment, a quantitative risk assessment, and a ramp-up time-line of the project.

64 Li, et al., cites examples of Haier, a Chinese manufacturer of appliances and one of the first to incorporate sensor technologies to allow for mobile-device accessibility that is capable of ‘turning off and on’ appliances remotely as using a ‘get ahead in the market’ strategy. Or competitor super-markets to Walmart ‘catch-up with the market’ by employing similar IoT technologies to improve product turn-over and reduce food spoiling. A ‘get-ahead in technology’ strategy focuses on proactively developing and keeping a unique technology to ‘create technological advantages’ through protected intellectual properties; citing a machine-to-machine technology developed by Cinteron as an example that led to creating a competitive advantage through an industry changing wireless communication technology. Conversely, a ‘catch-up in technology’ strategy is much the same as ‘catch-up in the market’, where fear of being ‘pushed aside’ by new technologies pressures the company into adoption; referencing a RFID technology provider (Junmp) with limited R&D resources, collaborated with a competitor possessing a more advanced technology to remain competitive.

160

“As perhaps the biggest of the latest technology trends, the IoT is going to give us the most disruption” that will provide businesses with new opportunities (Mahmood,

2016, p vii). Internet protocol65 technology has enabled this connectivity to occur; combining heterogenous networks where data is retrieved from heterogenous connected devices (Gershenfeld, et al., 2004) such as machine sensors and emerging nanotechnologies (Lee and Lee, 2016). Productivity applications for the IIoT are emerging daily and given their “interoperability and integration” with existing business information system platforms, greater data-access and report functionality is being made possible through “real-time data analytics and business intelligence” (Uckelmann, et al.,

2011). The adoption of IIoT technologies can be segmented into three motivations; connectivity to supply chains, manage assets, or monitoring product / production process

(Dunbrack, et al., 2016) as in optimizing distribution costs, improve material tracking, and redesigning factory work-flows (Lee and Lee, 2015)66.

When deployed, manufacturers can “fuse Data (Big and Small) with automation activities to create measurable business value”67; as innovations, providing for rapid costing, non-conformance report analytics, plant load optimization, shop-floor operational improvements, suppliers and supply chain value enhancements, and employee welfare in the work environment (Jamwal, 2016). The case study definition on

65 Internet Protocol (IP) specifies the format in which data packets (data packaged together into smaller transmittable segments, as in datagrams) are addressed [much like a postal letter] for delivery in disparate ‘chunks’ through the internet and when arriving at the destination and reassembled through a Transmission Control Protocol (TCP). 66 Five technologies essential to adopting the IIoT: radio frequency identification (RFID), wireless sensor networks (WSN), middleware or application program interfaces (API), cloud computing, and IoT integration software (Lee and Lee (2015). 67 Jamwal, A., (2016) The Industrial Internet of Things: 6 ways manufacturers Can Fuse Big Data, Automation and IoT for Better Operations, November 15, Internet of Things Institute, November. www.ioti.com

161

data connectivity is internally sourced data upon analyzation reveals insights on altering the performance of existing resources and capabilities. The re-combinative definition being the data-interconnectivity among technologies implemented for the purpose of generating, collecting, analyzing, and operationalizing data.

H10a: IIOTIP will have a direct and positive effect on EMI. H10b: … on DAUM. H10c:

IIOTIP when moderated by EMI and DAUM will have a positive effect on data actuation processes; … Productivity actuated processes (PAP), … Planning actuated processes

(PLAP), … Data governance actuated processes (DGAP), … Innovation actuated processes (IAP) H10d: IIOTIP when not moderated by EMI and or DAUM will have a lesser effect on DAP.

[Remainder of page left intentionally blank]

162

3.4.5 Constructs and definitions Table 20: Data-centric integration practices constructs and definitions

Item Case Study Definition Combinative Definition

Enterprise level data Data in support of organizationally Strategic Level enterprise goals to integrated effecting Integration sustain organization strategies and competitiveness operations Bogetoft, 2012; Baan, 2009; Demartini, 2014; Chen, et al. 2012; Baesens, et al. 2016; Baesens, 2014; Hoerl, et al. 2014; Tetlock, 2015 Elgendy and Elragal, 2014; Ross et al. 2013; Desouza and Smith, 2014 Data in support of Function area level maintaining data used on Operation Level organization maintaining Integration capabilities to capabilities to achieve strategic achieve strategic and objectives and goals operation goals Banafa, 2016; Freidman, 2015; Baesens, et al. 2016; Bughin et al. 2016; Lindstrom, 2016; Heath, 2016; Schoenherr and Swink, 2012; Ross et al. 2013 The integration of Technology data-asset protection integrated data- mechanisms into Data-security Level mechanisms to data-generating Integration protect data assets technologies to from unwanted prevent the unwanted departure departure of data- assets. Beecher, 2018; Simmonds, 2018; Anderson et al. 2017 The data- Internally sourced interconnectivity data for the purpose among technologies of revealing insights implemented for the IIOT Level on altering the purpose of Integration performance of generating, existing resources collecting, analyzing, and capabilities and operationalizing data Li, et al. 2012; Lee and Lee, 2015; Uckelmann, et al. 2011; Manzoor, 2016; Bono, 2016; Bharadwaj, et al. 2013; Mahmood, 2016; Gershenfeld, et al. 2004; Dunbrack, et al. 2016; Jamwal, 2016; Kreidler and Wascow, 2014

163

3.5 Data-analytics actuated processes

A process is defined as an organized activity occurring at the organization’s function area, department and or work-station level, segmented and linked to corporate objectives through internal processes and employee competency to achieve expectations.

(Kaplan and Norton, 1992; Bughin, et al., 2016;). Actuation or actuate68 or operationalize, to put into action, put into motion, to make a machine work, the reason a person acts, to start a process; each similar in definition indicating a process begins with an action taken, the action in this reference is the data integrated practices which start the data actuated processes.

As interpreted from the case interviews, a data-analytic actuated process is reflective of the organization’s strategic and operation goals and objectives made visible through data use practices and mechanisms. Analyzing leading and lagging data is important to facilitating data-integration of strategic goals and objectives and actuating a process. Understanding the relativity of each on how the process is performing towards an expected outcome is contextual, requiring co-analysis. One without the other will provide mis-leading information or incomplete information on which to make decisions.

Data-analysis on leading data aids the organization in visualizing trends or patterns indicating the necessity on altering or realigning a process to meet a strategic or operational goal or objective. Lagging data speaks to what has occurred, the outcome, simply looking at lagging data will not tell a complete performance story nor will it allow the data to clearly identify questions as to why performance did not meet expectations.

68 Definitions taken from a variety of online dictionary reference sites as commonly referenced as general understanding on the terms, actuation, or actuate.

164

Deeper discussions on leading and lagging indicators is held for future research, these are referenced herein for the purpose of adding context to the use of data in processes. As leading and lagging data is shared across function areas and processes, knowledge gleaned from the data “is the basis for important decision making … assessing the [data] strategic fit … to fulfill specific goals” (Hong, et al. 2011, p187)69.

Performance stories are told by the processes used to achieve outcomes. The case studies revealed three key processes on which companies operate, those pertaining to productivity, planning, and innovation. This literature review extends the process discussion to a fourth, data governance. A topic either inferred or briefly discussed in the case studies, but not significant on inclusion as specific to key processes in chapter 2; however its extension in literature discussed processes, is a natural build on to data- security integration practices found through the case studies.

3.5.1 Key productivity process

The case studies defined a key productivity process as the data-measurable relationship of inputs and outputs on an operation activity towards an expected outcome, comprised of safety (worker welfare), through-put (manufacturing efficiency), and financial (costs on manufacturing) measures. Literature presents a large and varied bag of thoughts on defining the term productivity, with its meaning, many times, foreshadowed

69 Taken contextually from the authors study on product development and project team involvement, the sharing of information cross-functionally from varying sources of data contributed by “sharing communities” (those external and internal to the organization) requires alignment with strategic objectives. Similar in application, processes require alignment with strategic and operation objectives.

165

by mathematical applications as definitions; at times confusing productivity and the use of resources contextually with performance expectations and actual performance.

Productivity measures the “physical inputs to the factory with the physical outputs”

(Tangen, 2004, p 36; Kaplan and Cooper, 1998). Data inputs (as a resource) into a process also has data outputs in terms of how well the process utilized data in its performance to achieve or exceed an expected outcome. Productivity thereby maximizes the usefulness of resources to extract as great an output as possible, producing more goods from same or less resources (Bernolak,1997), using available resources on creating greater value (Tangen, 2004). Where lessening physical inputs on the manufacturing process increases the through-put of the process, possibly lessening the cost of manufacturing per unit, increasing profitability, and growing capabilities.

Literature is resplendent with many forms of productivity measurements and related outcomes, most of which are tied to some form of system designed to derive greater value from productivity processes by identifying opportunities for improvement through greater use of data-analytics. Common to productivity improvement is lean manufacturing, representing several methods of examining value-added and non-value-added processes to reduce waste by identifying bottlenecks, snags, resources usage, material movement, worker involvement that impede value creation; value-added improvements on productivity positively impact on product quality, product profitability, environmental performance, and market acceptance (Yang, et al. 2011)70. Other forms of operation excellence programs center on data-analytics to operationalize value creation; companies

70 The authors reference TPS, the Toyota Production System as representative of lean manufacturing processes.

166

deploying Six-Sigma is done for the expressed purpose of improving profit margins, primarily through increased capacity, labor reallocation, and capital reductions (Harry and Schroeder, 2000); statistical process control (SPC) is used to view processes longitudinally identifying changes in productivity performance or generically, as any data-analytics method that examines processes over periods of time (Woodall and

Montgomery, 1999); regardless of the method of analysis, data actuates productivity for the purpose of determining how well the process is performing.

Other manufacturing information systems have made the actuation of data-analytics more feasible. For example, Manufacturing Execution Systems (MES) have developed into a near total IS connectivity platform serving on production preparation, production activity, transportation and logistics, material management, and quality assurance; providing automated and immediate sensing of data on order fulfillments, process delays, output constraints, material usage, labor allocations, safety issues, all of which in combination affect productivity processes (Kletti, 2007, Lindau, 1997). Productivity from this additive data-analytics perspective become processes focused on delivery reliability, product lifecycles, interconnectivity among resources, and adapting to market dynamics.

The combinative of literature and case studies indicate, productivity is a series of data-analyzed processes when combined and aligned with strategic and operation objectives, optimize the use of resources to profitably extract the greatest amount of output.

167

H11a: PAP will have a direct and positive effect on EMI. H11b: … on DAUM. H11c:

PAP when moderated by EMI and DAUM will have a positive effect on organization capabilities growth (OCG); … Current organization capabilities growth (COCG), …

Planning organization capabilities growth (POCG), … Current cost improvements (CCI)

H11d: PAP when not moderated by EMI and or DAUM will have a lesser effect on

OCG.

3.5.2 Key planning process

The case study definition is the integration of organized activities on current and future business elements aligned with strategies and objectives. Sales and Operation

Planning (S&OP), a key business process aligning customer demand with supply capabilities (Tuomikangas and Kaipia, 2014) is common among manufacturers, the un- commonality is in the varying degree or utilization of this planning process.

Advancements in data-systems (as just described on MES) are making available the tools for more efficient and effective planning, that aside not all manufacturers embrace these technologies, sans the use of an ERP or like IS that allows for planning purposes. Ivert and Jonsson (2010, 2014) found the learning effects on S&OP and planning efficiencies to improve on decision making and cost savings, supported by literature on large organizations utilizing sophisticated LP optimization models useful on strategic, tactical, and operation planning (Brown, et al. 2001; Gupta, et al. 2002). Sophisticated planning models are giving way to more intuitive planning systems, making data-analytics tied planning more affordable and accessible for many smaller sized manufacturers and yielding the same benefits larger organizations enjoy.

168

The APICS dictionary defines S&OP as “a process to develop tactical plans that provide management the ability to strategically direct its business to achieve competitive advantage on a continuous basis by integrating customer-focused marketing plans for new and existing products” (p6). Framed by ‘demand management’ consisting of marketing and sales plans predicting demand forecasts connecting to S&OP (demand and supply planning). Supply planning manages the alignment of production, resources, inventory, and distribution processes71 with the objectives of demand management. Data streams from sources on material availability and cost, analysis of social media to sense trends affecting demand, data scanning on skills and talent availability to meet human capital requirements, and collecting consumer data on planning new products are but a few of the many data points or data variety entering into the planning characteristics of

S&OP. Accordingly, S&OP allows management to monitor how these processes are achieving the strategic and operational plans of the organization, detecting planning and progress gaps that may prevent S&OP from meeting those expectations. Olhager, et al.

2001 outlined seven decision categories in a manufacturing strategy, capacity, facilities, production process, vertical integration, quality, organization, personnel, and information control systems; similar in context with those framed in the APICS S&OP model.

The APICS definition is comprehensive in its planning process explanation, other literature capsulize S&OP as dynamic, collaborative planning and decision-making process among functions, “a method-oriented perspective on planning” (Tuomikangas

71 The APICS S&OP model segments production processes into, production planning, master scheduling, detailed planning and scheduling; resources are categorized as facilities planning, labor planning, machine planning; inventory classified by investment targets, channel distribution, shipment plans; distribution classified by transportation, warehousing, related labor and equipment requirements.

169

and Kaipia, 2014, p253), sets of business and technological processes enabling the company to allocate resources on meeting market demands (Adamczak, et al. 2013;

Muzumdar and Fontanella, 2007), bridging operations and strategic plans (Thome, et al.

2014; Olhager, et al. 2001), cross-functional collaboration on tactical planning process aligning all business plans with near and long-term objectives creating value on the firm

(Thome, et al. 2011), manufacturing planning decisions impact on demand plans

(O’Leary and Flores, 2002). When considering company size and the use of S&OP,

Adamczak, et al. (2013) in a survey of small and medium sized European area manufacturers, found the smaller the firm, the greater the reliance on the immediacy of orders as data feeds on planning and as the company size increases, retrieving discrete data used in forecasting and planning becomes greater. The case study interviews found, at least in the Midwest Untied States, a similar pattern. Outcomes on S&OP in literature primarily consider organization performance as the outcome ‘catch-all’; however when thinking on the granular meaning of the term, S&OP is a data-fostered demonstration on organization capabilities and how S&OP determines current and planned capabilities to achieve company strategies. The more efficient and productive S&OP is, the greater the yield on cost reductions and improved profitability.

Commercially available cloud-based S&OP data systems, are becoming more prevalent; for example Oracle provides a platform on which to automate much of the planning processes found in S&OP. Critical on planning is the availability of multiple methods of information system data feeds into S&OP. MES has been mentioned, ERP systems or like enterprise systems, warehouse management systems, product lifecycle management systems, additive manufacturing systems, flexible manufacturing systems,

170

computer assisted design systems are but a few that coordinate into the S&OP process.

As the adoption of more effective, efficient, and economical data-technology means to execute S&OP, the greater impact this critical element of creating value in the manufacturing environment will occur. Re-combinating a key planning process definition, a key set of organized data-dependent activities centered on aligning manufacturing capabilities with demand and supply management.

H12a: PLAP will have a direct and positive effect on EMI. H12b: … on DAUM. H12c:

PLAP when moderated by EMI and DAUM will have a positive effect on OCG …

Current organization capabilities growth (COCG), … Planning organization capabilities growth (POCG), … Current cost improvements (CCI) H12d: PLAP when not moderated by EMI and or DAUM will have a lesser effect on OCG.

3.5.3 Key Data Governance process

“Governance refers to what decisions must be made to ensure effective management and use of IT [data assets] (decision domains) and who makes the decisions (locus of accountability)” (Khatri and Brown, 2010, p 148), an exercise of authority and control over the management of data assets … “a unifying oversight mechanism in data management” (Cupoli, et al. DAMA-DMBOK, 2014, p 26). The discipline of administering data and information assets (Orr, 2012). “The allocation of decision- making rights and related duties in the management and use of enterprise data” (Otto,

2013, p 95).

Data governance standards define data capture, storage, distributed, and curation of data

(Desouza and Smith, 2014). Interactions among actors sharing data may cause negative

171

consequences in the form of data-breaches, where control of data assets is critical to governance and control of data assets is managed by governing bodies within the organization responsible for determining the rights on accessibility and use of data assets

(Lee, et al. 2018). Data governance takes shape through IT strategies, defining roles and responsibilities for decisions on the management of data (Weber, et al. 2009), how strategies are monitored and accomplished (Al-Ruithe, et al. 2017), disseminating into seven fundamental governance disciplines; organization, metadata, privacy, data-quality, business process integration, master data integration, and data lifecycle management

Soares, 2012)72. DAMA-DMBOK73 (2014) provides for data governance as a series of process dimensions on the “planning, oversight, and control of data and data-related resources, covering the processes related to the management of data” (p10) and influencing data and information outcomes to drive value outcomes can be realized through the design and infrastructure of the governance program, data-asset management operations, data-asset management projects, data-asset management business operations,

72 Soares, S., (2012) Big Data Governance: An Emerging Imperative; describes information governance disciplines (pp 12-13) as: organization, the structure, roles and responsibilities of data- governance; metadata, the integration of all data within the organization’s data repository; privacy, identifying degrees of data sensitivity and establishing policies on use; data-quality, methods to insure the integrity of data; business process integration, identifying key processes and their data variety requirements; master-data integration, the management of data variety within the organization’s operating environment; information (data) lifecycle management, determining the business and regulatory use of data in operational and analytics systems, archiving and deletion of data. 73 DMBOK (2014), Data Management Body of Knowledge published by the Data Administration Management Association (DAMA) provides a framework on areas of data knowledge management, centering on data governance in dimensions of: data-architecture, data-modeling & design, data-storage & operations, data-security, data-integration & interoperability, documents & content, reference & master- data, data-warehousing & business intelligence, meta-data, and data-quality.

172

and aligned with the company’s governance strategies and policies74(Orr, 2012).

Data governance hierarchy begins with the ‘executive level’ sponsoring and leading governance initiatives, flowing down to the management level on establishing data- management policies and standards, flowing to data-stewards as implementers of managements policies and standards, and flowing onto data-uses and external actors aligning accessibility and use with established policies and standards (Fleckenstein and

Fellows, 2018, Chapter 8, p 71)75. Following this hierarchical illustration, data governance applied on centralization and decentralization of data accessibility and use for decision-making is leadership dependent. Decentralized meaning policies allowing data made available for decision making is provided at the “lowest point in the organization where the needed skills and competence and the needed information can be reasonably brought together” (Frey, T., 2014, p 11; Burlingame, 1961, p 121). Centralization, as the opposite in definition, is defined by policies relegating data accessibility and use to the highest levels of management in the organization, eliminating or significantly reducing the role of data in decision-making at lower points in the organization. This discussion is limited by the advances being made in data accessibility, namely cloud computing. As such, the contemporary effects on using cloud-based technologies present another layer of governance to those established and requires rethinking governance protocols on the centralized and decentralized mind-sets of data accessibility. In other words, cloud

74 Orr, J.C. (2011) in the book titled Data Governance For The Executive, details how data governance provides valued outcomes when data assets are effectively governed, chapters 3-8. 75 The authors (Fleckenstein and Fellows) present a pyramid hierarchy developed through the MITRE corporation, a not-for-profit organization operating research and development centers for the federal government (p viii).

173

technologies may have the effect of moving managers from a centralized data use position to one decentralized.

Data governance complexity increases with greater use of cloud-based computing and data-storage. The flow of company information and possibly data that is customer sensitive into remote, third party data warehouse locations presents transparency and control issues affecting how the organization manages risks associated with data movement and use among stakeholders. Felici and Pearson (2015) identify cloud-based data protection issues as being; multi-tenancy, elastic, abstraction, automation, duplication, multi-locus accessibility, and sub-processing76 each a dimension compound on how the organization manages data. Cloud computing is impacting our personal and business lives, applications such as Adobe Creative Cloud, ADP, Amazon Cloud, Apple iTunes, Box for document storage, GoToMeeting, Microsoft 365, Oracle Cloud Services,

SAP, Twitter, and hundreds of others are allowing us to quickly access and use productivity tools in the form of software as a service (SaaS), platform as a service

(PaaS), and infrastructure as a service (IaaS)77; doing so allows companies and individuals the ability to develop, operate, and manage IT applications without having to build and maintain expensive infrastructures. Adopting these less costly has great benefits

76 (Felici and Pearson, 2015 p8) Multi-tenancy refers to cloud applications where co-tenants in the cloud space gain unauthorized access to data of by means of another application; elasticity, the attack surfaces on which data can be accessed; abstraction, the reliability of physical controls on data accessibility; automation, movement away from or decrease in human involvement on data protection; multi-locus access, internal risks promoted by employee access to cloud services and data-flow boundaries; sub-processing, reliance on and reliability of third-parties for data processing, lack of related transparency and compliance protocols. 77 Generally accepted understanding on the terms SaaS, PaaS, and IaaS are: SaaS, web-access provides the point of interaction with software running on a centralized server located away from the user; PaaS, sits under SaaS and provides the platform on which software is developed and operating systems reside, allowing business the ability to scale resource requirements; IaaS forms the foundation on which PaaS and SaaS rest, automated and scalable computing resources accessing data through network capabilities on server deployments on an on demand basis.

174

on providing state-of-the-art technologies, the drawback from a governance perspective, is the locus of data control, to certain extents, moving data-security from the organization to a third party. This movement necessitates companies to develop stricter and stronger data governance to protect data-assets from unwanted departure.

Data governance processes vary in design based on the company and its level of sophistication in using data-analytics. Transparency and control factors are well noted in literature as fundamental on design, requiring regular communication across the organization on data-security matters and data-breach events. Noted is the impact of cloud computing technologies on data-security and the need to continuously re-think governance as new data-technologies are adopted. Data governance is infrastructure dependent, as illustrated in how management views the importance on data governance investment. Data governance was a loosely referenced item in the case interviews, while the topic on data-security a blending of practice integration and actuated process more prominent. Literature served to breakout the definitions with data-security meaning the integration of data-asset protection mechanisms and data governance meaning the authority and control on the management of data assets.

H13a: DGAP will have a direct and positive effect on EMI. H13b: … on DAUM. H13c:

DGAP when moderated by EMI and DAUM will have a positive effect on OCG: …

Current organization capabilities growth (COCG), … Planning organization capabilities growth (POCG), … Current cost improvements (CCI) H13d: DGAP when not moderated by EMI and or DAUM will have a lesser effect on OCG.

175

3.5.4 Key innovation processes

“Innovation is a very difficult thing in the real-world” (Feynman, 1985, p36). Data- analytics creates insightful value on business process improvements, process innovation, product innovation, organization process improvements, and others effecting process efficiencies, productivity, accessibility, and availability (Grover, et al. 2018); working to make the processes of innovation and innovativeness a bit less difficult.

Managing innovation is a continuous, iterative process (Kusiak, 2009), fostered by an openness to new concepts, developing creativity among employees, and limiting organization hierarchy to increase innovation resiliency (disseminating the power for innovation among many in the organization versus held by a few) Hamel (2007).

Innovation is longitudinal in measurement, past, present, and future where data-analytics provides insights on new methods of production and organization processes enabling capabilities. Davenport (1993) adopts “a process view of business with the application of innovation on key processes … as a mechanism … to reduce process costs, improve quality, operating flexibility, service levels, and other business objectives” (p1).

Davenport characterizes the definition by identifying key business processes found in

‘leading companies’; many of which are a direct reflection of S&OP, stating, in this sense, innovation on processes is a function of available technologies and organizational resources. Data-driven innovation processes take varying forms. As discussed on S&OP, data analysis of these core business processes’ outputted data reveals insights on which managers can alter, modify, or innovate a process to improve on its ability to achieve strategic and operation objectives. In other words, one form of an innovation process is the analysis of S&OP processes to reveal new ways of doing things.

176

Innovation processes and process innovation are different in dimensions and understanding. Process innovation creates new data-technology protocols designed to improve on manufacturing capacities and efficiencies, “demanding interfaces between functional or product units to be either improved or eliminated … parallel through rapid and broad movement of information” (Davenport, 1993, p8). Whereas innovation processes are structured and organized activities, comprised of inputs and outputs, framed by time, for the purpose of achieving a desired outcome. Innovation processes therefore shape process innovation though different and unique ways. Innovation processes categorized and those used for creating invention (idea emergence), development (idea elaboration), and implementation (idea acceptance) (Garud, et al. 2013). These macro- view categories are supported by innovation process mechanisms, organization structure, technologies (and or material), and people interaction to catalyze innovativeness.

Thiele, et al. (2016) promotes an organization structure innovation process whereby feeds of text-analyzed documents serve as a “data-driven process extracting inter and intra organizational synergies within network structures” (p318) to visually demonstrate interactions among actors and make use of them to detect innovation opportunities. Innovation processes can be external to the company, in essence a process of “shopping for ideas” (Nambisan and Sawhney, 2007, p 112), those raw, market-ready ideas and products purchased or licensed from third parties when internal idea generating processes are not enough. Data-analytics, the technology basis for this paper’s research, provides platforms to identify innovation opportunities, and manage innovation processes; e.g. digital technologies, artificial intelligence, machine learning to name a few. When the term innovation is raised, many times our minds think R&D. While

177

processes such as these emanate from technology applications, non-technology applications are equally of interest in this research. Innovation and Innovation processes in many ways are forms of research and development, sans the R&D label. Bäckström and Bengtsson (2018) in a recent AOM proceedings announcement, plans to study employee innovation relative to innovative work behavior, impact on firm innovation performance, innovation processes, and management tools for employee innovation involvement [data-technologies as discussed throughout this article provide the tools to for greater employee involvement on innovation processes]. Evident in the case studies, employee, customer, cross-function collaboration are important to innovation processes, in and of themselves become an innovation process deployed through data-analytic technologies utilized on innovativeness. Garud, et al. (2013) innovation processes “serve as an engine of organic growth, the invention, development, and implementation of new ideas” (p 776) with over-arching process complexities affecting the innovation process; evolutionary, relational, temporal, and cultural.78 Li, et al. (2018) posits high-involvement work systems (HIWS) as a process promoting innovation through knowledge exchange

(data-technologies facilitates such activities). Cross-functional teams form an innovation process, employees from other departments form decision-making, or recommendation- making bodies on ideas and initiatives through connectedness among team members, where gaps in connectedness are narrowed through data and communication technologies.

78 The authors define innovation process complexities as evolutionary, subject to path-dependent events that create innovation; relational, subject to organizational interplay among actors and technologies creating the innovation; temporal, innovation is not time linear, subject to starts and stops, involvement or interruption on the innovation process by expected and unexpected events; cultural, subject on organizational nuances on

178

Synthesizing this brief literature review with the case study definition, re- combinates the definition of key innovation processes to mean, those sets of structured and organized activities using organization resources on the promotion of creating novelty and organization value.

H14a: IAP will have a direct and positive effect on EMI. H14b: … on DAUM. H14c:

IAP when moderated by EMI and DAUM will have a positive effect on organization capabilities growth; … Current organization capabilities growth (COCG), … Planning organization capabilities growth (POCG), … Current cost improvements (CCI) H14d:

IAP when not moderated by EMI and or DAUM will have a lesser effect on OCG.

[Remainder of page left intentionally blank]

179

3.5.5 Constructs and definitions Table 21: Data-centric actuated processes constructs and definitions

Item Case Study Definition Combinative Definition

A set of data-analyzed The data-measurable organized activities Productivity relationship of inputs optimized in the use of Actuated and outputs towards resources to profitably Processes an expected outcome extract the greatest amount of output Kaplan & Norton, 2012; Bughin, et al. 2016; Hong, et al. 2011; Tangen, 2004; Bernolak, 1997; Yang, et al. 2011; Harry & Schroeder, 2000; Woodall & Montgomery, 1999; Kletti, 2007; Lindau, 1997 A set of organized, data- The data-integration dependent activities on organized centered on aligning Planning activities on current manufacturing Actuated and future business capabilities with demand Processes elements aligned and supply management with strategies and to accomplish current and objectives. planned strategies and objectives Tuomikangas and Kaipia, 2014; Ivert and Jonsson, 2010, 2014; Brown, et al, 2001; Gupta, et al. 2002; APICS; Adamczak, et al. 2013; Muzumdar and Fontella, 2007; Thome, et al. 2011; O’Leary and Flores, 2002; Thome, et al. 2014; Olhager, et al. 2001, 2007 A set of structured and organized Data Governance activities on Actuated None derived authority, control, Processes and management of data assets Khatri and Brown, 2010; Orr, 2012; Otto, 2013; DAMA, 2014; Desouza and Smith, 2014; Lee et al., 2018; Weber, et al. 2009; Al-Ruithe, et al. 2017; Soares, 2012; Fleckenstein and Fellows, 2018; Frey and Osborne, 2014; Burlingame, 1961; Felici and Pearson, 2015 A set of structured and organized The integration of activities using Innovation new or modified organization Actuated mechanisms to create resources on the Processes novelty and value promotion of creating novelty and organization value Feynman, 1986; Grover, et al. 2018; Kusiak, 2009; Hammel, 2007; Davenport, 1993; Wang, et al. 2015; Garud, et al. 2013; Thiele, et al. 2016; Nambisan and Sawhney, 2007; Bäckström and Bengtsson, 2018; Li, et al., 2017

180

3.6 Making a difference on organizational performance

Organization capabilities, in a general definition is relatively simple, its level of application or degree of use in research becomes multi-faceted. A basic meaning is the know-how of the organization, that over time has developed ways on doing certain things, reliant on available organization resources to achieve an expected outcome.

Companies are built on core, idiosyncratic capabilities, with proprietary knowledge, thought superior to competitors (Dosi, et al. 2000, p 25-26), as “manifestations of observable corporate structures … processes … culture … network of employee relations” (Collis, 1994, p145). Organizational capabilities are slow to change, stable in a reliable, satisfactory manner (Schienstock, 2009; Helfat, 2011).

Companies work to explore and exploit capabilities to their competitive advantage.

Literature suggests capabilities segment into those operational, dynamic, strategic.

Operational being core to ‘what the company does’, ‘how it makes money’, its ordinary function activities, e.g. production, operations, administrative, marketing, logistics, planning, etc.; dynamic as activities that make changes on operation functions, process innovativeness on capabilities, its extraordinary activities; strategic, as activities on what the company plans to do, its dynamic development of novel strategies, acquisitions of new business, establishing new product lines, to sustain competitiveness (Cepeda and

Vera, 2007; Collis, 1994; Helfat and Peteraf, 2003; Helfat and Winter, 2011; Winter,

2003; Eisenhardt and Martin, 2000; Amit and Shoemaker, 1993). Dynamic capabilities can be dimensionalized by competencies, Hong and Park (2015, p8) transformed dynamic capabilities to also mean network capabilities, framed by market competence,

181

effects of the external environment on capabilities, technology competence, resources allocated to enhance technology capabilities, and linkage competence, the capability to combine internal and external resources transforming ideas into tangible things (Park and

Hong, 2012). Teece (2007) partitions capabilities into capacities, sensing and shaping opportunities and threats, seizing opportunities and threats, and maintaining competitiveness.

Kim et al. (2012) speaks to the firm’s competence on addressing changing business environments through process improvements, cost improvements, superior business intelligence, and organization learning as defining performance advantages when compared with competitors. Thinking from this paper’s research on the data-centric competence or data-centric point of view, Ross et al. (2013) found “companies with an evidence-based decision-making culture tend to be more profitable than companies who do not” (p 92). “The more companies characterized themselves as data-driven, the better they performed on objective measures of financial and operational results” (McAfee and

Brynjolfsson, 2012, p 64)79. Davenport and Harris (2007) assessed firms that have higher levels of data-use intensity also demonstrate higher levels of growth rates. LaValle, et al.

(2011) categorize a company’s level of data-analytics maturity as being aspirational, experienced, or transformed measured against company performance in terms of ‘where analytics is performed’(p 28); at the IT department level, the point of need level, line of business level, or centralized level80; finding “capabilities grow and deepen within the

79 McAfee and Brynjolfsson found “companies in the top third of their industry using data-driven decision-making were, on average, 5% more productive and 6% more profitable than competitors” p 64 80 Aspirational, companies in pursuit of data-analytics, focused on efficiency, automation of existing processes and cutting costs; experienced, companies that have integrated data-analytics, moving past cost cutting situations and seeking ways to optimize the organization; transformed, companies that are highly experienced users of data-analytics and capturing data, focused on driving customer profitability and

182

organization … disciplines like finance and supply chain are inherently data intensive … often where data-analytics takes root” (LaValle, et al. 2011 p28) and data-analytic capabilities become additive to existing capabilities that increase organization performance.

3.6.1 Organization capabilities growth

When thinking of organization capabilities growth, it seems natural to tie this term to data-analytics, Davenport and Harris (2007) found as intensity of use increases, companies experience increasing capabilities and higher annual growth rates.

Capabilities, “the ability of an organization to perform a coordinated set of tasks, utilizing organizational resources, for the purpose of achieving a particular end result” (Helfat and

Peteraf, 2003, p999) and thinking on data as a resource being owned, controlled and accessible on a semi-permanent basis, promulgates data-analytics as important on organization capabilities growth. Data is dynamic, ever changing on the seven ‘Vs’ of its characteristics; data-analytics contributes to capability dynamism on providing insights building, integrating, reconfiguring operation capabilities (Teece et al., 1997) directly and indirectly contributing on organization performance through its intensity of use.

An organization’s capability, in abstract, is arguably subjective in believe on its functionality, the purpose of this paper’s research is not to gauge levels of capability

targeted investments, believing transformed companies are three times more likely to substantially outperform industry peers (p22-23). IT Department level, data-analytics is contained within the department level; at point of need, data-analytics at the function level where some data insight become cross-function area; line of business, data insights broaden in cross-function reach; centralized, where data analytic resources and insights are shared efficiently across the organization. These are built layers, each one additive to the other in application and use of data-analytics. (p28).

183

superiority, only to state organizations have capabilities and data-analytics serve to enhance capabilities. Like data, which has shelf-lives on useful application, capabilities also demonstrate a life-cycle staged through phases of founding, development and maturity, with maturity signaling a possible decline in the capability’s value creating propensity (Helfat and Peteraf, 2003).81 Measuring capabilities growth along this continuum is somewhat problematic given the heterogeneity of capabilities among organizations. Literature speaks little on how to measure capability growth. Capabilities can be measured on productivity, progressive accomplishment on an objective, its usefulness, the changes made on the capability demonstrated by improved outcomes, investments made into capabilities, allocation of resources on capabilities, and such.

For the purpose of this paper’s research, it segments organization capabilities growth in those current and those planned. Current organization capabilities growth meaning the current performance state of a capability measured against historical performance represented by changes on employment population, changes in number of customers served, changes on current product lines, changes in manufacturing technologies, changes on capabilities through acquiring capabilities. Planned organization capabilities growth, are those anticipated changes in the future, greater than one-year, on current organization capabilities growth and using the same set of metrics as in current organization capabilities. In essence, planned organization capabilities growth serves as a proxy for market capabilities growth. In that, as the company invests in adding and or making

81 While not an immediate interest of this research (but soon will be), aligning data-analytics with capability lifecycle and seeing the interplay on how DA affects each stage of the capability cycle. Founding stage, initial organization of resources and objective in creating a new capability; development stage considering capability alternatives to achieve the objective; maturity stage, exercises the capability and maintenances its usefulness.

184

advanced changes to current capabilities, it is premised on sustaining a competitive advantage, affecting the company’s market position. Hence this paper’s research considers capabilities as those organization competencies currently in use and those planned, where data-actuated processes effect the performance and growth of existing capabilities and provide insights on planning new capabilities fit well into definition of the afore described capability lifecycle.

3.6.2 Innovation capabilities growth

Extending this discussion to how the innovative part of capabilities can be measured, the simplest manner, is analyzing the historical changes on manufacturing costs when aligned with organization capabilities growth. The premise herein views the impact of innovation on capabilities would be realized through changes on core capabilities performance attributes. For example, the implementation of new advanced manufacturing systems would improve on manufacturing capabilities and therefore, possibly, effect the labor portion of manufacturing costs, or total unit production would improve, or raw material costs per unit may lessen, or the amount of energy consumed to manufacture a product would decrease. The other measure against these function metrics, would be the investment component, changes on the balance sheet in property, plant and equipment may indicate capital expenditures for innovative technology applications such as robotics, or new production facilities to house new processes.

185

A 2016 McKinsey Global Institute report82, stated the potential outcome in capturing value from data-analytics of manufacturers is in lowering product development costs by

50%, lowering operating costs by 25%, and increasing gross margin profitability by 30%.

The report further identified legacy information technology systems, siloed data, and skeptical leadership as challenges manufacturers face in benefiting from data-analytics

(p2). Engle, et al. (2006) put forth innovation is continuous on sustaining high- performance organizations, “composed of reinforcing practices and processes” (Lambrou,

2016, p 41). Data disrupts the status-quo (internally and externally on the organization), changes things, operationalized through cost, accessibility, and model structure mechanisms (Johnson, et al. 2017; Wessel, 2016) on new process and product development. The capability to combinate data from various sources, increases the capabilities of innovation to occur, resulting in capability growth and value creation; hence reducing operating costs and increasing profitability. Productivity growth (cost and profitability improvements) is reliant on data-driven innovativeness on processes; changes to work-flows, reallocating resources, inviting new resources, repositioning organization structure; product innovation benefits from material fungibility, feature- benefit enhancements, and fostering new production methods and processes.

H15a: Current organization capabilities growth has a direct and positive effect on planned organization capabilities.

H15b: Data actuated process have a direct and positive effect on cost improvements

H15c: Innovation actuated process have a direct and positive effect on cost improvements

82 The age of analytics: Competing in a data-driven world. Bughin, J., Manyika, J., Woetzel, J.; December 2016. McKinsey Global Institute

186

3.7 Summary

Eluded at the introduction of this chapter, conducting a literature review is meant to augment the case-study phase of this research. Comparatives on like topics were reviewed and construct definitions were re-combinated to form a deeper understanding on the elements of data-centric ecosystem. Propositions stated in chapter two, were re- positioned in chapter three and now serve as hypotheses in the survey portion of this paper’s research.

CHAPTER 4: VOICE OF MANY

What are the broad-segment implications offered in a large-scale survey, what is indicative of the findings? Two pieces of this research quest have been completed, in this third phase, the topics and queries of interest are expanded upon through a survey vehicle, designed to feasibly ask as many relevant questions as possible to a sample population of nation-wide manufacturers. The major constraint of a survey is the framing of items to reflect a topic of interest, even though significant efforts have been made by this paper’s research to clearly define a topic, its interpretation when presented among a larger audience may not always be interpreted as understood in the same manner.

Especially when conducting exploratory and theory building research. Chapter two, essentially, provides the framework on which independent measures reflect the latent constructs as used in building the survey, chapter three served to augment the case-study findings. Theoretical literature on the resource-based view, knowledge-based view, organizational learning and technology, organization and environment model align with the case-study findings as illustrated in figure 22; i.e. data as a resource, when integrated

187

and placed into action, works to increase the value of capabilities. Typical of survey research, the survey instrument is developed in respect of conventional tests on validity and reliability, independent variable predictive power, multicollinearity, and unidimensionality. This chapter discusses, the survey development, the methods of analysis, and the results.

4.1 General facts on the final survey instrument sample characteristics

In total, 117 items were fielded to 4,200 potential respondents and receiving 333 validated responses; 16 questions are related to company and respondent demographics,

101 items on topics of research interest. One proxy question was inserted to filter out any survey completed too quickly; in situations where the respondent did not follow the question direction, to simply select ‘strongly disagrees’ indicating a comprehension of the requested action. If the respondent consistently selected the same response for all items, including the proxy question, the survey was rejected. The average completion time on the survey was 24.7 minutes, median at 16.7 minutes.

Distribution of the survey was managed through Qualtrics, with 275 validated surveys originating from a Qualtrics selected manufacturing panel, and 56 from LinkedIn random contacts among C-Suite and senior management personnel in operations, information systems, executive management, finance, production/manufacturing, business intelligence, marketing, and product development within manufacturing companies. Respondent filters were placed on selecting only mid to senior level manufacturing executives, within the functions herein mentioned and nationally distributed. In keeping with this paper’s theme on data capture and integration within the

188

manufacturing environment, figures 20 and 21, illustrate in dashboard form selected survey demographics.

Details of note: the Northeast USA represents 17% of respondents, 28%

Southeast, 24% Central Midwest, 6% North Central Midwest, 8% South Central, and

18% West-Pacific regions. Regarding the respondent type executive management level,

24% were either company presidents or CEOs, 15% other C-suite members or those with senior VP titles, and 61% mid-level executives. Executive management as a function area of responsibility represented 26%, IT 20%, operations 15%, manufacturing and or production 12%, finance 11%, strategy and planning 6%, business intelligence 4%, marketing 4%, and product development 2%.

The type of company mix is represented by public traded companies at 26% and

74% privately held. Company size was measured by employment and revenue: 13% of respondents stated company employment at less than 100, 35% between 101 and 250,

14%, at 501 to 750, 10% at 1,001 to 2,000, 2,001 to 5,000 7%, and 5,000 or greater 8%.

Regarding revenue size, 34% with revenues less than $10MM, 13% $11 to $25MM, 10%

$51 to $100MM, 8% $101 to $250MM, 7% $251 to $500MM, 8% $501 to $1B, 7% $1 to

$2B, and 8% greater than $2B.

The mix of industry types surveyed was based on the respondent selecting from a drop-down list of 4-digit NAICS codes with corresponding manufacturing description.

Food industry manufacturing represented 16%, followed by Miscellaneous at 13%,

Computer and electronics at 9%, fabricated metals at 7%, with the remaining participant industries listed on the Figure 20 dashboard.

189

Figure 21 illustrates the type of technologies currently implemented by the respondents, with some form of business analytics being utilized by 89% of respondents.

Of the top five implemented technologies, business analytics being first, incorporating social media into analytics ranked second at 67%, wireless network connectivity (IIOT) third at 52%, visual data analytics fourth at 50%, and cyber security systems at 50%.

Overall, the survey asked respondents to select from 17 different technologies all of which are high data generating sources of information used in decision-making.

Overall, the sample population of this research is a fair representation on industry mix found across the USA, employment, and revenue size in context with levels of executive management and function-level responsibility.

[Remainder of page left intentionally blank]

190

Figure 21: Survey Demographics Dashboard

191

Figure 22: Demographics dashboard technology implemented

Figure 21 and Figure 22 graphics created in the data software program ‘Tableau’ by Blaine Stout, licensed to Blaine Stout.

192

4.2 Research model on data-analytics on organization outcomes Recalling figure 18, as established in the case-studies, the research model is re- illustrated with the proposed relationships among the variables.

Figure 22: Research Model

Growth Growth Growth Growth Market Market Impact on Impact on Impact on Impact Innovation Capabilities Capabilities Capabilities Outcomes

Organization Organization Performance

IM DAU

EMI

Planning Planning Actuated Actuated Actuated Actuated Actuated Processes Processes Processes Processes Processes Actuated Innovation Processes Productivity Productivity Data-centric

DataGovernance

IM DAU

EMI

Practices IIOT LevelIIOT Integration Integration Integration Data-security Data-centric Strategic Level Strategic Level Operation Level Operation Level DataIntegration DataIntegration

Data Data Mechanisms Accessibility & Accessibility Integration Use

Influence Executive Executive Management

Pressure Pressure Pressure Pressure Innovation [Technology] Pressures Competitive Performance Performance Cyber-security Data-centric

193

4.3 Item generation and pre-testing

Examining data-centric pressures, data-centric integrated practices, data-centric actuated processes and relative organization performance outcomes is fundamental on this research. In the survey instrument, developing scales and measures providing for reliability and validity is critical to the research quality. Much like the rigor performed in the case studies, the same is applied in this phase of the research. Literature is replete on discussions of measurement characteristics with those primary of review, construct validity (the effectiveness of the instrument to describe discrete domains), convergent validity (agreement among construct measures), discriminant validity (non-agreement among construct measures), predictive validity (the performance of measures to predict outcomes), and reliability (test measurability of variables of interest) conventionally using Cronbach’s alpha where acceptable minimum threshold at 0.70 for preliminary research moving to 0.80 for basic research (Nunally, 1978; Peterson, 2013).

Construct validity was fashioned first through grounded theory of the case-study research and then augmented with literature reviewed; of note the dominant influence on the survey instrument development is derived from the case studies. Content validity is secured through the interpretation of the case studies on the topics of interest as illustrated in figure 18 and matching statements on the topics and references on how those topics may be measured; these are illustrated in chapter two. Item definitions and measures were taken from first order concepts, aggregating to second order themes and aggregating to dimensions or discrete domains. The initial pilot questionnaire is developed on the topics of data-centric pressures comprised of three dimensions,

194

technology with 6 items (changed to innovation in the final survey instrument), performance 6 items, and competitive 6 items. Data-centric integration practices comprised of strategic level data 5 items, operation level data 5 items, IIOT level data 6 items, and cyber security (later changed data-security level) 6 items. Data-centric influences, as in executive management 6 items and data accessibility and use mechanisms 6 items. Key data-centric actuated processes are productivity 5 items, planning 5 items, innovation 5 items, and cyber-security 5 items (later changed to data governance). Organization performance relates to organization growth 6 items, market growth 5 items, and innovation growth 5 items. A seven-point Likert scale was employed: 1 strongly disagree, 2 disagree, 3 somewhat disagree, 4 neither agree nor disagree, 5 somewhat agree, 6 agree, and 7 strongly agree.

The survey was iteratively reviewed with this research’s academic mentor on three occasions, each meeting worked to refine the questions based on feedback from case study participants. The final pilot revision involved feedback from two case-study participants (prior versions had six respondent feedbacks; a researcher has to be conscious of ‘wearing one’s welcome out’ in consistently reaching out for assistance to willing participants on such projects, caution is advised to constrain the activity and work to make the interaction as value-added as possible to maintain a favorable relationship) upon taking the survey and following up on their responses to revise any question comprehension issues. The preliminary pilot was distributed to 21 participants working in a manufacturing setting and regionally local to the researcher. After review with the research mentor and another professor familiar with managing a Qualtrics survey event, the question structure appeared problematic in part due to a lack of commonality on

195

opening phrases within topic groupings, topic terminology consistency and question length; plus it was advised to provide i.e. or e.g. parenthetical references to add clarity on the question.

The majority of changes made on the final pilot version centered on wording and grouping items to elevate their measurability with the intended construct. The number of items Each section of the survey began with “In this section, the research examines the

[discrete domain construct name] placed on management to become a data-driven organization”. Some items also included parenthetical remarks to help with defining the question with the intent of further reducing confusion. The number of items did not materially change, with the exception of adding a few to arrive at 98 items, excluding those demographic in nature. Section 4.5 detail the domain constructs and items.

4.4 Final pilot-study

As mentioned in section 4.1, Qualtrics was engaged to assist with the survey and contracting for survey panelists. Two pre-launch meetings were held to determine survey layout and respondent filtering. The opening page of the survey began with items related to role, function, and industry type to serve as filters and asking the respondent who upon agreeing to take the survey, would do so with utmost honesty and diligence on answering the questions. A soft-launch was scheduled to gather 60 respondents.

In addition, the pilot survey was sent to randomly chosen names who received and accepted a LinkedIn invitation. In total, 81 responses were received, enough of a sample size to conduct a preliminary audit and factor analysis. The soft launch results were

196

reviewed prior to acceptance from Qualtrics, filtering those unsuitable and not caught during the initial filter stage; (9) were replaced with more qualified respondents. Upon review of the final pilot, a few working changes were made more grammatical in nature affecting 10 questions. A preliminary principal component analysis was conducted and most groupings aligned as expected, keep in mind the sample size is small compared to the number of items contained in the survey; the analysis at this stage only serves as a guide on finalizing the instrument.

4.5 Methods and Analysis

Section 4.1 details the sample population of the survey research. In this section, we review the results from two perspectives, multiple regression and structural equation modeling. With the aid of SPSS 25, a principal components analysis, varimax rotation is used to aggregate the items under relative constructs and domains as mentioned in section

4.3. Minimum loading was set at 0.5, loadings of items thought to be related on one construct and loading on another construct were evaluated and determined if they should be moved to that construct. This occurred on two instances where the definition of strategic level data contains primarily external sourced data in context with some internal source data, IIOT5 was loading on SLD at .60 and even though it is below a .70 threshold it was determined to keep the item aggregated with SLD due to integrating an internal data source with those external when used for strategic decision-making. In similar fashion, three items related to data-centric pressures and specifically focused on cyber security loaded as a fourth construct (originally these items were embedded in the other three dimensions) and upon review, determined allowable to keep this new construct

197

given the data-security and data governance constructs being examined in other domains of the model are natural extensions on cyber-pressures.

A dimension reduction was performed using principal components analysis with varimax rotation set at minimum threshold of 0.50 (described in section 4.4); Hair et al

(2010, p686) “a good rule of thumb for convergent validity is that standardized loading estimates should be .5 or higher … ideally .7 or higher … statistically significant … items converging on the latent construct, with the average variance explained (AVE) >

.50 is considered adequate” for analysis. All constructs achieved average factor loadings

> .70, with the exception of one at > .60 (competitive pressure). AVE for all constructs were > .50 with the exception of one at >.40 (competitive pressure); even though loadings on competitive pressure were lower, it was determined an important latent construct given its relevance on data-centric pressures ascribed in the case studies and literature review. A supervised principal components technique on the items was performed and aligned with identified discrete domains, those aggregated dimensions established in the cases studies; “when the task is regression or classification, it would be preferable to project the explanatory [independent] variables along the directions that are related to the response [dependent] variables” (Barshan, et al. 2010, p 2). The reasoning is to isolate and compact those items representative of the construct variable and most relevant to the regression analysis performed. For example, items related to pressures were gathered under one domain, and dimensionally reduced to keep only those with factors scores above 0.50, in actuality each of the construct variables averaged a factor score of 0.70, with the exception of one at >.60 as previously discussed.

198

Cronbach’s α test for reliability (high values on construct reliability indicates all item measures consistently represent the same latent construct, Hair et al. 2010, p 687) demonstrated 2 latent constructs with α > .90, 12 with α > .80, and 3 with α

> .70; α > .70 is considered acceptable for preliminary and basic research (Nunnally,

1967, 1978; Kaplan and Saccuzzo, 1982).

Variable dimension reduction follows the survey format itself, in that each section was divided along the topic of research interest, informing the survey respondent on the definition of the topic with related items on that topic. Doing so was advised to reduce respondent frustrations when completing a many item survey, valuing the respondents time and aid on interpreting the items with the intent of minimizing confusion when comprehending questions. In addition, the outcome variables were consolidated into those representing current organization capabilities growth, planned capabilities growth

(serving as a proxy for market growth, noting the impact of data use on current capabilities provides future investments into technologies that improve on competitive advantage), current cost improvements (serving as proxy on innovation, noting the impact of data use providing insights leading to process improvements materializing in cost savings and or improved production), and planned cost improvements. The scales used on several items varied, as well as the question being longitudinal and asking in some situations for the respondent to assess a metric measure versus selecting a range of disagree-agree options. When aggregating these items to reflect the construct, items are culled based on time-frame; ‘current’ represented questions which led with ‘in the past three years’, the term ‘planned’ led with ‘in the next three years’. Additionally, response structures within the construct, may contain differences. In other words, categorically,

199

those questions asking a respondent to select from a range of ‘strongly disagree to strongly agree’ choices may be included with response selections asking whether some measure has ‘fallen by more than 5% to climbed more than 5%’ or ‘declined by 5% or more to grown by 5% or more’ or ‘negative growth to growth of 10% or more’. At face value, this research notes this may be somewhat problematic, however outcome items were grouped according to time relevance and each response selection contained 7 choices, so all questions are analyzed on the same numbered scale mitigating over and under weighting when performing the PCA dimension reduction.

The following tables contain construct variables and measure items inclusive of factor loadings, communalities, and reliability assessment as used in the survey. Each table highlights those items not retained in the analysis. Only the retained items were used when conducting the structural equation modeling.

[Remainder of page left intentionally blank]

200

4.5.1 Principal Components Table 22: Data-centric pressures factors and scores

Loading Data-centric pressure Items Communality Scores Performance Pressure α .856 Avg .722 AVE = .521 PP1_Our company evaluates management using strategy-linked .73 .77 performance indicators PP2_Our company closely monitors operation processes against .81 .79 performance expectations PP3_Our company allocates resources according to performance .70 .66 outcomes PP6_Our company reduces employee task-performance frustrations with data-analytics. (i.e. technologies that make tasks more productive and .59 .61 efficient) Cyber-security Pressure α .856 Avg .752 AVE = .565 CYBERP1_Our company adopts the latest cyber-threat technologies .70 .75 for security innovation CYBERP2_Our company sets high cyber-security goals to meet .79 .80 stakeholder performance requirements CYBERP3_Our company exceeds industry competitors in cyber- .76 .82 security protection Innovation Pressure α .832 Avg .722 AVE = .521 IP3_Our company creates manufacturing innovation through Advanced Technologies. (e.g. Computer-integrated systems, machine learning .62 .65 cognitive systems, CAD-CAE, Flexible manufacturing, etc.) IP4_Our company tests new digital technologies for innovation goals .82 .82 (e.g., I-Pads, wearable technologies, smart devices) IP5_Our company depends on DA investments for creating innovation. (i.e. in terms of 'RODA', return on data investments, the outcome value .72 .78 of innovations) Competitive Pressure α .895 Avg .632 AVE = .400 CP1_Our company uses competitive benchmarking data .51 .61 CP2_Our company employs advanced technologies to stay .69 .77 competitive CP3_Our company is competitive using data-analytics to support .50 .69 customer buying decisions CP5_Our company remains competitive on changing industry trends (e.g. adapting business models, production models to changing .83 .81 environments) Items removed due to low factor loadings PP5_Q17_11 Our company performs in compliance with government

regulatory performance requirements. CP4_Q17 Our company improves its competitive position through

data-analytics. IP2_Q2 Our company uses data-analytic capabilities for product innovation. (i.e. analytics that improves existing products, or brings about new products) IP1_Q1 Our company recognizes data-analytics as critical for process innovation. (i.e. analytics to reduce process complexities, reveal value-added processes) Extraction Method: Principal Component Analysis. Rotation Method Varimax Kaiser Normalization KMO .946

201

Table 23: Data-centric integration practice items

Loading Data-centric integration practice Items Communality Scores Strategic Level α .809 Avg .728 AVE = .531 SLD1_Our company accesses external data for long-term objectives .68 .66 (e.g. context related data from sources outside the company) SLD2_Our company depends on supplier data for making source- .81 .71 material changes SLD3_Our company relies on leading indicators to support strategic .83 .75 decisions SLD6_IIOT3_Our company monitors production performance .60 .59 (i.e. data on production activities can be readily accessed, analyzed, and acted upon) Operation Level α .839 Avg .716 AVE = .512 OLD1_Our company routinely reviews business-process data against customer objectives. (e.g. data from order, planning, control tasks that effect .62 .59 response to customer needs) OLD2_Our company provides data in real-time to identify order- .65 .59 fulfillment problems OLD3_Our company relies on machine-data to reveal quality issues .70 .59 OLD4_Our company regularly reviews production-process data to improve efficiencies. (e.g. data from production tasks, those internal and external .79 .74 that effect production output) OLD5_Our company uses a standardized problem-analysis method across the organization. (e.g. formalized steps to identifying and solving a .83 .72 problem, recommending solutions) KMO .921 Data-security Level α .910 Avg .772 AVE = .595 DSL1_Our company communicates data-security issues through .64 .65 organization-wide meetings DSL2_Our company actively works to prevent cyber-hacking .88 .84 attempts DSL3_Our company installed strong data-breach mechanisms .87 .82 DSL4_Our company has a senior position for managing data-security .74 .72 risks DSL5_Our company provides stakeholder updates on data-security .74 .72 performance IIOT Level α .818 Avg .740 AVE = .547 IIOT1_Our company has adopted a cloud-based, data connectivity management system. (i.e. ERP, MES, PLM, and devices are interconnected so .66 .56 data can be collectively stored, retrieved, and analyzed) IIOT2_Our company uses mobile devices to notify of equipment .72 .62 interruptions. (i.e. remote machine monitoring) IIOT4_Our company controls inventory with smart technologies .78 .66 (e.g. RFID, bar-code and or QR-code technologies, blue-tooth technologies, etc.) IIOT5_Our company applies factory simulations for production .79 .72 efficiencies. (e.g. to study machine connectivity, manufacturing processes) KMO .905 Items removed due to low factor loadings SLD4_Q42 Our company places great reliance on ‘gut feeling’ in

planning decisions. SLD5_Q43 Our company uncovers strategic business opportunities

with data-analytics. Extraction Method: Principal Component Analysis. Rotation Method Varimax Kaiser Normalization

202

Table 24: Data-centric actuated process items

Loading Data-centric actuated processes Communality Scores Productivity α .852 Avg .761 AVE = .579 PAP1_Our company uses labor data to monitor safety performance .85 .72 (e.g. data concerning employee welfare in the manufacturing environment) PAP2_Our company surveys customers for satisfaction data .68 .72 PAP3_Our company analyzes customer segments for improving .80 .62 profit performance PAP4_Our company uses industry data for comparing productivity .72 .76 performance Planning α .732 Avg .714 AVE = .510 PLAP2_Our company senses trends with social media .90 .83 PLAP3_Our company anticipates challenges on securing talent with right skill-set. (i.e. personnel with critical thinking traits, technical knowledge, .66 .64 business mind-set, and analytical skills) PLAP4_Our company plans future products based on customer input .58 .55 KMO .886 Data Governance α .823 Avg .774 AVE = .600 DGAP1_Our company authorizes data-access privileges for security .74 .46 (i.e. assigned levels of data access) DGAP2_Our company excels against competitors in preventing data- .76 .64 breaches DGAP3_Our company communicates on data-breach audits .80 .71 DGAP4_Our company uses cloud-based data-storage systems for .74 .54 security DGAP5_Our company upgrades cyber-security technology .83 .70 infrastructure Innovation α .879 Avg .713 AVE = .508 IAP1_Our company develops prototypes with data-analytics .56 .67 (i.e. could be a process or product, measured against analyzed outcomes) IAP2_Our company integrates new process ideas from employees .74 .69 IAP3_Our company develops new products with customer ideas .84 .72 IAP4_Our company compares competitors’ products .68 .58 IAP5_Our company innovates with cross-function ideas .74 .74 KMO .907 Items removed due to low factor loadings PAP5_Q64 Our company utilizes waste materials for economic value. (e.g. re-purpose, re-manufacture, re-cycle) PLAP1_Q65 Our company plans material costs with commodity data. PLAP5_Q59 Our company prepares for government policy changes. (i.e. awareness to effects on the company, deliberating the impact and acting accordingly) Extraction Method: Principal Component Analysis. Rotation Method Varimax Kaiser Normalization

203

Table 25: Data-centric management influence and data use mechanisms

Loading Data-centric Influence and Mechanism Communality Scores Executive Management Influence α .945 Avg .756 AVE = .571 EMI1_Our senior management leads on implementing data-analytic .81 .74 initiatives. (i.e. the commitment to data-analytics use is felt at all organization levels) EMI2_Our senior management allocates significant resources for data-analytic technologies. (e.g. financial, people, organization, and physical .80 .77 resources annually allocated) EMI3_Our senior management values data-analytics as important to .81 .78 decision-making EMI4_Our senior management expects cross-function area problem- .77 .71 solving efforts. (i.e. data-sharing among cross-function areas) EMI5_Our senior management supports changing roles of data- technologies. (i.e. technology influence on dynamic changes effecting the .75 .72 organization and its operation) EMI6_Our senior management recruits talents with business-analytic .68 .68 skills EMI7_Our senior management investments in data-analytic .72 .71 technologies are achieving high ROI EMI8_Our company's president and or CEO sets the tone for using data-analytic technologies. (i.e. leads by example, uses data-analytics and .71 .67 expects data-supported recommendations) Planning α .850 Avg .704 AVE = .500 DAUM1_Our company enables employee access to job-related data .63 .48 (i.e. to aid in decision-making, goal achievement) DAUM2_Our company captures the right data at the right time for .66 .68 decision making. (i.e. data in context with the need, delivered in real or near-time) DAUM3_Our company relies on data accessed from automated- generating systems. (i.e. data that is self-delivered to the user, without manual .69 .59 intervention) DAUM4_Our company provides user-training on information-system .80 .73 technologies DAUM5_Our company uses data-visualizations to tell progress .74 .67 stories. (e.g. dashboards, graphical data illustrations, data story boards) Items removed due to low factor loadings DAU6_Q25 Our company provides data-security on mobile devices.

(e.g. iPads, laptops, smart-phones). Extraction Method: Principal Component Analysis. Rotation Method Varimax Kaiser Normalization KMO .969

204

Table 26: Organization Capabilities Growth

Loading Data-centric actuated processes Communality Scores Current Organization Capabilities Growth α .756 Avg .711 AVE = .505 COCG1_In the past three-years, our company employment has grown .76 .58 COCG2_In the past three-years, our company's number of customers .75 .56 has grown COCG3_In the past three-years, our company's current product lines .71 .51 have shown growth: COCG4_In the past three-years, our company has added advanced .71 .50 manufacturing capabilities COCG5 _In the past three-years, our company's growth is attributable .62 .39 to acquisitions KMO .791 Planned Organization Capabilities Growth α .836 Avg .741 AVE = .549 POCG1_In the next three-years, our company expects its customer- .79 .63 base to grow POCG2_In the next three-years, our company plans on adding more .75 .57 technology-skilled employees POCG3_In the next three-years, our company expects to expand on .75 .56 customized product offerings POCG4_In the next three-years, our company plans of adding more .73 .53 advanced manufacturing capabilities POCG5_In the next three-years, our company will grow by acquiring .73 .53 new market segments POCG6_In the next three-years, our company's core businesses are .70 .49 expected to show growth: KMO .846 Current Cost Improvements α .774 Avg .771 AVE = .594 CCI1_In the past three-years, our company’s labor portion of manufacturing costs have improved by: (increase or decrease range 1-5% .73 .53 lesser or greater) CCI3_In the past three-years, our company’s raw material cost per .79 .63 unit has improved by: (increase or decrease range 1-5% lesser or greater) CCI4_In the past three-years, our company’s energy consumption per .81 .66 unit has improved by: (increase or decrease range 1-5% lesser or greater) CCI5_In the past three-years, our company’s annual unit-production .75 .57 has improved: (increase or decrease range 1-5% lesser or greater) KMO .781 Items not use used in analysis CCI2_In the past three-years, our company's total per unit cost has improved FIN1_In the past three-years, our company has averaged a ROA of: PCI1_In the next three-years, our company average profit per customer is expected to: PCI2_In the next three-years, our company's production-labor cost per unit is expected to improve PCI3_In the next three-years, our company's planned manufactured units is expected to PCI4_In the next three-years, our company adopts new production methods to reduce unit

205

4.6. Multiple regression model and results

A primary insight derived from the case studies is the influence of management on investing in technologies and insuring their use to build organization capabilities and organization value. A priori given managers wish to succeed and are ambitious on success when they have made capital decisions that affect the sustainability of the organization.

The first stage of analyzing the data from the large-scale survey, is to view the effects on domain constructs on one another. With 98 items and 57 demographic items, performing a regression on each item is a vastly time-consuming task, hence the use of dimension reduction (as discussed earlier) and saving the reduced items as regression variables simplifies the task. In this manner, the characteristics of the items within the construct are retained and provide a means to measure the effect size or relationship between dependent and independent variables. Performing a regression analysis, the researcher is aware of the study sample size, variations among variables, scale type, and collinearity (Mooi, et al., 2018).

The sample size of this research’s study (n=333) is sufficient for this type of analysis, variation is satisfied through dimension reduction, scale type all items are interval scaled, and collinearity where two independent variables are highly correlated. Convention measures collinearity as the variance influence factor (VIF: 1/(1-R2)) and dependent on the sample size; a generally accepted guideline to collinearity is not present at values <

10, the closer to 0 the lesser the collinearity effect. PCA works to reduce collinearity since the factor is composed of the original variables, collinearity among those original

206

items no longer exists. “The regression model should be simple, yet complete” (Mooi, et al., 2018, p 222). With these thoughts in mind, the research examines by domain construct independent variables on dependent construct variables. Each relationship between an independent and dependent variable is moderated first by executive management influence (EMI) and second by data accessibility and use mechanism

(DAUM). A brief discussion on moderation and mediation is in order. A moderating variable interacts with an independent and dependent variable, modifying the strength of the relationship. The moderating variable can also serve as a mediator, and in this regard, the two interacting variables (EMI and DAUM) can be viewed as moderated mediation and mediated moderation (Baron and Kenny, 1986) effects on the independent and dependent variable; mediated moderation and moderated mediation conventionally have four variables, a moderator, mediator, independent and dependent variable.

The first series of analysis will be on pressures as the independent variable on data- centric practices as the dependent variable. The second series will be data-centric practices as the independent variable on data-centric processes as the dependent variable.

The third series will be data-centric processes on organization capabilities outcomes. To measure effect size, this research uses Cohen’s effect size benchmarks for multiple regression (Cohen, 1988, p 413-414; Ellis, 2010, p 41) a R2 < .02 to .12 represents a small effect, < .13 to .25 medium effect, and .26 > large effect. The discussion is presented, firstly with a table summarizing the multiple regression followed by a brief reporting on the results. To avoid repetitive opening narrative in this discussion “A multiple regression was carried out to investigate whether data-centric adoptive drivers moderated by EMI and DAUM could significantly influence the linkage on integrated

207

data practices”, the reader can assess from the tables the regression effects. Reporting is focused on the interaction of EMI and DAUM as moderators on the linkage between data-centric adoption pressures and data integration practices.

4.6.1 Pressures on data integration practices regression model Table 27: Performance pressure on integration practices

Modeled Modeled Modeled Dependent Independent without with EMI with EMI - DAUM Variable Variable Moderation Moderation Moderation R²a=.17, R²a=.24, R²a=.32, Strategic Performance F (3,329) = 68.35, F (3,329) = 52.63, F (3,329) = 53.17, Level Pressure p<.001 p<.001 p<.001 R²a=.27, R²a=.32, R²a=.46, Operation Performance F (3,329) = 123.04, F (3,329) = 80.97, F (3,329) = 96.02, Level Pressure p<.001 p<.001 p<.001 R²a=.14, R²a=.22, R²a=.32, Data-security Performance F (3,329) = 55.48, F (3,329) = 46.464, F (3,329) = 52.23, Level Pressure p<.001 p<.001 p<.001 R²a=.14, R²a=.24, R²a=.33, IIOT Performance F (3,329) = 55.74, F (3,329) = 54.24, F (3,329) = 55.41, Level Pressure p<.001 p<.001 p<.001

Regarding the linkage of performance pressure (PP) on SLD, the results indicate the model explained 32% of the variance when moderated by EMI (b= .408, p<.001) and

DAUM (b= .328, p<.001) significantly predicting SLD integration practices F(3,329) =

53.17, p = .001. Linkage of PP on OLD, 46% of the variance significantly predicts OLD integration practices F(3,329) = 96.02, p = .001, EMI (b= .400, p<.001), DAMU (b=

.422, p<.001). . Linkage of PP on DSL, 32% of the variance significantly predicts DSL integration practices F(3,329) = 52.23, p = .001, EMI (b= .431, p<.001), DAMU (b=

.360, p<.001). Linkage of PP on IIOT, 33% of the variance significantly predicts IIOT integration practices F(3,329) = 54.41, p = .001, EMI (b= .476, p<.001), DAMU (b=

.335, p<.001).

208

Table 28: Innovation pressure on integration practices

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation R²a=.03, R²a=.20, R²a=.32, Strategic Innovation F (3,329) = 12.15, F (3,329) = 42.34, F (3,329) = 53.49, Level Pressure p<.001 p<.001 p<.001 R²a=.14, R²a=.29, R²a=.46, Operation Innovation F (3,329) = 53.35, F (3,329) = 68.96, F (3,329) = 94.19, Level Pressure p<.001 p<.001 p<.001 R²a=.08, R²a=.22, R²a=.32, Data-security Innovation F (3,329) = 28.24, F (3,329) = 46.88, F (3,329) = 52.58, Level Pressure p<.001 p<.001 p<.001 R²a=.15, R²a=.30, R²a=.35, IIOT Innovation F (3,329) = 57.87, F (3,329) = 70.77, F (3,329) = 61.91, Level Pressure p<.001 p<.001 p<.001

Regarding the linkage of innovation pressure (IP) on SLD, the results indicate the model explained 32% of the variance when moderated by EMI (b= .463, p<.001) and DAUM

(b= .381, p<.001) significantly predicting SLD integration practices F(3,329) = 53.49, p

= .001. Linkage of IP on OLD, 46% of the variance significantly predicts OLD integration practices F(3,329) = 94.19, p = .001, EMI (b= .453, p<.001), DAMU (b=

.443, p<.001). .Linkage of IP on DSL, 32% of the variance significantly predicts DSL integration practices F(3,329) = 52.58, p = .001, EMI (b= .425, p<.001), DAMU (b=

.347, p<.001). Linkage of IP on IIOT, 35% of the variance significantly predicts IIOT integration practices F(3,329) = 61.91, p = .001, EMI (b= .429, p<.001), DAMU (b=

.267, p<.001).

[Remainder of page left intentionally blank]

209

Table 29: Competitive pressure on integration practices

Modeled Modeled Modeled Dependent Independent without with EMI with EMI +DAUM Variable Variable Moderation Moderation Moderation R²a=.15, R²a=.23, R²a=.32, Strategic Competitive F (3,329) = 59.94, F (3,329) = 50.19, F (3,329) = 52.53, Level Pressure p<.001 p<.001 p<.001 R²a=.33, R²a=.37, R²a=.47, Operation Competitive F (3,329) = 166.67, F (3,329) = 99.87, F (3,329) = 100.6, Level Pressure p<.001 p<.001 p<.001 R²a=.10, R²a=.20, R²a=.33, Data-security Competitive F (3,329) = 39.43, F (3,329) = 42.26, F (3,329) = 54.46, Level Pressure p<.001 p<.001 p<.001 R²a=.21, R²a=.28, R²a=.33, IIOT Competitive F (3,329) = 90.15, F (3,329) = 69.97, F (3,329) = 57.28, Level Pressure p<.001 p<.001 p<.001

Regarding the linkage of competitive pressure (CP) on SLD, the results indicate the model explained 32% of the variance when moderated by EMI (b= .449, p<.001) and

DAUM (b= .358, p<.001) significantly predicting SLD practices F(3,329) = 52.53, p =

.001. Linkage of CP on OLD, 47% of the variance significantly predicts OLD integration practices F(3,329) = 100.64, p = .001, EMI (b= .366, p<.001), DAMU (b= .443, p<.001). Linkage of CP on DSL, 33% of the variance significantly predicts DSL integration practices F(3,329) = 54.46, p = .001, EMI (b= .506, p<.001), DAMU (b=

.426, p<.001). Linkage of CP on IIOT, 33% of the variance significantly predicts IIOT integration practices F(3,329) = 57.28, p = .001, EMI (b= .412, p<.001), DAMU (b=

.278, p<.001).

[Remainder of page left intentionally blank]

210

Table 30: Cyber-security pressure on integration practices

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Cyber- R²a=.05, R²a=.20, R²a=.32, Strategic security F (3,329) = 18.80, F (3,329) = 42.70, F (3,329) = 52.90, Level Pressure p<.001 p<.001 p<.001 Cyber- R²a=.12, R²a=.26, R²a=.45, Operation security F (3,329) = 47.72, F (3,329) = 60.85, F (3,329) = 92.27, Level Pressure p<.001 p<.001 p<.001 Cyber- R²a=.33, R²a=.40, R²a=.44, Data-security security F (3,329) = 167.81, F (3,329) = 110.77, F (3,329) = 89.18, Level Pressure p<.001 p<.001 p<.001 Cyber- R²a=.07, R²a=.23, R²a=.33, IIOT security F (3,329) = 25.70, F (3,329) = 51.51, F (3,329) = 55.41, Level Pressure p<.001 p<.001 p<.001

Regarding the linkage of cyber-security pressure (CYBP) on SLD, the results indicate the model explained 32% of the variance when moderated by EMI (b= .460, p<.001) and

DAUM (b= .370, p<.001) significantly predicting SLD integration practices F(3,329) =

52.90, p = .001. Linkage of CYBP on OLD, 45% of the variance significantly predicts

OLD integration practices F(3,329) = 92.97, p = .001, EMI (b= .460, p<.001), DAMU

(b= .462, p<.001). .Linkage of CYBP on DSL, 44% of the variance significantly predicts DSL integration practices F(3,329) = 89.18, p = .001, EMI (b= .300, p<.001),

DAMU (b= .231, p<.001). Linkage of CYBP on IIOT, 33% of the variance significantly predicts IIOT integration practices F(3,329) = 55.41, p = .001, EMI (b= .412, p<.001),

DAMU (b= .278, p<.001).

[Remainder of page left intentionally blank]

211

4.6.2 Practices on key processes regression model Table 31: Strategic level data integration on key processes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Productivity Strategic R²a=.24, R²a=.36, R²a=.51, Actuated Level F (3,329) = 106.45, F (3,329) = 96.03, F (3,329) = 114.8, Processes Integration p<.001 p<.001 p<.001 Planning Strategic R²a=.09, R²a=.15, R²a=.15, Actuated Level F (3,329) = 35.31, F (3,329) = 30.29, F (3,329) = 22.07, Processes Integration p<.001 p<.001 p<.001 Data Strategic R²a=.19, R²a=.31, R²a=.39, Governance Level F (3,329) = 79.60, F (3,329) = 76.64, F (3,329) = 72.52, Processes Integration p<.001 p<.001 p<.001 Innovation Strategic R²a=.14, R²a=.20, R²a=.29, Actuated Level F (3,329) = 54.98, F (3,329) = 42.22, F (3,329) = 46.42, Processes Integration p<.001 p<.001 p<.001

Regarding the linkage of strategic level data integration (SLD) on key productivity processes (PAP), the results indicate the model explained 51% of the variance when moderated by EMI (b= .475, p<.001) and DAUM (b= .413, p<.001) significantly predicting key PAP processes F(3,329) = 114.84, p = .001. Linkage of SLD on key planning processes (PLAP), 15% of the variance significantly predicts key PLAP processes F(3,329) = 22.07, p = .001, EMI (b= .295, p<.001), DAMU (b= .121, p<.05).

Linkage of SLD on data governance processes (DGAP), 39% of the variance significantly predicts key DGA processes F(3,329) = 72.52, p = .001, EMI (b= .453, p<.001), DAMU (b= .310, p<.001). Linkage of SLD on innovation processes (IAP), 29% of the variance significantly predicts key IAP processes F(3,329) = 46.42, p = .001, EMI

(b= .342, p<.001), DAMU (b= .333, p<.001).

212

Table 32: Operation level data integration on key processes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Productivity Operation R²a=.30, R²a=.40, R²a=.51, Actuated Level F (3,329) = 145.24, F (3,329) = 110.48, F (3,329) = 114.1 Processes Integration p<.001 p<.001 p<.001 Planning Operation R²a=.05, R²a=.13, R²a=.15, Actuated Level F (3,329) = 19.39, F (3,329) = 25.05, F (3,329) = 20.17, Processes Integration p<.001 p<.001 p<.001 Data Operation R²a=.17, R²a=.29, R²a=.38, Governance Level F (3,329) = 68.39, F (3,329) = 70.11, F (3,329) = 69.19, Processes Integration p<.001 p<.001 p<.001 Innovation Operation R²a=.17, R²a=.22, R²a=.29, Actuated Level F (3,329) = 71.13, F (3,329) = 47.43, F (3,329) = 46.05, Processes Integration p<.001 p<.001 p<.001

Regarding the linkage of operation level data integration (OLD) on PAP, the results indicate the model explained 51% of the variance when moderated by EMI (b= .468, p<.001) and DAUM (b= .393, p<.001) significantly predicting key PAP processes

F(3,329) = 114.14, p = .001. Linkage of OLD on PLAP, 15% of the variance significantly predicts key PLAP processes F(3,329) = 20.17, p = .001, EMI (b= .368, p<.001), DAMU (b= .183, p<.005). Linkage of OLD on DGA, 38% of the variance significantly predicts key DGA processes F(3,329) = 69.19, p = .001, EMI (b= .510, p<.001), DAMU (b= .355, p<.001). Linkage of OLD on IAP, 29% of the variance significantly predicts key IAP processes F(3,329) = 46.05, p = .001, EMI (b= .339, p<.001), DAMU (b= .321, p<.001).

[Remainder of page left intentionally blank]

213

Table 33: Data-security level integration on key processes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Productivity Data-security R²a=.25, R²a=.37, R²a=.51, Actuated Level F (3,329) = 113.79, F (3,329) = 100.40, F (3,329) = 116.6, Processes Integration p<.001 p<.001 p<.001 Planning Data-security R²a=.09, R²a=.14, R²a=.16, Actuated Level F (3,329) = 31.01, F (3,329) = 29.02, F (3,329) = 21.43, Processes Integration p<.001 p<.001 p<.001 Data Data-security R²a=.35, R²a=.43, R²a=.47 Governance Level F (3,329) = 181.44, F (3,329) = 125.36, F (3,329) = 98.86, Processes Integration p<.001 p<.001 p<.001 Innovation Data-security R²a=.09, R²a=.17, R²a=.28, Actuated Level F (3,329) = 32.76, F (3,329) = 34.46, F (3,329) = 44.73, Processes Integration p<.001 p<.001 p<.001

Regarding the linkage of data-security level integration (DSL) on PAP, the results indicate the model explained 51% of the variance when moderated by EMI (b= .468, p<.001) and DAUM (b= .405, p<.001) significantly predicting key PAP processes

F(3,329) = 116.59, p = .001. Linkage of DSL on PLAP, 15% of the variance significantly predicts key PLAP practices F(3,329) = 20.17, p = .001, EMI (b= .306, p<.001), DAMU (b= .129, p<.05). Linkage of DSL on DGA, 47% of the variance significantly predicts key DGA processes F(3,329) = 98.86, p = .001, EMI (b= .354, p<.001), DAMU (b= .225, p<.001). Linkage of DSL on IAP, 29% of the variance significantly predicts key IAP processes F(3,329) = 44.73, p = .001, EMI (b= .393, p<.001), DAMU (b= .374, p<.001).

[Remainder of page left intentionally blank]

214

Table 34: IIOT level integration on key processes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Productivity IIOT R²a=.20, R²a=.33, R²a=.50, Actuated Level F (3,329) = 83.20, F (3,329) = 42.70, F (3,329) = 110.4, Processes Integration p<.001 p<.001 p<.001 Planning IIOT R²a=.09, R²a=.14, R²a=.16, Actuated Level F (3,329) = 33.26, F (3,329) = 28.95, F (3,329) = 21.50, Processes Integration p<.001 p<.001 p<.001 Data IIOT R²a=.20, R²a=.31, R²a=.39, Governance Level F (3,329) = 82.20, F (3,329) = 75.61, F (3,329) = 72.47, Processes Integration p<.001 p<.001 p<.001 Innovation IIOT R²a=.10, R²a=.17, R²a=.28, Actuated Level F (3,329) = 40.01, F (3,329) = 35.85, F (3,329) = 44.85, Processes Integration p<.001 p<.001 p<.001

Regarding the linkage of IIOT level integration (IIOT) on PAP, the results indicate the model explained 51% of the variance when moderated by EMI (b= .506, p<.001) and

DAUM (b= .440, p<.001) significantly predicting key PAP F(3,329) = 110.39, p = .001.

Linkage of IIOT on PLAP, 16% of the variance significantly predicts key PLAP processes F(3,329) = 21.50, p = .001, EMI (b= .300, p<.001), DAMU (b= .131, p<.05).

Linkage of IIOT on DGA, 47% of the variance significantly predicts key DGA processes

F(3,329) = 72.47, p = .001, EMI (b= .449, p<.001), DAMU (b= .312, p<.001). Linkage of IIOT on IAP, 28% of the variance significantly predicts key IIOT processes F(3,329)

= 44.85, p = .001, EMI (b= .375, p<.001), DAMU (b= .361, p<.001)

[Remainder of page left intentionally blank]

215

4.6.3 Processes on organization outcomes Table 35: Key productivity actuated processes on organization outcomes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Current Productivity R²a=.22, R²a=.26, R²a=.33, Organization Actuated F (3,329) = 96.37, F (3,329) = 60.78, F (3,329) = 56.08, Capabilities Growth Processes p<.001 p<.001 p<.001 Planned Productivity R²a=.34, R²a=.41, R²a=.47, Organization Actuated F (3,329) = 172.86, F (3,329) = 114.87, F (3,329) = 99.58, Capabilities Growth Processes p<.001 p<.001 p<.001 Current Productivity R²a=.19, R²a=.20, R²a=.23, Cost Actuated F (3,329) = 77.22, F (3,329) = 43.40, F (3,329) = 33.88, Improvements Processes p<.001 p<.001 p<.001

Regarding the linkage of PAP on current organization capabilities growth (COCG), the results indicate the model explained 33% of the variance when moderated by EMI (b=

.356, p<.001) and DAUM (b= .314, p<.001) significantly predicting COCG outcomes

F(3,329) = 56.08, p = .001. Linkage of PAP on planned organization capabilities growth

(POCG), 47% of the variance significantly predicts POCG outcomes F(3,329) = 99.58, p

= .001, EMI (b= .300, p<.001), DAMU (b= .131, p<.05). Linkage of PAP on current cost improvements (CCI), 23% of the variance significantly predicts CCI outcomes

F(3,329) = 33.88, p = .001, EMI (b= .232, p<.001), DAMU (b= .199, p<.001).

[Remainder of page left intentionally blank]

216

Table 36: Key planning actuated processes on organization outcomes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Current Planning R²a=.15, R²a=.25, R²a=.36, Organization Actuated F (3,329) = 60.24, F (3,329) = 55.35, F (3,329) = 63.02, Capabilities Growth Processes p<.001 p<.001 ,p<.001 Planned Planning R²a=.16, R²a=.33, R²a=.47, Organization Actuated F (3,329) = 65.14, F (3,329) = 84.05, F (3,329) = 99.75, Capabilities Growth Processes p<.001 p<.001 p<.001 Current Planning R²a=.06, R²a=.14, R²a=.21, Cost Actuated F (3,329) = 22.76, F (3,329) = 27.17, F (3,329) = 30.82, Improvements Processes p<.001 p<.001 p<.001

Regarding the linkage of PLAP on current organization capabilities growth (COCG), the results indicate the model explained 36% of the variance when moderated by EMI (b=

.357, p<.001) and DAUM (b= .343, p<.001) significantly predicting COCG outcomes

F(3,329) = 63.02, p = .001. Linkage of PLAP on planned organization capabilities growth (POCG), 47% of the variance significantly predicts POCG outcomes F(3,329) =

99.58, p = .001, EMI (b= .471, p<.001), DAMU (b= .379, p<.001). Linkage of PLAP on current cost improvements (CCI), 23% of the variance significantly predicts CCI outcomes F(3,329) = 33.88, p = .001, EMI (b= .317, p<.001), DAMU (b= .284, p<.001).

[Remainder of page left intentionally blank]

217

Table 37: Key data governance actuated processes on organization outcomes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Current Data R²a=.18, R²a=.24, R²a=.33, Organization Governance F (3,329) = 74.33, F (3,329) = 53.35, F (3,329) = 55.87, Capabilities Growth Processes p<.001 p<.001 p<.001 Planned Data R²a=.23, R²a=.34, R²a=.45, Organization Governance F (3,329) = 101.28, F (3,329) = 86.83, F (3,329) = 92.70, Capabilities Growth Processes p<.001 p<.001 p<.001 Current Data R²a=.07, R²a=.13, R²a=.20, Cost Governance F (3,329) = 26.81, F (3,329) = 25.64, F (3,329) = 29.55, Improvements Processes p<.001 p<.001 p<.001

Regarding the linkage of DGA on current organization capabilities growth (COCG), the results indicate the model explained 33% of the variance when moderated by EMI (b=

.369, p<.001) and DAUM (b= .353, p<.001) significantly predicting COCG outcomes

F(3,329) = 55.87, p = .001. Linkage of DGA on planned organization capabilities growth

(POCG), 47% of the variance significantly predicts POCG outcomes F(3,329) = 99.58, p

= .001, EMI (b= .477, p<.001), DAMU (b= .370, p<.001). Linkage of DGA on current cost improvements (CCI), 20% of the variance significantly predicts CCI outcomes

F(3,329) = 29.55, p = .001, EMI (b= .359, p<.001), DAMU (b= .306, p<.001).

[Remainder of page left intentionally blank]

218

Table 38: Key innovation processes on organization outcomes

Modeled Modeled Modeled Dependent Independent without with EMI with EMI+DAUM Variable Variable Moderation Moderation Moderation Current Innovation R²a=.22, R²a=.29, R²a=.36, Organization Actuated F (3,329) = 96.04, F (3,329) = 69.47, F (3,329) = 63.69, Capabilities Growth Processes p<.001 p<.001 p<.001 Planned Innovation R²a=.32, R²a=.43, R²a=.50, Organization Actuated F (3,329) = 154.14, F (3,329) = 126.21, F (3,329) = 113.9, Capabilities Growth Processes p<.001 p<.001 p<.001 Current Innovation R²a=.23, R²a=.26, R²a=.28, Cost Actuated F (3,329) = 100.93, F (3,329) = 59.28, F (3,329) = 44.88, Improvements Processes p<.001 p<.001 p<.001

Regarding the linkage of IAP on current organization capabilities growth (COCG), the results indicate the model explained 36% of the variance when moderated by EMI (b=

.340, p<.001) and DAUM (b= .291, p<.001) significantly predicting COCG outcomes

F(3,329) = 63.69, p = .001. Linkage of IAP on planned organization capabilities growth

(POCG), 50% of the variance significantly predicts POCG outcomes F(3,329) = 113.96, p = .001, EMI (b= .421, p<.001), DAMU (b= .302, p<.001). Linkage of IAP on current cost improvements (CCI), 28% of the variance significantly predicts CCI outcomes

F(3,329) = 44.88, p = .001, EMI (b= .220, p<.001), DAMU (b= .176, p<.001).

4.6.4 Discussion

Performing linear regressions on the PCA regression saved variables, the ‘enter’ method of linear regression, available in SPSS 25, was used in this analysis. Durbin Watson tests for autocorrelation (correlation among adjacent residuals ranging between 1 and 4) during the analysis fell within a conventionally accepted range of 1.5 to 2.5 and

219

specifically clustered around 2 indicating the tests have been favorably met, factors ≤ 1 and ≥ 3 are problematic (Field, 2013, p 311). F-ratios are statistically significant throughout the analysis, and R²a clearly demonstrates effect size movement between the independent variable when EMI and DAUM moderate and the dependent variable.

Notable on this review of this series of multiple regression models, is evidence in support of a main proposition of this research, the influence of executive management on the adoption of a data-centric culture through investments in data-technologies and its impact on organization performance outcomes. The modeling process viewed each domain separately to understand discrete effects among independent and dependent variables prior to and during moderation from EMI and DAUM. Not all results will be discussed in detail, suffice to say, in general, evidence indicates as the degree of involvement by executive management on allocating resources for data-technologies adoption and organizationally leading the initiatives to do so, data integration increases within the organization. Mechanisms that enable data accessibility and use likewise impact on increases of data integration on practices and processes affecting decision- making and organization outcomes.

Key findings when viewing the relationship between data-centric adoptive pressures and data-integration practices, most pressures demonstrate small R²a effects prior to moderation. Signaling, pressures in and of themselves are inert until executive management is motivated by the pressure to take an action to address the pressure. This is viewable on the relationship between PP and data integration practices, most effect size is small R²a ≥ .02 ≤ .12, indicating management feels the pressure to do something, and with their involvement through resource allocation on data-technologies increases the

220

effect size R²a . ≥ 13 ≤ .25 of the relationship. Albeit, by allocating resources to relieve pressure through data-technologies; the investments made also add pressure on management (who may have championed an investment and is ambitious to see it deliver positive returns) by insuring the technologies are used to produce desired outcomes. EMI requires the tools and or mechanisms (DAUM) on which to enable technologies to meet outcome expectations. When DAUM is added to the moderating effect on performance pressures and data integration practices, the effect size becomes large R²a ≥ .26 indicating as management allocates resources, it also provides the tools on which to operationalize those resources. This same stream of logic is used between the relationships of data integration practices on data actuated processes, and between data actuated processes on organization outcomes.

Of the relationships among data-centric pressures on data-integrated practices, PP on OLD moderated by EMI and DAUM demonstrates the largest effect size, R²a = .46, F

(3,329) = 96.02, p<.001, without moderation, the effect size relationship remains the largest of the four analyzed R²a=.27, F (3,329) = 123.04, p<.001. Representative of the positive and direct proposition between performance pressures and operation level data integration, as premised in this paper’s research. Total effect size change between the other moderated relationships shows PP on SLD R²a=.32, F (3,329) = 53.17, p<.001 with a non-moderated DR²a = .15, PP on DSL R²a=.32, F (3,329) = 52.23, p<.001 non- moderated DR²a = .18, PP, PP on IIOT R²a=.33, F (3,329) = 55.41, p<.001 non- moderated DR²a = .19. Moving forward in this discussion, the level of detail shown herein, will not be repeated, it is provided only to show the moderated changes on effect size; the remainder of the discussion will focus on a few key observations.

221

IP on data-centric practices illustrated the relationship on OLD as having the largest moderated effect size R²a=.46, F (3,329) = 94.19, p<.001 non-moderated DR²a =

.32. Given the case studies and discussions on innovation, those being most relevant (in the manufacturing environment) to process improvement and ways to reduce unit costs to sustain competitiveness, this relationship confirms a proposition of this research.

Interesting on the relationship between IP and SLD, IP has a small effect size R²a=.03, F

(3,329) = 12.15, p<.001, once moderated and the pressure to innovate is integrated at the strategic data level, the effect size becomes larger R²a=.32, F (3,329) = 53.49, p<.001, indicating management’s recognition of motivations prompting innovation may become strategic initiatives. CP R²a=.47, F (3,329) = 100.6, p<.001 and CYBP R²a=.45, F (3,329)

= 92.27, p<.001 also demonstrated a large effect size on OLD; indicating a natural alignment on OLD that integrates many forms of data on which data integrates on decision-making as critical to maintaining and sustaining organization capabilities.

Regarding data-integration practices on data-actuated processes. SLD demonstrates the largest moderated effect size on PAP R²a=.51, F (3,329) = 114.8, p<.001 and DGA R²a=.39, F (3,329) = 72.52, p<.001. As noted in the case studies and literature, productivity is proposed to be significantly related to SLD, given productivity channels onto building organization capabilities and organization performance. This is borne out in the analysis, in addition DGA presents a top-of-mind and strategic concern for managers, so having data on the movement of data is strategically important on engaging in a digitized, knowledge intensive environment. OLD likewise demonstrates the largest moderated effect size on PAP R²a=.51, F (3,329) = 114.1 p<.001 and DGA

R²a =.38, F (3,329) = 69.19, p<.001, and since OLD shares the same goals and objectives

222

as SLD, it should demonstrate a similar effect size relationship. DSL in similar fashion, follows suit on the moderated relationship between PAP R²a=.51, F (3,329) = 116.6, p<.001 and DGA R²a=.47 F (3,329) = 98.86, p<.001; albeit the DGA non-moderated relationship is large R²a=.35, F (3,329) = 181.44, p<.001. The relationship between DSL and DGA is practical, bridging CYBP and DGA, the between relationships of these variables demonstrates a consistent large effect size R²a ≥ .26. IIOT moves in similar moderated fashion on PAP F(3,329) = 110.39, p = .001 and DGA R²a=.39, F (3,329) =

72.47, p<.001; IIOT given its role as a network and or connectivity technology, would support the proposition PAP and DGA in a data-actuated situation should be closely related; data flows from IIOT into the organization to enable more informed and data supported decision-making. Whereas DGA, manages the flow of that data, to whom and by whom, regulating distribution to improve on PAP and increase capabilities strength.

POCG is used as a proxy for market growth by means of a company planning on growing capabilities and in turn, will have a growth effect on markets being served.

Regarding organization outcomes, POCG witnesses the largest moderated effect sizes among the data-actuated processes on organization capabilities growth; PAP moderated on POCG R²a=.47, F (3,329) = 99.58, p<.001; PLAP on POCG R²a=.47, F (3,329) =

99.75, p<.001; DGA on POCG R²a=.45, F (3,329) = 92.70, p<.001; IAP on POCG

R²a=.50, F (3,329) = 113.9, p<.001. COCG was proposed to be the larger of the effect size on the relationships, however the moderated effect size on the relationships is significant indicating data-actuated processes positively and directly affect organization capabilities: PAP on COCG R²a=.33, F (3,329) = 56.08, p<.001; PLAP on COCG

R²a=.36, F (3,329) = 63.02, p<.001; DGA on COCG R²a=.33, F (3,329) = 55.87, p<.001;

223

IAP on COCG R²a=.36, F (3,329) = 63.69, p<.001. Impact of data-actuated processes on

CCI demonstrated the smallest of moderated effect sizes. CCI is used as a proxy for innovation and given prior narratives on innovation being instrumental on process improvement and reducing costs, innovativeness is required to realize these benefits. PAP moderating on CCI R²a=.23, F (3,329) = 33.88, p<.001; PLAP on CCI R²a=.21, F (3,329)

= 30.82, p<.001; DGA on CCI R²a=.20, F (3,329) = 29.55, p<.001; IAP on CCI R²a=.28,

F (3,329) = 44.88, p<.001. IAP on CCI demonstrates the largest effect size, supporting the proposition of data-actuated innovation processes have a positive and direct effect is borne out in this analysis.

All relationships within this multiple regression analysis have been proposed as being direct and having a positive size effect. With moderation of EMI and DAUM on the relationships, the research showed the majority of these moved to larger effect sizes, demonstrating proof on this proposed moderation.

To further this research, viewing the analysis from another method may be beneficial towards deepening the understanding of these relationships. Hence, the study employs AMOS 25 to construct a structural equation model.

4.7 Structural model and results

The multiple regression analysis followed the research model as presented in figure

22; three models within one, data-centric pressures (DCP) on data-integration practices

(DIP), data-integration practices on data-actuated processes (DAP), and data-actuated processes on organization capabilities performance (OCP) in the form of capabilities

224

growth. EMI and DAUM moderated each linear between relationship among construct variables. To accurately assess this research model on goodness of fit and unidimensionality, it is examined through structural equation modeling (SEM). SEM

“takes a confirmatory approach to the analysis of a structural theory … demanding set patterns of intervariable a priori relationships (as established in the case study research model, figures 18, 22) … SEM lends itself well on inferential analysis of data … popular for non-experimental research” (Byrne, 2010, p3,4). Discussion on validities and reliabilities is not repeated herein, these have been adequately addressed in section 4.5.

Four reporting model fit indices are presented, χ2/df, CFI, SRMR, and RMSEA. Of these four, RMSEA ‘is one of the most important fit indices’ (Hooper, et al. 2008, p 54;

Diamantopoulos and Siguaw, 2000, p85) ‘a measure of approximate fit of a population, the square root of estimated discrepancy due to approximation per degree of freedom’

(Schermelleh-Engel, et al. 2003, p 36) ‘sensitive on the number of model parameters … choosing those with lesser parameters … scores below .08 are indicative of a good fit

(McAllum et al, 1996), Hu and Bentler (1999) suggest scores closer to .06 as a threshold of good model fit, Steiger (1990) a close fit should be ≤ .05 (Browne and Cudeck, 1993);

PClose values are not reported, although each of the hypothesized fully moderated models demonstrate values ≥ .05. SRMR, standardized root mean square residual is regularly reported with RMSEA; a ‘badness of fit measure based on fitted residuals’

(Jöreskog and Sörbom, 1989, p41; 1989; Schermelleh-Engel, et al. 2003, p 37) noting

‘rule of thumb SRMR values’ ≤ .05 considered a good fit and values ≤ .10 may be acceptable (Hu and Bentler, 1995). CFI, comparative fit index (Bentler, 1990), ‘assumes all latent variables are uncorrelated and compares sample covariance matrix with the null

225

model … CFI is less affected by sample size’ (Hooper, et al. 2008, p55), values ≥ .90 have traditionally been acceptable, however values ≥ .95 are considered indicative of a good fit (Schermelleh-Engel, et al. 2003, p 43). The last index, χ2 , considers ‘a good model fit, χ2/df ratio, if ≤ 2, scores between 2 and 3, an acceptable data fit. All models presented in this research fall within good to acceptable fit range. Unstandardized coefficients ‘B’ are used on path reporting alongside t and p values.

Three stages [DCP-EMI-DAUM-DIP], [DIP-EMI-DAUM-DAP], and [DAP-EMI-

DAUM-OCP] are examined individually due to the model’s complexity, when combined comprise they represent the overall research model. Comparisons are made between the fully moderated model and models sequentially removing the EMI and DAUM moderator to view the unmoderated model and in this manner, the moderating effect is viewed by changes occurring in SRMR and RSMEA (two index reporting, HU and

Bentler, 1999)

[Remainder of page left intentionally blank]

226

4.7.1 [DCP-EMI-DAUM-DIP] model Figure 23: DCP→EMI→DAUM→DIP model stage

Performance Pressures

Strategic Level Data Integration

Executive Innovation B=-.31, t=-3.118** Management Pressures Influence Operation Level Data Integration B=.46, t=6.84***

Data Accessibility & Use Mechanisms Competitive Pressures Data-security Level Integration

IIOT Level Cyber Security Integration Pressures

2 Χ / df = 1.816, CFI = 0.941, SRMR = 0.037, RMSEA = 0.050

This model hypothesizes positive and direct relationships between data-centric pressures and data-integration practices moderated by EMI and DAUM. Four models are tested: the Model 1 with all construct variables, Model 2 keeping EMI and removing

DAUM, Model 3 keeping DAUM and removing EMI, Model 4 removing both EMI and

DAUM. Testing in this manner allows the research to view the moderating impact on the model.

Model 1 demonstrated the best overall goodness of fit (χ2/df = 1.816, CFI = 0.941,

SRMR = 0.037, RMSEA = 0.050). Model 2 keeps EMI and removes DAUM (χ2/df =

1.921, CFI = 0.941, SRMR = 0.040, RMSEA = 0.053). Model 3 keeps DAUM and

227

removes EMI (χ2/df = 1.873, CFI = 0.944, SRMR = 0.040, RMSEA = 0.051). Model 4 removing both moderators (χ2/df = 1.931, CFI = 0.951, SRMR = 0.036, RMSEA =

0.053).

Interpreting the results

Model 1 illustrates the moderating effect of EMI and DAUM on DAP; managers allocating the necessary resources and making available DAUM mechanisms enabling the use of data is evident. DCP has significant effect on EMI (PP →EMI, B = .50, t

=3.81***; IP→EMI, B = .31, t = 3.12**; CP→EMI B =.68, t = 3.93***; CYBP→EMI B

=.15, t = 1.86, p>.05), CYBP slightly exceeded p>.05, CYBP yet retained as a factor of data-centric pressures on managers. As data-centric pressures increase, the level of management influence increases to meet the demands of a knowledge intensive environment. Not expected, and unexplained, is the negative strength of IP→EMI, as IP increases management influence lessens. The hypothesis of IP creating greater management influence the innovativeness within the organization from the case studies is a conundrum on explaining the survey results of IP effect on EMI. CP→EMI (B =.68, t

= 3.93***) demonstrates positive influence on management as hypothesized, the same as

PP→EMI (B = .50, t =3.81***); PP↔CP (r =.89) illustrates PP and CP moving in the same direction, each positively influencing the other and their effect on management.

DCP correlations with EMI indicate the same movement; PP↔EMI (r = .82), CP ↔ EMI

( r = .84), IP ↔ EMI (r = .66), CYBP ↔ EMI (r = .74).

EMI→DAUM (B = .46, t = 6.84***) as hypothesized, finds management influence on resource allocations and leading data-centric initiatives requires the necessary

228

mechanisms (DA tools, education, etc.) for organizational integration. DCP on DAUM showed CP→DAUM as significant (B =.51, t = 3.76) all other DCP on DAUM were not significant; as expected, DCP should first travel through EMI on deploying DAUM mechanisms, unexpected and unexplained is CP→DAUM. DAUM on DIP, illustrates the hypothesized significance of mechanisms on data integration (DAUM →SLDIP, B

=.90, t = 13.22***; DAUM→OLDIP, B =.86, t = 12.95***; DAUM→DSLIP, B =.98, t =

12.64***; DAUM→IIOPIP, B =.90, t = 11.45***).

Model 2, removing DAUM keeping EMI, the model suffers by reducing the moderating effect (SRMR = .040, RMSEA = .053), indicating DAUM mechanisms are necessary for DIP. Model 3, removing EMI keeping DAUM, the model fit compared improves, however not quite as good a fit as in Model 1 (SRMR = .040, RMSEA = .051).

Model 4 removes all moderation (SRMR = .036, RMSEA = .053), the increase in

RMSEA and near same SRMR. compared with Model 1 (SRMR = .037, RMSEA = .050) indicates the moderation effect is demonstrated when both EMI and DAUM are present.

Assessing the changes on SRMR and RMSEA are slight, but noticeable and indicate the premise of EMI and DAUM having a positive and direct effect on adopting a data- centric mindset as culturally integrated into the organization.

[Remainder of page left intentionally blank]

229

4.7.2 [DIP-EMI-DAUM-DAP] model Figure 24: DIP→EMI→DAUM→DAP model stage

Strategic Productivity Level Actuated Data Integration Processes

Planning B=.36 Operation Executive Actuated Level t=2.91*** Management Processes Data Integration Influence

B=.05, t=.81

B=1.05, t=3.89*** Data Accessibility & Use Mechanisms Data-security Data Governance Level Actuated Integration Processes

Innovation Actuated IIOT Processes Level Integration

Χ2 / df = 1.814, CFI = 0.934, SRMR = 0.039 RMSEA = 0.050

In this examination of moderation, models view the interaction between DIP and DAP.

Model 1 with all latent constructs (χ2/df = 1.814, CFI = 0.934, SRMB =0.039, RMSEA =

0.050). Model 2 removing DAUM keeping EMI (χ2/df = 1.949, CFI = 0.932, SRMR =

0.040, RMSEA = 0.053). Model 3 removing EMI and keeping DAUM (χ2/df = 1.887,

CFI = 0.934, SRMR = 0.041, RMSEA = 0.052). Model 4 removing all moderation (χ2/df

= 1.863, CFI = 0.945, SRMR = 0.039, RMSEA = 0.051).

Interpreting the results

Model 1 demonstrated the best overall fit (SRMR = .039, RMSEA = .050). In this model, DIP has significant relationships with both EMI and DAUM; SLDIP→EMI (B

=.64, t = 4.40***); OLDIP→EMI (B =.36, t = 2.91***): DSLIP→EMI (B =.13, t =

230

2.11*); IIOTIP→EMI not significant; SLDIP→DAUM (B =.36, t = 3.19***);

OLDIP→DAUM (B =.57, t = 5.70***; DSLIP→DAUM (B =.24, t = 4.93***);

IIOTIP→DAUM (B =-.15, t = -2.64**).

EMI effect on DAUM is non-significant, and puzzling; understanding from the DCP on DIP model in section 4.5.1.1, shows pressures motivating management on adopting a data-centric mindset, once resources allocations are made and data use mechanisms are in place, the role of management may lessen. The relationship significance of DIP on both

EMI and DAUM indicate, the value management places on data integration and its actuation. In this regard, a lessening between EMI and DAUM appears plausible, yet unexpected, given once investments are made in data-technologies and mechanisms are in place, the relationship EMI→DAUM weakens with management influence focusing on outcomes of use versus involvement with mechanism choices.

DAUM effect on DAP is significant on DAUM→PAP (B =1.06, t = 12.95***);

DAUM→PLAP (B =.83, t = 9.61***); DAUM→DGAIP (B =.97, t = 12.15);

DAUM→IAP not significant, unexpected is the non-significance of DAUM→IAP, proposing data-mechanisms are a requirement of creating innovation opportunities. Also unexpected, when reviewing the intra-relationships among DAP constructs, PLAP showed significance on IAP (B =.1.05, t = 3.89***), this relationship makes sense, given the impact of extraordinary processes on ordinary processes. Innovation is dependent on planning, the path DAUM→PLAP→IAP, data mechanisms providing insights then actuated in planning, leading to innovation. The relationship between DAUM and DAP, demonstrates as the level of data-mechanism use increases, processes affected by those

231

mechanisms increase in performance, thereby executing on company strategies and objectives.

Model 2 (SRMR= .040, RMSEA = .053) is significantly impacted by removing the

DAUM moderation, in part due to the necessity of data mechanisms to actuate processes.

Model 3 (SRMR = .041, RMSEA = .052) improves when EMI is removed, and lends itself to interpretation on the lessening influence on DAUM once resources have been allocated. Model 4 (SRMR = .039, RMSEA = .051) without moderation shows near performance as Model 1. Possibly indicating continuous moderation is not required once management allocates resources and sets in place data accessibility and use mechanisms

(stage 1 of the model). When EMI and DAUM are organizationally embedded, continuous moderation may lessen, with management expectations being the organization will utilize the mechanisms to improve on performance and increase the value of outcomes.

[Remainder of page left intentionally blank]

232

4.7.3 [DAP-EMI-DAUM-OCG] model Figure 25: DAP→EMI→DAUM→OCG model stage

Productivity Actuated Processes

Current Planning Executive Organization Management Actuated Capabilities Influence Processes Growth

B=.43, t=4.35*** B=.38, t=2.44**

B=1.09, t=4.90*** Planned Data Accessibility Organization & Use Mechanisms Capabilities Data Governance Growth Actuated Processes

Current Cost Improvements

Innovation Actuated Processes

2 Χ / df = 1.811, CFI = 0.928, SRMR = 0.046, RMSEA = 0.049

In this final examination, the research views the effect of moderation on outcomes, reflected on organization capabilities growth. Model 1 with full moderation (χ2/df =

1.811, CFI = 0.928, SRMR = 0.046, RMSEA = 0.049). Model 2 removing DAUM and keeping EMI (χ2/df = 1.826, CFI = 0.936, SRMR = 0.046, RMSEA = 0.050). Model 3 removing EMI and keeping DAUM (χ2/df = 1.787, CFI = 0.935, SRMR = 0.047, RMSEA

= 0.049). Model 4 removing all moderation (χ2/df = 1.774, CFI = 0.945, SRMR = 0.045,

RMSEA = 0.048).

Interpreting the results: hypothesized model 1, proves not to be the best of fit. Model

4 removing EMI and DAUM (SRMR .045, RMSEA = .048) performs better than model

1, however this performance does not preclude it as a replacement model. Albeit, the model suggests at this stage, the need for EMI and or DAUM is noticeably lessened, due

233

to those moderating effects taking hold in stage 1 and stage 2 of the model, and not required on the outcome.

Significant relationships on model 1 interact between DAP→EMI; PAP→EMI (B

=.71, t = 4.71***; PLAP→EMI, not significant; DGAP→EMI (B =.29, t = 2.55*);

IAP→EMI, (B =.34, t = 2.89**). Interaction between DAP→DAUM; PAP→DAUM (B

=.64, t = 3.57***); PLAP→DAUM, B = 1.07, t = 3.89***; DGAP→DAUM, not significant; IAP→DAUM, B =.61, t = 5.07***). PAP shows significance on both EMI and DAUM, productivity is important on capabilities growth, data is operationalized through mechanisms, data is also operationalized by managers who make decisions with supporting data that effect capabilities. PLAP→EMI, not significant, may indicate, planning at this stage of the model should not be a factor on EMI, premised on plans should be in place during stage one and two of the model, and expected outcomes should not require planning in stage three. PLAP→DAUM (B =-1.07, t = -3.89***) is a conundrum, as planning increases DAUM decreases, and reasons counterintuitive, where planning should have continuous need to mechanisms on data-supported decision- making; possibly, this is not as hypothesized and planning processes formed in stage one and two of the model, similar to management expectations, the role of DAUM lessens.

The non-significant relationship of PLAP→EMI (B = -.38, t = -1.55, p>.05) is not disconcerting, since PAP, DGAP, and IAP are significant inputs on EMI for decision making affecting organization capabilities and also flow to DAUM (B =.43, t = 4.35***).

DAUM→COCG (B = .65, t = 9.72***) indicates the importance of having the right data mechanisms to facilitate growing organization capabilities. Interesting and not expected, is the relationship with COCG→POCG (B =.31, t = 2.44**), albeit not surprising given

234

the data insights (part and parcel of this research) of this analysis. Improving on COCG should lead to confidence on increasing organization capabilities, hence COCG should have an effect on POCG. Capabilities grown by successful use of data-technologies set the tone for continued investment in resources to continuously build organization capabilities. DAUM→POCG (B =.40, t = 4.33***), as hypothesized should have a direct and positive influence on planning future organization capabilities growth. DAUM→CCI was not significant and does not support the hypothesis. CCI is a proxy for testing innovation through cost improvements realized by the data use mechanisms, regardless on testing without moderation or with, the results were not significant.

4.7.4 Discussion

A three-stage examination was made of the data-centric management influence model. Stage one [DCP-EMI-DAUM-DIP] found fully moderated Model 1 demonstrated the best goodness of fit (χ2/df = 1.816, CFI = 0.941, SRMR = 0.037, RMSEA = 0.050) affirming the positive and direct hypotheses of data-centric pressures are moderated by levels of management influence (EMI) making available the tools and infrastructure

(DAUM) required of data integration within the organization as indicated by

EMI→DAUM (B = .46, t = 6.84***). Collectively, in this stage of establishing a data- centric organization, EMI is critical to both resource allocation on adopting data- technologies and leading adoption initiatives. Competitive pressures (B = .50, t =

3.81***) and performance pressures (B = .68, t = 3.83***) have the largest influence on managers on adopting data-technologies.

235

Stage-two [DIP-EMI-DAUM-DAP] found fully moderated Model 1 demonstrating the best goodness of fit (χ2/df = 1.814, CFI = 0.934, SRMB =0.039, RMSEA = 0.050) representing the continuous effect of EMI and DAUM remained as hypothesized. Albeit,

EMI→DAUM (B = .05, t = .81ns) shows a lessening effect, indicating once resources are allocated and mechanisms in place, the relationship tide of EMI on DAUM may recede.

Stage-three, [DAP-EMI-DAUM-OCP], Model 4 (χ2/df = 1.774, CFI = 0.945, SRMR

= 0.045, RMSEA = 0.048), without the benefit of moderation, appears to perform the best. Lost on this model is the effect of DGAP→OCOG and DGAP→PCOG as well as the relationship CCOG→POCG (B = .29, t = 1.70, p>.05; although very close to p<.05).

This research is reluctant to abandon this moderating effect due to the arguments made in the case studies on the importance of EMI and DAUM on capabilities growth and as demonstrated on the effect size examination of these moderating variables on outcomes

(section 4.5.2.3).

Hypotheses results (section 4.5.1.5) are presented in tables 39, 40, and 41, narrative on hypotheses was here-to-for previously presented.

[Remainder of page left intentionally blank]

236

4.6.4.1 Hypotheses Result

Table 39: DCP-EMI-DAUM-DIP Model Hypothesis Results

DCP-EMI-DAUM-DIP Stage 1 Model Model 1 Full Model 2 Model 3 Hypothesis Moderation Keep EMI Keep DAUM Hm1a, b EMI→DAUM + NA NA H3a PP→EMI + + NA H4a IP→EMI + + NA H5a CP→EMI + + NA H6a CYBP→EMI - - NA H3b PP→DAUM - NA + H4b IP→DAUM - NA - H5b CP→DAUM + NA + H6b CYBP→DAUM - NA - H3c DAUM→SLDIP + NA + H4c DAUM→OLDIP + NA + H5c DAUM→DSLIP + NA + H6c DAUM→IIOTIP + NA + EMI→SLDIP NA + NA EMI→OLDIP NA + NA EMI→DSLIP NA + NA EMI→IIOTIP NA + NA

Table 40: DIP-EMI-DAUM-DAP Stage 2 Hypotheses Result

DIP-EMI-DAUM-DAP Stage 2 Model Model 1 Full Model 2 Model 3 Hypothesis Moderation Keep EMI Keep DAUM Hm1a, b EMI→DAUM - NA NA H7a SLDIP→EMI + + NA H8a OLDIP→EMI + + NA H9a DSLIP→EMI + + NA H10a IIOTIP→EMI - + NA H7b SLDIP→DAUM + NA + H8b OLDIP→DAUM + NA + H9b DSLIP→DAUM + NA + H10b IIOTIP→DAUM + NA + H7c DAUM→PAP + NA + H8c DAUM→PLAP + NA + H9c DAUM→DGAP + NA + H10c DAUM→IAP + NA - EMI→PAP NA + NA EMI→PLAP NA + NA EMI→DGAP NA + NA EMI→IAP NA - NA

237

Table 41: DAP-EMI-DAUM-OCG Stage 3 Hypotheses Result

DAP-EMI-DAUM-OCG Stage 3 Model Model 1 Full Model 2 Model 3 Hypothesis Moderation Keep EMI Keep DAUM Hm1a, b EMI→DAUM + NA NA H11a PAP→EMI + + NA H12a PLAP→EMI - + NA H13a DGAP→EMI + + NA H14a IAP→EMI + + NA H11b PAP→DAUM + NA + H12b PLAP→DAUM + NA + H13b DGAP→DAUM - NA - H14b IAP→DAUM + NA + H11c DAUM→COCG + NA + H12c DAUM→POCG + NA + H13c DAUM→CCI + NA - EMI→COCG NA + NA EMI→POCG NA + NA EMI→CCI NA - NA

[Remainder of page left intentionally blank]

238

CHPATER 5: ARIA

5.0 Summary and management implications

Firstly, through case studies, the research explored whether business analytics makes a difference on company performance in today’s information, knowledge, digitized intensive environment. Building theory on analyzing thousands of text words, a data- centric management influence research model was developed, demonstrating the data- centric ecosystem on which executive management influence (EMI), as a continuous moderating effect, successfully leads on data-integration practices and forming use strategies on actuating data to create new or enhance current organization capabilities.

EMI contains several attributes, primarily leadership on data-analytic initiatives, resource allocation, and expectations on data technology ROI in both economic and non- economic terms. A second moderating factor, are data accessibility and use mechanisms

(DAUM) resulting from EMI resource allocations, allowing for the integration and actuation of data value creating benefits through capabilities growth. Primary attributes of

DAUM are job-related data accessibility, user-education on data technologies, data visualization technologies, and automated data-generating systems.

Both moderators demonstrated strengthening effects on the research model since the level of EMI and DAUM indicates the degree of data-centricity adopted by the organization. Moderators, third variable independent interactors, modify relationships between an independent and dependent variable (Kenny, 2018; Baron and Kenny, 1986).

In this case, EMI and DAUM at varying levels strengthen the relationships between variables; differing values of a moderating variable alters the relationship strength. Hence

239

the level of resource allocations or management level influence offered by EMI alters the degree of data-centric adoption within the organization. Similar, DAUM is dependent on

EMI for implementation of the tools and mechanisms to bring about a data-centric culture. This being said, EMI can also mediate the relationship between the independent and dependent variable, where data-centric pressures (DCP) cause EMI causes DIP,

DAP, OCG and so forth and EMI then mediating on DAUM and DAUM causing DIP,

DAP, OCG.

The case studies also provided insights in the ‘black-boxes’ of what is contained within topics of DIP, DAP, OCG. Rather then modeling the research at high-level constructs, the case studies opened in detail how organizations view data-centricity as a management practice. Secondly, a literature review was conducted to augment the findings of the case-studies and serving to support the case study research model.

Thirdly, combinating the findings of the case studies and literature review, a survey instrument is developed on which to confirm the case study theory. Two forms of analysis on the survey data is performed, a multiple regression on the individual effects of the model dimensions as discussed and reported in section 4.5.2. The hypotheses proposed EMI and DAUM would prove to positively moderate the relationships. The findings support the proposition by illustrating changing effect sizes on the independent and dependent variables. The research model is examined in three stages, the effects on data-centric pressures (DCP) on data-integrated practices (DIP), DIP on data actuated process strategies (DAP), and DAP on organization capabilities growth (OCG). The majority of situations illustrate increasing effect size movement, from small or no effect on independent – dependent variables, to large effects when both moderators are present.

240

Stage one, demonstrates moderation has the largest increasing effect in the presence of both moderators. Stage two and three the dual moderation effect remains, albeit smaller, yet significant changes occur.

A second analysis is performed by means of a structural equation modeling (section

4.5.3). Given the model complexity, examination is made in similar stages as the multiple regression and using model fit as the proxy for viewing moderating effects. In the DCP on DIP stage, the moderating effect is evident, demonstrating the need for EMI and DAUM in this initial stage of a data-centric organization. Stage two, DIP on DAP, the moderating effect appears to lessen. Removing moderation / mediation, the results appear to favor a lessening effect, yet moderation remains evident. Stage three, DAP on

OCG, the model argues the moderating / mediating effect lessens further and possibly is not required at this stage; theorizing, once EMI and DAUM are firmly acting on stages one and two, outcomes may not require as continuous of moderation as hypothesized.

In each of the analyses, EMI and DAUM continuous moderation is demonstrated, and supports the hypotheses as presented in the case study research model.

Reading comparatives on topics of interest across a broad spectrum of manufacturers benefits managers to learn of different ways to conduct business. Managers should appreciate the pressures a business faces in adopting a data-centric culture to sustain competitiveness. Understanding economic and non-economic benefits of data-analytics and data-technology investments, the push and pull of internal and external influences on those investments and clearly differentiating both short-term and long-term expectations

241

of the technology’s organizational impact, is critical to realizing favorable returns on the technologies’ application and use.

A few key takeaways from each section are discussed. Key takeaway discussion points on data-adoptive pressures for managers are: enabling talent by unlocking employee performance and reducing task frustrations through data-technology investments, pushing information to organization and function levels where decision- making is best understood and made with cross-function collaboration on data-sharing to innovate processes for increased the productive use of talent. Understanding the degrees of innovation pressure externally applied by the marketplace, and those an internal function of management’s ambitions to make novel investments for maintaining the company’s technology capabilities. Learning the levels of influence within the organization on technology adoption is helpful when implementing a new technology.

Key takeaway discussion points on executive management influence and data use practice mechanisms. Both dimensions are continuous in their influence on data- centricity. Deepening the levels of data-access by employees leads to greater use of data- supported decision-making and thereby aids improving on critical thinking skills.

Companies that organizationally educate on data-use and problem-solving find data accessibility and timeliness helpful on decision-making processes. Executives who

‘practice what they preach’ by allocating sufficient resources on data-technologies and demonstrate the importance of technology use be self-prescription are better positioned to realize higher returns on technology investment.

242

Governance on data accessibly is an increasingly critical component of data or cyber security; managers when selecting a new technology should include assessments on privileges that may invite data-breach risks, making known potential vulnerabilities to the protection of data assets. Managers who regularly integrate and analyze data from external or strategic sources (industry, regulatory, supplier, customer, market) in decision-making and strategy planning are better positioned to sustain the organization within changing business environments; data integration and analysis also benefits managing raw-material fungibility, making aware the competitive substitutability of materials when scarcity is present and or marketplace changes requiring the use of new materials in the manufacturing process. Putting into place data analytic mechanisms to find things new and novel works to strengthen innovation and R&D capabilities, establishing cross-function teams and sharing of data enhances organizational innovativeness. Managers who understand the importance of critical success factors and remain disciplined in communicating these through a commonized approach to decision- making and problem-solving better position the company to sustain operation capabilities.

Key takeaway discussion points on data-applied processes. Regarding productivity, managers who regularly integrate safety data as a component of productivity analysis brings focus to employee welfare and the importance of protecting the labor pool from unwanted reduction. Manufacturing is about maximizing production capabilities, optimizing capacities or unit through-put, managers who regularly monitor processes through data-analytics tend to innovate on processes to improve on performance.

Managers should consider analyzing processes from a cost of activity and profit

243

contribution perspective to better understand value creation of making process improvements. Regarding planning, managers that regularly incorporate external data into predictive analysis are more adaptable to make changes on competitive strategies.

Projecting resource requirements is critical to understanding capabilities; managers should regularly analyze resource strengths and weaknesses to reallocate and utilize capabilities most beneficial to value creating activities. Maintaining connectivity with customers and suppliers and analyzing shared data predictive of trends, should be a regular activity of managers in assessing demand and market changes effecting capabilities.

Key discussion points on organization outcomes, emphasizes growth as the operational mechanism on which to measure performance. Managers ought to incorporate growth themes of employment, capabilities, cyber-security, and financial into the organizational environment; as measures relative to the introduction of technologies to understand economic and non-economic returns (i.e. the impact of technologies on employment dynamics, capabilities reconfiguration or addition, organization mindset on data protection, and technology influenced economic returns). Managers analysis of internal and external data to reveal opportunities for growth in existing markets and identifying those in new markets. Managers need to sustain innovation by creating momentum that continuously drives new ideas into the organization. Accomplished through mechanisms enabled by technologies to build on successes experienced on improving productivity, taking actions on employee generated ideas, and consistently creating concepts by prototyping new ideas and integrating data from customers and other external sources of information.

244

5.1 Recommendations on the research

Regardless of the research type, limiting personal biases, prior learned knowledge and experience creep on interpreting and reporting findings is critical to objective study.

Although well-guarded against in this research, through applied rigor, no guarantee of unconscious predispositions can be explicitly warrantied, doing so would be unrealistic given our human nature, individual levels of comprehension and accepting the premise of no research being perfect.

Academic criticisms of case study research have centered on limited sample size, rigor, and qualitative interpretation of results. In this regard the research herein has made consistent efforts to sample a representative population reflective of the topics under study. Interpreting the data of transcripts, collectively containing hundreds of thousands of words, followed a defined process comparing and coding of explicit and implicit topic references found through manual reading and using text analysis software.

The study sample size of ten companies and thirteen interviewees is greater than acceptable standards offered by proponents of case study research (Yin, 2014; Dul and

Hak, 2008; Miles and Huberman, 1994) on providing sufficient evidence of topics of interest. The limitation here being the regional focus of mid-west USA manufacturers and asking would the results be different if expanded to other geographic manufacturing centers. To mitigate some of this geographic constraint, the sample group deliberately contained a mix of industry types, culling from primary metals, food products, wood products, metal fabrication, home furnishings, tobacco, and other consumer products industries. This mix of industries provided a boarder context on the study thereby

245

increasing the randomness of responses to explore commonalities. Had the investigation been single industry focused, the evidence found may not be as revealing; i.e. if the sample group was strictly automotive focused would the findings reveal the degree of variations found in a mixed industry examination versus those industry specific.

In this sense, the research found both industry specific and cross-industry commonalities, heightening the level of analysis. Of note, conducting several in-depth studies of single industries would be of value to this research, drilling deeper into presented propositions from industry specific perspectives.

Time is the most significant constraint on case-study research. Interviewee availability, time allocation for the interview, and timely response to follow-up on questions are limiting factors if not managed properly. An interviewee’s time is valuable and not to be wasted; providing the participant with an outline of general topics beforehand aided in respecting time allocations.

The majority of interviews extended beyond appointed times, attributable to the semi- structured forum on topics and desire of the interviewee to continue the dialogue.

Respondents in face-to-face venues willingly exceeded the two-hour requested time, at times veering onto affinitive topics worthy of discussion. The telephone interviews were

‘hard-stopped’83 at the designated time. Face-to-face situations are much preferred with the majority of explicit referenced dialogue within this article generated from this form of interaction. The ability to listen, see and read human behavior reactions and responses to

83 Hard-stopped, references staying within designated time allocations, discussions cease regardless of the point in time in the discussion of a topic.

246

queries with focused attention cannot be imitated in telephone interviews. While telephone participants agree to devote a period of time to the interview; distractions occur with the interviewee, unseen by the interviewer84, interfering with time allocations. Post- interview clarification on interviewee comments were managed through e-mail communication, some responding immediately, others delayed in response, or not responding and retarding planned analysis.

Limiting exposure to sensitive information is of competitive concern by some interviewees and to certain extents, restricts the level of detail on discussion points.

Depth of references made by an interviewee on a topic varied among the cross-case comparisons and reliant upon the interviewee’s comfort level with revealing information.

Of the eight face-to-face interviews conducted at the plant offices, factory tours were conducted in five where the facility was in proximity to the place of interview. The opportunity to tour all factories of the sample companies would have been of benefit to further topic comparisons; however this was not a constraining factor in conducting the research. The five tours provided a randomness of observations on cross-industry manufacturer data use practices and processes.

Chapter four, through a survey instrument, examines the topics as revealed in the case studies. Given the depth of exploration herein, deeply researching every construct and relationship presented isn’t possible in one survey research event and requiring several to complete a full appreciation of the research model. Given the scale and

84 A choice of video conference or phone conference was offered, with the later chosen in all related situations

247

complex research model size, further research would benefit by its study on the three- stages as presented versus digesting the entire model in one research effort. Albeit, the advantage of conducting this broad of research, opens the opportunity for multiple paths of further inquiry that may not have been possible with a limited research effort.

The literature review in chapter three combines academic and non-academic research with the findings to build a more comprehensive definition of constructs versus those singularly defined through the case studies. The method of survey development and statistical analysis is conducted as best known by the researcher, the accuracy of such improves with time and experience. Caution is taken when reporting on the results and doing so is presented as accurately as possible, however there is no guarantee implied of error not occurring. Upon continued examination, other insights on the data may be found, or reputed on those herein reported.

5.2 Recommendations on further research

Any one of the fifteen sub-models and thirty-five granulized themes is apt for additional research. Of specific interest to this researcher are studies on innovation momentum, the digitally enabled mechanisms manufacturers employ to sustain the constant and continuous flow of value creating ideas into the organization.

Organizations who ignore the need for innovation become stale and lose competitiveness, researching the creative ways companies innovate through data- analytics is critical to sustaining organization growth. Studying the cross-function collaborative sharing of data to create new and novel ideas fits nicely into themes of

248

innovation momentum. Innovation momentum dovetails into innovation pressures managers face when adopting a data-centric organization culture and becomes another topic for deeper exploration. Where the use of data, especially in this age of knowledge intensity and information availability, determining the right data type and methods of timely delivery on decision-making warrants additional research.

Levels of management or organization influence on initiating, implementing, and interacting with data-technologies would reap much needed insights on deepening the organizational use of data in problem-solving and data-supported decision-making. These are but of few research extensions available on this research. Big data, or any collection of data flows through an organization for use in decision-making and problem-solving, the viscosity of that flow is enabled by technologies and the education on the use of data.

Researching data-viscosity in this regard will provide insights on how managers may better move data through the organization and realize greater value from it useful application.

249

References

Adamczak, M., Domański, R., Cyplik, P. (2013). Use of Sales and Operations Planning in Small and Medium-sized Enterprises. Log Forum, Scientific Journal of Logistics, 9 (1), pp 11-19.

Ackoff, R.L. (1989). From data to wisdom. Journal of Applied Systems Analysis, 16, pp 3-9

Akter, S. (2016). How to improve firm performance using big data analytics capability? PEARL, University of Plymouth. International Journal of Production Economics, 10.1016/j.ijpe.2016.08.018

Agrawal, A. (2016). 2016 US SMB IT spend growth rate to remain flat at US$188B. Techaisle.com https://techaisle.com/blog/248-2016-us-smb-it-spend-growth-rate- to-remain-flat

Agrawal, A. (2016). 2016 Analytics & Big Data in the US SMB Market. Techaisle.com, A Techaisle Report

Alamar, B.C. (2013). Sports Analytics: A Guide for Coaches, Managers, and Other Decision Makers. Columbia University Press, ISBN: 978023116292

Alavi, M., Leidner, D.E. (2001). Review: Knowledge Management and Knowledge Management Systems: Conceptual Foundations and Research Issues. MIS Quarterly, 25 (1), pp 107-136

Alavi, M., Tiwana, A. (2002). Knowledge integration in virtual teams: The potential role of KMS. Journal of the Association form Information Science and Technology, 53 (12), pp 1029-1037

Albright, S.C., Winston, W.L. (2017). Business Analytics, Data Analysis and Decision Making. 6th edition, Cengage Learning, ISBN: 978-1-305-94754-2

Al-Ruithe, K., Benkhelifa, E., Hameed, K. (2017). A systematic literature review of data governance and cloud governance. Personal and Ubiquitous Computing, Springer-Verlag London Ltd., part of Springer Nature 2018, January

Amit, R., Shoemaker, P.J. (1993). Strategic assets and organizational rent. Strategic Management Journal, 14 (1), pp 33-46

Amit, R., Zott, C., (2001), Value Creation in E-Business, Strategic Management Journal, 22, pp 493-520

250

Anderson, D., Baskerville, R.L., Kaul, M. (2017). Information Security Control Theory: Achieving a Sustainable Reconciliation Between Sharing and Protecting the Privacy of Information. Journal of Management Information Systems, 34 (4), pp 1082-1112

Ansoff, H.I. (1965). Corporate Strategy: an analytic approach to business policy for growth and expansion. McGraw-Companies, US ISBN: 978007002112

Ansoff, H.I. (1988). The New Corporate Strategy. Revised Edition. Wiley ISBN: 9780471629504

APICS Foundation. APICS S&OP Performance: Advancing Sales and Operations Planning. APICS Insights and Innovations online https://www.apics.org/docs/default-source/industry-content/apics-sop- performance-report.pdf?sfvrsn=0

Baan, P., & Homburg, R. (2013). Information Productivity: An Introduction to Enterprise Information Management. In Enterprise Information Management (pp. 1-42). Springer New York.

Bäckström, I., Bengtsson, L.G. (2018). Employee Involvement in Firm Innovation – A Mapping Study of Research on Employee Innovation. Academy of Management, July, Abstract online https://journals.aom.org/doi/abs/10.5465/AMBPP.2018.16710abstract

Baesens, B. (2014). Analytics in a Big Data World: The essential guide to data science and its applications. John Wiley & Sons, Inc., ISBN 978-1-118-89270

Baesens, B., Bapna, R., Mardsen, J.R., Vanthienen, J., Zhao, J.L. (2016). Transformational Issues of Big Data and Analytics in Networked Business. MIS Quarterly, 40 (4), pp 807-818

Baker, J. (2012). The Technology-Organization-Environment Framework, in Information Systems Theory: Explaining and Predicting Our Digital Society, Vol. 1, Springer New York pp 231-245, e-ISBN 9781441961082

Banafa, A. (2016). Small Data vs. Big Data: Back to the basics. Dataflog.com. https://dataflog.com/read/small-data-vs-big-data-back-to-the-basic/706

Bandura, A. (1993). Perceived Self-Efficacy in Cognitive Development and Functioning. Educational Psychologist, 28 (2), pp 117-148

Bapuji, H., Crossan, M. (2004). From Questions to Answers: Reviewing Organizational Learning Research. Management Learning, 35 (2), pp 397-417

Barbier, J., Buckalew, L., Loucks, J., Moriarty, R., O’Connell, K., Riegel, M. (2016). Cybersecurity as a growth advantage. Cisco

251

Barbos, A. (2015). Information Acquisition and Innovation Under Competitive Pressure. Journal of Economics & Management Strategy, 24 (2), pp 325-347

Barney, J.B. (1986). Organization Culture: Can It Be a Source of Sustained Competitive Advantage? Academy of Management, 11 (3), pp 656-665

Barney, J.B. (1991). Firm Resources and Sustained Competitive Advantage. Journal of Management, 17 (1), pp 99-120

Baron, R.M., Kenny, D.A. (1986). The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, pp 1173-1182

Barshan, E., Ghodsi, Ali., Azimifar, S., Jahromi. M.Z. (2010). Supervised Principal Components Analysis: Visualization, Classification and Regression on Subspaces and Submanifolds. Preprint submitted to Elsevier.

Barton, D., Court, D. (2012). Making advanced analytics work for you. Harvard Business Review, 90 (10), pp 78-83

Bass, B.M., Avolio, B.J. (1993). Transformational Leadership and Organization Culture. Public Administrative Quarterly, Spring, pp 112- 121

Baumeister, R.F. (1984). Choking Under Pressure: Self-Consciousness and Paradoxical Effects of Incentives on Skillful Performance. Journal of Personality and Social Psychology, 46 (3), pp 610-620

Beaudry, A., Pinsonneault, A. (2010). The Other Side of Acceptance: Studying the Direct and Indirect Effects of Emotions on Information Technology Use. MIS Quarterly, 34 (4), pp 689-710

Beecher, P. (2018). Enterprise-grade networks: the answer to IoT security challenges. In Network Security. July, pp 6-9.

Benbasat, I., Goldstein, D.K., Mead, M. (1987). The Case Research Strategy in Studies of Information Systems. MIS Quarterly, September, pp 369-386

Bernolak, I. (1997). Effective measurement and successful elements of company productivity: the basis of competitiveness and world prosperity. International Journal of Production Economics, 52 (1-2), pp 203-213

Bharadwaj, A., El Sawy, O.A., Pavlou, P.A., Venkatraman, N. (2013). Digital Business Strategy: Toward a Next Generation of Insights. MIS Quarterly, 37 (2), pp 471- 482

Bogetoft, P., & Otto, L. (2010). Benchmarking with Data Envelope Analysis, Stochastic Frontier Models and R (Vol. 157). Springer Science & Business Media.

252

Bontis, N. (2000). Assessing Knowledge Assets: A Review of the Models Used to Measure Intellectual Capital. Closing Keynote Presentation, KM World 2000

Brown, B., Gottlieb, J. (2016). The need to lead in data and analytics. McKinsey & Company, Digital McKinsey.

Brown, G., Keegan, J., Vigus, B., Wood, K. (2001). The Kellogg company optimizes production, inventory and distribution. Interfaces, 31 (6), pp 1-15

Brown, M.W., Cudeck, R. (1993). Alternative ways of assessing model fit. In Testing structural equation models, pp 136-162, Sage Publications, Newbury Park, CA.

Brynjolfsson, E., McAfee, A. (2012). Race Against the Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and The Economy. The MIT Center for Digital Business, January

Bughin, J., Livingston, J., Marwaha, S. (2017). Seizing the potential of ‘Big Data’. McKinsey Quarterly, 28 (4), pp 103-109

Bughin, J. (2016). Big Data, Big Bang? Journal of Big Data, 3 (2), pp 1-14

Bughin, J., LaBerge, L., Mellbye, A. (2017). The case for digital reinvention. McKinsey and Company, Digital McKinsey, February

Bugliarello, G., Doner, D.B. (1973). The History and Philosophy of Technology. University of Illinois Press ISBN-0-252-00462-0

Burlingame, J. F. (1961). Information technology and decentralization. Harvard Business Review, 39(6), pp 121–126.

Brooking, A., Motta, E. (1996). A taxonomy of intellectual capital and a methodology for auditing it. 17th Annual National Business Conference

Brown, B., Gottlieb, J., McKinsey Research (2016). The need to lead in data and analytics. McKinsey & Company

Byrne, B.M. (2010). Structural Equation Modeling with AMOS. Second edition, Routledge, Taylor and Francis Group, New York. ISBN 9780805863734

Camm, J.D., Cochran, J.J., Fry, M.J., Ohlmann, J.W., Anderson, D.R., Sweeney, D.J., Williams, T.A. (2017). In Essentials of Business Analytics, 2nd edition. Cengage Learning ISBN978-1-305-62773-4

Capon, N., Glazer, R. (1987). Marketing and Technology: a strategic coalignment. Journal of Marketing, 51 (3), pp 1-14

253

Cepeda, G., Vera, D. (2007). Dynamic capabilities and operational capabilities: A knowledge management perspective. Journal of Business Research, 60 (5), pp 426-437

Chadee, D.D., Pang, B. (2008). Technology strategy and performance: a study of information technology service providers from selected Asian countries. Service Business, 2 pp 109-126

Chen, H., Chinag, R.H.L, Storey, V.C. (2012). Business Intelligence and Analytics: From Big Data to Big Impact. MIS Quarterly, 36 (4), pp 1165-1188

Chen, D. Q., Preston, D. S., & Swink, M. (2015). How the Use of Big Data Analytics Affects Value Creation in Supply Chain Management. Journal of Management Information Systems, 32(4), pp 4–39

Collis, D.J. (1994). Research Note: How Valuable are Organizational Capabilities. Strategic Management Journal, 15, pp 143-152.

Cooper, R.B. (1994). The inertial impact of culture on IT implementation. Information & Management, 27, pp 17-31

Corley, K.G., Gioia, D.A. (2004). Identity Ambiguity and Change in the Wake of a Corporate Spin-off. Administrative Science Quarterly, 49, pp 173-208

Corley, K.G., Gioia, D.A. (2011). Building theory about theory building: What constitutes a theoretical contribution? Academy of Management, 36 pp 12-32

Cyert, R.M., March, J.G. (1963). A Behavioral Theory of the Firm. Prentice Hall, Englewood Cliffs, New Jersey

D’Aveni, R.D. (2002). Competitive Pressure Systems: Mapping and Managing Multimarket Contact. MIT Sloan Management Review Fall

Daft, R.L., Weick, K.E. (1984). Toward a model of organizations as interpretation systems. Academy of Management, 9, pp 284-295

Cupoli, P., Earley, S., Henderson, D. (2014) DAMA-DMBOK2 Framework. The Data Management Association.

Davenport, T.H. (1993). Process Innovation: Reengineering Work through Information Technology. Harvard Business School Press, Boston. eISBN: 9780875843667

Davenport, T.H., Harris, J.G. (2007) Competing on Analytics: The New Science of Winning. Harvard Business School Publishing Company. ISBN:137981422103326

Davenport, T.H. (2010). Are you ready to Reengineer Your Decision Making? An interview with Thomas H. Davenport. MIT Sloan Management Review, July

254

Davenport, T.H., Barth, P., Bean, R. (2012). How ‘Big Data’ Is Different. MIT Sloan Management Review, 54 (1), pp 22-24

Davenport, T.H. (2013). Analytics 3.0. Harvard Business Review, December, pp 65-72

Davenport, T.H. (2014). How strategists use “big data’ to support internal business decisions, discovery, and production. Strategy & Leadership, 42 (2), pp 45-50.

Davenport, T.H. (2014). Big Data at Work: Dispelling the myths, uncovering the opportunities. Harvard Business School Publishing Corporation. ISBN: 9781422168165

Davis, F.D. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13 (3), pp 319-339

De Clerq, D., Thongpapanl, N., Dimov, D. (2011). A Closer Look at Cross-Functional Collaboration and Product Innovativeness: Contingency Effects of Structural and Relational Context. Journal of Product Innovation Management, 28, pp 680-697

Del Canto, González, I.,S. (1999) A resource-based analysis of factors determining a firm’s R&D activities. Research Policy, 28 (8), pp 891-905

DeLone, W.H., McClean, E.R. (2003). The DeLone and McClean Model of Information Systems Success: A The-Year Update. Journal of Information Systems, 19 (4), pp9-30

DeSanctis, G., Poole, M.S. (1994). Capturing the Complexity in Advanced Technology Use: Adaptive Structuration Theory. Organization Science, 5 (2), pp 121-147

Demartini, C., (2014), Performance Management Systems: Design, Diagnosis, and Use, Physica, Springer Science and Business Media, ISBN 978-3-642-36683-3

Depietro, R., Wiarda, E., Fleischer, M. (1990). The Context for Change: Organization, Technology, and Environment, in Tornatzky, L.G. and Fleisher, M. (Eds.) The Process of Technological Innovation, Lexington, MA: Lexington Books, pp 151- 175

Desouza, K.C., Smith, K.L. (2014). Big Data for Social Innovation. Stanford Social Innovation Review, Summer pp 39-43

Diamantopoulos, A., Siguaw, J.A. (2000). Introducing LISREL. Sage Publications

Dietterich, T.G. (2000). Ensemble Methods in Machine Learning. Oregon State University

Dosi, G., Nelson, R.R., Winter, S.G. (2000). Introduction: The Nature and Dynamics of Organizational Capabilities. Oxford University Press.

255

Driver, M.J., Mock, T.J. (1975). Human Information Processing, Decision Style Theory, and Accounting Information Systems. The Accounting Review, 50 (3), pp 490-508

Dul. J., Hak, T. (2008). Case Study Methodology in Business Research, 1st edition. Elsevier, Butterworth-Heinemann Publications. ISNB: 978-0-7506-8196-4

Dunbrack, L., Hand, L., Turner, V., Ellis, S., Knickle, K. (2016). IoT and Digital Transformation: A Tale of Four Industries. IDC White Paper, March

Edvinsson, L., Malone, M.S. (1997). Intellectual Capital: Realizing Your Company’s True Value by Finding Its Hidden Brainpower. HarperCollins , ISBN: 9780887308413

Edvinsson, L. (2002). The New Knowledge Economics. London Business School Review, 13 (3), pp 72-76

Eisenhardt, K.M. (1989a). Building Theories from Case Study Research. Academy of Management Review, 14 (4), pp 532-550

Eisenhardt, K.M. (1989b). Making Fast Strategic Decisions in High-Velocity Environments. Academy of Management Journal, 32 (3), pp 543-576

Eisenhardt, K.M., Martin, J. (2000). Dynamic Capabilities: what are they? Strategic Management Journal, Special Issue 21 (10-11), pp 1105-1121

Elgendy, N., Elragal, A. (2014). Big Data Analytics: A literature Review Paper. Published in Advances in , applications, and theoretical aspects: 14th Industrial Conference, ICDM, St. Petersburg, Russia, July 16-20, Proceedings, pp 214-217. Springer International Publishing.

Eppler, M.J., Mengis, J., (2004). The concept of information overload: A review of literature from organization science, accounting, marketing, MIS, and related disciplines. Information Society

Felici, M., Pearson, S. (2015). Accountability for Data Governance in the Cloud. In Accountability and Security in the Cloud. Springer International Publishing Switzerland, pp 3-42

Feynman, R.P. (1985). Surely, You’re Joking Mr. Feynman.: Adventures of a curious character. W.W. Norton & Company, Inc., New York, N.Y. ISBN:9780393355628

Fiol, C.M., Lyles, M.J. (1985). Organizational Learning. Academy of Management Review, 10 (4), pp 803-813

Fitzgerald, M. (2013). Turning Big Data into Smart Data. Big Idea: Data & Analytics Blog, December, MIT Center for Information Systems Research

Fitzgerald, M. (2015). Viewing Data as a Liquid Asset. MIT Sloan Management Review.

256

Fleckenstein, M., Fellows, L. (2018). Modern Data Strategies. Springer International Publishing AG. ISBN 9783319689937 (eBook).

French, J.R.P.Jr., Raven, B. (1959) The bases of social power. In D. Cartwright (Ed.), Studies in social power (pp. 150-167). Oxford, England: Univer. Michigan.

Frey, T. (2014). Governance Arrangements for IT Project Portfolio Management: Qualitative Insights and a Quantitative Modeling Approach. Springer Gabler, Wiesbaden. ISBN: 9783658056612

Friedman, D. (2015). Get to Know the Four Types of Data in The Internet of Things. https://readwrite.com/2015/08/13/five-types-data-internet-of-things/

Galbraith, J.R. (2014). Organization Design Challenges Resulting from Big Data. Journal of Organization Design 3 (1), pp 2-13

Gardner, H.K. (2012). Performance Pressure as a Double-edged Sword: Enhancing Team Motivation but Undermining the Use of Team Knowledge. Administrative Science Quarterly, 57 (1), pp 1-46

Garud, R., Tuertscher, P., Van de Ven, A.H. (2013). Perspectives on Innovation Processes. Academy of Management, 7 (1) pp 775-819

Gershenfeld, N., Krikorian, R., Cohen, D. (2004). The Internet of Things. Scientific American, October, pp 76-81.

Gioia, D.A., Chittipeddi, K. (1991). Sensemaking and Sensegiving in Strategic Initiation. Strategic Management Journal, 12, pp 433-448

Gioia, D.A., Price, K.N., Hamilton, A.L., Thomas, J.B. (2010). Forging and Identity: An Insider-Outsider Study of Process Involved in the Formation of Organizational Identity. Administrative Science Quarterly, 55, pp 1-46

Gioia, D.A., Corley, K.G., Hamilton, A.L. (2012). Seeking Qualitative Rigor in Inductive Research: Notes on the Gioia Methodology. Organizational Research Methods, 16 (1), pp 15-31

Gilkman, P., Glady, N. (2015). What’s the value of your data? Tech Crunch, October 13

Gimenez, C., Sierra, V., Rodon, J. (2012). Sustainable Operations: Their impact on the triple bottom line. Int. J. Production Economics, 140, pp 149-159

Gladwell, M. (2000). The Tipping Point. Little, Brown and Company. ISBN o-316- 31696-2

Glaser, B.G., Strauss, A.L. (1967,1995,1999). The Discovery of Grounded Theory: Strategies for Qualitative Research. AldineTransaction, Transaction Publishers, ISBN: 0-202-30260-1

257

Grant, R.M. (1991). The Resource-Based Theory of Competitive Advantage: Implications for Strategy Formulation. California Management Review, Spring, pp 114-135

Grant, R.M. (1996). Toward a Knowledge-Based Theory of the Firm. Strategic Management Journal. 17, Winter Special Issue, pp 109-122

Grant, R.M. (1997). The Knowledge-based View of the firm: Implications for Management Practice. Long Range Planning, 30 (3), pp 450-454

Grover, V., Chiang, R.H.L., Liang, T-P., Zhang, D. (2018). Creating Strategic Business Value from Big Data Analytics: A Research Framework. Journal of Management Information Systems, 35 (2), pp 388-423

Gupta, V., Peter, E., Miller, T., Blyden, K. (2002). Implementing a distribution-network decision-support systems at Pfizer/Warner-Lambert, Interfaces, 32 (4), pp 28-45

Hair, J.F.jr., Black, W.C., Babin, B.J., Anderson, R.E. (2010). Multivariate Data Analysis. Seventh Edition, Prentice Hall, Pearson. ISBN 139780138132637

Hamel, G. (2007). The Future of Management. First edition, Harvard Business School Press, Boston, ISBN: 9781422102503

Harper, G.R., Utley, D.R. (2001). Organization Culture and Successful Information Technology and Implementation. Engineering Management Journal, 13 (2), pp 11-15

Harry, M., Schroeder, R. (2000). Six Sigma: The breakthrough management strategy revolutionizing the world’s top organizations. Doubleday Publishing, a Division of Random House, Inc., New York, ISBN: 0385494378

Heath, C. (2016). Forward In Small Data: The Tiny Clues That Uncover Huge Trends. St. Martin’s Press, New York

Heisig P., Vorbeck J., Niebuhr J. (2001) Intellectual Capital. In: Mertins, Heisig, and Vorbeck (eds) Knowledge Management, Springer International Series, Berlin

Helfat, C.E., Peteraf, M.A. (2003). The Dynamic Resource-Based View: Capability Lifecycles. Strategic Management Journal, 24, pp 997-1010

Helfat, C.E., Winter, S.G. (2011). Untangling Dynamic and Operational Capabilities: Strategy for the (N)Everchanging World. Strategic Management Journal, 32, pp 1243-1250.

Henfridsson, O., Lind, M. (2016). Information systems strategizing, organizational sub- communities, and the emergence of a sustainability strategy. Journal of Strategic Information Systems, 23, pp 11-28

258

Henke, N., Libarikian, A., Wiseman, B. (2016). Straight talk about big data. McKinsey Quarterly, October

Higson, C., Waltho, D. (2009). Valuing information as an asset. SAS White paper, London

Hong, P. (2000). Knowledge Integration in Product Development. PhD Dissertation, The University of Toledo.

Hong, P., Doll, W.J., Nahm, A., Li, X., (2004a). Knowledge Sharing in integrated product development. European Journal of Innovation Management 7(2), pp 102- 112.

Hong, P., Doll, J.W., Revilla, E., Nahm, A.Y. (2011). Knowledge sharing and strategic fit in integrated product development projects: An empirical study. International J. Production Economics, 132, pp 186-196

Hong, P., Park, YW. (2014). Building Network Capabilities in Turbulent Competitive Environments: Business Success Stories from the BRICs. CRC Press, Taylor & Francis Group. ISBN-13: 9781466515758

Hong and Stout (2017). Discussions on survey items and construct definitions for dissertation research

Hoerl, R.W., Snee, R.D., DeVeaux, R.D. (2014). Applying statistical thinking to ‘Big Data’ problems. WIREs Comput Stat, 6, pp 222-232, doi:10-1002/wics.1306

Holland, S., Gaston, K., Gomes, J. (2003). Critical success factors for cross-functional teamwork in product development. International Journal of Management Reviews, 2 (3), pp 231-259

Hooper, D. Coughlan, J., Mullen, M.R. (2008). Structural Equation Modeling: Guidelines for Determining Model Fit. The Electronic Journal of Business Research Methods, 6 (1), pp 53-60. Online at www.ejbrm.com

Hu, L., Bentler, P.M. (1999). Cutoff criteria for indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, pp 1-55

Hu, L., Bentler, P.M. (1995). Evaluating model fit. In Structural Equation Modeling. Concepts, issues, and applications, pp 76-99. Sage Publications: London ISBN: 97808033953185

Hunt, E. (1995). Will We Be Smart Enough? A Cognitive Analysis of the Coming Workforce. Russell Sage Foundation. ISBN 9781610443005

259

Hunt, S. (1997). Competing Through Relationships: Grounding Relationship Marketing in Resource-Advantage Theory. Journal of Marketing Management, 13, pp 431- 445

Hwang, D., Yang, M., Hong, P. (2015). Meditating effect of IT-enabled capabilities on competitive performance outcomes: An empirical investigation of ERP implementation. Journal of Engineering and Technology Management, 36, pp 1- 23

Ivert, L.K., Jonsson, P. (2010). The potential benefits of advanced planning and scheduling systems in sales and operations planning. Industrial Management & Data Systems, 110 (5), pp 659-681

Ivert, L.K., Jonsson, P. (2014). When should advanced planning and scheduling systems be used in sales and operations planning? International Journal of Operations and Production Management. 34 (10), pp 1338-1362

Jamwal, A. (2016). The Fourth Industrial Revolution: Challenges for Enterprises and Their Stakeholders. IOT and the Digitization of Manufacturing. Industry Week, April 18. https://www.industryweek.com/emerging-technologies/fourth- industrial-revolution-challenges-enterprises-and-their-stakeholders

Johnson, J.S., Friend, S.B., Lee, H.S. (2017). Big Data Facilitation, Utilization, and Monetization: Exploring the 3Vs in a New Product Development Process. Journal of Product Innovation Management, 34 (5), pp 640-658.

Jöreskog, K.G., Sörbom, D. (1996). LISREL 8: Users reference guide. Second Edition Chicago: Scientific Software. ISBN: 9780894980404

Kandemir, D., Hult, G.T.M. (2005). A conceptualization of an organizational learning culture in international joint ventures. Industrial Marketing Management, 34, pp 430-439

Kane, G.C., Palmer, D., Phillips, A.N., Kiron, D., Buckley, N. (2017). Achieving Digital Maturity. MIT Sloan Management Review, #DIGITALREVOLUTION # 59181

Kaplan, R.S., Norton, D.P. (1992). The Balanced Scorecard: Measures that Drive Performance. Harvard Business Review, January-February, pp 71-79

Kaushik, A. (2009). Slay the Analytics Data Quality Dragon & Win Your HiPPO’s Love! Occam’s Razor online https://www.kaushik.net/avinash/10-tips-best-practices- overcome-web-metrics-data-quality-challenge/

Kenny, D.A. (2018). Measuring Model Fit. Online davidkenny.net

Kenny, D.A. (2018). Mediation and Moderation. Online davidkenny.net

260

Khatri, V., Brown, C.V. (2010). Designing Data Governance. Communications of the ACM, 53 (1), pp 148-152

Kim, G., Shin, B., Kwon, O. (2012). Investigating the value of sociomaterialism in conceptualizing IT capability of a firm. Journal of Management Information Systems, 12, pp 327-362

Kiron, D., Prentice, P.K., Ferguson, R.B. (2014), Raising the Bar with Analytics. MIT Sloan Review, 55 (2) pp 29-32

Kiron, D., Prentice, P.K., Ferguson, R.B. (2014), The Analytics Mandate: Research Report. MIT Sloan Review, May

Klahr, R., Shah, J.N., Sheriffs, P., Rossington, T., Pestell, G. Ipsos MORI Social Research Institute in conjunction with Button. M., Wang, V. Institute for Criminal Justice, University of Portsmouth. (2017) Cyber security breaches survey, online https://www.ipsos.com/sites/default/files/2017-04/sri-cybersecurity-breaches- survey-2017.pdf

Kletti, J. (2007). Manufacturing Execution Systems – MES. Springer-Verlag Heidelberg. ISBN: 978354049743-1

Klotz, F. (2017). The Unique Challenges of Cross-Boundary Collaboration, an Interview with Amy Edmondson. MIT Sloan Management Review. http\\mitsmr.com/2BA7UM4

Kupsch, P., Marr, R., Picot, A. (1991). Innovationswirtschaft (Innovation Economy). In E. Heinen, ed. Industriebetriebslehre (Industrial Production -Decisions in the Industrial Enterprise): Entscheidungen im Industriebetriebslehre. Wiesbaden: Gabler

Kwon, O., Lee, N., Shin, B. (2014). Data quality management, data usage experience and acquisition intention of Big Data analytics. International Journal of Information Management, 34, pp 387-394

Lambrou, M.A. (2016). Innovation Capability, Knowledge Management and Big Data Technology: A Maritime Business Case. International Journal of Advanced Corporate Learning, 9 (2) pp 40-44

Langley, A. (1999). Strategies for Theorizing from Process Data. Academy of Management Review, 24 (4), pp 691-710

Laney, D. (2011). Infonomics: The Economics of Information and Principles of Information Asset Management. The Fifth MIT Information Quality Symposium, July 13-15

Laughlin, S.P. (1999). An ERP Game Plan. Journal of Business Strategy, 20 (1), pp 32- 37.

261

LaValle, S., Lesser, E., Shockley, R., Hopkins, M.S., Kruschwitz, N. (2011). Big Data, Analytics and the Path From Insights to Value. MIT Sloan Management Review 52 (2), pp 21-31

Larson, B. (2012). The many levels of SQL Server 2012 business intelligence. TechTarget, https://searchsqlserver.techtarget.com/feature/Excerpt-The-many- levels-of-SQL-Server-2012-business-intelligence

Larson, D., Chang, V. (2016). A review and future direction of agile, business intelligence, analytics and data science. International Journal of Information Management, 36 pp 1700-1710

Laskowski, N. (2014). Infonomics treats data as a business asset. TechTarget CIO, https://searchcio.techtarget.com/feature/Infonomics-treats-data-as-a-business- asset

LeCun, Y., Bengio, Y., Hinton, G. (2015). Deep Learning. Nature, 521, May 28

Lee, C., Lee, K., Pennings, J.M. (2001). Internal Capabilities, External Linkages and Performance: A Study on Technology-based ventures. Strategic Management Journal, 22 (6-7), pp 615-640

Lee, S.U., Zhu, L., Jeffery, R. (2018). A Data Governance Framework for Platform Ecosystem Process Management. Springer Nature Switzerland AG, pp 211-227

Lee, I., Lee, K. (2015). The Internet of Things (IoT): Applications, investments, and challenges for enterprises. Business Horizons, 58, pp 431-440

Li, Y., Hou, M., Liu, H., Liu, Y. (2012). Towards a theoretical framework of strategic decision, supporting capability and information sharing under the context of the Internet of Things. Information Technology Management, 13, pp 205-216

Li, Y., Wang, M., Van Jarrsveld, D.D., Lee, G.K., Ma, D.G. (2018). From Employee- experienced Hing-involvement Work System to Innovation: An Emergence-based Human Resource Management Framework. Academy of Management Journal, 61 (5), pp 2000-2019

Lindstrom, M. (2016). Small Data: The Tiny Clues That Uncover Huge Trends. St. Martin’s Press, New York ISBN: 9781250080684

Luthy, D.H. (1998). Intellectual Capital and Its Measurement. Utah State University

Lindau, R.A. (1997). Automatic data capture and its impact on productivity. International J. Production Economics, 52, pp 91-103

Mahmood, Z. (2016). Connectivity Framework for Smart Devices. Springer International Publishing, Switzerland. ISBN: 9783319331249

262

Mangelsdorf, M.E. (2017). What Executives Get Wrong About Cybersecurity. MIT Sloan Management Review, 58 (2) pp 22-24

Manzoor, A. (2016). Securing Device Connectivity in the Industrial Internet of Things (IoT). In Connectivity Frameworks for Smart Devices. Springer International Publishing, Switzerland ISBN: 9783319331249

Marr, B. (2017). Data-Driven Decision Making: Beware of The HIPPO Effect! Forbes online. https://www.forbes.com/sites/bernardmarr/2017/10/26/data-driven- decision-making-beware-of-the-hippo-effect/#4d32009480f9

Mason, R.O. (1978). Measuring Information Output: A Communication Systems Approach. Information and Management, 1 (4), pp 219-234

Mathias, O., Fouweather, I., Gregory, I., Vernon, G.A. (2017). Making Sense of Big Data – can it transform operations management? International Journal of Operations & Production Management, 37 (1), pp 37-55

Mauboussin, M.J. (2012). The True Measures of Success. Harvard Business Review, Spring

McAfee, A., Brynjolfsson, E. (2012). Big Data: The Management Revolution. Harvard Business Review, October, pp 60-68

McAllum, R.C., Browne, M.W., Sugawara, H.W. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychological Methods, 1, pp 130-149.

McCutcheon, D.M, Meredith, J.R. (1993). Conducting case study research in operations management. Journal of Operations Management, 11, pp 239-256

McCutcheon, I.S., Handfield, R., McLachlin, R., Samson, D. (2002). Effective case research in operations management: a process perspective. Journal of Operations Research, 20, pp 419-433

McDermott, C.M., Stock, G.N. (1999). Organizational culture and advanced manufacturing technology implementation. Journal of Operations Management, 17, pp 521-533.

Miles, M.B., Huberman, M.A. (1994). Qualitative Data Analysis, 2nd edition. Sage Publications

Miller, D., Shamise, J. (1996). The Resource-based View of the Firm in Two Environments: The Hollywood Film Studios from 1936 to 1965. The Academy of Management Journal, 39 (3), pp 519-543

263

Moges, H.T., Dajaeger, K., Lemahieu, W., Baesens, B. (2013). A Multidimensional Analysis of Data Quality for Credit Risk Management. New Insights and Challenges. Information and Management, 50 (1), pp 43-58

Moody, D., Walsh, P. (1999). Measuring the Value of Information: An Asset Valuation Approach. European Conference on Information Systems

Mooi, E., et al. (2018). Market Research, Chapter 7. Springer Texts in Business and Economics. DOI 10.1007/978-10-5218-7_7

Moon, B-J. (2013). Antecedents and outcomes of strategic thinking. Journal of Business Research, 66, pp 1698-1708

Moore, G.C., Benbasat, I. (1991). Development of an Instrument to Measure Perceptions of Adopting an Information Technology Innovation. Information Systems Research, 2 (3), pp 192-222

Muzumdar, M., Fontanella, J. (2007). The Secrets to S&OP Success. Supply Chain Management Review, April, pp 34-51

Nambisan, S., Sawhney, M., (2007). A Buyer’s Guide to the Innovation Bazaar. Harvard Business Review, June, pp 109-118

Nonaka, I. (1994). A Dynamic Theory of Organizational Knowledge Creation. Organization Science, 5 (1), pp 14-37

O’Leary, D. (2000). Enterprise resource planning: systems, life cycle, electronic commerce, and risk. Cambridge University Press

O’Leary-Kelly, S.W., Flores, B.E. (2002). The integration of manufacturing and marketing/sales decisions: impact on organizational performance. Journal of Operations Management, 20, pp 221-240

O’Reilly, C.A., Chatman, J., Caldwell, D.F. (1991). People and Organizational Culture: A Profile Comparison Approach to Assessing Person-Organization Fit. Academy of Management Journal, 34 (3), pp 487-516

Olhager, J., Rudberg, M., Wikner, J. (2001). Long-term capacity management: Linking the perspectives from manufacturing strategy and sales and operations planning. Int. J. Production Economics, 69, pp 215-225

Olhager, J., Selldin, E. (2007). Manufacturing planning and control approaches: market alignment and performance. International Journal of Production Research, 45 (6), pp 1469-1484

Orlikowski, W.J., (1992). The duality of technology: Rethinking the concept of technology in organizations. Organization Science, 3 (2), pp 398-426

264

Orlikowski, W.J. (1996). Improvising Organizational Transformation Over Time: A Situated Change Perspective. Information Systems Research, 7 (1), pp 63-92

Orlikowski, W.J., Iacono, C.S. (2001). Research Commentary: Desperately Seeking the “IT” in IT Research – A Call to Theorizing the IT Artifact. Information System Research, 12 (2), pp 121-134

Orr, J.C. (2012). Data Governance for the Executive. Senna Publishing, LLC. ISBN-13: 9780615531915

Otto, B. (2013). On the Evolution of data governance in firms: the case of Johnson & Johnson consumer products in North America. In Handbook of Data Quality, Springer, Berlin, Heidelberg

Park, Y., Hong, P., Fujimoto, T. (2017). Reshoring Strategy: Case Illustrations of Japanese Manufacturing Firms. Found in Reshoring of Manufacturing, Measuring Operations Performance. Springer International Publishing, Chapter 2 Literature Survey, DOI 10.1007/978-3-319-38883-4_7

Parker, G. (1994). Cross-Functional Collaboration. Training & Development, October, pp 49-53

Parameter, D. (2015). Key Performance Indicators: Developing, Implementing, and Using Winning KPIs. John Wiley & Sons, Inc. Hoboken, New Jersey ISBN: 9781118925102

Parnell, G.S., Bresnick, T.A. (2013). Introduction to Decision Analysis, Found in The Handbook of Decision Analysis, John Wiley & Sons, Inc. ISBN:978118173138

Penrose, E. (1959, 2009). The Theory of the Growth of the Firm. Fourth Edition, Oxford University Press

Peppard, J., Ward, J. (2004). Beyond strategic information systems: towards and IS capability. Journal of Strategic Information Systems, 13, pp 167-194

Pipino, L., Lee, Y.W., Wang, R.Y. (2002). Data Quality Assessment. Communications of the ACM, pp 211-218

Porras, J.I., Silvers, R.C. (1991). Organization Development and Transformation. Annual Review of Psychology, 42, pp 51-78

Porter, M.E., Millar, V.E. (1985). How information gives you competitive advantage. Harvard Business Review, July-August, pp 149-160

Porter, M.E. (1991). Towards a Dynamic Theory of Strategy. Strategic Management Journal, 12, pp 95-117

265

Porter, M.E., Heppelmann, J.E. (2015). How Smart Connected Products are Transforming Companies. Harvard Business Review, October, pp 97-114

Prahalad, C.K., Hamel, G. (1990). The Core Competence of the Corporation. Harvard Business Review, May-June

Prahalad, C.K., Hamel, G. (1994). Competing for the future. Harvard Business School Press, Boston

Ransbotham, S. (2015). Better Decision Making with Objective Data is Impossible, Found in Competing with Data & Analytics, MIT Sloan Management Review, July 28, 2015

Ransbotham, S., Kiron, D., Prentice, P.K. (2016). Beyond the Hype: The Hard Work Behind Analytics Success. MIT Sloan Management Review, Spring

Ross, J.W., Beath, C.M., Quaadgras, A. (2013). You May Not Need Big Data After All. Harvard Business Review, December, pp 90-98

Rothrock, R.A., Kaplan, J., Van der Oord, F. (2018). The Board’s Role in Managing Cybersecurity Risks. MIT Sloan Management Review. 59 (2), pp 12-15

Rowley, J. (2006). The wisdom hierarchy: representations of the DIKW hierarchy. Journal of Information Science, 33 (2), pp 163-180

Rubin, R.S., Munz., D.C., Bommer, W.H. (2005). Leading from Within: The effects of Emotion recognition and personality on transformational leadership behavior. Academy of Management Journal, 5, pp 845-858

Schermelleh-Engler, K., Moosbrugger, H., Müller. (2003). Evaluating the Fit of Structural Equation Models: Tests of Significance and Descriptive Goodness-of- Fit Measures. Methods of Psychological Research Online, 8 (2), pp 23-74

Schienstock, G. (2009). Organizational Capabilities: Some reflections on the concept. IAREG – Intangible Assets and regional Economic Growth, Research Unit for Technology, Science, and Innovation Studies, University of Tampere

Schmidhuber, J. (2014). Deep Learning Neural Networks: An Overview. Technical Report IDSIA-03-14 / arXiv:1404.7828 v4 [cs.NE]

Schoenherr, T., Swink, M. (2012). Revisiting the arcs of integration: Cross-validations and extensions. Journal of Operations Management, 30, pp 99-115.

Schwab, K. (2016). The Fourth Industrial Revolution. World Economic Forum. ISBN 13:09781944835002

Shields, B. (2017). Integrating Analytics in Your Organization: Lessons from the Sports Industry. MIT Sloan Management Review

266

Short, J.E., Todd, S. (2017). What’s Your Data Worth? MIT Sloan Management Review, Spring

Simmonds, M. (2018). Instilling a culture of data security throughout the organisation. In Network Security, June, pp 9-12

Singh, J.V. (1986). Performance, Slack, and Risk taking in Organizational Decision Making. Academy of Management Journal, 29 (3), pp 562-585

Slinger, G., Morrison, R. (2014). Will Organization Design be Affected by Big Data? Journal of Organization Design, 3 (3), pp 17-26

Soares, S., (2012). Big Data Governance: An Emerging Imperative. First edition, MC Press. ISBN: 9781583473771

Stampfl, G. (2014). The Process of Business Model Innovation. Springer Gabler, Vienna Austria. ISBN: 9783658112660

Steiger, J.H. (1990). Structural model evaluation and modification: An interval estimation approach. Multivariate Behavioral Research, 25, pp 173-180

Stewart, W.H., Watson, W.E., Carland, J.C., Carland, J.W. (1997). A Proclivity for Entrepreneurship: A comparison of entrepreneurs, small business owners, and corporate managers. Journal of Business Venturing, 14, pp 189-214

Strauss, A., Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Thousand Oaks, CA, US: Sage Publications, Inc.

Stuart, I., McCutcheon, Handfield, R., McLachlin, R., Samson, D. (2002). Effective case research in operations management: a process perspective. Journal of Operations Management, 20, pp 419-433

Sveiby K-E., Arnell E., Vikström S. (1990). The Invisible Balance Sheet. www.researchgate.net/publication/301548879

Tangen, S. (2004). Demystifying productivity and performance. International Journal of Productivity and Performance Management, 54 (1), pp 34-46

Tarafdar, M., Tu, Q., Ragu-Nathan, T.S. (2010-2011). Impact of Technostress on End- user Satisfaction and Performance. Journal of Management Information Systems. 27 (3), pp 303-334.

Teece, D.J., Pisano, G., Shuen, A. (1997). Dynamic Capabilities and Strategic Management. Strategic Management Journal, 18 (7), pp 509-533

Teece, D.J. (2007). Explicating Dynamic Capabilities: The Nature and Microfoundations of (Sustainable) Enterprise Performance, Strategic Management Journal, 28, pp 1319-1350

267

Tetlock, P.E., Gardner, D. (2015). Superforecasting: The Art of Prediction. Crown Publishing Group, New York. ISBN: 9780804136693

Thaler, R.H. (2015). Misbehaving: The making of behavioral economics. W.W. Norton & Company, New York, ISBN: 9780393080940

Thiele, T., Stiehm, S., Richert, A., Jeschke, S. (2016). Data-Driven Organization Engineering: Detection of Innovation Synergies With Data Analytics. 13th International Conference on Intellectual Capital, Knowledge Management & Organisational Learning, ACPI

Thome, A.M.T., Scavarda, L.F., Fernandez, N.S., Scavarda, A.J., (2011). Sales and operations planning and the firm performance. International Journal of Productivity and Performance Management, 61 (4), pp 359-381.

Thome, A.M.T., Sousa, R.S., Scavarda do Carmo, L.F.R.R. (2014). The impact of sales and operation planning practices on manufacturing operational performance. International Journal of Production Research, 52 (7), pp 2108-2121

Thompson, R.L., Higgins, C.A., Howell, J.M. (1991). Personal Computing: Toward a Conceptual Model of Utilization. MIS Quarterly, 15 (1), pp 124-143

Tornatzky, L., Fleischer, M. (1990). The process of technology innovation. Lexington Books, Lexington, MA.

Tuomikangas, N., Kaipia, R. (2014). A coordination framework for sales and operations planning (S&OP): Synthesis from the literature. International J. Production Economics. 154, pp 243-262.

Uckelmann, D., Harrison, M., Michahelles, F. (2011). An architectural approach towards the future of the Internet of Things. In Architecting the Internet of Things, pp 1- 24, Springer, Berlin

Vahls, D., Burmester, R., (2002) Innovationsmangement: Von der Produktidee zur erfolgreichen Vermarktung (Innovation Management: From product idea to successful marketing), Stuttgart: Shägger-Poeschel

Vales, E. (2007). Employees CAN Make a Difference! Involving Employees in Change at Allstate Insurance. Organizational Development Journal, 25 (4), pp 27-31

Venkatesh, V., Morris, M.G., Davis, G.B., Davis, F.D. (2003). User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly, 27 (3), pp 425- 478

Voss, C., Tsikriktsis, N., Frohlich (2002). Case Research in Operations Management. International Journal of Production Management, 22 (2) pp 195-219

268

Wade, M., Hulland, J. (2004). Review: The Resource-Based View and Information Systems Research: Review, Extension, and Suggestions for Future Research. MIS Quarterly, 28 (1), pp 107-142

Wang, C.L., Ahmed, P.K. (2004). Dynamic Capabilities: A Review and Research Agenda. The International Journal Management Reviews, 9 (1), pp 31-51

Weick, K.E. (1979). The Social Psychology of Organizing, Second Edition. McGraw Hill ISBN:978-0075548089

Weick, K.E., Sutcliffe, K.M., Obstfeld, D. (2005). Organizing and the Process of Sensemaking. Organization Science, 16 (4), pp 409-421

Wellers, D., Somaini, J. (2017). Securing Your Digital Future: Cyber Trust as Competitive Advantage. SAP BrandVoice, October 13

Wernerfelt, B. (1984). A Resource-based View of the Firm. Strategic Management Journal, 5, pp 171-180

Wessel, M. (2016). How big data is changing disruptive innovation. Harvard Business Review

Winter, S.G. (2003). Understanding Dynamic Capabilities. Strategic Management Journal. 24, pp 991-995

Wittmer, J. (2014). Organizational Behavior Issues in Implementing Technologies, Lecture notes on Leadership and Organization Culture, MFGM 8830. University of Toledo

Woodall, W.H., Montgomery, D.C. (1999). Research Issues and Ideas in Statistical Process Control, Journal of Quality Technology, 31 (4), pp 376-386

Yang, M.G., Hong, P., Modi, S.B. (2011). Impact of lean manufacturing and environmental management on business performance: An empirical study on manufacturing firms. International J. Production Economics, 129, pp 251-261.

Yin, R.K. (2014). Case Study Research: Design and Methods, 5th edition. Sage Publications

Zack, M.H. (1999). Developing a Knowledge Strategy. California Management Review, 41 (3), pp 125-145

269