<<

Thinking: Identification and Measurement of Attitudes for Systems , Systems Thinking, and

by

Melissa T. Greene

A dissertation submitted in partial fulfillment of the for the degree of Doctor of Philosophy () in the University of Michigan 2019

Doctoral Committee:

Professor Richard Gonzalez, Co-Chair Professor Panos Y. Papalambros, Co-Chair Associate Professor Eytan Adar Dr. Anna-Maria Rivas McGowan, National Aeronautics and Administration Professor Oscar Ybarra

Melissa T. Greene

[email protected]

ORCID iD: 0000-0002-7063-2500

© Melissa T. Greene 2019

DEDICATION

To Dad

Love, Tess

ii

ACKNOWLEDGMENTS

First and foremost, my sincerest thanks to Anna McGowan for recognizing my potential, introducing me to like minds at NASA Langley, and connecting me with the thought leaders in

Design Science at the University of Michigan. Without you, I’d have never found “my people.”

To Panos Papalambros, my advisor, chair and mentor: you have helped me grow intellectually, professionally, and personally. Thank you for challenging me to be the best version of myself, for encouraging me to be confident, and for your empathy and kindness through it all. It has been an honor and a privilege to learn from you.

To Rich Gonzalez: learning to leverage the incredible amount of knowledge, experience, and skill available to me at Michigan was and continues to be a challenge. Thank you for allowing me the time and space to figure it out. I sincerely appreciate your intellectual contributions to the work and your patience and support as I learn how to communicate.

To DESCI and ODE, past and present: thank you all for being wonderful friends and colleagues. I have many fond memories and have learned so much from each of you. To Vignesh and Sanjana, thank you for your intellectual contributions. Aria, thank you for sharing this journey with me – what a long, strange trip it’s been.

Finally, and most importantly, my deepest gratitude to my family, for believing in me every step of the way. Mom and Dad, thank you for your unwavering support through every decision, pivot, change of plan, and change of location over the last ten years. You have enabled me to realize my talents and fulfill my potential. I will be forever grateful. To Doug and Steve, my brothers and best friends, thank you for always reminding me what’s truly important, for

iii keeping me grounded, and for being a constant source of fun and light in my life. To Patty and

Ryanne, my soul sisters, thank you for being the wonderful women you are. I’m so thankful to know you.

To Andy, my boots on the ground: thank you for living the “every day” with me, for truly understanding what it took to get here, and for helping me make it happen. You are the best teammate and partner and I am so grateful for you.

To all: thanks again. I couldn’t have done it without you.

iv

TABLE OF CONTENTS

DEDICATION ...... ii

ACKNOWLEDGMENTS ...... iii

LIST OF TABLES ...... ix

LIST OF FIGURES ...... xi

LIST OF APPENDICES ...... xii

ABSTRACT ...... xiii

CHAPTER

I. What is “Systems Design Thinking?” ...... 1

1.1 Introduction ...... 1

1.2 Research Questions and Methodology...... 7

1.2.1 Understanding Systems Design Thinking Attitudes ...... 8

1.2.2 Measuring Systems Design Thinking Attitudes ...... 10

1.3 Dissertation Overview ...... 10

II. Understanding Systems Design Thinking Attitudes ...... 13

2.1 Introduction ...... 13

2.2 Literature Review...... 14

2.2.1 An Introduction to the “Systems Approach” for Dealing with Complexity ...... 14

v

2.2.2 ...... 16

2.2.3 Systems Thinking...... 19

2.2.4 Design Thinking...... 25

2.3 Developing a Codebook for Interview Analysis ...... 27

2.3.1 Systems Engineering Codes ...... 28

2.3.2 Systems Thinking Codes...... 30

2.3.3 Design Thinking Codes...... 32

2.4 From Frameworks to Attitudes: Interviews with Systems Engineers ...... 34

2.4.1 Method ...... 35

2.4.2 Analysis...... 35

2.4.3 Findings...... 38

2.5 Summary ...... 41

III. Modeling Systems Design Thinking Attitudes ...... 43

3.1 Introduction ...... 43

3.2 Study 1: Technical, Organizational, and Social Systems Thinking ...... 45

3.2.1 Scale Development ...... 45

3.2.2 Pilot Test, Factor Analysis, and Results for Study 1 ...... 48

3.2.3 Discussion ...... 52

3.3 Study 2: Systems Thinking and Design Thinking ...... 55

3.3.1 Comparing Systems Thinking and Design Thinking ...... 56

vi

3.3.2 Scale Development ...... 59

3.3.3 Pilot Test, Factor Analysis, and Results for Study 2 ...... 62

3.3.4 Discussion ...... 64

3.4 Study 3: Systems Engineering and Design Thinking ...... 65

3.4.1 Comparing Systems Engineering and Design Thinking ...... 66

3.4.2 Scale Development and Pilot Test ...... 70

3.4.3 Exploratory Factor Analyses...... 77

3.4.4 Confirmatory Factor Analyses ...... 83

3.4.5 Multigroup CFA...... 87

3.4.6 Tests for Measurement Invariance ...... 91

3.4.7 Additional Qualitative Findings ...... 92

3.4.8 Discussion ...... 94

3.5 Summary ...... 96

IV. Validating the Systems Design Thinking Scale...... 98

4.1 Introduction ...... 98

4.2 Behavioral Research in Systems Engineering and Design Thinking ...... 99

4.3 Pilot Validation Study ...... 101

4.3.1 Overview and Objectives ...... 101

4.3.2 Methods...... 101

4.3.3 Behavioral Task Selection ...... 102

vii

4.3.4 Study Population and Recruitment Strategy ...... 104

4.3.5 Pilot Test, Factor Analysis, and Results for Validation Study ...... 104

4.3.6 Findings and Lessons Learned ...... 107

4.4 Validation Opportunities ...... 109

4.5 Summary ...... 110

V. Conclusion ...... 111

5.1 Summary of Dissertation ...... 111

5.2 Contributions to Design Science ...... 112

5.3 Limitations and Opportunities for Future Work ...... 113

Appendices ...... 116

Bibliography ...... 126

viii

LIST OF TABLES

Table

2.1 Five themes were coded in all ten interviews…………………………………………….36

2.2 Design thinking themes, number of interviews including each theme, and total number of references to each theme……..…………………………………………………………………..37

2.3 Systems thinking themes, number of interviews including each theme, and total number of references to each theme…………………………………………………………………………37

2.4 Systems engineering themes, number of interviews including each theme, and total number of references to each theme...…………………………………………………………...38

3.1 Technical systems thinking attitude items tested in Study 1……………………………..46

3.2 Organizational systems thinking attitude items tested in Study 1………………………..47

3.3 Social systems thinking attitude items tested in Study 1…………………………………47

3.4 Post-hoc exploratory factor analysis results for technical systems thinking attitude items in

Study 1…………………………………….……………………………………………………..49

3.5 Post-hoc exploratory factor analysis results for social systems thinking attitude items in

Study 1 …………………………………………………………………………………………..50

3.6 Post-hoc exploratory factor analysis results for organizational systems thinking attitude items in Study 1 …………………………………………………………………………………51

3.7 Summary of factor structure after post-hoc exploratory factor analysis (Study 1)………51

3.8 Example systems thinking attitude items and themes from Study 2 …………………….61

3.9 Example design thinking attitude items and themes from Study 2………………………62 ix

3.10 Two-factor EFA results with varimax rotated loadings (Study 2)………………………63

3.11 Systems engineering attitude items from Study 3.………………………………………71

3.12 Design thinking attitude items from Study 3……………………………………………72

3.13 Posting the Systems Design Thinking Scale on Reddit……...………………………….77

3.14 Varimax rotated loadings for four factors (Study 3)…………………………………….80

3.15 Post-hoc EFA results: Varimax rotated loadings for three factors (Study

3)……..………...... 81

3.16 Two-factor EFA results (Study 3)…….………………………………………………...82

3.17 Final factor loadings (Study 3)….………………………………………………………83

3.18 Model results from confirmatory factor analysis with 12 items…..…………………….85

3.19 Model results from confirmatory factor analysis with 9 items...………………………..86

3.20 Multigroup CFA results: Reddit vs. known expert sample……………………………...88

3.21 CFA for Reddit group…………………………………………………………………...88

3.22 CFA for known expert sample………………………………………………………...... 89

3.23 Multigroup CFA results: Entry vs. senior (combined sample)……………………89

3.24 CFA for entry-level group………………………………………………………………90

3.25 CFA for senior-level group……………………………………………………………...90

4.1 Model results from confirmatory factor analysis (validation study)...………………….105

4.2 Comparison in factor loadings between validation study and Study 3 for items DT7, DT9, and DT10……………………………………………………………………….………………105

4.3 Covariances of factors and tasks………………………………………………………...106

4.4 Regression model results………………………………………………………………..106

x

LIST OF FIGURES

Figure

2.1 Design thinking process models from IDEO, Stanford d-school, and Google Design

Sprint….…………………………………………...……………………………………………..27

2.2 Systems engineering process models. On the left is the INCOSE systems engineering

“vee” and the NASA “systems engineering engine” is on the right….………………………….29

2.3 Systems design thinking attitude model…...……………………………………………..40

3.1 A graphical representation of the three types of systems thinking—technical, social, and organizational………………………………………………………..…………………………...45

3.2 Findings from Study 1 suggest that social systems thinking items seem to overlap with the design thinking framework………………………………………………………………………55

3.3 In Study 2, systems thinking items from Study 1 are redistributed and a new model is tested……………………………………………………………………………………………..56

3.4 Systems engineering and design thinking frameworks each include elements of systems thinking…………………………………………………………………………………………..64

3.5 Two-factor model with parameter values and standard error…………………………….84

3.6 Systems Design Thinking Classification of 458 survey participants……………………..87

xi

LIST OF APPENDICES Appendix

A. Systems Design Thinking Codebook…………………………………………………...117

B. Semi-Structured Interview Questions.………………………………………………….122

xii

ABSTRACT

Systems engineering, systems thinking, and design thinking are frameworks for understanding complex problems and developing effective, holistic solutions. Each framework is comprised of assumptions, concepts, values, and practices that affect the design of products, systems, and services. In this dissertation, we explore the assumptions, concepts, values, and practices that define systems engineering, systems thinking, and design thinking, and compare them using a mixed methods approach. This dissertation also explores the existence and definition of systems design thinking—an integrated framework for systems engineering, systems thinking, and design thinking—along with the development of the Systems Design Thinking

Scale. The Systems Design Thinking Scale is a 5-point Likert scale survey that measures attitudes about systems engineering, systems thinking, and design thinking, and is used to provide insight about potential relationships between these attitudes. Such a scale may be used for categorizing individuals based on these attitudes, which could be useful for informing teaming and other management decisions in design organizations.

The development of the Systems Design Thinking Scale in this dissertation was conducted as follows. First, thematic analysis of the systems engineering, systems thinking, and design thinking literature was used to generate codes that reflect core assumptions, concepts, values, and practices of each framework. These codes were then compiled into a systems design thinking codebook, and used to analyze data from semi-structured interviews with experienced systems engineers who were also recognized as strong systems thinkers by a technical leader

xiii within their organization. Interview data was used to identify common attitudes reflecting systems engineering, systems thinking, and design thinking in practice, and to generate hypotheses about how the frameworks are related. These attitudes were represented as statements on a 5-point Likert scale and distributed to a diverse sample of engineers and .

Exploratory and confirmatory factor analysis were used to determine how well the attitudes reflected systems engineering, systems thinking, and design thinking; and to test the hypothesized relationships between these frameworks quantitatively. Ethnography informs the research throughout.

Findings suggest several nuances that distinguish systems engineering, systems thinking, and design thinking. Findings also suggest that systems thinking attitudes exist within both systems engineering and design thinking frameworks. Results from the factor analyses suggests that systems engineering and design thinking attitudes are independent, and individuals may have systems engineering attitudes, design thinking attitudes, or both. A higher correlation between these attitudes is observed for experts in engineering design.

The final version of the scale is a 9-item questionnaire about systems engineering and design thinking attitudes. An exploratory study for validating the scale is described, in which correlations between scale scores and performance on analytical reasoning and divergent thinking tasks are examined. While no significant correlation was observed between the subscales and performance on the analytical reasoning task, some correlation between the design thinking subscale and divergent thinking measure suggests that the Systems Design Thinking

Scale may be useful for predicting certain behaviors. Further validation through gamification and other opportunities for future work are discussed.

xiv

CHAPTER I

What is “Systems Design Thinking?”

1.1 Introduction

Systems engineering and design thinking are methodologies for developing products, systems, and services. INCOSE, the International Council for Systems Engineering, defines systems engineering as “an interdisciplinary approach and means to enable the realization of successful systems” that “focuses on defining customer needs and required functionality early in the development cycle, documenting requirements, then proceeding with design synthesis and validation (INCOSE, 2015).” At NASA, emphasis is on “the satisfaction of stakeholder functional, physical, and operational performance requirements in the intended use environment over the planned life cycle within cost and schedule constraints (Conner, 2015).” Systems engineering arose with the increase in complexity of military-industrial systems in the 1940s. As defense projects increased in size and scope, computational tools for , modeling and simulation, coordination, and scheduling were necessary for successful system design, implementation, and decommission (Yassine and Braha, 2003; Braha and Bar-Yam,

2007).

Design thinking, now commonly applied as a product development framework, also arose in response to increasing complexity. Design thinking is “a human-centered approach to innovation that draws from the ’s toolkit to integrate the needs of the people, the possibilities of technology, and the requirements for business success (Brown, 2008).” Design

1 thinking is a methodology for defining and solving problems. It is particularly useful for tackling problems that are ill-defined or unknown, by “understanding the human needs involved, by reframing the problem in human-centric ways, by creating many ideas in sessions, and by adopting a hands-on approach in prototyping and testing (Dam and Siang, 2019).” The recognition of “wicked problems” in social planning and policy

(Buchanan, 1992), where solutions required a great number of people to change their mindsets and behavior, helped first draw attention to the fact that comprehensive needs assessment and problem definition are critical first steps in designing successful solutions. Design thinking evolved as a way to accomplish these goals and maintain human-centered values from through to embodiment.

Systems engineering and design thinking were developed with different applications, approaches, and goals in mind. However, the value of integrating their principles and processes is becoming increasingly recognized in both communities (Dym et al., 2005; IndustryWeek,

2018; Liedtka and MacLaren, 2018). Systems engineering organizations like governmental mission agencies and global manufacturing corporations are exploring opportunities for applying design thinking principles in systems engineering projects (Souza and Barnhöfer, 2015; Darrin and Devereux, 2017; McGowan et al., 2013; McGowan et al., 2017). Short courses where engineers can learn design thinking methods through a hands-on design process and reflection are common. Some organizations have dedicated research groups for developing multidisciplinary approaches that include design thinking (Grace et al., 2017). Similarly, the product design community has recognized the need for a more systematic approach as the complexity of design tasks at hand has increased (Greene et al., 2017). With the pervasiveness of

"smart" technology, mechatronic products requiring a combination of mechanical, electronics,

2 and also require the performance of systems engineering sub-processes for successful integration. The expansion of product-service systems, in which products and the services that support them are integrated and delivered simultaneously, has created a further need for a systematic approach to product design (Baines et al., 2007; Cavalieri and Pezzotta, 2012).

Many companies that are producing highly complex products are already doing systems engineering, although some do this more formally than others and with varying levels of completeness (IndustryWeek, 2018).

Both approaches have value, but integrating the two is not without its challenges. For some systems engineers, design thinking represents a non-traditional and unfamiliar approach.

The emphasis on defining the problem over defining requirements for a solution creates dissimilar interpretations of the problem scope and project objectives (McGowan et al., 2017).

Likewise, pursuing the fully integrated scope of systems engineering processes from design to operation to decommission may seem overwhelming or unnecessary to many product designers designing on a smaller scale with less complexity and risk. However, the ultimate goal of both types of designers is the same: developing fully verified and validated products that result in higher quality and customer acceptance (IndustryWeek, 2018).

The challenge in integrating systems engineering and design thinking processes may be attributed in part to a difference in attitudes held by engineers and designers. Attitudes are sets of beliefs, emotions, and behaviors toward a particular object, person, process, or event. Attitudes develop over time as the result of experience, education, and upbringing, and can have a powerful influence over behavior (Allport, 1935). Attitudes can serve a variety of functions for individuals, such as providing general approach or avoidance tendencies, helping organize and interpret new information, protecting self-esteem, and expressing central values or beliefs (Eagly

3 and Chaiken, 1998; Katz, 1960). An individual’s attitudes about systems engineering and design thinking can therefore influence both the implementation of these processes and their outcomes.

Why might these attitudes be different and how? According to Doob (1947), learning can account for most of an individual's attitudes. Identifying the differences between engineering education and can be a useful starting point for understanding the different attitudes held by each professional community. A typical undergraduate engineering science curriculum in the United States consists of mathematics, general chemistry, physics, computer science, and other introductory engineering courses (Panel on Undergraduate Engineering

Education, 1986; Tribus, 2005), while design curricula are more likely to include courses like psychology, sociology, anthropology, marketing, and other social and behavioral sciences (Ilhan,

2017; Self and Baek, 2017). Students in each curriculum regularly address different types of problems and research questions, using different theoretical frameworks, knowledge, and methods. Students in each curriculum likely get some exposure to courses in the other, but courses and curricula are often not well-integrated. Design programs typically offer few engineering courses. More engineering programs are expanding to include rigorous design courses, but engagement with other disciplines is still limited, and design as it is taught in engineering curricula differs significantly from design as it is taught in design school curricula.

Engineering students learn and , nearly to the exclusion of social, cultural, economic, and other design needs. Design students learn aesthetic design, design processes, and theory, but often do not learn the engineering and manufacturing processes required for embodiment.

After graduation, job function, responsibilities, and training are also different. Systems engineering is concerned with assessing the technical and logistical feasibility of potential

4 solutions to given problems. Responsibilities include defining system requirements and developing verification and validation plans for those requirements, with the goal of delivering a fully integrated product that is reliable and safe in all operational conditions. Product designers are more likely to consider business viability first. In the product design context, identifying the right problem to solve is important and can be very lucrative. For this reason, product design processes are more likely to include market research, user needs assessment, and other human- centered analysis, and product design organizations are more likely to encourage an expansive, socially and economically rich problem analysis and problem definition.

Because of these differences, systems engineers are often characterized as methodical, analytical, and data-driven, while designers are often described as having a more flexible, creative, and human-centered perspective. With the increased awareness that both systems engineering and design thinking need each other, the effects of a possibly persisting distinction on systems engineers’ attitudes toward design, and designers’ attitudes towards systems engineering, are not well understood. This dissertation seeks to explore these attitudes as they relate to a concept we will call systems design thinking. Systems design thinking is a hypothesized set of attitudes that reflect an integrated perspective on systems engineering, design thinking, and systems thinking, a related but distinct framework. Systems engineers who employ a human-centered approach during complex systems design are believed to have systems design thinking attitudes. Product designers who intuitively follow systems engineering processes to integrate, verify, and validate complex consumer products may also have systems design thinking attitudes.

In this work, the term ‘design thinking’ is used to represent the plurality of human- centered processes that drive design decision-making. These include creativity, intuition, and

5 empathy, and also include design activities such as prototyping and testing with users and collaborating and communicating with other designers. Systems engineering attitudes include some fundamental attitudes about engineering that may be generalizable to any engineering discipline. These include attitudes about mathematical modeling, simulation, analysis, and other engineering processes. This is because systems engineers are discipline engineers first, trained to analyze and solve problems using engineering science and mathematics. In their systems design work, they learn to make analogies and draw connections to other disciplines, and to pursue social interactions with other engineers in those disciplines, strengthening these connections and gaining additional knowledge. This “philosophy” of systems engineering, described as the

“systems perspective” or “systems view,” is what distinguishes systems engineering from other engineering disciplines (Frank, 2012). The “systems view” describes the ability to identify and manage interactions between sub-systems in the technical system, as well as interactions between individuals, disciplinary working groups, and organizations, to facilitate complex systems design.

In this work, systems design thinking is explored through the development and validation of a new instrument that we will call the Systems Design Thinking Scale. The Systems Design

Thinking Scale measures attitudes about systems engineering, systems thinking, and design thinking, and is used to provide insight about potential relationships between these attitudes.

Scale development in this dissertation was conducted as follows. First, thematic analysis of the systems engineering, systems thinking, and design thinking literature was used to generate codes that reflect core assumptions, concepts, values, and practices of each framework. These codes were then used to analyze data from semi-structured interviews with experienced systems engineers, who were also recognized as strong systems thinkers by a technical leader within their

6 organization. Interview data was used to identify common attitudes reflecting systems engineering, systems thinking, and design thinking in actual practice, and to generate hypotheses about how the frameworks are related. These attitudes were represented as statements on a 5- point Likert scale and distributed to a diverse sample of engineers and designers. Factor analysis was used to determine how well the attitudes reflected systems engineering, systems thinking, and design thinking; and to test the hypothesized relationships between these frameworks quantitatively. Ethnography informs the research throughout.

In the remainder of this chapter, an overview of the research questions and methodology is presented. We describe systems engineering, systems thinking, and design thinking frameworks; how we get from frameworks to attitudes; and the approach for quantifying and measuring them. Then, the larger dissertation overview is presented.

1.2 Research Questions and Methodology

This dissertation seeks to define better both the boundaries between systems engineering, systems thinking, and design thinking frameworks, and their intersection at systems design thinking. The following research questions are addressed:

• What are the core assumptions, concepts, values, and practices of systems

engineering, systems thinking, and design thinking frameworks?

• How are these frameworks related?

• What attitudes reflect these frameworks and their relationships in practice?

• Can these attitudes be represented and measured in a meaningful way?

By framework, we mean a set of assumptions, concepts, values, and practices that constitutes a way of viewing reality (American Heritage Dictionary, 2016). The goal of the dissertation is to

7 identify some of these assumptions, concepts, values and practices for systems engineering, systems thinking, and design thinking, and compare them using a mixed methods approach

(Crede and Borrego, 2013; Watkins and Gioia, 2015).

We hypothesize that systems thinking and design thinking are or can be integrated within the context of systems engineering practice, based on our interpretation of relevant literature and observation and interviews with systems engineering practitioners. We create and describe the systems design thinking framework based on this hypothesis. Systems design thinking is an integrated framework that situates systems thinking and design thinking within the context of systems engineering practice. Our motivation for creating and studying this framework is to expand existing systems engineering process models to include relevant cognitive and social processes from systems thinking and design thinking frameworks. This is an important first step towards appropriately applying and maximizing the potential of systems thinking and design thinking as fully integrated systems engineering subprocesses.

1.2.1 Understanding Systems Design Thinking Attitudes Qualitative methods are useful for conducting exploratory research. This dissertation describes qualitative research for exploring the relationship between systems engineering, systems thinking, and design thinking attitudes in the systems engineering context.

Informal ethnographic research informed many aspects of the dissertation (Griffin and

Bengry-Howell, 2017). The research questions were developed after spending several months in a government laboratory and participating in complex systems research with engineers and technologists with varying education, experience, and training. Relationships with experienced systems engineering and design professionals were cultivated over the course of the research.

8

These relationships were useful for “member-checking,” to ensure accuracy, credibility, validity, and transferability of the findings throughout, and to guide the development of the dissertation.

Thematic analysis of systems engineering, [engineering] systems thinking, and design thinking literature was used to identify key assumptions, concepts, values, and practices of each framework (Braun and Clarke, 2006). These assumptions, concepts, values, and practices were integrated into a codebook for understanding systems design thinking attitudes (DeCuir-Gunby,

Marshall, and McCulloch, 2011). A codebook is a set of codes, definitions, and examples used as a guide to help analyze interview data. Then, semi-structured, individual interviews were conducted with a small, non-random sample of systems engineers. The codebook was used to identify attitudes that reflect systems engineering, systems thinking, design thinking, and their relationships in practice. Interviews also informed ideas and hypotheses about systems design thinking for the quantitative study described in the next section.

The individuals selected for interview were recruited for their experience as systems engineers, and also for their expertise in systems thinking as recognized by a technical leader in the organization. Although the sample was small, the experience represented was rich and diverse. Participants related to and were engaged with the research questions, and they described complex cognitive, behavioral, and organizational processes and their relationships. They articulated their thoughts, feelings, and strategies clearly, and direct quotations were often used to reflect systems engineering, systems thinking, and design thinking attitudes in the Systems

Design Thinking Scale. Generalizability and transferability of these attitudes were tested quantitatively when the scale was distributed to a larger, balanced sample.

9

1.2.2 Measuring Systems Design Thinking Attitudes Quantitative analysis was used to test hypotheses about systems engineering, systems thinking, and design thinking derived from the qualitative study findings. Theory and methods from psychometrics were used to model systems engineering, systems thinking, and design thinking as latent psychological constructs, measured by observable attitudes identified in interviews. Attitudes are declarative statements that reflect key assumptions, concepts, values, and practices of systems engineering, systems thinking, and design thinking. For example, the statement “system requirements should be flexible and expected to change” is an attitude that reflects systems thinking and the core value “flexibility.”

Systems engineering, systems thinking, and design thinking constructs were represented by networks of survey items (attitude statements) in a 5-point Likert scale. Numerals were assigned to each attitude statement to measure the degree to which participants agree or disagree with each attitude. Structural Equation Modeling was used to determine how well each attitude statement reflected the underlying construct, and to explore relationships between constructs in a quantitative way. We hypothesize that systems design thinkers will endorse most systems engineering, systems thinking, and design thinking attitudes. This would provide support for the existence of systems design thinking, an integrated framework consisting of systems engineering, systems thinking, and design thinking attitudes.

1.3 Dissertation Overview

In this chapter we introduced systems engineering, systems thinking, and design thinking frameworks, and proposed a new integrated framework we call systems design thinking. In

Chapter 2, Understanding Systems Design Thinking Attitudes, “understanding” begins with a review of systems engineering, systems thinking, and design thinking literature. Thematic

10 analysis is used to develop a codebook for interpreting differences between systems engineering, systems thinking, and design thinking throughout the research. An exploratory qualitative study is also described, in which semi-structured interviews are conducted with a small sample of systems engineers. The purpose of the exploratory qualitative study is to identify attitudes that reflect systems engineering, systems thinking, and design thinking frameworks, and how they are related and integrated to reflect systems design thinking in practice.

In Chapter 3, Measuring Systems Design Thinking Attitudes, findings from the qualitative studies guide quantitative comparison of systems engineering, systems thinking, and design thinking attitudes. Three factor analysis studies are described. Study 1 explores systems thinking attitudes and their relationship. Systems thinking is defined along three dimensions— technical, social, and organizational— as suggested in interview findings. Attitudes statements reflecting these dimensions are derived from literature and interview findings and represented as three factors in a 5-point Likert scale. This scale was distributed to a small sample of practicing engineers and engineering researchers in industry, academia, and government. Data was analyzed using exploratory factor analysis. The data did not support the three-factor model of technical, social, and organizational systems thinking, but provided other informative results.

Significant items in the social systems thinking factor closely resembled design thinking. This comparison was explored further in Study 2, in which systems thinking and design thinking are compared. In Study 2, significant systems thinking attitudes from Study 1 are retained and reorganized, and new statements are added to reflect design thinking better. Technical and organizational systems thinking attitudes grouped into one factor, and design thinking attitudes grouped into a second factor. Significant technical and organizational systems thinking closely resemble a systems engineering framework when the systems design thinking codebook is used

11 to interpret results. This finding is explored further in Study 3. In Study 3, systems engineering attitudes are studied as they relate to design thinking attitudes. Significant items from Study 2 are retained and improved and additional systems engineering items are added. This model was tested on a larger sample recruited through the social media platform Reddit and yielded promising results for measuring systems design thinking.

Findings suggest many nuances that differentiate systems engineering, systems thinking and design thinking. Findings also suggest that systems thinking attitudes exist within both systems engineering and design thinking frameworks. Results from the factor analyses suggests that systems engineering and design thinking attitudes are independent, and individuals may have systems engineering attitudes, design thinking attitudes, or both. A higher correlation between these attitudes is observed for experts in engineering design, suggesting that the integrated systems design thinking perspective may develop with education and experience.

Chapter 4 describes a first attempt at validating the Systems Design Thinking Scale by studying the relationship between scale scores and performance on divergent thinking and analytical reasoning tasks. While no significant correlation was observed between the subscales and performance on the analytical reasoning task, some correlation between design thinking subscale scores and performance on the divergent thinking measure suggests that the Systems

Design Thinking Scale may be useful for predicting some behaviors. Additional tasks and methods are identified for their relevance and potential usefulness in validating the scale, and a validation plan is outlined and discussed. Chapter 5 summarizes the main contributions of the work and highlights some opportunities for future research.

12

CHAPTER II

Understanding Systems Design Thinking Attitudes

2.1 Introduction

This chapter describes a literature review and exploratory qualitative study. The goal is to begin to understand systems design thinking by understanding systems engineering, systems thinking, design thinking, and their relationship. A two-step research process is described. First, a literature review and thematic analysis are conducted with the objective of producing a codebook for systems design thinking. This codebook describes differentiating features of systems engineering, systems thinking, and design thinking frameworks, for the purpose of informing the rest of the research. In the second step, individual semi-structured interviews are conducted with a small sample of practicing systems engineers. The codebook is used to identify and interpret attitudes about each of the three frameworks and their relationships in practice.

The work described in this chapter is intended to address a gap in the existing literature.

While considerable effort is dedicated to understanding the individual nuances of systems engineering, systems thinking, and design thinking frameworks, discussion about the relationships between them is limited (Greene et al., 2017). Relatively little work has been done to understand what distinguishes a “systems engineering” framework from a “systems thinking” framework, or “systems thinking” from “design thinking.” These questions will be explored throughout.

13

2.2 Literature Review

2.2.1 An Introduction to the “Systems Approach” for Dealing with Complexity The history of the “systems approach” for dealing with design complexity is long and diverse. The following section provides a brief overview of several major schools of , beginning with Ludwig von Bertalanffy’s General Systems Theory (von Bertalanffy,

1940). von Bertalanffy’s work was quickly expanded to describe cybernetic systems (Wiener,

1948; Ashby, 1956) and dynamic systems (Forrester, 1961; Boulding, 1964; Meadows, 2000;

Sterman, 2000), and has also informed , systems engineering, and management science (Checkland, 1981). Later work examined the application of general systems concepts to human social systems (Barker, 1968; Bateson, 1972; Luhmann, 1984; Parsons,

1951). General systems theory was somewhat useful for analyzing social and organizational systems, but traditional physics-based models did not always adequately represent human factors that influence system performance (Checkland, 1981; Forrester, 1994). Bateson, for example, argues that cybernetic principles have direct mappings in social systems (1972). Barker’s work, however, suggests that human behavior is radically situated, and that predictions about human behavior can only be made if the situation, context, and environment in which the human is operating can be sufficiently understood (1968).

The term “systems theory” coined by Ludwig von Bertalanffy in 1937 describes the interdisciplinary study of systems in general. Systems theory emerged as an attempt to uncover patterns and principles common to all levels of all types of systems, again with an emphasis on generality. The primary goal in developing systems theory was to provide a useful framework for describing a broad range of systems using the same terminology, in contrast to existing discipline-specific systems models in biology, engineering, and psychology (von Bertalanffy,

14

1940). von Bertalanffy divided systems inquiry into three major domains: philosophy, science, and technology (1968). Similar domains were also explored by Béla H. Bánáthy (1967), whose discussion of systems theory included philosophy, theory, methodology, and application.

Philosophy refers to the ontology, epistemology, and axiology of systems; theory refers to a set of interrelated concepts and principles that apply to all systems; methodology refers to the set of models, tools, and strategies that operationalize systems theory and philosophy; and application refers to the context and interaction of the domains. Systems inquiry is described as

“knowledgeable action:” philosophy and theory are integrated as knowledge, and method and application are integrated as action.

A central tenet of systems theory is self-regulation, and systems theory is often applied to describe systems that self-correct through feedback. These types of systems are found in nature, in local and global ecosystems, and in human learning processes at both the individual and organizational level (Laszlo, Levine, and Milsum, 1974). Early work in self-regulating systems eventually led to the development of cybernetics – the formal study of and control of regulatory feedback (Wiener, 1948). Cybernetics offers approaches for exploring natural systems, mechanical systems, physical systems, cognitive systems, and social systems.

Cybernetics is applicable in the analysis of any system that incorporates a closed signaling loop.

While the cybernetics framework can be applied to analyze non-engineering systems, engineering theory and methods are still used to represent the system.

Systems dynamics emerged in the mid-1950s as application of electrical control theory to the analysis of business systems. Forrester (1961) developed a mathematical modeling technique to help corporate managers improve their understanding of industrial practices. By simulating the stock-flow-feedback structure of an organization, Forrester demonstrated that instability in

15 organizational employment was due to internal structure of the firm, and not to an external force such as a business cycle. From the late 1950s to the late 1960s, was applied almost exclusively to corporate and managerial problems (Radzicki and Taylor, 2008). In 1969,

Forrester extended the system dynamics model beyond its corporate application in Urban

Dynamics. In this book, Forrester presents a simulation model that describes the major internal forces controlling the balance of population, housing, and industry within an urban area

(Forrester, 1969).

2.2.2 Systems Engineering Many decades ago, systems dynamics formed the core of some systems engineering concepts. Systems engineering today describes an interdisciplinary field of formalized approaches for designing and managing large-scale, complex engineered systems (LSCES) throughout the life cycle. Systems engineering methodology offers a process for technical management of LSCES. Sophisticated quantitative techniques are used to organize and coordinate work activities, evaluate technical systems interactions, and assure system quality and performance. To address personnel issues that influence LSCES design, systems engineering has drawn from operations research and management science. Management involves identifying “the mission, objective, procedures, rules and manipulation of the human capital of an enterprise to contribute to the success of the enterprise” (P. Frank, 2006; Oliver, 1997).

Systems engineering considers issues such as requirements development and verification, work-process management, and system safety/reliability, and utilizes methods such as probabilistic risk assessment, modeling and simulations, and design optimization (Goode and

Machol, 1957; Papalambros and Wilde, 2017). The systems engineering approach is generally reductionist in nature, and offers several different tools and methods for decomposing large,

16 complex systems into smaller, more manageable subsystems (Altus, Kroo, and Gage, 1996;

Browning, 2001). An aircraft, for example, can be deconstructed into subsystems such as structures, controls, propulsion, etc., and strategies for arriving at this particular partitioning fall within the scope of systems engineering. This type of decomposition-based approach to the design of engineered systems requires significant forethought, as different partitioning strategies can determine the effectiveness of the design process (Allison, 2008). Likewise, the way in which subsystems are reintegrated, or coordinated, can similarly impact project success

(Forsberg and Mooz, 1992; Lake, 1992).

Systems engineering methods for identifying and minimizing the effects of interactions between technical system elements are of particular interest in this work. Interactions between technical system components are widely studied in systems engineering and systems design optimization, most notably in the multidisciplinary design and optimization (MDO) literature, through concepts and tools such as design structure matrices (Eppinger and Browning, 2012;

Steward, 1981), global sensitivity equations (Hajela, Bloebaum, and Sobieszczanski-Sobieski,

1990; Sobieszczanski-Sobieski, 1990), coupling metrics (Alyaqout et al., 2011; Kannan,

Bloebaum, and Mesmer, 2014) and partitioning and coordination methods for decomposition- based design optimization (Allison, 2008; Lasdon, 1970).

Decomposition-based design optimization has been established as a valuable tool for systems engineering design (Papalambros and Wilde, 2017). Complicated products or systems can be simulated and designed using optimization , which accelerates product development and drastically reduces the need for expensive physical prototypes (Allison, 2008).

Because the modeled products are inherently complicated, a single optimization cannot usually accommodate the large number of design variables and constraints operating

17 simultaneously. The simulation-based design process must first be partitioned into smaller and easier to solve subproblems, and the solutions of these subproblems must all be consistent and system-optimal. Because optimization requires a parsing of the simulation into subproblems, decisions about system decomposition are made before any formal system design activities actually commence (Allison, 2008). Decomposition-based design optimization therefore depends on a priori definition of partitioning and coordination strategies.

In design optimization, subproblems are linked through common design variables and interactions, but generic approaches for specifying a partitioned problem are rare (Tosserams et al. 2010). Seasoned engineers concede that a certain “element of artistry” is required for the process to be successful (Buede, 2009). Systems engineers are expected to apply systems thinking, and some aspire to become big picture “visionary designers” who manage technical processes as well as social and organizational interactions in dynamic environments (Brooks et al., 2011). The complexity of this task and its implications for system performance, cost, and schedule led to the development of “soft” operations research (OR)— an extension of traditional operations research that places less emphasis on mathematical modelling of business and social systems and more on thoroughly defining system boundaries and problems, resolving conflicting viewpoints, and reaching consensus on future action (Forrester, 1994). Soft OR methods characterize systems on a variety of qualitative dimensions — e.g., physical vs. social, causal vs. probabilistic, degree of complexity, or susceptibility to control— and utilize discussion and intuition rather than quantitative methods to analyze systems engineering and design processes

(Forrester, 1994). While soft operations research and systems engineering are related in theory, the two are rarely related in practice today.

18

2.2.3 Systems Thinking Early realizations of the systems thinking concept in management and organization are

Karl Weick’s sensemaking framework and Peter Senge’s “learning organization” (Weick, 1979;

Senge, 1990). Sensemaking is the process of “creating situational awareness and understanding in situations of high complexity in order to make decisions,” or, more simply, making sense of the world in order to act in it (Klein, 2006). Sensemaking is a powerful bridging concept that connects individual cognitive and affective processes with organizational structures and behavior, to create a ‘systems model’ of an individual’s behavior within the organization

(Manning, 2013).

Sensemaking has seven individual and social properties: personal identity, retrospection, enactment, social action, projection, cue extraction, and plausibility (Weick, 1995). These seven elements interact to form the narrative individuals use to interpret events. In the organizational context, understanding the individual narrative is critical for understanding “how organizational structures, processes, and practices are constructed, and how these, in turn, shape social relations and create institutions that ultimately influence people" (Clegg and Bailey, 2008).

Peter Senge’s ‘learning organization’ describes “a group of people working together collectively to enhance their capabilities, to create results they really care about (Fulmer and

Keys, 1998).” The learning organization has five characteristics; similar to the properties of sensemaking, these characteristics describe both individual and social factors that combine to explain human behavior in organizations. The properties of the learning organization are systems thinking, personal mastery, mental models, shared vision, and team learning.

In this dissertation, special attention is given to Engineering Systems Thinking (EST), which describes the study of systems thinking in the systems engineering context. Today’s systems engineers and LSCES designers often find themselves in highly complex, highly

19 ambiguous situations, technically and socially, and are responsible for making decisions about both. Technical problems are becoming more challenging, as new materials, technologies, and regulatory environments influence design capabilities, and solving these problems requires engineers and scientists from different disciplines, organizations, etc., to quickly overcome substantial cultural differences and become productive with one another (Cummings, 2002). The modern systems engineering context describes both physical engineering systems as well as the logical human organization of data, and systems engineering methodology has expanded over time to include work-process models along with optimization methods, tools, etc. Contemporary descriptions categorize systems engineering as both a technical process and a management process, in which the goal of the management process is to organize technical efforts throughout the life cycle (Oliver et al., 1997).

While this divide-and-conquer approach to systems engineering was once sufficient, a holistic view of LSCES and their operational environments has become increasingly important to engineering decision-making (McDermott and Freeman, 2016). Systems engineers in the 21st century must understand technical models of work flow, element couplings, interactions, uncertainty, risk, etc., but must also appreciate the social context in which these models are created, interpreted, and acted upon. To do so requires a unique cognitive and social skill set— engineering systems thinking—that has attracted the attention of systems engineering researchers in more recent years. Systems thinking has been described as “what makes systems engineering different from other kinds of engineering” and as “the underpinning skill required to do systems engineering (Beasley & Partridge, 2011).”

Research efforts in recent decades have explored this activity, attempting to elucidate both individual traits, skills, and attitudes as well team properties that contribute to the “capacity

20 for engineering systems thinking (CEST) (Frank, 2000; Frank, 2006; Frank, 2012; Williams and

Derro, 2008; Brooks et al., 2011).” Frank’s studies in engineering systems thinking first appeared in 2000. A substantial body of research followed (Frank, 2000; Frank and Waks, 2001;

Frank, 2002; Frank, 2006; Frank, 2007; Frank and Kordova, 2009; Frank, Sadeh, and Ashkenasi,

2011; Frank, 2012; Kordova, Frank, and Nissel, 2018), focusing on engineering systems thinking as it exists in both education and practice. This work is important in its early advocacy of updating the engineering curricula to incorporate systems thinking skills as part of standard engineering education.

In the professional context, the work by Frank, Sadeh, and Ashkenasi (2011) demonstrates a correlation between CEST and project success. This work is valuable for conveying the potential for the improvement of engineering practice that could result from a rigorous study of systems thinking. However, this work is yet to be substantially tested and validated, and the traits that Frank identifies as the “capacity” for or “cognitive characteristics of” engineering systems thinking are rather loosely defined. For example, in describing the cognitive characteristics of engineering systems thinkers in his 2012 paper, each one of the ten characteristics Frank identifies begins with “understanding.” What constitutes this understanding and how engineering systems thinkers come to understand things in this way remain open questions.

Frank describes engineering systems thinking in terms of behaviors that result from systems thinking or demonstrations of the act of engineering systems thinking, rather than the underlying psychological processes required for doing engineering systems thinking. Frank’s work is beneficial in recognizing the behaviors or tendencies of systems thinking that can be

21 useful in practice, but there is still much work to define ways to develop the skills required to become an engineering systems thinker.

Frank and Waks (2001) refer to the capacity for engineering systems thinking as a distinct personality trait. Psychology, however, defines personality traits and thinking abilities as distinctly different aspects. Personality traits are relatively stable over time, and are not expected to change as a function of experience (Doob, 1947). Cognitive skills and strategies like systems thinking can be developed through education, training, and experience. By describing the capacity for systems thinking as an innate ability or personality characteristic, the potential for methodically teaching systems thinking is lost, reducing a rather sophisticated concept to a simple interaction of personality and experience. The psychological distinction is important if the study and advancement of systems thinking is to progress in beneficial ways.

Work by Rhodes and co-workers Lamb, Nightingale, and Davidz is also important for opening up the discussion about systems thinking in engineering, but may be subject to a similar critique. In their 2008 paper, Davidz and Nightingale suggest that enabling systems thinking is a critical step in advancing the development of senior systems engineers. They recognize at the same time that fundamental questions still remain about how systems thinking develops in engineers (Davidz and Nightingale, 2008). The authors attempted to answer these questions through field studies and interviews with 205 engineers across 10 host companies in the US aerospace sector (Davidz, Nightingale, and Rhodes, 2004). Engineers of various levels of expertise and experience and with varying levels of proficiency in systems thinking were asked how they define systems thinking, and were also given a definition of systems thinking and were then asked to comment on what aspects of the definition they agreed and disagreed with. This approach resulted in divergent definitions of systems thinking, which did not help in developing

22 a single, unified framework from which to advance its study. The authors organized their findings into five broad foundational elements that explain what perspectives and behaviors constitute systems thinking, but do not address the underlying commonalities or constructs.

Only one element, deemed the “modal” element, describes how an individual performs systems thinking, but the authors described this “how” in the context of tools, methods, models, and simulations and do not address the actual cognitive processes required to do systems thinking.

Despite this shortcoming, the work identifies some important enablers to the development of systems thinking, such as experiential learning, education, interpersonal interactions, and training in a supportive environment. Other research by Rhodes, Lamb, and Nightingale (2008) also describes methods for studying systems thinking empirically. As in the work by Davidz and

Nightingale (2008), the authors seek to uncover the enablers, barriers, and precursors to engineering systems thinking. The authors recognize that both an in-depth understanding of engineering practice coupled with an orientation in social science is necessary to properly capture the essence of engineering systems thinking.

In other work, Lamb, Nightingale, and Rhodes (2008) offered a different explanation for engineering systems thinking by suggesting that it is perhaps not something that can be evaluated at the individual level at all. Instead, this paper and others by the same group (Lamb and Rhodes,

2008; Lamb and Rhodes, 2010) suggest that systems thinking may be better understood as “an emergent behavior of teams resulting from the interactions of the team members… utilizing a variety of thinking styles, design processes, tools, and languages to consider system attributes, interrelationships, context, and dynamics towards executing systems design.”

While social context is certainly a relevant and important factor in systems thinking, one can argue against describing systems thinking as an emergent behavior of teams (Greene and

23

Papalambros, 2016). The study by Davidz and Nightingale (2008) relied on testimony from

“proven stellar systems thinkers.” If systems thinking were simply an emergent property of teams, these individuals could not exist independent of the teams in which they work. Clearly, certain individuals have a more refined systems thinking skill set than others, and understanding why and how it is that this occurs is important. A cognitive psychological approach at the individual level of analysis is one first step in this direction (Cagan, 2007; Greene and

Papalambros, 2016). This dissertation represents such an approach. Offering additional support for this strategy, Davidz and Nightingale recognized the importance of addressing systems thinking at the level of the individual (2008). They argue that understanding how systems thinking develops in an individual is important for subsequently understanding how systems thinking develops in a team. If systems thinking is to be described in terms of emergence, it is more appropriate to summarize systems thinking as an emergent feature of a highly refined set of individual cognitive processes or attitudes rather than an emergent feature of teams (Greene and

Papalambros, 2016).

Contemporary research in EST explores some human-centered assumptions, concepts, values, and practices (Hutchison, Henry, and Pyster, 2016). Successful engineering systems thinkers are consistently recognized as being good leaders and communicators and as naturally

"curious" or "innovative." Studies suggest that systems thinkers can see and define boundaries; understand system synergy; and balance reductionist and holistic viewpoints (Frank, 2012). They think creatively, overcome fixation, and tolerate ambiguity. ESTs ask "good questions"; can understand new systems and concepts quickly; can consider non-engineering factors that influence system performance; and understand analogies and parallelism between systems

(Davidz and Nightingale, 2008; Davidz et al., 2008; Frank, 2012; Madni, 2015; McGowan,

24

2014; Rhodes et al., 2008; Williams and Derro, 2008). This prior research provides valuable contributions and strong evidence for the importance of studying systems thinking in engineering, but contributions from extant knowledge in psychology and cognitive science are not included. This dissertation seeks to address this gap.

2.2.4 Design Thinking Design thinking is a recent popular topic in academic research, pedagogy and education, engineering, and business. Rapid technological development throughout the twentieth century generated a need for formal academic study of "the science of design (Simon, 1969)." Two important periods in the modern history of design science are identified by Cross (1982). The first, the "design products movement" of the 1920s, sought to "produce works of art and design based on objectivity and rationality;" that is, on the values of science. The second, the " movement" of the 1960s, sought to establish design processes—in addition to the products of design—based on similar scientific principles. Despite some backlash against design methodology in the 1970s, the tradition continued to flourish in engineering and , and several prominent academic journals for , theory, and methodology emerged during the 1980s and 1990s.

Design methodology is defined by Cross (1982) as "the study of the principles, practices, and procedures of design" and "includes the study of how designers work and think." Over the past several decades, engineering researchers have successfully leveraged cognitive and social science approaches to study how designers think through engineering design problems, exploring a breadth of topics including creativity, ideation in early conceptual design, the role of analogies in creative problem solving, differences between novices and experts, and strategies for overcoming fixation and mental blocking. Verbal protocol analysis, cognitive ethnography,

25 controlled laboratory experiments, and other formal methods from cognitive science have been rigorously applied to the study of designer thinking in engineering (Dinar et al., 2015; Shah et al., 2012). Results of these studies and others suggest that design thinking approaches use solution-based methods to explore human-centered values throughout the engineering design process. This finding is reflected in many applications of design thinking: prototyping, a solution-based method, is often cited as a useful way to encourage inspiration, ideation, and organizational learning, all human-centered values (Brown, 2009; McGowan et al., 2017).

Design thinking emerged as a formalism of successful practice in identifying the right design problem, generating creative solutions, and making design decisions through rapid prototyping and user testing (Papalambros, 2018). Design thinking, as a formalism of user- centered design practice, emerged in stark contrast to engineering design obsessed with functionality. Design thinking frameworks typically do not include analytical-quantitative methods, dealing instead with sociocognitive processes such as creativity and empathy. Today’s industrial designers and product designers are relatively free from the complexity of engineering, including analysis, and are clear to focus instead on user needs, desires, and experience with designed artifacts (Papalambros, 2018).

Many applications of the design thinking framework exist in the literature and in practice.

Examples include Herbert Simon's design thinking process (1969), which suggests seven stages of design thinking for product design, including defining the problem, researching, ideating, prototyping, choosing a solution, implementing the solution, and learning. Plattner, Meinel, and

Leifer (2011) propose a five-step version of the design thinking process that includes redefining the problem, need finding and benchmarking, ideating, building, and testing. International design and consulting firm IDEO applies a four-phase process that includes gathering inspiration,

26 generating ideas, making ideas tangible, and sharing your story (Brown, 2008). Graphic representations of these processes are included in Figure 2.1.

Figure 2.1 Design Thinking Process Models from IDEO, Stanford d-school, and Google While each interpretation differs slightly from the others, important foundational values of design thinking persist. First, while design thinking frameworks emphasize the importance of problem definition, the process is solution-driven, and most design thinking methods include prototyping and iteration phases for generating solutions that meet customer needs. These solutions are human-centered products or services, developed through designers' personal experiences, empathy, and engagement with stakeholders. The design thinking process itself is also human-centered, offering methods for inspiration, ideation, and learning to designers

(Brown, 2008). Design thinking has been described as a "high order intellectual activity" that

"requires practice and is learnable" (Plattner et al., 2011).

2.3 Developing a Codebook for Interview Analysis

In this section, themes from the literature are integrated and described in depth to develop a “systems design thinking codebook.” This codebook will be used for informing interview analysis, as well as quantitative analyses in later chapters. The codebook will be used to understand individuals’ attitudes about systems engineering, systems thinking, and design thinking; how they are done/demonstrated in practice; how they are related; and the perceived effect each has on projects and outcomes. This codebook is a contribution to the interdisciplinary

27 study of these concepts, with the goal of addressing the gap in the literature comparing these concepts.

2.3.1 Systems Engineering Codes Systems engineering codes reflect organizational and process elements, such as planning, documenting, and managing. Systems engineering often begins with system requirements, and describes all processes related to verifying and validating that those requirements have been met on time and within budget. This process is large-scale and complex, requiring the coordination of many individuals over long periods of time, and extensive documentation to retain and communicate information effectively. Systems engineering codes reflect an organized, detail- oriented process for accomplishing these tasks.

Documentation

A key function of systems engineers is ensuring proper documentation throughout systems design. This includes documenting requirements, verification and validation plans, design changes, etc., in accordance with ISO 9000 requirements (Recker, 2002). This can be very challenging, as there are many engineering groups working together on systems projects, and each group manages and maintains their documentation in their own way. Systems engineers are responsible for integrating these documents and delivering them on time.

Systems engineers also exhibit informal documentation behaviors, such as personal note- taking, recording meeting minutes, etc. This reflects attention to detail and enables information retention over time and space.

28

Planning and scheduling

Systems engineers are responsible for delivering products on time and within budget constraints. This includes long-term as well as short-term project planning, and scheduling meetings between individuals. This can be done formally and informally.

Methodical/process-driven

Systems engineering is methodical and follows the same general process for every project. This is captured in the “systems vee,” although many organizations have a slightly modified/more personalized version that reflects the unique values of the organization. The

INCOSE systems engineering process model and NASA “systems engineering engine” are pictured in Figure 2.2 below.

Figure 2.2 Systems engineering process models. On the left is the INCOSE systems engineering “vee” and the NASA “systems engineering engine” is on the right. Requirements

Requirements definition, verification, and validation are key elements in systems engineering. Much of the systems engineering process is devoted to ensuring that requirements are clearly defined and easily interpreted by designers and customers alike. Systems engineering

29 processes typically begin at requirements definition, and requirements then serve as a contract between engineers and customers through the remainder of the design process.

Management

Differences exist between systems engineers and project managers, but technical management is at the heart of the systems engineering process (see Figure 2.2). Technical management includes ensuring that all subsystems are developed in a cohesive way that allows for easy integration and alignment.

2.3.2 Systems Thinking Codes Engineering systems thinking shares a foundation with ; thus, assumptions, concepts, values, and practices bear some resemblance to those of general systems theory, cybernetics, and systems dynamics. The systems thinking framework states that a system is composed of parts, the system is greater than the sum of its parts, and all parts of the system are interrelated. Systems receive inputs from the environment, execute processes that transform these inputs into outputs, and send these outputs back into the environment in feedback loops.

Systems are dynamic and complex, interactions may be difficult to identify or quantify, and emergence is common. Systems thinking themes reflect an appreciation for system complexity, with specific emphasis on understanding interactions between system elements, and integrating and aligning these elements. Flexibility, adaptability, and a tolerance for ambiguity reflect the systems thinking approach for dealing with complexity.

Contemporary research in engineering systems thinking seeks to make the approach more human-centered. Successful engineering systems thinkers are consistently recognized as being good leaders and communicators and as naturally "curious" or "innovative." They think creatively, overcome fixation, and tolerate ambiguity. ESTs ask "good questions"; can

30 understand new systems and concepts quickly; can consider non-engineering factors that influence system performance; and understand analogies and parallelism between systems

(Frank, 2012; Williams and Derro, 2008; Davidz and Nightingale, 2008; Davidz et al., 2008;

Rhodes et al., 2008; McGowan, 2014). Systems thinking codes also capture these human- centered elements.

Big picture view

Systems thinking codes reflect the “big picture view,” but also emphasize an appreciation for details about individual system elements and interactions between them. Studies suggest that systems thinkers can see and define boundaries; understand system synergy; and balance reductionist and holistic viewpoints (Frank, 2006).

Interactions

Systems thinking involves the constant search for interactions; technical, social, and organizational. This is due in part to the systems thinker’s natural curiosity about how technical systems work and how individual elements are related. Systems thinkers are also interested in social interactions and their effects on systems design, and leverage social relationships to deepen their understanding of the technical system. They do this within organizational constraints, following organizational cultural norms, processes, and practices.

Integration/alignment

This code emphasizes the importance of mitigating interactions and preventing emergence. Systems design activities and their resulting artifacts must be aligned and integrated throughout the systems engineering process. Each element in a system must be designed to function holistically. Individual optimization of subsystems is not enough. Socially, this involves

31

“getting everyone on the same page,” as integrating technical subsystems requires integrating the social systems that work on them.

Flexibility/adaptability

The systems thinking framework addresses complexity through flexibility and tolerance of ambiguity. This differs from the systems engineering framework, where the approach involves reducing ambiguity and risk through analysis and management.

2.3.3 Design Thinking Codes Design thinking codes are human-centered and related to understanding user and other stakeholder needs through interaction and engagement. Shared experience is a common theme in design thinking, for its ability to generate empathy and insight into uncovered needs. Design thinking also includes prototyping as a form of shared experience and communication, as well as an opportunity to test ideas and move from conceptual design to embodiment.

Empathetic [human-centered]

Various design thinking processes cited in literature start with identifying and immersing oneself in the end user’s position. This phase has also been described as “immersion,”

“awareness,” and “inspiration,” etc. (Fleury, Stabile, and Carvalho, 2016). The core function of the designer in this phase is to build empathy for their end users. Empathy in this context refers to imagining the world from the perspective of multiple stakeholders and prioritizing the latent needs of the people (Brown, 2008). Plattner et al. (2011) claims that empathy in design thinking is the process of ‘needfinding,’ or discovering both implicit and explicit needs before working on a design problem. Empathy forms a core component of the design process regardless of the steps taken to achieve the solution.

32

Intuitive/experience-driven

This code shares some overlaps with empathy (e.g., intuition, awareness, immersion).

Immersive experiences like co-design, prototyping/user testing (frequent customer interaction & feedback), etc., are useful for developing empathy and design intuition (Tonetto & Tamminen,

2015). An important function of the designer is to intuitively shape the problem at hand by identifying the view of participants and the underlying issues that concern them (Buchanan,

1992). Gasparini claims emotional empathy as being an instinctive, affective, shared and mirrored experience where one feels what other people experience (Gasparini, 2015).

Ambiguous problems & solutions

Design problems are now widely recognized as ill-defined or “wicked” (Cross, 1982;

Rittel and Webber, 1973, 1974; Kuhn, 1962; Buchanan, 1992). Cross examines scientific problems to be ‘tame’ and analogous to puzzles which can be solved by applying well known rules to given data while design problems are ill-structured with ambiguous requirements.

Buchanan (1992) supports this claim by stating that the underlying reason for the ‘wickedness’ of design problem lies in the fact that design has no subject matter of its own other than what the designer conceives it to be. In light of this, design thinking requires the designer to embrace and preserve ambiguity (Plattner et al., 2011). Plattner claims that when the constraints are explicitly defined, it provides minimal chance for the discovery of latent needs leading to innovative solutions. Modifying the nature of the problem to find a solution is a challenging yet imperative aspect of the designer’s role and this in turn states the ambiguous nature of problems in design thinking (Jones, 1992).

33

Innovative/creative

Design thinking is widely claimed to be a driver of innovation (Brown, 2008; Cross,

2001; Plattner, Meinel, and Leifer, 2012). Plattner et al. (2012) claim that creative design takes the center stage of all design thinking activities. Anderson et al. (2014) defines creative design as something that is novel and useful, or appropriate and adaptive. Shah (2012) claims that the discipline of design requires abductive reasoning, divergent thinking, and creative thinking, and that this differs from the science-based regimen that promotes convergent thinking and deductive reasoning through closed-end problem solving. Shah also claims that while these skills are core to engineers, they may be insufficient to the field of design. Buchanan (1992) claims that design thinking requires designers to think beyond their personal preferences and visualize novel possibilities by conceptual placements.

2.4 From Frameworks to Attitudes: Interviews with Systems Engineers

This section describes an informal pilot study, in which semi-structured interviews are conducted with practicing systems engineers. The goal is to further explore the relationship between “systems engineering” and “systems thinking,” and understand if/how design thinking themes are discussed, based on our understanding of the literature. The interview sample is small and not well-balanced, and findings are neither generalizable nor transferable from a research methodology perspective, nor are they meant to be. The interviews added depth to literature findings, provided useful insights and clarifications about systems engineering, systems thinking, and design thinking, and generated “attitude statements” for use in Likert scales to test generalizability and transferability quantitatively, as described in the next chapter.

34

2.4.1 Method The interview setting was a government laboratory for large-scale systems design. The subject population for the study included ten experienced adult engineers (more than ten years work experience) working in organizations that are responsible for the design and management of large-scale, complex engineered systems. The subject population included senior systems engineers, chief engineers, project managers, and related roles. Participants were identified as exceptional systems thinkers and recruited by a technical leader within the organization, who asked them to participate in “an interview about systems thinking.”

The cognitive work analysis framework was used to structure the interview (Vicente,

1999). Interview questions roughly followed the critical decision method (Klein and Armstrong,

2005; Klein, Calderwood, and MacGregor, 1989). This method uses cognitive probes to elicit information regarding naturalistic expert decision-making. In this work, cognitive probes were used to elicit attitudes, beliefs, and approaches for systems engineering and systems thinking.

Interviews also included questions about communication preferences, based on theories that suggest social coordination is closely related to technical coordination (Cataldo, Herbsleb, and

Carley, 2008; Colfer and Baldwin, 2016; Conway, 1968; de Souza et al., 2007). Interview questions can be found in the Appendix.

The codebook from Section 2.3 is imported into NVivo 12 and used to analyze interview data. Additional codes and themes are also generated from the data. Several word frequency queries and matrix coding methods are used to understand significant themes and relationships between these themes.

2.4.2 Analysis Thematic analysis is used to identify patterned meaning across a dataset (Braun and

Clarke, 2006). In the first step, a deductive approach is used, where coding and theme

35 development are directed by existing concepts and ideas. The codebook described in Section 2.3 is used for deductive coding. Inductive coding was also conducted within the results of the deductive coding to organize emergent concepts. The goal of interviews is to identify attitudes that reflect systems engineering, systems thinking, and design thinking in practice, and understand the relationships between them.

The first analysis conducted was a general count of number of interviews including each theme and number of references to each theme. Five themes were coded in all ten interviews.

These are in Table 2.1 below. Communication was first, referenced in all ten interviews with a total of 96 references. Interactions and empathy were next, both referenced in all ten interviews,

56 and 42 times respectively.

Table 2.1 Five themes were coded in all ten interviews. These five themes, with total number of references in parentheses, are: communication (96), interactions (56), empathy (42), complexity (30), and learning and information (29).

Theme Interviews References Communication 10 96 Interactions 10 56 Empathy 10 42 Complexity 10 30 Learning and information 10 29

36

Table 2.2 represents design thinking themes, and the number of interviews including each theme and number of references to each theme:

Table 2.2 Design thinking themes, number of interviews including each theme, and total number of references to each theme. Design thinking themes had highest number of total references (297).

Theme Interviews References Communication 10 96 Empathy 10 42 Human-centered 10 40 People & personalities 8 45 Experience 8 39 Awareness & intuition 8 35

Total 297

Design thinking themes had highest number of references overall. Communication, a sub-theme of “human-centeredness,” and empathy together had 138 references.

Systems thinking themes in Table 2.3 had second highest number of overall references.

Interactions and complexity had 86 combined.

Table 2.3. Systems thinking themes, number of interviews including each theme, and total number of references to each theme. Systems thinking themes had the second highest number of total references (238).

Theme Interviews References Interactions 10 56 Complexity 10 30 Integrate & align 9 56 Big picture 9 53 Flexibility/adaptability 8 26 Ambiguity/uncertainty 8 17

Total 238

Learning and information, an inductive code referenced in all ten interviews, refers to the approach for gathering and managing information about the technical system. Learning and

37 information includes themes from systems thinking and systems engineering, such as “asking questions” (about relationships, interactions) and “gathering data”. To do this effectively requires design thinking skills. Empathy is helpful but not always required; “awareness” (of tone, reception, etc.), emotional intelligence, and approaching conversations as “qualitative data” are all important precursors to learning.

Systems engineering themes are in Table 2.4 below:

Table 2.4 Systems engineering themes, number of interviews including each theme, and total number of references to each theme. Systems engineering themes had the fewest number of total references (190).

Theme Interviews References General SE 8 31 Management 8 23 Document 7 42 Requirements 7 25 Risk 7 17 Planning/scheduling 6 29 Process 6 23

Total 190

“Document” had the highest number of references overall. “Planning/scheduling” was the second most specifically referenced systems engineering theme. “Requirements” and “process” were third and fourth most referenced, respectively. Systems engineering themes and attitudes were mentioned fewest overall.

2.4.3 Findings Systems engineering reflects necessary occupational processes; systems thinking reflects the underlying cognitive processes that support them. One participant describes the relationship between systems engineering and systems thinking attitudes in the following way:

38

“People think systems engineering, they [think] this guy does requirements,

does , does this paperwork. But a systems engineer

is really—to be one—you’ve got to have command of how the system interacts

with each other, what the sensitivities are, and be able to carve that system up

and manage those interactions.”

Another interview offers a similar perspective on this, suggesting that the focus on policy and procedures is more of an “academic perspective” to systems engineering, rather than a good representation of systems engineering practice, which relies more heavily on systems thinking.

However, these processes make up the fundamental responsibilities of systems engineers, which influences their assumptions and values (Elsbach, Barr, and Hargadon, 2005). We therefore define “systems engineering attitudes” as attitudes about requirements, scheduling, planning, and documentation. This overlaps with technical and organizational systems thinking (e.g., scheduling, planning, and documentation are organizational systems thinking processes).

Systems thinking is organizational, in that it focuses on identifying and understanding relationships and interactions, and systems engineers see relationships between humans as equally important as relationships between technical system elements. Systems thinking is used for identifying what the problems are, where they are, and who can help. Solutions mostly involve getting people in a room to discuss, which has human-centered themes in common with the design thinking framework.

Design thinking attitudes reflect empathy and understanding, the act of hearing and listening, and other human-centered beliefs and approaches. There appears to be some overlap with the social element of systems thinking. However, in social systems thinking as described in interviews, human-centeredness is not limited to understanding user needs as it does in design

39 thinking/product design, but also includes understanding the personalities, needs, concerns, etc. of other engineers and designers within the organization in order to get design done.

Interviews also offered insights into the relationship between engineering and design thinking. Data captured differences in interpretation and preference for design thinking and systems engineering processes:

“Most people think the design, analysis, test and build is the cool stuff, so,

let's just get to the cool stuff right away. So, I think having some rigor in the

system from a systems engineering perspective forces us to make more

deliberate design development decisions that you might not make otherwise.”

Interview findings are summarized in the attitude model in Fig. 2.3 below.

Figure 2.3 Systems Design Thinking Attitude Model. Systems engineering attitudes describe feelings about requirements, analysis, documentation, scheduling, and management processes. Design thinking attitudes are human-centered, describing feelings about people/personalities,

40

empathy, and communication. Systems Design Thinking uses the holistic systems thinking approach (appreciation for ambiguity, complexity, interactions, and integration) for understanding both technical systems and social systems. Systems thinking is positioned between systems engineering and design thinking, as systems design thinkers will apply a “systems philosophy” to technical as well as human- centered systems, and understand the relationships between them.

2.5 Summary

This chapter presented a review of relevant literature on systems engineering, systems thinking, and design thinking. The outcome of this review was a set of codes that reflect core assumptions, concepts, values, and practices of systems engineering, systems thinking, and design thinking frameworks. These codes were compiled into a codebook for systems design thinking, and used to analyze data from semi-structured interviews with experienced systems engineers.

Consistent with the literature, interview findings suggest that engineering systems thinking is an integral skill for systems engineering. The systems engineering perspective is based on systems thinking (INCOSE, 2015). Systems thinking is necessary for identifying and understanding interactions between technical system elements, and for identifying and coordinating the corresponding social units within the engineering organization. This multidimensional concept is formalized in interview findings, where three “types” of systems thinking—technical, social, and organizational—are identified and described.

While design thinking is not mentioned explicitly in the interviews, systems engineers make references to concepts, values, and practices from design thinking. These include human- centered practices such as active listening and communication, empathy, and shared experience.

41

“Communication,” a subset of the design thinking code “human-centered,” was the most frequently referenced code in the qualitative analysis.

Systems engineering, systems thinking, and design thinking frameworks overlap in the context of complex systems design. All of these frameworks are useful at different stages and for different activities within the systems engineering process. Not surprisingly, systems engineers’ attitudes reflect different assumptions, concepts, values, and practices from these frameworks. In the next chapter, these attitudes are explored quantitatively. Factor analysis is used to determine whether these attitudes can be clearly differentiated, according to systems engineering, systems thinking, and design thinking frameworks.

42

CHAPTER III

Modeling Systems Design Thinking Attitudes

3.1 Introduction

Findings from interviews suggest different ways to define and relate systems engineering, systems thinking, and design thinking frameworks. In this chapter, theory and methods from psychometrics are used to make quantitative comparisons between systems engineering, systems thinking, and design thinking frameworks. Psychometrics is the field of study concerned with the theory and technique of objective psychological measurement (Furr and Bacharach, 2013). This includes the assessment of skills, knowledge, abilities, attitudes, personality traits, and educational achievement. Psychometrics includes the construction and validation of assessment instruments such as questionnaires, scales, and tests.

Some work in recent years has explored psychometric approaches for modeling systems thinking and design thinking (Castelle and Jaradat, 2016; Chesson, 2017; Davis and Stroink,

2016; Davis et al. 2018; Dosi, Rosati, and Vignoli, 2018; Jaradat, 2014; Thibodeau, Frantz, and

Stroink, 2016). Few studies explore the psychology of systems engineering. None of the identified studies have attempted to integrate these frameworks into a single measure. In this chapter, psychometrics is used to identify the key assumptions, concepts, values, and practices of systems engineering, systems thinking, and design thinking, and integrate them through the development of the Systems Design Thinking Scale. A series of three iterative studies is described. In Study 1, factor analysis is used to test a model of systems design thinking

43 consisting of technical systems thinking attitudes, social systems thinking attitudes, and organizational systems thinking attitudes. In Study 2, a two-factor model of systems design thinking consisting of systems thinking and design thinking attitudes is tested. In Study 3, we test a two-factor model of systems engineering and design thinking attitudes.

Structural Equation Modeling (SEM) is used to fit systems engineering, systems thinking, and design thinking frameworks to data (Kline, 2015). SEM is a popular technique in the social sciences, where unobservable constructs such as intelligence or self-esteem are more commonly studied than directly measurable variables such as volume or mass. SEM can be described as a two-step hypotheses testing technique. First, social scientists develop hypotheses about a construct (e.g., intelligence), and write measurement instruments (e.g., an IQ test) with questions designed to measure intelligence according to their hypotheses. Then, statistical methods are used to assess the validity of the hypotheses, using data gathered from people who took the intelligence test. In this example, “intelligence” is the latent variable, and test questions, referred to as ‘items,’ are the observed variables.

“Intelligence” in this example can be replaced with “systems design thinking.” Systems design thinking is a latent construct that is believed to exist, but it is not directly measurable in the same way that volume and mass are directly measurable. Thus, in order to “measure” systems design thinking, the network of constructs that comprise systems design thinking must first be decomposed into items that are directly measurable. In this case, items reflect assumptions, concepts, values, and practices in systems engineering, systems thinking, and design thinking, and the degree to which subjects agree or disagree is measured. Several network models are developed based on findings from the qualitative research described in Chapter II.

44

These models are tested using exploratory and confirmatory factor analysis as described in the following sections.

3.2 Study 1: Technical, Organizational, and Social Systems Thinking

3.2.1 Scale Development Interviews with systems engineers suggested three “types” of systems thinking, depicted in Figure 3.1. These are technical, organizational, and social systems thinking. Items representing technical, social, and organizational systems thinking are organized into three respective factors. This three-factor model of systems design thinking is tested first.

Figure 3.1 A graphical representation of the three types of systems thinking—technical, social, and organizational. This categorization was suggested in interviews with professional systems engineers. These individuals were recruited to participate in the study based on their designation as “exceptional systems thinkers” by a technical leader within the organization. Tables 3.1, 3.2, and 3.3 include 57 attitude statements that were included in the pilot test.

These statements are grouped into three factors according to systems thinking types. It is important to note that many of the items are intended to be “reverse worded”, i.e., meant to represent the ‘opposite’ of a systems design thinking attitude. This is indicated in the table with a

(-) following each reverse-worded statement. There are issues with this choice, which will be discussed in the findings, Section 3.2.3.

45

Table 3.1 Technical systems thinking attitude items tested in Study 1. Items marked with a (-) are reverse-worded, i.e., meant to represent the ‘opposite’ of a systems design thinking attitude.

T1. I would prefer to design & manufacture a T2. I tend to focus on the nuances of a T3. For best system performance, subsystems single part rather than analyze interactions problem, rather than the big picture. (-) should be as independent as possible. (-) between two parts of a system. (-)

T4. I test several ways to solve a problem T5. Modifications and adjustments made T6. Design decisions should be made based before choosing the best one. after a system is deployed indicate that the on the problem a system was designed to design was inadequate. (-) address, rather than the system that currently exists.

T7. A system will perform optimally if each T8. I prefer to work on problems with T9. Comprehensive understanding of a of its subsystems is designed optimally. (-) objective solutions. (-) system can be achieved by analyzing each individual subsystem. (-)

T10. I like to receive a detailed set of T11. Once desired performance is achieved, T12. When designing a system, plans should requirements before beginning a project. a system should be left alone. (-) be tentative and expected to change.

T13. Once successful, a technical solution T14. I prefer to work on technical problems T15. It is better to try proven solutions before should result in similar success in other rather than non-technical problems (e.g., fix pursuing new solutions to a problem. applications. a part vs. negotiate a contract). (-)

T16. It is important that I understand how my T17. Prototyping speeds up the process of T18. I break problems into tasks/steps before work contributes to the larger system or innovation. beginning work. mission.

T19. Problems at technical interfaces take T20. Engineering and design are different. T21. I can recall at least one “a-ha moment” longer to resolve than problems in other I’ve had at work. areas.

T22. Design decisions should be made based on data.

46

Table 3.2 Organizational systems thinking attitude items tested in Study 1

O1. Engineering organizations should ensure O2. Working in geographically dispersed O3. Planning is wasteful in uncertain that their divisions are integrated, even if that teams is harder than working in co-located situations. means limiting each division's freedom to teams. make decisions.

O4. My organization's values are important O5. Organizations should follow the same O6. I believe that control of my work to me. design process for every project. environment is possible.

O7. Task-focused individuals are just as O8. I ask myself if what I'm learning is O9. Engineering organizations should valuable to organizations as innovators are. related to what I already know. distribute decision authority equally between discipline engineers and .

O10. Long-term planning and short-term O11. I delay making plans rather than make O12. It is important that my job offers planning are equally important. plans I know will change later. flexible scheduling.

O13. Organizations should support interdisciplinary collaboration.

Table 3.3 Social systems thinking attitude items tested in Study 1

S1. It is important to consider stakeholders’ S2. When I have a question about my work, I S3. I enjoy using group values and emotions in addition to technical try to figure it out by myself before asking such as mind mapping/brainstorming. system requirements. anyone for help.

S4. I seek out others' opinions when deciding S5. I prefer to have my own office rather S6. I enjoy working with people outside my how to approach a problem. than shared workspace. discipline.

S7. I like to know what my colleagues are S8. I prefer to meet as needed rather than S9. Collaborating and working independently doing, even if it doesn't relate directly to my attend regularly scheduled meetings. are equally important. work.

S10. I use storytelling and/or analogies to S11. It is important that I know my S12. I am comfortable applying my technical describe my work to others. colleagues personally as well as knowledge and skills in unfamiliar situations. professionally.

S13. Engineering is a creative process. S14. I enjoy taking on leadership roles. S15. My colleagues know about my life outside of work.

S16. I enjoy sharing my ideas with others. S17. I feel most creative around other people. S18. I prefer to work with colleagues that have more experience than I do.

S19. I wish I had more time to learn about S20. It is difficult to establish a common S21. Team-building exercises are valuable. my colleagues' work. vocabulary with colleagues from other disciplines.

S22. I prefer to work with people with whom I've worked successfully before.

47

3.2.2 Pilot Test, Factor Analysis, and Results for Study 1 Exploratory factor analysis (EFA) was used to study all 57 items in Tables 3.1, 3.2, and

3.3 in Section 3.2.1 (Mulaik, 2009). These items were arranged in a 5-point Likert scale and evaluated by a sample of 20 professional engineers and engineering researchers from industry, academia, and government. It is important to note that sample size of 20 is very small for this type of study. The recommended sample size for conducting exploratory factor analysis is at least 100 subjects (Kline, 2015). Minimum ratios of sample size to the number of variables have also been proposed (Pearson and Mundform, 2010). However, Study 1 was intended to serve only as a pilot test for identifying and removing items that function particularly poorly, and to identify preliminary patterns in the data for further analysis.

The EFA is specified to explore up to three factors in Study 1. Maximum likelihood estimation and varimax (orthogonal) rotation were used in the analysis, as it is expected that an individual can have any combination of systems design thinking attitudes. Based on the interview findings, in which systems thinking was partitioned into technical, social, and organizational systems thinking, the EFA was expected to produce a corresponding 3-factor solution. However, no interpretable results were observed for three-factor model.

Additional analysis was conducted on the 2-factor model, to interpret each factor and refine the hypotheses for further study. Variables with loadings less than 0.300 were dropped from the model due to poor fit, and the data was analyzed again. Variables included in this second analysis can be seen with their varimax rotated loadings in Tables 3.4, 3.5, and 3.6.

48

Table 3.4 Post-hoc exploratory factor analysis results for technical systems thinking attitude items in Study 1 Item ID F1 F2 Attitude Statement

T2 -0.028 0.670 I tend to focus on the nuances of a problem, rather than the big picture.

T3 0.344 -0.184 For best system performance, subsystems should be as independent as possible.

T4 0.149 -0.309 I test several ways to solve a problem before choosing the best one.

Modifications and adjustments made after a system is deployed indicate that the design T5 -0.334 0.145 was inadequate.

T7 -0.148 -0.382 A system will perform optimally if each of its subsystems is designed optimally.

T8 -0.718 0.375 I prefer to work on problems with objective solutions.

Comprehensive understanding of a system can be achieved by analyzing each T9 -0.578 -0.217 individual subsystem.

T11 0.314 -0.065 Once desired performance is achieved, a system should be left alone.

Once successful, a technical solution should result in similar success in other T13 0.623 0.153 applications. I prefer to work on technical problems rather than non-technical problems (e.g., fix a T14 -0.072 0.407 part vs. negotiate a contract).

T17 -0.327 0.341 Prototyping speeds up the process of innovation.

T18 -0.351 0.339 I break problems into tasks/steps before beginning work.

T22 -0.304 0.301 Design decisions should be made based on data.

49

Table 3.5 Post-hoc exploratory factor analysis results for social systems thinking attitude items in Study 1

Item ID F1 F2 Attitude Statement

When I have a question about my work, I try to figure it out by myself before asking S2 0.258 -0.674 anyone for help.

S3 -0.869 0.039 I enjoy using group creativity techniques such as mind mapping/brainstorming.

S5 0.670 -0.093 I prefer to have my own office rather than shared workspace.

I like to know what my colleagues are doing, even if it doesn't relate directly to my S7 -0.142 -0.764 work.

S8 0.721 -0.057 I prefer to meet as needed rather than attend regularly scheduled meetings.

S10 0.065 -0.863 I use storytelling and/or analogies to describe my work to others.

S13 -0.793 0.249 Engineering is a creative process.

S14 -0.645 0.196 I enjoy taking on leadership roles.

S15 0.433 0.013 My colleagues know about my life outside of work.

S17 -0.556 0.188 I feel most creative around other people.

S18 0.789 -0.219 I prefer to work with colleagues that have more experience than I do.

S20 -0.160 0.465 It is difficult to establish a common vocabulary with colleagues from other disciplines.

S21 -0.729 -0.336 Team-building exercises are valuable.

S22 0.850 0.019 I prefer to work with people with whom I've worked successfully before.

50

Table 3.6 Post-hoc exploratory factor analysis results for organizational systems thinking attitude items in Study 1

Item ID F1 F2 Attitude Statement

O3 -0.359 0.101 Planning is wasteful in uncertain situations.

O4 -0.309 0.308 My organization's values are important to me.

O5 0.503 0.220 Organizations should follow the same design process for every project.

O7 -0.216 -0.756 Task-focused individuals are just as valuable to organizations as innovators are.

O8 0.387 0.180 I ask myself if what I'm learning is related to what I already know.

O10 0.078 -0.682 Long-term planning and short-term planning are equally important.

O12 0.055 -0.473 It is important that my job offers flexible scheduling.

O13 -0.119 -0.906 Organizations should support interdisciplinary collaboration.

Again, the 3-factor model did not produce interpretable results. The 2-factor model demonstrated slightly better than the 1-factor model, although both models result in poor fit according to conventional model fit indices (Hooper, Coughlan, and Mullen, 2008). The 2-factor model is summarized in Table 3.7, and is analyzed and interpreted as follows.

Table 3.7 Summary of factor structure after post-hoc exploratory factor analysis (Study 1)

Factor Technical ST Social ST Organizational ST 3, 5, 8, 13, 14, 15, 17, 18, 1 3, 5, 8, 9, 11, 13, 17, 18 3, 4, 5, 8 21, 22 2 2, 7, 14, 17, 18 2, 7, 19, 20 4, 7, 10, 12, 13

Factor 1 seems to represent systems design thinking attitudes. Technical systems thinking items in Factor 1 reflect a balance between reductionism and holism; support for iteration and improvement after system deployment; and comfort with ambiguity and subjective problem- solving. Social attitudes in Factor 1 reflect a similar balance. Collaboration and independent work are both important, and individual preferences reflect the need for both shared time and

51 space, and personal time and space. Organizational systems thinking items are not readily interpretable in this factor.

Factor 2 seems to represent general discipline engineering attitudes. These include things like appreciation for nuance and detail, a preference for technical work (versus non-technical work like negotiation), and a reductionist approach to design based on organizational partitioning. Social attitudes in this factor are also less representative of systems design thinking, reflecting introversion and difficulty establishing common vocabulary across disciplines despite interest and effort. Organizational systems thinking items are also difficult to interpret in this factor.

3.2.3 Discussion EFA results for the three-factor model of technical, social, and organizational systems thinking, as written, seem to reflect the style and quality of the attitude statements rather than the structure of the systems design thinking framework. Almost half of the items in this analysis were reverse-worded, and meant to represent the ‘opposite’ of a systems design thinking attitude.

But, these items didn’t represent the opposite of a systems design thinker, and some reverse items were consistently endorsed. This is interpreted to mean that systems design thinkers are cognitively flexible and context-sensitive, and alternate between a holistic and reductionist view as necessary. This is consistent with literature findings about engineering systems thinking

(Brooks, Carroll, and Beard, 2011).

Study 1 also highlighted other important methodological issues. First, 57 attitude statements are too many to include on a scale of this type. The average survey response time for this survey was 15 minutes, more than double the target time of 7 minutes. Data indicates that surveys longer than 9 minutes start to see substantial levels of respondent break-off (Qualtrics

52

Support, 2018). Second, the item display may have affected response time and user experience in a meaningful way. Items were presented in 10 consecutive matrices, each containing 5-6 items.

Research has shown that response quality and completion rates both decline when questions use the matrix format (Qualtrics Support, 2018). The high number of matrix rows contained in the survey may have negatively impacted the data quality and quantity.

Study 1 yielded additional valuable theoretical insights. First, the study did not provide any evidence for a model of systems thinking with technical, social, and organizational factors.

Technical and social items that reflect the systems design thinking framework grouped into the first factor, and technical and social items that represent a more localized disciplinary perspective grouped into the second factor. Organizational systems thinking items were not easily interpreted within either factor. This could be because technical, social, and organizational systems thinking function similarly, although the subject is different; i.e., “systems thinking” is a big-picture, holistic framework for analyzing technical, social, and organizational systems alike. There may not be any functional difference between how each type of system is conceptualized.

This could be because technical subsystems and the social subsystems that design and deliver them are two sides of the same coin. To effectively integrate technical subsystems, systems engineers rely on methods for proactively engaging with and organizing corresponding individuals and groups. Interview findings suggest that systems engineers build personal connections with discipline engineers through empathy, questioning techniques, and active listening. They facilitate meetings and interactions between disciplines to support additional information exchange. Additionally, systems engineers have formal methods for recording, integrating, and distributing this information according to organizational processes. This includes

53 requirements definition and validation plans, schedules, change documentation, etc. (Collopy,

2019).

To explain the lack of support for “organizational systems thinking” as an independent factor, we interpret the findings to mean that systems thinking in systems engineering is a framework for organizing technology, people, and information. In other words, organizational systems thinking does not exist as an independent construct; rather, systems thinking is itself an organizational framework. Systems, whether technical or social, require organization. Technical systems thinking and social systems thinking both include organizational themes: e.g., partitioning subsystems and delegating work; identifying interactions and coordinating discipline engineers. The systems thinking framework can be applied to organize, understand, and influence the relationship between both elements in systems and individuals in design teams and organizations. This is related to the “learning and information” theme from the thematic analysis in Chapter 2.

Another major finding is that significant social systems thinking items bear strong resemblance to design thinking concepts, as depicted in Figure 3.2. Interview data suggested many instances of design thinking concepts in systems engineering. “Empathy/understanding” and “human-centered” were among the most common codes in the thematic analysis in Chapter

II. Design thinking attitudes include many of the significant social and social/organizational systems thinking attitudes, such as “I enjoy using group creativity techniques such as mind mapping/brainstorming.” and “Engineering is a creative process.”

54

Figure 3.2 Findings from Study 1 suggest that social systems thinking items seem to overlap with the design thinking framework. This finding is explored further in Study 2. Technical and organizational systems thinking items from Study 1 are used again in a new systems thinking factor. Social systems thinking items from Study 1 are reorganized into a design thinking factor that includes several new items for testing in Study 2. This finding is explored further in Study 2. The goal of Study 2 is to understand the relationship between systems thinking and design thinking, which includes social systems thinking. Many items from Study 1 were used again in Study 2. Additional thematic analysis of the design thinking literature is discussed in Section 3.3. The goal of this analysis was to identify and include in Study 2 additional design thinking items which reference design thinking practices, such as prototyping, that were not included as a part of social or organizational systems thinking in Study 1.

3.3 Study 2: Systems Thinking and Design Thinking

The previous section described an exploratory study designed to test the quantitative research methodology and inform preliminary hypotheses about the latent factors underlying a construct we called systems design thinking. A three-factor model derived from interview data was tested, consisting of technical, social, and organizational factors. Findings suggested that this model was not supported in the quantitative analysis. In Study 2, systems thinking items are redistributed as depicted in Figure 3.3. The social systems thinking factor is restructured as a design thinking factor based on the findings in Section 3.2.3. The design thinking factor also

55 includes themes like interpersonal relationships, empathy, understanding, questioning, curiosity, prototyping etc., not included as part of social or organizational systems thinking in Study 1.

Figure 3.3 In Study 2, systems thinking items from Study 1 are redistributed and a new model is tested. The social systems thinking factor from Study 1 is included as part of a new design thinking factor. The design thinking factor also includes additional themes like empathy, questioning, prototyping, etc. not included as a part of social organizational systems thinking items in Study 1. 3.3.1 Comparing Systems Thinking and Design Thinking Chapter 2 described codes that capture major themes in design thinking and engineering systems thinking. This section expands on these themes, describing similarities and differences between themes in design thinking and engineering systems thinking. These similarities and differences are represented as items and explored quantitatively in subsequent sections.

Differences Between Systems Thinking and Design Thinking

Systems thinking and design thinking have different theoretical backgrounds and goals, which influence problem framing, approach, and solution methods. Systems thinking is based on systems science, systems theory, cybernetics, management science, and operations research; design thinking is based initially on experience from practice and later on social and behavioral science, art, and the humanities. These differences result in some of the stereotypes described in earlier sections.

56

Systems thinking is more associated with mathematical modeling and analysis than design thinking, and objectivity drives the problem framing approach. In systems thinking, focus is on the technical solution and deals with quantifiable relationships between elements. Because of the complexity of these models, necessary information isn’t always available, and accepting this ambiguity is part of the process. Emergence may happen, and managing related risks is imperative.

In design thinking, the problem solving approach is informed by industrial design, product design, psychology, art and , and many other disciplines. These disciplines make use of qualitative analysis, where subjectivity is sought out and embraced, especially as is reflects the human experience. Focus is on problem definition in design thinking. Uncovering latent needs by sharing and understanding diverse personal experiences is of particular importance. Then, innovative, human-centered solutions are generated based on these rich descriptions. Methods for design thinking are also human-centered, and include practices such as storytelling, empathy-building, and creativity techniques. Generally speaking, there is less cost and risk involved in consumer product design, so less emphasis is placed on planning, scheduling, coordinating, and managing in the design thinking framework.

Systems thinking and design thinking are also found to have different goals and processes, based off the analysis in Chapter 2. For example, systems thinking requires partitioning and coordination processes. The systems thinking framework is holistic, but a key practice is understanding elements both individually and also as they interact and relate to the larger system. The different processes can be summarized as follows:

In systems thinking:

• Identify system elements and their interactions/interdependencies

57

• Partition solution

• Coordinate design

• Verify/validate requirements have been met

In design thinking:

• Identify needs, wants, and desires of users, stakeholders, and beneficiaries

• Synthesize

• Define problem

• Generate alternative solutions

• Prototype and test

Similarities Between Systems Thinking and Design Thinking

The “human element” is a key feature of both systems thinking and design thinking frameworks. Both systems thinking and design thinking deal with people, but different people and in different ways. In systems thinking, the people are the systems designers – those responsible for designing the technical system elements. Coordinating technical systems requires coordinating the related individuals and groups. This is done formally and informally, and requires empathy and understanding, communication skills, etc., similar to design thinking. In design thinking (for product design), the people of interest are users of the product being designed. Communication, empathy, and understanding are still necessary for engaging them, but the nature of the user/designer relationship is different than the designer/designer relationship. This adds a “leadership and management layer” to engineering systems thinking.

Leadership and management skills are not necessary for engaging and empathizing with users as described in the design thinking framework, but leadership and management skills are necessary to organize designers and their work activities.

58

Another common theme is the exploration of relationships and interactions. In systems thinking, the focus is on each system element and how it affects and is affected by the others.

Change propagation is a particularly salient concern, as the effects of change in one subsystem may be disruptive to the functioning of other subsystems. Relationships and interactions also have implications for partitioning and coordination approaches, and can dictate the best approach for integrating elements to produce a cohesive, fully-functioning system. The design thinking framework seeks to uncover important factors in the problem space, and understand how their interactions create problems and unmet needs. Design thinking also attempts to address these factors holistically, through the development of a single, unified solution.

3.3.2 Scale Development The scale for comparing systems thinking and design thinking in Study 2 was developed using codebook findings from Chapter 2 and insights from Study 1. Systems thinking is represented as a single factor, consisting of the following subthemes:

• “Big picture” thinking

• Tolerant of change/uncertainty

• Managing risk

• Coordination/communication

In Study 1, systems thinking was deconstructed into technical, social, and organizational factors. In Study 2, technical, social, and organizational systems are treated as attitude objects, while attitudes themselves are reflected above. I.e., regardless of whether the technical, social, or organizational system is being discussed, systems design thinkers should take a “big picture” approach, tolerate ambiguity/uncertainty, manage risk, and coordinate.

Design thinking themes include the following:

59

• Human-centered

• Creativity

• Prototyping/testing

• Iteration

These concepts, values, and practices are more human-centered and more related to understanding and working within social systems. Empathy, creativity, and storytelling enable communication across disciplines, organizations, and levels of education and experience.

Prototyping, testing, and iterating with users deepens the relationship between designers, and between designers and users. These practices focus on the human-centered process of designing, rather than the technical, social, and organizational systems on and in which design happens.

In Study 2, 40 items were examined in a second exploratory factor analysis. A total of 29 items were retained from Study 1, and 11 new items were added. Example items are included in

Tables 3.8 and 3.9. New items are listed with their corresponding systems thinking/design thinking theme. Items that were adapted from Study 1 and reused are labelled technical/social/organizational in parentheses underneath the new systems thinking/design thinking themes to reflect their categorization in Study 1.

60

Table 3.8 Example systems thinking attitude items and themes from Study 2. A total of 29 items were retained from Study 1, and 11 new items were added. New items are listed with their corresponding systems thinking theme. Items that were adapted from Study 1 and reused are labelled technical/social/organizational in parentheses underneath the new systems thinking themes to reflect their categorization in Study 1.

Systems Thinking Attitude Statements Themes

I need to know how my technical decisions affect the bigger system architecture. "Big picture" thinking (Technical)

I like to know what my colleagues are working on, even if it isn't directly related to my work. "Big picture" thinking (Social)

I am comfortable working with flexible/changing system requirements. Manage uncertainty/risk (Technical)

I am comfortable making technical decisions with incomplete information. Manage uncertainty/risk

I always have a backup plan in case something goes wrong. Manage uncertainty/risk

I enjoy taking on leadership roles. Coordination (Social)

It is easy for me to establish a common vocabulary with colleagues from other disciplines. Coordination (Social)

I am comfortable delegating tasks to others. Coordination

I prefer to resolve minor work issues in person rather than over email. Coordination

61

Table 3.9 Example design thinking attitude items and themes from Study 2. A total of 29 items were retained from Study 1, and 11 new items were added. New items are listed with their corresponding design thinking theme. Items that were adapted from Study 1 and reused are labelled technical/social/organizational in parentheses underneath the new design thinking themes to reflect their categorization in Study 1.

Design Thinking Attitude Statements Themes

I consider myself to be an empathetic person. Human-centered

I enjoy co-designing products/systems with customers. Human-centered

Workplace mentoring programs are important. Human-centered

I use storytelling and/or analogies to describe my work to others. Human-centered (Social)

I consider myself to be a creative person. Creativity

I enjoy using group creativity techniques such as mind mapping/brainstorming with my team. Creativity (Social)

Prototyping speeds up the design process. Prototyping/testing (Technical)

Higher-fidelity prototypes are always best. Prototyping/testing (Technical)

I see iteration to be an improvement of an idea rather than a setback. Iteration

3.3.3 Pilot Test, Factor Analysis, and Results for Study 2 Exploratory factor analysis (EFA) was used to study 40 attitude statements in Study 2.

These items were arranged in a 5-point Likert scale and evaluated by a second sample of 16 professional engineers and engineering researchers from industry, academia, and government.

Again, a sample size of 16 is very small for this type of study, but Study 2 is also considered to be a pilot study in these analyses. We use maximum likelihood estimation and varimax

62

(orthogonal) rotation again in this analysis, as it is expected that an individual can have any combination of systems design thinking attitudes. We specify the model to explore solutions with up to three factors. We expect to see support for a two-factor model, with distinct systems thinking and design thinking factors.

The three-factor model did not produce any interpretable results. The 2-factor model fit slightly better than the 1-factor model, although both models are considered poor according to standard goodness of fit indices. Following the same protocol from Study 1, all variables in the

2-factor model with loadings below .300 were dropped, and the data reanalyzed. Variables with loadings above .300, retained in the post-hoc analysis, are listed with varimax rotated loadings in

Tables 3.10.

Table 3.10 Two-factor EFA results with varimax rotated loadings (Study 2)

Item ID F1 F2 Attitude Statement

T2 0.383 0.267 It is important that I understand how my work contributes to the overall system design.

T4 0.672 0.263 I like to receive a detailed set of requirements before beginning a project.

T5 0.201 0.532 I have received formal education in more than one discipline.

T10 0.328 -0.469 Higher-fidelity prototypes are always best.

S1 -0.030 0.662 I consider myself to be an empathetic person.

S2 -0.908 0.380 I enjoy co-designing products/systems with stakeholders outside my organization.

S3 -0.123 0.860 I use storytelling and/or analogies to describe my work to others.

S5 0.001 0.362 I enjoy using creativity techniques such as mind mapping/brainstorming with my team.

Decision authority should be equally distributed between engineering and project O2 0.510 0.112 management.

O6 0.512 -0.138 I prefer to resolve minor work issues in person rather than over email.

O7 0.426 -0.134 I feel comfortable delegating tasks to others.

63

In this analysis, Factor 2 can clearly be identified as design thinking. Empathy, co-design, storytelling, and creativity are key themes, as identified in the codebook in Chapter II. Many of the social systems thinking items from Study 1 loaded significantly onto the Design Thinking factor in Study 2. Items in Factor 1 appear to represent technical and organizational themes of systems thinking. These include requirements definition, delegation, partitioning, etc., and seem to more closely reflect systems engineering, as described in the codebook in Chapter II. This finding, and findings from Study 1, are combined in Figure 3.4.

Figure 3.4 Systems engineering and design thinking frameworks each include elements of systems thinking. Study 1 indicated that social and some organizational systems thinking items had much in common with the design thinking framework. Study 2 suggested that technical and some organizational systems thinking items may be closely related to the systems engineering framework. This finding is explored further in Study 3. 3.3.4 Discussion As described in Section 3.3.1, systems thinking and design thinking share some fundamental similarities. Both are holistic approaches for understanding problems, engaging with stakeholders, and generating and realizing potential solutions. While design thinking originally evolved in business as a method for developing consumer products, it is now more generally applied as an innovation framework for technical, social, and economic problems. In today’s landscape, in which the internet, big data, and globalization have made complexity the norm, it is difficult to do design thinking without also doing systems thinking. While design thinking alone provides a compelling process for innovation, consideration of systemic

64 complexity and systems dynamics is necessary to ensure that potential solutions to these problems are viable and feasible. Thus, there exists an opportunity to integrate systems engineering and design thinking more formally. Complex social challenges like education and healthcare require systems engineering to design and diffuse innovations at scale (Tjendra,

2018). The relationship between systems engineering and design thinking attitudes is explored in

Study 3.

3.4 Study 3: Systems Engineering and Design Thinking

Section 3.3 described development and testing of a two-factor model of systems design thinking, consisting of a systems thinking subscale and a design thinking subscale. This model was not supported by the data, possibly due to the similarities between systems thinking and design thinking frameworks. Systems thinking and design thinking are both holistic problem solving approaches that emphasize careful analysis of the problem space, with specific focus on interactions between elements of the system. Systems thinking is concerned with partitioning and coordination of these elements, while design thinking focuses primarily on synthesis between elements (Spacey, 2016). Significant systems thinking items from Study 2 more closely reflected the processes of systems engineering, focusing on modeling, analysis, requirements definition, and planning rather than the epistemological representation of systems.

Big differences exist between systems engineering and design thinking, especially in the problem definition phase (whereas systems thinking and design thinking have a similar approach to problem definition). In design thinking, it is assumed that the problem is not well-understood, and uncovering the problem is a key part of the process. Systems engineering processes do not formally begin until the project, problem, and requirements are already defined. These processes require different skills; in design thinking, ideas are integrated; in systems engineering, hardware

65 and software are integrated. Design thinking is an open ended, iterative process; systems engineering is a linear, stepwise process with clear answers and next steps. These processes ask different questions: design thinking asks, “what is the need?” to generate requirements; and systems engineering asks, “what can we build?” to meet the given requirements.

This section explores systems engineering attitudes and their relationship to design thinking attitudes, structured as follows. First, a literature review is conducted to understand contemporary descriptions of the SE/DT distinction- the “stereotypes” of engineers vs. designers.

Then, scale development and factor analysis are discussed, in which we test a two-factor model of systems design thinking consisting of systems engineering and design thinking subscales. A pilot test is conducted first using Amazon M-Turk and produces good results. The scale is then distributed to broader audiences through snowball sampling and on Reddit, a social media platform. Reddit’s usefulness for survey research is discussed in Section 3.4.4. Then, good model results are confirmed, first quantitatively using confirmatory factor analysis, and then qualitatively, through additional analysis of feedback from Reddit users.

3.4.1 Comparing Systems Engineering and Design Thinking Thematic analysis of the literature reflecting the traditional systems engineering vs. design thinking distinction is used to derive conceptual models of the relationship between systems engineering and design thinking attitudes. Systems engineering and design thinking have been widely seen as distinctly different processes, systems engineering being more data-driven and analytical, and design thinking being more human-centered and creative.

Classical systems engineering and operations research by Checkland (1981) describes this duality as systems practice and systems thinking. The application of systems thinking in systems engineering practice is described as part of soft systems methodology (Checkland and

66

Scholes, 1990). Soft systems methodology is a sense-making approach to solving engineering problems that addresses human and social aspects of problem situations in addition to engineering objectives. Holwell (1997) describes soft systems methodology as “classic systems engineering with the transforming addition of human activity systems modelling.” Our themes reflect this characterization by capturing “classic systems engineering” (i.e., methodical, analytical, and data-driven) attitudes and attitudes about human-centered (i.e., empathetic, innovative, and creative) modelling and design processes.

Pidd (1996) summarizes the differences between “hard” engineering approaches and

“soft” analytical approaches like design thinking along four dimensions: problem definition, model, the organization, and outcomes. From Pidd’s work we take the following description of differences related to problem definition:

“Problems are social or psychological constructs that are the result of framing and

naming (Schön, 1982). This contrasts with the view, common in engineering, that work

begins once a need is established. Thus...in soft analysis, the work focuses on ends as

well as means to those ends. In hard systems engineering, the idea is to provide

“something to meet the need” and the concern is with “how [do we meet the need]...not

what [is the need]?” (Checkland and Scholes, 1990)

A common attitude in engineering is that design problems are taken as given, along with their parameters, constraints, objectives, and requirements. These problems are unambiguous, involve known physical variables, and mathematical and simulation-based analyses are necessary and sufficient for solving them. In “soft analysis,” problems, parameters, and solutions are ambiguous, and qualitative methods are used to consider social and psychological contexts while defining the design problem and engineering parameters.

67

A similar discussion of requirements definition approaches appears as part of a research agenda on the “top 10 illusions of systems engineering” (Pennock and Wade 2015). This work describes classical assumptions of systems engineering, and ways in which these assumptions are illusory in today’s systems environment. Requirements definition is described as traditionally absolute and unambiguous:

“Traditional systems engineering assumes that there is a “right’ or optimal

answer....this is one of the major assumptions that often separates traditional

systems engineering from systems thinking which embraces the soft, and often

inconclusive, nature of systems. Systems engineering operates under the

illusion that it is possible to specify unambiguous requirements using human

language. Of course, experience tells us that it is quite common for reasonable

people to hold differing interpretations of a .” (Pennock and Wade

2015)

The authors go on to describe systems engineering as traditionally mechanistic:

“Traditional systems engineering is blind to this human element, including

culture and history. [In systems engineering] it is implicitly assumed that these

factors will not substantially influence outcomes. However, norms and values

can affect both what potential solutions are considered and how a systems

engineering program is executed.” (Pennock and Wade 2015)

“Empathetic” and “relationship-driven” are included as complementary dimensions of the design thinking process. In addition to empathy-driven processes, design thinking also includes strategic and practical processes for the innovation of products and services within

68 business contexts. Inspiration, creativity, iteration, and prototyping are human-centered processes for stimulating innovation and increasing business viability (Brown and Katz, 2011).

Strategic and practical processes in systems engineering (we refer to these as

“methodical/systematic”) more closely resemble project management processes such as scheduling, budgeting, and change documentation.

Other Differences Between Systems Engineering and Design Thinking

Like systems thinking and design thinking, systems engineering and design thinking have different theoretical backgrounds and methodologies. Systems engineering includes theory and methods for design optimization, operations research, and other processes, which are often mathematics-based and model-driven. Systems engineering is concerned with “solving the problem right.” Design thinking is intended to address other considerations, and ensure that designers are “solving the right problem.” Design thinking moves away from mathematical modeling, favoring theory from disciplines such as psychology, sociology, and anthropology and methods such as field studies, interview, ethnography, etc.

Systems engineering and design thinking also have different starting points. While design thinking can be applied in different ways throughout a design process, it is typically considered to be a part of front-end design. Again, the goal of design thinking is to provide insight into stakeholder values for the purpose of problem definition; thus, design thinking begins with needs assessment and “desirability” of the proposed solution. Design thinking goes from this stage through to concept generation, prototyping, and testing.

Systems engineering starts with “solving the problem right.” The problem has already been defined and the design concept has largely been selected; systems engineering is concerned with optimizing and implementing the desired solution.

69

3.4.2 Scale Development and Pilot Test The following themes were selected to represent systems engineering attitudes in Study

3:

· Unambiguous problem definition, objectives, and requirements

· Analytical/data-driven

· Methodical/systematic

The following themes were selected to represent design thinking attitudes in Study 3:

· Ambiguous problem definition, objectives, and requirements

· Empathetic/relationship-driven

· Innovative/creative

Ethnographic findings suggest that these characteristics are not necessarily “opposites” or mutually exclusive, although they are often represented as such. The “methodical/systematic” engineering process is not the opposite of an “innovative/creative” design process and many systems engineers are both methodical/systematic and innovative/creative. Similarly, one can be both empathic/relationship-driven and analytical/data-driven, and research suggests that a human-centered approach is actually necessary for coordinating complex sociotechnical systems

(Williams and Derro, 2008). Experienced systems engineers are also able to challenge assumptions about problems and constraints while working within process boundaries.

Statements reflecting these six themes were written using language from professional, academic, and open-source materials on systems engineering and design thinking (INCOSE,

2015; Shea, 2017). These statements were included as items on a five-point Likert scale ranging from “1- strongly disagree” to “5- strongly agree”. Systems engineering items and themes are listed in Table 3.11. Design thinking items and themes are listed in Table 3.12.

70

Table 3.11 Systems engineering attitude items from Study 3

Assume unambiguous problems, objectives Analytical/data driven Methodical/systematic and solutions

I like to receive a detailed set of I build simulations and/or models to test my ideas. I document every change I make to requirements before beginning a project. my .

I generate better ideas when I have a defined I use quantitative methods to compare different I always compare my final design to problem statement and objectives. ideas. the initial project goals.

I can infer a customer's expectations based I use mathematical modelling/analysis to predict I evaluate designs based on cost and on the project goals. whether my designs will meet customer schedule. expectations.

I make design decisions based on data/analytical results.

I evaluate the success of my designs using quantifiable performance measures.

Table 3.12 Design thinking attitude items from Study 3

Assume ambiguous problems/solutions Empathetic/relationship-driven Innovative/creative

I am comfortable working with changing I am an empathetic person. Iteration is an improvement of a project requirements. design rather than a setback.

I like to redefine or restructure the problems I I like to speak directly with my I am a creative person. am given to work on. customers to ensure that my design meets expectations.

I like to find unconventional ways to solve I like to interact with customers I find inspiration for my work in problems instead of relying on past methods. frequently throughout the design my everyday life. process.

I am a curious person. I use storytelling techniques to understand the problems I am given to work on.

I use intuition to make design decisions.

71

A pilot test was conducted through Amazon Mechanical Turk (MTurk), a crowdsourcing marketplace for distributing processes and jobs to a decentralized human workforce (Amazon,

2018). MTurk enables “Requesters” to coordinate the use of human intelligence to perform tasks that computers are unable to do. These tasks include things like simple data validation and research, in addition to more subjective tasks like survey participation, content moderation, and more (Amazon, 2018).

We chose to use MTurk to recruit participants to limit demands on our known pool of experts after the first two pilot studies. Amazon Mechanical Turk allows for qualifying users before they work in their tasks; for this study, we requested that all users have degrees in engineering. Data was collected from 32 individuals with bachelor’s degrees in engineering.

Results were promising, so we proceeded to recruit a larger panel of experts for model testing as described in Section 3.4.3.

3.4.3 Subject populations & recruitment

The survey was designed for an expert sample working in a professional systems engineering or design context. The survey was developed using interview data from experts, expert materials such as professional systems engineering handbooks, and published research on experts conducted in professional settings. For this study, we recruited a small sample of known experts in systems engineering and design thinking research. This panel consisted of 88 self- and peer- identified experts, recruited through personal referral, listserv data, and from participation in engineering design conferences and events. Of 88 expert participants, 66 held doctorate degrees and 15 held masters degrees. Because a sample size of 88 is still too small to draw meaningful conclusions using exploratory and confirmatory factor analysis (Kline, 2015), a second group of participants was also recruited. The second group consisted of a panel of 369

72 participants recruited through the social media site Reddit. We use multigroup confirmatory factor analysis to compare the two recruited panels as a consistency check that the same model emerges across both samples, even if one sample is relatively small.

3.4.4 Crowdsourcing for Data Collection: Recruiting From Reddit

In this study we explore the use of crowdsourcing in the context of data collection.

Crowdsourcing is a term that refers to tasks performed by an undefined network of people

(Howe, 2006). Survey distribution and data collection over the internet can be viewed as crowdsourcing, but modern definitions of crowdsourcing have shifted away from the large untargeted network of people contributing to a body of knowledge to a targeted group of individuals that participate in organizational decision making processes (Kietzmann, 2016).

There are several contextual variations of the term “crowdsourcing,” such as Open Innovation

Challenges, Data Collection, and Analysis (Hill, Dean, and Murphy, 2014).

While there are popular options such as Amazon Mechanical Turk in the crowdsourcing marketplace (Landers and Behrend, 2015), measuring engineering and design attitudes requires a targeted distribution strategy to individuals with appropriate background experience. We seek other convenient distribution strategies and look to Reddit as a source for study participants with appropriate background in engineering. Reddit is an online forum, organized into smaller communities called subreddits. There are several engineering and design-focused subreddits to which the survey recruitment script was posted, described in detail in the results section.

There currently exist too few published articles where participants were recruited through

Reddit to gauge reliably the quality of data collected through it. However, using Reddit to recruit study participants is potentially advantageous for several reasons (Shatz, 2017). First, it is possible to recruit large samples in a short amount of time. Second, Reddit’s community of

73 subforums, or subreddits, makes it is possible to recruit participants from specific demographics and special interest groups. Distributing a survey via Reddit also adds the following layers to the process of distributing a survey:

• Upvotes and downvotes: An upvote (positive vote) or downvote (negative vote) adds to a

score that indicates overall quality of a post on Reddit. Early votes determine the quality

of the post for future readers and hence influences the popularity of that post on a

particular subreddit (Birman, 2018).

• Academic discourse with users: The incentive for the user to engage in quality academic

research is to gain an understanding of the topic that they did not have before. Reddit has

evolved by relying on external information to satisfy an ever increasing need for original

and self-referential content (Singer et al., 2014). Users look to provide feedback on their

personal experiences via feedback in the survey or in the comments section. The quality

of these comments and feedback could span a wide range based on the disinhibition

effect (Koivu, 2015).

• Targeted distribution: Surveys can be presented to selected subreddits. Selection of

subreddits is based on factors such as quality of content on a subreddit, day-to-day

activity, number of subscriptions for the subreddit, number of Redditors online who are

subscribed to the subreddit, age of the subreddit, and other similar metrics.

There are general guidelines for posting content to Reddit informally known as Reddiquette

(Reddit, 2018). The following Reddiquette guidelines should be adhered to while posting to

Reddit:

74

• Having a Reddit account with sufficient age/activity to begin posting: Creating

several posts with hyperlinks in them on a new account will trigger auto-moderation

and may even result in a ban on the account.

• Removing personally identifying information from the survey recruitment script:

This includes names and email addresses that link the Reddit user account to a real

person. This is done to avoid bots/scavenging algorithms from collecting sensitive

information. This can also be grounds for automatic deletion of a post. This

information should only be presented inside the survey itself if required.

• Using full length links: Using shortened links will hide the original URL. This will

cause automatic deletion of the post in some subreddits.

• Assigning appropriate flair: Flair for a post on a subreddit indicates the nature of the

post to users that read the post. These are tags attached to the original post that are

necessary for Redditors on a subreddit to determine the nature of a post. Default

flairs are unique to every subreddit. Flair can be used to indicate a post intended for

discussion, or a post that conveys theoretical or factual knowledge.

• Search for appropriate subreddits: Allow sufficient time to engage with a subreddit

and determine its true nature. Spamming several unrelated subreddits will lead to

backlash from the community through comments and downvoting.

• Tagging: Title of the post should include a tag such as [Survey] and [Intended

Demographic] to clearly indicate the purpose of the post. This is to help users avoid

reading a post about a survey when they are not interested in taking a survey.

• Time for completion: The script should also include accurate representation of the

amount of time for completion of the survey. Underestimation could lead to

75

frustrated users and overestimation could lead to decrease in number of survey

attempts (Galesic and Bosnjak, 2009).

• Prepare to engage actively with Redditors over the life of a post: A post is most

active in the first 2-3 hours. Tips on maximizing the utility of a Reddit post and

maximizing engagement are summarized in the article by Shatz (2017).

The Systems Design Thinking Scale on Reddit

The Systems Design Thinking Scale was posted to 25 unique subreddits. A total of 369

Redditors provided scores for the 23 variables in Table 3.12. In this sample, 35% reported their job title as senior level or above (management, senior management, director, and professorship positions); 37% as entry-level or analyst/associate; and 17% as student/intern. The survey completion rate was ~40%.

Table 3.13 summarizes the results from the top 3 posts created on Reddit. Although it is difficult to determine what constitutes a successful post, posts that had positive community engagement early on saw higher view counts. The best posts are those in which users leave highly personalized feedback in the comments section of the post, and researchers respond to these comments individually.

76

Table 3.13 Posting the Systems Design Thinking Scale on Reddit. The “r/” preceding Subreddit names indicates “Subreddit.” “Subscriptions” refers to the number of users following the Subreddit. “Upvotes” refers to the number of positive votes the survey post received on each subreddit.

Subreddit Subscriptions Upvotes % Upvoted Views

r/MechanicalEngineering 20000 27 89 1500

r/userexperience 34700 24 100 1500

r/ElectricalEngineering 23000 16 86 1400

All the subreddits in the above table indicated interest in viewing/learning the results from the survey. There was little to no engagement from other subreddits due to poor initial reaction possibly due to the survey not resonating well with the nature of the subreddit.

The mean response time in seconds for a 5% trimmed sample (to eliminate extremities) of the subset of fully completed responses was 291 seconds with standard deviation of 138 seconds. This is in line with advertised survey duration of 300 seconds. The total percentage of all survey attempts that were fully completed is ~40%.

3.4.3 Exploratory Factor Analyses Partial responses with missing data were deleted from the data set. All 23 items in Tables

3.11 and 3.12 were included in the EFA. The analysis was conducted on a data set consisting of

457 observations of these 23 variables (the expert and Reddit data were combined).

Factors were extracted using maximum likelihood estimation. An orthogonal rotational method (varimax) was used because we did not expect the obtained factors to be correlated based on the theoretical evidence described in Section 3.1.

A factor solution was obtained by considering Kaiser’s criterion (retaining factors with eigenvalues greater than one), the interpretability of obtained factor solutions, the internal

77 consistency of the obtained factors, and model fit indices (Worthington and Whittaker, 2006).

Items were removed if they did not load onto a distinct factor consisting of at least three items, or if they did not have a primary factor loading of 0.40 or above with no significant cross-loadings onto other factors.

Four-, three-, two- and one- factor models were compared in Table 3.14. In the 4-factor model, systems engineering variables SE1 and SE2 group together (with loadings of 0.844 and

0.516); these are the only two items loading greater than 0.400 on Factor 1. Factor 2 includes design thinking items 5-10 and 12, providing strong evidence for a “Design Thinking Attitudes” factor. Factor 3 includes systems engineering items 5-8, providing evidence for a “Systems

Engineering Attitudes” factor. Factor 4 includes design thinking items 2 and 3; these are the only two items loading greater than 0.400 on Factor 4. Systems engineering items 3, 4, 9, 10, and 11 and design thinking items 1, 4, and 11 didn’t load onto any factor with a loading higher than

0.400. These results can be seen in Table 3.14.

A second analysis was conducted. First, systems engineering items SE 1 and SE 2 are dropped from the model. These two items grouped together into a single factor in the first analysis. These items describe attitudes about problem statements, objectives, and requirements.

Preferences for these vary considerably among systems engineers; more qualitative analysis on this is required to fully understand the perception of requirements in systems engineering and to draft good survey questions about it. Design thinking items DT 2 and 3 are also dropped in the post-hoc analysis for similar reasons. These items grouped together in a single factor describing preferences for speaking and working directly with customers. A second EFA suggests high eigenvalues for 1 and 2 factor models (3.303 & 2.574), and lower values for the third factor and beyond (1.222, 1.160). A clear interpretation exists for the 2-factor model. There is no clear

78 interpretation of 3-factor model, and no items in this factor have loadings greater than 0.400.

Results are captured in Tables 3.15 and 3.16.

Model fit indices suggested that the two-factor solution is a relatively good fit to the data.

The Root Mean Square Residual (RMR) value of 0.046 was below the suggested 0.08 cut-off for very good fit. The Root Mean Square Error of Approximation (RMSEA) value of 0.047 was below the recommended 0.06 cut-off value for “very good” fit (Hu & Bentler, 1999; Kline,

2015). While there is some disagreement regarding exact cut-offs for fit indices and the RMR estimate of model fit, the two-factor EFA model was determined to be the best model. This solution suggests two conceptually meaningful factors, reflecting the underlying systems engineering and design thinking frameworks.

79

Table 3.14 Varimax rotated loadings for four factors (Study 3)

Item ID Factor1 Factor2 Factor3 Factor4 Attitude Statement I like to receive a detailed set of requirements before SE1 0.844 0.187 0.025 0.032 beginning a project. I generate better ideas when I have a defined SE2 0.516 -0.012 0.000 -0.021 problem statement and objectives. I can infer a customer's expectations based on the SE3 0.237 -0.123 0.167 -0.005 project goals. SE4 0.015 -0.218 0.359 -0.008 I build simulations and/or models to test my ideas. I use quantitative methods to compare different SE5 -0.057 -0.066 0.660 -0.001 ideas. I use mathematical modeling/analysis to predict SE6 0.128 0.040 0.684 0.089 whether my designs will meet customer expectations. I make design decisions based on data/analytical SE7 -0.012 0.027 0.588 -0.050 results. I evaluate the success of my designs using SE8 0.052 -0.046 0.553 0.059 quantifiable performance measures. SE9 0.168 0.009 0.268 -0.035 I document every change I make to my designs. I always compare my final design to the initial SE10 0.236 -0.124 0.239 -0.051 project goals. SE11 0.265 0.028 0.284 -0.150 I evaluate ideas based on cost and schedule.

DT1 0.050 -0.297 -0.092 -0.186 I am an empathetic person. I like to speak directly with my customers to ensure DT2 0.086 -0.212 0.050 -0.697 that my design meets expectations. I like to interact with customers frequently DT3 0.070 -0.218 0.021 -0.808 throughout the design process. I am comfortable working with changing project DT4 -0.107 -0.367 0.031 -0.247 requirements. I like to redefine or restructure the problems I am DT5 -0.060 -0.486 -0.014 -0.152 given to work on. I like to find unconventional ways to solve problems DT6 -0.016 -0.512 0.153 -0.008 instead of relying on past methods. DT7 0.135 -0.570 0.089 -0.068 I am a curious person. Iteration is an improvement of a design rather than a DT8 -0.029 -0.402 0.137 -0.086 setback. DT9 0.026 -0.589 0.049 -0.074 I am a creative person.

DT10 -0.044 -0.554 0.079 -0.219 I find inspiration for my work in my everyday life. I use storytelling techniques to understand the DT11 -0.082 -0.384 -0.141 -0.313 problems I am given to work on. DT12 0.089 -0.423 -0.047 0.075 I use intuition to make design decisions.

80

Table 3.15 Post-hoc EFA results: Varimax rotated loadings for three factors (Study 3)

Item ID Factor1 Factor2 Factor3 Attitude Statement I can infer a customer's expectations based on the project SE3 0.212 0.027 0.097 goals. SE4 0.383 0.085 0.172 I build simulations and/or models to test my ideas.

SE5 0.636 -0.031 0.025 I use quantitative methods to compare different ideas. I use mathematical modeling/analysis to predict whether SE6 0.690 -0.043 -0.107 my designs will meet customer expectations. SE7 0.565 -0.187 -0.035 I make design decisions based on data/analytical results. I evaluate the success of my designs using quantifiable SE8 0.571 0.049 -0.012 performance measures. SE9 0.281 -0.098 -0.006 I document every change I make to my designs. I always compare my final design to the initial project SE10 0.281 0.059 0.099 goals. SE11 0.287 -0.274 0.005 I evaluate ideas based on cost and schedule.

DT1 -0.079 -0.095 0.354 I am an empathetic person. I am comfortable working with changing project DT4 0.003 -0.394 0.470 requirements. I like to redefine or restructure the problems I am given to DT5 -0.006 -0.145 0.523 work on. I like to find unconventional ways to solve problems DT6 0.188 0.052 0.471 instead of relying on past methods. DT7 0.140 -0.058 0.570 I am a curious person. Iteration is an improvement of a design rather than a DT8 0.154 -0.063 0.410 setback DT9 0.129 0.387 0.590 I am a creative person.

DT10 0.113 0.101 0.568 I find inspiration for my work in my everyday life. I use storytelling techniques to understand the problems I DT11 -0.124 0.071 0.447 am given to work on. DT12 0.007 0.064 0.380 I use intuition to make design decisions.

81

Table 3.16 Two-factor EFA results (Study 3)

Item ID SYSENG DESIGN Attitude Statement

SE3 0.206 0.103 I can infer a customer's expectations based on the project goals.

SE4 0.374 0.184 I build simulations and/or models to test my ideas.

SE5 0.638 0.034 I use quantitative methods to compare different ideas. I use mathematical modeling/analysis to predict whether my designs will SE6 0.697 -0.100 meet customer expectations. SE7 0.570 -0.032 I make design decisions based on data/analytical results. I evaluate the success of my designs using quantifiable performance SE8 0.561 -0.003 measures. SE9 0.288 -0.012 I document every change I make to my designs.

SE10 0.272 0.115 I always compare my final design to the initial project goals.

SE11 0.301 -0.002 I evaluate ideas based on cost and schedule.

DT1 -0.076 0.350 I am an empathetic person.

DT4 0.031 0.419 I am comfortable working with changing project requirements.

DT5 0.001 0.519 I like to redefine or restructure the problems I am given to work on. I like to find unconventional ways to solve problems instead of relying DT6 0.181 0.484 on past methods. DT7 0.137 0.563 I am a curious person.

DT8 0.153 0.410 Iteration is an improvement of a design rather than a setback

DT9 0.090 0.568 I am a creative person.

DT10 0.099 0.586 I find inspiration for my work in my everyday life. I use storytelling techniques to understand the problems I am given to DT11 -0.137 0.454 work on. DT12 -0.002 0.386 I use intuition to make design decisions.

Systems engineering items SE3, 4, 9, 10 and 11 and design thinking items DT1 and 12 did not load on either factor with a loading greater than 0.400. These factors were dropped from

82 the model after the second analysis. A total of 11 items were removed from the measure after failing to meet the minimum criteria. The final factor loadings for 12 items are provided in Table

3.17. These 12 items were included in a confirmatory factor analysis, described in Section 3.4.6.

Table 3.17 Final factor loadings (Study 3)

Item ID SYSENG DESIGN Attitude Statement

SE5 0.638 0.034 I use quantitative methods to compare different ideas. I use mathematical modeling/analysis to predict whether my designs will SE6 0.697 -0.100 meet customer expectations.

SE7 0.570 -0.032 I make design decisions based on data/analytical results. I evaluate the success of my designs using quantifiable performance SE8 0.561 -0.003 measures. DT4 0.031 0.419 I am comfortable working with changing project requirements.

DT5 0.001 0.519 I like to redefine or restructure the problems I am given to work on. I like to find unconventional ways to solve problems instead of relying on DT6 0.181 0.484 past methods. DT7 0.137 0.563 I am a curious person.

DT8 0.153 0.410 Iteration is an improvement of a design rather than a setback

DT9 0.090 0.568 I am a creative person.

DT10 0.099 0.586 I find inspiration for my work in my everyday life. I use storytelling techniques to understand the problems I am given to work DT11 -0.137 0.454 on.

3.4.4 Confirmatory Factor Analyses A confirmatory factor analysis was conducted with 12 items and two factors. The systems engineering factor includes items SE5, 6, 7, and 8 from Table 3.17. The design thinking factor includes items DT4-11. The model is shown in Figure 3.5.

83

Figure 3.5 Two-factor model with parameter values and standard error

84

Results of this analysis, reported in Table 3.18, suggest an acceptable model fit:

Table 3.18 Model results from confirmatory factor analysis with 12 items

SYSENG by Estimate S.E. Est./S.E. P-value SE5 0.702 0.039 18.035 0.000 SE6 0.633 0.040 15.787 0.000 SE7 0.602 0.042 14.449 0.000 SE8 0.557 0.043 12.959 0.000

DESIGN by Estimate S.E. Est./S.E. P-value DT4 0.423 0.047 8.967 0.000 DT5 0.509 0.044 11.594 0.000 DT6 0.505 0.044 11.559 0.000 DT7 0.585 0.041 14.355 0.000 DT8 0.431 0.047 9.271 0.000 DT9 0.566 0.042 13.625 0.000 DT10 0.601 0.040 15.016 0.000 DT11 0.413 0.047 8.735 0.000

DESIGN WITH SYSENG 0.118 0.064 1.862 0.063

ꭓ2 df P RMSEA CFI/TLI SRMR 131.777 53 0.000 0.057 0.912/0.890 0.050

Modification indices suggest some issues with variables DT4 and DT11, in addition to low loadings. DT 4, DT 11, and DT 8 were all dropped due to low loadings (less than .500) and the analysis was run again with 9 total items: systems thinking items SE5, 6, 7, and 8, and design thinking items 5, 6, 7, 9, and 10. Model results are reported in Table 3.19.

85

Table 3.19 Model results from confirmatory factor analysis with 9 items

SYSENG by Estimate S.E. Est./S.E. P-value SE5 0.705 0.039 18.122 0.000 SE6 0.634 0.040 15.854 0.000 SE7 0.598 0.042 14.333 0.000 SE8 0.557 0.043 12.949 0.000

Two-tailed DESIGN by Estimate S.E. Est./S.E. P-value DT5 0.478 0.048 10.029 0.000 DT6 0.551 0.045 12.167 0.000 DT7 0.575 0.044 12.953 0.000 DT9 0.585 0.044 13.227 0.000 DT10 0.586 0.044 13.311 0.000

DESIGN WITH SYSENG 0.149 0.065 2.285 0.022

ꭓ2 df P RMSEA CFI/TLI SRMR 43.128 26 0.019 0.038 0.974/0.964 0.034

As shown in Table 3.19, the CFA shows a good fit to the data. The RMSEA of 0.038 is below the suggested 0.08 cut-off value. The CFI and TLI values of 0.974 and 0.964, respectively, are above the .90 cut-off value for good fit suggested by Kline (2015). The SRMR value of 0.034 is below the 0.08 cut-off for good fit. The low correlation (0.149) between factors suggests that systems engineering attitudes and design thinking attitudes are independent. An individual can hold engineering attitudes, design attitudes, or both.

The distribution of individuals examined is summarized in Figure 3.6. Mean scores were calculated for systems engineering and design thinking subscales (scores of 15 and 20, respectively). Individuals were categorized as “high” in a category if they scored above the mean on that subscale, and “low” if they scored below the mean.

86

Design Thinking Low High

58 116 Low

Systems 96 188

Engineering High

Figure 3.6 Systems Design Thinking Classification of 458 survey participants. Mean scores were calculated for systems engineering and design thinking subscales. Individuals were categorized as “high” in a category if they scored above the mean, and “low” if they scored below the mean.

Overall, model fit indices indicated that the hypothesized relationships between observed variables and their corresponding latent constructs were a good fit to the data. All variables significantly loaded onto the same factor in the CFA as they had in the EFA, which provides psychometric support for the systems engineering and design thinking Scale and its factor structure using an alternative modelling approach.

3.4.5 Multigroup CFA The survey was designed for an expert sample working in a professional setting The survey was developed using interview data from experts, expert materials (e.g., systems engineering handbooks), and published research from expert settings. An expert sample was recruited for the study; however, this sample was small as experts are difficult to find. To survey the recommended number of participants for EFA/CFA, the sample was expanded to include the

Reddit group. Demographic information was recorded for both groups. Information collected included “job title,” with the following choices: student, intern, entry-level, analyst/associate,

87 senior level, management, senior management, director, professor, or other with option to write- in.

Two multigroup CFAs were conducted. First, the Reddit sample was compared to the expert participants we contacted directly. Then, both samples were combined, and entry-level participants were compared to expert participants from both samples.

Reddit vs. Known Sample

In this analysis, the Reddit sample (n=371) and known expert sample (n=87) are

compared. Model results are reported in Tables 3.20, 3.21, and 3.22 below:

Table 3.20 Multigroup CFA results: Reddit vs. known expert sample

Reddit Known ꭓ2 sample sample Df p RMSEA CFI/TLI SRMR 126.354 51.819 74.535 66 0.000 0.063 0.915/0.907 0.074

Table 3.21 CFA for Reddit group

SYSENG by Estimate S.E. Est./S.E. P-value SE5 0.683 0.043 16.063 0.000 SE6 0.621 0.040 15.352 0.000 SE7 0.585 0.042 13.833 0.000 SE8 0.548 0.043 12.631 0.000

Two-tailed DESIGN by Estimate S.E. Est./S.E. P-value DT5 0.489 0.049 9.942 0.000 DT6 0.554 0.042 13.099 0.000 DT7 0.600 0.047 12.895 0.000 DT9 0.555 0.044 12.687 0.000 DT10 0.534 0.046 11.654 0.000

DESIGN WITH SYSENG 0.086 0.075 1.143 0.253

88

Table 3.22 CFA for known expert sample

SYSENG by Estimate S.E. Est./S.E. P-value SE5 0.717 0.053 13.492 0.000 SE6 0.769 0.057 13.384 0.000 SE7 0.670 0.062 10.790 0.000 SE8 0.597 0.067 8.973 0.000

Two-tailed DESIGN by Estimate S.E. Est./S.E. P-value DT5 0.463 0.060 7.762 0.000 DT6 0.691 0.069 10.051 0.000 DT7 0.478 0.061 7.818 0.000 DT9 0.749 0.059 12.711 0.000 DT10 0.723 0.056 12.827 0.000

DESIGN WITH SYSENG 0.349 0.124 2.808 0.005

The model performs well for both groups. Systems engineering items have higher loadings among known experts. Also, there is a higher correlation between systems engineering and design thinking attitudes in the known expert group.

Entry vs. Senior Level: Both Groups

In this analysis, Reddit data was combined with data collected through snowball sampling. Entry-level and senior (expert) level were compared. Entry level includes student, intern, entry-level, and analyst/associate. Senior level includes the following job titles: senior level, management, senior management, director, and professor. The entry-level group included

213 observations. The senior-level group included 245 observations. Total sample size was 458 observations. Model results are presented in Tables 3.23, 3.24, and 3.25 below:

Table 3.23 Multigroup CFA results: Entry vs. senior level (combined sample)

ꭓ2 Entry-level Senior Df p RMSEA CFI/TLI SRMR 60.469 31.464 29.005 50 0.1475 0.030 0.981/0.979 0.054

89

Table 3.24 CFA for entry-level group

SYSENG by Estimate S.E. Est./S.E. P-value SE5 0.681 0.050 13.597 0.000 SE6 0.638 0.047 13.667 0.000 SE7 0.563 0.050 11.233 0.000 SE8 0.549 0.049 11.273 0.000

Two-tailed DESIGN by Estimate S.E. Est./S.E. P-value DT5 0.480 0.056 8.599 0.000 DT6 0.528 0.057 0.199 0.000 DT7 0.595 0.051 11.601 0.000 DT9 0.540 0.047 11.437 0.000 DT10 0.543 0.048 11.277 0.000

DESIGN WITH SYSENG 0.182 0.097 1.867 0.062

Table 3.25 CFA for senior-level group

SYSENG by Estimate S.E. Est./S.E. P-value SE5 0.711 0.043 16.391 0.000 SE6 0.632 0.047 13.309 0.000 SE7 0.627 0.047 13.447 0.000 SE8 0.574 0.049 11.615 0.000

Two-tailed DESIGN by Estimate S.E. Est./S.E. P-value DT5 0.483 0.049 9.885 0.000 DT6 0.543 0.046 11.705 0.000 DT7 0.545 0.050 10.798 0.000 DT9 0.670 0.050 13.527 0.000 DT10 0.641 0.049 12.973 0.000

DESIGN WITH SYSENG 0.117 0.087 1.342 0.180

The model performs well for both groups, but fits the expert group slightly better. Biggest

90 differences are observed on design thinking items DT9 (0.525 for entry-level vs 0.702 for senior level, ∆=0.177) and DT10 (0.511 for entry-level vs 0.658 for senior level, ∆=0.147).

3.4.6 Tests for Measurement Invariance The extent to which this model exhibited measurement and structural invariance between novices and experts was examined using Mplus v. 8 (Muthen & Muthen, 2017), following the method by Hoffman (2018). Robust maximum likelihood (MLR) estimation was used for all analyses. Novices were used as the reference group in all invariance models. A configural invariance model was initially specified, in which single-factor models were estimated simultaneously within each group. Factor mean was fixed to 0 and the factor variance was fixed to 1 for identification within each group. The configural model had good fit. A series of model constraints were then applied in successive models to examine potential decreases in fit due to measurement or structural non-invariance.

Equality of the unstandardized item factor loadings across all groups was then examined in a metric invariance model in which the factor variance was fixed to 1 in novices but was freely estimated in experts. Factor means were fixed to 9 in both groups. All factor loadings were constrained to be equal across groups. All intercepts and residual variances were permitted to vary across groups. The metric invariance model fit well and did not result in a significant decrease in fit relative to the configural model. The modification indices suggested no points of localized strain among the constrained loadings. The fact that metric invariance held indicates that the items were related to the latent factors equivalently across groups; i.e, the same latent factors were being measured in each group.

Equality of the unstandardized item intercepts across groups was then examined in a scalar invariance model. The factor mean and variance were fixed to 0 and 1 respectively for

91 identification in the novice group. The factor mean and variance were estimated in the expert group. Factor loadings and item intercepts were constrained to be equal across groups. All residual variances were permitted to differ across groups. The scalar invariance model fit well and did not result in significant decrease in fit relative to the metric invariance model.

Equality of the unstandardized residual variances across groups was then examined in a residual variance invariance model. The factor mean and variance were fixed to 0 and 1 respectively for identification in the novice group. The factor mean and variance were estimated in the expert group. All factor loadings, item intercepts, and residual variances were constrained to be equal across groups. The residual variance invariance model fit well, and did not result in significant decrease in fit relative to other models.

After achieving measurement invariance as described, structural invariance was then tested with two additional models. First, the factor variance in experts, which had been estimated freely, was constrained to 1 to be equal to the factor variance in novices. Second, the factor mean in experts which had been estimated freely, was constrained to 0.

These analyses showed that full measurement and structural invariance was obtained between novices and experts.

3.4.7 Additional Qualitative Findings In addition to quantitative results, the Reddit analysis yielded additional qualitative data for validating the underlying theory and hypotheses. A feedback box was included in the

Qualtrics form for both samples. Redditors were also able to post comments on the survey thread. The following responses were received from Reddit and are organized by hypothesis supported.

92

On systems engineering:

• “When working with engineers (I am trained as a designer) I often find their

process very linear and practical. They often will get the job done, but will also

sometimes miss opportunities.”

On design thinking:

• “In my experience the major difference between a designer and an engineer is that

a designer has a better vision and understanding of how the product will impact

the consumer in their use.”

• “I often create new goals for a project because my clients often have a vague

sense of it. Thinking outside my client’s box is why I get a lot more work,

compared to other designers I work with. And when I do my best work, I’m also

presenting clients something unique, and much needed.”

On engineering and design being perceived as [stereotypically] different:

• “Some questions are strange - like quantitative / data analysis for design.”

• “My perspective is not all things need to look good. They just need to work.

Where needed aesthetics are extra time and need to be discussed with the

customer beforehand.”

On possible relationships between frameworks:

• “I like to think of myself not as an engineer, but as a technical designer.”

• “As a mechanical , I use engineering tools (computer analysis,

calculations, etc.) to solve design problems (form, fit, function, manufacturability,

cost). They are two sides of a coin.”

93

• “I personally have worked quite well in collaboration with several actual

engineers; while our skills are different, our mindsets both point to analysis and

testing uncharted waters.”

• “In my world engineers ARE designers. There is nearly 100% overlap. We do not

have anyone employed to do design that isn't either a degreed engineer or a senior

technician with decades of experience.”

• “I think design and even art are very important to engineering in providing a base

for creativity and expression. And it goes both ways. Knowing the analytical

aspects of engineering and something as simple as a design of experiments

process is highly useful for ID and product designers.”

In summary:

• “Similarities: Engineering and Design both employ creativity, input and

interaction with user/customer, and an investment of personal passion.

Differences: Engineering makes use of math, science, and rigor to balances

multiple and sometimes conflicting requirements to achieve a successful result.

Design may employ engineering methods but often is driven by other factors. An

engineering solution may be successful but not viewed as an elegant design. A

design solution may appear elegant but not necessarily be a sound engineering

solution.”

3.4.8 Discussion The goal of the studies described in this chapter was to understand systems engineering attitudes, design thinking attitudes, and the relationship between these two constructs through the development and validation of the systems design thinking scale. Traditional representations of

94 systems engineering and design thinking were used to develop structural models of systems engineering and design thinking attitudes, which were evaluated quantitatively using exploratory and confirmatory factor analyses.

Findings support the traditional representation of systems engineering and design attitudes as two distinct latent constructs, but do not support the stereotype that these two constructs are mutually exclusive. Consistent with contemporary observation and experience, systems engineering and design thinking attitudes can be complementary. Design thinking can be used to innovate new solutions based on a "bottom-up" human-centered approach, while systems engineering processes support change management and integration by maintaining a “top-down,” big-picture view. This is especially important when applying design thinking to systems-level problems. While human-centered design processes often generate innovations that meet human needs, there is no guarantee that their diffusion into a large-scale, complex system will mirror diffusion into consumer markets. Systems engineering, which includes technical and organizational elements of systems thinking, is an approach for designing and deploying these types of solutions and ensuring that they perform optimally in their intended environment.

Systems engineering adds key values and practices synchronicity, consistency, integration, and optimization to the design thinking process (Tjendra, 2018).

The Systems Design Thinking Scale shows promise as a tool for capturing systems engineering and design thinking attitudes along a spectrum. Subscale scores for engineering and design attitudes may be useful for identifying and balancing perspectives within engineering design teams. This possibility presents an opportunity for an observational study in the future.

Another promising direction for this work is a behavioral study, in which scores on the Systems

Design Thinking Scale are used to predict behaviors with known implications for the success of

95 systems engineering projects. Understanding the relationship between attitudes and behaviors in this context would be useful for education and training.

Using Reddit to collect survey data had the unintended benefit of enabling concurrent collection of qualitative data for validating underlying theory and hypotheses, and also for improving them and refining vocabulary. Through user feedback, we received the following recommendations for improving the questionnaire:

• For some items, frequency (e.g., rarely → often) would have been a better

indicator than agree/disagree;

• Additional items describing delegation of responsibilities would have been useful

(e.g., “I direct people to do X simulation/analysis.”)

Additional feedback about the questionnaire shared through open-response survey items, along with the Reddit comment sections, provided information that would be interesting to include in future qualitative analysis.

3.5 Summary

In this chapter, quantitative analysis guided several iterations of the Systems Design

Thinking Scale. In Study 1, systems design thinking was represented by three factors: technical systems thinking, social systems thinking, and organizational systems thinking. Findings suggested that technical and social systems thinking are important factors in systems design thinking. Findings related to the organizational systems thinking factor were difficult to interpret.

Significant social systems thinking items appeared similar to design thinking concepts, when the codebook from Chapter II was used to interpret the findings. In Study 2, the difference between systems thinking and design thinking was explored. Systems thinking items included significant technical and organizational items from Study 1. Design thinking items included

96 significant social systems thinking items, and several additional items reflecting the design thinking themes from Chapter II.

Significant systems thinking items appeared similar to systems engineering concepts, when the codebook from Chapter II was used to interpret the findings. Almost all of the design thinking items were significant. Systems engineering and design thinking subscales were analyzed in Study 3. Results suggested that this model of systems design thinking was a good fit.

In the next chapter, we attempt to validate the systems design thinking scale by studying correlations between scale scores and performance on analytical reasoning and divergent thinking tasks.

97

CHAPTER IV

Validating the Systems Design Thinking Scale

4.1 Introduction

Chapter 2 described the development of systems design thinking theory, and Chapter 3 described the development of a scale for measuring systems design thinking. In this chapter, a pilot validation study is described, in which the scale’s ability to predict performance on systems engineering and design tasks is explored.

The validation study has several goals. The first is to establish construct validity; i.e., that the systems design thinking scale measures what it claims to be measuring. The second is to establish criterion validity to determine the extent to which the systems design thinking measure is related to an outcome. The systems design thinking scale should be operationally useful for identifying occupational strengths, like the Jung Typology Profiler for Workplace (Kerbel &

Wainstein, 2013), Clifton StrengthsFinder (Rath, 2007), and other similar scales. It should mean something to be high on systems engineering attitudes, high on design thinking attitudes, or both.

Therefore, it is necessary to identify what skills and behaviors are correlated with these self- reported attitudes, and their impact on work, in order for the scale to be operationally useful.

Then, the scale can be used to identify or categorize individuals based on their perspectives and strengths in a meaningful way, as “playing to strengths” is a time-effective way to improve performance and engagement at work (Rath, 2007).

98

Identifying the right behaviors to observe and the right way to measure them is a major challenge. In this chapter, a first attempt is made, guided by related work. Scale scores were used to predict performance on analytical reasoning and divergent thinking tasks. While no correlation was observed between scale scores and performance on the analytical reasoning task, some correlation was observed between design thinking subscale scores and performance on the divergent thinking task. This study yielded other important findings, which are interpreted, summarized as lessons learned, and used to contribute a validation plan for the Systems Design

Thinking Scale.

4.2 Behavioral Research in Systems Engineering and Design Thinking

Controlled experiments, including laboratory and field experiments, are becoming increasingly important in systems engineering. While design of experiments is not unique to systems engineering research, certain features of the systems engineering context pose unique experimental design challenges that require special consideration (Panchal and Szajnfarber,

2017). First, complex systems are designed and developed by large, geographically-dispersed teams, over spans of years or even decades. Controlled experiments are typically performed in brief sessions with one or few subjects involved. It is difficult to recreate complex systems problems in an experimental setting without significant loss of relevant information. Second, for experienced systems engineers in real organizations, transactional knowledge of how things get done in their particular organization is an important aspect of expertise. Recreating this knowledge in a short laboratory session can be difficult.

Cognitive and behavioral research in systems engineering studies topics such as problem solving, mental model formation, teams and team effectiveness (Avnet, 2016; DeFranco et al.,

2011; de Graaf and Loonen, 2018). In these studies, surveys, interviews, observation, and

99 documentation are used to study engineering design teams working in aerospace laboratories, academia, construction, and other settings. Avnet (2016) explores team coordination and shared cognition in engineering design teams using a network-based approach. DeFranco et al. (2011) describes the importance of shared mental models for team performance. De Graaf and Loonen

(2018) explore team effectiveness, and the extent to which differences in team effectiveness can be explained based on characteristics of systems engineering teams and organizations.

Other work in systems research and behavioral science describes a similar approach to the development and validation of a systems thinking scale (Davis and Stroink, 2016; Randle and

Stroink, 2018; Thibodeau, Frantz, and Stroink 2016). This work describes the development of a psychometric instrument for measuring systems thinking, and a study of this instrument in relation to well-studied constructs (e.g., holistic and relational thinking; creativity) and decision- making tasks in the psychological literature. Other studies of systems thinking list decision- making tasks, measures of holistic thinking (Maddux and Yuki, 2006; Choi et al., 2007; Chiu et al., 2000), and measures of relational reasoning (Thibodeau, Frantz, and Stroink, 2016; Vendetti,

Wu, and Holyoak, 2014) as options for validating measures of systems thinking.

Design thinking research studies many aspects of design thinking, including psychological and cognitive processes of design (Dinar et al. 2015; Plattner, Meinel, and Leifer,

2011; Shah, 2012; Toh and Miller, 2016). These processes include creativity, divergent thinking, analogical reasoning, sketching, prototyping, and visual representation, among others. Case studies, think-aloud protocols, controlled experiments, psychometric measurement, and, more recently, physiological measurement techniques are used to study these processes (Dinar et al.,

2015).

100

4.3 Pilot Validation Study

4.3.1 Overview and Objectives This study represents the first attempt at understanding the relationship between systems design thinking, behavior, and related psychological constructs. The goal of the study is to validate the Systems Design Thinking Scale by identifying correlations between scale scores and observable behaviors. This is the first step in operationalizing the Systems Design Thinking

Scale.

4.3.2 Methods In this study, the Systems Engineering and Design Scale from Chapter 3 was delivered to participants on Reddit, along with analytical reasoning and divergent thinking tasks. The analytical reasoning tasks used in this study were adapted from Frederick’s Cognitive Reflection

Test (2005). This task is designed to measure a person’s tendency to override an incorrect “gut” response and engage in higher-level analysis to find the correct answer to a problem. This task is taken here to represent the analytical and mathematical systems engineering approach to problem solving, versus the more intuitive approach of design. Design thinking is measured by a classical test of divergent thinking (Guilford et al., 1960). Divergent thinking is an integral process for generating many possible solutions in creative problem solving. Linear regression modeling is used to analyze the correlation between scores on the Systems Design Thinking Scale and the number of correct responses and unique responses on the analytical reasoning and divergent thinking tasks.

101

4.3.3 Behavioral Task Selection

Systems Engineering Tasks

Problem solving lies at the core of engineering practice and education alike. Word problems are a traditional instructional mechanism for learning how to apply mathematics to solving problems in the educational setting (Salado, Chowdhury, and Norton, 2018). In this study, word problems are used to explore the relationship between scores on the Systems Design

Thinking Scale, analytical reasoning, and mathematical problem solving. Participants complete three short word problems, based on the cognitive reflection test by Frederick (2005). The cognitive reflection test is a task designed to measure a person’s tendency to override an incorrect “gut” response and engage in further analysis to find the correct answer to a problem.

It is expected that systems engineers are more likely to activate what Frederick and others describe as “System 2”: a deliberate and analytical cognitive process. System 1 describes an intuitive, immediate response that is executed quickly and without reflection. As designers are often categorized as more intuitive and less analytical, we expect designers to be more likely to answer these questions more intuitively, while systems engineers are more likely to apply an analytical process for writing mathematical formulas and finding the correct solution. The following questions are used for the analytical reasoning task in this study:

• If Rhonda is the 50th fastest and 50th slowest runner in her school, how many

students are there in her school? [Correct answer: 99; Intuitive answer: 100]

• You have 100 pounds of potatoes, which are 99% water by weight. You let them

dehydrate until they are 98% water by weight. Now how much do they weigh?

[Correct answer: 50 pounds; Intuitive answer: 99 pounds]

102

• At a party, everyone shook hands with everybody else. There were 6 handshakes.

How many people were at the party? [Correct answer: 4 people; Intuitive answer:

5 people]

Performance on these tasks is measured by the number of correct responses. Correlation is expected between systems engineering subscale scores and number of responses. Individuals high in systems engineering subscale scores are expected to generate more correct responses on these questions.

Design Thinking Tasks

Scores on the design thinking subscale should correlate with design thinking behaviors.

In systems engineering, these include creating and generating divergent opinions and ideas, encouraging differing opinions, driving convergence on decisions, and others (Williams and

Derro, 2008). Studies in systems thinking, which shares commonalities with design thinking described in Chapter 3, have explored the relationship between systems thinking and creativity.

One study uses a simple measure of creativity in which participants are presented with three hypothetical situations and are asked to list as many consequences of the situations as possible

(Randle and Stroink, 2018; Furnham and Nederstrom, 2010). Participants who scored higher on

Randle and Stroink’s Systems Thinking Scale tended to generate more creative responses on the consequence measure.

In this study, a classic measure of divergent thinking (Guilford et al., 1960) is used to evaluate the design thinking subscale of the Systems Design Thinking Scale. Divergent thinking is the cognitive process used to generate creative ideas by exploring many possible solutions

(Furnham and Nederstrom, 2010). In this study, participants were asked to generate as many uses as they could think of for two common household items, a newspaper and a coffee mug.

103

4.3.4 Study Population and Recruitment Strategy In this study, the process of recruiting from Reddit was refined. The research strategy now included the best time of day to post, based on the analysis by Candocia (2017). A similar process was applied to timing postings on Facebook and LinkedIn as well (Hootsuite, 2018;

Kolowich, 2019)

The best time to post to Reddit was estimated to be 8:00 AM EST (Candocia, 2017); for this study, the process began at 6 AM EST because there is a 10-minute delay to posting on multiple subreddits and eleven total subreddits were targeted. This is also to allow for resolution of mistakes that result in auto-moderators removing posts. Posts are made on the following subreddits: r/architecture; r/designthought; r/ComputerEngineering; r/aerospaceengineering; r/chemicalenigneering; r/SoftwareEngineering; r/MechanicalEngineering; r/userexperience; r/ElectricalEngineering; r/productdesign; and r/EngineeringStudents. The posts were monitored for 24 hours, during which time researchers were interacting with participants via forum comments to answer questions and increase the visibility of the post.

4.3.5 Pilot Test, Factor Analysis, and Results for Validation Study A pilot study was conducted on Reddit using Qualtrics. This study enabled testing of the

Systems Design Thinking Scale on a new population, and a test for correlation between the scale and analytical reasoning and divergent thinking tasks. 136 observations were recorded. Model results are reported in Table 4.1 and summarized below.

104

Table 4.1 Model results from confirmatory factor analysis (validation study)

SYSENG by Estimate S.E. Est./S.E. P-value SE5 0.672 0.039 18.035 0.000 SE6 0.713 0.040 15.787 0.000 SE7 0.588 0.042 14.449 0.000 SE8 0.577 0.043 12.959 0.000

DESIGN by Estimate S.E. Est./S.E. P-value DT5 0.460 0.047 8.967 0.000 DT6 0.575 0.044 11.594 0.000 DT7 0.466 0.044 11.559 0.000 DT9 0.450 0.041 14.355 0.000 DT10 0.379 0.047 9.271 0.000

ꭓ2 Df P RMSEA CFI/TLI SRMR 55.629 42 0.077 0.049 0.918/0.893 0.067

Findings suggest that the specified model is a good fit to the data for the new population.

Factor loading coefficients in the validation study are similar to previous studies in which items were selected. However, coefficients for DT7, 9, and 10 are slightly lower in the validation study than in Study 3, as indicated in Table 4.2. The p-value for these items are less than .001 in both models. Factors and task variables are independent and do not vary together, as indicated in

Table 4.3.

Table 4.2 Comparison in factor loadings between validation study and Study 3 for items DT7, DT9, and DT10

Validation Item Study 3 Loading Study Loading Δ DT7 0.575 0.475 0.100 DT9 0.585 0.442 0.143 DT10 0.586 0.383 0.203

105

Table 4.3 Covariances of factors and tasks

Estimate Std. Err z-value P(>|z|) Std.lv Std.all SYSENG ~~ on DESIGN -0.012 0.031 -0.396 0.692 -0.051 -0.051 Analytical Reasoning ~~ Divergent Thinking 1.142 0.893 1.278 0.201 1.142 0.115

Participants were given scores on both the analytical reasoning and divergent thinking tasks. The tasks were then treated as observed variables in the model and regressed on the systems engineering and design thinking factors. The mean score for the analytical reasoning task is 1.686 correct answers (out of 3 possible correct answers), with a standard deviation of

1.034. The mean score for the divergent thinking task is 13.730 unique answers, with a standard deviation of 10.376. The regression model results are reported in Table 4.4:

Table 4.4 Regression model results

Estimate Std. Err z-value P(>|z|) Std.lv Std.all Analytical Reasoning on SYSENG 0.237 0.188 1.262 0.207 0.13 0.127 on DESIGN -0.05 0.259 -0.193 0.847 -0.022 -0.021 Divergent Thinking on SYSENG 2.481 1.866 1.33 0.184 1.359 0.132 on DESIGN 6.946 2.988 2.325 0.02 3.063 0.297

We expected scores on the systems engineering subscale to correlate with scores on the analytical reasoning task. We also expected scores on the design thinking subscale to correlate with scores on the divergent thinking tasks. We did not expect to see any correlation between scores on the systems engineering subscale and scores on the divergent thinking task; nor did we expect to see any correlation between scores on the design thinking subscale and scores on the analytical reasoning task.

106

Scores on the systems engineering subscale were not correlated with scores on either task. No correlation was observed between scores on the divergent thinking subscale and analytical reasoning task as expected. Some correlation (0.288, p<.05) between the design thinking subscale and divergent thinking measure suggests that the Systems Design Thinking

Scale may be useful for predicting certain behaviors.

Most participants were able to solve the analytical reasoning questions correctly. Most participants provided some answers to the divergent thinking task, although some indicated that they did not feel incentivized to complete this task. Others put in considerable effort, generating as many as 50 unique answers.

4.3.6 Findings and Lessons Learned No significant correlation was observed between either subscale and performance on the analytical reasoning task. The mean score on this task was 1.68 correct answers, indicating that of 3 possible correct answers, most individuals got close to 2 correct. This task may have been too easy, making it difficult to predict performance using the systems engineering subscale. No correlation was expected between the analytical reasoning task and design thinking subscale scores.

Some correlation between the design thinking subscale and divergent thinking measure was observed. Divergent thinking is a thought process used to generate creative ideas by exploring a large number of possible solutions. While creativity is a part of design thinking, design thinking includes many additional key features, such as empathy and human- centeredness. The low correlation observed in this analysis could be due to the fact that creativity is only part of design thinking. A multi-dimensional measure that captures more than one feature of design thinking would be useful for fully validating the scale. Additionally, feedback from

107 participants suggested that the divergent thinking tasks were boring/not engaging and participants had no incentive to put effort in (participants were not compensated for completing the study). It is likely that task performance captured participants’ level of interest and engagement, rather than their divergent thinking abilities.

The lack of desired results is probably also due in part to the artificiality of the tasks and information provided to the participants more generally. The tasks were one-dimensional and did not require any interaction, which does not adequately represent systems engineering and design tasks. The analytical reasoning and divergent thinking tasks were not complex, little information was given/required, and skills for those tasks are different from the skills required for systems design. Similarly, artificiality of the incentives and environment likely contributed to the lack of correlation. These tasks did not include similar incentive conditions to most systems engineering and design tasks. In this experimental setting, subjects were not rewarded for high performance, nor were they penalized for poor performance. In a real systems engineering environment, cost of poor performance can be very high.

Construct validity seems to be the major issue with the pilot study. Operationalized measures need to proxy the measures of interest in validation studies. In this case, the operationalized measures—the analytical reasoning and divergent thinking tasks—do not seem to accurately proxy systems design thinking behavior. A systems problem would be a better choice for future work, and the problem should depend on the specific aspect of systems engineering being investigated, such as problem partitioning, concept generation, decision- making, etc. (Panchal and Szajnfarber, 2017).

108

4.4 Validation Opportunities

Like the work by Thibodeau, Frantz, and Stroink (2016), this work attempted to situate the Systems Design Thinking Scale in the landscape of existing psychological constructs and measurement instruments. An attempt was made to study systems design thinking relative to analytical reasoning and divergent thinking. Findings did not demonstrate any correlation between scores on the systems engineering subscale and the selected tasks. Findings suggested some correlation between scores on the design thinking subscale and divergent thinking task.

Additional findings suggest that a more interactive and engaging, systems-level task would be more appropriate for studying complex behaviors like systems design thinking. This section suggests additional opportunities for validation. The goal is to still be able to use Reddit as a way of recruiting large number of participants quickly. An interactive and engaging task that can be distributed to a large number of people online is needed.

Gamification is a good approach for doing this. According to Farber (2017), “all games are systems: the rules (or constraints), components, space, and goal interconnect. A game’s interconnected system is driven by player actions; as players learn a game, they also learn its system.” Games encourage players to think about relationships; not isolated events, facts, and skills (Goodwin and Franklin, 1994). Games like Rise of Nations, for example, require players to think of how each action taken might impact their future actions and the actions of the other players playing against them. Similarly, the city management series SimCity tasks players with balancing a complex urban system. Plague, Inc., also requires systems thinking. These games can be used to teach systems thinking, and could also be used to measure it.

Similarly, “play is essential to the design thinking process,” and “a playful and exploratory attitude leads to move innovative, competitive, and breakthrough ideas (Silvers,

109

2016).” There is a great deal of academic research describing the value of play and its importance not just to childhood development, but to adult life. Play, games, and the principles that underlie them have vital roles in “building critical skills like systems thinking, creative problem solving, collaboration, empathy and innovation,” according to the National Institute of

Play. Gamification could also be used to improve interest and effort on design thinking tasks.

4.5 Summary

In this chapter, a study for validating the Systems Design Thinking Scale was described.

Scale scores were compared with performance on analytical reasoning and divergent thinking tasks. While no correlations were observed between subscales and performance on the analytical reasoning task, some correlation was observed between scores on the design thinking subscale and performance on the divergent thinking task. This suggests that the Systems Design Thinking

Scale is useful for measuring certain behaviors, but additional work is required to find a suitable multi-dimensional measure for validating the scale fully.

Because the Systems Design Thinking Scale was only partially validated, some ideas for additional validation were offered. We suggest validating the Systems Design Thinking Scale through gamification, which will present more realistic environments, tasks, and incentives.

Several games were identified for their potential usefulness in validating the scale, and performance metrics for these games were also identified. Correlation between these performance metrics and subscale scores will be useful for further validating the scale.

110

CHAPTER V

Conclusion 5.1 Summary of Dissertation

This dissertation explored systems engineering, systems thinking, design thinking, and their relationships using a mixed methods approach. Qualitative analysis was used to identify key assumptions, concepts, values, and practices for each framework, and to identify individual attitudes that reflect each of these frameworks. Quantitative analysis was used to develop an instrument for measuring these attitudes, called the Systems Design Thinking Scale.

The scale was developed in three major iterations. The first iteration captured systems thinking attitudes along three dimensions: technical, social, and organizational. This hypothesis was based off interview findings and tested. While technical and social items grouped together in meaningful ways, organizational items did not appear to follow any clear pattern. We explain this finding by defining systems thinking itself as an organizational framework. Systems thinking is useful for organizing both technical and social system elements.

“Social systems thinking” in Study 1 demonstrated many similarities with design thinking, when the codebook in Chapter II was used to interpret findings. The relationship between systems thinking and design thinking was explored further in Study 2. Systems thinking and design thinking were not clearly distinguishable. However, interesting patterns were discovered in the systems thinking factor. Significant items appeared to reflect systems engineering more than systems thinking.

111

In Study 3, the relationship between systems engineering and design thinking was explored further. This model was a good fit to the data. While the scale functioned well theoretically, a goal of the work was to develop an instrument with practical use. We attempted to link scale scores with performance outcomes on analytical reasoning and divergent thinking tasks in Chapter 4. While some correlation was observed between divergent thinking subscale scores and performance on divergent thinking tasks, additional work is required before the

Systems Design Thinking Scale can be considered as fully validated.

Several important lessons were learned throughout the research process. First, the application of psychometrics within the domain of systems engineering represented a unique challenge, as few precedents existed for identifying and measuring many relevant concepts.

Consistent qualitative and quantitative data collection and analysis were required for interpreting findings and advancing hypotheses throughout the dissertation. Methodological lessons were related to the use of reverse-coding techniques, as well as survey design and user experience.

5.2 Contributions to Design Science

The main contribution of this work to the discipline of Design Science is the development and demonstration of an integrated framework of systems engineering, systems thinking, and design thinking we call “systems design thinking.” Systems design thinking describes systems engineers who use a human-centered approach to complex systems design. Systems design thinking also describes product designers who have a natural inclination to follow systematic, analytical processes during design. The work presented in this dissertation suggests that the analytical and systematic attitudes of engineering and the holistic and human-centered attitudes of design can coexist, contrary to existing stereotypes that suggest an individual can only have

112 one set of attitudes or the other. The use of psychometric methods to make this claim is a novel approach.

Another major contribution of this work is the Systems Design Thinking Scale. The

Systems Design Thinking Scale is a 9-item questionnaire that measures systems engineering and design thinking attitudes in two subscales. In this work, we demonstrate that this model of systems design thinking is a good fit to two different samples. Additional work is required to demonstrate the usefulness of the scale in practice, as described in Section 5.3; however, it is believed that the scale could be useful for categorizing individuals based on their attitudes, with the ultimate goal of developing effective management strategies, for example, in systems design teams.

In this work, psychometric methods were used to solve engineering problems. It is difficult for organizations to identify systems design thinkers, despite their known value. By making theory and methods from psychology available for use in systems engineering organizations, we hope to offer an innovative method for identifying valuable individuals within an organization, allowing for the maximization of human potential.

5.3 Limitations and Opportunities for Future Work

Several limitations exist in the current study. First, throughout the research, items were dropped from the Systems Design Thinking scale in order to converge on a solution. Revisiting certain concepts, items and potential factor structures would be useful for building a more representative model. A good example of this occurs during the scale development process in

Study 3. In the exploratory factor analysis, systems engineering items SE1 & SE2 group together in a single factor statistically and theoretically. Both items ask about preferences for requirements, a core concept in systems engineering. However, because conventions of structural

113 equation modeling suggest that a factor should consist of a minimum of three items, these two items were dropped from the final model. The same happens for design thinking items DT2 and

DT3, which ask about preferences for working with customers, reflecting the relationship-driven practice of design thinking. In future work, these items should be reintroduced into the final model, with additional test items included to explore these concepts further. Also, it would be worth investigating small differences in phrasing of the items and the impact on results. In the systems engineering subscale, the attitude items are heavily behavior focused. Examples include

“I use quantitative methods” and “I use mathematical modeling/analysis.” Few affective or cognitive items were included (e.g., I prefer X; I think X). For the behavioral items as written, frequency is another metric worth exploring (i.e., I use quantitative methods frequently/infrequently rather than agree/disagree).

More work is needed to validate the Systems Design Thinking Scale, and to determine that the scale is actually measuring systems design thinking. The scale could be capturing differences between creative and analytical thinkers more generally, for example. There is evidence in neuroscience suggesting this difference is real and due to differences in neural activity that can be observed even when people are not working on a problem (Erickson et al.,

2018).

A related scale is the Rational and Intuitive Decision Styles Scale (Epstein et al. 1996;

Hamilton, Shih, and Mohammed, 2014). Our scale appears different, as it is context-rich and specifically focused on quantitative methods, data, and analysis, instead of a broad “rational” approach to problem solving (e.g., “I gather all necessary information” & “I thoroughly evaluate alternatives.”) Also, while design thinking includes intuition, design thinking is not totally intuitive. Design thinking is also creative (involves restructuring problems, trying new

114 approaches, and finding inspiration in everyday life) and social (involves empathy, codesign, and collaboration). An intuitive item was included in pilot testing (“I make [design] decisions based on intuition”), but was dropped in final scale due to low loadings. Another related scale is the

Systems Thinking Scale by Randle and Stroink (2018). Additional work that compares the

Systems Design Thinking Scale to this and other psychological constructs would be useful for understanding what makes systems design thinking unique.

It is anticipated that the Systems Design Thinking Scale will be useful for identifying individual strengths in practice after full validation is achieved. This could be useful for building teams with desired skill compositions. The scale could then also be used as a way of individualizing training and skill-building programs based on interests, preferences, and abilities.

The games described in the validation plan could be used for both measurement and training purposes, to record and improve an individual’s performance over time.

115

Appendices

A. Systems Design Thinking Codebook…………………………………………………...117

B. Semi-Structured Interview Questions.………………………………………………….122

116

Appendix A: Systems Design Thinking Codebook

Name Files References

Analysts 3 4

Collaboration 4 5

Design Thinking 5 8

Awareness&Intuition 8 35

Creative&Innovative 5 14

Brainstorming 2 3

Mental blocks 1 1

Curiosity 3 18

Human-centered 10 40

Communication 10 96

Face to face 6 10

Formal&Informal 6 11

Hearing&listening 8 21

117

Meetings 9 23

Questions 7 38

Culture 8 23

Empathy&Understanding 10 42

Experience 8 39

People&Personalities 8 45

Conflict 4 5

Relationships 3 9

Team 2 3

Trust 5 14

Insight 2 3

Problem definition 5 9

Prototyping 0 0

Storytelling&analogy 8 13

Visual 1 1

Leadership 5 9

Leadership&techskills 1 1

118

Learning&Information 10 29

Mediate&Negotiate 8 21

Observation&Participation 2 8

Organization 4 7

Colocation 2 3

Proactive 2 5

Problem solving 5 11

Alternative methods and 2 6

approaches

Ambiguous problems & 0 0

solutions

Unambiguous problems & 3 3

solutions

SE v DTorDiscipline 7 21

Systems Engineering 8 31

Analysis 6 15

Data 7 13

Tools&Methods 2 4

119

Trade studies 2 3

Coordinate 5 21

Cost 6 12

Delegate&Assign 6 14

Document 7 42

Elements&Subsystems 2 3

Goals&objectives 2 4

Interface 7 16

Managing 8 23

Methodical (process) 6 26

MinimizeReducePartition 6 15

Organize 2 4

Planning&Scheduling 6 29

Requirements 7 25

Risk 7 17

Robust 2 3

Strategy 7 14

120

Technical 8 18

Systems Thinking 7 17

AmbiguityUncertainty 8 17

Big picture 9 53

Complexity 10 30

Flexibility&adaptability 8 26

Holistic 7 12

Integrate&align 9 56

Interactions 10 56

Large-scale 8 18

Synergy 2 3

Training 6 8

Education 5 10

Mentorship 4 10

Translate 4 7

Understanding details 6 9

121

Appendix B: Semi-Structured Interview Questions

Please consider your first-hand experiences with designing and maintaining large-scale, complex engineered systems.

1. Please provide some background context for your experience. Where do you work? What is your formal title? How many years of work experience do you have? How many years have you had your current job?

2. Walk me through a typical workday. In a typical week, what are the top 3-5 tasks you spend the most time on?

3. Which of the following would you say has contributed most to your ability to complete these tasks/do your job effectively: formal education, on-the-job training, mentoring, or something else? Could you elaborate?

4. Please describe a specific project in which you participated in the design and management of a complex system. Can you draw the general architecture of the system? Using this sketch, can you tell me which groups, teams, or divisions within the organization work on which parts of the technical system? Is this consistent with the original for the project?

How many engineers were involved in the project? How many engineers were in each group you drew in the sketch? Where was each group physically located?

5. How did you arrive at this partitioning (in the sketch)? Is there a typical partitioning common to your organization, or is each project broken down differently? Can you describe it? Is this partitioning reflected in the structure of your organization? Who (what title) is responsible for

122 deciding how the work gets done/what the subsystems are? Can you describe how these decisions are made? How are the design teams selected? Are you directly involved in making these decisions?

5a. (If participant makes partitioning decisions): Generally speaking, what heuristics or processes did you use to decide how to distribute the work for this project? At what stage did you finalize this breakdown? What information did you have available to you at the time you were making decisions about how to break down the project? Did you use all of the information available to you when making decisions about how best to distribute the work?

5b. (If participant does not make partitioning decisions): How was this work breakdown structure presented or communicated to you? Who delivered it to you? Do you feel that this was the best way to break down the project/distribute the work? To communicate the project structure? Why or why not? Would you have broken things down differently? How? Why? Would you have communicated this information differently?

6. Going back to your sketch, what was your role in the project? Did your role change throughout the design of this system? Can you indicate which subsystem(s) you worked on, and their relationship to the other systems in the project? How would you characterize your work on these subsystems (design, interface management, analysis, something else)? Which of the groups you indicated did you feel you “belonged” to?

7. Which groups in the sketches often found your work relevant to theirs? Which groups in the sketches rarely found your work relevant? How did you know?

8 Which groups did you work with frequently (e.g., several times per week)? Infrequently (e.g., a few times per month)? How would you characterize your interactions with these groups, e.g. requesting information, or providing information? Would you characterize these interactions as

123 primarily formal or informal? Did you routinely anticipate any requests for information from other groups or the need to provide information to other groups? How do you work this into your personal process?

9. Did subsystems communicate with one another consistently throughout the design process

(early conceptual stages through to final design)? Which ones? Why? Did patterns of communication change during different design stages? How? Is there a single person or group with which all subsystems regularly communicated? What is the role of direct communication between groups as compared to communication with this central individual/group? Do these interactions tend to be through scheduled meetings or informal conversation? Something else?

10. What methods of communication does your organization use (email, meetings/face-to- face/documents/coffee breaks/other)? What role do each of these communication methods have?

Do you find that the methods of communication you mentioned promote or inhibit your ability to do your job? In what way? How could these communication channels be improved? Is there a common technology or software that you use to keep track of documentation or host meetings?

Do you feel that this technology or software is useful/effective? Why/why not? Is there something you would do differently, or another tool that you would use?

11. At what stage in the design process do systems engineers/integrators attempt to coordinate the design of the subsystems? What about the integration of the subsystems? Can you describe how this was done in this project? Who was involved? What information was available to [the systems engineer or systems integrator] in each of these cases? How was this information presented to the systems engineer? Was this information presented to design groups?

12. At any stage in system design or subsystem coordination, were you uncertain about either the reliability or the relevance of the information that you had available? At any stage, were you

124 uncertain about the appropriateness of the decisions you made based on this information? How did you handle this situation?

13. Was there any stage during the system design process in which you found it difficult to process and integrate the information available? Describe precisely the nature of the situation.

14. Were you reminded of similar experiences/projects at any point during your work on this project? Were you at any point reminded of different experiences/projects? Were you at any point reminded of a project that succeeded? Were you at any point reminded of a project that failed? Did these experiences affect the decisions you made or actions that you took? How?

15. Do you think that you could develop a rule, based on your experience, which could assist another person to make the same design decisions successfully? Why/why not? What advice would you give to someone new to the role you had on this project?

Is there anything I might have missed? Do you have any other thoughts about systems design that you’d like to share?

125

Bibliography

Allison, J. T. (2008). Optimal Partitioning and Coordination Decisions in Decomposition-based Design Optimization (Doctoral dissertation). University of Michigan, Ann Arbor, USA. Retrieved from ResearchGate database.

Allport, G. (1935). "Attitudes," in A Handbook of Social Psychology, ed. C. Murchison. Worcester, MA: Clark University Press, 789–844.

Altus, S.S., Kroo, I.M., and Gage, P.J. (1996). A genetic algorithm for scheduling and decomposition of multidisciplinary design problems. Trans. ASME (118), 486-489.

Alyaqout, S. F., Peters, D. L., Papalambros, P. Y., and Ulsoy, A.G. (2011). Generalized coupling management in complex engineering systems optimization. Journal of Mechanical Design, 133(9), 1864-1869.

Amazon. (2018). Amazon Mechanical Turk. Retrieved April 3, 2019 from https://www.mturk.com/.

American Heritage Dictionary. (2016). “Framework.” Retrieved April 1, 2019 from https://ahdictionary.com/word/search.html?q=framework.

Anderson, N., Potočnik, K., & Zhou, J. (2014). Innovation and creativity in organizations: A state-of-the-science review, prospective commentary, and guiding framework. Journal of Management, 40(5), 1297–1333.

Ashby, W. R. (1965). An Introduction to Cybernetics. Chapman & Hall.

Avnet, M. S. (2016). A network-based analysis of team coordination and shared cognition in systems engineering. Systems Engineering and Electronics, 19(5), 395–408.

Baines, T. S., Lightfoot, H. W., Evans, S., Neely, A., Greenough, R., Peppard, J., … Wilson, H. (2007). State-of-the-art in product-service systems. In Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture, 221(10), 1543–1552.

Banathy, B. H. (1967). The systems approach. The Modern Language Journal, 51(5), 281-289.

126

Barker, R. G. (1968). Ecological Psychology: Concepts and Methods for Studying the Environment of Human Behavior. Stanford, CA.: Stanford University Press.

Bateson, G. (1972). Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology. University of Chicago Press.

Beasley, R., and Partridge, R. (2011). The three T’s of systems engineering – trading, tailoring, and thinking. In Proceedings of the 21st Annual Symposium of the International Council on Systems Engineering (INCOSE). Denver, CO, USA. June 20-23, 2011.

Birman, I. (2018). Moderation in Different Communities on Reddit – A Qualitative Analysis Study (Undergraduate thesis). Georgia Institute of Technology, Atlanta, GA, USA. Retrieved from Georgia Institute of Technology SMARTech Repository.

Boulding, K. E. (1964). The Meaning of the Twentieth Century: the Great Transition. Harper & Row.

Braha, D. and Bar-Yam, Y. (2007). The statistical mechanics of complex product development: Empirical and analytical results. Management Science, 53(7), 1127-1145.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.

Brooks, J. M., Carroll, J. S., & Beard, J. W. (2011). Dueling stakeholders and dual-hatted systems engineers: Engineering challenges, capabilities, and skills in government infrastructure technology projects. IEEE Transactions on Engineering Management, 58(3), 589–601.

Brown, T. (2008). Design thinking. Harvard Business Review, 86(6), 84–92, 141.

Brown, T. and Katz, B. (2011). Change by design. The Journal of Product , 28(3), 381-383.

Browning, T. (2001). Applying the design structure matrix to system decomposition and integration problems: A review and new directions. IEEE Transactions on Engineering Management, 48(3), 292-306.

Buchanan, R. (1992). Wicked Problems in Design Thinking. Design Issues, 8(2), 5–21.

Buede, D.M. (2009). The Engineering Design of Systems: Models and Methods, Second Edition. John Wiley & Sons.

Candocia, M. (2017). What Time Should You Post to Reddit (Part 2). Retrieved April 4, 2019,

127 from https://maxcandocia.com/article/2017/Oct/12/what-time-should-you-post-to-reddit-pt-2/

Cagan, J. (2007). The cognition of engineering design—An opportunity of impact. Cognitive Science, 31, 193–195.

Castelle, K. M., & Jaradat, R. M. (2016). Development of an instrument to assess capacity for systems thinking. Procedia Computer Science, 95, 80–86.

Cataldo, M., Herbsleb, J., & Carley, K. (2008). Socio-technical congruence: A framework for assessing the impact of technical and work dependencies on productivity. In Proceedings of the 2nd International Symposium on Empirical and Measurement, Kaiserslautern, Germany, Oct. 9-10, 2008.

Cavalieri, S., & Pezzotta, G. (2012). Product–service systems engineering: State of the art and research challenges. Computers in Industry, 63(4), 278–288.

Checkland, P.B. (1981). Systems Thinking, Systems Practice. John Wiley & Sons Ltd., West Sussex, England, UK.

Checkland, P.B. and Scholes, J. (1990). Soft Systems Methodology in Action. John Wiley & Sons, West Sussex, England, UK.

Chesson, D. (2017). Design Thinker Profile: Creating and Validating a Scale for Measuring Design Thinking Capabilities (Doctoral dissertation). Antioch University, Yellow Springs, OH.

Chiu, C. Y., Morris, M. W., Hong, Y. Y., & Menon, T. (2000). Motivated cultural cognition: the impact of implicit cultural theories on dispositional attribution varies as a function of need for closure. Journal of Personality and Social Psychology, 78(2), 247–259.

Choi, I., Koo, M., & Jong An Choi. (2007). Individual differences in analytic versus holistic thinking. Personality & Social Psychology Bulletin, 33(5), 691–705.

Clegg, S. and Bailey, J.R. (2008). International Encyclopedia of Organization Studies. Sage Publications.

Colfer, L.J. and Baldwin, C.Y. (2016). The mirroring hypothesis: Theory, evidence, and exceptions. Industrial and Corporate Change, 25(5), 709-738.

Collopy, A.X. (2019). Coordination Strategies and Individual Behavior in Complex Engineered Systems Design (Doctoral dissertation). University of Michigan, Ann Arbor, USA.

Conner, M. (2015). Systems Engineering and Integration. Retrieved from http://www.nasa.gov/centers/armstrong/capabilities/CodeR/flight/systems_engineering.html

128

Conway, M.E. (1968). How do committees invent? Datamation, 14(5), 28-31.

Crede, E., & Borrego, M. (2013). From ethnography to items: A mixed methods approach to developing a survey to examine graduate engineering student retention. Journal of Mixed Methods Research, 7(1), 62–80.

Cross, N. (1982). Designerly ways of knowing. , 3(4), 221–227.

Cross, N. (2001). Designerly ways of knowing: Design discipline versus design science. Design Issues, 17(3), 49–55.

Cumming, M. (2002). Flexible and distributed coordination models for collaborative design. In Proceedings of the 20th eCAADe Conference, Warsaw, Poland.

Dam, R., & Siang, T. (2019). 5 Stages in the Design Thinking Process. Retrieved April 9, 2019, from https://www.interaction-design.org/literature/article/5-stages-in-the-design-thinking- process

Darrin, M. A. G., & Devereux, W. S. (2017). The agile manifesto, design thinking and systems engineering. In Proceedings of the 2017 Annual IEEE International Systems Conference (SysCon), 1–5.

Davidz, H. L., & Nightingale, D. J. (2008). Enabling systems thinking to accelerate the development of senior systems engineers. Systems Engineering, 11(1), 1–14.

Davidz, H. L., Nightingale, D. J., & Rhodes, D. H. (2004). Enablers, barriers, and precursors to systems thinking development: The urgent need for more information. In Proceedings of the 2004 Conference on Systems Engineering Research, Los Angeles, CA.

Davis, A. C., Leppanen, W., Mularczyk, K. P., Bedard, T., & Stroink, M. L. (2018). Systems thinkers express an elevated capacity for the allocentric components of cognitive and affective empathy: Systems thinking and empathy. Systems Research: The Official Journal of the International Federation for Systems Research, 35(2), 216–229.

Davis, A. C., & Stroink, M. L. (2016). The relationship between systems thinking and the new ecological paradigm: Systems thinking and environmental worldview. Systems Research: The Official Journal of the International Federation for Systems Research, 33(4), 575–586.

DeCuir-Gunby, J.T., Marshall, P.L., and McCulloch, A.W. (2011). Developing and using a codebook for the analysis of interview data: An example from a professional development research project. Field Methods, 23(2), 136-155. de Graaf, R. S., & Loonen, M. L. A. (2018). Exploring team effectiveness in systems engineering construction projects: Explanations why some SE teams are more effective than others. Systems Research: The Official Journal of the International Federation for Systems Research, 35(6), 687–702.

129

de Souza, C. R., Quirk, S., Trainer, E., & Redmiles, D. F. (2007). Supporting collaborative software development through the of socio-technical dependencies. In Proceedings of the 2007 International ACM Conference on Supporting Group Work. New York, NY, USA.

DeFranco, J. F., Neill, C. J., & Clariana, R. B. (2011). A cognitive collaborative model to improve performance in engineering teams-A study of team outcomes and mental model sharing. Systems Engineering, 14(3), 267–278.

Dinar, M., Shah, J. J., Cagan, J., Leifer, L., Linsey, J., Smith, S. M., & Hernandez, N. V. (2015). Empirical studies of designer thinking: Past, present, and future. Journal of Mechanical Design, 137(2), 021101.

Doob, L. W. (1947). The behavior of attitudes. Psychological Review, 54(3), 135–156.

Dosi, C., Rosati, F., and Vignoli, M. (2018). Measuring design thinking mindset. In Proceedings of the DESIGN 2018 15th International Design Conference. Dubrovnik, Croatia, May 21-24, 2018.

Dym, C. L., Agogino, A. M., Eris, O., Frey, D. D., & Leifer, L. J. (2005). Engineering design thinking, teaching, and learning. Journal of Engineering Education, 94(1), 103–120.

Eagly, A., & Chaiken, S. (1995). Attitude strength, attitude structure and resistance to change. In R. Petty and J. Kosnik (Eds.), Attitude Strength. 413–432. Mahwah, NJ: Erlbaum.

Elsbach, K. D., Barr, P. S., & Hargadon, A. B. (2005). Identifying situated cognition in organizations. Organization Science, 16(4), 422-433.

Eppinger, S.D. and Browning, T.R. (2012). Design Structure Matrix Methods and Applications. MIT Press, Boston, MA, USA.

Epstein, S., Pacini, R., Denes-Raj, V., & Heier, H. (1996). Individual differences in intuitive- experiential and analytical-rational thinking styles. Journal of Personality and Social Psychology, 71(2), 390–405.

Farber, M. (2017). Games, hacking, and the 21st century skill of systems thinking. [Blog post]. Retrieved February 12, 2019 from http://info.thinkfun.com/stem-education/blog/games-hacking- and-the-21st-century-skill-of-systems-thinking.

Fleury, A., Stabile, H., and Carvalho, M. (2016). An overview of the literature on design thinking: Trends and contributions. International Journal of Engineering Education, 32(4), 1704-1718.

Forrester, J. W. (1994). System dynamics, systems thinking, and soft OR. System Dynamics Review, 10(2-3), 245–256.

130

Erickson, B., Truelove-Hill, M., Oh, Y., Anderson, J., Zhang, F.Z., & Kounios, J. (2018). Resting-state brain oscillations predict trait-like cognitive styles. Neuropsychologia, 120, 1-8.

Forrester, J. (1961). Industrial Dynamics. Pegasus , Waltham, MA.

Forrester, J. (1969). Urban Dynamics. Pegasus Communications, Waltham, MA.

Forrester, J. (1994). System dynamics, systems thinking, and soft OR. System Dynamics Review, 10(2), 1-14.

Forsberg, K., & Mooz, H. (1992). The relationship of systems engineering to the project cycle. Engineering Management Journal, 4(3), 36–43.

Frank, M. (2000). Engineering systems thinking and systems thinking. Systems Engineering, 3(3), 163–168.

Frank, M. (2002). Characteristics of engineering systems thinking - a 3D approach for curriculum content. IEEE Transactions on Systems, Man and Cybernetics. Part C, Applications and Reviews: A Publication of the IEEE Systems, Man, and Cybernetics Society, 32(3), 203–214.

Frank, M. (2006). Knowledge, abilities, cognitive characteristics and behavioral competences of engineers with high capacity for engineering systems thinking (CEST). Systems Engineering, 9(2), 91–103.

Frank, M. (2007). Towards a quantitative tool for assessing the capacity for engineering systems thinking. International Journal of Human Resources Development and Management, 7(3-4), 240–253.

Frank, M. (2012). Engineering systems thinking: Cognitive competencies of successful systems engineers. Procedia Computer Science, 8, 273–278.

Frank, M., & Kordova, S. (2009). Developing the capacity for engineering systems thinking (CEST) of senior engineering management students: Learning in a project-based learning (PBL) environment. In Proceedings of the 7th Annual Conference on Systems Engineering Research, Loughborough, England.

Frank, M., Sadeh, A., & Ashkenasi, S. (2011). The relationship among systems engineers’ capacity for engineering systems thinking, project types, and project success. Project Management Journal, 42(5), 31–41.

Frank, M., & Waks, S. (2001). Engineering systems thinking: A multifunctional definition. Systemic Practice and Action Research, 14(3).

Frank, P. (2006). People Manipulation: A Positive Approach (2 ed.). New Delhi: Sterling Publishers Pvt. Ltd.

131

Frederick, S. (2005). Cognitive reflection and decision making. The Journal of Economic Perspectives, 19(4), 25–42.

Fulmer, R.M. and Keys, J. B. (1998). A conversation with Peter Senge: New developments in organizational learning. Organizational Dynamics, 27(2), 33-42.

Furnham, A., & Nederstrom, M. (2010). Ability, demographic and personality predictors of creativity. Personality and Individual Differences, 48(8), 957–961.

Galesic, M., & Bosnjak, M. (2009). Effects of questionnaire length on participation and indicators of response quality in a web survey. Public Opinion Quarterly, 73(2), 349-360.

Gasparini, A. A. (2015). Perspective and use of empathy in design thinking. In Proceedings of the Eighth International Conference on Advances in Computer-Human Interactions. Lisbon, Portugal.

Goode, H.H. and Machol, R.E. (1957). System Engineering: An Introduction to the Design of Large-scale Systems. McGraw-Hill.

Goodwin, J. S., & Franklin, S. G. (1994). The beer distribution game: Using simulation to teach systems thinking. International Journal of Management & Enterprise Development, 13(8), 7–15.

Grace, M., Smith, D. L., Juhnke, L., & Dalton, S. (2017). Lean by Design: The Synthesis of Lean and Design Thinking. Boeing.

Greene, M. T., Gonzalez, R., Papalambros, P. Y., and McGowan, A. R., (2017). Design thinking vs. systems thinking for engineering design: What's the difference? In Proceedings of the 21st International Conference on Engineering Design, Vancouver, British Columbia, Canada, Aug. 21-25, 2017.

Greene, M.T., and Papalambros, P.Y. (2016). A cognitive framework for engineering systems thinking. In Proceedings of the 2016 Conference on Systems Engineering Research, Huntsville, AL, Mar 22-Mar 24, 2016.

Griffin, C. & Bengry-Howell, A. (2017). Ethnography. In Willig, C., & Rogers, W. The SAGE Handbook of qualitative research in psychology (pp. 38-54). 55 City Road, London: SAGE Publications Ltd.

Guilford, J.P., Christensen, P.R., Merrifield, P.R., and Wilson, R.C. (1960). Alternative Uses Manual. Sheridan Supply Co.

Furr, R. M., & Bacharach, V. R. (2013). Psychometrics: An Introduction (Second edition). SAGE Publications, Inc.

132

Hajela, P., Bloebaum, C.L., and Sobieszczanski-Sobieski, J. (1990). Application of global sensitivity equations in multidisciplinary aircraft synthesis. Journal of Aircraft, 27(12), 1002- 1010.

Hamilton, K., Shih, S-I, and Mohammed, S. (2014). The development and validation of the rational and intuitive decision styles scale. Journal of Personality Assessment, 98(5), 523-535.

Hill, C. A., Dean, E., & Murphy, J. (2014). Social Media, Sociality, and Survey Research. Hoboken, NJ: John Wiley & Sons.

Hoffman, L. (2018). “Multiple-Group Measurement Invariance in CFA using Mplus.” Retrieved April 9, 2019, from http://www.lesahoffman.com/CLDP948/index.html

Holwell, S.E. (1997). Soft Systems Methodology and its Role in Information Systems (Doctoral dissertation). Lancaster University, Lancaster, Lancashire, England.

Hootsuite. (2018, March 5). The Best Time to Post on Instagram, Facebook, Twitter, and LinkedIn. Retrieved April 6, 2019, from https://blog.hootsuite.com/best-time-to-post-on- facebook-twitter-instagram/

Hu, L., & Bentler, P. , (1999). Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1-55.

Hutchison, N., Henry, D., & Pyster, A. (2016). Atlas: Understanding what makes systems engineers effective in the U.S. defense community. Systems Engineering and Electronics, 19(6), 510–521.

Ilhan, A. O. (2017). Growth of undergraduate education in design in the United States, 1988– 2012. Design Issues, 33(4), 17–29.

INCOSE. (2015). INCOSE Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities (4th edition). Wiley.

IndustryWeek. (2018). Who Needs Systems Engineering? You Likely Do. Retrieved April 3, 2019, from https://www.industryweek.com/technology-and-iiot/who-needs-systems-engineering- you-likely-do.

Jaradat, R.M. (2014). An Instrument to Assess Individual Capacity for Engineering Systems Thinking (Doctoral dissertation). Old Dominion University, Norfolk, VA.

Jones, J. C. (1992). Design Methods. John Wiley & Sons.

Kannan, H., Bloebaum, C. L., and Mesmer, B. (2014). Incorporation of coupling strength models in decomposition strategies for value-based MDO. In Proceedings of the 15th Annual AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, Atlanta, Georgia, June 16-20, 2014.

133

Katz, D. (1960). The functional approach to the study of attitudes. Public Opinion Quarterly, 24(2), 163–204.

Kerbel, S. and Wainstein, A. (2013). Jung Typology Profiler for Workplace Assessment User Handbook. Humanmetrics.

Kietzmann, J. (2016). Crowdsourcing: A revised definition and introduction to new research. Business Horizons, 60(2), 151-153.

Klein, G. and Armstrong, A-A. (2005). Critical decision method. In Handbook of Human Factors and Ergonomics Methods, 58-1—58-6, Boca Raton, FL:CRC Press.

Klein, G. A., Calderwood, R., & MacGregor, D. (1989). Critical decision method for eliciting knowledge. IEEE Transactions on Systems, Man, and Cybernetics, 19(3), 462–472.

Klein, G., Moon, B., and Hoffman, R.R. (2006). Making sense of sensemaking. IEEE Intelligent Systems, 21(4), 70-73.

Kline, R. B. (2015). Principles and Practice of Structural Equation Modeling, Fourth Edition. The Guilford Press.

Koivu, S. (2015). Why Do People Share Online? Online Disinhibition Effect in the Context of the Virtual Community of Reddit (Master’s thesis). Aalto University, Finland.

Kolowich, L. (2019). The Best Time to Post on Instagram, Facebook, Twitter, LinkedIn, & Pinterest. Retrieved April 6, 2019, from https://blog.hubspot.com/marketing/best-times-post-pin- tweet-social-media-infographic

Kordova, S., Frank, M., & Nissel Miller, A. (2018). Systems thinking education—Seeing the forest through the trees. Systems, 6(3), 29.

Kuhn, T.S. (1962). The Structure of Scientific Revolutions. University of Chicago Press, Chicago.

Lake, J. G. (1992). Systems engineering re-energized: Impacts of the revised DoD acquisition process. Engineering Management Journal, 4(3), September, pp. 8-14.

Lamb, C.T. and Rhodes, D.H. (2008). Systems thinking as an emergent team property: Ongoing research into the enablers and barriers to team-level systems thinking. In Proceedings of SysCon2008: IEEE International Systems Conference.

Lamb, C. T., & Rhodes, D. H. (2010). Collaborative systems thinking: Uncovering the rules. IEEE Aerospace and Electronic Systems Magazine, 25(11), 4–10.

134

Landers, R. N. and Behrend, T.S. (2015). An inconvenient truth: Arbitrary distinctions between organizational, mechanical turk, and other convenience samples. Industrial and Organizational Psychology, 8(2), 142-164.

Lasdon, L.S. 1970. Optimization Theory for Large Systems. London: MacMillan

Laszlo, C.A., Levine, M.D., and Milsum, J.H. 1974. A general systems framework for social systems. Behavioral Science, 19(2), 79-92.

Liedtka, J., & MacLaren, E. (2018, November 7). How Children’s Health System of Texas Is Improving Care with Design Thinking. Harvard Business Review. Retrieved from https://hbr.org/2018/11/how-childrens-health-system-of-texas-is-improving-care-with-design- thinking

Luhmann, N. (1984). Soziale Systeme. Frankfurt/Main: Suhrkamp.

Maddux, W. W., & Yuki, M. (2006). The “ripple effect”: cultural differences in perceptions of the consequences of events. Personality & Social Psychology Bulletin, 32(5), 669–683.

Madni, A. M. (2015). Expanding stakeholder participation in upfront system engineering through storytelling in virtual worlds. Systems Engineering, 18(1), 16–27.

Manning, P. K. (2013). Sensemaking. In Encyclopedia of Management Theory. Sage Publications.

McDermott, T. and Freeman, D. (2016). Systems thinking in the systems engineering process: New methods and tools. In Systems Thinking: Foundation, Uses, and Challenges. Nova Publishing.

McGowan, AM. (2014), Interdisciplinary Interactions During R&D and Early Design of Large Engineered Systems. Doctoral dissertation, University of Michigan, Ann Arbor, MI.

McGowan, A.-M. R., Bakula, C., & Castner, R. S. (2017). Lessons learned from applying design thinking in a NASA rapid design study in aeronautics. In Proceedings of the 58th AIAA/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference.

McGowan, A.-M. R., Daly, S., Baker, W., Papalambros, P., & Seifert, C. (2013). A socio- technical perspective on interdisciplinary interactions during the development of complex engineered systems. Procedia Computer Science, 16, 1142–1151.

Meadows, D. H. (2008). Thinking in Systems: A Primer. White River Junction, VT: Chelsea Green Publishing.

135

Mulaik, S. A. (2009). Foundations of Factor Analysis. Boca Raton, FL: Chapman & Hall/CRC.

Muthén, L. K., & Muthén, B. O. (2017). Statistical Analysis With Latent Variables (Version Mplus User’s Guide. , Eighth Edition). Los Angeles, CA: Muthén & Muthén. Retrieved from https://www.statmodel.com/download/usersguide/MplusUserGuideVer_8.pdf.

Oliver, D.W., Kelliher, T.P., and Keegan, J. G. (1997). Engineering Complex Systems with Models and Objects. McGraw-Hill.

Panchal, J. H., & Szajnfarber, Z. (2017). Experiments in systems engineering and design research. Systems Engineering and Electronics, 20(6), 529–541.

Panel on Undergraduate Engineering Education. (1986). Engineering Undergraduate Education. Washington, D.C., United States: National Academies Press.

Papalambros, P. Y. (2018). From design optimization to design science: An evolution in design thinking. In “Design Research: The Sociotechnical Aspects of Quality, Creativity, and Innovation”, D. Marjanovic, M. Storya, S. Skec (eds.), Springer-Verlag, Berlin, 2019 (in print).

Papalambros, P.Y. and Wilde, D.J. (2017). Principles of Optimal Design. Cambridge University Press.

Parsons, T. (1951). The Social System. Abingdon, Oxon: Routledge.

Pearson, R. H., & Mundfrom, D. J. ( 2010). Recommended sample size for conducting exploratory factor analysis on dichotomous data. Journal of Modern Applied Statistical Methods, 9(2), 359– 368.

Pennock, M. J., & Wade, J. P. (2015). The top 10 illusions of systems engineering: A research agenda. Procedia Computer Science, 44, 147–154.

Pidd, M., (1996). Tools for Thinking: Modelling in Management Science. Wiley, Chichester.

Plattner, H., Meinel, C., & Leifer, L. (Eds.). (2011). Design Thinking Research: Measuring Performance in Context. Springer, Berlin, Heidelberg.

Qualtrics Support. (2018). “Survey Methodology and Compliance Best Practices.” Retrieved 01 May 2019 from https://www.qualtrics.com/support/.

Radzicki, M.J. and Taylor, R.A. (2008). "Origin of System Dynamics: Jay W. Forrester and the History of System Dynamics". In: U.S. Department of Energy's Introduction to System Dynamics.

136

Randle, J. M., & Stroink, M. L. (2018). The development and initial validation of the paradigm of systems thinking: Development and validation of systems thinking. Systems Research, (6), 645–657.

Rath, T. (2007). StrengthsFinder 2.0. Simon and Schuster.

Recker, S. A. (2002). Architecting an engineering documentation system. In Proceedings of the 2002 INCOSE International Symposium, Las Vegas, NV, July 28-Aug. 1, 2002.

Reddit. (2018). “Reddiquette.” Retrieved 01 May 2019 from https://www.reddit.com/wiki/reddiquette.

Rhodes, D. H., Lamb, C. T., & Nightingale, D. J. (2008). Empirical research on systems thinking and practice in the engineering enterprise. In Proceedings of the 2008 2nd Annual IEEE Systems Conference.

Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a general yheory of planning. Design: Critical and Primary Sources.

Rittel, H. W. J., & Webber, M. M. (1974). Wicked problems. Man-Made Futures, 26(1), 272– 280.

Salado, A., Chowdhury, A. H., & Norton, A. (2019). Systems thinking and mathematical problem solving. School Science and Mathematics, 119(1), 49–58.

Self, J. A., & Baek, J. S. (2017). Interdisciplinarity in design education: Understanding the undergraduate student experience. International Journal of Technology and Design Education, 27(3), 459–480.

Senge, P. (1990). The Fifth Discipline: The Art and Practice of the Learning Organization. Doubleday, New York, NY, USA.

Shah, J. J., Millsap, R. E., Woodward, J., & Smith, S. M. (2012). Applied tests of design skills— Part 1: divergent thinking. Journal of Mechanical Design, 134(2).

Shatz, I. (2017). Fast, free, and targeted: Reddit as a source for recruiting participants online. Social Science Computer Review, 35(4), 537–549.

Shea, G. (2017). NASA Systems Engineering Handbook Revision 2. Retrieved from http://www.nasa.gov/connect/ebooks/nasa-systems-engineering-handbook.

Silvers, D. M. (2016, October 31). Why play is essential to the design thinking process. Retrieved April 3, 2019, from https://designthinkingformuseums.net/2016/10/31/why-play-is-

137 essential-to-the-design-thinking-process/.

Simon, H. A. (1996). The Sciences of the Artificial. MIT Press.

Singer, P., Flock, F., Meinhart, Cl., Zietfogel, E., & Strohmaier, M. (2014). Evolution of Reddit: From the front page of the internet to a self-referential community? In Proceedings of the 23rd World Wide Web Conference, Seoul, South Korea.

Sobieszczanski-Sobieski, J. (1990). Sensitivity of complex, internally coupled systems. AIAA Journal, 28(1), 153-160.

Souza, J., & Barnhöfer, U. (2015). Design thinking: It’s the flare that adds another dimension to systems engineering. Insight, 18(3), 25–27.

Spacey, J. (2016). “Design Thinking vs Systems Thinking”. Retrieved April 3, 2019, from https://simplicable.com/new/design-thinking-vs-systems-thinking

Sterman, J. D. (2000). Business dynamics: Systems thinking and modeling for a complex world. Boston: Irwin/McGraw-Hill.

Steward, D.V. (1981). and management: Structure, strategy, design. Petrocelli Books, San Francisco, CA.

Thibodeau, P. H., Frantz, C. M., & Stroink, M. L. (2016). Situating a measure of systems thinking in a landscape of psychological constructs. Systems Research, 33(6), 753–769.

Tjendra, J. (2018, April 25). “Systems Thinking is the New Design Thinking.” Retrieved April 3, 2019, from http://businessinnovation.design/blog/2018/4/25/systems-thinking-is-the-new-design- thinking.

Toh, C. A., & Miller, S. R. (2016). Creativity in design teams: the influence of personality traits and risk attitudes on creative concept selection. Research in Engineering Design, 27(1), 73–89.

Tonetto, L. M., & Tamminen, P. (2015). Understanding the role of intuition in decision-making when designing for experiences: contributions from cognitive psychology. Theoretical Issues in Ergonomics Science, 16(6), 631–642.

Tosserams, S., Hofkamp, A.T., Etman, L.F.P., and Rooda, J.E. (2010). A specification language for problem partitioning in decomposition-based design optimization. Structural and Multidisciplinary Optimization, 42(1), 707-723.

Tribus, M. (2005). Some remarks on the improvement of engineering education. Journal of Science Education and Technology, 14(1), 1.

138

Vendetti, M. S., Wu, A., & Holyoak, K. J. (2014). Far-out thinking: Generating solutions to distant analogies promotes relational thinking. Psychological Science, 25(4), 928–933.

Vicente, K. J. (1999). Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer-Based Work. CRC Press. von Bertalanffy, L. (1940). An outline of general systems theory. British Journal for the Philosophy of Science, 134-165.

Watkins, D., & Gioia, D. (2015). Mixed Methods Research. USA: Oxford University Press.

Weick, K. E. (1979). The Social Psychology of Organizing. New York, NY: Random House.

Weick, K. E. (1995). Sensemaking in Organizations. Thousand Oaks, CA: Sage.

Wiener, N. (1948). Cybernetics: Or Control and Communication in the Animal and the Machine. Cambridge, MA: MIT Press.

Williams, C. and Derro, M. (2008). NASA systems engineering behavior study. NASA Office of the Chief Engineer, Washington, DC, USA.

Worthington, R., & Whittaker, T. (2006). Scale development research: A content analysis and recommendations for best practices. The Counseling Psychologist, 34, 806-838.

Yassine, A. and Braha, D. (2003). “Complex Concurrent Engineering and the Design Structure Matrix Approach. Concurrent Engineering: Research and Applications, 11(3), 165-177.

139