Towards Zero Defects in the Aerospace Industry through Statistical Process Control A Case Study at GKN Aerospace Engine Systems

Hugo Andrén

Industrial and Management Engineering, master's level 2020

Luleå University of Technology Department of Business Administration, Technology and Social Sciences Acknowledgements

Among those who have helped me along the way, making it possible for me to conduct this master’s thesis and complete my degree in engineering, several people have my gratitude. I owe special thanks and appraisal to Soren¨ Knuts, my supervisor and representative at GKN Aerospace Trollhattan.¨ Without Soren’s¨ tireless interest and efforts in pointing me in the right direction, this would not have been possible. Second, I would like to thank Professor Erik Vanhatalo at Lulea˚ University of Technology for supervising my work and providing invaluable feedback through an academic lens. I would also like to thank my friend and opponent Albert Stenback¨ Juhrich for the feedback throughout the duration of the work. Together, Soren,¨ Erik, and Albert have helped me enhance both relevance and of my thesis. Lastly, I would like to thank all the people who participated in the interviews, enabling me to collect the qualitative data, which I see as a cornerstone for the thesis, and of course GKN Aerospace Trollhattan¨ for letting me conduct my thesis despite the outbreak of the COVID-19 pandemic. Abstract

With the ongoing transformation of modern systems in an industry 4.0 environment, industrial actors may see great improvements with respect to quality towards a state of near zero defects. For the aerospace industry, where increased quality and reduced risk is strongly related, new technologies may be used in manufacturing to see to the increasing demands on products. The safety, as well as the manufacturing complexity of products and processes, make the collected measurement data an integral asset for enterprises within the aerospace industry. Collected data may be analysed using statistical tools and methods to improve process capability and, in extension, product quality. Communicating the need for zero defects, original equipment manufacturers demand increased capability from product and component manufacturers. Hence, zero defects are typically operationalised through exhibiting a process capability of Cpk= 2.0. In response to the challenge, GKN Aerospace need to raise the traditional process capability targets of Cpk=1.33.

By employing an exploratory research strategy with a deductive approach, the thesis combines theoretical knowledge from the literature with empirical findings in a thematic analysis. The thematic analysis was conducted by employing six phases as suggested by Braun and Clarke (2006) and resulted in the identification of 50 codes from a total of 459 data extracts. Based on the empirical interview data, a framework for how zero defects is interpreted at GKN Aerospace was developed, which describes zero defects as a cycle. Taking into account that zero defects is operationalised through Cpk= 2.0, the cycle consists of six phases that start with a vision and is completed by delivering a true and reliable Cpk of 2.0. In addition, the codes from the thematic analysis were collated into a thematic mind map, focusing on key aspects of working with statistical process control (SPC) to support zero defects. Two main themes are presented in the mind map, statistical approach to improvement work; highlighting necessary aspects of statistical process control and measurability, and removing barriers for improvement; highlighting fundamental organisational barriers that impede proactive quality improvement.

To support the findings and give a practical example of how process data may be presented and analysed using tools and methods within statistical process control, an SPC study was conducted on a set of data. In the SPC study, the construction and analysis of individuals Shewhart control charts and moving range charts were described in detail. These procedures provide better insights about process behaviour through statistical thinking and thus better knowledge on how to approach more proactive process improvements.

KEYWORDS: Zero Defects, Statistical Process Control, Quality Management, Industry 4.0, Aerospace Industry. Abbreviations

Abbreviation Meaning AI Artificial Intelligence

CPS Cyber Physical Systems

GAS GKN Aerospace Sweden Trollhattan¨

IoT Internet of Things

KC Key Characteristic

KPI Key Performance Index

SPC Statistical Process Control

ZD Zero Defects

ZDM Zero Defects Manufacturing

QSYS Internal database of process data at GAS

Q3 Suspected or confirmed internal non-conformance Contents

1 Introduction ...... 1 1.1 Background ...... 1 1.2 GKN Aerospace Engine Systems ...... 3 1.3 Problem Discussion ...... 3 1.4 Aim...... 5 1.5 Delimitations ...... 5

2 Literature Overview ...... 6 2.1 Industry 4.0 ...... 6 2.2 A Review on Quality within Industry ...... 6 2.3 Statistical Process Control ...... 8 2.4 Zero Defects ...... 11 2.5 Organisational Implications ...... 14

3 Methodology ...... 16 3.1 Research Approach ...... 16 3.2 Literature Overview ...... 17 3.3 Interviews ...... 18 3.4 Thematic Analysis ...... 19 3.5 SPC Study ...... 22 3.6 Research Quality ...... 23

4 Results and Analysis ...... 25 4.1 Zero Defects at GKN Aerospace ...... 25 4.2 Thematic Analysis ...... 33

5 SPC Study ...... 42 5.1 Current Process Control ...... 42 5.2 Distribution Fitting ...... 43 5.3 Control Charts and Analysis ...... 45 5.4 Capability Study ...... 48

6 Findings and Recommendations ...... 49

7 Discussion ...... 51

8 References ...... 52

Appendix A Interview Guide ...... i

Appendix B Complete List of Codes ...... iv 1 Introduction

The introduction provides a background on zero defects and quality management as well as how Industry 4.0 is transforming modern manufacturing systems and affecting quality management practices. What follows is a short introduction of GKN Aerospace Engine Systems, a problem discussion, and finally the aim of the thesis, research questions, and delimitations.

1.1 Background

Quality is of critical concern for organisations in order to meet customer expectations and the threat of competitors’ products (Dogan & Gurcan, 2018). Foidl and Felderer (2015) argue that increasing customer requirements and competitiveness implies quality management being an essential prerequisite and key to sustained economic performance. For original equipment manufacturers (OEMs) supplying products that are subject to meticulous safety, the concern for product quality is amplified. The aerospace industry is an example where the safety of products is of great concern. Within the aerospace industry, the quality of a product is measured by data of multiple geometric specifications and the quality of the manufacturing processes are characterised by process data sets (Wang, 2013). Quality can be referred to as the ability to reduce variation to the point where customer expectations are met or even exceeded. Within industrial manufacturing however, defects are a fairly tangible way to measure quality. Defects are described by Montgomery (2012) as ”nonconformities that are serious enough to significantly affect the safe and effective use of the product” (p. 9). Being affected by facilities, equipment, and manufacturing processes, the quality of a product is subject to several sources that have the potential to cause errors that generate defects (Wang, 2013). To supply an organisation with the right prerequisites for enhancing quality monitoring and optimisation, Wang (2013) suggests shifting the focus from product data to process data.

Zero defects (ZD) is a concept practised within manufacturing for the purpose of minimising defects in a process by doing things right from the very beginning, ultimately aiming for zero defective products (Wang, 2013). It is, according to Tatipala et al. (2018), the increasing demands on produced parts that has led to the escalating importance of zero defects in manufacturing. Although it can be traced back to the 1960s (Montgomery, 2012), it was during the 1990s that ZD saw wide efforts of implementation when automotive companies wanted to cut costs by reducing quality inspections while simultaneously increasing demands on quality from suppliers. Wang (2013) address the necessity of a system for zero defect manufacturing (ZDM) to prevent failures and increase the reliability and safety for manufacturing industries. Tatipala et al. (2018) suggest that part of such a system is the ability to control product and process parameters with the use of connected manufacturing technologies and control systems that handle machine and other process data. It is however implied that such systems require the ability to collect and handle large amounts of data supported by advanced and reliable internet, IT and other technologies. These technologies have just recently become advanced and reliable enough to support the required scalability within industrial manufacturing systems.

In 2014, Lasi, Fettke, Kemper, Feld, and Hoffmann (2014) anticipated that the industrial community was soon to experience a new paradigm. It was expected that a way for ”smart” machinery and products were to emerge as a result of recent advancement in digitalisation within industries combined with internet technologies (Lasi et al., 2014). The German government started preparing for a fourth industrial revolution and coined the term ”Industrie 4.0”. They expected that within a near future industrial actors could have the economic benefits of while producing single products in a batch size of one. Producing with a high degree of customisation was to be made feasible by manufacturing systems where products control their own manufacturing (Foidl & Felderer, 2015; Lasi et al., 2014; Rußmann¨ et al., 2015). Lasi et al. (2014) described Industry 4.0 as a future project defined by two directions of development: application-pull

1 triggered by social, political, and economic changes, and technology-push. The push for new technology can take different forms, one of them being a result of digitalisation and networking. According to Lasi et al. (2014), increasing digitalisation of manufacturing systems provide increasing amounts of data that may be used to control and analyse industrial processes. However, analysing massive sets of raw data is, a major challenge for ZDM as stressed by Wang (2013). Sall (2018) address the fact that many manufacturing processes are so complex that there are simply too many steps and responses to measure, calling for the importance of finding out which processes need monitoring and which ones to leave out. In doing so, Sall (2018) suggest that statistical process control (SPC) may be used to approach the problem through three areas:

Process Change: concerning methods for examining how engineering changes result in different • process variable changes. Process Stability: concerning methods for identifying unstable processes and how these shift over time. • Quality engineers need to find patterns across multiple process variables such as simultaneous shifts within or between systems. Process Capability: concerning ways of determining which processes are meeting customer • specifications.

Resource intensive compensation strategies that focus on rectifying quality post production has resulted in both practitioners and researchers to study methodologies that simultaneously provide increasing product quality and process capability (Magnanini et al., 2019). Being a structured and scientific collection of improvement tools that provide a statistical approach to process improvements, SPC has traditionally been considered a good option for identifying nonconforming products and process deviations through the identification of special (or assignable) causes of variability for a given quality characteristic (Eleftheriadis & Myklebust, 2016). Quality or key characteristics (KCs) are variables or attributes whose variation effect product fit, form, function, performance, or producibility significantly (SAE International, 2018a).

A common method for achieving higher quality is the use of sigma limits. The Greek letter sigma () is often used as a statistical term for measuring how much a process is deviating from its target described in multiples of standard deviations, where six standard deviations result in reducing the defects in manufacturing to 3.4 parts per million (ppm) (Dogan & Gurcan, 2018). Products consisting of many components tend to provide many opportunities for defects or other failure along the different process stages (Arsuaga Berrueta et al., 2012; Magnanini et al., 2019), not least as a result of variation (Montgomery, 2012). In order to meet strict demands on such products, the concept was developed in the 1980s by Motorola (Montgomery, 2012). The six sigma concept seeks to reduce the variability of a given process by having statistically computed control limits located six standard deviations from the mean. According to Gultom and Wibisono (2019), six sigma seeks to reduce process variation and increase process capability by aiming for zero defects. However, having limitations in dealing with complex data sets, six sigma approaches are not sufficient to reach ZDM (Wang, 2013).

Within the aerospace industry, quality is often defined as conformance to specifications (Rolls Royce, 2016) It is thus customary to describe zero defects in terms of capability of a given process (AESQ, 2020), implying that a capable process is bound to produce non-defective components or products. Finding frequent application, the process capability ratio Cp can be used to evaluate the potential capability of process variables but does not supply any information about why a process is not meeting specifications (Montgomery, 2012; Sall, 2018). To measure the actual capability, Cpk can be used and compared to Cp to consider centring of the process (Montgomery, 2012). In addition, Sall (2018) suggests that Ppk describes process performance (i.e. how well it is doing) and that Cpk describes the capability of a process (i.e.

2 how well it could perform if process drift issues are fixed). The difference between these measures is whether a long-run classical estimate of the standard deviation is used (Ppk), or a short-run local estimate (Cpk) (Sall, 2018). On the contrary, Montgomery (2012) points out that Pp and Ppk were developed for describing processes that are not in statistical control, resulting in these indices essentially telling you nothing. Montgomery (2012) further claims that all process capability rations (including Cp and Cpk) should be used with great care, stating that they are describing complex circumstances in a too simplified way. Thus, producing products that are altogether free from defects may not be possible solely by utilising tools and methods within SPC. Therefore, in order to make certain that nonconforming products are identified so that defects are not delivered to customers, inspection must be present.

1.2 GKN Aerospace Engine Systems

As part of GKN Aerospace, Engine Systems is a division manufacturing aircraft engines, engine components, space rockets, and stationary gas turbines. In addition, they provide maintenance work on aircraft engines for military use (GKN, 2020a). GKN Aerospace is an important actor on the market responsible for the production and maintenance of engines for JAS 39 Gripen. Having specialised in specific products GKN Aerospace components are present in over 90% of new larger commercial aircraft (GKN, 2020a). GKN Aerospace Sweden (GAS) has a manufacturing plant with workshops and national offices located in Trollhattan.¨ Due to the time for developing and manufacturing products and components being limited at GAS, it follows that the time for analysing process capability is also limited within the organisation. GAS has started the journey towards an Industry 4.0 environment, having equipped new manufacturing processes with sensors for controlling and measuring process outcome (GKN, 2020b). Compared to process requiring more manual operations as well as semi-automated processes, these new sensor-equipped processes have resulted in increasing amounts of data being collected.

1.3 Problem Discussion

Ideally, any manufacturing enterprise would prefer to produce products completely free from defects. Generating no defects whatsoever is however rarely the case, since all systems and processes inevitably are subject to some variation (Montgomery, 2012). Employing six sigma limits and having 3.4 defective products for every one million produced is in many regards considered a good performance. However, depending on the number of components in a product or the number of process steps, one may still end up producing several defective products (Eleftheriadis & Myklebust, 2016; Montgomery, 2012). It is also necessary to point out that some industries simply cannot allow any defects at all due to the final products being used in high performing systems or in situations where defect components imply catastrophic outcomes such as putting lives at stake. The aerospace industry is such an industry, where quality of the end product is heavily dependent on inspection of products and components. From a quality engineering perspective, controlling and monitoring processes to the point where zero defects is achieved is therefore of interest. Yet it seems insufficient to employ the traditional six sigma limits. Although both Dogan and Gurcan (2018) and Gultom and Wibisono (2019) suggest that the aim of six sigma limits is to move towards zero defects, Montgomery (2012) simply point out that the goal is a 3.4 ppm defect level. Attaining a Cpk equal to 2.0 is often viewed as having zero defects (Martens, 2011), but a Cpk= 2.0 implies the employment of six sigma limits, which only delivers a defect level of 3.4 per million units produced (Dogan & Gurcan, 2018; Martens, 2011). It follows that one cannot expect a guarantee of zero defects by solely employing six sigma limits or achieving Cpk= 2.0. From a quality perspective, it is therefore necessary to establish how zero defects should be quantified, interpreted, and pursued in an organisation such as GAS.

3 Since no clear or quantifiable definition has been found in the literature, ZD may be considered nothing more nor less than what its name implies, i.e. 100% non-defective. Thus, ZD may be quantified with a defect rate of 0 ppm. Even though it is not synonymous with ZD, operationalising through Cpk= 2.0, is considered a reasonable trade-off within the industry where it is expected to give a maximum of 3.4 defective products for every one million produced. Given that the goal of ZD in reality is operationalised through Cpk, GAS need to follow strict product specifications to achieve required quality. The capability index is enhanced by eliminating excess variability and maintaining a random fluctuation around the mean of each process (Montgomery, 2012). Capability studies are a well-established concept of SPC methodology. Since the only terms capable of describing variability are statistical, Montgomery (2012) claims that statistical methods are key to improving quality. Foidl and Felderer (2015) stressed that several elements of Industry 4.0 may provide great benefits for quality management. It was however discovered by the authors that one of the main challenges following the technological advance is the immense amount of information that needs filtering and processing (Foidl & Felderer, 2015). As a result of increasing data collection within manufacturing industries, larger amounts and more types of measurements follows which can be used to evaluate processes. According to Sall (2018), such amounts quickly become too much to handle from a quality perspective.

In combination, considering requirements from customers and on safety together with the ongoing transformation, it is of interest to study how customer demands on ZD are interpreted within GKN Aerospace Engine Systems. Traditionally, ZD has been described in terms of Cpk equal to 2.0, but from an organisational perspective it is necessary to evaluate whether other SPC methods can be used to quantify and describe zero defects such as Ppk, Pp, Cp, stability index etc. For GKN Aerospace, having processes that are mainly operated manually, semi-automated processes, as well as sensor-equipped automated processes, an efficient way of collecting and monitoring process data continuously is paramount for seeing to the customer demand of ZD and preparing for a future Industry 4.0 environment. Figure 1 illustrates a framework in which different actors, systems, concepts, and methods may describe and/or interpret the concept of zero defects in different ways.

Figure 1: Representation of different actors/systems/concepts describing ZD.

4 In the literature, ZD has been described in various ways over time. Present literature has mainly focused on managerial aspects to achieve the outcome. Emerging literature predicts that ZD might be fully realisable with the right technology. Industry 4.0, where big data and new technology such as artificial intelligence (AI), machine learning, and cyber physical systems (CPS) can enable smarter manufacturing through interconnecting the physical world with digital systems. Another aspect is the customers, demanding better and better products continuously, often having different demands. GKN Aerospace must try to meet these demands by providing the proper training, management, and a culture where work is done correctly to make sure no defective products are delivered to customers. Focusing on defects, SPC has a central part of explaining what defects are in terms of variation and thus what ZD actually means in statistical terms.

1.4 Aim

The aim of the master’s thesis is to investigate how methods for statistical process control (SPC) can support zero defects within manufacturing processes at GKN Aerospace Trollhattan.¨ To support the above aim the following research questions have been formulated:

RQ1: How is ZD interpreted at GKN Aerospace Sweden? •

RQ2: How can tools within SPC support the work towards ZD? •

RQ3: How should tools and methods for SPC be used to approach ZD in complex/modern • manufacturing systems?

1.5 Delimitations

There are, according to Wang (2013) two reasons for pursuing products and processes of zero defects; safety and customer expectations. This thesis will focus on the latter, looking at how customer requirements on zero defects can be secured using tools and methods within SPC. The main focus of the thesis will not concern the details of Industry 4.0. However, since GAS and the industrial community at large are preparing to operate in such an environment within a near future, this thesis, and the recommendations made herein will be developed by having Industry 4.0 in consideration.

5 2 Literature Overview

The following chapter is a literature overview treating different areas that help describe the topic of the master’s thesis. The literature overview will be used for analysing empirical data later in this report. The main sections of the chapter concern Industry 4.0, a brief review on quality management within industry, statistical process control, zero defects, and managerial implications of improving quality towards ZD.

2.1 Industry 4.0

With roots stemming from a project undertaken by the Federal Ministry of Education and Research in Germany, Industry 4.0 is a term denoting what has been commonly known as the fourth industrial revolution (Lasi et al., 2014). The new paradigm within the industrial community is going to be powered by technological advances, mainly within digitalisation. A broader digitalisation of manufacturing industries that, together with internet technologies allow factories, machinery, processes, and products to communicate with one another. According to Lasi et al. (2014), industrial actors, if capable of setting up such systems, can expect the economic benefits of mass production even though they are maintaining a high degree of product and process customisation in small- (Foidl & Felderer, 2015; Lasi et al., 2014; Rußmann¨ et al., 2015). Lasi et al. (2014) defined Industry 4.0 by two directions of development: application-pull triggered by social, political, and economic changes, and technology-push. The push for new technology can take different forms, one of the most important ones being a result of digitalisation and networking. According to Lasi et al. (2014), increasing digitalisation of manufacturing systems provide increasing amounts of data that can be used in efforts to control and analyse industrial processes.

Considered an advanced manufacturing initiative, Ferreira et al. (2018) suggest Industry 4.0 is focused on achieving zero defective products throughout manufacturing processes, implying that ZDM is feasible once manufacturing industries have developed environments of advanced and interconnected technologies. In such environments, contributed by new technologies, devices, innovations and contexts, humans are enabled to act on and drive the production of data. The result, according to Ferreira et al. (2018), is a quick and radical change to the perspective and the capacity we have to deal with both known and unknown data. In order to do so, the collection of data must be planned to meet requirements and enable engineers to analyse and control variation, stability, and capability of processes (SAE International, 2018a). Having a capacity to better and more efficiently discover and use data should be viewed as a key add-on on emerging decision systems according to Ferreira et al. (2018).

Aircraft, machine tools, and other high-technology industries depend largely on manufacturing processes that are safe, efficient, and adaptive. The development of products and processes within these industries are therefore vital, especially the aerospace industry, considering its need for cutting-edge technology and ability to support growth in Europe (Eleftheriadis & Myklebust, 2016). Further, production within the aerospace industry must continually improve to ensure reliable and safe products that are equal or exceed customer and regulatory requirements (SAE International, 2018a).

2.2 A Review on Quality within Industry

The quality of a product or service may be defined in different ways using different perspectives (Bergman & Klefsjo,¨ 2012). A traditional definition of quality based on the somewhat narrow viewpoint that products and services need to meet user requirements is fitness for use as suggested by Juran (1999). Montgomery (2012), suggest that there are two aspects of that definition; quality of design and quality of conformance.

6 Montgomery (2012) stress that the definition suggested by Juran (1999) has become more associated with the latter, leading to quality work solely focusing on conformance to specifications and thus reduced customer focus. Instead, Montgomery (2012) prefers to define quality to be inversely proportional to variability. The preference is based on reduced variability being directly translated into cost reduction as a result of less waste, rework, effort, and time. Another, and perhaps broader definition of quality is provided by Bergman and Klefsjo¨ (2012), suggesting that the quality of a product is its ability to satisfy and preferably exceed customer’s needs and expectations.

It is also of interest to consider how the progression of what quality encompass have changed over time as suggested by Radziwill (2018) who propose four phases of progression:

1. Quality as inspection: Seeking to sort out nonconforming products when such have been produced. 2. Quality as design: Preventive efforts inspired by W. Edwards Deming to consider quality in the design of products and services.

3. Quality as empowerment: Continuous improvement through empowering individuals. Creating a concern for quality through everyone’s responsibility by applying approaches such as (TQM) and six sigma.

4. Quality as discovery: Where industry 4.0 enables adaptive and intelligent environments Radziwill (2018) suggest that quality depends on the ability to find and combine new sources of data as well as the effectiveness of discovering root causes and new insights about products, organisations, and ourselves.

We can see that the above definitions of quality may fit into one or more of these progressions. It is also apparent that the concern for quality has augmented gradually from focusing on product inspection, process design, organisation and people, to discovering how situational factors may be utilised to work in favour for quality improvement. Quality could thus take the shape of both incremental and breakthrough innovations, as indicated by Box and Woodall (2012) who propose that quality and innovation have developed strong similarities from an engineering viewpoint. For example, Box and Woodall (2012) suggest six sigma to be an approach that, from the outset, primarily was focused on reducing defects. Since then, it has evolved onto focusing more on process efficiency to what is now more of a methodology for developing new products and services to new markets. Thus, Box and Woodall (2012) suggest that six sigma is more focused on innovation today, however stating that it is still widely associated with defect reduction.

It is the author’s view that the definition proposed by Bergman and Klefsjo¨ (2012) is broad enough to encompass all four of Radziwill’s (2018) progressions yet vague enough to result in questions weather how customer needs and expectations should be met and exceeded. Montgomery (2012) defines a more hands-on approach. Thus, the thesis will use the following definition of quality:

”Quality is the ability to reduce variation to the point where customer expectations are met and preferably exceeded.”

The above definition is considered suitable because of its relevance with regards to how ZD is described in the thesis. Operationalising ZD through Cpk= 2.0 as a result of customer requirements, the above definition relates both to the aspect of customers’ needs as well as the importance of reducing variation as a means to enhance the Cpk index and quality in general.

7 Being an integral part of economic performance in an environment where competitiveness and customer requirements are continually increasing (Foidl & Felderer, 2015), quality management is of critical concern for any organisation (Dogan & Gurcan, 2018). More so for organisations producing components and products that are subject to a high degree of stress or part of systems relying on safety. Aerospace, for example, is an industry where product safety is especially important (Magnanini et al., 2019; SAE International, 2018a). The quality of a product or component is described by data of multiple geometric specifications and the quality of a process is characterised by process data sets (Wang, 2013). Being affected by facilities, equipment, and manufacturing processes, the quality of a product is subject to several sources that have the potential to cause errors or defects (Wang, 2013). Papacharalampopoulos et al. (2019) argue that the quality of a product is dependent on the quality and efficiency of the manufacturing process such that deviations in production can result in defective products. Montgomery (2012) suggest that a defect is one, or a set of nonconformities that are serious enough to have a significant impact on the safe and effective use of the product.

Box and Woodall (2012) highlighted how quality and innovation may be viewed as closely related stating that an innovative system can be developed by combining statistical tools for continuously added business value. Information technology and the combination of several sources of data are often used in such systems. However, innovation usually finds greater interest among business leaders than quality. Still, traditional quality tools are often useful for paramount incremental innovations. Combining different key ideas of SPC with other areas of statistics have resulted in innovations such as time series methods, change-point methods, and operations research (Box & Woodall, 2012). For the most part, SPC methods are based on the assumption that observed data is stationary and independent over time, Box and Woodall (2012) acknowledge that such assumptions are often describing reality inadequately. Further, the 3.4 ppm defect metric, distributional assumptions, 1.5 sigma shift, and specification limits are doing more harm than good today.

2.3 Statistical Process Control

Continuous improvement is an overall philosophy for improving quality, part of that philosophy pertains to Statistical Process Control (SPC) (Mohammed et al., 2008). The objective of SPC is to increase the knowledge of a process so that it can be directed towards a specific target or a desired way. Doing so is done by reducing variation of the process or product so that performance can be enhanced (SAE International, 2018b). As nonconformities can be reduced by having less variation, SPC-methods are suitable for improving quality. Further, since variability can only be described in statistical terms, statistical methods are in fact an integral component of quality work (Montgomery, 2012). In statistical science, one primary driving force has been the need to adapt and develop methods and theory that can handle practical problems in various areas (Box & Woodall, 2012). For new products and processes, Box and Woodall (2012) suggest that statistical thinking and methods are often necessary for handling both design and analysis of the measurement systems needed. The increasing amount of data has also served as a force driving developments in statistical science together with increases in computational power, making computationally intensive methods and analysis of larger data sets possible (Box & Woodall, 2012).

Utilising observed data to monitor and control processes requires some groundwork. First, appropriate methods and tools must be selected that complies with design characteristics and process variables that represent the product quality (SAE International, 2018a). Common practice is to identify measurable key characteristics (KCs) which are variables or attributes whose variation effect product fit, form, function, performance, or producibility significantly. In the second step, analytical studies of process effectiveness is conducted with regards to process stability and capability as well as actions needed to handle potential problems. In order to do so, the collection of data must be planned so that it can be used to understand the

8 process. The data must also be generated and analysed using statistical techniques to interpret stability, capability, and variation (SAE International, 2018a). Before continuing on to the third and last step, improvements are conducted on the process in order to see to problems with stability, capability and variation due to special causes. Once a state of statistical control is established, one can continue to the third step where process monitoring and control is applied continually in production with the goal of maintaining the stability and capability of the process. Figure 2 illustrates the different steps of process control (SAE International, 2018a).

Figure 2: Process Control Activities (inspired by SAE International (2018a), p.7)

In figure 2, the second and third step may be interpreted as phase I and phase II control chart applications respectively. Phase I involves gathering and analysing data in retrospect to determine whether or not the process has been in a state of statistical control. If not, efforts to identify and eliminate assignable causes of variation are carried out so that reliable control limits can be established for future monitoring. Once the process is brought to a state of statistical control, phase II utilises control charts to monitor the process and maintain its stability.

According to the aerospace standard on process control methods (AS13006) developed by the Aerospace Engine Supplier Quality (AESQ) group, proactive process controls are preferred over reactive approaches such as inspection (SAE International, 2018a). Post-production identification (inspection) of product defects and process deviations often result in compensation strategies that consume excess resources such as time, material, and energy (Magnanini et al., 2019). Thus, when possible, processes should be managed into a state of control as early as possible in the production cycle rather than employing quality control by inspection at the end of production (SAE International, 2018b). Quality inspection will not enable a defect level of zero. Rather, it motivates a culture of only finding solutions to urgent problems and not long-term quality and customer satisfaction (SAE International, 2018b).

2.3.1 Control Charts

The typical control chart, often referred to as a Shewhart control chart after its inventor Walter A. Shewhart, represent a plot of data over time (Shewhart, 1931). A centre line (CL) and two control limits are added to the control chart horizontally. The centre line is often represented by the data set mean and the control limits are usually computed by adding and subtracting three standard deviations to the mean of the data set (Mohammed et al., 2008). The Greek symbol for sigma () is often used for denoting standard deviation, therefore these control limits are sometimes referred to as three-sigma limits. For practical reasons, it is not

9 always possible to have negative data points. In that case, it is customary to set the lower control limit to 0. If the data points are plotted within these limits and without any unusual pattern, the process is considered to be in a state of statistical control. Statistical control is an important construct within SPC, meaning that a process is only showing variation due to common causes (Mohammed et al., 2008) and exhibit a random and predictable pattern, within its natural way (SAE International, 2018b). Variation caused by sources that are not considered part of the system or process itself, is commonly referred to as special (or assignable) causes (SAE International, 2018a). If such variation is present, the process is said to be out of (statistical) control. There is a variety of additional guidelines to determine how a process is performing based on these types of control charts. However, as more rules are used, one can expect an increase in false alarms. Three main rules are widely accepted (Mohammed et al., 2008): Eight or more consecutive points on one side of the centre line. • Two out of three consecutive points plotted beyond either the upper or lower two-sigma limit. • A trend of eight or more consecutive points either increasing or decreasing. • The observations or collected data that serve as data points in a control chart can be divided into two main categories; variable data that are continuous in nature, and discrete attribute data that are based on counts or classifications (SAE International, 2018a). These types of data can be divided further based on sample size, data type, and the size of shift in mean that is to be detected (Montgomery, 2012), see figure 3.

Figure 3: Univariate Control Charts Decision Tree (inspired by Montgomery (2012))

All the charts in figure 3 have in common that the data are collected and presented sequentially over time. Mohammed et al. (2008) suggest that it is necessary to make certain that data are independent over time. In that regard, independence implies that successive data points do not show any relationship or autocorrelation. There are two types of autocorrelation according to Mohammed et al. (2008). Positive autocorrelation is present when high values tend to be followed by high values and low values tend to be followed low values. Negative autocorrelation is present in a data set when low values are followed by high values and vice versa (Mohammed et al., 2008).

10 2.3.2 Process Capability

In order to provide a quantitative measure of process potential and performance, capability indices such as Cp, and Cpk are used within manufacturing industries (Pearn & Chen, 1999). Cp may be used to determine the potential capability of a process if it is centred about the mean but does not supply any information about why the process is not meeting specifications (Montgomery, 2012; Sall, 2018). For measuring the actual capability, Cpk may be used and compared to Cp in order to give an idea of process centring (Montgomery, 2012). In order to provide information about process capability, i.e. how well it could perform if process drift issues are fixed, Sall (2018) suggest using Cpk. For determining how well a process is doing, Sall (2018) suggests using the performance index Ppk. Cpk is computed using a short-run local estimate of the process standard deviation while Ppk is computed using a long-run classical estimate (Sall, 2018). Being easy to apply and understand, indices are often used to make conclusions without considering the underlying data, distribution, and sampling errors according to Pearn and Chen (1999). One of the main prerequisites for computing reliable capability indices is that the studied process is in a state of statistical control. Montgomery (2012) argue that the indices Pp and Ppk were developed to describe processes that are out of statistical control, implying that they essentially tell you nothing. According to Deming (1986), it is pointless to predict the outcome of a process that is out of statistical control since it has no capability. Describing complex phenomena in a very simplified way, both Montgomery (2012) and Pearn and Chen (1999) stress that indices of process performance and capability should be used with great care since. Formulas for computing the above mentioned indices are provided below where USL is the upper specification limit, LSL is the lower specification limit, µ is the process mean, is the short-run local standard deviation and s is the long-run sample standard deviation.

USL LSL C = (1) p 6

USL µ µ LSL C = min , (2) pk 3 3 ⇢

USL LSL P = (3) p 6s

USL µ µ LSL P = min , (4) pk 3s 3s ⇢

2.4 Zero Defects

Increasing demands on produced parts have led to an escalation in the importance of reducing defects drastically within manufacturing (Tatipala et al., 2018). In fact, to gain customers and market shares in modern competitive markets, Wang (2013) claims that reaching the goal of so-called zero defect products is essential. Zero Defect Manufacturing (ZDM) is a concept practised within manufacturing that seeks to minimise defects in processes by doing things right from the very beginning, ultimately seeking to reduce the defect rate to zero (Wang, 2013). By reducing variability during production, Papacharalampopoulos et al. (2019) suggest that ZDM attempts to achieve better and more sustainable production systems on both economic and environmental grounds. By minimising defective output, cost, time, raw materials, and other resources are reduced (Papacharalampopoulos et al., 2019).

11 Although it can be traced back to the 1960s (Eleftheriadis & Myklebust, 2016; Montgomery, 2012), it was during the 1990s that zero defects saw wide efforts of implementation when automotive companies wanted to cut costs by reducing quality inspections while simultaneously increase the demands on quality from suppliers. When it was introduced by the US State Secretary of Defence, zero defects (ZD) was an approach for eliminating defects derived from human error (Eleftheriadis & Myklebust, 2016). Meant to inspire all levels of the organisation to do their jobs right the first time, ZD was heavily focused on underlining that it was not meant to be viewed as a technique for evaluating employees, a speed up program, nor a substitute for quality control (Eleftheriadis & Myklebust, 2016). An important point made by Eleftheriadis and Myklebust (2016) is that, as a concept, ZD took notice of that dedication, enough training, and tool proficiency among employees did not necessarily guarantee products and processes free from the defects. However, the focus on each and everyone’s individual accountability for quality on a personal and collective level was later disclaimed by Deming (1986) who stated that the majority of causes for low quality are derived from the system itself and not the workforce. Wang (2013) address the necessity of a system for ZDM to prevent failures and increase the reliability and safety for manufacturing industries. Tatipala et al. (2018) suggest that part of such a system is the ability to control product and process parameters with the use of connected manufacturing technologies and control-systems that handle machine and other process data. The modern technologies that make up an Industry 4.0 environment, supplies the right conditions for the concept of ZDM to emerge and, as described by Papacharalampopoulos et al. (2019), ”...be established as a basic pillar of the new era.”. By increasing digitalisation and inter-connectivity of processes and production lines, Industy 4.0 can provide strong tools for achieving ZDM. Papacharalampopoulos et al. (2019) mention data based models, simulation-based engineering, online measurement, and data gathering systems as tools that can be used to work towards ZD.

Although quite a lot has been written on the subject, no clear or quantifiable definition of ZD or ZDM has been found in the literature. Therefore, it is considered to be nothing more nor less than what its name implies, i.e. 100% non-defective. Thus, ZD may be quantified with a defect rate of 0 ppm. It is however important, as suggested by Deming (1986), to consider where in the chain of process steps defects are recorded and how they are measured. Likewise, it is of interest to determine who decides what is defective and how the interpretation of defects is shared among management, worker and inspector. When considering how Montgomery (2012) defines defects, it is implied that such are recorded when a product has been delivered to the customer. With good process knowledge, it is probable to identify these sooner rather than later. Altogether, we may view ZD as a concept which seeks to reach a rate of 100% conforming and deliverable products or services. Also, one could argue that ZD is a state of being as an organisation or for a process, producing 100% conforming and deliverable products or services. Nevertheless, without advanced and interconnected manufacturing systems that can control production to perfection, guaranteeing to customers deliveries of 100% conforming products may only be feasible with the use of suitable tools and methods within SPC accompanied by thorough inspection.

2.4.1 Modern Manufacturing Requirements

Up until recently, statistical tools and collection of data have been considered excellent, or at least sufficient indicators, potentially improving quality. Today’s complex manufacturing systems enabled by technologies such as internet of things (IoT) and cyber physical systems (CPS) as well as increasing amounts of data requires a better understanding of the inter-operability between the different elements of a manufacturing system (Eleftheriadis & Myklebust, 2016). With increasing complexity of multi-stage manufacturing systems, product quality characteristics have developed inter-dependency among process stages (Magnanini et al., 2019). Despite rigorous tests and controls during manufacturing, there are still a proportion of nonconforming products that are cleared and delivered to customers (Papacharalampopoulos et al., 2019). Therefore, Papacharalampopoulos et al. (2019) suggest that there is a need for real time monitoring and

12 more accurate, adaptive process control methods. Considering the steadily increasing customer demands and forecasts that an Industry 4.0 environment will facilitate a higher degree of customisation and reduction in batch sizes down to one (Lasi et al., 2014), traditional methods for process control needs to be supplemented. Although viewed as a necessary control instance allowing users to measure, evaluate, and assist long-term improvements, Statistical Process Control (SPC) is at best partially applicable in small-batch production (Eleftheriadis & Myklebust, 2016). SPC requires a certain number of measurement values of a quality characteristic and is traditionally used for monitoring the characteristic so that it is possible to distinguish between the natural variability of a process and the variability assignable to special causes (Eleftheriadis & Myklebust, 2016). Assignable (special) causes are then to be eliminated and can provide information about additional improvement of the process (Montgomery, 2012). Thus, zero defect approaches such as enhanced real-time process control and improved medium- and long-term process improvement are also necessary for achieving ZDM (Eleftheriadis & Myklebust, 2016).

Looking at multi-stage manufacturing systems for producing products with complex features such as engine components for aircraft, Magnanini et al. (2019) suggest that it is important to consider quality strategy with the overall production strategy. Due to large size and high production costs Arsuaga Berrueta et al. (2012) state that the impact of defective parts in aerospace production is paramount. Final product quality is a result of the possibility within the different operations in different stages of the process chain to produce satisfactory. Further, Magnanini et al. (2019) argue that the capability of detecting defects or detecting the possibility of them along the process chain is related to time and cost savings. Controlling quality at the end of the manufacturing line enables for defects produced in earlier process stages to spread onto the following (Arsuaga Berrueta et al., 2012). For the aircraft industry, complex parts, large number of parameters, and generally short series, imply that it is insufficient to adjust production by employing SPC-methods (Arsuaga Berrueta et al., 2012).

As a result of what is considered feeble attempts to improve quality post-production, practitioners and researchers have started to focus on developing methodologies for simultaneously increasing product quality and process capability, rather than treating these two separately (Magnanini et al., 2019). Since the different stages of a manufacturing process is subject to variability that can result in deviations or defects, Magnanini et al. (2019) suggests characterising controllable and uncontrollable causes of variability on the final product is a key challenge. Magnanini et al. (2019) explain that the aim of ZDM is to reduce defective product by having an integrated strategy capable of identifying errors sooner rather than later to successfully avoid procreation of defects along the process chain. Thus, strategies for ZDM aim at improving the quality of both product and process. However, developing quality-oriented strategies that are integrating products and systems require a good understanding of the manufacturing system with regards to both organisational and technological complexity. According to Magnanini et al. (2019) that understanding provides better reactiveness to customer requirements and thus market competitiveness.

As proclaimed by Eleftheriadis and Myklebust (2016), creating a guideline of suitable quality tools for new and complex ZDM systems should not be considered straight forward. It was however discovered that systematic approaches of validating, structuring, and storing acquired data proved to be important. Eleftheriadis and Myklebust (2016) suggest selecting control tools that are critical to quality with respect to machine tolerance and collecting end-user process knowledge. By doing so, a guideline for handling what is referred to as Vital Process Parameters (VPP) can be developed. The development of new and intelligent ZDM systems are not likely to take place at one instance. Rather, the process is likely to take form gradually over time where the organisation and management will learn concurrently. Although Arsuaga Berrueta et al. (2012) suggest SPC-methods are insufficient for reaching a state of zero defects, several authors indicate that key issues are possible to handle using these methods. At the very least, typical components of SPC, such as control charts and capability studies, seems necessary to remedy existing quality problems.

13 2.5 Organisational Implications

As noted, there are several different points made within the literature that tries to describe what ZD is and how it is obtained. Regardless, there are organisational aspects that need to be considered in order to support the wanted trajectory of an enterprise. One important factor discussed by Deming (1986) is the adaptation and institution of leadership, where focus on outcome must be relinquished. Among the examples mentioned are work standards, management by numbers, meet specifications, and zero defects. Supervising on outcome should be replaced by sources of improvement, the intent of quality of product, and on translating the intent into design. In order to do so, barriers that hinder the employees to be proud of their work must be removed. Further, it is necessary that leaders are familiar with the work they are supervising so that they in turn are able to inform their managers what needs to be corrected such as inherited defects or maintenance of tools and machines. An example described by Deming (1986) is treating every non-conformance as a special cause when the process is stable. The result is recurring troubles when the only remedy should be improving the system by reducing process variation or adjusting the level.

Deming (1986) suggested that only acting on the system itself can result in substantial improvement. Exhortations, slogans and targets are therefore not advised. Not only must the specific process or organisation where in question be free from defects, all the suppliers down the chain must supply no defectives either. ”Do it right the first time”, is a phrase related to ZD (Eleftheriadis & Myklebust, 2016; Wang, 2013). According to Deming (1986), doing right the first time is nearly impossible since one has to be able to make certain that inputs are totally free from defects, the tools and machines for producing and measuring must also be flawless. Further, not only denouncing rates and incentive pay, Deming (1986) suggest that exhortations and targets are directed at the wrong audience, only to show that management has not identified the barriers to enable the work to be carried out properly, as a result of management expecting that employees can improve quality and accomplish ZD solely by trying harder. It is the responsibility of management to improve the system so that it can enable employees to do their jobs. What management should communicate to the employees is what they are doing for the employees, how they are improving the system so that the job can be carried out and provide better quality. Setting up and communicating numerical goals to the people within the organisation is therefore self-defeating if without a clear and detailed road map on how to reach them. Deming (1986) suggest that quotas are affecting productivity and quality negatively since they only result in increased costs of operation and curb pride of workmanship. Furhter, Deming (1986) suggest that more engineers are often occupied with developing work standards and counting defectives than engaged in production.

Numerical goals should not only be eliminated among the employees, but for management too. Employees, including management, need to be allowed to be proud of their work. Therefore, resources and prerequisites need to be secured in order for the people in the organisation to make the right quality. The right quality cannot be produced if employees have to spend their time inspecting and correcting defective product. Inspection cannot create or improve quality because it is too late as well as ineffective and costly. Inspection is planning for defects, accepting that the process does not have the capability for the required specifications. But if there is no quality, inspection may be the only alternative. It is sometimes however inevitable to have inspection because of economic reasons or customer demands. Inspecting smaller samples are also necessary for producing control charts that help achieving or maintaining statistical control of processes.

If systems and processes are not stable, machines, gauges, and other equipment are out of order the only thing that can be produced is defective product. Referring to such efforts as ”putting out fires”, Deming (1986) stresses that it is not improving a process. Identifying and removing a special cause detected by a point out of control is neither. It merely resets the process to where it should be. By removing barriers that inhibit the organisation from producing quality product, people will feel important to the job they are executing, resulting in pride of accomplishment and a willingness to improve the system.

14 According to Deming (1986), typical obstacles are leaders who do not have sufficient knowledge to give leadership, inadequate training in technology, documentation on how the job is done, excessive or complicated documentation on how the job is done, resulting in people not knowing what their jobs are. Management can remove these barriers by listening to advice and take action on suggestions from the people who do the job. Referring once again to system stability, Deming (1986) describes that it is useless to set up a goal if the system is stable. The outcome will be what the system is capable of delivering. Distributed normally there will always be a percentage of output that is lower than the mean as there will always be a percentage that is above the mean. On the other hand, if the system is not stable it has no capability and it is therefore no way of knowing what it will put out. Therefore, setting a goal is pointless in this case as well. In other words, outcome is an ineffective measure to focus on when seeking to improve processes.

As implied by both Montgomery (2012) and Deming (1986), variation is an aspect greatly affecting the quality of products, processes, and business as a whole. As mentioned, statistical methods are key to describe variation and issues thereof. Continuous improvement is not derived from accordance to specifications, it is a result of the ability to reduce variation about the nominal value. In order to learn about the different sources of variation in an organisation, Deming (1986) suggest that barriers between staff areas need to be broken down. According to Deming (1986), production and delivery is prioritised to the extent where engineers have to make production trade-offs. Doing so inhibits them to learn about production and design problems. To a large extent, this is a result of controlling the workforce on metrics such as KPIs or ratios, ultimately resulting in them making to specification. The remedy, as described by Deming (1986) in point nine of the 14 points for management, is that people in research, design, purchase, sales etc. must learn from each other and one’s struggles. Quality should be built in at the design stage, therefore collaboration between departments is fundamental. The quality starts with the intent which comes from management according to Deming (1986). For the aerospace industry where product and design complexity is a fact, transferring the intent of quality is highly relevant, especially between design and manufacturing (Arsuaga Berrueta et al., 2012; Magnanini et al., 2019).

15 3 Methodology

The following chapter presents the methodology of the thesis. First, the research approach is described. Choices regarding data collection and analysis approaches are motivated, where methods that have been used are discussed in relation to alternative methods. Lastly, research quality in terms of reliability and validity of the thesis is also treated within the chapter. Figure 4 represents the main steps of thesis methodology.

Figure 4: Main methodological process steps.

3.1 Research Approach

The research approach of the master’s thesis may be considered deductive. A deductive approach is usually focused on causality (Deborah, 2013). The approach may therefore be considered deductive since ZD is a concept which, especially during the last two decades, have been studied extensively. The empirical data collected for the thesis is considered to be sufficiently detailed and rich in order be used for the purpose of exploring the phenomenon of ZD and explain it by developing themes in the thematic analysis.

In order to answer the research questions and fulfil the aim of the thesis, the research strategy is of exploratory character with a qualitative approach for collecting data. First, a literature overview was conducted in order to map out different views and approaches of ZD and SPC. The literature overview resulted in a developed model which can be described by the framework in figure 1. The framework depicts different actors, systems, and concepts that may describe ZD in different ways. The thesis used existing theory from the literature to describe the concept of ZD in general as well as how ZD may be approached by utilising tools and methods for SPC, answering the third research question. The general description was then compared to a qualitative study consisting of 14 interviews in order to gain deeper and specific knowledge of the phenomenon at GKN Aerospace. The comparison was carried out through a thematic analysis, addressing the first, and partially the second and third research question. Developing a general view through literature and comparing it to empirical observations implies the research being of more deductive character according to Saunders et al. (2007). According to Saunders et al. (2007), deduction may be viewed as theory generating data and induction may be viewed as data generating theory. In addition, Saunders et al. (2007) suggest that an important characteristic of deductive approaches is that concepts have to be operationalised to provide measurability. As the thesis assumes that ZD may be operationalised through Cpk, the deductive approach is thought to be suitable.

In order to support the second research question with a practical example, an SPC study was also conducted. The SPC study provided understanding of how GAS is working with collection, reporting, and analysis of process data. The quantitative data used was collected from the internal system for process data called QSYS. Thus, it was not generated for the purpose of the study but rather collected as a means for identifying and analysing how process data is currently managed within the organisation. In combination with the interviews and the case study, the framework was then used as a help for assessing the current situation at

16 GKN Aerospace Trollhattan¨ with regards to working towards zero defects from an SPC-perspective. Table 1 summarises how the different studies address the three research questions of the thesis.

Table 1: Relation between the studies of the thesis and the research questions

Study Research Questions Literature Overview 1, 2, 3 Initial Analysis 1, 3 Thematic Analysis 2, 3 SPC Study 2

3.2 Literature Overview

It is, according to Eisenhardt (1989) essential to compare emerging concepts and theories with existing literature for the purpose of internal validity and generalisability. Additionally, if different sources of literature are conflicting it may drive the research to become more creative and frame-breaking, resulting in deeper insight and reducing the degree of generalisability of the study (Eisenhardt, 1989). The literature used in the thesis mainly consist of peer-reviewed articles in scientific journals. Articles from established journals were of special importance as was articles which have been cited to a higher extent since these articles are more likely to be part of the contemporary views on the topic within the literature. Additionally, the articles should be considered quite recent since none of them are more than 20 years old, thus portraying contemporary concepts and theories on the subject. To a larger extent, the scientific articles describe contemporary views within the literature. These articles have been compared to existing theories and concepts from previous literature. Among previous literature, Deming (1986) and Montgomery (2012) are considered well established sources that are broadly considered to be valid and reliable. In addition, international aerospace standards have been used in order to consider what can be viewed as an industry perspective. These standards have been used to consider the theories and concepts that are practised within industry.

Two databases have been used for finding articles in scientific journals; Scopus and Google Scholar. Scopus served as the primary source database on the basis of structure and citation traceability. By using Scopus, it is easy to toggle back and forth between different articles through citations. Another reason for Scopus serving as a primary database choice was the ability to formulate more specific search strings using booleans and truncation. Restrictions in accessibility through Scopus resulted in some articles being collected through Google Scholar, given its extensive amount of articles.

In order to find articles treating subjects such as ZDM, SPC, quality management, and industry 4.0, different keywords were used. Some of the more frequent ones included defect, zero defects, manufacturing, aerospace, process control, statistical process control, process monitoring, industry 4.0, and quality improvement. Some of the keywords generated great amounts of results from an array of industries and practices. Filtering on aerospace or vehicle transport, statistics, as well as manufacturing and engineering was therefore employed in order to reduce the number of less relevant hits and focus the results towards the project’s aim and the specific industry. A wide variety of combinations of the keywords were also employed, using booleans and truncation.

17 3.3 Interviews

The collection of qualitative data consisted of 14 semi-structured interviews conducted with company representatives that were either involved in quality work or had good knowledge of particular processes. Considering the literature, working towards ZD requires engagement and action from management on all levels. The employees chosen for the interviews in this thesis are therefore represented by top management, managers from areas such as design, manufacturing, and quality, as well as managers of specific products, processes and projects. The aim of interviewing employees in different types of managerial positions was to consider different areas of the organisation in an attempt to gain a comprehensive collection of data on the views within the organisation as a whole. Further, engineers within both design and manufacturing as well as production technicians were interviewed to get an understanding of the more operative aspects of quality work. These individuals work close to problems and the more practical aspects of the business.

According to Longhurst (2003), semi-structured interviews are partially structured and self-conscious verbal interchanges where information is extracted through questions. Semi-structured interviews can be used as a supplement method, a way to triangulate in multi-methods research, or as a stand-alone method for collecting empirical data (Longhurst, 2003). The interviews conducted in this thesis were a way of gaining deeper knowledge of process management with regards to data collection and analysis as well as more organisational aspects such as work routines and communication to get a better understanding of how the interviewees work towards zero defects. Table 2 contains the interviewees in order of conducted interviews, the approximate number of years they have worked for GKN Aerospace, years in current position as well as approximate interview duration.

Table 2: Interviewee information and interview duration.

Number Title Years in organisation Years in position Duration 1 Manager Quality Assurance 35 1 55 min 2 Quality Assurance Product Development 38 13 51 min 3 Engineering Management & Support 36 3 59 min 4 Engineering Management & Support 3 1 54 min 5 Manager Area Manufacturing Engineer 35 0.5 46 min 6 Quality Engineer 24 8 58 min 7 Robust Design Engineer 8 8 60 min 8 Director Operational Excellence 12 1 85 min 9 Director Manufacturing Engineering & Quality 25 1 60 min 10 Director Quality 36 4 71 min 11 Chief Manufacturing Engineer 25 3 64 min 12 Chief Manufacturing Engineer 13 1 44 min 13 Technology Insertion Project Manager 34 0.5 48 min 14 Product Manufacturing Engineer 32 24 59 min

The qualitative approach with semi-structured interviews was chosen to obtain empirical data which could be compared to one another during analysis while still enabling the interviewees to speak freely about what they felt was most important for approaching ZD by utilising tools and methods within SPC. Kallio et al. (2016) suggest that interview guides for qualitative semi-structured interviews reflect the objectivity and trustworthiness of a study as well as the plausibility of its results. In order to develop a rigorous interview guide, five steps suggested by Kallio et al. (2016) was followed:

18 1. Identifying prerequisites for using semi-structured interviews 2. Retrieving and using previous knowledge 3. Formulating the preliminary semi-structured interview guide 4. Pilot testing of the interview guide 5. Presenting the complete semi-structured interview guide

The interview guide used during the interviews can be found in Appendix A and contained four general and 33 more specific questions as well as three statements on which the interviewees were asked to share their opinions. Both the general and specific questions were developed with the help of the literature to enable comparative analysis of literature and the collected interview data. The literature overview identified key areas that related to ZD. These areas are broadly represented by the subsections in the literature overview. First, 20 questions were developed that related to how ZD could be interpreted within GKN Aerospace. These questions were mainly developed with help of company representatives and related to earlier improvement initiatives and the internal systems at GKN Aerospace. The literature overview helped define these questions with regards to improvement initiatives, measurability, and data reliability. Three specific questions were developed for the purpose of defining what defects are and how these are detected. Six questions related to SPC as well as how Cpk is interpreted and enhanced within the organisation. Four questions related to Industry 4.0 and how modern technologies may affect the work towards ZD. The three statements were based on difficulties in working with SPC within the aerospace industry, identified in AESQ’s Guidance Materials standard (SAE International, 2018b). Follow-up questions on some of the more specific questions were asked in an attempt to get further explanations.

Each interview was initiated with the general questions followed by the first couple of specific ones where it unfolded in a conversational manner letting the interviewees speak more freely (Longhurst, 2003). The less structured approach to interviewing was chosen for the purpose of gaining deeper knowledge about how the company representatives think about, work with, and interpret process control methods and the measures used as well as their interpretation of how these aspects relate to ZD. Each interview was recorded and transcribed within 24 hours for the purpose of keeping the interviewer focused on the specific interview. Transcription was conducted to develop a documented set of data for the thematic analysis. Each recorded interview was first played back once to serve as a reminder of the interview. During this playback, notes were taken. Afterwards, the interview was played back a second time and transcribed. During the second playback, each interview was written down in detail for the purpose of clearly depicting what each interviewee shared.

3.4 Thematic Analysis

As a relatively straight-forward form of qualitative analysis, thematic analysis is a method used for identifying, analysing and highlighting patterns within data (Braun & Clarke, 2006). These patterns are the themes formed through the analysis. Thematic analysis can be used for different purposes according to Braun and Clarke (2006). Riessman (1993) suggest that the thematic approach can be employed to theorise by identifying common themes across the participants in the research and what they share. Although it may be used with one or more theoretical frameworks, it is not bound by having a pre-existing theoretical framework for conducting analysis. Thematic analysis can be used to report meanings, experiences and the reality (i.e. essentialist or realist method) or examine how meanings, experiences, realities or events are the effects of different aspects of society (i.e. constructionist method). Thematic analysis can thus both reflect reality or explain reality (Braun & Clarke, 2006). The conducted thematic analysis uses theories and

19 concepts within the literature to explain the nature of the transcribed interview data. Therefore, it can be viewed as utilising a realist method.

There are a number of choices to consider and discuss before starting a thematic analysis according to Braun and Clarke (2006). The researcher should also undergo reflexive dialogues during the analytic process. The choices to consider are following:

What accounts as a theme? A theme represents some patterned meaning within the data set. It should capture what is considered important in the data with strong ties to the research questions. This is a subjective matter depending on the researcher’s judgement since the ’keyness’, as described by Braun and Clarke (2006), is often not relying a measured quantity of occurrence in the data but rather in terms of the theme’s ability to contain importance related to the research questions.

Should a rich description of the data set be reported, or a detailed account of one particular aspect? Since the analysis in this thesis is conducted with ties to theory, it is likely describing more of a detailed analysis on some aspects of the data than a rich description of the overall data.

Should the thematic analysis be inductive or theoretical? The thematic analysis in this thesis can be considered theoretical since it is driven by the researchers theoretical and analytic interest for the specific topic (Braun & Clarke, 2006). The coding may be considered quite specific, being based on the research questions and various theoretical constructs form the literature. It was however discovered during construction of themes and codes that review of additional literature was necessary, making the approach more inductive.

Should the identified themes be semantic or latent? The themes of a thematic analysis can be identified at either a latent or semantic level. Analysis on a latent level implies looking for underlying ideas, conceptualisations, or assumptions within the data. Analysis on a semantic level involves identifying themes based on explicit meanings within the data. By identifying themes based on the explicit meanings of the transcribed data, the themes identified in this thesis can be considered semantic.

Should the epistemology of the analysis be of essentialist/realist or constructionist character? According to Braun and Clarke (2006), the research epistemology informs how meaning is theorised and guides what can be said about the data. Employing an essentialist/realist approach implies theorising meaning, motivations, and experience in a straightforward way assuming that mainly unidirectional relationships exist between meaning and experience and language. A constructionist approach is unable to, and does not seek to focus on individual psychologies or motivation. It rather seeks to theorise the socio-cultural contexts and structural conditions that enable the individual accounts that have been shared within the data. Only considering unidirectional relationships between meaning, experience, and language, the thematic analysis in this thesis should be considered of essentialist/realist epistemology.

The thematic analysis was conducted by freely working through the six phases suggested by Braun and Clarke (2006):

Phase 1: Getting familiar with the data The thematic analysis process is initiated when the researcher or analyst begins to look for meaningful patterns and issues of potential interest in the data. As noted by Braun and Clarke (2006), this might be as early as during data collection. To some extent, such patterns and issues were noticed during the interviews and even more so during transcription. The transcription of interviews served as a familiarising step to get to understand the data, informing the first stages of analysis.

20 Phase 2: Generating initial codes By reading through the entire data set, initial codes where generated. Once the entire data set was treated, 50 codes had been developed. In order to pay equal attention to each data extract and each code, the extracts were read once again, adding additional codes. This was done because the first round of coding resulted in codes being developed gradually. Thus, the data extracts that was handled in the beginning might not have been coded with a code that was introduced at a later state, even though it was a fitting code for the data extract. According to Braun and Clarke (2006), coding is an ongoing organic process. At the end of this phase, codes were translated from Swedish to English and the number of times each code was used was recorded. An example of what a coded data extract could look like is presented in table 3.

Table 3: Example data extract and codes

Data extract Coded with

To a very large extent, we rely on the 10. Need for more statistical work with processes requirements of customer and design. I don’t really know, but I think you expect to be able to 16. Tolerance limits are used for controlling control and analyse the outcome of the process after that. I don’t think you use control limits 42. Customer/delivery is prioritised and statistical control as much as you should. 46. Process knowledge replacing standardised quality work You are probably relying on experience; it went well last time so we’ll try it this time as well. (I2)

Phase 3: Searching for themes Once the initial codes had been generated, these were sorted into potential themes. This step included collating all the data extracts to see if they would fit together within the themes. Some of the codes were found to work well as main themes or sub-themes at this point. Some names where therefore changed to serve better as themes, as themes are broader than codes. In some cases, codes did not seem to fit in any specific theme, these were put in a theme called miscellaneous. A thematic map was created based on the initial themes, showing the main views from the transcribed data.

Phase 4: Reviewing themes Once the initial themes had been created, these were reviewed in order to make certain that each theme had a clear distinction and that data within themes cohered together with meaning. The review resulted in themes either collapsing into each other or were divided into separate themes. This was done was by first reviewing at the level of coded data extracts by reading through all the extracts once again to make sure each theme represented a coherent pattern. Once satisfied, the next step of this phase was to consider the validity of the themes in relation to the entire data set and the thematic map’s accuracy in reflecting the meanings in the data set as a whole (Braun & Clarke, 2006). This was both an iterative and subjective process as it was considered finished when the analysis and refinement of themes was not adding anything that was considered substantial.

Phase 5: Defining and naming themes The purpose of this phase was to define the themes so that the overall essence of the data set was represented. Also, the essence of each individual theme was identified to determine what aspect of the data it captured. This phase involved considering the data extracts and codes for each theme to make sure that each theme represented a coherent account (Braun & Clarke, 2006). For each theme, a detailed analysis was written based on what data extracts and codes within the theme portrayed. This resulted in the development of sub-themes which, according to Braun and Clarke (2006), can be useful for structuring large and complex themes and demonstrating the hierarchy within the data set. At the end of this phase, the themes were named to represent a concise and immediate feeling of what they were about.

21 Phase 6. Compile the material in own words According to Braun and Clarke (2006), purpose of the write-up of a thematic analysis is to tell the story of the data so that the reader is convinced of the merit and validity of the analysis. In practice, this was conducted by explaining each theme, including sub-themes, what they are about and how they form the build-up the data set as a whole. Data extracts were used to highlight or exemplify points discovered in the analysis. Also, these points were analysed in relation to views and concepts from the literature overview.

In practice, the six phases of thematic analysis as suggested by Braun and Clarke (2006) were carried out in six steps. The first three steps focused on getting familiar with the data (phase 1) through conducting the interviews, transcribing them, and identifying data extracts. In the fourth step, initial codes were generated (phase 2) and reviewed. Step 5 consisted of phases 3, 4, and 5 where themes where identified, reviewed, defined, and named. In the sixth step, the material was compiled in a written analysis (phase 6). Figure 5 illustrates the thematic analysis process in relation to the six phases developed by Braun and Clarke (2006).

Figure 5: Thematic analysis steps.

3.5 SPC Study

An SPC study was carried out for the purpose of illustrating a practical example of how process data may be presented and statistically analysed. By showing how tools and methods within SPC may be applied to identify and remedy process issues such as assignable causes of variation, the SPC study helps answering the second research question. The SPC study was conducted on a set of data consisting of a height measurement from an engine component. The process of measuring output of the height data was not considered stable and thus resulted in nonconforming product. This was determined by the help of employees with process knowledge or involved in work where the processes delivered nonconforming or defective products. This process was chosen in order for the author to get an understanding of what kind of data was collected and how it was used to improve processes with statistical tools. It also served the purpose of understanding how problematic processes resulted in problems for the business.

First, the process control report that was currently used was presented with statistics such as capability indices and standard deviation. Since the measurement process delivered double values for each measurement the next step involved computing an average of each observation’s double measurements. 100 measurements from 50 observations were thus reduced to 50 averages for each observation. Afterwards, distribution fitting was conducted first to the original 100 measurements and then to the 50 averages using the software Statgraphics Centurion. The distribution fitting procedure was done in order to check so that the data sets came from a normal distribution. Once completed, a Shewhart chart for individual variable data and a Moving range chart for the 50 averages was constructed in order to analyse the process statistically. Figure 6 shows the main phases of the SPC study.

22 Figure 6: Phases of the SPC Study.

3.6 Research Quality

The following section address the research quality of the thesis. First, internal validity of empirical data is discussed. Second, external validity and generalisability of the thesis is discussed. Third, reliability of the empirical data and methodology is treated.

3.6.1 Internal Validity

According to Guba (1981) internal validity relates to the truthlikeness between the data of a study and what the data actually represent. For the quality of this thesis, internal validity may be viewed as the credibility of what the data represent. There were mainly three actions taken in order to improve the internal validity of the interview data in this thesis. First, Interviewees where initially selected with the help of company representatives that possessed knowledge about the organisation, process improvement, quality work or specific design or manufacturing processes. Second, during or after interviews, additional interviewees were recommended based on their expertise and knowledge within specific areas, these recommendations were considered valuable and help build the complete sample. Third, the interview questions were initially based on literature as well as the help from company representative with long experience and knowledge on the subject. In addition, after conducted interviews, the interviewees were asked if they thought that there was a need for additional questions in order to permeate the different aspects of the topic. Including such questions would then be considered.

The internal validity may have been improved by using different methods of data collection. Qualitative data is however considered suitable for the purpose of answering the research questions. Alternative methods of collecting qualitative data, such as workshops or focus groups may have provided useful discussions between individuals. Such methods are however more difficult to document in a controlled and reliable way. Reliable documentation was prioritised in order to conduct a thorough thematic analysis. Questionnaires could have been used as a means to collect qualitative as well as quantitative data. Qualitative data from questionnaires were however considered to be less in-depth than interview data. It is also believed that the research questions would not be answered in a better way by relying on quantitative questionnaire data since interpretation is more easily explained with words than numbers.

3.6.2 External Validity

External validity is described by Guba (1981) as how the results of a study can be generalised without being affected by aspects of variation or specific situations. Partly, external validity for this thesis relates to how the results from the thematic analysis of transcribed interview data may be generalised with the help of the studied literature. The results of the thematic analysis may also be generalised in broader contexts such as the aerospace industry, or other industries. In order to facilitate generalisation, it is also important to consider selection of data and the size of it (Saunders et al., 2007). For this thesis, this relates to both the interview data and the data used in the case study. The interviewees were selected

23 based on their knowledge and proficiency in the studied field of manufacturing processes, management and quality engineering with the help of company representatives and recommendations from interviewees. In particular, interviewees recommended by earlier interviewees may be considered especially important since they were often recommended based on the character of the interview questions. For an interview study, 14 interviewees may be viewed as a sufficient amount given that ZD as a concept is spanning a certain degree of cross-functionality. The length of the interviews was considered long enough to supply sufficient, in-depth information, yet short enough to maintain active and fruitful conversations. As noted, the case study was conducted for the purpose of portraying an example of how process data is presented and analysed as well as alternative ways to do this. The case study data was selected with the help of company representatives possessing process and product knowledge. These data was selected for the purpose of portraying how issues may be handled at GAS. Therefore, the case study data is not believed to have a high degree of generalisability with regards to the overall data collected at GAS nor similar organisations. With regards to how GAS, and similar actors within the aerospace industry, are experiencing problems with how to apply statistical process control, these data may however be seen as generalisable.

3.6.3 Reliability

Reliability refers to the trustworthiness or authenticity of the collected data that is used in a study (Guba, 1981). Since qualitative interview data may provide greater subjectivity than quantitative data, all interviews in this study were audio recorded. Each recording was played back several times in order to make sure that it was transcribed in a way that represented the meanings and essence of the interview. Saunders et al. (2007) suggest that a structured methodology is important for facilitating reliability. The methodology in this thesis, in particular the thematic analysis, may be viewed as a structured approach to analyse qualitative data for the purpose of explaining the concept of ZD in a reliable way.

24 4 Results and Analysis

The following chapter contains the analysis of the qualitative interview data. In order to answer the first and partially the third research question, an initial analysis is presented that seeks to identify what ZD is and how it is interpreted at GAS. The following sections presents a thematic analysis where codes identified in the transcribed interview data are collated into themes related to how GAS can work towards ZD, partially addressing the second and third research question.

4.1 Zero Defects at GKN Aerospace

In order to identify and define how ZD is viewed within the organisation at GAS, an initial analysis was carried out. The initial analysis is thus carried out to answer the first research question of the thesis. The initial analysis identified six phases that may describe ZD at GAS as a cycle. These phases are based on codes from the thematic analysis which is elaborated further in section 4.2 Thematic Analysis. The six codes relate to how the interviewees interpret ZD within the organisation at GAS. According to the interviewees, ZD is:

a vision • requiring a cultural change • a foundation for quality work • working with risks and quality in early phases • working with SPC • being able to show C = 2.0 • pk

Apart from the fact that they relate to ZD, these six points are quite different. Some are easier to relate to each other than others of course. Working with SPC can for example be connected to being able to show a

Cpk of 2.0 quite easily since capability studies are a traditional concept within SPC. Working with SPC may also be seen as a foundation for quality work and working with quality and risk in early phases may very well be viewed as a foundation for quality work too. A vision might inspire a cultural change and working with risk and quality in early phases may require a cultural change. The way it is interpreted in the thesis can be represented in figure 7 where ZD can be described as a cycle, assuming that quality is the ability to meet or exceed customer expectations by reducing variation in both manufacturing and business processes. In each step, the cycle highlights what focus areas in order to create the best possible circumstances for working towards ZD. The different steps of the cycle do not have clearly defined endpoints in which work can be considered completed and can be discontinued. Arranged in successive order, the steps rather emphasise what needs to be considered before carrying on with the following steps. For example, the second step cultural change is likely to require efforts and resources continuously.

25 Figure 7: Zero defects described as a cycle at GAS.

4.1.1 ZD is a Vision

For GAS, ZD was introduced as a result of customers stepping up their requirements on product quality to reduce risks. In reality, 100% non-defective product is extremely difficult if not impossible. Therefore, ZD can be viewed as a vision. A point communicated by several interviewees. For example, Interviewee 7 described it as:

”Zero defects is some kind of vision, a good expression, but we will never have zero errors and I do not think it is profitable either. We actually use SPC, where in reality we say that a certain number of percentages will fall outside sometimes.” - Interviewee 7

Focusing a vision on an outcome such as ZD may be a good way to latch on to a future goal or ideal state for an organisation. However, Deming (1986) pointed out that focusing on outcome is an ineffective approach that should be replaced by a focus on the intent of the quality throughout design and production. With that in mind, a vision focusing on ZD may not be a bad idea for GAS. According to Magnanini et al. (2019), having multi-stage manufacturing systems producing complex aircraft engine components, GAS should consider the quality strategy with the overall production strategy. Doing so may be viewed as a modern interpretation of keeping focus on the intent of quality in design and production, as discussed by Deming (1986). Being in that particular situation, the quality is closely related to the safety of the product where a defect might result in catastrophic outcomes. For that reason, communicating that the organisation is striving towards zero defective products may have positive effects both internally and externally. Since deviating processes can result in defective products, Papacharalampopoulos et al. (2019) suggest that the quality of products is depending on the quality and efficiency of the manufacturing processes. Wang (2013) suggest that the focus on quality should shift from product data to process data. Even though 100% conforming products may be impossible, the efforts should emphasise improvement on the process rather than the product in order to move in the desired direction. In a sense, the product measurements may be used as means to monitor and improve the quality and efficiency of processes. By doing so the quality of product is also improved.

26 4.1.2 ZD Requires a Cultural Change

In order to work in accordance to the vision of zero defects, GAS would have to go through a cultural change where focus on the intent of quality would need to see a shift. A cultural change in order to enable ZD was one of the most frequently used codes. The interviewees indicated that a cultural change is necessary in over 50 of the 459 coded data extracts. Such results indeed indicate the need for a change in culture but also that the engineers, technicians, managers, and the organisation as a whole are aware that the existing culture cannot support what the customers are requiring. In other words, the people of the organisation are fully aware that the existing organisational culture does not support an employee to do their work right. The essence of the issue is captured in a statement made by one of the interviewees.

”For our company, quality is by far and on its own our biggest challenge, really all dimensions. Financially, in terms of delivery, great place to work. There are extremely many people at this company who work with non-stimulating tasks. That’s exactly what you generate when you have lack of quality.” - Interviewee 8

The above extract describes how quality can come to affect not only the external relations and financial aspects of an organisation but also how it preys on psychological aspects of the people carrying out the work. As described by Deming (1986), without barriers to pride of workmanship, quality can be boosted as a result of individuals feeling important to the work they are doing, creating a willingness to change and improve the processes and products to achieve better quality. In many cases, an improvement may seem insignificant yet have great effects. Nonetheless, even improvements having incremental effects are necessary to tune a manufacturing organisation towards ZD. The key to unlock these incremental improvements is to have a strategy that allows for management to be exposed to the struggles in production and design. In order for management to make requests for what is wanted, they need to understand what affects reaching the state they want to reach. Often, it is the people working with the actual process or product that have the expertise, therefore management can only attain the knowledge by asking what the expert needs in order to improve in the wanted direction (Deming, 1986). In order to develop a culture and strategy that allows identification of errors in early phases to avoid procreation of errors along the process chain, Magnanini et al. (2019) stressed the necessity of a good understanding of the manufacturing system. A good understanding of both the organisational and technological aspects of the manufacturing system provides better reactiveness to customer requirements and market competitiveness.

”Just like for our client’s ZD project, you need a change in culture where leadership and management are the most important factors in making it work. Also knowledge, being able to ask the right questions, demand the right things from employees, and change, otherwise it will not happen.” - Interviewee 3

It is clearly indicated in both the literature and the transcribed interview data that management plays a vital role in creating a culture that can enable working towards ZD. Deming (1986) suggested that the intent of quality is very much the responsibility of management to remove the barriers for cross-functional work between design and manufacturing. The quality has to be built in at the design stage, and the intent of that quality also needs to be understood in production. Likewise, the design entity needs to have an understanding of how the intended quality may be produced, and thus design with that in consideration.

27 ”It is about fundamentally changing our way of working and describing the working methods in our processes so that they are confirmed. That you are constantly working to remind yourself that this is what we have to do. After all, it is a journey of change in the culture really, which requires leadership to a large extent.” - Interviewee 10

4.1.3 ZD is a Foundation for Quality Work

For GAS, the product quality is of great importance. The traditional way of securing product quality within the organisation has to a large extent been to control product dimensions using specification limits and evaluate them on Cpk. Specifications and Cpk are well known throughout the organisation, affecting many employees in their daily work. In that sense, they are a part of the culture at GAS. Such a culture seeks to increase product quality by conforming to specifications and improve process capability. Magnanini et al. (2019) suggested that in order to reach ZD, methodologies are needed that can increase product quality and process capability simultaneously rather than treating them separately. The issue here is that quality should not be focused on conformance to specifications as suggested by Deming (1986). By defining quality as conformance to specification or a sufficient value of Cpk customer expectations may be met. The chances of customer expectations being exceeded are however very small. In extension, the approach may stifle the continuous improvement efforts within the organisation resulting in sufficient quality but never superior. Specification limits are of course important to consider and Cpk is a useful index for evaluating processes. But if quality is considered to be the ability to reduce variation to the point where customer expectations are met or exceeded the focus is too narrow for enabling engineers, managers, and technicians to feel important to their work and have the willingness to improve. Interviewee 7 touches on the subject within the following data extract:

”Sometimes it feels like everything we do here is so special. We talk about continuous improvement, that’s something you work with every day, it’s nothing special! If you have a way of working, you are obliged to work with it and constantly improve it. That’s what’s fun! How can I make things better and smarter? That’s what’s fun and drives you forward.” - Interviewee 7

The above quote clearly suggests that improving is part of the job which can be related to the importance of considering the quality strategy with the overall production strategy as suggested by Magnanini et al. (2019) as well as the barriers to pride of workmanship as discussed by Deming (1986). By developing strategies that take quality into consideration and allows individuals on every level to improve without unnecessary obstacles, the organisation can become an environment where the culture is focused on the intent of quality. Doing so will make individuals feel important to their work and create a willingness to improve. Such a culture can enable a foundation for quality work where more ideas concerning improvement and quality may be unlocked, discovering more alternatives to meeting and exceeding customer expectations.

When asked about the ongoing initiatives for quality work at GAS, several interviewees described that the organisation was currently rolling out a lean-initiative called Lean Operating Model (LOM). Discussing lean in relation to ZD the interviewees described that lean can generally be seen as a way to be as an organisation while ZD can enable the state of being by providing the right quality. In a sense, ZD then provides a foundation for other quality work.

28 ”My spontaneous perception is crawl, walk and run. Crawling and walking is what you do when you start with ZD, you do right from the beginning. You can then speed up and slim down costs, simplify.” - Interviewee 13

The above quote suggests that if ZD can guide the organisation in the right direction by creating a culture for improvement, further tuning is possible in order to ramp up effectiveness and efficiency within the organisation. The next quote suggests that, by making processes more stable, i.e. more predictable, a foundation for streamlining production is enabled.

”If you think lean should enable streamlined production, you often need to have stable processes. There, I definitely see that ZD is an enabler for being able to work on such things.” - Interviewee 14

What the quotes from these and other interviewees suggest is that, if successful, ZD can provide a culture and a way of working with quality through process improvement so that other improvement efforts may be carried out with more efficiency. If the organisation as a whole has a common interpretation of what it is to achieve, i.e. the intent of quality, and that interpretation is further defined to specific ways of working in different areas, the quality may serve as the everyday goal for everyone. Much of this is related to reducing variation in production and more administrative processes. In a not so distant future, emerging technologies creating an Industry 4.0 environment may further strengthen ZD as a foundation for quality work. According to Papacharalampopoulos et al. (2019), many of the modern technologies making up Industry 4.0 supplies the right conditions for ZD to be more established as a foundation in industrial manufacturing. Again, it should be emphasised that the intent of quality should be the focus of utilising such technologies. According to Papacharalampopoulos et al. (2019), ZD is supported by technologies that provide increased digitalisation and inter-connectivity of processes and production-lines.

4.1.4 ZD is Working with Risk and Quality in Early Phases

Since great risks are the reality of aerospace products, security in using these products needs to be an integral part of the quality. In such an environment, a system for ZDM is therefore necessary to prevent failures and reliability of manufacturing within the industry (Wang, 2013). As a result of strict safety requirements, quality and risk mitigation are often closely related and needs to be considered in early phases. According to Wang (2013), ZD is doing things right from the very beginning. Because of the high demands of safety, doing things right from the beginning is especially important within the aerospace industry. The increasing complexity of multi-stage manufacturing systems has, according to Magnanini et al. (2019), resulted in product characteristics developing inter-dependencies among process stages. These inter-dependencies further indicates the necessity of cross-functional collaboration between process stages throughout the manufacturing system. Also, since each component is usually very expensive to produce, the impact of defects in the aerospace industry is of great importance (Arsuaga Berrueta et al., 2012). Prioritising proactive work over reactive is therefore essential so that defects produced in earlier process stages do not spread onto the following. Rework and inspection is however always present in the industry, partly because products and components can simply not be nonconforming, and partly because OEMs can only be sure of it by inspecting thoroughly. Nevertheless, by tirelessly focusing on identifying and eliminating risks and constantly improving quality, the non-conformities and the control instances can be reduced. The way to go about it is, again, to make the different departments, especially design and manufacturing, aware of and concerned with each other’s issues. One interviewee described how the lack of communication and collaboration between design and manufacturing harms the work towards ZD.

29 ”We have been in a situation where design has made a design foundation to manufacture around. Then they are satisfied with theirs and there is very little communication in between. It’s devastating. ZD cannot be made in the production section alone, you must have the design side with you.” - Interviewee 10

There are of course examples of situations where the project to manufacture a product has good communications between design and manufacturing. The below quote refers to the process which has to be run in order for a specification change to be made. The change case process is a complicated matter in both administrative and technical terms. Changing specifications on one component may require re-evaluation of it and the assembled product, sometimes it may even require a completely new certification of the product.

”The change case process is very much about collaboration where we present an idea, reason with the Chief Engineer of Design and the engineer in charge and get feedback, then we almost write a change case together.” - Interviewee 11

Another key part of working with risk and quality in early phases is using data. Large amounts of data are collected on up to thousands of measurements for each product at GAS. Such data should be used when making decisions about specification changes, new product introductions, and product and process improvements. The use of data introduces the next phase of the cycle, working with SPC.

4.1.5 ZD is Working with SPC

Much of the production at GAS is complicated and heavy on measurements and numbers. Since a lot of inspection by measurement is the industry standard, it follows that a lot of measurement data are available as well as process data from automated and sensor equipped processes. Such data are to be considered a valuable asset and it is, to a large part, collected because GAS needs to be able to show the OEMs product dimensions and process capabilities. As stressed by Magnanini et al. (2019), characterising controllable and uncontrollable causes of variability on the final product is a key challenge for approaching ZD. Tools and methods within SPC are generally considered suitable for controlling and reducing variability (Montgomery, 2012). Even though, Arsuaga Berrueta et al. (2012) suggest that SPC methods are insufficient within the aerospace industry due to complex parts, large number of parameters, and short production series SPC is applicable since variability can only be described in statistical terms (Montgomery, 2012). Of course, a lot of the data at GKN Aerospace are used to monitor process output and make adjustments when so are necessary, but it is indicated on numerous instances in the transcribed interviews that statistical process control is mainly focused on capability indices and conformance to specifications.

”We also have goals for something called SPC, which is the statistical process outcome for the products. There, at least 90% of all requirements on the drawing must have a tolerance that does

not utilise more than 80% of the tolerance range. It is about Cpk = 1.33, not quite a hundred per cent applicable but around there. After all, it is a proactive metric, to have a robust process that does not use more than 80% of its tolerance range to ensure that we do not fall outside of tolerance.” - Interviewee 5

30 The above quote suggests that the operators and engineers responsible for a process are measured and controlled on what Cpk it provides. Improving the index is done by utilising tolerances, or a percentage of the tolerance range, as control limits. Indeed, it is advisable to check for out of specification points, if such are beyond the tolerance limits the product is likely not accepted by the customer. But using tolerance limits for control is not a proactive metric. What would be proactive is to use statistically computed control limits where the standard deviation and the mean of the output are used as suggested by Mohammed et al. (2008). Of course one have to assume that the process is in statistical control, i.e. only shoving variation due to common causes and exhibit a random pattern within its natural way (Mohammed et al., 2008). Either way, Deming (1986) suggests that setting up numerical goals and focusing on outcome is useless since the system will put out what it is capable of producing. If the process is under statistical control and control limits are used, the work is of more proactive character. Output is still the usual metric referred to here, but the statistically computed control limits will provide scientific indications on what has happened or what is about to happen to the process. Specification limits are not action limits (Deming, 1986).

”We have very good dimensional data via QSYS. Through that I have seen a lot of examples when we got into trouble where we see that if we had only worked with our process data we would have seen it coming and could have acted on it before.” - Interviewee 10

Even though the data are available for use, they are not always utilised in a way that satisfies the goal of better product quality as described by Interviewee 10 in the above quote. To a large extent this is a result of the heavy focus on delivering to customers on time. In order to achieve more sustainable production systems on both economic and environmental grounds, ZD needs to be approached by reducing variability during production (Papacharalampopoulos et al., 2019). A big portion of the time, effort, and resources are however dedicated to handling defects that have already been produced, leaving little resources left for proactiveness. The below quote portrays such an insight from a manager at GAS:

”Today, a lot of our Manufacturing Engineers and Quality Technicians sit and handle errors, deviations to try to get things moving. I want that gang to move from that world to instead sit and work with our SPC data for example, to see when things start happening. At least you get a little closer, even if you see it when a product is already run.” - Interviewee 10

On several instances, the interviewees discussed to the need for a culture and a foundation for quality work that allows statistical process control as a way to improve processes. As indicated on several instances in the transcribed data, such a foundation can be achieved by management inquiries that drive process improvement through statistical analysis. The aerospace industry is subject to some difficulties in working with SPC. According to Eleftheriadis and Myklebust (2016), small batch production implies restrictions in working with SPC. Either way, working with SPC in order to move towards ZD is in many situations the best possible alternative to proactivity. In addition, since OEMs have operationalised the goal of ZD as Cpk=2.0, SPC also represents a set of available tools to meet the goal. The fact that manufacturers need to show their customers a Cpk of 2.0 is discussed in the following section which is also the final phase of the describing ZD as a cycle at GAS.

4.1.6 ZD is being able to show a Cpk of 2.0

The focus on capability and conformance to specification has been adopted widely. In fact, the ZD concept as whole is operationalised by working towards the point where the process in question can show a Cpk=2.0.

31 As discovered in the literature, Cpk=2.0 may in many regards be consider a very capable process, indicating that it will provide a stable stream of conforming products, where not even a hand full out of one million units produced will turn out defective. As indicated by the interviewees, absolutely zero defects are considered impossible. Therefore, one would probably consider what the next best thing would be. If not completely realisable in practice, what is the best thing that can be done in reality? Operationalising ZD through Cpk= 2.0 could therefore be considered a good industrial trade-off between quality on the one hand and economy and producibility on the other. What needs to be emphasised is how to work towards achieving a Cpk=2.0. Starting with the intent of the quality on the component or product, managers must make certain that the intent is translated into design specifications that are meeting the needs on safety and quality of the end product as suggested by Deming (1986). At the same time, they need to make sure that it is producible for the manufacturing organisation. This might not be successful or satisfactory from start, but with proper SPC-methods and statistical thinking in place, it can be tuned to produce a true Cpk of 2.0.

It is common that quality is defined as conformance to specification within the aerospace industry (Rolls Royce, 2016). But in a broader perspective quality is about reducing variation and SPC is about increasing knowledge of a process by using tools to reduce variation, not about conforming to specifications (SAE International, 2018b). It is however customary within the industry to describe ZD in terms of capability (AESQ, 2020). That interpretation has also been adopted within the organisation at GAS, as indicated in the following quote:

”If we say ZD then we mean Cpk= 2.0. This can be discussed a lot, but I say that 1.33 is not enough. We have processes that have Cpk= 1 and even lower. But that is not enough because then we get non-conformance, meetings with suspected deviation, we lose money. Even if we capture them via measurement / visual control, the details are delayed and get longer lead times than planned. Therefore, it is fundamental if we are to reach ZD that one must have the attitude that 1.33 is not enough. Does that mean that we can stop everything that does not reach 2.0? No. In practice, it does not, because there are simply no economic preconditions.” - Interviewee 3

The above quote implies that achieving Cpk= 2.0 is necessary for delivering components to customers. A lower Cpk will result in increased costs as components will have to be reworked into conformance. The quote pictures the amount of pressure that is put on the capability index. Such pressure may cause negative effect. As deliveries are prioritised, time and money are clearly related to the value of Cpk. If pushed too far, the actual quality may be affected as described by Interviewee 1:

”I feel that we sometimes focus too much on Cp and Cpk without doing any checks. Is it really normally distributed or do we have any conscious or unconscious influence when controlling a diameter that is very tight for example? Maybe you want to control towards one side of the tolerance so as not to risk anything. In that case, Cp and Cpk falls since you are deliberately controlling towards one side of the tolerances. There I have a feeling that we throw up the index a bit flippant. We may have Cpk = 0.87 but the explanation is that we consciously control to be quite close to the maximum limit so as not to risk getting too small a diameter.” - Interviewee 1

Summarising, at GAS ZD may be described in short as a vision that requires a cultural change that lays a foundation for quality work by considering risks and quality in early phases where SPC is used for reducing variation and showing OEMs that processes are capable of manufacturing safe and flight worthy components of high quality. As GAS is only one part of the aerospace value chain. Their part is to make

32 sure that what they receive from their suppliers is suitable, i.e. right material with low variation in order to make certain the same for their customers, who are the next link in the chain.

4.2 Thematic Analysis

The purpose of the thematic analysis was to find answers to the second and third research question by relating empirical data from the interviews to the literature. The thematic analysis resulted in 50 codes generated from 459 extracts of data. Table 4 shows the number of data extracts from each interviewee. Each interviewee provided between 23 and 50 extracts for the analysis.

Table 4: Number of data extracts per interviewee.

Interviewee I.1 I.2 I.3 I.4 I.5 I.6 I.7 I.8 I.9 I.10 I.11 I.12 I.13 I.14 Total Extracts 26 30 26 24 31 30 43 50 37 40 23 26 46 27 459

The number of times each code were used differed greatly. Used twice, code 47: Need for additional/different data was the most sparsely used code. The most frequently used code, 10: Need for more statistical work with processes, was used for coding a total of 107 data extracts. A total of 22 codes were used for coding more than 20 data extracts. Table 5 summarises these codes and the number of data extracts they were used for in descending order.

Table 5: Codes used more that 20 times.

Code number Code Used in number of extracts 10 Need for more statistical work with processes 107 24 Managers must request the right things 84 7 Lack of resources / prerequisites for proactive quality work 69 28 Change daily working methods 62 3 Knowledge transfer between projects / processes / organisations 53 9 Quality work is focused on product characteristics and Cpk 53 27 ZD requires a cultural change 53 16 Tolerance limits are used for controlling 50 22 Measurability is important 45 21 Managers must enable measurability and process control 40 4 Collaboration between design and manufacturing 38 5 Quality initiatives do not get foothold 37 42 Customer / delivery is prioritised 35 1 ZD is working with risks and quality in early phases 33 8 Process health is related to product deviations 32 48 Industrial trade-offs 30 49 Persistence in managers and management 29 15 Cp and Cpk are used for identifying process improvements 26 13 Defects are considered product deviations 23 25 Combining different data / systems 22 30 ZD (Cpk=2.0) is very difficult for older and / or complex products 22 32 Existing data needs to be used differently 21

33 As evident in the interview data, the most frequently used codes are possibly interconnected to some extent. For example, ZD was viewed as a need for a cultural change within the organisation where daily working methods would also have to change. Another example is that in order to meet the goal of Cpk= 2.0, more analysis of statistical character would be needed. In some cases, a good statistical approach of working with manufacturing data was not in place because of insufficient knowledge. In other cases, the specifications from the design organisation were too strict or in other ways too difficult to work with in manufacturing, resulting in Cpk= 2.0 being unattainable. A third insight is that the focus on Cpk and controlling processes on specification limits has resulted in the work being too reactive. In extension, defects were produced and detected when it was too late. Being forced to see to these problems, proactive initiatives are not to think of since the time and resources are not enough for both. Lastly, as ZD is operationalised through Cpk= 2.0 due to stricter customer requirements, it is understandable that managers are asking their subordinates to take necessary measures to increase the capability of manufacturing processes. Having different expertise and knowledge of statistical process improvement, such inquiries may result in an array of different actions. It is, regardless of Cpk reaching 2.0, not certain that the processes are maintained under statistical control (Mohammed et al., 2008). Therefore, Cpk might drop to an even lower level than before since the change might put the process out of control, not only reducing the value of the capability index due to increased variation, but also the reliability of the index itself (Montgomery, 2012).

34 Figure 8: Mind map of key codes.

The developed codes and the coding frequency served as the basis for creating an initial mind map of the interrelationships between key codes, see figure 8. Some codes were used less frequently but represented strong or interesting statements and were therefore also considered to some extent when creating the mind map in an attempt to map out the meaning and views within the entire data set (Braun & Clarke, 2006). The grey bubbles may be viewed as nodes that are especially important since they are interconnected to several areas of the phenomenon. Figure 8 depicts the complexity of the interrelationships between codes and extracts of the data set, implying that it is a complicated task to permeate how to interpret and work towards ZD within the organisation.

In order to develop themes that were internally homogeneous and externally heterogeneous, the codes and data extracts were reviewed over again, combining, replacing, and/or removing codes. New potential themes were also added in the process, resulting in the initial thematic map in figure 9. The map consists of three

35 main themes represented by the grey bubbles. The main themes are built up by the different components represented by the white bubbles. One of the main themes contain two sub-themes, represented by the rectangles.

Figure 9: Initial themes.

After going back to analysing these themes as well as the codes and data extracts once again, the thematic map was refined further. The main change that was made was rearranging the main theme knowledge transfer as a sub-theme within the main theme called remove barriers for improvement. In practice, inadequate knowledge transfer may be interpreted as a barrier for improvement as suggested by Deming (1986). As the theme remove barriers for improvement is heavily dependent on managers enabling an environment where improvements are supported, knowledge transfer was considered an important aspect. Figure 10 depicts the final thematic map which is discussed in the following sections.

Figure 10: Final themes.

4.2.1 Statistical Approach to Improvement Work

The theme statistical approach to improvement work, may be viewed as one of the key pillars of reaching ZD. Statistical knowledge varied greatly among the interviewees, implying that daily working methods also

36 vary. In extension, variation within the organisation is increased, affecting quality negatively (Montgomery, 2012). On very few instances, variation and special causes of variation were mentioned in the interviews, as was normality in the distribution of characteristics and metrics. Such aspects are integral components of product quality (SAE International, 2018a) and having a statistical approach to improvement work (Eleftheriadis & Myklebust, 2016; Gultom & Wibisono, 2019). However, what was mentioned by every interviewee, often several times, was how they are working with Cpk and the specifications they are forced to comply with. As noted by some of the interviewees, basic training in statistical analysis of process data was needed to understand how these aspects affect the Cpk. Since Cpk is the main metric for evaluating processes (Rolls Royce, 2016), employees seek to increase or maintain it at a desired level, by using specification/tolerance limits to control the processes. Many processes do not deliver desired Cpk, resulting in internal non-conformities, so-called Q3s. The implication is ineffective and inefficient ways of working with process data. As stressed by Deming (1986), controlling on specifications and supervising on outcome only serves the purpose of deteriorating workforce morale and reactiveness.

The sub-theme deliveries prioritised over proactiveness reflect how it is difficult to take proactive measures because of the customers’ specific demands on Cpk. The demands from customers have resulted in the strong focus on controlling processes on Cpk and specifications, leaving little resources left for proactive measures. The issue is expressed by Interviewee 13 in the following quote:

”If you ask me, it should be the production technician’s most important means of achieving good quality, and understanding what the person needs to work with to improve the quality. There, he should look at his skill and ability and understand what is causing the problems and use the statistics to work with the right things. It is done to a certain extent, but it is not done to a sufficient extent. The priorities out in production may not always be that type of systematic work, many times it is to fix today’s delivery rather than maybe get a faultless production.” - Interviewee 13

As suggested by Deming (1986), many of the recurring problems are likely the result of every non-conformance produced in a stable process being treated as a special cause. What needs to improve in such cases is the system itself by reducing variation or adjusting the level since the majority of causes for poor quality is often derived from the system. Since the industry standard is to rely heavily on inspection, chances are that the proactive efforts are not prioritised since the inspection instances will most likely identify non-conforming products. According to SAE International (2018a), inspection will not enable a defect level of zero, it rather motivates to a culture of only finding solutions to urgent problems instead of long-term quality and customer satisfaction. Further, relying heavily on post-production inspection often result in compensation strategies that consume excess resources (Magnanini et al., 2019; SAE International, 2018a).

Naturally, it is important to prioritise deliveries. But time and resources need not only be assigned to making improvements but also for providing training and guidance on basic statistical work with process data. Doing so is important in order to gain knowledge about processes so that variation can be reduced and thus secure a true Cpk. As a more statistical approach to process control would result in more reliable capability as well as more proactiveness. In the below quote, Interviewee 6 describes that statistical tools and methods may be used insufficiently.

”We are not very good at SPC, we have almost no control over such things as trends and stuff. We have problems with the distribution of processes, that is why the warning system where you set boundaries yourself came to be, where you have to judge what is normal and what you want. It is rather ’unstatistical’ but quite practical.”

37 - Interviewee 6

The above quote implies that it is often unclear whether a specific process is normally distributed or even in statistical control. An issue that is important to handle since Cpk-indices are still computed regardless within the organisation. Then, what is actually computed may very well be a Ppk. According to Montgomery (2012), the Ppk-index is good for nothing since it was developed for describing processes that are not in statistical control. It serves no purpose to predict the outcome of a process that is out of statistical control since such a process has no capability Deming (1986). Sall (2018) on the other hand, suggested that Cpk is a measure of how well a process could perform while Ppk is a measure of how well it is performing. Regardless, it is vital to communicate with the customers that are demanding a certain level of Cpk. In the case where Ppk is used instead of Cpk they must be informed so that they do not receive something other than what they are requesting. In order to gain customers and market shares in modern competitive markets ZD-products are essential according to Wang (2013). As a ZD-product is defined by GAS’ customers as products manufactured in a process with Cpk= 2.0, that is what GAS must offer.

Another important thing to mention here is the sub-theme measurability, it relates to statistical process control in many ways but perhaps especially with regards to the type of data. There are enormous amounts of variable data collected, stored, and used within GAS. The data has some issues mostly related to how it is used properly and effectively. Attribute data on the other hand, was discussed in one way or another by several interviewees. Processes where casting, welding, or moulding are present are often incapable of supplying any variable data. What can be done is collecting attributes such as conforming/nonconforming, counts and percentages. No good way of analysing these types of data seem to be in place. Further, several interviewees said that such processes are responsible for a big part of the total defective products. The issue is described by Interviewee 9 in the quote below.

”We measure statistical outcome on manufacturing processes where we have variable data. In cases where we have attribute data, it is difficult because it is simply within or beyond tolerance. I’ve tried to talk to the engineers if we can measure anything other than the dimensions that are in the drawing to get variable data. If we say that the welding width must be a maximum of 10 mm then you could of course measure what it actually is, but in the measurement protocol it says ’meets requirements’.” - Interviewee 9

The above quote is one of many examples where the interviewees stressed the issues with attribute data. There are of course ways to control these using SPC-methods described by Montgomery (2012) and SAE International (2018a) but as suggested by Interviewee 9, measurements that provide variable data are generally more informative. Variable data are especially useful for the purpose of conducting capability studies. Capability studies cannot be conducted for attribute data without specification limits (Montgomery, 2012). For that reason, it is hard to determine whether or not the product meets the demands on zero defects. What needs to be prioritised is planning the collection of data so that they may be used to understand processes (SAE International, 2018a). As suggested by Eleftheriadis and Myklebust (2016), a guideline for handling vital process parameters may be developed by selecting control tools that are critical to quality with respect to machine tolerance and collecting end-user process knowledge.

4.2.2 Remove Barriers for Improvement

On a broad level, much of what was inhibiting individuals and the organisation as a whole was fundamental organisational barriers which can only be affected by management. The theme named remove barriers for

38 improvement encapsulates the codes and extracts that are relating to barriers for improvement. One part of the main theme is active management involving mangers requesting and asking for the right things, not least to support measurability and better process control. The below quote describes how different requests from management will result in different behaviours and thus different outcomes.

”We have statistical follow-up on almost all our products, or the opportunity to get it through the systems we have. But often the focus is on, I think, perhaps more how many deviations we have instead of pushing the question on what capability we have. I think these are two different things that result into two different behaviours. If you focus on capability, then you work with process improvements. If you focus on how many deviations you have, then that is what is important. Which can result in two different behaviour’s, I think.” - Interviewee 13

Regardless of what it is that managers require from their subordinates, they need to be persistent. The interviews showed that, a lot of times, new directives or quality initiatives where rolled out, creating confusion among individuals. Persistence is needed to communicate and make everyone aware of what is expected in order to meet specific and common goals, involving management tirelessly asking their subordinates what they require to do their jobs and improve what they can. The expert knowledge is within the workforce, not the management (Deming, 1986). The below quote provided by Interviewee 8 describes that such an approach may very well be needed but it is not in place or adopted widely within the organisation.

”Why not start with a round like in all other normal contexts where you start talking to those who are working with the problem. To start with their opinions and what help they need for it to work. That’s when you build ownership, we are overlooking such simple psychological mechanisms.” - Interviewee 8

The second part of the theme relates to what the interviewees described as obstacles for doing the work right and improving. Such issues are out of the employee’s hands. With persistence and engagement for the employees and the quality of product and process, management can change this. One example such obstacles is the administrative processes that many times stifle effective work. To a large degree, the work is heavy on administration because it is the industry norm. All actors within the aerospace industry need to comply with the meticulous safety that has to be in place. Therefore, administration serve as instances for control or validation. Administration is therefore assumed to be helpful to some extent, too much might however stifle value creation or even result in waste within the organisation, as indicated by Interviewee 8 in the following quote.

”For years we are stuck with recurring deviations, we know exactly what we should do but because of all the systems we do not succeed. We squeeze the problem to death. we believe that there is a template, method, or technique missing for us to fix everything. At the same time, those who are actually dealing with the problem are fully aware of what we should do. But they need 8 signatures and 37 steps in different systems to go through. Then it is easier to close deviations when you arrive in the morning, it takes a minute. Getting into this journey and changing it takes months.” - Interviewee 8

Administrative work need to serve the purpose of better quality and that purpose has to be communicated by managers. Doing so may be a difficult task, but by focusing on the intent of quality as suggested by

39 Deming (1986), administrative work may be re-evaluated and legitimised among managers and employees as indicated by Interviewee 7 in the below quote.

”If you do not show that there is pressure on it then it is easy to withdraw. We always have to work with quality, it is very important. But at the same time, it is not only creating a folder. It is important to do it right, that you do not see it as an extra job but as an aid on the road. If you do right from the start and have control over things, then you should save time in the project. You should not do it just to get it done.” - Interviewee 7

The sub-theme knowledge transfer relates to the importance of not isolating departments, teams, individuals, projects, and products when carrying out the work. In order to build in quality from the beginning, cross-functionality is essential (Deming, 1986). Even more so when transforming into a more modern and complex manufacturing environment. One main focus here should be to have better integration and collaboration between manufacturing and design organisations. On several accounts, interviewees stressed that they cannot improve or do better work than they already are because the design is fixed and sometimes impossible to meet satisfactory. At the same time one can only go so far with tolerance relief, the safe and effective use of the end product is always prioritised. To enable conditions that more effectively enable the work towards ZD, all elements of the theme, including organisations, individuals, projects, processes, risk, and quality, needs synergy. Best practice from a project needs to be transferred in a logical and effective way throughout the organisation so that improvements can be made even though the same problem is not identified everywhere. At the same time, the individuals of the organisation are in possession of great expertise. Their expertise is naturally of varied nature, but it was indicated in the interviews that no good way is in place for transferring knowledge and applying the expertise where it might be needed. Radziwill (2018) suggest that quality depends on an organisation’s ability to find and combine new sources of data as well as the effectiveness of discovering root causes. Data may very well be the knowledge among individuals and the best possible practices of carrying out different work. Best practices may be sharing information that help identify root causes. Thus, it is necessary to enable and improve cross-functionality within the organisation in order to achieve better quality on both manufacturing and business processes.

”I believe that a weakness is not utilising our common knowledge enough. We have no system for gathering knowledge and best practices around methods and processes.” - Interviewee 6

It is an established view among several interviewees that many of the root causes for problems are a result of too little collaboration between design and manufacturing organisations. Even if root causes are identified, the effectiveness of discovering them could possibly increase. As indicated by interviewees, when the design-manufacturing collaboration was not working satisfactory it was often characterised by discussions back and forth between the organisations. Manufacturing might struggle to meet their capability target due to strict specifications. As specifications are often tightly connected to the safety of the product, changing them requires lengthy administrative cycles and sometimes re-certification of the product and manufacturing process. If manufacturing requests from design an ease of the specifications. The designing organisation might want to see that every suitable process improvement measure has been made before going through with a change in specification. Manufacturing are then forced to go back and do what process improvements they can. Sometimes these improvements are satisfactory but when they are not, the design organisation will have to look into opportunities of changing specifications. As described by Interviewee 3 in the quote below, changing specifications of a design is also more difficult for older products.

40 ”Once you have done your design and started producing it, a major design change may come in that require re-certification of the engine costing many millions. Often the resources are also moved to new projects, then it can be difficult to come back and make a big change even if it would be good. Priorities within the company result in them being ignored. It is therefore very important to do right from start.” - Interviewee 13

41 5 SPC Study

In order to illustrate an example of how process data can be analysed statistically at GAS, this chapter provides an SPC study in which a statistical data analysis is conducted on a set of example data. Assuming that the behaviour of the process is unknown, a phase I SPC study seeks to identify and eliminate special causes of variation. By doing so, the process may be improved to a state of statistical control so that knowledge about what should be considered normal behaviour can be determined. Then, phase II seeks to monitor the process so that statistical control is maintained. First, a presentation of how the data is currently presented and analysed is provided. What follows is a approach that is based on applying tools and methods within statistical process control.

5.1 Current Process Control

The data used in the case study are collected from the 50 most recent height measurements of an engine component, from the end of December 2019 to the beginning of May 2020. The data generated through the internal data base QSYS supplies double measurements that are conducted simultaneously (the time between a pair of measurements may differ a few seconds). The way these measurements are usually displayed within the QSYS-system is by plotting both measurements for every specific instance in a run chart where the upper and lower tolerance limits are visualised in a control chart. The red horizontal lines are the specification (tolerance) limits and the orange horizontal lines are what is described as warning limits. For the height measurements, the warning limits are made up by 70% of the tolerance range, see table 6 below.

Table 6: Limits and values

Metric Value Nominal 402,33 Lower Specification Limit (LSL) 399,83 Upper Specification Limit (USL) 404,83 Upper Warning Limit (UWL) 404,08 Lower Warning Limit (LWL) 400,58

Figure 11 below represents the plot of the data generated through QSYS. As indicated by the orange circles, several points are above the upper warning limit, implying that the person responsible for monitoring the process was made aware that it was approaching the specification after the first few observations. Unfortunately, it is difficult to distinguish between the observations because the x-axis has no data labels. While studying these types of charts in QSYS however, one can hover over each data point to see information such as the measurement value for each observation and the time it was measured.

42 Figure 11: QSYS Run Chart.

Control charts for statistical process control are often used to describe a set of observations, or data points, over time. By visualising a process measure that way, it is possible to see how it is behaving over time (Mohammed et al., 2008; Montgomery, 2012). The data points in figure 11 are plotted in order of serial number, providing little information about how the process is performing with respect to time. Also notice that the data are plotted above the upper warning limit for ten of the observations. In eight out of these ten cases, both measurement values for each observation are above the warning limit, for the remaining two, the lower measure is below the limit. Using the chart in 11, it is difficult to draw any conclusions about what might happen in the future or to make any recommendations for remedial action. The first warning at observation 5 may suggest that the observations are approaching the tolerance limit. For observation 6-9, all measurements are within the warning limits only to go above again at observation 10. Another thing to consider for the chart in 11 is that the data is not centred between the tolerances. No data point is close to the lower specification and only two measurements are located below the nominal value. The variation of the data set is quite random, which could indicate that no special causes are apparent. However, since the majority of the observations are plotted between the nominal value and USL the process should not be considered capable. In table 7, statistics from QSYS for the data set is presented. Cp is fairly high which confirms the apparent low variability among observations. Cpk is however quite low because the data is not centred between the specification limits.

Table 7: QSYS Data Statistics

Statistic Value Min 402,152 Max 404,793 Average 403,613 Sigma 0,478 Cp 1,745 Cpk 0,850

5.2 Distribution Fitting

When conducting statistical analysis on process data, it is advisable to start by fitting a distribution to the data set since control charts are often constructed under the assumption that the data comes from a normal

43 Figure 12: Histogram for measurements. Figure 13: Q-Q plot for measurements. distribution. A distribution fitting of the data was therefore conducted. Figure 12 represents a histogram of the measurement data. The red curve represents a normal distribution. By looking at the histogram, the data might come from a normal distribution. Figure 13 represents a quantile-quantile plot. The chart plots two sets of quantiles against a normal distribution line. If the two sets of quantiles come from the same distribution the points should roughly form a straight line that follows the blue normal distribution line in the plot.

A Shapiro-Wilk test for normality resulted in a P-value equal to 0,51386. A P-value from a Shapiro-Wilk test that is greater than or equal to 0,05 indicates that the idea that data comes from a normal distribution should not be rejected. A Kolmogorov-Smirnov test was conducted to see if the measurements could be adequately modelled by a normal distribution. Also, the test suggest that we should not reject the idea that the data is coming from a normal distribution since the Kolmogorov-Smirnov P-value equal to 0,954084 is greater than 0,05.

In conclusion, the P-values and plots indicates that we should not reject the idea that the measurement data comes from a normal distribution. If the measurement data does not appear to come from a normal distribution it might be necessary to transform the data so that it does. There are also specific types of control charts that are not dependent on normal distribution. For these data, we may assume that Shewhart control charts are reliable enough for analysis. For the purpose of providing a way of analysing the data that is visual and easy to interpret, averages are computed for each pair of measurements. An alternative way of visualising and applying SPC to the data set is then to plot these averages in a run chart. Averages were computed and used instead of the measurement pairs because it is thought to be more easily interpreted visually. Although it is customary to use rational sub-groups when constructing control charts, each measurement pair should not be considered a sub-group but rather to measurements for the same observation. Since the time between each observation varies, creating sub-groups of averages may also be misleading.

A distribution fitting of the averages resulted in a mean of 403,614 and a standard deviation equal to 0,332248. The standard deviation was estimated from the mean moving range. The moving range is simply the difference between two successive observations as shown in equation 5. To compute the standard deviation, the average of these moving ranges was divided by the tabulated constant d2=1,128 (for n=2).

MRi = xi xi 1 (5) | |

The Shapiro-Wilk test resulted in a P-value equal to 0,0229592, indicating that the averages may not come from a normal distribution. However, the Kolmogorov-Smirnov test provided a P-value 0,265382. Based

44 Figure 14: Histogram for averages. Figure 15: Q-Q plot for averages. on the Kolmogorov-Smirnov P-value, the idea that the averages come from a normal distribution should not be rejected. In the histogram (figure 14 for the averages, the bars are not following the red normal distribution line very well. In the quantile-quantile plot of the averages in figure 15, the points seem to follow the normal distribution line quite well.

These types of charts should not be used on their own since interpreting them is a subjective matter. Therefore, the goodness-of-fit tests and the charts should be used together. In combination, rejecting or accepting the idea that the computed averages comes from a normal distribution is difficult based on the P-values and the charts since they provide differing indications. If the idea that the averages comes from a normal distribution is not rejected, control charts can be constructed.

5.3 Control Charts and Analysis

The chart in figure 16 below represents a Shewhart control chart for the individual averages. The red horizontal lines are 3-sigma control limits which are computed by adding and subtracting the mean with three standard deviations as shown in equations 6 and 8 where x is the mean of the averages. The yellow and green horizontal lines represent the 2-sigma and 1-sigma limits respectively. The values for these limits can be found in table 8 where UCL is Upper Control Limit and LCL is Lower Control Limit for each multiple of sigma. The mean of the individual averages was computed through equation 7 is represented by the dotted red centre line (CL). In the chart in 8, data is plotted from left to right based on when the measurements were conducted. That way, the measurement process may be visualised in a time oriented-fashion.

MR UCL = x +3 (6) d2

CL = x (7)

MR LCL = x 3 (8) d2

45 Table 8: Limits for Individuals Shewhart Control Charts of Averages

Sigma UCL CL LCL 1 sigma 403,95 403,61 403,28 2 sigma 404,28 403,61 402, 95 3 sigma 404,61 403,61 402,62

Figure 16: Individuals Shewhart control chart for averages.

Using these types of charts at this stage should be viewed as a Phase I study. At this point, it is not known what can be considered normal process behaviour. In a phase I study, control charts are used to find assignable causes of variation through out-of-control signals and other alarms. By doing so, one can develop an idea of what should be considered normal gradually and adapt the control limits thereafter. Once the process is considered to be in a state of statistical control, phase II SPC can be employed where control charts with properly adjusted control limits are used to monitor the process.

Based on the plotted data in figure 16 there are eight points indicating that the process is not in statistical control. First, observation 13, 34, and 36 are plotted beyond the control limits, indicating that these observations are differing too much from the process mean. Second, observation 9, 37, 38, 39, and 40 are marked with a red asterisk because they represent unusual patterns, even though these points are within the 3-sigma control limits. Observation 9 is indicating an unusual behaviour because by the ninth observation, a run of eight or more points are plotted on the same side of CL. Observation 36 is not only beyond the lower control limit, it also represents the point where two out of three consecutive points are plotted beyond the 2-sigma limit, as is the case for observation 37. In addition, observations 37-40 are each marked with a red asterisk because a set of 4 out of five consecutive observations are beyond the lower 1-sigma limit, which should not be considered a random behaviour.

In order to study variation and determine if the process may be in a state of statistical control, it is also customary to study the Moving range chart when using Shewhart control charts. The control limits and

46 centre line of for the moving range chart are calculated as shown by 9, 10, and 11 where the tabulated constants D3=0 and D4=3,267 have been used. The values for these limits can be found in table 9.

UCLMR(2) = D4MR (9)

CLMR(2) = MR (10)

LCLMR(2) = D3MR (11)

Table 9: Limits for MR(2) Chart

Limit Value Upper Control Limit (UCL) 1,2245 Centre Line 0,374776 Lower Control Limit (LCL) 0,0

Figure 17: Moving Range Chart for Individuals.

In the above MR(2) chart the difference, or range, between each successive average is plotted. Therefore, there is no data for the first observation since the first available measure is the difference between the first and the second observation. Since the differences calculated through equation 5 cannot be negative, it is customary to set UCL equal to 0. The 17 chart is useful for analysing the variation between observations. Points plotted beyond UCL indicate that the difference between two successive data points is too large. Points plotted on or below LCL indicate that the difference between two successive points is very small. In the chart, four points are located on LCL, suggesting that two successive points are of equal magnitude. Such points should not be interpreted as issues by themselves since low variation is generally considered positive. If points present an unusual pattern or trends however, they may indicate that the process is not

47 operating in a state of statistical control. Observation 35 is marked with a red asterisk in the chart, implying that the difference between the average measurement of observation 34 and 35 is statistically significant. The point may thus indicate that a special cause was present around that time.

5.4 Capability Study

Based on the individuals Shewhart control chart for averages and the moving range chart, the process does not seem to be in a state of statistical control. According to Montgomery (2012), conducting a capability study for the process is therefore meaningless. Deming (1986) suggested that, if a process is not in statistical control it has no capability. An alternative may be to refer to the performance of the process instead, by using a long-term standard deviation and compute Pp/Ppk as described by Sall (2018). One should however use all these indices with great care as highlighted by Montgomery (2012) who suggest that Ppk essentially provides no reliable information.

Table 10: Statistics for Individuals Shewhart control chart

Statistic Short-Term Capability Long-Term Performance Est. sigma 0,332248 0,431111 Cp/Pp Cp= 2,50817 Pp= 1,93299 Cpk/Ppk Cpk= 1,22011 Ppk= 0,940316

There are no clear trends apparent in the two charts. Out-of-control signals and alarms from the runs tests however suggest that the process is not in statistical control. Between observation 15 and 33 in figure 8 the process is showing little variation about the mean. Any improvement efforts may investigate that period further to see what might have resulted in that outcome. The main conclusion that can be made from the analysis is that poor centring is preventing the process from statistical control. Therefore, it is advised to investigate ways of shifting and maintaining the mean of the process closer to the nominal value.

48 6 Findings and Recommendations

The aim of the master’s thesis was to investigate how methods for statistical process control (SPC) can support zero defects within manufacturing processes at GKN Aerospace Trollhattan.¨ Based on the aim and the research questions, the following chapter provides specific recommendations for GKN Aerospace, recommendations for the aerospace industry in general, as well as recommendations for future research.

RQ1: How is ZD interpreted at GKN Aerospace Sweden?

Based on the literature overview and analysis of qualitative data, the first research question can be answered by referring to figure 7 where ZD is described as a cycle at GAS. The cycle starts off by accepting that, to a full extent, ZD is very difficult to achieve in reality. By viewing ZD as a vision that seeks to reach the goal of completely conforming product, a mindset is established within the organisation which is needed for the next phase of the cycle. The phase is depending on a mindset that, all though it might be impossible to reach, ZD is about constantly improving by reducing sources of variation in processes. In doing so, the phase relates to creating a culture where the focus is on the intent of quality. Such a culture is heavily dependent on managers requesting the right things by asking experts within different areas what they need in order to support better quality and improvement. In that way, the next phase relies on the culture to serve as a foundation for quality work. Quality work is related to both manufacturing processes and the products produced but also administrative processes that are required for supporting the business. By developing such a foundation, with a constant focus on the intent of quality is, better prerequisites for working with risk and quality in early phases are enabled which introduces the next phase of the cycle. Integrating the concern for quality and risk is especially important within the aerospace industry so that actors supply customers with products and solutions that enable safe air travel. Since ZD is communicated and operationalised by a fixed value of the capability index Cpk, SPC offers suitable methodology and a set of tools for improving quality by reducing process variation and thus enable increased capability. By utilising tools and methods for SPC, better analysis and control of manufacturing processes can be practised. In turn, GAS will become more efficient and effective in providing a true Cpk= 2.0, as requested by their customers.

RQ2: How can tools for SPC support the work towards ZD?

Considering that ZD is translated into Cpk= 2.0 within the aerospace industry, the SPC study provided in the thesis illustrate an example of how tools for SPC may support the work towards ZD. As noted, providing grounds for conducting capability analysis requires data that come from a normal distribution as well as a process that is in a state of statistical control. In order to enable best possible circumstances for valuable insights being derived from the data, it is also important to plan the collection of data so that it may be used to understand the specific process. A process that is under statistical control may be achieved by following the steps of the phase I SPC study provided in the thesis. Computing and using statistical control limits in control charts based on a sufficient amount of data is recommended. Out-of-control points and trend patterns may then provide statistically reliable information that indicate whether or not a given process is in a state of statistical control. Once statistical control is established, a phase II study may be initiated where process data are monitored over time. It is also during phase II that capability studies may be conducted that result in a true Cpk. To provide increasing process knowledge and better process control, it is recommended to shift focus from Cpk and specifications onto the data and statistical work. By doing so Cpk will improve as well.

49 Data collection – Collection of data needs to be planned so that it can be used to understand the • process. Characteristics whose variability affect product form, fit, and function should be selected.

Distribution Fitting – Check for normality through distribution fitting with histograms, Q-Q • plots, half-normal plots or similar. Refer to tests such as Anderson-Darling, Shapiro-Wilk, or Kolmogorov-Smirnov. If it is not likely that the data come from a normal distribution, transform data and control limits or use other control charts such as EWMA.

Construct Control Limits – Use three standard deviations as a standard, modify if suitable. It is also • often useful to compute 1-sigma and 2-sigma limits for the purpose of visualising and interpreting the control charts.

Control Charts – Always plot the data points as they are collected with respect to time. If the data • show obvious or unusual trend patterns, consider rational sub-grouping if possible.

Analysis – If points are plotted beyond the control limits or violating additional rules, investigate any • apparent special causes and remove. Re-compute control limits using updated standard deviation.

Phase II – Once there are no special causes of variation nor out-of-control points, the process may • be considered to be in a state of statistical control. Phase II monitoring of the process may then be initiated, and capability studies conducted to compute Cp and Cpk.

RQ3: How should tools and methods for SPC be used to approach ZD in complex/modern manufacturing systems?

Operating in a industry 4.0 environment characterised by complex systems, GAS may find the results from the thematic analysis and SPC study useful in approaching ZD. The SPC study provides practical guidelines on how to consider fundamental aspects of SPC when presenting and analysing process data. These guidelines may be used to develop a more standardised approach of working with SPC within manufacturing processes at GAS. It is recommended to develop a guideline for suitable quality tools that enable effective approaches of structuring, validating, combining, and storing acquired data. Likewise, it is recommended to investigate further how inter-dependencies between process variables throughout different process stages may affect the quality of processes and end products. It is advisable to investigate alternative ways of providing variable data from processes involving welding, moulding and casting that today only provide attribute data, or the possibilities of combining attribute and variable data to enable better process knowledge and control. Such investigations may be approached by integrating internal systems containing different types of data such as QSYS, SAP/R3, and Co-Pilot.

The thematic analysis indicates important organisational factors that affect the work towards ZD for GAS. When evolving onto more modern and complex manufacturing systems, it is key to find new ways of combining the sources of data within the system in order to provide interconnected control of the entirety of the system. There, it is also paramount to identify and define the purpose of new technology with the intent of quality in consideration. In order to consider the intent of quality while finding new ways of combining different systems and data to provide effective identification of root causes and insights, it is recommended that GAS develop a quality strategy that is integrated with the overall design and production strategy. By doing so, barriers for quality improvement may be removed to establish better ways of enhancing quality more proactively. One important area in which it may be suitable to initiate such work is the knowledge transfer between designing and manufacturing organisations.

50 7 Discussion

The methodology used in the thesis was considered suitable with regards to the aim and the purpose of answering the research questions. The 14 interviews conducted were carried out with equal attention and dedication. However, being a subjective matter, chances are that the interviewer may have misinterpreted interviewees’ feelings and communicated thoughts to an extent where reliability is reduced. For the most part, the methodology of the thesis was carried out as planned. The outbreak of the COVID-19 pandemic resulted in the decision to conduct a majority of the interviews through conference calls instead of in person. Doing so may also have affected the interviewer’s ability to understand or interpret the interviewees feelings regarding what was discussed. The risk is however considered small since the interviews were of similar duration, covering roughly equal amounts and the same topics regardless of being carried out in person or not.

The data collected may be considered valid and to some extent transferable within the aerospace industry. Since a variety of individuals of the organisation at GAS were interviewed, it may be considered that a broad view of process and quality management is provided within the thesis. Most interviewees with long experience within the industry claimed that most of what they do at GAS is the industry norm. GAS and other actors within the aerospace industry work closely with their customer OEMs in order to provide safe end products. This is considered a result of the strict international safety regulations of aerospace products. Therefore, the findings and recommendations provided in the thesis may be generalisable and valuable for other practitioners within the aerospace industry. In addition, the findings and recommendations may be generalised onto other manufacturing industries for the purpose of approaching ZD by applying tools and methods within SPC.

To a large extent, the findings of the thesis rely on qualitative data. As SPC is a structured and statistical approach for improvement work that utilises quantitative data, the qualitative study provides explanation and deeper insights about how and why tools and methods within SPC is suitable for approaching ZD. Exactly zero defects may be viewed as quite an absolute and quantitative concept. However, applying thematic analysis on the qualitative interview data showed that interpretation to be too narrow and therefore ineffective in practice. The thematic analysis provided ways of analysing qualitative interview data with a somewhat quantitative approach. The coding frequency is an example where data of qualitative nature is quantified, at least to some extent.

As ZD is operationalised through Cpk within the industry, the thesis provides ways of working with process data through SPC so that process capability may be increased. A recommendation for future work on the topic is closer investigation of other measures, metrics, and/or indices that may encapsulate a better interpretation ZD. Additionally, as the findings of the thesis include the importance ’softer’ aspects in order to move towards ZD, future work may involve identifying key focus areas or ways of working for establishing a corporate culture that support ZD.

51 8 References

AESQ. (2020). AESQ Zero Defects. Retrieved March 18, 2020, from https://aesq.sae-itc.com/ Arsuaga Berrueta, M., Ortiz, J. & Lobato, R. (2012). Instrumentation and Control Methodology for Zero Defect Manufacturing in Boring Operations. Proceedings of the 23rd International DAAAM Symposium, Vienna, Austria, 2012: Volume 23, No.1, ISSN 2304–1382. Bergman, B. & Klefsjo,¨ B. (2012). Kvalitet fran˚ behov till anvandning¨ (5th ed.) Lund: Studentlitteratur AB. Box, G. E. & Woodall, W. H. (2012). Innovation, quality engineering, and statistics. Quality Engineering, 24(1), 20–29. Braun, V. & Clarke, V. (2006). Using Thematic Analysis in Psychology. Qualitative Research in Psychology, 3(2), 77–101. Deborah, G. (2013). Inductive and Deductive Approaches to Research. Retrieved from https://deborahgabriel.com/2013/03/17/inductive-and-deductive-approaches-to-research/. Deming, W. E. (1986). Out of the Crisis. Cambridge, Mass: MIT, Center for Advanced Engineering Study. Dogan, O. & Gurcan, O. F. (2018). Data Perspective of Lean Six Sigma in Industry 4.0 Era: A Guide to Improve Quality. Proceedings of the International Conference on Industrial Engineering and Operations Management, Paris, France, July 26–27, 2018: pp. 943–953. Eisenhardt, K. M. (1989). Building Theories from Case Study Research. Academy of Management Review, 14(4), 532–550. Eleftheriadis, R. & Myklebust, O. (2016). A Guideline of Quality Steps Towards Zero Defect Manufacturing in Industry. Proceedings of the International Conference on Industrial Engineering and Operations Management, Detroit, MI, United States, September 23–25, 2016: pp. 332–340. Ferreira, L., Putnik, G. D., Lopes, N., Garcia, W., Cruz-Cunha, M. M., Castro, H., Varela, M. L., Moura, J. M., Shah, V., Alves, C. Et al. (2018). Disruptive Data Visualization Towards Zero-defects Diagnostics. Procedia CIRP, 67, 374–379. Foidl, H. & Felderer, M. (2015). Research Challenges of Industry 4.0 for Quality Management. International Conference on Enterprise Resource Planning Systems, Munich, Germany, November 16–17, 2015: pp. 121–137. GKN. (2020a). GKN Aerospace in Sweden, retrieved 2020-02-17 from. https://www.gknaerospace.com/en/about- gkn-aerospace/locations/gkn-aerospace-in-europe/gkn-aerospace-in-sweden/ GKN. (2020b). GKN Aerospace Pushing Boundaries of Industrialised Additive Manufacturing Through New Research Rrogrammes, retrieved 2020-02-17 from. https://www.gknaerospace.com/en/newsroom/news- releases/2019/gkn-aerospace-pushing-boundaries-of-industrialised-additive-manufacturing-through-new- research-programmes// Guba, E. G. (1981). Criteria for Assessing the Trustworthiness of Naturalistic Inquiries. ECTJ, 29(2), 75–91. Gultom, G. D. P. & Wibisono, E. (2019). A Framework for the Impact of Lean Six Sigma on Supply Chain Performance in Manufacturing Companies. IOP Conference Series: Materials Science and Engineering, 528(1), 012089, Makasar, South Sulawesi, Indonesia, November 27–29, 2018. https://doi.org/10.1088/1757- 899X/528/1/012089 Juran, J. M. (1999). How to Think about Quality. JM Juran, AB Godfrey, RE Hoogstoel, and EG, Schilling (Eds.): Quality-Control Handbook. New York: McGraw-Hill. Kallio, H., Pietila,¨ A.-M., Johnson, M. & Kangasniemi, M. (2016). Systematic Methodological Review: Developing a Framework for a Qualitative Semi-structured Interview Guide. Journal of Advanced Nursing, 72(12), 2954–2965. Lasi, H., Fettke, P., Kemper, H.-G., Feld, T. & Hoffmann, M. (2014). Industry 4.0. Business & Information Systems Engineering, 6(4), 239–242. https://doi.org/10.1007/s12599-014-0334-4 Longhurst, R. (2003). Semi-structured Interviews and Focus Groups. Key Methods in Geography, 3(2), 143–156. Magnanini, M. C., Eger, F., Reiff, C., Colledani, M. & Verl, A. (2019). A Control Model for Downstream Compensation Strategy in Multi-stage Manufacturing Systems of Complex Parts. IFAC-PapersOnLine, 52(13), 1473–1478.

52 Martens, P. (2011). The Quest for Excellence: Exceptional Performance Using Real Process Thinking. Chicago, IL: FPA Publishing. Mohammed, M. A., Worthington, P. & Woodall, W. H. (2008). Plotting Basic Control Charts: Tutorial Notes for Healthcare Practitioners. BMJ Quality & Safety, 17(2), 137–145. Montgomery, D. C. (2012). Statistical Quality Control: A Modern Introduction (7th ed.) Hoboken, NJ: John Wiley & Sons, Inc. Papacharalampopoulos, A., Petrides, D. & Stavropoulos, P. (2019). A Defect Tracking Tool Framework for Multi-process Products. Procedia CIRP, 79, 523–527. Pearn, W. & Chen, K. (1999). Making Decisions in Assessing Process Capability Index Cpk. Quality and Reliability Engineering international, 15(4), 321–326. Radziwill, N. M. (2018). Quality 4.0: Let’s Get Digital - The Many Ways the Fourth Industrial Revolution is Reshaping the Way We Think About Quality. Quality Progress. Riessman, C. K. (1993). Narrative Analysis (Vol. 30). Sage. Rolls Royce. (2016). Working Together to Deliver a Competitive Supply Chain: Drive For Zero Defects. Retrieved March 18, 2020, from %7Bhttps://suppliers.rolls-royce.com/GSPWeb/ShowProperty?nodePath=/BEA% 20Repository / Global % 20Supplier % 20Portal / Section % 20DocLink % 20Lists / Drive % 20for % 20Zero % 20Defects / Main / Column % 201 / Section % 201 / Documents / Zero % 20Defects % 20Initial % 20Supplier % 20Engagement%20Material//file%7D Rußmann,¨ M., Lorenz, M., Gerbert, P., Waldner, M., Justus, J., Engel, P. & Harnisch, M. (2015). Industry 4.0: The Future of Productivity and Growth in Manufacturing Industries. Boston Consulting Group, 9(1), 54–89. SAE International. (2018a). Aerospace Standard on Process Control Methods (AS13006). Retrieved from https://www.sae.org/publications/collections/content/asquality/. SAE International. (2018b). Process Control Methods (AS13006): Appendix D - Guidance Materials. Retrieved from https://www.sae.org/publications/collections/content/asquality/. Sall, J. (2018). Scaling-up Process Characterization. Quality Engineering, 30(1), 62–78. https://doi.org/10.1080/ 08982112.2017.1361539 Saunders, M., Lewis, P. & Thornhill, A. (2007). Research Methods for Business Students (4th ed.) Harlow, United Kingdom: Pearson Education Limited. Shewhart, W. A. (1931). Economic Control of Quality of Manufactured Product. New York, NY: D. Van Nostrand Company, Inc. Tatipala, S., Wall, J., Johansson, C. M. & Sigvant, M. (2018). Data-driven Modelling in the Era of Industry 4.0: A Case Study of Friction Modelling in Sheet Metal Forming Simulations. Journal of Physics: Proceedings of the International Conference and Workshop on Numerical Simulation of 3D Sheet Metal Forming Processes, 1063(1), 012135, Tokyo, Japan 30 July –3 August, 2018. https://doi.org/10.1088/1742-6596/1063/1/012135 Wang, K.-S. (2013). Towards Zero-defect Manufacturing (ZDM): A Data Mining Approach. Advances in Manufacturing, 1(1), 62–74. https://doi.org/10.1007/s40436-013-0010-9

53 Appendix A Interview Guide

Syfte/bakgrund

Jag skriver ett examensarbete om hur statistisk processtyrning kan hjalpa¨ tillverkande foretag¨ att na˚ zero defects. Jag har forst¨ att˚ att ett initiativ nyligen har startats upp med anledning av att man fran˚ kunder har fatt˚ hardare˚ krav att ga˚ fran˚ Cpk 1.33 till 2.0.

Med denna intervju vill jag dyka lite djupare for¨ att forst¨ a˚ hur man ser pa˚ processtyrning och zero defects, vad man gor¨ utifran˚ det man ser kopplat till olika processer.

(Fraga˚ om det ar¨ OK att spela in intervjun for¨ att transkribera. Inspelningen kommer inte ateranv˚ andas¨ till annat an¨ transkribering och analys. Alla svar och personer kommer att vara anonyma.)

Allmant¨

Vad heter du? • Vad har du for¨ roll/befattning? • Hur lange¨ har du jobbat paf˚ oretaget?¨ • Hur lange¨ inom aktuell roll? •

Zero Defects

Vad ar¨ zero defects? For¨ dig? • Hur paverkas˚ du? • Hur jobbar du for¨ att mata¨ mal?˚ • ZD vs. LOM, vad ar¨ skillnaden? • Andra initiativ, Road to zero/robustness, Lean? • Far˚ initiativen fotfaste?¨ • Konkurrerar olika initiativ med varandra? • Finns foruts¨ attningarna¨ for¨ att jobba mot sadana˚ mal˚ som ZD? • Hur sakrar¨ man battre¨ produkter/processer • Hur tar design in behoven ifran˚ produktion da˚ vi ser att vi behover¨ korrigera matt(nominella),˚ toleranser • eller designlosning?¨ Vem bar¨ ansvaret for¨ att styra processer/ vem ska arbete med processtyrning? • Hur mar˚ processer/(produkter) idag? • Hur vet man det? •

i Hur mater¨ man det? • Kan man avgora/analysera¨ hur processer mar˚ genom att titta pa˚ data? QYSY, Co-Pilot, SAP R3? • Skiljer sig detta med avseende pa˚ om det ror¨ sig om slutmatt˚ (QSYS), maskindata (Co-Pilot) eller • operatorsdata¨ (SAP R3)? Ar¨ det skillnad i vardet¨ eller tillforlitligheten¨ pa˚ dessa data? • Anser du att det finns tillrackligt¨ med data? • Vilken data saknar du? • Hur jobbar man med standiga¨ forb¨ attringar?¨ •

Defekter

Vad ar¨ en defekt? • Defekt i en produkt/process? • Hur mater/f¨ angar˚ man defekter? I en produkt/process? •

SPC

Anvander¨ du QSYS (eller andra system) for¨ att utvardera¨ processutfall? • Hur ska man tanka¨ kring Cp och Cpk (Cpk=1.33–2.0)? Ska man noja¨ sig med 1.33? • Hur ska man tanka¨ kring tolerans/kravgranserna?¨ • Kan man utifran˚ dessa hitta forb¨ attrings¨ atg˚ arder¨ for¨ en process? • Kan man utga˚ ifran˚ tifran˚ Cpk=1.33–2.0 for¨ att hitta forb¨ attrings¨ atg˚ arder¨ for¨ en process? • Diskuteras VOP vid inkorning¨ av en process? • – Styrgranser¨ – Statistisk jamvikt¨ – Hur behandlar man urskiljbara orsaker (special causes) till variation? – Hur sakrar¨ man att standardavvikelsen beskriver en process naturliga variation (common causes)?

Hur staller¨ du dig till foljande¨ past˚ aenden?˚ (Finns i standard AS13006)

”SPC is not suitable for low-volume production”

Ar¨ det latt/sv¨ art˚ att mata¨ och styra givet det faktum att de flesta produkter/komponenter tillverkas i • ganska lag˚ volym?

”SPC is only suitable for simple products”

ii Da˚ det ofta ror¨ sig om komplexa produkter/komponenter, innebar¨ det komplexitet i att mata/styra?¨ •

”We already inspect everything we make”

Ar¨ det latt/sv¨ art˚ att mata¨ och styra givet det faktum att de flesta produkter/komponenter tillverkas i • ganska lag˚ volym?

Industri 4.0

Hur tror du att ZD paverkar˚ implementering av Industri 4.0? • Gallande¨ I4.0 och ZD, foruts¨ atter/kr¨ avs¨ det att man har ”natt˚ fram till” nagot˚ av koncepten for¨ att kunna • lyckas med det andra? Finns det svarigheter˚ i att veta vad som behover¨ matas?¨ • Hur stottar/m¨ ojligg¨ or¨ chefer okad¨ matbarhet/f¨ orm¨ agan˚ att kunna styra processer? •

iii Appendix B Complete List of Codes

Code number Code Used in number of extracts 1 ZD is working with risks and quality in early phases 33 2 ZD is working with SPC 10 3 Knowledge transfer between individuals/processes/organisations 53 4 Collaboration between design and manufacturing 38 5 Quality initiatives do not get foothold 37 6 Quality initiatives can compete against each other 20 7 No resources/prerequisites for proactive quality work 69 8 Process health is related to product deviations 32 9 Quality work is heavily focused on product characteristics and Cpk 52 10 There is a need to become better at working statistically with processes 107 11 Data reliability is considered good 8 12 Existing data is enough for now 12 13 Defects are considered product deviations 23 14 Manufacturing Engineers (MEs) evaluate process outcome with QSYS 18 15 Cp and Cpk are used for identifying process improvements 26 16 Tolerance (specification) limits are used for controlling 50 17 It varies whether special causes are part of the standard deviation 16 18 Low-volume production requires adaptation of SPC methods 18 19 Complexity of products/processes is not an obstacle for process control 14 20 ZD and I4.0 support each other 14 21 Managers must enable increased measurability and process control 40 22 Measurability is important 45 23 ZD is being able to show Cpk= 2.0 19 24 Managers must request the right things 84 25 Combining different data/systems 22 26 Defects can be related to processes 15 27 ZD requires a cultural change 53 28 Change daily working methods 62 29 Administrative work stifles value creation 19 30 ZD (Cpk= 2.0) is very difficult for older and/or complex products 22 31 QSYS is used for analysis and identification of improvements 11 32 Existing data needs to be used differently 21 33 ZD lays the foundation for quality work 20 34 I4.0 supports quality work 10 35 ZD is a vision 8 36 Ratios/KPIs 18 37 Several parties should work with process control 14 38 Inspection 14 39 Continuous improvement 8 40 In some cases one can settle for a lower Cpk 16 41 ZD (Cpk= 2.0) is not fully possible of profitable 20 42 Customers/deliveries are prioritised 35 43 Normal distribution is considered 7 44 Attribute data is difficult to measure/analyse 13 45 Keep it simple 19 46 Process knowledge replaces standardised quality work 14 47 Need for additional/different data 2 48 Industrial adaptation/balancing 30 49 Persistence in managers and management 29 50 Human factors complicate 7

iv