Mälardalen University School of Innovation Design and Technology Västerås, Sweden

Course Code: DVA503 Master Thesis in Intelligent Embedded Systems

TEST PROCESS ASSESSMENT OF INDUSTRIAL CONTROL SYSTEMS VIA SAFETY STANDARDS

Ladan Pourvatan [email protected]

Examiner: Thomas Nolte Mälardalen University, Västerås, Sweden

Supervisors: Wasif Afzal, Eduard Paul Enoiu Mälardalen University, Västerås, Sweden

6th June 2021 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Abstract Context: As more systems are becoming embedded hardware-based, challenges regarding soft- ware safety and considerable consequences of their failure arise. Various safety standards as- sure certain safety aspects of systems, addressing areas including testing. The safety standards chosen for this thesis are ISO/IEC/IEEE 29119-2 & 3, IEC 61508-1 & 3, ISO 13849-1 & 2, and ISO/IEC/IEEE 12207:2017. Objective: This thesis tackles the problem of compliance with safety standards by utilising a lightweight assessment method, leading to recommendations for improving the test process of an industrial control system. Method: A case study is performed on an auto- mation company to achieve the objectives of this thesis. The method used for the qualitative data analysis results in recommendations regarding the compliance of the company’s test process with selected safety standards. As the final step, the execution of a focus group research leads to the industrial evaluation of the recommendations and assessment results. Results: The company’s development process fully complies with 22% and fails to comply with 58% of the extracted require- ments from the selected safety standards. Furthermore, the thesis results in recommendations for improving the test process of an industrial control system. As a result of performing the case study, a method for a lightweight assessment of the development process of industrial control systems is achieved. The generic method follows five steps, firstly tabulating the data to attain assessment criteria and items, used by the assessment step to get a compliance degree per requirement. The analysis step comes next to shed light on areas of strength and weakness, leading to recommenda- tions. The final step evaluates and refines the recommendations according to the results of a focus group. Conclusion: Further development of the method used in this thesis can lead to a generic method for assessing development processes, concerning safety standards, using limited resources. The results of this generic method can lead to recommendations for test process improvements of control systems via safety standards.

1 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Acknowledgements

I would like to thank my supervisors Eduard Paul Enoiu, Wasif Afzal, and my industry supervisor for their advice throughout the process of writing this thesis. They never failed to answer my endless questions and kept me going despite the challenges. This thesis would have been impossible without the help of all three supervisors, and I am greatly appreciative of them all for their time and contribution.

2 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Contents

1 Introduction 1

2 Background 2 2.1 V-Model Process...... 2 2.2 Safety Standards...... 3 2.2.1 IEC 61508 Standard...... 4 2.2.2 ISO/IEC/IEEE 12207 Standard...... 5 2.2.3 ISO/IEC 29119 Standard...... 6 2.2.4 ISO 13849 Standard...... 7 2.2.5 ISO/IEC 33063 Standard...... 8

3 Related Works 8

4 Problem formulation 10 4.1 Scope Definition...... 10 4.2 Limitations...... 11

5 Method 11

6 Ethical and Societal Considerations 14

7 Case Study Design 15 7.1 Data Collection...... 16 7.1.1 Document Review...... 16 7.1.2 Interview...... 26 7.2 Data Analysis...... 27 7.2.1 Tabulating the Data...... 28 7.2.2 Assessment...... 29 7.2.3 Analysis...... 30 7.2.4 Recommendation...... 30 7.2.5 Industrial Evaluation...... 30 7.3 Threats to Validity...... 31

8 Results 32 8.1 Tabulation of Data...... 33 8.2 Assessment...... 38 8.3 Analysis...... 42 8.3.1 Use Cases...... 43 8.3.2 Safety Standards...... 44 8.3.3 V-Model Categorisation...... 46 8.3.4 Keyword Classification...... 48 8.3.5 Analysis Resolutions...... 50 8.4 Recommendations...... 51 8.4.1 Test Plan...... 51 8.4.2 Test Strategy...... 51 8.4.3 Test Status Report...... 52 8.4.4 Test Completion...... 52 8.4.5 Test Design Specifications...... 52 8.4.6 Test Environment Set-up and Monitoring...... 53 8.4.7 Test Execution...... 53 8.5 Industrial Evaluation...... 53 8.5.1 Refined Recommendations...... 56

9 Discussions 58

3 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

10 Future Works 59

11 Conclusions 60

References i

Appendices iv A. Appendix A - Safety Standard Requirement Extraction...... iv B. Appendix B - Interview Questions...... xxi C. Appendix C - Data Sources and Types...... xxiv D. Appendix D - Condensed Assessment Results...... xxv E. Appendix E - Focus Group Procedure...... xxxvi F. Appendix F - Recommendations...... xxxvii G. Appendix G - Refined Recommendations...... xlii

4 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

1 Introduction

In the modern-day world, many systems depend on the correct operation of machines and autonom- ous systems. As many industries encounter the development of safety-critical systems, one of the main concerns that affect our everyday lives, is the considerable consequences of their failure. These consequences may involve loss of life, property damage, or environmental damage [1]. As electron- ics technology advances, more and more systems are becoming embedded hardware-based, leading to the challenges of software safety and system performance quality [2]. The quality of a product consists of many aspects, one of which is reliability and completeness of testing [3]. Throughout the years, different safety standards have been generated to assure certain safety aspects of control systems in different domains, including testing and validation. Optimal test processes can assure the completeness of testing according to testing standards. Test processes are knowingly costly, as well as resource-consuming [4]. This thesis intends to tackle the problem of compliance of safety-related systems’ test processes with relevant standards. The assessment of the compliance of a process with certain criteria entails reviewing the documentation produced in the standard and verifying they adhere to the instructions in the standard [5]. The fundamental question directing this research addresses recommendations entailed in criteria for a test process, which will assist the industrial control systems to achieve higher conformity with specific standards. One step for reaching the main goal of the research is gathering an assembly of relevant instructions concerning test processes and validation of safety- related control systems. Using this collection, a case study shall be performed to solidify the compliance of a specific case to safety standards. The case selected for this case study consists of the development process documentation for two use cases, provided by an automation company in Sweden. The results achieved from studying the compliance of the process in place at the company can be transferable to other similar cases. A collection of instructions assisting the test process of an example of a control system to comply with safety-related standards has been the answer to the first research question. Answers to the research questions have led to the utilisation of a lightweight method for generating recommendations for improving the test processes of industrial control systems. This thesis aims to investigate relevant safety standards concerning control systems in regards to their application in safety and validation in the test processes of industrial control systems. Furthermore, the compliance of a test process with relative safety standards is investigated. This study continues to conduct a thorough case study at a large automation company to assess the degree to which the existing processes comply with the safety standards. The extent of compliance of a company with relevant safety standards is decided through reviewing the documentation provided by the company. The review has been done through multiple stages, applicable to other companies. Moreover, the review will lead to ensuring the thoroughness of the justifications of the conformity of the company’s process with the relevant standards. The steps performed for answering the research questions in this thesis include tabulating the data, assessment, analysis, recommendation, and finally industrial evaluation. Data collection that takes place prior to and in parallel to these steps, determines the development process that is to be assessed. In addition, the data collection results in the assessment criteria, which is a selection of requirements from the safety standards. Careful consideration of the desired safety standards leads to assessment criteria. Collecting documentation on the development process of a company leads to evidence for compliance of the process with said standards. Tabulation of the data sources and the assessment criteria into the V-Model Categorisation and Keyword-Classification schemes further enlightens the areas in need of improvement. This tabulation can be customised to a development process, as it uses keywords and qualitative analysis of texts. Once tabulated, the evidence and criteria are matched and assessed. The results of assessing the example industrial control system have resulted in 58% failure of compliance with the selected safety standards. Furthermore, the analysis of these results revealed the areas of growth to be within integration testing, traceability of the documentation, and performance of analyses for achieving test cases. The recommendation step uses the results of the assessment and analysis phases to cluster and summarises a series of suggestions to improve the test process in place at a company. The recommendations generated in this specific case were focused on Test Planning activities. Furthermore, the recommendations, assessment, and analysis results are to be evaluated by those affected. The case under study utilised

1 Ladan Pourvatan Test Process Assessment of Industrial Control Systems a small focus group from the company to present the results and received feedback on improving the recommendations by prioritising and further clustering them. The focus group also revealed the usability and the value of the results achieved through the assessment and analysis stages. This thesis has put forth a set of steps for lightweight assessment of test processes of industrial control systems, as a result of its exploratory case study. The steps in this method require further refinement and evaluation to be applied generally and can contribute to industry and academia by limiting the resources needed for identifying areas of improvement in the test process of industrial control systems. The thesis mostly focuses on the results of the assessment of an example of an industrial control system’s development process with a focus on testing and recommendations made for the test process in question.

2 Background

The thesis aims to make recommendations for improving test processes, taking into account the assessment of development processes. The steps taken in this thesis for assessing the development processes are applied to a large automation company in Sweden. For achieving the goals of the thesis, working closely with a large automation company in Sweden, various safety standards will be taken into consideration and analysed. These standards are regulated systems and compliance with them is usually either required or advised [6]. This study will also focus on the analysis and improvement of test processes, namely the test process in place at the company. The software development process used at the company is the V-Model. As testing is a vital and inconvenient part of every organisation, appropriate and mature test processes are needed [7]. Test process improvement approaches guide through the improvement of different software testing processes [4]. In this thesis, the assessment of the processes is done by reviewing documentation provided by the company and ensuring their compliance with specific standards. As test process improvement is of great importance, it must be dependent on its compliance with appropriate safety standards. The compliance of a test process with safety standards is determined through the review of documentation provided. The documentation provided by the company involves their development process, following the V-Model. How the following of the recommended guidelines is justified in the documentation, is what leads to compliance. Compliance with a standard is particularly of high importance in safety-critical systems since the test processes recommended in these standards ensure the safety of human lives as well as avoiding the loss of equipment and harming the environment.

2.1 V-Model Process The software development life cycle model used by the company is the V-Model. The book “ Software Testing: An ISTQB-BCS Certified Tester Foundation Guide” is used as a reference for the definition of the V-Model [8]. The book [8] states that a software development life cycle is basically a set of processes containing tasks and activities, which leads to the entire cycle from the retrieval of requirements to the release of the system. One of the simple, traditional development life cycle models is the Waterfall model. The V-Model is an extension of the waterfall model and has many different variants [8]. The Waterfall model sequentially takes steps from the beginning being requirement specifica- tion, all the way to the last step, which is testing. The steps in the waterfall model are as follows: requirement specification, functional specification, technical specification, program specification, coding, and finally testing [8]. Testing acts as a final quality check in the Waterfall model, and the checks done throughout the life cycle are known as verification and validation. Verification includes the reassurance of the system meeting the requirements and the correct development of a system. Validation, on the other hand, is the set of tasks that ensure the conformity of the system with customer requirements, meaning reassurance that the correct system is being developed. V-Model is known as an extension of the Waterfall model. Figure1 shows a variant of the V-Model for software development.

2 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Figure 1: A variant of the sequential V-Model presented by [8]. The figure demonstrates a model for a software development life cycle, stating different phases or activities which must be achieved. The order to follow the sequential V-Model of software development is also demonstrated.

As can be seen in Figure1, the V-Model contains all the steps that the Waterfall model does. However, in the V-Model, the planning for testing is done at the start [8]. The first step of the V-Model is requirement specifications, in which the needs of the user or customer are captured. Plans for acceptance testing must be made at this stage since acceptance testing includes testing the system against requirement specifications. The next step is functional specifications, during which the system design is done. This step is when the functions aimed to fulfil the requirements are defined. The functional specification stage is parallel to system test planning since the system tests are designed based on the functional specifications. The third stage is technical specifications, during which the previously defined functions are designed at a high level. The technical design of the functions helps clearly define the connections between the internal entities and other systems or the environment, which provides the necessary information for designing the integration tests. Once the technical specifications and the integration planning are done, the program specification phase begins. Program specification includes a low-level design or an internal design of all the modules; this is the phase that makes the system ready for unit test planning. Once all the specifications are completed, and with them, the test plans are done, the coding is performed [8]. The right side of the incremental V-Model focuses on testing. Keep in mind that at the point of coding, all the test plans are ready. The tests are done from the bottom up. First, the unit tests are executed, making sure that each module functions correctly. Next, the relationship between the modules, and the technical specifications are tested via integration testing. Once the technical aspects of the modules are confirmed to work properly, the system is tested against the functional specifications of the system; this is called system testing. The final step of testing according to the V-Model is acceptance testing, which aims to test the system with respect to the requirements presented by users [8]. The V-Model is an extended model of the Waterfall model and is used as a model for software development life cycle [8]. The V-Model has specifications about planning the tests during the specification of each phase of the system and executing them later. The planning of the tests helps with better specifications, and faster execution of tests once the code is done. The development life cycle model used at the company is a variation of the V-Model. A simplified variant of the V-Model is utilised at the company, which allows the safety requirements to be satisfied.

2.2 Safety Standards Safety standards can be categorised into two groups: those which focus on the practical guidance on methods to be used during development for achieving the system objectives, and those which apply constraints on the development process without practical guidance [9]. The standards taken into consideration for this thesis will be among those considered as highly relevant in the context of process assessment of safety-critical control systems, and will mostly belong to the first group

3 Ladan Pourvatan Test Process Assessment of Industrial Control Systems mentioned. Safety standards are applied to the development processes of safety-critical systems in order to reduce risks to an endurable level. Generally, safety standards in industrial control systems tend to specify safety requirements in addition to a guideline to their implementation through safety functions. [10]. The criteria for choosing the standards used during this case study include their involvement in of machinery, software testing and development process assessments, and those specified to be mandatory by the company. The standards chosen are to be concerned with safety, clearly address test process assessment and safety of machinery in control systems, and be highly regarded in their respective domains.

2.2.1 IEC 61508 Standard One of the standards used in this work is IEC 61508. This is a standard for the functional safety of electrical/electronic/programmable electronic safety-related systems. What brings this standard to attention for this thesis is its applicability to control systems. This standard is likely concerned with the safety of systems. Safety in this context is referred to as the absence of failures that may cause harm to persons, environments, or have economic implications [11]. Furthermore, Nevalaian et al. has identified this standard as being useful when it comes to using methods and techniques as evidence for the evaluation of the achievement of safety [12]. IEC 61508 is an international standard addressing system performing safety functions having electrical, electronic, or programmable electronic (E/E/PE) elements [11]. The standard contains recommendations and instructions for all safety lifecycle activities for the systems mentioned. The standard covers a specific scope, namely the characteristics that E/E/PE systems must have if they carry out safety functions. As IEC 61508 is generic, it is therefore applicable to all E/E/PE safety-related systems, regardless of their application, with the exception of medical equipment in compliance with the IEC 60601 series. Compliance with this standard entails the satisfaction of all the relevant requirements with their respective criteria and meeting the objective of each specified clause [11]. The standard IEC 61508 has many clauses, of which the software safety lifecycle requirements in part 3 are of interest. However, it is useful to briefly look at the other clauses related to software requirements. Part one of the standard focuses mainly - other than the scope, documentation, and management of functional safety - on the overall safety lifecycle requirements in systems under interest. The overall safety lifecycle aims to systematically deal with the necessary activities for having the required safety integrity in a system. The lifecycle has sixteen main phases in addition to Verification, Management of Functional Safety, and Functional Safety Assessment. The final three are relevant to all the main sixteen phases. For each phase, the standard includes the sub- sequent sub-clauses, adhering to which, leads to the compliance of the phase with the standard: the objectives, scope, sub-clauses with requirements, required inputs and required outputs. All these aspects are documented in detail in IEC 61508-1:2010. The phases in the overall safety lifecycle are: Overall scope definition, Hazard and risk analysis, Overall safety requirements allocation, Overall operation and maintenance planning, Overall safety validation planning, Overall installation and commissioning planning, E/E/PE system safety requirements specification, E/E/PE safety-related systems – realisation, Other risk reduction measures – specification and realisation, Overall install- ation and commissioning, Overall safety validation, Overall operation, maintenance and repair, Overall modification and retrofit, Decommissioning or disposal, and Verification. Even though the standard recommends overall safety lifecycle requirements, the main point of interest for this thesis is the software safety lifecycle requirements presented in IEC 61508-3:2010 [13]. The document includes nine general requirements for the software safety lifecycle, which lead to the software development process being structured into defined phases and activities. Once the general requirements are stated, the standard moves on to the requirements and objectives of software safety requirements specifications. The sub-clause that follows includes guidelines for the validation plan for software aspects of system safety. The requirements in this sub-clause include the specific technical and procedural steps to be carried out for planning so that the safety requirements are satisfied. More precisely, the standard continues with the aspects of system safety that must be considered in this phase. The sub-clause includes the required information for the technical strategy that has been chosen for validation, as well as criteria for accomplishing software

4 Ladan Pourvatan Test Process Assessment of Industrial Control Systems validation [13]. After the three phases explained above (General requirements, Software safety requirements specification, and Validation plan for software aspects of system safety), comes the stage of Soft- ware design and development. The objectives that must be achieved for the software design and development phase to be compliant with the standard are stated clearly along with some general requirements. These general requirements precede specific criteria for software architecture design, provisions for support tools, including programming languages, conditions for detailed software design and development, specifications for code implementation, requirements for software module testing, and last but not least necessities for software integration testing. All the sub-clauses in the software design and development phase include their objectives and requirements, which will be employed for deciding on the compliance of the company’s documentation in regards to the two use-cases with instructions provided by instructions IE 61508.

2.2.2 ISO/IEC/IEEE 12207 Standard ISO/IEC/IEEE 12207, titled Systems and software engineering — Software life cycle processes, is a standard used by the company under study. The compliance of the software development process at the company with the standard will be thoroughly assessed. Although ISO/IEC/IEEE 12207 is not a safety standard per se, it is concerned with critical quality characteristics. The quality characteristics subsume aspects related to health, safety, security assurance, reliability, availability and supportability [5]. Safety in this standard is referred to generally as an expectation of not endangering human life, health, property, or the environment. Therefore, the standard ISO/IEC/IEEE 12207 is utilised to assess the processes of the software development for the use cases under interest, due to its safety measures in addition to its usage by the company. The document for the standard ISO/IEC/IEEE 12207 includes instructions and requirements for various processes [5]. The standard acknowledges its vastness and states that all the provided processes may not be needed for a particular project. Therefore, the document provides instructions for full conformance and tailored conformance, meaning the readers will have ways to assess a life cycle’s compliance with this standard tailored to the needs of the system under interest. The full conformance is a union of conforming to the outcomes and the tasks in a process. There exists a prescribed tailoring process in the standard, which assists with modifying the outcomes and tasks to the needs of a specific system under interest. One may claim tailored conformance to the standard by demonstrating that the outcomes, activities, and tasks, as modified, have been achieved [5]. The documentation for the standard ISO/IEC/IEEE 12207 uses some essential concepts as its basis which are thoroughly described [5]. Generally, the concepts can be divided into four groups: software system concepts, organisation and project concepts, lifecycle concepts, and process con- cepts. Software system concepts involve explanations about a software system, its structure, its lifecycle processes, as well as enabling systems. Organisation and project concepts explain that the execution of processes by which organisations or parts of organisations is not of concern of this document. Lifecycle concepts have explanations for software lifecycle stages and lifecycle models for software systems. It is presented that, lifecycle stages portray the major progress and achieve- ment milestones of the software system through its life cycle. A lifecycle is additionally explained as an abstract functional model. The functional model must embody the need for a system, the system’s realisation, utilisation, evolution, and disposal [5]. The descriptions of the concepts for the processes in the documentation of the standard are of most interest to this piece of research. The process concepts model three basic principles for determining what is identified as a life cycle process. According to ISO/IEC/IEEE 12207, there is a strong relationship between the tasks, outcomes, and activities in a life cycle process [5]. Each process is to have the ability to be executed by a single organisation in the life cycle. Also, the processes are to have as minimised dependencies amongst them as feasible. Besides fulfilling the basic three criteria mentioned, a process in the documentation of ISO/IEC/IEEE 12207 has the following attributes: a title, purpose, outcome, activities, and tasks. The title of a process demonstrates the domain and scope of the process. A process’s purpose is a description of the goals that are to be achieved from performing the process. The outcomes of a process are referred to as the expected results that can be perceived once the process is successfully executed. The set

5 Ladan Pourvatan Test Process Assessment of Industrial Control Systems of the united tasks of a process are specified as activities. The tasks themselves include permissible actions, requirements, or recommendations that assist in achieving the desired results from the process [5]. Now that the concepts have been explained to the extent needed to understand the work, the software life cycle processes are taken into consideration. The processes are classified into four groups: Agreement Processes, Organizational Project-Enabling Processes, Technical Management Processes, and Technical Processes. Furthermore, there are requirements for the processes in each of the groups. For each of the processes, requirements and instructions are provided for their purpose, tasks, activities, and outcomes. The actual requirements will not be mentioned in this text but may be accessed through the actual text in the documentation for the ISO/IEC/IEEE 12207 standard [5]. The Agreements Processes group includes the acquisition process and the supply process. Or- ganizational Project-Enabling processes consist of six processes, each with their recommendations regarding their attributes. The processes in this group are the Life Cycle Model Management process, Infrastructure Management process, Portfolio Management process, Human Resource Management process, Quality Management process, and Knowledge Management process. The following eight processes are grouped into the Technical Management processes: Project Planning process, Project Assessment and Control process, Decision Management process, Risk Management process, Configuration Management process, Information Management process, Measurement pro- cess, and Quality Assurance process. Finally, the Technical processes consist of Business or Mission Analysis process, Stakeholder Needs and Requirements Definition process, System/Software Re- quirements Definition process, Architecture Definition process, Design Definition process, System Analysis process, Implementation process, Integration process, Verification process, Transition process, Validation process, Operation process, Maintenance process, and Disposal process. To sum up, the document for ISO/IEC/IEEE 12207 standard used by the company, comprises requirements, definitions and recommendations in regards to software life cycle processes and is used as a reference for software life cycle processes. The total software life cycle is divided into groups of processes and each group has multiple processes. The processes have basic attributes, requirements about which are provided in the standard documentation to assure quality and safety for the software-under-interest.

2.2.3 ISO/IEC 29119 Standard ISO/IEC/IEEE 29119, titled Software and systems engineering – Software testing, is a standard with four parts, each containing requirements about different aspects of software testing [14]. Part 1 of ISO/IEC/IEEE 29119, titled Concepts and definitions, specifies the concepts and definitions in software testing which are vital for understanding the other parts of the series. ISO/IEC/IEEE 29119-2, Test processes, is the part that describes the test processes that can be utilised for the implementation, governance, and management of testing software, and works independently of which software development life cycle model is utilised. Part 3 of the series is titled Test documentation, and it comprises templates for various software test documentation, specified as outputs of processes described in part 2. Part 4 includes descriptions and definitions for test design techniques that may be used for the processes in ISO/IEC/IEEE 29119-2. The thesis at hand will focus mainly on parts 2 and 3, using them for assessment of the process in place at the company, as well as making safety-related recommendations for a control system’s test process. ISO/IEC/IEEE 29119 has classified the testing activities in a software life cycle into three layers: Organisational Test Process, Test Management Processes, and Dynamic Test Processes [15]. The standard continues to further describe the processes in each group in terms of three attributes. The attributes used to describe each process include the purpose, activities and tasks, as well as the desired outcome of the process. The document explains which activities and tasks must be performed per process, and defines the purpose of the processes clearly. To conform to this standard, the desired outcomes of each process are to be assessed as well. The activities, tasks, and desired outcomes may be modified based on the needs of the system. When modifications are made, they must be justified and recorded, and include the applicable risks of the tailoring. The justifications of the tailoring decisions may lead to tailored conformance of test processes with the ISO/IEC/IEEE 29119 standard [14].

6 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

The organisational test process layer aims to define processes for creating and maintenance of organisational test specifications [15]. The organisational test specifications may include test policies, strategies, procedures, and so on at the organisational level. The layer involves a single process with an expected information item as a result, desired outcomes, its purpose, and three basic activities, each with their tasks. The details of each attribute are not mentioned in this thesis but can be read in the documentation of the standard [14]. The second layer of the test process model used in ISO/IEC/IEEE 29119 is the Test Management Processes layer. There are three processes each having its own aforementioned attributes. The test management processes are generically applied at different levels and test phases in a development process. The three processes are the test planning process, test monitoring and control, and the test completion process. The final layer, namely the Dynamic Test Processes layer, involves four processes. Again, the processes each have a purpose, a set of activities with tasks, the desired outcome, and an information item that is produced as a result of the execution of the process. The attributes involve the details of the requirements and recommendations given by the standard. The processes in the final layer consist of the Test Design and Implementation Process, Test Environment Set-Up and Maintenance Process, Test Execution Process, and finally the Test Incident Reporting Process. The details of each of these layers and their respective processes assist in having a safe and secure overall test process [15]. As previously mentioned, ISO/IEC/IEEE 29119-2 has an information item per process. These information items are documents resulting from the successful execution of each process at each level. Part 3 of the ISO/IEC/IEEE 29119 series provides templates for these documents [16]. Following these templates, or justifying the specific changes made to tailor them to the needs of the system-of-interest, will lead to compliance with ISO/IEC/IEEE 29119-3. The details of these templates are not given in this section but are used to decide on conformity of the test process at the company, as well as making relevant recommendations for a safe test process.

2.2.4 ISO 13849 Standard The safety standard of central interest in this thesis is ISO 13849, both parts 1 and 2. This standard is used by the company for their software development process along with ISO/IEC/IEEE 12207. As this research aims to assess the test process of industrial control systems, the main part considered from ISO 13849 is the software safety requirements clause. Software safety requirements in ISO 13849 seek to achieve software with high readability and understandability, as well as maintainability, and most of all testability [17]. The standard hopes to achieve this objective by stating that the faults introduced in the software life cycle should be avoided in the life cycle activities of safety-related application or embedded software. The requirements in the standard are divided into those for safety-related application software and those for safety-related embedded software [17]. Fundamental measures are to be taken for safety-related embedded and application software as recommended by ISO 13849. The first measure includes the software safety life cycle with veri- fication and validation activities. Secondly, design and specification documentation is required. Modular and structured design and coding is the third measure to be taken. Control of systematic failures and functional testing are among the steps recommended by the standard. Finally, appro- priate software safety life cycle activities, proceeding modifications of the system, are required as expected actions for safety-related software. In addition to these instructions, safety-related embedded systems containing components with performance levels with a high probability of a rate of dangerous failure shall take more measures. This high-risk safety-related software should pay more attention to project, quality, and configur- ation management and quality management systems along with more structured code and testing criteria among other measures. The safety-related application software should apply the previously mentioned fundamental measures, together with more actions for components with higher risk rate [17]. The safety stand- ard makes recommendations in ten different areas. These areas consist of reviewing the safety- related software specifications, selection of tools, libraries and languages, features mandatory for the software design, and recommendations for where both safety-related and not safety-related ap- plication software are combined in one component. The clauses also include instructions regarding

7 Ladan Pourvatan Test Process Assessment of Industrial Control Systems software implementation and coding, testing, documentation, verification, configuration manage- ment, and finally modifications of the software [18]. The recommendations and instructions given for each of these areas are explained further in the document for ISO 13849-2:2012 [18].

2.2.5 ISO/IEC 33063 Standard ISO/IEC 33063 is a standard titled “Information technology — Process assessment — Process assessment model for software testing”. This standard has been suggested as a process assessment model for software testing by Garcia et al. [19]. ISO/IEC 33063 will help with the assessment of the actual process in place at the company, and given its assessment criteria assist with future recommendations regarding a test process compliant with safety standards. This process also uses the process reference model in ISO/IEC/IEEE 29119-2 as its basis [20][15]. The standard ISO/IEC 33063 has as its fourth clause an overview of the process assessment model. The structure of the process assessment model is explained in the standard, through the outlining of the assessment criteria for different processes [20]. The standard is strictly used as a reference guideline for process assessment as opposed to the other standards, which are used for their safety-related requirements. The document continues to discuss process dimensions and process performance indicators. ISO/IEC 33063 includes assessment guidelines for the compliance and conformity of the process assessment model [20]. This standard will be studied in the future phases of the thesis and used as a reference assessment model.

3 Related Works

The thesis at hand focuses on test process assessment of industrial control systems via safety standards. The objective of the thesis can be partitioned into viewing how the test process as- sessment is conducted in general. The assessment of test processes in the industry is performed in various ways, and Toroi et al. [21] have done interesting work in identifying areas of improvement in test processes. In general, industrial control systems have an approach to their test processes to assure safety; this is discussed in Nunns et al. [22]. Finally, the use of safety standards in assessing the test processes is considered by researchers. Safety standards, in general, help with the assessment of various phenomena. The way standards are applied to test processes is discussed by Panesar-Walawege et al. in two of their works [23][24]. Many methods are used for assessing and improving test processes. The standard ISO/IEC 15504 gives guidelines for the software testing life cycle processes, and CMMI determines how verification and validation should be performed [21]. The research done by Toroi et al. states that test improvement models are found to be too difficult by software companies, producing the need for a lightweight approach to test process improvement [21]. This research used the LAPPI (A Lightweight Technique to Practical Process Modeling and Improvement Target Identification) technique for modelling the test processes studied and proceeded to identify improvement targets in the test processes. The case study was performed in three case organisations. All cases had issues with inadequate unit testing, handling code modifications, and a lack of exit criteria for testing. The study helps this thesis have more of a focus on these areas as well. Another issue found by Toroi et al. was the integration of automated and manual test processes. The study suggests that there is a need for test process standardisation and confirms this by identifying the common and the test related problems that reappear in the software industry [21]. In order to assess the safety of a system, an approach needs to be consistent. Many different methods are being used for the assurance of safety among industry, which may lead to confusion [22]. Nunns et al. identify one of the approaches helping the industry face various challenges to be the use of safety standards. One of the safety standards used for safety assessment is IEC 1508, an International Standard by the International Electrotechnical Commission, which addresses safety- related systems [22]. IEC 1508 contains general documentation of standards used in different sectors of industry. The seven different parts of IEC 1508 undertake various instructions regarding requirements, definitions, and the application of the instructions. The standard consists of seven parts: Part 1 General Requirements, Part 2 Requirements for electrical/ electronic programmable electronic systems, Part 3 Software Requirements, Part 4 Definitions, Part 5 Guidelines on the application of Part 1, Part 6 Guidelines on the application of Parts 2 and 3, and Part 7 Bibliography

8 Ladan Pourvatan Test Process Assessment of Industrial Control Systems of techniques. Nunns et al. demonstrate how IEC 1508 helps with the assessment of safety in different sectors of industry by involving four key concepts. The concepts are The Safety Life Cycle, Safety Management, Competencies, and Design of safety-related control and protective systems. The guidelines for the structured approach to the safety life cycle assist with the assessment of processes [22]. Panesar-Walawege et al. [23] describes A Model-Driven Engineering Approach to Support the Verification of Compliance to Safety Standards. The methods used to ensure compliance with safety standards help this thesis’s approach to the assessment of the test process via safety standards. The work by Panesar-Walawege et al. addresses the complications due to the company’s risk of lacking essential recorded details during development [23]. The work also takes into account the loss of productivity of an assessor leading to delays due to incomplete poorly structured evidence and documentation provided by the company. To face these challenges, Panesar-Walawege et al. use various Model-Driven Engineering technologies to propose an approach to system designers for linking what the standards require as evidence to the concepts of their application domain. The challenge of making such relations threatens the progress of this thesis, therefore, the methods used in the research done by Panesar-Walawege et al. are of great help. The thesis at hand also faces the challenges of understanding the documentation provided by the company, as well as justification of the collected evidence being complete with attention to the standard’s requirements. The work by Panesar-Walawege et al. helps to face these challenges by using UML profiles and developing a profile of the system according to a standard’s conceptual model. This modelling helps with linking the safety standard’s concepts to the application’s domain in a systematic way, leading to a clear route for showing the relationship between the company’s documentation and the standard [23]. Panesar-Walawege et al. use Model-Driven Engineering techniques to verify a system’s com- pliance with safety standards in later works [24]. As previously mentioned, the presentation of evidence by the suppliers of safety-critical systems is essential for the demonstration of their com- pliance with certain safety standards. Clear-cut interpretation of the standards is needed to avoid missing crucial details during development [24]. The work by Panesar-Walawege et al. states the importance of a systematic approach to assessment, which is also supported by automation. Panesar-Walawege et al. suggest a method for linking the concepts of an application domain to the requirements of relevant standards. Just like their previous work, Panesar-Walawege et al. use MDE technologies. To create a profile for the domain model of the system, UML is used, which is then augmented with constraints and transformed into the Object Constraint Language [24] [23]. Panesar-Walawege et al. performed a case study resulting in how their approach of using general modelling tools can assist in the production of adequate evidence for verifying the compli- ance of a system with standards. The approach proposed by Panesar-Walawege et al. establishes a relationship between the domain model of a system and the model of a safety standard. The constraints put forth by the standards and the profile are automatically verified using the existing OCL constraint engines [24]. The studied works suggest the need for a systematic and disciplined approach to the collection and analysis of data in this thesis. The related works are about the test process assessment of industrial control systems and the role safety standards play in this assessment. The pieces of research inspire a clear path for assessing the development processes in industry, when a company aims to follow specific safety standards. The works help with setting a path for creating a systematic view of the evidence provided by the company, and the requirements put forth by the standards. Another outcome of studying the related works is how the improvement targets in test processes of industrial control systems are identified and which areas need more focus.

9 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

4 Problem formulation

All safety-critical systems are required to meet specific safety standards. Safety standards ensure that the appropriate validation mechanisms for a control system are in place and the system meets safety. By conducting an exploratory case study, one can assess the test process of an industrial control system with regard to safety standards and develop the steps to do so. This thesis will investigate various standard documentation concerned with safety and process assessment, and by applying them, evaluate the compliance of an existing development process at a large automation company in Sweden. The thesis will investigate four different safety standards. The selection of safety standards aims to include ones that regard test processes, safety, and activities within the development process that concern test processes. The safety standards are explored in the context of safety-critical control systems’ test processes. Furthermore, the standards were reviewed and chosen according to their relevance to the functional safety of machinery and testing standards. The problem addressed is the compliance of a development process, with attention to testing, with the noted safety and process standards. A lightweight method is used for thoroughly studying and investigating development processes with respect to the chosen standard documentation. The use of a lightweight method is beneficial as opposed to a full assessment, as it utilises fewer resources and identifies areas with a need for more effort faster [25]. The method is performed on the development process in place at an automation company. Furthermore, recommendations made for improving the process will aid in the efficiency and compliance of the testing process with selected safety standards, as they bring forth the relevant requirements in one place. This will contribute to the world of research in exploring new methods for making beneficial recommendations for test processes and the assessment of development processes that aim to follow specific safety standards. The exploratory aspect of the case study performed in this thesis leads to unfolding areas that need consideration in test process assessment and improvement. This study aims to answer the following research questions: RQ1. What recommendations further improve safety compliance for the test process of an industrial control system? RQ2. To what extent does the current development process in place at an automation company comply with specific safety standards?

The first research question is the main problem addressed in this study. The thesis answers the research question RQ1 by answering the second research question RQ2. The main research question results in a collection of recommendations that, if followed, will lead to a test process compliant with a selection of safety standards. The method used to assess the development process is generic and may be applied to other cases. To accomplish the desired set of recommendations for the test process, initially, the suggestions made by relevant safety standards are to be gathered. Secondly, the assessment of the development process, with attention to testing, leads to valuable information regarding the areas of improvement in the development process leading to a more compliant test process. Hence, answering the second research question gathers a concise collection of instructions from preferred safety standards, using them for more distinct assessment criteria for compliance. The exploratory nature of this study has led to a lightweight method used for the assessment of the development process of industrial control systems and is applied to an automation company to answer the research questions. This methodology for a lightweight assessment of a test process is a by-product of answering these research questions.

4.1 Scope Definition This thesis focuses on standards concerned with safety and test processes in control systems. It is paramount to assign boundaries to such a broad problem. The scope of this study is limited to safety-related recommendations for the test process of industrial control systems. The recommend- ations are specific to the test process within the software development process of control systems. To bound the domain of the standards, they are narrowed to those related to the functional safety of machinery, specific to software development processes, testing, and validation. Therefore, while

10 Ladan Pourvatan Test Process Assessment of Industrial Control Systems studying the standards, only elements concerning safety-related parts of control systems are con- sidered. In addition, the requirements extracted from the safety standards are those that somehow affect the test process of an industrial control system. The scope of the thesis does not include in- structions from the standards that address areas unrelated to testing. The test process researched will be specific to two safety functions in the form of use cases provided by the company, which will conclude transferrable results.

4.2 Limitations One remarkable limitation conflicting the world at the moment is the COVID-19 pandemic. This pandemic has caused restrictions concerning access to on-site resources of the company and delays in information gathering. The issue of confidentiality is considered as the thesis includes data from an automation com- pany in Sweden. Initially, the confidentiality issue was dealt with by signing a Non-Disclosure Agreement (NDA) by the student. Under the NDA, access is granted to the needed information for performing the thesis work. Furthermore, the company has provided an excerpt from the pro- prietary software which is allowed to be included in the public thesis report. Another approach for addressing the confidentiality issue has been to avoid sharing any sensitive data on any cloud platform and using local and not cloud-based backup methods for the work.

5 Method

This thesis is performed as a case study by validating industry needs concerning test process assessment. As the thesis hopes to contribute to both academia and the needs of the industry partner, the method chosen is a case study executed at an example company, the results of which will be transferable to other cases with a similar development process. The methods for data collection in this case study can be used for other cases, as the gathering of archival data and interview are applicable for other companies. Furthermore, the data analysis steps are transferable as they are clearly defined in the following sections and can be applied to other companies with similar development processes. Multiple sources of assessment criteria are utilised on two different use cases, and data triangulation is considered by using observation, documentation, and interview as data collection methods [26]. The case study will ensure the compliance of the test process of safety-critical control systems with relevant safety standards. The case study method will start with a planning stage and follow to iterate through the phases of design, preparation, data collection, analysis, and sharing [27][28]. Once planning and time management was thoroughly complete, the case study design stage led to the definition of objectives and a more reliable overall plan. Subsequently, the prepara- tions for data collection solidified the descriptions for procedures and protocols for collecting the data [28]. The execution of data collection on the development process at the company followed the preparation stage. Once the needed data was collected, it was investigated by qualitative data analysis methods, with attention to constructing validity, internal validity, and external validity of the study [29]. The thesis performed analysis by studying the compliance of the test process with specific standards. Reporting or sharing the results of the analysis is the final stage. In this stage, recommendations were made to the test process at the company. The general flow of the case study procedure, inspired by Runeson et al. and Baškarada is demonstrated in Figure2[28].

11 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Figure 2: This diagram is inspired by the recommendations of Runeson et al. [28] and Baškarada[27] in regards to case study guidelines. The diagram demonstrates the flow of different stages of a case study, where case definition and case study protocol are sub-steps to be taken for the design phase.

The first stage of performing a case study is the case study design and planning. The design for a case study involves case and subject selection. This step includes precisely defining the objective, the case, the theory, and the research questions. The case in this context is defined as what is being studied. The theory is the frame of reference for the case study, as in the field of software engineering, theories are not commonly used. Once the design of a case study is complete, the preparation is performed. The preparation part of the case study design stage involves the arrangements for the data collection procedures [28]. The methods for collecting the data at the design level for this study are in the independent and the direct categories, namely document analysis and interview, respectively. The selection strategy is seeking data from two specific safety functions, in the form of use cases and the development process in place at the company, with definite attention to testing. The relevant documents regarding the development process of the company, specifically for the two safety functions, was provided by the company. Likewise, an interview conducted with an R&D engineer at the company contributed supplemental information regarding the parts of the development process at the company related to testing.

12 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Figure 3: Figure to show the steps taken for the data collection phase. In step (1), the documents to be assessed were requested and acquired from the company. Step (2) involved choosing the safety standards according to their relevance to the scope of the thesis. During step (3), all the acquired safety standards were thoroughly read to achieve sufficient familiarity with them. Furthermore, in step (4), the generic exclusion and inclusion criteria for the instruction and requirements were produced and then applied to the requirements in step (5) to find those relevant to the scope of the thesis. Finally, The selected requirements were evaluated and confirmed in step (6), with results shown in Tables1,2,3,4,5,6, and7.

The data collection stage in this thesis was performed through document review and inter- view. The various steps of the data collection are shown in Figure3. The document review was executed by the acquisition of documentation concerning the company’s development process and familiarizing oneself with the documents (1) through carefully studying them. Another aspect of the document review stage of the data collection phase is (2) the acquisition and selection of the safety standards. Once the safety standards were read carefully (3), generic exclusion and inclusion criteria for the instructions and requirements were determined (4), which lead to their extraction (5) from the safety standards. Finally, the selected requirements were confirmed by the supervisors (6), leading to a set of instructions and requirements within the scope of the thesis, from relevant safety standards, as well as documentation from the company. The process of performing the independent document review in the data collection stage is shown in Figure3. After independent collection of data through document review, the first two steps of analysis - explained further in the following paragraph - were performed. Following the first two steps, the data collection was continued through an interview. An interview is a direct method for data collection used in this thesis [26]. A semi-structured interview was performed, with questions about the general instructions of the safety standards. The questions also referred to the sub-clauses of safety standards’ instructions which had a failing compliance degree by the development process. The interview was executed in two stages, one with broad questions about the development process and the other with respect to the requirements failed by the provided documentation. Subsequently, the results of the interview were put through the first two steps of the analysis, and then the analysis was continued. A more detailed description of the steps for the data collection stage may be found in Section 7.1. Another stage of the case study is data analysis. Data analysis is performed by initially (1) tabulating the collected data from the safety standards and company documents and grouping them based on two grouping methods: their application to the V-Model, and abstract keywording. Once all the data was tabulated, and each requirement and the data source were assigned to each group, the development process at the company was assessed. The assessment (2) was performed through their documentation based on the extracted and tabulated requirements. The results of the assessment were then presented in tables assigning a compliance degree to each requirement and referring to a justification. Next, the results of the assessment were analysed via their compliance degree, based on the different safety standards, V-model categories, keyword classification groups, and the two use cases to find patterns in their evaluation (3). The patterns were then assembled (3) and produced conclusions. Following the determination of the conclusions, recommendations were made (4) concerning the test process of the control system. The recommendations were then presented to the company and evaluated in the form of a focus group (5). Finally, all the steps

13 Ladan Pourvatan Test Process Assessment of Industrial Control Systems of the analysis result in a set of recommendations evaluated by the company, the assessment of the company’s development process, as well as a list of instructions from safety standards that are within the scope of test process assessment in safety-related control systems. A more detailed description of the data analysis, along with visualisation of the parallelism with the case study method in Figure5, may be found in Section 7.2. The final stage of a case study, according to Runeson et al. [28], is reporting the procedure. The report stage was done iteratively throughout the entire process of the case study. According to guidelines proposed by Runeson et al. [28] and Baškarada [27], the report structure and char- acteristics considered for this thesis were determined. Section7 (Case Study Design) follows the propositions of Runesn et al. [28], including descriptions for the selection of case and subject and procedures for data collection, data analysis, and validity of the study. The collected data and res- ults of the analysis procedure are presented in Section8 (Results). Section9 (Discussions) presents the speculations made from the results of each stage of analysis, and finally, Section 11 (Conclu- sions) concludes the thesis by stating a summary of the thesis and presenting the connections between the results and the research statement.

6 Ethical and Societal Considerations

As this thesis addresses safety-critical systems and is performed as a case study, there are ethical considerations on two fronts. Firstly, the assessment that is carried out in this thesis must be precise, as its results may affect the way the development process at the company is viewed by the company. If the assessment is treated too leniently, it may put future users of the systems at risk. Furthermore, the case study is performed on a very real subject which must remain confidential to contain the integrity of the company and avoid possible negative consequences [26]. The collection of data in the forms of document review, interview, and focus groups each come with their ethical issues. This research involves sensitive data from a large automation company, leading to the need for keeping the confidentiality [29]. The thesis assesses and analyses the company’s development process which leads to results that may lead to negative consequences in the public domain. The consequences are avoided by keeping everything confidential, beyond not naming the company or its members [26]. An effort has been made to redact any words or small piece of information that may lead to exposure of the company’s identity or its specific domain [28]. As the thesis has scientific value, the company is motivated to reveal the information to the student, considering unknown future risks [29]. To avoid further ethical issues, the execution of the case study was performed with the informed consent of the company. Furthermore, the company has been continuously informed of the progress of the thesis via weekly meetings and has access to all the results, data gathered, and analysis performed to avoid possible deception of participants [26]. Data collection in the forms of document review and interview may lead to ethical issues concerning consent, board approval, confidentiality, inducement, and feedback [28]. Board approval and informed consent regarding data collection have been acquired before the start of the thesis activities. Furthermore, the documents are referred to by codes and types as to not reveal any specific information. The interview results are redacted from the public report, and the information is only referenced. Another aspect of ethical considerations in this thesis is performing the (5) Industrial Evaluation step of data analysis in the form of a focus group. Dealing with ethical issues in executing a focus group research is similar to that of data collection through interview or archival documents [30]. To conquer popular ethical issues, the interview and the focus group session were both performed online in a safe environment to put the participants at ease [31]. The participants were also assured of confidentiality and the recording of the session. Once transcribed, the recordings were deleted to assure no further distribution. The focus group session was kept as unbiased as possible, and the questions leading the session were kept simple, short, and distributed in such a manner to avoid confusion [31]. In conclusion, with the actions taken to face the ethical considerations, the risks of this thesis outweigh its benefits as they are minimal, and it will help improve the safety of control systems for the company, and perhaps using the methods presented further the safety of other systems as well [26].

14 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7 Case Study Design

The first stage of performing a case study is the case study design and planning. Following the guidelines provided by Runeson et al. [28], the design stage includes a clear definition of the case and its units of analysis. This stage also clarifies the problem formulation, author’s intentions, existence of data or method triangulation, and the rationale behind the case selection. The design phase also addresses the construct validity of the study, as well as taking into account the integrity of the company under study.

Figure 4: The alignment of the case study procedure with the procedures in this thesis

The objective of this case study is of an exploratory and improving nature [28]. The expected result is a set of recommendations leading to a test process that complies with specific safety standards. The case studied for achieving this goal is the development process carried out at a big company for two specific use cases. The frame of reference for the case study is the testing process and safety; meaning that the process is viewed from a safety perspective, and the main focus is on the testing process. The questions answered as a result of this case study are found in detail in Section4 Problem Formulation. In short, the questions involve recommendations for a test process having compliance with safety standards, and the assessment of the current development process at an automation company. The design of this case study utilises data triangulation, as well as method triangulation. Multiple sources are considered for executing the case study. The assessment is performed on two different use cases, as well as four different safety standards. As for data triangulation, other than a literature review, an interview is conducted, increasing the precision of the study, by the means of it collecting the data on different occasions [28]. The rationale behind choosing the said standards is selecting those which are concerned with safety, clearly address test process assessment and safety of machinery in control systems, and are highly regarded in their respective domains. According to Runeson et al. [28], construct validity is to be addressed in the design phase. Construct validity is the determination of the valid relevance of the specified case to the research questions. Research questions RQ1 and RQ2 address the selected case specifically. The selection criteria for the safety standards makes them relevant to the scope of this research, as explained in detail in Section 2.2. The relevance of the two use cases chosen from the company to RQ2 is determined by the company itself. The main research question, RQ1, uses the results from the other research question to generate a set of suggestions for a test process compliant with the selected safety standards. The generalisation of the issues in place at the company under study may be done in further research by applying the same process of generating recommendations on other industrial control systems which use the same development process model. The company’s integrity is taken into account by removing their specific data and name, avoiding their identification in the public domain.

15 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7.1 Data Collection The case study design explains the process of preparing for, and performing the data collection stage [29]. This thesis acquires the relevant data to analyse through independent document review and an interview. The document review is executed in six steps, shown in Figure3. Furthermore, after the analysis of the data attained by document review, conducting an interview confirms the results of the analysis and assessment, leading to the second form of data collection. The data collected from the interview is analysed once more, going through the steps as explained in Section 7.2. Each of these data collection methods is further described in this section.

7.1.1 Document Review The data collection stage, in this thesis, involves obtaining the appropriate documentation from the company, selecting the standards, and extracting the relevant requirements from the specified safety standards. A visualisation of the different steps of data collection in this thesis is shown in Figure3 Firstly, the relevant documentation is obtained from the company (1). This step was performed in the beginning by receiving the relevant documentation about the two specific use cases from the company, and further on iteratively throughout the process as the need for more documentation was established. The acquisition of company documentation (1) is followed by selecting the desired safety standards (2) with specific criteria, described in more detail in Section 7.1.1.2. After reading and becoming familiarised with the selected safety standards (3), generic criteria for including and excluding clauses and sub-clauses from the safety standards are produced (4). Using the generic inclusion and exclusion criteria, each clause and sub-clause of the safety standards is determined to be included or excluded (5) for data assessment, with specific justification per clause or sub- clause, and a reference to the generic inclusion or exclusion criteria. Once the extraction of relevant requirements from each safety standard was finished, these requirements were further evaluated (6) and refined. The guidelines proposed by Runeston et al. [28] require the data collection to include the specifications of triangulation as well as measurement definitions. The guidelines also specify that the gathered evidence must enable further analysis, identify sensitive data, be traceable, and address the research questions. To assure the clarity, transferability and reliability of the study data triangulation has been performed [28]. The selection of multiple data sources for this study leads to data triangulation. The data include various documentation provided by the company, which concern two different use cases. The data also involves four different standards concerned with safety and software lifecycles. The selection of data sources regarding two use cases rather than one, and the inclusion of multiple safety standards for process assessment at the company, gives a larger sample for the case under study, which is the development process in place at the company. Having multiple sources leads to a broader application of the data, which also assists with the transferability of the study. The recorded data also allows further analysis to ensure the performance of the case study. As some of the data is sensitive to the organisation, the data is labelled vaguely in the public reports and the name of the organisation is not mentioned. The text snippets extracted from the documentation provided by the company are ensured as to not contain information allowing an onlooker to identify the company. Finally, the collected data provides the ability to address the research questions in various ways. RQ1 is answered by the outcomes of RQ2. As RQ2 refers to the assessment of the development process at the company, certain assessment criteria must be specified. Recommendations of a selection of safety standards lead to the assessment criteria, therefore, the principle under which the standards are chosen is important in the data collection phase. The data provided by the company, with its relation to the requirements of the standards, help fulfil the answers to RQ2. The measurement definitions for the data are the criteria under which the standards and the company documents have been chosen. The measurement procedures for performing the analysis must be specified in the data collection phase [28]. The analysis of the collected evidence will be based on the conformance of the provided documentation with the selected instructions and requirements from the specified standards. This conformance is measured in three levels of Pass, Fail, and Partial, otherwise named a compliance degree throughout the thesis. The extent to which the company conforms with the standards is measured based on what is specified in approved documentation. If a document has not been approved based on the company system and is a work

16 Ladan Pourvatan Test Process Assessment of Industrial Control Systems in progress, a Partial grade will be granted with comments as to what is missing. A Partial score is also granted when some of the sub-clauses of a requirement clause are met and some are not. Other than the two instances mentioned, the passing or failing of the requirements and instructions are done in a binary fashion. As previously mentioned, the measurement definitions for the data include the criteria under which the standards and the company documents have been chosen. The standards have been selected based on their addressing of software testing and development process assessments and the functional safety of machinery. The selection also takes into account the mandatory standards identified by the company, and that the standards be highly regarded in their respective domains. The documentation is provided by the company, initially according to those deemed appropriate to the development process of the two use cases. As the process carried on, there were documents added as more were needed for the assessment of the company’s standards, concerning the chosen requirements and instructions from the selected standards.

7.1.1.1 Selection of the Safety Standards

Four standards have been chosen for this thesis as more would broaden the scope of the thesis to an extent that would cause timing issues, and less would not be enough to gather sufficient assess- ment criteria. IEC 61508:2010 and ISO/IEC/IEEE 29119:2013 are standards for the functional safety of electrical/electronic/programmable electronic safety-related systems and software and systems engineering - software testing respectively. These two standards are both considered to be among the most popular approaches used in the improvement of software test processes [32][4][19]. As the study at hand focuses on assessing test processes, these two standards were deemed ap- propriate. The other two standards used are ISO 13849:2016, parts 1 and 2, and ISO/IEC/IEEE 12207:2017. The two have recommendations concerning the safety of machinery and software li- fecycle processes. ISO 13849 and ISO/IEC/IEEE 12207 are used by the company and are also among the most prevalent standards used by industry to ensure the safety of industrial control systems [33].

7.1.1.2 Selection of Requirements and Instructions

In addition to the selection of the actual standards, the relevant instructions and requirements from the specified standards must also be extracted. As the scope of this thesis focuses on test processes, and assessment of the company’s overall process in terms of verification and validation, not all the requirements from all the standards have been chosen for this assessment. The list below gives an overall view of the limitations considered for removing certain clauses of the standards from the assessment criteria used in this thesis.

Generic Exclusion Criteria

1. Parts of standards must be considered while performing the assessment; however, they are not selected as requirements for assessment. For example, the Conformance clauses are considered in detail, as they are vital for understanding how the compliance of a process with the requirements of a standard shall be assessed; however, such clauses are marked as "not included" in the tables. Conformance clauses explain how to read the standards and do not include instructions or requirements. 2. As the thesis is concerned with specifically the two use-cases, and not the entire organisation, requirements for certain processes do not apply to the scope of this thesis. 3. All stakeholder interactions are ignored in this thesis, since (according to the company) customer requirements are at a very high level, and the company writes the software re- quirements to support the development process. Communications with stakeholders, and between members of the organisation, are not in the scope of this thesis and therefore are not considered in the assessment.

17 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

4. The scope is limited to software parts, not with mechanical, electrical, hardware, etc. The requirements in some clauses are not within the software development scope and are applied to an entirety of a machine, therefore they are not considered in this thesis. 5. Some requirements and instructions do not concern the test process or the software devel- opment process of the safety functions, therefore they are out of the scope of this thesis

6. As some clauses are references to other clauses that have already been included or are ref- erences to other standards not in the scope of this thesis, they are marked as not included.

Generic Inclusion Criteria

1. Clauses involving processes that are directly part of the V-model, or are essential for assessing the software development process of a safety-related system, are included. The involvement includes, but is not limited to, processes and clauses concerned with the development process in terms of documentation, validation, and . 2. As this thesis is concerned with safety-related control systems, the instructions for functional safety management and software safety, specifications for the safety functions, design of the safety-related parts, and validation of safety-related software is considered. 3. Given the scope of the thesis addresses test processes, requirements and instructions regarding test activities, documentation of test activities (e.g. test plan) and validation by testing are considered.

A complete list of what has been chosen from each standard, what has not been, and the reason for this choice can be seen in the following tables: Table1 (IEC 61508-1:2010), Table2 (IEC 61508- 3:2010), Table3 (ISO/IEC/IEEE 29119-2:2013), Table4 (ISO/IEC/IEEE 29119-3:2013), Table5 (ISO 13849-1:2016), Table6 (ISO 13849-2:2012), and Table7 ISO/IEC/IEEE 12207:2017). The tables below only show if an entire clause is included completely (), partially, or not at all (). For clauses and sub-clauses that are partially included, their items are extracted to be more precise in AppendixA.. Please keep in mind that these tables are only informative about which instructions have been chosen as parts of the assessment criteria in this thesis.

18 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Num Title Inclusion Status Justification 1 Scope  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 2 Normative References  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 3 Definitions and abbre-  This clause does not include instruc- viations tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 4 Conformance to this  This clause does not include instruc- standard tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criteria1 5 Documentation  All of clause 5 is relevant as it gives instructions for proper documentations that are required for the software devel- opment of a safety-related system. In- clusion Criteria1&2 6 Management of func-  All of clause 6 is relevant as it gives tional safety instructions for proper management of functional safety when it comes to soft- ware development for a safety-related control system. Inclusion Criterion 2 7 Overall Safety life-  The requirements in this clause are not cycle requirements within the software development scope, and are applied to an entirety of a ma- chine, therefore they aren’t considered in this thesis. Exclusion Criterion4 8 Functional safety as-  The objective of the requirements of sessment this clause is to specify the activities necessary to investigate and arrive at a judgement on the adequacy of the func- tional safety achieved by the E/E/PE safety-related system(s) or compliant items. This leads to the clause being relevant to the scope of the thesis. In- clusion Criterion2

Table 1: The list of requirements extracted from the safety standard 61508-1:2010 [11]. Each row represents a clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included.

19 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Num Title Inclusion Status Justification 1 Scope  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criteria1 2 Normative References  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criteria1 3 Definitions and abbre-  This clause does not include instruc- viations tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criteria1 4 Conformance to this  This clause does not include instruc- standard tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criteria1 5 Documentation  This clause is just a clause 5 of 61508-1, which is already included. Exclusion Criterion6 6 Additional require-  Clause 6 comprises of instructions re- ments for manage- garding software safety, making it fully ment of safety-related relevant to the thesis. Inclusion Cri- software terion2 7 Software safety life-  Instructions regarding the software cycle requirements safety lifecycle are all relevant as we intend to assess the software develop- ment process at the company. Inclu- sion Criteria2&1 8 Functional safety as-  References to clause 8 of 61508-1. Ex- sessment clusion Criterion6

Table 2: The list of requirements extracted from the safety standard 61508-3:2010 [13]. Each row represents a clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included.

20 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Num Title Inclusion Status Justification 1 Scope  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 2 Conformance  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 3 Normative References  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 4 Terms and Definitions  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 5 Multi-Layer Test Pro-  This clause does not include instruc- cess Model tions or assessment criteria, it is read and used, as it explains the process model utilised, but does not contain ex- tractable requirements, therefore is not included in the assessment criteria. Ex- clusion Criterion1 6 Organizational Test  The organizational test process is used Process to develop and manage organizational test specifications. As this thesis is con- cerned with only 2 use-cases and not the entire organization, this clause does not apply. Exclusion Criterion2 7 Test Management Partial Some of the sub-clauses are included Processes and some are not. A more detailed ver- sion of this table can be viewed in Table 14 8 Dynamic Test Pro- Partial Some of the sub-clauses are included cesses and some are not. A more detailed ver- sion of this table can be viewed in Table 14

Table 3: The list of requirements extracted from the safety standard ISO/IEC/IEEE 29119-2 [15]. Each row represents a clause of the standard, its inclusion status, justification for the inclusion status, as well as reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included. A more detailed list for the requirements in this standard is provided in AppendixA., Table 14

21 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Num Title Inclusion Status Justification 1 Scope  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 2 Conformance  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 3 Normative References  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 4 Terms and Definitions  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 5 Organizational Test  The organizational test process is used Process to develop and manage organizational test specifications. As this thesis is con- cerned with only 2 use-cases and not the entire organization, this clause does not apply. Exclusion Criterion2 6 Test Management  Directly related to the thesis, as it is an Processes Document- essential requirement for subprocesses ation of an appropriate test process. Inclu- sion Criterion3 7 Dynamic Test Pro-  Directly related to the thesis, as it is an cesses Documentation essential requirement for subprocesses of an appropriate test process. Inclu- sion Criterion3

Table 4: The list of requirements extracted from the safety standard ISO/IEC/IEEE 29119-3 [16]. Each row represents a clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included.

22 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Num Title Inclusion Status Justification 1 Scope  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 2 Normative references  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 3 Terms, definitions,  This clause does not include instruc- symbols, and abbrevi- tions or assessment criteria, it is read ated terms and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 4 Design Considera- Partial Some sub-clauses are included and tions some are not, leading to this clause’s partial inclusion. A more detailed ex- planation may be found in Table 15 5 Safety functions Partial This clause provides a list and details of safety functions which can be provided by the SRP/CS. The designer (or type- C standard maker) shall include those necessary to achieve the measures of safety required of the control system for the specific application. As the scope of this thesis is limited to software arte- facts, only those are considered. The inclusion and exclusion of this clause’s sub-clauses can be found in detail in Table 15 6 Categories and their  Not related to software. The scope of relation to MTTFD of this thesis is limited to software arte- each channel, DCavg facts. Exclusion Criterion4 and CCF 7 Fault consideration,  Not related to software. The scope of fault exclusion this thesis is limited to software arte- facts. [28] 8 Validation  References 13849-2:2012. The clause does not include any requirements in this part of the standard. Exclusion Criterion6 9 Maintenance  Not related to software. The scope of this thesis is limited to software arte- facts. the standard ISO 12100:2010 is referenced in this section, as this stand- ard is not part of the scope of the thesis, this clause has been disregarded. Ex- clusion Criteria6&5

23 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

10 Technical documenta-  The technical documentation that must tion be produced under the design of a safety-related machine is directly ap- plicable to the development process of the use cases, making this clause relev- ant. Inclusion Criterion1 11 Information for use  Not related to software. The scope of this thesis is limited to software arte- facts. Exclusion Criteria4,5,&2

Table 5: The list of requirements extracted from the safety standard ISO 13849-1:2016 [17]. Each row represents a clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included. A more detailed list for the requirements in this standard is provided in AppendixA., Table 15

Num Title Inclusion Status Justification 1 Scope  This clause does not include instruc- tions, It is read and used, but does not contain extractable requirements. Ex- clusion Criterion1 2 Normative references  This clause does not include instruc- tions. It is read and used, but does not contain extractable requirements. Ex- clusion Criterion1 3 Terms, definitions,  This clause does not include instruc- symbols, and abbrevi- tions. It is read and used, but does not ated terms contain extractable requirements. Ex- clusion Criterion1 4 Validation Process Partial Some of the sub-clauses are included and some are not. For more details refer to Table 16 5 Validation by analysis  Validation by analysis does not concern the test process or the software devel- opment process of the safety functions, therefore it is out of the scope of this thesis. Exclusion Criterion5 6 Validation by testing  Validation by testing is concerned with the test process of safety-related sys- tems therefore this clause is highly rel- evant and is taken into consideration. Inclusion Criteria3&1 7 Validation of safety  Validation of safety requirements spe- requirements specific- cification for safety functions is con- ation for safety func- cerned with reviewing processes in the tions software development lifecycle of a safety related system, therefore are rel- evant to this thesis. Inclusion Cri- teria1&2

24 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

8 Validation of safety  The requirements for the validation of functions safety functions in this standard cover both software and hardware functions, as well as testing. Therefore, this clause is partially considered in this thesis, disregarding the sections applicable to an entirety of the machine or hard- ware specifications. The part concern- ing software references sub-clause 9.5 of the standard. Since that clause is in- cluded, it makes the inclusion of this clause redundant. Exclusion Criteria 4&6 9 Validation of Perform- Partial Some of the sub-clauses parts are con- ance Levels and Cat- cerned with the software parts of the egories safety-related system, and some are not. For more details please refer to Table 16 10 Validation of environ-  Validation of environmental require- mental requirements ments is not concerned with software artefacts and therefore is out of the scope of this thesis. Exclusion Cri- terion4 11 Validation of main-  Validation of maintenance requirements tenance requirements is not concerned with software artefacts and therefore is out of the scope of this thesis. Exclusion Criterion5 12 Validation of tech-  Validation of technical documentation nical documentation and information for use is concerned and information for with many different aspects of the sys- use tem, only one of which is software doc- umentation, which is relevant to this thesis. As there are no specific in- structions in this clause regarding soft- ware documentation (This is done else- where), the clause is not of concern in this thesis. Exclusion Criteria4,5, &2

Table 6: The list of requirements extracted from the safety standard ISO 13849-2:2012 [18]. Each row represents a clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included. A more detailed list for the requirements in this standard is provided in AppendixA., Table 16.

25 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Num Title Inclusion Status Justification 1 Scope  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 2 Normative References  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 3 Terms, definitions,  This clause does not include instruc- and abbreviated tions or assessment criteria, it is read terms and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 4 Conformance to this  This clause does not include instruc- standard tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 5 Key concepts and ap-  This clause does not include instruc- plication tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 6 Software life cycle Partial Some processes are included and some process are not, leading to a partial inclusion of this clause. For more details please refer to Table 17

Table 7: The list of requirements extracted from the safety standard ISO/IEC/IEEE 12207:2017 [5]. Each row represents a clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included. A more detailed list for the requirements in this standard is provided in AppendixA., Table 17.

The results of reviewing, tabulating, and assessing archival data led to specific results that raised confusions about compliance to certain requirements and lack of some documentation. An interview was conducted after the first two steps of data analysis were performed on the documents, to clarify the results. The interview assisted in improving and changing the resulting compliance degrees achieved as a result of document review.

7.1.2 Interview Once the independent data collection is done, the collected data is put through the first two steps of data analysis. The first two steps, tabulation and assessment, are further explained in detail in Section 7.2. The two steps result in a series of requirements from the safety standards with a specific compliance degree. Based on the assessment results, a semi-structured interview was constructed and performed to assure the correctness of assumptions made about missing docu- mentation and procedures. The data collected from the interview led to the company partially

26 Ladan Pourvatan Test Process Assessment of Industrial Control Systems complying with many clauses due to lack of complete documentation, as opposed to failing com- pletely. The interview is semi-structured as it includes a mixture of open and closed questions [28]. The closed questions pointed specifically to certain failing sub-clauses of the instructions extracted from the assessment results. The open questions were created based on large clauses that were determined to fail based on assessing the provided documentation. The interview was performed in two phases, as it was rather long. Both phases of the interview were transcribed, and its results were added to the data collection tables as an extra data source, in addition to the company’s documentation. The collected data as a result of the interview is confidential to safeguard the integrity of the company, along with the company documentation. The questions asked in the interview can be found in AppendixB.

7.2 Data Analysis The procedure that is followed for conducting the data analysis for the case study in this thesis is done in a total of five steps, each inspired by the steps for qualitative analysis methods [34]. The steps are (1) Tabulating the Data, (2) Assessment, (3) Analysis, (4) Recommendations, and finally(5) Industrial Evaluation. The visualisation of these steps, and their parallelism with the case study procedure proposed by Runeson et al. [28], is shown in Figures4 and5.

Figure 5: Alignment of the procedure for this study with case study procedure, as well as the analysis process using the content analysis technique [34]

The first step, Tabulating the Data ((1) in Figure5), comprises of categorising all the instruc- tions proposed by standards based on their application in the V-Model, as well as keywords. Once the grouping of the standards’ instructions is complete, the data provided by the company are grouped in the same categories in the form of references to various parts of the provided docu- mentation or the information gathered in the interview. Once the data is tabulated, the Assessment step ((2) in Figure5) commences. Now there exist n groups of standards as well as items from the company documentation in the form of text snippets to assess based on the standards. One by one the instructions in the standards are gone through and use cases’ processes’ relevant parts are assessed accordingly. The results of this assessment are in text snippets to be analysed in the following steps. Each of the instructions of the standards results in a pass, fail, or partial grade awarded to the company’s process plus a justification for the awarded score.

27 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Now that there exists a complete collection of passed, failed, or partially passed requirements from the use cases’ processes, the Analysis step ((3) in Figure5) is done. This step comprises of analysing the results of the assessment based on the compliance degree of each requirement. The possible trends from this analysis can be pointed out for investigation towards the reasons for the compliance degree of the standard instructions. Furthermore, this step aims to use connections and reasoning to make conclusions as to which requirements proposed by safety standards are commonly met and which are not. In this step, the difference between the two use cases, different safety standards, and different categories are analysed in terms of their compliance. Making Recommendations ((4) in Figure5) is the fourth step taken in this study. Based on the conclusions in the third step, a set of recommendations for the test process of the company is made. These recommendations have a small focus on parts already conforming and more focus on the non-compliant bits to make the whole process more compliant to standards in general. Once the recommendations are done, they must be evaluated to assure their usefulness and practicality. The fifth step, Industrial Evaluation ((5) in Figure5) proceeds to the conduction of a session with a focus group from the company, in which the recommendations are presented and discussed. The results achieved from the focus group lead to an evaluation of the recommendations. Finally, the desired results of the case study are given at the end of this thesis. Once all the steps one through five are done, there will exist a collection of what safety standards recommend in terms of testing and validation, giving an assessment criteria that helps answer the second research question (RQ2). An evaluation of the extent of the company’s compliance with specific standards, in addition to justifications for the compliance or lack thereof, resolves the second research question (RQ2). Using the assessment and analysis results, and ending up with a collection of recommendations that helps comply with safety standards answers the first and the main research question (RQ1).

7.2.1 Tabulating the Data The data analysis phase of this thesis starts with (1) Tabulating the Data. During this step, the extracted safety instructions, company’s documentation, and interview transcription are tabulated. The data sources and instructions are then categorised in two ways. The first categorisation is done based on the application of the item in the V-Model. The second form of classification uses abstract keywording [35]. From this point forward, the two ways of categorisation will be referred to as V-Model categorisation and Keyword-Classification.

Tabulating the Data

2 V-Model 1 Keyword Classification Categorisation Process Safety Process 1.4Safety Validation 1.9Design 1.3Analysis 2.2Architecture 1.8Modification 1.5Performance 1.2Management .1 Documentation.1 1.6Configuration 1.11 Architecture 2.1 2.1 Systems/Software 1.12 ModuleTest DefinitionProcess 1.14 Requirements 1 2.4 System Analysis Process Analysis System 2.4 1.7Verification and Requirements Definition Definition Requirements 1.13 IntegrationTest 2.3Design Definition 1.10 Implementation 2.10 Overall Software 2.8Validation Process 2.6Integration Process 2.7Verification Process 2.9Maintenance Process 2.5 Implementation 2.5 Process Implementation 1.15 Documentation Test Testing Testing Testing Unit Test Unit Test Cases Test Cases Test Evaluation Languages Simulation Cycle Time Cycle Complexity Traceability Assessment System Test System Test System Test Reports Test Modification SafetyLevels Review Code Maintenance Maintenance Management Changeability Configuration Support Tools Support Module Test Module Module Module Design Response Time Response SafetyIntegrity Integration Test Integration Test Integration Software Design Software Regression Tests Regression Technical Design Technical Integration TestsIntegration Signal Definitions Signal Technical Strategy Technical Validation ProcessValidation SafetyAssessment Acceptance Testing Acceptance Verification Process Verification System ArchitectureSystem Software IntegrationSoftware Modification ProcessModification Safety function Specification function Safety System RequirementsSystem Software ArchitectureSoftware Test Process Test Activities Software Implementation Software Stakeholder Requirements Stakeholder Software Design Definition Design Software Failure and Impact analysis Impactand Failure Software RequirementsSoftware RequirementsSoftware Report andDocumentation Report Management ProcessesManagement Configuration Management Configuration Plans, Rules, and Guidelines andRules, Plans, Performance Characteristics Performance Code and Implementation and Code Stakeholder Requirements Stakeholder

Figure 6: The groupings done to tabulate the data. Data Tabulation has been done based on two grouping schemes: Keyword Classification and V-Model Categorisation. Keywords and contexts that will lead to data falling under each group are also shown in the fourth layer of this figure.

28 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

The Keyword-Classification scheme has 15 categories and was created through abstract keyword- ing [35]. The classes in this scheme are (1.1) Documentation, (1.2) Management, (1.3) Analysis, (1.4) Safety, (1.5) Performance, (1.6) Configuration, (1.7) Verification and Validation, (1.8) Modi- fication, (1.9) Design, (1.10) Implementation, (1.11) Architecture, (1.12) Module Test, (1.13) Integ- ration Test, (1.14) Requirements, and (1.15) Test Documentation. The process for getting these categories started from the documentation provided by the company. Based on understanding the content and context of each document, keywords were extracted from them. The abstract keywording of the documents led to many keywords, each corresponding to a group relevant to process assessment of the company. The keywords were clustered based on relevance, meaning, and their use within the documents, and the clusters matured into classes. Figure6 shows the result of the clustering of the keywords. Once the classification scheme was established from the clustering of keywords, the extracted requirements from the safety standards were classified into the categories using extraction of keywords; If the text representing the requirement included one of the keywords, it was classified into one of the Keyword-Classification scheme’s categories. The V-Model categorisation scheme has ten categories selected from the propositions of ISO/ IEC/IEEE 12207:2017 [5]. ISO/IEC/IEEE 12207:2017 divides the software lifecycle processes into four groups: Agreement Processes, Organisational Project-Enabling Processes, Technical Manage- ment Processes, and Technical Processes. With attention to the problem formulation of this thesis, the scope only applies to the technical processes. The technical processes from ISO/IEC/IEEE 12207:2017 are as follows [5]: (1) Business or Mission Analysis process, (2) Stakeholder Needs and Requirements Definition process, (3) System/Software requirements definition process,(4) Archi- tecture Definition process, (5) Design Definition process, (6) System Analysis process. (7) Imple- mentation process, (8) Integration process, (9) Verification process, (10) Transition process, (11) Validation process, (12) Operation process, (13) Maintenance process, and (14) Disposal process. By considering the above categories, the V-Model demonstrated in Figure1, and the simplified V-Model from Safety Standard ISO 13849-1:2016 [17], ten categories were extracted. The ten categories all refer to the different processes within the V-Model which are related to testing in various ways. The ten categories, with their application to the test process, are as follows: (2.1) Re- quirements Definition Process, which leads to acceptance testing [8]; (2.2) Architecture Definition Process which influences integration tests [36]; (2.3) Design Definition Process and (2.4) System Analysis Process, related to system tests [36]; (2.5) Implementation Process, which along with program specifications leads to unit tests [8]; (2.6) Integration Process accompanying integration tests; (2.7) Verification and (2.8) Validation Processes which cover all testing; (2.9) Maintenance Process which includes regression test; and finally (2.10) Overall Software Safety, which refers to the overall management, planning, and processes that assist in the creation and execution of a test process. Based on the content, and the way the sections of the safety standards, one can determine how each clause or sub-clause of a safety standard belong to each of these categories.

7.2.2 Assessment The second step of Data Analysis in this thesis is (2) Assessment. The assessment stage consists of evaluating the compliance of the development process at the company concerning each extracted safety standard requirement. This step uses the instructions and requirements extracted in Section 7.1 to assess the company’s development process. According to the categorisations done in Section 7.2.1, it is determined which documents to go through to determine if the company’s process complies with the specific instruction. The instructions, in the form of clauses and sub-clauses, are then assigned to have been passed, failed, or partially complied with by the company. The justification for this level, or the compliance degree, is given per instruction. The compliance degree of instruction is determined by how well the data sources prove their compliance with the selected sub-clause. If all aspects of a clause or sub-clause are clearly and explicitly met by the documentation, and evidence to this compliance can be visibly pointed to, the instruction has passed. If the documentation points to plans, with specific timelines, to fulfil the requirement, or the interview points to the tasks being followed but documentation is lacking for the evidence of such tasks, the sub-clause partially passes. If there isn’t sufficient evidence supporting the compliance of the company with the instruction through the documentation or the interview, the sub-clause fails.

29 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

The results of the second step of the data analysis, (2) Assessment, are in the form of a table. The table will have the ID for the sub-clauses selected in Data Collection 7.1 in one column, their title in another, the compliance degree in a third column, and the justification for the compliance degree in the final column. The justifications for the compliance degree are in the form of texts referring to the Data Source ID giving evidence for the compliance degree. The compliance degree is demonstrated as Pass, Partial, and Fail. The results of this step are presented in Section 8.2

7.2.3 Analysis The third step of the Data Analysis stage of this case study is (3) Analysis. This step comprises of performing analysis on the results of the assessment step, presented in Section 8.2. The analysis is of a quantitative nature since it uses different types of graphs to find trends in the number and percentage of compliance of the company’s development process with the selected instructions. The analysis is performed by comparing the number of passed, failed, and partially passed instructions in terms of different criteria. The compliance is compared in terms of V-Model Categorisation, Keyword-Classification, different standards, and the two use cases. The results of this analysis are demonstrated in Section 8.3, and are further discussed in Section9.

7.2.4 Recommendation Once analysis is performed on the assessment results, step (4), Making Recommendations takes place. The results of the analysis pointed out areas that may need more concrete documentation or attention from the company. The analysis results are based on the entire development process at the company, however, the recommendations are made for the test process alone. As the test process is dependant on many different parts of the V-Model, the analysis of the development process helps lead to more insightful recommendations for the test process. Initially, the requirements categorised as part of (1.12) Module Test, (1.13) Integration Test, (1.7) Verification and Validation, and (1.15) Test Documentation, from the Keyword-Classification scheme were extracted. These requirements were studied in detail and those with a compliance degree of Fail or Partial were grouped. The result of this grouping gave a set of general requirements from all different standards put together. The general requirements were then summarised and put together as a checklist to be provided to the company. The set of recommendations made for improving the test process at the company can be found in 8.4.

7.2.5 Industrial Evaluation Industrial Evaluation (5) is the fifth step of data analysis in this thesis. This step is performed to evaluate the results of the assessment and recommendations in regards to the needs of the company. To perform such an evaluation, the focus group method is used. The focus group method may be used for an evaluation phase of a study, to help with receiving feedback and support of data [37]. To use the focus group method, initially, the research problem formulation is done, followed by preparation and planning, recruitment and selection of participants, and finally, conduction of the session [38]. The session was recorded, once the recording was transcribed, the transcription underwent thematic analysis [39]. The transcription was reviewed, coded, and themes emerged [34]. The process of performing the industrial evaluation in the form of a focus group is represented in Figure7.

30 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Figure 7: The process followed to perform industrial evaluation using the focus group research method. This process follows a combination of the instructions put forth by Kontio et al. [37] and Breen [31] for conducting a focus group. The instructions put forth by Vaismoradi et al. [34] have been used for the thematic analysis process.

A simplified, small version of a focus group is used for performing the evaluation of assess- ment and recommendation results. To formulate the research problem, the following questions are formed:

FGQ1. How do industrial practitioners view these assessment results? FGQ2. How do industrial practitioners view these recommendations? FGQ3. How can these recommendations support the improvement of the current testing process? Following the formulation of the questions, aimed to be answered by utilising the focus group, preparation and planning for the session is done. In order to prepare, a presentation containing the necessary information to demonstrate the assessment results, analysis of the assessment, and the resulting recommendations were created. Meanwhile, a group of engineers from the company were recruited to participate. The session was finally conducted online, with a group of engineers from the company. The Q&A session after the presentation was used for gathering the results to the questions. The session was moderated by having a discussion about the assessment method, assessment and analysis results, and finally the recommendations. At a certain point more detailed questions, presented in AppendixE. helped to guide the course of the discussions, therefore moderating the session. The session was recorded and transcribed, and the transcription was then put through thematic analysis to identify the important points made [39]. The results achieved from this focus group are presented in Section 8.5 along with the refined recommendations.

7.3 Threats to Validity As instructed by guidelines proposed by Runeson and Höst [28], the threats to validity are to be addressed when designing the case study. Firstly, construct validity is considered. Construct valid- ity is referred to by making sure the research questions are being addressed sufficiently throughout this research [28][26]. Next, internal validity is considered as considering the plausible demonstra- tion of a relationship between what has been done and the resulting outcome [26]. Threats to the external validity of the study are considered as the level of generalisability of the results of the thesis.

31 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

As the procedure taken for performing the thesis is directly involved with the research questions presented in Section4, construct validity is considered. The research question RQ1 aims to find re- commendations for improving the safety compliance of the test process in place at an example of an industrial control system. The company selected as the subject of the case study has provided two use cases as examples of industrial control systems. Furthermore, the recommendations presented in Section 8.4 are a clear and direct answer to this question. RQ2 investigates the extent of com- pliance of the current development process of a company, with specific safety standards. The safety standards were specified in Section 7.1.1.1, and the company being studied is a large automation company in Sweden. The results achieved and discussed in Section 8.2 provide an answer to RQ2, leading to construct validity. Threats to internal validity include changes that have occurred while the study was in pro- gress [26]. To avoid such threats, the documentation provided by the company was continuously provided as they were updated. However, it is worth mentioning that the last version of the documentation studied was that of May 1st, 2021, so any updates made after this date were not considered in the analysis of data. As the case study does not involve rivalry, such threats are not considered. Furthermore, the growth and evolution of the development process at the company has been continuously observed with the updated documentation, but other aspects that may affect the growth have not been considered. The threats to internal validity may include a poor selection of the documentation, which was avoided to the best of the student’s ability by the continuous acquisition of documents. Moreover, ambiguity about the causality of the compliance degrees of the development process with the requirements must be considered [26]. As assessing such an advanced development process with such safety standards requires qualifications and experience to better understand both the development process and the safety standards, there is a great possib- ility of occurrence of misunderstandings. This threat was avoided by having continuous meetings with experienced persons and supervisors, however, more meetings with experts in safety assurance may be of more help in achieving internal validity. The assessment results and recommendations in this study cannot be generalised beyond the studied case, however, they can be replicated for a similar case, with a possibility of different outcomes. Furthermore, regarding the external validity of the study, the method generated as a result of answering the two research questions is generic, has detailed descriptions for every step, and is easily adaptable to other systems. This adaptability leads to the generalisability of the method. Further application of the method in other cases will improve the claim of its generalisability, avoiding the threat to external validity. As the findings are specific to the case studied under a very specific context, external validity has not been achieved in regards to the answers to the research questions [26]. Future studies may utilise the methods applied in this study on more industrial control systems with a wider range of domain to achieve generalised results. By transferring the results of this study to other systems and companies, the generalisability of this thesis can be improved.

8 Results

The results of the data analysis phase of the case study performed for this thesis have been divided into sections, each representing the results for every step of the data analysis performed. This partitioning has been performed to ease the reading of results. Initially, the results of Tabulating the Data ((1) in Figure5) are presented in Section 8.1 to show the results of Keyword-Classification and V-Model Categorisation of the requirements extracted from the safety standards, as well as the data sources from the company. Section 8.2 represents the Assessment ((2) in Figure5) results. The assessment results are demonstrated in a condensed version, with respect to the requirements extracted, referencing the code of data sources from the company, the compliance degree, along a justification for such degree. The results of the Analysis step ((3) in Figure5) are in the form of different charts, showing the compliance degree of the data sources with respect to each safety standard, the V-model Categories, Keyword-Classification classes, and each use case. The charts are explained and represented in Section 8.3. Recommendations ((4) in Figure5) are made in the form of a checklist, along with a glossary table and are presented in Section 8.4. The next step of the data analysis is Industrial Evaluation ((5) in Figure5), which uses a focus group to

32 Ladan Pourvatan Test Process Assessment of Industrial Control Systems evaluate the recommendations, assessment, and analysis results. The results of this evaluation are presented in Section 8.5. Finally, all the results of these steps are put together, providing a final result for the data analysis.

8.1 Tabulation of Data As further explained in Section 7.2.1, the first step of data analysis performed was Tabulating the Data ((1) in Figure5). Data tabulation has been performed to categorise the assessment criteria and the data sources from the company to ease the assessment process and lead to better ana- lysis results. The results of tabulating the data show the clauses of the safety standards selected for the assessment as categorised into Keyword-Classification scheme and V-Model Categorisation categories. Table8 demonstrates the classification of the requirements from safety standards. Jus- tifications for the requirements belonging to the categories of the V-Model Categorisation scheme come to their titles, the place in the standard they belonged to, and their content. In a likely manner, the results from grouping the data sources into the Keyword-Classification scheme and V-Model Categorisation is shown in Tables 10 and9, respectively. Appendix C supplies a guide to the data sources. As each data source may belong to more than one group, the tables look differently from the results of tabulating the safety standard instructions. The keywords showing the justification for each data source’s belonging to the classes and categories have been redacted due to confidentiality reasons.

Standard Req V-Model Categorisation Keyword Clas- Keyword Num sification 61508-1 5 Overall Software Safety Documentation Documentation 61508-1 6 Overall Software Safety Management Functional Safety Manage- ment 61508-1 8 Overall Software Safety Analysis Funtional Safety Assess- ment 61508-3 6 Overall Software Safety Management Management of Safety- related Software 61508-3 7.1 Overall Software Safety Safety Safety Lifecycle 61508-3 7.2.2.2 Requirements Definition Management Safety planning Process 61508-3 7.2.2.3 Requirements Definition Safety Safety integrity Process 61508-3 7.2.2.4 Requirements Definition Analysis Failure analysis Process 61508-3 7.2.2.5 Requirements Definition Analysis Evaluation for adequate Process specification 61508-3 7.2.2.6 Requirements Definition Safety Detailed specification of Process Modes of operation and equipment in Safety re- lated sw 61508-3 7.2.2.7 Integration Process Safety Relevant constraints between the hardware and the software 61508-3 7.2.2.8 Requirements Definition Performance Complexity Process 61508-3 7.2.2.9 Requirements Definition Safety Identification of non- Process Safety functions by Safety functions 61508-3 7.2.2.10 Requirements Definition Safety Safety properties of the Process product 61508-3 7.2.2.11 Requirements Definition Configuration Requirements regarding Process configuration data

33 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

61508-3 7.2.2.12 Verification Process Implementation Performance characterist- ics 61508-3 7.2.2.13 Verification Process Configuration Operational parameters 61508-3 7.3.2.1 Validation Process Management Planning 61508-3 7.3.2.2 Validation Process Management Validation plan 61508-3 7.3.2.3 Validation Process Documentation Technical strategy 61508-3 7.3.2.4 Validation Process Management Validation plan 61508-3 7.3.2.5 Validation Process Verification Validation and Validation 61508-3 7.4.2.1 Design Definition Process Management Safety Planning 61508-3 7.4.2.2 Design Definition Process Safety Safety integrity level 61508-3 7.4.2.3 Design Definition Process Modification Safe Modification 61508-3 7.4.2.4 Design Definition Process Modification Modification 61508-3 7.4.2.5 Design Definition Process Design Design representation 61508-3 7.4.2.6 Design Definition Process Safety Safety-related parts 61508-3 7.4.2.7 Design Definition Process Safety Safety integrity level 61508-3 7.4.2.8 Design Definition Process Safety Safety and none Safety- related parts 61508-3 7.4.2.9 Design Definition Process Safety Safety integrity level 61508-3 7.4.2.10 Design Definition Process Safety Safety integrity level 61508-3 7.4.2.11 Design Definition Process Implementation Implemented 61508-3 7.4.2.12 Design Definition Process Modification Reuse 61508-3 7.4.2.13 Design Definition Process Modification Pre-existing software ele- ment 61508-3 7.4.2.14 Design Definition Process Configuration Languages 61508-3 7.4.3.1 Architecture Definition Management Safety Planning Process 61508-3 7.4.3.2 Architecture Definition Architecture Architecture Process 61508-3 7.4.3.3 Architecture Definition Modification changes required Process 61508-3 7.4.4 Requirements Definition Configuration Support Tool Process 61508-3 7.4.5.1 Design Definition Process Management Safety Planning 61508-3 7.4.5.2 Design Definition Process Management Validation plan 61508-3 7.4.5.3 Design Definition Process Modification Safe Modification 61508-3 7.4.5.4 Design Definition Process Design Design 61508-3 7.4.5.5 Design Definition Process Safety Safety integrity level 61508-3 7.4.6 Implementation Process Implementation Code Review 61508-3 7.4.7 Verification Process Module Test Module Test 61508-3 7.4.8 Verification Process Integration Integration Test Test 61508-3 7.5.2 Integration Process Integration Integration Test Test 61508-3 7.7.2.1 Validation Process Management Validation planning 61508-3 7.7.2.2 Validation Process Management Safety planning 61508-3 7.7.2.3 Validation Process Management Safety planning 61508-3 7.7.2.4 Validation Process Safety Aspects of system Safety 61508-3 7.7.2.5 Validation Process Safety Safety function 61508-3 7.7.2.6 Validation Process Safety Aspects of system Safety 61508-3 7.7.2.7 Validation Process Safety Safety-related software as- pects 61508-3 7.7.2.8 Validation Process Configuration Support tools

34 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

61508-3 7.7.2.9 Validation Process Safety Safety-related software as- pects 61508-3 7.8.2.1 Maintenance Process Modification Software modification 61508-3 7.8.2.2 Maintenance Process Modification Software modification 61508-3 7.8.2.3 Maintenance Process Modification Software modification 61508-3 7.8.2.4 Maintenance Process Analysis Impact analysis 61508-3 7.8.2.5 Maintenance Process Modification Software modification 61508-3 7.8.2.6 Maintenance Process Management Safety planning 61508-3 7.8.2.7 Maintenance Process Management Plan 61508-3 7.8.2.8 Maintenance Process Modification Software modification 61508-3 7.8.2.9 Maintenance Process Modification Software modification 61508-3 7.8.2.10 Maintenance Process Analysis Assessment 61508-3 7.9.2.1 Verification Process Management Plan 61508-3 7.9.2.2 Verification Process Management Verification plan 61508-3 7.9.2.3 Verification Process Management Plan 61508-3 7.9.2.4 Verification Process Documentation Documented 61508-3 7.9.2.5 Verification Process Documentation Documented 61508-3 7.9.2.6 Verification Process Verification Software Safety lifecycle and Validation 61508-3 7.9.2.7 Verification Process Verification Software Safety lifecycle and Validation 61508-3 7.9.2.8 Verification Process Requirements Software safety require- ments 61508-3 7.9.2.9 Verification Process Architecture Software architecture 61508-3 7.9.2.10 Verification Process Design Software system design 61508-3 7.9.2.11 Verification Process Design Software module design 61508-3 7.9.2.12 Verification Process Implementation Verification of code 61508-3 7.9.2.13 Verification Process Configuration Verification of data 61508-3 7.9.2.14 Verification Process Performance Performance 29119-2 7.2 Verification Process Management Test Planning Process 29119-2 7.3 Verification Process Test Docu- Test Monitoring and Con- mentation trol Process 29119-2 7.4 Verification Process Test Docu- Test Completion Process mentation 29119-2 8 Verification Process Test Docu- Report mentation 29119-3 6 Verification Process Test Docu- Test Plan, Documenta- mentation tion, etc. 29119-3 7 Verification Process Test Docu- Test Design Specification, mentation Reporting, etc. 13849-1 4.6.2 Overall Software Safety Safety SREW 13849-1 4.6.3.a Requirements Definition Requirements Safety-related software Process specification 13849-1 4.6.3.b Overall Software Safety Configuration Tools & languages 13849-1 4.6.3.c Design Definition Process Design Software design 13849-1 4.6.3.d Integration Process Safety Safety and non-Safety components together 13849-1 4.6.3.e Implementation Process Implementation Software implementation/ coding 13849-1 4.6.3.f Verification Process Verification Testing and Validation 13849-1 4.6.3.g Overall Software Safety Documentation Documentation 13849-1 4.6.3.h Verification Process Verification Verification and Validation

35 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

13849-1 4.6.3.i Overall Software Safety Configuration Configuration manage- ment 13849-1 4.6.3.j Maintenance Process Modification Modification process 13849-1 4.6.4 Verification Process Analysis Parameterization 13849-1 5.1 Requirements Definition Requirements Specification of Safety Process functions 13849-1 5.2.4 Overall Software Safety Safety Danger zone 13849-1 5.2.6 Requirements Definition Performance Response time Process 13849-1 10 Overall Software Safety Documentation Technical documentation 13849-2 4.2 Validation Process Management Validation plan 13849-2 6 Validation Process Verification Validation by testing and Validation 13849-2 7 Validation Process Requirements Safety requirement spe- cifications 13849-2 9.5 Validation Process Safety Safety-related software 12207 6.4.3 Requirements Definition Requirements System/Software require- Process ments definition process 12207 6.4.4 Architecture Definition Architecture Architecture Definition Process process 12207 6.4.5 Design Definition Process Design Design Definition process 12207 6.4.6 System Analysis Process Analysis System Analysis process 12207 6.4.7 Implementation Process Implementation Implementation process 12207 6.4.8 Integration Process Integration Integration process Test 12207 6.4.9 Verification Process Verification Verification process and Validation 12207 6.4.11 Validation Process Verification Validation Process and Validation 12207 6.4.13 Maintenance Process Modification Maintenance Process

Table 8: The results of tabulating the selected requirements from the specific safety standards. The “Standard" column represents the shortened name of the safety standard from which the requirement has been extracted. The column “Req Num" includes the numbers of clauses or sub-clause selected. “V-Model Categorisation" and “Keyword Classification" columns each contain the name of the group the requirement belongs to within each grouping scheme. The last column, “Keyword", represents the keywords in the requirements putting them in the corresponding group of Keyword Classification.

Table8 shows the results of grouping the extracted requirements from the selected safety stand- ards into the V-Model Categorisation and Keyword-Classification grouping schemes. Tabulating the data concerns grouping both the assessment criteria, and the evidence to ease their matching got further analysis. The evidence to be matched with the assessment criteria is in the form of text snippets extracted from multiple data sources. The data sources are also grouped into the two grouping schemes. Tables9 and 10 show the relationship between the grouping schemes and the data sources. As each data source may belong to more than one group from each scheme, the structure of Tables9 and 10 differs from that of Table8. The belonging of the data sources to the categories and classes has been marked in a binary fashion in the tables, with  meaning they belong, and  meaning they do not.

36 Ladan Pourvatan Test Process Assessment of Industrial Control Systems eurmnsDfiiinProcess Definition Requirements rhtcueDfiiinProcess Definition Architecture einDfiiinProcess Definition Design ytmAayi Process Analysis System mlmnainProcess Implementation vrl otaeSafety Software Overall aneac Process Maintenance eicto Process Verification nerto Process Integration aiainProcess Validation

Source ID

DOC-1806           DOC-0001           DOC-1600           DOC-1557           DOC-1142           DOC-1808           DOC-0002           DOC-1111           DOC-1480           DOC-1527           DOC-1576           DOC-1598           DOC-1818           DOC-1865           DOC-1487           DOC-1003           DOC-1526           DOC-1554           DOC-2025           DOC-1525           DOC-1544           DOC-1553           INT-0319           INT-0326          

Table 9: The mapping of how each data source from the company belongs to the groups within the V-Model Categorisation scheme. Each source may belong to more than one category, therefore their belonging to the categories has been marked in a binary fashion, with  meaning they belong, and  meaning they do not.

Since the grouping of the data sources into each scheme can be shown in binary form, the results of tabulating them are shown in two tables. Table9 shows the belonging of each data source to the categories of the V-Model Categorisation scheme. Table 10 shows the belonging of the data sources to the classes from the Keyword-Classification scheme in the same manner; Meaning that each column represents a class from the Keyword-Classification scheme and each row a data source, and the cells with  show the belonging of the data source to the class, and  the opposite.

37 Ladan Pourvatan Test Process Assessment of Industrial Control Systems eicto n Validation and Verification etDocumentation Test nerto Test Integration Documentation Implementation Configuration Requirements Modification oueTest Module Performance Architecture Management Analysis Design Safety Source ID

DOC-1806                DOC-0001                DOC-1600                DOC-1557                DOC-1142                DOC-1808                DOC-0002                DOC-1111                DOC-1480                DOC-1527                DOC-1576                DOC-1598                DOC-1818                DOC-1865                DOC-1487                DOC-1003                DOC-1526                DOC-1554                DOC-2025                DOC-1525                DOC-1544                DOC-1553                INT-0319                INT-0326               

Table 10: Table showing how each data source from the company belongs to the groups within the Keyword- Classification scheme. Each source may belong to more than one class, therefore their belonging to the classes has been marked in a binary fashion, with  meaning they belong, and  meaning they do not.

The results presented in this section lead to the recommendations of safety standards in terms of the safety of control systems. These results are applied to the development process of the company, keeping in mind the test process. Table8 demonstrates the recommendations and instructions given by safety standards. To further simplify the assessment and analysis of data, the table also presents the categorisation or classification of data.

8.2 Assessment The results for the second step of data analysis in the case study performed for this thesis, (2) Assessment, are presented in this section. The method for arriving at the assessment results is explained further in Section 7.2.2. The assessment is performed on the data sources provided by the company, including documentation and interview. Section 7.1 explains the methods and reasons for attaining the assessment criteria. The data sources are assessed according to the assessment criteria that are the requirements extracted from the safety standards resulted in Table8. The compliance of the data sources with each instruction has been assessed. As a result of this assessment, a compliance degree, a justification for such degree, and references to data sources are assigned per assessment criterion. The detailed outcomes of the assessments are redacted from this thesis due

38 Ladan Pourvatan Test Process Assessment of Industrial Control Systems to confidentiality and ethical reasons. AppendixD. supplies a condensed version of the assessment results. Table 19 shows the selected requirements with which the company’s development process did not comply. Table 20 presents those requirements partially met, as deducted from studying the provided information. The company’s development process was deemed to fully follow the requirements in Table 21. Each table includes the number of sub-clauses within each group of sub- clauses or clause belonging to a selected requirement from a safety standard. The justifications for the level of compliance are given in the tables in AppendixD. as well. The standard IEC 61508-1:2010 addresses the general requirements for assuring functional safety of control systems [11]. Out of the 81 selected requirements from this standard’s document- ation, the company’s development process is fully compliant with 17, partially with 34, and does not comply with 30. The data sources which comply with the sub-clauses of IEC 61508-1:2010 included the Document Review Checklist, Software Development Process documentation, CR-Tool Guidelines, and the second phase of the interview. Furthermore, it is deduced that the information acquired in the first phase of the interview partially complied with the selected requirements. The failure of compliance with IEC 61508-1:2010 is shown to be due to lack of sufficient documentation, confirmed by the interview. The phases of the overall software safety lifecycle are recorded incom- pletely. The documentation does not contain adequate information required for the management and implementation of a functional safety assessment. Even though the functional safety assess- ment is currently being performed, the lack of such documentation leads to failure of compliance. No quality assurance or quality management activities are precisely defined in the development process of this company. Software requirements for the functional safety of programmable electronic safety-related sys- tems are presented in IEC 61508-3:2010. The compliance degree of the company’s development process with this standard is 66 out of 354, with the addition of partially complying with 66 re- quirements and a failure to comply with 222 sub-clauses. The test specifications and test reports for the two safety functions provided by the company provide evidence that shows compliance to IEC 61508-3:2010, along with the general description of the development process and architec- tural design of the system. Even though the documents mentioned prove certain conformity, they are missing some explicit information that leads to a lack of full compliance with the standard’s requirements. The lack of precise documentation for the functional assessment plan and configur- ation management activities and references to the interview leads to a lack of complete compliance of the company’s development process with the selected requirements from IEC 61508-3:2010. The results of assessing the development process of the company with respect to IEC 61508- 1:2010 and IEC 61508-3:2010, may be an indication of the process’s strengths and weaknesses. The results in Table 21 show that the documentation provided by the company is accessible, maintain- able, accurate, easy to understand, and well-structured. Furthermore, the documentation identifies all persons in charge of different activities and confirms the communication of their responsibil- ities to them. In addition to the management of people and documentation, the inclusion of an integrated set of activities during the software safety lifecycle stages satisfy the software safety requirements specification at the required safety integrity level. Furthermore, the notations and techniques for representing the design and architecture of the system are defined. Concerning veri- fication, the results show that each software module is verified as required by the software module test specification developed during software system design. The clear documentation of the test specifications adds to the compliance of the test process. However, the lack of some documentation due to the ongoing development process has led to partial and failure of compliance with some sub-clauses in IEC 61508:2010 parts 1 and 3, as shown in Tables 20 and 19. It is also deduced that more explicit documentation of certain aspects, which shall remain confidential, along with following the plans stated in the documentation provided, and adding quality assurance measures, will lead to compliance of the company’s development process with the selected requirements of IEC 61508. ISO/IEC/IEEE 29119-2:2013 is a safety standard with requirements for the test processes in software and systems engineering [15]. As this standard focuses on software testing, the data sources used for the assessment are test reports and test specification documentation for the two safety functions along with the interview. The selected requirements from ISO/IEC/IEEE 29119- 2:2013 add up to a total of 127 instructions. Out of the chosen instructions, the data sources provide evidence of full compliance with 26, partial compliance with 4, and failure to comply with

39 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

97. Reviewing the documentation indicates that risk assessment and incident reporting during testing doesn’t exist, and the interview results confirm this. The lack of sufficient analysis for test purposes, control directives, test strategy, test control and monitoring activities, test planning activities, and traceability between various activities is another cause of failure of compliance with ISO/IEC/IEEE 29119-2:2013. Requirements for test documentation in software testing within the scope of software and system Engineering are given by ISO/IEC/IEEE 29119-3:2013. Part 3 of ISO/IEC/IEEE 29119:2013 has a total of 14 relevant requirements extracted, as further justified in Table 14. The compliance degree of the test documentation of the company with this standard has been determined by assessing the information in the interview, test reports, and test specifications. Certain missing test documentation or lack of necessary characteristics of the existing documents has led to a failure of compliance with 10 out of 14 requirements. The coverage items are not identified in the test specification document. Test cases don’t have a priority or a unique id. Individual Documents of Test Procedure Specification, Test Data Requirements, Test Environment Requirements, Test Data Readiness Report and a Test Plan have led to this high rate of failure. Partial compliance has been decided with two of the requirements due to partial information, and full compliance with another two. The safety standards for software testing chosen for this thesis, as justified in Section 7.1.1.1, are parts 2 and 3 of ISO/IEC/IEEE 29119:2013. ISO/IEC/IEEE 29119:2013 is a standard for software and systems engineering, with a focus on software testing [14]. Requirements for the test processes are given in ISO/IEC/IEEE 29119-2:2013 [15], and those for test documentation in ISO/IEC/IEEE 29119-3:2013 [16]. The assessment results presented in Tables 21, 20, and 19 demonstrate that the test process at the company, however solid, needs some polishing, as is the purpose of this thesis. One can deduce from these results that although basic characteristics of the test reports and test specification documents are met with respect to the requirements of this standard, further explicit documentation is needed to assure compliance. Lack of clear reporting of relationships between the actual and expected results of test cases in the reports, along with not sufficient general details of documentation has been inferred to be the reason for the partial failure of compliance with certain sub-clauses of ISO/IEC/IEEE 29119:2013. Also, to achieve compliance with this standard, it is necessary to produce an explicit test plan, risk assessment, and incident reporting during testing. In addition to a need for further documentation and phases for the test process, analyses must be conducted to achieve test cases that have not been done. Lack of these analyses has led to a lack of documentation of certain details of a test process which are outcomes of analyses. One can deduce from the results of the assessment with regards to ISO/IEC/IEEE 29119:2013, that further analyses for finding test cases can lead to the identification of test coverage items, assigning priorities to test cases, and concise documentation of a test plan, incident reports, and risk analysis. ISO 13849 is the safety standard for the safety of machinery. Part one of this standard, ISO 13849-1:2016, gives the general principles for the design of the safety-related parts of control sys- tems [17]. Those phases of the development process of the company which could relate to its test process were assessed in relation to the requirements selected from ISO 13849-1:2016. The assessment criteria extracted from ISO 13849-1:2016 led to a total of 69 instructions, of which 13 were completely met by the company. The evidence gathered through multiple data sources from the company also suggests that the development process is partially compliant with 17 of the said instructions and fails to comply with 39 of them. The achieved results are from assessing the documentation for overall development process requirements for the system, the design docu- mentation of each of the two safety functions studied, code review records, coding guidelines, test specifications, architecture definition, and finally, requirement definition of the safety functions. All these documents provided evidence for justification of the compliance degree results and were confirmed in the interview. The documents showed now mention of modification activities, im- pact analysis, analysis techniques used for verification purposes, detection and control of external failure, configuration management, or risk analysis. ISO 13849-2:2012 is the second part of the standard for the safety of machinery, which covers the requirements for validation of safety-related parts of control systems. The development process of the company is inferred to be compliant with 21 out of 43 selected requirements of this standard. Furthermore, the compliance with ISO 13849-2:2012 is partial for 5 of the sub-clauses, failing 17 of

40 Ladan Pourvatan Test Process Assessment of Industrial Control Systems them. The documentation leading to the compliance degrees resulted in an assessment concerning ISO 13849-2:2012 were the validation report and test reports. The focus of ISO 13849-2:2012 on requirements for validation, and its application to testing, led to the relevance of these documents for the assessment. The failure of compliance with this standard is due to multiple reasons. A validation plan does not exist, and information regarding operational and environmental condi- tions during testing, and analyses and tests to be applied are missing from the validation report. Information about accuracy of time measurements, pressure measurements, force measurements, electrical measurements, relative humidity measurements, linear measurements while validation by testing is also missing from the validation reports. And as previously mentioned, a Test Plan does not exist. Safety of machinery, more precisely safety-related parts of control systems, is addressed by ISO-13849. Parts 1 and 2 of this standard are under interest, as ISO 13849-1:2016, gives the general principles for design and ISO 13849-2:2012 the instructions for validation processes. The results achieved from assessing the development process presented in Tables 21, 20, and 19 show the relatively high compliance rate with ISO 13849-2:2012, which is likely due to the sufficient information included in the company’s validation report. The results suggest that the validation report, though incomplete, contains a lot of the information that is to be in the validation plan. Furthermore, the test reports provided for the two use cases under interest are complete in regard to needed information instructed by ISO 13849-2:2012. The assessment results show that the process in place at the company will have a noticeably higher compliance degree with both parts of ISO 13849 if there existed a test plan. Also, the performance of analyses for finding test cases, explicit mention of modification activities, and documentation of measures for performance characteristics will lead to higher compliance. Even though there is information for verification and validation of testing and implementation in the validation report, the guidelines must be added to the actual coding and review guidelines to keep consistency. These results demonstrate that lack of consistency in the documentation leads to partial compliance of a lot of the development process with ISO 13849-1:2016. Achieving consistency and traceability when it comes to verification and validation activities will go a long way in improving the compliance degrees. The safety standard for Systems and software engineering, Software life cycle processes, ISO/ IEC/IEEE 12207:2017, has provided 192 requirements used as assessment criteria in this thesis. As ISO/IEC/IEEE 12207:2017 addresses the entire software life cycle processes, all the documentation provided by the company were studied when performing the assessment. The provided document- ation has failed 94 of the criteria, partially complied with 53, and passed 45. This low compliance degree has been deduced based on the lack of information in the documentation, and further con- firmed by the interview. In the requirement definition documentation, some requirements relate to risks and criticality of the software system, however, they are not identified as such in any of the documentation. No mention of cost targets, critical quality characteristics, performance measures, or roadmap definition was made in the documentation. In like manner, the reasoning for the choice of architecture method, design techniques, or boundaries is currently missing from the documentation. Concerning verification, even though there is a record of actions that must be taken before performing verification by testing, constraints, priorities, or risks are not identified in the relevant test documents. Furthermore, a verification strategy is not defined, which has led to failing multiple of the assessment criteria. ISO/IEC/IEEE 12207:2017 is used as a safety standard for software lifecycle processes, in the domain of systems and software engineering [5]. The results achieved from assessing the development process of the company with this standard, help with the deduction of points in many different areas. The development process is very strong in explaining functional requirements and interfaces in various documents, with attention to the performance level of the system. In addition to the requirement definition process being strongly compliant with ISO/IEC/IEEE 12207:2017, the design definition strategy is defined, utilising logic diagrams and activities diagrams, including the rationales for the design artefacts. Although the development process has many strengths, there are some gaps identified from the assessment results presented in Tables 21, 20, and 19. One weakness would be the lack of explicit documentation about risk management, risk analysis, and critical quality characteristics, or performance measures of the software system. Furthermore, most of the identified shortcomings are addressed in the company in many different ways, as confirmed in the interview. However, the lack of clear documentation of such measures leads to a lack of

41 Ladan Pourvatan Test Process Assessment of Industrial Control Systems compliance. The assessment of the development process of the company led to many realisations. The assessment was performed using the criteria extracted from safety standards. The resulting assess- ment criteria resulted in a total of 880 instructions. The data sources from the company declared full compliance with 190 of these instructions. An addition of 181 instructions was partially com- plied with, and a total of 509 were not sufficiently considered by the documentation. A condensed version of the assessment results is presented in AppendixD. The results have been grouped based on their compliance degree, and the reasons for the given compliance degrees have been assembled, summarised, and condensed. The justifications tend to be as precise as possible whilst not revealing sensitive information that may reveal the company’s identity.

8.3 Analysis In this stage, the results of the Assessment step ((2) in Figure5), presented in Section 8.2 were analysed. The analysis was done by considering the compliance of the company’s development pro- cess with extracted requirements from the safety standards (presented in Section 7.1). Conformity is defined in terms of the compliance degree of the process with each requirement. The compliance degree is in three different levels of Pass, Partial, and Fail. The term Pass means there exists sufficient evidence in the documentation to show conformity with the sub-clauses represented by the selected requirement from the safety standards. Partial is assigned to requirements that are either planned to be fulfilled, according to the documentation or the interview or are partially fulfilled; this means that the development process at the company, as is currently in place, only conforms to some of the sub-clauses of a requirement. The Fail grade is granted to the require- ments which are not complied with, as there is no evidence in the documentation to support their compliance or a plan with a timeline to comply with them in the interview or the documentation. Instructions that are followed, but lack documentation are also assigned a Fail compliance degree. The total percentage of passing, failing, and partially passing requirements are shown in Figure 8. The compliance degree of the requirements is shown in multiple graphs to better represent the differences and similarities in terms of different safety standards, the different use cases, and the different groups.

Figure 8: The total compliance degree of the company’s development process with selected requirements from safety standards which relate to testing.

Analysis performed on the assessment results led to more findings, demonstrated in charts in Section 8.3. Firstly, putting the assessment results together shows a compliance degree rate of 21.59% pass, 20.57% partial, and 57.84% failure of compliance. Furthermore, the assessment results were put together and compared based on their groupings according to each use case, the safety standards, V-Model categorisation, and Keyword-Classification. Analysis of the results with each grouping has led to realisations and assumption made about areas in need of improvement in regards to the development process of the company.

42 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Answer to RQ2: By using the lightweight assessment method, the extent of the compliance of the current development process in place at an automation company with specific safety standards is 22% full compliance and 20% partial compliance.

8.3.1 Use Cases Firstly, the comparison of different compliance degrees between the two use cases provided by the company is performed. Figures9 and 10 demonstrate that the number and percentage of different compliance degrees of the requirements between the two use cases are equal. The assessment of the development process of the two safety functions revealed the number of shared data sources between the two use cases, as well as equal compliance of the documents that were not shared to the requirements extracted from the safety standards. Separate assessment of the safety functions, in addition to the assessment of the shared parts, led to full compliance to 190 of the total of 808 requirements. In total, the development process for both safety functions was partially compliant to 181 of the extracted requirements and failed a total of 509 of them.

Figure 9: The total compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the number of requirements with different compliance degrees, based on each safety function/use case.

Figure 10: The rate of compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the percentage of requirements with different compliance degrees, based on each safety function/use case.

The assessment results of the two use cases showed consistency in the development process of

43 Ladan Pourvatan Test Process Assessment of Industrial Control Systems the company. As the results for analysing the two safety functions proved to be equally compliant, uniformity in the process is concluded. The documentation specific to each use case includes safety requirement specifications, software design specifications, test specifications, test reports, and I/O Mapping documentation. These documents had an equal level of compliance with the requirements extracted from the safety standards. This result shows significant strength in the development process of the company. Consistency is an important aspect, and it shows that the tasks are done with perfect uniformity and managed well enough to continue with an equal level of conformity to safety.

8.3.2 Safety Standards As the thesis is looking into various safety standards, to assess the safety of the test process, the compliance level of the development process at the company are analysed in terms of the different safety standards. Figure 11 represents the proportions of the selected requirements with respect to each safety standard. As can be viewed, ISO 61508-3 has the most requirements selected with dominating 40% of the requirements, followed by IEC 12207 with 22%. The lowest number of requirements extracted belongs to IEEE 29119-3. 13849-2 and 13849-1, with 2%, 5%, and 8% respectively. Given the standards specified by the company are 13849-2 and 13849-1, further analysis is done individually on these two safety standards. Further discussion on these results is in Section9.

Figure 11: The grouping of the selected requirements from the safety standards.

Figures 12 and 13 show the various compliance degrees of the development process at the company with the different safety standards used. The level of compliance with each safety standard is determined by the different compliance degrees, namely the terms Pass, Fail, and Partial. As can be seen in figure 12, IEC 61508-3:2010 has the highest number of both passed and failed requirements. Given that IEC 61508-3:2010 is the highest in numbers, as seen in Figure 11, it is best to look at Figure 13 for further speculation. The highest percentage of failed requirements belong to ISO/IEC/IEEE 29119-2:2013 and the highest passed percentage to ISO 13849-2:2012. These results are discussed in Section9.

44 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Figure 12: The total compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the number of require- ments with different compliance degrees, grouped based on each safety standard.

Figure 13: The rate compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the percentage of requirements with different compliance degrees, grouped based on each safety standard.

Grouping the results based on safety standards shows significant results. Figure 13 shows the rate of full, partial, and failure of compliance per each safety standard. In descending or- der, the development process has the highest compliance rate with ISO 13849-2:2012 with 48.84%, ISO/IEC/IEEE 12207:2017 with 23.44%, and IEC 61508-1:2010 with 20.99%. The reasons for such high compliance scores among these safety standards are inferred to be due to the usage of ISO 13849-2:2012 and ISO/IEC/IEEE 12207:2017 by the company for their development process. The

45 Ladan Pourvatan Test Process Assessment of Industrial Control Systems lowest compliance rate belongs to ISO/IEC/IEEE 29119-3:2013, which addresses test documenta- tion. Such low compliance with ISO/IEC/IEEE 29119-3:2013 suggests a need for improvement of test documentation by the company. The highest partial compliance rate is with IEC 61508-1:2010 with 41.98%. Such compliance rates for IEC 61508-1:2010, both for full and partial compliance, are indicative of the company’s attention to safety. As IEC 61508-1:2010 addresses general require- ments regarding functional safety, the compliance rates associated with it indicate the company’s attention to safety. Comparing the analysis results of compliance rates of various standards shown in Figure 13 suggests the necessity for improvement in documentation areas as partial compliance is rather high, and partial compliance implies the existence of measures but lack of precise and explicit documentation. Safety is of utmost importance when it comes to the company’s values and can be viewed in the compliance rate of different standards as those addressing safety have a much higher compliance rate than those concerning documentation. The results suggest that fur- ther attention to documentation will highly improve the compliance of the company’s development process with all the safety standards.

8.3.3 V-Model Categorisation One of the ways to make good recommendations about a company’s compliance with safety stand- ards is to identify the areas in which they have neglected to make sufficient documentation. By categorising the requirements in the V-Model, one can make conclusions about the processes that may need a little more attention. The partitioning of the safety standard requirements into V-Model categories is further explained in 7.2.1. Figure 14 gives an overall view of how the requirements have been categorised in the V-Model Categorisation scheme’s groups.

Figure 14: The grouping of the selected requirements from the safety standards in terms of the V-Model Categor- isation scheme.

The results of assessing the company’s development process are divided in terms of the V-Model Categorisation categories concerning their degree of compliance. Figures 15 and 16 show the com- pliance degree of each of the V-Model categories processes with all the selected safety standard requirements. Figure 15 indicates that the highest number of Failed and Passed requirements belong to the Verification process. Meanwhile, Figure 16 shows that System Analysis and Main- tenance processes have the highest percentage of failures, while the Validation process remains to be among the highest compliant. Further speculation on the reasons for these results is made in Section9.

46 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Figure 15: The total compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the number of require- ments with different compliance degrees, grouped based on V-Model Categorisation.

Figure 16: The rate of compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the percentage of requirements with different compliance degrees, grouped based on V-Model Categorisation

Analysis of the results based on each use case and safety standard provides insight into the general performance of the company’s development process when assessed in the manner of this thesis. V-Model categorisation helps with understanding in more detail the activities within the development lifecycle’s processes. The analysis based on the V-Model categorisation shows areas of improvement and strength within different stages of the development process of the company. Figure 16 displays the company’s high compliance with standards in the validation process. The

47 Ladan Pourvatan Test Process Assessment of Industrial Control Systems validation report provided by the company includes ample useful information, which has led to the company’s development process being highly compliant with the requirements for the validation process put forth by the selected safety standards. Although Figure 16 shows the highest rate for the failure of compliance with the System Analysis and Maintenance processes, Figure 14 demonstrates that they have the least percentage of requirements among the V-Model categorisation groups. Figure 15 shows that although the maintenance process has the highest failure rate, it has the lowest number of failures as well. This oddity is possibly due to the selection of the parts concerning regression testing from the requirements for the maintenance process, and the documentation for such tests was not provided by the company. Looking past the system analysis and maintenance processes, the highest failure of compliance rate is among the integration and verification processes. The company’s aim to improve its test processes justifies the high failure rate of the verification process. The company has acknowledged the need for improving its verification before these studies, which has led to this thesis. The parts of the processes in the V-Model that affect the test process of a company were assessed, and in these parts, the company has the highest rates of full compliance among its validation, design definition, and requirement definition processes. This analysis shows that to improve the verification and testing of the company, most of the changes need to be made to the actual testing process, rather than the processes within the development process of the company. Looking into the detailed assessment results, one can infer that once the documentation is done as indicated in the validation report, the V-Model categorisation groups will highly improve their compliance degree with the selected requirements.

8.3.4 Keyword Classification Along with V-Model Categorisation, Keyword-Classification was applied to the requirements in (1) Tabulating the Data, with results presented in Section 8.1. The way each requirement was categor- ised using the Keyword-Classification scheme is explained in Section 7.2.1. Keyword-Classification aids in identifying areas that need more attention, not in regards to the V-model, but in regards to what is deemed important to the company; as the keywords were initially extracted from the company documentation. Figure 17 shows how the requirements are distributed among the dif- ferent classes in the Keyword-Classification scheme. As expected, most of the requirements fall under Verification and Validation, as the keywords clustered into that group are most associated with test processes. It is noteworthy to repeat that even though a data source may belong to more than one class in this scheme, the safety standard requirements are assigned only one class each.

Figure 17: The grouping of the selected requirements from the safety standards in terms of the Keyword- Classification scheme.

The assessment results, categorised with respect to the Keyword-Classification scheme, show the compliance degree of the company’s development process to the selected safety standards. Figures 18 and 19 show the various compliance degrees of the requirements belonging to each class. The class with the highest number of failing requirements is Management, while Verification

48 Ladan Pourvatan Test Process Assessment of Industrial Control Systems and Validation take the lead in the passing requirements, as shown in Figure 18. However, Figure 19 indicates that most of the requirements belonging to the Integration test class failed, and the Module Test class has the highest percentage of passing requirements. Discussions on these results are presented in Section9.

Figure 18: The total compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the number of require- ments with different compliance degrees, grouped based on Keyword-Classification

Figure 19: The rate of compliance degree of the company’s development process with selected requirements from safety standards which relate to testing. The chart shows the assessment results in terms of the percentage of requirements with different compliance degrees, grouped based on Keyword-Classification

To make decent recommendations, other than viewing the compliance of the development pro-

49 Ladan Pourvatan Test Process Assessment of Industrial Control Systems cess as a whole or in terms of the V-Model, aspects that seem important to the company must also be considered. The Keyword-Classification scheme gives a new view on the analysis of the assessment results, as it goes into further detail of the parts that need improvement shared among the various processes of the V-Model. Figure 17 shows that the standards have more focus on doc- umentation and performing analyses and less focus on performance and module tests. Moreover, Figure 18 shows the high number of failure of compliance with documentation and management activities performed by the company. This high failure does not indicate that the company is lack- ing in these areas, but only reveals that there are unfinished and continuous changes that are being made in the management of the documentation. The partial compliance with such criteria also suggests the unfinishedness of the activities. Though the company, admittedly, requires to spend more time on documentation, especially in the performance and configuration areas as indicated by Figure 19, it is highly compliant with the standards in the area of module testing.

8.3.5 Analysis Resolutions The company has admittedly neglected the documentation (19), especially in system analysis and verification processes (16) and has reached out for recommendation to improve. The high percentage of failure of compliance in the management department (19), referring to Figure6 also points to a lack of documentation of traceability between the documents. Most of the problems that have fallen within the management area can be addressed by further documentation of rules and guidelines and explicit expression of the connections and bridges between the various stages of the development process. The high percentage of partial and failure of compliance with requirements falling under the analysis keyword shown in 19, regarding the levels of the verification process and test documentation in Figures 16 and 13 exhibit the lack of analyses performed for reaching adequate test cases which may threaten safety. Furthermore, putting the previous points together with the compliance rate of safety, performance, and configuration revealed in Figure 19 there is a need for addressing the relation of each activity and analysis done in regards to performance measures as well as safety integrity and safety levels in the documentation, however much there is an understanding among the engineers. The compliance of validation and verification activities are addressed in all analysis stages in the form of Validation and Verification processes in Figure 15, in the form of the keyword validation and verification in Figure 18, and finally, as ISO 13849-2:2012 is a standard specific to validation activities, in Figure 2.2. Verification and Validation need some improvement which is approached by the recommendations made in this thesis. Verification and Validation activities performed by testing, which follow the recommendations in Section 7.2.4 will likely increase the compliance degrees related to these areas greatly. When it comes to (1.8) Modification, the assessment results presented in Table 20 indicate that modifications are treated as new projects, and if only this is mentioned in the documentation, the development process will pass all the requirements in this area, bringing one back to issue in the documentation rather than the actual process of modification. Analysis with respect to (1.14) Requirements, (1.9) Design, (1.10) Implementation, and (1.11) Architecture have been also addressed in the V-model context in the form of (2.1) Systems/Software Requirements Definition Process, (2.3) Design Definition Process, (2.5) Implementation Process, and (2.2) Architecture Definition Process; results of which analysis in terms of rate of compliance is demonstrated in Figures 19 and 16. Further investigation into the detailed assessment results, redacted due to ethical reasons stated in Section6, indicates the failures and partial compliance in these areas mostly with requirements put forth by ISO/IEC/IEEE 12207. This also reveals the reasons for the rates presented in Figure 13 for by ISO/IEC/IEEE 12207. Detailed analysis shows the reasons for failure to be due to lack of consideration for alternative methods for design, architecture, and implementation. However, the interview specified that this is due to the experience of the company with the selected methods and the decision to not consider alternatives based on legitimate reasons. The results of the interview, although not included in the analysis, conclude that documenting the reasons briefly will lead to full compliance with the aforementioned requirements. Consequently, the analysis leads to (1.12) Module Test, (1.13) Integration Test, and (1.15) Test Documentation. The failure rate for integration tests shown in Figure 19 is due to a lack of integration tests in general. Although the reasoning for not performing integration tests presented in the interview are valid, there must be mention of those reasons in the documentation. The

50 Ladan Pourvatan Test Process Assessment of Industrial Control Systems module tests have been documented well, but the lack of performing analysis to reach the test cases leads to a failure rate of 25%. Analysis of the assessment results leads to deduction of the company’s need for improvement of documentation. In addition to improving documentation, a weakness observed is the lack of sufficient analysis performed for arriving at test cases and lack of explicit documentation of links between the different phases of the company’s development process. Lack of documentation is most visible within the test process, which has been acknowledged by the company and will be improved using the recommendations presented in Section 8.4. Although the numbers and rates of failure of compliance of the development process seem high, looking at the reasons suggests that some minor changes will considerably improve the compliance very fast, and the company is in an agreeable and acceptable state.

8.4 Recommendations The recommendations phase uses the proposed model in IEC 33063 [20]. The test process is divided into two main groups of processes: (1) Test Management Process Group and (2) Dynamic Test Process Group. The Test Management Process Group involves three processes (1.1) Test Planning Process, (1.2) Test Monitoring and Control Process, and (1.3) Test Completion Process. The dynamic Test Process Group is divided into four processes: (2.1) Test Design and Implementation Process, (2.2) Test Environment and Set-up process, (2.3) Test Execution Process, and (2.4) Test incident reporting process IEC 33063 [20]. Each of the four safety standards selected for this thesis has recommendations and instructions that fall under these processes. The instructions are initially divided to fall under their correlated process, and then merged and summarised to give a good set of recommendations that can be used by the company and assist in their compliance with the selected safety standards. The checklist resulting from the produced recommendations can be found in AppendixF. A table of the glossary is also given in the appendix to assist with the terminology and avoid possible confusions.

8.4.1 Test Plan As shown in Table 23, recommendations in regards to the Test Planning Process, which leads to the activities with the headings “Test Plan" and “Test Strategy", are addressed first. As discussed in the analysis of the assessment results, the company is lacking a test plan at the moment. Safety standards ISO 13849-1:2016, ISO/IEC/IEEE 29119-3:2013, ISO/IEC/IEEE 29119-2:2013, and IEC 61508-3:2010 each propose recommendations concerning the execution of a test planning process and producing documentation of the test plan. A test planning process complying with all the selected standards will include five general activities. Firstly, all previously identified risks must be reviewed, so that those related to software testing can be identified. Next, the test strategy, explained further Section 8.4.2, must be incorporated in the test plan documentation. Once the activities in the test strategy are performed and recorded, the test specifications stated in Section 8.4.5 must be included. The test plan must also specify the chronology of the tests in term of their execution. Documentation of the test plan should consist of decisions made on the required outcome of the tests, which will lead to completeness. In other words, the Test Planning Process decides on what must be done in order to call a test execution activity completed, and the test plan documentation must record these decisions.

8.4.2 Test Strategy The activities under the “Test Strategy" heading in Table 23 are among the activities that lie under the (1.1) Test Planning Process. The test strategy consists of agreeing on the staffing and scheduling of the tests and specifying all the procedures for corrective action upon failure of a test case (TG_13). In addition, identification and documentation of general testing requirements (TG_1), test data (TG_2), test environment requirements (TG_11), and test tool requirements fall under the test strategy activity. To complete the test strategy steps, the required resources to perform the complete set of actions in the test strategy must be estimated and recorded.

51 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

8.4.3 Test Status Report (1.2) Test Monitoring and Control Process involve activities that lead to the production of a “Test Status Report" [20]. This process involves identifying the metrics that are used for monitoring and controlling the tests (TG_3), and documenting them in the test status report. Furthermore, all the test measures (TG_14) must be collected and recorded. Monitoring the progress of the test activities against the test plan is a crucial part of the test status report activities. Factors that block the progress of the tests (TG_4), with respect to the test plan along with any divergence from the activities in the test plan must be identified and recorded. As the tests are going on, there will be newly identified and changed risks, which must be identified, and any means of treating them must be recorded. Test Monitoring and Control Process also involves making sure of release and availability of control directives to ensure the traceability, of changes made to the testing, the test plan, test data, test environment and staffing.

8.4.4 Test Completion (1.3) Test Completion Process follows the (1.2) Test Monitoring and Control and (2.3) Test Execu- tion Processes. During this process, the test records are to be compared to the test plan to ensure a lack of divergence. A finalised test completion report must be produced, and once all testing activities specified in Test Execution 8.4.7 are finished, the test environment must be restored to a predefined state. Test Completion Report must be complete and have information regarding the test procedures (TG_9) and equipment used. The test completion report is to record whether or not specified functional and performance targets were achieved. Moreover, test assets that may be of use at a later date, or on other projects, in no uncertain terms must be recorded. The report must also include records of the lessons learned during the project execution as well as records and identification of any recommended improvements to the testing and other processes, such as the development process.

8.4.5 Test Design Specifications The (2.1) Test Design and Implementation Process is parallel to the (1.1) Test Planning Process and is to be included in the Test Plan documentation. The activities in this process incorporate performing various analyses to specify test cases (TG_13). One activity is to perform control flow analysis to specify test cases and record the analysis and the resulting test cases. Test cases for black-box testing of functional behaviour and performance criteria (e.g. timing performance) must also be specified. As the system is safety-related, boundary value analysis and limit value analysis must be performed and recorded to specify test cases. Test cases for I/O testing should be specified to ensure that safety-related signals are correctly used. Once the test cases are specified, one must prioritise the testing of the feature sets (TG_6) using the risk exposure levels documented in the Risk Analysis report. Furthermore, the feature set(s) must be documented and the traceability between the test basis (TG_5), feature sets (TG_6), test conditions (TG_7), test coverage items (TG_8), and test cases (TG_13) must be recorded. Determining the test conditions (TG_7) for each feature based on the test completion criteria specified in the Test Plan fall under the activities in this process. The Risk Analysis report is needed for the activities in the test design specification. The details for such a report are not given in these recommendations, however, it is assumed that a sufficient risk analysis report is created. Once a risk analysis report is produced, the test conditions (TG_7) are prioritised using the risk exposure levels and recorded, and the same goes for the test coverage items (TG_8) and the test cases (TG_13). Once prioritisation using the risk exposure levels is done, the test cases (TG_13) are to be distributed into one or more test sets (TG_12) based on constraints on their execution. The test cases (TG_13) within a test set are then ordered according to dependencies described by preconditions and postconditions and other testing requirements to derive test procedures (TG_9). The test sets (TG_12) and test procedures (TG_9) are recorded, and finally, the test procedures are prioritised using the risk exposure levels documented in the Risk Analysis report.

52 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

8.4.6 Test Environment Set-up and Monitoring (1.2) Test Monitoring and Control Process go alongside the (2.2) Test Environment and Set-up process, which lead to activities listed under “Test Environment Set-up and Monitoring" in Table 23[20]. These activities instruct towards planning the set-up of the test environment (TG_10), including the test environment requirements (TG_11), and the schedules and costs of setting up the test environment. Once planned, the test environment is to be set up and then implemented with regard to determining the degree of configuration management to be applied. Implementation of the test environment (TG_10) is followed by the set up of test data to support the testing and installation and configuration of the test items on the test environment. Lastly, it is to be verified that the test environment meets the test environment requirements (TG_11) stated at the beginning of the process.

8.4.7 Test Execution The test execution activities lie under the (2.3) Test Execution Process and (2.4) Test incident reporting process which is parallel to (1.2) Test Monitoring and Control Process and happens before (1.3) Test Completion Process. These activities state that safety-related part(s) under test shall not be modified during the course of the tests. If a test can permanently change the performance of some components such that it causes the safety-related part to be incapable of meeting the requirements of further tests, a new sample or samples shall be used for subsequent tests. Test execution activities also include the detection, control, and recording of external failures as well as systematic faults. To comply with test standards, the code is to be tested by simulation during test execution. The Test Execution Log, Actual Results, and Test Result must be recorded during test execu- tion. Where a test result relates to a previously-raised incident, the test result shall be analysed and the incident details shall be updated. In like manner, where a test result indicates that a new issue has been identified, the test result shall be analysed and it will be determined whether it is an incident that requires reporting, an action item that will be resolved without incident reporting or requires no further action to be taken. During software integration, any modifications to the software shall be subject to an impact analysis which shall determine all software modules impacted, and the necessary re-verification and re-design activities. The test incidents, as well as results of impact analysis and any activities performed, must be recorded.

8.5 Industrial Evaluation The final stage of data analysis in this thesis is Industrial Evaluation ((5) in Figure5). The evaluation of the recommendations, assessment, and analysis results were carried out by executing a focus group study, the process of which is shown in Figure7. The session took 90 minutes and involved 11 participants. The focus group session was transcribed with the assurance of anonymity and led to the evaluation results. The transcription of the session was analysed using the thematic analysis method. Firstly, the transcription was reviewed to ensure familiarity. Next, the interesting features of the transcript were coded. The codes were put together to create potential themes, and finally, the themes were reviewed [34]. As a result of this thematic analysis, emerging themes and sub-themes were recorded [31] and are shown with respect to the focus group questions FGQ1, FGQ2, and FGQ3 respectively in Tables 11, 12, and 13. The results of the focus group shed a light on areas of improvement for the recommendations and confirmed the usability of the presentation of the assessment and analysis results.

53 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Theme Sub-theme

1.1 Satisfied despite the low compliance degree 1.2 Failure due to unavailability of certain docu- 1 Satisfaction with the assessment ments results 1.3 Satisfied in terms of usability 1.4 Accurate results of parts with available docu- mentation

2.1 Valuable form of presentation of the assess- ment results 2.2 Useful to have the V-Model Categorisation 2 Presentation of the results presentation of analysis results 2.3 Interesting to see the analysis results with re- spect to the Safety Standards

3.1 Traceability 3 Areas which need improvement 3.2 Documentation within the development process

Table 11: Identified themes and sub-themes regarding the assessment results (FGQ1), achieved through thematic analysis of the focus group session’s transcript.

Theme Sub-theme

4.1 Clear 4 Understandability 4.2 Understandable

5.1 Further clustering 5 Areas for improving the recom- 5.2 Prioritisation mendations 5.3 Identification of immediate actions to be made

6.1 Recommendations are compatible with the current test process 6 Integration in the current devel- 6.2 Integration requires too much documentation. opment process 6.3 Finding a way to further compliance without losing efficiency

7.1 The recommendations are consistent 7 Validity 7.2 The recommendations are justified by refer- ring to the safety standards

Table 12: Identified themes and sub-themes regarding the recommendations (FGQ2), achieved through thematic analysis of the focus group session’s transcript.

54 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Theme Sub-theme

8.1 Improve the level of safety assurance as they increase compliance with selected safety stand- ards 8.2 Improve the reliability of the system as they 8 Improvement of the current test instruct towards testing for safety in addition to process testing the functionality 8.3 Improvement of the performance needs fur- ther internal debate within the company

9.1 The test plan is deemed necessary 9.2 Certain aspects of the test status reporting can be of a lower priority 9.3 Certain recordings of test environment set-up and monitoring can be of lower priority 9 Length of documentation 9.3 Certain recordings of the test execution pro- cess can be of lower priority 9.4 Documentation of risks, faults, and failures are of high importance 9.5 Risk analysis is deemed important

Table 13: Identified themes and sub-themes regarding the effects of the recommendations on improving the test process (FGQ3), achieved through thematic analysis of the focus group session’s transcript.

Regarding the assessment results, answering FGQ1, the engineers and managers from the com- pany were more satisfied than one could expect, given the low compliance degree presented. Ac- cording to the participants, the assessment results are presented in a systematic way that can be useful for internal debates in the company. Furthermore, the participants were pleased with the presentation of the analysis results, especially the grouping of the results based on the V- Model Categorisation. The various groupings of the requirements in the V-Model Categorisation and Keyword-Classification schemes led to discussions about the overlapping groups that led to demonstrating areas of weakness and strength in the company. The results presented based on the V-Model Categorisation scheme, demonstrated in Figure 16, revealed convenient outcomes for improvement. The discussions also led to the realisation that high degree of failure with require- ments in the groups of System Analysis ((2.4) in Figure6) and ((1.13) in Figure6), shown in Figures 16 and 19, was due to unavailability of certain documentation. The assessment and ana- lysis results led to clarifying the need for further documentation. The results in Figure 19 further led to discussions about Traceability, which falls under the keyword Management ((1.2) in Figure 6). The conversation about Traceability depicted that although documentation of Traceability may not be interesting as one is developing, it is essential for making the maintenance processes more efficient. Altogether, the analysis and assessment results led to shedding light on areas worth debating internally within the company and were agreeable to the participants regarding usability and accuracy. The focus group session involved considerations about the recommendations, answering FGQ2. The recommendations were deemed clear and understandable as they were a simplified and con- densed version of the instructions from the safety standards. However, further clustering of the re- commendations and prioritisation of them was resolved to make them more time-efficient. Though the compatibility of the recommendations with the current test process was not an issue, the in- tegration of all the recommendations was considered to lead to too much documentation. This further approved the need for further clustering and prioritising the recommendations. The need for prioritisation and clustering the recommendations was clear. The recommenda- tions were said to be valuable to the company as they provided insight into the areas of improve- ment, answering FGQ3. It was concluded that the requirements do increase the level of safety

55 Ladan Pourvatan Test Process Assessment of Industrial Control Systems assurance, as they move the focus of the test process from testing for functionality to testing for safety. The participants also expressed their approval of the presented recommendations regarding the increase of overall reliability of the developed system. The recommendations, as they reference the safety standards, were also considered to improve the overall compliance of the company’s test process with the selected safety standards. The focus group study led to many constructive results that help improve the recommendations and include further considerations in the overall method of reaching the recommendations. The participants also approved the assessment and analysis steps of the study, leading to assurance of relevance and prominence of the steps taken to reach the recommendations. The refined recom- mendations provide insight into improving the company’s test process’s compliance with selected safety standards without losing efficiency.

8.5.1 Refined Recommendations The recommendations made in Section 8.4 are a result of following all the steps explained in Section 7.2 as well as guidelines put forth by IEC 33063 standard [20]. The recommendations are divided into activities for Test Plan, Test Strategy, Test Status Report, Test Completion, Test Design Spe- cification, Test Environment Set-up and Monitoring, and Test Execution. This classification helps apply the recommended activities more easily. The results of the recommendations were further improved by feedback received in the industrial evaluation step. The refined recommendations are further discussed in this section, and a list of the immediate actions for the company to take is shown below.

Answer to RQ1: By using the lightweight assessment method, the following recommendations have been produced, and can further improve safety compliance for the test process of the industrial control system being studied.

• Actions for creating a Test Plan

 Review all previously identified risks to identify those that relate to and/or can be treated by software testing [15].

– Incorporate the test design specifications in the test plan as follows [18][16]:  Perform control flow analysis to specify test cases [17].  Specify test cases for black-box testing of functional behaviour [17].  Specify test cases for black-box testing of performance criteria (e.g. timing per- formance) [17].  Perform boundary value analysis for testing the safety-related application and em- bedded software in the system, to specify test cases [17].  Perform limit value analysis to specify test cases [18].  Specify test cases for I/O testing to ensure that safety-related signals are correctly used [17].  Record the analyses performed and their resulting test cases [15].  Prioritise the test coverage items using the risk exposure levels in the Risk Analysis report[15].  Prioritise the test cases using the risk exposure levels in the Risk Analysis re- port[15][16].

• Actions before Test Execution (Test Environment Set-up and Monitoring)

 Implement the test environment and record the process [15].  Install and configure the test item on the test environment [15].

56 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

• Actions during Test Execution

 Test the code by simulation and record the outcomes [17].  Make sure safety-related part(s) under test are not modified during the course of the tests [18].  Use a new sample or samples for tests that can permanently change the performance of some components such that it causes the safety-related part to be incapable of meeting the requirements of further tests [18].  Detect and control external failure while testing and record it [18].  Detect systematic faults (errors, omissions or inconsistencies) while testing [18].  Record the Test Execution Log, Actual Results, and Test Result [15][16].  Where a test result relates to a previously-raised incident, the test result shall be ana- lyzed and the incident details shall be updated [5].  Where a test result indicates that a new issue has been identified, the test result shall be analysed and it will be determined whether it is an incident that requires reporting, an action item that will be resolved without incident reporting, or requires no further action to be taken [5].  During software integration, any modification to the software shall be subject to an impact analysis which shall determine all software modules impacted, and the necessary reverification and re-design activities [5][13].  Report test incidents [5].  Make sure of release and availability of control directives to ensure the traceability, of changes made to the testing, the test plan, test data, and test environment [15][16].

• Actions upon Test Completion

 Restore the test environment to a predefined state [15]. – Create a Test Completion Report which has the following information [16]:  The test procedures and equipment used. [15][16].  Whether or not specified functional and performance targets were achieved [13].  Test assets which may be of use at a later date, or on other projects [15].  Records of the lessons learned during the project execution (what went well and what did not, during testing and associated activities) [15].  Records and identification of any recommended improvements to the testing and other processes, such as the development process [5].

Based on the results of the focus group, it was assumed that the most important part of the recommendations was the Test Plan. The test planning activities involving test design specification and risk analysis are of high importance in both compliances with safety standards and from the point of view of the participants in the focus group. The test completion report and recording of incidents, lessons learned, and traceability of changes were also considered essential. The refined recommendations include the activities within the test planning concerning the review of all previously identified risks to identify those that relate to and can be treated by software testing. The test design specifications are to be included in the test plan. The test design specifications must involve performing and recording the control flow analysis, boundary value analysis, and limit value analysis to specify test cases. Test cases for black-box testing of functional behaviour and performance criteria must also be specified. Moreover, the test coverage items are to be prioritised using the risk exposure levels in the Risk Analysis report. Implementation of the test environment, along with installing and configuring the test item on the test environment is to be recorded in this phase as well. Once test planning is through, the test execution phase must assure testing by simulation and that safety-related part(s) under test are not modified during the course of the tests. Test execution activities should also consider if a test can permanently change the performance of some components

57 Ladan Pourvatan Test Process Assessment of Industrial Control Systems such that it causes the safety-related part to be incapable of meeting the requirements of further tests, a new sample, or samples shall be used for subsequent tests. Detection, control, and record of external failure and systematic faults while testing remains an important recommendation to the test process of the company. Moreover, the record of the Test Execution Log, Actual Results, and Test Result must be made available, and test incidents should undergo necessary analysis and be reported. It is crucial to also make sure of the release and availability of control directives to ensure the traceability of changes made to the testing, the test plan, test data, and test environment. Test completion activities are the last to be performed. Once all test execution activities are finished, restore the test environment to a predefined state. A finalised test completion report with details on the test procedures and equipment used, meeting of specified functional and perform- ance targets and test assets which may be of use at a later date, or on other projects. Another important recommendation for the test process is the record of the lessons learned during the project execution, along with identifying any recommended improvements to the testing and other processes. The list above only shows the immediate actions for the company to make, the complete newly clustered and prioritised list of recommendations can be found in AppendixG. The industrial evaluation step concludes the data analysis of the case study performed for this thesis by presenting a set of refined recommendations that are usable by the company and improve the safety compliance of its test process with selected safety standards.

9 Discussions

Using the steps put forth in Section 7.2 Data Analysis, recommendations can be derived for im- proving the test processes of industrial control systems. Conformance to various safety standards is required by different sectors of industry and academia to ensure necessary measures for ensuring the safety of control systems. The existence of a lightweight assessment process for the develop- ment process of control systems with attention to testing is necessary for achieving test processes that are compliant with safety standards. The thesis has put together multiple general steps for assessing the compliance of the development processes, and by applying these steps, answered the research questions regarding an example of industrial control systems. By following these steps, one can evaluate the development process of an industrial control system with regard to test processes, and produce recommendations that help supply a test process compliant to test processes. RQ1 addresses recommendations for the further improvement of the safety compliance of test processes. To achieve useful recommendations for improvement, one must assess the existing process and make suggestions. For the purpose of production of recommendations, the methods put forth in Section 7.2 were applied on an example of an industrial control system. The example was selected from two safety functions within a bigger system from a large automation company. Application of the data collection in the manner explained in Section 7.1, and the steps of data analysis on the system has led to recommendations that assist in the company’s test process compliance to certain safety standards. The thesis aims to answer RQ2 by performing assessment and analysis to find the extent to which the current development process in place at an automation company in Sweden complies with specific safety standards. To answer this question, data were collected from relevant safety standards and the company and tabulated. Gathering the relevant requirements from the safety standards led to assessment criteria. The tabulation of data assisted in matching evidence, in the form of data gathered, to the assessment criteria. Following this, assessment outcomes were defined as compliance degrees of Pass, Partial, and Fail. Having the data to be assessed, the assessment criteria, and the style of assessment outcome, the assessment was performed. The development process of two safety functions provided by an automation company in Sweden was assessed in this stage, as examples of an industrial control system. The results of the assessment were then grouped and explained in four ways, with respect to the use cases, each safety standard, classes put forth in the Keyword-Classification scheme, and the categories of V-Model Categorisation. These analysis results showed the extent of compliance of the development process with each safety standard and highlighted the areas in which there is room for growth. The case study performed for answering the research questions RQ1 and RQ2 has led to a lightweight method for coming up with recommendations. The method can be applied to any

58 Ladan Pourvatan Test Process Assessment of Industrial Control Systems industrial control system as it has general instructions that may be followed. Afzal et al. list the approaches for software test process improvements to belong to four categories of TMM and related approaches, TPI and related approaches, Standards and related approaches, and Individual ap- proaches [4]. The approach used in this thesis uses an individual approach to use safety standards for improving the test processes. Applying the steps taken in the data analysis phase of this case study to other subjects can help develop these steps into a technique for test process assessment of control systems. Techniques currently existing in this area include usage of Model-Driven En- gineering techniques for process assessment [24][23]. Furthermore, the LAPPI method has been proposed for identifying improvement targets in test processes [21]. These methods confirm the necessity of a systematic and disciplined approach for performing test process assessment. The techniques used for the data collection and analysis in this thesis use limited resources and can be performed easily. By further refining, evolving, and putting these techniques together, an effective way for teams to assess the test process of a company, using limited resources will be developed.

10 Future Works

Test process assessment of safety-related industrial control system is of high importance. Safety standards provide criteria, conforming to which will lead to systems that avoid harm to people, environment, or resources [1]. The results of this thesis show the assessment results for the de- velopment of two safety functions at one company. Utilising these results, the thesis provides re- commendations that will help the conformance of the company’s test process with selected safety standards. Actions taken to generalise the recommendations achieved through this thesis will aid both industry and academia in creating test processes that assure safety via their conformance to safety standards. Furthermore, as the case study performed in this was exploratory, new problems and hypotheses arose during its execution [26]. The case study led to the realisation of a need for a technique that could ease the assessment of development processes with safety standards with limited time and resources. Generalising the results of this thesis will result in recommendations for test processes of safety- related industrial control systems. The results of this thesis are transferable to other companies with a similar development process, however not generalisable due to lack of sufficient data. By applying the steps taken in this study to development processes at other companies, the results could conclude to a generalised set of recommendations. The resulting generalised set of recom- mendations will lead to an easy way for test processes to conform to selected safety standards. Further development of the steps taken in this study can contribute to academia and industry. The academic domain can benefit through research conducted for a generalised method of test process assessment by uncovering new challenges and shortcomings in the current techniques. The industry will be provided with an efficient way for test process assessment if the results are justified. Furthermore, the technique can be automated to be even more efficient and less time-consuming. As the generation of appropriate documentation has been concluded to be of great importance from the results of this thesis, an automated form of document generation for the test process that will ensure the compliance of a test process with safety standards is an ambitious continuation of the work presented in this thesis. The work of this thesis has opened the way for furthering research in the area of test process assessment techniques and recommendations in multiple ways. Generalising the recommendations is one path to take. A refined method for assessing the development process of industrial control systems is another approach to take. Automation of test process documentation generation using the steps in this thesis is another idea worth following. Using the results and process of this thesis can lead to exciting new research and contribute to academia and industry.

59 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

11 Conclusions

Consequences of failure of safety-related systems may involve loss of life, property damage, or en- vironmental damage [1]. Therefore, assurance of the safety of autonomous systems in the industry is of utmost importance [2]. Ensuring the compliance of the test process of a control system with safety standards is one of the methods for ensuring safety [4]. This thesis has supplied recom- mendations for improving the safety compliance of a test process with selected safety standards to help prevent harmful outcomes of using an industrial control system. The recommendations were made by studying the compliance of the current development process in place at an automation company in Sweden, with specific safety standards. As a by-product of answering the research questions of this thesis, a generic method for lightweight assessment of the development process with regard to testing has been created, which leads to recommendations. The use of a lightweight method leads to insight into the areas in need of improvement with the use of limited resources [25]. The method can be applied to various industrial control systems with development processes that follow the V-Model. Two research questions have been addressed in this thesis. For answering these research ques- tions, a case study has been executed. The subject of the case study being a large automation company in Sweden, two safety functions were selected as use cases. The results of the case study culminated in a set of recommendations that help with the compliance of the test process of the case being studied, answering RQ1. These recommendations were based on the results of assess- ing the extent of compliance of the development process of the two use cases with selected safety standards. The results of this assessment answered RQ2. Answering the two research questions called for a method for lightweight assessment of the development process, leading to the steps taken for performing the case study. The steps taken for the data collection and data analysis phases of the case study resulted in a method that can be applied to other cases and produce useful results. Data has been gathered from the specific case in the form of documentation and interview. Based on some generic selection criteria, specific requirements were extracted from selected safety standards, leading to a set of assessment criteria. The collection of data can be applied to other cases as the criteria are generic and may be applied to other safety standards for a different set of assessment criteria, or other use cases for a different development process to be assessed. Data analysis was performed on the collected data through five steps: (1) Tabulating the Data, (2) Assessment, (3) Analysis, (4) Recommendation, and (5) Industrial Evaluation. Initially, the collected data were organised into different grouping schemes in the step (1) Tabulating the Data. The V-Model Categorisation scheme is based on a simplified version of the software safety lifecycle put forth by ISO/IEC/IEEE 12207:2017 and ISO 13849-1:2016 [5][17]. The Keyword-Classification scheme emerged from abstract keywording of the documentation provided and clustering the keywords [35]. These two grouping schemes helped understandably divide the collected data, match the evidence to assessment criteria, and end up with more comprehensible analysis results in step 3 of the data analysis. Once data is tabulated, (2) Assessment is performed by first clearly defining an assessment outcome, then applying the assessment criteria and text snippets from the data sources. The as- sessment outcome is defined as a compliance degree that is applied to each assessment criterion. The compliance degree has three levels of Pass, Partial, and Fail, representing the level of conform- ance of the development process under evaluation with that specific requirement. The assessment of the specific case resulted in a total of 808 requirements, 509 of which were assigned Fail, 181 Partial, and 190 Pass. These assessment results were further analysed in the next step of data analysis. Analysis (3) comes after the assessment step and puts the results in a form that could be used for identifying areas of weakness and strength within a development process. In this step, the as- sessment results are grouped based on each safety function, each safety standard, and the V-Model Categorisation and Keyword-Classification groups. The division of data into the use cases helps confirm consistency of development among different safety functions within a system. The analysis step also demonstrates which safety standards require more attention and are least or most con- formed to. The expression of data in terms of the two grouping schemes assists in identifying the areas of the development process that need further attention from the organisation. The example

60 Ladan Pourvatan Test Process Assessment of Industrial Control Systems of an industrial control system chosen as the subject of the case study in this thesis has shown consistency among its two different safety functions. Furthermore, the company’s development pro- cess is highly conforming with ISO 13849-2:2016 but requires further attention to ISO/IEC/IEEE 29119-2:2013 and ISO 13849-2:2012, which respectively focus on test planning activities and ana- lyses for the creation of test cases. The high rate of the Fail compliance degree among the System Analysis and Maintenance processes to areas of weakness in activities regarding regression tests and integration tests. Moreover, the need for more documentation of integration tests is verified by the compliance degree of the Integration Test class of the Keyword-Classification Scheme, in addition to the need for further planning, organisation and documentation of traceability. These analysis results and their presence in the manners above lead to conclusions that help make better recommendations in the next step of the data analysis. The step (4) Recommendation uses the analysis results to put forth recommendations useful for improving the safety compliance of test processes. By utilising the results from the third step of the data analysis, the areas of weakness have been identified and suggestions in those regards are made for the test process. The case of the company has shown weakness in multiple areas that have led to recommendations mostly in the areas of test planning and test design specification. The recommendations focus on documentation of risk analysis, performing different types of analyses for arriving at test cases, and creating a test plan to follow throughout various testing activities. Furthermore, the recommendations suggest further documentation of the set-up of the test environment, test execution and test completion processes. The final step of the data analysis is (5) Industrial Evaluation. The industrial evaluation step weights the level of contribution of the results of the previous steps to the test process improvement of industrial control systems via their compliance to safety standards. By applying the focus group research method, the results of the steps are presented to those who are affected by them, and their thoughts on the results are recorded and applied to end up with refined recommendations. The participants in the focus group executed for this thesis voiced their preference for a more prioritised list of recommendations. The discussions from the focus group led to a set of recommendations with more focus on the test plan and test case generation with lower priority on environment set-up and status report activities.

Lightweight Assessment Method 1. Tabulating the Data 2. Assessment 3. Analysis 4. Recommendations 5. Industrial Evaluation

Answer to RQ1 What recommendations further improve safety compliance for the test process of an industrial control system? • Actions to take towards the creation of a Test Plan, involving test design specifications through performing analyses. • Actions regarding implementation, installation, and configuration of the test environment prior to test execution. • Actions concerning records, detections, and analyses to be performed during test execution. • Actions regarding test completion activities in the form of a Test Completion Report.

61 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Answer to RQ2 To what extent does the current development process in place at an automation company comply with specific safety standards? • Total compliance rate of 22% Pass, 20% Partial, and 58% Fail with the selected standards. • Full compliance rate (Pass compliance degree) of 23% with ISO/IEC/IEEE 12207:2017, 19% with ISO 13849-1:2016, 49% with ISO 13849-2:2012, 20% with ISO/IEC/IEEE 29119-2:2013, 14% with ISO/IEC/IEEE 29119-3:2013, 21% with IEC 61508-1:2010, and 19% with IEC 61508-3:2010

This thesis concludes by answering two research questions regarding the compliance of the devel- opment process of an automation company with selected safety standards and suggesting improve- ments for its test process in the same respect. The results of the assessment show 22% complete compliance, 20% partial compliance, and 58% failure of compliance with the selected requirements. The analysis results showed a need for further documentation and analysis in the test process. The assessment and analysis results answered RQ2. Using these results, recommendations were made and evaluated. As a result of the evaluation, the recommendations focused more on a test plan and risk analysis. The refined recommendations answered RQ1. Answering these research ques- tions has led to the formation of a method for generating recommendations for test processes more compliant to selected safety standards. The resulting method may be validated by performing experiments using other assessment methods. Further development of these steps will lead to ad- vances in research for test process improvements via safety standards. The lightweight assessment of industrial control systems’ development processes via safety standards is also addressed by this outcome.

62 Ladan Pourvatan Test Process Assessment of Industrial Control Systems

References

[1] J. C. Knight, ‘Safety critical systems: Challenges and directions,’ in Proceedings of the 24th International Conference on Software Engineering. ICSE 2002, 2002, pp. 547–550. [2] H. Wang and N. Liang, ‘A software diversity model for embedded safety-critical system,’ in 2009 International Conference on Wireless Networks and Information Systems, 2009, pp. 106–109. doi: 10.1109/WNIS.2009.52. [3] E. P. Jharko, ‘The methodology of software quality assurance for safety-critical systems,’ in 2015 International Siberian Conference on Control and Communications (SIBCON), 2015, pp. 1–5. doi: 10.1109/SIBCON.2015.7147057. [4] W. Afzal, S. Alone, K. Glocksien and R. Torkar, ‘Software test process improvement ap- proaches: A systematic literature review and an industrial case study,’ Journal of Systems and Software, vol. 111, pp. 1–33, 2016, issn: 0164-1212. doi: https://doi.org/10.1016/ j.jss.2015.08.048. [5] ‘Systems and software engineering — software life cycle processes,’ en, Institute of Electrical and Electronics Engineers, Inc, Switzerland, Standard, 2017. [6] J. Pedersen Notander, M. Höst and P. Runeson, ‘Challenges in flexible safety-critical soft- ware development – an industrial qualitative survey,’ in Product-Focused Software Process Improvement, J. Heidrich, M. Oivo, A. Jedlitschka and M. T. Baldassarre, Eds., Berlin, Heidelberg: Springer Berlin Heidelberg, 2013, pp. 283–297, isbn: 978-3-642-39259-7. [7] J. Jacobs, J. Moll and T. Stokes, ‘The process of test process improvement,’ vol. 8, pp. 23–29, Jan. 2000. [8] P. Morgan, A. Samaroo, G. Thompson and P. Williams, Software Testing : An ISTQB-BCS Certified Tester Foundation Guide. 3rd ed. BCS Learning & Development Limited., 2015, pp. 39–59, isbn: 978-1-78017-300-9. [9] J. Thorn, ‘Test framework quality assurance: Augmenting agile processes with safety stand- ards,’ Ph.D. dissertation, 2020. [10] J. Heidrich, M. Oivo, A. Jedlitschka and M. Baldassarre, Product-Focused Software Process Improvement: 14th International Conference, PROFES 2013, Paphos, Cyprus, June 12-14, 2013. Proceedings. Jan. 2013, vol. 7983, isbn: 978-3-642-39258-0. doi: 10.1007/978-3-642- 39259-7. [11] ‘Functional safety of electrical/electronic/programmable electronic safety-related systems - part 1: General requirements,’ en, European Committee for Electrotechnical Standardization, Geneva, Switzerland, Standard BS EN 61508-1:2010, May 2010. [Online]. Available: https: //www.iso.org/standard/62711.html. [12] R. Nevalainen and T. Varkoi, ‘A safety-critical assessment process,’ vol. 477, Nov. 2014, pp. 157–164. doi: 10.1007/978-3-319-13036-1_14. [13] ‘Functional safety of electrical/electronic/programmable electronic safety-related systems - part 3: Software requirements,’ en, European Committee for Electrotechnical Standardiz- ation, Geneva, Switzerland, Standard BS EN 61508-3:2010, May 2010. [Online]. Available: https://www.iso.org/standard/62711.html. [14] ‘Software and systems engineering - software testing - part 1: Concepts and definitions,’ SIS Swedish Standards Institute, Stockholm, Sweden, Standard, Nov. 2013. [15] ‘Software and systems engineering - software testing - part 1: Test processes,’ SIS Swedish Standards Institute, Stockholm, Sweden, Standard, Nov. 2013. [16] ‘Software and systems engineering - software testing - part 3: Test documentation,’ SIS Swedish Standards Institute, Stockholm, Sweden, Standard, Nov. 2013. [17] ‘Safety of machinery – safety-related parts of control systems – part 1: General principles for design,’ EUROPEAN COMMITTEE FOR STANDARDIZATION, Brussels, Standard, Jan. 2016.

i Ladan Pourvatan Test Process Assessment of Industrial Control Systems

[18] ‘Safety of machinery – safety-related parts of control systems – part 2: Validation,’ EUROPEAN COMMITTEE FOR STANDARDIZATION, Brussels, Standard, Oct. 2012. [19] C. Garcia, A. Dávila and M. Pessoa, ‘Test process models: Systematic literature review,’ vol. 477, Nov. 2014. doi: 10.1007/978-3-319-13036-1_8. [20] ‘Information technology - process assessment - process assessment model for software testing,’ ISO, Geneva, Switzerland, Standard, Aug. 2015. [21] T. Toroi, A. Raninen and L. Väätäinen, ‘Identifying process improvement targets in test processes: A case study,’ in 2013 IEEE International Conference on Software Maintenance, 2013, pp. 11–19. doi: 10.1109/ICSM.2013.12. [22] S. R. Nunns, ‘Functional safety of safety-related systems: The influence of iec 1508 and developments in conformity assessment schemes on business drivers,’ in Proceedings of IEEE International Symposium on Software Engineering Standards, 1997, pp. 110–118. doi: 10. 1109/SESS.1997.595922. [23] R. K. Panesar-Walawege, M. Sabetzadeh and L. Briand, ‘A model-driven engineering ap- proach to support the verification of compliance to safety standards,’ in 2011 IEEE 22nd International Symposium on Software Reliability Engineering, 2011, pp. 30–39. doi: 10 . 1109/ISSRE.2011.11. [24] ——, ‘Supporting the verification of compliance to safety standards via model-driven engin- eering: Approach, tool-support and empirical validation,’ Information and Software Techno- logy, vol. 55, no. 5, pp. 836–864, 2013, cited By 37. doi: 10.1016/j.infsof.2012.11.009. [25] A. Raninen, J. Ahonen, H.-M. Sihvonen, P. Savolainen and S. Beecham, ‘Lappi: A light-weight technique to practical process modeling and improvement target identification,’ Journal of Software: Evolution and Process (JSEP), Wiley, vol. 25, Sep. 2013. doi: 10.1002/smr.1571. [26] C. Robson and K. McCartan, Real World Research, 4th ed. John Wiley and Sons Ltd., 2011, isbn: 9781119144854. [27] S. Baskarada, ‘Qualitative case study guidelines,’ Qualitative Report, vol. 19, pp. 1–25, Oct. 2014. [28] P. Runeson and M. Höst, ‘Guidelines for conducting and reporting case study research in software engineering,’ Empirical Softw. Engg., vol. 14, no. 2, pp. 131–164, Apr. 2009, issn: 1382-3256. doi: 10.1007/s10664-008-9102-8. [29] C. Wohlin, P. Runeson, H. Martin, M. C. Ohlsson, R. Björn and W. Anders, Experimentation in software engineering: an introduction. Springer-Verlag New York, 2012, isbn: 978-3-642- 29043-5. [30] A. Gibbs, ‘Focus groups,’ Social Research Update, vol. 9, pp. 1–8, 8 Mar. 1997. [31] R. L. Breen, ‘A practical guide to focus-group research,’ Journal of Geography in Higher Education, vol. 30, no. 3, pp. 463–475, 2006. doi: 10.1080/03098260600927575. [32] M. A. T. Laksono, E. K. Budiardjo and A. Ferdinansyah, ‘Assessment of test maturity model: A comparative study for process improvement,’ in Proceedings of the 2nd International Conference on Software Engineering and Information Management, ser. ICSIM 2019, Bali, Indonesia: Association for Computing Machinery, 2019, pp. 110–118, isbn: 9781450366427. doi: 10.1145/3305160.3305203. [33] B. A., F. Y., L. Landi and H. Mödden, ‘Probabilities in safety of machinery-part 1: Risk profiling and farmer matrix,’ Jan. 2014. [34] M. Vaismoradi, H. Turunen and T. Bondas, ‘Content analysis and thematic analysis: Im- plications for conducting a qualitative descriptive study,’ Nursing & health sciences, vol. 15, Mar. 2013. doi: 10.1111/nhs.12048. [35] K. Petersen, R. Feldt, S. Mujtaba and M. Mattsson, ‘Systematic mapping studies in software engineering,’ Proceedings of the 12th International Conference on Evaluation and Assessment in Software Engineering, vol. 17, Jun. 2008. [Online]. Available: https://dl.acm.org/doi/ 10.5555/2227115.2227123.

ii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

[36] A. Roman, A Study Guide to the ISTQB Foundation Level 2018 Syllabus: Test Techniques and Sample Mock Exams, 1st. Springer Publishing Company, Incorporated, 2018, isbn: 3319987399. [37] J. Kontio, L. Lehtola and J. Bragge, ‘Using the focus group method in software engineering: Obtaining practitioner and user experiences,’ Sep. 2004, pp. 271–280, isbn: 0-7695-2165-7. doi: 10.1109/ISESE.2004.1334914. [38] J. Kontio, J. Bragge and L. Lehtola, ‘The focus group method as an empirical tool in software engineering,’ in. Jan. 2008, pp. 93–116, isbn: 978-1-84800-043-8. doi: 10 . 1007 / 978 - 1 - 84800-044-5_4. [39] M. Abbas, R. Jongeling, C. Lindskog, E. P. Enoiu, M. Saadatmand and D. Sundmark, ‘Product line adoption in industry: An experience report from the railway domain,’ in Pro- ceedings of the 24th ACM Conference on Systems and Software Product Line: Volume A - Volume A, ser. SPLC ’20, Montreal, Quebec, Canada: Association for Computing Ma- chinery, 2020, isbn: 9781450375696. doi: 10.1145/3382025.3414953. [Online]. Available: https://doi.org/10.1145/3382025.3414953.

iii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Appendices

A. Appendix A - Safety Standard Requirement Extraction This appendix includes all the extraction criteria for the requirements in the standards IEC 61508- 1:2010, IEC 61508-3:2010, ISO/IEC/IEEE 29119-2, ISO/IEC/IEEE 29119-3, ISO 13849-1:2016, ISO 13849-2:2012, and ISO/IEC/IEEE 12207:2017, which are used for the assessment of the process in place at the company under study.

Num Title Included Justification 1 Scope  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 2 Conformance  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 3 Normative References  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 4 Terms and Definitions  This clause does not include instruc- tions or assessment criteria, it is read and used, but does not contain extract- able requirements, therefore is not in- cluded in the assessment criteria. Ex- clusion Criterion1 5 Multi-Layer Test Pro-  This clause does not include instruc- cess Model tions or assessment criteria, it is read and used, as it explains the process model utilised, but does not contain ex- tractable requirements, therefore is not included in the assessment criteria. Ex- clusion Criterion1 6 Organizational Test  The organizational test process is used Process to develop and manage organizational test specifications. As this thesis is con- cerned with only 2 use-cases and not the entire organization, this clause does not apply. Exclusion Criterion2 7 Test Management Partial Some of the sub-clauses are included Processes and some are not. The excluded sub- clauses are due to Exclusion Criteria3 and1. Inclusion Criterion3 7.1 Introduction  Introduction does not involve instruc- tions, but it is read and considered. Ex- clusion Criterion1

iv Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7.2.4.1 Understand Context Partial Parts of the process TP1 are directly (TP1) related to the thesis, but some parts of it, which involve stakeholder commu- nications have been removed. Exclu- sion Criterion3. Inclusion Criterion3. 7.2.4.1.a Understanding of  This sub-clause is essential for assess- the context and the ing the software development process of software testing re- a safety-related system and it directly quirements shall be affects the test process. Inclusion Cri- obtained to support terion3 the preparation of the Test Plan 7.2.4.1.b An understanding  Communications with stakeholders and of the context and between members of the organisation the software testing are not in the scope of this thesis, there- requirements should fore, are not considered in the assess- be obtained by identi- ment. Exclusion Criteria3 fying and interacting with the relevant stakeholders 7.2.4.1.c A communication  Communications with stakeholders and Plan should be ini- between members of the organisation tiated, and lines are not in the scope of this thesis, there- of communication fore, are not considered in the assess- recorded ment. Exclusion Criterion3 7.2.4.2 Organize Test Plan Partial Parts of the process TP2 are directly Development (TP2) related to the thesis, but some parts of it, which involve stakeholder commu- nications or other forms of communica- tions between the members in the pro- ject have been removed. Exclusion Cri- terion3. Inclusion Criterion3. 7.2.4.2.a Based on the test-  This sub-clause is essential for assess- ing requirements iden- ing the software development process tified in TP1, those of a safety-related system and it dir- activities that need to ectly affects the test process.Inclusion be performed to com- Criterion3. plete test planning, shall be identified and scheduled 7.2.4.2.b The stakeholders re-  Communications with stakeholders and quired to participate between members of the organisation in these activities are not in the scope of this thesis, there- should be identified fore, are not considered in the assess- ment. Exclusion Criteria3 7.2.4.2.c Approval of the activ-  Communications with stakeholders and ities, schedule and between members of the organisation participants shall be are not in the scope of this thesis, there- obtained from the rel- fore, are not considered in the assess- evant stakeholders ment. Exclusion Criteria3 7.2.4.2.d Stakeholder involve-  Communications with stakeholders and ment should be between members of the organisation organised are not in the scope of this thesis, there- fore, are not considered in the assess- ment. Exclusion Criteria3

v Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7.2.4.3 Identify and Analyse  This sub-clause is essential for assess- Risks (TP3) ing the software development process of a safety-related system and it directly affects the test process. Inclusion Cri- terion1 7.2.4.4 Identify Risk Mit-  This sub-clause is essential for assess- igation Approaches ing the software development process of (TP4) a safety-related system and it directly affects the test process. Inclusion Cri- terion1 7.2.4.5 Design Test Strategy  This sub-clause is essential for assessing (TP5) the software development process of a safety-related system and it directly af- fects the test process. Inclusion Criteria 1&3 7.2.4.6 Determine Staffing  Communications with stakeholders and and Scheduling (TP6) between members of the organisation are not in the scope of this thesis there- fore are not considered in the assess- ment. Exclusion Criteria3 7.2.4.7 Record Test Plan  The test plan is directly within the (TP7) scope of this thesis. Inclusion Criterion 3 7.2.4.8 Gain Consensus on  Communications with stakeholders and Test Plan (TP8) between members of the organisation are not in the scope of this thesis there- fore are not considered in the assess- ment. Exclusion Criteria3 7.2.4.8.a The views of the  Communications with stakeholders and stakeholders on the between members of the organisation test plan shall be are not in the scope of this thesis, there- gathered. fore, are not considered in the assess- ment. Exclusion Criteria3 7.2.4.9 Communicate Test Partial Communications with stakeholders and Plan and Make between members of the organisation Available (TP9) are not in the scope of this thesis, there- fore, are not considered in the assess- ment. However, the availability of the test plan is directly in the scope of the thesis therefore this sub-clause is par- tially considered. Exclusion Criterion 3. Inclusion Criterion3 7.2.4.9.a The Test Plan shall be  The availability of the test plan is re- made available quired for the desired test plan. A test plan is directly related to the scope of the thesis. Inclusion Criterion3 7.2.4.9.b The availability of the  Communications with stakeholders and Test Plan shall be between members of the organisation communicated to the are not in the scope of this thesis, there- stakeholders. fore, are not considered in the assess- ment. Exclusion Criteria3

vi Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7.2.5 Information items  The instructions for the documentation produced from the test management process is directly related to the scope of the thesis. Inclusion Criterion1 7.3.4.1 Set-Up (TMC1)  Process TMC1 is directly related to the thesis, as it is an essential subprocess of an appropriate test process. Inclusion Criterion3 7.3.4.2 Monitor (TMC2)  Process TMC2 are directly related to the thesis, as it is an essential sub- process of an appropriate test process. This process is not related to mainten- ance, making its activities relevant to the scope. Inclusion Criterion3 7.3.4.3 Control (TMC3) Partial Parts of the process TMC3 are directly related to the thesis, but some parts of it, which involve stakeholder commu- nications or other forms of communica- tions between the members in the pro- ject have been removed. Exclusion Cri- terion3. Inclusion Criterion1. 7.3.4.3.a Those actions neces-  Directly related to the thesis, as it is an sary to implement the essential requirement for subprocesses test plan shall be per- of an appropriate test process. Inclu- formed. sion Criterion3. 7.3.4.3.b Those actions neces-  Directly related to the thesis, as it is an sary to implement essential requirement for subprocesses control directives of an appropriate test process. Inclu- received from higher sion Criterion1 level management processes shall be performed. 7.3.4.3.c Those actions neces-  Directly related to the thesis, as it is an sary to manage the essential requirement for subprocesses divergence of actual of an appropriate test process. Inclu- testing from planned sion Criterion3. testing shall be iden- tified. 7.3.4.3.d Means of treating  Directly related to the thesis, as it is an newly identified and essential requirement for subprocesses changed risks shall be of an appropriate test process. Inclu- identified. sion Criterion1. 7.3.4.3.e As appropriate: Partial The sub-clause is directly related to the thesis, as it involves essential require- ments for subprocesses of an appropri- ate test process. However, the sub- parts involving stakeholder communica- tions have been disregarded. Exclusion Criterion3. Inclusion Criterion3. 7.3.4.3.e.1 control directives shall  Directly related to the thesis, as it is an be issued to make essential requirement for subprocesses changes to the way of an appropriate test process. Inclu- testing is performed; sion Criterion3.

vii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7.3.4.3.e.2 changes to the test  Directly related to the thesis, as it is an plan shall be in the essential requirement for subprocesses form of test plan up- of an appropriate test process. Inclu- dates; and sion Criterion3. 7.3.4.3.e.3 recommended changes  Communications with stakeholders and shall be communic- between members of the organisation ated to the relevant are not in the scope of this thesis, there- stakeholders. fore, are not considered in the assess- ment. Exclusion Criteria3 7.3.4.3.f Readiness for com-  As this clause addresses requirements mencing any assigned regarding test activities, it falls directly test activity shall be under the scope of the thesis. Inclusion established before Criterion3. commencing that activity, if not already done. 7.3.4.3.g Approval shall be  Communications with stakeholders and granted at the com- between members of the organisation pletion of assigned are not in the scope of this thesis, there- test activities. fore, are not considered in the assess- ment. Exclusion Criteria3 7.3.4.3.h When the testing has  Communications with stakeholders and met its completion between members of the organisation criteria, approval for are not in the scope of this thesis, there- the test completion fore, are not considered in the assess- decision shall be ob- ment. Exclusion Criteria3 tained. 7.3.4.4 Report (TMC4)  Process TMC4 is directly related to the thesis, as it is an essential subprocess of an appropriate test process. Inclusion Criterion3. 7.3.5 Information Items  The instructions for the documentation produced from the test monitoring pro- cess is directly related to the scope of the thesis. Inclusion Criteria3&1. 7.4.4.1 Archive Test Assets  Directly related to the thesis, as it is an (TC1) essential requirement for subprocesses of an appropriate test process.Inclusion Criterion3 7.4.4.1.a Those test assets  Directly related to the thesis, as it is an which may be of essential requirement for subprocesses use later should of an appropriate test process. Inclu- be identified and sion Criterion3. made available using appropriate means. 7.4.4.1.b Those test assets  Identification and archival of docu- which may be reused ments fall under within company com- on other projects munications, however, it is directly re- should be identified lated to the activities for an appropri- and archived. ate test process, and is therefore con- sidered. Inclusion Criterion3.

viii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7.4.4.1.c The availability of re- Partial Directly related to the thesis, as it is an usable test assets shall essential requirement for subprocesses be recorded in the of an appropriate test process, how- Test Completion Re- ever, the parts concerning communic- port and communic- ation with stakeholders are not con- ated to the relevant sidered. Exclusion Criterion3. Inclu- stakeholders. sion Criterion3. 7.4.4.2 Clean Up Test Envir-  Directly related to the thesis, as it is an onment (TC2) essential requirement for subprocesses of an appropriate test process. Inclu- sion Criterion3. 7.4.4.3 Identify Lessons  Directly related to the thesis, as it is an Learned (TC3) essential requirement for subprocesses of an appropriate test process. Inclu- sion Criterion3. 7.4.4.4 Report Test Comple-  Directly related to the thesis, as it is an tion (TC4) essential requirement for subprocesses of an appropriate test process. Inclu- sion Criterion3. 7.4.4.4.a Relevant information  Directly related to the thesis, as it is an shall be collected from essential requirement for subprocesses the following docu- of an appropriate test process. Inclu- ments, but not limited sion Criterion3. to,1) Test Plans, 2) Test Results, 3) Test Status Reports, 4) Test Completion Re- ports from test phase or test type, 5) Incid- ent Reports 7.4.4.4.b The collected inform-  Directly related to the thesis, as it is an ation shall be evalu- essential requirement for subprocesses ated and summarized of an appropriate test process. Inclu- in the Test Comple- sion Criterion3. tion Report. 7.4.4.4.c Approval for the Test  Communications with stakeholders and Completion Report between members of the organisation shall be obtained are not in the scope of this thesis, there- from the responsible fore, are not considered in the assess- stakeholder(s). ment. Exclusion Criteria3 7.4.4.4.d The approved Test  Communications with stakeholders and Completion Report between members of the organisation shall be distrib- are not in the scope of this thesis, there- uted to the relevant fore, are not considered in the assess- stakeholders. ment. Exclusion Criteria3 7.4.5 Information Items  The instructions for the documentation produced from the test completion pro- cess is directly related to the scope of the thesis. Inclusion Criterion3. 8 Dynamic Test Pro- Partial Some of the sub-clauses are included cesses and some are not. The excluded sub- clauses are due to Exclusion Criteria1. Inclusion Criterion3.

ix Ladan Pourvatan Test Process Assessment of Industrial Control Systems

8.1 Introduction  Does not include requirements or in- structions. Exclusion Criteria1 8.2 Test Design & Imple-  Directly related to the thesis, as it is an mentation Process essential requirement for subprocesses of an appropriate test process. Inclu- sion Criterion3. 8.3 Test Environment  Directly related to the thesis, as it is an Set-Up & Mainten- essential requirement for subprocesses ance Process of an appropriate test process. Inclu- sion Criterion3. 8.4 Test Execution Pro-  Directly related to the thesis, as it is an cess essential requirement for subprocesses of an appropriate test process. Inclu- sion Criterion3. 8.5 Test Incident Report-  Directly related to the thesis, as it is an ing Process essential requirement for subprocesses of an appropriate test process. Inclu- sion Criterion3.

Table 14: Table showing a detailed list of requirements extracted from the safety standard ISO/IEC/IEEE 29119-2 [15]. Each row represents a clause or a sub-clause of the standard, its inclusion status, justification for the inclusion status, as well as reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included.

Num Title Included Justification 1 Scope  This clause does not include instructions or as- sessment criteria, it is read and used, but does not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 2 Normative references  This clause does not include instructions or as- sessment criteria, it is read and used, but does not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 3 Terms, definitions,  This clause does not include instructions or as- symbols, and abbrevi- sessment criteria, it is read and used, but does ated terms not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 4 Design Considera- Partial Some sub-clauses are included and some are not, tions leading to this clause’s partial inclusion. The ex- cluded sub-clauses are due to Exclusion Criterion 4. Inclusion Criterion1. 4.1 Safety objectives in  Not related to software. The scope of this thesis is design limited to software artefacts. Exclusion Criterion 4 4.2 Strategy for risk re-  Not related to software. The scope of this thesis is duction limited to software artefacts. Exclusion Criterion 4 4.3 Determination of  Not related to software. The scope of this thesis is required performance limited to software artefacts. Exclusion Criterion level (PLr) 4

x Ladan Pourvatan Test Process Assessment of Industrial Control Systems

4.4 Design of SRP/CS  Not related to software. The scope of this thesis is limited to software artefacts. Exclusion Criterion 4 4.5 Evaluation of the  Not related to software. The scope of this thesis is achieved perform- limited to software artefacts. Exclusion Criterion ance level PL and 4 relationship with SIL 4.6 Software safety re-  The main objective of the following requirements quirements is to have readable, understandable, testable and maintainable software; making them directly rel- evant to the scope of the thesis. Inclusion Cri- terion1 4.7 Verification that  Not related to software. The scope of this thesis is achieved PL meets limited to software artefacts. Exclusion Criterion PLr 4 4.8 Ergonomic aspects of  Not related to software. The scope of this thesis is design limited to software artefacts. Exclusion Criterion 4 5 Safety functions Partial This clause provides a list and details of safety functions which can be provided by the SRP/CS. The designer (or type-C standard maker) shall include those necessary to achieve the measures of safety required of the control system for the specific application. As the scope of this thesis is limited to software artefacts, only those are con- sidered. Exclusion Criterion4. Inclusion Criteria 1&2 5.1 Specification of safety  As the two use-cases under study are safety func- functions tions, the specifications for the safety functions must be considered. However as Hardware is out- side the scope of this thesis, some specifications are not considered. Inclusion Criterion2 5.2 Details of safety func- Partial This clause covers details for various safety func- tions tions. Some of these requirements apply to safety functions that are not part of the two use-cases under study. Therefore, some of them are re- moved and some that can be applied to the use cases are included. Exclusion Criterion4. Inclu- sion Criterion2. 5.2.1 Safety-related stop  This safety function is not part of the two use function cases. Exclusion Criterion4 5.2.2 Manual reset function  This safety function is not part of the two use cases. Exclusion Criterion4 5.2.3 Start/restart function  This safety function is not part of the two use cases. Exclusion Criterion4 5.2.4 Local control function  The two safety functions should have the possibil- ity of being controlled locally. Inclusion Criterion 2. 5.2.5 Muting function  The muting function does not apply to the use cases being studied. Exclusion Criterion4

xi Ladan Pourvatan Test Process Assessment of Industrial Control Systems

5.2.6 Response time  The response time of the control system is part of the overall response time of the machine. The required overall response time of the machine can influence the design of the safety-related part. But the response time for the actual safety func- tion also matters leading to this requirement is included. Inclusion Criterion2. 5.2.7 Safety–related para-  This sub-clause focuses on when safety-related meters parameters, e.g. position, speed, temperature or pressure, deviate from present limits the control system shall initiate appropriate measures (e.g. actuation of stopping, warning signal, alarm). This makes it apply to the entirety of the ma- chine instead of specific safety-related functions, leaving it outside of the scope of this thesis. Ex- clusion Criterion4 5.2.8 Fluctuations, loss and  This sub-clause is applicable to the entirety of restoration of power the machine instead of specific safety related func- sources tions, leaving it outside of the scope of this thesis. Exclusion Criterion4 6 Categories and their  Not related to software. The scope of this thesis is relation to MTTFD of limited to software artefacts. Exclusion Criteria each channel, DCavg 4 and5 and CCF 7 Fault consideration,  Not related to software. The scope of this thesis is fault exclusion limited to software artefacts. Exclusion Criteria 4 and5 8 Validation  References 13849-2:2012. The clause does not in- clude any requirements in this part of the stand- ard. Exclusion Criterion6 9 Maintenance  Not related to software. The scope of this thesis is limited to software artefacts. the standard ISO 12100:2010 is referenced in this section, as this standard is not part of the scope of the thesis, this clause has been disregarded. Exclusion Criteria6 and5 10 Technical documenta-  The technical documentation that must be pro- tion duced under the design of a safety-related ma- chine is directly applicable to the development process of the use cases, making this clause relev- ant. Inclusion Criterion1. 11 Information for use  Not related to software. The scope of this thesis is limited to software artefacts. Exclusion Criteria 4,5, and2

Table 15: Table showing a detailed list of requirements extracted from the safety standard ISO 13849-1:2016 [17]. Each row represents a clause or a sub-clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included.

Num Title Included Justification 1 Scope  This clause does not include instructions, It is read and used, but does not contain extractable requirements. Exclusion Criterion1

xii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

2 Normative references  This clause does not include instructions. It is read and used, but does not contain extractable requirements. Exclusion Criterion1 3 Terms, definitions,  This clause does not include instructions. It is symbols, and abbrevi- read and used, but does not contain extractable ated terms requirements. Exclusion Criterion1 4 Validation Process Partial Some of the sub-clauses are included and some are not. Exclusion Criteria4&1. Inclusion Criterion 1. 4.1 Validation Principles  This sub-clause only explains important inform- ation, but does not include extractable instruc- tions. Exclusion Criterion1 4.2 Validation Plan  If the validation plan is related to the develop- ment process at the company and can aid with the test process, it is relevant. However, the val- idation plan concerned with the hardware parts of the system is not considered. Inclusion Criterion 1. 4.3 Generic fault lists  The generic fault lists are concerned with hard- ware, and are not in the scope of this thesis. Ex- clusion Criterion4 4.4 Specific fault lists  Specific product-related fault lists in this stand- ard are concerned with the hardware of the safety- related parts, therefore out of the scope of this thesis. Exclusion Criterion4 4.5 Information for valid-  The information required for validation will vary ation with the technology used, the category or categor- ies and performance level(s) to be demonstrated, the design rationale of the system, and the con- tribution of the SRP/CS to the reduction of the risk. The requirements stated in this clause are concerned with hardware and firmware, and the scope of this thesis is the software development and testing processes. Exclusion Criterion4 4.6 Validation record  Validation record is a result of performing the actions in the previous clauses, as they are not relevant to the thesis, neither is their record. Ex- clusion Criterion4 5 Validation by analysis  Validation by analysis does not concern the test process or the software development process of the safety functions, therefore it is out of the scope of this thesis. Exclusion Criteria4 and5 6 Validation by testing  Validation by testing is concerned with the test process of safety-related systems therefore this clause is highly relevant and is taken into con- sideration. Inclusion Criteria1&3. 7 Validation of safety  Validation of safety requirements specification for requirements specific- safety functions is concerned with reviewing pro- ation for safety func- cesses in the software development lifecycle of a tions safety related system, therefore are relevant to this thesis. Inclusion Criteria1&2.

xiii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

8 Validation of safety  The requirements for the validation of safety func- functions tions in this standard cover both software and hardware functions, as well as testing. Therefore, this clause is partially considered in this thesis, disregarding the sections applicable to an entirety of the machine or hardware specifications. The part concerning software references sub-clause 9.5 of the standard. Since that clause is included, it makes the inclusion of this clause redundant. Ex- clusion Criteria4 and6 9 Validation of Perform- Partial Some of the sub-clauses parts are concerned with ance Levels and Cat- the software parts of the safety-related system, egories and some are not. Exclusion Criteria4&5. In- clusion Criteria1&2. 9.1 Analysis and testing  These parts are not concerned with the software parts of the safety-related system, therefore are not in the scope of this thesis. Exclusion Criteria 4 and5 9.2 Validation of category  These parts are not concerned with the software specifications parts of the safety-related system, therefore are not in the scope of this thesis Exclusion Criterion 4 9.3 Validation of MTTFd,  These parts are not concerned with the software DCavg and CCF parts of the safety-related system, therefore are not in the scope of this thesis. Exclusion Criteria 4 9.4 Validation of meas-  These parts are not concerned with the software ures against system- parts of the safety-related system, therefore are atic failures related to not in the scope of this thesis. Exclusion Criteria performance level 4 9.5 Validation of safety-  The validation of both safety-related embedded related software software (SRESW) and safety-related application software (SRASW) are included in this clause, leading to it being applicable in the software development and testing processes of a safety- related control system. So, the requirements and instructions in this clause are included. Inclusion Criteria1&2. 9.6 Validation and verific-  These parts are not concerned with the software ation of performance parts of the safety-related system, therefore are level not in the scope of this thesis. Exclusion Criteria 4 9.7 Validation of combin-  These parts are not concerned with the software ation of safety-related parts of the safety-related system, therefore are parts not in the scope of this thesis. Exclusion Criteria 4 10 Validation of environ-  Validation of environmental requirements are not mental requirements concerned with software artefacts and therefore is out of the scope of this thesis. Exclusion Criterion 4 11 Validation of main-  Validation of maintenance requirements are not tenance requirements concerned with software artefacts and therefore is out of the scope of this thesis. Exclusion Criterion 5

xiv Ladan Pourvatan Test Process Assessment of Industrial Control Systems

12 Validation of tech-  Validation of technical documentation and in- nical documentation formation for use is concerned with many differ- and information for ent aspects of the system, only one of which is use software documentation, which is relevant to this thesis. As there are no specific instructions in this clause regarding software documentation (This is done elsewhere), the clause is not of concern in this thesis. Exclusion Criteria4,5, and2

Table 16: Table showing a detailed list of requirements extracted from the safety standard ISO 13849-2:2012 [18]. Each row represents a clause or a sub-clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included.

Num Title Included Justification 1 Scope  This clause does not include instructions or as- sessment criteria, it is read and used, but does not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 2 Normative References  This clause does not include instructions or as- sessment criteria, it is read and used, but does not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 3 Terms, definitions,  This clause does not include instructions or as- and abbreviated sessment criteria, it is read and used, but does terms not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 4 Conformance to this  This clause does not include instructions or as- standard sessment criteria, it is read and used, but does not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 5 Key concepts and ap-  This clause does not include instructions or as- plication sessment criteria, it is read and used, but does not contain extractable requirements, therefore is not included in the assessment criteria. Exclusion Criterion1 6 Software life cycle Partial Some processes are included and some are not, process leading to a partial inclusion of this clause. Ex- clusion Criteria2,3, and5 6.1.1 Acquisition process  The Agreement processes are organizational pro- cesses that apply outside of the span of a pro- ject’s life, as well as for a project’s lifespan. As the thesis is concerned with specifically the two use-cases, this process does not apply to the scope of this thesis. Exclusion Criterion2 6.1.2 Supply process  The Agreement processes are organizational pro- cesses that apply outside of the span of a pro- ject’s life, as well as for a project’s lifespan. As the thesis is concerned with specifically the two use-cases, this process does not apply to the scope of this thesis. Exclusion Criterion2

xv Ladan Pourvatan Test Process Assessment of Industrial Control Systems

6.2.1 Life Cycle Model  The Organizational Project-Enabling processes Management process; are concerned with providing the resources to enable the project to meet the needs and ex- pectations of the organization’s stakeholders. All stakeholder interactions are ignored in this thesis, since (according to the company) Customer re- quirements are at a very high level, the company writes the software requirements to support the development process. Exclusion Criteria2 and3 6.2.2 Infrastructure Man-  The Organizational Project-Enabling processes agement process; are concerned with providing the resources to enable the project to meet the needs and ex- pectations of the organization’s stakeholders. All stakeholder interactions are ignored in this thesis, since (according to the company) Customer re- quirements are at a very high level, the company writes the software requirements to support the development process. Exclusion Criteria2 and3 6.2.3 Portfolio Manage-  The Organizational Project-Enabling processes ment process; are concerned with providing the resources to enable the project to meet the needs and ex- pectations of the organization’s stakeholders. All stakeholder interactions are ignored in this thesis, since (according to the company) Customer re- quirements are at a very high level, the company writes the software requirements to support the development process. Exclusion Criteria2 and3 6.2.4 Human Resource  The Organizational Project-Enabling processes Management process; are concerned with providing the resources to enable the project to meet the needs and ex- pectations of the organization’s stakeholders. All stakeholder interactions are ignored in this thesis, since (according to the company) Customer re- quirements are at a very high level, the company writes the software requirements to support the development process. Exclusion Criteria2 and3 6.2.5 Quality Management  The Organizational Project-Enabling processes process; and are concerned with providing the resources to enable the project to meet the needs and ex- pectations of the organization’s stakeholders. All stakeholder interactions are ignored in this thesis, since (according to the company) Customer re- quirements are at a very high level, the company writes the software requirements to support the development process. Exclusion Criteria2 and3 6.2.6 Knowledge Manage-  The Organizational Project-Enabling processes ment process. are concerned with providing the resources to enable the project to meet the needs and ex- pectations of the organization’s stakeholders. All stakeholder interactions are ignored in this thesis, since (according to the company) Customer re- quirements are at a very high level, the company writes the software requirements to support the development process. Exclusion Criteria2 and3

xvi Ladan Pourvatan Test Process Assessment of Industrial Control Systems

6.3.1 Project Planning pro-  The Technical Management processes are con- cess; cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this does not apply to the scope of this thesis. Exclusion Criterion2 6.3.2 Project Assessment  The Technical Management processes are con- and Control process; cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this does not apply to the scope of this thesis. Exclusion Criterion2 6.3.3 Decision Management  The Technical Management processes are con- process; cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this does not apply to the scope of this thesis. Exclusion Criterion2 6.3.4 Risk Management  The Technical Management processes are con- process; cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this is not applicable to the scope of this thesis. Exclusion Criterion2 6.3.5 Configuration Man-  The Technical Management processes are con- agement process; cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this is not applicable to the scope of this thesis. Exclusion Criterion2

xvii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

6.3.6 Information Manage-  The Technical Management processes are con- ment process; cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this is not applicable to the scope of this thesis. Exclusion Criterion2 6.3.7 Measurement process;  The Technical Management processes are con- and cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this is not applicable to the scope of this thesis. Exclusion Criterion2 6.3.8 Quality Assurance  The Technical Management processes are con- process. cerned with managing the resources and assets allocated by organization management and with applying them to fulfil the agreements into which the organization or organizations enter. As the thesis is concerned with specifically the two use- cases, and requirements for the processes in this group are concerned with organizational level activities, this is not applicable to the scope of this thesis. Exclusion Criteria2 and5 6.4.1 Business or Mission  Business and Mission Analysis is related to the or- Analysis process; ganization encompassing stakeholders concerned by the activities of the software life cycle. This process interacts with the organization’s strategy. As the thesis is concerned with specifically the two use-cases, this process is not applicable to the scope of this thesis. Exclusion Criteria2 and 5

6.4.2 Stakeholder Needs  This process identifies stakeholders, or stake- and Requirements holder classes, involved with the system through- Definition process; out its life cycle, and their needs. It analyses and transforms these needs into a common set of stakeholder requirements. All stakeholder inter- actions are ignored in this thesis, since (according to the company) Customer requirements are at a very high level, the company writes the software requirements to support the development process. Exclusion Criteria2,3, and5

6.4.3 System/Software Re-  This process is directly part of the V-model, and quirements Definition its outcomes are directly relevant to testing. In- process; clusion Criterion1.

xviii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

6.4.4 Architecture Defini-  These processes are directly part of the V-model, tion process; and their outcomes are directly relevant to test- ing. Inclusion Criterion1. 6.4.5 Design Definition pro-  These processes are directly part of the V-model, cess; and their outcomes are directly relevant to test- ing. Inclusion Criterion1. 6.4.6 System Analysis pro-  These processes are directly part of the V-model, cess; and their outcomes are directly relevant to test- ing. Inclusion Criterion1. 6.4.7 Implementation pro-  These processes are directly part of the V-model, cess; and their outcomes are directly relevant to test- ing. Inclusion Criterion1. 6.4.8 Integration process;  These processes are directly part of the V-model, and their outcomes are directly relevant to test- ing. Inclusion Criterion1. 6.4.9 Verification process;  These processes are directly part of the V-model, and their outcomes are directly relevant to test- ing. Inclusion Criterion1. 6.4.10 Transition process;  The purpose of the Transition process is to estab- lish a capability for a system to provide services specified by stakeholder requirements in the oper- ational environment. All stakeholder interactions are ignored in this thesis, since (according to the company) Customer requirements are at a very high level, the company writes the software re- quirements to support the development process. Exclusion Criteria2 and3

6.4.11 Validation process;  The purpose of the Validation process is to provide objective evidence that the system when in use, fulfils its business or mission objectives and stakeholder requirements. All stakeholder in- teractions are ignored in this thesis, since (accord- ing to the company) Customer requirements are at a very high level, the company writes the soft- ware requirements to support the development process. Inclusion Criterion1. 6.4.12 Operation process;  This process establishes requirements for and as- signs personnel to operate the system and mon- itors the services and operator-system perform- ance. To sustain services, it identifies and ana- lyses operational anomalies in relation to agree- ments, stakeholder requirements and organiza- tional constraints. As this becomes an organ- izational matter, and the scope of the thesis is limited to the two use cases, assessment of this process does not apply. Exclusion Criteria2,3, and5

xix Ladan Pourvatan Test Process Assessment of Industrial Control Systems

6.4.13 Maintenance process; Partial This process monitors the system’s capability to deliver services, records incidents for analysis takes corrective, adaptive, perfective and prevent- ive actions and confirms restored capability. As the thesis is limited to the assessment of test pro- cesses and processes directly related to testing, this process does not apply, except for sub-clause b.4 which is concerned with regression testing, making it partially relevant. Exclusion Criteria 2,3,&5. Inclusion Criterion3.

6.4.14 Disposal process.  The purpose of the Disposal process is to end the existence of a system element or system for a specified intended use. As the thesis is limited to the assessment of test processes and processes directly related to testing, this process does not apply. Exclusion Criterion5

Table 17: Table showing a detailed list of requirements extracted from the safety standard ISO/IEC/IEEE 12207:2017 [5]. Each row represents a clause or a sub-clause of the standard, its inclusion status, justification for the inclusion status, as well as a reference to generic inclusion or exclusion criteria. The inclusion status of each clause or sub-clause is defined as: () included, () excluded, and (Partial) partially included.

xx Ladan Pourvatan Test Process Assessment of Industrial Control Systems

B. Appendix B - Interview Questions Introductory Questions 1. What is your position in the company? 2. How many years of experience do you have in your field?

Closed Questions 1. (QID-ACT_28): When assigning responsibility, what factors are considered to determine the appropriateness of competence of the people? 2. (QID-ACT_52): What are the necessary and distinct software environments, including en- abling systems or services needed to support development and testing? 3. (QID-Arch_01): How would you describe the alignment and relation of the architecture to the design, stakeholder requirements, and critical safety concerns? 4. (QID-Arch_02): While preparing for architecture definition, what was your strategy and architecture evaluation criteria? 5. (QID-Arch_03): While preparing for architecture definition, how did you assure the align- ment of requirements with architectural entities?

6. (QID-Arch_05): What architecture candidates were considered for the preparation of the architecture and how were they assessed? 7. (QID-Arch_06): How was the management of the selected architecture performed in terms of architecture governance approach, governance-related roles and responsibilities, accept- ance of the architecture by stakeholders, completeness of the architectural entities and their architectural characteristics, Organisation, assessment and control evolution of the architec- ture models and views, the architecture definition and evaluation strategy, traceability of the architecture, key artefacts and information items that have been selected for baselines) 8. (QID-Arch_07): What candidate architectures were considered before selecting the one in use? How was the development of models for candidate architectures performed? 9. (QID-DES_29): Have alternatives Design Definitions for obtaining software system elements been considered? If yes, how were they assessed? 10. (QID-DES_30): In regards to managing the design, have a periodic assessment of the design characteristics in case of evolution of the software system and its architecture been performed?

11. (QID-DES_49): How would you argue that the software design specifications achieve feas- ibility, testability, and the capability for safe modification? 12. (QID-Imp_10): As there currently isn’t any specific code review criteria (other than the design specification which are used for code review), how would you describe the character- istics of the function blocks in your code? 13. (QID-IMP_46): Is there software being reused or adapted? If yes, how is this handled? 14. (QID-IMP_51): I cannot find the implementation priorities to support data and software migration and transition, along with the retirement of legacy systems. Could you elaborate?

15. (QID-PER_24): (Not sure what this means)Has the interface between software and external systems been defined by data? If yes, what performance characteristics have been considered? 16. (QID-Req_04): What was the plan to demonstrate that the software satisfies its safety requirements, and how were the responsibilities divided (in more detail as in a schedule and list of people)?

xxi Ladan Pourvatan Test Process Assessment of Industrial Control Systems

17. (QID-REQ_23): Are software safety requirements expressed or implemented by configuration data? If yes, what are the characteristics of the data?

18. (QID-REQ_27): The software safety requirements specification shall specify and document any safety-related or relevant constraints between the hardware and the software. Has this been done? If so, could you provide proof? 19. (QID-REQ_44): In Defining the system/software requirements, what critical quality char- acteristics and risks were considered?

20. (QID-SAF_16): In regards to functional assessment, (I am doing the assessment, but ) are the activities taking place in this assessment kept in mind throughout the phases? Is there a policy in place in regards to functional safety assessment? If yes, please elaborate. 21. (QID-SAF_17): Where, and how, have the following safety functions been considered? (loss of power, frequency of action, simultaneously active functions) 22. (QID-VAL_26): Does the validation report address accuracy of measurements during the validation by testing? If yes, what measurements are considered? 23. (QID-VER_11): You have mentioned that no analysis has been performed for the selection of test cases, is that correct? How about considerations regarding risk exposure when designing the tests? 24. (QID-VER_12): Do you test functional behaviour, performance criteria (e.g. timing per- formance), and safety-related signals(I/O testing)? Can you share evidence for that? 25. (QID-VER_13): Are there techniques for the detection of external failures while testing? How do you keep track of incidents and failures that may be detected during testing? 26. (QID-VER_35): The traceability between the test basis, feature sets, test conditions, test coverage items, test cases, test sets and test procedures (and/or automated test scripts) could not be found in the documentation. How is this handled? 27. (QID-VER_38): How would you defend the lack of integration tests for the software?

28. (QID-VER_41): As there is no evidence of a test strategy in the documentation, is there a plan to have one? Is there an "understood" test strategy? 29. (QID-VER_45): Are there test assets that can be reused? If yes, how is this handled? 30. (QID-VER_56): How is the verification of data performed with respect to Hardware aspects?

Open Questions 1. (QID-ACT_14): How is the maintenance activity performed for this system? 2. (QID-ACT_15): How is configuration management done in this project? (If necessary, ask in more detail to guide the discussion)

3. (QID-ACT_31): How has the system analysis activity been performed in detail? (No docu- mentation regarding system analysis strategy or actual system analysis was provided.) 4. (QID-ACT_33): Please explain your usage of the CR-Tool. (Make sure the lessons learned, traceability of design and architecture, risk assessment, change management, and software integration are considered, test execution log)

5. (QID-ACT_34): How is integration done? (No docs ) 6. (QID-DES_39): Certain design characteristics and principles have not been expressed in the documentation. Let’s discuss them.

xxii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

7. (QID-DES_48): To the extent required by the safety integrity level, what does the software or design representation (including a programming language) have?

8. (QID-IMP_42): As information couldn’t be found on this topic, please elaborate on your re- quirements for support tools, including programming languages. (If needed ask more detailed questions to guide the discussion) 9. (QID-MOD_21): How are modifications handled in general? What are the plans and pro- cedures for modifications, their traceability, impact analysis, documentation, verification? If applicable, ask in detail about impact analysis (61508_3:7.8.2.3) 10. (QID-MOP_18): What are the required states or modes of operation of the software system? 11. (QID-MOP_19): What parameters are considered for the operation of the system, to ensure safety?

12. (QID-REQ_22): The software safety requirements specification shall express the required safety properties of the product, (not project) what safety requirements are considered? (As this is about the product, maybe not in the scope of the thesis - keep the question but double-check with supervisors if it should be included) 13. (QID-REQ_25): Certain safety-related requirements regarding monitoring and testing could not be found in the provided documentation. What can you tell me about such requirements? 14. (QID-REQ_57): Could you elaborate on how the safety requirements have been assessed? (completeness, critical performance measures, issues, deficiencies, conflicts, and weaknesses) 15. (QID-VAL_55): Please elaborate on what else will be included in the validation report. (If needed, ask in detail with reference to each requirement) 16. (QID-VER_20): As previously mentioned by you, a verification report will not be provided however the validation report will include information about verification activities. Will this document include your verification plan? If yes, please elaborate on the plan. 17. (QID-VER_43): Will the validation report include Verification of software architecture? Please elaborate. 18. (QID-VER_47): How would you describe the process of test monitoring and Test Control activities for your system? 19. (QID-VER_50): As there is no verification report, could you tell me the information that is verified throughout the software lifecycle? Could you specify the verification activities? (If needed go into more detail to guide the discussion) 20. (QID-VER_53): Where verification constraints and priorities identified before performing verification? Please elaborate 21. (QID-VER_54): Please elaborate on your implementation of the test design and implement- ation process. The process as explained in 1557 is incomplete and more information regarding the test conditions, procedures, test data is needed.

xxiii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

C. Appendix C - Data Sources and Types

Source ID Source Type DOC-1806 Test Report - Use Case 2 DOC-0001 Software Architecture DOC-1600 Application Design Description DOC-1557 Software Development Process DOC-1142 Programming Rules and Guidelines DOC-1808 Test Report - Use Case 1 DOC-0002 End User Guide DOC-1111 Document Review Checklist DOC-1480 I/O Mapping DOC-1527 Module Design Specification DOC-1576 Code Review Record DOC-1598 CR-Tool Guidelines DOC-1818 I/O Mapping DOC-1865 I/O Mapping DOC-1487 Traceability Matrix DOC-1003 Requirement Specification - Use Case 1 DOC-1526 Software Design Specification - Use Case 1 DOC-1554 Test Report - Use Case 1 DOC-2025 Requirement Specification - Use Case 2 DOC-1525 Software Design Specification - Use Case 2 DOC-1544 Test Specification - Use Case 2 DOC-1553 Test Specification - Use Case 1 INT-0319 Interview INT-0326 Interview

Table 18: Table showing the ID of the data sources collected from the company, and each data source’s type.

xxiv Ladan Pourvatan Test Process Assessment of Industrial Control Systems

D. Appendix D - Condensed Assessment Results A condensed version of the assessment results is presented in this appendix. The results have been grouped based on their compliance degree, and the reasons for the given compliance degrees have been assembled, summarised, and condensed. The justifications tend to be as precise as possible whilst not revealing sensitive information that may reveal the company’s identity. Table 19 shows the selected requirements with which the company’s development process failed to comply. Table 20 presents those requirements partially met, as deducted from studying the provided information. The company’s development process was deemed to fully comply with the requirements in Table 21. Each table includes the number of sub-clauses within each group of sub-clauses or clause, belonging to a selected requirement from a safety standard. The justifications for the level of compliance are also given in the tables.

Standard Req Count Justification Group Num Although there are clearly requirements that relate to risks 6.4.3 14 and criticality of the software system, they are not identified as such in any of the documentation. No mention of cost targets, critical quality characteristics or performance measures (other than one) in the documentation. No specific roadmap definition, reasoning for the choice of ar- 6.4.4 23 chitecture, boundaries, or relating the design to the architec- 12207 ture in the documentation, only generic ideas. According to INT-0319, since the hardware and software frameworks are already in place, these aspects have not been documented but are considered. The sub-clause would pass if there exists explicit documentation on the roadmap, approach, rationale, boundaries, and strategy. As the architecture documentation (DOC-0001) is unfinished, it does not include information regarding assessment of archi- tecture candidates, or management of the selected architec- ture. No evidence of prioritisation, examining the feasibility of im- 6.4.5 10 plementation, or the status of design principles and design characteristics No evidence of assessment of alternatives for obtaining soft- ware system elements. 6.4.6 17 No documentation regarding system analysis has been provided by the company Evaluation of software unit and affiliated data or other inform- 6.4.7 10 ation according to the implementation strategy and criteria is evident to be planned based on DOC-1576, code review has not been performed for the use cases under study according to DOC-1565. No information regarding reused or adapted software. No quality assurance activity 6.4.8 8 No documentation of management of integration results. Even though there are actions that must be taken before per- 6.4.9 4 forming verification by testing, such constraints, priorities, or risks are not identified in either documents for test specifica- tion of the use cases or anywhere else. DOC-1553 and DOC- 1544 As the verification strategy is not defined, neither are the con- straints. 6.4.11 8 INT-0319 specifies that a Validation Plan is to be completed, however no accurate plan or timeline is given.

xxv Ladan Pourvatan Test Process Assessment of Industrial Control Systems

6.4.13 1 No identification of procedures for correction of flaws (defects) and errors, or for replacement or upgrade of system elements. (Using regression tests) Modification and impact analysis are not mentioned in the documentation in any detail and it should be considered in 4.6.2 5 DOC-1557. No reference or evidence of control flow analysis for verification purposes having been performed 13849-1 Although testing of the functions is done according to the test specification documentation, not evidence of actual functional testing or analytical methods for reaching the test cases is given. DOC-1553 and DOC-1544 DOC-1526 and DOC-1525 provide modular and structured design for each use case . As the code review is done accord- ing to these documents, and it is decided that the code is also modular and structured. However according to DOC-1576, the two use cases have not been reviewed, therefore failing this clause. No evidence od Detection and control of external failure, or configuration management 4.6.3 17 Coding guidelines, documented in DOC-1142, do not specify modular and structured programming predominantly realized by function blocks deriving from safety-related validated func- tion block libraries, function blocks of limited size of coding, code execution inside function block which should have one entry and one exit point, assignment of a safety output at only one program location. The guidelines must specifically mention that code shall be readable, understandable and test- able and, because of this symbolic variables (instead of explicit hardware addresses) should be used; data integrity and plaus- ibility checks (e.g. range checks.) available on application layer (defensive programming) should be used; and that veri- fication should be by control and data flow analysis for PL = d or e. Testing does not include the appropriate validation method, black-box testing of functional behaviour and performance criteria (e.g. timing performance); test case execution from boundary value analysis; test planning, or I/O testing. Documentation is not complete. Code documentation within source text was not provided to contain module headers with a legal entity, functional and I/O description, version and ver- sion of used library function blocks, and sufficient comments of networks/statement and declaration lines. Such character- istics must be mentioned in coding guidelines DOC-1142 4.6.4 1 No evidence supporting that verification that the data/signals for parameterization are generated and processed in such a way that faults cannot lead to a loss of the safety function. The test cases may do this, but no reasoning to support this is provided and should be documented. 5 8 No evidence of results of the risk assessment for each specific hazard or hazardous situation; machine operating character- istics, the behaviour of the machine on the loss of power; the frequency of operation; priority of those functions that can be simultaneously active and that can cause conflicting action.

xxvi Ladan Pourvatan Test Process Assessment of Industrial Control Systems

10 8 The following information could not be found explicitly in the documentation: the exact points at which the safety-related part(s) start and end; environmental conditions; the paramet- ers relevant to the reliability; measures against systematic fail- ure; all safety-relevant faults considered; justification for fault exclusions; the design rationale or measures against reasonably foreseeable misuse. 4.2 2 A validation plan does not exist, and information regarding operational and environmental conditions during testing, and 13849-2 analyses and tests to be applied is missing from documenta- tion. 6 11 Information about accuracy of time measurements, pressure measurements, force measurements, electrical measurements, relative humidity measurements, linear measurements while validation by testing is missing. 7 1 Documentation in regards to measures to detect systematic faults (errors, omissions or inconsistencies) cannot be identi- fied. 9.5 3 ?Nothing in the Validation Report Doc-1830 in regards to ex- tended test cases, limit value analysis, software-based meas- ures for control of failures, or prevention of systematic faults. 6 11 A Test Plan does not exist. Reviewing the documentation indicates that risk assessment 7 51 and incident reporting during testing doesn’t exist. INT-0326 29119-2 confirms this. Reviewing the documentation indicates the lack of sufficient analysis for test purposes, control directives, test strategy, test control and monitoring activities, test planning activities, and traceability between various activities. INT-0326 confirms these. INT-0319 specifies that a Test Plan does not exist. Reviewing the documentation indicates the lack of sufficient 8 46 analysis for test purposes, control directives, test strategy, test control and monitoring activities, test planning activities, and traceability between various activities. INT-0326 confirms these INT-0319 specifies that a Test Plan does not exist. INT-0319 specifies that a Test Plan does not exist. 6 2 29119-3 The coverage items are not identified in the test specification document. Test cases don’t have a priority or a unique id. 7 8 The coverage items are not identified in the test specification document. Test cases don’t have a priority or a unique id. Individual Documents of Test Procedure Specification, Test Data Requirements, Test Environment Requirements, Test Data Readiness Report, Test Environment Readiness Report, Test Execution Log Test Incident Reporting do not exist 5 4 The phases of the overall software safety lifecycles are incom- plete and the documentation does not contain sufficient in- formation required for the management of functional safety, or 61508-1 implementation of a functional safety assessment, even though the functional safety assessment is currently being performed. The verification activities specified in 7.18 are not completed by the company and no plan for such completion has been 6 25 identified. No quality assurance or quality management activities are pre- cisely defined in the process of this company.

xxvii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

INT-0326 indicates that the competence of members is only measured by a general set of criteria and then the members are trained. Therefore none of the sub-clauses regarding com- petence measurement pass. No quality assurance, quality management, or configuration management activities are defined in the process of this com- pany. INT-0326 confirms this. 8 1 Although the functional safety assessment is ongoing, a specific documented plan for this assessment is not provided. 6 9 No mention or precise documentation of a functional assess- ment plan or configuration management activities. 7.1 4 Failure of related clauses (Clause 6), lack of Quality Assurance activities confirmed in INT-0319, and no impact analysis. "No documentation of failure analysis, hardware safety integrity requirements, reasonably foreseeable misuse of software, any safety-related or relevant constraints between the hardware and the software can be identified in the documentation. INT-0326 indicates that due to the nature of the project, these are deemed unnecessary." Functions related to the detection, annunciation and man- 7.2 24 agement of faults in the programmable electronics hardware, sensor and actuators faults, programmable electronics hard- ware; the software itself (software self-monitoring), periodic 61508-3 testing of safety functions on-line (i.e. in the intended op- erational environment); Interfaces to non-safety-related func- tions; Capacity and response time performance are not identi- fied in detail in the documentation. Parts can be found in the Architecture documentation DOC-0001, however, passing this clause requires completion and the parts are not sufficient to grant partial compliance. Where data defines the interface between software and ex- ternal systems, performance characteristics are not systemat- ically considered or documented. No evidence of operational parameters being protected against: Invalid, out of range or untimely values; Unauthorized changes; or Corruption can be identified. INT-0319 confirms the lack of documentation of such evidence. A planning specifying the steps, both procedural and tech- 7.3 18 nical, that will be used to demonstrate that the software sat- isfies its safety requirements is not provided in the document- ation. Missing the identification of the relevant modes of the EUC operation from documentation, including: Preparation for use including setting and adjustment; Start up, teach, automatic, manual, semi-automatic, steady state operation; Re-setting, shut down, maintenance; Reasonably foreseeable abnormal conditions and reasonably foreseeable operator misuse Nothing about timing constraints, shared resources, exception handling or comments are in guidelines DOC-1142, or any 7.4 56 other documentation. Code Review records DOC-1576 indicates the functions have not been reviewed. There is no evidence of integration tests in the documentation and INT-0316 confirms this. As the validation plan is still ongoing, there is no evidence of any validation activities having been performed. No evidence of configuration management can be found. 7.5 16 There is no evidence of integration tests in the documentation and INT-0316 confirms this. 7.8 24 No documentation on software modification activities. INT- 0326 indicated that modifications are treated as new projects, but there is no evidence to support this.

xxviii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Although DOC-1557 specifies the plan for producing a verifica- 7.9 58 tion report, INT-0326 denies any plan for having a verification report or verification plan report. No analysis has been performed for testing or any other verific- ation activities, this is confirmed by INT-0319. Such analyses must be performed and documented. There is no evidence of integration tests in the documentation and INT-0319 confirms this.

Table 19: Table showing the condensed assessment results grouped based on failed compliance. The shortened name of each standard, a number representing the clause or sub-clause, and the count of sub-clauses within the requirement which have failed to be complied with are given. Furthermore, the justification for the failure of compliance along with references to data sources is given.

Standard Req Count Justification Group Num 5 1 The information to be documented is partially as stated in the various clauses of this standard. Further detail is given in different clauses. 61508-1 INT-0319 specifies that a Validation Plan is to be completed According to INT-0319, incident reporting and other proced- 6 14 ures are done via CR-tool. however, explicit documentation of this is needed in order to pass the requirements INT-0319 specified procedures compliant with the standard, including the diagnostics system. However to pass the relevant requirements, explicit documentation is needed. As the functional assessment is currently being performed, the criteria in relation to it are deemed as partially passed. 8 19 The functional safety assessment is ongoing so the results are still incomplete and partial. 6 1 Dependent on 61508-1_6.2 In theory, according to DOC-1557, the software safety lifecycle 7.1 2 satisfies the requirements are acceptable to tailor the V-model to take account of the safety integrity and the complexity of the project. However, based on further investigation of other 61508-3 relevant sub-clauses, the follow-through isn’t sufficient. The results of the activities in the software safety lifecycle are documented, however the documentation is incomplete. 7.2 15 INT-0319 indicates that Software safety requirements specific- ation consider software self-monitoring, monitoring of the pro- grammable electronics hardware, sensors, and actuators, peri- odic testing of safety functions while the system is running, enabling safety functions to be testable when the EUC is op- erational, software functions to execute proof tests and all dia- gnostic tests in order to fulfil the safety integrity requirement. However as documentation to provide concrete evidence of this consideration is missing, the sub-clauses only partially pass. 7.3 2 It is specified in DOC-1557 that a validation report must be an output of the validation process, INT-0319 specifies that a Validation Plan is to be completed

xxix Ladan Pourvatan Test Process Assessment of Industrial Control Systems

In accordance with the required safety integrity level and 7.4 15 the specific technical requirements of the safety function, the design method chosen possesses features that facilitate The expression of Testability nut the capacity for safe modifica- tion is nod documented to be considered. The design method chosen does possess features that facilitate software modifica- tion, however, modification is not mentioned anywhere in the documentation. No documentation of reuse of pre-existing software element, or their previous verification. The requirements are available before performing design activ- ities, however, the unfinishedness of the architecture document DOC-0001 and the unavailability of the validation report un- til suggests that this information was not available prior to detailed design Test specifications in DOC-1544 and DOC-1553 and the test reports DOC-1806 and DOC-1808, show that the functions are checked to do what they are intended to do, but not what they shouldn’t do. The tests are incomplete. some form of analysis should be done 7.7 7 INT-0319 specifies that a Validation Plan is to be completed 7.9 24 INT-0319 specifies that a Validation Plan is to be completed Test design specifications and test case specifications are put 29119-2 8 4 together, they may be identified as the test procedure, however incomplete. DOC-1806 and DOC-1808 show the tests passed, it is safe to assume the actual results are the same as expected. However, this is not recorded in the report. 6 1 DOC-1806 and DOC-1808 include certain information. Fail- 29119-3 ure reasons: The scope should be explained in more detail, Glossary should be included, Deviations from planned testing, Factors that blocked progress, Residual risks, Reusable test assets, Lessons learned 7 1 DOC-1544 and DOC-1553 are missing Notation conventions, Glossary, Feature sets with Specific strategy and Traceability. Test Conditions with assigned Priority and Traceability 4.6.2 1 Measures for the control of systematic failures are a constant diagnostics system running, according to INT-0326. Failure detection by automatic tests; is performed according to the 13849-1 test specifications and the development process documenta- tion. Measures for controlling the effects of errors and other effects arising from any data communication process could not be identified anywhere in the documentation. Selection of tools, libraries, languages meet their requirements 4.6.3 10 according to INT-0326 and INT-0319. However, there needs to be specific documentation of their characteristics. Verification activities are listed to be followed, however no evidence of actually following them, In fact, evidence to the contrary According to the interview, modifications are treated as new projects - Explicitly state in documentation to pass 5 3 DOC-0001 states that an emergency stop operation will be specified, and describes the input block of the programs. For the two use cases, the state the machine should be in is stated in the test specifications, but not for the overall safety function activities, leading to partial failure.

xxx Ladan Pourvatan Test Process Assessment of Industrial Control Systems

10 3 The technology or technologies used are mentioned in various documents, however, a full account of these in a single place could not be identified. 4.2 2 Though a validation plan does not exist, certain information 13849-2 is provided in the validation report that satisfies the require- ments for a validation plan. The information about the valid- ation process and scheduling in DOC-1830 and DOC-1557 are not complete 6 2 Mixture of passing and failing sub-clauses. 9.5 1 Mention of black-box testing in Validation Report Doc-1830, however no evidence of this being actually performed in other documentation. Interfaces and functions are defined but boundaries are not. If there are no boundaries, this should be mentioned 6.4.3 10 It is stated in DOC-1557 that Stakeholder/customer input requirements specification are used as an input for the re- quirement definition process in the development lifecycle. The traceability of requirements to stakeholder requirements is also supposed to be done as a task in this process. An actual link between each safety requirement and its relevant stakeholder 12207 requirement is however missing. Enabling systems for requirements definition include tools for facilitation and requirements management. These tools were identified in INT-0326 and INT-0319, however, a plan for using them or an official identification of them is not mentioned in the documentation provided. DOC-0001 has a general description for the data structures and formats. However, the definitions for the requirements do not include information about data elements, structures, formats, database, or data retention requirements The rationale for the requirements are given in DOC-1003, traceability to software system elements is also included. No traceability to test cases or information items or methods of verification can be found. Also no documentation of any eval- uated risk. Key drivers are identified by reviewing: market studies, or- 6.4.4 8 ganizational strategies, mission or business concept of oper- ations,etc. Unfortunately, none of the above information is included in DOC-1557 which identifies the tasks needed to be done for the architecture process, or in any of the architecture documentation, namely: DOC-0001. According to INT-0319, since the hardware and software frameworks are already in place, these aspects have not been documented but are con- sidered. The sub-clause would pass if there exists explicit doc- umentation of pertinent information and identify key drivers of the architecture. The connection between requirements and architectural en- tities is missing in the architecture documentation. There is alignment between the design and elements, however, no clear linkage is specified today. According to INT-0319, there is a plan to make this bridge between the artefacts, however, cur- rently, there isn’t such a linkage.

xxxi Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Enabling systems for design definition include tools for facil- 6.4.5 5 itation and design management. These tools were identified in INT-0326 and INT-0319, however, a plan for using them or an official identification of them is not mentioned in the documentation provided. Traceability between the detailed design elements and the sys- tem/software requirements is established in DOC-1487, but traceability with the architectural entities of the software sys- tem architecture is not documented. Standards governing safety are given (distributed) in DOC- 6.4.7 9 1142, nothing about environmental practices, except for the simulation set up. Programming and coding standards are given to an extent in DOC-1142 and DOC-1600 The lifecycle is well defined in DOC-1557 for development, however nothing for support environments. 6.4.8 5 Integration activities are performed according to INT-0326 and INT-0319, as well as distributed in various documents, however, nowhere in the documentation are they grouped as integration activities. The purpose and conditions are explained for the tests, how- 6.4.9 10 ever, no conformance criteria are mentioned. Enabling systems for requirements definition include tools for facilitation and requirements management. These tools were identified in INT-0319, however, a plan for using them or an official identification of them is not mentioned in the docu- mentation provided. As a result of these procedures, it is stated in DOC-1557 that Test Specifications, Test Implementation, Verification report for safety-related inputs, outputs and parameters, and a Test report are produced. All these documents except for the veri- fication report are produced and approved since they have a release ID, showing that the verification procedures have been performed. Test Specification Documents: DOC-1553 and DOC-1544, Test Implementation: DOC-1556 and DOC- 1546, Test Report: DOC-1808 and DOC-1806. It was stated in INT-0319 that there will not be such an overall document for the verification. The Validation report will list all individual artefacts/documents. 6.4.11 6 A mixture of passing and failing sub-clauses

Table 20: Table showing the condensed assessment results grouped based on partial compliance. The shortened name of each standard, a number representing the clause or sub-clause, and the count of sub-clauses within the requirement which have been partially complied with are given. Furthermore, the justification for the partial compliance along with references to data sources is given.

Standard Req Count Justification Group Num 5 6 All the provided documentation is accessible, maintainable, accurate, easy to understand, have titles, index arrangement, 61508-1 take account of company procedures, have a revision index, are structured and revised, amended, reviewed and approved under an appropriate document control scheme. Reference to DOC-1111

xxxii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

DOC-1557 shows, and INT-0326 confirms, that there are per- 6 6 sons be appointed to carry out one or more functional safety assessments and have access to relevant safety lifecycle activ- ities; as well as, all persons in charge of different activities be- ing identified and their responsibilities clearly communicated to them. Procedures for defining what information is to be communic- ated, ensuring prompt follow-up and satisfactory resolution of recommendations relating to E/E/PE safety-related systems, including those arising from Hazard and risk analysis is done in the CR-Tool, according to INT-0326 and DOC-1598 The Operation manual includes training and information for the emergency services. 8 5 Persons are appointed to carry out the functional safety as- sessments and access to relevant documentation of different phases is granted. 7.1 3 DOC-1557 and DOC-0001 include sufficient information re- garding the division of software safety lifecycle into activities with the scope, inputs and outputs specified for each phase. If the requirements for safety-related software have already 7.2 11 been specified for the E/E/PE safety-related system (see Clause 7 of IEC 61508-2), then the specification of software 61508-3 safety requirements need not be repeated. DOC-1557, DOC-1447, DOC-2025, DOC-1003 include the spe- cification of the requirements for safety-related software, and are evident to have been derived from the specified safety re- quirements, are made available,k is sufficiently detailed to al- low the design and implementation to achieve the required safety integrity, and to allow an assessment of functional safety to be carried out. It is stated in DOC-1557 that the validator must carry out 7.3 5 all the tasks in the validation process, except for the approval of the validation report which is to be carried out by the line manager. The required input and output signals with their sequences and their values are found in DOC-1818 and DOC-1865 Design docs DOC-1525, DOC-1526, DOC-1600 and DOC- 1487 include the use of UML, which helps with understand- 7.4 31 ing. Based on the doc reviews done stated in the documents, the contents are comprehended. Abstraction, modularity and other features which control complexity, Functionality; In- formation flow between elements; data structures and their properties; design assumptions and their dependencies are all included in the aforementioned documents. The division of responsibility is stated in DOC-1557 DOC-0001 includes an integrated set of techniques and meas- ures necessary during the software safety lifecycle phases to satisfy the software safety requirements specification at the required safety integrity level. The notation for representing the architecture is unambiguously defined. INT-0326 and DOC-1142 provide evidence that the program- ming languages are according to a suitable programming lan- guage coding standard.

xxxiii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Each software module is verified as required by the software module test specification that was developed during software system design and the results of the software module testing are documented as evident by Test Specification documents DOC-1544, DOC-1553 and the test reports DOC-1806 and DOC-1808. Test cases and their results are documented in DOC-1544 and 7.7 8 DOC-1553 and the test reports DOC-1806 and DOC-1808 The division of responsibility is documented in DOC-1557 7.9 7 Test specifications in DOC-1544 and DOC-1553, and test re- ports DOC-1808 and DOC-1806 are compatible, readable, and dependent on the software architecture design, and the soft- ware safety requirements specification. 7 11 Test reports DOC-1806 and DOC-1808 exist with release 29119-2 dates, Test data, Test environment requirements, Test deliv- erables, Test Results 8 15 Test specifications DOC-1544 and DOC-1554 exist with release dates, test conditions, test cases, approval, test environment and data information. Test reports DOC-1808 and DOC-1806 include test results 29119-3 7 2 The test results exist in the test reports for each use case DOC-1806 and DOC-1808 Safety functions with required PL and associated operating 4.6.3 7 modes, Performance criteria, Hardware architecture with ex- 13849-1 ternal signal interfaces, semi-formal methods to describe data and control flow, architecture model of three stages, Inputs © Processing © Outputs are all presented in the Architecture document DOC-0001 Justified or accepted coding guidelines are used and code is be tested by simulation according to DOC-1142 and test specific- ation documents DOC-1544 and DOC-1553 5 1 The desired behaviour of the system is stated in DOC-1487, DOC-1003, and DOC-2025 10 5 Documentation of safety function(s) provided by the SRP/CS, the characteristics of each safety function, the performance level (PL), and the category or categories selected are in the software documentation. (Specifically DOC-0001) 4.2 2 Though a validation plan does not exist, sufficient information about identification of specification documents and standards 13849-2 is presented in the Validation Report DOC-1830. 6 11 The name of the person carrying out the test, the environ- mental conditions, the test procedures and equipment used, the date of the test, and the results of the test are all included in the test reports DOC-1806 and DOC-1808 7 4 Sufficiently explicit information in the validation report DOC- 1830 9.5 4 Validation activities in DOC-1830, compliant to the sub- clauses.

xxxiv Ladan Pourvatan Test Process Assessment of Industrial Control Systems

6.4.3 12 Functional requirements and interface are explained in DOC- 1487 and DOC-1003. Performance requirements are that of PLd. Design constraints are defined in DOC-1526 and the software is classified as category 3 for safety design. Non- 12207 functional requirements are included in the documentation. Critical performance measures are not defined in the docu- mentation but since the level is PLd and it is referenced where to find critical performance measures for this group of systems, this is considered passed. The definition strategy for the re- quirements is defined in DOC-1557 Section 3. Access to the enabling systems or services is acquired by the company as they are currently in use. The functions that the element is required to perform are defined in detail in document DOC- 1487. Access to the enabling systems or services is acquired by the 6.4.4 8 company as they are currently in use. Architecture viewpoints have been developed, and no candid- ate architectures were considered according to INT-0326. 6.4.5 6 The design definition strategy is defined in DOC-1557along with utilisation of logic diagrams and activities diagrams are utilised for in DOC-1526. DOC-1480, DOC-1526, and DOC- 1487 include the design artefacts and rationales. 6.4.7 5 Procedures and methods for software development (construc- tion) and development of unit tests; and the use of peer re- views, unit tests, and walkthroughs during implementation; use of CM control during software construction are provided in DOC-1577, DOC-1598 6.4.8 5 DOC-1557 includes information regarding the Implementation process, the Verification process, and the Validation process. For the specified test cases, it is specified what should be veri- 6.4.9 4 fied along with the expected results in documents DOC-1553 and DOC-1544 Testing has been chosen as a verification method as stated in DOC-1557 6.4.11 5 Sufficient information in the validation report DOC-1830

Table 21: Table showing the condensed assessment results grouped based on compliance. The shortened name of each standard, a number representing the clause or sub-clause, and the count of sub-clauses within the requirement which have been complied with are given. Furthermore, the justification for the compliance along with references to data sources is given.

xxxv Ladan Pourvatan Test Process Assessment of Industrial Control Systems

E. Appendix E - Focus Group Procedure The procedure of this focus group follows the guidelines proposed by Gibbs [30] and Breen [31]. Research problem formulation for executing the focus group are as follows: • How do people from this company view these recommendations?

• How can these recommendations support the improvement of the current testing process? Planning and preparation for the focus group were performed by preparing a session to present the process of the thesis, the assessment results, analysis results, and the recommendations. General questions for moderating the session were created to guide the discussion into evaluating the recommendations made to the company’s test process. Selection and recruiting participants were done by the company. Conducting the sessions was done through a presentation and a moderated session which was recorded. The recordings were then transcribed and the data was analysed. The ethical consider- ations for the focus group are addressed in Section6. The questions for guiding the focus group session are as follows:

• How do you view these assessment results? – To the best of your knowledge, are these assessment results accurate? – Does the presentation of the results meet your needs concerning the areas of improve- ment in your development process? • How do you view these recommendations? – Are these recommendations sufficiently understandable and clear? – (In your opinion) How does applying these recommendations affect the time efficiency of the test process? – How compatible are these recommendations with the current test process? Are they easy to apply? – Do you believe these recommendations will lead to a test process that is easy to integ- rate with the other processes at the company? – Are these recommendations sufficiently consistent with each other? – Are these recommendations justified to your satisfaction? • How can these recommendations support the improvement of the current testing process?

– Do you believe applying these recommendations into the test process will improve the level of safety assurance of the developed system? – Do you believe applying these recommendations to the test process will improve the overall reliability of the developed system? – Do you believe applying these recommendations to the test process will improve the overall performance of the developed system?

xxxvi Ladan Pourvatan Test Process Assessment of Industrial Control Systems

F. Appendix F - Recommendations

ID Term Definition Notes TG_1 Testing Require- Identification of test item(s) Test conditions can be used to ments and test conditions. derive coverage items, or can themselves constitute cover- age items. TG_2 Test Data Data which has been selec- Test data can be stored ted, or created, for the pur- within the product under test pose of satisfying the input (e.g., in arrays, flat files, or requirements for the execu- a database), or can be avail- tion of test cases. able from or supplied by ex- ternal sources, such as other systems, other system com- ponents, hardware devices, or human operators. TG_3 Test Monitor- The process which ensures ing and Control that testing is performed in Process line with the Test Plan TG_4 Factors blocking Identifies those factors that progress impeded progress during the reporting period and the corresponding solutions that were implemented to remove them. Outstanding (un- solved) issues still impeding progress should be recorded and possible solutions identi- fied TG_5 Test Basis The body of knowledge used Documentation of require- as the basis for the design of ments specification, design tests and test cases specification, or module spe- cification. TG_6 Feature Set The logical subset of the test This could be the set of all item(s) which may be treated features for the item (its full independently of other fea- feature set) or a subset identi- ture sets in various test design fied for a specific purpose (the activities functional feature set, etc.). TG_7 Test Condition The testable aspect of a com- Function, transaction, fea- ponent or a system identified ture, quality attribute, or as a basis for testing structural element TG_8 Test Coverage Attribute or combination of Items enabling the measure- Items attributes derived from test ment of the thoroughness of conditions by using a test the test execution. design technique. TG_9 Test Procedures The sequence of test cases in execution order, along with actions required for setting up the initial preconditions, in addition to activities that must be done after comple- tion of execution.

xxxvii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

TG_10 Test Environment Facilities, hardware, software, Test procedures include de- firmware, procedures, and tailed instructions for how to documentation intended for run a set of one or more test or used for performing testing cases selected to be run con- of software secutively, including setting up of common preconditions, and providing input and eval- uating the actual results for each included test case. TG_11 Test Environment Description of the necessary All or parts of the test envir- Requirements properties of the test environ- onment requirements can ref- ment erence where the information can be found, e.g. in the ap- propriate Organizational Test Strategy, Test Plan, and/or Test Specification. TG_12 Test Set The collection of test cases for The test sets will typically re- the purpose of testing a spe- flect the feature sets, but they cific test objective. could contain test cases for a number of feature sets. TG_13 Test Case The set of test case precon- ditions, inputs (including ac- tions, where applicable), and expected results, developed to drive the execution of a test item to meet test ob- jectives, including correct im- plementation, error identific- ation, checking quality, and other valued information. TG_14 Test Measures The collated measures taken Measures on test cases, de- in testing which are related fects, incidents, test cover- to the end of the reporting age (TG_8), activity pro- period. gress and resource consump- tion.

Table 22: Table presenting the terminology used for the test process recommendations. The ID column includes identifiers representing each term, the terms are recorded in the second column, the Definition column includes the meaning of each term within the context, and the Notes column shows notes or examples on each term. These descriptions are derived from ISO/IEC/IEEE 29119:2013 [14][15][16].

Recommendations Reference Test Plan 13849-1_4.6.3.f, 29119-2_7.2.4.9, 29119-2_7.4.4, 29119-3_6.2, 61508-3_6.2.3 1 Review all previously identified risks to identify those that 29119-2_7.2.4.3, relate to and/or can be treated by software testing. 29119-2_7.3.4.1 2 Incorporate the test strategy in the test plan. 29119-2_7.2.4.7 3 Include the test specifications as stated in the Test Specific- 13849-2_6.1.a.1, ations Section. 29119-3_7.2 4 Specify the chronology of the tests in term of execution. 13849-2_6.1.a.3 5 Decide on and document the required outcome of the tests 13849-2_6.1.a.2 to decide on completeness.

xxxviii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

Test Strategy 29119-2_7.2.4.5, 12207_6.4.9.3 6 Agree on the staffing and scheduling of the tests. 29119-2_7.2.4.7 7 Identify general testing requirements. (TG_1) 29119-2_7.2.4.1, 29119-2_7.2.4.2 8 Specify all the procedures for corrective action upon failure 61508-3_7.4.8 of a test case (TG_13). 9 Identify the Test data. (TG_2) 29119-3_7.5 10 Identify the Test environment requirements and test tool 29119-2_8.3 requirements. 11 Estimate and record the required resources to perform the 12207_6.4.9.3, complete set of actions in the test strategy. 29119-2_7.2.4.7 Test Status Report 29119-2_7.3.4.1, 29119-3_6.3 12 Identify the metrics that are used for monitoring and con- 29119-2_7.3.4.1 trolling the tests. (TG_3) 13 Collect and record all the test measures (TG_14) 29119-2_7.3.4 14 Monitor the progress against the test plan 29119-2_7.3.4.2, 29119-3_6.2 15 Identify and record any factors that block the progress of 29119-3_6.4, the tests, with respect to the test plan. (TG_4) 29119-2_7.3.4 16 Identify and record any divergence from the activities in 29119-2_7.3.4.3, the test plan 29119-3_6.2 17 Identify any means of treating newly-identified and changed 12207_6.4.9.3 risks while testing 18 Make sure of release and availability of Control directives 29119-2_7.3.5, to ensure the traceability, of changes made to the testing, 29119-2_7.2.4.7, the test plan, test data, test environment and staffing 29119-2_7.3.4.3, 29119-3_7.5 Test Completion 29119-3_6.4 19 Compare the test records to the test plan to ensure lack of 13849-2_6.1.c divergence. 20 Create a finalised test completion report with details de- 29119-3_6.4 scribed in the Test Completion section. 21 Restore the test environment to a predefined state once all 29119-2_7.4.4 testing activities specified in Test Execution test_exec are finished. 22 Test Completion Report must be complete and have 29119-3_6.4 the following information 23 The test procedures (TG_9) and equipment used 29119-2_8.2.4, 29119-3_7.4 24 Whether or not specified functional and performance tar- 61508-3_7.9.2 gets were achieved. 25 Test assets which may be of use at a later date, or on other 29119-2_7.4.4 projects, in no uncertain terms. 26 Records of the lessons learned during the project execu- 29119-2_7.4.4 tion (what went well and what didn’t, during testing and associated activities). 27 Records and indentification of any recommended improve- 12207_6.4.9.3, ments to the testing and other processes, such as the devel- 29119-2_7.4.4 opment process. Test Design Specification 29119-3_7.2 28 Perform control flow analysis to specify test cases (TG_13). 13849-1_4.6.2

xxxix Ladan Pourvatan Test Process Assessment of Industrial Control Systems

29 Record the analysis performed and the test cases specified 13849-1_4.6.2 in requirement 28. 30 Specify test cases (TG_13) for black-box testing of func- 13849-1_4.6.3.f tional behaviour 31 Specify test cases (TG_13) for black-box testing of per- 13849-1_4.6.3.f formance criteria (e.g. timing performance). 32 Perform boundary value analysis for testing the safety- 13849-1_4.6.3.f related application and embedded software in the system, to specify test cases (TG_13). 33 Record the analysis performed and the test cases specified 13849-1_4.6.3.f in requirement 32. 34 Perform limit value analysis to specify test cases (TG_13). 13849-2_9.5 Record the analysis and the test cases. 35 Record the analysis performed and the test cases specified 13849-2_9.5 in requirement 34. 36 Specify test cases (TG_13) for I/O testing to ensure that 13849-1_4.6.3.f safety-related signals are correctly used. 37 Prioritise the testing of the feature sets (TG_6) using the 29119-2_8.2.4 risk exposure levels documented in the Risk Analysis re- port. 38 Document the feature sets (TG_6) 29119-2_8.2.4 39 Record the traceability between the test basis (TG_5) and 29119-2_8.2.4, feature sets (TG_6) and test conditions (TG_7) and test 29119-3_7.3 coverage (TG_8) items and test cases (TG_13) 40 Determine the test conditions (TG_7) for each feature 29119-2_8.2.4, based on the test completion criteria specified in the Test 29119-2_7.2.4.5 Plan. 41 Prioritise the test conditions (TG_7) using the risk expos- 29119-2_8.2.4 ure levels in the Risk Analysis report. 42 Record the test conditions (TG_7). 29119-2_8.2.4 43 Prioritise the test coverage items using the risk exposure 29119-2_8.2.4 levels in the Risk Analysis report. 44 Prioritise the test cases (TG_13) using the risk exposure 29119-2_8.2.4, levels in the Risk Analysis report. 29119-3_7.3 45 Distribute the test cases (TG_13) into one or more test 29119-2_8.2.4.5, sets (TG_12) based on constraints on their execution and 29119-3_7.3 record the test sets. 46 Order the test cases (TG_13) within a test set according 29119-2_8.2.4, to dependencies described by preconditions and postcondi- 29119-3_7.3, tions and other testing requirements to derive test proced- 29119-3_7.4 ures (TG_9). 47 Prioritise the test procedures (TG_9) using the risk expos- 29119-2_8.2.4, ure levels documented in the Risk Analysis report. 29119-3_7.4 Test Environment Set-up and Monitoring 29119-2_8.2.4, 29119-3_7.6 48 Plan the set-up of the test environment (TG_10), includ- 29119-2_8.2.4, ing the test environment requirements (TG_11), and the 29119-2_8.2.4, schedules and costs of setting up the test environment. 29119-3_7.6 49 Set up the test environment (TG_10) as planned. 50 Determine the degree of configuration management to be 29119-2_8.2.4 applied (where appropriate). 51 Implement the test environment. 29119-2_8.2.4 52 Set up test data to support the testing (where appropriate). 29119-2_8.2.4, 29119-3_7.5

xl Ladan Pourvatan Test Process Assessment of Industrial Control Systems

53 Install and configure the test item on the test environment. 29119-2_8.2.4 54 Verify that the test environment meets the test environment 29119-2_7.2.4.5, requirements (TG_11) stated in recommendation 45. 29119-2_8.2.4, 29119-3_7.6 Test Execution 29119-2_8.4.4 55 Safety-related part(s) under test shall not be modified dur- 13849-2_6.4 ing the course of the tests. 56 If a test can permanently change the performance of some 13849-2_6.4 components such that it causes the safety-related part to be incapable of meeting the requirements of further tests, a new sample or samples shall be used for subsequent tests. 57 Detect and control external failure while testing and record 13849-2_9.5 it. 58 Detect systematic faults (errors, omissions or inconsisten- 13849-2_7 cies) while testing. 59 Test the code by simulation. 13849-1_4.6.2 60 Record the Test Execution Log, Actual Results, and Test 29119-2_8.2.4, Result. 29119-3_7.9 61 Where a test result relates to a previously-raised incident, 12207_6.4.9.3 the test result shall be analyzed and the incident details shall be updated. 62 Where a test result indicates that a new issue has been 12207_6.4.9.3 identified, the test result shall be analyzed and it will be determined whether it is an incident that requires report- ing, an action item that will be resolved without incident reporting, or requires no further action to be taken. 63 During software integration, any modification to the soft- 12207_6.4.8.3, ware shall be subject to an impact analysis which shall de- 61508-3_7.4.7, termine all software modules impacted, and the necessary 61508-3_7.4.8 reverification and re-design activities. 64 Report test incidents. 12207_6.4.9.3

Table 23: Table containing the recommendations for improving the compliance of the test process of the company. As the recommendations are created based on the combination of different instructions, the references to sub-clauses of standards are given as well.

xli Ladan Pourvatan Test Process Assessment of Industrial Control Systems

G. Appendix G - Refined Recommendations The recommendations presented in this Appendix are refined with attention to the company’s needs. They have been further clustered into actions that should be taken immediately, and those with lower priorities. The terminology used is presented in Table 22

High Priority Recommendations • Actions for creating a Test Plan

 Review all previously identified risks to identify those that relate to and/or can be treated by software testing [15].

– Incorporate the test design specifications in the test plan as follows [18][16]:  Perform control flow analysis to specify test cases [17].  Specify test cases for black-box testing of functional behaviour [17].  Specify test cases for black-box testing of performance criteria (e.g. timing per- formance) [17].  Perform boundary value analysis for testing the safety-related application and em- bedded software in the system, to specify test cases [17].  Perform limit value analysis to specify test cases [18].  Specify test cases for I/O testing to ensure that safety-related signals are correctly used [17].  Record the analyses performed and their resulting test cases [15].  Prioritise the test coverage items using the risk exposure levels in the Risk Analysis report[15].  Prioritise the test cases using the risk exposure levels in the Risk Analysis re- port[15][16].

• Actions before Test Execution (Test Environment Set-up and Monitoring)

 Implement the test environment and record the process [15].  Install and configure the test item on the test environment [15]. • Actions during Test Execution

 Test the code by simulation and record the outcomes [17].  Make sure safety-related part(s) under test are not modified during the course of the tests [18].  Use a new sample or samples for tests that can permanently change the performance of some components such that it causes the safety-related part to be incapable of meeting the requirements of further tests [18].  Detect and control external failure while testing and record it [18].  Detect systematic faults (errors, omissions or inconsistencies) while testing [18].  Record the Test Execution Log, Actual Results, and Test Result [15][16].  Where a test result relates to a previously-raised incident, the test result shall be ana- lyzed and the incident details shall be updated [5].  Where a test result indicates that a new issue has been identified, the test result shall be analysed and it will be determined whether it is an incident that requires reporting, an action item that will be resolved without incident reporting, or requires no further action to be taken [5].  During software integration, any modification to the software shall be subject to an impact analysis which shall determine all software modules impacted, and the necessary reverification and re-design activities [5][13].

xlii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

 Report test incidents [5].  Make sure of release and availability of control directives to ensure the traceability, of changes made to the testing, the test plan, test data, and test environment [15][16]. • Actions upon Test Completion

 Restore the test environment to a predefined state [15]. – Create a Test Completion Report which has the following information [16]:  The test procedures and equipment used. [15][16].  Whether or not specified functional and performance targets were achieved [13].  Test assets which may be of use at a later date, or on other projects [15].  Records of the lessons learned during the project execution (what went well and what did not, during testing and associated activities) [15].  Records and identification of any recommended improvements to the testing and other processes, such as the development process [5].

Medium Priority Recommendations • Additional actions for creating a Test Plan

 Decide on, and document the required outcome of the test activities to decide on com- pleteness [18].  Identify general testing requirements [15].  Specify all the procedures for corrective action upon failure of a test case [13].  Plan the set-up of the test environment, including the test environment requirements, and the schedules and costs of setting up the test environment [15].

– Additions to the Test Design Specification documentation:  Document the feature set(s) [15].  Prioritise the testing of the feature sets using the risk exposure levels documented in the Risk Analysis report [15].  Prioritise the test conditions using the risk exposure levels in the Risk Analysis report [15].  Record the test conditions [15].

• Additional actions for Test Environment Set-up and Monitoring:

 Verify that the test environment meets the test environment requirements stated in the Test Plan [15][16]. • Additional actions during Test Execution:

 Monitor the progress against the test plan [15].  Identify and record any factors that block the progress of the tests, with respect to the test plan [15].  Identify any means of treating newly-identified and changed risks while testing [5].

Low Priority Recommendations • Further actions for creating a Test Plan

 Specify the chronology of the tests in term of execution [18].

– Further additions to the Test Design Specification documentation:  Record the traceability between the test basis and feature set(s) and test conditions and test coverage items and test cases [15][16].

xliii Ladan Pourvatan Test Process Assessment of Industrial Control Systems

 Determine the test conditions for each feature based on the test completion criteria specified in the Test Plan [15].  Distribute the test cases into one or more test sets based on constraints on their execution and record the test sets [15][16].  Order the test cases within a test set according to dependencies described by precon- ditions and postconditions and other testing requirements to derive test procedures [15][16].  Prioritise the test procedures using the risk exposure levels documented in the Risk Analysis report [15][16]. – Incorporate the Test Strategy into the Test Plan with the following additional inform- ation:  Agree on the staffing and scheduling of the tests [15].  Identify the Test data [16].  Identify the Test environment requirements and test tool requirements [15].  Estimate and record the required resources to perform the complete set of actions in the test strategy [5][15].

• Further actions for Test Environment Set-up and Monitoring:

 Determine the degree of configuration management to be applied (where appropriate) [15].

• Further actions during Test Execution:

 Identify the metrics that are used for monitoring and controlling the tests [15].  Collect and record all the test measures [15].  Identify and record any divergence from the activities in the test plan [15][16]. • Further actions upon Test Completion:

 Compare the test records to the test plan to ensure lack of divergence [18].

xliv