
Test Accelerator for Service Oriented Architectures (SOA-Test Accelerator) Jo~aoFilipe Garrett Paix~aoFlor^encio Thesis to obtain the Master of Science Degree in Information Systems and Computer Engineering Supervisor: Doctor Jos´eCarlos Campos Costa Examination Committee Chairperson: Doctor Paolo Romano Supervisor: Doctor Jos´eCarlos Campos Costa Member of Committee: Doctor Ant´onioManuel Ferreira Rito da Silva October 2015 CHAPTER 0 ii Abstract This document describes the design and development of a tool, \SOA { Test Acceler- ator", created to automate intelligent test creation in service oriented architectures. The need for such a tool comes from the increasing difference between testing some indepen- dent services and testing their overall interaction. As the system architecture grows in number of services, manually creating test case scenarios becomes a heavy burden. SOA- TA's ultimate goal is to reduce the time spent on combining and orchestrating service calls to simulate a business process. The work presented here is based on five stages and their outcome depends on the exhaustiveness level chosen by the user. First, the automatic generation of test cases through process descriptions analysis, having business requirements in consideration. Second, the generation of the input set required to execute these test cases. Third, the production of specific service calls, by means of test scripts to be run on Apache JMeter. Fourth, the execution of these scripts and fifth, showing the results. SOA-TA will be useful for operations that rely on consecutive service calls, and need to ensure the overall system compliance with previously set requirements. Keywords:SOA testing,test case generation,test case execution,web services,service testing,coverage. Resumo Este documento descreve o desenho e desenvolvimento da ferramenta \SOA { Test Accelerator", criada para automatizar a cria¸c~aointeligente de testes para arquitecturas orientadas a servi¸cos.A necessidade desta ferramenta prov´emda crescente diferen¸caentre testar um conjunto de servi¸cosindependentes e testar as suas interac¸c~oes. A` medida que o sistema aumenta em n´umerode servi¸coso tempo gasto na cria¸c~aode cen´ariosde teste aumenta proporcionalmente, factor este que aqui tentamos mitigar. O objectivo maior do SOA-TA ´e,portanto, reduzir o tempo despendido em combinar e orquestrar as chamadas aos servi¸cosque simulam um processo de neg´ocio.O funcionamento ´ebaseado em cinco etapas e o seu resultado depende da exaustividade escolhida pelo utilizador. A saber, inicialmente, a gera¸c~aoautom´aticade casos de teste atrav´esde uma descri¸c~aodos processos, tendo em considera¸c~aoos requisitos do neg´ocio. Seguidamente, a cria¸c~aoda lista de dados de entrada que alimentam esses casos de teste. Em terceiro, a gera¸c~ao das chamadas aos servi¸cos,por scripts de Apache JMeter. Em quarto a execu¸c~aodesses mesmos scripts e por ´ultimoa apresenta¸c~aodos resultados. O SOA-TA ser´a´utilem opera¸c~oesque dependem de chamadas consecutivas a servi¸cose que precisam de garantir que os requisitos previamente estabelecidos s~aorespeitados. Keywords: testes, gera¸c~aode testes, execu¸c~ao de testes, SOA, cobertura Acknowledgements I would like to thank INESC-I&D for giving me the opportunity of learning with the best, Wintrust for the vision and for believing in our work, my thesis supervisor Professor Jos´eCosta for the continuous support of my study and related research, for his patience, motivation, and immense knowledge. Last but not the least, I would like to thank my family and friends, not only for their willingness to help but specially for their support. They made it all worth it. iii Contents List of Tables vii List of Figures ix Acronyms List xii 1 Introduction1 1.1 Motivation....................................3 1.2 Objectives....................................3 1.3 Thesis Outline..................................4 1.4 Summary....................................5 2 Related Work7 2.1 Academic....................................7 2.1.1 Test-Case Generation..........................7 2.1.2 Testing SOA...............................8 2.2 Commercial Testing Frameworks........................ 10 2.2.1 HP Unified Functional Testing (UFT)................. 10 2.2.2 Oracle Testing Accelerators for Web Services............. 11 2.2.3 Parasoft SOAtest............................ 12 2.2.4 LISA for SOA Testing......................... 13 2.2.5 Apache JMeter............................. 13 2.3 Related Work Critical Analysis........................ 14 2.4 Summary.................................... 15 3 Background 17 3.1 Coverage Metrics................................ 17 v CONTENTS CHAPTER 0 3.1.1 The Model................................ 17 3.1.2 Diagram................................. 18 3.1.3 Switch Coverage............................ 19 3.2 Summary.................................... 22 4 Solution Description 23 4.1 General Description............................... 23 4.2 Modelling the process.............................. 24 4.2.1 Tibco Logs Exception Case...................... 26 4.3 Graph Generation................................ 27 4.4 Test Path Generation.............................. 28 4.5 Input Data................................... 30 4.6 Test Script Generation............................. 31 4.7 Test Script Execution.............................. 31 4.8 Reading Results................................. 32 4.9 Summary.................................... 33 5 Implementation 35 5.1 User Interface.................................. 36 5.2 Business Logic.................................. 37 5.3 Database and Data Access........................... 39 5.4 Summary.................................... 40 6 Evaluation Methods 41 7 Results 43 7.1 Path Generation Results............................ 43 7.2 End-To-End................................... 48 8 Conclusions and Future Work 55 Bibliography 57 vi List of Tables 3.1 Chow-0 Test Paths............................... 21 3.2 Chow-1 Test Paths............................... 22 7.1 Generated Test Paths - Example graph 6................... 44 7.2 TAP Staff Booking Generated Paths - Chow-0................ 46 7.3 Process Example 2 - Generated Paths..................... 47 7.4 Process Example 4 - Generated Paths..................... 48 7.5 Example 5 - Chow-0 Test Paths........................ 50 7.6 Example 5 - Chow-1 Test Paths........................ 50 7.7 GetStockValue - Unit test results (16 samples)................ 51 7.8 ConvertUsdToEur - Unit test results (Input from table 7.7)......... 52 8.1 Attatchment A -Tap Staff Booking Chow-1 Path............... 62 vii List of Figures 2.1 HP UFT Screenshot.............................. 11 2.2 Oracle Testing Accelerator Screenshot..................... 11 3.1 Example graph - 1............................... 19 3.2 Example graph - 2............................... 20 3.3 Example graph - 3............................... 21 4.1 General System Description.......................... 23 4.2 Process Example 1 - Bizagi Modeler...................... 25 4.3 Bizagi Modeler BPMN Export......................... 25 4.4 Bizagi Process Example 1's Graph; Left { Before modification; Right { After correction................................. 27 4.5 Example graph - 4 Before........................... 29 4.6 Example graph - 4 After............................ 30 4.7 Example graph - 5............................... 30 4.8 JMeter Web Service Sampler.......................... 32 5.1 SOA-TA Communications Diagram...................... 36 5.2 SOA-TA Screenshots.............................. 37 5.3 Business Logic structure............................ 38 5.4 Database schema................................ 39 7.1 Example Graph 6................................ 44 7.2 Example graph 7 - TAP Staff Booking.................... 45 7.3 Process Example 2............................... 46 7.4 Process Example 3............................... 47 7.5 Process Example 4............................... 48 ix LIST OF FIGURES CHAPTER 0 7.6 Process Example 5............................... 49 x Acronyms List BPD Business Process Diagram BPEL Business process execution language BSF Bean Scripting Framework CA Computer Associates International Inc. DG Directed Graph EA Enterprise Architect (software) GUI Graphical User Interface HTTP HyperText Transfer Protocol ISTQB International Software Testing Qualifications Board OASIS Organization for the Advancement of Structured Information Standards REST Representational State Transfer SCR Software Cost Reduction SMT Satisfiability Modulo Theories SOA Service-Oriented Architecture SOAP Simple Object Access Protocol TAP Transportes A´ereosPortugueses UDDI Universal Description Discovery and Integration V&V Verification and Validation xi LIST OF FIGURES CHAPTER 0 WADL Web Application Description language WSDL Web Services Description language WSIL Web Services Inspection language XML Extensible Markup Language xii 1 Introduction We cannot solve our problems with the same thinking we used when we created them. -Albert Einstein Nobody is perfect. Humans make mistakes all the time. While some of them are negligible, some are expensive or dangerous. We are used to assume that our work may have mistakes, even minor ones, and it needs testing. Software testing in particular is now recognized as a key activity of large systems development. Companies have come to understand that, in order to achieve an higher quality product, efforts on Verification and Validation (V&V)[1], save time and money. Service-Oriented Architecture
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages78 Page
-
File Size-