Test Automation in Web Environment
Total Page:16
File Type:pdf, Size:1020Kb
FACULDADE DE ENGENHARIA DA UNIVERSIDADE DO PORTO Test Automation in Web Environment Jorge Miguel Guerra Santos ForMestrado Jury Integrado em Engenharia Evaluation Informática e Computação Supervisor: Profa Ana Paiva Proponent: Engo Joel Campos June 27, 2016 Test Automation in Web Environment Jorge Miguel Guerra Santos Mestrado Integrado em Engenharia Informática e Computação Approved in oral examination by the committee: Chair: External Examiner: Supervisor: June 27, 2016 Abstract In today’s fast moving world, it is a challenge for any company to continuously maintain and improve the quality and efficiency of software systems development. In many software projects, testing is neglected because of time or cost constraints. This leads to a lack of product quality, followed by customer dissatisfaction and ultimately to increased overall quality costs. Addition- ally, with the increasingly more complex software projects, the number of hours spent on testing increases as well, but without the support of suitable tools, the test efficiency and validity tends to decline. Some software testing tasks, such as extensive low-level interface regression testing, can be laborious and time consuming to do manually. In addition, a manual approach might not always be effective in finding certain classes of defects. Test automation offers a possibility to perform these types of testing effectively. Once automated tests have been developed, they can be run quickly and repeatedly. However, test automation systems usually lack reporting, analysis and meaningful information about project status. The end goal of this research work is to create a prototype that can create and organize test batteries by recording user interaction, reproduce the recorded actions automatically, detect failures during test execution and generate reports, while also setting up the test environment, all in a automatic fashion and develop techniques to create more maintainable test cases. This tool can help bring technical advantage in automated web test- ing by creating new and more maintainable recorded test cases with minimal user action and allow testers to better evaluate the software project status. i ii Acknowledgements First of all, I would like to thank FEUP for serving as a place for personal and interpersonal growth, as a well as for providing all the conditions for me to focus on my personal goals and to meet incredible people. I thank Profa Ana Cristina Ramada Paiva for guiding me throughout this whole research work, making sure that I stayed focused on the main goal and for always managing her time in order to provide me feedback about the work being developed and clarify any doubts that I had. Additionally, I would like to thank Glintt HS and its crew for allowing me to fit right in and provide all the conditions I needed during this research work. I hope the work done throughout my stay will be of value to the company for a long time. More specifically, I thank Joel for always being available for any questions that I had, even in the busiest days, and for letting me have the freedom to experiment with new ideas for the research work, while giving me his feedback. A special shout-out to the Andrés group for being such a tight-knit group of friends. You keep my sanity in check—most of the times—and I thank you for all the moments during our stay at FEUP and I hope that it is just the beginning. A special thanks to Simão for helping me throughout these academic years. I am eternally grateful for the fact that you had the initiative to work with me and for influencing me in a way that I was able to adapt to the new academic life and form a new work ethic that has without a doubt led me this far. Another individual thanks goes to JP. Thanks for being such an upbeat guy and for, in a random day, asking me to study together in FEUP at night. This turned into a tradition that helped me manage my time to be able to do a little bit of everything. I thank my parents for always supporting me through thin and thick but also allowing me the freedom to explore and focus on my interests while always giving me advice. I am thankful for having had the chance to be born to parents as incredible as you and sorry you had to deal with me and my brother. Speaking of which, thank you my brother. You were always at my side throughout my life and I wish it stays that way no matter where our paths go. Thank you, Jorge Miguel Guerra Santos iii iv “Only those who have patience to do simple things perfectly ever acquire the skill to do difficult things easily.” James J. Corbett v vi Contents 1 Introduction1 1.1 Context and Background . .1 1.2 Motivation . .1 1.3 Goals . .2 1.4 Structure . .2 2 Automated Web Application Testing3 2.1 Software Testing . .3 2.2 Web Application Testing . .4 2.3 Test Automation . .5 2.3.1 Automated Web Application Testing Techniques . .7 2.4 Software Testing Metrics . .9 2.4.1 Automated Software Testing Metrics . .9 2.5 Tools Survey . 10 2.5.1 Automated Software Testing Tools . 10 2.5.2 Test Logging and Reporting Tools . 12 2.6 Conclusions . 13 3 Methodology 15 3.1 Requirements . 15 3.2 Technologies Comparison . 16 3.2.1 Automated Web Application Testing Tools . 16 3.2.2 Continuous Integration Tools . 17 3.2.3 Additional Technologies . 18 3.3 Test Cases . 19 3.3.1 Locators . 19 3.3.2 Recurrent Steps . 20 3.3.3 Implicit waits . 21 3.4 Test Environment . 21 3.4.1 Selection of Test Cases . 21 3.4.2 Automation . 22 3.4.3 Test Reports . 22 3.5 Conclusions . 23 4 Implementation 25 4.1 Method . 25 4.1.1 Test Case Structure . 25 4.1.2 Test Case Life Cycle . 26 vii CONTENTS 4.1.3 Test Results file structure . 27 4.1.4 Test Selector file structure . 28 4.1.5 Test Environment Setup . 28 4.2 Architecture . 30 4.2.1 Selenium IDE . 30 4.2.2 Jenkins . 35 4.3 Prototype . 39 4.3.1 Selenium IDE . 39 4.3.2 Jenkins . 43 4.4 Discussion . 47 4.4.1 Method . 47 4.4.2 Results . 48 5 Conclusions and Future Work 51 5.1 Final Remarks . 51 5.2 Future Work . 52 References 53 A Configuration 55 A.1 Selenium IDE Configuration . 55 A.2 Jenkins Configuration . 56 B Procedure 63 B.1 Translated Selenese commands in NUnit C# . 63 B.2 Gsearcher’s Locator Strategy . 64 viii List of Figures 2.1 Mike Cohn’s test automation pyramid . .6 3.1 Integration between technologies . 18 3.2 RecurrentSteps case used by multiple test cases . 20 3.3 Jenkins Build Process . 22 4.1 Jenkins Workspace . 29 4.2 Overview of the developed system . 30 4.3 Implementation model of the Test suite chooser plugin . 31 4.4 Implementation model of the Command builders . 31 4.5 Implementation model of the Selenium IDE’s user extension . 32 4.6 Implementation model of the saveTestCaseToProperties executable . 33 4.7 Implementation model of the C# NUnit Formatters . 34 4.8 Use Case diagram of Selenium IDE . 35 4.9 Implementation model of the tests selector Jenkins plugin . 36 4.10 Implementation model of the transferTestsToBuild executable . 36 4.11 Implementation model of the parseTestsProperties executable . 37 4.12 Implementation model of the extractTestResults executable . 38 4.13 Use Case diagram of Jenkins . 38 4.14 Context menu with Selenium IDE commands in Selenese . 40 4.15 Selenium IDE interface with a select command . 41 4.16 Test suite tree view . 42 4.17 Locator’s format in Selenium IDE . 43 4.18 Jenkins main interface . 44 4.19 Jenkins test selector interface with a screenshot . 44 4.20 Jenkins test selector interface with a failed screenshot . 45 4.21 Jenkins test result interface . 45 4.22 Cause of the failed test presented in the test result interface . 46 A.1 Selenium IDE’s general settings . 55 A.2 Selenium IDE’s format settings . 56 A.3 Jenkins’ Tests folder . 57 A.4 Jenkins’ Project Pre-Build Configuration . 59 A.5 Jenkins’ Project Build Configuration . 60 A.6 Jenkins’ Project Post-Build Configuration . 61 A.7 Jenkins’ Project Node Configuration . 62 ix LIST OF FIGURES x List of Tables 3.1 Web Automated Testing Tools Comparison . 16 3.2 Continuous Integration Tools Comparison . 17 4.1 Rules to handle user actions in Selenese commands and in NUnit C# . 26 xi LIST OF TABLES xii Abbreviations UI User Interface GUI Graphical User Interface IDE Integrated Development Environment IT Information Technology CI Continuous Integration C&R Capture & Replay API Application Programming Interface URL Uniform Resource Locator CSS Cascading.