Thesis no: MSSE-2016-14 Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing A Case Study Akshay Kumar Magapu Nikhil Yarlagadda Faculty of Computing Blekinge Institute of Technology SE–371 79 Karlskrona, Sweden This thesis is submitted to the Faculty of Computing at Blekinge Institute of Technology in partial fulfillment of the requirements for the degree of Master of Science in Software Engineering. The thesis is equivalent to 20 weeks of full time studies. Contact Information: Author(s): Akshay Kumar Magapu E-mail: [email protected] Nikhil Yarlagadda E-mail: [email protected] External advisor: Saket Rustagi Project Manager Ericsson India Global Services Pvt. Ltd. Gurgaon, India. University advisor: Michael Unterkalmsteiner Department of Software Engineering Faculty of Computing Internet : www.bth.se Blekinge Institute of Technology Phone : +46 455 38 50 00 SE–371 79 Karlskrona, Sweden Fax : +46 455 38 50 57 Abstract Context. Testing of web applications is an important task, as it ensures the functionality and quality of web applications. The quality of web applica- tion comes under non-functional testing. There are many quality attributes such as performance, scalability, reliability, usability, accessibility and se- curity. Among these attributes, PSR is the most important and commonly used attributes considered in practice. However, there are very few empiri- cal studies conducted on these three attributes. Objectives. The purpose of this study is to identify metrics and tools that are available for testing these three attributes. And also to identify the challenges faced while testing these attributes both from literature and practice. Methods. In this research, a systematic mapping study was conducted in order to collect information regarding the metrics, tools, challenges and mitigations related to PSR attributes. The required information is gathered by searching in five scientific databases. We also conducted a case study to identify the metrics, tools and challenges of the PSR attributes in practice. The case study is conducted at Ericsson, India where eight subjects were interviewed. And four subjects working in other companies (in India) were also interviewed in order to validate the results obtained from the case com- pany. In addition to this, few documents of previous projects from the case company are collected for data triangulation. Results.A total of 69 metrics, 54 tools and 18 challenges are identified from systematic mapping study. And 30 metrics, 18 tools and 13 challenges are identified from interviews. Data is also collected through documents and a total of 16 metrics, 4 tools and 3 challenges were identified from these documents. We formed a list based on the analysis of data that is related to tools, metrics and challenges. Conclusions.We found that metrics available from literature are overlap- ping with metrics that are used in practice. However, tools found in liter- ature are overlapping only to some extent with practice. The main reason for this deviation is because of the limitations that are identified for the tools, which lead to the development of their own in-house tool by the case company. i We also found that challenges are partially overlapped between state of art and practice. We are unable to collect mitigations for all these challenges from literature and hence there is a need for further research to be done. Among the PSR attributes, most of the literature is available on perfor- mance attribute and most of the interviewees are comfortable to answer the questions related to performance attribute. Thus, we conclude there is a lack of empirical research related to scalability and reliability attributes. As of now, our research is dealing with PSR attributes in particular and there is a scope for further research in this area. It can be implemented on the other quality attributes and the research can be done in a larger scale (considering more number of companies). Keywords: Web applications, Web testing, Performance, Scalability, Reli- ability, Quality. ii Acknowledgments We would like to thank our supervisor Michael Unterkalmsteiner for his tremen- dous and quick support whenever needed. We also thank Ericsson for providing us the opportunity to conduct case study and interviewees from other organiza- tions for participating in the interviews. Special credits go to our family, friends for providing us the support to make the thesis completed. The authors iii Contents Abstract i Acknowledgments iii 1 Introduction 1 1.1 Web testing . 1 1.1.1 Functional testing . 2 1.1.2 Non-functional testing . 2 1.2 Problem statement . 3 1.3 Thesis structure . 3 2 Background and Related Work 5 2.1 Web applications . 5 2.2 Web testing . 7 2.2.1 Functional testing . 8 2.2.2 Non-functional testing . 10 2.3 Selected attributes . 13 2.4 Research scope . 14 2.5 Related work . 15 2.5.1 Literature related to metrics . 15 2.5.2 Literature related to tools . 15 2.5.3 Literature related to challenges . 16 2.5.4 Research gap . 17 3 Method 18 3.1 Research purpose . 18 3.1.1 Objectives . 18 3.2 Research questions . 19 3.2.1 Motivation . 20 3.3 Research method . 20 3.3.1 Systematic mapping study . 21 3.3.2 Case study . 31 3.4 Data analysis . 36 3.4.1 Familiarizing yourself with the data . 36 iv 3.4.2 Generating initial codes . 37 3.4.3 Searching for themes . 38 3.4.4 Reviewing themes . 38 3.4.5 Defining and naming themes . 39 3.4.6 Producing the report . 39 3.5 Validity threats . 39 3.5.1 Construct validity . 39 3.5.2 Internal validity . 40 3.5.3 External validity . 40 3.5.4 Reliability . 41 4 Results and Analysis 43 4.1 Facet 1: Metrics for testing PSR attributes . 44 4.1.1 Systematic mapping study . 44 4.1.2 Interviews and documents . 48 4.1.3 Criteria for selection of metrics . 52 4.2 Facet 2: Tools for testing PSR attributes . 53 4.2.1 Systematic mapping study . 53 4.2.2 Interviews and documents . 56 4.2.3 Tool drawbacks and improvements . 61 4.3 Facet 3: Challenges faced by software testers . 62 4.3.1 Systematic mapping study . 62 4.3.2 Interviews and documents . 67 4.3.3 Does mitigations available in literature mitigates challenges in practice? . 72 4.4 Facet 4: Important attribute among PSR . 72 4.4.1 Interviews . 72 5 Discussion 75 5.1 Metrics for testing PSR attributes of web applications . 75 5.2 Tools for testing PSR attributes of web applications . 77 5.3 Challenges in PSR testing of web applications . 79 5.4 Most important attribute among PSR . 82 5.5 Implications . 82 6 Conclusions and Future Work 86 6.1 Research questions and answers . 86 6.1.1 RQ 1: Metrics used for testing the PSR attributes . 86 6.1.2 RQ 2: Tools used for testing the PSR attributes . 87 6.1.3 RQ 3 Challenges identified while testing the PSR attributes 89 6.1.4 RQ 4: Important attribute among PSR . 90 6.2 Conclusion . 90 6.3 Research contribution . 91 v 6.4 Future work . 92 Appendices 105 A Systematic maps 106 B SMS overview 108 C List of metrics 117 D List of tools 121 E List of challenges 122 F Interview questions 124 F.1 Technical Questions . 124 F.1.1 Tools . 124 F.1.2 Metrics . 125 F.1.3 Challenges . 126 F.1.4 General . 127 G MTC and IA identified between case company and other com- panies 128 G.1 Metrics . 128 G.2 Tools . 129 G.3 Challenges . 129 G.4 Important attribute . 129 H Consent form 130 vi List of Figures 1.1 Types of testing . 2 1.2 Thesis structure . 4 2.1 Requirements classification . 7 2.2 Types in functional testing . 8 2.3 Types in non-functional testing . 10 2.4 Research scope . 14 3.1 Systematic mapping study process . 22 3.2 Case study process steps . 31 3.3 Pyramid model for interview questions . 33 3.4 Steps for thematic analysis . 36 3.5 Themes formed in Nvivo tool for interviews . 42 4.1 Number of sources addressing the research attributes . 44 4.2 Thematic map for metrics from SMS . 45 4.3 Thematic map for metrics from interviews . 48 4.4 Thematic map for metrics from documents . 52 4.5 Thematic map for tools from SMS . 53 4.6 Thematic map for tools from interviews . 57 4.7 Types of tools obtained from interviews . 57 4.8 Thematic map for tools from documents . 61 4.9 Thematic map for challenges from SMS . 62 4.10 Number of articles addressed each theme from SMS . 63 4.11 Thematic map for challenges from interviews . 67 4.12 Number of interviewees addressed the themes . 68 4.13 Thematic map for challenges from documents . 71 4.14 Thematic map for important attribute from interviews . 73 5.1 Overlap and differences in metrics among all data sources . 76 5.2 Overlap and differences in tools among all data sources . 78 5.3 Overlap and differences in challenges among all data sources . 81 5.4 Overlap and differences in challenge areas among all data sources 81 vii 5.5 Overlap and differences in metrics between state of art and state of practice . 83 5.6 Overlap and differences in tools between state of art and state of practice . 84 5.7 Overlap and differences in challenges between state of art and state of practice . 85 A.1 Research parameters vs research attributes in SMS . 106 A.2 Research methods vs research attributes in SMS . 107 A.3 Research methods vs research parameters in SMS .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages142 Page
-
File Size-