
Masaryk University Faculty of Informatics Performance Testing Automation of Apache Qpid Messaging Libraries Bachelor’s Thesis Jiří Daněk Brno, Fall 2020 Masaryk University Faculty of Informatics Performance Testing Automation of Apache Qpid Messaging Libraries Bachelor’s Thesis Jiří Daněk Brno, Fall 2020 This is where a copy of the official signed thesis assignment and a copy ofthe Statement of an Author is located in the printed version of the document. Declaration Hereby I declare that this paper is my original authorial work, which I have worked out on my own. All sources, references, and literature used or excerpted during elaboration of this work are properly cited and listed in complete reference to the due source. Jiří Daněk Advisor: Mgr. Martin Večeřa i Acknowledgements Software performance testing has proved to be a deep and intrigu- ing area in the field of software quality engineering to me. I would therefore like to thank my fellow quality engineers on the Messaging QE team at Red Hat Czech, s. r. o. for providing a friendly work en- vironment in which I could pursue this topic. Special thanks belong to my thesis consultant, Ing. Zdeněk Kraus, who works as the team’s manager. I am fortunate that there has been considerable prior work in de- velopment of performance measurement tooling applicable to the software I am focusing on. When making the final step of integrating the tools into a performance regression test suite, I was able to refer to tools built by Justin R. Ross and Otavio R. Piske, both fellow Red Hat employees. Some of the bibliographical references I have used were recommended by yet another Red Hat employee, Francesco Nigro, in e-mail discussions. Finally, I must not forget to thank my parents for their support, and for helping to keep me motivated during my university studies. iii Abstract Event-driven architecture has proven to be an effective way of design- ing software systems composed of loosely coupled software compo- nents. These designs are often implemented using message-oriented middleware, which puts messaging into a fundamental role and places great demands on reliability and performance of the messaging solu- tion being used. This thesis focuses on messaging libraries developed under the Apache Qpid Proton project, and proposes a method for an automated measurement of their performance in peer-to-peer mode using an open-source tool called Quiver. The proposed performance measurement method has been imple- mented in the form of a Jenkins pipeline and is suitable for inclusion in a continuous delivery pipeline of a corporate software development process, serving as a performance regression test. iv Keywords software testing, performance testing, continuous integration, network technologies, Qpid Proton, AMQP 1.0 v Contents Introduction 1 0.1 Note on Terminology . .3 1 Software Testing 5 1.1 Software engineering . .5 1.2 Software Quality Engineering . .5 1.3 Testing process . .7 1.3.1 Testing terminology . .8 1.3.2 Types of tests . .9 1.3.3 Purposes of testing . 11 1.4 When to test . 14 1.4.1 Continuous integration . 15 1.4.2 Testing is Alerting . 16 1.5 When to stop testing . 17 2 Performance testing 19 2.1 Importance of performance and performance testing . 19 2.2 Key Performance Indicators (KPIs) . 21 2.3 When to test performance . 22 2.4 Automation in performance testing . 23 2.5 Types of performance tests . 23 2.6 Performance testing approaches . 25 2.6.1 Microbenchmarks . 25 2.6.2 System-level performance testing . 27 2.6.3 Performance monitoring in production . 30 2.7 Industrial benchmarking . 30 3 Maestro, Quiver, and Additional Tooling 33 3.1 Software Under Test: Apache Qpid . 33 3.2 Performance measurement frameworks . 34 3.2.1 Maestro . 34 3.2.2 Quiver . 37 3.3 Additional tooling . 37 3.3.1 Ansible . 38 3.3.2 Jenkins . 38 3.3.3 Google Benchmark . 39 vii 3.3.4 Docker . 40 4 Automation Design and Implementation 43 4.1 Microbenchmarking . 43 4.1.1 Implementation . 43 4.2 Quiver automation . 44 4.2.1 Design . 44 4.2.2 Implementation . 45 4.3 Result reporting . 46 4.3.1 Statistical analysis . 47 5 Performance regression testing evaluation 49 5.1 Test results . 49 5.2 Result analysis . 50 5.2.1 Qpid Proton C . 51 5.2.2 Qpid Proton C++ . 52 5.2.3 Qpid Proton Python . 53 5.2.4 Qpid JMS . 54 5.3 Future work . 54 Bibliography 57 viii List of Figures 3.1 Maestro HTML report 41 3.2 Performance Co-Pilot disk throughput chart 42 5.1 Qpid Proton C P2P Throughput per Release Version 51 5.2 Qpid Proton C++ P2P Throughput per Release Version 52 5.3 Qpid Proton Python P2P Throughput per Release Version 53 5.4 Qpid JMS P2P Throughput per Release Version 54 ix Introduction Messaging libraries are part of middleware, a software layer located between application code and operating system, which provides the application with additional services beyond what the operating sys- tem offers, by extending on that. Messaging belongs to the areaof Interprocess Communication and Enterprise Integration techniques. It allows creating distributed software with decentralized, decoupled flow of information, based on the notion of a logical address. Main applications of messaging include the Internet of Things (IoT), Event- Driven Architectures, and application integration using the Enterprise Service Bus (ESB) pattern. Messaging libraries developed in the Apache Qpid project use a standardized protocol called AMQP 1.0 (Advanced Message Queue- ing Protocol) to exchange messages. Moreover, the Apache Qpid JMS library also implements a standardized Application Programming Interface (API) called JMS 2.0 (Java Message Service). As a result of this, the external interfaces of Apache Qpid libraries are largely fixed, and the focus of the subprojects is to implement them in an efficient and performant manner for their respective supported programming languages. Performant messaging is necessary to enable the creation of demanding applications, unlocking the world’s potential. Continuous Integration and Continuous Delivery (CI/CD) are two techniques of modern software development. Changes to the software are automatically integrated into the master repository, unit tests and integration tests are performed for each change. Continuous Delivery then automates the process of creating deliverable artifacts at release time. Continuous Deployment extends this to the automatic deployment of the artifacts to the production environment, making the software development project more agile and efficient. The CI/CD pipeline becomes a point of communication and collaboration for all the diverse roles on the project development team working on the release. Building an automated Continuous Delivery pipeline that includes nonfunctional requirement checks, such as performance, is highly desirable for project health and velocity. Timely information gained from performance testing can prevent performance regressions, that is, degradation in application perfor- 1 mance from one version of the application to the next; aid in capacity planning, allowing to estimate the amount of hardware needed to sup- port a particular application deployment; inform future optimization efforts; and last but not least, provide backing for marketing claims used to promote the product. In my thesis, I have focused on examining two preexisting per- formance testing solutions for AMQP client libraries and decided on using one of those, called Quiver, initially developed by Justin R. Ross. I have then designed and built a job in the Jenkins continuous integra- tion system, which automatically sets up an environment and runs a Quiver performance test against an Apache Qpid client library the user specifies. The stability and reliability of performance results from this job stems from the use of a dedicated physical machine to perform the test, and from running the Quiver test multiple times, and calculating a mean throughput and the associated confidence interval from the data. The Jenkins job has been implemented in the form of a Jenkins declarative pipeline and is suitable for inclusion in a continuous de- livery pipeline of a corporate software development process, serving as a performance regression test for peer-to-peer message exchange using Apache Qpid libraries. My work presents an improvement over the previous state of things, which required each developer or quality engineer to pro- cure their own hardware, and set up the test in an ad-hoc fashion. Previously shared performance measurement data were never accom- panied by confidence intervals, or other similar means allowing to judge their reliability and significance of changes in the values over time. Using the job developed in this thesis is not only more straight- forward, it also provides more meaningful and repeatable results, and improves utilization of dedicated hardware running the benchmark, which is now managed as a Jenkins node. This thesis is structured as follows. Chapter 1 introduces the dis- cipline of software testing. Chapter 2 discusses the specifics of per- formance testing. Chapter 3 describes available performance mea- surement tooling with a view towards comparing their capabilities. Chapter 4 presents a design for a performance test and discusses its concrete implementation in Jenkins CI. Chapter 5 concludes the thesis with an evaluation of the implemented solution and proposals for fu- 2 ture expansions of the performance test to cover additional messaging patterns and messaging middleware. Note on Terminology This thesis distinguishes between machines (physical or virtualized computers) and servers (processes providing network services, run- ning on machines) [1]. Another useful distinction, although less rele- vant for the present discussion, is that between a client (process con- necting to a network server) and a customer (a physical or corporate entity which is using our software). 3 1 Software Testing Beware of bugs in the above code; I have only proved it correct, not tried it.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages77 Page
-
File Size-