On Agile Performance Requirements Specification and Testing
Total Page:16
File Type:pdf, Size:1020Kb
On Agile Performance Requirements Specification and Testing Chih-Wei Ho1, Michael J. Johnson2, Laurie Williams1, and E. Michael Maximilien2 1Department of Computer Science, North Carolina State University {cho, lawilli3}@ncsu.edu 2IBM Corporation {mjj, maxim}@us.ibm.com Abstract engineering activities start with performance requirements (PRs) definition [10]. However, some of Underspecified performance requirements can the performance factors are not predictable (e.g., an cause performance issues in a software system. application may be run on an inferior processor with However, a complete, upfront analysis of a software slower speed than initially anticipated) or not available system is difficult, and usually not desirable. We at the early stages of development (e.g., the number of propose an evolutionary model for performance hits of a web server). To conduct a complete upfront requirements specifications and corresponding analysis for performance issues is usually undesirable, validation testing. The principles of the model can be especially for agile practitioners. integrated into agile development methods. Using Therefore, we propose the software Performance this approach, the performance requirements and test Requirements Evolution Model (PREM) as a guideline cases can be specified incrementally, without big for PRs specification and validation. Using the upfront analysis. We also provide a post hoc model, the development can start with quick, simple examination of a development effort at IBM that had a PRs and related performance test cases. During high focus on performance requirements. The development, more precise and realistic performance examination indicates that our evolutionary model can characteristics are obtained through customer be used to specify performance requirements such that communication or performance model solving. The the level of detail is commensurate with the nature of performance characteristics can be added to the PRs, the project. Additionally, the IBM experience and the test cases can be more detailed, until the PRs indicates that test driven development-type validation are “good enough” for the project and its testing corresponding to the model can be used to functionalities. determine if performance objectives have been met. In this paper, we provide an overview of PREM and a post hoc examination of a development effort at IBM 1. Introduction that had a high focus on performance requirements. While software performance concerns include time Agile development methods are sometimes (e.g., throughput, response time) and space (e.g., criticized for not having explicit practices for eliciting memory usage, storage usage), we limit and only focus non-functional requirements (NFRs) [13]. In this on the discussion of time-related performance issues. paper, we discuss an agile approach to address the However, we believe that our model is applicable to specification and testing of an important NFR: space-related performance issues. performance. Performance is highly visible to the The rest of the paper is organized as follows: software user. A system that runs too slowly is likely Section 2 provides the background and related work to be rejected by all users. Failure to achieve some for PRs and testing tools; Section 3 gives description expected performance level might make the system of PREM; Section 4 presents an IBM experience with unusable, and the project might fail or get cancelled if PRs specification and testing; and Section 5 concludes the system performance objective is not met [18]. this paper with summary and future work. Research shows that performance issues should be dealt with early, otherwise performance problems are IBM is a trademark of International Business Machines Corporation more difficult and expensive to fix in later stages of the in the United States or other countries or both. Other company, development process [6, 7, 17]. Performance product, and service names may be trademarks or service marks of others. JUnit decorators that perform both timed and load 2. Background tests. Figure 2 shows an example of JUnitPerf performance testing code. In this section, we present some background information and related work on agile-style software //JUnit 4 timed test: only passes if it performance and performance testing tools. //finishes in 200 ms. @Test(timeout=200) public void perfTest() { //test scenario 2.1. Software performance and Agile Methods ... } At a glance, agile methods do not have explicit Figure 1: JUnit timeout parameter practices for eliciting overarching non-functional system properties. However, Auer and Beck list a //JUnitPerf load test: run 10 instances family of software efficiency patterns called Lazy //of test, with 100 ms intervals between Optimization [1], which reflects the famous quote from //them – max elapsed time is 1000 ms. Knuth that “Premature optimization is the root of all Timer timer = new ConstantTimer(100); evil in programming,”1 and the “You Aren’t Gonna Test test = new MyTest("Perf Test"); Need It (YAGNI)” philosophy. These patterns can be Test loadTest = new LoadTest(test, 10, timer); summarized as follows. Early in development, the Test timeLoadTest = system performance is estimated with a short new TimedTest(loadTest, 1000); performance assessment. Rough performance criteria Figure 2: JUnitPerf example are specified to show performance concerns in the system, and are evolved as the system matures. Tune These JUnit-based test frameworks provide an easy, performance only when the functionality works but programmatic way to write performance test cases. does not pass the performance criteria. However, in complex test cases with complicated Another camp claims that performance is mostly workloads, accurate probability distribution may be determined during the architecture stages [8]. Smith required [19]. JUnit-based test frameworks may be and Williams criticize that the “fix-it-later” attitude is insufficient in such situations. Additionally, the one of the causes of performance failures [17]. Their resulting test code would be difficult to understand. standing ground is that performance models built in To design more complicated performance test cases, architecture and early phases can predict the system one should consider more advanced performance performance. Fixing software problems later is difficult testing tools, for example, script-based (e.g., The and expensive. Therefore, performance decisions 4 Grinder [21]), user action recording (e.g., Apache should be made as early as the architecture phase. 5 JMeter ) or other commercial, high-end tools. We believe the disagreement is rooted in the Agile approaches rely heavily on testing for different philosophy of software development. For software quality assurance [9]. Although, as stated in agile practitioners, software architecture is an instance the previous section, Auer and Beck argue that of the big up-front design (BDUF) that is avoided. performance concerns should be dealt with after the Lazy Optimization aims for agility, but may not be functionality is complete [1], we posit that a sufficient for high-criticality software projects, of development team can benefit from early performance which predictability is more desirable. specification and testing. 2.2. Performance testing tools 3. Performance requirements evolution JUnit2 is the most popular unit testing framework in model the agile community. Beginning with Version 4 which was released in 2006, JUnit provides a In this section, we provide an overview of PREM. “timeout” parameter to support performance-related PREM is an evolutionary model for PRs specification. testing. A test method with timeout parameter can PREM provides guidelines on the level of detail only pass if the test is finished in the specified amount needed in a PR for development teams to specify the of time. Figure 1 shows an example of a test method necessary performance characteristics details and the with the timeout parameter. JUnitPerf3 is a library of form of validation for the PR. Before explaining the model, a short example is provided to show how a 1 “Computer Programming as an Art,” 1974 Turing Award lecture. 2 http://www.junit.org 4 http://grinder.sourceforge.net/ 3 http://www.clarkware.com/software/JUnitPerf.html 5 http://jakarta.apache.org/jmeter/ performance objective can fail because of usually specified qualitatively. As a result, they can underspecified PRs. usually be validated with qualitative evaluation. In A PR might be as simple as “The authentication the authentication process example, the development process shall be complete in 0.2 seconds.” An team might provide a prototype to the customer, and intuitive way to write a test case for this requirement is see whether the user is satisfied with the response time. to run the authentication process several times, and compute the average. However, even if the average is 3.1.2. Level 1. Successful agile testing relies heavily below 0.2 seconds, when the system is live, and on test automation [9]; human testing is undesirable concurrent authentication requests come in, the users and should be limited whenever possible. To make may experience more than 0.2 seconds of waiting time. the requirement testable with automation, the first step In this example, the performance objective is not is to provide quantitative expectations in the achieved because the workload characteristic is not specification, and promote it to a Level 1 PR. The specified in the requirement and not accounted for