Chapter 9 Quality and Change Management
MACIASZEK, L.A. (2007): Requirements Analysis and System Design, 3rd ed. Addison Wesley, Harlow England ISBN 978-0-321-44036-5
Chapter 9 Quality and Change Management
© Pearson Education Limited 2007 TopicsTopics
• Quality assurance
• Quality control
Change management
• Change requests
• Traceability
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 2 1. Quality management
Quality management is part of an overall software process management, along such other activities as people, risk and change management. Quality management divides into quality assurance and quality control. MainMain conceptsconcepts
Testing is not just the debugging of programs – it is part of quality management: • Quality assurance is about proactive ways of building quality into a software system • Quality control is about (mostly reactive) ways of testing the quality of a software system Test driven development builds quality into a software system by an outright demand that a test code has to be written before the application code and that the application must pass the test to be quality assured Change management is a fundamental aspect of the overall project management Traceability underlies testing and change management
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 4 QualityQuality managementmanagement
Part of an overall software process management along with: • people management, • risk management and • change management. To be performed in parallel with and in addition to project management (scheduling, budget estimation, tracking project progress). • should have its own budget and schedule • one of its tasks should be to ensure quality in project management per se • the actions and outcomes of quality and change management may involve changes to the project schedule and budget baselines, also known as the performance measurement baselines
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 5 SoftwareSoftware qualitiesqualities
correctness reliability robustness performance usability understandability maintainability (repair ability) scalability (evolvability) reusability portability interoperability productivity timeliness visibility
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 6 QualityQuality assuranceassurance
Software Quality Assurance (SQA) team
Techniques:
• Checklists
•Reviews
– Walkthroughs
– Inspections
•Audits
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 7 ChecklistsChecklists
A predefined list of to-do points that need to be scrupulously checked off in the development process.
Processes differ from project to project Æ checklists cannot be fixed for ever • Also, the “baseline” checklists need to be modified to account for new IT technologies and changes in IT development paradigms.
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 8 WalkthroughsWalkthroughs
A type of formal brainstorm review that can be conducted in any development phase A friendly meeting of developers, carefully planned and with clear objectives, an agenda, duration, and membership A few days prior to the meeting, the participants are handed the materials to be reviewed During the meeting the problems need to be pinpointed but no solutions attempted Acknowledged problems are entered on a walkthrough issues list The list is used by the developer to make corrections to the reviewed software product or process A follow-up walkthrough may be necessary
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 9 InspectionsInspections
Like the walkthrough, an inspection is a friendly meeting but done under close supervision by the project management The purpose is also to identify defects, validate that they are in fact defects, record them, and schedule when and by whom they have to be fixed Unlike the walkthroughs, inspections are conducted less frequently, may target only selected and critical issues, and are more formal and more rigorous An informational meeting usually takes place one week before the inspection meeting During the meeting, the defects are identified, recorded and numbered Immediately after the meeting, the moderator prepares the defect log – recorded in a change management tool The developer is normally requested to resolve the defects quickly and record the resolution in the change management tool The moderator – in consultation with the project manager – submits the development module to the Software Quality Assurance (SQA) group in the organization.
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 10 AuditsAudits
A quality assurance process that is modeled on and similar in required resources to traditional accounting audits. Positioned within the overall software process management: • It places risk management in its view. • It relates to the strategic importance of IT to the enterprise and addresses IT governance. • It looks at the alignment of the audited project with IT investments in the context of the enterprise mission and business objectives. Differences between an audit and other QA techniques: • the producer of the audited product or process is usually a team, not a person, • an audit can be performed without the presence of the producer, • audits make extensive use of checklists and interview and less so of reviews, • audits can be external, i.e. conducted by auditors external to the organization, • an audit can last from a day to a week or more (but is always restricted to strictly defined scope and objectives).
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 11 TestTest drivendriven developmentdevelopment
The idea is to write test cases and scripts as well as test programs before the application code (the unit under test) is developed (designed and programmed) • application code is written as a response to a test code and the test code can be used to test the application code as soon as it is available Advantages • allows to clarify user requirements (and the use case specifications) before the programmer writes the first line of the application code • drives in fact the software development, not just the software verification • supported by testing frameworks, such as JUnit
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 12 JUnitJUnit testingtesting
public class XXXTest extends TestCase { public XXXTest(String testName) { super(testName); Test } public void testSomething() { run(TestResult) ... } public void testSomethingElse() { ... } public void runTest() { TestCase testSomething(); TestSuite testSomethingElse(); run(TestResult) } runTest() run(TestResult) } setUp() addTest(Test) tearDown()
XXXTest TestResult
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 13 TestingTesting typestypes andand developmentdevelopment phasesphases
Design Analysis System Customer User Programming specifications specifications constraints requirements environment
Unit test Integration Functional Constraint Acceptance Installation test test test test test ReusableReusable components components Unit Performance, Security, test System Scalabilty, deployed etc.
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 14 TestTest conceptsconcepts
Business Use Use Case Enhancements Case Document Documents Document
Feature 1 Use Case Req 1 Enhancement 1 Feature 2 Use Case Req 2 Enhancement 2 … … …
Test Plan Test Case Defects Document Documents Document
Test Case 1 Test Req 1 Defect 1 Test Case 2 Test Req 2 Defect 2 … … …
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 15 TestTest casecase documentdocument
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 16 TestTest environmentenvironment
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 17 TestingTesting systemsystem servicesservices
Informal testing
Methodical testing
• Non-execution-based (formal reviews)
– Walkthroughs
– Inspections • Execution-based
– Testing to specs
– Testing to code
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 18 TestingTesting toto specsspecs
Execution-based test type • applies to executable software products, not to documents or models • also known as – black-box testing – functional testing – input/output driven testing Test module treated as a black box that takes some input and produces some output (no need to understand the program logic or computational algorithms) Requires that the test requirements are derived from the use case requirements, and then identified and documented in a separate test plan and test case documents Test scenarios can be recorded in a capture-playback tool and used repeatedly for regression testing Likely to discover defects normally difficult to catch by other means, such as missing functionality
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 19 TestingTesting toto specsspecs
Inputs causing anomalous behavior Input set Ie
Test Unit (as black box)
Outputs which reveal the presence of defects Output set Oe
[1735] © Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 20 FeasibilityFeasibility ofof testingtesting toto specsspecs
Example 1: Consider price calculation based on two factors: 5 types of commission and 7 types of discount → 35 test cases
Example 2:
Consider 20 factors, each taking on 4 values → 420 or 1.1 × 1012 test cases
Conclusion:
Combinatorial explosion [1816]
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 21 TestingTesting toto codecode
Execution-based testing Also known as • white-box testing • glass-box testing • logic-driven testing • path-oriented testing Starts with the careful analysis of the program’s algorithms Test cases are derived to exercise the code – i.e. to guarantee that all possible execution paths in the program are verified • the test data are specially contrived to exercise the code Can be supported by the capture-playback tools and used then for regression testing • playback scripts need to be written by the programmer rather than generated by the tool Like all other forms of execution-based testing, testing to code cannot be exhaustive
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 22 TestingTesting toto codecode
Test inputs
Derives Test
Test Unit as white box
Test outputs
[1735]
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 23 FeasibilityFeasibility ofof testingtesting toto codecode
Combinatorial explosion loop <= 18 times
It is possible to test every path without detecting every fault. A path can be tested only if it is present. Weakening white-box testing: • exercising both, and only, the true branch and the false branch of all conditional statements • execute every statement ( and that’s it) [1816]
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 24 TechniquesTechniques ofof testingtesting toto specsspecs
Equivalence testing • suppose the range 1 through 16,000 is an equivalence class •test cases: – less than 1 (error message) – from 1 through 16,000 (correct) – more than 16,000 (error message) Boundary value analysis: – 0 – 1 – 2 – 723 – 15,999 – 16,000 [1816] – 16,001 © Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 25 TechniquesTechniques ofof testingtesting toto codecode
Structural testing: • statement coverage • branch coverage • path coverage • all-definition-use-path coverage
Complexity metrics: • lines of code • cyclomatic complexity
[1816]
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 26 ApproachesApproaches toto integrationintegration testingtesting
Bottom-up testing • requires drivers (routines that call a particular component and pass a test case to it)
Top-down testing • requires stubs (routines to simulate the activity of the missing component; they answer the calling sequence and pass back output data that lets the testing process continue)
Big-bang integration
Sandwich integration [2110]
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 27 TestingTesting systemsystem constraintsconstraints
Predominantly execution-based Includes such issues as: • user interface testing • database testing • authorization testing • performance testing • stress testing • failover testing • configuration testing • installation testing
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 28 GraphicalGraphical useruser interfaceinterface testingtesting
Is the window modal or modeless? Which should it be? Is a visual distinction made between the required and optional fields? Are any fields missing? Are there any spelling mistakes in titles, labels, prompt names, etc.? Are command buttons (OK, Cancel, Save, Clear, etc.) used consistently across all dialog boxes? Is it possible always to abort the current operation (including the delete operation)? Are all static fields protected from editing by users? If the application can change the static text, is this being done correctly? Do the sizes of edit boxes correspond to ranges of values that they take? Are the values entered into edit boxes validated by the client program? Are the values in drop-down lists populated correctly from the database? Are edit masks used in entry fields as specified? ...
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 29 DatabaseDatabase testingtesting
Verify that the transaction executes as expected with correct input. Is the system’s feedback to the UI correct? Is the database content correct after the transaction? Verify that the transaction executes as expected with incorrect input. Is the system’s feedback to the UI correct? Is the database content correct after the transaction? Abort the transaction before it finishes. Is the system’s feedback to the UI correct? Is the database content correct after the transaction? Run the same transaction concurrently in many processes. Deliberately make one transaction hold a lock on a data resource needed by other transactions. Are the users getting understandable explanations from the system? Is the database content correct after the transactions have terminated? Extract every client SQL statement from the client program and execute it interactively on the database. Are the results as expected and the same as when the SQL is executed from the program? ...
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 30 AuthorizationAuthorization testingtesting
The user interface of the program should be able to configure itself dynamically to correspond to the authorization level of the current user (authenticated by the user id and password) Server permissions (privileges): •to accessindividual server objects (tables, views, columns, stored procedures, etc.) • to execute SQL statements (select, update, insert, delete, etc.) Permissions for a user may be assigned at a user level or at a group level Most DBMSs support also the role level An authorization database may need to be set up alongside the application database to store and manipulate the client and server permissions
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 31 TestingTesting ofof otherother constraintsconstraints
performance testing • transaction speed and throughput • peak loads stress testing • to break the system when abnormal demands are placed on it • frequently coupled with performance testing failover testing • system’s response to a variety of hardware, network or software malfunctions • closely related to the DBMS recovery procedures configuration testing • how the system operates on various software and hardware configurations. installation testing • extends the configuration testing • verifies that the system operates properly on every platform installed
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 32 ReviewReview QuizQuiz 9.19.1
1. Which technique of quality assurance is also known as non-execution-based testing?
2. Which quality assurance technique produces a defect log to act on?
3. What is a popular testing framework for Java applications?
4. How are test cases realized (specified)?
5. Which kind of testing can discover missing functionality?
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 33 2. Change management
Change management is the underlying aspect of overall project management. Change requests must be documented and the impact of each change on development artifacts tracked and retested after the change has been realized. ChangeChange andand qualityquality managementmanagement
Change Requests Change Management
Use Case Requirements
Model Artifacts
Enhancements
Test Requirements Defects
Quality management
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 35 ManagingManaging changechange
Workspace browser with predefined queries, charts and reports
List of defects
Details of selected defect Possible actions (state changes)
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 36 SubmittingSubmitting changechange requestrequest
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 37 DefectDefect managementmanagement
Allowed actions for submitted defect
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 38 KeepingKeeping tracktrack ofof changechange requestsrequests
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 39 TraceabilityTraceability
There is a significant cost to the project associated with the traceability, testing and change management •the cost-benefit analysis should determine the scope and depth of project traceability – as a minimum, the traceability should be maintained between the use case requirements and defects What can be traced? • Features are linked to test cases and to use case requirements in the use case documents • Test requirements in the test case documents can be traced back to test cases and use case requirements • Test requirements are linked to defects and enhancements are traced to use case requirements
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 40 SystemSystem featuresfeatures toto useuse casescases andand useuse casecase requirementsrequirements
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 41 TestTest plansplans toto testtest casescases andand testtest requirementsrequirements
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 42 UMLUML diagramsdiagrams toto documentsdocuments
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 43 UMLUML diagramsdiagrams toto requirementsrequirements
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 44 UseUse casecase requirementsrequirements toto testtest requirementsrequirements
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 45 TestTest requirementsrequirements toto defectsdefects
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 46 UseUse casecase requirementsrequirements toto enhancementsenhancements
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 47 ReviewReview QuizQuiz 9.29.2
1. How does an activity of managing a change originate?
2. What technique is used to determine the scope and depth of project traceability?
3. What tools can automate test scripts?
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 48 SummarySummary
Testing and change management span the development lifecycle Testing has two dimensions • It is a reactive (post factum) activity when used as a quality control mechanism • It can, however, be a very proactive quality assurance activity when used within the framework of the test driven development Testing and change management assume that the traceability links between system artifacts exist and have been properly maintained during the development Testing divides into the testing of system services and the testing of system constraints Test requirements are identified in the test case documents and linked to the use case requirements in the use case documents A change request is normally either a defect or an enhancement
© Pearson Education 2007 Chapter 9 (Maciaszek - RASD 3/e) 49