Isn’t an Enterprise Test Strategy just for an Enterprise Project Release? CHICAGO April 18th — April 22th Renaissance Hotel 1 West Wacker Drive Chicago IL 60601

Speaker(s): Susan Schanta When: April 22, 2016 Company: Cognizant Technology Solutions Time: 10:00 am - 11:00 am 04/22/16 Isn’t an Enterprise Test Strategy just for an Enterprise Project Release? Susan Schanta, Director

© 2015 Cognizant 1 © 2015 Cognizant Why do I need an Enterprise Test Strategy? Why do I need an Enterprise Test Strategy? Because quality can’t be tested into the product… We need a shared vision to achieve quality…

What is the Challenge We’re Solving for?

. Lack of shared understanding regarding the role of the Quality organization − We pay a lot for QA, why isn’t quality better? − Why do we need QA so early in the lifecycle? − QA just tests at the end, right? − Why does it take QA so long to test?

. How do I change our persona from a Testing Department to a Quality Center of Excellence − Does the organization understand QA’s role in the development lifecycle? − Have I set expectations for how and when project stakeholders should engage QA? − How can I establish a collaborative relationship where cross-functional teams understand the interdependencies between their work product and the QA work product?

2 © 2015 Cognizant What is an Enterprise Strategy?

The Enterprise Test Strategy establishes a framework for how the QA organization operates and interacts with project team members. When collaboratively developed with Project Management, Business Analysts and Development, the Enterprise Test Strategy provides a foundation for how the organization will build quality into product releases while reducing the cost of quality. . Establishes standards for test analysis, planning and validation . Helps drive behavioral change in QA and cross-functional teams . Introduces shared responsibility for best practices . Drives defect reduction in the Requirements, Design and Construction Phases of the lifecycle . Institutes a shared and disciplined approach to automation standards to achieve automation sustainability . Defines standards for performance standards for critical applications to address operational and business continuity goals . Aligns test data management to corporate security policies . Creates strategies where gaps exist for specialty testing such as mobility, big data, automation, performance, etc.

3 © 2015 Cognizant The Value Proposition

The Enterprise Test Strategy provides a framework for how QA drives quality throughout the lifecycle and provides a foundation for cross-team collaboration, quality disciplines and operating guidelines. The ultimate goal is to deliver the best quality while driving down the cost of quality. . Limit Scope Creep . Increase Test Coverage . Increases velocity of Test Case creation . Increase Requirements Traceability to Test Cases . Reduce Defect Leakage to Production . Reduce Cost of Maintenance . Reduces redo work

4 © 2015 Cognizant An Examination of Strategy Types Enterprise Test Strategy Program / Project Test Strategy . Mission Statement . Project scope and objectives . Standards for test analysis, planning and validation  Risks & Mitigation  Definition of Test Types  Assumptions, Dependencies & Constraints  RACI for Test Phases – Unit through UAT . Test Scope – what will be tested  Tiered approach for test documentation  Limitations to testing based on tools needed . Automation Disciplines – ROI driven automation,  Limitations to testing based on environment development disciplines, availability . Performance Disciplines – load/stress, usability and . Test Approach business continuity  Manual test activities  . Defect Management Guidelines Automated test activities . . Test Data Management Guidelines Test Environment Requirements  Required hardware, software and licenses . Test Environment Management . Test Data Requirements . Test Tool  Data extraction requirements from production . Metrics beyond Defect Rates  Parameters for manipulation of test data (such as aging)

5 © 2015 Cognizant Establishing the Enterprise Quality Framework

6 © 2015 Cognizant Defining the Mission

Create a definitive statement that defines your organizational goals…

Quality Assurance is dedicated to reducing the cost of quality by improving overall product reliability. Our mission to attain Quality lies in defect prevention by establishing a precisely measurable process to ensure conformance to requirements. We believe that Quality is the all important catalyst that makes the difference between success and failure.

Our mission is to control, manage and drive all testing services, ensure quality in our software solutions, deliver on-time, on-budget, goal-oriented and cost-effective solutions for business, satisfy customer requirements, strive for continuous process improvements and contribute to the overall growth of the organization through streamlined, efficient and best-in-class testing practices, governance model with highly skilled & motivated people.”

7 © 2015 Cognizant Tiered Approach to Test Documentation Maintenance # Deliverable SDLC Phase PMO Releases After defining the full library of QA 1 L0 QA Project Estimate Initiation Y Y artifacts, determine mandatory 2 QCOE Project Plan (Schedule) Requirements Y N use based on set criteria 3 L1 QA Project Estimate Requirements Y N . Based on project size & 4 Requirements Traceability Matrix Requirements Y N complexity 5 Test Scenario Requirements Y Y . Based on project duration 6 Master Test Plan Requirements Y N . Based on test scope 7 L2 QA Project Estimate Design Y N 8 Project Level Test Plan Design Y N 9 Regression Test Case Selection Design Y Y 10 Test Case Design Y Y 11 Test Data Requirements Template Design Y Y 12 Test Environment Readiness Checklist Construction Y N 13 Defect Report Template Validation Y Y 14 Productivity Loss Log Validation Y Y 15 Test Summary & Closure Report Validation Y N 16 Lessons Learnt Document Validation Y N

8 © 2015 Cognizant Communication Guidelines Communication Deliverable Objective Owner Frequency Test conditions based on requirements, business Business/Test Scenario QA Lead Once during Requirements Phase rules, constraints, etc. Business/Test Scenario Stakeholder feedback collected and incorporated QA Manager As scheduled Stakeholder Review Code turnover notes of functions ready to test, Code Turnover Notes Dev Lead For each code turnover workarounds and open defects Estimate given based on business case and discussion- L0 QCOE Project Estimate QA Manager Once in Proposal Phase no requirements defined. Estimate given based on Elicitation Sessions and L1 QCOE Project Estimate QA Lead Once in Requirements Phase Requirements Requirements Traceability Trace requirements to test cases to defects QA Lead Weekly from Test Design forward Matrix Steps to validate a test condition based on business, Test Case QA Lead Once/Stored in Test Repository user, functional/nonfunctional requirements Test Case Peer Review Peer feedback collected and incorporated QA Lead As scheduled Defines the tactical and operational approach to Initiated in Requirements and Test Plan QA Lead validation of test conditions updated as needed Test Plan Stakeholder Review Stakeholder feedback collected and incorporated QA Manager As scheduled

9 © 2015 Cognizant Definition of Test Types Test Type Test Definition Unit Test Unit Test is performed by the developer to validate units of source code, modules and functions using controlled data to measure that each unit of code operates as designed. Integrated Unit Test Integration is a logical extension of unit testing. In its simplest form, two units that were individually tested are combined into a component and the interface between them is tested. A component, in this sense, refers to an integration of one or more units. Unit Regression Test When developers modify code, regression unit testing is required to evaluate code quality. In this case, the developer can reuse his original unit test cases. In some cases, the developer may be required to modify the existing tests or create new ones depending on the extent of the changes made. Smoke Test The Smoke Test provides a preliminary evaluation to identify failures severe enough to reject a prospective code turnover. A subset of test cases that cover the most important functionality.

10 © 2015 Cognizant Definition of Test Types Test Type Test Definition System Test System Test is performed to compare a program’s behavior against the functional business and technical specifications. Positive involves testing users can successfully use all paths of functionality with expected results. Negative system testing is checking to make sure users are not allowed to use improper paths. System System Integration Test is performed on the complete, fully integrated system with a focus on Integration Test Role Based Testing to emulate real life scenarios, sometimes called end-to-end testing. The purpose of integrated system testing is to detect any inconsistencies between the functions that are integrated together. Regression Test Regression test is any type of that seeks to uncover software errors by partially retesting a modified program. The intent of is to provide a general assurance that no additional errors were introduced in the process of fixing other problems. User Acceptance User (UAT) is a process to obtain confirmation that a system meets Test mutually agreed-upon requirements. The UAT acts as a final verification of the required business function and proper functioning of the system, emulating real-world usage conditions on behalf of the end user.

11 © 2015 Cognizant Defect Management

• Enforce a single standard for logging and tracking defects • Traceability of defects through resolution and Root Cause Analysis . Business Priority vs. Technical Severity − Implementation provides a preventative remedy to avoid downgrading defects in order to release to production . Establish SLAs based on business priority and technical severity based on lifecycle phase − Expected turnaround time < 12 hrs. − Expected turnaround time < 48 hrs. − Expected turnaround time < 72 hrs. − Expected turnaround time – prior to production release . Restricted permissions for select defect states − Closed, cancelled and deferred states . Go/No Go Decision − Zero uninspected defects at release

12 © 2015 Cognizant Standards for Test Analysis, Planning & Validation

13 © 2015 Cognizant Enterprise Approach to . Requirements Elicitation Requirements Phase  Participate in elicitation sessions  Create business/test scenarios . Participation in Requirements  Ensure requirements have an expected outcome Elicitation Sessions . Requirements Analysis . Requirements Analysis . Requirements Phase Signoff  Ambiguity Reviews to identify unclear, incomplete and missing requirements  Feedback loop with BA to address ambiguities  Unresolved ambiguities documented as requirements defect . Requirements Phase Signoff  QA certifies requirements are testable  QA certifies requirements are traceable to test cases  QA obtains signoff from business that business/test scenarios provide appropriate test coverage

14 © 2015 Cognizant Enterprise Approach to . Design Sessions Design Phase  Participate in design sessions . Participation in Design Sessions  Add/modify business/test scenarios . Design Analysis  Ensure design delivers documented requirements and . Design Phase Signoff provide the expected outcome . Design Analysis  Identify gaps/conflicts between requirements and design  Feedback loop with BAs/Development to address gaps/conflicts between requirements and design . Design Phase Signoff  QA certifies design addresses defined requirements  QA certifies design is traceable to requirements and test cases

15 © 2015 Cognizant Enterprise Approach to . Test Case Creation Construction Phase  Create test cases based on requirements and . Test Case Creation business/test scenarios . Regression Suite Analysis  Automation script creation . Test Case Review  Performance script creation . Test Case Signoff . Regression Suite Analysis  Examine regression test case suite and select test cases for execution during regression test . Test Case Review  Internal test case peer reviews with Test Leads  Project stakeholder review with BAs and Dev Leads . Test Case Signoff  QA secures signoff from BAs and Dev Leads

16 © 2015 Cognizant . Smoke Test Enterprise Approach to  Execute as entrance gate to Validation Phase Validation Phase  Execute as entrance gate for every code turnover . Smoke Test . System Testing . System Testing  Test execution for system test cases . Regression Test  Test execution for system integration test cases . Test Closure  Defect fix and retest cycles . Regression Test  Execute manual regression test cases  Execute automated regression test cases . Test Closure  Final update of test cases where needed  Test case selection for Regression Suite  Target test cases for next stage of automation  Lessons Learned

17 © 2015 Cognizant Introducing the Enterprise Test Strategy

18 © 2015 Cognizant Introducing an Enterprise Test Strategy to Your Organization

. Define the purpose and mission  Decide what you want to accomplish  Engage key stakeholders to gain acceptance and support  Create a template outline in alignment with your goals  Draft standards and guidelines as needed where gaps exist  Implement in phased approach . Introduce governance councils  Set goals based on governance council purpose (such as…) − Automation goals based on calculated ROI − Create development disciplines to achieve automation sustainability  Perform collaborative reviews with cross-functional stakeholders . Introduce to senior management as a program  Determine if the creation of a steering committee will garner greater support  Highlight approach as an initiative to reduce the cost of quality  Create metrics to benchmark present state and measure progress going forward  Set and keep a schedule for publishing metrics such as a monthly scorecard

19 © 2015 Cognizant Automation & Performance Governance Councils Automation Governance Council Performance Governance Council Provide oversight and direction to facilitate expansion of Provide oversight and direction to facilitate expansion of automated test coverage and sustainability to achieve long performance test coverage and sustainability to achieve term ROI long term ROI . Develop code disciplines to support automation . Code disciplines to support performance sustainability sustainability . Prioritization of applications to performance test . Collaborate to design automation coding approach to . ROI review against goals align with EH system architecture . Evaluate environmental requirements to support . Council governs cross-functional collaboration for performance as a discipline tool/process . Prioritize, review and approve CapEx expenditures for . Prioritization of test case automation tools, hardware and software to support test . ROI review against goals environment management before presentation to . Set standards to facilitate use of automation Senior Management benchmarks in development . Prioritize, review and approve CapEx expenditures for tools, hardware and software to support test environment management before presentation to Senior Management

20 © 2015 Cognizant Test Data & Test Environment Management Governance Councils

Test Data Management Governance Council Test Environment Governance Council Provide oversight and direction to facilitate transfer, use, Provide oversight and governance to normalize test maintenance and disposal of production test data within environment configuration, usage and expansion to the guidelines of IT Governance support E2E testing . Align IT Governance Standards to the test data . Unify management of test environments under one management discipline for masking, manipulating and governance structure disposal of production data extracts used for testing . Establish standards for configuration to align as closely . Govern use of production data to perform break/fix as possible to production testing . Govern release standards to ensure code promotion in . Set standards for capacity management (of data) to test environments mirrors production ensure storage use is optimized . Set standards for capacity management to ensure new . Implement guidelines for test coverage to ensure IT hardware/tools are not requested until current system Governance Standards for data are observed usage is maximized . Prioritize, review and approve CapEx expenditures for . Prioritize, review and approve CapEx expenditures for tools, hardware and software to support test tools, hardware and software to support test environment management before presentation to environment management before presentation to Senior Management Senior Management

21 © 2015 Cognizant Measurement

22 © 2015 Cognizant Project Level Measurements Project Budget: % Milestone Rate of Change Estimated vs. Adherence Requests Actual

Maintenance as Schedule Defect Aging Variance % of IT Budget

Compliance to Production Effort Variance SDLC Processes Defect Leakage

23 © 2015 Cognizant Requirements Measurement % Requirements # Clarifications # Ambiguities Schedule due to Unclear per Requirement Adherence Requirements

Rate of Defects Requirements # Incomplete with Root Cause Volatility Index Requirements as Requirements

Defect Leakage Rate of Change # Missing to Design & Requests Requirements Construction

24 © 2015 Cognizant Development Measurements

% Development Defects per Code Turnover Schedule Function Points Build Failures Adherence

Defect Leakage to Defects per Lines Mean Time to Validation Phase of Code Repair

Adherence to # Times Defect Automation Technical Debt due Rejected/ Development to Poor Design Reopened Disciplines

25 © 2015 Cognizant Test Measurements

% QA Schedule Automation Defect Adherence ROI Detection Rate

Requirements % Automated Defect Traceability to Regression Removal Test Cases Coverage Efficiency

Test Productivity Production Effectiveness Loss Log Defect Leakage

26 © 2015 Cognizant Appendix

27 © 2015 Cognizant Enterprise Test Strategy – Sample Template 1. Executive Summary a. QCOE Mission Statement b. QCOE Scope of Testing 2. Test Levels a. Unit Test b. Integrated Unit Test c. Unit Regression Test d. System Test e. System Integration Test f. Smoke Test g. Regression Test 1) Manual vs. Automated h. Nonfunctional Test i. UAT 3. Tiered Approach to Projects 4. Minimum Threshold for Entrance to QCOE 5. Entrance & Exit Criteria a. Proposal Phase b. Initiation & Planning Phase c. Requirements Phase

28 © 2015 Cognizant Enterprise Test Strategy – Sample Template d. Design Phase e. Construction Phase f. Validation Phase 6. Suspension & Resumption Criteria a. Test Halt 1) Smoke Test Fails 2) Module/Function Failure or Roadblock b. Test Resumption 7. Program Assumptions, Constraints, Risks & Dependencies a. Assumptions b. Constraints c. Risks d. Dependencies e. Regulatory & Compliance 8. Systems Supported a. Operating Systems b. Web Browsers 9. QCOE Deliverables 10.QCOE Communication Guidelines

29 © 2015 Cognizant Enterprise Test Strategy – Sample Template 6. Test Estimation Model a. L0 Test Estimation b. L1 Test Estimation c. L2 Test Estimation d. Impact Analysis – Change in Scope 12.Strategic Program Approach to Test Activities a. Functional Test b. System Test c. System Integration Test d. Regression Test 13.Functional vs. User Acceptance Test 14.Strategic Program Approach for Automation a. Guidelines for b. Automation Phase & Activities 1) Requirements Phase 2) Design Phase 3) Construction Phase 4) Validation Phase c. IT Development Standards d. Automation Coding Standards

30 © 2015 Cognizant Enterprise Test Strategy – Sample Template 15.Strategic Program Approach for Performance a. Types of Performance Testing b. Performance Standards c. Guidelines for Performance d. Performance Phase & Activities 16.Test Data Management a. TDM Standards b. Guidelines for Test Data Use c. TDM Framework d. TDM Phase & Activities 17.Test Environment Management a. TEMs Standards b. Guidelines for Test Environment Use c. TEMs Framework d. TEMs Phase & Activities e. System Environment Limitations & Missing Environments 18.QCOE Tools a. Commercial Tools b. Proprietary Tools

31 © 2015 Cognizant Enterprise Test Strategy – Sample Template 19.Defect Management Process a. Defect State b. Business Priority c. Defect Technical Severity d. Defect Prioritization Based on Business Priority & Technical Severity e. Restricted Permissions – Closing/Cancelling/Deferred Defects 20.Go/No Go Decision a. Zero Uninspected Defects at Release b. Go/No Go Decision 21.QCOE Test Management Tool Set 22.QCOE Program Reporting a. Reports (Traceability, Daily, Weekly, Productivity Loss Log) b. KPIs, SLAs & Metrics 23.Reference Documents 24.Appendix a. Glossary

32 © 2015 Cognizant References Software Testing Stuff: Test Strategy - All Aspects http://www.softwaretestingstuff.com/2008/11/test-strategy-all-aspects.html.

Test Strategy and Test Plan http://www.testingexcellence.com/test-strategy-and-test-plan/

Test Plan Fundamentals http://softwaretestingfundamentals.com/test-plan/

Difference between Test Plan and Test Strategy | Do we really need Test Plan documents? http://www.softwaretestingtimes.com/2010/04/difference-between-test-plan-and-test.html

Defining Code Quality http://blog.smartbear.com/peer-review/defining-code-quality/

33 © 2015 Cognizant References

Test Metrics & KPIs http://www.ust-global.com/en/images/stories/pdf/Test_Metrics_and%20KPI_s.pdf

Project Metrics for Software Development http://www.infoq.com/articles/project-metrics

The Cost of Software Quality - A Powerful Tool to Show the Value of Software Quality http://www.riceconsulting.com/articles/software-cost-of-quality-metric.html

Using the Cost of Quality Approach for Software http://sunset.usc.edu/cse/pub/event/archives/pdfs/Herb_pt2.pdf

34 © 2015 Cognizant Susan Schanta Director Cognizant Technology Solutions [email protected] 201-478-0571