A QA’s Role in a DevOps World Quality Initiatives in the DevOps Tool Chain

Nordic Testing Days 2018 Photo: Jan Sahlström, 2017 Tallinn, Estonia Jan Sahlström, Betsson Technologies @testsson_se DevOps “A term used to refer to a set of practices that emphasize the collaborationDevOps and communication is a culture, of both not (only) developers a role and information technology (IT) professionals while automating the process of software delivery and infrastructure changes. It aims at establishing a culture and environment, where building, testing, and releasing software can happen rapidly, frequently, and more reliably.” Source: Wikipedia Are there any problems with DevOps? • There is no clear definition • Developed by developers and ops, no testers involved • ”Everything can be automated” DevOps Toolchain Test here! DevOps Toolchain Test here! DevOps ToolchainTest Test Test here! here! Test here! here!

Test Test Test here! here! here!

Inspired by: @DanAshby04 DevOps Toolchain

1. Understand the activity in each stage Create Create is composed of the building, coding, and configuring of the . The process specific activities are: • Design of the software and configuration • Coding including code quality and performance • Software build and build performance • Release candidate a.k.a: Coding DevOps Toolchain

1. Understand the activity in each stage 2. Identify quality initiatives Create | QA initiatives What Comment How TDD Help drive the testing and understanding of the feature Sapient Unit Test Verify the functionality of a specific section of code Machine Integration Test Verify the interfaces between components Machine Component interface Check the handling of data passed between various units Machine Test Code Quality Configured and measured using tools Machine Code Review Any form of four eye principle Sapient Code Coverage Where are the weak spots? Machine DevOps Toolchain R R R R Repeat for each section R R R Pipeline

Define where the various initiatives fit and what should happen if a verification step fails Some initiatives

Adapting to the increasing speed of deliveries and ways of working Environments

Frequent - Controlled Highly Controlled Guarded Deployment Deployment Dev PC PreProd PROD

LOCAL TEST QA PRODUCTION EveryoneUnit Test have a Smoketest environment. Test System Test Load Test(!) SomeIntegration are Test lucky enoughRST to have anotherUAT environment for their productionAcceptance Test code.Regression Test Performance Test 4 Eye Principle System Test MONITORING MONITORING MONITORING

LOCAL TEST QA PROD Environments Test Doubles: Fake, Mock, Stub

3rd Party 3rd Party 3rd Party Test Doubles Test Environment Live Environment

Dev PC PreProd PROD

LOCAL TEST QA PRODUCTION Unit Test Smoke Test System Test Load Test(!) Integration Test RST UAT Acceptance Test Regression Test Performance Test 4 Eye Principle System Test MONITORING MONITORING MONITORING

LOCAL TEST QA PROD Linting

• Linting - Running a program that will analyze code for potential errors • Start small with a just a few rules and expand as you go Unit Test / Integration Test • Run on LOCAL and TEST • Code Coverage – %? • Code Coverage – Shouldn’t deteriorate • Should be fast, if not try to parallelize

Build Test Set 01 Test Set 02 Test Set 03 Build Test Set 01

Test Set 02

Test Set 03 Test Data – Setup and Teardown

Initial State

Setup

Execute

Teardown

Return to Initial State Why? End and start tests in a know state will avoid tests failing due to other breaking tests Branches – When to test

Master #00 Master #01 Master #02

A #1

Test Deep Branches – When to test

B #1

Master #00 Master #01 Master #02 Master #03

A #1 Test Deep

Tests done in the branch Tests done in Master Test Deep might not be valid after B #1 might be too late due to breaking changes Branches – When to test

A #1

Master #00 Master #01 Master #02 Master #03

2. Test Shallow A #1 i.e. Smoke Test

Middle of the 1. Test Deep road approach Branches – When to test

A #1

Master #00 Master #01 Master #02 Master #03

A #1

Master should always be deployable Jez Humble, DevOps Research and Assessment LLC

Master should never contain known errors Jakob Alander, Spotify On (checks) • Writing automated checks is bug prevention, if it’s done by a developer. I.e. bugs are found while writing the tests. • Let new tests qualify for the CI monitoring (Non-deterministic tests) Code Review • Review all code, including tests • Don’t just read the code. Run the code and the applicable tests during the review • Testers can join in on code reviews even if they cannot code Canarying / Canary Deploy • Deploy / switch on and verify one instance / section at a time • Start with the “cheapest” one… • Monitor closely The Test column Who owns it? ToDo In Progress Test Done

!

Empty board right-to-left i.e. empty Test column before starting a new feature Pair/Mob Testing Join up with someone in the team and do joint test session Gather the team in front of a big monitor and do a mob session

Image source: Taipei Times Dec 24, 2014. AFP / KCNA via KNS Done Done Or almost done? In Progress Test Almost Done... Done

Make sure that there is a DOD before starting on a new feature Monitor the builds

• The state of the build should be everyone’s primary interest • Must always be green – Avoid the broken window syndrome • Display on a dedicated monitor • All green != no defects… Monitor your systems

• Look into different strategies for monitoring (RED, USE, Other) • Use the same monitoring cross the different environments • Share usefull monitoring Monitoring vs. Testing

Testing Monitoring Coverage

Increased risk of defects in PROD → Monitoring vs. Testing. Why not? If:

ü Possibility to perform A-B testing, Canary Release or equivalent ü Hidden PROD environment (AWS Green & Blue) ü Support & Tech organization available to act quickly ü Extremely loyal users DevOps Toolchain

1. Understand the activity in each stage 2. Identify quality initiatives 3. Prioritize and implement TeQuILA

Bringing it all together Modeling Google Test Certified Level 1 A 5 level model developed at Google for enhancing quality •Set up test coverage bundles •Set up a continuous build •Classify your tests as Small, Medium and Large •Identify nondeterministic tests •Create a smoke test suite

Level 2 •No releases with red tests •Require a smoke test suite to pass before a submit •Incremental coverage by all tests >=50% •Incremental coverage by small tests >=10% •At least one feature tested by an integration test

Level 3 •Require tests for all nontrivial changes •Incremental coverage by small tests >=50% •New significant features are tested by integration tests Level 4 •Automate running of smoke tests before submitting new code. •Smoke tests should take less than 30 minutes to run •No nondeterministic tests •Total test coverage should be at least 40% •Test coverage for small tests alone should be at least 25% •All significant features are tested by integration tests

Level 5 •Add a test for each nontrivial bug fix •Actively use available analysis tools •Total test coverage should be at least 60% •Test coverage for small tests alone should be at least 40% Modeling Personalize the model

Google Test Certified Added •Define testable requirements for all new stories •Guarded commits A 5 level model developed at Google for enhancing quality •Four eyes principle •Own metrics •...

Level 1 •Set up test coverage bundles •Set up a continuous build •Classify your tests as Small, Medium and Large •Identify nondeterministic tests •Create a smoke test suite

Level 2 •No releases with red tests •Require a smoke test suite to pass before a submit •Incremental coverage by all tests >=50% •Incremental coverage by small tests >=10% •At least one feature tested by an integration test

Level 3 •Require tests for all nontrivial changes •Incremental coverage by small tests >=50% •New significant features are tested by integration tests Level 4 •Automate running of smoke tests before submitting new code. •Smoke tests should take less than 30 minutes to run •No nondeterministic tests •Total test coverage should be at least 40% •Test coverage for small tests alone should be at least 25% •All significant features are tested by integration tests

Level 5 •Add a test for each nontrivial bug fix •Actively use available analysis tools •Total test coverage should be at least 60% •Test coverage for small tests alone should be at least 40% Dependencies Sort out the internal dependencies for the identified initiatives

1 2 3 4 ... Implementation

DriverGetTeam started effort! Each team need a person within the team to drive the inititiative. This can be any team member. Set up workshops and go through each step deciding on how to implement them None of this will be possible without the participation of the whole team

Working software over new functionallity DevOps Culture Team Architect This is where the magic happens

Developer QA

SEAL | All roles somewhat overlapping Software Engineering ALchemist Caveat

Malum est consilium, quod mutari non potest Publilius Syrus (85-43 BC)

It’s a bad plan if it admits no changes Thank you!

Q&A

Temptare necesse est, navigare non est necesse @testsson_se FamilyGuyYourself Inspirations and sources

Beside all bits and pices picked up on conferences, meetups, tweets and various other places (too many and too scattered to list) that has gone into this presentation there are some that stands out. Publications Talks & Discussions

• World Quality Report 17/18 9th edition • Anna Sundin Almén, AddQ • How Google Tests Software by James A. • Jakob Alander, Spotify Whittaker, Jason Arbon, Jeff Carollo • Henrik Andersson, Betsson Technologies AB • Continuous Delivery: Reliable Software • Sigurdur Birgisson, Avensia Releases Through Build, Test, and Deployment Automation by Jez Humble • Richard Bradshaw, Ministry of Test • A Practical Guide to Testing in DevOps by • Angie Jones, Twitter Katrina Clokie • Blog post "Continuous Testing in DevOps…“ by Dan Ashby