Business Value of CI, CD, & DevOpsSec Scaling Up to Billion User Global Systems of Systems Using END-TO-END AUTOMATION & CONTAINERIZED DOCKER UBUNTU IMAGES Dr. David F. Rico, PMP, CSEP, FCP, FCT, ACP, CSM, SAFe Twitter: @dr_david_f_rico Website: http://www.davidfrico.com LinkedIn: http://www.linkedin.com/in/davidfrico Agile Capabilities: http://davidfrico.com/rico-capability-agile.pdf Agile Resources: http://www.davidfrico.com/daves-agile-resources.htm Agile Cheat Sheet: http://davidfrico.com/key-agile-theories-ideas-and-principles.pdf Dave’s NEW Business Agility Video: https://www.youtube.com/watch?v=-wTXqN-OBzA DoD Fighter Jets vs. Amazon Web Services: http://davidfrico.com/dod-agile-principles.pdf Author Background

 Gov’t contractor with 33+ years of IT experience  B.S. Comp. Sci., M.S. Soft. Eng., & D.M. Info. Sys.  Large gov’t projects in U.S., Far/Mid-East, & Europe

 Career systems & engineering methodologist  Lean-Agile, Six Sigma, CMMI, ISO 9001, DoD 5000  NASA, USAF, Navy, Army, DISA, & DARPA projects  Published seven books & numerous journal articles  Intn’l keynote speaker, 160 talks to 13,000+ people  Specializes in metrics, models, & cost engineering  Cloud Computing, SOA, Web Services, FOSS, etc.

 Adjunct at six Washington, DC-area universities 2 Agile Testing—Global Scaling

 Most of world’s population connected to Internet  Systems must support billions of simultaneous users  New approaches are needed to scale to global market



Kemp, S. (2016). Digital in 2016: We are social's compendium of global digital, social, and mobile data, trends, and statistics. New York, NY: We Are Social, Inc. 3 Agile Testing—What is it?

 Test-ing (tĕst′ĭng) An early, iterative, and automated V&V of customer requirements; Incremental testing  A testing approach embracing principles & values of lean thinking, product development flow, & agile methods  Early, collaborative, and automated form of incremental development, integration, system, & operational testing  Testing method that supports collaboration, teamwork, iterative development, & responding to change  Mult-tiered automated framework for TDD, , BDD, , & DevOps  Maximizes BUSINESS VALUE of organizations, portfolios,  & projects by enabling buyers-suppliers to scale globally

Crispin, L., & Gregory, J. (2009). Agile testing: A practical guide for testers and agile teams. Boston, MA: Addison-Wesley. Crispin, L., & Gregory, J. (2015). More agile testing: Learning journeys for the whole team. Boston, MA: Addison-Wesley. 4 Agile Testing—How it works?

 Agile requirements implemented in slices vs. layers  User needs with higher business value are done first  Reduces cost & risk while increasing business success Agile Traditional  Faster 1 2 3 Late   GUI   Early ROI No Value  APIs  Lower Costs Applications Cost Overruns   Fewer Defects Middleware Very Poor Quality   Manageable Risk Operating System Uncontrollable Risk   Computer   Better Performance Slowest Performance   Network   Smaller Attack Surface More Security Incidents  Seven Wastes 1. Rework  JIT, Just-enough architecture 2. Motion  Myth of perfect architecture  Early, in-process system V&V 3. Waiting  Late big-bang integration tests  Fast continuous improvement MINIMIZES 4. Inventory MAXIMIZES  Year long improvement cycles  Scalable to systems of systems 5. Transportation  Breaks down on large projects  Maximizes successful outcomes 6. Overprocessing  Undermines business success 7. Overproduction

Shore, J. (2011). Evolutionary design illustrated. Norwegian Developers Conference, Oslo, Norway. 5 Agile Testing—Workflow

 User needs designed & developed one-at-a-time  Changes automatically detected, built, and tested  System fully tested and deployed as changes occur Early, Automated, Fast, Thousands of Tests Lean, Waste Free, Low WIP, Efficient, & Repeatable Continuously Executed No Deadlocked Test Queues

Build  Status   Builds Commits Database Changes Provides Developer A Analysis

Commits Watches Uses Changes Testing

Developer B Reporting Version Build Commits Build Changes Control Integration Documentation Scripts Server Server Deployment

Developer C Constant Readiness No More Late Big Rapidly & Successfully State & CM Control Bang Integration Dev. Complex Systems

Humble, J., & Farley, D. (2011). Continuous delivery. Boston, MA: Pearson Education. 6 Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration. Boston, MA: Addison-Wesley. Agile Testing—Workflow—Results

 Late big bang integration increases WIP backlog  Agile testing early and often reduces WIP backlog  Improves workflow and reduces WIP & lead times

Traditional vs. Agile Cumulative Flow

TRADITIONAL Cumulative Flow AGILE Cumulative Flow (Week, Day, (Week, Day, Hour) (Week, Day, Hour) or Effort  or Effort  (Story, (Story, Point, Task) (Story, Point, Task) Work Work

Time Unit (Roadmap, Release, Iteration, Month, Week, Day, Hour, etc.) Time Unit (Roadmap, Release, Iteration, Month, Week, Day, Hour, etc.)

Anderson, D. J. (2004). Agile management for . Upper Saddle River, NJ: Pearson Education. Anderson, D. J. (2010). : Successful evolutionary change for your technology business. Sequim, WA: Blue Hole Press. 7 Agile Testing—MMF, MVP, MVA, etc.

 Methods to “scope” project, product, or system  “Key” is smallest possible scope with highest value   Reduces cost, risk, time, failure, & tech. obsolescence

MINIMUM MINIMUM STORY MAP VISION MICRO- MARKETABLE FEATURE VIABLE PRODUCT OR IMPACT MAP STATEMENT SERVICE  - MMF - - MVP - - SM or IM - - VS - - MS -  Advantage  Goal  Goal  For  Purpose  Difference  Process  Actors  Who  Automated  Revenue  Features  Impacts  The  Unique  Profit  Priorities  Deliverables  Is a  Independent  Savings  Story Map  Measures  That  Resilient  Brand  Architecture  Milestones  Unlike  Ecosystem  Loyalty  Ours  Consumer

 INCREASES TESTABILITY, QUALITY, RELIABILITY, SECURITY, MORALE, MAINTAINABILITY, & SUCCESS

Denne, M., & Cleland-Huang, J. (2004). Software by numbers: Low-risk, high-return development. Santa Clara, CA: Sun Microsystems. Ries, E. (2011). The lean startup: How today's entrepreneurs use continuous innovation. New York, NY: Crown Publishing. Patton, J. (2014). User story mapping: Discover the whole story, build the right product. Sebastopol, CA: O'Reilly Media. Layton, M. C., & Maurer, R. (2011). Agile for dummies. Hoboken, NJ: Wiley Publishing. Krause, L. (2014). Microservices: Patterns and applications. Paris, France: Lucas Krause. 8 Agile Testing—Microservices

 Lightweight, fast, disposable virtual environments  Set of isolated processes running on shared kernel   Efficient way for building, delivering, & running apps Minimal - Typically single process entities Declarative - Built from layered Docker images Immutable - Do exactly same thing from run to kill   

 OS’s Have UNREPORTED 30-50 CVEs Monolithic Applications Just-Enough Applications Containerized Apps • Small autonomous services that work together • Self-contained process that provides a unique capability  • Loosely coupled service oriented architecture with bounded contexts  • Small independent processes communicating with each other using language-agnostic APIs • Fined-grained independent services running in their own processes that are developed and deployed independently • Suite of services running in their own process, exposing APIs, and doing one thing well (independently developed and deployable) • Single app as a suite of small services, each running in its own process and communicating with lightweight mechanisms (HTTP APIs) 9 Krause, L. (2014). Microservices: Patterns and applications. Paris, France: Lucas Krause. Agile Testing—Various Models

 Numerous models of lean-agile testing emerging  Based on principles of lean & agile one piece flow   Include software, hardware, system, & port. testing

TDD CI BDD CD DEVOPS DEVOPSSEC - 2003 - - 2006 - - 2008 - - 2011 - - 2012 - - 2014 -

 User Story  Building  Analyze Feature  Packaging  Sys Admin  Sec. Engineer.

 Acc Criteria  Database  Acc Criteria  Acceptance  Config. Mgt.  Sec. Containers

 Dev Unit Test  Inspections  Dev Feat. Test  Load Test  Host Builds  Sec. Evaluation

 Run Unit Test  Testing  Run Feat. Test  Performance  Virtualization  Sec. Deploy.

 Write SW Unit  Feedback  Develop Feature  Pre-Production  Containerization  Runtime Prot.

 Re-Run Unit Test  Documentation  Re-Run Feature  Certification  Deployment  Sec. Monitoring

 Refactor Unit  Deployment  Refactor Feat.  Deployment  Monitor & Supp  Response Mgt.

Beck, K. (2003). Test-driven development: By example. Boston, MA: Addison-Wesley. Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration. Boston, MA: Addison-Wesley. Barker, K., & Humphries, C. (2008). Foundations of rspec: Behavior driven development with ruby and rails. New York, NY: Apress. Humble, J., & Farley, D. (2011). Continuous delivery. Boston, MA: Pearson Education. Huttermann, M. (2012). Devops for developers: Integrate development and operations the agile way. New York, NY: Apress. Bird, J. (2016). Devopssec: Delivering secure software through continuous delivery. Sebastopol, CA: O'Reilly Media. 10 BASIC—Test Driven Development

 Term coined by Kent Beck in 2003  Consists of writing all tests before design  Ensures all components are verified and validated

Beck, K. (2003). Test-driven development: By example. Boston, MA: Addison-Wesley. 11 BASIC—Behavior Driven Develop.

 Term coined by Dan North in 2006  Consists of writing feature tests before design  Ensures all system features are verified and validated 

Smart, J. F. (2014). BDD in action: Behavior-driven development for the whole software lifecycle. Shelter Island, NY: Manning Publications. 12 ADVANCED—Continuous Integration

 Term coined by Martin Fowler in 1998  Process of automated build/regression testing  Evaluates impact of changes against entire system

 ALL DEVELOPERS RUN PRIVATE BUILDS  DEVELOPERS COMMIT CODE TO VERSION CONTROL  INTEGRATION BUILDS OCCUR SEVERAL TIMES PER DAY  100% OF SYSTEM TESTS MUST PASS FOR EVERY BUILD  A SHIPPABLE PRODUCT RESULTS FROM EVERY BUILD  FIXING BROKEN BUILDS IS OF THE HIGHEST PRIORITY  REPORTS AUTOMATICALLY GENERATED & REVIEWED

Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration: Improving and reducing risk. Boston, MA: Addison-Wesley. 13 ENTERPRISE—Continuous Delivery

 Created by Jez Humble of ThoughtWorks in 2011  Includes CM, build, testing, integration, release, etc.   Goal is one-touch automation of deployment pipeline

CoQ • 80% MS Tst Source Code Build Test Continuous Release Continuous • 8/10 No Val Control Automation Automation Integration Automation Delivery • $24B in 90s • Rep by CD  • Not Add MLK

Humble, J., & Farley, D. (2011). Continuous delivery. Boston, MA: Pearson Education. Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration. Boston, MA: Addison-Wesley. Ohara, D. (2012). Continuous delivery and the world of . San Francisco, CA: GigaOM Pro. 14 GLOBAL—Development Operations

 Created by Patrick Debois of Jedi BVBA in 2007  Collaboration of developers & infrastructure people   Goal to automate the deployment to end-user devices

Collaboration

Bass, L., Weber, I., & Zhu, L. (2015). Devops: A software architect's perspective. Old Tappan, NJ: Pearson Education. Gruver, G., & Mouser, T. (2015). Leading the transformation: Applying agile and devops at scale. Portland, OR: IT Revolution Press. Humble, J., Molesky, J., & O'Reilly, B. (2015). Lean enterprise: How high performance organizations innovate at scale. Sebastopol, CA: O'Reilly Media. 15 GLOBAL—Development Ops Sec

 DevOpsSec coined by Shannon Lietz in 2014  Rugged devops, devsecops, secdevops, devopssec  Microservices, security engineering & operations keys Secure Microservices Secure Engineering • Docker App • Security Champions • Docker Bins • Security Planning • Docker Files • Security Training • Docker Images • Security Requirements • Docker Scanning • Security Architecture • Docker Registry • Security Analysis • Docker Host • Security Testing • Docker Hub • Security Review • Docker Monitoring • Security Response

Secure Operations • Activity Logging • Patch Management • Vulnerability Mgt. • Event Monitoring • User Access Control • Response Mgt. • Configuration Mgt. • Privilege Management • Performance Mgt.  16 Bird, J. (2016). Devopssec: Delivering secure software through continuous delivery. Sebastopol, CA: O'Reilly Media. Agile Testing—Basic Metrics

Increase Coverage

Increase Automation

Defects Integrations Decrease Increase

Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration: Improving software quality and reducing risk. Boston, MA: Addison-Wesley. 17 Agile Testing—CD Statistics

 Hewlett-Packard is a major user of CI, CD, & DevOps  400 engineers developed 10 million LOC in 4 years  Major gains in testing, deployment, & innovation

TYPE METRIC MANUAL DEVOPS MAJOR GAINS Build Time 40 Hours 3 Hours 13 x CYCLE TIME No. Builds 1-2 per Day 10-15 per Day 8 x IMPROVEMENTS Feedback 1 per Day 100 per Day 100 x Regression Testing 240 Hours 24 Hours 10 x  Integration 10% 2% 5 x Planning 20% 5% 4 x DEVELOPMENT Porting 25% 15% 2 x COST EFFORT Support 25% 5% 5 x DISTRIBUTION Testing 15% 5% 3 x Innovation 5% 40% 8 x 

Gruver, G., Young, M. & Fulghum, P. (2013). A practical approach to large-scale agile development. Upper Saddle River, NJ: Pearson Education. 18 Agile Testing—DevOps Statistics

 Assembla went from 2 to 45 releases every month  15K Google developers run 120 million tests per day   30K+ Amazon developers deliver 136K releases a day

3,645 x Faster U.S. DoD  IT Project

62 x Faster U.S. DoD IT Project 

Singleton, A. (2014). Unblock: A guide to the new continuous agile. Needham, MA: Assembla, Inc. 19 Agile Testing—DevOps Example

 Simple example of a DevOps reference architecture  Includes CM, continuous integration, & deployment  Code automatically built/tested/deployed to users

Morris, B., & Cassatt, C. (2015). Devops for the rest of us. Proceedings of the Agile DC Conference, Washington, DC, USA. 20 Weeks, D. E. (2014). Devops and continuous delivery reference architectures (volume 1 & 2). Fulton, MD: Sonatype. Agile Testing—DevOps Ecosystem

 Numerous tools to automate DevOps pipeline  People can piece together toolset along with hubs  Tools include version control, testing, & deployment

Juengst, D. (2015). Deliver better software faster: With the cloudbees jenkins platform. San Francisco, CA: CloudBees. 21 Weeks, D. E. (2014). Devops and continuous delivery reference architectures (volume 1 & 2). Fulton, MD: Sonatype. Agile Testing—DevOps Adoption

 DevOps adoption growing fast in-spite of slow start  35% using, 27% thinking about it, & 38% are in-dark  DevOps a global industry-wide extinction-level event

Brown, A. (2016). Devops and the need for speed, quality, and security: Do organizations really have to pick two out of three. Portland, OR: Puppet Labs. 22 Agile Testing—5 Keys to Success  Everything begins with lean & agile principles  Next step is smaller portfolio & simpler designs   Final step is modular interfaces & E2E automation  

  Kim, G., Debois, P., Willis, J., & Humble, J. The devops handbook: How to create world-class agility, reliability, and security 23 in technology organizations. Portland, OR: IT Revolution Press. Agile Testing—Bottom Line?

 Eliminates big-bang integration in the 11th hour  Creates a repeatable and reliable testing process  Evaluates system-wide changes throughout project Traditional Testing Agile Testing · Late defect discovery · Dramatically reduces risks · Low quality software · Automates manual processes  · Poor project visibility · Instant verification & validation  · Lack of deployability · High project visibility · Late big-bang integration · Greater confidence and morale · Testing is a bottleneck · Incremental business value · Poor customer satisfaction · 24x7 deployability to users · Outright project failure · Highly quality and reliability

Maeda, M. K. (2009). Agile testing: Early, often, and smart. Arlington, MA: Cutter Consortium. 24 Agile Testing—Foundational Books  Thousands of textbooks on agile methods  Include requirements, design, coding, test, etc.   Continuous Integration, Delivery, & DevOps best

Beck, K. (2003). Test-driven development: By example. Boston, MA: Addison-Wesley. Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration. Boston, MA: Addison-Wesley. Smart, J. F. (2014). BDD in action: Behavior-driven development for the whole software lifecycle. Shelter Island, NY: Manning Publications. Humble, J., & Farley, D. (2011). Continuous delivery. Boston, MA: Pearson Education. Kim, G., Debois, P., Willis, J., & Humble, J. The devops handbook: How to create world-class agility, reliability, and security in technology organizations. Portland, OR: IT Revolution Press. 25 Dave’s PROFESSIONAL CAPABILITIES

Strategy & Organization Acquisition & Cost Estimates Systems Roadmapping Change Contracting & Scheduling Engineering

BPR, IDEF0, Innovation & DoDAF Management Valuation — Cost-Benefit Analysis, B/CR, ROI, NPV, BEP, Real Options, etc.

CMMI & Evolutionary ISO 9001 Technical Software Project Development Quality PSP, TSP, & Research Code Reviews Mgt. Methods Mgt. Methods

Lean-Agile — Scrum, SAFe, Continuous Integration & Delivery, DevOps, etc. DoD 5000, Statistics, CFA, TRA, & SRA EFA, & SEM

Lean, Kanban, Metrics, Workflow Big Data, Modeling & & Six Sigma Models, & SPC Automation Cloud, NoSQL Simulations

STRENGTHS – Data Mining  Gathering & Reporting Performance Data  Strategic Planning  Executive & Manage- ment Briefs  Brownbags & Webinars  White Papers  Tiger-Teams  Short-Fuse Tasking  Audits & Reviews  Etc. ● Data mining. Metrics, benchmarks, & performance. ● Simplification. Refactoring, refinement, & streamlining. MP SEP 33 YEARS ● Assessments. Audits, reviews, appraisals, & risk analysis. P , C , CP CT IN IT ● Coaching. Diagnosing, , & restarting stalled projects. F , F INDUSTRY ● Business cases. Cost, benefit, & return-on-investment (ROI) analysis. ACP, CSM, ● Communications. Executive summaries, white papers, & lightning talks. & SAFE ● Strategy & tactics. Program, project, task, & activity scoping, charters, & plans. 26 Backup Slides Software in U.S. DoD Systems

 No. of software-intensive systems is growing  80% of US DoD functions performed in software   Major driver of cost, schedule, & tech. performance

Kennedy, M. P., & Umphress, D. A. (2011). An agile process: The missing link. Crosstalk, 24(3), 16-20. 28 Requirements Defects & Waste  Requirements defects are #1 reason projects fail  Traditional projects specify too many requirements   More than 65% of requirements are never used at all Defects Waste

Never Requirements 45% 47% Other 7% Always 7%

Implementation Often 13% 18% Rarely Design 19% 28% Sometimes 16%

Sheldon, F. T. et al. (1992). Reliability measurement: From theory to practice. IEEE Software, 9(4), 13-20 Johnson, J. (2002). ROI: It's your job. 2002 Conference, Alghero, Sardinia, Italy. 29 Large Traditional Projects  Big projects result in poor quality and scope changes  Productivity declines with long queues/wait times   Large projects are unsuccessful or canceled Size vs. Quality Size vs. Change 16.00 40%

12.80 32%

9.60 24%

6.40 16% CHANGE

DEFECTS 3.20 8%

0.00 0% 0 2 6 25 100 400 0 2 6 25 100 400 SIZE SIZE Size vs. Productivity Size vs. Success 5.00 60%

4.00 48%

3.00 36%

2.00 24%

1.00 12% SUCCESS

0.00 0% PRODUCTIVITY 0 2 6 25 100 400 0 2 6 25 100 400 SIZE SIZE

Jones, C. (1991). Applied software measurement: Assuring productivity and quality. New York, NY: McGraw-Hill. 30 Global Project Failures  Challenged and failed projects hover at 67%  Big projects fail more often, which is 5% to 10%   Of $1.7T spent on IT projects, over $858B were lost

$1.8 2015 29% 52% 19%

2014 28% 55% 17% $1.4 2012 27% 56% 17%

2010 33% 41% 26% $1.1

2008 32% 44% 24% Year 2006 35% 46% 19% $0.7

2004 29% 53% 18%

Trillions (US Dollars) $0.4 2002 34% 51% 15%

2000 28% 49% 23% $0.0 0% 20% 40% 60% 80% 100% 2002 2003 2004 2005 2006 2007 2008 2009 2010 Successful Challenged Failed Expenditures Failed Investments

Standish Group. (2015). Chaos summary 2015. Boston, MA: Author. Sessions, R. (2009). The IT complexity crisis: Danger and opportunity. Houston, TX: Object Watch. 31 Agile Testing—Road/Story Map, Arch

 Organize needs into capabilities, features, and stories  Prioritize features, group releases, and initiate sprints   Develop minimum set of features with highest value (for example, assume 25 user stories per feature, 175 user stories per capability, and 1,225 user stories total)

Capability #1 Capability #2 Capability #3 Capability #4 Capability #5 Capability #6 Capability #7 ● Feature 1 Release ● Feature 8 ● Feature 15 ● Feature 22 ● Feature 29 ● Feature 36 ● Feature 43 ● Feature 2 ● Feature 9 ● Feature 16 ● Feature 23 ● Feature 30 ● Feature 37 ● Feature 44  ● Feature 3 ● Feature 10 Release ● Feature 17 ● Feature 24 ● Feature 31 ● Feature 38 ● Feature 45 ● Feature 4 ● Feature 11 ● Feature 18 ● Feature 25 ● Feature 32 ● Feature 39 ● Feature 46 ● Feature 5 ● Feature 12 ● Feature 19 ● Feature 26 ● Feature 33 Release ● Feature 40 ● Feature 47 MMF ● Feature 6 ● Feature 13 ● Feature 20 ● Feature 27 ● Feature 34 ● Feature 41 ● Feature 48 Release -or - ● ● ● ● ● ● ● Feature 7 Feature 14 Feature 21 Feature 28 Feature 35 Feature 42 Feature 49 MVP Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 Sprint 6 Sprint 7

Evolving “Unified/Integrated” Enterprise “Leased” DWA/HPC/Cloud Services A A A A A A A B B C B C B C B C B C D D E D E F D E F “Legacy” MS SQL Server Stovepipes “Inter-Departmental” Linux Blade/Oracle/Java/WebSphere Server G H I J K

ETL ETL ETL ETL ETL ETL ETL

1 4 7 10 13 16 19

2 3 5 6 8 9 11 12 14 15 17 18 20 21

Disparate “ ” LEGACY SYSTEM DATABASES (AND DATA MODELS) 32 Bente, S., Bombosch, U., & Langade, S. (2012). Collaborative : Enriching EA with lean, agile, and enterprise 2.0 practices. Waltham, MA: Elsevier. Agile Testing—Workflow—ScrumBan

 Created by Corey Ladas of Modus Cooperandi (2008)  Hybrid of Agile (Scrum) and Lean (Kanban) methods   Scrum with one-piece-workflow vs sprints (batches)

Ladas, C. (2008). Scrumban: Essays on kanban systems for lean . Seattle, WA: Modus Cooperandi. 33 Reddy, A. (2016). The scrumban revolution: Getting the most out of agile, scrum, and lean-kanban. New York, NY: Addison-Wesley. PRACTICES—Behavior Driven Dev.

 Agile BDD consists of seven broad practices  Document test criteria, tests, syst. features, etc.   Include refactoring, verification, optimization, etc. Practice Description

Feature Read feature, analyze meaning, ask questions, and clarify understanding

Acc Criteria Identify, verify, and document acceptance criteria for each feature

Dev Test Design, develop, code, and verify automated feature test for feature

Run Test Run automated feature test to verify that it fails the first time (sanity check)

Dev Feature Design, develop, code, and verify the feature software to satisfy feature

Rerun Test Rerun automated feature test to see if code satisfies automated feature test

Refac Feature Refine, reduce, and simplify code to remove waste and optimize performance

Smart, J. F. (2014). BDD in action: Behavior-driven development for the whole software lifecycle. Shelter Island, NY: Manning Publications. 34 PRACTICES—Continuous Integration

 Agile CI consists of seven broad practices  Automated build, database, inspection, tests, etc.   Include reporting, documentation, deployment, etc. Practice Description

Building Frequently assembling products and services to ensure delivery readiness

Database Frequently generating/analyzing database schemas, queries, and forms

Inspections Frequently performing automated static analysis of product/service quality

Testing Frequently performing automated dynamic product and service evaluation

Feedback Frequently generating automated status reports/messages for all stakeholders

Documentation Frequently performing automated technical/customer document generation

Deployment Frequently performing automated delivery of products/services to end users

Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration: Improving software quality and reducing risk. Boston, MA: Addison-Wesley. Humble, J., & Farley, D. (2011). Continuous delivery. Boston, MA: Pearson Education. 35 PRACTICES—Continuous Delivery

 Agile CD consists of seven broad practices  Automated acceptance, load, performance, etc.   Include packaging, pre-production test, C&A, etc. Practice Description

Packaging Frequently generating system images for pre-production testing & checkout

Acceptance Frequently performing automated system & user acceptance testing

Load Test Frequently performing automated system load, stress, & capacity testing

Performance Frequently performing automated system user & technical performance testing

Pre-Production Frequently performing automated pre-production tests prior to final deployment

Certification Frequently performing automated system certification & accreditation tests

Deployment Frequently generating product images for pre-deployment testing & checkout

Mukherjee, J. (2015). Continuous delivery pipeline: Where does it choke. Charleston, SC: CreateSpace. Swartout, P. (2014). Continuous delivery and devops: A quickstart guide. Birmingham, UK: Packt Publishing. 36 PRACTICES—Development Operations

 Agile DevOps consists of seven broad practices  Automated sys admin, CM, building, monitor, etc.   Include virtualization, containerize, deployment, etc. Practice Description

Sys Admin Frequently performing automated system administration tasks, i.e., scripting

Config. Mgt. Frequently performing automated infrastructure config. mgt./version control

Host Builds Frequently performing automated system and server host builds and config.

Virtualization Frequently performing automated system, server, & net virtualization services

Containerize Frequently performing automated software and Microservices containerization

Deployment Frequently generating final end-user system & software images for distribution

Monitor & Supp Frequently performing automated metrics collection & deployment monitoring

Duffy, M. (2015). Devops automation cookbook: Over 120 recipes coverying key automation techniques. Birmingham, UK: Packt Publishing. Farcic, V. (2016). The devops 2.0 toolkit: Automating the continuous deployment pipelines with containerized microservices. Victoria, CA: LeanPub. 37 PRACTICES—Development Ops Sec

 DevOpsSec consists of seven broad practices  Automated secure build, analysis, & deployment   Includes containerization, engineering & operations Practice Description

Engineering Frequently performing “baked-in” lean and agile security engineering practices

Containers Frequently performing automated microservices containerization practices

Evaluation Frequently performing automated static and dynamic vulnerability analysis

Deployment Frequently performing automated digitally signed security deployment practices

Protection Frequently performing automated real-time self-security protection practices

Monitoring Frequently performing automated real-time security monitoring practices

Responses Frequently performing automated trigger-based rollback response practices

Bird, J. (2016). Devopssec: Delivering secure software through continuous delivery. Sebastopol, CA: O'Reilly Media. 38 Agile Testing—CI Statistics

 Fewer integrations leave in higher bug counts  Frequent, early integrations eliminate most defects   Goal is to have as many early integrations as possible

Less Defects •More Integrations More Defects •Early Integrations •Few Integrations Number of •Late Integrations Integrations



Lacoste, F. J. (2009). Killing the gatekeeper: Introducing a continuous integration system. Proceedings of the Agile 2009 Conference, Chicago, Illinois, USA, 387-392. 39 Agile Testing—DevOps CoQ

 Agile testing is orders-of-magnitude more efficient  Based on millions of automated tests run in seconds  One-touch auto-delivery to billions of global end-users Activity Def CoQ DevOps Economics Hours ROI Development Operations 100 0.001 100 Defects x 70% Efficiency x 0.001 Hours 0.070 72,900%  Continuous Delivery 30 0.01 30 Defects x 70% Efficiency x 0.01 Hours 0.210 24,300%  Continuous Integration 9 0.1 9 Defects x 70% Efficiency x 0.1 Hours 0.630 8,100% Software Inspections 3 1 2.7 Defects x 70% Efficiency x 1 Hours 1.890 2,700% "Traditional" Testing 0.81 10 0.81 Defects x 70% Efficiency x 10 Hours 5.670 900% Manual Debugging 0.243 100 0.243 Defects x 70% Efficiency x 100 Hours 17.010 300% Operations & Maintenance 0.073 1,000 0.0729 Defects x 70% Efficiency x 1,000 Hours 51.030 n/a

Under 4 Minutes

40 Rico, D. F. (2016). Devops cost of quality (CoQ): Phase-based defect removal model. Retrieved May 10, 2016, from http://davidfrico.com Agile Testing—DevOps Security

 Many tools emerging for DevOps application security  Begins-ends with microservices—tiny attack surface  Includes containers, testing, & real-time monitoring

Tesauro, M. (2016). Taking appsec to 11: Appsec pipelines, devops, and making things better. Denver, CO: SnowFROC 2016. 41 Weeks, D. E. (2014). Devops and continuous delivery reference architectures (volume 1 & 2). Fulton, MD: Sonatype. Agile Testing—DevOps Periodic Tab.

XeniaLabs. (2016). Periodic table of devops tools. Retrieved April 11, 2016, from https://xebialabs.com/periodic-table-of-devops-tools. 42 Weeks, D. E. (2014). Devops and continuous delivery reference architectures (volume 1 & 2). Fulton, MD: Sonatype. Agile Testing—CI Stats—Cont’d

 Most agile testing tools are “free” open source  Build server costs no more than a commodity PC  10x more efficient/effective than traditional testing

Grant, T. (2005). Continuous integration using cruise control. Northern Virginia Java Users Group (Novajug), Reston, Virginia, USA. Fredrick, J. (2008). Accelerate software delivery with continuous integration and testing. Japanese Symposium on , Tokyo, Japan. 43 Agile Testing—vs. Traditional

 Traditional testing is a late, manual process  Agile testing is an early and automated process  Goal to deliver early & often and V&V components

TRADITIONAL TESTING AGILE TESTING - Late Big Bang Integration Testing - - Early Incremental Testing -  Test Criteria Written After Fact  Test Criteria Accompany Stories  Manual Tests Written Much Later  Automated Tests Written First   Units Coded Late All at One Time  Units Coded-Tested One at Time   Code Checked In Late in Project  Code is Frequently Checked In  Code Manually Submitted to Test  Code Automatically Retrieved  Code Manually Compiled & Built  Code Automatically Compiled  Tests Manually Executed Late  Tests Automatically Executed  Late Project Feedback & Reports  Instant Feedback & Test Reports  Late Defects Freeze Projects  Code Automatically Deployed

Rico, D. F. (2012). Agile testing resources. Retrieved Sep. 9, 2012, from http://davidfrico.com/agile-testing-resources.txt Crispin, L., & Gregory, J. (2009). Agile testing: A practical guide for testers and agile teams. Boston, MA: Addison-Wesley. 44 Grant, T. (2005). Continuous integration using cruise control. Northern Virginia Java Users Group (Novajug), Reston, Virginia, USA. Agile Testing—Anti-Patterns

 Agile teams don’t often use TDD, CI, CD & DevOps  Implement independent test teams after Sprints done   Sprint Waterfalling, Scrummerfalling, & Wagile result

Incorrect Correct • Phased Testing • Integrated Testing  • Separate Teams • Integrated Teams • Delayed Testing • Continuous Testing 

Heusser, M. (2015). 12 years of agile testing: What do we know now. Proceedings of the Agile Gathering, Grand Rapids, Michigan, USA. 45 Agile Testing—Scaling Practices

 Agile testing slows down with very large systems  Slow testing slows integration and increases bugs  Agile testing can speed back up with more attention

MACRO ADJUSTMENTS MICRO ADJUSTMENTS - Wide Impact Tuning- - Focused Impact Tuning-  In-Memory Compilation  Add More CPUs & Memory  Parallelize Test Runs  Parallelize System Builds   Pre-Install Test Libraries  Replace 3rd Party Test Libraries   Remove Process Randomness  Reduce or Remove Test Timeouts  Use Faster Code & Test Tools  Select Different Tests  Incremental vs. Big Bang Tests  Refactor Code & Components  Parallelize Build & Install  Tune Network & Software  Tune & Optimize Build Process  Tune Database & Middleware

Kokko, H. (2009). Increase productivity with large scale continuous integration. Proceedings of the Agile 2009 Conference, Chicago, Illinois, USA. 46 Agile Testing—Common Barriers

 Industry very slow in adopting agile testing model  Cost, difficulty, and territorialism are common issues  Developers must take initiative for disciplined testing Organizational Barriers Technical Barriers · Resistance to change · Developers don’t want to test  · Fear of investment costs · Infrequently committing code  · Fear of learning new skills · Committing broken code · Test group territorialism · Failing to immediately fix builds  · Organizational policy conflicts · Not writing automated tests · Overhead of maintaining CI · Not ensuring 100% of tests pass  · Complexity and scaling · Not running private builds  · Not developing a quality culture · Resorting to traditional testing 

Duvall, P., Matyas, S., & Glover, A. (2006). Continuous integration: Improving software quality and reducing risk. Boston, MA: Addison-Wesley. 47 Agile Testing—Usage Statistics

 Agile test use is low in spite of its age, i.e., 15 years  Many do not understand its utter simplicity and power  Failure to use agile testing undermines project success Agile Practices Continuous Integrations Retrospectives Agile Testing 13% 2-3 Weekly  Refactoring Times Done Definition Per Test Tools Iteration Daily

Test Driven Dev.

CM Tools 2-3 Times Never Simplicity Per Day Pair Programming  Technical Debt

Kim, D. (2013). The state of scrum: Benchmarks and guidelines. Indianapolis, IN: Scrum Alliance. 48 Agile Testing—Security Life Cycle

 Microsoft created software security life cycle in 2002  Waterfall approach tailored for Scrum sprints in 2009   Uses security req, threat modeling & security testing

SEE DETAILED - SECURITY LIFE CYCLE STEPS http://davidfrico.com/agile-security-lifecycle.txt

Microsoft. (2011). Security development lifecycle: SDL Process Guidance (Version 5.1). Redmond, WA: Author. Microsoft. (2010). Security development lifecycle: Simplified implementation of the microsoft SDL. Redmond, WA: Author. Microsoft. (2009). Security development lifecycle: Security development lifecycle for agile development (Version 1.0). Redmond, WA: Author. 49 Bidstrup, E., & Kowalczyk, E. C. (2005). Security development lifecycle. Changing the software development process to build in security from the start. Security Summit West. Agile for EMBEDDED SYSTEMS

 1st-generation systems used hardwired logic  2nd-generation systems used PROMS & FPGAs   3rd-generation systems use APP. SW & COTS HW

GILE EO RADITIONAL RADITIONAL RISK A N -T T Embedded “Software Model” “FPGA Model” “Hardwired Model” Systems More HW -MOST FLEXIBLE - -MALLEABLE - -LEAST FLEXIBLE - Than SW ● Short Lead ● Moderate Lead ● Long Lead ● Least Cost ● Moderate Cost ● Highest Cost  ● Lowest Risk ● Moderate Risk ● Highest Risk ● 90% Software ● 50% Hardware ● 90% Hardware ● COTS Hardware ● COTS Components ● Custom Hardware  ● Early, Iterative Dev. ● Midpoint Testing ● Linear, Staged Dev. ● Continuous V&V ● “Some” Early V&V ● Late Big-Bang I&T  Iterations, Integrations, & Validations Iterations,

START GOAL –SHIFT FROM LATE HARDWARE TO EARLIER SOFTWARE SOLUTION STOP Competing Competing With SW With HW Pries, K. H., & Quigley, J. M. (2010). Scrum project management. Boca Raton, FL: CRC Press. Pries, K. H., & Quigley, J. M. (2009). Project management of complex and embedded systems. Boca Raton, FL: Auerbach Publications. Thomke, S. (2003). Experimentation matters: Unlocking the potential of new technologies for innovation. Boston, MA: Harvard Business School Press. 50 Key Agile SCALING POINTERS

 One must think and act small to accomplish big things  Slow down to speed up, speed up ‘til wheels come off  Scaling up lowers productivity, quality, & business value

 EMPOWER WORKFORCE - Allow workers to help establish enterprise business goals and objectives.

 ALIGN BUSINESS VALUE - Align and focus agile teams on delivering business value to the enterprise.

 PERFORM VISIONING - Frequently communicate portfolio, project, and team vision on continuous basis.   REDUCE SIZE - Reduce sizes of agile portfolios, acquisitions, products, programs, projects, and teams.   ACT SMALL - Get large agile teams to act, behave, collaborate, communicate, and perform like small ones.   BE SMALL - Get small projects to act, behave, and collaborate like small ones instead of trying to act larger.  ACT COLLOCATED - Get virtual distributed teams to act, behave, communicate and perform like collocated ones.

 USE SMALL ACQUISITION BATCHES - Organize suppliers to rapidly deliver new capabilities and quickly reprioritize.

 USE LEAN-AGILE CONTRACTS - Use collaborative contracts to share responsibility instead of adversarial legal ones.

 USE ENTERPRISE AUTOMATION - Automate everything with Continuous Integration, Continuous Delivery, & DevOps.  Rico, D. F. (2014). Dave's Notes: For Scaling with SAFe, DaD, LeSS, RAGE, ScrumPLoP, Enterprise Scrum, etc. Retrieved March 28, 2014 from http://davidfrico.com 51 Models of AGILE ORG. CHANGE

 Change, no matter how small or large, is difficult  Smaller focused changes help to cross the chasm   Validating, simplifying, & incrementalism are keys INFLUENCER DRIVE SWITCH A-B-C DECISIVE

MAKE IT DESIRABLE PURPOSE DIRECT THE RIDER ATTUNEMENT COMMON ERRORS  Create new experiences  Purpose-profit equality  Narrow framing  Follow the bright spots  Reduce Your Power  Create new motives  Business& societal benefit  Confirmation bias  Share control of profits  Script the critical moves  Take Their Perspective  Short term emotion  Delegate implementation  Over confidence SURPASS YOUR LIMITS  Point to the destination  Use Strategic Mimicry  Culture & goal alignment  Perfect complex skills  Remake society-globe WIDEN OPTIONS  Build emotional skills  Avoid a narrow frame  Multi-track USE PEER PRESSURE AUTONOMY MOTIVATE ELEPHANT BUOYANCY  Find out who solved it  Recruit public figures  Accountable to someone  Find the feeling  Use Interrogative Self-Talk  Recruit influential leaders  Self-select work tasks TEST ASSUMPTIONS  Self-directed work tasks  Shrink the change  Opt. Positivity Ratios  Consider the opposite  Self-selected timelines STRENGTH IN NUMBERS   Offer Explanatory Style  Zoom out & zoom in  Self-selected teams Grow your people   Ooch Utilize teamwork  Self-selected implement.  Power of social capital ATTAIN DISTANCE DESIGN REWARDS MASTERY SHAPE PATH CLARITY  Overcome emotion  Gather & shift perspective  Use incentives wisely  Experiment & innovate  Tweak the environment  Find the Right Problem  Self-directed work tasks  Use punishment sparingly  Align tasks to abilities   Continuously improve  Build habits Find Your Frames PREPARE TO BE WRONG CHANGE ENVIRONMENT  Learning over profits  Rally the herd  Find an Easy Path  Bookend the future  Make it easy  Create challenging tasks  Set a tripwire  Make it unavoidable  Set high expectations  Trust the process

Patterson, K., et al. (2008). Influencer: The power to change anything: New York, NY: McGraw-Hill. Pink, D. H. (2009). Drive: The surprising truth about what motivates us. New York, NY: Riverhead Books. Heath, C., & Heath, D. (2010). Switch: How to change things when change is hard. New York, NY: Random House. Pink, D. H. (2012). To sell is human: The surprising truth about moving others. New York, NY: Riverhead Books. 52 Heath, C., & Heath, D. (2013). Decisive: How to make better choices in life and work. New York, NY: Random House. Agile vs. Traditional Success

 Traditional projects succeed at 50% industry avg.  Traditional projects are challenged 20% more often   Agile projects succeed 3x more and fail 3x less often Agile Traditional

Success Success 14% 42%

Challenged 57% Failed Failed Challenged 29% 9% 49%

Standish Group. (2012). Chaos manifesto. Boston, MA: Author. 53 Fin. Benefits to ENTERPRISE AGILITY

 Study of 15 agile vs. non-agile Fortune 500 firms  Based on models to measure organizational agility   Agile firms out perform non agile firms by up to 36%

Hoque, F., et al. (2007). Business technology convergence. The role of business technology convergence in innovation and adaptability and its effect on financial performance. Stamford, CT: BTM Corporation. 54 Nat’l Benefits to ENTERPRISE AGILITY

 U.S. gov’t agile jobs grew by 13,000% from 2006-2013  Adoption is higher in U.S. DoD than Civilian Agencies   GDP of countries with high adoption rates is greater GOVERNMENT AGILE JOB GROWTH GOVERNMENT COMPETITIVENESS 13,000% High ERCENTAGE P OMPETITIVENESS C

0 Low 2006YEARS 2013 LowAGILITY High

Suhy, S. (2014). Has the U.S. government moved to agile without telling anyone? Retrieved April 24, 2015, from http://agileingov.com Porter, M. E., & Schwab, K. (2008). The global competitiveness report: 2008 to 2009. Geneva, Switzerland: World Economic Forum. 55