Best Practices for the Use of Static Code Analysis Within a Real-World Secure Development Lifecycle
Total Page:16
File Type:pdf, Size:1020Kb
An NCC Group Publication Best Practices for the use of Static Code Analysis within a Real-World Secure Development Lifecycle Prepared by: Jeremy Boone © Copyright 2015 NCC Group Contents 1 Executive Summary ....................................................................................................................................... 3 2 Purpose and Motivation ................................................................................................................................. 3 3 Why SAST Often Fails ................................................................................................................................... 4 4 Methodology .................................................................................................................................................. 4 5 Integration with the Secure Development Lifecycle....................................................................................... 5 5.1 Training ................................................................................................................................................. 6 5.2 Requirements ....................................................................................................................................... 6 5.3 Design ................................................................................................................................................... 7 5.4 Implementation ..................................................................................................................................... 7 5.5 Verification ............................................................................................................................................ 7 5.6 Release................................................................................................................................................. 7 5.7 Response and Sustainment ................................................................................................................. 8 6 Evaluating and Deploying a SAST Solution .................................................................................................. 8 6.1 Programming Language Support ......................................................................................................... 8 6.2 Selecting Natural and Artificial Code Bases ......................................................................................... 9 6.3 Development Support Systems ............................................................................................................ 9 6.3.1 Compiler and Build System ..................................................................................................... 9 6.3.2 Integrated Development Environment ................................................................................... 10 6.3.3 Continuous Integration........................................................................................................... 10 6.3.4 Collaborative Peer Code Review Systems ............................................................................ 11 6.3.5 Nightly Builds ......................................................................................................................... 11 6.3.6 Distributed Analysis ............................................................................................................... 12 6.3.7 Bug Trackers ......................................................................................................................... 12 6.3.8 Revision Control System ....................................................................................................... 12 6.4 The Analysis Engine ........................................................................................................................... 13 6.4.1 Defect Classes ....................................................................................................................... 13 6.4.2 Accuracy and Precision ......................................................................................................... 13 6.4.3 Sensitivity ............................................................................................................................... 14 6.4.4 False Positive Suppression ................................................................................................... 14 6.4.5 Dealing with Legacy Defects ................................................................................................. 15 6.4.6 Creating Custom Rulesets ..................................................................................................... 15 6.4.7 Analysis Speed ...................................................................................................................... 16 6.5 Defect Triage Maturity ........................................................................................................................ 16 6.5.1 Severity .................................................................................................................................. 16 6.5.2 Detailed Trace and Remediation Guidance........................................................................... 17 6.5.3 Collapse Similar Defects........................................................................................................ 17 6.5.4 Support for Multiple Branches ............................................................................................... 17 6.5.5 Triage Workflow and Multi-User Collaboration ...................................................................... 17 6.5.6 Isolate Users or Teams .......................................................................................................... 17 6.5.7 Report Generation ................................................................................................................. 18 6.6 Vendor Roadmap ............................................................................................................................... 19 7 Final Words .................................................................................................................................................. 20 NCC Group | Page 2 © Copyright 2015 NCC Group 1 Executive Summary Static application security testing (SAST) is the analysis of computer software that is performed without the need to actually execute the program. The term is usually applied to analysis performed by an automated tool, whereas human analysis is typically called security-focused code review. The primary objective of SAST is to gain an understanding of the software’s behaviour, usually with the aim of uncovering security, privacy, and quality defects. In recent years, commercial SAST solutions have matured considerably, and they now offer numerous methods of integrating with various development processes and support systems: continuous integration, bug trackers, revision control, peer code review tools, and so on. However, NCC Group routinely encounters ineffective or suboptimal static analysis deployments that either fail to accommodate the requirements of a secure development lifecycle (SDLC) or tend to impose a significant burden on development staff, leading to disengagement and patterns of misuse. These shortcomings frequently result in the SAST solution failing to serve its primary purpose: to improve software security. In this paper we describe a methodology for evaluating and selecting the most appropriate static code analysis solution for your software organisation, as well as best practice guidance for effectively integrating that solution with your development procedures as part of a mature secure development lifecycle. Unfortunately, we must intentionally avoid recommending a specific SAST solution, because in our experience there is no “one size fits all” solution; besides which, vendors typically place DeWitt clauses1 in their EULAs, to prevent the publishing of benchmark data. 2 Purpose and Motivation The primary motivator that drives the use of static code analysis within a software development organisation is the ability to deliver higher-quality and more secure products at a cheaper price. Studies indicate that automated static code analysis can detect up to 60% of post-release failures2. This metric becomes significant when you consider that a single failure can lead to widespread and costly product recalls. For example: In 2002, the National Institute of Standards and Technology (NIST) showed that the annual cost of software defects to the American economy is approximately $59.5 billion3. In 2015, Charlie Miller and Chris Valasek discovered security defects that led to the recall of 1.4 million Fiat Chrysler vehicles4. In 2011, the FDA stated that 24% of medical device recalls can be attributed to software defects5. So it becomes easy to imagine how a modest investment in SAST can help to avoid expensive recalls or patching of in-field products. Fearmongering aside, it should be recognised that static code analysis is neither a silver bullet nor a panacea. Forrester’s TechRadar Q2 2015 Application Security report 6 summarised it well: SAST provides a significant value to its customers — it has a proven track record and provides extensive benefits. To gain maximum advantage from SAST […] teams must be fairly mature and capable of systematic follow-through on remediation. SAST will continue to see significant success […] for another five to 10 years as application environments become