
INSTITUTE FOR DEFENSE ANALYSES State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation Gregory Larsen, Task Leader E. Kenneth Hong Fong, Project Leader David A. Wheeler Rama S. Moorthy July 2014 Approved for public release; distribution is unlimited. IDA Paper P-5061 Log: H 13-001130 Copy INSTITUTE FOR DEFENSE ANALYSES 4850 Mark Center Drive Alexandria, Virginia 22311-1882 About This Publication This work was conducted by the Institute for Defense Analyses (IDA) under contract DASW01-04-C-0003, Task AU-5-3595, “State of the Art Resource for Hardware and Software Assurance,” for the Office of the Deputy Assistant Secretary of Defense for Systems Engineering; Office of the Deputy CIO for Identity and Information Assurance (DCIO(IIA)), Director of Trusted Mission Systems and Networks (TMSN); Technical Director for the Center for Assured Software, National Security Agency. The views, opinions, and findings should not be construed as representing the official position of either the Department of Defense or the sponsoring organization. Copyright Notice © 2014 Institute for Defense Analyses 4850 Mark Center Drive, Alexandria, Virginia 22311-1882 • (703) 845-2000. This material may be reproduced by or for the U.S. Government pursuant to the copyright license under the clause at DFARS 252.227-7013 (a)(16) [Sep 2011]. INSTITUTE FOR DEFENSE ANALYSES IDA Paper P-5061 State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation Gregory Larsen, Task Leader E. Kenneth Hong Fong, Project Leader David A. Wheeler Rama S. Moorthy Executive Summary Nearly all modern systems depend on software. It may be embedded within the system, delivering capability; used in the design and development of the system; or used to manage and control the system, possibly through other systems. Software may be acquired as a commercial off-the-shelf component, custom developed for the system, or embedded within subcomponents by their manufacturers. Modern systems often perform the majority of their functions through software and can easily include millions of lines of software code. Although functionality is often created through software, this software can also introduce risks. Unintentional or intentionally inserted vulnerabilities (including previously known vulnerabilities) can provide adversaries with various avenues to reduce system effectiveness, render systems useless, or even turn our systems against us. Department of Defense (DoD) software, in particular, is subject to attack. Analyzing DoD software to identify and remove weaknesses is a critical program protection countermeasure. Unfortunately, it can be difficult to determine what types of tools and techniques exist for analyzing software, and where their use is appropriate. The purpose of this paper is to assist Department of Defense (DoD) program managers (PM), and their staffs, in making effective software assurance (SwA) and software supply chain risk management (SCRM) decisions, particularly when they are developing their program protection plan (PPP). A secondary purpose is to inform DoD policymakers who are developing software policies. This paper defines and describes the following overall process for selecting and using appropriate analysis tools and techniques for evaluating software: 1. Select technical objectives based on context. This paper identifies a set of 10 major technical objectives and subdivides them further into up to 3 more levels of progressively more detailed objectives. For example, the major technical objective “counter unintentional-‘like’ weaknesses” is subdivided into a second level of 12 sub-categories, and some of these second-level objectives are subdivided still further. This multi-stage breakdown of technical objectives is captured in Appendix E, Software State-of-the-Art Resources (SOAR) Matrix. 2. Select tool/technique types to address those technical objectives. This paper identifies 56 types of tools and techniques available for analyzing software. The supporting “Software SOAR Matrix” provides a detailed mapping between these i tool/technique types and the technical objectives, to help readers identify and select the types of tools and techniques to meet the technical objectives. 3. Select tools/techniques. This paper identifies, in some cases, where additional information is available to help the selection process. 4. Summarize selection as part of a Program Protection Plan. This paper provides guidance on how to summarize the information derived from the selection of tool/technique types, and later the planned use of the tools/techniques, into a PPP. 5. Apply the tools/techniques and report the results. Here the selected tools and techniques are applied, including the selection, modification, or risk mitigation of software based on tool/technique results. Reports are provided to support oversight and governance. Vignettes in Section 8 provide examples of this process. This paper also describes some key gaps that were identified in the course of this study, including difficulties in finding unknown malicious code, obtaining quantitative data, analyzing binaries without debug symbols, and obtaining assurance of development tools. Additional challenges were found in the mobile environment; examples include lack of maturity in many tools, expectations of time constraints that preclude in-depth analysis, and widespread use of a Software-as-a-Service (SaaS) model that limits data availability and application to DoD systems. These would be plausible areas to consider as part of a research program. Appendices provide additional detail, including more information on each type of tool and technique. Appendix D, for example, describes how we believe analysis should be continuously applied and integrated into the entire software lifecycle, creating a feedback loop for better-informed risk management decisions. The information provided here was gathered from a variety of sources, including many interviews of subject matter experts. These experts identified a number of key topics, some of which are also captured in this paper. This paper extends the earlier 2013 draft by adding information specifically focused on mobile platforms (e.g., smartphones and tablets running on operating systems such as iOS and Android). It also includes various incremental improvements, including those suggested by reviewer comments from the sponsors, the Software Engineering Institute (SEI), and the MITRE Corporation. In particular, the set of technical objectives was expanded and slightly reorganized per reviewer comments. Software analysis is a large and dynamic field, and this paper represents one step in capturing and organizing a wide range of diverse information. We hope that this material will be refined through feedback from the larger community. We recommend piloting the approach described in this document to determine its utility and to evolve it. ii Contents Executive Summary ............................................................................................................. i 1. Introduction ............................................................................................................. 1-1 2. Background .............................................................................................................. 2-1 3. Overall Process for Selecting and Reporting Results from Appropriate Tools and Techniques ............................................................................................................... 3-1 A. General Approach ............................................................................................ 3-1 B. Matrix to Help Select Tool/Technique Types to Address Technical Objectives ........................................................................................................ 3-1 C. Using the Matrix .............................................................................................. 3-5 4. Technical Objectives ............................................................................................... 4-1 A. Technical Objectives’ Development Approach .............................................. 4-1 B. Technical Objectives – Main Categories ......................................................... 4-2 5. Types of Tools and Techniques ............................................................................... 5-1 A. Static Analysis ................................................................................................. 5-3 B. Dynamic Analysis ........................................................................................... 5-7 C. Hybrid Analysis ............................................................................................. 5-10 D. Combining Tools and Techniques ................................................................. 5-10 6. Software Component Context ................................................................................. 6-1 A. General Factors ................................................................................................ 6-1 B. PPP Contexts ................................................................................................... 6-2 7. Program Protection Plan Roll-up ............................................................................. 7-1 8. Vignettes .................................................................................................................. 8-1 A. OTS Proprietary Software Critical Component .............................................
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages234 Page
-
File Size-