Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

SIMAF Master Process (SMP)

Version 4.4 September 2019

Prepared by: AFLCMC/XZS Simulation and Analysis Facility 2303 8th St WPAFB OH 45433

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Signature Page (Reviewers)

Prepared by: Digitally signed by PATTERSON.CLIFFORD.E.1411401762 Mr. Clifford Patterson: PATTERSON.CLIFFORD.E.1411401762 Date: 2019.09.26 10:19:38 -04'00'

Systems Integration Engineer Date: AFLCMC/XZSI 9/26/19

Reviewed by:

Digitally signed by KESSEL.CHARLES.L.1080693148 Mr. Charles Kessel: KESSEL.CHARLES.L.1080693148 Date: 2019.09.26 10:33:59 -04'00' Configuration Manager Date: AFLCMC/XZSI (Contractor) 9/26/19

Reviewed by:

MASON.PENNY.K.12747 Digitally signed by Ms. Penny Mason: MASON.PENNY.K.1274734848 34848 Date: 2019.10.23 10:52:27 -04'00' Program Management and Integration Branch Chief Date: AFLCMC/XZSM 10/23/19 Reviewed by:

Digitally signed by GRAEFF.RICHARD.C.1230150555 Mr. Richard Graeff: GRAEFF.RICHARD.C.1230150555 Date: 2019.09.26 13:43:22 -04'00' Infrastructure Branch Chief Date: AFLCMC/XZSI 9/26/19

Reviewed by:

Digitally signed by SANTANA.AWILDA.I.1230218796 Ms. Awilda Santana: SANTANA.AWILDA.I.1230218796 Date: 2019.09.26 14:47:29 -04'00' Analysis Branch Chief Date: AFLCMC/XZSA 9/26/19

Reviewed by: Digitally signed by KORTHAUER.RALPH.1083555307 Mr. Ralph Korthauer: KORTHAUER.RALPH.1083555307 Date: 2019.10.03 13:43:22 -04'00' Technical Expert Date: AFLCMC/XZS 10/3/19

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

ii

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Signature Page (Approvers)

Approved by: Ms. Kelly Navarra:

Chief Engineer Date: AFLCMC/XZS Approved by: Mr. Dean Berry:

S&A Division Chief Date: AFLCMC/XZS

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

iii

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Change Control Page

Revision Date Change Control Notes Number

V2.0 February 2011 FY 2011 Update FY 2013 Update – Phase 7 added to include Configuration Management V3.0 January 2013 section. Verification testing activities and deliverables expanded the test reporting requirement to include test planning and development. FY 2014 Update – Expanded CM, Requirements Definition, and V4.0 January 2014 Tailoring sections. Incorporated Lessons Learned from FY13 Project Closeout Reviews. V4.1 July 2015 Added guidance for applying SMP to various development models Added activities and deliverables related to validation testing. Expanded V4.2 March 2016 Closeout Review activity, checklist, and briefing. Updated SMP to reflect changes in procurement and contracting V4.3 September 2017 processes for new CDIP contract. V4.4 February 2018 Integrated new CDIP DID’s. Replaced PRR with SRR.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

iv

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

TABLE OF CONTENTS

1 INTRODUCTION ...... 1 1.1 Mission Statements ...... 1 1.2 Document Structure and Purpose ...... 2 1.3 Owner ...... 2 1.4 Scope ...... 2 1.5 Project Planning and Tailoring ...... 2 1.5.1 Project Planning Considerations ...... 3 1.5.2 Project Execution Models ...... 4 1.6 ...... 5 1.6.1 Definition of Risk ...... 5 1.6.1.1 Risk Components ...... 6 1.6.1.2 Risk Identification ...... 6 1.6.1.3 Risk Analysis ...... 6 1.6.1.4 Risk Mitigation and Acceptance ...... 8 1.6.1.5 Risk Tracking and Reporting ...... 8 1.7 Reference Documents ...... 9 1.8 Distribution ...... 9 1.9 Maintenance ...... 9 1.10 Policy Compliance ...... 9

2 SIMAF PROJECT TEAM STRUCTURE ...... 10 2.1 S&A Division (SIMAF) Senior Management Team...... 10 2.1.1 Division Chief ...... 11 2.1.2 Technical Director ...... 11 2.1.3 Chief Engineer...... 11 2.1.4 Security Officer ...... 12 2.1.5 and Integration Branch Chief (XZSM) ...... 12 2.1.6 Analysis Branch Chief (XZSA) ...... 13 2.1.7 Infrastructure Branch Chief (XZSI) ...... 13 2.1.8 Technical Expert ...... 13 2.2 Project Execution Team ...... 14 2.2.1 The Project Manager (PjM) ...... 14 2.2.2 The Technical Lead ...... 15 2.2.3 Financial Management Representative ...... 16 2.2.4 Contracting Officer ...... 16 2.2.5 Configuration Manager ...... 16 2.2.6 Subject Matter Expert(s) ...... 17 2.2.7 Operations Lead ...... 17 2.2.8 Architecture Lead ...... 17 2.2.9 Analysis Lead ...... 18 2.2.10 Network Lead ...... 19 2.2.11 Information Assurance/Cyber Lead ...... 19 Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

v

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 3 THE SIMAF MASTER PROCESS (SMP) ...... 20 3.1 Initiate Project ...... 22 3.1.1 Define Project Requirements ...... 22 3.1.1.1 Establish Project Integrated Product Team (IPT) ...... 23 3.1.1.2 Define Customer Requirements ...... 24 3.1.1.3 Perform Gap Analysis ...... 24 3.1.1.4 Develop Statement of Capability (SOC) ...... 25 3.1.1.5 Develop Project Study Plan ...... 26 3.1.1.6 Develop Verification &Validation (V&V) Plan ...... 26 3.1.1.7 Develop Activity Plan ...... 27 3.1.1.8 Develop Project Order Document ...... 29 3.1.2 Establish Contractor Support ...... 29 3.1.2.1 Process Contract Order ...... 30 3.1.2.1.1 Coordinate Contract Order Documents ...... 30 3.1.2.1.2 Release RFP/FOPR package ...... 31 3.1.2.1.3 Proposal Technical Evaluation ...... 31 3.1.2.1.5 Contract Order Award ...... 32 3.1.2.2 Conduct Kick-Off Meeting ...... 32 3.1.2.6 Monitor Project Execution ...... 32 3.2 Develop Analytical Design ...... 34 3.2.1 Define Analysis Requirements ...... 34 3.2.1.1 Develop Analysis Objectives ...... 35 3.2.1.2 Define Measures of Effectiveness & Performance (MOEs/MOPs) ...... 35 3.2.1.3 Define Data Elements ...... 36 3.2.1.4 Produce Analysis (ATM) ...... 36 3.2.2 Define Experiment ...... 36 3.2.2.1 Develop High-Level Assessment Description ...... 37 3.2.2.2 Develop Experimental Design Method ...... 38 3.2.2.3 Identify Data Source ...... 39 3.2.2.4 Define Data Collection Plan and Analysis Methodology ...... 39 3.2.2.5 Produce Technical Assessment Plan (TAP) ...... 40 3.2.3 Conduct Analysis Requirements Review (ARR) ...... 41 3.3 Specify Operational Environment ...... 42 3.3.1 Develop Live, Virtual Constructive (LVC) List ...... 43 3.3.2 Develop Integration Plan ...... 44 3.3.3 Define Mission Use Cases ...... 44 3.3.3.1 Decompose Integration Plan into Mission Use Cases ...... 45 3.3.3.2 Produce DoDAF Viewpoints ...... 45 3.3.4 Conduct Operational Requirements Review (ORR) ...... 46 3.4 Develop Capability...... 48 3.4.1 Define System/Subsystem Requirements...... 49 3.4.2 Generate Requirements Traceability Matrix (RTM) ...... 50 3.4.3 Conduct System Requirements Review (SRR) ...... 51 3.4.4 Develop System/Subsystem Design ...... 53 3.4.5 Define Requirements ...... 54

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

vi

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 3.4.6 Conduct Software Specification Review (SSR) ...... 55 3.4.7 Develop ...... 56 3.4.8 Conduct Software Design Review (SDR) ...... 57 3.4.9 Develop Software ...... 59 3.4.9.1 Code Computer Software CIs (CSCIs) ...... 60 3.4.9.2 Develop Functional Unit Test Cases ...... 60 3.4.9.3 Conduct Functional Unit Tests...... 61 3.4.9.4 Perform Pre-Integration Component-Level Testing ...... 61 3.4.10 Conduct Integration Test Readiness Review (ITRR) ...... 62 3.5 Integrate Battlespace ...... 64 3.5.1 Establish Relevant Environment ...... 64 3.5.2 Conduct ...... 65 3.5.2.1 Prepare the SIMAF Facility ...... 65 3.5.2.2 Establish Network Connections ...... 65 3.5.2.2.1 Connect Participating Sites to the Network ...... 66 3.5.2.2.2 Verify Network Operation ...... 66 3.5.2.2.3 Verify Security...... 66 3.5.2.2.4 Interface to Operational Systems ...... 67 3.5.2.3 Test Integrated Environment ...... 67 3.5.3 Conduct Dry Run...... 68 3.5.4 Conduct Event Readiness Review (ERR) ...... 68 3.6 Execute Battlespace...... 71 3.6.1 Conduct System Testing ...... 71 3.6.1.1 Perform System Verification Tests ...... 71 3.6.1.2 Perform System Validation Tests...... 72 3.6.2 Conduct Experiment...... 72 3.6.2.1 Conduct Pre-Event Planning ...... 73 3.6.2.2 Conduct Event Pre-Briefs ...... 74 3.6.2.3 Execute Event Runs ...... 74 3.6.2.4 Perform Quick Look Analysis ...... 75 3.6.2.5 Archive Daily Output Data ...... 76 3.6.2.6 Conduct Daily Hotwash ...... 76 3.6.2.7 Conduct Event Debrief ...... 77 3.6.2.8 Analyze Results ...... 78 3.6.2.8.1 Process Data ...... 78 3.6.2.8.2 Assess Results ...... 79 3.6.3 Deliver Capability ...... 80 3.6.3.1 Document Results ...... 80 3.6.3.2 Archive Data ...... 80 3.6.4 Conduct Project Closeout Review ...... 81 3.7 Deploy/Sustain Capability ...... 81 3.7.1 Establish Product Baseline ...... 81 3.7.2 Deploy Product ...... 81 3.7.3 Manage Configuration...... 82 List of Appendices

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

vii

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 Appendix A – Terms and Definitions ...... 83 Appendix B – Acronyms ...... 88

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

viii

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

List of Figures

Figure 1: Risk Reporting Matrix ...... 7 Figure 2: S&A Division Team Structure ...... 10 Figure 3: Project Execution Workflow ...... 21 Figure 3.1 – Initiate Project Overview ...... 22 Figure 3.1.1 – Define Project Requirements Activity Flow ...... 23 Figure 3.1.2 – Establish Contractor Support Activity Flow ...... 29 Figure 3.1.2.1 – Process Contract Order Activity Flow ...... 30 Figure 3.2 – Develop Analytical Design Overview ...... 34 Figure 3.2.1 – Design Analysis Requirements Activity Flow ...... 35 Figure 3.2.2 – Define Experiment Activity Flow ...... 37 Figure 3.2.3 – Conduct Analysis Requirements Review (ARR) Activity Flow ...... 41 Figure 3.3 – Specify Operational Environment Overview ...... 43 Figure 3.3.4 – Conduct Operational Readiness Review (ORR) ...... 46 Figure 3.4 – Develop Capability Overview ...... 49 Figure 3.4.3 – Conduct System Requirements Review (SRR) Activity Flow ...... 52 Figure 3.4.6 – Conduct Software Specification Review (SSR) Activity Flow ...... 55 Figure 3.4.8 – Conduct Software Design Review (SDR) Activity Flow ...... 58 Figure 3.4.10 – Conduct Integration Test Readiness Review (ITRR) Activity Flow ...... 62 Figure 3.5.1 – Integrate Battlespace Overview Activity Flow ...... 64 Figure 3.5.2 – Conduct Integration Testing Activity Flow ...... 65 Figure 3.5.2.2 – Establish Network Connections Activity Flow ...... 66 Figure 3.5.4 – Conduct Event Readiness Review (ERR) Activity Flow ...... 69 Figure 3.6 – Execute Battlespace Overview ...... 71 Figure 3.6.2 – Conduct Experiment Activity Flow ...... 72

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.

ix

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

1 Introduction

Increasingly complex technical requirements from the warfighter, combined with sweeping changes throughout the Department of Defense (DoD) to improve efficiency in acquisition processes led to the activation of the Simulation and Analysis Facility (SIMAF) at Wright-Patterson Air Force Base (WPAFB) in 1999.

Through partnerships with engineering, analysis and simulation offices at DoD and Industry, the SIMAF conducts simulations and analyses for system and requirements trades in accordance with the DoD vision for Simulation-Based Acquisition (SBA). It provides an integrated modeling, simulation, and analysis capability that includes a real-time, high fidelity, human-in-the-loop virtual battlespace suitable for design, development, and operational evaluations during any phase of a weapon system’s life cycle. With its team of virtual simulation scientists, engineers, and analysts, SIMAF’s capability enables collaborative teams of senior decision-makers, warfighters, developers, engineers, and analysts to: • Better and more quickly evaluate the impact of emerging technologies and systems onthe warfighter’s mission effectiveness • Develop advanced concepts and technologies for militaryapplications • Provide the Air Force Materiel Command (AFMC) Commander, Air Force Life Cycle Management Center (AFLCMC) Commander, Program Executive Officers (PEO), Service Acquisition Executive (SAE) and USAF Program Directors with detailed information that supports acquisition decisions.

1.1 Mission Statements

XZS, Simulation and Analysis Division Provide Customers and Decision Makers with independent government analyses and simulation software throughout the aerospace acquisition life cycle, to develop, test, and assess war-fighting requirements, solutions, and capabilities in integrated, cross-domain Live-Virtual-Constructive battlespace environments.

XZSA, Analysis Branch

Provide analytic underpinnings for live, virtual, constructive modeling and simulation events and studies through: requirements development; representations, tools, and processes; and engineering, HSI, and cost-benefit assessments.

XZSI, Infrastructure Branch

Provide expertise to enable constructive and virtual simulation development and analysis by rapidly delivering: Secure facility physical plant and planning; secure stand-alone/distributed networks; hardware; and integration; and credible simulation environments for both development testbeds and customer simulation needs. Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 1

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

XZSM, Project Management and Integration Branch Provide Project Management expertise to successfully execute an Enterprise-wide approach to live- virtual-constructive modeling, simulation, and analysis assessments and solutions.

1.2 Document Structure and Purpose

The SIMAF Master Process (SMP) provides a standardized approach for the management of SIMAF projects from initial project planning to closeout. A project can be an event or series of events, an analysis study, creation of or modifications to models and simulations, a simulation model; creation of a simulation model; a simulator hardware upgrade or procurement; a facility modification, a verification/validation test activity, or any combination of these elements. Not all projects culminate in an Event, which is defined as a Live Virtual Constructive (LVC) demonstration, test, and evaluation activity, or analysis supported by exercises.

1.3 Owner

The Simulation and Analysis (S&A) Division Chief Engineer is the owner of this process.

1.4 Scope

The SMP provides a standardized approach for the management and execution of SIMAF projects from initial project planning to closeout. A project can be an event or series of events, an analysis study, creation of or modifications to models and simulations, a simulation model; creation of a simulation model; a simulator hardware upgrade or procurement; a facility modification, or any combination of these elements. Not all projects culminate in an event which is defined as a Live Virtual Constructive (LVC) demonstration, test, and evaluation activity, or analysis supported by exercises.

This SMP makes available a comprehensive set of activities applicable to a broad range of S&Aprojects. Adherence to the SMP is required for all projects in SIMAF, however it is not intended that every activity be applied to every project. The purpose of the SMP is to provide a framework and set of processes from which a given project can draw.

1.5 Project Planning and Tailoring

This SMP contains a comprehensive set of processes which can be tailored and applied to a variety of project types as listed in paragraph 1.5.2 below. A project may tailor these processes to fit type, scope, timeline, or financial constraints. A project’s process set is tailored initially by the Government during development of the Statement of Capability (SOC). Contractors may suggest further tailoring in their formal proposal, however any deviations from the tailoring defined by the Government must be justified and will be evaluated by the Government IPT evaluation team during the technical evaluation of the proposal. All reviews and deliverables at the time of contract award become the baseline for the project and will be reflected in the Contractor’s Project Management Plan (PMP) and Integrated Master

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 2

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Schedule (IMS). Any changes to the SMP baseline must be formally requested and approved by the Project Execution Team before incorporation into the PMP. Once the PMP is approved, further SMP tailoring is not permitted.

SIMAF manages two (2) main project types; analysis efforts and capability builds and are distinguished as follows:

a. Analysis Effort: • Projects focused on analysis involving study planning and design of experiments b. Capability Build: • Projects focused on capability development, such as simulation models and applications, cockpit builds, infrastructure updates, builds, and procurements

Frequently a project combines elements of both project types. That is, a project may require that capabilities be developed to satisfy the analysis objectives for the project. In that case the project will include processes associated with both analysis and capability build.

Regardless of the project type, it is the responsibility of the Government to determine the scope of tailoring for a given project and ensure the resulting tasking documents (SOW Amendment, Order Assignment, PWS, etc.) accurately and completely reflects the Government’s expectations regarding project deliverables and work scope for the project.

1.5.1 Project Planning Considerations

It is the responsibility of the Project Execution Team to plan the project properly to ensure the resulting tasking accurately and completely reflects the Government’s expectations regarding deliverables and work scope. Project Execution Team planning activities and products are further discussed in the Initiate Project section of this SMP. Topics to be considered during project planning include:

a. Identify the capabilities, products, and/or services the project will develop/provide and the end user of the products. This will influence the level of process rigor related to product development (e.g. Verification &Validation (V&V) testing), management, and sustainment (e.g. configuration management). For example, products developed primarily for SIMAF use are subject to strict configuration management processes whereas management responsibility of products delivered to an external customer will transition to the customer, but there may be higher expectations for V&V testing. b. Determine what data, products (tools), and/or services are being provided by the Government for the project and which of these the contractor will be tasked to provide. The resulting split in responsibilities and work scope shall be documented with all associated costs/risks and submitted to SIMAF leadership for approval. c. Selection of a project execution model and the resulting tailoring of milestones, deliverables, and/or workflow.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 3

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

1.5.2 Project Execution Models

Although the activity workflows are presently visualized as sequential processes in the SMP, it is not intended that a monolithic or waterfall process be applied to all projects. While some activities are always dependent upon outputs from a preceding task (e.g. requirements must exist before a design can be started, a design must exist before code can be written, etc) application of the SMP is intended to be adaptive to the project’s specific needs. In tailoring an execution strategy for a given project, it is always the objective to improve product quality and turnaround, leverage reuse, reduce risk and eliminate unnecessary process overhead. In the end, one should not be able to tell which software development model was employed for a given project or capability by looking at the documentation.

Many factors can influence a project’s execution model. Project type, scope and clarity of customer requirements, customer-driven response timelines, or significant tailoring of the SMP itself may justify alternative development strategies. Types of alternative process models include:

a. Concurrent Development. All activities exist concurrently but reside in different states. The concurrent process model defines a series of events that will trigger transitions from state to state for each of the master process activities defined for theproject. b. Parallel Development. Applies primarily to larger software development efforts where portions of the software products are developed simultaneously by different individuals or teams. It may also apply if the Specify and Build Battlespace threads of the SMP workflow can be executed in parallel. c. Incremental Build. The incremental build model is a method of software development where the model is designed, implemented and tested incrementally. Each development phase results in a complete implementation of the selected requirements and has some standalone value. Development of each increment can occur in either concurrent or sequential phases and continues until the product is finished. The product is defined as finished when it satisfies all of its requirements. d. Iterative Build. The iterative build model differs from the incremental model in that each build phase attempts to address the complete set of requirements, with the build phases refining and adding detail to the overall design. The advantage of an iterative development is that defects can be identified and tracked at early stages of development. This model combines the elements of parallel development with the iterative philosophy of prototyping. e. Spiral Development. The is a risk-driven process that specifically introduces risk assessment as part of development model. As an example of a spiral process, an initial version of the product may be developed then repetitively modified through prototyping based on input received from customer evaluations. Thus, the final product is developed as part of the final spiral. f. Agile Process. In general the Agile process model describes a set of values and principles for software development under which requirements and solutions evolve through the collaborative effort of self-organizing cross-functional teams. It advocates adaptive (vice predictive) planning, evolutionary development, early delivery, and continuous improvement. One very popular Agile framework for managing software development is the Scrum. The Scrum consists of small teams that break the work into actions that can be completed within

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 4

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

fixed duration cycles (“sprints”), track progress, and re-plan daily, with workable software delivered at the end of every sprint. SIMAF encourages execution of an agile process model where appropriate to promote innovative design and rapid prototyping. However, when executing an agile process model it is imperative that a layer of process discipline be applied so the good ideas that come out of model-storming are applied to a controlled baseline through the configuration management process to maximize reusability across projects. During product development, communication among contractor and Government technical teams should occur on a regular, if not daily basis to avoid requirements drift and to ensure the project team remains focused. is not an ad hoc process. It typically takes more oversight, communication and teamwork to successfully execute an agile process model than any other process model. One common myth of the Agile process is that it obviates the need for documentation, placing emphasis instead on working code. At SIMAF, consistent and thorough documentation is critically important for sustainment of our product line regardless of the software development model employed. With an Agile process, the documentation tends to evolve with the sprints as the end product matures.

1.6 Risk Management

Risk management is the process that encompasses identification, analysis, mitigation planning, plan implementation, and tracking. Risk management should begin at the onset of a project and be outlined in the PMP, and visited at each major review or milestone. Note: Risks should not be confused with issues, issue management applies resources to address and resolve current issues or problems, while risk management applies resources to mitigate future potential root causes and their consequences. The objective of a well-managed risk management program is to provide a repeatable process for balancing cost, schedule, and performance goals within funding, especially with designs that approach or exceed the state-of-the-art or have tightly constrained or optimistic cost, schedule, and performance goals. Without effective risk management the project team may find itself doing crisis management, a resource-intensive process that is typically constrained by a restricted set of available options. Successful risk management depends on the knowledge gleaned from assessments of all aspects of the project coupled with appropriate mitigations applied to the specific root causes and consequences. A key concept here is that the government shares the risk and does not transfer all risks to the contractor. All program risks, are of concern and must be assessed and managed by the SIMAF project team. Successful mitigation requires that government and the contractor communicate all program risks and project options. Risk mitigation involves selection of the option that best provides the balance between performance and cost.

1.6.1 Definition of Risk

Risk is defined as a future uncertainty to achieving project execution goals within defined cost, schedule and performance constraints.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 5

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

1.6.1.1 Risk Components

Risks have three parts, the action yet to happen, the likelihood, and the consequence of that action. Each risk can be defined in terms, If action then Result, and proper assessment of the probability (likelihood) and consequence. A future cause is the most basic reason for the presence of a risk. Accordingly, risks should be tied to future root causes and their effects.

1.6.1.2 Risk Identification

The intent of risk identification is to answer the question “What can go wrong?” Followed by “why” multiple times until the source(s) is discovered. Risk identification is the activity that examines each element of the project to identify associated root causes, begin their documentation, and set the stage for their successful management. Consequently it is important to recognize that risk identification is the responsibility of every member of the Project Execution Team. Examination of a program is accomplished through decomposition into relevant elements or areas. Teams should break down projects to a level of detail that permits an evaluator to understand the significance of any risk and identify its causes. Such risks can be identified based on prior experience, brainstorming, lessons learned, or in comparison with similar projects. An examination of each project element against each risk area is an exploratory exercise to identify the critical root causes. Root causes are those potential events that evaluators determine would adversely affect the program at any time in its life cycle. Some example of risks areas may include; requirements uncertainty, model development, facilities, contractor or government resources, schedule, and security.

1.6.1.3 Risk Analysis

The intent of risk analysis is to answer the question “How big is the risk?” Considering the likelihood of the root cause occurrence; identifying the possible consequences in terms of performance, schedule, and cost; and identifying the risk level using the Risk Reporting Matrix shown below.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 6

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 1: Risk Reporting Matrix

Each undesirable event that might affect the success of the program (performance, schedule, and cost) should be identified and assessed as to the probability and consequence of occurrence. A standard format for evaluation and reporting of program risk assessment findings facilitates common understanding of program risks at all levels of management. The Risk Reporting Matrix for each root cause is reported as low (green), moderate (yellow), or high (red). The level of likelihood of each root cause is established utilizing the following criteria. Probability: 1-5 scale with probability of occurrence: 1. Not Likely 5-20% 2. Low 21-40% 3. Likely 41-60% 4. High 61-80% 5. Near Certain 81-99%

The consequences of each risk are established utilizing the following criteria.

Consequence: 1-5 scale: 1. Negligible: Negligible schedule slip; Minimal technicalimpact 2. Minor: Schedule slip, but able to meet key milestones/deliverables; Minor reduction intechnical performance 3. Moderate: Schedule slip impacts key milestones/deliverables; moderate shortfall intechnical performance 4. Serious: Will require change to critical path; significant degradation in technical performance 5. Critical: Cannot meet key milestones; severe degradation in technicalperformance

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 7

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

1.6.1.4 Risk Mitigation and Acceptance

Risk mitigation is the activity that identifies, evaluates, and selects options to set risk at acceptable levels given program constraints and objectives. It is intended to enable program success. It includes the specifics of what should be done, when it should be accomplished, who is responsible, and the funding required to implement the risk mitigation plan. The most appropriate project approach is selected from the mitigation options listed above and documented in a risk mitigation plan. The intent of risk mitigation is to answer the question “What is the approach for addressing this potential unfavorable consequence?”. Once alternatives have been analyzed, the selected mitigation option should be incorporated into program planning. A risk mitigation plan needs to be realistic, achievable, measurable, and documented. The plan does not need to be complex but should be a written summary containing the following items; date of the plan, description of the risk, root causes leading to the risk, possible alternatives to alleviate the risk, activities intended to reduce the risk, success criteria for each plan event, and subsequent “risk level if successful” values, fallback approaches, and expected decision dates. When determining that it may be appropriate to lower the consequence of a risk, careful consideration should be given to the current risk state to justify such a reassessment. It is imperative that the Technical Lead and the Project Manager understand and approve the mitigation plan and examine the plan in terms of secondary, unforeseen impacts to other elements of the project. As part of this effort, SIMAF requires that a project’s risk management is documented and briefed, as appropriate, during program and technical reviews.

1.6.1.5 Risk Tracking and Reporting

The intent of risk tracking is to ensure successful risk mitigation. It answers the question “How are things going?” An event's likelihood and consequences may change as the project proceeds and updated information becomes available. The project team should re-examine risk assessments and risk mitigation approaches concurrently. As the design matures, more information becomes available to assess the degree of risk inherent in the effort. If the risk changes significantly, the risk mitigation approaches should be adjusted. If the risks are found to be lower than previously assessed, then specific risk mitigation actions may be reduced or canceled. In addition to reassessing, the team should look for new risk mitigation options. The purpose of risk reporting is to ensure management receives all necessary information to make timely and effective decisions. This allows for coordination of actions by the risk team, allocation of resources, and a consistent, disciplined approach. A primary goal of risk reporting should be to provide the PjM with an effective early warning of developing risk. The matrix below can be used to summarize the risks of a project during a project review, or weekly status meetings. The goal is to keep the team focused on upcoming risk mitigation efforts, and think through how to execute plans as the needs arise.

Risk Impact Statement Probability Consequence Risk Result Mitigation Plan Closure Date Statement

If action Then response 3-Likely 4-Serious Mod Plan of action 1 Nov 99

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 8

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

1.7 Reference Documents

The following documents form a part of SMP and shall be complied with for every project. The documents listed below can be found on the AFLCMC/XZS SharePoint:

1) SIMAF Data Management and Archiving Plan (DMAP) 2) SIMAF Configuration Management Plan (CMP) 3) SIMAF Standard Operating Procedures (SOPs)

1.8 Distribution

The SMP, associated templates, work products, and training materials are released under the authority of the Technical Director. No changes to this process shall be made without the approval of the SMP Point of Contact (POC). Whenever the document is updated, approved, and released as a new version, the SMP POC will ensure that: 1. The document is placed on the AFLCMC/XZS SharePoint and is available to all S&A Division personnel for use. 2. The S&A Division community is notified via e-mail that the document is available on the AFLCMC/XZS SharePoint and a link is provided. 3. Training is provided upon request and/or if extensive document revisions have been made.

The SMP is also available to organizations external to SIMAF. Requests for a copy of the SMP can be submitted to the SIMAF Technical Director, the S&A Division Chief, or the SMP POC.

1.9 Maintenance

User feedback on this process is welcome and can be provided to AFLCMC/XZS. The SMP will be reviewed on an annual basis and updated/distributed accordingly.

1.10 Policy Compliance

The S&A Division Government PjMs monitor process compliance using a standard checklist that focuses on items concerning cost, schedule, and performance and ensure: • The requirements of the project contracts are being met • The project cost estimates baselined in the Project Management Plan (PMP) are updated monthly in the Funds and Man-hour Expenditure Report Monthly Status Reports (MSRs) and is on track • The Integrated Master Schedule tracks to the milestone review dates and due dates of associated products • Project deliverables are submitted and tracked against the Contract Data Requirements List (CDRL) requirements

The Government Technical Lead ensures that the day-to-day technical execution of the project is consistent with the policies, processes, and procedures documented in this process.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 9

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

2 SIMAF Project Team Structure

The SIMAF team structure described below contains all the potential roles and relationships for projects and events controlled by SIMAF. Team composition may be tailored depending on the size and scope of a given project.

Figure 2: S&A Division Team Structure

2.1 S&A Division (SIMAF) Senior Management Team

This team consists of the Division Chief, Technical Director, Chief Engineer, Security Officer, Project Management and Integration Branch Chief, Analysis Branch Chief, and Infrastructure Branch Chief and Technical Expert.

Collectively, the Senior Management Team forms the SIMAF Oversight Board (OSB) and has the following responsibilities: 1. Conduct planning and programming necessary to ensure adequate funds are available to execute all Division activities. 2. Determine what projects will be accomplished during the Fiscal Year.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 10

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3. Ensure contracting vehicles exist to execute S&A Division projects. 4. Resolve all issues, such as project schedule and resource conflicts, that cannot be resolved at the lower levels. 5. Review and approve the resource allocations such as facilities, schedule, and personnel for all projects to be initiated. 6. Review draft Project Initiation Documents (PIDs) and prioritize them for implementation. 7. Review and resolve all issues elevated from the Project Execution Team.

2.1.1 Division Chief

The Division Chief is responsible for the overall management of SIMAF and will: 1. Provide oversight and direction of facility operations, maintenance, and general support. 2. Provide oversight and direction of facility and security support within the division. 3. Review and provide guidance concerning the management of contracts supporting facility. 4. Advocate for funding to support facility operations and maintenance. 5. Provide oversight and direction to branch chiefs regarding personnel actions and resource management. 6. Provide general guidance and oversight for all projects and events. 7. Serve as an external interface to USAF, DoD, and other government agencies regarding S&A. 8. Foster organizational advocacy.

2.1.2 Technical Director

The Technical Director, in support of the PjM, is responsible for the strategic technical direction of the division and will: 1. Advance the state-of-the-art in Capability-Based Planning, Development, and Assessment. 2. Support and lead Modeling and Simulation activities that align with the SIMAF strategic vision. 3. Ensure all technical support delivered by the division is authoritative, accurate, documented, and coordinated. 4. Coordinate technical support at the branch level in collaboration with Division and Branch Chiefs and the Technical Expert. 5. Serve as an external interface to USAF, DoD, and other government agencies regarding S&A. 6. Oversee Systems Engineering process development and execution and process improvement.

2.1.3 Chief Engineer

The Chief Engineer is responsible for

1. Develop, disseminate, and implement technical policies and direction. 2. Lead and advise personnel in the implementation of technical policies.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 11

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

2.1.4 Security Officer

The Security Officer is a government employee and shall be Special Access Program (SAP) qualified. The Security Officer is responsible for security oversight for the SIMAF Facility which encompasses all security disciplines applicable to SIMAF. These requirements are documented in the Security Standard Operating Procedures (SOP) and the Fixed Facility Checklist. The Security Officer shall: 1. Ensure SIMAF rooms are maintained to meet security requirements needed for events. 2. Obtain and maintain facility accreditation through development of standard operating procedures, fixed facility checklists, requests for approvals and documentation required for special access programs and collateral security. 3. Ensure personnel are accessed/de-accessed to SIMAF room(s) alarms as appropriate per SOP requirements. 4. Ensure personnel are issued a badge per SOP requirements which allows access to SIMAF room(s) they are approved for. 5. Provide a complete security education and training program at classification levels appropriate for and required by specific projects to ensure that personnel are properly briefed; and maintain supporting documentation/records of training. 6. Ensure classified information is properly marked, controlled, and secured and the document control program is administered. 7. Provide support for event security planning and execution to include all aspects of physical, technical, and administrative security issues. 8. Ensure a badge control program is administered and executed properly to guarantee personnel are issued required badges and badges are maintained and inventoried per established policy. 9. Support all security inspections from external organizations and assist in internal self- inspection programs for all security disciplines. 10. Ensure reproduction, destruction, and transmission (mailing and faxing) procedures are implemented. 11. Review existing policies/procedures to achieve integration and consistency between information system security and facility security. 12. Execute SIMAF DD-254s

2.1.5 Project Management and Integration Branch Chief(XZSM)

The Project Management and Integration Branch Chief is responsible for the overall program management of the S&A Division and will: 1. Be accountable for assigned programs on all matters of program cost, schedule, and performance 2. Plan, program, and budget for adequate funds to execute SIMAF operations. 3. Work with the Technical Director to transition customers to SIMAF. 4. Set criteria and guidelines for all supplier agreement activities. 5. Provide program management and functional resources to support development of project requirements, acquisition strategies, and acquisition contract documentation and execution. 6. Oversee SIMAF project contracts and ensure execution. 7. Periodically review status of contracts and conduct CPAR assessments annually on each prime SIMAF contractor. Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 12

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

8. Review and advise contract change requests. 9. Identify requirements and the risk associated with unmet requirements for SIMAF to include facilities, personnel, and resources and provide them to AFLCMC/XZ 10. Assign Contracting Officer Representative (COR) with the appropriate level of clearance, or require an accessed SIMAF team member meeting eligibility requirements, become a COR for that effort. 11. Manage AFLCMC/XZSM day-to-day operations.

2.1.6 Analysis Branch Chief (XZSA)

The Analysis Branch Chief manages the Analysis Branch and will: 1. Assign engineers to serve as Technical Leads and/or Project Support for SIMAF projects. 2. Oversee analytic requirements and activities for all events/projects. 3. Manage development of models and simulations, analytical methods, and tools consistent with the event objectives and functional requirements. 4. Ensure implementation of Systems Engineering approach to event planning, design and V&V, and execution. 5. Serve as Government OPR for Test and Analysis Plans and V&V plans. 6. Review the analysis of event data for effectiveness in answering the analytical questions of the study. 7. Review final reports for compliance with customer expectations and impacts to customer and/or Warfighter decisions.

2.1.7 Infrastructure Branch Chief (XZSI)

The Infrastructure Branch Chief manages the Infrastructure Branch and will: 1. Manage SIMAF physical plant and planning. 2. Manage simulation infrastructure requirements, planning and design (aircraft cockpits, computers, networks, operating system software, etc). 3. Assign Branch support to oversee Cyber/Information Assurance (IA) management(oversee certification and accreditation of computers, internal and external networks, etc). 4. Oversee virtual simulation event integration management. 5. Provide technical project oversight as required. 6. Work with the Analysis Branch Chief to identify the Government Technical Leads and/or Project Support for a project/event. 7. Plan and manage the entire SIMAF Information Technology infrastructure in support of SIMAF business requirements.

2.1.8 Technical Expert

The Technical Expert is the primary technical focal point for all SIMAF projects and will: 1. Work with the Technical Director and Branch Chiefs to support customer needs and develop SIMAF capabilities.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 13

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

2. In support of the PM, will serve as the primary technical focal point for Project Planning; Project Monitoring and Control; and Supplier Agreement Management processes and activities. 3. In support of the PM, will provide overall technical oversight for all active projects and coordinate technical activities across projects to ensure technical products are being developed consistently and efficiently. 4. Reviews and approves all technical deliverables 5. Ensure the SMP is followed across all projects/events executed in the SIMAF.

2.2 Project Execution Team

A Project Execution Team is established for each project. The Project Execution Team consists of a Project Manager, Technical Lead, Financial Manager Representative, Contracting Officer, and Configuration Manager. Depending on the type and scope of the project, the Project Execution Team is supported by one or more of the following: Subject Matter Experts (SMEs), Operations Lead, Architecture Lead, Analysis Lead, Network Lead, and Information Assurance/Cyber Lead.

The Project Execution Team is the collective body of all stakeholders for a given project and is responsible for the programmatic execution and technical performance of the project. All projects within SIMAF will maintain a working Project Execution Team and use the Project Execution Team as the primary mechanism conveying project status, identifying and mitigating risk factors, resolving issues, and in general facilitating communication between the Government and contractor staff as the project executes.

The Project Execution Team is formed as soon as a project need is identified. Prior to contract award the objectives of the Project Execution Team are to define the requirements, work scope, and process tailoring for the project.

2.2.1 The Project Manager (PjM)

The PjM is responsible for cost, schedule, performance, and risk associated with executing a project and shall: 1. Manage the cost, schedule, and performance of each project. 2. Work closely with contracting to place identified projects on contract in a timely manner. 3. Chair the Project Execution Team. 4. Deliver/disseminate contractual project documentation to customer and/or other stakeholders, as required. 5. Act as the customer and contractor interface for cost, schedule, and risk issues. 6. Work with the Government Technical Lead to perform conflict resolution among stakeholders when required. 7. Review/assist with and prepare pre-contract and contractual documents (PID, SOC, SOW Amendment/Order Assignment/PWS/etc. and CDRL). 8. Coordinate the review, evaluation, and delivery of all project/event CDRLs/deliverables. 9. Resolve funding, schedule, risk management, or resource issues or request assistance from the Senior Management Team. Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 14

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

10. Work with the customer and Financial Officer (FO) to facilitate funds transfer and timely expenditure to support completion of the project/event requirements within the scheduled timeline, regardless of the contract vehicle. 11. Provide financial data to the FO for tracking purposes for those projects that are executed on contracts external to SIMAF ownership. 12. Track budget and technical progress and report business performance to management/customer. 13. Coordinate as required with the Government Technical Lead to facilitate the scheduling of all project reviews. 14. Note and disseminate minutes of all project meetings and reviews 15. Support event closeout activities (reference para. 3.6.5 for PjM closeout responsibilities). 16. Assess customer satisfaction. 17. Assess contractor performance based on input from the Project Execution Team members.

2.2.2 The Technical Lead

The Technical Lead will: a. Serve as a member of the Project Execution Team b. Serve as the Technical Lead for the project, acting as the final engineering authority on technical issues/requirements. c. Review/assist with and prepare pre-contract and contractual documents (PID, SOC, SOW Amendment/Order Assignment/PWS/etc. and CDRL). d. Work with the PjM to ensure technical resources (hardware, software, personnel, etc) are allocated/committed. e. Advise the PjM of significant technical issues which have the potential for cost, schedule, contracting or other project performance impacts. f. Work closely with the security office to ensure the integrity of project security. g. Assist the PjM in reviewing the technical sections of the PMP. h. Ensure that any Memoranda of Agreement (MOAs) needed for a distributed event are properly coordinated and approved. i. Coordinate with the Division Chief to ensure that official invitations are sent to very important people (VIPs) 4-6 weeks prior to an event. j. Review and approve the Event Announcement for dissemination. k. Oversee, review and approve mitigation strategies for meeting project technical requirements when initial plans fail or fall short of objectives. l. Oversee the test planning process. m. Oversee, review and approve all technical products. n. Oversee the software development process. o. Coordinate software deliveries with the CM Manager/CI Librarian to ensure delivered software is properly versioned and baselined. p. Work in conjunction with the Technical Expert and the PjM to ensure that the project is in compliance with the SMP and assist with tailoring the process for the project, if required.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 15

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

q. Work in conjunction with the PjM to ensure all project deliverables per the SOW Amendment/Order Assignment/PWS/etc. are named and stored in accordance with the SIMAF file and folder naming convention.

2.2.3 Financial Management Representative

The Financial Management Representative will work closely with the project manager and aid in the following activities: 1. Serve as a member of the Project Execution Team. 2. Assist the PjM in identifying the source of and/or dissemination of funds for the project. 3. Assist the PjM by reviewing the obligations and expenditures for each project to ensure projects are being executed to meet the customer’s expectations. 4. Assist the PjM in resolving financial issues.

2.2.4 Contracting Officer

The Contracting Officer (CO) will work closely with the project manager and aid in the following activities: 1. Serve as member of the Project Execution Team. 2. Provide delivery/task order guidance and direction. 3. Support the Project Execution Team by reviewing acquisition strategies, offering feedback to acquisition contract documentation (PID/SOC/SOW/PWS/IGCE, etc.) to ensure proper requirements documentation 4. Prepare contract documentation and ensure compliance with Federal Acquisition Regulation (FAR) requirements 5. Review contract proposal submissions. 6. Conduct contract negotiations 7. Prepare and finalize contract modifications. 8. Issue formal contract(s) and/or contract modifications. 9. Issue contract direction to supplier throughout the life of the contract. 10. Provide guidance to the XZSM Branch Chief and/or PjM on contractual issues. 11. Review and coordinate contractor Contract order/Contract Change Requests. 12. Monitor completion of contractual requirements. 13. Review completion of contract closure actions. 14. Issue formal acceptance of products/services and closure of contract.

2.2.5 Configuration Manager

The Configuration Manager role may be executed by a contract employee working exclusively and independently on behalf of the Government. The Configuration Manager will: 1. Serve as the primary focal point for management and control of configuration items (CIs). Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 16

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

2. Ensure the SIMAF Configuration Management Process is followed across all projects/events executed in the SIMAF. 3. Assist the Government Technical Leads in resolving project-level CM issues. 4. Assist the Technical Expert in resolving product line CM issues. 5. Ensure that Change Requests (CRs) are monitored and controlled. 6. Identify and maintain functional, allocated, and product baselines 7. Chair the Configuration Control Board (CCB). 8. Conduct CM audits (FCA and PCA) 9. Ensure that SIMAF CIs are under formal configuration management.

2.2.6 Subject Matter Expert(s)

Subject Matter Experts (SMEs) may be drawn from the local SIMAF engineering pool or external to SIMAF depending on the type and scope of service required. The SME(s) will:

1. Support the Project Execution Team Serve as the technical authority for a given product capability. 2. Review documentation in the areas of expertise 3. Oversee development, under the guidance of the Technical Lead, of specific capabilities relative to his/her expertise.

2.2.7 Operations Lead

The Operations Lead manages the team of operational weapon system (blue, red, gray, white) SMEs and intelligence experts involved in the event and will: 1. Coordinate the activities of the operational weapon system and intelligence SMEs. 2. Develop and define requirements from an operational employment perspective in support of the Government and Contractor Technical Leads. 3. Develop Concept of Operations (CONOPS)/Techniques, Tactics, and Procedures (TTP) for the event. 4. Develop event scenarios and vignettes. 5. Develop the Live, Virtual, Constructive (LVC) List. 6. Develop the Network requirements. 7. Develop operational requirements and ensures their implementation into the simulations. 8. Provide event pilot training. 9. Develop use cases. 10. Develop the Information Exchange Requirements (IERs). 11. Develop Department of Defense Architecture Frame (DoDAF) products [Operational Views (OVs) and System Views (SVs)]. 12. Develop the Weapon System Requirements Traceability Matrix (RTM).

2.2.8 Architecture Lead

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 17

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Depending on the type and scope of the project, there may be hardware and responsibilities. Therefore hardware and software responsibilities may be divided and are categorized separately, and may be allocated to different personnel at the discretion of the Technical Lead:

For software:

1. Provide leadership and understanding of SIMAF simulation framework (EAAGLES),models, and applications 2. Provide support and subject matter expertise to all project events related to software development 3. Review all documentation related to and design 4. Review and assess results of code analysis tool(s) to ensure delivered code complies with coding standards and processes. 5. Assess Verification and Validation (V&V) activities to include testing requirements and sufficiency of tests. 6. Oversees SIMAF code(s) release to DoD users. Advises SIMAF Leadership in determination of requests for SIMAF code through public release and/or to open source software repositories. 7. Facilitates and leads the Change Review Board (CRB) to ensure proposed changes to the software baseline are in compliance with the software architecture

For hardware:

1. Provide hardware/facility leadership including support of management understanding of current MS&A hardware capabilities and near term potential MS&A hardware capabilities and constraints. 2. Develop and maintain a SIMAF hardware and facility architecture. 3. Work closely with Technical Expert to ensure the hardware architecture aligns with and supports project simulation needs. 4. Oversee the decomposition of requirements into hardware solutions. 5. Perform hardware development activities, prepare schedules, integrate and coordinate activities. 6. Provide support and subject matter expertise to all project events related to hardware development, procurement, installation and operation. 7. Approve all hardware procurements to ensure they fit into the current SIMAF hardware/facility architecture. 8. Review all documentation related to hardware requirements and design 9. Provide low-level schedule tasking and estimates to the Government Technical Lead and implement scheduled activities. 10. Review all CRs to determine if hardware action(s) are required to resolve the problem/deficiency.

2.2.9 Analysis Lead

The Analysis Lead manages the team that works the technical details of the analysis and will: 1. Develop the Study Plan 2. Develop the Analysis Traceability Matrix (ATM).

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 18

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3. Assist the Contractor Technical Lead in writing the TAP and defining the Measures of Performance (MOPs)/Measures of Effectiveness (MOEs) needed to answer customer questions. 4. Provide the experimental design, as required. 5. Provide data requirements definition and/or decomposition. 6. Provide model and simulation functionality requirements and/or selection criteria (e.g., definition of “red” and “blue” models, system/subsystems, constructive simulations, etc) to meet analytic and operational requirements and coordinate this activity with the SAIL. 7. Map event data to the defined MOPs/MOEs, and use data reduction and data analysis tools to answer the measures. 8. Calculate the MOPs/MOEs. 9. Draft analysis findings and conclusions. 10. Prepare the Final Technical Report and brief results. 11. Perform data collection archiving activities. 12. Assess all analysis-related event DRs to determine appropriate action(s) to resolve the problem/deficiency. 13. Assess Verification and Validation (V&V) Reports for adequacy, as required.

2.2.10 Network Lead

The Network Lead coordinates network communications and simulator equipment requirements with the SAIL and other members of the Event Execution Team and will: 1. Design the network architecture. 2. Implement Communications Security (COMSEC) and resolve related issues. 3. Oversee and coordinate network connectivity with other sites. 4. Facilitate distributed network connectivity approval process. 5. Assess that latencies and other network error sources are not detrimental to the analysis problems being addressed or take corrective action to mitigate their impacts. 6. Assess all network/hardware-related event DRs to determine appropriate action(s) to resolve the problem/deficiency.

2.2.11 Information Assurance/Cyber Lead

The IA/Cyber Lead coordinates all activities associated with product compliance with established standards and policies.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 19

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3 The SIMAF Master Process (SMP)

The SIMAF Master Process is based on a standard Systems Engineering (SE) process, which has been adapted for a Modeling and Simulation environment. The workflow and processes are organized in the following functional categories: Project Initiation, Analysis Development, Mission Development, Capability Development, and Operational Test & Assessment as indicated in Figure 3. Activities within the functional process categories are summarized as follows:

• Initiate Project: Contains activities and workflow required to define the type and scope of the project and get the project on contract. • Develop Analytical Design: Contains activities and workflow required to develop the experimental design methodology and metrics based on the customer’s analysis requirements and questions. • Specify Operational Environment: Contains activities and workflow required to provide the operational context for the analysis and/or test environment. Typical elements of the Operational environment are blue and red force composition, laydowns, Concept of Operations (CONOPs) and Techniques, Tactics, and Procedures (TTPs), routes, timing, and weapons pairing. This functional process category also includes use cases based on the weapon systems used in the experiment and weapon system interoperability requirements. • Develop Capability: Contains activities and workflow required to develop capabilities needed to support the analysis and/or capability build. This process phase also includes of the individual subsystem components (CSCIs). • Integrate Battlespace: Contains activities and workflow required to integrate and test the capabilities built during the Develop Capability phase with the operational environment. This process phase also includes testing of the integrated subsystem components. • Execute/Assess/Deliver Capability: Contains activities and workflow required to validate the relevant environment, perform system level testing, conduct the analytical experiment, and preparing the products for delivery to the Government. If no analytical experiment is to be performed for the project, then only validation of the models, applications, and relevant environment and system test activities will be performed. • Deploy/Sustain Capability: Contains activities and workflow to baseline the delivered capability, build the product for the customer from Government controlled workspace, deliver/install the product at the customer site, and manage changes to the product baseline.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 20

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3: Project Execution Workflow

Also indicated in Figure 3 are the major reviews and deliverables associated with the workflow activities. Each Technical Review has well-defined entry and exit criteria, and are considered successful when all the criteria are satisfied. Each Technical Review shall be event-driven instead of being based on a predetermined schedule.

A project shall be ready to conduct a particular review when all applicable entry criteria are satisfied. All deliverables required for a particular review shall be submitted to the Government with sufficient time to review as specified in the contract. Additionally, all deliverables shall be named and organized in a standardized manner across projects.

Technical reviews shall be held at the classification level appropriate for the project. For example if the project’s effort/artifacts are at the SAP/SAR level, then all technical reviews of that effort and for those artifacts shall be at the SAP/SAR level. Under no circumstances shall unclassified technical reviews be attempted for classified project material.

Entry criteria are designed to ensure specific actions and progress have been completed so that the program is ready to expend the time and resources to conduct the review. Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 21

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Exit criteria are designed to review the current status of the program and determine whether sufficient progress has been made to justify moving forward to the next phase of work.

The following subsections provide the activity descriptions, including inputs, associated templates and tools, expected output, Office of Primary Responsibility (OPR), and personnel involved in performing each activity.

3.1 Initiate Project

Initiating the project is primarily a Government activity. The purpose of this activity is to define the project requirements, and establish contractor support.

Figure 3.1 – Initiate Project Overview

3.1.1 Define Project Requirements

The Technical Director and/or a Government representative meet with potential customers to discuss the benefits and cost effectiveness of using the SIMAF System of Systems (SoS) environment to address the customer’s requirements. The Technical Director conducts an initial review of the customer’s Project Introduction Document (PID) or high-level project requirements to determine if the project is feasible given the proposed schedule and requirements. If it is determined to be feasible, the Technical Director works with Senior Management Team to establish a Project Execution Team. The PjM works with the assigned IPT to document the proposed effort and present the information to the AFLCMC/XZ Business Review Board for New Work for approval of based on availability of resources (funding, manpower, equipment and facilities), acceptability of risk, and stakeholder support. If approved, the customer, Technical Director, Government PjM and Technical Lead first meet to define high-level customer expectations and requirements. The SIMAF technical SMEs then perform a gap analysis against the customer requirements to determine what capabilities in the SIMAF repository can be reused and what capabilities need to be developed or modified. Risks are then identified and cost estimates developed based on the gap analysis and further planning activities defined in paragraphs 3.1.1.1 through 3.1.1.7.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 22

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The project planning activity results in an outline of work scope, cost, and risks and forms the basis for tasks to be specified in the Project Tasking Document (Paragraph 3.1.1.8).

Figure 3.1.1 – Define Project Requirements Activity Flow

3.1.1.1 Establish Project Integrated Product Team(IPT)

Input: Project Commitment/Customer PID (if available) Available Templates/Tools: N/A Output: Project Execution Team in place OPR: Technical Director, Branch Chiefs

Activity Description: Once the customer and SIMAF tentatively commit to the project, the government’s Project Execution Team is formed. The first priorities of the Project Execution Team are to: 1. Establish the customer’s analysis and/or capability requirements 2. Apply the customer’s requirements to existing SIMAF modeling and simulation capability requirements and identify gaps

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 23

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3. Develop the infrastructure requirements to support analysis and capability development in the form of an Activity Plan 4. Develop the XZ Business Review Board for the contract acquisition

Once the contract Order has been awarded, the Contractor Program Manager and contractor technical leads are also part of this team. The team ensures that the project stays on track by monitoring project cost, schedule, and performance. 3.1.1.2 Define Customer Requirements

Input: Customer participation in Project meeting(s) Available Templates/Tools: Project Introduction Document (PID) Template; historical project data Output: Approved PID OPR: Customer, Technical Director, Government Technical Lead, Division Chief

Activity Description: The Project Execution Team, led by the SIMAF Government Project Manager and Technical Lead and/or the SIMAF Technical Director, engage in a series of meetings with the customer to review the PID and/or define/refine the following requirements: 1. Overall goal(s), objective(s), and purpose of the project 2. Customer constraints including estimated funding and estimated project need by date 3. High level analytic questions and objectives 4. Project security level and any unique hardware needs 5. Blue, Red or other Weapon Systems needed to support the analysis 6. High level Operational Concept including the high level , vignette area of interest, and high level missions such as Close Air Support (CAS), etc. 7. Project deliverables including hardware, software, and documentation 8. High level schedule including the estimated Statement of Capability (SOC) completion date, the anticipated funding availability date, and the project start date and any other key milestones anticipated at time of PID development

Using the information collected above, the Project Execution Team works with the customer to draft the Project Introduction Document (PID) if a customer generated document is not provided to AFLCMC/XZS. Upon mutual agreement, both the customer and the Division Chief sign the PID, which represents the customer requirements baseline. If a section of the PID template is not applicable to the project, it will not be removed from the document, but will be marked as N/A. The completed PID will be provided to the PjM for archival in the appropriate electronic project folder in the AFLCMC/XZS repository. If there are any changes to the scope of the project, the PID must be re-baselined and agreed to by both parties. Note that all initial projects require a PID and that follow-on projects may or may not require a PID as determined by the Technical Director. 3.1.1.3 Perform Gap Analysis

Input: Approved PID Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 24

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Available Templates/Tools: Software Repositories Output: High-Level Gaps Identified OPR: Technical Director, Government Technical Lead, Hardware Architecture Integration Lead (HAIL), SAIL

Activity Description: Using the approved PID as the customer requirements baseline, the Project Execution Team performs a gap analysis. If a PID was not developed for the project, then the Technical Director will prepare and present the project requirements to the Project Execution Team. Key stakeholders in the gap analysis process include the Technical Director, Government Technical Lead, HAIL and SAIL. They map existing SIMAF hardware, software, facility, and network capabilities to project requirements. They review current network and facility diagrams maintained by security to determine if the project will require additions or modifications to the current facility and/or networks. They review the software repositories to determine what current models can be reused or modified to support the event and what new models must be developed. In addition, they review current terrain and/or other databases and analysis tools to determine reuse, modification or new development. A Gap Analysis is required for follow-on projects whether introduced with or without a PID.

3.1.1.4 Develop Statement of Capability (SOC)

Input: Approved PID or requirements document, Gap Analysis Results Available Templates/Tools: Statement of Capability (SOC) Template, Cost Estimating Form Output: Approved SOC, SMP tailoring, Risk Management Plan OPR: Technical Director, Division Chief

Activity Description: The Project Execution Team uses the information from the gap analysis as the basis for determining the resources needed to implement the event and draft a government-only Statement of Capability (SOC). Based upon the scope of customer requirements specified in the PID and capabilities required to be built and/or procured as identified in the Gap Analysis, the Project Execution Team categorizes the project type and performs a recommended project tailoring using the tailoring guidelines as discussed in paragraph 1.5.

The SOC contains information describing the personnel required for the project, cost data estimates (material, travel, etc), facility and infrastructure support required, risk assessments (cost, schedule, and technical), and a Work Breakdown Structure (WBS) or similar description of all tasks that will be assigned, performed, and tracked based on the tailored process.

If the cost of the project is estimated to be greater than customer funding allows, the Technical Director renegotiates project requirements with the customer and re-baselines the PID before updating the SOC. The final draft of the SOC shall be reviewed by the Division Chief for approval and commitment of resources and project execution. Upon mutual agreement, the customer, Technical Director and Division Chief sign the SOC, which, upon approval, represents the project baseline. If a section is not applicable to the project, it will not be removed from the document, but will be marked as N/A. The approved SOC, Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 25

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 signed by the customer, will be provided to the Project Manager for archival in the appropriate electronic project folder on the AFLCMC/XZS SharePoint. If there are any changes to the scope of the project, the SOC must be re-baselined and agreed to by both parties. Based upon the scope of customer requirements specified in the PID and capabilities required to be built and/or procured as specified in the SOC, the Project Management Team then categorizes the project type and performs a recommended project tailoring using the tailoring guidelines as discussed in paragraph 1.5. 3.1.1.5 Develop Project Study Plan

Input: Approved PID, approved SOC Available Templates/Tools: Study Plan Template Output: Study Plan OPR: Government Analysis Lead

Activity Description: Using the analytic questions and objectives identified in the PID and/or SOC, the Project Execution Team develops the analysis requirements for the project. Key stakeholders in the development of the analysis requirements are the Government Analysis Lead and Government Technical Lead. The project’s analysis requirements are documented in the Project Study Plan. The purpose of the Project Study Plan is to convey high level guidance and analysis requirements to the contractor regarding the analysis portion of a project. Ideally, the Project Study Plan is developed in conjunction with the V&V Plan to ensure the simulation and modeling capabilities (LVC elements and levels of fidelity) employed or developed for the project are aligned with the analysis objectives for the project. After contract award, the Study Plan will be used by the contractor to provide input into the project’s analysis artifacts (ATM andTAP). 3.1.1.6 Develop Verification &Validation (V&V)Plan

Input: Approved PID and SOC (containing high level capabilities required) Available Templates/Tools: Validation &Verification Plan template Output: V&V Plan (Initial) OPR: Government Technical Lead

Activity Description: The Project Execution Team develops the V&V testing requirements for the project. Key stakeholders in the development of V&V planning are the Analysis Lead and Government Technical Lead. The project’s V&V testing requirements are documented in the V&V Plan. Project V&V requirements are intended to be developed in conjunction with the Study Plan to ensure that 1) adequate design information is available to conduct proper verification testing and 2) the project’s simulation and modeling capabilities (LVC elements and levels of fidelity) represents real world performance and aligns with the analysis objectives for the project. The V&V requirements shall be used by the contractor to create the implementation portion of the V&V Plan which is delivered for review at Project SRR.

Major activities related to V&V test requirements development include: 1. Develop the V&V methodology and issues and document in the V&V Plan Executive Summary (refer to MIL-STD-3022 Appendix B, paragraph B.5 as a guide). Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 26

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

2. Define the problem addressed by models and simulations to be developed for the project andtheir intended use. This will determine the scope of the V&V effort. Document in the V&V Problem Statement section of the V&V Plan (refer to MIL-STD-3022 Appendix B, paragraph B.6 as a guide). 3. Define the requirements for the models and simulations to be developed for the project, the acceptability criteria, and the qualitative/quantitative metrics used to measure success.Document in the M&S Requirements and Acceptability Criteria section of the V&V Plan (refer to MIL- STD-3022 Appendix B, paragraph B.7 as a guide). 4. Identify any assumptions, limitations, and/or risks associated with the models and simulations to be developed for the project. Document in the M&S Assumptions, Capabilities, Limitations & Risks/Impacts section of the V&V Plan (refer to MIL-STD-3022 Appendix B, paragraph B.8 asa guide).

3.1.1.7 Develop Activity Plan

Input: Approved PID, SOC Available Templates/Tools: Activity Plan Template Output: Preliminary Activity Plan OPR: Government Technical Lead, SIMAF Security Officer

Activity Description: Often there are long lead times associated with procuring, configuring, and/or deconflicting facility resources to support a project. This includes work areas, computational system assets, and personnel clearances. An early assessment of security, facility & infrastructure, and network requirements is critical to the successful execution of a project. These requirements are documented in the Activity Plan for the project, which is maintained by the Security Officer. As soon as practicable after the project requirements have been defined and prior to preparation of the contract order package, the Government Technical Lead conducts a review of the project’s facility, security, and infrastructure requirements with the SIMAF Security Officer and Infrastructure Branch (XZSI) personnel oversees preparation of the initial Activity Plan for the project. Not all information contained in the Activity Plan as listed below will be available at this point (pre-contract award) in the project. As the project executes and more detailed information becomes available, the Government Technical Lead will refine the Activity Plan as appropriate. After contract award and kickoff, the Government and Contractor Tech Leads will work together to build the Final Activity Plan, which must be completed before any software development or capability build can begin. Examples of information contained in the Activity Plan:

Security requirements:

1. The event classification level that will be used. 2. Identification of all documents that identify security requirements and/or classification guidelines related to the project. 3. Any modifications required to existing security Certification and Accreditation(C&A) documents.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 27

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

4. Organizations that will be participating in the event, including both local andnon-local participants. 5. A list of visitors and approximate dates for visits and ensure that visitors are aware ofprocedures for providing clearance verification. 6. The amount of safe (container) space required to store media/information in support ofevent. 7. The storage location for the material, identified by safenumber. 8. The names of individuals supporting the event who require accounts on thesystems. 9. The individuals who are required to complete Information Assurance (IA) Security Trainingand to ensure that this training is accomplished prior to establishingaccounts.

Facility requirements:

1. The areas (vaults/rooms) within the SIMAF where the development and/or event willoccur. 2. A list of rooms within the SIMAF that need to be connected. 3. A list of computer hardware required, by system, identification number, and, in addition, anynew hardware required. 4. A list of all software required. If new software, indicate whether COTS, GOTS, open source, freeware, shareware based, etc. Note: new software requirements, to include cost estimates, should be identified in the SOC, 5. A list of sites that will be participating, if the event is externallydistributed. 6. A graphical layout of each SIMAF room configuration required (i.e. what stations are to be used) and the participating external locations if the event is externallydistributed. 7. A description of each participating facility to support the graphiclayout.

Network requirements:

1. A list of external sites which will connect to the SIMAF network and how they will be connected (DREN/SDREN/Other). 2. A list of hardware, software, and simulation application assets that are to access thenetwork. 3. Which network transport protocol will be used for the passing of simulation data (DIS/HLA/TENA/Other). 4. If non-simulation data such as Voice Comm will be passed andhow. 5. How data will be encrypted [Communications Security (COMSEC)keys] 6. A network connectivity diagram. 7. Network services required at each location. 8. Memoranda of Agreement (MOA), which must be approved before connections to sites outside of the SIMAF can occur. The Information Assurance Officer (IAO) initiates all MOAs. An MOA is an agreement between each participant site's Designated Approving Authority (DAA) concerning procedures and items used for protecting the data that is transferred between sites. Although there are suggested formats for data transfer, it is ultimately the DAA for each site who makes the final format decision. Each MOA is refined until both parties are in agreement and sign the document. If agreement cannot be reached, the site will not be a player in the event. A signed MOA is the requirement for establishing connectivity.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 28

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.1.1.8 Develop Project Order Document

Input: Approved PID, SOC, Activity Plan, Study Plan Available Templates/Tools: Project Tasking Document (SOWA/OA as appropriate) Template Output: Project Tasking Document OPR: Government Project Manager

The Government PjM works closely with the project’s Technical Lead to develop the project tasking document. The project tasking document defines the work scope for the project and requires that all tailoring, capability, analysis, security, and infrastructure requirements have been developed per the preceding sections of the SMP.

[Note: The project tasking document can be a SOWA/OA (as appropriate), or a different document type (PWS, SOO, etc.) specified on a future contract. It is not within the scope of the SMP to duplicate in detail specifics of a particular contract vehicle, but rather discuss the intent in broad terms; for example in this case how SIMAF conveys project tasking to the contractor. This allows separation of the SMP from the contract vehicle while still allowing the intent of the activity to be included with the overall project process.]

3.1.2 Establish Contractor Support

The project’s PjM assists contracting in processing a contract Order to procure contractor support for the effort. After the Contract Order is awarded, the PjM calls a Kick-Off meeting to ensure that both the Government and the contractor have a thorough understanding of project requirements in accordance with the project tasking document. The contractor then documents his management strategy in the Project Management Plan (PMP). The PjM monitors the project throughout its life cycle.

Figure 3.1.2 – Establish Contractor Support Activity Flow

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 29

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.1.2.1 Process Contract Order

Upon PID/SOC approval, the PjM works with FM to procure customer funding and prepare the Military Interdepartmental Purchase Request (MIPR), Purchase Request, or other funding documents as required. The PjM works with the Contracting Office to prepare an Order Package. The completed package is coordinated with XZ per guidance in effect at the time of project execution, and then submitted to the contractor for bid, who, in turn, responds to the Government with proposals. The Evaluation Team assesses the contractor proposal(s) against predefined evaluation criteria and selects the contractor(s) that best meet(s) these criteria. The team documents the rationale for contractor selection and provides it to the Contracting Office. The Contracting Office notifies the contractor(s) of the contract order award and funds the contract order. Note that multiple contract orders are awarded during the duration of the contract and become part of the approved contract baseline.

Figure 3.1.2.1 – Process Contract Order Activity Flow

3.1.2.1.1 Coordinate Contract Order Documents

Input: Approved PID/SOC, requirements document, MIPR/funding document Available Templates/Tools: SOW/SOO/PWS, IGCE, BRB briefing, DD-254, CDRL listing Output: Contract Order Documents OPR: PjM, Project Execution Team

Activity Description: Using the information in the PID/SOC, the PjM and Project Execution Team prepares a package containing all the contractual documents required by the contract vehicle in place at the time of the project and works with the Contracting Officer (CO) to get it finalized. Examples of documents required for contract execution include: 1. Acquisition Plan 2. Justification and Approval Document (if required)

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 30

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3. Purchase Request/Availability of Funds Letter/MIPR 4. COR Self-Nomination (for services contracts) 5. Approved Requirement Approval Document (RAD) or Waiver (for services contracts) 6. Quality Assurance Surveillance Plan (QASP) (for services contracts) 7. Independent Government Cost Estimate(IGCE) 8. Government Furnished Property/Equipment/Information List and Determination and Finding – FAR 45.301 9. DD-254 Contract Security Classification Specification 10. AFLCMC/XZ Business Review Board Approval

3.1.2.1.2 Release RFP/FOPR package

Input: Approved contract order documents Available Templates/Tools: Sample project tasking document; RFP, IGCE, DD-254, CDRL listing Output: RFP letter and project order documents to contractors OPR: PjM, Contracting Officer

Activity Description: Once the contract order documents are approved and funding is provided to finance, the contracting officer can release the RFP or Fair Opportunity Proposal Request (FOPR) package. The RFP/FOPR package includes the RFP letter, the project tasking document, Evaluation criteria, Period of Performance, and the Solicitation DD-254. Contractors submit their proposals to the contracting officer, who forwards them to the PjM for evaluation. 3.1.2.1.3 Proposal Technical Evaluation

Input: Contractor proposals Available Templates/Tools: Technical Evaluation Guide/Template; Sample Evaluation Input Sheet, Non-Disclosure Agreement (NDA) Output: Completed Technical Evaluation OPR: PjM, Technical Evaluation Team

Activity Description: Prior to receipt of proposals, the Branch Chiefs will assign evaluation team members in sufficient number and skillset breadth to support a timely evaluation. Evaluation team members must sign a Non- Disclosure Agreement (NDA). The PjM assembles team members to provide evaluation training/refresher, evaluation timeline, expectations, and ensures members have the RFP/FOPR package specific to the task. PjM receives proposal information from the contracting officer, reviews and provides name-redacted versions to evaluation team members. Using the Technical Evaluation Guide/Template, team members review the proposals and prepare an evaluation sheet for each proposal. Individual evaluations are provided to the PjM for consolidation and review with the IPT. The PjM consolidates all inputs, provides feedback as necessary, then conducts evaluation team meeting to review final recommendations. This information is captured in the Technical Evaluation Memorandum with supporting rationale and provided to Project Management and Integration Branch Chief for final review

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 31

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 and forward Division Chief for final approval. Once complete, the Technical Evaluation package is sent to the Contracting Officer. 3.1.2.1.5 Contract Order Award

Input: Order Award Memorandum for Record (MFR) Available Templates/Tools: N/A Output: Awarded Contract Order OPR: Contracting Officer

Activity Description: Upon receipt of Technical Evaluation Memorandum of Record the Contracting Officer reviews the Technical Evaluation to ensure that the evaluation contains the required information and is complete. The Contracting Officer begins negotiations with contractor and if needed works with the PjM for clarification/additional details to support negotiations. Upon successful negotiation, the Contracting Officer issues a formal notice of award to the Contractor and notifies the PjM.

3.1.2.2 Conduct Kick-Off Meeting

Input: Contract Award Available Templates/Tools: Kick-Off Meeting Agenda Template, Draft Activity Plan Output: Kick-off Meeting schedule, Meeting minutes and Action Items OPR: PjM, Government Technical Lead, Contractor PM, Contractor Technical Lead

Activity Description: As soon as practical after Order award, the Government and contractor team leads meet to coordinate schedule for kick-off meeting. The Government provides the Kickoff chart template and any GFI/GFE required to support the Kickoff meeting. The Government also answers any outstanding questions the contractor may have, and coordinates/facilitates personnel in-processing, building access, clearances, and MIS support as required.

At a time specified in the contract, the Government PjM holds a kick-off meeting with the contractor to introduce team members and ensure that both the contractor and the Government fully understand and agree on the project requirements. The contractor and the Government review the project tasking document and determine what requirements remain intact and what requirements, if any, may need to be dropped or adjusted given time constraints, etc. The team reviews the CDRL, draft IMS, milestone review, and event or demo due dates to determine if any adjustments need to be made. The Government PjM takes meeting minutes and creates a list of any Action Items that have been assigned as a result of the meeting. The meeting minutes are provided to the participants for review and comment. Approved meeting minutes are filed in the designated project folder.

3.1.2.6 Monitor Project Execution

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 32

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Input: Successful Kick-Off Meeting; Approved PMP, Approved IMS Available Templates/Tools: CPAR Template Output: Project Status OPR: Government PjM; Project Execution Team

Activity Description: The Government PjM monitors the project during all phases of the SIMAF Master Process and conducts the following general activities: 1. Participates in weekly project status meetings. 2. Reviews the Monthly Status Reports (MSRs) submitted by the contractor to ensure that any changes in schedule have been identified and approved by the Government and that anynew project risks have been identified. 3. Reviews risks and risk mitigation efforts. 4. Reviews the project’s Integrated Master Schedule (IMS) to ensure that the project milestone reviews occur on schedule. 5. Reviews the monthly Financial Reports submitted by the contractor to ensure that the SpendPlan is on-track.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 33

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.2 Develop Analytical Design

The purpose of this activity is to capture all requirements associated with establishing the Modeling & Simulation (M&S) environment needed for the project. To accomplish this goal, the following elements must be defined in detail: 1. Analysis Requirements 2. Experiment Design 3. Support Requirements (security, facility, networks) 4. Operations Concept 5. Mission Use Cases 6. Simulated Weapon System Functional Requirements 7. Simulated Weapon System Interoperability Requirements

In order to ensure that all elements are defined to a sufficient level of detail and in a timely manner, the following milestone reviews are conducted during the Specify Battlespace phase, as appropriate: 1. Analysis Requirements Review (ARR) 2. Operational Requirements Review (ORR)

Figure 3.2 – Develop Analytical Design Overview

3.2.1 Define Analysis Requirements

The Analysis Lead reviews the problem statement, the study purpose, and the customer questions as documented in the PID and/or Study Plan if one was developed for the project; develop the associated analytic elements required to support the analysis; and document this information in the Analysis Traceability Matrix (ATM) for the project.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 34

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3.2.1 – Design Analysis Requirements Activity Flow 3.2.1.1 Develop Analysis Objectives

Input: Approved Study Plan Available Templates/Tools: Analysis Traceability Matrix (ATM) Template Output: Defined Analysis Objectives OPR: Contractor and Government Analysis Leads

Activity Description: The Analysis Team decomposes and allocates the project analysis requirements/analysis questions to analysis objectives which are normally described within the context of a mission type. If analysis objectives were defined in the project Study Plan, then the analysis objectives documented in the ATM shall be consistent with the project Study Plan. 3.2.1.2 Define Measures of Effectiveness & Performance(MOEs/MOPs)

Input: Defined Analysis Objectives Available Templates/Tools: Analysis Traceability Matrix (ATM) Template Output: Defined MOPs/MOEs OPR: Analysis Lead

Activity Description: Once the analytic objectives are defined, the team develops Measures of Effectiveness (MOEs) to support each objective and Measures of Performance (MOPs), if required.

An MOE is defined as a qualitative or quantitative measure of the performance of a model or simulation or a characteristic that indicates the degree to which it performs the task or meets an operational objective or requirement under specified conditions. Simulation MOEs typically involve lethality and survivability metrics and are expressed in terms of a ratio; for example, % of targets detected, % of targets killed, and % of weapons expended against a target.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 35

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

MOEs are often supported by one or more MOPs. Specifically related to Modeling and Simulation (M&S), an MOP is defined as a quantitative measure of how the system performs its functions in a given environment and is closely related to inherent parameters of the system such as physical performance and structural characteristics. For example, for a data-link system, typical MOPs are message transfer rate, frame rate, and bandwidth.

If MOEs/MOPs were specified in a Study Plan for the project, the project’s MOEs/MOPs defined in the ATM shall be consistent with the Study Plan.

3.2.1.3 Define Data Elements

Input: Defined MOEs/MOPs Available Templates/Tools: Analysis Traceability Matrix (ATM) Template Output: Defined Data Elements OPR: Analysis Lead

Activity Description: The Analysis Lead and SMEs analyze the MOEs/MOPs to determine what types of data must be collected to calculate the measure. Possible data types include such items as number of targets in the scenario, number of weapons on the aircraft, number of operationally relevant messages received and/or sent, number of truth tracks, etc. 3.2.1.4 Produce Analysis Traceability Matrix (ATM)

Input: Analysis Requirements listed above Available Templates/Tools: Analysis Traceability Matrix (ATM) Template Output: Completed ATM OPR: Analysis Lead

Activity Description: The purpose of the ATM is to document the analysis requirements for the project and to provide traceability from the customer questions to the actual weapon system performing the required functionality to support the analysis study. Note that, depending on the project’s particular security requirements (scenario and/or system pairings, etc.), this information may become classified and must be secured appropriately. The Analysis Lead or one of the analysis SMEs enters the analysis information into the project’s ATM.

3.2.2 Define Experiment

The Analysis Team designs the experiment that will be used to support the analysis study. It begins with a high-level description that provides the overall context for the assessment. The Contractor Technical Lead and Analysis Lead then work together to develop the Live-Virtual-Constructive (LVC) list. The LVC list describes the weapon systems that will be used in the study and indicates if the systems will be

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 36

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 live, virtual, or constructive. The Contractor Technical Lead uses the LVC List to develop a Software Concept for the project, which specifies what models and/or applications can be used as is, must be modified, or developed. This information forms the basis for the detailed software development schedule and supports the project Integrated Master Schedule (IMS). The high-level assessment description, LVC List, and Software Concept are documented in the TAP.

Figure 3.2.2 – Define Experiment Activity Flow

3.2.2.1 Develop High-Level Assessment Description

Input: Study Plan, PID/SOC, ATM Available Templates/Tools: TAP – Main Body Template Output: Completed TAP (main body only) OPR: Analysis Lead

Activity Description: Using information in the PID/SOC, Study Plan and ATM, the Analysis Team develops a high-level description of the assessment to be conducted. This includes the following information: 1. A description of the problem being addressed by thestudy. 2. A description of the purpose of the study. 3. Pertinent background information concerning the study including why it is important, who will use it, and how it will be used. 4. The primary objectives of the study, which trace back to the customer’s questions documented in the PID and the analytical objectives documented in the ATM.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 37

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

5. A high-level description of the scenario and Area of Interest (AOI) that will be used, including the blue players and red threats that will be simulated. Note that, depending on the project’s security requirements, the scenario chosen and the blue and red systems selected, thisinformation may become classified and shall be appropriately protected as such. 6. A high-level description of the operational context for the assessment, including a brief description of the types of missions that will be accomplished; for example, Close Air Support (CAS) and Time Sensitive Targeting(TST). 7. A schedule for the event that supports the assessment, including the event start and end dates,and the activities that will occur on eachday. 8. A list of the Points of Contact (POCs) for the event, including the role (example: Event Director); the name of the person serving in the role, his phone number and e-mail address.

If a Study Plan was developed for the project, the project’s high level assessment defined in the ATM shall be consistent with the Study Plan.

3.2.2.2 Develop Experimental Design Method

Input: PID, SOC; Study Plan; ATM Available Templates/Tools: TAP Template Output: Updated Study Plan OPR: Analysis Lead

Activity Description: The Analysis Lead and the analysis SMEs draw on the analysis information captured to date to design the experiment method that supports the analysis. The goal is to answer the analysis question(s) both effectively and efficiently. The Design of Experiments (DOE) methodology combines the tenets of sound , combat modeling, and the design of meaningful experiments. The experiment design method is a structured, organized approach to the study that involves designing a set of experiments. A well-designed experiment will define the variables that are to be controlled, called control factors, the levels to which the control factors are to be set, and the responses to collect. In addition, the design should include any variables that might influence the responses, but may not be able to be controlled effectively, called nuisance variables. When the results of these experiments are analyzed, they help to identify optimal conditions, the factors that most influenced the results, and those that did not, as well as other details such as the existence of any interactions and synergies between factors. If a Study Plan was developed for the project, the project’s DOE methodology defined in the TAP shall be consistent with the Study Plan.

The team also develops a Design Matrix, which lists the design cases that will be run. A design case is defined as the set of values that each control factor will have during a particular run instance. A design or case matrix will be developed for the basic experiment that will systematically vary all of the control factors of variables through all of the chosen factor levels. The case matrix will define which factors to control, to what levels the factors are controlled, and the responses to collect. Note that a complete (or full) factorial design is a design containing every possible combination of the levels of all factors. Running each and every case is not always practical nor cost/schedule efficient, especially for live-virtual Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 38

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 tests. In a fractional or mixed-level factorial design, a select subset of all possible combinations of the levels of all factors are used to create runs which accentuate the main effects and interactions of significance.

A series of run matrices will be developed, using the case matrix as its basis, to meet the analytic requirements of the study. The run matrix defines which case to execute, and the values of any nuisance variables that will be varied.

3.2.2.3 Identify Data Source

Input: Defined Data Elements, LVC List Available Templates/Tools: Analysis Traceability Matrix (ATM) Template Output: Defined Data Sources OPR: Analysis Lead

Activity Description: Using information from the ATM (Data Elements) and the LVC List, the Analysis Lead and SMEs further refine the information in the ATM. They identify the specific data source, such as a specific platform, human operator, data link, weapon, etc., that will provide the required data element for the event. Data sources shall be linked to the Requirements Traceability Matrix (RTM), where the functional requirements for each system are further defined. data sources in the ATM to RTM requirements is one way the linkage between the analysis and capability development threads in the SMP is maintained.

3.2.2.4 Define Data Collection Plan and AnalysisMethodology

Input: Defined Data Sources Available Templates/Tools: Analysis Traceability Matrix (ATM) Template Output: Defined Data Collection Plan and Analysis Methodology OPR: Analysis Lead, Contractor Technical Lead, and Software Architecture Lead

Activity: The Analysis Lead and the analysis SMEs, working with the Contractor Technical Lead and the Software Architecture Lead, define the data collection plan and the analysis methodology, which include the following information: 1. Data Source - name of the entity that will provide the data (e.g., a pilot, platform, data link, etc.). 2. Data Collection Method - name of the tool that is used to collect the required information. (e.g., Pilot Survey, RedSim, TabLogger, Data Recorder, etc). 3. Instrumentation Requirement – refers to any data recording functionality that must be developed to support the analysis and the name of the tool. This includes modifications to existing tools and the development of new analysis tools. 4. Analysis Methodology - a description of how the data will be assessed to support calculationof the MOEs/MOPs. Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 39

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

5. Data Product - the product type that will be used to show the data assessment results (barchart, timeline chart, etc.).

3.2.2.5 Produce Technical Assessment Plan (TAP)

Input: Analytical Elements depicted in the Table below Available Templates/Tools: TAP Template Output: Completed TAP OPR: Analysis Lead, Operations Lead

Activity Description: The TAP serves as the basis for the modeling and simulation experiment. It defines the resources, simulation environment, experiment design method, and use cases will be executed during the event. In cases where a Study Plan has been developed for the project, some sections of the TAP may be replaced by or referenced to the Study Plan as coordinated during the project tailoring activities prior to contract award.

The table below depicts a suggested delivery format for the TAP by arranging the various analytical elements into appendices and maps the analytical elements to the milestone review at which it is due. Note that the TAP, main body is due at ARR and that several appendices contain more than one part. In that case, each part is due at a different milestone review.

Appendix Title Milestone Review A-1 Analysis Plan ARR A-2 Analysis Traceability Matrix (ATM) ARR B LVC List Draft ARR Final ORR C Software Concept ARR D Activity Plan – Included in the TAP by reference only. The Activity Plan is maintained by SIMAF Kickoff security. See paragraph 3.1.1.6. (updates as req) E Integration Plan ORR E-1 Mission Use Cases ORR E-2 DoDAF Architecture Products ORR E-2.1 Operational Viewpoints (OVs) ORR E-2.2 System Viewpoints (SVs) ORR E-3 Requirements Traceability Matrix (RTM) Multiple (updates as req) F Event Execution Products [Run Matrix; Time ORR Ordered Event List (TOEL)]

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 40

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.2.3 Conduct Analysis Requirements Review (ARR)

Figure 3.2.3 – Conduct Analysis Requirements Review (ARR) Activity Flow

Input: 1. TAP, Main Body 2. ATM Available Templates/Tools: ARR Briefing Template Output: Successful ARR and/or Corrective Action Plan OPR: Government Technical Lead

Activity Description An Analysis Requirements Review (ARR) is conducted at the conclusion of the preceding activities to ensure that: 1. The analysis objectives in the ATM are traceable to the analysis requirements in the PID/SOC and have been defined in sufficient detail for the operations SMEs to develop the operational concept for the project. 2. The ATM and the TAP Main Body high-level assessment description contain the requiredlevel of detail. 3. The Data Collection Plan and Analysis Methodology are clearly defined and documented inthe ATM.

The documentation listed above as input shall be delivered to the Government in preparation for the meeting. The contractor team prepares the ARR briefing using the template provided by the

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 41

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Government. The briefing template includes a top-level review of the products listed above and provides a list of success criteria for each product as well as overall success criteria for the review itself. The Government expects the contractor to present any unresolved issues along with a mitigation plan for each.

Required participants in the ARR include the PjM, Technical Lead, Contractor Technical Lead, Operations Lead, Analysis Lead, Network Lead, Architecture Lead, Security Officer, and IA/Cyber Lead. Optional participants include the Technical Director, Division Chief, Technical Expert, and Branch Chiefs. The Government Technical Lead determines if the products presented at this milestone review are complete and contain enough detailed information to proceed with the next stage of the project, which is to define the operational concept.

If all of the success criteria for both the products and the ARR itself are met, the ARR is declared successful. If all success criteria are not met, the Government Technical Lead determines the proper course of action. If the decision is made not to proceed to the next stage of the project on schedule, and the project needs to be re-scoped, the documentation listed above and the PID/SOC must be re-baselined.

The Government PjM takes minutes at the review to provide the official record of decisions made and any resulting action items and disseminates them to the ARR participants for review and comment. The Government Technical Lead is the final approval authority for the minutes and is responsible for providing the approved minutes to the PjM for filing in the official project folder on the AFLCMC/XZS SharePoint.

Exit Criteria: 1. ATM approved 2. TAP (Main Body) approved 3. Corrective Action Plan, if all items are not complete

3.3 Specify Operational Environment

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 42

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3.3 – Specify Operational Environment Overview

3.3.1 Develop Live, Virtual Constructive (LVC) List

Input: Approved PID, SOC, Study Plan, ATM Available Templates/Tools: LVC List Template Output: Draft LVC List OPR: Government Technical Lead

Activity Description: The Contractor Technical Lead reviews the information provided in the PID, SOW, Study Plan, and preliminary ATM as the starting point for the development of a comprehensive list of weapon systems and subsystems (e.g., specific platform, data link, etc.) that are required for the analysis event. The LVC List includes detailed information about each system, such as: • Whether it is a threat (red) system, a blue system, or othersystem • What type of system it is – aircraft, missile, etc. • The number of players that will be used in the scenario. • Whether the weapon system/subsystem will be represented as live, virtual, or constructive inthe simulation. • If the event is externally distributed or uses externally provided systems, the list will indicate the source of the system. • Whether the weapon system can be reused, needs to be modified, or must be developed. If the weapon system is reused or modified then a reference to the appropriate high level/functional analysis requirement shall be established and presented at ARR. If the weapon system is being developed for the project, then a new high level/functional requirement must be establishedand presented at ARR.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 43

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

This information is due at the Analysis Requirements Review (ARR). Further information such as asset location, simulation system, and call signs are due at the Operational Requirements Review (ORR). The completed LVC List and all required information are also ORR.

3.3.2 Develop Integration Plan

Input: Approved LVC List Available Templates/Tools: Integration Plan Template Output: Integration Plan, Final LVC List OPR: OPS Lead

Activity Description: The Integration Plan documents the high-level scenarios [e.g., Offensive Counter Air (OCA); Close Air Support (CAS); Defensive Counter Air (DCA); Suppression of Enemy Air Defense (SEAD) and Battlespace Interdiction (BAI)] that are required to support the analysis.

The Operations Lead and Operations SMEs review the high level operational description provided in the Study Plan, which includes the geographical region of interest and the missions to be carried out in the area of interest. The LVC List is also completed at this time.

The Operations SMEs further develop the Integration Plan to include the following information: 1. The Red Force (enemy) composition including the number and type of aircraft and ground threats as well as the geographical locations (laydowns) of both the airborne and Integrated Air Defense System (IADS) threats. Note that, depending on a project’s security requirements, the scenario chosen, and the blue and red systems selected, this description may become classified and must be protected as such. 2. The Blue Force composition including the number and type of aircraft and ground support vehicles, bases, etc., as well as the geographical locations (laydowns) of both the airborneand ground assets. 3. The Blue and Red force mission routes including the identification and mapping of initial coordination lines and zones; and the selection of airspace descriptions and start points. A routing tool should be used to develop and plot this information.

3.3.3 Define Mission Use Cases

In general, the technique is used to capture a system's behavioral requirements in a given scenario. A use case can be conceptually described as an ordered set of related activities grouped according to function or desired objective. The following generic mission use cases describing fundamental functionality for most systems have been developed by SIMAF: • Command and Control (C2) • Collection • Communications • Engagement • Movement Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 44

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The mission use cases are decomposed into specific mission activities (e.g., for C2: Plan Mission, Develop Targets, etc.). The mission activities are further refined into sub-activities (e.g., for Plan Mission: Plan Collection, Plan effects, Develop Movement Plan). The Mission Use Cases are documented in a series of Department of Defense Architecture Framework (DoDAF) products. The contractor will propose, prepare, and submit an appropriate set of DoDAF 2.0 Viewpoints based on project requirements. The required DoDAF viewpoints may also be specified in the PWS.

3.3.3.1 Decompose Integration Plan into Mission Use Cases

Input: Integration Plan Available Templates/Tools: Mission Use Case Template Output: Mission Use Cases OPR: Operations Lead

Activity Description: To define the mission use cases for the project, the Operations SMEs execute the following steps: 1. Review the high-level mission information provided in the Integration Plan. 2. Select the mission use case templates that apply to the project from the standard set of generic mission use case templates developed by SIMAF. This ensures that the mission information they produce for the project takes all factors into consideration and therefore, has no gaps. 3. Define the Tactics, Techniques and Procedures (TTPs), which describe how the forces will be employed in the mission as well as the roles and tasks of each weapon system involved in the mission and the weapon pairing of the assets (for example, in a fighter sweep, laser against fighter). 4. Decompose the mission into the following mission phases - setup, ingress, engagement,egress, post egress, as appropriate. 5. Decompose the mission phase into mission tasks. Each mission task represents a weapon system use case. For example, the setup mission phase might have 3 mission tasks – early setup; establish communication; build Situational Awareness (SA). 6. Apply the Find, Fix, Track, Target, Engage, Assess (F2T2EA) structure to each mission usecase. 7. Document the mission use cases. 3.3.3.2 Produce DoDAF Viewpoints

Input: Mission Use Cases Available Templates/Tools: Department of Defense Architecture Framework (DoDAF) Version 2.0 Output: DoDAF Viewpoints OPR: Operations Lead

Activity Description: Using the information documented in the Integration Plan, the Operations SMEs develop DoDAF Viewpoints (e.g. OV’s, SV’s, SvcV’s, DIV’s, etc).

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 45

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

DoDAF-described Models in the Operational Viewpoint describe the tasks and activities, operational elements, and resource flow exchanges required to conduct operations.

The Operations SMEs represent the weapon system interoperability in DoDAF SV products. The DoDAF SVs in particular provide the basis for the interoperability requirements developed in paragraph 3.4.1 Define System/Subsystem Requirements.

The DoDAF-described models within the Systems Viewpoint describes systems and interconnections provided for, or supporting the mission use cases. The System Viewpoints associate systems resources to the operational and capability requirements. These systems resources support the operational activities and facilitate the exchange of information.

Note that, depending on project-specific needs and scope, not all DoDAF Viewpoints described in DoDAF 2.0 may apply. Guidance for which DoDAF Viewpoints apply to a specific project will be given in the PID, SOC, Study Plan, and/or PWS.

3.3.4 Conduct Operational Requirements Review (ORR)

Figure 3.3.4 – Conduct Operational Readiness Review (ORR)

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 46

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Input: 1. LVC List 2. Integration Plan 3. Mission Use Cases 4. DoDAF Products

Available Templates/Tools: ORR Briefing Template Output: Successful ORR and/or Corrective Action Plan OPR: Government Technical Lead

Activity: An ORR is conducted at the conclusion of the preceding activities to ensure that: 1. The Experiment Design is complete and documented. 2. The Integration Plan is clearly defined and documented. 3. The Mission Uses are clearly defined and documented. 4. The DoDAF Viewpoints that support the Mission Uses Cases aredocumented.

The intent is to demonstrate operational linkage with the ATM by providing the operational context necessary to satisfy the mission needs in support of collecting and analyzing the ATM requirements. The ORR is intended to provide the developers insight into how the experiment will be conducted (as discussed in the main body of the TAP and subsequent appendices) so that the software development teams have the operational context when designing, building, and testing their configuration items.

The documentation listed above as input shall be delivered to the Government in preparation for the meeting. The contractor team prepares the ORR briefing using the template provided by the Government. The briefing template provides a list of success criteria for each product as well as the overall success criteria for the review itself. The Government expects the contractor to present any unresolved issues along with a mitigation plan for each.

Required participants in the ORR include the Government Technical Lead, Contractor Technical Lead, Operations Lead, and the Analysis Lead. Optional participants include the Technical Director, Technical Expert, and Branch Chiefs. The Government Technical Lead determines if the products presented at this milestone review are complete and contain enough detailed information to proceed to the next stage of the project.

If all of the success criteria for both the products and the ORR itself are met, the ORR is declared successful. If all success criteria are not met, the Government Technical Lead determines the proper course of action. If the decision is made not to proceed to the next stage of the project on schedule, and the event needs to be re-scoped, the documentation listed above and the PID/SOC must be rebaselined.

The Government takes minutes at the review to provide the official record of decisions made and any resulting action items and disseminates them to the ORR participants for approval. Upon approval of the minutes, the Government Technical Lead provides them to the Government PjM to be filed in the official project folder on the AFLCMC/XZS SharePoint.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 47

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Exit Criteria:

Approval of the following: 1. Integration Plan. 2. Mission Uses Cases. 3. DoDAF Viewpoints. 4. LVC List 5. Corrective Action Plan, if all items are not complete.

3.4 Develop Capability

The activities within the Develop Capability process set apply classic systems engineering discipline to the specification, development, and test of models, applications, and simulations for the project. Included in the Develop Capability project phase are:

1. Development of system requirements for the overall capability, subsystem requirements for each Configuration Item (CI), interface requirements for interactions and exchange of data among CIs, and software requirements that drive design and implementationapproach 2. From the requirements, development of design and code for the models, databases, andinterfaces that comprise a capability. 3. Development and execution of test plans, descriptions, and results to demonstrate howthe delivered product meets the requirements. 4. Development of user instructions to build, install, and or operate the delivered capability

The contractor shall follow best engineering practices while defining, designing, coding, and testing each CI. These best practices (e.g. peer reviews, walkthroughs, and quality reviews) shall be applied as appropriate for the level and complexity of software being developed. While the SMP does not explicitly require delivery of “best practice” artifacts, the Government Technical Lead is encouraged to participate in these internal reviews, and as a minimum, ensure they are happening.

In order to ensure that each major activity within this phase occurs in a timely manner and produces the correct information at a sufficient level of detail, the following milestone reviews are conducted:

1. System Requirements Review (SRR) 2. Software Specification Review (SSR) 3. Software Design Review (SDR) 4. Integration Test Readiness Review (ITRR).

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 48

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3.4 – Develop Capability Overview

3.4.1 Define System/Subsystem Requirements

Input: PID, SOC, SOW/PWS, TAP (Mission Use Cases, DoDAF Viewpoints) Available Templates/Tools: System/Subsystem Specification (SSS) Data Item Description (DID) (DI- IPSC-81431A), Interface Requirements Specification (IRS) DID (DI-IPSC-81434A), Requirements Traceability Matrix (RTM) Template Output: System/Subsystem Requirements OPR: Contractor Technical Lead

The SOW/PWS, PID, SOC, and TAP (Mission Use Cases/DoDAF Viewpoints) provide the basis for defining the system and/or subsystem requirements for the project’s capability development. The system/subsystem level requirements specify what capabilities the system must have to support the analysis and/or test objectives. The system/subsystem requirements for the capability being developed are documented in the project’s System/Subsystem Specification (SSS). The types of requirements defined in the SSS for the project’s capabilities include functional requirements, non-functional

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 49

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 requirements (e.g. performance and quality requirements), interface (external and internal) requirements, and interoperability requirements.

Regarding external interfaces, the Mission Use Cases and DoDAF Viewpoint(s) developed during specification of the operational environment are reviewed to determine what capabilities each simulated system must have to support the missions described in the use cases and DoDAF artifacts. Information in the Mission Use Cases and DoDAF Viewpoints are very useful in defining the interfaces between the systems/subsystems and environment, as well as the external interfaces and interoperability requirements among the systems themselves. An example of an external interface between the system and environment would be the ability for an aircraft sensor model to consider the effects of weather or signal density on emissions and returns. An example of an external interface between systems would be the ability for the system to transmit/receive specific messages for the purpose of conducting mission operations. External interfaces can be defined either in the SSS for the project or a separate Interface Requirements Specification (IRS) if the interface requirements are extensive orcomplex.

Internal interfaces are defined within the context of the capability being developed. For example if the system is an aircraft, then internal interfaces could be defined among the component subsystems such as the sensor, fire control, and weapon subsystems. If the system is a sensor, then internal interfaces could exist among the receiver, transmitter, and antenna components. Internal interface requirements are normally documented in the SSS. The documentation strategy of system level interface requirements should be documented in the contract tasking document.

As each system/subsystem requirement is defined, it shall be tagged with information to further categorize and describe it. The requirement shall be tagged with its type (functional, performance, interface, etc) and category lexicon. The following categories serve as a guideline to develop detailed functional requirements and ensure that all possible system functionality is taken into consideration:

1. Command and Control (C2) 2. Fly and Maneuver (FM) 3. Communicate (CM) 4. Understand, Predict, React (UP) 5. Sense and Detect (SD) 6. Special category (SC) 7. Launch Munitions (LM) 8. Electronic Warfare (EW) 9. Directed Energy Attack (DE) 10. Infrared Attack & Support (IR) 11. Instrumentation (IN)

3.4.2 Generate Requirements Traceability Matrix (RTM)

Input: Defined System/Subsystem Functional Requirements Available Templates/Tools: System Requirements Traceability Matrix (RTM) Template

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 50

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Output: Completed System RTM OPR: Operations Lead

Activity Description: The project RTM is used as a traceability tool throughout the Develop Capability phase. As the project progresses through the Develop Capability phase, updated RTMs will be generated to confirm the existence of and traceability among requirements other modeling artifacts such as CSCI names, test cases, etc.

The project RTM shall indicate, which system requirements are reused by the project, which system requirements require modification for the project, and which system requirements are new for the project.

If the project type is an Analysis Effort, then requirements and other information in the RTM is also linked to the Analysis Traceability Matrix (ATM) providing bi-directional traceability upward from the simulated system to the Analysis Question(s) contained in the PID and downward from the system to the software Configuration Item (CI) that represents it. To create the project RTM, the Operations SMEs use the generic RTM templates developed by SIMAF as a starting point to ensure that they consider all primary functions of a system.

The initial RTM is due at the System Requirements Review (SRR). The initial RTM contains:

1. A system level requirement description. (Example: Platform xxx shall communicate with other virtual Blue Players via voice communications (e.g. radios). 2. A subsystem level requirement description. (Example: Communication subsystem on platform xxx shall send UHF radio signals) 3. Traceability from the system and/or subsystem level requirements to the source requirement (PID, SOC, SOW, etc) 4. Traceability from the system and/or subsystem level requirements to the analysis requirement(s) (if an Analysis Effort) 5. Traceability from the system and/or subsystem level requirements to the mission usecase(s)

3.4.3 Conduct System Requirements Review(SRR)

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 51

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3.4.3 – Conduct System Requirements Review (SRR) Activity Flow

Input: 1. SSS 2. IRS 3. RTM (initial) Available Templates/Tools: SRR Briefing Template Output: Successful SRR and/or Corrective Action Plan OPR: Government Technical Lead

Activity Description: The SRR is a formal review conducted to ensure that system requirements have been completely and properly identified and that a mutual understanding between the government and contractor exists. It ensures that the system under review can proceed into initial systems development and that all system and performance requirements are defined and testable, and are consistent with cost, schedule, risk, technology readiness, and other system constraints.

The SRR also ensures the system and/or subsystem level requirements are defined in sufficient detail to proceed with System/Subsystem Design and for the software developers to derive the software requirements for the software Configuration Items (CIs). Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 52

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The documentation listed above as input shall be delivered to the Government in preparation for the meeting. Using the SRR briefing template provided by the Government, the contractor team prepares the SRR briefing. The briefing template provides a list of success criteria for each product as well as overall success criteria for the review itself and the expectation that the contractor will present any unresolved issues along with a mitigation plan for each.

Required participants in the SRR include the Government Technical Lead, Contractor Technical Lead, Operations Lead, and Analysis Lead. Optional participants include the S&A Division Chief, Technical Director, Technical Expert, and Branch Chiefs. The Government Technical Lead determines if the products presented at this milestone review are complete and contain enough detailed information to proceed to the ‘Develop System/Subsystem Design’ phase of the project.

If all of the success criteria for both the products and the SRR itself are met, the SRR is declared successful. If all success criteria are not met, the Technical Director determines the proper course of action. If the decision is made not to proceed to the next stage of the project on schedule, and the event needs to be re-scoped, the documentation listed above and the PID/SOC must be re-baselined.

The Government takes minutes at the review to provide the official record of decisions made and disseminates them to the SSR participants for approval. Upon approval of the minutes, the Government Technical Lead provides the briefing charts and the minutes to the Government PjM to be filed in the official project folder on the AFLCMC/XZS SharePoint.

Exit Criteria: 1. Accepted System/Subsystem Requirements Specification (SSS) 2. Accepted Project Requirements Traceability Matrix (RTM) 3. Corrective Action Plan, if all items are not complete

3.4.4 Develop System/Subsystem Design

Input: 1. SSS 2. IRS

Available Templates/Tools: System/Subsystem Design Description (SSDD) DID (DI-IPSC-81432A), Interface Design Description (IDD) DID (DI-IPSC-81436A), Data Base Design Description (DBDD) DID (DI-IPSC-81437A) Output: SSDD, RTM OPR: Contractor Technical Lead

Activity Description:

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 53

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

From the SSS (and if applicable the IRS), the Contractor Technical Lead develops a high level system design and architecture for the project’s capability. Architecture development includes identification of all components (hardware and software), the static (“consists-of”) relationship among components, concept of execution among system components, and description of interface characteristics among the components. Also documented are the system-wide design decisions specifying how the system will behave from a user’s point of view, and other decisions affecting the selection and design of system components. Refer to the SSDD DID, DI-IPSC-81432A as a guide for the scope of activities to be performed during system/subsystem level design.

At the completion of system/subsystem design, the RTM shall be updated to include bilateral traceability between each system component identified and the system requirement allocated to it.

3.4.5 Define Software Requirements

Input: 1. SSS 2. IRS 3. SSDD

Available Templates/Tools: Software Requirements Specification (SRS) DID (DI-IPSC-81433A), Software Test Plan (STP) DID (DI-IPSC-81438A) Output: SRS, STP OPR: Contractor Technical Lead

Activity Description: The contractor Technical Lead and Software SMEs review the SSS, IRS, and SSDD for the project to ensure that they have the proper contexts for development of the software requirements that support development. Derived requirements emerge from analysis of system requirements, high level design, and architecture and are at a level of sufficient detail to represent the functionality required by individual CSCIs. The contractor Technical Lead and software SMEs derive software requirements and update the SRS accordingly. The SRS shall also trace the software requirements to the system-level requirements in the RTM and, if applicable, the analysis objectives contained in the ATM.

Software test planning occurs concurrently with development of the software requirements. The contractor Technical Lead and software SMEs shall prepare the Software Test Plan for delivery at SRR. For SRR the STP shall include a description of the planned tests and traceability to the SRS for each test planned. A complete description of the STP content required for SRR can be found in the STP DID, DI- IPSC-81438A.

At the completion of software requirements definition, the RTM shall be updated to include: 1. traceability between each system level requirement in the SSS and the software requirement derived from it as defined in the SRS. 2. traceability between each software requirement and the software test case identified in theSTP.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 54

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.4.6 Conduct Software Specification Review (SSR)

Figure 3.4.6 – Conduct Software Specification Review (SSR) Activity Flow

Input: 1. SRS 2. STP 3. RTM update Available Templates/Tools: SSR Briefing Template, RTM Output: Approved SRS, Approved STP, Corrective Action Plan (if required) OPR: Contractor Technical Lead

Activity Description: The purpose of this review is to ensure that the software requirements for each CSCI to be created or modified for the project have been described in sufficient detail to proceed with the software design. The SRR is a formal government review and must be approved before the contractor can proceed with software design.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 55

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The documentation listed above as input shall be delivered to the Government in preparation for the meeting. Using the SSR briefing template provided by the Government, the contractor team prepares the SSR briefing. The briefing template provides a list of success criteria for each product as well as overall success criteria for the review itself and the expectation that the contractor will present any unresolved issues along with a mitigation plan for each.

Required participants in the SSR include the Government Technical Lead, Contractor Technical Lead, Operations Lead, and Analysis Lead. Optional participants include the S&A Division Chief, Technical Director, Technical Expert, and Branch Chiefs. The Government Technical Lead determines if the products presented at this milestone review are complete and contain enough detailed information to proceed to the ‘Develop Software Design’ phase of the project. The Government Technical Lead also notifies the Architecture Lead if software requirements impact design(s) of the EAAGLES framework and/or applications and model architectures, necessitating review and/or CRB action.

If all of the success criteria for both the products and the SSR itself are met, the SSR is declared successful. If all success criteria are not met, the Technical Director determines the proper course of action. If the decision is made not to proceed to the next stage of the project on schedule, and the event needs to be re-scoped, the documentation listed above and the PID/SOC must be re-baselined.

The Government takes minutes at the review to provide the official record of decisions made and disseminates them to the SSR participants for approval. Upon approval of the minutes, the Government Technical Lead provides the briefing charts and the minutes to the Government PjM to be filed in the official project folder on the AFLCMC/XZS SharePoint.

Exit Criteria: 1. Approved SRS 2. Approved STP 3. Corrective Action Plan, if all items are not complete

3.4.7 Develop Software Design

Input: SRS, STP Available Templates/Tools: Software Design Description (SDD) DID (DI-IPSC-81435A), Interface Design Description (IDD) DID (DI-IPSC-81436A), Data Base Design Description (DBDD) DID (DI- IPSC-81437A), Software Test Description (STD) DID (DI-IPSC-81439A) Output: SDD, IDD, DBDD, STD OPR: Contractor Technical Lead

Activity Description: The Contractor Technical Lead and the Software SMEs review the functional and derived requirements to develop the top-level software architectural design, which is documented in an object-oriented Conceptual Model. It is intended that this release of the SMP begins to emphasize the importance of Model Based Systems Engineering (MBSE) in the detailed design of systems and components to facilitate a product line philosophy in the maintenance and reusability of MS&A artifacts. Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 56

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The conceptual model is further decomposed into a detailed software design communicated through a series of diagrams implemented in a visual (e.g. Unified Modeling Language (UML), System Modeling Language (SysML)), including as appropriate and not limited to: • A class diagram that describes the structure of the system by showing the system's classes, their attributes, and the relationships between the classes. • A sequence diagram that describes the messages sent between participating objects in ascenario. • A collaboration diagram that displays an interaction organized around the objects and their links to one another. Numbers are used to show the sequence of messages.

The software design is documented in the SDD, the interface design is documented in the IDD, and the data base design is documented in the DBDD.

The STD includes a description of the CSCI-level software test cases to be executed along with the expected results. It is recognized that the actual step-by-step test procedures may not be written until the code is developed. The CSCI-level test descriptions documented for SDR are intended to be at a level below the test plan and provide insight into how the test cases will be constructed to ensure the software requirements in the SRS are satisfied. The step-by-step test procedures and actual code developed to drive the test cases shall be delivered at ITRR.

At the completion of software design, the RTM shall be updated to include: 1. traceability between each software unit/component identified in the SDD, IDD, and/orDBDD and software requirement identified in the SRS. 2. traceability between each software unit/component identified in the SDD, IDD, and/orDBDD and CSCI-level test description identified in the STD.

3.4.8 Conduct Software Design Review (SDR)

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 57

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3.4.8 – Conduct Software Design Review (SDR) Activity Flow

Input: 1. SDD 2. IDD 3. DBDD 4. STD 5. RTM update

Available Templates/Tools: SDR Briefing Template, RTM Output: Successful SDR and/or Corrective Action Plan OPR: Contractor Technical Lead

Activity Description: The purpose of this review is to ensure that the software design is consistent with the existing system and software architecture and supports the software requirements. The SDR is a formal government review and must be approved before the contractor can proceed with software coding. The Contractor Technical Lead presents the software design(s) for the configuration items to be developed within the project. The contractor shall demonstrate during this review that the SW design documented in the SDD traces to the Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 58

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

SW requirements contained in the SRS and that the test cases developed and documented in the STPDR will adequately test the new or modified CSCI(s) in a standalone test environment. The Architecture Lead reviews the SDD and advises the Government Technical Lead regarding compliance of the software design with the SIMAF SySML/UML modeling style guide. SDDs shall correspond to and be delivered for each CSCI developed or modified for the project, rather than a single SDD for the entire project.

The documentation listed above as input shall be delivered to the Government in preparation for the meeting. Using the SDR briefing template provided by the Government, the contractor team prepares the SDR briefing. The briefing template provides a list of success criteria for each product as well as overall success criteria for the review itself and the expectation that the contractor will present any unresolved issues along with a mitigation plan for each.

Required participants in the SDR include the Government Technical Lead, Contractor Technical Lead, Operations Lead, Architecture Lead, and Analysis Lead. Optional participants include the S&A Division Chief, Technical Director, Technical Expert, and Branch Chiefs. The Government Technical Lead determines if the products presented at this milestone review are complete and contain enough detailed information to proceed to the ‘Develop Software’ phase of the project.

If all of the success criteria for both the products and the SDR itself are met, the SDR is declared successful. If all success criteria are not met, the Technical Director determines the proper course of action. If the decision is made not to proceed to the next stage of the project on schedule, and the event needs to be re-scoped, the documentation listed above and the PID/SOC must be re-baselined.

The Government takes minutes at the review to provide the official record of decisions made and disseminates them to the SDR participants for approval. Upon approval of the minutes, the Government Technical Lead provides the briefing charts and the minutes to the Government PjM to be filed in the official project folder on the AFLCMC/XZS SharePoint.

Exit Criteria: • Approved SDD • Approved IDD • Approved DBDD • Approved STD • Corrective Action Plan, if all items are not complete

3.4.9 Develop Software

The software developers code the Configuration Items (CIs), develop functional unit test cases to support them and conduct unit tests to ensure that each CI functions according to the requirements documented in the SRS. Subsequently, they conduct pre-integration component-level testing. The contractor, with Government participation, conducts an Integration Test Readiness Review (ITRR) to verify that each CI conforms to coding standards and is ready to be tested in the integrated test environment. During this

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 59

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 phase, the Software User Manual (SUM) is also drafted which includes any operational, installation aspects of the software of interest to the user. 3.4.9.1 Code Computer Software CIs (CSCIs)

Input: Completed SDD, RTM, SRS Available Templates/Tools: C++ coding standards, C++ code analysis tool, RTM Additional Processes/Procedures: 0020_sop_sd-001_gitflow, 0020_sop_sd-002_gitlab-issues Output: Coded/Validated Computer Software Configuration Items (CSCIs), Code Analysis Report, RTM update, Software Version Description (SVD) OPR: Software Developers

Activity Description: During this phase, the developers use the software design information contained in the SDD and develop the CSCIs assigned to them. C++ is the predominant language used at SIMAF. If a language other than C++ is to be used for a project, it must be specifically approved prior to contract award. Each CSCI is named in accordance with SIMAF naming convention and is coded in accordance with industry-standard C++ coding standards. Contractors are expected to apply software development best practices to the coding effort to include walkthroughs, desk audits, and peer reviews. Upon completion of coding (or as the CSCI is being coded) the contractor runs a SIMAF-approved code analysis tool on the CSCI under development. The results of code analysis is retained and presented at ITRR as evidence the CSCI meets SIMAF quality standards. Each CSCI is designated as ‘completed’ when it passes code analysis and implements all software requirements allocated to it. It should be noted that final CSCI deliverables but be reviewed and assured as Cybersecurity Compliant by the Contractor’s Senior (s) possessing required certifications per AFMAN 33-285 “Information Assurance Workforce Improvement Training” as specified by AFI 17-1303/AFMAN 33-285 Sections 3.2.12 and 3.2.12.3.

In addition to coding the software, the developer creates a set of unit test cases/procedures for each CSCI based on the test descriptions contained in the STD. In addition, the software developers prepare a Software Version Description (SVD) that describes the functionality contained in each new or modified CSCI.

At the completion of software coding, the RTM shall be updated to include: 1. traceability between each CSCI and the corresponding software design unit/component identified in the SDD, IDD, and/or DBDD.

3.4.9.2 Develop Functional Unit Test Cases

Input: STD (test descriptions), SRS, Validated CSCIs Available Templates/Tools: N/A Output: Updated STD (unit test cases/procedures) OPR: Contractor Technical Lead

Activity Description: Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 60

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Under direction from the Contractor Technical Lead, the Software Developer develops step-by-step test procedures for each new and/or modified CSCI and documents these test procedures in the STD. The unit test cases/procedures shall include inputs and expected results. Government Technical Leads are encouraged to participate in peer reviews/walkthroughs related to development of unit test cases.

At the completion of software coding, the RTM shall be updated to include: 1. traceability between each CSCI and the test case(s).

3.4.9.3 Conduct Functional Unit Tests

Input: Validated CSCIs; STD (unit test cases/procedures), SRS, SDD, IDD, DBDD Available Templates/Tools: Software Test Report (STR) DID (DI-IPSC-81440A) Output: Initial Software Test Results (STR) OPR: Contractor Technical Lead, Software Developers

Activity Description: After all the required functionality has been developed for a given CSCI, the developer conducts verification testing to ensure that the CSCI works in accordance with the SDD/IDD/DBDD. If necessary, he revises the CSCI and retests until a positive result is obtained. If the CSCI has interface requirements, they are tabled until code and unit test completion of the interfacing CSCI. The contractor is expected to conduct internal peer reviews on the unit code as a quality assurance function to verify that the requirements are met in the code. Government Technical Leads are encouraged to engage in the execution or review the results of functional unit testing. Further, it is required that all test case descriptions and results be documented in the STR, and the test code be delivered with the CSCI via CSPEI at project end. 3.4.9.4 Perform Pre-Integration Component-Level Testing

Input: Tested CSCIs Available Templates/Tools: STD, STR, Test rig Output: Pre-integration Test Results OPR: Contractor Technical Lead, Software Developers

Activity Description: The purpose of pre-integration component level testing is to ensure that the individual CSCIs work with their interfacing CSCIs as documented in the requirements and design documentation. Pre-integration component testing is intended to reduce risk by exercising the interface exchanges among related CSCIs prior to full-up integration of the software with the hardware. Pre-integration testing can be performed on the actual end-item hardware if available or on a workstation design to emulate the end item. The strategy for pre-integration testing (i.e. which CSCIs will be exercised as a group) shall be contained in the Software Test Plan (STP). Ideally, the unit test cases should be reused for pre-integration testing replacing test input files with the actual inputs from the interfacing CSCIs. During pre-integration testing, the developer demonstrates to the Government Technical Lead and the Contractor Technical Lead that the CSCI is in compliance with the requirements listed in the RTM. The Contractor Technical Lead Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 61

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 updates the STR with the pre-integration interface test results. If a CSCI has requirements that can only be fully tested in the integrated environment, then those requirements are tabled for ERR and recorded in the integration test description portion of the STD. Government Technical Leads are encouraged to witness execution or review the results of pre-integration component-level testing.

3.4.10 Conduct Integration Test Readiness Review (ITRR)

Figure 3.4.10 – Conduct Integration Test Readiness Review (ITRR) Activity Flow

Input: 1. Coded Software CIs 2. STR completed (Step-by-step Test Procedures, Functional Unit Test Case Results; Component- Level Test Results) 3. STD completed (Integration-level test case descriptions) 4. Software Version Description (SVD) – Draft 5. Computer Software Product End Item (CSPEI) delivery for eachCI. 6. Software User Manual (SUM) – Draft

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 62

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

7. Results of code style and structure analysis.

Available Templates/Tools: ITRR Briefing Template Output: Successful ITRR and/or Corrective Action Plan OPR: Contractor Technical Lead

Activity Description: The purpose of the ITRR is to determine the readiness of each individual Configuration Item (CI) for integration test in the SIMAF. The ITRR is a formal review to assess functional unit and component- level test results, contained in the STR, which are based on the software requirements contained in the SRS. The ITRR answers the following questions:

1. Has the code been assessed against the C++ coding standard? 2. Do the software components appear to work together in a consistent and stablemanner? 3. Do all modified and new software CIs contain the required functionality and are they verifiedas pass/fail against specific requirements in the SRS? 4. Are all event execution products complete? 5. Have all changes to modified CSCIs been communicated to the Government through eitherthe design review process or by the submission of Change Requests (CRs)?

The documentation listed above as input shall be delivered to the Government in preparation for the meeting. Using the ITRR briefing template provided by the Government, the contractor team prepares the ITRR briefing. The briefing template provides a list of success criteria for each product as well as overall success criteria for the review itself and the expectation that the contractor will present any unresolved issues along with a mitigation plan for each.

Required participants in the ITRR include the Government Technical Lead, Contractor Technical Lead, Operations Lead, and Analysis Lead. Optional participants include the S&A Division Chief, Technical Director, Technical Expert, and Branch Chiefs. The Government Technical Lead determines if the products presented at this milestone review are complete and contain enough detailed information to proceed to the ‘Integrate Battlespace’ phase of the project.

If all of the success criteria for both the products and the ITRR itself are met, the ITRR is declared successful. If all success criteria are not met, the Technical Director determines the proper course of action. If the decision is made not to proceed to the next stage of the project on schedule, and the event needs to be re-scoped, the documentation listed above and the PID/SOC must be re-baselined.

The Government takes minutes at the review to provide the official record of decisions made and disseminates them to the ITRR participants for approval. Upon approval of the minutes, the Government Technical Lead provides the briefing charts and the minutes to the Government PjM to be filed in the official project folder on the AFLCMC/XZS SharePoint.

Exit Criteria: 1. CSPEI complete.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 63

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

2. Draft SVD contains sufficient detail. 3. Approved draft SUM. 4. Unit and pre-integration test procedures documented in the STD. 5. Unit and pre-integration test results documented in the STR. 6. The code meets SIMAF standards for style and structure.

Required Documentation: 1. Updated STD with step-by-step test procedures 2. STR with results of unit and pre-integration level test cases 3. Draft Software Version Description (SVD). 4. Draft Software User Manual (SUM). 5. Computer Software Product End Item (CSPEI) delivery for each CSCI.

3.5 Integrate Battlespace

Figure 3.5.1 – Integrate Battlespace Overview Activity Flow

3.5.1 Establish Relevant Environment

The battlespace is made ready for the event. The integrated environment is established. Rooms and computers are set up in accordance with the requirements documented in the Activity Plan. Networks are established and tested in accordance with security Information Assurance (IA) requirements. Dry Runs are conducted prior to the formal event to ensure that the simulation environment is fully operational and that the data needed to support the event can be collected and is reasonable.

Recall that a Project “event” may be a singular or series of events, an analysis study, creation of or modifications to models and simulations, a simulation model; creation of a simulation model; a simulator hardware upgrade or procurement; a facility modification, a verification/validation test activity, or any combination of these elements.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 64

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The following subsections provide the activity descriptions, including inputs, associated templates and tools, expected output, Office of Primary Responsibility (OPR), and personnel involved in performing each process activity.

3.5.2 Conduct Integration Testing

In preparation for integration testing the battlespace must be established. The SIMAF is set up, networks are prepared and network connectivity is tested.

Figure 3.5.2 – Conduct Integration Testing Activity Flow

3.5.2.1 Prepare the SIMAF Facility

Input: Activity Plan, STP Available Templates/Tools: Facility Diagram Output: Facility configured to meet the event requirements OPR: Facility Lead Activity Description: In order to properly prepare the SIMAF facility for integration testing, the Facility Lead and facility SMEs review the Activity Plan and the facility diagram which shows which SIMAF rooms are be used for the event, where the various computers will be located, and what software will be loaded on each computer. They then configure the physical facility in accordance with the diagram. Additional site configuration requirements may be contained in the Site Test Planning portion of the STP, under the Hardware and Software Preparation paragraphs. 3.5.2.2 Establish Network Connections

Using the information on network requirements provided in the Activity Plan, the Network Team connects systems and/or outside sites that are participating in the event to the network; verifies that the networks are functioning properly; verifies that multi-level security is in place, if required; and establishes interfaces to live, operational systems.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 65

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3.5.2.2 – Establish Network Connections Activity Flow

3.5.2.2.1 Connect Participating Sites to the Network

Input: Activity Plan Available Templates/Tools: N/A Output: Outside sites connected to the Network OPR: Network Team

Activity Description: All workstations and/or participating sites and connectivity points are identified in the Activity Plan. To exchange information and collaborate on mission accomplishment during the event, all participating sites that are remote from the control facility must be physically and logically connected, in accordance with the event network diagram. The following minimum actions are performed: 1. Assign IP addressing. 2. Establish voice, video, and data physical and logical connections tofacilities. 3. Verify persistent connections via test and tune-up.

3.5.2.2.2 Verify Network Operation

Input: Activity Plan Available Templates/Tools: N/A Output: Correct network operation verified OPR: Network Team

Activity Description: The Network Team verifies that all network services that support the event are functioning properly. The following minimum actions are performed: 1. Verify network and encryption operations with pingtests. 2. Verify voice communications. 3. Set up video teleconferencing, if required. 3.5.2.2.3 Verify Security

Input: Activity Plan Available Templates/Tools: N/A Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 66

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Output: Correct Security Verified OPR: Network Team

Activity Description: The Network Team works closely with the event Security Team to verify and maintain the required level of classification over the distributed network. If required, multi-level security connections are maintained and monitored throughout event build and integration activities so that encryption rules and security processing requirements are always met. The following minimum actions are performed: 1. Maintain authentication. 2. Maintain authorization. 3. Maintain digital rights management. 4. Maintain user access. 5. Enforce encryption rules. 6. Monitor transport mechanism. 7. Monitor firewall loads. 8. Maintain export controls.

3.5.2.2.4 Interface to Operational Systems

Input: Activity Plan Available Templates/Tools: N/A Output: Interface to Operational Systems Established OPR: Network Team

Activity Description: If live entities will be in the event battlespace, the Network Team establishes the necessary interfaces. New interface capabilities are tested during integration testing. Interfaces to C4ISR systems involve translation of a significant amount of data between the simulation formats and the C4ISR format. The following minimum actions are performed: 1. Connect through data and video protocols. 2. Maintain protocols. 3. Exchange data per the protocols. 4. Process Time-Space-Position-Indicator (TSPI) standards. 5. Maintain frame-based synchronization, as required. 3.5.2.3 Test Integrated Environment

Input: Activity Plan, STPDR Available Templates/Tools: N/A Output: Verification performed on integrated environment OPR: Contractor Technical Lead, Government Technical Lead

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 67

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Activity Description: The Contractor Technical Lead, supported by the Event Director performs a walk- though of the event environment. The objective is to ensure that the event battlespace is compliant with the Activity Plan, and that the following items have been completed: 1. All models, databases, simulators, facilities, and support are in place for the event. 2. All interfaces are operational. 3. Hardware and software components (e.g., controllers, displays, window displays) areassembled, interconnected, and fully operational. 4. The proper software and databases are loaded and running. 5. All networks are operational (network connectivity to participating site isverified). 6. Timing is synchronized, data protocols are verified, scenario execution steps are verified,and data capture methods are verified. 7. All security issues have been resolved and implemented.

3.5.3 Conduct Dry Run

Input: Run Matrix; ATM Available Templates/Tools: N/A Output: STR (deficiency reports), V&V Report OPR: Analyst Lead

Activity Description: Prior to the formal event or runs-for-record, a dry run is conducted to ensure that the required data can be collected and that it is sufficient to support the analysis and answer the customer’s questions. The team executes the Run Matrix and, using the Data Collection Plan captured in the ATM, Part 2, collects the data generated, computes MOEs/MOPs and assesses the basic analytic objectives against the data. A summary of the run cases (contractor results from having executed the run matrix and a calculation of the expected results as compared to the ATM requirements) shall be documented in the STPDR. The STPDR shall also include a disposition summary of problems encountered during integration testing and/or execution of the event Dry Run. Deficiencies noted during the Dry Run are corrected and documented in CRs. However, any deficiencies that cannot be corrected during the dry run period are categorized as Deficiency Reports (DRs) and submitted as CRs. They are then reviewed by the SAIL and discussed at a Configuration Review Board (CRB) if necessary.

The dry run is also the best time to conduct validation testing because at this point the models requiring validation are integrated with and executing in the relevant environment. The appropriate models shall be executed against the validation data and/or validation method described in the V&V Plan and the results documented in the Validation Report.

3.5.4 Conduct Event Readiness Review (ERR)

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 68

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Figure 3.5.4 – Conduct Event Readiness Review (ERR) Activity Flow

Input: Dry Run Complete; Dispositioned Change Requests (Deficiency Reports), STD, STR Available Templates/Tools: ERR Briefing Template Output: Successful ERR and/or Corrective Action Plan OPR: Technical Director

Activity Description: The purpose of the ERR is to ensure that the analysis questions documented in the ATM can beanswered using the actual data collected from the Dry Run to support the MOEs. The contractor presents a top level review of the ATM requirements, the run matrix, and a summary of which requirements aremet. For those requirements not met, the contractor presents a more detailed discussion of why the requirement is not met, whether it will be met in this event, whether the requirement is deferred, or whether the contractor is asking permission to delete the requirement. The government expects to see a summary of the run cases (contractor results from having executed the run matrix and a calculation of the expected results as compared to the ATM requirements) as documented in the STPDR. A validation report is also required if previously unvalidated models are being used for the analysis. In addition, the status of the items on the ERR checklist is presented to ensure that everything required to execute the event is in place.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 69

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The documentation listed above as input shall be delivered to the Government in preparation for the meeting. Using the ERR briefing template provided by the Government, the contractor team prepares the ERR briefing. The briefing template provides a list of success criteria for each product as well as overall success criteria for the review itself and the expectation that the contractor will present any unresolved issues along with a mitigation plan for each.

Required participants in the ERR include the Government Technical Lead, Contractor Technical Lead, Operations Lead, and Analysis Lead. Optional participants include the S&A Division Chief, Technical Director, Technical Expert, and Branch Chiefs. The Government Technical Lead determines if the products presented at this milestone review are complete and contain enough detailed information to proceed to the ‘Execute Battlespace’ phase of the project.

If all of the success criteria for both the products and the ERR itself are met, the ERR is declared successful. If all success criteria are not met, the Technical Director determines the proper course of action. If the decision is made not to proceed to the next stage of the project on schedule, and the event needs to be re-scoped, the documentation listed above and the PID/SOC must be re-baselined.

The Government takes minutes at the review to provide the official record of decisions made and disseminates them to the ERR participants for approval. Upon approval of the minutes, the Government Technical Lead provides the briefing charts and the minutes to the Government PjM to be filed in the official project folder on the AFLCMC/XZS SharePoint.

Exit Criteria: 1. Calculation of the MOPs/MOEs successfully verified in accordance with the TAPrequirements. 2. Dry Run produced the properly formatted data that had values within expectedranges. 3. All required data can be collected and processes by the data collection and analysis tools. 4. The analytical tools reduced the data correctly. 5. Integrated test environment verification. 6. Dry run test results documented in the STR. 7. Validation test results documented in the Validation Report 8. ERR Checklist complete with no critical outstandingissues.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 70

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.6 Execute Battlespace

The purpose of this activity is to execute the event in accordance with the TAP and the documented event scenarios. The Event Director and the Execution Team conduct pre-event briefs for the stakeholders and operators to familiarize them with the mission. The Execution Team exercises the test cases in accordance with the run matrix. After each run, the Analysis Team collects the data and performs a Quick Look Analysis to assess how the event is progressing in accordance with the TAP requirements. The Analysis Lead ensures that event data is collected and archived daily. At the end of the day, a ‘Hotwash’ is conducted to discuss any anomalies encountered. The Contractor Technical Lead, Government Technical Lead, Analysis Lead, and Event Director discuss the corrective actions and agree on a resolution. The Government Technical Lead is the final decision authority. At the conclusion of the event, an Event Debrief is held to summarize the entire event.

Figure 3.6 – Execute Battlespace Overview

The following subsections provide the activity descriptions, including inputs, associated templates and tools, expected output, Office of Primary Responsibility (OPR), and personnel involved in performing each process activity.

3.6.1 Conduct System Testing 3.6.1.1 Perform System Verification Tests

Input: STP, STD (for system test), baselined code repositories Available Templates/Tools: N/A Output: STR (for system test), RTM (final) OPR: Government Technical Lead

Activity Description: Executing the system test procedures documented in the Software Test Description (STD), the complete system consisting of the software (models, applications, and simulations) and hardware (computational systems, cockpits, visual systems, etc.) is verified against the system level requirements using the qualification method(s) (inspection, analysis, demonstration, test) indicated for each requirement. The Data Item Description (DID) for the System/Subsystem Specification (SSS) requires that each requirement be annotated with the qualification method. Any special test equipment required for verification of system requirements shall be identified in the STP and provided as part of the system test

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 71

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 suite. The results of executing the system test procedures are documented in the STR and shall include a final RTM update indicating compliance traceability from the system test procedure back to the original system or customer-level requirement. Any discrepancies noted during system test shall be fully discussed in the STR and also in the Final Report if the deficiency is serious enough to have an affect on the overall product. 3.6.1.2 Perform System Validation Tests

Input: V&V Plan Available Templates/Tools: N/A Output: V&V Report OPR: Government Technical Lead

Activity Description: The complete system, as described for verification testing, including environment, is validated using the techniques, methods, data, criteria, and resources described in the V&V Plan. The results of validation testing are documented in the V&V Report which shall include a detailed evaluation of the system under test regarding the real world representation or operation of the delivered capability. Any deviations from real world representation noted during validation test shall be fully discussed in the V&V Report and also in the Final Report if the deficiency is serious enough to have an affect on the suitability/usability of the overall product.

3.6.2 Conduct Experiment

The Execution Team conducts the experiment that supports the analysis study. Prior to event execution, they conduct a series of briefs for both stakeholders and operators to set the stage for the event. They execute the event runs for the day, perform a Quick Look Analysis of the data collected, archive the output data, and conduct a daily Hotwash. At the conclusion of the event, they conduct an Event Debrief.

Figure 3.6.2 – Conduct Experiment Activity Flow

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 72

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.6.2.1 Conduct Pre-Event Planning

Input: Event Execution Product Checklist Available Templates/Tools: Product templates Output: Completed Event Execution Products OPR: Event Director

Activity Description: The Event Director reviews the following list and selects the products required to support the execution of his particular event. Each product has an associated template, which may be tailored, if needed. 1. Event Announcement (EA) Four to six weeks prior to a Distributed Event, the Event Director and Contractor Technical Lead prepare an Event Announcement based on the information contained in the TAP. The Event Announcement is coordinated with the customer and subsequently provided to all event participants, sponsors, and stakeholders. The purpose of the announcement is to provide a high level overview of the event and to communicate participation across multiple agencies. The announcement contains such items as a list of the event participants, an event description; event goals/objectives; contact information for the customer and team leaders; simulation information (aircraft type and number); pilot/operator information; a schedule, a mission description and the time and locations where the event may be viewed. Any required Distinguished Visitor (DV)/VIP notices shall be included with the EA. 2. Battle Rhythm The Battle Rhythm provides the daily event schedule by hour. 3. Event Pre-Brief When the operators (if applicable) and stakeholders are in place for the event, the Event Director briefs them on the mission prior to the start of the event. The purpose of the Event Pre-Brief is to perform a walk-thru of the mission, scenario, and vignettes, and to address additional information associated with the event (e.g. call signs, frequency assignments). The event pre-brief for each event session is typically performed for morning and afternoon runs and contains the following sections: a. Administrative Information - this information includes such items as the daily simulation schedule, location of restrooms, and breaks. b. Analysis Goals - this section presents the customer objectives and requirements, theevent analysis objectives, and the MOEs/MOPs. c. OPS Context - this section provides information on the operations, missions, and level of decision-making control or injections required from the operators. The briefing also covers the scenario resources, the Comm Plan, and Blue, Red, and White Concept of Operations (CONOPS). Blue, red, and white operators are briefedseparately. d. Infrastructure Status - this section presents network status, remote facility status, LVC, and communications status. e. Security Information – this section will cover the general SIMAF security requirementsand any security requirements specific to the project/event. 4. Run Matrix

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 73

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The run matrix consists of a list of the test cases associated with the analysis experiment and the order in which they are to be executed. 5. Time Ordered Events List (TOEL) The TOEL consists of an event schedule and overview of the sequence of injects or key triggers that occur prior to each run, during each run, and after each run. 6. Pilot Briefing Package (if applicable) The package consists of maps which show the ground depiction, airspace control, holding points and the pilot mission briefing card.

3.6.2.2 Conduct Event Pre-Briefs

Input: TAP Available Templates/Tools: Event Briefing Templates Output: Briefed Operators and Stakeholders OPR: Event Director

Activity Description: When the operators and stakeholders are in place for the event, the Event Director briefs them on the mission prior to the start of the event. The purpose of the event brief is to conduct any required training on the simulators/stations; to perform a walk-thru of the mission, scenario, and sequencing; and to address additional information associated with the event (e.g. call signs, frequency assignments). The event pre-brief for each event session is typically performed for morning and afternoon runs. Four separate briefs are conducted. 1. Administrative Brief – This brief covers the following items: daily schedule, modeling and simulation overview, rules of engagement, facility, security, and communication plandetails. 2. Analysis Brief – This brief presents a discussion of the customer objectives and requirements, the event analysis objectives, and MOPs/MOEs. 3. Ops Brief – This brief contains the relevant information to conduct the operations, missions, level of decision-making control or injections required from the operators. The briefing also covers the Blue, Red, and White Concept of Operations (CONOPS). Blue, red, and white operators are briefed separately. 4. Tech Support Brief – This brief presents network status, remote facility status, and communications status. 3.6.2.3 Execute Event Runs

Input: M&S Environment; Run Matrix; TAP Available Templates/Tools: None Output: DRs; Test Results OPR: Event Director

Activity Description: Each day’s event runs (trials) begin with an operations briefing to discuss any scenario changes that were made based on the TOEL. In addition, in some cases, the full sequence of a scenario may not be Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 74

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 conducted, or the same sequence with different triggers, key events or variants may be performed. The briefing keeps all the event players on track and focused on the event objectives under study. The Execution Team brings the M&S environment to ready status and performs functionality checks on systems, communications, and simulation applications. This typically occurs prior to or concurrently with the operations briefing. The Event Director announces commencement of each trial. The Event Director and analysts monitor entity control translations and ensure that event data are being captured during each trial. The analysts document pass/fail results, and record any anomalies noted during the run. Post-event, the Event Director, Contractor Technical Lead, and Government Technical Lead make a determination as to whether or not a DR should be written against the anomaly. However, if an anomaly results in a shut-down of the event, it is immediately recorded as a DR and worked. 3.6.2.4 Perform Quick Look Analysis

Input: Test Run Results; TAP Available Templates/Tools: N/A Output: Quick Look Briefing OPR: Lead Analyst

Activity Description: Data is collected from the event runs locally and/or from the entire distributed participant group to perform a daily Quick Look. The Quick Look Analysis is designed to give a daily perspective on how the event is progressing against the goals and objectives defined for the analysis in the TAP. In particular the Quick Look Analysis focuses on the MOPs/MOEs. A daily Observation Log is kept by the Lead Analyst and the Analysis Team to ensure that the day’s event activities and problems have been adequately captured. In addition, event data for each day is archived to ensure the information is available for the analysis phase. Upon completion of the event, the operators/pilots are provided with a survey to solicit additional feedback and data regarding the event scenario results. This information is included with the test data and quick look results for further analysis. The Analysis Team performs the following steps in support of the Quick Look: 1. Monitors, records and captures scenario configuration, intermediate results, and results during each event run in accordance with the data collection plan. At the conclusion of the day or at predetermined intervals, this data is collected from all the stations/distributed participants. 2. Processes the raw data and/or intermediate results to meet the interface specifications for the automated MOP/MOE calculating tools. 3. Imports the properly formatted data into the automated MOP/MOE calculating tools where MOPs/MOEs are computed. 4. Hand processes data in cases where the raw data is either not digitally produced or mustbe deduced from event execution and computes the MOPs/MOEs. 5. Performs an analytical assessment of all calculated MOPs/MOEs to evaluate how the eventis progressing in terms of the analysis plan documented in the TAP. 6. Identifies any risks or issues that arose during the event runs. 7. Documents (interim) conclusions, lessons learned, watch items, and DRs. 8. Organizes the (interim) conclusions, lessons learned, watch items, DRs, and MOP/MOE assessment into presentation format.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 75

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

9. Presents the Quick Look Analysis to decision makers. Depending on the interim results, therun matrix may be adjusted to achieve customer and analysis objectives. 3.6.2.5 Archive Daily Output Data

Input: Data Output Files Available Templates/Tools: Observation Log Output: Archived Data Output files and Observation Log OPR: Analysis Lead

Activity Description: Event data for each day is captured and archived to ensure the information is available for the post event analysis phase. The Analysis Team establishes a “working” directory for data processing using copies of the raw data as the starting point. The original raw data is kept in its original archival location until the final analysis report is complete. At that point, the data is archived to CD/DVDs or a separate Hard Drive that is stored in a safe location. After delivery of the Final Report the raw data may be delivered, deleted or destroyed at the customer’s discretion. 3.6.2.6 Conduct Daily Hotwash

Input: Mission Use Cases/Scenarios, Run Matrix, DRs Available Templates/Tools: N/A Output: Completed Hotwash OPR: Event Director

Activity Description: At the end of each day, a review is performed to capture and log observations. This includes discussing and documenting any scenario related problems that occurred and documenting any applicable lessons learned, open and fixed DRs as they relate to the planned activities for the day, and any desired improvements. The Analysis Team ensures the analysis data is assessed for completeness in accordance with the Data Collection Plan and the Event Director prepares a daily Event Activity Report. At the conclusion of every event run the operators may provide additional feedback and event related information via a survey. This information is included with the event data results for further analysis. The following activities occur at the Daily Hotwash: 1. Review Mission Objectives The team reviews the mission objectives as briefed prior to event start as the basis for determining the success of the event. 2. Conduct Mission Reviews The team compares the mission results, the scenarios run, and the battlefield expectations against the mission objectives to determine if the objectives were met. 3. Review Blue Force CONOPS The Blue Force operators review the Blue Force CONOPS and the operations conducted during the event runs, and report on mission success and failures. 4. Review Red Force CONOPS

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 76

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The Red Force operators review the Red Force CONOPS and the operations conducted during the event runs, and report on mission success and failures. 5. Review White Force CONOPS The White Force operators review the White Force CONOPS and the operations conducted during the event runs, and report on mission success and failures. 6. Review Combined Force (Blue, Red White) CONOPS The operators review the CONOPS from the combined force perspective, and the operations conducted during the event runs, and report on mission success and failures as a combined force, vice the individual forces. The results for the combined force may be different than the individual results. 7. Check /Assess Data The Analysis Team reviews the data produced and assesses the data against the TAP. 8. Document DRs, Lessons Learned and Desired Improvements The team documents DRs, lessons learned, determines real-time fixes, and desired improvements and enters them into the appropriate project management tool. 3.6.2.7 Conduct Event Debrief

Input: Event Run Data, Deficiency Reports (DRs) Available Templates/Tools: Event Debrief Template Output: Completed Event Debrief OPR: Event Director

Activity Description: After all event runs are completed, an Event Debrief is held to summarize the entire event. The debrief aggregates, collates and evaluates key event information to synthesize the event-level results. The results will be preliminary, if the complete analysis has not been finished. The following information is discussed at the Event Debrief. 1. Review Event Objectives The event objectives (customer and event analysis) are reviewed. The interim event results are compared against the event objectives, where possible, to determine if the objectives were met. 2. Conduct Mission Review The event mission results, the scenarios run and the battlefield expectations are compared against the mission objectives to determine if the objectives were met. 3. Review Combined Force (Blue, Red White) Force CONOPS and Note Successes andFailures The operators review the CONOPS from the combined force perspective, and the operations conducted during the event runs. They report on mission success and failures as a combined force, vice the individual forces. The results for the combined force may be different than the individual results. 4. Present Quick Look Analysis Results The Analysis Team reviews the raw data and makes preliminary assessments of data trends, which it presents to the stakeholders for the purpose of gathering additional insight into the customer's objectives for consideration in accomplishing the final analysis. The team also assesses how well the event objectives were met. 5. Review DRs, Lessons Learned, and Desired Improvements Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 77

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The team documents DRs, lessons learned, and desired improvements. 6. Send Event Summary Report The Event Director provides an Event Summary Report to the Government Technical Lead. The Event Summary Report contains a summary of the event activities and an assessment of the customer objectives and whether or not they were met.

3.6.2.8 Analyze Results

The purpose of this process is to perform the post-event analysis and develop the Event Analysis Report. The Analysis Team uses analytical techniques and tools to produce findings/results and present them to the customer/sponsor.

The following subsections provide the activity descriptions, including inputs, associated templates and tools, expected output, Office of Primary Responsibility (OPR), and personnel involved in performing each process activity.

3.6.2.8.1 Process Data

Input: Event Run Data (output files), TAP Available Templates/Tools: N/A Output: Post-processed data OPR: Analysis Lead

Activity Description: The Analysis Team is responsible for the post processing of all event data and the quality control of that process. The following activities are conducted in support of this process: 1. Process/Copy Data Archive The Analysis Team copies the event data from the data store and verifies that all participant (distributed) data is accounted for. Note that the event data is raw, meaning no modifications have been made to the datasets. 2. Collect Any Remaining/Missing Data and Archive; and Open Data for Review & Assessment In cases where data may have been missing (e.g. network connectivity lost during event data collection period), the missing data is acquired from the appropriate source(s). All raw datasets are archived in the appropriate location. If there is new data, it is verified by the Analysis Team. If event data exists in non-electronic format, such as survey forms completed by the operators, the appropriate information from those event data items are recorded in electronic form (e.g. tabulated in a spreadsheet) and the files are archived in the appropriatelocation. 3. Verify Data by Review and Assessment The Analysis Team reviews and assesses the event data for completeness and usefulness based on the requirements documented in the Data Collection Plan. 4. Sort, Categorize and Distribute Raw Data to all Participant Analysts

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 78

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

The Analysis Team sorts and categorizes the event data by file structure and source. The data is transferred to the appropriate media, (DVD, FTP site, etc.) and then, if required, distributed to all participant analysis organizations, if the event was distributed. 5. Parse/Refine/Format for and Port to Analytic SoftwareTools The raw data are parsed, refined and/or reformatted to meet the interface specifications for any analysis tools needed to calculate the MOPs/MOEs of the event. 3.6.2.8.2 Assess Results

Input: Event Test Data Available Templates/Tools: Analysis Tools Output: Test Analysis Results OPR: Analysis Lead

Activity Description: The Analysis Team performs detailed analysis in accordance with the methodology and procedures laid out in the TAP, and produces findings/results. The Analysis Team coordinates the findings/results with the Analysis Lead and the Government Technical Lead finalizing these findings/results in the Technical Assessment Report. The following activities are conducted in support of this process: 1. Perform Quantitative & Qualitative Data Analysis The Analysis Team performs the quantitative and qualitative data analysis as specified and prioritized in the TAP. They assess the test results and calculate the final event MOP/MOEs. Once the preliminary findings/results are drafted, the Analysis Team begins the comparisonphase of the detailed analysis. 2. Compare/Merge to Data Journal Observations During the course of the event runs, analysts document their observations in data journals, and operators provide feedback. Quantitative and qualitative analysis from the post event is compared to the analysts’ data journals and observations to identify contradictory and/or subjective information. The Analysis Lead documents any anomalies and/or gaps. 3. Compare/Merge to Pre-Event Constructive Analysis/Prior Event Results 4. Preliminary findings/results are also compared to any pre-event predictive runs and anyrelevant historical event results. The intent is to further identify contradictory and corroborating information to substantiate event results. If necessary, the Analysis Team conducts additional post event virtual and/or constructive analysis to substantiate or refute eventfindings. 5. Conduct Constructive Analysis The Analysis Team may conduct constructive analysis in this post event phase to further validate or substantiate the findings. In some cases, the findings/results may have been inconclusive and further investigation is required. This post event constructive analysis may be required to extend the evaluation to other simulations. Steps include matching event conditions in the constructive mode, design of experiments, run matrix, analyze outputs and incorporate into analysis results. If additional virtual analysis is necessary, another event may need to be planned, which would require additional funding from the customer. 6. Format Findings/Results for Final Reporting The findings/results are formatted into graphical or statistical formats for the final report in accordance with customer requirements. Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 79

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

7. Peer Review/Coordinate Findings with the Lead Analyst and Government Technical Lead The Lead Analyst peer reviews the Analysis Team’s findings/results and coordinates them with the Government Technical Lead. The Government Technical Lead’s feedback is documented and the feedback’s impact on the findings is researched. The findings/results are updated, as appropriate.

3.6.3 Deliver Capability 3.6.3.1 Document Results

Input: Test/Analysis Results and Data Available Templates/Tools: Technical Assessment Report and Briefing Template Output: Completed Technical Assessment Report and Briefing OPR: Analysis Lead, Contractor Technical Lead, Government Technical Lead

Activity Description: Using the results of the previous task, the Analysis Lead prepares a Technical Assessment Report to document the event results. The Technical Assessment Report is intended to document the results of the event with a focus on whether or not event objectives and requirements were met. The report includes findings and recommendations, as appropriate. Before the report is provided to the customer, a Peer Review is conducted to ensure its accuracy, quality, and validity. The report is delivered to the customer and appropriate stakeholders for review. If the customer provides feedback that impacts the final report, it is updated and resubmitted to the customer for review and/or acceptance. The customer may elect to have the Technical Assessment Report presented as a PowerPoint briefing. If this is the case, the Government Technical Lead will schedule the briefing and notify the attendees. The approved final Technical Assessment Report is filed in the appropriate project folder.

3.6.3.2 Archive Data

Input: Event Artifacts and Technical Assessment Report Available Templates/Tools: CM Tool Output: Archived Artifacts OPR: Government PjM, Government Technical Lead, Analysis Lead, Contractor Technical Lead

Activity Description: The objective of this activity is to ensure that event artifacts are captured and archived for future review and reuse. The Government PjM and Government Technical Lead ensure that all project documentation (briefings, milestone review minutes, action items, deliverables, etc.) is located in the appropriate location. The Analysis Lead archives the processed data used to generate the charts, tables, etc. presented in the Final Assessment Report. The Contractor Technical Lead archives the “as built” configuration of the event M&S environment. The Contractor Technical Lead is also responsible for creating Change Requests (CRs) for all CSPEIs and/or CIs developed or modified for the project. The Government Technical Lead and SAIL schedules a Configuration Review Board (CRB) to review each

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 80

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

CR. A Configuration Control Board (CCB) is then conducted to bring each CI under configuration control. Classified documents are stored in accordance with SIMAF information security policies.

3.6.4 Conduct Project Closeout Review

Input: Completed Closeout Review Checklist Available Templates/Tools: Closeout Presentation, Closeout Review Checklist Output: Closed Project OPR: Government Project Execution Team

Activity Description: The Government Project Execution Team collaborate to perform a programmatic assessment of the project. The purpose is to examine how well the project was executed based on the PWS/SOW and PMP and ensure all deliverables have been received and archived. This includes a review of schedule and cost variances, and technical results. This information is documented and used by the Government as feedback to refine project management and technical processes and estimation methods. The results of this assessment including any lessons learned and/or recommended process improvements are documented in the Closeout briefing. In the case of process improvement recommendations, the project team works with the Technical Director to ensure that these improvements are documented and implemented.

The Government PjM and Government Technical Lead complete the project closeout checklist and prepare the Closeout Review briefing. The Government PjM, Government Technical Lead, and Configuration Manager also perform a final review of the project folder to ensure that the following items are present: 1. Approved PID and SOC and PWS. 2. Approved project deliverables per the SOW/PWS. If any of the project documentation is classified, the unclassified project folder shall contain a list of classified documents and their location. 3. Signed DD250 – before the project is closed the Government PjM ensures that allcontract obligations have been satisfied.

3.7 Deploy/Sustain Capability

To be supplied in the next version of the SMP. In the interim, refer to the appropriate Standard Operating Procedures and/or processes as referenced in the following subparagraphs.

3.7.1 Establish Product Baseline SOP_CM_006_Product_Baseline_Creation

3.7.2 Deploy Product SOP_CM_002_Software_Distribution_Management

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 81

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

3.7.3 Manage Configuration Management of approved product baselines shall be performed by the Configuration Manager in accordance with the Configuration Management Plan (CMP).

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 82

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Appendix A – Terms and Definitions

Analysis Requirements Review (ARR) - assesses the top level analytic objectives, measures and data elements for a study and determines if they are adequate to support the study and defined in sufficient detail.

Analysis Traceability Matrix (ATM) - traces the analysis requirements for the project from the customer questions (s) to analysis objectives to Measures of Effectiveness (MOEs)/Measures of Performance (MOPs) to data element(s) needed to perform the measure.

Concept of Operations (CONOPS) - a verbal or graphic statement, in broad outline, of a commander's assumptions or intent in regard to an operation or series of operations. The concept of operations frequently is embodied in campaign plans and operation plans; in the latter case, particularly when the plans cover a series of connected operations to be carried out simultaneously or in succession. The concept is designed to give an overall picture of the operation. It is included primarily for additional clarity of purpose. It is also called commander's concept.

Configuration Item (CI) - any component of an information technology infrastructure that is under the control of configuration management. Configuration items (CIs) can be individually managed and versioned, and they are usually treated as self-contained units for the purposes of identification and change control.

Design of Experiments (DOE) - the process of planning an experiment so that appropriate data (which can be analyzed by statistical methods) will be collected through economical (efficient) testing, resulting in valid and objective conclusions.

Event – in the context of this SMP an event is any formal demonstration of capability that is conducted in accordance with the project’s contract and governed by process. Examples of events are capability demonstrations, qualification, validation or acceptance tests, and runs for record associated with analysis or experimentation.

Event Readiness Review (ERR) - a review to ensure that the integrated dry runs support the calculation of the analytic objectives and to ensure that everything necessary to support the full-up event is in place.

Information Exchange Requirements (IERs) - information exchange requirements characterize the information exchanges to be performed by the proposed system(s). For Capability Development Documents (CDDs), top-level IERs are information exchanges between systems of combatant command/Service/agency, allied, and coalition partners. For Capability Production Documents (CPDs), top-level IERS are information exchanges external to the system (i.e., with other combatant commands/services/agencies, allied and coalition systems). IERs identify who exchanges what information with whom, why the information is necessary, and how the information exchange must occur. Top-level IERs identify warfighter information used to support a particular mission-related task and exchanged between at least two operational systems supporting a joint or combined mission. The

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 83

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 quality (i.e., frequency, timeliness, security) and quantity (i.e., volume, speed, and type of information such as data, voice, and video) are attributes of the information exchange included in the IER. IERs are architecturally represented through the Operational View-3 (OV-3), Operational Information Exchange Matrix, in both the DoD and Minister of Defense Architecture Frameworks (MODAF).

Integrated Master Schedule (IMS) - defines the overall development and execution schedule for the project and includes, by phase, the applicable milestone reviews and associated deliverables.

Integration Test Readiness Review (ITRR) - review of the unit and system level test results prior to full integration and testing.

Live, Virtual, and Constructive (LVC) Simulation - the categorization of simulation into live, virtual, and constructive is problematic, because there is no clear division between these categories. The degree of human participation in the simulation is infinitely variable, as is the degree of equipment realism. This categorization of simulations also suffers by excluding a category for simulated people working real equipment (e.g., smart vehicles). (DOD 5000.59-P) • Constructive Model or Simulation - models and simulations that involve simulated people operating simulated systems. Real people stimulate (make inputs) to such simulations, but are not involved in determining the outcomes. (DOD 5000.59-P) • Live Simulation - a simulation involving real people operating real systems. (DOD 5000.59-P) • Virtual Simulation. A simulation involving real people operating simulated systems. Virtual simulations inject human-in-the-loop in a central role by exercising motor control skills (e.g., flying an airplane), decision skills (e.g., committing fire control resources to action), or communication skills (e.g., as members of a C4I team). (DOD 5000.59-P)

Measure of Effectiveness (MOE) – measure of operational success that must be closely related to the objective of the mission or operation being evaluated. For example, the number of enemy submarines sunk or enemy tanks destroyed may be satisfactory MOEs if the objective is to destroy such weapon systems. However, if the real objective is to protect shipping or an infantry battalion, then the best course of action might be one that results in fewer friendly submarines or tanks actually killed. A meaningful MOE must be quantifiable and measure to what degree the real objective is achieved. (Defense Acquisition Deskbook)

Measure of Performance (MOP) - a measure of performance reflects a system’s technical capabilities and may be expressed in terms of speed, payload, range, time on station, survivability (susceptibility, vulnerability, and recoverability), or other distinctively quantifiable performance feature. (Defense Acquisition Deskbook)

Mission - a finite defined set of tasks necessary to achieve a desired mission level objective and support a vignette or scenario level objective.

Mission Task (MT) - MTs are “military operational tasks” that are performed while executing the mission and are derived directly from the capability requirements identified in the ICD or CDD. They are usually expressed in terms of general tasks to be performed or effects to be achieved (e.g., hold targets at risk, provide countermeasures against surface-to-air missiles, or communicate in a jamming environment). The MoEs are Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 84

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 then developed to measure “how well” each alternative performs the tasks or achieves the desired effects. Because MTs are tasks, cost is never an MT or MoE, and cost is never considered in the effectiveness analysis. All capabilities discussed in the ICD or CDD should be addressed in the MTs and MoEs for the AoA. MoEs are supported by one or more MoPs. Because the AoA or analytical study tries to identify the most promising solution(s), MTs must not be stated in solution-specific language. MoEs should not call for optimizing aspects of a task or effect, because this often has unintended impacts on cost or other aspects of the alternatives’ performance. For example, one solution to minimizing aircraft attrition could be not flying missions; this solution would hardly be conducive to placing targets at risk. Similarly, maximizing targets destroyed may result in unacceptable attrition.

Operational Requirements Review (ORR) - assesses the Operational Requirements/Products before the construction of the Use Cases.

Operational View (OV) - derived from the operational architecture, the OV is a description of the tasks and activities, operational elements, and information exchanges required to accomplish DOD missions. DOD missions include both warfighting missions and business processes. The OV contains graphical and textual products comprising an identification of the operational nodes and elements, assigned tasks and activities, and information flows required between nodes. It defines the types of information exchanged, the frequency of exchange, which tasks and activities are supported by the information exchanges, and the nature of information exchanges. Project Introduction Document (PID) - serves as the agreement between SIMAF and its customers of the project’s overall analytic goals.

Project Management Plan (PMP) – the contractor’s plan for project execution which includes a brief project description, a list of the software Configuration Items (CIs) to be developed or modified, a list of required deliverables, a matrix of required skills, a high level Integrated Master Schedule (IMS), project cost/performance reporting information, and risk identification and mitigation information.

Protocol Data Unit (PDU) Standards - formally defined data exchange standards established for each of the several primary classes of functionality that is represented in the DIS synthetic environment; e.g., movement, weapons, firing effects, collisions, etc.

Protocol Data Unit (PDU) - Distributed Interactive Simulation (DIS) terminology for a unit of data that is passed on a network between simulation applications.

Protocol Entity - object that exchanges information with other protocol entities in a network via Protocol Data Units in accordance with an established protocol. A key attribute of a protocol entity is its state. State transitions occur in a given protocol entity in accordance with the established protocol as the result of: a. Protocol Data Units received from other protocol entities, and b. occurrence of an external event (e.g., expiration of a time-out counter.)

Protocol - set of rules and formats (semantic and syntactic) that defines the communication behavior of simulation applications.

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 85

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Requirements Traceability Matrix (RTM) - contains traceability information among requirements (system, subsystem, and software), design components (Configuration Items (CIs)), and test cases.

Rules of Engagement (ROEs) - the circumstances under which troops/forces will initiate and/or continue combat. The ROEs determine when, where, and how force shall be used.

Scenario - a geographic region of interest that encompasses political objectives and aspirations and includes: nationalities, military forces/bases/targets, civilian infrastructure (for control or targeting), and potential targets of political value.

Software Design Review (SDR) - review of the software design prior to coding the design for the virtual battlespace.

Software Requirement Review (SRR) - review of all derived software requirements prior to the development of the design.

Statement of Capability (SOC) - documents the agreement between SIMAF and its customer on the scope of activity, type of activity, and costs.

Systems View (SV). - derived from the systems architecture, the SV is a set of graphic and text products describing systems and interconnections providing for, or supporting, DOD functions. DOD functions include both warfighting and business functions. The SV associates systems resources to the OV. These systems resources support operational activities and facilitate the exchange of information among operational nodes. Tactics, Techniques, and Procedures – the actions and methods that implement joint doctrine (via an Operational Concept or CONOPS) and describe how forces will be employed in joint operations.

Technical Standards View (TV) - derived from the technical architecture, the TV represents the minimal set of rules governing the arrangement, interaction, and interdependence of system parts or elements. It ensures a system satisfies a specified set of operational requirements. The TV provides the technical systems implementation guidelines upon which engineering specifications are based, common building blocks are established, and product lines are developed. The TV includes a collection of the technical standards, implementation conventions, standards options, rules, and criteria organized into profile(s) to govern systems and system elements for a given architecture. Technical Assessment Plan (TAP) – documents the overall modeling and simulation environment.

Use Case – links the operational activities to the weapon system requirements necessary to achieve those activities. Each use case is composed of tasks within a mission phase, which are linked to the weapon system and/or subsystem.

Validation - The process of determining the degree to which a model, simulation, or federation of models and simulations, and their associated data are accurate representations of the real world from the perspective of the intended use(s).

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 86

Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

Verification – The process of determining that a model, simulation, or federation of models and simulations implementations and their associated data accurately represents the developer's conceptual description and specifications.

Vignette – a set of conditions, within a finite and defined geographic scenario, which provide constraints for coordinated missions to achieve an objective(s).

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 87 Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 Appendix B – Acronyms

AFLCMC – Air Force Life Cycle Management Center AFMC – Air Force Materiel Command AOI – Area of Interest AOR – Area of Responsibility ARR – Analysis Requirements Review ATM – Analysis Traceability Matrix BA – Budget Authority BAI – Battlespace Air Interdiction BDA – Battle Damage Assessment C2 – Command and Control CA – Counter Air C&A – Certification and Accreditation CAS – Close Air Support CCaRS - Comprehensive Cost and Requirement System CCB – Configuration Control Board CDRL – Contract Data Requirements List CI – Configuration Item CM – Configuration Management CMM – Capability Maturity Model CMMI – Capability Maturity Model Integrated CO – Contracting Officer COMSEC – Communications Security CONOPS – Concept of Operations COTS – Commercial-off-the-Shelf CRB – Configuration Review Board CR – Change Request CROP – Change Request Oversight Program CSAF – Chief of Staff, United States Air Force CSPEI – Computer Software Product End Item DAA – Designated Approving Authority DCA – Defensive Counter Air DE - Directed Energy DFAS – Defense Financial Accounting System DID – Data Item Description DIS – Distributed Interactive Simulation DIV - Data and Information Viewpoint DO – Director of Operations DoD – Department of Defense DR – Deficiency Report DREN – Defense Research and Engineering Network DT&E – Development Test and Evaluation DVD – Digital Video Disc

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 88 Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

EA – Event Announcement EAAGLES - Extensible Architecture for the Analysis and Generation of Linked Simulations ERR – Event Readiness Review EW - Electronic Warfare F2T2EA – Find, fix, track, target, engage, assess FAE – Functional Area Evaluator FCA – Functional Configuration Audit FM – Financial Management FM – Fly and Maneuver FOPR - Fair Opportunity Proposal Request FPC – Final Planning Conference FS – Fighter Sweep FTP – File Transfer Protocol FY – Fiscal Year GOTS – Government-off-the-Shelf HLA – High Level Architecture H/W – Hardware IA – Information Assurance IADS – Integrated Air Defense System IAO – Information Assurance Officer IAM - Information Assurance Manager IAW – In accordance with IBA – Integration Branch Assignee IDD – Interface Design Description IER – Information Exchange Requirement IMS – Integrated Master Schedule IN- Instrumentation IP – Internet Protocol IPC – Initial Planning Conference IPT – Integrated Product Team IR – Infrared IRS - Interface Requirements Specification ITRR – Integration Test Readiness Review LM - Launch Munitions LVC – Live-Virtual-Constructive MCO – Major Combat Operation MFR – Memorandum for Record MIL-HDBK – Military Handbook MIPR – Military Interdepartmental Purchase Request MOA – Memorandum of Agreement MOE – Measures of Effectiveness MOP – Measures of Performance MOU – Memorandum of Understanding M&S – Modeling and Simulation MS&A – Modeling, Simulation, and Analysis Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 89 Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

MSR – Monthly Status Report N/A – Not Applicable OA - Order Assignment OCA – Offensive Counter Air OPR – Office of Primary Responsibility ORR – Operational Requirements Review OT&E – Operational Test and Evaluation OV – Operational View PAT – Process Action Team PCA – Physical Configuration Audit PDU – Protocol Data Unit PI – Process Improvement PID – Project Introduction Document PjM – Project Manager PM – Program Manager PMP – Project Management Plan PMR – Program Management Reviews POC – Point of Contact PRR – Project Requirements Review PWS – Performance Work Statement QAP – Quality Assurance Personnel ROE – Rule of Engagement RTM – Requirements Traceability Matrix SAE – Service Acquisition Executive SAP – Special Access Program SAR – Special Access Required SBA – Simulation-Based Acquisition SC - Special Category SD - Sense and Detect SDD – Software Design Description SDR – Software Design Review SDREN – Secure Defense Research and Engineering Network SE – Systems Engineering SEAD – Suppression of Enemy Air Defense SECAF – Secretary of the Air Force SIMAF – Simulation and Modeling Facility SME – Subject Matter Expert SOC – Statement of Capability SOJ – Stand-Off Jammer SOP – Standard Operating Procedures SOS – System of Systems SOW – Statement of Work SOW-A – Statement of Work Amendment SPWG – SIMAF Process Working Group SRR – Software Requirements Review Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 90 Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204

SRS – Software Requirements Specification SSG – Senior Steering Group STP – Software Test Plan STR – Software Test Report SUM – Software User Manual SV – System View SvcV – Services View SVD – Software Version Description SSAA – System Security Authorization Agreement TAP – Technical Assessment Plan TBD – To Be Determined TBR – To Be Resolved TENA – Test and Training Enabling Architecture TOEL – Time Ordered Event List TSPI – Time-Space-Position-Indicator TTP – Techniques, Tactics, and Procedures UML – Unified Modeling Language UP – Understand, Predict, React UR – Unfunded Requirement V&V – Verification and Validation VIP – Very Important Person WBS – Work Breakdown Structure WPAFB – Wright-Patterson Air Force Base

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. 91 Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 Appendix C - Directory and File and Naming Convention

Refer to sop_cm_003_directory_and_file_naming

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division. Air Force Life Cycle Management Center SIMAF Master Process Simulation and Analysis (S&A) Division Version 4.4 WPAFB OH 45433-7204 Appendix D - Configuration Management Plan

Refer to SIMAF Configuration Management Plan (CMP)

Distribution C: Distribution authorized to U.S. Government Agencies and their contractors for administrative or operational use. Other requests for this document shall be referred to the S&A Division.