Guidelines for MBASE s1

Total Page:16

File Type:pdf, Size:1020Kb

Guidelines for MBASE s1

Guidelines for Lean Model-Based (System) Architecting and Software Engineering (LeanMBASE)

General permission to make fair use in teaching or research of all or part of these guidelines is granted to individual readers, provided that the copyright notice of the Center for Software Engineering at the University of Southern California is given, and that reference is made to this publication. To otherwise use substantial excerpts or the entire work requires specific permission, as does reprint or republication of this material.

01111e796d662a64ee23970f5be9f23a.doc i Version Date: 9/28/2005 © 2005 Center for Software Engineering, University of Southern California. All Rights Reserved. Guidelines for MBASE Sections SECTIONS...... II VERSION HISTORY...... IV TABLE OF CONTENTS...... V TABLE OF FIGURES...... XIII TABLE OF TABLES...... XIV GENERAL GUIDELINES...... 1 OPERATIONAL CONCEPT DESCRIPTION (OCD)...... 6 SYSTEM AND SOFTWARE REQUIREMENTS DEFINITION (SSRD)...... 19 SYSTEM AND SOFTWARE ARCHITECTURE DESCRIPTION (SSAD)...... 31 LIFE CYCLE PLAN (LCP)...... 49 FEASIBILITY RATIONALE DESCRIPTION (FRD)...... 58 CONSTRUCTION, TRANSITION, & SUPPORT (CTS)...... 78 ITERATION PLAN...... 84 ITERATION ASSESSMENT REPORT...... 86 RELEASE DESCRIPTION...... 88 QUALITY MANAGEMENT PLAN (QMP)...... 90 TEST PLAN AND CASES...... 95 TEST PROCEDURES AND RESULTS...... 100 PEER REVIEW PLAN...... 103 PEER REVIEW REPORT...... 109 TRANSITION PLAN...... 112 SOFTWARE USER'S MANUAL...... 114 SYSTEM AND SOFTWARE SUPPORT PLAN (SSSP)...... 116 TRAINING PLAN...... 119 DESCRIPTION...... 119 SOURCES AND REFERENCES...... 120 APPENDICES...... 122

LeanMBASE_Guideline_V1.2 ii Version Date: 9/28/2005 Guidelines for MBASE APPENDIX A. SUGGESTED WIN–WIN TAXONOMY FOR MBASE...... 123 APPENDIX B. LEVEL OF SERVICE REQUIREMENTS...... 125 APPENDIX C. COMMON DEFINITION LANGUAGE (CDL) FOR MBASE...... 130 APPENDIX D. STATUS REPORTS...... 136 INDEX...... 140

LeanMBASE_Guideline_V1.2 iii Version Date: 9/28/2005 Guidelines for MBASE Version History

Date Author Versio Changes made n

08/20/0 Barry Boehm, 1.0  Initial version 5 David Klappholz, Ed Colbert, Prajakta Puri, Apurva Jain, Jesal Bhuta, Hasan Kitapci

09/17/0 Barry Boehm, 1.1  Updated CTS documents 5 Prajakta Puri

09/28/0 Barry Boehm, 1.2  Updated OCD guideline: Added section 3.2 5 Jesal Bhuta, (Organizational Goals) Alex Lam,  Updated SSAD sections: 3.1.4.1.4, 4.1.4.1.4 Apurva Jain,  Deleted SSAD sections: 3.1.4.1.5, 3.1.5.1.3, Prajakta Puri 3.1.6.1.3, 4.1.4.1.5, 4.1.5.1.3, 4.1.6.1.3, 4.1.7.1.8, 4.1.8.1.7  Changed section name 3.2.1  Figure 2 updated

LeanMBASE_Guideline_V1.2 iv Version Date: 9/28/2005 Guidelines for MBASE Table of Contents SECTIONS...... II VERSION HISTORY...... IV TABLE OF CONTENTS...... V TABLE OF FIGURES...... XIII TABLE OF TABLES...... XIV GENERAL GUIDELINES...... 1

A. Overview of LeanMBASE 1

B. General Formatting Guidelines 5

C. Final Remark 5 OPERATIONAL CONCEPT DESCRIPTION (OCD)...... 6

A. Description of the OCD 6 1. Purpose of the OCD...... 6 2. OCD Life Cycle Process...... 6 3. Completion Criteria for the OCD...... 7 3.1 Inception Readiness...... 7 3.2 Life Cycle Objectives (LCO)...... 7 3.3 Life Cycle Architecture (LCA)...... 7 3.4 Initial Operational Capability (IOC)...... 7

B. Sections of the OCD Document 8 1. OCD Overview...... 8 1.1 Introduction to the OCD...... 8 1.2 References...... 8 2. Shared Vision...... 9 2.1 System Capability Description...... 9 2.2 Expected Benefits...... 10 2.3 Benefits Chain (Initiatives, Expected Outcomes, and Assumptions)...... 10 2.4 Success Critical Stakeholders...... 12 2.5 System Boundary and Environment...... 12 3. System Transformation...... 13 3.1 Capability Goals...... 13 3.2 Organizational Goals...... 14 3.3 Proposed New Operational Concept...... 15 3.4 Organizational and Operational Implications...... 17 4. Easy Win-Win Results...... 18 SYSTEM AND SOFTWARE REQUIREMENTS DEFINITION (SSRD)...... 19

A. Description of the SSRD 19 1. Purpose of the SSRD...... 19 2. Completion Criteria for the SSRD...... 19 2.1 Life Cycle Objectives (LCO)...... 21

01111e796d662a64ee23970f5be9f23a.doc v Version Date: 9/28/2005 Guidelines for MBASE Table of Contents

2.2 Life Cycle Architecture (LCA)...... 21 2.3 Initial Operational Capability (IOC)...... 21

B. Sections of the SSRD Document 21 1. SSRD Overview...... 22 1.1 Status of the SSRD...... 22 1.2 References...... 22 2. Project Requirements...... 22 2.1 Budget and Schedule...... 23 2.2 Development Requirements...... 23 2.3 Deployment Requirements...... 24 2.4 Transition Requirements...... 25 2.5 Support Environment Requirements...... 25 3. Capability (Functional/Product) Requirements...... 25 4. System Interface Requirements...... 26 4.1 User Interface Standards Requirements...... 27 4.2 Hardware Interface Requirements...... 27 4.3 Communications Interface Requirements...... 27 4.4 Other Software Interface Requirements...... 27 5. Level of Service (L.O.S.) Requirements...... 28 6. Versions, States and Modes...... 29 7. Evolutionary Requirements...... 29 8. Appendices...... 30 SYSTEM AND SOFTWARE ARCHITECTURE DESCRIPTION (SSAD)...... 31

A. Description 31 1. Purpose...... 31 2. Completion Criteria...... 31 2.1 Life Cycle Objectives (LCO)...... 31 2.2 Life Cycle Architecture (LCA)...... 31 2.3 Initial Operational Capability (IOC)...... 32

B. Document Sections 32 1. Introduction...... 32 1.1 Purpose of the SSAD Document...... 32 1.2 Standards and Conventions...... 32 1.3 References...... 33 2. System Analysis...... 33 2.1 Structure...... 33 2.2 Artifacts & Information...... 34 2.3 Behavior...... 34 3. Platform Independent Model...... 35 3.1 Structure...... 35 3.2 Data Model...... 40 3.3 Behavior...... 40 3.4 Architectural Styles, Patterns & Frameworks...... 41 4. Platform Specific Model...... 41 4.1 Structure...... 41 4.2 Behavior...... 47 4.3 Patterns & Frameworks...... 48 4.4 Project Artifacts...... 48 5. Glossary for System Analysis and Design...... 48 6. Appendices...... 48 LIFE CYCLE PLAN (LCP)...... 49

LeanMBASE_Guideline_V1.2 vi Version Date: 9/28/2005 Guidelines for MBASE Table of Contents

A. Description of the LCP 49 1. Purpose...... 49 2. LCP Life Cycle Process...... 49 3. Completion Criteria...... 50 3.1 Life Cycle Objectives (LCO)...... 50 3.2 Life Cycle Architecture (LCA)...... 51 3.3 Initial Operational Capability (IOC)...... 51

B. Sections of the LCP Document 51 1. Introduction...... 51 1.1 Status of the LCP Document...... 51 1.2 Assumptions...... 51 1.3 References...... 52 2. Milestones and Products...... 52 2.1 Overall Strategy...... 52 2.2 Phases...... 53 2.3 Project Deliverables...... 53 3. Responsibilities...... 54 3.1 Overall Summary...... 54 3.2 By Phase / Stage...... 55 4. Approach...... 56 5. Resources...... 56 6. Appendix...... 57 FEASIBILITY RATIONALE DESCRIPTION (FRD)...... 58

A. Description of the FRD 58 1. Purpose...... 58 2. FRD Life Cycle Process...... 58 3. Completion Criteria...... 59 3.1 Life Cycle Objectives (LCO)...... 59 3.2 Life Cycle Architecture (LCA)...... 59 3.3 Initial Operational Capability (IOC)...... 59

B. Sections of the FRD Document 59 1. Introduction...... 59 1.1 Status of the FRD Document...... 59 1.2 References...... 60 2. Business Case Analysis...... 60 3. Requirements Traceability...... 61 4. Level of Service Feasibility...... 63 5. Process Feasibility...... 64 6. Risk Assessment...... 65 7. Analysis Results...... 66 7.1 Introduction...... 67 7.2 System Structure...... 68 7.3 COTS, Legacy, Custom Components and Connectors Description...... 70 7.4 Component Interaction Evaluation...... 74 7.5 Evaluation Summary...... 77 8. Appendix...... 77 CONSTRUCTION, TRANSITION, & SUPPORT (CTS)...... 78

A. General Construction Process Guidelines 78

B. Guidelines for the Deliverables80

LeanMBASE_Guideline_V1.2 vii Version Date: 9/28/2005 Guidelines for MBASE Table of Contents

C. General Guidelines for Plans and Reports 82 ITERATION PLAN...... 84

A. Description 84 1. Purpose...... 84 2. High-Level Dependencies...... 84

B. Document Sections 84 1. Iteration Overview...... 84 1.1 Capabilities to be Implemented...... 84 1.2 Capabilities to be Tested...... 85 1.3 Capabilities not to be tested...... 85 2. Plan...... 85 2.1 Schedule and Resources for Activities...... 85 2.2 Team Responsibilities...... 85 3. Assumptions...... 85 ITERATION ASSESSMENT REPORT...... 86

A. Description 86 1. Purpose...... 86 2. Additional Information...... 86

B. Document Sections 86 1. Overview...... 86 1.1 Capabilities Implemented...... 86 1.2 Summary of Test Results...... 86 2. Adherence to Plan...... 86 3. External Changes Occurred...... 87 4. Suggested Actions...... 87 RELEASE DESCRIPTION...... 88

A. Description 88 1. Purpose...... 88 2. High-Level Dependencies...... 88

B. Document Sections 88 1. About This Release...... 88 1.1 Physical Inventory of materials released...... 88 1.2 Inventory of software contents...... 88 2. Compatibility Notes...... 88 3. Upgrading...... 88 4. New Features and Important Changes...... 89 4.1 New Features...... 89 4.2 Changes since previous release...... 89 4.3 Upcoming Changes...... 89 5. Known Bugs and Limitations...... 89 QUALITY MANAGEMENT PLAN (QMP)...... 90

A. Description 90 1. Purpose...... 90 2. Quality Focal Point...... 90 3. Additional Information...... 90

LeanMBASE_Guideline_V1.2 viii Version Date: 9/28/2005 Guidelines for MBASE Table of Contents

4. High-Level Dependencies...... 90

B. Document Sections 91 1. Purpose...... 91 1.1 Overview...... 91 1.2 References...... 91 2. Quality Guidelines...... 91 2.1 Design Guidelines...... 91 2.2 Coding Guidelines...... 91 2.3 Quality Assessment Methods...... 91 2.4 Process Assurance...... 92 2.5 Product Assurance...... 92 2.6 Problem Reporting and Tracking System...... 93 2.7 Configuration Management...... 93 TEST PLAN AND CASES...... 95

A. Description 95 1. Purpose...... 95 2. High-Level Dependencies...... 95

B. Document Section 96 1. Introduction...... 96 1.1 Purpose of the Test Plan...... 96 1.2 References...... 96 2. Environment Preparation...... 96 2.1 Hardware preparation...... 96 2.2 Software preparation...... 96 2.3 Other pre-test preparations...... 97 3. Test Identification...... 97 3.1 Test Identifier...... 97 4. Resources...... 98 4.1 Responsibilities...... 98 4.2 Staffing and Training Needs...... 98 4.3 Other resource allocations...... 99 5. Test Completion Criteria...... 99 TEST PROCEDURES AND RESULTS...... 100

A. Description 100 1. Purpose...... 100 2. High Level Dependencies...... 100

B. Document Sections 100 1. Introduction...... 100 1.1 Purpose...... 100 1.2 References...... 100 2. Test Preparation...... 101 3. Test Case Description...... 101 4. Test Incident Reports...... 101 4.1 Incident Description...... 101 4.2 Impact...... 101 5. Test Log...... 101 6. Test Summary...... 102 6.1 Overall assessment of the software tested:...... 102 6.2 Impact of test environment:...... 102

LeanMBASE_Guideline_V1.2 ix Version Date: 9/28/2005 Guidelines for MBASE Table of Contents

6.3 Recommended improvements:...... 102 7. Operational Test and Evaluation...... 102 7.1 Evaluation Criteria...... 102 7.2 Outline of Operational Test and Evaluation Report...... 102 PEER REVIEW PLAN...... 103

A. Description 103 1. Purpose...... 103 2. High-Level Dependencies...... 103

B. Document Sections 103 1. Purpose...... 103 1.1 Overview...... 103 1.2 References...... 103 2. Peer Review Items, Participants and Roles...... 103 3. Peer Review Process...... 103 4. Classification of Defects...... 104 4.1 Priority and Criticality...... 104 4.2 Classification...... 104 5. Appendices: Forms for Peer Review Data Gathering...... 104 PEER REVIEW REPORT...... 109

A. Description 109 1. Purpose...... 109

B. Document Sections 109 1. Purpose...... 109 1.1 Overview...... 109 1.2 References...... 109 2. Individual Inspection-like Peer Reviews...... 109 2.1 Participants and Roles...... 109 2.2 Peer Review Items...... 109 2.3 Pre-Peer Review Meeting Defect Data...... 109 2.4 Peer Review Meeting Defect Data...... 110 2.5 Post-Peer Review Meeting Defect Data...... 110 2.6 Peer Review Statistics...... 110 2.7 Defect Correction and Rework...... 110 3. Peer Review Summary Information...... 111 3.1 Peer Reviews' Performed...... 111 3.2 Peer Review Results...... 111 4. Appendix...... 111 TRANSITION PLAN...... 112

A. Description 112 1. Purpose...... 112 2. High-Level Dependencies...... 112

B. Document Sections 112 1. Transition Strategy...... 112 1.1 Transition Objectives...... 112 1.2 Transition Process Strategy...... 112 2. Preparing for Transition...... 113 2.1 Hardware Preparation...... 113

LeanMBASE_Guideline_V1.2 x Version Date: 9/28/2005 Guidelines for MBASE Table of Contents

2.2 Software Preparation...... 113 2.3 Site Preparation...... 113 3. Stakeholder Roles and Responsibilities...... 113 SOFTWARE USER'S MANUAL...... 114

A. Description 114 1. Purpose...... 114 2. High-Level Dependencies...... 114

B. Document Sections 114 1. Introduction...... 114 1.1 System Overview...... 114 1.2 System Requirements...... 114 2. Operational Procedures...... 114 3. Installation Procedures...... 114 3.1 Initialization Procedures...... 115 3.2 Re-installation...... 115 3.3 De-installation...... 115 4. Troubleshooting...... 115 4.1 Frequently Asked Questions...... 115 4.2 Error Codes and Messages...... 115 5. Notes...... 115 SYSTEM AND SOFTWARE SUPPORT PLAN (SSSP)...... 116

A. Description 116 1. Purpose...... 116 2. High-Level Dependencies...... 116

B. Document Sections 116 1. Support Objectives and Assumptions...... 116 1.1 Support Objectives...... 116 1.2 Support...... 116 2. Support Strategy...... 117 3. Support Environment...... 117 4. Support Responsibilities...... 118 TRAINING PLAN...... 119 DESCRIPTION...... 119 1. Purpose...... 119

B. Document Sections 119 1. Schedule and Participants...... 119 1.1 Training Schedule...... 119 1.2 Measure of Success...... 119 1.3 Training of Others...... 119 2. Tutorial and Sample Data...... 119 3. Training Feedback...... 119 SOURCES AND REFERENCES...... 120

A. Overall 120

B. Operational Concept Description 120

LeanMBASE_Guideline_V1.2 xi Version Date: 9/28/2005 Guidelines for MBASE Table of Contents

C. System and Software Requirements Definition121

D. System and Software Architecture Description 121

E. Life Cycle Plan 121 APPENDICES...... 122 APPENDIX A. SUGGESTED WIN–WIN TAXONOMY FOR MBASE...... 123 APPENDIX B. LEVEL OF SERVICE REQUIREMENTS...... 125 APPENDIX C. COMMON DEFINITION LANGUAGE (CDL) FOR MBASE...... 130 APPENDIX D. STATUS REPORTS...... 136 INDEX...... 140

LeanMBASE_Guideline_V1.2 xii Version Date: 9/28/2005 Guidelines for MBASE Table of Contents Table of Figures Figure 1: Benefits Chain Notation...... 11 Figure 2: Example Sierra Mountainbikes Order Processing Benefits Chain...... 11 Figure 3: Notation to be used for System Boundary and Context...... 12 Figure 4: Example System Boundary and Context...... 13 Figure 5: Example Entity Relationship Diagram...... 15 Figure 6: Example Business Workflow...... 16 Figure 7: Architecture 1: Images stored on FTP Server:...... 69 Figure 8: Architecture 2: Image stored in the database:...... 70 Figure 9: Sequence diagram for Java business logic, Java-Mysql database connector, and Mysql database...... 76 Figure 10: Area of Concern Log Form...... 105 Figure 11: Problems/Issuer Log Cover Sheet Form...... 106 Figure 12: Field Descriptions for the Areas of Concern Log...... 107 Figure 13: Field Descriptions for Artifact Review Problem List...... 108

LeanMBASE_Guideline_V1.2 xiii Version Date: 9/28/2005 Guidelines for MBASE Table of Contents Table of Tables Table 1: Differences between Current and Proposed System...... 14 Table 2: Table of Level of Service Goals...... 16 Table 3: Requirements and Attributes Documentation Overview...... 20 Table 4: Project Requirement...... 23 Table 5: Capability Requirement...... 26 Table 6: System Interface Requirement...... 26 Table 7: Level of Service Requirement...... 28 Table 8: MARS...... 29 Table 9: Stakeholder responsibilities during software life cycle...... 55 Table 10: Traceability Matrix...... 61 Table 11: Level of Service Product and Process Strategies...... 63 Table 12: Process Model Decision Table...... 64 Table 13: Conditions for Additional Complementary Process Model Options...... 65 Table 14: Software Risk Management Techniques...... 66 Table 15: COTS Description...... 71 Table 16: MySQL Database Management System Component Description...... 72 Table 17: MSAccess Database Management System Component Description...... 72 Table 18: MySQL- JDBC Database Connector Description...... 73 Table 19: Test Table...... 74 Table 20: Test Table for a test of the interaction of Java Business Logic, MySQL-JDBC Database Connector, and MySQL database...... 76 Table 21: Example of Evaluation Summary...... 77 Table 22: Test Cases...... 97 Table 23: Test Completion Criteria...... 99 Table 24: Test Procedure Template...... 101 Table 25: Training Schedule...... 119

LeanMBASE_Guideline_V1.2 xiv Version Date: 9/28/2005 Guidelines for MBASE General Guidelines

Every CS577a student, regardless of which system definition element’s writing (OCD, SSRD, SSAD, LCP, FRD) s/he is going to concentate on, should read this document, through the end of the section entitled Feasibility Rationale Description, before starting to write anything. The reason is that consistency among the various elements is critical to project success, and it’s hard to maintain consistency if you don’t know what type of information will appear in all the elements. Additionally, each student should read these general guidelines carefully before proceeding to the guidelines for the individual software system definition elements that must be produced and delivered by each CS577 team. A. Overview of LeanMBASE

“Lean Model-Based System Architecting and Software Engineering” (LeanMBASE) is an approach to the development of software systems that integrates the system’s process (PS), product (PD), property (PY) and success (SS) models, models that are documented in the following system definition elements (also referred to as “artifacts” or “deliverables”):

 Operational Concept Description (OCD)

 System and Software Requirements Definition (SSRD)

 System and Software Architecture Description (SSAD)

 Life Cycle Plan (LCP)

 Feasibility Rationale Description (FRD)

 Construction, Transition, Support (CTS) plans and reports

 Risk-driven prototypes

The essence of the LeanMBASE approach is to develop the system definition elements concurrently, through iterative refinement, using the risk-driven, three-anchor point, Win–Win Spiral approach defined in [Boehm, 1996].

The three anchor points are milestones within a software system’s development at which it is logical for all critical stakeholders to decide whether continuation of the project will produce a win situation or whether the project should be stopped or radically altered. The three anchor points, or milestones, used in LeanMBASE software development, are referred to as the Life Cycle Objectives (LCO) anchor point, the Life Cycle Architecture (LCA) anchor point, and the Initial Operating Capability (IOC) anchor point. As far as the system definition elements are concerned, they have to satisfy the specific completion criteria described below at the three anchor points:

Life Cycle Objectives (LCO):

 Less structured, with information moving around

 Focus on the strategy or "vision" (e.g., for the Operational Concept Description and Life Cycle Plan), as opposed to the details

 Could have some mismatches (indicating unresolved issues or items)

 No need for complete forward and backward traceability

 May still include "possible" or "potential" elements (e.g., Entities, Components, …)

01111e796d662a64ee23970f5be9f23a.doc 1 Version Date: 9/28/2005 Guidelines for MBASE General Guidelines

 Some sections could be left as TBD, particularly Construction, Transition, and Support plans

Life Cycle Architecture (LCA):

 More formal, with solid tracing upward and downward

 No major unresolved issues or items, and closure mechanisms identified for any unresolved issues or items (e.g., “detailed data entry capabilities will be specified once the Library chooses a Forms Management package on February 15”)

 No more TBDs except possibly within Construction, Transition, and Support plans

 Basic elements from the Life Cycle Plan are indicated within the Construction, Transition, and Support plans

 There should no longer be any "possible" or "potential" elements (e.g., Entities, Components, …)

 No more superfluous, unreferenced items: each element (e.g., Entities, Components, …) either should reference, or be referenced by another element. Items that are not referenced should be eliminated, or documented as irrelevant

Initial Operating Capability (IOC):

 Complete tracings within and between models, delivered software (e.g. comments in code trace to SSAD design elements)

 LeanMBASE models are updated to be consistent (but not necessarily complete) with delivered system, that is “as built” OCD, SSRD, SSAD, etc. models

 Core system capability requirements have been implemented and tested

 At least one construction interaction

 Complete set of CTS plans and reports consistent with the development completed

 The system definition elements are strongly integrated and a strong traceability thread ties the various sections. Therefore, to enforce conceptual integrity, it is essential that team members work collaboratively, particularly on strongly related sections.

 Due to the strong interdependencies, it may be a good idea to follow some order when producing the deliverables, at least initially: e.g., write core sections of the OCD before the SSRD. During successive iterations, the documents generally should not be traversed in a linear fashion. Forward consistency should always be enforced (if an Entity is added to the Entity Model, then it should be examined as to how it affects the Component Model). Backward consistency can be less strongly enforced, but is useful to do where feasible.

 Strongly dependent sections are indicated by [Consistent with DDD x.x.x] where DDD is the LCO/LCA deliverable, and x.x.x the section number. When reviewing the deliverables and checking the overall conceptual integrity, it is very helpful to review strongly connected sections in sequence, as opposed to reviewing the deliverables in a linear fashion.

 Conceptual integrity and consistency between the various deliverables, at a given milestone (LCO/LCA), is critical. In particular, a system definition element should not be "incomplete" with respect to the remaining ones. For instance, if the SSRD specifies more requirements, than the architecture described in the SSAD supports, but the FRD claims that the architecture will satisfy all the requirements, the SSAD would be

01111e796d662a64ee23970f5be9f23a.doc 2 Version Date: 9/28/2005 Guidelines for MBASE General Guidelines

considered incomplete. It is important to reconcile the deliverables, and make sure that one deliverable is not "one iteration ahead" of the other deliverables.

Additional detail on the Completion Criteria for each of the artifacts at each anchor point may be found in the guidelines for the various artifacts.

Note that achievement of the Completion Criteria for The LCO deliverables are precisely the “exit criteria” for the LCO, or “inception,” phase of a development project, and achievement of the Completion Criteria for the LCA deliverables are “exit criteria” for the project’s “elaboration” phase. i.e., achievement of the LCA Completion Criteria is an indication that implementation of the software system may reasonably begin.

It is strongly recommended that the principles listed below be followed by development teams in constructing the various deliverables:

 Each deliverable must meet the Completion Criteria for the anchor point for which it is being developed, i.e., the exit criteria for the development phase terminated by that anchor point. This is the one and only criterion for completion of that deliverable. One thing that this implies is that there is no mandated minimum or maximun number of pages for any deliverable; rather, the size of a deliverable is the number of pages to concisely document what is necessary for the project at hand – and can differ radically from project to project. (An important consequence of this is that a good grade can just as easily be jeopardized by putting unnecessary, useless, information in a deliverable as by omitting information critical to the proper modeling of the software system and of the plan to be used for its development.)

 A question that frequently arises in software development projects is that of when each particular aspect of the software being developed (the “product”) or of the way it is being developed (the “project”) should be specified/finalized and “cast in stone.” A primary characteristic of the LeanMBASE process is to be risk driven at all times – because risk-driven development greatly increases the probability of project success. Tricky specification / finalization issues should, therefore, be resolved on the basis of risk considerations. (This flies in the face of the frequent assumption that “the earlier something is completed the better” or “the more that has been ‘accomplished’ to date, the better.”) The proper approach is:

 If it’s risky not to specify / finalize now Do specify / finalize now (e.g., a safety-critical hardware-software interface)

 If it’s risky to specify / finalize now, Don’t specify / finalize yet. For example:

 Early rigorous specification (casting in stone) of user screen layouts should be avoided because:

. On the one hand, it is risky to lock these in before users have had a chance to play with them and decide what they actually want/need

. And, on the other hand, GUI-builder tools make it very low risk to redo (iteratively refine) screen layouts with the aid of actual users.

 Another implication of the principle of risk-driven development, already stated above, but without relating it to risk considerations, and, therefore, worth repeating, is that the amount of text that you put into each section of each artifact should be determined by the notion that:

 It’s risky to omit information that will be required for the successful completion of the product’s development and/or for its later maintenance, so put include such information. (And, it’s also risky to omit such information because instructors, TA’s, and graders are good judges of what’s necessary, and will lower your grade if something necessary is missing.)

 It’s risky to include information that is repetitive or irrelevant – “more” is not necessarily “better” – because doing so results in wasted time, or, worse yet, confusion, on the part of other developers and maintainers, so leave out such information. (And, it’s also risky to include repetitive or irrelevant

01111e796d662a64ee23970f5be9f23a.doc 3 Version Date: 9/28/2005 Guidelines for MBASE General Guidelines

information because instructors, TA’s, and graders are good judges of what’s repetitive or irrelevant, and will lower your grade if such material is included.)

 Part of what you’re expected to learn in CS577 is how to judge relevance, repetitiveness, and irrelevance in the execution of a software development project.

 Visual models should be used wherever possible, e.g.:

 In the OCD/SSRD: block diagrams, context diagrams

 In the SSAD: UML diagrams

 In the LCP: tables, Gantt charts, PERT charts

 Information specified in any section/model within any one of the deliverables should occur in only one place within all the deliverables. The reason is that:

 Correct development of a software system can be severely compromised if different versions of a section/model exist in different artifacts or in different parts of the same artifact. (A developer may read an outdated version, mistakenly think that it details the proper way to proceed, and waste time pusuing a useless – or, in the worst case, project-compromising – activity.)

 Given that different team members are typically either leads for, or participants in, the writing of different artifacts, it is extremely easy for a duplicated section/model to be (quite correctly) updated in one artifact, but not in another. (In fact, experience suggests that even duplications within a single artifact is likely to result in inconsistent updates.)

 Since it is often necessary, within one section of one artifact, to refer to something in another section of the same, or of a different, artifact, the way to do it is just that, i.e., not to copy that which must be referred to, but, rather, to give it a unique identifier, and a short name, and to refer to it by that identifier and name. For example, in the OCD, “capability goals” for the software system being developed are listed. The following is an example of an operational capability for an order processing system to be developed for a manufacturing company:

OC-1 Central Order Processing: Order may be (i) entered and processed directly via the Sierra Mountainbikes (SMB) central website and Enterprise Resource Planning (ERP) system, or, in the case of telephone or fax orders (ii) entered by SMB service personnel. Orders are validated interactively, using validation criteria editable by administrators.

Should it be necessary to reference this operational capability in the SSRD or in the FRD – both of which are likely -- it should be referred to simply as “OC-1 Central Order Processing.” Doing so eliminates the update inconsistency that would likely occur if OC-1 had to be updated/changed, for good reason, by the writers of the OCD.

 When creating a reference:

 Avoid a “broken” reference – a reference to something that no longer exists -- or invalid reference by checking that the writers of the item referenced have not either deleted it or changed it since the last version that you read of the artifact in which you last saw it. For example, examine the latest version of the OCD before adding a reference to “OC-1 Central Order Processing” as the OCD’s authors may have changed the details of the operational capability, or may even have deleted it entirely, since the last version of the OCD that you saw. (The criticality to project success of avoiding this problem is a very strong argument for putting all artifacts under version control.)

01111e796d662a64ee23970f5be9f23a.doc 4 Version Date: 9/28/2005 Guidelines for MBASE General Guidelines

 If assumptions are made in the LCO/LCA packages, it is important to reality-check them as much as possible. For example, if you write "This assumes that the COTS package will do X", determine the likelihood that it actually will. If the likelihood is low that the package will do X, identify this as a risk, and create a risk management strategy for dealing with it. Additionally, avoid introducing non-customer and non-domain-expert assumptions.

 Do not include text from the guidelines or from outside sources in your deliverables without relating the the text to your project's specifics. And, don’t repeat in great detail software engineering principles and explanations from the lectures or readings in your deliverables as:

 The purpose of the deliverables is to convey product- and project-specific information.

 Those who use – and those who grade – the deliverables may be assumed to know the fundamentals of software engineering. B. General Formatting Guidelines

 Every document should have the following information on the cover page

 Document Title

 Project Title

 Team number

 Team Members and Roles

 Date

 Document Version Control Information

 and every document should include a version history, a Table of Contents, a List of Figures, and a List of Tables

 In general, use an outline form, e.g., for Organization Activities, instead of wordy paragraphs. In an outline form, items are easier to read, and important points stand out.

 Use numbered lists as opposed to bulleted lists to be able to reference items by their number, e.g., 'Organization Goal #2', which helps traceability.

 Include captions for each figure, table, etc., to encourage referencing and enforce traceability. C. Final Remark

We can only suggest outlines for the LCO/LCA/IOC deliverables; for example, there is no one-size-fits-all structure/ outline for a Requirements Description, or a Life Cycle Plan. If a particular item in one of the artifact guidelines is not applicable to your project, it should be noted as "Not applicable" or "N/A" with a short explanation of why it isn’t applicable. (Do not feel compelled to artificially create information simply to fill in a section that is not applicable to your project; recall the question of risk – to both the success of your project and to your grade.) Similarly, you may add sections to an artifact’s outline if there is a project-specific need to do so.

It is not, however, recommended that you change the orders of the various sections or that you freely delete critical sections – again a judgment call. The overriding goal is clear, concise communication, and standardized guidelines

01111e796d662a64ee23970f5be9f23a.doc 5 Version Date: 9/28/2005 Guidelines for MBASE General Guidelines help accomplish this: if you do make substantial alterations, make sure they are clear, and well justified. Haphazard documentation is a major point of project failure. (And, if in doubt, consult an instructor or a TA.)

01111e796d662a64ee23970f5be9f23a.doc 6 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) A. Description of the OCD

Sections A1-A3 of this Operational Concept Description (OCD) section of the LeanMBASE guidelines are intended to be read by CS577 students -- to help them understand what the OCD is about and how to write their projects’ OCD’s -- and are not intended to be included in submitted OCD’s. A.1. Purpose of the OCD

The purpose of a development project’s OCD is to describe the success critical stakeholders’ (also known as “key stakeholders”) shared vision of the project being undertaken. Key stakeholders typically include the system’s users, the client, the customer, if different from the client, and the developers. Depending on the nature of the project, others, including the following, may also be key stakeholders: investors, administrators, operators, maintainers, component suppliers (e.g., suppliers of DBMS’s, middleware, and other COTS products), and providers of other systems with which the current system communicates or inter-operates. In the case of a safety-critical system such as air traffic control, members of the general public are also key stakeholders.

The OCD describes how the new, or modified/enhanced, software system that is the subject of the project – henceforth referred to simply as “the system” -- will operate within its environment. Where necessary, it includes prototypes to help flesh out details of the operational concept in such a way that it will meet the win conditions of all key stakeholders. (Note that in CS577 projects screen prototypes are necessary.)

The OCD is written by developers, with the collaboration of representatives of each category of key stakeholder. Unlike most of the other MBASE documents, which include software-related technical detail that will not be understood by non-developer stakeholders, the OCD must be written in language understandable by all categories of stakeholder. If a particular project’s OCD has to contain terminology specific to any category of stakeholder, that terminology must be defined explicitly. (This includes technical/business terminology from the client’s/customer’s field/domain, which may not be familiar to software developers.)

Key executive stakeholders use the OCD to verify that the operational concept is of benefit to their organization. Key operational stakeholders use the OCD to understand, at a high level, how their activities will be affected by the new system. Development-side stakeholders use the OCD for the purpose of understanding, at a high level, what they are to develop and, in some cases, constraints under which they must perform their development activities. A.2. OCD Life Cycle Process

You should start by writing a draft of Section 2 (Shared Vision) of the OCD and some rough scenarios of use cases. This will provide the context for the concurrent prototyping and Win-Win negotiation activities. The results of these latter two activities are used to develop the Life Cycle Objective (LCO) and Life Cycle Architecture (LCA) versions of the OCD, System and Software Requirements Definition (SSRD), System and Software Architecture Description (SSAD), Life Cycle Plan (LCP), and Feasibility Rationale Description (FRD). During this process, the prototypes should be extended and refined, but in general the Win-Win results are not. The LCA version of the OCD is submitted as a part of the LCA package to the CS577 Archive.

During CS577b, the OCD is updated as appropriate, but only change summaries are presented at the Rebaselined Life Cycle Architecture (RLCA) and Transition Release Readiness (TRR) reviews. The final version of the OCD is delivered as part of the client’s Initial Operational Capability (IOC) package and submitted to the CS577 Archive.

01111e796d662a64ee23970f5be9f23a.doc 7 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) A.3. Completion Criteria for the OCD

This section (A.3) describes what your team should do before you start your WinWin (EasyWinWin) negotiations (in section A.3.1) and what the versions of the OCD for the three milestones (LCO OCD, LCA OCD, and IOC OCD) should contain (in sections 3.2-3.4 respectively). A.3.1 Inception Readiness

Before starting your Win-Win negotiations and your detailed prototypes, work with your client to develop a draft version of Section 2 (I.e., the section whose description you’ll find in B.2 below: Shared Vision) of the LCO OCD. If possible get representatives of all categories of success-critical stakeholders – not just your client -- to participate in the Win-Win negotiations and in demos of the prototype. (The participation of end users is often very important; the importance of participation by other types of critical stakeholder varies from project to project, and from very important to optional. Since you may not be able to identify all categories of critical stakeholder by yourself, involve the client, as well as other already-identified stakeholders in this process.) A.3.2 Life Cycle Objectives (LCO)

The following items must be described in the LCO OCD.

 Shared vision and context for success-critical stakeholders

 System transformation strategy : All nominal scenarios and critical off-nominal scenarios as coordinated with operational stakeholders

 Draft operational concept for IOC

 High level operational and organizational transformations

 Link to easy Win-Win results A.3.3 Life Cycle Architecture (LCA)

At LCA, the following items need to be described.

 Operational concept for IOC

 System transformation strategy : All critical nominal and off-nominal scenarios as coordinated with operational stakeholders

 All operational and organizational transformations A.3.4 Initial Operational Capability (IOC)

At IOC:

 Change summary and changes from LCA version must be described.

 Additionally, the OCD must be consistent with the IOC versions of other artifacts (IOC SSRD, IOC SSAD, IOC LCP, and IOC FRD).

LeanMBASE_Guidelines_V1.2 8 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) B. Sections of the OCD Document

The (sub-) sections listed below describe the base format for the OCD. For readability by faculty, TA’s, graders, and CS577b students, every submitted OCD document (LCO OCD, LCA OCD, and IOC OCD) should contain each of the indicated sections, with the title and section numbering unchanged. (See the General Guidelines for exceptions.) The text under each (sub-) heading describes what should be in the corresponding (sub-) section. (The texts in these (sub-) sections of the present document are to be read by CS577 students -- to help them prepare the contents of the corresponding (sub-) sections of their OCD documents -- and are not intended to be included in submitted OCD.) B.1. OCD Overview B.1.1 Introduction to the OCD

This section of the submitted OCD introduces the project/system by identifying:

 The system by name or function

 The team developing the new system

 The client, the organization s/he represents, and the customer, if different from the client

 The project milestone/anchor point (LCO, LCA, or IOC) for which this document is being prepared

 The number of this version of the OCD, and the most significant changes since the previous version. (Each CS577 artifact should be written / edited by at least two team members. All documents written/edited by more than one person should be under some type of version identification / numbering / control, either formal or informal.)

 Definitions of terms, if any, that are not in the common vocabulary of all key stakeholders

The following are sample OCD introductions:

 This document contains the LCO anchor point description of the operational concept behind the Conference Room Scheduling System that Team 5 is doing for Jane Smith (the customer), head of the User Services Department of the university libraries. We will be working directly with Jane Doe (client), a User Services librarian who reports directly to Ms. Smith.

 Team 6 will be involved in completely revamping the university’s Career Services Department’s web site to make it competitive with other major universities’ career services web sites. We will be working directly with John Doe, the director of the department. This section contains the LCA milestone operational concept for the desired new web site.

 John Smith is the sole proprietor of Smith’s Fabulous Collectibles, a bricks and mortar retail business in Encino, CA. Team 10 will be investigating the development of a web site for Smith’s Fabulous Collectibles using COTS products to the extent possible. The current document is version 1.1 of the project’s LCO OCD. It updates version 1.0, in which key stakeholders were not adequately identified. B.1.2 References

This section contains references, if any are needed, to books, documents, web sites, meetings, tools used or referenced, etc., that will help the reader understand the system’s requirements in greater detail. Note that the degree

LeanMBASE_Guidelines_V1.2 9 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) to which any MBASE model -- document section -- is elaborated should be determined using risk considerations. To repeat an admonition from the General Guidelines:

 If it’s risky not to include something, then include it

 If it’s risky to include something, don’t include it.

It is risky not to write and include information that’s critical to success and that will likely be forgotten if it isn’t explicitly stated; it is risky to write and include something if doing so takes significant time away from doing something considerably more important to the success of the project. Thus, on the one hand, your grade may suffer if you omit references critical to the understanding of the project; it may, on the other hand, suffer if you include unnecessary references just for the sake of including references.

The following are examples of references:

 The university’s Career Services Department’s current web site may be found at www.careerservices.university.edu.

 The following are the URLs of some other universities’ career services web sites that are perceived to be better than our client’s web site:

www.university1.edu/career_services

www.university2.edu/career_services

www.university2.edu/career_services

If the nature of the project is such that there is nothing project-relevant to put in this section, simply write “no relevant references.” B.2. Shared Vision B.2.1 System Capability Description

Provide an “elevator test” summary of the project’s expected benefits, i.e., a short executive summary of the sort that could convince a potential funder (venture capitalist) “during the course of a short elevator ride.” (See Geoffrey Moore’s Crossing the Chasm (Harper Collins, 1991, p.161)). Such an “elevator test” summary includes:

 The type of system to be built

 The target customer(s) for the system

 The need or opportunity that will be satisfied by the system

 A compelling reason for the customer to buy/use the system

 The closest competitor of the system

 The system’s primary differentiation from or benefit over the closest competitor or alternative approach, if there are competitors or alternative approached

The following is an “elevator test” summary for a corporate order-entry system:

 Sierra Mountainbikes, Inc’s Sales Department needs a faster, more integrated order entry system to increase sales. The proposed Web Order System will give us an e-commerce order entry system similar to

LeanMBASE_Guidelines_V1.2 10 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD)

Amazon.com’s that will fit the special needs of ordering mountain bicycles and their aftermarket components. Unlike the template-based system that our main competitor bought, ours would be faster, more user friendly, and better integrated with our order fulfillment system. B.2.2 Expected Benefits

This section provides a more detailed list of benefits that the stakeholders can expect the new system to provide. The following is an example for the Sierra Mountainbikes’ order entry system:

The proposed new order entry system will:

 Enable employees of Sierra Mountainbikes, their customers, and their distributors to track the current location and expected delivery dates of shipments

 Reduce the number of clerks required to perform order entry

 Improve Sierra Mountainbikes’ ability to manage customer relations

 Reduce the error rate in orders

 Enable Sierra Mountainbikes to maintain better data on order rates of various products, and demographics of customers for their various products B.2.3 Benefits Chain (Initiatives, Expected Outcomes, and Assumptions)

The key stakeholders of a software development project engage in the project in order to achieve certain desired (beneficial) outcomes, with the expectation that successful completion of the development team’s “initiative” of designing and building the software will result in the achievement of these benefits. It often isn’t quite as obvious that:

 Achievement of the desired outcomes might depend upon the validity of certain unstated assumptions; frequently such assumptions are unknown to one category of stakeholders, and so trivially obvious to another category of stakeholders that members of the second category don’t even bother mentioning them – often to the detriment, or outright failure, of the project. (So, be sure to discuss this issue with representatives of all identified categories of key stakeholders.)

 Additional initiatives over and above the design and construction of the software – and usually on the part of stakeholders other than the developers -- might be required for the beneficial outcomes to be achieved. (So, again, be sure to discuss this issue with representatives of all identified categories of key stakeholders.)

For example, a number of years ago a CS577 team worked on a system that would enable scholars all over the world to perform web-based searches for, and retrieval of, Chinese, Japanese, Korean, and Indian films from a USC archive of such films. The initiative of building a web-based (search and retrieval) system was certainly necessary to the achievement of this outcome, but, as it turned out, not sufficient; an additional initiative of:acquiring a donor to supply the significant funds required to digitize the large number of films in the archive was also necessary for the project to succeed.

The purpose of this section is to specify any additional initiatives required to achieve the desired outcome(s), and the assumptions under which the successful pursuit of all the recognized initiatives will actually produce these beneficial outcomes. In this section of the OCD you are to specify all of this in the form of a diagram known as a “Benefits Chain.” (The Benefits Chain is an extended version of the Results Chain in John Thorp’s The Information Paradox (McGraw Hill, 1998, p.98)). The Benefits Chain is a notation for linking Initiatives (that consume resources) to Contributions (desired capabilities) and Outcomes (desired states), that may lead either to further contributions or to added value (benefits). An important aspect of the Benefits Chain notation is Assumptions, which

LeanMBASE_Guidelines_V1.2 11 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) condition the realization of the Outcomes. Stakeholders shown in the benefits chain should include all those development-stage and operational-stage stakeholders who are critical to the success of the initiatives and of the resulting outcome(s). Figure 1 shows the Benefits Chain notation; figure 2 shows an example of the Benefits Chain for a Sierra Mountainbikes order processing system.

Figure 1: Benefits Chain Notation

Assumptions

Initiative Outcome Outcome

Contribution Contribution

Initiative Initiative

Stakeholders Stakeholders Stakeholders

LeanMBASE_Guidelines_V1.2 12 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD)

Figure 2: Example Sierra Mountainbikes Order Processing Benefits Chain

Distributors, retailers, customers

Assumptions New order fulfillment New order fulfillment - Increasing market size system processes , - Continuing consumer satisfaction with product outreach , training - Relatively stable e -commerce infrastructure - Continued high staff performance Safety , fairness Interoperability inputs inputs

Less time, Faster , Increased Increased fewer better sales, Less time, fewer customer New order-entry errors per order profitability , Increased profits , errors in order satisfaction , system order entry customer growth processing decreased entry system satisfaction operations costs system

Developers Faster order -entry steps , errors On-time assembly

New order-entry Improved supplier processes , coordination outreach , training

Sales personnel , Suppliers distributors

Note that:

 Required initiatives and assumptions are often impossible to uncover without significant collaboration between developers and other key stakeholders, often including key operational stakeholders.

 The purpose of this section is not to draw a diagram, but to uncover non-obvious assumptions, additional required initiatives, and additional success-critical stakeholders. The Benefits Chain notation provides a convenient way to represent such uncovered information. B.2.4 Success Critical Stakeholders

List each category of success critical stakeholder for your project by home organization, authorized representative for project activities, and place in the Benefits Chain. The four classic categories are the software system’s users, customers/clients, developers, and maintainers. Be sure to identify additional stakeholders critical to your system’s success. These could include data and infrastructure providers, and proprietors of critical interfacing systems. Be sure to investigate the issue of where the system will be hosted, and by whom, when fleshing out this list. (See Section A.1 of these OCD Guidelines for a short discussion of success critical stakeholders.) B.2.5 System Boundary and Environment

The system boundary contains a list of services / functions that the project team will be responsible for developing and delivering, and the system environment shows the stakeholder organizations and other systems for which the project has no authority or responsibility, but with which the delivered system must interface in order to deliver the desired benefits. Figure 3 shows the basic structure context diagram used to define the system boundary. Figure 3

LeanMBASE_Guidelines_V1.2 13 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) shows the notation to be used in describing the system boundary and environment; figure 4 shows an example of the system boundary and environment for a Sierra Mountainbikes order processing system.

Figure 3: Notation to be used for System Boundary and Context

Critical Service interfacing users systems

System Data Administration Sources Users Infrastructure

Figure 4: Example System Boundary and Context

Sales Personnel

SMB Manufacturing System Customer · Order processing

· Payment Processing

· Sales credits Distributors, Retailors · Customer concerns processing

· Order fulfillment processing SMB Financial Administrators , · In-transit visibility System Service personnel

Network, ERP Infrastructure System

SMB: Sierra Mountainbikes

ERP: Enterprise Resource Planning

LeanMBASE_Guidelines_V1.2 14 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) B.3. System Transformation B.3.1 Capability Goals

Provide a brief enumeration of the most important operational capability goals. A “capability” is simply a function that the system performs or enables users to perform. To facilitate traceability between capability goals listed in the OCD and references to them from other artifacts (SSRD, SSAD, LCP, and FRD), assign each capability a unique designation e.g.OC-1 and a short descriptive name.

The following are examples of capability goals for the Sierra Mountainbikes order-entry system:

OC-1 Central Order Processing: Orders may be (i) entered and processed directly via the Sierra Mountainbikes (SMB) central website and Enterprise Resource Planning (ERP) system, or, in the case of telephone or fax orders (ii) entered by SMB service personnel. Orders are validated interactively, using validation criteria editable by administrators.

OC-6 In-Transit Visibility: The system shall enable, SMB employees, customers and distributors to determine where an in-transit shipment is and when it is likely to arrive. (If a shipment is delayed, affected parties need to know where it is and when it is likely to arrive, to enable them to make alternative plans, if necessary.) .3.1.1 Relation to Current System

The purpose of this section is to describe those limitations of the current organizational environment and/or of the current system (if one exists) and/or of the organization’s way of doing business that motivates the development of the new system. Focus should be on ways in which the organization’s environment and the current system do not provide the needed capabilities (OCD 3.1). (As indicated in the Benefits Chain example, new Initiatives to achieve needed capabilities and beneficial outcomes will require some transformation of the current system and its relation to stakeholders.)

Summarize the relations between the current and new systems in a table of the form that is shown, in table 1, for the Sierra Mountainbikes order-entry system. Include key differences between current and new roles, responsibilities, user interactions, infrastructure, stakeholder essentials, etc.

Table 1: Differences between Current and Proposed System

Capabilities Current System New System

Roles and Responsibilities  Diffuse responsibilities  Central order-processing

 Many clerks performing  Special case routing to cognizant tedious, repetitive tasks responsible personnel

 Clerks trained for online operation or other jobs

User Interactions  Clerical order-processing  Point-of-entry validation and error notification  Redundant data entry  Minimum redundancy  Batch file updates  Prepackaged frequent transactions  Slow error notification

Infrastructure  Incompatible files  ERP system

LeanMBASE_Guidelines_V1.2 15 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD)

 Batch data processing  Interactive data processing

Stakeholder Essentials and  Assured sales for salespersons Amenities  Rapid acknowledgement of concerns

 In-transit visibility

Future Capabilities  Just-in-time manufacturing supplier support

 Trend analysis

 Customer relations management

B.3.2 Organizational Goals

Identify the broad, high–level objectives and aspirations of the sponsoring organization(s) and of the organizations that will be using and maintaining the new system. The goals should be expressed in terms of (or reference) the Expected Benefits (OCD 2.2), and should only include the goals that indicate what the organization wishes to achieve by having the proposed system (e.g., increase sales, profits, and customer satisfaction). Each goal in this section should relate to one or more of the Expected Benefits (OCD 2.2).

A brief enumerated list of goals. To facilitate traceability, assign each goal a unique number (e.g. OG-1). For example:

OG-1: Increase sales and profits via more efficient order processing.

OG-2: Improve speed via faster order entry.

OG-3: Improve quality via better in-process order visibility, reduced order defects.

OG-4: Improve customer satisfaction via better and faster operations. B.3.3 Proposed New Operational Concept

Summarize the key aspects of the proposed new operational concept, including appropriate entity-relation diagrams, business workflows, desired and acceptable levels of service, and key prototype results, as indiated in the following sub-sections. .3.3.1 Entity Relationship Diagrams

Summarize the major relationships among the primary entities involved in the proposed new system. An order- processing example is provided in figure 5. (Note that the example is more in the style of a data flow diagram than in the style of Chen’s ER diagram or of an EER diagram; any of these notations is fine as the content is far more important than the style.) If necessary, also draw a diagram for:

 The relevant aspects of the organization as they function before the proposed system is put into use.

LeanMBASE_Guidelines_V1.2 16 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD)

 The relevant aspects of the organization they will function when all evolutionary goals for the system have been met.

Figure 5: Example Entity Relationship Diagram

Sierra Mountainbikes

payments orders Finance invoices orders payments payments Component Suppliers Component Distributors Customer components schedules orders orders Distribution, Inventory Manufacturing Bicycles Sales Bicycles, Management bicycles delivery parts Management parts schedules components Product Order planning trends Desires, complaints Customer Marketing responses Desires, Services Complaint summaries

.3.3.2 Business Workflows

Characterize the new operational concept in terms of the flow of work through the proposed new system. As appropriate, indicate future capabilities of the new system or major differences from the current system as well. An order-processing example is provided in figure 6.

Figure 6: Example Business Workflow

Invalid payment message

financial Financial Financial Processing actions

Validation criteria Valid Yes Payment? Concurrently Process delivery Delivery Delivery order Validate Appropriate Processing actions Order Order valid Attributes

invalid Out-of-stock actions Error message admin Administrator Admin Processing actions

LeanMBASE_Guidelines_V1.2 17 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD) .3.3.3 Level of Service (L.O.S.) Goals

Identify the desired and acceptable goals for the proposed new system’s important levels of service, that is, the system’s quality attributes or “-ilities.” Indicate desired and acceptable levels of service of the new system or major differences from the current system, if increase in level of service is an important motivation for the new system. Indicate how performance with respect to goals will be measured. If/where appropriate, identify different classes of service (e.g., for simple vs. complex queries). If relevant, state the prority for each level of service (l.o.s.) attribute. To facilitate traceability, assign each level of service attribute a unique identifier (e.g. LOS-1). Table 2 shows an example for the proposed order-entry system.

Table 2: Table of Level of Service Goals

Attribute (units) Level of Service Notes

Desired Acceptable

Response time, entries (sec) 0.1 0.5

Response time (sec)

- Simple transactions 0.3 1.0

- Complex ransactions 3.0 10.0

Throughput (average no. of 40 20 Current system: 3 transactions/sec) After 2 years: (60,40)

Late order delivery (%) 5 9 Current system: 12.4

After 2 years: (2,5)

Note that in the case of many CS577 projects, especially web services projects:

 Required reliability is simply (can’t be greater than) the reliability of the server’s and/or the client’s connection to the web, and the reliability of the web.

 There is little server-side computation, so the best achievable response time is expected time through the web.

So, in the case of many CS577 projects, not very much need be said about level of service goals. (That is, don’t struggle to find something to include here if there is nothing that is relevant.) .3.3.4 Prototype

Developer-side and non-developer-side stakeholders have different professional backgrounds. Prototyping can serve to avoid misguided development activities by providing a common platform for comprehension of the way the new system will operate. This section should include:

 Prototype screen shots of the most important screens/web pages.

 Explanations of how each screen/web page operates, possibly using a flowchart showing the conditions under which one screen leads to another.

LeanMBASE_Guidelines_V1.2 18 Version Date: 9/28/2005 Guidelines for MBASE Operational Concept Description (OCD)

In case final versions of screens have not yet been agreed upon, multiple alternatives that are under consideration may be shown here.

Note that in the case of some types of .project, for example, pure COTS assessment projects, prototyping may not be unnecessary, in which case this section should simply indicate that prototyping is unnecessary, and the reason it is unnecessary. B.3.4 Organizational and Operational Implications

One of the main sources of failed software systems has been that the new systems changes success-critical operational stakeholders’ roles, responsibilities, authority, and operational procedures in ways that are (personally) unacceptable to them. This section identifies the stakeholders affected by such changes and indicates their concurrence with the changes. (Note that the changes may not be what the developers initially thought they would be, but, rather, that they may require significant (WinWin) negotiation efforts to resolve – and can easily cause project failure, especially failure to adopt the software, as spiffy as it may turn out to be – if not resolved to the satisfaction of all key stakeholders.) .3.4.1 Organizational Transformations

Identify any significant changes in organizational structure, authority, roles, and responsibilities that will result from transitioning to the new system. Identify the major operational stakeholders affected by the changes, and indicate their concurrence with the changes.

An example from figure 6 would be the need for the delivery organization to check with the financial organization for payment validity before shipping an order. Another would be the elimination of the need for current time- consuming management approvals before initiating delivery actions. .3.4.2 Operational Transformations

Identify any significant changes in operational procedures and workflows that will result from transitioning to the new system. Identify major operational stakeholders affected by the changes, and indicate their concurrence with the changes.

An example from figure 6 would be having the financial, delivery, and administrative processing progress concurrently rather than sequentially to decrease response time, subject to the check for payment validity before shipping an order. Another example would be the elimination of the management approval steps before initiating delivery actions. B.4. Easy Win-Win Results

Provide a link to the easy win-win results. Summarize any major subsequent changes to the easy win-win results, with a simple explanation of the rationale for each change.

LeanMBASE_Guidelines_V1.2 19 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD) A. Description of the SSRD

Sections A1 and A2 of this System and Software Requirements Definition (SSRD) section of the LeanMBASE guidelines are intended to be read by CS577 students -- to help them understand what the SSRD is about and how to write their projects’ SSRD -- and are not intended to be included in submitted SSRD. A.1. Purpose of the SSRD

The purpose of a development project’s SSRD is to enumerate and explain the various requirements that follow from (elaborate upon) the results of the WinWin negotiations among key stakeholders. Requirements are features of, attributes of, or constraints on, the new system or of/on the way it is to be designed and implemented – i.e., of/on the product or the project. To be a “requirement,” a feature, attribute, or constraint must actually be “required” in the sense of having been explicitly agreed to as necessary (required) in the EasyWinWin (or follow-up) negotiations or must be a logical derivative (refinement) of a feature, attribute, or constraint agreed to as necessary in the EasyWinWin (or follow-up) negotiations. (Note that the fact that a feature, attribute, or constraint has been agreed to means that critical stakeholders are committed to its realization.)

In other words, the SSRD specifies what the delivered system will do and how it will do it (behavior). Decisions about the system are included in the SSRD only if all the stakeholders explicitly agree with them. For example, If the development team wants to use Oracle for a project’s database, possibly because those members of the team who have database experience are most familiar with Oracle – or familiar with only Oracle -- then:  If the desire to use Oracle has been raised, and agreed to, in WinWin negotiations – as a developers’ win condition -- then “Oracle shall be used for the system’s database” is a valid requirement  If the issue of using Oracle hasn’t been both raised, and agreed to, in WinWin negotiations, then “Oracle shall be used for the system’s database” is not a valid requirement and may not be listed as a requirement in the SSRD. (It may, however, be listed, in the SSAD, as a design decision.) The same is true of any programming language, server software, etc., for-profit or open-source, that the developers prefer to use. (Note that, in the case of a COTS product, like Oracle, that has a non-trivial licensing fee, if the developers are strongly in favor of using that product, then the issue must be raised in WinWin negotiations because: o The customer/client may not be willing, or able, to pay the fee o The client’s hosting/maintenance staff may not be willing or able to support the specific COTS product, or may already be committed to using a different COTS product.

Regarding derived requirements, if it has been agreed that the system capability of “managing orders” is a requirement, then the logically derivative capabilities of:  Creating orders  Modifying orders  Deleting orders are also requirements. A.2. Completion Criteria for the SSRD

Table 3 provides an overview of which attributes are mandatory (M), optional (O), or recommended (in comments column) for each type of requirement, and some LCO / LCA distinctions.

01111e796d662a64ee23970f5be9f23a.doc 20 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

Table 3: Requirements and Attributes Documentation Overview

Attribute Project Capability Interface Level of Comments Requirements Requirements Requirements Service Requirements

Number, Name M M M M

Description M M M M Unambiguous, testable

Priority M M M M MSCW (Must / Should / Could / Want to have) – with respect to Initial Operational Capability (IOC)

Input(s) M M Type and units recommended; other attributes optional

Source(s) M M

Output(s) M M Type and units recommended; other attributes optional

Destination(s) M M

Precondition(s) M Nominal by LCO; critical off- nominal by LCA; others optional

Post condition(s) M Nominal by LCO; critical off- nominal by LCA; others optional

Measurable O Optional application to other requirements types. Templates only for Level of Service requirements

Achievable O Optional application to other requirements types. Templates only for Level of Service requirements

Relevant O Optional application to other requirements types. Templates only for Level of Service requirements

Specific O Optional application to other requirements types. Templates only for Level of Service requirements

API’s O

LeanMBASE_Guidelines_V1.2 21 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

Block Diagram O

States and O Put in capability for Modes convenience

Scenarios O Nominal by LCO; critical off- nominal by LCA; others optional

Traceability M M M M Documented in FRD

M: Mandatory; O: Optional

Evolutionary requirements may be specified with less detail than their IOC counterparts.

All content is risk-driven: if it’s low risk to leave it out, leave it out.

Requirements should have testable definitions. Avoid using vague, un-testable, qualifiers such as “adequate,” “appropriate,” “sufficient,” “satisfactory,” etc. A.2.1 Life Cycle Objectives (LCO)

 Most significant Project requirements, Interface requirements, Level of Service requirements

 Capability requirements: All nominal scenarios

 High-level evolutionary requirements; versions, states and modes

 Requirements satisfiable by at least one system/software architecture A.2.2 Life Cycle Architecture (LCA)

 Capability requirements: All nominal scenarios and critical off-nominal scenarios

 Elaboration of Project requirements, Interface requirements, Level of Service requirements, Evolution requirements; versions, states and modes

 Requirements satisfiable by the architecture in the SSAD, with evidence of satisfaction in the FRD A.2.3 Initial Operational Capability (IOC)

 Change summary and changes from LCA versions

 At IOC, the SSRD should be consistent with the IOC versions of other artifacts B. Sections of the SSRD Document

The (sub-) sections listed below describe the base format for the SSRD. For readability by faculty, TA’s, graders, and CS577b students, every submitted SSRD document (LCO SSRD, LCA SSRD, and IOC SSRD) should contain each of the indicated sections, with the title and section numbering unchanged. (See the General Guidelines for exceptions.) The text under each (sub-) heading describes what should be in the corresponding (sub-) section. (The texts in these (sub-) sections of the present document are to be read by CS577 students -- to help them prepare the

LeanMBASE_Guidelines_V1.2 22 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD) contents of the corresponding (sub-) sections of their SSRD documents -- and are not intended to be included in submitted SSRD.) B.1. SSRD Overview B.1.1 Status of the SSRD

Summarize any significant differences between the content of the SSRD and the Win-Win negotiated agreements. Identify any major requirements issues that have not yet been finalized. B.1.2 References

This section contains references, if any are needed, to books, documents, web sites, meetings, tools used or referenced, etc., that will help the reader understand the system’s requirements in greater detail. Note that the degree to which any MBASE model -- document section -- is elaborated should be determined using risk considerations. To repeat an admonition from the General Guidelines:

 If it’s risky not to include something, then include it

 If it’s risky to include something, don’t include it.

It is risky not to write and include information that’s critical to success and that will likely be forgotten if it isn’t explicitly stated; it is risky to write and include something if doing so takes significant time away from doing something considerably more important to the success of the project. Thus, on the one hand, your grade may suffer if you omit references critical to the understanding of the project; it may, on the other hand, suffer if you include unnecessary references just for the sake of including references. B.2. Project Requirements

This section describes the project requirements for the proposed system. A project requirement is a (WinWin- negotiated) mandated constraint on how the project is to be executed. More specifically, project requirements are constraints on such issues as: the time by which (schedule) the system must be completed and delivered; the amount of money (budget) available for such items as salaries and software license purchases; etc. (The various sub-sections of this section enumerate and explain the various sub-categories into which project requirements fall.)

The following are examples of short versions of project requirements. (Note that in a CS577 SSRD short versions are not sufficient; rather, each project requirement must be documented in a table (Table 4) of the form described later in this section):  PR-1: Delivery in 24 weeks: The system shall be delivered and running on the customer organization’s servers within 24 weeks of the start of the project (This is a “schedule” requirement)  PR-2: Limited COTS budget: The total of all (annual) costs of licenses for COTS products incorporated into the system shall not exceed $50,000. (This is a “budget” requirement)  PR-3: Design using UML: The system’s design documents shall use UML as the main modeling notation. (This is a “development” requirement; it was, very likely, imposed because the customer organization’s maintenance staff is trained in UML.)  PR-4: Browser compatibility: The system shall run on all versions of Internet Explorer, v5 and higher, and on all versions of Netscape, v4.3 and higher. (This is a “deployment” requirement)  PR-5: Operator training: The developers shall provide ten hours of hands-on operator training in the use of the system to 20 employees of the client organization, such training to start the day after initial deployment testing has been completed. (This is a “transition” requirement)

Occasionally, the optional Measurable, Achievable, Relevant, and Specific attributes in the Level of Service (section 5) should be added, e.g., how hardware speed or communications bandwidth is measured.

LeanMBASE_Guidelines_V1.2 23 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

Table 4: Project Requirement

Project Requirement: << (i) A unique identifier of the project requirement of the form “PR-n,” where n is an integer; (ii) A short title of the Project requirement e.g., “24 Week Schedule” >>

Description: <>

Priority: <>

Win-Win Agreement(s): <>

B.2.1 Budget and Schedule

This section should contain a table, in the (project requirement) format shown above, for:  Each schedule requirement. In CS577 it is typically sufficient to have only one schedule requirement, identifying the 24-weeks available for developing and delivering the system. For projects completing in 577a, the schedule requirements should be 12 weeks/  The budget available for such items as purchases of: COTS products and COTS licenses, if any; hardware, if any, that will be required for the system to be put into operation; etc. Note that: o If appropriate – especially when the budget is zero – only a single budget table is required. o If there are separate WinWin agreements for specific budget items, it is best to include a table for each o In non-educational contexts, salaries are typically important budget items. B.2.2 Development Requirements

This section should contain a table, in the (project requirement) format shown above (Table 4), for each development requirement; each table should be placed in the sub-section (below) for the appropriate (sub-) type of development requirement. .2.2.1 Tools Requirements

A tools requirement is a requirement to the effect that a specific tool(s) must be used for the design, implementation, or documentation of the system. For example:  PR-20: Rational Rose for drawing UML diagrams: All UML diagrams produced during the course of this project shall be drawn using Rational Rose. (This is a “design tools” requirement.) Note that in CS577 (see the SSAD guidelines) there is such a requirement; it arises not from explicit WinWin negotiations, but, rather, from the implicit win condition that CS577 grades depend upon learning all the course content, and Rational Rose is part of the course content.  PR-30: Development using Eclipse v2.1: All software development shall be done using Eclipse version 2.1 as the IDE. (This is an “implementation tools” requirement) Recall that “Development using Eclipse v2.1” must be listed as a tools requirement only if this was agreed to in WinWin negotiations – but not if the development team made the decision, on its own, and in the absence of a requirement to use some other IDE.

LeanMBASE_Guidelines_V1.2 24 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD) .2.2.2 Language Requirements

A language requirement is a requirement to the effect that a specific language/notation must be used for the design, implementation, or documentation of the system. For example:  PR-40: Analysis and Design in UML: All diagrams used in documenting the analysis and design of the system shall be UML diagrams. (This is a “documentation notation” requirement) Note that in CS577 (see the SSAD guidelines) there is such a requirement – in fact there are even more specific requirements of particular UML diagrams to be used for specific analysis and design issues; it arises not from explicit WinWin negotiations, but, rather, from the implicit win condition that CS577 grades depend upon compliance with MBASE guidelines.  PR-50: Client-side software in Java: All client-side software shall be written in Java. (This is an “implementation language” requirement) Recall that “Client-side software in Java” must be listed as a language requirement only if this was agreed to in WinWin negotiations – but not if the development team made the decision, on its own, and in the absence of a requirement to use some other programming language(s). .2.2.3 Computer Hardware Requirements

List any requirements regarding computer hardware that must be used by the system. The requirements shall include, as applicable, number of each type of equipment, type, size, capacity, and other required characteristics of processors, memory, input/output devices, auxiliary storage, communications/network equipment, and other required equipment. .2.2.4 Computer Software Requirements

List any requirements regarding computer software that must be used by, or incorporated into, the system. Examples include operating systems, database management systems, communications/ network software, utility software, and business software packages. .2.2.5 Computer Communication Requirements

List any requirements concerning the computer communications that must be used by the system. Examples include geographic locations to be linked; configuration and network topology; transmission techniques; data transfer rates; gateways; required system use times; type and volume of data to be transmitted/received; time boundaries for transmission/reception/response; peak volumes of data; and diagnostic features. B.2.3 Deployment Requirements

Describe any requirements for packaging, labeling, and handling the system for delivery. These should reflect site-specific variations, and be consistent with the Transition Plan. Deployment requirements may include:

 Installation

o Assumptions

o Deployment hardware and software

o Installer experience/skills

 Post-installation requirements

o Re-packaging

LeanMBASE_Guidelines_V1.2 25 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

o Uninstall

o Transport and delivery B.2.4 Transition Requirements

Describe any requirements for system transition; these should be consistent with personnel and training identified in the LCP 3 and in the Transition Plan. Transition requirements may include staffing and training of operational personal and software maintainers; acquisition and installation of equipment, COTS software, or developed software; adaptation of existing software; and data conversion. B.2.5 Support Environment Requirements

 Describe any required Software Support Environments to be used for the support of the delivered system

 Describe the skill levels of required support personnel (e.g., for programming languages or software packages) and the frequency of support interactions required in terms of bug fixes, future software releases and the reporting and tracking of problems. B.3. Capability (Functional/Product) Requirements

This section describes the capability (functional) requirements of the proposed system. A capability requirement is a function that the proposed system must (be designed and implemented to) perform. All capability requirements must be specified in a sufficiently concrete way that they can be implemented and tested. The following are examples of short versions of capability requirements. (Note that in a CS577 SSRD short versions are not sufficient; rather, each capability requirement must be documented in a table (Table 5) of the form described later in this section):  CR-1: The system shall enable both customers and system administrators to create new orders  CR-2: The system shall enable both customers and system administrators to modify orders that have not yet been shipped  CR-3: The system shall enable both customers and system administrators to delete orders that have not yet been shipped  CR-4: The system shall enable customers, customer-relations personnel, and administrators to track orders, i.e., to determine: o If an order has been shipped o When an as yet un-shipped order is expected to be shipped, and the reason for delay in shipping if the order will not be shipped within the advertised window o Where a shipped, but not yet delivered, order is located, and the expected delivery date  CR-5: The system shall take payment via Visa, MasterCard, American Express, and Diners Club This section should include both:  Nominal capability (functional) requirements, that is, system functionality/behavior required when everything goes right  And off-Nominal capability (functional) requirements, that is, system functionality/behavior required in the case of undesired events, errors, exceptions and abnormal conditions In general there are a number of different formats for specifying capability requirements. In CS577 the following tabular format must be used (Table 5), one table per capability requirement:

LeanMBASE_Guidelines_V1.2 26 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

Table 5: Capability Requirement

Capability Requirement: <>

Priority: <>

Description: <>

Input(s): <>

Source(s): <>

Output(s): <>

Destination(s): <>

Precondition(s): <>

Post condition(s): <>

Win-Win Agreement(s): <>

B.4. System Interface Requirements

 In the following sections, describe any applicable requirements on how the software should interface with other software systems or users for input or output. Examples of such interfaces include library routines, token streams, shared memory, data streams, and so forth.

In general there are a number of different formats for specifying system interface requirements. In CS577 the following tabular format must be used (Table 6), one table per system interface requirement:

Table 6: System Interface Requirement

System Interface <> Requirement:

Description: <>

Priority: <

LeanMBASE_Guidelines_V1.2 27 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

have)>>

Win-Win Agreement(s): <>

B.4.1 User Interface Standards Requirements

 Describe any requirements on the various User Interfaces that the system presents to the users (who may belong to various user classes, such as end-user, programmer, etc.), which can be any of the following:

a) Graphical User Interface(s) Requirements or standard style guides

. Describe any Graphical User Interface (GUI) standards to which the proposed system should adhere.

b) Command-Line Interface(s) Requirements

. Describe any Command-Line Interface (CLI) requirements

c) Diagnostics Requirements

. Describe any requirements for obtaining debugging information or other diagnostic data

 To facilitate traceability, assign each system requirement a unique number e.g. IR-1.

 Usability requirements (learning time, skill levels supported) should be included under Level of Service (section 5) B.4.2 Hardware Interface Requirements

 Describe any requirements on the interfaces to hardware devices (if depends on them the system’s performance)

 Such devices include scanners, bar code readers and printers; or sensors and actuators in an embedded software system B.4.3 Communications Interface Requirements

 Describe any requirements on the interfaces with any communications devices (e.g., Network interfaces, audio, video, or image formats) if the system’s performance depends upon them. B.4.4 Other Software Interface Requirements

 Application Programming Interface(s) Requirements

 APIs used and provided to external systems

 Describe any requirements on the remaining software interfaces not included above

 Device drivers for special hardware devices

LeanMBASE_Guidelines_V1.2 28 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD) B.5. Level of Service (L.O.S.) Requirements

 Specify both the desired and acceptable levels of service of the System (i.e., “how well” the system should perform a given Capability Requirement; an acceptable level of service may be a “must have” system requirement while a desired level of service is a lower priority system requirement.)

 Trace the Level of Service Requirements back to the Levels of Service (OCD 3.2.3) in the FRD, as appropriate.

 To satisfy some Level of Service Requirements you may need to add to or modify the Needed Capabilities (and hence Capability Requirements)

 To facilitate traceability, assign each system requirement a unique number e.g. LOS-1.

Use the following taxonomy of Level of Service requirements as a checklist.

1. Dependability

1.1 Reliability/Accuracy

1.2 Correctness

1.3 Survivability/Availability

1.4 Integrity

1.5 Verifiability

2. Interoperability

3. Usability

4. Performance (Efficiency)

5. Adaptability

5.1 Verifiability

5.2 Flexibility

5.3 Expandability

5.4 Maintainability/Debuggability

6. Reusability

In general there are a number of different formats for specifying level of service requirements. In CS577 the six table entries below must be used (Table 7 and Table 8), one table per level of service requirement:

Table 7: Level of Service Requirement

Level of Service <> Requirement:

Description: <>

Priority: <

LeanMBASE_Guidelines_V1.2 29 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

have), W (Want to have)>>

Desired Level: <>

Accepted Level: << Describe what the acceptable level of service is for this service requirement>>

Win-Win Agreement(s): <>

In some cases (e.g., defining response time end points and workload characteristics) the following four table entries should also be used (when is risky not to):

Table 8: MARS

Measurable: <>

Achievable: <>

Relevant: <>

Specific: <> B.6. Versions, States and Modes

 Some systems respond quite differently to similar stimulus depending on the operational mode. If that’s the case, identify the various modes, and organize the System Requirements (Nominal and Off-Nominal) around the operational modes, to avoid forgetting some critical system requirement.

 For example, a voice-operated computer system may have two operational modes:

o Operational Mode: where the system is actually being used to perform productive work

o Training Mode: where the operators are training the Voice Recognition module in the system to properly interpret their voice commands, to be saved for later use.

o In operational mode, the response to the voice stimulus “Quit Application” would be to do so. In Training mode, the response might be to ask what action should be taken to the voice command ‘Quit Application’ B.7. Evolutionary Requirements

 Describe any requirements on the flexibility and expandability that must be provided to support anticipated areas of growth or changes in the proposed system or the domain itself

 Describe foreseeable directions of the system growth and change

 Include Capability, interface, technology and level of service evolution requirements

 Describe how the software and data assets will be maintained

LeanMBASE_Guidelines_V1.2 30 Version Date: 9/28/2005 Guidelines for MBASE System and Software Requirements Definition (SSRD)

 To facilitate traceability, assign each system requirement a unique number e.g. ER-1.

 Use the tables as mentioned above depending upon the category of requirements (i.e. Capability Requirements, Interface Requirements, Level of Service Requirements, etc) B.8. Appendices

Create appendices as need. Each appendix shall be referenced in the main body of the document where the data would normally have been provided. These might characterize the workload with respect to which Level of Service requirements such as throughput and response time are measured.

LeanMBASE_Guidelines_V1.2 31 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD) A. Description A.1. Purpose

The purpose of the SSAD is to document the results of analyzing the organizational concept of operation for the system, designing an architecture, and designing an implementation. The SSAD serves as a bridge between the Operational Concept defined during the Inception phase and the Construction phase. The SSAD is started during the Inception phase, refined during the Elaboration phase, and completed during the Construction phase. A.2. Completion Criteria

The following criteria must be satisfied at all milestones.

 The OCD must be defined using language appropriate to intended audience (i.e. the operational stakeholders).

 All domain– or project–specific terms and acronyms, and any common terms with domain– or project– specific definitions must be described in “Glossary for System Analysis and Design” (SSAD B.5).

The following paragraphs describe the specific completion criteria for SSAD at the three project milestones. A.2.1 Life Cycle Objectives (LCO)

At LCO, the following items need to be described.

 Top-level definition of at least one feasible architecture:

Feasibility Criterion: a system built to the architecture would support the operational concept, satisfy the requirements, be faithful to the prototypes, and be built within the budgets and schedules in the Life Cycle Plan

Physical and logical elements, and relationships

 Must provide essential features of likely components, behaviors, objects, operations

Choices of COTS and reusable software elements

Detailed analysis, high-level Design

 Identification of infeasible architecture options A.2.2 Life Cycle Architecture (LCA)

At LCA, the following items need to be described.

 Choice of architecture and iterations

01111e796d662a64ee23970f5be9f23a.doc 32 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Physical and logical components, connectors, configurations, constraints

 Must have precise descriptions of likely components, behaviors, objects, operations

COTS and technology reuse choices

Architectural style choices, deployment considerations

Critical algorithms and analysis issues must be resolved

 Architecture evolution parameters

 Complete design for each component of the system

 Tracing to and from OCD/SSRD

 Assurance of satisfaction of Feasibility Criterion A.2.3 Initial Operational Capability (IOC)

At IOC, the following items need to be described.

 The design of a technology–specific implementation for the system

Update LCA to reflect current implementation (“as built” specifications)

 Tracing to and from CTS B. Document Sections

The following subsections describe the base format for the SSAD. The section headers should appear in your document. The text shown here describes the content of the section that should appear in your document.

Note: For additional details regarding ‘Representation, Recommendations, 577 specific guidelines, Model Integration rules, Trade Offs, and Common pitfalls’ refer to LeanMBASE Additional SSAD Guideline document. B.1. Introduction B.1.1 Purpose of the SSAD Document

Summarize the purpose and contents of this document with respect to your project and the people involved. Describe how your SSAD meets the completion criteria for the current milestone. B.1.2 Standards and Conventions

Describe any standards (e.g. DOD, IEEE), notations (e.g. UML), and conventions used to produce the SSAD. For example

 Standards used (DOD, IEEE)

 Notation used (UML)

LeanMBASE_Guidelines_V1.2 33 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Any exceptions to the standard notation employed

 Naming Conventions

Nouns for Components and Objects; verbs for Behaviors, Operations

Consistent use of naming style for elements B.1.3 References

Provide usable citations for the most significant prior and current related work and artifacts, documents, meetings and external tools referenced or used in the preparation of this document. B.2. System Analysis

Refine the operational concept of the proposed (OCD 3.2) into a model that focuses on the system and its requirements, as defined in the SSRD. Describe what developers need to know in order to build the system and eliminate thing is that the developer need not worry about. The model should answer the following questions

 Which of the organization’s workers (person roles & systems) and outside actors will this system interact with when it is operating (for software only systems, we use the term executing)?

 What specific capabilities does this system contribute to the organization’s processes?

 What services will this system provide, and what services will it require of other workers and of the outside actors?

 How does the system interact with the other workers and the outside actors to implement the processes in which the system participates?

 What artifacts and information does this system inspect, manipulate, or produce? If the system uses an artifact or information when communicating with other workers or outside actors, what is the required format?

Representation

Create a UML package with the stereotype <> the name “System Analysis”. Create all models described in the following sections in this package.

577 Guidelines:

Using Rose, create a UML package with the stereotype <> the name “System Analysis” in the top– level package with the name Logical View. B.2.1 Structure

Briefly describe the workers (e.g. people roles, systems) of the organization and the outside actors (e.g. customers, suppliers, partners, prospects) with which the system interacts. Identify which of the organization’s workers and outside actors interact with the system. Describe the services provided by the system and expected by the system of the workers had outside actors with which the system interacts.

LeanMBASE_Guidelines_V1.2 34 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Representation:

Describe the context of the system using either:

 A Block Diagram

 An Architecture Description Language (ADL).

 UML’s Static–Structure and Collaboration Diagrams. .2.1.1 System

Provide a brief description of the system’s role, purpose, and responsibilities. List the services of the system, which are used in the system’s behavior (SSAD B.2.3). .2.1.2 Actor X

Create a section at this level for each actor classifier in the figure(s) shown in SSAD 2.1. The header of this section should be the name of the worker or outside actor class and its unique designator, if you have assigned an identifier and it is different from the name.

Provide a brief description of actor’s role, purpose, and responsibilities. List any attributes of the actor that the system processes (SSAD 2.3) are known to be use. List any services provided by the actor that are used in a system process (SSAD 2.3). List any processes in which the actor participates. If the actor is a system and the current instance used is a COTS product, identify the product’s name, version, and creator or distributor. B.2.2 Artifacts & Information

Briefly describe the artifacts (e.g. documents, products, resources) inspected, manipulated, or produced by the system, the kinds of information that the system must inspect, manipulate, produce, or maintain (e.g. customer record, part description, event description), and the relations among the artifacts and the information.

Representation

Create either

 A Block Diagram

 A Business–Artifact Model .2.2.1 Artifact or Information Class X

Create a section at this level for each artifact in the figure(s) shown in SSAD B.2.2. The header of this section should be the name of the artifact or information class and its unique designator, if you have assigned an identifier and it is different from the name.

Provide a brief description of its role, purpose, and responsibilities. List any system capabilities that use the artifact. B.2.3 Behavior

Describe the processes that the system uses to implement its capabilities (OCD 3.1), including how it works with the actors, and how it uses the artifacts and information; and if significant, how the system’s behavior depends on its state (also called mode).

LeanMBASE_Guidelines_V1.2 35 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

For each process, identify which actors participate in the process, and which artifacts and information are inspected, manipulated, or produced by the process; and describe at a high level the actions performed by the system and each actor during the process.

Representation:

 For each process, create a subsection numbered 2.3.x with the name of the process in the header.

 For each high prority process create a Use Case Model, Use-Case Description (Table 1 – LeanMBASE Additional SSAD Guideline) and an Activity Diagram and Description (Table 2, 3, 4 – LeanMBASE Additional SSAD Guideline)

 For other process it’s sufficient to just have Use-Case Description (Table 1 – LeanMBASE Additional SSAD Guideline)

 Modes: create a State Model B.3. Platform Independent Model

Design a high–level, general architecture for the system that is independent of the implementation technology. Describe what are the work units (components) of the system, what the components are expected to do, how the components are connected, and what the components communicate.

The form of the component depends on the abstraction and architectural style employed. The following are the three categories of systems.

 System of systems

 Composite system

 Simple system

Representation

Create a UML package with the stereotype <> the name “Architecture Design”. Create all models described in the following sections in this package.

577 Guidelines:

Using Rose, create a UML package with the stereotype <> the name “Architecture Design” in the top–level package with the name Logical View. B.3.1 Structure

In the following subsections, describe the hardware and software components of the system, the actors for the system, any layering or partitioning of the components, and the connections among the components and actors. (In section B.3.4, describe the architecture style(s), pattern(s), and framework(s) used, and any constraint that they imposes.)

UML Guideline:

Create a UML package with the stereotype <> the name “Architecture”. Create all models described in the following sections in this package.

LeanMBASE_Guidelines_V1.2 36 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

577 Guidelines:

Using Rose, create the UML package with the stereotype <> the name “Architecture” in the <> package with the named “Architecture Design”. Create subpackages as described above for simple systems. (Note: do not use Rose’s Component View or Deployment View). .3.1.1 Hardware Classifier Model

Describe the kinds of hardware components that are either part of the system or on which this system will run, the actors for the system with which the components interacts, and the kinds of connectors that will be used to connect them.

Representation:

Describe the hardware classes the system using either:

 An Architecture Description Language (ADL).

 UML’s Deployment Diagrams. .3.1.2 Software Classifier Model

Describe the kinds of software components that are part of the system, the actors for the system with which the system interacts, and the kinds of connectors that will be used to connect them.

Representation:

Describe the software classes the system using either:

 An Architecture Description Language (ADL).

 UML’s Component Diagrams. .3.1.3 Deployment Model

Describe component and connector configuration(s) that make a working version of the system. (There may be more than one configurations, e.g. for different modes.) For each configuration, describe the instances of hardware and software component classes that participate in the configuration, the allocation of software components to the hardware components, the instances of hardware connector classes that link the hardware components, and the instances software connector classes that link the software components.

Representation:

Describe the deployment configurations using either:

 An Architecture Description Language (ADL).

 UML’s Deployment Model. .3.1.4 Software Component Classifiers

For each software component classifier defined in the figure(s) shown in the Software Classifier Model (SSAD 3.1.2), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the software component classifier in the header.

LeanMBASE_Guidelines_V1.2 37 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Representation:

The description of software components classifiers depends on the type of system that you are defining, on the architecture language that you are using to represent your system, and possibly on architecture style considerations.

577 Guidelines:

Create a UML package, with the stereotype <> and the name “Component Classifier Name AD”, in the package that holds the classifier that represents this software component classifier. B.3.1.4.1 Component Classifier X

Based on your chosen architectural style and language (and, as always, on your risk analysis) fill–in the details for the following sections. B.3.1.4.1.1 Purpose

Describe the purpose of this class of component. B.3.1.4.1.2 Interface(s)

Describe the visible features of the component classifier.

What features can be specified in an interface depend on your architecture style and language (inc. UML). Some common features include operations, attributes, and subcomponents.

Representation:

Describe the visible features of the component classifier using either:

 An Architecture Description Language (ADL).

 UML’s Static–Structure Diagrams & Component Diagrams

B.3.1.4.1.2.1 Feature or Feature Set X

Create a section at this level for each feature or feature set described above. The header of this section should be the name of the feature or feature set and its unique designator, if you have assigned an identifier and it is different from the name. The contents of this section should include a description of the feature’s purpose and the following:

 If the feature is a attribute, include its purpose, its classifier and any default value;

 If the feature is a component that is a part, include a reference to its description in the Internal Architecture (SSAD B.3.1.4.1.6) section;

 If the feature is an operation, include a description of its purpose, its parameters and result (called a signature), and any pre– and post–condition;

 If describing an interface, describe the each member of the interface.

B.3.1.4.1.3 Parameters

Describe any parameters of the software component classifier. The parameters need to be set when an instance of the software component classifier is created.

What parameters can be specified, if any, depend on your architecture style and language (inc. UML). Some common parameters include values, objects, classifiers, operations, and other components.

LeanMBASE_Guidelines_V1.2 38 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Representation:

Describe the parameters of the component classifier using either:

 An Architecture Description Language (ADL).

 UML’s Component Diagrams. B.3.1.4.1.4 Behavior

Describe behavior of the instances of this component classifier.

Processes

Describe the processes of this component classifier. For each process, identify which other components and actors participate in the process, and which artifacts and information are inspected, manipulated, or produced by the process; and describe at a high level the actions performed by this component and each actor during the process.

Representation:

For each high prority process create a Use Case Model, Use-Case Description (Table 1 – LeanMBASE Additional SSAD Guideline) and an Activity Diagram and Description (Table 2, 3, 4 – LeanMBASE Additional SSAD Guideline)

For other process it’s sufficient to just have Use-Case Description (Table 1 – LeanMBASE Additional SSAD Guideline)

Process X

Create a section at this level for each process of the component. The header of this section should be the name of the process and its unique designator, if you have assigned an identifier and it is different from the name.

For each process, create a subsection numbered 3.1.7.1.4.x with the name of the process in the header. B.3.1.4.1.5 Constraints

Describe any constraints on the use and implementation of this component that are not captured in other sections of the component description. The following paragraphs describe some typical constraints.

 Rules that the component must implement in support of rules that the system must satisfy;

 Constraints imposed on component’s implementation by the architecture style(s), patterns, or frameworks used for system architecture;

 Constraints imposed on component’s implementation by architecture design notation used for system architecture. (These are less likely.)

The goal is to document accurately important rules that affected the component’s design and implementation, without getting into implementation details. B.3.1.4.1.6 Internal Architecture

Describe the architecture of this software component classifier that is independent of the implementation technology. Describe the subcomponents of this component, what they are expected to do, how they are connected, and what they communicate.

LeanMBASE_Guidelines_V1.2 39 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Representation:

Apply the same guidelines to describe the architecture of a component as was used to describe for the system architecture in section Platform Independent Model (SSAD B.3). .3.1.5 Hardware Components

For each hardware component defined in the figure(s) shown in the Hardware Classifier Model (SSAD 3.1.1), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the hardware component classifier in the header.

Note:

The description of hardware components classifiers depends on the type of system that you are defining, on the architecture language that you are using to represent your system, and possibly on architecture style considerations. B.3.1.5.1 Component X

Based on your chosen architectural style and language (and, as always, on your risk analysis) fill–in the details for the following sections. B.3.1.5.1.1 Purpose

Describe the purpose of this component. B.3.1.5.1.2 Classifier

Identify the classifier of this component. .3.1.6 Software Components

For each hardware component defined in the figure(s) shown in the Software Classifier Model (SSAD 3.1.2), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the hardware component classifier in the header.

Note:

The description of hardware components classifiers depends on the type of system that you are defining, on the architecture language that you are using to represent your system, and possibly on architecture style considerations. B.3.1.6.1 Component X

Based on your chosen architectural style and language (and, as always, on your risk analysis) fill–in the details for the following sections. B.3.1.6.1.1 Purpose

Describe the purpose of this component. B.3.1.6.1.2 Classifier

Identify the classifier of this component.

LeanMBASE_Guidelines_V1.2 40 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD) B.3.2 Data Model

Create a model of the information classes that are needed to support the architectural structure (SSAD B.3.1) implement the system behavior (SSAD B.3.3). This class model may include classes representing following:

 Artifacts and information required by the system (SSAD B.2.2);

 Forms defined in the current prototype (OCD 3.2.4);

 Information needed for communication components in the architectural structure (SSAD B.3.1);

 Control behavior specific to one or a few processes performed by architectural units (i.e. “recipes” that define how to do something).

Representation

Create a class model using either:

 An Architecture Description Language (ADL).

 UML’s Class Model. .3.2.1 Data Class X

Create a section at this level for each artifact in the figure(s) shown in SSAD B.3.2. The header of this section should be the name of the class and its unique designator, if you have assigned an identifier and it is different from the name.

Provide a brief description of its role, purpose, and responsibilities. List any attributes and operations that are used to describe either the structure (SSAD B.3.1) or behavior (SSAD B.3.3) of the architecture. List any system capabilities that use the artifact. B.3.3 Behavior

Describe how the components work with each other and with the actors to implement the required behavior of the system (SSAD B.2.3).

Describe how each system process (SSAD B.2.3) is implemented by the components described in the Deployment Model (SSAD 3.1.3). For each process, identify which components participate in the system process, and which instances of the analysis classes (SSAD B.3.2) are inspected, manipulated, or produced in the process; and describe the interactions among the components and the analysis classes.

Representation:

Create a behavior model using either:

 An Architecture Description Language (ADL).

 Create a Use–Case Model that describes the system’s processes, the actors that participate in each process and the relations among the processes and the outside actors. Represent the model as a package with the stereotype <> with the package name “Process Implementations”.

For each high priority process implementation, create a Use–Case Realization Description (along with Use Case Model – Table 12 – LeanMBASE Additional SSAD Guideline), and one or more Interaction Diagram.

LeanMBASE_Guidelines_V1.2 41 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

For other processes its sufficient to have create a Use–Case Realization Description (Table 12 – LeanMBASE Additional SSAD Guideline) B.3.4 Architectural Styles, Patterns & Frameworks

Describe any architectural styles (e.g. the Prism style [http://sunset.usc.edu/~softarch/Prism/]), patterns (e.g., pipe– and–filter and client–server), or frameworks used to describe the system architecture. B.4. Platform Specific Model

Design a technology–specific implementation for the system by refining the general architecture defined during Platform Independent Model (SSAD B.3).

 Describe the development technologies that are to be used in the implementation, including: hardware types (e.g. SUN Servers, 1553 Buses), languages (e.g. Java, XML/HTML), database managers (e.g. Oracle 8), communication tools (e.g. HTTP servers, CORBA ORBs), frameworks (e.g. CORBA, .Net, JDK 1.4), class libraries, and design patterns.

 Describe how each component and connector defined in architecture structure B.3.1 will be implemented (e.g. hand–coded, COTS product used, tool–generated). Refine the detailed description (i.e. interfaces, behavior, L.O.S., internal structure) of each component and connector, as appropriate, based on the implementation technology used.

 Describe how each analysis class will be implemented (e.g. language–specific classes, relational database tables) on appropriate components.

 Describe how instances of each class will be used to implement each component.

 Refine the descriptions of the behavior of the architecture and of the L.O.S. provided by the architecture base on the implementation technology.

UML Guideline:

Create a UML package with the stereotype <> the name “Implementation Design”. Create all models described in the following sections in this package.

577 Guidelines:

Using Rose, create the UML package with the stereotype <> the name “Implementation Design” in the top–level package with the name Logical View. Create subpackages as described above for simple systems. (Note: do not use Rose’s Component View or Deployment View). B.4.1 Structure

Describe the architecture and the analysis classes will be implemented. Describe how each component and connector will be implemented.

 For each “hand–coded” component or connector, describe the objects and classes that will be used to implement the component or connector.

 For components that are implemented using a COTS product, describe the values will be used for any parameters of COTS product and any tailoring that will be performed. Limited the description of COTS product to features that must be known or used by developers to implement other components in the system.

LeanMBASE_Guidelines_V1.2 42 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

 For tool–generated components, describe inputs to the tools and the interfaces of the components that will be produced.

Describe how the implementation components and connectors are organized into layers, partitions, and subsystems. Describe which components will need to implement which analysis classes, and how the analysis classes will be realized using the technologies (e.g. programming language, database tables, web–pages) that are being used to implement the component.

577 Guidelines:

Using Rose, create a UML package with the stereotype <> the name “Implementation Design” in the top–level package with the name Logical View. .4.1.1 Hardware Classifier Model

Describe the kinds of hardware components that are either part of the system or on which this system will run, the actors for the system with which the system interacts, and the kinds of connectors that will be used to connect their parts.

Refine the Hardware Classifier Model specified in architecture design (SSAD 3.1.1) by describing how each hardware component and connector classifier described during architecture design is implemented, and by adding any implementation–specific hardware component and connector classifiers that are needed for support.

Representation:

See Representation in the architecture–design Hardware Classifier Model (SSAD 3.1.1). .4.1.2 Software Classifier Model

Describe the kinds of implementation–specific software components that are part of the system, the actors for the system with which the components interacts, and the kinds of connectors that will be used to connect their parts.

Refine the Software Classifier Model specified in architecture design (SSAD 3.1.2) by describing how each software component and connector classifier described during architecture design is implemented, and by adding any implementation–specific software component and connector classifiers that are needed for support (e.g. frameworks).

Representation:

See the Representation discussion in the architecture–design Software Classifier Model (SSAD 3.1.2). .4.1.3 Deployment Model

Describe implementation–specific component and connector configuration(s) that make a working version of the system. (There may be more than one configurations, e.g. for different modes or involving different platforms.) For each configuration, describe the instances of hardware and software component classes that participate in the configuration, the allocation of software components to the hardware components, the instances of hardware connector classes that link the hardware components, and the instances software connector classes that link the software components.

Refine the Deployment Model specified in architecture design (SSAD 3.1.3) by describing how each component and connector classifier described during architecture design is implemented, and by adding any implementation– specific software components and connectors that are needed for support (e.g. frameworks).

LeanMBASE_Guidelines_V1.2 43 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Representation:

See the Representation discussion in the Error: Reference source not found specified in architecture design (SSAD 3.1.3). .4.1.4 Software Component Classifiers

For each implementation–specific software component classifier defined in the figure(s) shown in the Software Classifier Model (SSAD 4.1.2), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the software component classifier in the header.

Representation:

The description of software components classifiers depends on the type of system that you are defining, on the architecture language that you are using to represent your system, and possibly on architecture style considerations.

577 Guidelines:

Create a UML package, with the stereotype <> and the name “Component Classifier Name ID”, in the package that holds the classifier that represents this software component classifier. B.4.1.4.1 Component Classifier X

Based on your chosen architectural style and language (and, as always, on your risk analysis) fill–in the details for the following sections. B.4.1.4.1.1 Purpose

Describe the purpose of this class of component. B.4.1.4.1.2 Interface(s)

Describe the visible features of the component classifier.

What features can be specified in an interface depend on your architecture style and language (inc. UML). Some common features include operations, attributes, and subcomponents.

Representation:

See Representation for Interface(s) of architecture–design Software Component Classifiers (SSAD B.3.1.4.1.2).

B.4.1.4.1.2.1 Feature or Feature Set X

See Feature or Feature Set X for Interface(s) of architecture–design Software Component Classifiers (SSAD B.3.1.4.1.2). B.4.1.4.1.3 Parameters

Describe any parameters of the software component classifier. The parameters need to be set when an instance of the software component classifier is created.

What parameters can be specified, if any, depend on your architecture style and language (inc. UML). Some common parameters include values, objects, classifiers, operations, and other components.

Representation:

See Representation for Parameters of architecture–design Software Component Classifiers (SSAD B.3.1.4.1.3).

LeanMBASE_Guidelines_V1.2 44 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD) B.4.1.4.1.4 Behavior

Describe behavior of the instances of this component classifier.

Processes

Describe the processes of this component classifier. For each process, identify which other components and actors participate in the process, and which artifacts and information are inspected, manipulated, or produced by the process; and describe at a high level the actions performed by this component and each actor during the process.

Representation:

For each high prority process create a Use Case Model, Use-Case Description (Table 1 – LeanMBASE Additional SSAD Guideline) and an Activity Diagram and Description (Table 2, 3, 4 – LeanMBASE Additional SSAD Guideline)

For other process it’s sufficient to just have Use-Case Description (Table 1 – LeanMBASE Additional SSAD Guideline)

Process X

Create a section at this level for each process of the component. The header of this section should be the name of the process and its unique designator, if you have assigned an identifier and it is different from the name. B.4.1.4.1.5 Constraints

Describe any constraints on the use and implementation of this component that are not captured in other sections of the component description. The following paragraphs describe some typical constraints.

 Rules that the component must implement in support of rules that the system must satisfy;

 Constraints imposed on component’s implementation by the architecture style(s), patterns, or frameworks used for system architecture;

 Constraints imposed on component’s implementation by architecture design notation used for system architecture. (These are less likely.)

The goal is to document accurately important rules that affected the component’s design and implementation, without getting into implementation details. B.4.1.4.1.6 Internal Architecture

Describe the architecture of this software component classifier that is independent of the implementation technology.

 For components that represent lowest–level modular, deployable, and replaceable part of a system (e.g. the representation of an executable, a link–library, a Java Been), describe the objects and classes that are used to create the component.

 For other components, describe the subcomponents of this component, what they are expected to do, how they are connected, and what they communicate.

Representation:

Apply the same guidelines to describe the architecture of this component as was used to describe for the system architecture in section Platform Specific Model (SSAD B.4).

LeanMBASE_Guidelines_V1.2 45 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD) .4.1.5 Hardware Components

For each hardware component defined in the figure(s) shown in the Hardware Classifier Model (SSAD 4.1.1), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the hardware component in the header.

Representation:

The description of hardware components depends on the type of system that you are defining, on the architecture language that you are using to represent your system, and possibly on architecture style considerations. B.4.1.5.1 Component X

Based on your chosen architectural style and language (and, as always, on your risk analysis) fill–in the details for the following sections. B.4.1.5.1.1 Purpose

Describe the purpose of this component. B.4.1.5.1.2 Classifier

Identify the classifier of this component. .4.1.6 Software Components

For each software component defined in the figure(s) shown in the Software Classifier Model (SSAD 4.1.2), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the software component in the header.

Representation:

The description of software components depends on the type of system that you are defining, on the architecture language that you are using to represent your system, and possibly on architecture style considerations. B.4.1.6.1 Component X

Based on your chosen architectural style and language (and, as always, on your risk analysis) fill–in the details for the following sections. B.4.1.6.1.1 Purpose

Describe the purpose of this component. B.4.1.6.1.2 Classifier

Identify the classifier of this component. .4.1.7 Implementation Classes

For each implementation class defined in the figure(s) describing the Internal Architecture (SSAD B.4.1.4.1.6) of software components in the Software Classifier Model (SSAD 4.1.2), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the implementation class in the header.

LeanMBASE_Guidelines_V1.2 46 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Representation:

For each implementation class, describe its purpose, the component that defines it, its interfaces, its operations, its parameters, its state behavior, and its level of services.

577 Guidelines:

Create a UML package, with the stereotype <> and the name “Implementation Class Name Details”, in the package that holds the classifier that represents this software component classifier. B.4.1.7.1 Implementation Class X

Based on UML and your chosen implementation language (and, as always, on your risk analysis) fill–in the details for the following sections. B.4.1.7.1.1 Purpose

Describe the purpose of this class. B.4.1.7.1.2 Defined In: Component Name

Put the full name of the component in which this class resides in the header. B.4.1.7.1.3 Interface(s)

Describe the interfaces that are realized by this implementation class.

Representation:

Describe the visible features of the implementation class by creating Create the Static–Structure Diagram named “Interfaces”.

B.4.1.7.1.3.1 Interfaces X

Create a section at this level for each interface described above. The header of this section should be the name of the interface and its unique designator, if you have assigned an identifier and it is different from the name. The contents of this section should include a description of the interface’s purpose and a list of all operations defined. For each operation, describe its purpose, its parameters and result (called a signature), and any pre– and post– condition. B.4.1.7.1.4 Parameters

Describe any parameters of the implementation class. The parameters need to be set when an instance of the implementation class is created.

Representation:

 Create a Static–Structure Diagram that shows the implementation class.

B.4.1.7.1.5 Attributes

Describe the attributes of this implementation class. For each attribute, describe its purpose, its class, its visibility, any default value, and any stereotype.

LeanMBASE_Guidelines_V1.2 47 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD) B.4.1.7.1.6 Operations

Describe the operations of this implementation class. For each operation, describe its purpose, its parameters and result (called a signature), any pre– and post–condition, and its behavior.

B.4.1.7.1.6.1 Operation X

Create a section at this level for each operation of the class. The header of this section should be the name of the operation and its unique designator, if you have assigned an identifier and it is different from the name. B.4.1.7.1.7 State Behavior

If significant, describe the state behavior of the class, i.e. how the behavior of a class instance depends on the state it is in. Describe the states of the class instance, the events that cause the class instance to change states, and how the processing is different in each state.

Representation:

 Create a State Model that describes the class instance’s state behavior.

B.4.1.7.1.8 Constraints .4.1.8 Objects

For each object defined in the figure(s) shown in the Software Classifier Model (SSAD 4.1.2), create a subsection with the name and unique identifier (if you have assigned an identifier and it is different from the name) of the object in the header.

Representation:

For each object, describe its purpose, its classifier, and any L.O.S. goals. B.4.1.8.1 Object X

Based on your chosen architectural style and language (and, as always, on your risk analysis) fill–in the details for the following sections. B.4.1.8.1.1 Purpose

Describe the purpose of this object. B.4.1.8.1.2 Classifier

Identify the classifier of this object. B.4.2 Behavior

Describe how the components work with each other and with the actors to implement the required behavior of the system (SSAD B.2.3).

Describe how each system process (SSAD B.2.3) is implemented by the components described in the Deployment Model (SSAD 4.1.3). For each process, identify which components participate in the system process, and which instances of the classes (SSAD 4.1.7), which are implementations of Data Model (SSAD B.3.2), are inspected, manipulated, or produced in the process; and describe the interactions among the components and the classes that are implementations of analysis classes.

LeanMBASE_Guidelines_V1.2 48 Version Date: 9/28/2005 Guidelines for MBASE System and Software Architecture Description (SSAD)

Representation:

See Representation in the Behavior section (SSAD B.3.3) of architecture design. B.4.3 Patterns & Frameworks

Describe any implementation architecture styles (e.g. the Prism style [http://sunset.usc.edu/~softarch/Prism/]), patterns (e.g., pipe–and–filter and client–server), or frameworks (e.g. Java and CORBA) used to describe the system architecture. B.4.4 Project Artifacts

Describe how the software components and classes defined in the implementation Structure (SSAD B.4.1) are assigned into project artifacts (e.g. files, database tables, web pages) that will be used to produce code; how the files are organized into directories; the dependency relations among the artifacts and directory structure.

Representation:

Create a hierarchical list of directories, their files, and the classes or components in each file. B.5. Glossary for System Analysis and Design

Create an alphabetical listing of all uncommon or organization-specific terms, acronyms, abbreviations, and their meanings and definitions, to understand the Domain Description.

Recommendation:

Glossary items are often answers to questions that you ask to the client: “What does this mean?” B.6. Appendices

Create appendices as need. Each appendix shall be referenced in the main body of the document where the data would normally have been provided.

Include supporting documentation or pointers to electronic files containing:

 Descriptions of capabilities of similar systems

 Additional background information

LeanMBASE_Guidelines_V1.2 49 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP) A. Description of the LCP

Sections A1-A3 of this Life Cycle Plan (LCP) section of the LeanMBASE guidelines are intended to be read by CS577 students -- to help them understand what the LCP is about and how to write their projects’ LCP -- and are not intended to be included in submitted LCP. A.1. Purpose

The purpose of a development project’s LCP is to:

 Serve as a basis for monitoring and controlling the project’s progress

 Help make the best use of people and resources throughout the system’s life cycle

 Provide evidence to other key stakeholders that the developers have thought through the major life cycle issues in advance

The LCP is organized to answer the most common questions about a project or activity: what?, why?, when?, who?, where?, how?, how much?, and whereas A.2. LCP Life Cycle Process

As the team is forming you should:

 Identify those skills necessary for the execution of your project, e.g.: managing, prototyping, architecting, customer relations, etc.

 Identify those team members who possess each of the requisite skills

 Assign team members to the various project activities based upon their skills and interests, and the project’s needs

The need to properly allocate team members’ time suggests that for each artifact you should check the course schedule to determine when its LCO and LCA versions are due and what the completion criteria are. This will enable you to understand how much work will have to be done to complete each version of each artifact. Since the amount of effort required for the development of each version of each of the artifacts will vary depending upon your project type (1-semester prototyping, COTS-intensive, 2-semester custom software development), it is best to wait until your project has been assigned to work out detailed artifact development plans.

Once your project has been assigned, either the project leader or the LCP lead author should take the lead in starting to develop the schedule of early tasks and responsibilities. Once the Win-Win negotiations have been completed, the initial draft of Section 3 (Responsibilities) can be completed, and Section 2.1 (overall Strategy) can be developed.

Different type of project will require very different types of activities, e.g;

 COTS assessment and tailoring

 Or COTS glue code development

 Or custom code development;

01111e796d662a64ee23970f5be9f23a.doc 50 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP)

 Or legacy code understanding and modification

 Or some combination of the above

Once the Win-Win negotiations have been completed you will know which types of activity will be required and can use some combination of COCOMO II and COCOTS to initially scope the amount of functionality that can be delivered with one or two semesters (12 or 24 weeks) of effort, and can write an initial version of Section 5 (Resources).

At this point you should have enough information to develop an initial task schedule Gantt chart for Section 2.2 (Phases) using Microsoft Project. (The COCOMO II phase and activity distributions and a Microsoft Project Gantt chart generator tool can help you do this. The average effort distributions on CS577 projects, described in EP – 12 (http://greenbay.usc.edu/csci577/fall2005/site/coursenotes/ep/index.html) can also be helpful.)

Because CS577a students often misunderstand which task schedules (Gantt charts) should be presented in each of the anchor point (LCO and LCA) LCP’s, we emphasize the following points:

 While it is important to develop good Gantt charts for the Inception (Life Cycle Objective (LCO)) phase, the most important Gantt charts to present at the LCO architecture review are the ones for the Elaboration (Life Cycle Architecture (LCA)) phase, including those for a COTS Assessment Plan (CAP), if necessary.

 For the LCA architecture review, the most important Gantt charts, organization charts, roles and responsibilities and risk management plans are those for the 577b (Construction and Transition) part of the project, including scoping and planning for an initial Core Capability increment and for a set of Transition (of product)-to-client activities.

 That is, since:

o The purpose of the LCO review is to enable all key stakeholders to decide whether the project is well enough along to enter the elaboration phase

o The purpose of the LCA review is to enable all key stakeholders to decide whether the project is well enough along to enter the implementation phase

o Stakeholders are not very interested in how completed tasks were scheduled, but, rather, whether, and how successfully they were completed; the are, however, critically interested in whether there is a plausible plan/schedule for executing the next phase.

o In each set of artifacts the stakeholders are critically interested in seeing Gantt charts, organization charts, roles and responsibilities and risk management plans for the next stage.

 In each set of artifacts the stakeholders are critically interested in seeing Gantt charts, organization charts, roles and responsibilities and risk management plans for the next stage.

Depending on the amount of change in team composition and in client priorities between CS577a and CS577b there may be a good deal of planning needed for the Rebaselined LCA review,. If that is the case, the LCP is updated to cover your new increment plans and any further project redirection. It is not a client deliverable at IOC, although the final 577a and 577b versions must be packaged for the archived version of the project artifacts. A.3. Completion Criteria A.3.1 Life Cycle Objectives (LCO)

 Identification of primary life-cycle stakeholder roles and responsibilities

LeanMBASE_Guidelines_V1.2 51 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP)

 Identification of life-cycle process strategy : phases and increments

 Top-level WWWWWWHH (Why, Whereas, What, When, Who, Where, How, How Much) by phase

 Detailed plans for Elaboration phase

 Deliverables, budgets and schedules achievable by developing to at least one system/software architecture and life-cycle process strategy. A.3.2 Life Cycle Architecture (LCA)

 Elaboration of WWWWWWHH for Initial Operational Capability (IOC)

 Deliverables, budgets and schedules achievable by developing to the architecture in the SSAD and detailed development plans for the Construction phase. A.3.3 Initial Operational Capability (IOC)

 Plans for future increments beyond IOC B. Sections of the LCP Document

The (sub-) sections listed below describe the base format for the LCP. For readability by faculty, TA’s, graders, and CS577b students, every submitted LCP document (LCO LCP, LCA LCP, and IOC LCP) should contain each of the indicated sections, with the title and section numbering unchanged. (See the General Guidelines for exceptions.) The text under each (sub-) heading describes what should be in the corresponding (sub-) section. (The texts in these (sub) -sections of the present document are to be read by CS577 students -- to help them prepare the contents of the corresponding (sub-) sections of their LCP documents -- and are not intended to be included in submitted LCP.) B.1. Introduction B.1.1 Status of the LCP Document

Summarize any significant difference between the content of the LCP and the Win-Win negotiated Agreements. Identify any major LCP issues that have not yet been resolved. B.1.2 Assumptions

This section identifies the conditions that must be maintained in order to implement the plans described in the rest of the LCP using the resources, human and other, specified.

Assumptions might cover such items as:

 Stability of software product requirements, including external interfaces

 Stability of required development schedules

 Continuity of funding

 On-schedule provision of customer-furnished facilities, equipment, and services

 On-schedule, definitive customer response to review issues and proposed changes

LeanMBASE_Guidelines_V1.2 52 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP)

 What the developers expect to be ready in time for them to use. For example, other parts of your products, the completion of other projects, software tools, software components, test data, etc.

 Dependencies on computer systems or people external to this project

 Any tasks which are on the critical path of the project

The above assumptions can be used as a reference and tailored as per project requirements. That is:

 Stability of software product requirements is a condition that must be maintained (in order to implement the plans described in the rest of the LCP using the resources, human and other, specified) for every project, so it shouldn’t be listed here unless there is some reason to believe that for your particuluar project product requirements are more likely to change than for just any project. If this is the case, then the reason that you believe that product requirements are likely to change should be included here.

 The comments about stability of software product requirements apply equally well to virtually all the other examples of “condition that must be maintained,” so the comments regarding which, if any, to list apply to all of them.

 So, at the risk of belaboring the point, only those conditions for which you have good reason to suspect change should be listed here – and with an indication of your reasons. B.1.3 References

This section contains references, if any are needed, to books, documents, web sites, meetings, tools used or referenced, etc., that will help the reader understand the system’s requirements in greater detail. Note that the degree to which any MBASE model -- document section -- is elaborated should be determined using risk considerations. To repeat an admonition from the General Guidelines:

 If it’s risky not to include something, then include it

 If it’s risky to include something, don’t include it.

It is risky not to write and include information that’s critical to success and that will likely be forgotten if it isn’t explicitly stated; it is risky to write and include something if doing so takes significant time away from doing something considerably more important to the success of the project. Thus, on the one hand, your grade may suffer if you omit references critical to the understanding of the project; it may, on the other hand, suffer if you include unnecessary references just for the sake of including references. B.2. Milestones and Products

This section lists tasks to be performed, work product that will be produced, dates by which tasks and work products will be completed B.2.1 Overall Strategy

In this section you should describe the overall strategy that will be used for executing your project.

Most CS577 projects will use some form of schedule as Independent Variable (SAIV) strategy, in which the 12- or 24–week schedule drives development of a set of (top priority) core capabilities. The overall strategy will, of course vary depending on the type of project, e.g., feasibility study project, COTS assessment project, COTS based development project, custom development project, etc.

LeanMBASE_Guidelines_V1.2 53 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP) C. Phases

In this section you provide a list of major milestones and dates by which they are expected to be completed – in the form of a Gantt chart, which you are to construct using Microsoft Project.

 The schedule should focus on the next phase of the development.

 Include a link to the Microsoft Project file for the detailed (Gantt chart) schedule rather than inserting it here.

 For COTS – intensive projects, a concise COTS Assessment Plan (CAP) is also needed. For COTS assessment projects, it replaces the LCP 2.3 Project Deliverables

In this section you:

 Provide a list of artifacts to de delivered at the end of each stage (see below). (Note that the precise artifacts to be delivered depend upon the project’s type. (Note that the contents of the list depend on the type of project, with certain COTS projects having different artifact delivery requirements than custom development projects, so an idication of the project type would be appropriate here.)

 For each artifact, provide its due date, format (word, pdf, etc) and delivery medium (hard copy, soft copy, etc). .1.1.1 Engineering Stage

In this section you provide a list of artifacts to de delivered at the end of the Engineering Stage. This list typically includes (see above for minor exceptions) the LCO and LCA versions of the:

 Operational Concept Description (OCD)

 System and Software Requirements Description (SSRD)

 System and Software Architecture Description (SSAD)

 Life Cycle Plan (LCP)

 Feasibility Rationale Description (FRD)

 Prototypes (See OCD Section 4.) .1.1.2 Production Stage

In this section you provide a list of artifacts to de delivered at the end of the Production Stage. This list typically includes the following (technical) documents plus the added transition package that is to be delivered to the client:

 LCA package (kept up-to-date with "as-built" architecture and design)

 Iteration Plans

 Iteration Assessment Reports

 Release Description (for each internal release)

LeanMBASE_Guidelines_V1.2 54 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP)

 Quality Management Plan

 Test Plan and Cases(s)

 Test Procedure and Results

 Peer Review Plan

 Peer Review Reports

The transition package includes the software library’s:

 Source Code

 Load Modules

 Installation Scripts

 Software Development Files (e.g. test modules)

Client-side deliverables include:

 User Manual

 Training Plan

 Transition Plan

 Support Plan

 Training materials (including tutorials and sample data)

 Regression Test Package

 Tools and Procedures

 Version Description Document (or Final Release Notes)

 Facilities, Equipment and Data (these may not be the responsibility of the team) C.2. Responsibilities

In this section you specify who will be responsible for performing each of the software life cycle functions, and by which organization each will be performed. C.2.1 Overall Summary

In this section you provide an overall summary of the various stakeholders’ responsibilities during the development of the project. You should include responsibilities of all stakeholders across all project phases.

The following table (Table 9) may be used as a reference, but must be tailored to your project.

LeanMBASE_Guidelines_V1.2 55 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP)

Table 9: Stakeholder responsibilities during software life cycle

Inception Elaboration Construction Transition

Users Support definition Review designs Review and test Review and test of Requirements and prototypes product (or its product (or its specification, during ARB. Help increment) in increment) in operational provide test data Development operational concept and plan. and test scenarios. environment. environment. Review prototype Provide test Provide usage and exercise is support. feedback to available Maintainer

Customer/client Support definition Monitor progress Monitor progress Monitor progress. and review of at milestones. at milestones. Provide requirements Review designs, Review and test administrative specification, prototypes, plans product. Provide support in operational and feasibility administrative transitioning the concept and plan – during ARB. Help support. Review product. Review accept or reject provide test data system system options and test scenarios. performance performance

Developer/maintainer Prepare Refine architecture Refine design, Provide requirements and design and implement, and development specification, present them integrate product. support in operational during ARB. Perform and transitioning the concept, Refine or rebuild support reviews product. Adapt architectural further prototype and test. product if sketches and plan. to investigate risks. development Build user Prepare test plan. environment interface differs from prototype. operational one.

Interface Support definition Refine interface Review interface Validate interface of Requirements specification and design and in operational specification and review design. implementation. environment interface Build prototype to Validate interface specification. investigate risks. in development environment

C.2.2 By Phase / Stage

In this section you provide full detail of responsibilities, organized by phase, i.e.,

 For each development team member, identify the member, his/her role, and his/her primary and secondary responsibilities in each of the project phases..

LeanMBASE_Guidelines_V1.2 56 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP)

 For every stakeholder representative who is authorized to approve changes in project scope, budget (for COTS licenses, etc., in CS577) or schedule (not applicable in CS577), identify the stakeholder and his/her stakeholder category (customer, client, system administrator, maintainer, etc.) and his/her organization. C.3. Approach

A reference set of good development practices for Monitoring and Control (Section 4.1), Reviews (Section 4.1.2), Status Reporting (Section 4.1.3), Risk Monitoring and Control (Section 4.1.4), Project Communication (Section 4.1.5), Methods, Tools, and Facilities (section 4.2), Configuration Management (Section 4.3), and Quality Management (Section 4.4) is provided in the following examples:

 Fulltext Title Database Project http://ebase.usc.edu/eservices/cs577b_2001/team08b/default.aspx

 USC Maternal, Child and Adolescent Center for Infectious Disease and Virology Project http://greenbay.usc.edu/csci577/spring2005/projects/team2/FD/FD.html

 Online Bibliographies on Chinese Religions in Western Language http://greenbay.usc.edu/csci577/spring2005/projects/team3/

Refer to MBASE guidelines for specific details regarding each section. http://cse.usc.edu/classes/cs577b_2005/guidelines/MBASE_Guidelines_v2.4.2.pdf

Using this numbering scheme, simply summarize how your project’s development practices will vary from the reference set. Some particular items to include explicitly are:

 Your particular Top-N risk list for the next phase in section 4.1.4

 Your particular choice of configuration Management and Problem Closure Tracking tools, with any special arrangements necessary to satisfy the good development practices fro configuration Management (Section 4.3) and Quality Management (Section 4.4)

 Your particular approach to involving your offsite IV & V personnel in reviews, prototype exercising, and tailoring in Section 4.4.2.1

It would also be good to review the best-practice guidelines in the previous set of MBASE Guidelines (Version 2.4.2) at http://cse.usc.edu/classes/cs577b_2005/guidelines/index.html C.4. Resources

For this section you will:

 Use COCOMOII / COCOTS (as appropriate to your project) to create an effort and schedule estimate for your project. Rather than including the COCOMOII / COCOTS files here, include a link to them. (Note that the purpose of the COCOMOII / COCOTS estimates are made not to decide on how much effort and how much time the project will take to complete; rather:

o Total effort is fixed by the number of team members and the number of hours/week a CS577 student is expected to devote to the project

o Schedule is dictated by CS577 rules

LeanMBASE_Guidelines_V1.2 57 Version Date: 9/28/2005 Guidelines for MBASE Life Cycle Plan (LCP)

o So, the effort and schedule estimation are made to enable you to scope the project to a size that is likely to be achievable in the limited time available

 Provide an explanation as to how you selected specific values for each of the COCOMOII / COCOTS parameters. C.5. Appendix

Create an appendix as needed. Each section of the appendix shall be referenced in the main body of the document where the data would normally have been provided.

LeanMBASE_Guidelines_V1.2 58 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD) A. Description of the FRD

Sections A1-A3 of this Feasibility Rationale Description (OCD) section of the LeanMBASE guidelines are intended to be read by CS577 students -- to help them understand what the FRD is about and how to write their projects’ FRD’s -- and are not intended to be included in submitted FRD. A.1. Purpose

The purpose of the Feasibility Rationale Description (FRD) is to show that the other artifacts (OCD, SSRD, SSAD, and LCP) are consistent and complete, in the sense that:

 Satisfaction of all the requirements specified in the SSRD will result in satisfaction of the Win-Win agreements made by the critical stakeholders and of the – including the organizational goals (benefits, ROI, etc.) desired by the customer and the client.

 An implementation of the system whose architecture / design is specified in the SSAD will satisfy all product-related requirements specified in the SSRD and the Prototypes

 Execution of the project described in the LCP will:

o Result in the production of the system whose architecture / design is specified in the SSAD.

o Satisfy all project-related requirements specified in the SSRD, including the requirement of completion within schedule and within budget

I.e., the purpose of the FRD is to ensure that the system developers have not just created a number of system definition elements, but have also demonstrated the completeness and consistency of these elements in ensuring, with high probability, the feasibility of accomplishing the desired goals – both product-wise and project-wise.

That such a document is critical to project success is motivated by the fact that:

 In industrial and governmental software development projects – and, of course, in CS577 projects – different development team members write different ones of the op con, requirements, architecture / design, and project management artifacts. (Note that the terms “Operational Concept Definition,” “System and Software Requirements Description,” System and Software Architecture Description,” and “Life Cycle Plan” are MBASE terms for these artifacts.)

 There is often no team member who reads the two or more artifacts needed to check any one the four points listed above – and rarely any team member who reviews the complete set of artifacts.

 As a result, project success can be severely jeopardized. (Construction of the FRD should alert its writer(s) to inconsistencies within and among project artifacts and various types of incompleteness; these should result in urging authors of those artifacts involved to edit them in order to remove the problems identified, and such editing should be followed up by the FRD’s authors to ensure that their feasibility arguments are correct with respect to the actual contents of the other artifacts.)

01111e796d662a64ee23970f5be9f23a.doc 59 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD) A.2. FRD Life Cycle Process

The Expected Benefits (Section 2.2) and Benefits Chain (Section 2.3) sections of the Operational Concept Description (OCD) should provide starting points for estimating the required client investments, the added value or cost savings resulting from the product’s use, and the resulting business-case return on investment (ROI)

The LCO version of the FRD will usually be incomplete, as key decisions usually remain to be made during the Elaboration phase. Any shortfalls in the evidence for the feasibility with respect to the criteria above should be treated as risks, documented in FRD section 6, and reflected in the Top – N risk resolution status summary in Life Cycle Plan (LCP) section 4.1.4. Shortfalls in subsequent versions of the FRD should follow the same procedure. A.3. Completion Criteria A.3.1 Life Cycle Objectives (LCO)

 Assurance of consistency among the system definition elements above for at least one feasible architecture (this should be done via appropriate combinations of analysis, measurement, prototyping, simulation, modeling, reference checking, etc.)

 Assurance of a viable business case analysis for the system as defined

 Requirements traceability and Level of Service feasibility

 Assurance that all major risks are either resolved or covered by a risk management plan A.3.2 Life Cycle Architecture (LCA)

 Assurance of consistency among the system definition elements above for the architecture specified in the SSAD

 Assurance of a viable business case analysis for the system as defined

 Assurance that all major risks are either resolved or covered by a risk management plan

 Elaboration of Requirements traceability and Level of Service feasibility A.3.3 Initial Operational Capability (IOC)

 Feasibility rationale for future increments beyond IOC

 Validation of business case and Benefits Chain (OCD 2.3) assumptions B. Sections of the FRD Document

The (sub-) sections listed below describe the base format for the FRD. For readability by faculty, TA’s, graders, and CS577b students, every submitted FRD document (LCO FRD, LCA FRD, and IOC FRD) should contain each of the indicated sections, with the title and section numbering unchanged. (See the General Guidelines for exceptions.) The text under each (sub-) heading describes what should be in the corresponding (sub-) section. (The texts in these (sub) -sections of the present document are to be read by CS577 students -- to help them prepare the contents of the corresponding (sub-) sections of their FRD documents -- and are not intended to be included in submitted FRD’s.)

LeanMBASE_Guidelines_V1.2 60 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD) B.1. Introduction B.1.1 Status of the FRD Document

Summarize any significant difference between the content of the FRD and the Win-Win negotiated Agreements. Identify any major FRD issues that have not yet been resolved. B.1.2 References

This section contains references, if any are needed, to books, documents, web sites, meetings, tools used or referenced, etc., that will help the reader understand the system’s requirements in greater detail. Note that the degree to which any MBASE model -- document section -- is elaborated should be determined using risk considerations. To repeat an admonition from the General Guidelines:

 If it’s risky not to include something, then include it

 If it’s risky to include something, don’t include it.

It is risky not to write and include information that’s critical to success and that will likely be forgotten if it isn’t explicitly stated; it is risky to write and include something if doing so takes significant time away from doing something considerably more important to the success of the project. Thus, on the one hand, your grade may suffer if you omit references critical to the understanding of the project; it may, on the other hand, suffer if you include unnecessary references just for the sake of including references. B.2. Business Case Analysis

In this section you analyze the software system’s return on investment (ROI): ROI= (Benefits-Costs)/Costs.

 Cost Analysis

o Costs include actual client costs for system development, transition, operations, and maintenance. Project team costs are zero, but participation by non-developer stakeholders does cost (salary, overhead, etc.). Transition costs can include equipment purchase, facilities preparation, COTS licenses, training, conversion, and data preparation costs. Operation costs can include COTS licenses, supplies, system administration, and database administration costs. Maintenance costs can include hardware and software maintenance. The COCOMO II maintenance estimator can be helpful in producing the cost analysis.

o Cost incurred should include one-time and recurring costs of personnel, hardware, software, etc.

o Personnel costs should be estimated in terms of effort. One – time effort includes development and transition effort by clients, users, etc; while recurring effort includes effort operational and maintenance effort.

 Benefit Analysis

o Possible benefits are expressed in financial terms compared to costs, such as increased sales and profits, or reduced operating costs.

o Non-financial benefits and costs should also be included.

o The value added may also describe non-monetary improvements (e.g. quality, response time, etc.), which can be critical in customer support and satisfaction

LeanMBASE_Guidelines_V1.2 61 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

 ROI Analysis

o Include a Return-On-Investment (ROI) analysis

o Include Breakeven point analysis as appropriate (graph showing benefits-minus-costs vs. time will be added soon) B.3. Requirements Traceability

In this section you must summarizes the various traceability concerns between and among LeanMBASE artifacts i.e. OCD, SSRD, SSAD, LCD, FRD. Table 10 provides a reference set of traceability relation. Do not just copy it, but construct your own table from your actual numbers, goals, requirements, and architecture elements.

Table 10: Traceability Matrix

Document/ From Section 1 From Section 2 From Section 3 From Section 4 From Section 5 Section (Glossary)

of OCD Should reflect Initial document compiled from client meetings and Win-Win negotiation the same sessions structure across of SSRD all the To: To: To: To: documents. OCD 4.2 OCD 3.2 OCD 3.2 OCD 3.2

§ 3.1 ← OCD 2.1, OCD 5 OCD 4.4 Glossary should OCD 4.5.1 show the same OCD 4.5.3 terminology with addition to specific for each § 3.2 ← OCD 4.3 document, if of SSAD To: To: To: SSAD has no needed. Section 5 § 2.1 ← OCD § 3.1.6 - § 3.1.8 ← SSAD 3 4.5.1, OCD 2.3 OCD 4.4, OCD 3.5, SSRD 5

§ 2.2 ← OCD 4.5.2 § 3.2 ← OCD 5

§ 2.2.1 ← OCD 2.3 § 3.4 ← OCD 4.4, SSRD 5

§ 2.3.1 ← OCD 4.3

§ 2.3.2 ← SSRD 3.2.1, OCD 4.3, SSAD 2.3.1

LeanMBASE_Guidelines_V1.2 62 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

§ 2.4 ← OCD 4.4, SSRD 5. SSAD 2.2, SSAD 2.3.1, SSAD 2.3.2

§ 2.5 ← OCD 4.5.4, OCD 3.5

of LCP Should reflect To: To: To: To: the same structure across § 2.1 ← FRD 3.2 § 3 ← OCD 4.2.7 § 4.1.4 ← FRD § 5.2 ← FRD all the 4 3.3, FRD 2.1.1, documents. § 3.1.2 - § 3.1.4 ← SSRD 2.1 OCD 2.1 § 2.2 ← LCP 2.1

§ 2.1 ← FRD 3.2, SRD 2, OCD 2.1

of FRD Should reflect To: To: To: Dependent on the same the architecture structure across § 2.1.1 - § 2.1.4 ← § 3.1 ← SSRD 3, LCP 4.1 elucidated in all the LCP 5.2 SSRD 3.2, LCP SSAD documents. 2.2, OCD 3.2, LCP 2.1 OCD 4.2 § 2.1.5← OCD Glossary should 2.1, OCD 3.2 show the same § 3.2 ← LCP 2.1, terminology LCP 2.2 with addition to specific for each § 2.2.1 ← OCD document, if 4.2, OCD 4.4, needed. OCD 3.4.3 § 3.3 ← LCP 4, LCP 5.2

§ 2.2.2 ← SSRD 2, LCP 4

§ 2.2.3 ← SSRD 3, SSRD 3.2, SSRD 3.3

§ 2.2.4 ← SSRD 4

§ 2.2.5 ← SSRD 5

LeanMBASE_Guidelines_V1.2 63 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

§ 2.2.6 ← SSRD 6

B.4. Level of Service Feasibility

In this section you provide arguments to justify the feasibility of achieving the Level of Service (l.o.s.) requirements specified in the SSRD. You do this by demonstrating -- through analysis, detailed references to prototypes, models, simulations, etc. -- how the designs will satisfy the SSRD L.O.S. Requirements (SSRD 5). Complete coverage of the L.O.S. Requirements is essential. If applicable, provide links to files containing detailed evidence of the feasibility of required levels of performance, interoperability, and dependability.

Note that ambitious Level of Service requirements and their tradeoffs can be the most difficult requirements to satisfy, and the most difficult requirements for which to argue satisfiability in advance of actual implementation. Table 11 summarizes some useful product and process strategies for addressing Level of Service requirements. Additional useful tales are in the FRD section 2.2.5 of the previous set of MBASE (not LeanMBASE) Guidelines (version 2.4.2) at http://cse.usc.edu/classes/cs577a_2004/guidelines/index.html.

Table 11: Level of Service Product and Process Strategies

Attributes Product Strategies Process Strategies

Dependability Accuracy Optimization, Backup/ Failure Modes & Effects Analysis, Fault Recovery, Diagnostics, Error-reducing Tree Analysis, Formal Specification & User Input/output, Fault-tolerance Verification, Peer Reviews, Penetration, Functions, Input Acceptability Regression Test, Requirements/Design V & Checking, Integrity Functions, Intrusion V, Stress Testing, Test Plans & Tools Detection & Handling, Layering, Modularity, Monitoring & Control, Redundancy

Interoperability Generality, Integrity Functions, Interface Change Control, Interface Interface Specification, Layering, Definition Tools, Interoperator Modularity, Self-containedness Involvement, Specification Verification

Usability Error-reducing User Input/output, Help/ Prototyping, Usage Monitoring & Analysis, explanation, Modularity, Navigation, User Engineering, User Interface Tools, Parametrization, UI Consistency, UI User Involvement Flexibility, Undo, User- programmability, User-tailoring

Performance Descoping, Domain Architecture- Benchmarking, Modeling, Performance driven, Optimization (Code/ Algorithm), Analysis, Prototyping, Simulation, Tuning, Platform-feature Exploitation User Involvement

Adaptability Generality, Input Assertion/type Benchmarking, Maintainers & User (Evolvability / Checking, Layering, Modularity, Involvement, Portability Vector Portability) Parameterization, Self-containedness, Specification, Prototyping, Requirement Understandability, User- Growth Vector Specification & programmability, User-tailorability, Verification Verifiability

LeanMBASE_Guidelines_V1.2 64 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Development Cost / Descoping, Domain Architecture- Design To Cost/schedule, Early Error Schedule driven, Modularity, Reuse Elimination Tools And Techniques, Personnel/Management, Process Automation, Reuse-oriented Processes, User & Customer Involvement

Reusability Domain Architecture-driven, Portability Domain Architecting, Reuser Involvement, Functions Reuse Vector Specification & Verification

All of Above Descoping, Domain Architecture- Analysis, Continuous Process driven, Reuse (For Attributes Possessed Improvement, Incentivization, Peer By Reusable Assets) Reviews, Personnel/Management Focus, Planning Focus, Requirement/ design V&V, Review Emphases, Tool Focus, Total Quality Management

B.5. Process Feasibility

In this section you

 Provide rationale for:

 Choice of process model. Tables 12 and Table 13 provide top-level help in choosing a process model.

 Choice of increment, block, or build sequence in incremental development.

 Show specific content of prioritized capabilities by increment

 Provide evidence that priorities, process and resources match, in the sense that:

o Budgeted cost and schedule are achievable

o No single person is involved in two or more full-time tasks at any given time

o Design/architecture is such that low priority features can be feasibly dropped to meet budget or schedule constraints

 Use the estimated Effort (Person-months) and Schedule from Budgets (LCP 5.2) to show that the staffing levels are sufficient, and that the project is achievable within the schedule. (It is important to use a credible and repeatable estimation technique for the Effort and the Schedule.)

Table 12: Process Model Decision Table

Objectives, Contraints Alternatives Model Example

Growth Understanding of Robustness Available Architecture Envelope Requirements Technology Understanding

Limited COTS Buy COTS Simple Inventory Control

LeanMBASE_Guidelines_V1.2 65 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Limited 4GL, Transform or Small Transform Evolutionary Business - DP Development Application

Limited Low Low Low Evolutionary Advanced Prototype Pattern Recognition

Limited High High High Waterfall Rebuild of old to Large system

Low High Risk Complex Reduction Situation followed by Assessment Waterfall High Low High- performance Avionics

Limited Low Low- High Evolutionary Data to Medium Development Exploitation Medium

Limited Large Medium to Capabilities- Electronic to Large Reusable High to- Publishing Components Requirements

Very High Risk Air Traffic Large Reduction Control &Waterfall

Medium Low Medium Partial Low to Spiral Software to Large COTS Medium Support Environment

Table 13: Conditions for Additional Complementary Process Model Options

Design-to-cost or Design-to-schedule Fixed Budget or Schedule Available

Incremental Development  Fixed Budget or Schedule Available; Early Capability

(only one condition is sufficient)  Needed; Limited Staff or Budget Available;

 Downstream Requirements Poorly Understood;

 High-Risk System Nucleus; Large to Very Large

 Application; Required Phasing With System

 Increments

LeanMBASE_Guidelines_V1.2 66 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD) B.6. Risk Assessment

Risk Assessment consists of risk identification, risk analysis and risk prioritization. In this section you identify the major sources of risk in the project and how you will deal with them. Two of the ways that you might use to identify risks are:

 To start with the list of the top-10 risks (ten most frequent risks) shown in Table 14. Note that table 14 also provides high-level suggestions as to how to deal with each type of risk

 To consider as a risk any serious uncertainty (shortfall) in the content any of the previous sections of the FRD.

Table 14: Software Risk Management Techniques

Source of Risk Risk Management Techniques

Staffing with top talent; key personnel agreements; team-building; Personnel shortfalls training; tailoring process to skill mix; peer reviews.

Detailed, multi-source cost and schedule estimation; cost/schedule as independent variable; incremental development; software reuse; Schedules, budgets, process requirements descoping; adding more budget and schedule; outside reviews.

Benchmarking; peer reviews; reference checking; compatibility COTS, external components prototyping and analysis; usability prototyping

Requirements scrubbing; prototyping; cost-benefit analysis; design to Requirements mismatch cost; user surveys

Prototyping; scenarios; user characterization (functionality; style, User interface mismatch workload); identifying the real users

Simulation; benchmarking; modeling; prototyping; instrumentation; Architecture, performance, quality tuning

High change threshold: information hiding; incremental development Requirements changes (defer changes to later increments)

Reengineering; code analysis; interviewing; wrappers; incremental Legacy software deconstruction

Externally-performed tasks Pre-award audits; award-fee contracts; competitive design or prototyping

Straining computer science Technical analysis; cost-benefit analysis; prototyping; reference checking

Be sure to:

 Provide a description of all identified risks.

 Provide all of the following for every critical risk:

o A clear description of the risk

o The Risk Exposure: Potential Magnitude and Probability of Loss

LeanMBASE_Guidelines_V1.2 67 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

o Actions you will take to mitigate the risk should it materialize

 Identify low-priority requirements that can be left out in the case of schedule slippage caused by the realization of specific risks B.7. Analysis Results

The section, above, of the FRD guidelines entitled “Purpose of the FRD” listed a number of issues that must be convincingly argued for the the FRD. In previous sections of your FRD you will have provided arguments for most of them. In this section you will complete the feasibility rationale by:

 Providing evidence that/how a system built to the architecture specified in the SSAD will satisfy the requirement defined in the SSRD. (You do this by showing: software that has already been implemented – possibly prototype software rather than software that will be part of the system; non-executable prototypes; simulations (for performance and other l.o.s. requirements); other types of model, perhaps mathematical; etc., etc.) In the case of files that are too large, or inappropriate, to insert here directly, provide links if not already done elsewhere (e.g., Section 4 on Level of Service Feasibility).

 Describing how the final architecture/design was decided upon, by”

o Describing all relevant trade-offs, and why each specif decision was made

o Describing alternative architectures/designs that were considered, but rejected, and the reasons they were rejected – primarily to avoid the need to consider them again should there be thoughts of architectural change. (This type of information is especially useful, to prevent wasted time, during the Inception and Elaboration Phases.) B.7.1 Introduction

Most projects involve the use of one or more COTS packages along with custom components. Some projects may also include legacy components which must be integrated with the system. This document is intended to guide you in performing and documenting the activities involved in selecting a set of COTS components that can be integrated with minimum effort. The primary audience for this document is the system’s technical development team and Independent Validation &Verification personnel. .7.1.1 Definitions

The following definitions have been included for further clarity:

Commercial Off The Shelf (COTS) (software) product:

A COTS (software) product is a product that:  Was developed by a third party (who controls its ongoing support and evolution),  Must be bought, licensed, or otherwise acquired  Will be integrated into a larger system as a component, i.e.: o Will be delivered, to the customer, as part of the larger system o Isn’t simply a software tool used in developing the larger system  Might or might not be modifiable, by the developers of the larger system at the source code level,  May include mechanisms for customization,  Is bought and used by a significant number of system developers.

Examples of COTS products:

LeanMBASE_Guidelines_V1.2 68 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Database management systems such as MySQL, MSSQL, MS Access, and Oracle

Application Servers such as Apache, Microsoft Internet Information Server, Oracle AIS, and IBM Websphere

Connectors:

A connector is an architectural element that models  (mediates) interaction between and/or among components  (enforces) rules that govern those interactions Each connector provides an interaction duct(s) and transfer of control and/or data.

Examples of Connectors:

Simple connectors: Procedure calls, shared variable access

Complex connectors: Client-server protocols, database access protocols

Legacy system:

An older computer system or application program that continues to be used:  Because of the prohibitive cost of replacing or redesigning and re-implementing it  And despite its poor competitiveness and poor compatibility with modern equivalents. The implication is that a legacy system is large, monolithic and difficult to modify.1

Examples of Legacy Systems:

The most common examples are COBOL mainframe computers and software applications used in the 1980s for business data processing B.7.2 System Structure

In this section you will identify the high level, overall structure of your system, including place-holders for COTS components, by drawing a System Structure Diagram. If there is more than one system architecture being considered / evaluated, identify (and label) each one and draw a System Structure Diagram for each one. Each System Structure Diagram should clearly indicate:  Data transfer, control transfer, or both  Overview of the data being transferred (if any)  Information exchange protocols being used (http, ftp, odbc, jdbc, …) Please note: You do not need to use any specific architecture description languages to draw the System Structure Diagram. Rather, a simple diagram with boxes to represent components and complex connectors and lines to indicate interaction is sufficient.

Example of a System Structure Diagram:

System Structure Diagram for two different architectures for an online photo library system are shown below. In both cases the system provides its users the capability to store, retrieve and transform photographs using a browser. (An example of an existing photo library system may be found at http://photos.yahoo.com/).

Depending on the architecture chosen, the system will have 5 or 6 major components and connectors from among the following:  Database management System  Database Connector

1 Defined in the Free On-Line Dictionary Of Computing (FOLDOC)

LeanMBASE_Guidelines_V1.2 69 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

 Business Logic  Image Manipulation Toolkit  Browser  FTP server The two possible architectures being assessed are:  One in which images are stored on an FTP server and a link to the image is stored in the database  And one in which the images are stored in the database itself. System Structure Diagrams for both architectures are shown below:

Figure 7: Architecture 1: Images stored on FTP Server:

Database Management System

* -Returns Query Results Remote Query Interface Data transfer

* -Requests SQL Query

Database Connector

* -Returns RecordSet File Transfer Protocol Object calls Data and control transfer

* -Provides Query, Database Name, Location, Access Information

-Provides Image and Image transformation information *

Business Logic * -Image and Logistic information request *

* -HTTP Response FTP Server -Returns Processed Image Link -Acknowledgement and logistic information Image Manipulation Toolkit HTTP Request Response Protocol * Data transfer * -FTP Response * -HTTP Request

* Object calls Data and control transfer Browser -FTP Request

File Transfer Protocol

LeanMBASE_Guidelines_V1.2 70 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Figure 8: Architecture 2: Image stored in the database:

Database Management System

* -Returns Query Results Remote Query Interface Data transfer

* -Requests SQL Query

Database Connector

* -Returns RecordSet

Object calls Data and control transfer

* -Provides Query, Database Name, Location, Access Information

-Provides Image and Image transformation information

Business Logic *

-Returns Processed Image Link * -HTTP Response Image Manipulation Toolkit HTTP Request Response Protocol * Data transfer * -HTTP Request

Object calls Data and control transfer Browser

B.7.3 COTS, Legacy, Custom Components and Connectors Description

In this section you will describe the attributes of each COTS, Legacy, and Custom components being considered and of every connector being considered. The steps to be followed in doing this are:  For each architectural role (component) in each System Structure Diagram for which COTS products are being considered (e.g., DBMS, application server, image processing toolkit, etc.) – or, in the case of a COTS-based application, COTS products that have passed the initial evaluation: o List the role (e.g., DBMS) o List each COTS product and/or legacy system that can serve in that role (e.g., mySQL) o For each COTS product, list the connectors that may be used with it o If you are also considering developing a custom component for the role, list it as well  For each COTS product and each possibly-custom component (architectural entity) identified and described in the previous step, construct a table (Table 15) of the following form:

LeanMBASE_Guidelines_V1.2 71 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Table 15: COTS Description

Name Name of the COTS product or the designated name of the custom or legacy component

Version Version of the component (COTS and Legacy only)

Function Functional specifications of the COTS product or the functional requirements of the custom component

Type Component or Connector

Granularity & Packaging Components can be integrated at many levels of the system architecture. Types of granularity include:  Service Providers (a third party service such as Microsoft .NET passport)  Software Package (a component that can run independently of other components)  Class Library (a set of classes with source code available)  Object Library (a set of classes without source code)  Communication Framework (CORBA, DCOM) Packaging The types and representations of the component to be integrated (for custom components indicate the programming language to be used when developing the component)

Inputs Input interfaces provided by the component

Outputs Output interfaces provided by the component

Binding How the component is connected to the connector. Types of binding can be divided into five classes:  Static Binding e.g. Procedural call  Topological Dynamic Binding: e.g. Pipe and filter, Database connections  Runtime Dynamic Binding e.g. CORBA or COM  Compile time dynamic binding e.g. Object oriented languages  Mixed Binding: Combination of 2 or more bindings above Information flow Synchronous or Asynchronous

Software Dependencies Known software dependencies (e.g. JRE 1.4.1, MS .NET Framework)

Hardware Dependencies Known hardware dependencies (e.g. Intel Pentium processor, Mac PowerPC)

Target Platform For COTS: Platform required for using the COTS product

For Custom: Platform for which the system is designed

Architecture Style The architecture style used by the specific component

Non Functional Level of service objectives for the component to adhere to. E.g. Security, Requirements performance, etc.

Known Issues Any known issues with the component

LeanMBASE_Guidelines_V1.2 72 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Initially not all the required information may be available to you, so research may be required to identify possible system components and connectors.

Example of a component description table:

Two components that could serve in the DBMS role in the architectures shown above for the Online Photo Archive System are mySQL and MSAccess. Component Description Tables for the two, and a Connector Description Table for one possible connector, are shown below.

Table 16: MySQL Database Management System Component Description

Name MySQL Database Management System

Version 4.1

Function Storage and retrieval of data

Type Independently executing package

Granularity & Packaging Component

Inputs MySQL queries transmitted via ODBC, or JDBC

Outputs Record Set depending upon the programming language and connection driver used

Binding Topologically Dynamic Binding

Information flow Asynchronous and Synchronous

Software Dependencies No known dependencies

Hardware Dependencies No known dependencies

Target Platform Windows NT, 2000, Linux, Unix

Architecture Style Client-Server

Non Functional  Response Time: Query response for database sized 1 GB within 5 seconds Requirements  Data Security: Only authorized users should be able to access the data in the DBMS Known Issues No known issues

Table 17: MSAccess Database Management System Component Description

Name MSAccess Database Management System

Version 2003

Function Storage and retrieval of data

Type Independently executing package

Granularity & Packaging Component

Inputs SQL queries transmitted via ODBC, or ADODB

LeanMBASE_Guidelines_V1.2 73 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Outputs Record Set depending upon the programming language and connection driver used

Binding Topologically Dynamic Binding, Runtime Dynamic Binding

Information flow Synchronous

Software Dependencies Microsoft Windows

Hardware Dependencies No known dependencies

Target Platform Internet Information Services 6.0

Architecture Style Client-Server

Non Functional Response Time: Query response for database sized 1 GB within 5 seconds Requirements Data Security: Only authorized users should be able to access the data in the DBMS

Database Scalability: Limit to 1 GB

Known Issues No known issues

Table 18: MySQL- JDBC Database Connector Description

Name MySQL-JDBC Database Connector

Version 3.1

Function Connects a Java program to MySQL database

Type Object Library (.jar file) used with JAVA

Granularity & Packaging Connector

Inputs Database information (name, location, username, password) and queries

Outputs Java RecordSet

Binding Mixed (Compile time Dynamic Binding and Runtime Dynamic Binding)

Information flow Synchronous

Software Dependencies JRE 1.4.1

Hardware Dependencies No known dependencies

Target Platform Windows NT, 2000, Linux, Unix

Architecture Style Object calls

Non Functional Response Time: Query response for database sized 1 GB within 5 seconds Requirements

Known Issues No known issues

LeanMBASE_Guidelines_V1.2 74 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD) B.7.4 Component Interaction Evaluation

In this section you will use the System Structure Diagrams from Section 7.2 and the Component and Connector Description Tables from Section 7.3 to develop a combination of components and connectors that satisfies the system requirements and does not have any visible interoperability problems. Known interoperability problems include:  Platform incompatibilities: Two components that depend upon incompatible platforms. E.g. Apache Server, which depends upon the linux platform and MS Access DBMS, which doesn’t run on linux.  Software and Hardware dependencies: Two components that have conflicting dependencies. E.g. a specific java library requires JRE 1.4.1 while another one requires JRE 1.4.2  Incompatible Bindings: One component requires a specific type of binding while the other component does not support it.  Input/Output incompatibilities: Components that need to interact have different input and output interfaces.  Architecture Style clashes: Components that need to interact do not support the same architecture style. A number of the above incompatibilities can be resolved. The following are some component conflict resolution strategies:  Glueware development: Certain minor problems, such as input/output interface incompatibilities can be resolved by developing simple glueware.  Architecture modification: Certain problems such as software and platform dependencies can be resolved by modifying the architecture of the system. For example, To make a MySQL database on a linux server work with a .NET program, one could host the MySQL database on a linux machine, and the .NET program on a Windows machine. Examine the tables created in Section 7.3 and identify the combinations that appear to require the least effort to integrate. For each major interaction for which the effect of integrating the two components is not perfectly well understood by any of your team members:  Create a sequence diagram describing the elements of the interaction  Prototype the interaction, ie., write code to perform sample interactions  And document the results in a Test Table (Table 19) of the following form:

Table 19: Test Table

Name Name of the components interacting in this test

Test Purpose Brief explanation of what is being tested in this test case

Test procedure Brief description of the activities involved in testing the interaction

Input specifications Inputs (single or a range) required to execute this test.

Expected output Output expected for a successful test

Architectural changes made Changes, if any, made to the system structure for successful test execution

Glueware developed Glueware, if any, developed during this test. If glueware was developed, indicate what interoperability problems the glueware was intended to resolve. Also indicate the amount of effort that was required to create the glueware.

Test Results Was the interaction successful or did it fail?

Notes Any additional notes or observations regarding this test

LeanMBASE_Guidelines_V1.2 75 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Examples of combinations for the online photo library system are listed below:

Combination 1: for architecture 2

Database: MySQL

Database Connector: Java-MySQL Connector

Business Logic language: Java

Image manipulation toolkit: Java Image Manipulation toolkit

FTP Server: Windows, Linux, Unix

Browser: Internet Explorer, Mozilla Firefox, Safari

Combination 2: for architecture 2

Database: MS Access

Database Connector: ADODB Connection

Business Logic language: Visual Basic, C#, Visual J++

Image manipulation toolkit: MS Image Manipulation Classes

FTP Server: Windows, Linux, Unix

Browser: Internet Explorer, Mozilla Firefox, Safari

Combination 3: for architecture 2

Database: MSSQL

Database Connector: PHP-ODBC

Business Logic Language: PHP

Image manipulation toolkit: PHP Image Manipulation classes

FTP Server: Windows, Linux, Unix

Browser: Internet Explorer, Mozilla Firefox, Safari .7.4.1 Interaction Testing

Provide a sequence diagram and test table (as shown below) for the interaction between various components.

The following is a sequence diagram for Java business logic, Java-Mysql database connector, and Mysql database.

LeanMBASE_Guidelines_V1.2 76 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

Figure 9: Sequence diagram for Java business logic, Java-Mysql database connector, and Mysql database.

Database Connector:MySQL Business Logic: Java Database Connector Database: MySql

Procedure Call: Connect To Database

Database Object

Procedure Call: Execute Query

JDBC Call: Execute Query

Query Results

Query Results RecordSet

The following is a Test Table for a test of the interaction of Java Business Logic, MySQL-JDBC Database Connector, and MySQL database

Table 20: Test Table for a test of the interaction of Java Business Logic, MySQL-JDBC Database Connector, and MySQL database

Name Java Business Logic, MySQL-JDBC Database Connector, MySQL database

Test Purpose Testing:  To check if Java business logic can interact with a MySQL database via a MySQL-JDBC database connector  To determine the datatypes that can be exchanged. Test procedure Developed a test code in Java that creates compile-time static connection to the classes provided by the MySQL-JDBC connector. Provided database information to establish connection. Upon successful connection ran a series of insert, select, update and delete queries for data types involving strings, date & time, integer, decimal, and binary (for images).

Input specifications Dummy date for strings, data & time, integer, decimal, and binary (image – JPG, BMP, GIF, and WMF)

Expected output Data as inserted or updated

Architectural changes made None

Glueware developed No significant glueware developed

Test Results The interaction successfully returned string, integer, and decimal data. However it

LeanMBASE_Guidelines_V1.2 77 Version Date: 9/28/2005 Guidelines for MBASE Feasibility Rationale Description (FRD)

did not successfully return binary, and date & time information.

Notes None.

B.7.5 Evaluation Summary

In this section you will create an Evaluation Summary Table, in the format shown below, of the results shown in Section 7.4. The individual evaluations should be in priority order, starting with the most successful combination.

Table 21: Example of Evaluation Summary

1 Combination 1: for architecture 2 The system with the given components is scalable, and while the system could not return date & time and Database: MySQL image information there are easy work arounds to both the problems. For date & time one can use string data Database Connector: Java-MySQL Connector instead in yyyy-mm-dd format and images could be stored on an FTP storage location. No glueware or Business Logic language: Java architectural changes are required for implementing the system. Image manipulation toolkit: Java Image Manipulation toolkit

FTP Server: Windows, Linux, Unix

Browser: Internet Explorer, Mozilla Firefox, Safari

2 Combination 2: for architecture 2 The system can store and retrieve all data-types, and minimal glueware is required. However, the system has Database: MS Access limited scalability. Database Connector: ADODB Connection

Business Logic language: Visual Basic, C#, Visual J++

Image manipulation toolkit: MS Image Manipulation Classes

FTP Server: Windows, Linux, Unix

Browser: Internet Explorer, Mozilla Firefox, Safari

B.8. Appendix

Create an appendix as needed. The Appendix may be used to provide additional information like COCOMO results. Each section of the appendix shall be referenced in the main body of the document where the data would normally have been provided.

LeanMBASE_Guidelines_V1.2 78 Version Date: 9/28/2005 Guidelines for MBASE Construction, Transition, & Support (CTS) A. General Construction Process Guidelines

 The process for the Construction and Transition Phase should be risk-driven: in particular, you should follow process strategies, to accommodate your particular requirements. For instance, if you have stringent performance requirements, you should plan accordingly for some of the following process strategies, such as Benchmarking, Modeling, Performance Analysis, Prototyping, Simulation, Code Profiling, Tuning and Optimization (Reference Table 8 in Hoh In’s Dissertation)

 It is critical to keep all the artifacts properly baselined. In particular, at the end of each iteration, the Operational Concept Description (OCD), System and Software Requirements Definition (SSRD), System and Software Architecture Description (SSAD), Life Cycle Plan (LCP), Feasibility Rationale Description (FRD) must be consistent with the IOC plans and implementation (e.g. source code comments, component and object names, etc.) documentation. This is consistent with the concept of “continuous integration” aspect of iterative and incremental application development.

 As part of making winners of all the success critical stakeholders, it is recommended that your clients assess and evaluate each one of the intermediary or incremental releases, to avoid any changes introduced late in the process, which might introduce schedule slippage--something you want to avoid in a design-to-schedule situation

 Reference information where applicable, as opposed to repeating information. In particular, you should provide references to the information, when the reader will have to look at multiple documents to get hold of the information. In particular, it is recommended to use hyperlinks with traceability matrices that reference specific related areas in the documentation.

 Although the purpose of concurrent engineering is to having coding and testing proceeding in parallel, it is advisable to have a functional freeze at some point during the iteration. Ideally, the release at the end of the iteration should have thoroughly tested the features implemented in that increment. If the programmers don't stop adding features at some point before the end of the iteration, the added features may not be thoroughly tested, and furthermore, may compromise the quality or correctness of the current feature set, leading to an unusable increment.

577b Guidelines:

 Rose Model Files should also be kept up-to-date. The code generation as well as the round-trip engineering capabilities of Rose (Java, C++, …) should be used, where applicable.

 Ideally, during the rebaselining of the LCA packages, you should set your own dates for performing peer reviews: make sure that you turn in the required deliverables by the date indicated on the class schedule.

 Make sure that your project plan also identifies which increments are important to be evaluated by the customer and the users. It is important that the client periodically reviews the software as it is being developed, in particular, regarding user interface considerations. It is very important, due to the short schedule, to minimize rework, by avoiding making assumptions: when in doubt, refer to your customer. We recommend that most teams adopt the practice of delivering intermediate working increments to the clients, and keep incorporating the feedback.

01111e796d662a64ee23970f5be9f23a.doc 79 Version Date: 9/28/2005 Guidelines for MBASE

During Construction, you will be performing the following activities:

 Requirements Management

 Detailed Design

 Coding

 Unit and Integration Testing

 Peer Reviews

 Configuration Management

 Quality Management

You will be generating the following artifacts:

 Iteration Plan

 Iteration Assessment Report

 Release Descriptions and Notes

 Test Plans and Results

 Peer Review Plans and Reports

 Quality Management Plans

 Training Materials

Requirements Management

Changes in the requirements will be documented, as appropriate, in the Operational Concept Description, and the System and Software Requirements. Subsequently this may affect the architecture in the SSAD, impact the schedule and appear in the LCP, for which the FRD will then need to be updated. For instance, some of the changes might be moved to the Changes Considered but not Included or Evolutionary Requirements (SSRD). Accordingly, the Feasibility Rationale Description should be updated to reflect the impacts of the changes on the feasibility criterion of the project (e.g. is the value of the revised system greater than the cost?).

Design Creating or Modification

During Construction, a lot of effort will be spent on developing, detailing or changing the system design. The design activities are reflected in the System and Software Architecture Description, and the associated model files (e.g., the Rational Rose model). Low-level design details should also be included as comments in the source code and be consistent with the SSAD (especially naming). Another related activity would be to review the design and implementation (includes design meetings and consultations, as well as formal and informal reviews, walkthroughs, and inspections) and generating accordingly Inspection Reports.

Since there is no separate Detailed Design artifact, the Detailed Design information is documented in the following 3 artifacts:

 SSAD

 Rose MDL Files

LeanMBASE_Guidelines_V1.2 80 Version Date: 9/28/2005 Guidelines for MBASE

 Source Code (comments always related back to SSAD)

577 Guidelines:

You should try to strike a good balance as to what goes in the SSAD, v/s what goes in the MDL files, v/s what goes in the source code. Because having many objects/operations without tool support can lead to an unwieldy SSAD, you may want to leave very detailed information (e.g., method argument types, return values, ...) in the Rose MDL file. Once you have the information in the Rose model file, you can generate a report out of it (e.g., using SoDA), and include that information in the SSAD, as Operation Specification Templates, and so forth.

At any point in time, you should make sure that there are no glaring problems, such as having your architecture/design as represented in the MDL file conflict what is represented in the SSAD (e.g., block diagram), or with how the system was built.

Code Generation/Modification

During construction, most of the effort will be spent on actually coding the system. During coding, care should be taken to follow proper coding standards and programming style, emphasizing on code readability and maintainability, including the proper use of comments in the source code. An associated activity will consist of code reviews or inspections, where you will be assessing the code for defects, and generating Peer Review Reports. WARNING: risk managed scheduling and resource allocation as well as careful and regular assessment and control are essential for a successful outcome.

Some related activities during the Construction stage will include creating or modifying prototypes, assessing various Commercial Of The Shelf (COTS) components for the application, tailoring COTS products, and integrating COTS products into application including glue code design, development and test.

Testing

Testing is an integral part of the Construction stage. This includes testing individual components of the system, writing test drivers, simulations, and gages, generating regression test packages, writing test descriptions, matching with requirements scenarios and reporting test results.

Project Management and Special Functions

Throughout the project, you will be performing planning and control activities, such as creating or modifying plans, reporting status, collecting and analyzing metrics, managing or coordinating work (configuration management, quality control). In particular, the Project Manager will be generating Iteration Plans and Iteration Assessment Reports.

Configuration Management and Quality Management: Hours spent performing configuration management and quality management functions, including developing Quality Management Plan, Peer Review Plan, coordinating tools and the like must be planed and accounted for on the construction schedule.

In preparation for, and during the Transition Phase, you will be performing several activities, such as developing and executing the transition plan, coordinating deliverables with the client, meeting with key personnel for transition strategy and readiness discussions. You will also be training the users on the application, developing training material, as well as developing user documentation (e.g., user's manual and online help). Finally, you will need to spend some time coordinating, preparing and packaging customer deliverables for delivery (source code files, installation scripts, maintenance package, regression test package, support tools and environment, etc.). Most importantly documenting how the system must be supported and will support anticipated evolutionary changes. B. Guidelines for the Deliverables

The artifacts of the process grouped in "logical" sets. These groupings do not imply physical document groupings. They indicate what areas are the main focus within a particular phase.

LeanMBASE_Guidelines_V1.2 81 Version Date: 9/28/2005 Guidelines for MBASE

Requirements, Architecture, Design and Management Set

 Operational Concept Description (OCD)

 System and Software Requirements Definition (SSRD)

 System and Software Architecture Description (SSAD) and Rose Model Files (MDL)

 Feasibility Rationale Description (FRD)

 Life Cycle Plan (LCP). Must include effort/cost estimates such as:

COCOMO II run

COCOMO II Data Collection Form (as an Appendix) as a rationale capture for the COCOMO Estimate

 Risk-Driven Prototype(s)

Construction Planning Set

 Life Cycle Plan (LCP)

 Quality Management Plan (including guidelines for Configuration Management, Testing and Peer Reviews)

 Peer Review Plan

 Test Plan

Status Assessment Set

 Weekly Effort Forms

 Weekly Progress Reports

Construction Working Set

One Construction Set is delivered at the end of each iteration.

 Documentation

As-built specs

 As-built Operational Concept Description (OCD)

 As-built System and Software Requirements Definition (SSRD)

 As-built System and Software Architecture Description (SSAD)

 As-built Rose Model Files (MDL)

 As-built Feasibility Rationale Description (FRD)

 Updated Risk Management Plans

 Summary of the revisions

LeanMBASE_Guidelines_V1.2 82 Version Date: 9/28/2005 Guidelines for MBASE

Iteration Plans (one per iteration)

Peer Review Reports (at least 1 peer review report per iteration)

Test Reports (at least 1 test report per iteration)

Release Description (one per iteration)

Iteration Assessment Reports (one per iteration)

 Implementation

Source Code Baselines (including comments in the source files and “Read Me” files)

Associated Compile-Time Files

Component Executables

Test drivers and simulations

Transition Set

 Transition Plan (including some Training planning)

 User Manual

 Transition readiness assessment

Support Set

 Support Plan (including evolution support plan)

 Training materials (including tutorials and sample data)

Data Collection Set

 Size Report (including Source Lines Of Code (SLOC) estimates) (one size report per iteration)

 Other data collection items such as:

COCOMO Data Collection Form (including actuals)

COCOTS Data Collection Form C. General Guidelines for Plans and Reports

The following guidelines for plans and reports are very general. As such some items and activities they describe may not be apply to the particular project at hand. In some cases items will need to be added. The choice as to what to include and at what level of detail should be risk driven in accordance with achieving high system assurance and effective development team communication given the constraints of the project. Below are some questions that may be helpful in determining the appropriate items to include and their respective level of detail:

What should be documented to indicate that the correct system would be constructed?

LeanMBASE_Guidelines_V1.2 83 Version Date: 9/28/2005 Guidelines for MBASE

What should be documented to indicate that the system would be constructed correctly?

Are the plans and processes helping to guide and control the construction or do they hinder it?

Are all the development team members being utilized effectively?

Do the plans address all the significant construction issues (in accordance with the FRD feasibility analysis)?

Try to keep plans and reports short, tightly focused, and as concise as possible. Keep in mind that the audience is generally developers who are likely already familiar with the system concept and as such extended explanations, justifications, and so forth are unnecessary. Be direct, brief, and clear as possible about what is asked for or being reported. Consider the use of tables, diagrams, bullet lists and so forth over large blocks of text.

The following table presented within the “High-Level Dependencies” will indicate the general level of integration a particular plan or report has with LCO/LCA MBASE deliverables:

OC SSR SSA LC FR Prototy D D D P D pe

X X X X X X

Where X is one of:

 = direct integration

+ = strong influence

~ = moderate integration

- = indirect integration

LeanMBASE_Guidelines_V1.2 84 Version Date: 9/28/2005 Guidelines for MBASE Iteration Plan A. Description A.1. Purpose

Overall the purpose of the iteration plans are to detail the incremental implementation and control of the SSRD requirements following the designs within the SSAD according to the schedule and approach specified in the LCP. The Iteration Plan for an upcoming iteration is planned in the current iteration. It is modified as needed during the iteration. The current iteration plan is an input to the next iteration plan. There are often two such plans: one for the current iteration, and one under construction for the next iteration. An iteration plan is realized and frozen after the scheduled iteration time is exhausted. The next iteration plan is then executed and an iteration assessment report is generated for the previous iteration. The Iteration Plan corresponds to the ‘Establish next level objectives, constraints and alternatives’ in the Win–Win spiral model. A.2. High-Level Dependencies

The Iteration Plan requires the following as inputs:

 Life Cycle Plan for the overall milestones to be achieved for each iteration (i.e. schedule estimates, dependencies, etc.)

 Life Cycle Plan and Feasibility Rationale for the identification and assessment of the risks and the risk management strategy to be implemented during each iteration

 System and Software Requirements Definition (SSRD) for the list of requirements that must be completed

 Current status of the project (as represented by the set of Weekly Progress Reports) to-do’s, unrealized tasks from previous iterations.

 Current Test Reports and Peer Reviews Reports for a summary of the defects that must be removed prior to next release.

OC SSR SSA LC FR Prototy D D D P D pe

- + +  + + B. Document Sections B.1. Iteration Overview

Provide a high-level overview of the content of the given iteration. Indicate which LCP milestones will be addressed. B.1.1 Capabilities to be Implemented

 Identify the features, requirements or use–cases that are being developed (implemented, tested, ...) for this iteration.

01111e796d662a64ee23970f5be9f23a.doc 85 Version Date: 9/28/2005 Guidelines for MBASE Iteration Plan

 Each component should be accounted for in at least one iteration. All requirements should be implemented and tested (or re-negotiated) by the completion of all the iterations. Be mindful of implementation dependencies. Document complex dependencies and communicate them to the appropriate development staff. B.1.2 Capabilities to be Tested

 Identify the software features and combinations of software features to be tested this iteration. This may also include non-functional requirements or extra-functional requirements, such as performance, portability, and so forth.

 Every requirement listed in the SSRD LCA package. Additionally you may need to test non-requirement component features such as COTS capabilities and quality, API functionality, etc. B.1.3 Capabilities not to be tested

Identify notable features, and significant combinations of features, which will not be tested this iteration and why (e.g. a given feature uses a feature which will be implemented in following iteration). B.2. Plan

This is the core section of the Iteration Plan. It is important to keep the Plan up-to-date during a given iteration. Thus, this section should be written so that it is very easily modified and updated. Be sure to keep careful version control. B.2.1 Schedule and Resources for Activities

List the major activities abnd miletones in this iteration. Indicate the resources needed for completing the activites. Provide links to MS Project file as appropriate. This should details major milestones indicated on the lifecycle schedule within LCP. B.2.2 Team Responsibilities

 Provide detailed team responsibilities, covering the possible range of activities for this particular iteration. B.3. Assumptions

Describe briefly the specific (significant) assumptions, under which this plan will hold: i.e., if those assumptions were no longer satisfied, the Iteration Plan would have to be revisited.

LeanMBASE_Guidelines_V1.2 86 Version Date: 9/28/2005 Guidelines for MBASE Iteration Assessment Report A. Description A.1. Purpose

An iteration is concluded by an iteration assessment, where the actual results of construction actively are assessed in the light of the evaluation criteria that were established within the iteration plan. Iteration Assessments are not updated, but should be maintained for future reference. One aspect of the Iteration Assessment Report is to come up with "Lessons Learned", which corresponds to the ‘Evaluate Product and Process alternatives’ in the Win–Win Spiral Model. A.2. Additional Information

This assessment is a critical step in an iteration and should not be skipped. If iteration assessment is not done properly, many of the benefits of an iterative approach will be lost. Note that sometimes the right thing to do in this step is to revise the evaluation criteria rather than reworking the system. Sometimes the benefit of the iteration is in revealing that a particular requirement is not important, too expensive to implement, or creates an unmaintainable architecture. In these cases, a cost/benefit analysis must be done and a business decision must be made. Sound metrics should be used as the basis of this assessment. B. Document Sections B.1. Overview B.1.1 Capabilities Implemented

 List the features, use–cases and scenarios and their respective requirements (SSRD), components and objects that were actually implemented.

 Indicate divergence from items planned to be implemented within 1.1 of the Iteration Plan. B.1.2 Summary of Test Results

 Provide an overall assessment of the system as demonstrated by the test results.

 Identify the features or combinations which were not sufficiently tested and highlight those as needing further testing in the following iteration

 Identify all resolved and unresolved incidents and summarize their resolutions B.2. Adherence to Plan

Describe how well the iteration ran according to plan. Was it on budget and on time? Provide some insight to avoid mistakes for future iterations.

01111e796d662a64ee23970f5be9f23a.doc 87 Version Date: 9/28/2005 Guidelines for MBASE Iteration Assessment Report B.3. External Changes Occurred

 Describe any changes that have occurred with respect to the original assumptions in Iteration Plan 3.0: e.g., changes in requirements, new user needs, competitor’s plan, discovery of a more efficient algorithm, …

 Provide indications on the amount of rework required for the following iteration B.4. Suggested Actions

 State any actions suggested due to unexpected results of the analysis.

 Provide any recommended improvements in the design, operation, or testing of the system tested. For each recommendation, describe the impact on the system.

LeanMBASE_Guidelines_V1.2 88 Version Date: 9/28/2005 Guidelines for MBASE Release Description A. Description A.1. Purpose

The purpose of the Release Description is to describe items, particularly executables that will be made available after the completion of a development incremental plan. A Release Description is prepared right at the end of an iteration. In particular, the final Release Description before the Product Release is the most critical one, as it details the system outcome and aids in transition. Release descriptions are good candidates for “Read Me” files. A.2. High-Level Dependencies

OC SSR SSA LC FR Prototy D D D P D pe

- ~ + ~ - - B. Document Sections B.1. About This Release

Provide version information, what the release consists of, documentation, licensing, etc. Include the following information, as applicable. B.1.1 Physical Inventory of materials released

List all the physical media, documentation, and hardware that make up the software version being released by identifying titles, abbreviations, dates, version numbers, author as applicable. B.1.2 Inventory of software contents

 List all computer files that make up the software version being released by identifying numbers, titles, abbreviations, dates, version numbers, and release numbers as applicable.

 List the number, title, revision and date of all documents pertinent to this software. This should include applicable requirements, SSRD, design, SSAD, test (CDC Test Procedure and Results) and user documents. B.2. Compatibility Notes

 Describe software (include version) that interact with the system and that are known to be compatible.

 Describe significant software that is known to be incompatible, and if there are any workarounds

01111e796d662a64ee23970f5be9f23a.doc 89 Version Date: 9/28/2005 Guidelines for MBASE Release Description B.3. Upgrading

Describe installation, data conversion from information produced by earlier versions, etc. B.4. New Features and Important Changes

Provide an accounting of the differences between this release and the previous (or what there is if this is the first release) for the following areas: B.4.1 New Features

List all the new features incorporated into the software since the previous version. B.4.2 Changes since previous release

 List all changes incorporated into the software since the previous version.

 Identify as applicable the problem reports, peer review reports, test results and change notices for each change. B.4.3 Upcoming Changes

List all new features and changes that will be incorporated in future releases. B.5. Known Bugs and Limitations

Identify any possible problems or known errors with the software at the time of release, and instructions for recognizing, avoiding, correcting or handling each error (if known).

LeanMBASE_Guidelines_V1.2 90 Version Date: 9/28/2005 Guidelines for MBASE Quality Management Plan (QMP) A. Description A.1. Purpose

The objective of construction is to follow a high quality process and deliver high quality products. Quality is elusive and poses particularly challenging issues when addressed only after the majority of a system has been implemented. As such it is difficult to achieve directly and a sound set of guidelines established prior to and followed during implementation can help achieve this indirectly and incrementally. It is also difficult to ensure that quality will be achieved as a matter of course, however the main concern is to avoid unspecified, haphazard or ad-hoc and often clashing approaches. A.2. Quality Focal Point

It is the responsibility of the Quality Focal Point to implement the plan so that each team member can carry out their designated tasks for achieving required quality without significant additional efforts. The Quality Focal Point performs the quality management activities for the project and is responsible for the overall quality of the product.

577 Guidelines:

For versions of the Quality Management Plan since the last ARB, include a summary of changes made in the document to ease the review process. It is sufficient to identify the Quality Focal Point for the project and his/her associated responsibilities. A.3. Additional Information

577b Guidelines:

For CS 577b, this document should help the members of the project team understand each person’s contribution to quality. Each team has the flexibility to choose their own standards for quality management and should take the initiative in defining additional quality mechanisms as dictated by project requirements. For instance it should include the related trade offs as the degree of testing decreases and reviews increases, the methodology used to control injection and removal of defects etc.It should be clear from the plan who is primarily responsible for the implementation of the various parts of the plan. A.4. High-Level Dependencies

 The Quality Management Plan has the following dependencies within other sections:

OC SSR SSA LC FR Prototy D D D P D pe

-  + ~ ~ -

01111e796d662a64ee23970f5be9f23a.doc 91 Version Date: 9/28/2005 Guidelines for MBASE Quality Management Plan B. Document Sections B.1. Purpose B.1.1 Overview

Summarize the purpose and contents of this document with respect to the particular project and stakeholders involved. B.1.2 References

Provide usable citations to significant prior and current related work and artifacts, documents, meetings and external tools referenced or used in the preparation of this document. B.2. Quality Guidelines

This section describes the guidelines for quality management. This section should be very brief and only cover those quality tasks that are significant and meaningful to the project. B.2.1 Design Guidelines

Briefly describe design guidelines to improve or maintain modularity, reuse and maintenance, etc. In particular indicate how the designs in SSAD will map to the implementation of those designs. B.2.2 Coding Guidelines

It is necessary though to follow the same spirit of documentation and coding throughout the system for ease of understanding among all the team members and to ensure smooth transition to maintenance. Specify the coding guidelines the development team will use. It is not important which coding convention is chosen, but it is very important that the project identifies and complies with defined coding standards (for each language used). You can develop your own coding conventions. There should also be sufficient comments in the code to support maintenance of the software for instance the change history and modification details.

We are providing links to some industry conventions for coding in some of the implementation languages:

C: http://www.gnu.org/prep/standards_toc.html

C++: http://www.nfra.nl/~seg/cppStdDoc.htmll

Java http://www.infospheres.caltech.edu/resources/code_standards/jav a_standard.html

Visual http://construxsoftware.com/carmac/DocumentationProject/vbco Basic destd.pdf B.2.3 Quality Assessment Methods

Quality Assessment methods refer to all the methods, techniques, tools, processes, etc. that [can] identify (or measure) "problems". Some are used to detect the problems earlier than testing: they generate problem reports, issues lists and/or defect lists. Two methods of early (before testing) identification of defects are Peer Reviews and Independent Verification and Validation

LeanMBASE_Guidelines_V1.2 92 Version Date: 9/28/2005 Guidelines for MBASE Quality Management Plan .2.3.1 Peer Reviews

Quality "assessment" methods, generically called peer reviews since performed by peers in the development team. Identify which techniques to be used when and where. List the types of peer reviews, their definitions, types of issues or defects identified and schedule of reviews. .2.3.2 Degree of data gathering

 Identify the target data that will be detected during the peer reviews.

 Describe the defect and its classification as to severity and avoidability. B.2.4 Process Assurance

The objective of software quality management is to ensure the delivery of a high-quality software product. This section should elaborate on Level of Services, and roles & responsibilities of team members in achieving them.

Traditional quality assurance functions listed below can be used and tailored as per project.

 Auditing the project's compliance with its plans, policies, and procedures;

 Monitoring the performance of reviews and tests;

 Monitoring corrective actions taken to eliminate reported quality related deficiencies

Due to the significant schedule constraints, no explicit process assurance related activities are done by the team’s. The instructional staff does provide some process assurance in terms of scheduling deliverables, tasks and assignments B.2.5 Product Assurance

One of the prime objectives of software quality management is to ensure that the software product meets the client requirements and expectations. This section should elaborate on Level of Services, and roles & responsibilities of team members in achieving them.

Traditional quality assurance functions listed below can be used and tailored as per project:

 Development of documentation and code standards

 Verification of the project's compliance with its documentation and code standards;

 Requirement specification, design and architecture.

 Coding

 Development of Test Plans, Quality Plans, Transition Plan etc

 Testing

 Verification & Validation

Provide the list of assurance functions and the technique used by your project.

LeanMBASE_Guidelines_V1.2 93 Version Date: 9/28/2005 Guidelines for MBASE Quality Management Plan .2.5.1 Requirement Verification

 Include a Requirements Verification Matrix specifying how each requirement from SSRD will be verified either by Testing, Demonstration, Analysis, Simulation, Inspection, etc. .2.5.2 Independent Verification and Validation

Independent Verification and Validation includes IV&V to generate independent feedback (either defect lists and/or problem reports). IV&V also needs to review and test all deliverables for completeness and correctness and ensure the verification of the project’s compliance with its documentation and code standards and feasibility of the solution. B.2.6 Problem Reporting and Tracking System

This section should state the methods to be used for Problem, Issue and Defect reporting and tracking.

 Identify the system for receiving reports from the team or IV&Vers

 Identify how the reports will be monitored and by whom, including frequency of analysis, and weekly summary report generation. B.2.7 Configuration Management

During the project various versions of documents and software product are created. In order to avoid costly re-work, hidden defects, and deliver a software system consistent with the stage of development and traceable to the needs of the customer, configuration management is needed. .2.7.1 Configuration Item and Rationale

Each project churns out a number of documents, but not all are required for continued development and system maintenance. Provide a list of configuration item (includes both documents and software), rationale behind the selection of the item, volatility and impact severity. Artifacts suited for configuration control include plans, specifications, designs, code, tests, manuals, object code, defect reports and change requests. You may also provide type for the items like MinBASE documents, plan, report, code, etc. .2.7.2 Identification System

Provide the identification system used, in particular the naming convention for various versions of all the artifacts.. .2.7.3 Storage of Configuration Items

The storage location for each artifact should be identified. Provide the directory structure for the artifacts. .2.7.4 Configuration Control

It includes (for both documents and software, as applicable)

 How changes to the baseline artifacts will be coordinated with the client (e.g., meeting for major changes, email for moderate changes, none for trivial changes).

 How outstanding problem reports will be tracked.

 Who will be the custodian of the master baseline versions, and how he/she will preserve the integrity and ensure that a version control is maintained?

LeanMBASE_Guidelines_V1.2 94 Version Date: 9/28/2005 Guidelines for MBASE Quality Management Plan

 How will the changes be made? Provide a flowchart .2.7.5 Resources and Personnel

Identify the software and human resources required to perform configuration management (CM) functions. Designate specific project personnel to perform the tasks of CM and main project librarian or archivist. .2.7.6 Tools

Each of the tasks mentioned above may necessitate the use of tools for efficiency and effectiveness.

LeanMBASE_Guidelines_V1.2 95 Version Date: 9/28/2005 Guidelines for MBASE Test Plan and Cases A. Description A.1. Purpose

 To prescribe the scope, approach, resources, and schedule of testing activities. To identify the items being tested, the features to be tested, the testing tasks to be performed, the personnel responsible for each task, and the risks associated with testing.

 To detail the activities required to prepare for and conduct the system test

 To communicate to all responsible parties the tasks which they are to perform, and the schedule to be followed in performing the tasks

 To define the sources of the information used to prepare the plan

 To define the test tools and environment needed to conduct the system test

 Test Description. To describe the test cases needed for each test identifier.

 Test Initialization. Describe how to set up the conditions for the test; identify any flags, breakpoints, or data to be set/reset prior to starting the test.

 Describe what the expected output specification is, and pass/fail criteria.

 Describe assumptions and constraints, dependencies among the test cases, and tracebility to other resources. (e.g. SSAD) A.2. High-Level Dependencies

The Test Plan and Cases requires the following as inputs:

 Life Cycle Plan (LCP)

 Quality Management Plan (QMP)

 System and Software Requirements Definition (SSRD)

 Configuration Management Plan

 System and Software Architecture Description (SSAD)

 Relevant organizational policies

 Relevant organizational standards

The Test Plan and Cases will generate outputs for the following:

 Test Procedures and Results (TP&R)

01111e796d662a64ee23970f5be9f23a.doc 96 Version Date: 9/28/2005 Guidelines for MBASE Test Plan and Cases

OCD SSRD SSAD LCP FRD UML Prototype IP IAR RD QMP TP&C TP&R PRP PRR Model

-  +  ------+  - -

B. Document Section B.1. Introduction B.1.1 Purpose of the Test Plan

 Provide the purpose, background, and scope of testing within this project.

 Provide the description of test cases within this project. B.1.2 References

Provide usable citations to significant prior and current related work and artifacts, documents, meetings and external tools referenced or used in the preparation of this document. B.2. Environment Preparation

Specify how to prepare test environment for testing. B.2.1 Hardware preparation

Describe the procedures needed to prepare the hardware for the test, including support hardware (e.g., test equipment). Reference any operating manuals, if applicable.

Provide the following as applicable:

 The specific hardware to be used, identified by name and, if applicable, number

 Purpose of each hardware item, security and privacy considerations. B.2.2 Software preparation

Describe the procedures needed to prepare the software for the test, including support software (e.g., simulators, data recording/reduction software). Reference any software manuals, if applicable.

Provide the following as applicable:

 The specific software to be used, identified by name and, if applicable, version number

 Purpose of each software item, security and privacy considerations

LeanMBASE_Guidelines_V1.2 97 Version Date: 9/28/2005 Guidelines for MBASE Test Plan and Cases B.2.3 Other pre-test preparations

Describe any other pre-test personnel actions, preparations, or procedures needed to perform the test not accounted for in 2.1 or 2.2 above. B.3. Test Identification

This section should contain the following information: B.3.1 Test Identifier

This identifies the test by a project unique identifier. It shall provide a brief description of the test .3.1.1 Test Level

This section shall describe the level at which the testing will be performed, for example, software item level or system qualification level. .3.1.2 Test Class

This section shall describe the type or class of the test that will be performed, for example, timing tests, erroneous tests, maximum capacity tests, etc. .3.1.3 Requirements Traceability

This section shall include traceability from each test identified to the software item requirements and if applicable, software system requirements it addresses. .3.1.4 Test Cases

A test case specification specifies inputs, expected results, and a set of execution conditions for a test item. For each test case, create a sub-heading using the following structure:

Identify a test (one of the tests in the test set comprising the application testing addressed by this test description) by a project-unique identifier and provide the information specified below for the test. The name includes the identification of the applicable unit. There maybe several test cases for one test identifier.

The following table 22 can be used as reference.

Table 22: Test Cases

Test Case Number <>

Test Item <>

Pre-conditions << Describe the prerequisite conditions that must be established prior to performing this test case>>

Post-conditions << Describe the conditions that must be established after performing this test case>>

Input Specifications << Describe each input required to excute the test case>>

LeanMBASE_Guidelines_V1.2 98 Version Date: 9/28/2005 Guidelines for MBASE Test Plan and Cases

Expected Output Specifications <>

Pass/Fail Criteria << Identify all the pass/fail criteria to be used for evaluating the results of this test case>>

Assumptions and Constraints << Identify any assumptions made or constraints imposed in this test case>>

Dependencies << List the identifiers of the test cases that depend on this test case i.e. the test cases that must be executed prior to the execution of this test case>>

Traceability < Provide mapping to OCD, SSRD, SSAD>

Note: The following template shows a way of organizing Section 3.x (this depends on whether there are more than one tests):

3.1 Test Identifier 1

3.1.1 Test Level

3.1.2 Test Class

3.1.3 Requirements Tracebility

3.1.4 Test Case

3.2 Test Identifier 2

3.2.1 Test Level

3.2.2 Test Class

3.2.3 Requirements Tracebility

3.2.4 Test Case

... B.4. Resources

Specify the people, time, budget, resources allocated for testing, etc. B.4.1 Responsibilities

Identify the stakeholders responsible for managing, designing, preparing, executing, witnessing, inspecting and resolving test items. In addition, provide the groups responsible for providing items to be tested. B.4.2 Staffing and Training Needs

Specify test-staffing needs by skill level. Identify training options for providing necessary skills

LeanMBASE_Guidelines_V1.2 99 Version Date: 9/28/2005 Guidelines for MBASE Test Plan and Cases B.4.3 Other resource allocations

Include all other resources required for the testing activities. B.5. Test Completion Criteria

A statement identifying recommended test completion and evaluation criteria. Overall, individual tests may have separate completion criteria.

Representation:

Table 23: Test Completion Criteria

Test Identifier Unique Identifier for traceability

Description Brief description of the purpose

Completion List of completion criteria specific for this test identifier. Criteria

LeanMBASE_Guidelines_V1.2 100 Version Date: 9/28/2005 Guidelines for MBASE Test Procedures and Results A. Description A.1. Purpose

 Detail the activities required to prepare for and conduct specific system tests and document their results.

 Communicate to all responsible parties the tasks which they are to perform, and the schedule to be followed in performing the tasks

 Test procedure specification. A document specifying a sequence of actions for the execution of a test.

 Tester. Identify the person who prepared the test description.

 Test Preparation. Identify the test tools to be used; the names of files in which test cases and/or data reside; the hardware and software required for the test.

 Test Result Report. Include the incident of the test, test log, test summary and rational for this test.. A.2. High Level Dependencies

The Test Procedures and Results document requires the following as inputs:

 Test Plan and Cases

 System Requirements (SSRD)

 Behavior Model (SSAD)

OCD SSRD SSAD LCP FRD UML Prototype IP IAR RD QMP TP&C TP&R PRP PRR Model

-  + ------ - - B. Document Sections B.1. Introduction B.1.1 Purpose

Provide the purpose, background, and scope of test descriptions and results within this project. B.1.2 References

Provide usable citations to significant prior and current related work and artifacts, documents, meetings and external tools referenced or used in the preparation of this document.

01111e796d662a64ee23970f5be9f23a.doc 101 Version Date: 9/28/2005 Guidelines for MBASE Test Procedure and Results B.2. Test Preparation

For each test provide the following:

 Test identifier: This identifies the test by a project unique identifier.

 Hardware preparation: Describe the procedures needed to prepare the hardware for the test, including support hardware (e.g., test equipment). Reference any operating manuals, if applicable.

 Software preparation: Describe the procedures needed to prepare the software for the test, including support software (e.g., simulators, gages, data recording/reduction software). Reference any software manuals, if applicable.

 Other pre-test preparation: Describe any other pre-test personnel actions, preparations, or procedures needed to perform the test. B.3. Test Case Description

For each test case provide: Test Identifier, Test Case, and Test Process (Table 24 can be used for describing the test process)

Table 24: Test Procedure Template

Step No. Step Description Expected Result Observed Result Pass/Fail

B.4. Test Incident Reports

A test incident report is a document reporting on any event that occurs during the testing process, which requires further investigation. B.4.1 Incident Description

Provide a detailed description of the incident. This description should include the following items:

 Unique identifier for each test incident

 Inputs

 Expected results, Actual results

 Tester B.4.2 Impact

If known, indicate what impact this incident will have on test plans, test- design specifications, test-procedure specifications, or test-case specifications. B.5. Test Log

The purpose of the test logs is to provide a chronological record of relevant details about the execution of tests.

LeanMBASE_Guidelines_V1.2 102 Version Date: 9/28/2005 Guidelines for MBASE Test Procedure and Results

This test log shall include:

 The date(s), time(s), and location(s) of the tests performed

 The hardware and software configurations used for each test including, as applicable,

 The date and time of each test-related activity, the identity of the individual(s) who performed the activity, and the identities of witnesses, as applicable B.6. Test Summary

This provides an overview of test results. B.6.1 Overall assessment of the software tested:

 Provide an overall assessment of the software as demonstrated by the test results in this report.

 Identify any remaining deficiencies, limitations, or constraints that were detected by the testing performed. B.6.2 Impact of test environment:

This section shall provide an assessment of the manner in which the test environment may be different from the operational environment and the effect of this difference on the test results. B.6.3 Recommended improvements:

This section shall provide any recommended improvements in the design, operation, or testing of the software tested. A discussion of each recommendation and its impact on the software may be provided. B.7. Operational Test and Evaluation

Assurance of adequate transition readiness is essential for successful transition. Operational test and evaluation help identify risks, problem areas, set baselines, and manage stakeholder expectations. Address the following items to estimate: B.7.1 Evaluation Criteria

Identify the criterions, which address issues such as improved efficiency, quality of service, mission effectiveness, stakeholder satisfaction, and return on investment. B.7.2 Outline of Operational Test and Evaluation Report

The report should include an update of the evaluation criteria and procedures above, and various summaries of the results, conclusions, and recommendations.

LeanMBASE_Guidelines_V1.2 103 Version Date: 9/28/2005 Guidelines for MBASE Test Procedure and Results

LeanMBASE_Guidelines_V1.2 104 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan A. Description A.1. Purpose

The peer review plan describes the peer [or people-based] review process(es) and actual peer review checklists or Fagan's Inspection exit criteria (or references to them), and identifies the people involved and the schedule of peer reviews for the project. Since IV&V reviews are also "people-based" reviews, they should be covered by this document, although it may not be possible to provide all the same information. A.2. High-Level Dependencies

OC SSR SSA LC FD Prototy D D D P pe

~ + +  ~ ~ B. Document Sections B.1. Purpose B.1.1 Overview

Describe the intended purpose of the in-process, people-based reviews and expected outcomes for this project.

577b Guidelines:

Since there can be both team and independent (IV&Ver) in-process, people-based reviews for CS577, both should be described if you have an IV&Ver assigned to your project. For sections 2 through 4, this can be done using two separate sub-sections. B.1.2 References

Provide usable citations to significant prior and current related work and artifacts, documents, meetings and external tools referenced or used in the preparation of this document. B.2. Peer Review Items, Participants and Roles

 Describe the artifacts (documents, etc…) that will be subjected to peer review, taking into account schedule or staffing constraints

 For each artifact, indicate the primary author, and identify who will play the various roles. This information will be required to help the participants involved prepare for their respective roles.

01111e796d662a64ee23970f5be9f23a.doc 105 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan B.3. Peer Review Process

 Describe the peer review process for each type of item to be reviewed

 Reference any inspection process (e.g., Fagan Code Inspection): if that is the case, simply indicate any variations introduced particularly for your project, such as Perspective Based Reading or Object Oriented Reading Technique. B.4. Classification of Defects

All defects identified during the peer review meeting shall be classified by severity and type. This serves as a metric for future project. Example severity levels and type of defects are presented in the tables below. Consider using existing defect classifications such as Orthogonal Defect Classification (ODC). Describe the classification scheme to be used for this project. In addition to an overall description, define the form below from the project perspective. B.4.1 Priority and Criticality

 Priority

Priority indicates the importance of the issue to the client/stakeholder. It is decided based on the stakeholders’ value propositions. In CS577, the priority will be determined from WinWin negotiation client meetings and recorded in OCD, SSRD, and SSAD templates. The values of priority are: High, Medium, and Low.

 Criticality

The criticality value indicates the degree of loss to the system if the issue did not get fixed. The value is based on judgment from the software engineer. The values of criticality are High, Medium, and Low. B.4.2 Classification

Define the type of defect

 Missing

Information that is specified in the requirements or standard, but is not present in the document

 Wrong

Information that is specified in the requirements or standards and is present in the document, but the information is incorrect

 Extra

Information that is not specified in the requirements or standards but is present in the document B.5. Appendices: Forms for Peer Review Data Gathering

Include either the forms or references to them for ALL the forms and instruction sets to be used in gathering information from the Peer Review activities on your project.

LeanMBASE_Guidelines_V1.2 106 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan

Figure 10: Area of Concern Log Form

LeanMBASE_Guidelines_V1.2 107 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan

Figure 11: Problems/Issuer Log Cover Sheet Form

The field descriptions for the forms, which are shown in the Figure 12 and 13, can also be found in the following worksheets of Microsoft Excel spreadsheet in the file named "AgileArtifactReview_Form-FieldDesc_v6.xls".

 Areas of Concern Log Field Desc

 Problem List Field Description

LeanMBASE_Guidelines_V1.2 108 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan

Figure 12: Field Descriptions for the Areas of Concern Log

LeanMBASE_Guidelines_V1.2 109 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan

Figure 13: Field Descriptions for Artifact Review Problem List

LeanMBASE_Guidelines_V1.2 110 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan Peer Review Report A. Description A.1. Purpose

The peer review report summaries the result of the peer review process. It identifies the peer review defect data and statistics, thus indicating the importance of peer review for a high quality project. B. Document Sections B.1. Purpose B.1.1 Overview

This document provides information on what peer review information is reported for the various types of peer reviews identified in the Peer Review Plan B.1.2 References

Provide usable citations to significant prior and current related work and artifacts, documents, meetings and external tools referenced or used in the preparation of this document. B.2. Individual Inspection-like Peer Reviews

The following information is generally provided as part of any Peer Review. B.2.1 Participants and Roles

Indicate the primary author, and identify who will play the various roles for the peer review. B.2.2 Peer Review Items

 Describe the item(s) that were subjected to this peer review.

 Describe the exit criteria used for each Peer Review Item B.2.3 Pre-Peer Review Meeting Defect Data

Report the pre-peer review meeting data such as: time, majors and minors found per inspector

577b Guidelines:

Indicate your tailoring

LeanMBASE_Guidelines_V1.2 111 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan B.2.4 Peer Review Meeting Defect Data

Report the meeting defect data: size of artifact, major and minor counts, time of peer review, new defects found during the meeting not found in preparation.

577b Guidelines:

Links to the Quality Reports are sufficient. B.2.5 Post-Peer Review Meeting Defect Data

Report the post- meeting data: rework hours and final count of defects.

577b Guidelines:

Provide a summarized report of open defects, issues and defect count using Weekly Progress Reports. Also indicate the number of open issues and defects between anchor points. B.2.6 Peer Review Statistics

From the above, compute the following peer review statistics:

 Total effort consumed for each peer reviewer

 Defect density of the inspected artifacts

 Defects asserted per minute of the meeting

 Defect removal effectiveness

These are defined as follows:

 Time meeting effort = Time Meeting * Persons

 Total defects found = Items from preparation + New items

 Defects asserted per minute of meeting = Total Defects / Time Meeting

 Peer review effort = Preparation effort + Meeting effort + rework effort

 Defect removal effectiveness = Defects found / Peer review effort

 Defect density = Defects / SLOC or Pages

 Peer review rate = Inspected pages / Meeting time B.2.7 Defect Correction and Rework

Provide the status of defect correction, including

 Technical approach of outstanding rework

 Due date of outstanding rework if any

LeanMBASE_Guidelines_V1.2 112 Version Date: 9/28/2005 Guidelines for MBASE Peer Review Plan

Provide information for tracking that the defects were properly removed.

577b Guidelines:

Since most of this information is captured on the supplied forms, it is sufficient to reference them. B.3. Peer Review Summary Information B.3.1 Peer Reviews' Performed

For Artifacts prior to Construction, provide a reference to the information in the LCP §4.4.

For Artifacts during the Construction phase, including critical technical documents (e.g. SSRD and SSAD), code and possibly tests, provide information similar to that in the "Example of Table of Assessment Activities History" of LCP §4.4. B.3.2 Peer Review Results

Provide any information that is deemed useful to your project to assist with closed-loop control of the project.

577b Guidelines:

At the very least, this might include

 The COQUALMO data recording form, TableSPD-8b.doc, available from the CS577b website.

 The aggregated weekly open problem report and issues list identified in the Weekly Status Report. B.4. Appendix

Include in the Appendix all variations of the following forms (either directly or by reference), which are to be completed by the various participants in the in-process team-based Peer Review(s) Meeting(s) for your project.

 Announcement

 Individual Preparation Log

 Defect List

 Detailed Report

 Summary Report

Examples should range from full-up Fagan's Inspections to Agile Artifact Review Forms.

Also include in the Appendix all variations of the forms (either directly or by reference) which are to be completed by an IV&V person assigned to your project. Examples range from Formal Review to Agile Artifact Review.

LeanMBASE_Guidelines_V1.2 113 Version Date: 9/28/2005 Guidelines for MBASE Transition Plan A. Description A.1. Purpose

The purpose of the transition plan is to ensure that the system’s operational stakeholders (users, administrators, operators, maintainers) are able to successfully operate and maintain the system. A.2. High-Level Dependencies

The Transition Plan draws on and elaborates the transition aspects of the Operational Concept (OCD 4); Milestones and Products (LCP 2.) and Responsibilities (LCP 3.) from the Life Cycle Plan. It prepares for the successful execution of the life cycle support activities and responsibilities in the Support Plan.

OC SSR SSA LC FR Prototy D D D P D pe

~ - -  + - B. Document Sections B.1. Transition Strategy

This section shall be divided into paragraphs as needed to describe the developer's plans for transitioning the deliverable software to the support agency. This section shall address the following: B.1.1 Transition Objectives

Establish the various dimensions of success associated with the particular transition activity. These may include the following classes of transition choices:

 Extent of capability transitioned: limited pilot operation; full operation; intermediate.

 Number and nature of transition sites: one vs. many; homogeneous vs. heterogeneous.

 Degree of post-transition developer support: none (pure turnkey), full (facilities management by developer), intermediate.

 Degree of validation of operational satisfaction of stakeholder objectives.

 Nature of product transition: shrink-wrapped product; new system; improvement in existing operation.

 Relation to support objectives, e.g., data collection to support evolution planning. B.1.2 Transition Process Strategy

This section establishes the major strategic transition choices:

01111e796d662a64ee23970f5be9f23a.doc 114 Version Date: 9/28/2005 Guidelines for MBASE Transition Plan

 Phasing of cutover: instantaneous (cold turkey); incremental; parallel operation; other.

 Phasing of transition of multiple increments to multiple sites.

 Role of alpha-testing, beta-testing, independent operational testing and evaluation. B.2. Preparing for Transition

Preparing for transition is a significant non trivial task critical to ultimate successful operation of the system. B.2.1 Hardware Preparation

 Indicate additional hardware that needs to be purchased if any

 Indicate any special or additional instructions for placing the hardware in a state of readiness

 Staff time required for hardware preparation

 Organizational approvals for nonstandard hardware installation B.2.2 Software Preparation

 Appropriate packaging of deliverable support software (tools; infrastructure software).

 Licenses for commercial software.

 Preparation and installation of necessary data.

 Conversion of legacy software for compatibility with the new system.

 Organizational approvals for nonstandard software installation B.2.3 Site Preparation

Facilities may include:

 Computer rooms, flooring, power supply

 Computers, peripherals, and supplies

 Data communications capabilities

 Security and backup facilities B.3. Stakeholder Roles and Responsibilities

Identify a significant transition role for each stakeholder category. Besides developers, maintainers, users, customers, and interfacers, these could include operational testers, labor unions, and safety officials for example.

LeanMBASE_Guidelines_V1.2 115 Version Date: 9/28/2005 Guidelines for MBASE Software User's Manual A. Description A.1. Purpose

The purpose is to teach and guide the user how to use the product, i.e., the steps for running the software, describes the expected output(s), and describes the measures to be taken if errors occur. A.2. High-Level Dependencies

The following table shows the dependencies of the Software User’s Manual on other MBASE documents.

OC SSR SSA LC FR Prototy D D D P D pe

+ + ~ - ~ + B. Document Sections B.1. Introduction B.1.1 System Overview

State the brief purpose of the system to which this manual applies. B.1.2 System Requirements

Describe the minimum hardware and software (Operating System, etc.) requirements for the system. B.2. Operational Procedures

Briefly describe functions available to the user and describe the step-by-step procedures for accessing these functions.

Include the following information, as applicable:

 Initialization: Describe any procedures needed to initialize the software. Identify any initialization options available to the user.

 User Functions: Describe the functions available to the user. For each function describe: User inputs including format, limitations, input method, etc., Operational results B.3. Installation Procedures

In a system where the end user is expected to install the product, the Installation Instructions can be included in the user's guide. For complicated installation where qualified service staff is needed, a separate Installation Manual should be documented.

01111e796d662a64ee23970f5be9f23a.doc 116 Version Date: 9/28/2005 Guidelines for MBASE Software User’s Manual B.3.1 Initialization Procedures

Describe first-time installation procedures B.3.2 Re-installation

Describe procedures for reinstalling the system (e.g., to recover from a corrupt installation) B.3.3 De-installation

Describe procedures for removing the system B.4. Troubleshooting B.4.1 Frequently Asked Questions

List Frequently Asked Questions by operators, and answers to those questions. B.4.2 Error Codes and Messages

List and identify all error codes and messages generated by the software, the meaning of each message, and the action to be taken when each message appears. B.5. Notes

Include any general information that aids in the understanding of the document. All acronyms, abbreviations, and their meaning as used in this document should be listed.

LeanMBASE_Guidelines_V1.2 117 Version Date: 9/28/2005 Guidelines for MBASE System and Software Support Plan (SSSP) A. Description A.1. Purpose

The purpose of the Support Plan is to guide the system’s support stakeholders (administrators, operators, maintainers) in successfully operating and maintaining the system. A.2. High-Level Dependencies

The following table shows the dependencies of the System Software Support Plan on other MBASE documents.

OC SSR SSA LC FR Prototy D D D P D pe

+ - +  + - B. Document Sections B.1. Support Objectives and Assumptions B.1.1 Support Objectives

Identify the key driving objectives for software support. Some possible examples are:

 The top priority objective is to ensure [system safety; Platinum customer satisfaction; speed of competitive reaction in the marketplace].

 Replace inefficient legacy systems as quickly as possible.

 Provide more promising and stimulating career paths for support personnel. B.1.2 Support

State any assumptions which, if invalid, would make the support plan unworkable. Some possible examples are:

 Continuity of funding, staffing, and upper management support.

 Controllable/negotiable interfaces with interoperating systems.

 Stability of requirements and schedules for each release.

01111e796d662a64ee23970f5be9f23a.doc 118 Version Date: 9/28/2005 Guidelines for MBASE System & Software Support Plan (SSSP) B.2. Support Strategy .2.1.1 Support Lifetime

Provide a best estimate of the envisioned support lifetime. It is acceptable to provide approximate estimates (less than one year; over 5 years) or relative estimates (until the current Master Status package is phased out). .2.1.2 Release Strategy

Identify the overall release strategy (continuous as-needed small fixes; major annual releases and as-needed small fixes; releases synchronized with major COTS upgrades, etc.). Identify any known content of early releases (evolution requirements in SSRD). Describe the transition strategy for new releases (alpha and beta testing; pilot operation; sequencing of multisite or multinational releases). .2.1.3 Release Requirements Determination

Identify the primary drivers of new release content (budget, schedule, staffing, legal, business opportunity).Describe the process by which release requirements are determined (executive prioritization; stakeholder win-win; bidding; organizational sub–allocations, etc.). .2.1.4 Release Process

Describe the process for each release and how its phases overlap with neighboring releases. B.3. Support Environment

Document the anticipated environmental concerns for the long-term use of the system. .3.1.1 Hardware

Describe the hardware and associated documentation needed to maintain the deliverable software. This hardware may include computers, peripheral equipment, and non-computer equipment. .3.1.2 Software

Identify and describe the software and associated documentation needed to maintain the deliverable software. This software may include computer-aided software engineering (CASE) tools, compilers, test tools, test data, utilities, configuration management tools, databases and other software. The description shall include:

 Specific names, identification numbers, version numbers, release numbers, and configurations, as applicable

 Rationale for the selected software

 Reference to user/operator manuals or instructions for each item, as applicable

 Availability information about vendor support, licensing, and data rights, including whether the item is currently supported by the vendor, whether it is expected to be supported at the time of delivery, whether licenses will be assigned to the support agency, and the terms of such licenses

LeanMBASE_Guidelines_V1.2 119 Version Date: 9/28/2005 Guidelines for MBASE System & Software Support Plan (SSSP) .3.1.3 Facilities

Describe the facilities needed to maintain the deliverable software. These facilities may include special building features, such as cabling, special power requirements, etc. B.4. Support Responsibilities

Identify the major support stakeholder roles, and the organizations and individuals (where possible) responsible for each. Roles may include:

 Software maintenance (corrective, adaptive, perfective)

 System administration (version control, backup/recovery, authorizations, performance management, etc.)

 Operational and user support (help desks, training, etc.)

 Database administration

 Data acquisition

Include anticipated number of personnel, types and levels of skills and expertise.

LeanMBASE_Guidelines_V1.2 120 Version Date: 9/28/2005 Guidelines for MBASE Training Plan Description A.1. Purpose

The purpose of the Training Plan is to help the developers provide effective training training to the stakeholders (administrators, operators, maintainers) in successfully operating and maintaining the system. B. Document Sections B.1. Schedule and Participants B.1.1 Training Schedule

Use the following table (Table 25) and identify the following for each training session the team will have:

Table 25: Training Schedule

Date Time Location Contents Person being Responsible Training trained Personnel Materials

B.1.2 Measure of Success

State measurable goals to be achieved during each training session. B.1.3 Training of Others

Identify and stakeholder you will not train during these training sessions and who will be responsible for training them and how they will be trained. B.2. Tutorial and Sample Data

Include any tutorial presentations and sample data used for the training sessions in this section. For presentations, just indicate what presentations were used and where they can be found. You do not need to paste the presentations in this section. B.3. Training Feedback

 Indicate how each training session went. Were your goals accomplished in each session?

 Identify any issues they had during training with your system.

01111e796d662a64ee23970f5be9f23a.doc 121 Version Date: 9/28/2005 Guidelines for MBASE Sources & References Sources and References

The guidelines have drawn upon material from the following: A. Overall

[Boehm, 1996] Boehm, B., “Anchoring the Software Process,” IEEE Software, July 1996, pp. 73-82.

[Boehm, 1989] Boehm, Software Risk Management, IEEE Computer Society Press, 1989.

[Royce, 1998] Royce, W. Software Project Management: A Unified Framework. Addison Wesley, 1998.

Boehm, B., Egyed, A., Kwan, J., and Madachy, R. (1997), “Developing Multimedia Applications with the Win– Win Spiral Model,” Proceedings, ESEC/ FSE 97, Springer Verlag.

Boehm, B., Egyed, A., Kwan, J., and Madachy, R. (1998), “Using the Win–Win Spiral Model: A Case Study,” IEEE Computer, July, pp. 33-44.

[Laprie 1992] J.C. Laprie. "For a product-in-a-process approach to software reliability evaluation", Proc. Third International Symposium on Software Reliability Engineering (ISSRE92), pp. 134-139, Research Triangle Park, North Carolina, 1992.

[MIL-STD-498] Defense Standards MIL-STD-498

[EIA/IEEE J-STD-016] Commercial Standard EIA/IEEE J-STD-016

[Cichocki et al., 1997] Cichocki, A., Abdelsalam, A., Woelk, D., Workflow and Process Automation : Concepts and Technology (Kluwer International Series in Engineering and Computer Science, Secs 432), Kluwer Academic Pub, 1997.

Potts-1994] Potts, C., Takahashi, K., & Anton, A.: "Inquiry-Based Requirements Analysis"; IEEE Software, March 1994), 21-32

Sommerville, I., Software Engineering, Addison Wesley, 1995.

IEEE, IEEE Standard Glossary of Software Engineering Terminology, IEEE Std 610.12-1990, February 1991.

Since the guidelines have been integrated from multiple sources, they will not necessarily be fully consistent with the guidelines in any individual source. See the following resources for useful perspectives and additional guidelines. B. Operational Concept Description

[AIAA, 1992] AIAA Recommended Technical Practice, Operational Concept Description Document (OCD), Preparation Guidelines, Software Systems Technical Committee, American Institute of Aeronautics and Astronautics (AIAA), March 1, 1992.

[Fairley et al, 1994] Fairley, R., Thayer R, and Bjorke P., The Concept of Operations: The Bridge from Operational Requirements to Technical Specifications, IEEE, 1994.

[Lano, 1988] Lano, R.J., A Structured Approach for Operational Concept Formulation (OCF), In Tutorial: software Engineering Project Management, edited by R. Thayer, Computer Society Press, 1988.

LeanMBASE_Guidelines_V1.2 122 Version Date: 9/28/2005 Guidelines for MBASE Sources & References C. System and Software Requirements Definition

In, H. (1998). Conflict Identification and Resolution for Software Attribute Requirements. Ph.D. Thesis, University of Southern California, Los Angeles, CA.

Template. James & Suzanne Robertson. Atlantic Systems Guild http://www.atlsysguild.com/Site/Robs/Templsects.html

Typical list of data items for system/software development http://www.airtime.co.uk/users/wysywig/didlist.htm D. System and Software Architecture Description

C2 Architectural style (see http://www.ics.uci.edu/pub/arch/c2.html)

Port, D., Integrated Systems Development Methodology, Telos Press (to appear). E. Life Cycle Plan

[AT&T, 1993] Best Current Practices: Software Architecture Validation, Lucent/AT&T, 1993.

[CMU-SEI, 1995] Paulk, M., Weber, C., Curtis, B. The Capability Maturity Model : Guidelines for Improving the Software Process (SEI Series in Software Engineering), Addison-Wesley, 1995.

[Thorp 1999] Thorp, J,The Information Paradox: Realizing the Business Benefits of Information Technology, McGraw Hill, 1999

LeanMBASE_Guidelines_V1.2 123 Version Date: 9/28/2005 Guidelines for MBASE Appendices

A. Suggested Win–Win Taxonomy for MBASE

B. Level of Service Requirements

C. Common Definition Language (CDL) For MBASE

01111e796d662a64ee23970f5be9f23a.doc 124 Version Date: 9/28/2005 Guidelines for MBASE Appendix A. Suggested Win–Win Taxonomy for MBASE

The suggested domain taxonomy to be used as a checklist and organizing structure for the Win–Win requirements negotiation. Each Win–Win stakeholder artifact should point to at least one taxonomy element (modify taxonomy as appropriate). Each taxonomy element should be considered as a source of potential stakeholder win conditions and agreements. The Win–Win taxonomy roughly corresponds to the table of contents of the System and Software Requirements Definition (SSRD). Mapping the Win–Win taxonomy to the SSRD outline is straightforward, but in some cases, some sections need to be combined. In particular, Operational Modes are described in the SSRD with System Requirements. The reason is that the same system functionality may lead to different results depending on the mode. 1. Project Constraints (===>SSRD 2. Project Requirements) 1.1. Budget Constraints 1.2. Schedule Constraints 1.3. Staffing Constraints 2. Application Capabilities (===>SSRD 3.2 System Requirements) 2.1. Operational Modes 2.2. User Classes 2.3. Mission Capabilities. These will vary depending on whether the mission involves a multimedia archive, selective dissemination of information, data analysis, etc. 2.4. Support Capabilities 2.4.1. Help 2.4.2. Administration 2.4.2.1. User Account Management 2.4.2.2. Usage Monitoring and Analysis 2.4.3. 2.4.3 Maintenance and Diagnostics 3. Level of Services (===> SSRD 5. Level of Service Requirements) 3.1. General Qualities 3.1.1. Correctness 3.1.2. Simplicity 3.1.3. Consistency 3.1.4. Completeness 3.2. Coherence 3.3. Dependability 3.3.1. Reliability 3.3.2. Accuracy 3.3.3. Availability 3.3.4. Survivability 3.3.5. Serviceability 3.3.6. Verifiability 3.3.7. Resilience 3.4. Security 3.4.1. Integrity 3.4.2. Privacy 3.4.3. Audit 3.5. Confidentiality 3.6. Safety 3.7. Interoperability 3.7.1. Compatibility 3.8. Usability 3.8.1. Mission Orientation 3.8.2. Comprehensiveness

01111e796d662a64ee23970f5be9f23a.doc 125 Version Date: 9/28/2005 Guidelines for MBASE Sources & References

3.8.3. Controllability 3.8.4. Ease of Learning 3.8.5. Ease of Use 3.8.6. Help requirements 3.9. Performance 3.9.1. Processing Efficiency 3.9.2. Memory Efficiency 3.9.3. Storage Efficiency 3.9.4. Network Efficiency 3.10. Adaptability (Evolve-ability) 3.10.1. Portability 3.10.2. Flexibility 3.10.3. Scalability/Expandability/Extendability/Extensibility 3.10.4. Modifiability 3.10.5. Maintainability 3.10.6. Reconfigurability 3.11. Reusability 4. Interfaces (===>SSRD 4. System Interface Requirements) 4.1. User Interfaces Requirements 4.1.1. Graphical User Interfaces 4.1.2. Command-Line Interfaces 4.1.3. Application Programming Interfaces 4.1.4. Diagnostics 4.2. Hardware Interfaces 4.3. Communications Interfaces 4.4. Other Software Interfaces 5. Environment and Data (===> SSRD 2. Project Requirements) 5.1. Design and Construction Constraints 5.1.1. Tools 5.1.2. Programming Languages 5.1.3. Computer Resources 5.1.4. Standards Compliance 5.2. Packaging 5.3. Implementation 5.4. Software Support Environment Requirements 6. Evolution (===>SSRD 6. Evolution Requirements) 6.1. Capability Evolution 6.2. Interface Evolution 6.3. Technology Evolution 6.4. Environment Evolution 6.5. Workload Evolution

LeanMBASE_Guidelines_V1.2 126 Version Date: 9/28/2005 Guidelines for MBASE Appendix B. Level of Service Requirements

The following glossary is based on the IEEE Standard Glossary of Software Engineering Terminology, IEEE Std 610.12-1990, February 1991.

Accuracy

(1) A qualitative assessment of correctness, or freedom from error; (2) A quantitative measure of the magnitude of error

Adaptability

Adaptability is defined by the ease with which a system or component can be modified for use in applications or environments other than those for which it was specifically designed. Syn: Flexibility

Audit

Specification of the required audit checks or various audit trails the system should keep to build a system that complies with the appropriate audit rules. This section may have legal implications

Availability

The degree to which a system or component is operational and accessible when required for use. Often expressed as a probability. See also: Error Tolerance; Fault-tolerance; Robustness

Compatibility

(1) The ability of two or more systems or components to perform their required functions while sharing the same hardware or software environment (2) The ability of two or more systems or components to exchange information (See also: Interoperability)

Complexity

(1) The degree to which a system or component has a design or implementation that is difficult to understand and verify. Contrast with: simplicity

Consistency

The degree of uniformity, standardization, and freedom from contradiction among the documents or parts of a system or component.

Correctness

Correctness is defined by: (1) The degree to which a system or component is free from faults in its specification, design, and implementation; (2) The degree to which software, documentation, or other items meet specified requirements; (3) The degree to which software, documentation, or other items meet user needs and expectations, whether specified or not.

Dependability

Dependability is defined as “that property of a computer system such that reliance can justifiably be placed on the service it delivers” [Laprie, 1992]. Depending on the intended application of the system dependability is usually expressed as a number of inter-dependent properties such as reliability, maintainability and safety. It refers to a broad notion of what has historically been referred to as “fault tolerance”, “reliability”, or “robustness”

01111e796d662a64ee23970f5be9f23a.doc 127 Version Date: 9/28/2005 Guidelines for MBASE Appendix B. Level of Service Requirements

Efficiency

Efficiency is defined by the degree to which a system or component performs its designated functions with minimum consumption of resources.

Error Tolerance

The ability of a system or component to continue normal operation despite the presence of erroneous inputs. See: Fault tolerance, robustness

Expandability/Extendability/Extensibility

Expandability is defined by the easy with which a system or component can be modified to increase its storage or functional capability.

Fault-tolerance

(1) The ability of a system or component to continue during normal operation despite the presence of hardware or software faults. See also: error tolerance; fail safe; fail soft; fault secure; robustness

Flexibility

Flexibility is defined by the easy with which a system or component can be modified for use in applications or environments other than those for which it was specifically designed.

Integrity

Integrity is defined by the degree to which a system or component prevents unauthorized access to, or modification of, computer programs or data.

Example: "Identical up-to-date booking information must be available to all users of the system."

Interoperability

Interoperability is defined by the ability of two or more systems or components to exchange information and to use the information that has been exchanged.

Describe other platforms or environments on which the system is expected to run without recompilation.

Example: "The program should be binary compatible with Windows 3.1, Windows 95 and Windows 98 "

Legality

Describe any legal requirements for this system, to comply with the law to avoid later delays, lawsuits and legal fees.

If the legal requirements are above average, then this section might need to be entirely revisited

Example: "Personal information must be implemented so as to comply with the data protection act."

Example: "The system shall not use any image formats that might infringe with existing copyrights or pending legislation (e.g., GIF)"

Maintainability

Maintainability is defined by: (1) The easy with which a software system or component can be modified to correct faults, improve performance or other attributes, or adapt to a changed environment; (2) The easy with which a hardware system or component can be retained in, or restored to, a state in which it can perform its required functions.

LeanMBASE_Guidelines_V1.2 128 Version Date: 9/28/2005 Guidelines for MBASE Appendix B. Level of Service Requirements

Memory Efficiency

List of memory usage requirements that have a genuine effect on the system's ability to fit into the intended environment to set the client and user expectations.

Example: "The system should be able to run on a multi-tasking system with 4MB of free memory"

Example: "Upon exit, the server shall return all the memory it allocates to the pool of free memory on the host computer without any memory leaks".

Modularity

The degree to which a system or computer program is composed of discrete components such that a change to one component has minimal impact on other components.

Network Efficiency

List of network usage requirements that have a genuine effect on the system's ability to fit into the intended environment to set the client and user expectations.

Example: "The system should not increase the traffic on the current network by more than 10% "

Performance

Performance is defined by the degree to which a system or component accomplishes its designated functions within given constraints, such as speed, accuracy, or memory usage.

Political Correctness

Describe any special factors about the product are necessary for some political or socioeconomic reason: the reality is that the system has to comply with political requirements even if you can find a better/more efficient/more economical solution.

Example: "Our company policy says that we must buy our hardware from Unisys."

Portability

The ease with which a system or component can be transferred from one hardware or software environment to another.

 Describe other platforms or environments to which the system must be ported to quantify client and user expectations about the platforms and future environments in which the system is expected to run.

 Example: "The source code should compile correctly on Solaris and Linux"

Privacy/Confidentiality

Specification of who has authorized access to the system, and under what circumstances that access is granted.

Processing Efficiency

List of response time requirements that have a genuine effect on the system's ability to fit into the intended environment to set the client and user expectations for the response time of the system.

Reliability

Reliability is defined by the ability of a system or component to perform its required functions under stated conditions for a specified period of time.

LeanMBASE_Guidelines_V1.2 129 Version Date: 9/28/2005 Guidelines for MBASE Appendix B. Level of Service Requirements

Reusability

Reusability is the degree to which a software module or other work product can be used in more than one computer program or software system.

Software reusability means that ideas and code are developed once, and then used to solve many software problems, thus enhancing productivity, reliability and quality. Reuse applies not only to source-code fragments, but to all the intermediate work products generated during software development, including requirements' documentation, system specifications, design structures and any information the developer needs to create software

Robustness

The degree to which a system or component can function correctly in the presence of invalid inputs or stressful environmental conditions. See also: error tolerance; fault tolerance

Scalability

A quantification of how the system should be able to adapt to an increase in the workload without imposing additional overhead or administrative burden.

Storage Efficiency

The degree to which a system or component performs its designated functions with minimum consumption of available storage.

Example: "The system should be able to run with only 40MB of available disk space "

Usability

Usability is defined by the ease with which a user can learn to operate, prepare inputs for, and interpret outputs of a system or component.

Ease of Learning

A statement of the expected learning time, and any special training needed, for the expected users of the system to guide the designers of the system's interface and functionality and to determine whether or not the user can use the system after the number of hours training/familiarization/use (plus description of training program if applicable) for each type of user.

Example: "The average user should be able to produce a virtual tour within 1 hour of beginning to use the system, without resorting to the manual."

Make sure that you have considered the ease of learning requirements from the perspective of all the different types of users.

Ease of Use

A statement of how easy the system is to use to guide the system's designers.

Example: "The average user must have an error rate less than 2%."

Make sure that you have considered the usability requirements from the perspective of all the different types of users. It is necessary to make some measure of the system's intended usability. This may be that the user labs pronounce that the system is usable, or that it follows all the Apple/Windows interface guidelines, or simply that it must be popular with the majority of users.

LeanMBASE_Guidelines_V1.2 130 Version Date: 9/28/2005 Guidelines for MBASE Appendix B. Level of Service Requirements

Help requirements

A description of the help that the system will provide. The help requirements might become so complex that it is better to treat help as a separately specified system.

Example: "The system must provide context specific help. The user must be able to select an artifact and receive instruction about its use."

These might be requirements that relate to all events (globally accessible help facilities) or they might be requirements that relate to individual events or functional requirements.

LeanMBASE_Guidelines_V1.2 131 Version Date: 9/28/2005 Guidelines for MBASE Appendix C. COMMON DEFINITION LANGUAGE (CDL) for MBASE

Abbreviations: A: Analysis DD: Domain Description D: Design

Abstraction DD: A simple interface on complex Classification: Organizing a set of abstractions information. An abstraction is a representation of according to their common qualities. Classification something, either tangible or conceptual. allows you to reduce complexity in an object model. By making general statements about groups of Algorithm D: That portion of a behavior that objects. actually carries out the work of the behavior, independent of any decision-making (policy making) CLI: Command Line Interface necessary to determine which algorithm to execute. CM: Configuration Management Analogy: The identification of groups of similar relationships between abstractions. An operation Coherence: A measure of an abstraction’s elegance. abstraction. An abstraction is coherent if all of its quality resolutions are both correct and consistent. Analysis: The part of the software development process whose primary purpose is to formulate a Cohesiveness: A measure of an abstraction’s model of the problem domain. Analysis focuses on elegance. The degree to which an abstraction’s what to do, design focuses on how to do it. See qualities fit with the defining quality and with each design. other.

ARB: Architectural Review Board Common Definitional Language (CDL): A glossary. A common and consistent set of problem Architect: The person or persons responsible for space and solution space terminology developed evolving and maintaining the system’s architecture. during modeling. Used as a catalog and thesaurus Ultimately, the architect gives the system its during systematic reuse domain engineering to conceptual integrity. describe the key concepts.

Architecture: A description of the organization and Comparator: A similarity between two abstractions. structure of a system. Many different levels of If the similarity between the two abstractions is architectures are involved in developing software expressed in terms of each abstraction’s defining systems, from physical hardware architectures to the quality, the Comparator is referred to as the Main logical architecture of an application framework.. Comparator. Contrast with Discriminator.

Describes the static organization of software into Completeness: A measure of elegance describing subsystems interconnected through interfaces and whether all of an abstraction’s information can be defines at a significant level how nodes executing accessed from the interface. those software subsystems interact with each other. Component: A meta-type whose instances are ‘part- Audience: The person or persons who act as the of’ another abstraction. Components are needed to consumers of an abstraction’s interface. describe the system to domain experts. Components are compositions of objects (sometimes only one). Behavior Model: A representation of the behaviors What an entity is in the domain description, a in a system. component is in the system analysis. Components are nouns. Behavior: Maps to objects. Composition: Creating a new abstraction by Boundaries of Control: The point at which a combining smaller components into a larger behavior requires interaction with users or other abstraction. Contrast with Decomposition. An elements outside the system. operation on an abstraction.

01111e796d662a64ee23970f5be9f23a.doc 132 Version Date: 9/28/2005 Guidelines for MBASE Appendix D. Status Reports

Conceptualization: The earliest phase of abstraction’s defining quality, the discriminator is development, focused upon providing a proof of referred to as the Main Discriminator. concept for the system and characterized by a largely unrestrained activity directed toward the delivery of a Domain: The part of the real world that a particular prototype whose goals and schedules are clearly software project is intended to model. Compare with defined. Context.

Constraint: A restriction on the resolution of a Domain Expert: A person very familiar with the component’s or an object’s quality. For instance, an subject matter of the domain and how it fits together. attribute might have a minimum value, or a set of illegal values. An attribute quality Domain Model: The sea of classes in a system that serve to capture the vocabulary of the problem space; Context: The surrounding environment of a software also known as a conceptual model. A domain model project. can often be expressed in a set of class diagrams whose purpose is to visualize all of the central classes COTS: Commercial Off The Shelf responsible for the essential behavior of the system, together with a specification of the distribution of Coverage: A measure of elegance, with regard to an roles and responsibilities among such classes. object’s defining quality. A defining quality has good coverage if it includes everything that should be Elegance: An abstraction that conveys its underlying a part of the abstraction. information using the simplest possible interface. A measure of an abstraction’s elegance is its CTS: Construction Transition Support package Information-to-interface Ratio. Qualities that directly impact elegance are: Completeness, Cohesiveness. Customer: Customers request application systems, Sufficiency, and Coherence. place requirements on them, and usually pay for the systems. Customers also interact when deciding on Encapsulation: A property of object-oriented needed features, priorities, and roll-out plans when programs in which an object’s implementation is developing new versions of component systems and hidden behind its interface. the layered system as a whole. Customers can be both internal, such as business process owner, or Engineering: Creating cost-effective solutions to external, such as another company. practical problems by applying scientific knowledge to building things of value. Decomposition: Breaking an abstraction into smaller components. An operation on an abstraction. Engineering Abstractions: Construction of elegant Dependency: A requirement that specifies how one abstractions to increase the information/interface. element of a system affects another. There are three possible dependencies. Dependencies can be Enterprise Classification Model: The complete Semantic – how a quality of an abstraction( object, model of a domain, including object structures and attribute, behavior, relationship) is resolved under a behaviors. given set of conditions, e.g. vacation depends on start date; Functional – describes how a component uses Entity (Organization): Any identifiable set of other components to assist in providing behavior, e.g. individuals, policies or systems. alarm – response; or Conceptual – effects the defining of qualities, e.g. what a car is may depend Entities Model: Entities are the fundamental on what model it is. Dependency is an attribute building blocks of the Domain Description that quality. represent information stored in the system.

Design: The part of the software development Factoring: Identifying common elements of two or process whose primary purpose is to decide how the more abstractions, typically decomposition followed system will be implemented. During design, strategic by composition. An operation abstraction. and tactical decisions are made to meet the required Fixed: A value that once set remains unchanging. A functional and quality requirements of a system. See quality of an attribute. A measure of accessibility. analysis. Discriminator: A difference between two abstractions. If the difference between the two Framework: A collection of interdependent classes abstractions is expressed in terms of each that provides a set of services for a particular domain;

LeanMBASE_Guidelines_V1.2 133 Version Date: 9/28/2005 Guidelines for MBASE Appendix D. Status Reports a framework thus exports a number of individual Object: An encapsulated packet of data and a classes and mechanisms that clients can use or adapt. behavior that acts as abstraction of a particular domain element or programming construct. FRD: Feasibility Rationale Description OCD: Operational Concept Description Functional: See Functional Dependency under Dependency. Operation: A task which is executed in response to the stimulus of an event. The functionality of a Generalization: Creating a more inclusive component. Involve data flow, control flow and state classification. Within an abstraction hierarchy, diagrams. Map to abstractions (see behavior). generalization results in “kind of” relationships. Contrast with Specialization. An operation on an Operations Engineering: Facilitates ways of abstraction. There are three special cases of organizing and managing complex operations, e.g. generalization: Leading Special Case: easy to through assembling operations into hierarchies. handle and very accessible in which it is seen that other cases follow; Representative Special Case: is Operations Qualities: Trigger, Scenario, a specialization achieved by resolving some of the Preconditions, Postconditions, Inputs, Outputs, abstraction’s qualities in an arbitrary way; Extreme Actions, Exceptions. Special Case: sets boundaries for other cases. Operations Classification: Organizing operations Goal: motivation neither expressed nor implied by according to abstractions. responsibilities. From notes, “factors that contribute to the choices and aspirations of the organization.” Output: An operation quality. Any data that is produced by the operation. Hierarchy: A directed tree of abstractions with transitive relationships between levels. Participating Agent: Non developer stakeholder in the system Identity: Designation of a component such as a name or phone number. An attribute quality. Performing Agent: One that expends effort to realize the solution to the problem being solved in the Item Qualification Testing: See Software Item project (manager, architect, analyst, designer) Qualification Testing. Policy: That portion of a given behavior that decided Information: Processed data that conveys more than what the behavior should be doing. Contrast with the data itself; relationships or descriptions of data. algorithm.

Input: An operation quality. Any data that is Precision: A measure of elegance, with regard to an required to carry out the operation. object’s defining quality. A defining quality is precise it is excludes everything that is not part of the Interaction: mutual or reciprocal action or influence abstraction. between entities and/or systems. Precondition: An operation quality. A prerequisite Interface: Set of qualities of an object/entity that set of conditions that must be true in order for an may be extracted or changed. Refers to that part of operation to proceed, an object/entity which is accessible to others. Primitive Method: A method that directly accesses Law of Demeter: A hierarchy of operations with an instance variable. respect to where messages should be sent, e.g. first to itself. Readability: Visibility of the value. All attributes should be readable. A quality of an attribute. A Meta Level: Abstractions that describe other measure of accessibility abstractions. Reflexivity: Indicates whether a relationship can Metric: Measures. E.g. elegance metrics such as have the same object at both ends. Reflexive cohesiveness, consistency etc. relationships can; irreflexive relationships cannot.

Model: An organized collection of abstraction levels. Regression Testing: Selective testing of a system or component to verify that modifications have not

LeanMBASE_Guidelines_V1.2 134 Version Date: 9/28/2005 Guidelines for MBASE Appendix D. Status Reports caused unintended effects and that the system or Software items are selected based on trade-offs component still complies with its specified among software function, size, host or target requirements. computers, developer, support strategy, plans for reuse, criticality, interface considerations, need to be Relationship: A conceptual connection between two separately documented and controlled, and other or more components or objects. A complex factors. A software item is made up of one or more relationship is a composition of simple relationships software units. and include bidirectional relationships (simple “one to many” relationships) and symmetric relationships (a relationship that has the same qualities when Software item qualification testing: The developer viewed from either direction (e.g. “next to”)). shall perform software item qualification testing in accordance with the following requirements. Relationship Constraints: Three of them NOTES: (Reflexivity – relationships that have the same 1-Software item qualification testing is performed to components as both the source and destination e.g. demonstrate to the acquirer that software item lawyers as own clients; Directed – limits the possible requirements have been met. It covers the software relationships between two components to one or item and interface requirements in System and more relationships e.g. selected from; Mappings – Software Requirements Definition (SSRD). This the existence of a relationship requires the existence testing contrasts with developer-internal software of another – e.g. ‘works for’ implies ‘employed by’ item testing, performed as the final stage of unit Relationship Types: ‘Part of’ relationship (the integration and testing. relationship objects within a component have to their 2-If a software item is developed in multiple builds, container). its software item qualification testing will not be completed until the final build for that software item, ‘Contains’ relationship is the reverse of the ‘part’ or possibly until later builds involving items with relationship (deletion of the container deletes the which the software item is required to interface. parts). ‘Composite’ relationships, four of them: Software item qualification testing in each build is (Collections –e.g. address book; Aggregations – e.g. interpreted to mean planning and performing tests of an automobile engine with its aggregation of parts; the current build of each software item to ensure that Groupings – like an aggregation but if you delete all the software item requirements to be implemented in the parts the group is deleted as well; Partnerships – that build have been met. where the deletion of a relationship between objects causes the container to be deleted) Software Unit: An element in the design of a software item; for example, a major subdivision of a Relevance: The degree to which a quality is software item, a component of that subdivision, a important in the current context. class, object, module, function, routine, or database. Software Units may occur at different levels of a Responsibility: System responsibilities are a list of hierarchy and may consist of other software units. tasks the final system will be responsible for. Software units in the design may or may not have a Something to be done. one-to-one relationship with the code and data entries Reuse: Further use or repeated use of an artifact. (routines, procedures, databases, data-files, etc.) that Typically software programs that were designed for implement them or with the computer files containing use outside their original context. those entities.

RLCA: Rebaselined LCA Subsystem: A model element which has the semantics of a package, such that it can contain other ROI: Return on Investment model elements, and a class, such that it has behavior. (The behavior of the subsystem is provided Scenario: An operation quality. A description of the by classes or other subsystems it contains). A steps taken to complete the operation. subsystem realizes one or more interfaces, which define the behavior it can perform. Software Item: An aggregation of software, such as a computer program or database, that satisfies an end Sufficiency: A measure of an abstraction’s elegance. use function and is designated for purposes of The degree to which all of an abstraction’s specification, qualification testing, interfacing, information can be accessed in a reasonable amount configuration management, or other purposes.

LeanMBASE_Guidelines_V1.2 135 Version Date: 9/28/2005 Guidelines for MBASE Appendix D. Status Reports of time, or with a reasonable amount of knowledge E.g., bank account can be represented by open about the domain. account.

Super-type: Similar to generalization. Creating a Source Component: The component that originates less-specific, more inclusive class from a set of the relationship. (See also Destination component). subclasses Source: Specialization: Refining a classification by System: As an instance, an executable configuration adding more qualities to it. Contrast with of a software application or software application Generalization. family; the execution is done on a hardware platform. As a class, a particular software application or Specialization: Refining a classification by adding software application family that can be configured additional qualities to it (sub class). .An operation on and installed on a hardware platform. In a general an abstraction. sense, an arbitrary system instance. Spiral: A model of a system development which System qualification testing: The developer shall repeatedly cycles from function to form, build, test, participate in system qualification testing in and back to function. accordance with the following requirements. NOTES: SSRD: System and Software Requirements 1 – System qualification testing is performed to Definition demonstrate to the acquirer that system requirements have been met. It covers the system requirements in SSAD: System and Software Architecture the System/Subsystem Specifications (SSE) and in Description associated Interface Requirements Specifications (IRSs). This testing contrasts with developer-internal State: A combination of attributes and relationships system testing, performed as the final stage of that affects the behavior of an object and has well- software/hardware item integration and testing. defined transitions to or from other discreet states. 2 – If a system is developed in multiple builds, E.g. solvency of a bank account. State may be qualification testing of the completed system will not represented by an attribute. occur until the final build. System qualification testing in each build is interpreted to mean planning Test class: An optional (only used if designing and and performing tests of the current build of the implementing test specific functionality) stereotype system to ensure that the system requirements to be of Class in the design model. implemented in that build have been met. Test model: A collection of test cases and test procedures and their relationship. A test case can be implemented by one or more test procedures, and a Test case: A set of test inputs, execution conditions, test procedure may implement (the whole or parts of) and expected results developed for particular one or more test cases. objective, such as to exercise a particular program path or to verify compliance with a specific Test procedure: A set of detailed instructions for the requirement. [NOTE: not the same definition as set-up, execution, and evaluation of results for a used in IEEE J-016.] given test case (or set of test cases).

Transitive (Transitivity): The relationship: If A then B, If B then C. Therefore if A then C. Scope: The value of an attribute and whether or not another component of the same type can have the Trigger: An operation quality. A set of conditions same value. A quality of an attribute. which, when true, cause an event to be sent to stimulate an operation. SDP: Software Development Plan Types: Types of components in the same class must Selector: A relational attribute which uniquely share all qualities of other components in that class. identifies an object in the context of the relationship E.g. names. to which it belongs. UTCR: Unit Test Completion Review Semantic: Semantic equivalence is one component simply representing the state of another component.

LeanMBASE_Guidelines_V1.2 136 Version Date: 9/28/2005 Guidelines for MBASE Appendix D. Status Reports

Value Class: A class that is used to represent an Waterfall: A development model based on a single atomic value, such as a string or a number. sequence of steps

LeanMBASE_Guidelines_V1.2 137 Version Date: 9/28/2005 Guidelines for MBASE Appendix D. Status Reports

The CS577 [weekly] status reports are shown in this appendix.

01111e796d662a64ee23970f5be9f23a.doc 138 Version Date: 9/28/2005 Guidelines for MBASE Weekly Effort Form Week # Name: Team:

Activity Mon Tue Wed Thu Fri Sat Sun Management Life-Cycle Planning Control and Monitoring Client Interaction Team Interaction Project Web Site ARB Review Environment and Configuration Management Training and Preparation Installation and Administration Configuration Management Custom Toolsmithing COTS Initial Filtering (Assessment) Requirements Win–Win Negotiation Prototyping for Requirements Modeling for Operational Concept Description Documenting of Operational Concept Description Modeling for System and Software Requirements Definition Documenting of System and Software Requirements Definition Design Modeling for System and Software Architecture Description Documenting of System and Software Architecture Description COTS Final Selection (Evaluation) COTS Tailoring Implementation Component Prototyping Critical Component Implementation COTS Glue Code Development Assessment Business Case Analysis Feasibility Rational Description Test Planning Inspections and Peer Reviews Deployment Transition and Support Planning Other Effort Not Covered Here

LeanMBASE_Guidelines_V1.2 139 Version Date: 9/28/2005 Guidelines for MBASE Weekly Status Report Team # Team Members Project Title Week #

This is a suggested Status Report. Feel free to expand or collapse the various categories. Progress

 Describe the progress on the project during the past week, both in qualitative and quantitative terms

 List any significant accomplishments since the last status report. This can include the completion/revision of project deliverables

 Include metrics to describe the progress

 Attach a copy of the major documents' and code change history (easily provided from configuration management system)

 If problem reports and/or defect/issue lists were exchanged electronically, either within the team or with the IV&V person, please provide a list of the message subject and date Problems

 List any problems the team is experiencing. This includes things that have caused delays, or any other roadblocks to anticipated progress.

 Provide a weekly Top-N risk items list as follows:

Risk Items Weekly Ranking Risk Resolution Progress

Current Previous # Weeks

LeanMBASE_Guidelines_V1.2 140 Version Date: 9/28/2005 Guidelines for MBASE

 Provide a weekly # of open problem report and # of issues list as follows:

Type New2 NewError: Old3 closed Current opened Reference source not found closed

Issues4

Defects5

Problem Reports

Immediate Plans

List your team's plans (or TO DO items) for the next week. This section might be filled with significant action items that are mentioned during your team status meeting, to help remind team members of significant items discussed in the meeting. You are encouraged to maintain, separately, meeting minutes, to record important discussions and action items raised during your team meetings. Comments

Include here any items not covered above.

2 Problem reports and issues generated since last report. 3 Old are those in "Current" in the previous report. 4 See QMP for a definition. Usually potential defects identified in other artifacts during a review (i.e., artifacts not the primary focus of the review) 5 See QMP for a definition. Usually changes made by the "author" of a reviewed artifact as a result of an "issue" or those identified in an inspection (where a "Defect" is objectively defined and the author concurs with the designation.)

LeanMBASE_Guidelines_V1.2 141 Version Date: 9/28/2005 Guidelines for MBASE Index

Activity Diagram....24, 25, 26, 30, 31, 39, 40, 76, 78, Development Cost...... 34, 227, 231, 232 104, 105, 113, 114, 122, 123, 153, 159, 160, 166, 167, 182, 183 Domain Description...7, 9, 20, 22, 25, 27, 31, 33, 35, 37, 39, 41, 47, 65, 71, 193, 323, 324 Actors...... 24, 77, 79, 104, 114, 123 Engineering Stage...... 195, 198, 200, 201, 205 Automated Analysis...... 263, 264, 265 Environment Evolution...... 317 Budget...... 54, 206, 208, 214, 222, 236, 316 Evolution requirements...... 64 Business Case...... 7, 9, 15, 227, 239, 330 Execution Testing...... 263 Capabilities Implemented...... 254 Feasibility Criterion...... 66, 67 Change Request Reporting...... 258 Feasibility Rational Description...... 330 Change Summary11, 52, 70, 197, 227, 261, 272, 276, 286, 297 Framework...... 5, 107, 117, 313, 324

Choice of architecture...... 66 Gantt Charts...... 223

Classification Model...... 5, 324 Hardware Interface...... 60, 317

Classification of Defects...... 287 Hardware Preparation...... 277, 301

COCOMO 15, 195, 223, 228, 233, 235, 238, 245, 246 Identification System...... 266

Code Generation...... 244 IKIWISI...... 47, 263, 289

Coding...... 219, 243, 261 Inspections...... 264, 289, 298, 299, 330

Collaboration Diagram.19, 20, 35, 37, 71, 72, 73, 170 Installation...... 55, 202, 223, 278, 307, 330

Compatibility Notes...... 257 Intended Audience...9, 195, 225, 249, 253, 256, 259, 271, 275, 285, 296, 300, 305, 308 Completion Criteria..3, 7, 49, 66, 194, 224, 249, 253, 256, 260, 271, 274, 275, 285, 296, 297, 300, 305, Interface Evolution...... 64, 317 308 Inventory...... 236, 257 Component Model...... 1 Iteration Assessment...... 202, 243, 244, 246, 252, 253 Configuration management...... 266 Iteration Plan 201, 202, 243, 244, 245, 249, 250, 252, Configuration Model...... 191, 192 254, 255

COQUALMO...... 263, 299 Key Stakeholders...... 14, 16, 43, 44

De-installation...... 307 Known Bugs...... 258

Design Guidelines...... 261, 268 Level of Service. 9, 33, 34, 49, 50, 51, 52, 53, 57, 58, 61, 62, 63, 65, 68, 84, 218, 230, 231, 232, 234, Detailed Design...... 212, 243 250, 251, 315, 316, 318

01111e796d662a64ee23970f5be9f23a.doc 142 Version Date: 9/28/2005 Guidelines for MBASE Index

Level of Service Evolution...... 65 Results Chain. 8, 12, 13, 14, 15, 43, 44, 52, 197, 205, 206, 225, 228 Life Cycle Plan.....1, 2, 4, 52, 66, 194, 195, 196, 201, 210, 211, 220, 226, 234, 242, 245, 250, 271, 300, Return on Investment...... 228, 326 304, 308, 314 Risk Assessment...... 207, 237 Life Cycle Strategy...... 198 Risk Management. .207, 214, 226, 238, 245, 252, 313 Maintenance Cost...... 228 Risk Monitoring...... 213 Milestone Plan...... 303 Schedule 30, 34, 52, 54, 208, 222, 231, 232, 235, 236, Open Problems...... 254 247, 252, 274, 289, 302, 316

Operational Concept Description.....1, 2, 7, 195, 201, Sequence Diagram...... 182, 184 210, 211, 221, 242, 243, 245, 308, 313, 325, 330 Shared Vision...... 11, 15, 16, 28 Operational Cost...... 228 Software Preparation...... 277, 302 Organization Charts...... 43, 206 Software Support Plan...... 308 Peer Reviews210, 211, 212, 221, 231, 243, 245, 250, 263, 264, 289, 297, 298, 299, 330 Spiral Objectives...... 15

Performance Levels...... 251 Staff Preparation...... 302

Phase Milestones...... 199, 219 Staffing...... 204, 206, 238, 274, 316

Physical Inventory...... 257 Stakeholder Concurrence...... 233

Process Match...... 197, 200, 234 Static–Structure Diagrams19, 22, 74, 90, 95, 96, 100, 119, 136, 137, 148, 170, 190, 192 Process Rationale...... 234 Status Reports...... 207, 245, 250, 329 Product Features...... 239 Support Objectives...... 309 Product Rational...... 227 Support Resources...... 312 Production Stage....198, 200, 201, 202, 206, 211, 310 Support Responsibilities...... 311 Project Constraints...... 15, 29, 316 Support Set...... 246 Project Deliverables...... 201, 302, 310 System and Software Architecture Description. 1, 66, Project Goals 8, 9, 29, 51, 53, 54, 57, 58, 68, 229, 234 201, 210, 211, 227, 242, 243, 245, 272, 314, 327, 330 Project Library...... 216 System and Software Requirements Definition. 1, 49, Quality Guidelines...... 261 52, 210, 211, 221, 242, 245, 250, 272, 314, 316, 326, 327, 330 Quality management...... 216 System Boundary...... 8, 14, 31 Release Notes...... 202 System Design...... 67 Requirements Management...... 243 System Priorities...... 197, 200, 234 Requirements Verification...... 265 System Requirements3, 31, 51, 56, 57, 58, 59, 63, 68, 196, 229, 230, 234, 275, 306, 316

LeanMBASE_Guidelines_V1.2 143 Version Date: 9/28/2005 Guidelines for MBASE Index

System Shortfalls...... 239 Transition Plan 9, 50, 55, 56, 202, 210, 211, 212, 219, 246, 300, 308, 310 Technology Evolution...... 65, 317 Transition Strategy...... 301 Test Cases...... 279, 311 UML Component...... 95, 119, 121, 191, 192 Test Incident Reports...... 282 Unit Testing...... 206 Test Plan...50, 51, 200, 202, 219, 231, 243, 245, 252, 271, 272, 275, 289, 330 Upcoming Changes...... 257

Test Procedure...... 275, 279 Use-Case...... 5, 24

Test Summary...... 283 User interface...... 238

Tool Support...... 68, 195, 226, 252 User Manual...... 202, 212, 246, 305

Training...... 56, 58, 80, 202, 204, 207, 228, 246, 274, Verification and Validation...... 219, 264, 265 302, 330 Workload Evolution...... 65, 317 Transition Cost...... 227

LeanMBASE_Guidelines_V1.2 144 Version Date: 9/28/2005

Recommended publications