Achieving CMMI Level 5 Improvements with MBASE and the CeBASE Method Dr. Barry Boehm, Dr. Daniel Port, and Apurva Jain Dr. Victor Basili University of Southern California University of Maryland Each branch of service in the Department of Defense has major initiatives to pursue more advanced -intensive systems concepts involving network-centric warfare with self- adaptive networks and cooperating human and autonomous agents. The ability to balance discipline and flexibility is critically important to developing such highly dependable soft- ware-intensive systems in an environment of rapid change. Risk-management orientation enables users of Capability Maturity Model® IntegrationSM (CMMISM) to apply risk con- Tuesday, 30 April 2002 siderations to determine how much discipline and how much flexibility is enough in a given Track 1: 1:00 - 1:40 situation. The risk-driven nature of the and MBASE enables them to achieve Ballroom A a similar balance of discipline and flexibility. When these project-level approaches are com- bined with the organization-level approaches in the Experience Factory, the result is the unified Center for Empirically Based (CeBASE) method described in this article. ecent events in Afghanistan have con- cific techniques for achieving and maintain- In “Using the Spiral Model and MBASE Rvincingly demonstrated the value of ing the right balance of discipline and flexi- to Generate New Acquisition Process software and information technology in bility for a particular program’s evolving sit- Models: SAIV, CAIV and SCQAIV [3],” we achieving military superiority. Each of the uation. showed how programs could use MBASE Department of Defense (DoD) services has In our previous three CrossTalk arti- techniques to avoid many major initiatives to pursue even more cles, we provided some specific methods overruns of fixed schedules and budgets. advanced software-intensive systems con- that programs can use to achieve this bal- This is done by prioritizing desired features cepts involving network-centric warfare ance. In “Understanding the Spiral Model as and inverting the development process to with self-adaptive networks and cooperating a Tool For Evolutionary Acquisition [1],” we deliver the most important features within human and autonomous agents. the available schedule or budget. However, there are tremendous chal- ... the opportunity is lenges in providing the software product “ MBASE and the CeBASE and process capabilities necessary to realize here for other Method these concepts. These challenges include However, the spiral, MBASE, and Schedule security, scalability, interoperability, legacy organizations to use as Independent Variable (SAIV) approaches systems transition, uncontrollable commer- all operate at the individual project level. cial off-the-shelf (COTS) products, and syn- the Experience Factory This still leaves open the coverage of the chronizing dozens if not hundreds of inde- approach to achieve counterpart CMMI organization-level pro- pendently evolving systems in a world of cess areas, particularly those of achieving increasingly rapid change. CMMI Level 5 benefits continuous improvement of the organiza- In particular, the processes for managing tion’s processes. these complex systems of systems will well before reaching In this article, we show how MBASE has require both highly disciplined methods to been integrated with the University of ensure dependable operations and highly Level 4. Maryland’s (UMD) organization-level Qual- flexible methods to adapt to change. ” ity Improvement Paradigm (QIP), Exper- Fortunately, DoD’s new 5000-series showed how appropriate use of the spiral ience Factory (EF), and Goal-Question- policies on evolutionary acquisition and spi- model enables programs to achieve the flex- Metric (GQM) approaches into a Center for ral development provide the acquisition and ibility needed for evolutionary acquisition, Empirically Based Software Engineering program management framework to achieve while applying risk-management principles (CeBASE) method, which successfully this balance of discipline and flexibility. to retain an appropriate level of program addresses these challenges. CeBASE is Also, the recent Capability Maturity Model® discipline. sponsored by the National Science Foun- (CMM®) IntegrationSM (CMMISM) provides a In “Balancing Discipline and Flexibility dation, NASA, and the DoD, and jointly led development framework for integrating with the Spiral Model and MBASE [2],” we by the UMD and the University of software and systems consideration with showed how risk considerations could be Southern California (USC). degrees of freedom for tailoring develop- used to realize appropriate but different As we explored the details of process models for different program situa- ment processes to achieve appropriate bal- Maryland’s QIP, EF, and GQM approaches tions. We also elaborated on some of the ances of discipline and flexibility. However, and USC’s MBASE approach, we found that specific practices in Model-Based (system) these initiatives fall short of providing spe- they were expressing very similar principles Architecting and Software Engineering and practices. The Spiral Model’s initial ® Capability Maturity Model, CMM, Software Capability Maturity Model, and SW-CMM are registered in the U.S. (MBASE) such as the use of life-cycle focus on system objectives was consistent Patent and Trademark Office. anchor-point milestones to keep the pro- with the QIP’s initial focus on organization- SM CMM Integration and CMMI are service marks of Carnegie Mellon University. gram on track during its evolution. al and project-specific goals expressed in

May 2002 www.stsc.hill.af.mil 9 Best Practices

gration and test may start showing a 50 per- Org. Shared Vision & Progress/Plan/Goal Org. Improvement Initiative cent increase rather than decrease in dura- Improvement Strategy Mismatches Planning & Control tion. Analyzing this progress/plan/goal •Initiative Plans • Org. Improvement Goals Planning Context mismatch would determine the root cause –Initiative-related questions, –Goal-related questions, metrics metrics to be delays due to inadequate test planning tives •Org. Improvement Strategies Initia •Initiative Monitoring and and preparation of test tools, test drivers, –Goal achievement models Control and test data. Further, shortening the test –Experience-base analysis plan activity had produced no cycle timesav- Analyzed ing, as test planning was not on the project’s Achievables, Experience, critical path. Opportunities Updated Models The results of this analysis would be fed Org. Project into the organization’s experience base: Goals Experience Base Experience Future cycle-time reduction strategies should focus on reducing the duration of Models and Models and Data critical path activities, and options for doing Data this include increasing the thoroughness and Planning Context Project Shared Project Planning duration of noncritical-path activities. Vision and Strategy Progress/Plan/Goal/Mismatches and Control Overall then, as shown in the center of Figure 1, the EF analyzes and synthesizes Figure 1: Experience Factory Framework such kinds of experience, acts as a reposito- ry for the experiences, and supplies relevant context using the GQM approach. The EF’s feedback processes that have resulted in experience to the organization on demand. focus on organizational learning to under- significant improvements in software quali- The EF packages experience by building stand a system’s operational stakeholders ty across more than 100 large software informal and formal models and measures and their goals corresponds strongly with applications in the last 25 years. of various processes, products, and other MBASE’s stakeholder win-win approach to The formulation of these feedback forms of knowledge via people, documents, mutual stakeholder understanding and processes is called the QIP [4]. It uses six and automated support. development of a shared system vision. steps to provide an organized approach to In the next section of this article, we continuous improvement: QIP,GQM, and EF in Practice summarize the key principles and practices 1) characterizing the organization, 2) setting The application of the integrated set of of the QIP, the EF, and the GQM goals, 3) choosing and instrumenting an these methods is referred to as the approaches; provide evidence of their suc- appropriate process, 4) executing and moni- Experience Factory Organization, which cessful application over 25 years of practice toring the process, 5) analyzing the data to resulted in a continuous improvement in in the NASA Goddard-University of identify improvements, and 6) packaging the software quality and cost reduction during Maryland- Corp. (CSC) experience and improvements for future the quarter-century life span of the SEL. Software Engineering Laboratory (SEL); use. When measured during three baseline peri- and provide an example of their application The QIP makes use of the GQM ods in 1987, 1991, and 1995 (each repre- at a systems as well as software level. In the approach, which is a mechanism for defin- senting about three years of development section “The CeBASE method and CMMI,” ing and evaluating a set of operational goals efforts), the demonstrated improvements we present the CeBASE method and show using measurement [5]. It ensures that your included a 75 percent decrease in develop- how its process elements cover the process general goals are elaborated into specific ment defect rates from 1987 to 1991, and a areas of the CMMI. In “Using the CeBASE questions and metrics for tracking progress 37 percent decrease from 1991 to 1995. We Method,” we show a version of the and evaluating success, and that your people also observed a reduced development cost CeBASE method that has been successfully do not waste effort collecting and analyzing of 55 percent from 1987 to 1991 and of 42 applied to more than 100 electronic services data weakly related to your goals. percent from 1991 to 1995 [7, 8]. applications over six years’ practice at USC. The GQM approach can be applied at A more detailed example of improve- Our conclusions include a diagram sum- both the project level and the organization ment over time, in Figure 2, shows the marizing the process model distinctions level. The EF [6] provides a consistent way defect rates in defects per thousand deliv- among traditional approaches such as the of operating at both levels, as shown in ered lines of code (K-DLOC) for similar and Software Capability Figure 1. Organization and project goals are classes of projects at CSC during the appli- ® ® Maturity Model (SW-CMM ); project-ori- determined by involving the relevant suc- cation of the EF concepts. Over time, the ented approaches such as the spiral model, cess-critical stakeholders in negotiating defect models became well established and MBASE, and the Rational mutually satisfactory (win-win) and achiev- the range of variation (indicated by the (RUP); and integrated project/organization able goals. upper and lower lines) narrowed, allowing approaches such as the CMMI and CeBASE For example, the organization may set a managers to better manage quality [9]. Thus, Method. goal to reduce its projects’ software cycle the EF approach enabled the SEL portion time by 50 percent. The initial implementing of CSC to achieve SW-CMM Level 5 The QIP,GQM, and EF project may set goals and plans to have each improvements well before CSC became a Approach project activity reduce its calendar time by Level 5 organization in 1998. With the Framework and Methods 50 percent. As the project proceeds, its CMMI’s emphasis on measurement and Since 1976, the UMD has been collaborat- progress is monitored for progress/plan/ analysis as early as Level 2, the opportunity ing with NASA-Goddard and CSC on the goal mismatches, as shown at the bottom of is here for other organizations to use the EF SEL. The UMD and the SEL have devel- Figure 1. While design, code, and test plan- approach to achieve CMMI Level 5 benefits oped and refined a series of closed-loop ning may finish in 50 percent less time, inte- well before reaching Level 4.

10 CROSSTALK The Journal of Defense Software Engineering May 2002 Achieving CMMI Level 5 Improvements with MBASE and the CeBASE Method

Various EF concepts have been success- fully applied in other organizations, includ- ing Daimler Chrysler, Robert Bosch, TRW, 14 and Allianz. FORTRAN 12 Upper Data Points Non-FORTRAN Applying EF Concepts at the Systems Level 10 Systems-Level Goals and Questions Many software organizations interpret the 8 All Data Points EF concepts at just the software level. They miss many opportunities to reap much 6 more significant returns on investment at Errors Per K-DLOC 4 the systems level. For example, suppose that your software intensive project has a 2 proposed goal for an improvement initia- tive to reduce the project’s software defect 0 Lower Data Points rates. What should be your next step? 1986 1988 1990 1992 1994 1996 2000 Usually it would be to look down from the Project Midpoint software goal into the software details. How will we define defect? What are the count- Figure 2: Defect Rate Improvements in Software Engineering Laboratory/Computer Science ing rules for overlapping defects? What are Corporation Projects our current defect rates? With EF at the systems level, your next when only part of an order is in stock. of the recent Institute of Electrical and step is to look upward and sideways from (It is generally okay to send a partial Electronics Engineers’ Software Process the software and ask system-level questions: shipment at Amazon.com, however, not Achievement Award winners, Advanced Why do we want to reduce software defect for jet engine repair spare parts.) Information Services, Inc. (AIS) and Tinker rates? What system goals are being frustrat- Software defects can again cause consid- Air Force Base, are good examples [11, 12]. ed by software defects? Where are the frus- erable operational delays. AIS uses Balanced Scorecard tech- trations the greatest? 3. “Produce status reports” defects should niques to integrate its software, systems, For example, in an operational order- not be on the operational critical path. project, and organizational goals in such processing system, the answers may be that This module was probably put on the areas as customer satisfaction, financial per- the software defects are causing 1) too critical path by a programmer’s detailed formance, employee growth, process much downtime in the operation’s critical design coupling and cohesion decision improvement, and organizational learning path, 2) too many defects in the system’s without considering its potential system capability. Specifically, AIS periodically operational plans, and 3) too many new- effect, resulting in status-report defects assesses its performance and rate of release operational problems. causing order-delivery delays. progress in these areas on a Balanced These insights enable you to reformu- 4. The overall legacy order-processing sys- Scorecard form and uses the results to late your improvement initiative goal to tem may just be too slow and difficult to adjust its improvement efforts in each area. decrease the organization’s software defect- modify and should be replaced down- Tinker has used its software insights to related losses in operational cost effective- stream by a new Web-based order-pro- stimulate systems-level initiatives with its ness. Items one through three become ini- cessing system. It is generally good to be counterpart hardware and test organiza- tial high-payoff target sub-goals for the ini- asking why not as well and why questions. tions to improve system-level cycle time tiative. Given this new goal and context, and to deliver quality in such areas as B-2 Putting It All Together what should be your next step? Test Program Sets. Each of these sub-goal-related initiatives This kind of approach is what transi- Sub-Goal Level Questions, Models, and needs to be monitored and controlled with tioning from the software CMM to the Metrics respect to improvement-related metrics CMMI is all about. It requires software Again, a good next step is to ask why the such as order-processing cycle time and organizations to be more pro-active than software defects are causing operational user satisfaction. The results need to be reactive in interacting with the operational problems, often with the help of models. integrated with other ongoing improve- system stakeholders. It gets software peo- For example, Figure 3 shows a critical-path ment initiatives to ensure synergy and inte- ple applying their necessary expertise to model for analyzing the order-processing gration with the overall organizational system issues. It results in much larger bot- downtime and delays caused by software experience base discussed earlier in the tom-line payoffs for the operational system defects. Analyzing this model may lead to “framework and methods” section. The stakeholders. The next two sections discuss several valuable insights, improvement sidebar on page 12 shows the resulting sys- how the CeBASE method integrates soft- strategies, and system payoffs: tems-level EF-GQM initiative steps. ware and system-level activities as well as 1. Often, major sources of delay are addi- These system-level EF-GQM ap- project- and organizational-level activities tional manual processing delays caused proaches are already being practiced by and how its practices map to the process by software or non-software problems, leading-edge software organizations. Two areas and practices in the CMMI. as with the Scientific American order Figure 3: Order-Processing System Critical Path Model processing system discussed in Boehm [10]. Order Validate Schedule Produce Prepare Deliver Validate items in packaging, status delivery 2. The logic for packaging and delivery items order order scheduling can become quite complex stock delivery reports packages

May 2002 www.stsc.hill.af.mil 11 Best Practices

The CeBASE Method at the Systems-Level Experience Factory Organizational Level Goal-Question-Metric Initiative Steps The resulting organization (or portfolio) 1. Identify a software-related improvement initiative goal. shared vision (OSV) (Figure 4) drives two 2. Relate this to system-level goals: Ask questions about why the initiative is sets of initiatives. Horizontally in Figure 4, needed. it drives initiatives to improve software 3. Use the results to identify the related system-level improvement initiative goal cycle time or to reduce order-delivery time and high-payoff sub-goal initiatives. across the organization. These initiatives 4. Perform a systems-level root-cause analysis: Construct relevant models, ask will have strategy elements and their asso- questions about why the current-system shortfalls cause problems and whether ciated organization-level improvement or not to try alternative system approaches. plans (OP) such as reducing delays in 5. Identify the system improvement initiative’s key stakeholders; achieve a shared order-delivery time due to software vision of and commitment to the initiative goals and strategies. defects. 6. Establish improvement initiative plans and progress metrics for each sub-goal Following the GQM paradigm, the goal initiative and the overall initiative. of reducing order-delivery time is related 7. Execute, monitor, and control the initiative plans with key stakeholder partici- to organization plan questions such as, pation. Feed the resulting experiences into the organization’s experience base “What is our current record on delivery for future benefits. times?” “What distinguishes orders with significantly better or worse delivery times?” “What are the costs and benefits The CeBASE Method and the sion statement. This will cover the organi- resulting from an improvement initiative?” CMMI zational stakeholders’ agreed-upon win These questions are related to metrics The CeBASE Method Framework conditions and will be expressed in terms such as overall order-delivery time, critical- of such Balanced Scorecard elements as Overall, we found that both EF-GQM and path task times, costs and benefits of elim- customer satisfaction, financial perform- MBASE could be integrated into a com- inating related software defects, and cus- ance, employee growth, process improve- mon CeBASE method. Its framework is tomer order-delivery satisfaction ratings. ment, and organizational learning capa- These are used to monitor and control organized around a trio of common strate- bility. gic themes, shown by the vertical pairs in organization-level progress (OMC), and to Improvement goals and priorities will adjust the strategies and goals based on Figure 4. These three themes are the stake- come from Balanced Scorecard assess- holders’ shared vision for the organization or the organizational feedback of progress/ ments. These might include such goals as plan/goal achievements and mismatches. project; risk-driven plans for process, prod- reducing software development cycle time uct, and people; and continuous monitoring or reducing average order-delivery time. The CeBASE Method at the Project and control. As seen in Figure 4, these The specific quantitative goals, e.g., reduce Level themes express both the operation of EF- software development cycle time by 50 per- Vertically in Figure 4, the OSV drives the GQM at the organizational level and the cent, would be based on initial cost/value nature of each project’s shared vision (PSV) operation of MBASE-GQM at the project analyses. These are developed using such and its associated goals and priorities. Thus, level. Within a large diverse organization, techniques as the DMR Consulting for example, an organizational improve- we may wish to consider a particular set of Group’s Results Chains linking improve- ment goal to reduce software development projects within a portfolio or product line of ment initiatives to contributions and ben- cycle time by 50 percent will be reflected in related products or services. efits-realized outcomes [13] and associated the project’s value propositions and To start at the upper left of Figure 4, business-case models linking the value of improvement goals or the organization the organization’s value propositions are benefits realized to the costs invested in project shared vision (O-PSV) (see arrow in often contained in an organizational mis- the initiatives. Figure 4). This will lead to project-level goals, models, questions, and metrics such Figure 4: The CeBASE Method Framework as reducing the duration of each project Applies to organizations and projects people, processes, and products task by 50 percent. OFB: Progress/plan/goal mismatches Plan/goal mismatches -shortfalls, opportunities, risks This in turn leads horizontally across the

Planning bottom of Figure 4 to project-level plans Org. Monitoring & Control (OMC)

Org-Portfolio Shared Vision (OSV) context 1. Monitor environment Org. Plans (OP) (PP), project monitoring and control activi- Organization/ 1. Org. value propositions a - Update models

portfolio: (VP's) 1. Strategy elements 2. Implement plans 2. Evaluation criteria/questions ties (PMC), and to the determination and Experience a - Stakeholder values 3. Evaluate progress Factory, 2. Current situation w.r.t. VP's 3. Improvement plans Monitoring a. Progress metrics -w.r.t. goals, models feedback of project-level progress/plan/ GQM Initiatives & control 3. Improvement goals, priorities b. Experience base 4. Determine, apply 4. Global scope, results chain contex t corrective actions goal mismatches. This mismatch feedback 5. Value/business case models 5. Update experience base could be negative such as increased integra- Project Project Short falls, Planning Monitor ing experience, Short falls, vision, opportunities, context: & control tion and test task durations or it could be Scoping progress w.r.t. opportunities, goals: risks: P-OP O-PP context context plans, goals risks; P-OSV O-PSV positive. For example, the project might

Project Plans (PP) Proj. Monitoring & Control (PMC) incorporate a new in-transit-visibility COTS 1. LCO/LCA package Project Shared Vision (PSV) - 1. Monitor environment Planning Ops concept, prototypes, 1. Project value propositions a - Update models package for order delivery tracking that Project: rqts, architecture, Monitoring a -Stakeholder values context 2. Implement plans MBASE LC plan, rationale & control 2. Current situation w.r.t. VP's 3. Evaluate progress both helps in delay diagnosis and improves 2. IOC/transition/support package context 3. Improvement goals, priorities -w.r.t. goals, models, plans -Design, increment plans, 4. Project scope, results chain Plan/goal 4. Determine, apply customer satisfaction by answering ques- quality plans, T/S plans 5. Value/business case models mismatches corrective actions 3. Evaluation criteria/questions 4. Progress metrics 5. Update experience base tions about delivery delays.

LCO: Life-Cycle Objectives This project feedback propagates up- LCA: Life-Cycle Architecture PFB: Progress/plan/goal mismatches IOC: Initial Operational Capability -shortfalls, opportunities, risks w.r.t.: with respect to ward to the organizational level along all GQM: Goal-Question- Metric Paradigm T/S: Transition/Support MBASE: Model-Based (System) Architecting and Software Engineering three lines of traceability. The shortfalls or

12 CROSSTALK The Journal of Defense Software Engineering May 2002 Achieving CMMI Level 5 Improvements with MBASE and the CeBASE Method opportunities with respect to organizational Table of Contents shared vision and goals are fed back along 2. Shared Vision the project OSV (P-OSV) arrow at the left. 2.1. System Capability Description 2.1.1. Benefits Realized The corresponding plan-context feedback 2.1.2. Results Chain occurs along the project OP (P-OP) arrow 2.2. Key Stakeholders in the center, and the monitoring and con- 2.3. System Boundary and Environment 2.4. Major Project Constraints trol feedback at the right is used to update 2.5. Top-Level Business Case the organization’s detailed experience base 2.6. Inception Phase Plan and Required Resources on best practices for achieving goals. 2.7. Initial Spiral Objectives, Constraints, Alternatives, and Risks As a final observation, note that the 2.1 System Capability Description A concise description of the system that can pass the “elevator test” described in Geoffrey Moore’s Crossing the Chasm content of the PP element consists of the [13]. This would enable you to explain why the system should be built to an executive while riding up or down an ele- Spiral/MBASE Life-Cycle Objectives vator. It should take the following form: • For (target customer) (LCO), Life-Cycle Architecture (LCA), and • Who (statement of the need or opportunity) Initial Operational Capability (IOC) anchor- • The (product name) is a (product category) point milestone content that we discussed • That (statement of key benefit-that is, compelling reason to buy) • Unlike (primary competitive alternative) in our May 2001 and December 2001 • Our product (statement of primary differentiation) CrossTalk articles [1, 2]. Thus, the Here is an example for a corporate order-entry system: “Our sales people need a faster, more integrated order-entry Spiral/MBASE guidelines in those articles system to increase sales and customer satisfaction. Our proposed Web order system would give us an e-commerce have become the project-level guidelines for order-entry system similar to Amazon.com’s that will fit the special needs of ordering mobile homes and their after- market components. Unlike the template-based system our main competitor bought, ours would be faster, more user the CeBASE method. A detailed example of friendly, and better integrated with our order fulfillment system.” these guidelines is shown next. Common Pitfalls: • Not relating the need or opportunity to the goals in the organization’s Shared Vision. CeBASE Guidelines Example: Shared • Being too verbose about “our product” or its key benefits. Vision 2.2 Key Stakeholders The CeBASE project-level and organiza- Identify each stakeholder by their home organization, their authorized representative for project activities, and their relation to the Results Chain.The four classic stakeholders are the software/IT system’s users, customers, developers tion-level shared vision guidelines are and maintainers. Additional stakeholders may be system interfacers, subcontractors, suppliers, venture capitalists, inde- quite similar. Their main difference is one pendent testers, and the general public (where safety or information protection issues may be involved). of context: The project-level shared Common Pitfalls: • Being too pushy or not pushy enough in getting your immediate clients to involve the other success-critical stake- vision has the organization-level shared holders. Often, this involves fairly delicate negotiations among operational organizations. If things are going slow- vision as context and shows traceability to ly and you are on a tight schedule, seek the help of your higher-level managers. it, but not vice versa. Figure 5 shows the • Accepting just anybody as an authorized stakeholder representative. You don’t want the stakeholder organization to give you somebody they feel they can live without. Some good criteria for effective stakeholder representatives are table of contents and example text from that they be empowered, representative, knowledgeable, collaborative, and committed. the project-level shared vision guidelines. In the CeBASE method, it is the first item Figure 5: Example CeBASE System-Level Shared Vision Content to be drafted by the project’s Integrated Product Team of success-critical stake- method onto the CMMI’s 24 process respondence but also an almost complete holders or its equivalent. It sets the stage areas using the CMMI summary tables in coverage of the CMMI’s practices by the for subsequent Inception Phase prototyp- Ahern, et al. [15]. A mapping example is organizational and project components ing and stakeholder win-win requirements provided in a longer version of this paper of the CeBASE method. We are extend- negotiation. available at . ing the CeBASE method to cover the The shared vision guidelines are adopt- Overall, the mapping indicated that specific CMMI processes not currently ed from best commercial practices in ways the CeBASE method covered the CMMI covered. A summary of the percentage that apply to public service applications as goals and practices well. It provided the of the CMMI process areas covered by well. The system capability “elevator” CeBASE team with some action items to the CeBASE method is shown in Table 1 description comes from Geoffrey Moore’s address missing elements covered in the (see page 14). The “+” annotations in classic Crossing the Chasm [13]. The “Benefits CMMI. Most significantly, though, it iden- Table 1 indicate that the CeBASE Realized” and “Results Chain” sections are tified items that we have found important method’s coverage goes considerably adapted from the DMR Consulting Group’s to software and that beyond that of the CMMI. For example, Benefits Realization Approach [14]. The were missing in the CMMI. These includ- it covers not just an organizational Results Chain identifies the full set of initia- ed a business case justifying the need for process focus but also an organizational tives necessary to realize the proposed sys- required features, having a stakeholder product and people focus. The “-” anno- tem’s benefits; this also identifies the full set win-win prioritization of requirements tations in Table 1 indicate that some of success-critical stakeholders who should (for coping with new requirements and areas in the CeBASE method still remain be involved in the system’s definition. The fixed budgets or schedules), coverage of to be fleshed out, such as detailed guide- current version of the guidelines is at project requirements (required platforms, lines for organizational training plans, . resource constraints), level of service although they are covered in principle. requirements (the -ilities), and evolution The CeBASE method also provides a CeBASE Method Coverage of the requirements (to avoid point-solution prescriptive approach for an organization to CMMI architectures). use in tailoring the CMMI’s generic practices Example Mapping: Requirements to its particular culture, environment, and Development Overall CeBASE Method Coverage of value propositions. Thus, an e-commerce To test its coverage of critical issues, we the CMMI organization’s value propositions (rapid time have done a mapping of the CeBASE Overall, we found not only a strong cor- to market, rapid adaptation to change) will

May 2002 www.stsc.hill.af.mil 13 Best Practices

Process Management Engineering computer science and has a much larger • Organizational Process Focus: 100+ • Requirements Management: 100 enrollment than the spring course, which is • Organizational Process Definition: 100+ • Requirements Development: 100+ • Organizational Training: 100- • Technical Solution: 60+ only required for a few specialization areas. • Organizational Process Performance: 100- • Product Integration: 70+ In 1996-97, the subset of projects to be • Organizational Innovation and Deployment: 100+ • Verification: 70- continued in the spring was primarily those • Validation: 80+ having students continuing from the fall • Project Planning: 100 Support course. After we found that most of the • Project Monitoring and Control: 100+ • Configuration Management: 70- • Supplier Agreement Management: 50- • Process/Product Quality Assurance: 70- 1996-97 products went unused, we per- • Integrated Project Management: 100- • Measurement and Analysis: 100- formed a critical success factor analysis and • Risk Management: 100 • Decision Analysis and Resolution: 100- determined a set of spring project selection • Integrated Teaming: 100 • Organizational Environment for Integration: 80- • Quantitative Project Management: 70- • Causal Analysis and Resolution: 100 criteria (e.g., library commitment to prod- Note: All amounts are percentages. uct use, empowered clients) that increased Table 1: CeBASE Method Coverage of CMMI the project adoption rate. Even then, unforeseen circumstances such as the cause it to adopt more flexible processes. USC’s Information Services Division as inevitable changes in library infrastructure However, such elements as the anchor-point clients (customers, users or user represen- and organizational responsibilities have milestones will balance this flexibility with tatives, and maintainers). Each year, we caused some applications’ usage commit- sufficient discipline to keep the overall have 15 to 20 teams execute the MBASE ments to be overtaken by events. This is a process under control. The value proposi- inception and elaboration phases in the fall frequent phenomenon for electronic serv- tions of an organization developing safety- semester to develop and validate life-cycle ices applications [21]. critical products or services will cause it to architecture packages for USC electronic In general, the EF improvements on emphasize more rigorous specifications, services applications’ candidates. The top MBASE have effected a uniform improve- processes, and practices, but in ways that six to 10 of these applications are then ment in outcome, but there are some enable it to cope with rapid change. selected for spring semester teams who anomalies. For example, the 12 percent of Examples include capturing evolution execute the MBASE construction and tran- projects failing IOC in 1999-2000 were due requirements, designing systems to accom- sition phases and deliver initial operational to a team who botched their product tran- modate future change, building in buffer capability application systems. sition when their client was unexpectedly Our shared vision for the USC Center periods to synchronize and stabilize called out of town during the transition for Software Engineering’s research and processes [16], or to adapt to potential period. Another example was our introduc- education goals incorporates the win con- tion of midcourse client briefings on core schedule or budget slips by dropping lower- ditions of not only our students and their priority product features [3]. capability expectations in 1999-2000 as project clients, but also other stakeholders part of our SAIV process [3]. SAIV only Another point worth emphasizing is that such as the center’s staff and prospective the EF component of the CeBASE method guarantees the delivery of a highest-priori- technology users, represented by our 35 ty core capability set of features, with fur- supports a continuous vs. staged approach industry and government affiliate organiza- to process improvement. You do not need ther features added as time is available. tions [17]. Our questions and metrics While this resulted in early client disap- to be a CMM Level 4 organization to begin include stakeholder critiques of each proj- pointments at LCA where client success realizing significant benefits from organiza- ect and extensive instrumentation of the scores dropped from 4.74 in 1998-1999 to tional innovation or causal analysis. projects’ effort, schedule, quality, produc- 4.48 in 1999-2000, there was a dramatic tivity, and behavioral characteristics [18, 19, increase in the clients’ success score for the Using the CeBASE Method and 20]. delivered product (4.3 in 1998-1999 to 4.75 Since 1996, we have been applying the EF Table 2 summarizes four years’ experi- and GQM approaches to improve the proj- ence to date in applying and refining in 1999-2000). ect-oriented MBASE aspects of the CeBASE on an annual selection of real- The 1998-1999 improvement in the CeBASE method by using them to client projects. “Failing LCO” criterion shown in Table 2 improve an annual series of USC electron- A few explanatory comments on Table resulted primarily from our introduction ic services projects. These are developed 2 are in order. The number of LCA teams of a simplifier and complicator (S&C) using annually improved MBASE guide- is larger than the number of IOC teams expectations management activity. This lines by teams of five master’s-level stu- because USC’s fall course is a core course helped the developers to have a better dents as developers and staff members of for the USC master’s of science degree in understanding of the system and the stake- holders by leveraging an experience base Table 2: Annual University of Southern California E-Services Project Outcomes of designs that help simplify the architec- ture (the simplifiers) and apply a risk-driven 1996-1997 1997-1998 1998-1999 1999-2000 approach to the architectural areas that LCA Teams 15 16 19 22 may cause significant complications (the Failing LCO 27% 25% 5% 5% complicators). Involving the clients in risk management activities throughout the Failing LCA 0% 0% 0% 0% process clearly contributed to their rating LCA Client Score 4.46 4.67 4.74 4.48 virtually all delivered applications as highly IOC Teams 6568satisfactory. Failing IOC 16% 0% 0% 12% IOC Client Score n/a 4.15 4.3 4.75 Conclusions Figure 6 summarizes the distinctions IOC Regularly Used 16% 60% 50% 62% among maturity models such as the SW-

14 CROSSTALK The Journal of Defense Software Engineering May 2002 Achieving CMMI Level 5 Improvements with MBASE and the CeBASE Method

CMM and the CMMI; process models such Operational Focus: Organizational Focus as the waterfall model; and process model Assessment generators such as MBASE, RUP, and the Project Organization CeBASE method. It shows where each Practice model fits with respect to organizational Application Software Software CMM Software CMM focus (project vs. organization), application Focus Waterfall, Early EF, GQM focus (software vs. system), and operational Incremental focus (practice vs. assessment). From Figure 6, we can see that the SW- Systems CMMI CMMI CMM covers both project and organization Spiral, MBASE, CeBASE Method considerations, but it has shortfalls in both RUP applications focus (software, not systems) and operational focus (assessment, not Figure 6: Process Model Coverage Distinctions practice). We can also see that solutions MBASE and CeBASE. 10. Boehm, B. Software Engineering Eco- focused on redressing one of the two short- nomics. Prentice Hall, 1981. fall dimensions will still have shortfalls of References 11. Ferguson, P., et al. “Software Process their own in another dimension. Thus, the 1. Boehm, B., and W. Hansen. “Under- Improvement Works!” Advanced CMMI redresses the systems shortfall in the standing the Spiral Model as a Tool for Information Services, Inc. CMU/SEI- SW-CMM, but it still has the shortfall of Evolutionary Acquisition.” CrossTalk 99-TR-027, Nov. 1999. providing explicit guidelines for assessment May 2001. 12. Butler, K., and W. Lipke, “Software but not for project practices. While MBASE 2. Boehm, B., and D. Port. “Balanc- Process Achievement at Tinker Air and RUP provide explicit guidelines for ing Discipline and Flexibility with Force Base, Oklahoma.” CMU/SEI- project practices, they do not provide coun- the Spiral Model and MBASE.” 2000-TR-014, Sept. 2000. terparts for an organization’s practices. CrossTalk Dec. 2001. 13. Moore, Geoffrey. Crossing the Chasm: However, the combination of CMMI and 3. Boehm, B., D. Port, L. Huang, and A. Marketing and Selling High-Tech Pro- the CeBASE method covers all aspects of W. Brown. “Using the Spiral Model and ducts to Mainstream Customers. New operational, organizational, and application MBASE to Generate New Acquisition York: Harper Business, 1991: 161. focus. Process Models: SAIV, CAIV, and 14. Thorp, J., and DMR Consulting Group. In terms of future software-intensive SCQAIV.” CrossTalk Jan. 2002. The Information Paradox, McGraw system challenges, the ability to balance dis- 4. Basili, Victor R., and Gianluigi Caldiera. Hill, 1998. cipline and flexibility is critically important “Improve Software Quality by Reusing 15. Ahern, D., A. Clouse, and R. Turner. to the development of highly dependable Knowledge and Experience.” Sloan CMMI Distilled. Addison Wesley, software-intensive systems in an environ- Management Review 37.1 (1995). 2001. ment of rapid change. The CMMI’s risk- 5. Basili, Victor R., Gianluigi Caldeira, and 16. Cusumano, Michael A., and Richard W. management orientation enables its users to H. D. Rombach. “The Goal Question Selby. Microsoft Secrets: How the apply risk considerations to determine how Metric Approach.” Encyclopedia of World’s Most Powerful Software much discipline and how much flexibility is Software Engineering. Ed. J. Marciniak. Company Creates Technology, Shapes enough in a given situation. The risk-driven Wiley, 1994. Markets, and Manages People. New nature of the spiral model and MBASE 6. Basili, Victor R., Gianluigi Caldeira, and York: Simon & Schuster, 1996. enables them to achieve a similar balance of H. D. Rombach. “The Experience 17. Boehm, B., A. Egyed, D. Port, A. Shah, discipline and flexibility. When these proj- Factory.” Encyclopedia of Software J. Kwan, and R. Madachy. “A ect-level approaches are combined with the Engineering. Ed. J. Marciniak. Wiley, Stakeholder Win-Win Approach to organization-level approaches in the EF, the 1994. Software Engineering Education.” result is the unified CeBASE method sum- 7. Basili, Victor R., Gianluigi Caldiera, Annuals of Software Engineering 6 marized in the section “The CeBASE Frank McGarry, Rose Pajersky, Gerald (1998): 295-321. Method and the CMMI.” It currently imple- Page, and Sharon Waligora. “The 18. Egyed, A., and B. Boehm. “Comparing ments most of the CMMI, is being extend- Software Engineering Laboratory – An Software System Requirements Ne- ed to cover the full CMMI, and has a strong Operational Software Experience gotiation Patterns.” Systems Engineer- track record of continuous process Factory.” 14th International Confer- ing 2.1 (1999): 1-14. improvement at USC’s and UMD’s ence on Software Engineering. May 19. Port, D., and B. Boehm. “Introducing Software Engineering Laboratories and Risk Management Techniques Within ◆ 1992. industry adapters elsewhere. 8. Basili, Victor R., Marvin Zelkowitz, Project-Based Software Engineering Frank McGarry, Jerry Page, Sharon Courses.” Computer Science Edu- Acknowledgements Waligora, and Rose Pajerski. “Special cation 2002 (to appear). We would like to acknowledge the support Report: SEL’s Software Process- 20. Majchrzak, A., and C. Beath. “A of the National Science Foundation in Improvement Program.” IEEE Framework for Studying Learning and establishing CeBASE, the DoD Software Software 12.6 (1995): 83-87. Participation in Software Develop- Intensive Systems Directorate in support- 9. McGarry, F. “What Is A Level 5 ment Projects.” Management Infor- ing its application to DoD projects and Organization? Lessons from 10 Years mational Systems Quarterly, under organizations, and the affiliates of the USC of Process Improvement Experiences review. Center for Software Engineering and the at CSC.” Proceedings of the Twenty- 21. Boehm, B., “Project Termination University of Maryland’s software engi- Sixth NASA Software Engineering Doesn’t Equal Project Failure.” IEEE neering program for their contributions to Workshop. Nov. 2001. Computer Sept. 2000: 94-96.

May 2002 www.stsc.hill.af.mil 15 Best Practices

COMING EVENTS About the Authors

Barry Boehm, Ph.D., May 13-17 Apurva Jain is a doc- is the TRW professor toral student at the Analysis and Review of software engineer- University of Southern (STAREAST 2002) ing and director of the California’s Center for Center for Software Software Engineering. Engineering at the Previously he was a University of Southern California. He project manager at SpruceSoft Inc. was previously in technical and man- His research interests are software Orlando, FL agement positions at General Dynam- process management and pervasive www.sqe.com/stareast ics, Rand Corp., TRW, Defense computing. He received a bachelor’s Advanced Research Projects Agency, degree from Curtin University of June 3-6 and the Office of the Secretary of Technology, Perth, Australia and a Combat Identification Systems Defense as the director of Defense professional honors diploma from Conference Research and Engineering Software Informatics, Singapore. and Computer Technology Office. Dr. Boehm originated the spiral University of Southern California model, the Constructive Cost Model, Center for Software Engineering and the stakeholder win-win approach 941 W. 37th Place, SAL 329 Colorado Springs, CO to software management and require- Los Angeles, CA 90089-0781 www.usasymposium.com ments negotiation. Phone: (213) 740-6505 Fax: (213) 740-4927 University of Southern California July 18-20 E-mail: [email protected] Shareware Industry Conference Center for Software Engineering Los Angeles, CA 90089-0781 St. Louis, MO Phone: (213) 740-8163 www.sic.org Victor R. Basili, Fax: (213) 740-4927 Ph.D., is a professor E-mail: [email protected] of Computer Science July 22-25 at the University of Joint Advanced Weapons Systems Sensors, Daniel Port, Ph.D.,is Maryland, the Execu- Simulation, and Support Symposium a research assistant tive Director of the (JAWS S3) professor of Com- Fraunhofer Center, Maryland, and Colorado Springs, CO puter Science and an one of the founders and principals in www.jawswg.hill.af.mil associate of the Center the Software Engineering Laboratory. for Software Engi- He works on measuring, evaluating, neering at the University of Southern and improving the software develop- August 19-22 California (USC). Dr. Port’s previous ment process and product and has The Second Software Product positions were assistant professor of consulted for many organizations. Dr. Line Conference Computer Science at Columbia Basili is a recipient of a 1989 NASA San Diego, CA University, director of Technology at Group Achievement Award, a 1990 www.sei.cmu.edu/SPLC2/ the USC Annenburg Center EC2 NASA/GSFC Productivity Improve- Technology Incubator, co-founder of ment and Quality Enhancement September 9-13 Tech Tactics, Inc., and a project lead Award, the 1997 Award for Out- International Conference on Practical and technology trainer for NeXT standing Achievement in Mathematics Software Quality Techniques and Computers, Inc. He received a doctor- and Computer Science by the International Conference on Practical ate from the Massachusetts Institute Washington Academy of Sciences, Software Testing Techniques 2002 North of Technology in applied mathemat- and the 2000 Outstanding Research St. Paul, MN ics with an emphasis on theoretical Award from ACM Special Interest www.softdim.cim/psqt/ computer science in 1994 and a bach- Group on Software Engineering. elor’s degree in mathematics from the University of California in Los April 28-May 1, 2003 Computer Science Department Angeles. 4111 AV Williams Building Software Technology Conference 2003 University of Southern California University of Maryland Center for Software Engineering College Park, MD 20742 Los Angeles, CA 90089-0781 Phone: (301) 405-2668 Salt Lake City, UT Phone: (213) 740-7275 Fax: (301) 405-2691 www.stc-online.org Fax: (213)740-4927 E-mail: [email protected] E-mail: [email protected]

16 CROSSTALK The Journal of Defense Software Engineering May 2002