ptg Process Areas by Category Process Management OPD Organizational Process Definition OPF Organizational Process Focus OPM Organizational Performance Management OPP Organizational Process Performance OT Organizational Training

Project Management IPM Integrated Project Management PMC Project Monitoring and Control PP Project Planning QPM Quantitative Project Management REQM Requirements Management RSKM Risk Management SAM Supplier Agreement Management

Engineering PI Product Integration RD Requirements Development TS Technical Solution ptg VAL Validation VER Verification

Support CAR Causal Analysis and Resolution CM Configuration Management DAR Decision Analysis and Resolution MA Measurement and Analysis PPQA Process and Product Quality Assurance Generic Goals and Practices GG1 Achieve Specific Goals GP 1.1 Perform Specific Practices GG2 Institutionalize a Managed Process GP2.1 Establish an Organizational Policy GP 2.2 Plan the Process GP 2.3 Provide Resources GP 2.4 Assign Responsibility GP 2.5 Train People GP 2.6 Control Work Products GP 2.7 Identify and Involve Relevant Stakeholders GP 2.8 Monitor and Control the Process GP 2.9 Objectively Evaluate Adherence GP 2.10 Review Status with Higher Level Management GG3 Institutionalize a Defined Process GP 3.1 Establish a Defined Process GP 3.2 Collect Process Related Experiences

ptg CMMI® for Development Third Edition

ptg

Wow! eBook The SEI Series in Software Engineering

Visit informit.com/sei for a complete list of available products. ptg

he SEI Series in Software Engineering represents is a collaborative Tundertaking of the Carnegie Mellon Software Engineering Institute (SEI) and Addison-Wesley to develop and publish books on software engineering and related topics. The common goal of the SEI and Addison-Wesley is to provide the most current information on these topics in a form that is easily usable by practitioners and students.

Books in the series describe frameworks, tools, methods, and technologies designed to help organizations, teams, and individuals improve their technical or management capabilities. Some books describe processes and practices for developing higher-quality software, acquiring programs for complex systems, or delivering services more effectively. Other books focus on software and system architecture and product-line development. Still others, from the SEI’s CERT Program, describe technologies and practices needed to manage software and network security risk. These and all books in the series address critical problems in software engineering for which practical solutions are available.

Wow! eBook CMMI® for Development Guidelines for Process Integration and Product Improvement

Third Edition

ptg

Mary Beth Chrissis Mike Konrad Sandy Shrum

Upper Saddle River, NJ • Boston• Indianapolis • San Francisco New York • Toronto • Montreal • London • Munich • Paris • Madrid Capetown • Sydney • Tokyo • Singapore • Mexico City

Wow! eBook The SEI Series in Software Engineering Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those designations appear in this book, and the publisher was aware of a trademark claim, the designations have been printed with ini- tial capital letters or in all capitals. CMM, CMMI, Capability Maturity Model, Capability Maturity Modeling, Carnegie Mellon, CERT, and CERT Coordination Center are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. ATAM; Architecture Tradeoff Analysis Method; CMM Integration; COTS Usage-Risk Evaluation; CURE; EPIC; Evolutionary Process for Integrating COTS Based Systems; Framework for Software Product Line Practice; IDEAL; Interim Profile; OAR; OCTAVE; Operationally Critical Threat, Asset, and Vulnerability Evaluation; Options Analysis for Reengineering; Personal Soft- ware Process; PLTP; Product Line Technical Probe; PSP; SCAMPI; SCAMPI Lead Appraiser; SCAMPI Lead Assessor; SCE; SEI; SEPG; Team Software Process; and TSP are service marks of Carnegie Mellon University. Special permission to reproduce portions of CMMI for Development (CMU/SEI-2010-TR-035), © 2010 by Carnegie Mellon University, has been granted by the Software Engineering Institute. The authors and publisher have taken care in the preparation of this book, but make no expressed or implied warranty of any kind and assume no responsibility for errors or omissions. No liability is assumed for incidental or consequential damages in connection with or arising out of the use of the information or programs contained herein. The publisher offers excellent discounts on this book when ordered in quantity for bulk purchases or special sales, which may include electronic versions and/or custom covers and content particular to your business, training goals, marketing focus, and branding interests. For more information, please contact: U.S. Corporate and Government Sales (800) 382-3419 [email protected] For sales outside the United States, please contact: International Sales ptg [email protected] Visit us on the Web: informit.com/aw Library of Congress Cataloging-in-Publication Data Chrissis, Mary Beth. CMMI for development : guidelines for process integration and product improvement / Mary Beth Chrissis, Mike Konrad, Sandy Shrum.—3rd ed. p. cm. Includes bibliographical references and index. ISBN 978-0-321-71150-2 (hardcover : alk. paper) 1. Capability maturity model (Computer software) 2. Software engineering. 3. Production engineering. 4. Manufacturing processes. I. Konrad, Mike. II. Shrum, Sandy. III. Title. QA76.758.C518 2011 005.1—dc22 2010049515 Copyright © 2011 Pearson Education, Inc. All rights reserved. Printed in the United States of America. This publication is protected by copyright, and permission must be obtained from the publisher prior to any prohibited reproduction, storage in a retrieval system, or transmission in any form or by any means, electronic, mechanical, photocopying, recording, or likewise. For information regarding permissions, write to: Pearson Education, Inc. Rights and Contracts Department 501 Boylston Street, Suite 900 Boston, MA 02116 Fax: (617) 671-3447 ISBN-13: 978-0-321-71150-2 ISBN-10: 0-321-71150-5 Text printed in the United States on recycled paper at Courier in Westford, Massachusetts. First printing, March 2011

Wow! eBook ptg

This book is dedicated to Watts Humphrey, in appreciation for all he accomplished as a leader, visionary, and teacher. You only needed to be in a room with Watts Humphrey a short time to real- ize what a special person he was. Watts’ leadership, vision, and insights helped many over his lifetime. He was a student of learn- ing and he shared that quest for learning with everyone with whom he came into contact. He had a vision that he shared with the world and the world became a better place. CMMI would not have been possible without Watts Humphrey. May he continue to inspire us for years to come.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook CONTENTS

LIST OF PERSPECTIVES xiii PREFACE xv ptg BOOK ACKNOWLEDGMENTS xxi

PART ONE—ABOUT CMMI FOR DEVELOPMENT 1

1 INTRODUCTION 3 About Process Improvement 4 About Capability Maturity Models 9 Evolution of CMMI 10 CMMI Framework 14 CMMI for Development 18 2 PROCESS AREA COMPONENTS 19 Core Process Areas and CMMI Models 19 Required, Expected, and Informative Components 19 Required Components 19 Expected Components 20 Informative Components 20

vii

Wow! eBook viii Contents Components Associated with Part Two 20 Process Areas 20 Purpose Statements 22 Introductory Notes 22 Related Process Areas 22 Specific Goals 22 Generic Goals 23 Specific Goal and Practice Summaries 23 Specific Practices 23 Example Work Products 24 Subpractices 24 Generic Practices 24 Generic Practice Elaborations 25 Additions 25 Supporting Informative Components 25 Notes 25 Examples 25 References 26

Numbering Scheme 26 ptg Typographical Conventions 27 3 TYING IT ALL TOGETHER 31 Understanding Levels 31 Structures of the Continuous and Staged Representations 32 Understanding Capability Levels 34 Capability Level 0: Incomplete 35 Capability Level 1: Performed 35 Capability Level 2: Managed 35 Capability Level 3: Defined 35 Advancing Through Capability Levels 36 Understanding Maturity Levels 41 Maturity Level 1: Initial 42 Maturity Level 2: Managed 42 Maturity Level 3: Defined 43 Maturity Level 4: Quantitatively Managed 43 Maturity Level 5: Optimizing 44 Advancing Through Maturity Levels 45

Wow! eBook Contents ix Process Areas 46 Equivalent Staging 49 Achieving High Maturity 52 4 RELATIONSHIPS AMONG PROCESS AREAS 59 Process Management 60 Basic Process Management Process Areas 60 Advanced Process Management Process Areas 62 Project Management 64 Basic Project Management Process Areas 64 Advanced Project Management Process Areas 66 Engineering 68 Recursion and Iteration of Engineering Processes 74 Support 77 Basic Support Process Areas 78 Advanced Support Process Areas 79 5 USING CMMI MODELS 85 ptg Adopting CMMI 90 Your Process Improvement Program 94 Selections that Influence Your Program 98 CMMI Models 99 Interpreting CMMI When Using Agile Approaches 100 Using CMMI Appraisals 104 Appraisal Requirements for CMMI 105 SCAMPI Appraisal Methods 105 Appraisal Considerations 106 CMMI Related Training 107 6 ESSAYS AND CASE STUDIES 113 Case Studies 113 Essays 137

Wow! eBook x Contents PART TWO—GENERIC GOALS AND GENERIC PRACTICES, AND THE PROCESS AREAS 163

GENERIC GOALS AND GENERIC PRACTICES 165 CAUSAL ANALYSIS AND RESOLUTION 233 CONFIGURATION MANAGEMENT 243 DECISION ANALYSIS AND RESOLUTION 257 INTEGRATED PROJECT MANAGEMENT 267 MEASUREMENT AND ANALYSIS 287 ORGANIZATIONAL PROCESS DEFINITION 303 ORGANIZATIONAL PROCESS FOCUS 317 ORGANIZATIONAL PERFORMANCE MANAGEMENT 331 ORGANIZATIONAL PROCESS PERFORMANCE 351 ORGANIZATIONAL TRAINING 365 PRODUCT INTEGRATION 377 PROJECT MONITORING AND CONTROL 393

PROJECT PLANNING 403 ptg PROCESS AND PRODUCT QUALITY ASSURANCE 425 QUANTITATIVE PROJECT MANAGEMENT 433 REQUIREMENTS DEVELOPMENT 455 REQUIREMENTS MANAGEMENT 473 RISK MANAGEMENT 481 SUPPLIER AGREEMENT MANAGEMENT 497 TECHNICAL SOLUTION 509 VALIDATION 531 VERIFICATION 541

PART THREE—THE APPENDICES 553

A REFERENCES 555 B ACRONYMS 561 C CMMI VERSION 1.3 PROJECT PARTICIPANTS 565 D GLOSSARY 573

Wow! eBook Contents xi BOOK CONTRIBUTORS 605

INDEX 623

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook PERSPECTIVES

LOOKING AHEAD Watts Humphrey 5 ptg CMMI: INTEGRATION AND IMPROVEMENT CONTINUES Bob Rassa 12 THE ARCHITECTURE OF THE CMMI FRAMEWORK Roger Bate with a postscript by Mike Konrad 14 APPLYING PRINCIPLES OF EMPIRICISM Victor R. Basili, Kathleen C. Dangle, and Michele A. Shaw 37 USING PROCESS PERFORMANCE BASELINES AND PROCESS PERFORMANCE MODELS TO ENABLE SUCCESS Michael Campo, Neal Mackertich, and Peter Kraus 53 EXPANDING CAPABILITIES ACROSS THE “CONSTELLATIONS” Mike Phillips 72 MEASUREMENT MAKES IMPROVEMENT MEANINGFUL David N. Card 75 PEOPLE, PROCESS, TECHNOLOGY, AND CMMI Gargi Keeni 80 THE ROLE OF PROCESS STANDARDS IN PROCESS DEFINITION James W. Moore 85 EXECUTIVE RESPONSIBILITIES IN PROCESS IMPROVEMENT Bill Curtis 91

xiii

Wow! eBook xiv Perspectives IMPLEMENTING ENGINEERING CULTURE FOR SUCCESSFUL PROCESS IMPROVEMENT Tomoo Matsubara 95 PROCESS IMPROVEMENT IN A SMALL COMPANY Khaled El Emam 101 IMPROVING INDUSTRIAL PRACTICE Hans-Jürgen Kugler 107 SWITCHING TRACKS: FINDING THE RIGHT WAY TO GET TO MATURITY LEVEL 2 Heather Oppenheimer and Steve Baldassano 113 USING CMMI AS A BASIS FOR COMPETITIVE ADVANTAGE AT ABB Aldo Dagnino 123 LEVERAGING CMMI AND LSS TO ACCELERATE THE PROCESS IMPROVEMENT JOURNEY Kileen Harrison and Anne Prem 130 GETTING STARTED IN PROCESS IMPROVEMENT: TIPS AND PITFALLS Judah Mogilensky 137 AVOIDING TYPICAL PROCESS IMPROVEMENT PITFALLS Pat O’Toole 141 ptg TEN MISSING LINKS TO CMMI SUCCESS Hillel Glazer 146 ADOPTING CMMI: HIRING A CMMI CONSULTANT OR LEAD APPRAISER Rawdon Young, Will Hayes, Kevin Schaaff, and Alexander Stall 152 FROM DOUBTER TO BELIEVER: MY JOURNEY TO CMMI Joseph Morin 157

Wow! eBook PREFACE

CMMI (Capability Maturity Model Integration) models are collec- tions of best practices that help organizations to improve their ptg processes. These models are developed by product teams with mem- bers from industry, government, and the Software Engineering Insti- tute (SEI). This model, called CMMI for Development (CMMI-DEV), pro- vides a comprehensive integrated set of guidelines for developing products and services.

Purpose The CMMI-DEV model provides guidance for applying CMMI best practices in a development organization. Best practices in the model focus on activities for developing quality products and services to meet the needs of customers and end users. The CMMI-DEV V1.3 model is a collection of development best practices from government and industry that is generated from the CMMI V1.3 Architecture and Framework.1 CMMI-DEV is based on the CMMI Model Foundation or CMF (i.e., model components common

1. The CMMI Framework is the basic structure that organizes CMMI components and com- bines them into CMMI constellations and models.

xv

Wow! eBook xvi Preface

to all CMMI models and constellations2) and incorporates work by development organizations to adapt CMMI for use in the develop- ment of products and services.

Model Acknowledgments Many talented people were involved in the development of the V1.3 CMMI Product Suite. Three primary groups were the CMMI Steering Group, Product Team, and Configuration Control Board (CCB). The Steering Group guided and approved the plans of the Product Team, provided consultation on significant CMMI project issues, and ensured involvement from a variety of interested communities. The Steering Group oversaw the development of the Develop- ment constellation, recognizing the importance of providing best practices to development organizations. The Product Team wrote, reviewed, revised, discussed, and agreed on the structure and technical content of the CMMI Product Suite, including the framework, models, training, and appraisal materials. Development activities were based on multiple inputs. These inputs included an A-Specification and guidance specific to each release provided by the Steering Group, source models, change requests ptg received from the user community, and input received from pilots and other stakeholders. The CCB is the official mechanism for controlling changes to CMMI models, appraisal related documents, and Introduction to CMMI training. As such, this group ensures integrity over the life of the product suite by reviewing all proposed changes to the baseline and approving only those changes that satisfy identified issues and meet criteria for the upcoming release. Members of the groups involved in developing CMMI-DEV V1.3 are listed in Appendix C.

Audience The audience for CMMI-DEV includes anyone interested in process improvement in a development environment. Whether you are famil- iar with the concept of Capability Maturity Models or are seeking infor- mation to begin improving your development processes, CMMI-DEV

2. A constellation is a collection of CMMI components that are used to construct models, training materials, and appraisal related documents for an area of interest (e.g., development, acquisition, services).

Wow! eBook Preface xvii

will be useful to you. This model is also intended for organizations that want to use a reference model for an appraisal of their develop- ment related processes.3

Organization of this Document This document is organized into three main parts:

• Part One: About CMMI for Development • Part Two: Generic Goals and Generic Practices, and the Process Areas • Part Three: The Appendices and Glossary

Part One: About CMMI for Development, consists of five chapters:

• Chapter 1, Introduction, offers a broad view of CMMI and the CMMI for Development constellation, concepts of process improvement, and the history of models used for process improvement and different process improvement approaches. • Chapter 2, Process Area Components, describes all of the compo- nents of the CMMI for Development process areas.4 • Chapter 3, Tying It All Together, assembles the model components ptg and explains the concepts of maturity levels and capability levels. • Chapter 4, Relationships Among Process Areas, provides insight into the meaning and interactions among the CMMI-DEV process areas. • Chapter 5, Using CMMI Models, describes paths to adoption and the use of CMMI for process improvement and benchmarking of prac- tices in a development organization. • Chapter 6, Essays and Case Studies, contains essays and case studies contributed by invited authors from a variety of backgrounds and organizations.

Part Two: Generic Goals and Generic Practices, and the Process Areas, contains all of this CMMI model’s required and expected com- ponents. It also contains related informative components, including subpractices, notes, examples, and example work products.

3. An appraisal is an examination of one or more processes by a trained team of professionals using a reference model (e.g., CMMI-DEV) as the basis for determining strengths and weaknesses. 4. A process area is a cluster of related practices in an area that, when implemented collec- tively, satisfies a set of goals considered important for making improvement in that area. This concept is covered in detail in Chapter 2.

Wow! eBook xviii Preface

Part Two contains 23 sections. The first section contains the generic goals and practices. The remaining 22 sections each repre- sent one of the CMMI-DEV process areas. To make these process areas easy to find, they are organized alphabetically by process area acronym. Each section contains descriptions of goals, best practices, and examples. Part Three: The Appendices and Glossary, consists of four sections:

• Appendix A: References, contains references you can use to locate documented sources of information such as reports, process improve- ment models, industry standards, and books that are related to CMMI-DEV. • Appendix B: Acronyms, defines the acronyms used in the model. • Appendix C: CMMI Version 1.3 Project Participants contains lists of team members who participated in the development of CMMI-DEV V1.3. • Appendix D: Glossary, defines many of the terms used in CMMI-DEV.

How to Use this Document ptg Whether you are new to process improvement, new to CMMI, or already familiar with CMMI, Part One can help you understand why CMMI-DEV is the model to use for improving your development processes.

Readers New to Process Improvement If you are new to process improvement or new to the Capability Maturity Model (CMM) concept, we suggest that you read Chapter 1 first. Chapter 1 contains an overview of process improvement that explains what CMMI is all about. Next, skim Part Two, including generic goals and practices and specific goals and practices, to get a feel for the scope of the best practices contained in the model. Pay close attention to the purpose and introductory notes at the beginning of each process area. In Part Three, look through the references in Appendix A and select additional sources you think would be beneficial to read before moving forward with using CMMI-DEV. Read through the acronyms and glossary to become familiar with the language of CMMI. Then, go back and read the details of Part Two.

Wow! eBook Preface xix Readers Experienced with Process Improvement If you are new to CMMI but have experience with other process improvement models, such as the Software CMM or the Systems Engineering Capability Model (i.e., EIA 731), you will immediately recognize many similarities in their structure and content [EIA 2002a]. We recommend that you read Part One to understand how CMMI is different from other process improvement models. If you have experience with other models, you may want to select which sections to read first. Read Part Two with an eye for best practices you recog- nize from the models that you have already used. By identifying familiar material, you will gain an understanding of what is new, what has been carried over, and what is familiar from the models you already know. Next, review the glossary to understand how some terminology can differ from that used in the process improvement models you know. Many concepts are repeated, but they may be called something different.

Readers Familiar with CMMI ptg If you have reviewed or used a CMMI model before, you will quickly recognize the CMMI concepts discussed and the best practices pre- sented. As always, the improvements that the CMMI Product Team made to CMMI for the V1.3 release were driven by user input. Change requests were carefully considered, analyzed, and imple- mented. Some significant improvements you can expect in CMMI-DEV V1.3 include the following:

• High maturity process areas are significantly improved to reflect industry best practices, including a new specific goal and several new specific practices in the process area that was renamed from Organizational Innovation and Deployment (OID) to Organizational Performance Management (OPM). • Improvements were made to the model architecture that simplify the use of multiple models. • Informative material was improved, including revising the engineer- ing practices to reflect industry best practice and adding guidance for organizations that use Agile methods. • Glossary definitions and model terminology were improved to enhance the clarity, accuracy, and usability of the model.

Wow! eBook xx Preface

• Level 4 and 5 generic goals and practices were eliminated as well as capability levels 4 and 5 to appropriately focus high maturity on the achievement of business objectives, which is accomplished by apply- ing capability level 1-3 to the high maturity process areas (Causal Analysis and Resolution, Quantitative Project Management, Organi- zational Performance Management, and Organizational Process Performance).

For a more complete and detailed list of improvements, see http://www.sei.cmu.edu/cmmi/tools/cmmiv1-3/.

Additional Information and Reader Feedback Many sources of information about CMMI are listed in Appendix A and are also published on the CMMI website—http://www.sei.cmu.edu/ cmmi/. Your suggestions for improving CMMI are welcome. For informa- tion on how to provide feedback, see the CMMI website at http://www.sei.cmu.edu/cmmi/tools/cr/. If you have questions about CMMI, send email to [email protected]. ptg

Wow! eBook BOOK ACKNOWLEDGMENTS

This book wouldn’t be possible without the support of the CMMI user community and the work of a multitude of dedicated people ptg working together on CMMI-based process improvement. Ultimately, without the work of those involved in the CMMI project since it began in 1998, this book would not exist. We would like to specially thank our many CMMI partners who help organizations apply CMMI practices. The complete CMMI-DEV model is contained in the book, which was created based on more than one thousand change requests sub- mitted by CMMI users. The CMMI Product Team, which included members from different organizations and backgrounds, used these change requests to improve the model to what it is today. We would also like to acknowledge those who directly con- tributed to this book. All of these authors were willing to share their insights and experiences and met aggressive deadlines to do so: Steve Baldassano, Victor Basili, Michael Campo, David Card, Bill Curtis, Aldo Dagnino, Kathleen Dangle, Khaled El Emam, Hillel Glazer, Kileen Harrison, Will Hayes, Watts Humphrey, Gargi Keeni, Peter Kraus, Hans Juergen Kugler, Neal Mackertish, Tomoo Matsubara, Judah Mogilensky, James Moore, Joseph Morin, Heather Oppen- heimer, Mike Philips, Pat O’Toole, Anne Prem, Robert Rassa, Kevin Schaaff, Michele Shaw, Alex Stall, and Rusty Young. We are delighted that they agreed to contribute their experiences to our book and we

xxi

Wow! eBook xxii Book Acknowledgments

hope that you find their insights valuable and applicable to your process improvement activities. Special thanks go to Addison-Wesley Publishing Partner, Peter Gordon, for his assistance, experience, and advice. We’d also like to thank Kim Boedigheimer, Curt Johnson, Stephane Nakib, and Julie Nahil for their help with the book’s publication and promotion.

From Mary Beth Chrissis I am so very blessed and humbled to be fortunate enough to be part of the third edition of this book. Working on CMMI has allowed me the opportunity to interact with many people. From the SEI Partners who work with the community daily, to the many individuals I’ve had the privilege to teach, to the people in organizations that believe CMMI will help them improve their business, I’ve learned so much from you. I’d like you all to know that I realize although my name appears on this book, I’m really just representing all of you. I don’t know why I have been given this opportunity, but please know I am greatly appreciative. I have a very special thank you for my colleagues at the SEI and especially those in the SEPM Program. You’ve encouraged and sup- ptg ported me. Thanks to Anita Carleton, Mike Phillips, Barbara Tyson, Bob McFeeley, Stacey Cope, Mary Lou Russo, and Barbara Baldwin for all of your day-to-day assistance. I can’t forget Bill Peterson who has supported CMMI and the work on this book since its inception. Version 1.3 wouldn’t have been accomplished without the many volunteers who worked on it. I’d like to recognize the training team, particularly Mike Campo and Katie Smith, for spending many hours writing and reviewing the model and training materials. Thanks to Diane Mizukami Williams who had the vision and expertise to improve the CMMI-DEV Intro course. I’d also like to thank Bonnie Bollinger for her sage wisdom and comments. Lastly, thanks to Eric Dorsett, Steve Masters, and Dan Foster for their help with the train- ing materials. To my coauthors Sandy and Mike, you’re the best. Each time we write a book, it is a different experience and I wouldn’t want to work with anyone else. Thanks for your patience and understanding dur- ing this busy time. If you know me, you know my family is the center of my life. To my husband, Chuck, and my children, Adam, Pamela, and Kevin, thank you again for supporting me with this book. I love you and I am grateful to have you in my life.

Wow! eBook Book Acknowledgments xxiii

From Mike Konrad I came to the SEI 23 years ago hoping to contribute to an interna- tional initiative that would fundamentally improve the way systems and software are built. CMMI has become that initiative. For this, I thank my many colleagues at the SEI and Industry, past and present. Over the years I’ve been honored to work with some of the most talented individuals in systems and software engineering. My favorite teachers have been Roger Bate, Jack Ferguson, Watts Humphrey, and Bill Peterson. For Version 1.3, I’ve been particularly honored to work with Jim Armstrong, Richard Basque, Rhonda Brown, Brandon Buteau, Mike Campo, Sandra Cepeda, Mary Beth Chrissis, Mike D’Ambrosa, Eileen Forrester, Brian Gallagher, Will Hayes, Larry Jones, So Norimatsu, Parry, Lynn Penn, Mike Phillips, Karen Richter, Mary Lou Russo, Mary Lynn Russo, Winfried Russwurm, John Scibilia, Sandy Shrum, Kathy Smith, Katie Smith-McGarty, Barbara Tyson, and Rusty Young. I also continue to learn from my two coauthors, Mary Beth and Sandy; one could not hope for better or more talented coauthors. With respect to my family, words cannot express my heartfelt ptg thanks to my wife, Patti, and our family, Paul, Jill, Christian, Katie, Alison, Tim, David, and Walter, for their patience while I was work- ing on the book and for sharing their time and insights of the world we all share; to my father, Walter, and my late mother, Renée, for their years of nurturing and sacrifice; and to my sister, Corinne, for her encouragement over the years.

From Sandy Shrum Working simultaneously on three CMMI books has tested my limits in many ways. Those that have helped me along the journey pro- vided both professional and personal support. Many thanks to Rhonda Brown and Mike Konrad for their part- nership during CMMI model development. They are peerless as team members and friends. Our joint management of the CMMI Core Model Team was not only effective, but enjoyable. Affectionate thanks to my boyfriend, Jimmy Orsag, for his loving support and for helping me keep my focus and sense of humor through all the hours of work preparing three manuscripts. Heartfelt thanks to my parents, John and Eileen Maruca, for always being there for me no matter what and instilling my strong work ethic.

Wow! eBook xxiv Book Acknowledgments

Finally, thanks to the coauthors of all three CMMI books: Bran- don Buteau, Mary Beth Chrissis, Eileen Forrester, Brian Gallagher, Mike Konrad, Mike Phillips, and Karen Richter. They are all terrific to work with. Without their understanding, excellent coordination, and hard work, I would never have been able to participate.

ptg

Wow! eBook PART ONE About CMMI for D e v e l o p m e n t ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook CHAPTER 1

INTRODUCTION

Now more than ever, companies want to deliver products and services better, faster, and cheaper. At the same time, in the high-technology environment of the twenty-first century, nearly all organizations have found themselves building increasingly complex products and serv- ices. It is unusual today for a single organization to develop all the components that compose a complex product or service. More com- monly, some components are built in-house and some are acquired; then all the components are integrated into the final product or serv- ice. Organizations must be able to manage and control this complex ptg development and maintenance process. The problems these organizations address today involve enter- prise-wide solutions that require an integrated approach. Effective management of organizational assets is critical to business success. In essence, these organizations are product and service developers that need a way to manage their development activities as part of achieving their business objectives. In the current marketplace, maturity models, standards, method- ologies, and guidelines exist that can help an organization improve the way it does business. However, most available improvement approaches focus on a specific part of the business and do not take a systemic approach to the problems that most organizations are fac- ing. By focusing on improving one area of a business, these models have unfortunately perpetuated the stovepipes and barriers that exist in organizations. CMMI for Development (CMMI-DEV) provides an opportunity to avoid or eliminate these stovepipes and barriers. CMMI for Devel- opment consists of best practices that address development activities applied to products and services. It addresses practices that cover the product’s lifecycle from conception through delivery and maintenance.

3

Wow! eBook 4 PART ONE ABOUT CMMI FOR DEVELOPMENT

The emphasis is on the work necessary to build and maintain the total product. CMMI-DEV contains 22 process areas. Of those process areas, 16 are core process areas, 1 is a shared process area, and 5 are develop- ment specific process areas.1 All CMMI-DEV model practices focus on the activities of the developer organization. Five process areas focus on practices specific to development: addressing requirements development, technical solution, product integration, verification, and validation.

About Process Improvement In its research to help organizations to develop and maintain quality products and services, the Software Engineering Institute (SEI) has found several dimensions that an organization can focus on to improve its business. Figure 1.1 illustrates the three critical dimen- sions that organizations typically focus on: people, procedures and methods, and tools and equipment. What holds everything together? It is the processes used in your organization. Processes allow you to align the way you do business. ptg

Procedures and methods defining the relationship of tasks B A D C

Process

People with skills, training, and Tools and motivation equipment

FIGURE 1.1 The Three Critical Dimensions

1. A core process area is a process area that is common to all CMMI models. A shared process area is shared by at least two CMMI models, but not all of them.

Wow! eBook Chapter 1 Introduction 5

They allow you to address scalability and provide a way to incorpo- rate knowledge of how to do things better. Processes allow you to leverage your resources and to examine business trends. This is not to say that people and technology are not important. We are living in a world where technology is changing at an incredi- ble speed. Similarly, people typically work for many companies throughout their careers. We live in a dynamic world. A focus on process provides the infrastructure and stability necessary to deal with an ever-changing world and to maximize the productivity of people and the use of technology to be competitive. Manufacturing has long recognized the importance of process effectiveness and efficiency. Today, many organizations in manufac- turing and service industries recognize the importance of quality processes. Process helps an organization’s workforce to meet business objectives by helping them to work smarter, not harder, and with improved consistency. Effective processes also provide a vehicle for introducing and using new technology in a way that best meets the business objectives of the organization.

Looking Ahead ptg

by Watts Humphrey

Nearly 25 years ago when we first started the maturity model work that led to the CMM and CMMI, we took a diagnostic approach. What are the characteristics of organizations that performed good software work? We were following the principle that Tolstoy stated in Anna Karenina.2

“Happy families are all alike; every unhappy family is unhappy in its own way.”

In applying the Tolstoy principle to software organizations, we looked for those markers that characterized effective software oper- ations and then formed these characteristics into a maturity model. The logic for the maturity model was that, to do good work, organi- zations must do everything right. However, since no one could pos- sibly fix all of their problems at once, our strategy was to determine what organizations were doing and compare that to what they

2. Leo Tolstoy, Anna Karenina, Modern Library, New York.

Wow! eBook 6 PART ONE ABOUT CMMI FOR DEVELOPMENT

should be doing to be a “happy family.” Then, using the model as a guide, they should start fixing the omissions and mistakes in the order defined by the model.

Keep Doing the Good Things Unfortunately, we were not sufficiently clear with our initial guid- ance, and some people felt that, if they were only trying to get to maturity level 2, they should not do things that were at levels 3, 4, and 5. That is not what we meant. Organizations should continue doing all of the “good” things they now are doing and only focus on the problem areas. The objective at level 2 is to address those level 2 things that are missing or inadequate, and fix them first. Then the organization should consider addressing the level 3, 4, and 5 gaps. Again, they should not stop doing any things that work.

The Logic for the Maturity Level 2 The logic that we followed in establishing the maturity levels was as follows. First, organizations cannot do good work, and they cer- tainly cannot improve, if they are in a perpetual state of crisis. Therefore, the first improvement efforts should focus on those things that, if done poorly or not at all, will result in crises. These are plan- ptg ning, configuration management, requirements management, sub- contract management, quality assurance, and the like.

The Logic for Maturity Level 3 Second, after the crises are largely controlled and the organization is plan-driven rather than crisis-driven, the next step is learning. How can people learn from each other rather than having to learn from their own mistakes? Again, from Tolstoy’s principle, there is an infi- nite number of ways to fail, so improving by reacting to failures is a never-ending and fruitless struggle. The key is to find out what works best in the organization and to spread that across all groups. There- fore, maturity level 3 focuses on learning-oriented things like process definition, training, and defined ways to make decisions and evalu- ate alternatives.

The Logic for Maturity Levels 4 and 5 At level 4, the focus turns to quantitative management and quality control, and at level 5, the efforts include continuous improvement, technological innovations, and defect prevention. Unfortunately, we never included an explicit requirement in CMM or CMMI that high- maturity organizations must look outside of their own laboratories

Wow! eBook Chapter 1 Introduction 7 to identify best industry practices. Then, after they find these prac- tices, they should measure, evaluate, and prototype them to see if these practices would help improve their operations. This seemed like such an obvious step that we never explicitly required it. How- ever, judging from the slow rate of adoption of new and well-proven software and systems development innovations, such a requirement should have been included in the model.

Continuous Improvement At this point, of the many organizations that have been evaluated at CMMI level 5, too many have essentially stopped working on improvement. Their objective was to get to level 5 and they are there, so why should they keep improving? This is both an unfortu- nate and an unacceptable attitude. It is unfortunate because there are many new concepts and methods that these organizations would find beneficial, and it is unacceptable because the essence of level 5 is continuous improvement. Unfortunately, many organizations have become entangled in the weeds of process improvement and have lost sight of the forest and even of the trees. Six Sigma is a powerful and enormously help- ful statistically based method for improvement, but it is easy for ptg people to become so enamored with these sophisticated methods that they lose sight of the objective. This is a mistake. The key is pri- orities and what will help organizations to improve their business performance. This is where external benchmarking and internal performance measures are needed. Use them to establish improve- ment priorities and then focus your improvement muscle on those areas that will substantially improve business performance.

Next Steps While CMMI has been enormously successful, we have learned a great deal in the last 25 years, and there are now important new concepts and methods that were not available when we started. The key new concept concerns knowledge work. Peter Drucker, the leading management thinker of the twentieth century, defined knowledge work as work that is done with ideas and concepts rather than with things. While a great deal of today’s technical work is knowledge work, large-scale knowledge work is a relatively new phenomenon. Except for software, until recently, the really big proj- ects all concerned hardware systems. Now, however, software work pervades most parts of modern systems, and even the work of hard- ware designers more closely resembles that of software developers than traditional hardware bread boarding and manufacturing.

Wow! eBook 8 PART ONE ABOUT CMMI FOR DEVELOPMENT

To properly manage this new kind of work, new methods are needed, and Drucker enunciated the key new management concept. This is that knowledge work can’t be managed with traditional methods; the knowledge workers must manage themselves.3 The logic behind Drucker’s view is compelling, but the story is too extensive to cover in this short perspective. However, there is a growing number of publications that describe knowledge work and how and why new management methods are needed.4 In summary, the key point is that software and complex systems development projects are large-scale knowledge work, and the rea- son such projects have long been troubled is that they have not been managed with suitable methods. The first method that has been designed to follow Drucker’s knowledge-management princi- ples is the Team Software Process (TSP), but there will almost cer- tainly be more such methods in the future. Using the TSP to guide software and systems development proj- ects turns out to be highly effective, and TSP projects are typically delivered on schedule, within budget, and with substantially improved quality and productivity.5 To assist CMMI users in contin- uously improving their performance, the SEI has defined a new CMMI-based strategy and a family of practices to guide them in ptg evaluating and piloting these methods. This method is called CMMI-AIM (Accelerated Improvement Method), and it is currently in use by a growing number of organizations.

Conclusions As we continue refining our processes and methods to address the needs and practices of creative teams and people, new opportunities will keep showing up for broadening the scope of our processes and including new methods and technologies as they become available. Because many of these advances will be new to most users, users will need specific guidance on what these new methods are and how to best use them. The SEI strategy has been to provide this guidance for each new family of methods as it becomes available and is proven in practice.

3. Peter Drucker, Knowledge-Worker Productivity: the Biggest Challenge, California Man- agement Review, Winter 1999, 41, 2, ABI/INFORM Global. 4. Watts S. Humphrey, “Why Can’t We Manage Large Projects?” CrossTalk, July/August 2010, pp. 4–7; and Watts S. Humphrey and James W. Over, Leadership, Teamwork, and Trust: Build- ing a Competitive Software Capability, Reading, MA: Addison Wesley, 2011. 5. Noopur Davis and Julia Mullaney, Team Software Process (TSP) in Practice, SEI Technical Report CMU/SEI-2003-TR-014, September 2003.

Wow! eBook Chapter 1 Introduction 9

Two examples of such new methods are CMMI- and TSP-related guidance on how to develop secure systems and on how to architect complex systems. As we look to the future, there will be many more opportunities for improving the performance of our systems and software engineering work. The key is to couple these methods into a coherent improvement framework such as TSP-CMMI and to pro- vide the explicit guidance organizations need to obtain the potential benefits of these new methods. To avoid chasing the latest fads, however, organizations should measure their own operations, eval- uate where they stand relative to their leading peers and competi- tors, and focus on those improvements that will measurably improve their business performance.

About Capability Maturity Models A Capability Maturity Model (CMM), including CMMI, is a simpli- fied representation of the world. CMMs contain the essential ele- ments of effective processes. These elements are based on the concepts developed by Crosby, Deming, Juran, and Humphrey. In the 1930s, Walter Shewhart began work in process improve- ptg ment with his principles of statistical quality control [Shewhart 1931]. These principles were refined by W. Edwards Deming [Deming 1986], Phillip Crosby [Crosby 1979], and Joseph Juran [Juran 1988]. Watts Humphrey, Ron Radice, and others extended these principles further and began applying them to software in their work at IBM (International Business Machines) and the SEI [Humphrey 1989]. Humphrey’s book, Managing the Software Process, provides a descrip- tion of the basic principles and concepts on which many of the Capa- bility Maturity Models (CMMs) are based. The SEI has taken the process management premise, “the quality of a system or product is highly influenced by the quality of the process used to develop and maintain it,” and defined CMMs that embody this premise. The belief in this premise is seen worldwide in quality movements, as evidenced by the International Organization for Standardization/International Electrotechnical Commission (ISO/ IEC) body of standards. CMMs focus on improving processes in an organization. They contain the essential elements of effective processes for one or more disciplines and describe an evolutionary improvement path from ad hoc, immature processes to disciplined, mature processes with improved quality and effectiveness.

Wow! eBook 10 PART ONE ABOUT CMMI FOR DEVELOPMENT

Like other CMMs, CMMI models provide guidance to use when developing processes. CMMI models are not processes or process descriptions. The actual processes used in an organization depend on many factors, including application domains and organization struc- ture and size. In particular, the process areas of a CMMI model typi- cally do not map one to one with the processes used in your organization. The SEI created the first CMM designed for software organiza- tions and published it in a book, The Capability Maturity Model: Guidelines for Improving the Software Process [SEI 1995]. Today, CMMI is an application of the principles introduced almost a century ago to this never-ending cycle of process improve- ment. The value of this process improvement approach has been con- firmed over time. Organizations have experienced increased productivity and quality, improved cycle time, and more accurate and predictable schedules and budgets [Gibson 2006].

Evolution of CMMI The CMM Integration project was formed to sort out the problem of using multiple CMMs. The combination of selected models into a ptg single improvement framework was intended for use by organiza- tions in their pursuit of enterprise-wide process improvement. Developing a set of integrated models involved more than simply combining existing model materials. Using processes that promote consensus, the CMMI Product Team built a framework that accom- modates multiple constellations. The first model to be developed was the CMMI for Development model (then simply called “CMMI”). Figure 1.2 illustrates the mod- els that led to CMMI Version 1.3. Initially, CMMI was one model that combined three source mod- els: the Capability Maturity Model for Software (SW-CMM) v2.0 draft C, the Systems Engineering Capability Model (SECM) [EIA 2002a], and the Integrated Product Development Capability Maturity Model (IPD-CMM) v0.98. These three source models were selected because of their success- ful adoption or promising approach to improving processes in an organization. The first CMMI model (V1.02) was designed for use by develop- ment organizations in their pursuit of enterprise-wide process improvement. It was released in 2000. Two years later Version 1.1 was released and four years after that, Version 1.2 was released.

Wow! eBook Chapter 1 Introduction 11

History of CMMs

CMM for Software Systems Engineering V1.1 (1993) CMM V1.1 (1995)

INCOSE SECAM (1996)

Software CMM V2, draft C (1997) Integrated Product EIA 731 SECM Development CMM (1998) (1997) Software Acquisition CMM V1.03 (2002) V1.02 (2000)

V1.1 (2002)

CMMI for Development V1.2 (2006) CMMI for Acquisition CMMI for Services V1.2 (2007) V1.2 (2009)

CMMI for Acquisition CMMI for Development CMMI for Services V1.3 (2010) V1.3 (2010) V1.3 (2010) ptg FIGURE 1.2 The History of CMMs6

By the time that Version 1.2 was released, two other CMMI mod- els were being planned. Because of this planned expansion, the name of the first CMMI model had to change to become CMMI for Devel- opment and the concept of constellations was created. The CMMI for Acquisition model was released in 2007. Since it built on the CMMI for Development Version 1.2 model, it also was named Version 1.2. Two years later the CMMI for Services model was released. It built on the other two models and also was named Version 1.2. In 2008 plans were drawn to begin developing Version 1.3, which would ensure consistency among all three models and improve high maturity material in all of the models. Version 1.3 of CMMI for Acquisition [Gallagher 2011, SEI 2010b], CMMI for Development [Chrissis 2011, SEI 2010c], and CMMI for Services [Forrester 2011, SEI 2010a] were released in November 2010.

6. EIA 731 SECM is the Electronic Industries Alliance standard 731, or the Systems Engineer- ing Capability Model. INCOSE SECAM is International Council on Systems Engineering Sys- tems Engineering Capability Assessment Model [EIA 2002a].

Wow! eBook 12 PART ONE ABOUT CMMI FOR DEVELOPMENT CMMI: Integration and Improvement Continues

by Bob Rassa

CMMI is almost 15 years old, and has clearly become the world- wide de facto standard for process improvement in the development of systems, including systems engineering, software engineering, design engineering, subcontractor management, and program man- agement. Since the release of CMMI V1.2 (for Development) almost 5 years ago, CMMI has embraced process improvement for Acquisi- tion as well as the delivery of Services. The full product suite of CMMI-DEV, CMMI-ACQ, and CMMI- SVC covers the complete spectrum of process improvement for the entire business, including commercial and defense industry, govern- ments, and even military organizations. After the initial release of CMMI in November 2000, well over 1,000 Class A appraisals were reported in just four years—very successful numbers by our mea- sures at that time; whereas recently almost 1,400 Class A appraisals were conducted in 2009 alone—quite a significant improvement. As of January 2006, more than 45,000 individuals had received ptg Introduction to CMMI training. As of July 2010, that number has exceeded more than 117,000 students. CMMI-DEV has been translated into Japanese, Chinese, French, German, Spanish, and Portuguese. Translation of CMMI-SVC into Arabic is beginning. The success in CMMI recognition and adop- tion worldwide is undeniable. The CMMI V1.2 architecture was altered slightly to accommo- date two additional CMMI constellations, which we designated CMMI-ACQ (CMMI for Acquisition) and CMMI-SVC (CMMI for Services). CMMI V1.3 focuses on providing some degree of simpli- fication as well as adding more integrity to the overall product suite. V1.3 model improvements have a heavy concentration on the high maturity aspects embodied in levels 4 and 5, in both the model structure as well as the appraisal method. We learned that there were certain ambiguities within the V1.2 product suite, and the areas affected are now clarified in V1.3 to achieve greater consistency in overall model deployment and appraisal conduct of CMMI. The criteria that are used in the appraisal audit process, which was implemented in 2008, have now been incorporated in the product suite where appropriate. We have also provided clarification on the sampling of “focus programs” in

Wow! eBook Chapter 1 Introduction 13 the appraised organization to reduce the complexity and time involved in conducting Class A appraisals, thereby reducing the cost of implementing CMMI. It has been noted by some that CMMI is only for large organiza- tions, but the data tells a different story. In fact, a large number of small organizations have been appraised and have told us that they reap benefits of CMMI far beyond the investment. A comprehensive Benefits of CMMI report is now on the website of the designated CMMI Steward, the Software Engineering Institute of Carnegie Mel- lon University (http://www.sei.cmu.edu/cmmi). This report, essen- tially a compendium of real benefits provided by users, clearly shows positive effects such as reduced defects on delivery, reduced time to identify defects, and more. The data tells us that CMMI is truly state-of-the-art in-process improvement, and the substantive benefits reported confirm this. However, to be truly effective, CMMI must be applied conscien- tiously within the organization. When we started the initial devel- opment of CMMI, it was well-publicized that its purpose was to integrate the divergent maturity models that existed at the time. We soon realized that the real purpose that should have been commu- nicated as the ultimate benefit of CMMI was that this integrated ptg model would integrate the design and management disciplines in terms of both process and performance. To achieve this ultimate benefit, care is needed to ensure that integrated processes are put into place within the organization, that such processes are implemented across the enterprise on all new programs and projects, and that such implementation is done in a thorough manner to assure that new programs start out on the right foot. This book provides the latest expert and detailed guidance for effective CMMI implementation. It covers all the specifics of V1.3 and addresses nuances of interpretation as well as expert advice useful to the new and experienced practitioner. Hundreds of process improvement experts have contributed to the overall CMMI development and update, and many of them con- tributed their expertise to this volume for the benefit of the world- wide user community. We trust you will enjoy their work and find it useful as you continue your journey along the path of continuous process improvement. Remember, great designers and great managers will still likely fail without a proven process framework, and this is what CMMI provides.

Wow! eBook 14 PART ONE ABOUT CMMI FOR DEVELOPMENT

CMMI Framework The CMMI Framework provides the structure needed to produce CMMI models, training, and appraisal components. To allow the use of multiple models within the CMMI Framework, model compo- nents are classified as either common to all CMMI models or applica- ble to a specific model. The common material is called the “CMMI Model Foundation” or “CMF.” The components of the CMF are part of every model generated from the CMMI Framework. Those components are combined with material applicable to an area of interest (e.g., acquisition, develop- ment, services) to produce a model. A “constellation” is defined as a collection of CMMI components that are used to construct models, training materials, and appraisal related documents for an area of interest (e.g., acquisition, develop- ment, services). The Development constellation’s model is called “CMMI for Development” or “CMMI-DEV.”

The Architecture of the CMMI Framework

ptg by Roger Bate with a postscript by Mike Konrad

Over the years, as the CMMI Product Suite has been used in dis- parate industries and organizations, it became apparent that CMMI could be applied to all kinds of product development, especially if the terminology was kept general for similar practices. A further revelation was that the process and project manage- ment practices of the model are suitable for a wide range of activi- ties besides product development. This discovery led me to propose that we should enable the expansion of CMMI, including the exten- sion of the scope of CMMI, by creating a new architecture for the CMMI Framework. This new architecture would accommodate other areas of inter- est (e.g., Services and Acquisition). I was musing one day about the valuable best practices that were contained in models. I began to think of them as the stars of process improvement. I pushed this metaphor a little further to call the collection of components that would be useful in building a model, its training materials, and appraisal documents for an area of interest a constellation. This was the beginning of the architecture that was eventually created.

Wow! eBook Chapter 1 Introduction 15

There are two primary objectives for the CMMI Framework architecture:

• Enable the coverage of selected areas of interest to make use- ful and effective processes. • Promote maximum commonality of goals and practices across models, training materials, and appraisal methods.

These objectives pull in opposite directions; therefore, the architecture was designed as a bit of a compromise. The CMMI Framework will be used in the future to accommo- date additional content that the user community indicates is desir- able. The framework contains components used to construct models and their corresponding training and appraisal materials. The framework is organized so that the models constructed will benefit from common terminology and common practices that have proven to be valuable in previous models. The CMMI Framework is a collection of all model components, training material components, and appraisal components. These components are organized into groupings, called constellations, which facilitate construction of approved models and preserve the ptg legacy of existing CMM and CMMI models. In the framework, there are constellations of components that are used to construct models in an area of interest (e.g., Acquisition, Development, and Services). Also in the framework, there is a CMMI model foundation. This foundation is a skeleton model that contains the core model components in a CMMI model structure. The content of the CMMI model foundation is apropos to all areas of interest addressed by the constellations. A CMMI model for a constellation is constructed by inserting additional model compo- nents into the CMMI model foundation. Because the CMMI architecture is designed to encourage pre- serving as much common material as is reasonable in a multiple constellation environment, the framework contains and controls all CMMI material that can be used to produce any constellation or model. CMMI models have a defined structure. This structure is designed to provide familiar placement of model components of various constellations and versions. If you look at the structure of a process area, you’ll see components including Process Area Name, Category, Maturity Level, Purpose, Introductory Notes, References, and Specific Goals. You will also find that every process area in this

Wow! eBook 16 PART ONE ABOUT CMMI FOR DEVELOPMENT

model (i.e., CMMI for Development) and all other CMMI models produced from the CMMI Framework have the same structure. This feature helps you to understand quickly where to look for informa- tion in any CMMI model. One of the benefits of having a common architecture and a large portion of common content in the various models is that the effort required to write models, train users, and appraise organizations is greatly reduced. The ability to add model components to the com- mon process areas permits them to expand their scope of coverage to a greater variety of needs. In addition, whole new process areas may be added to provide greater coverage of different areas of inter- est in the constellations. CMMI models have a great deal of well-tested content that can be used to guide the creation of high performance processes. The CMMI architecture permits that valuable content to continue to work in different areas of interest, while allowing for innovation and agility in responding to new needs. You can see that CMMI has grown beyond the star practices of the three original source models to constellations. This expansion into the galaxy is only possible with a well-thought-out and designed architecture to support it. The CMMI architecture has ptg been designed to provide such support and will grow as needed to continue into the future.

Postscript Roger passed away in 2009, about two years after this perspective was written for the second edition of this book. Roger’s grand vision of constellations and a flexible architecture that established a deep level of commonality among CMMI models was largely realized in the sequential release of the three V1.2 CMMI models published between 2006 and 2009. One of many anecdotes that reveals Roger’s transcending vision and greatness and reflects on the early days of CMMI is when I was contacted by friend and colleague Tomoo Matsubara-san in 1997. He asked me to provide a short opinion piece for his Soapbox col- umn in IEEE Software magazine. At the time, we at the SEI were confronted with consternation from many Software CMM transition partners because they felt that we were abandoning the Software CMM product line. Instead, we were pursuing the convergence of the software engineering and systems engineering process improve- ment communities by creating a product line that both could use.

Wow! eBook Chapter 1 Introduction 17

We hoped such a convergence would not only eliminate the need to separately maintain two models and associated product lines (those for the Software CMM and the EIA 731 Systems Engi- neering Capability Model), but also would encourage synergism within organizations addressing system-wide performance issues. To me, the Soapbox column provided an opportunity to communi- cate what we were hoping to accomplish with CMMI, so I suggested to Roger that he write something for Matsubara-san’s column on why software engineering would benefit from a “systems engineer- ing” perspective. This chain of events led to Roger’s Soapbox piece, “Do Systems Engineering? Who, Me?”7 This anecdote typifies Roger’s brilliance, ability to span multiple disciplines, and talent for motivating multiple stakeholders with a unifying vision and a shared mission to change the organization’s behavior and achieve superior performance. As mentioned, Roger’s vision was largely realized with the release of the Version 1.2 CMMI Product Suite. For V1.3, however, we confronted a challenge that required some modification to Roger’s initial vision for the CMMI architecture. The challenge was the term “project,” which commonly refers to an endeavor whose purpose and end are marked by the delivery ptg of something tangible (e.g., a product). This term appeared throughout many of the process areas (PAs) and was generally a good fit for CMMI-ACQ and CMMI-DEV but not for CMMI-SVC. After researching the problem, we devised a solution: for V1.3, we would allow the CMF to vary across constellations in a very limited way. CMF material in CMMI-DEV and CMMI-ACQ could continue to refer to “projects;” whereas in CMMI-SVC, the reference would instead be to “work,” “work activities,” “services,” or similar. In this way, the CMMI user community could continue to use common English terms in a familiar and consistent way while benefitting from a sharing of CMMI terminology and best practices (for 17 PAs) across a broad range of process improvement communities. (This idea was researched by exploratory implementation, a survey, and pilots; and it worked well.) V1.3 benefits from many synergies. The sequential release of new constellations for V1.2 between 2006 and 2009 provided the oppor- tunity to evaluate what CMF should be—one context at a time. However, in V1.3 development, model material originally proposed

7. Roger R. Bate, “Do Systems Engineering? Who, Me?” IEEE Software, vol. 15, no. 4, pp. 65–66, July/Aug. 1998, doi:10.1109/52.687947.

Wow! eBook 18 PART ONE ABOUT CMMI FOR DEVELOPMENT

for one area of interest was sometimes extended to all three. Exam- ples include the concept of quality attributes; prioritization of requirements; ontology for products and services; a richer set of examples for measurement and analysis; a sharper focus on what is essential to team performance (i.e., team composition, empower- ment, and operational discipline); and alignment of high maturity practices with business objectives enabled by analytics. The V1.3 user may be unaware of these synergies or their sources, but may derive benefit from them. With V1.3, we’ve only begun to explore ideas for the future of CMMI. Beyond V1.3, there are lots of promising directions forward, which is only the case because of Roger’s original pioneering vision for CMMI. With Roger’s passing, his Chief Architect mantle has been passed on to someone many years his junior (me) and I can assure our readers that well into his mid-80s, Roger remained a truly brilliant man, retaining his clear thinking and ear for nuance while maintaining a broad perspective when rendering an opinion; and so I recognize almost as much as anyone how big a space he has left behind. Although he was the one who conceived of the term constellations as a way of thinking about CMMI architecture, to us who knew him well, he was the real star of CMMI. ptg

CMMI for Development CMMI for Development is a reference model that covers activities for developing both products and services. Organizations from many industries, including aerospace, banking, computer hardware, soft- ware, defense, automobile manufacturing, and telecommunications, use CMMI for Development. CMMI for Development contains practices that cover project management, process management, systems engineering, hardware engineering, software engineering, and other supporting processes used in development and maintenance. Use professional judgment and common sense to interpret the model for your organization. That is, although the process areas described in this model depict behaviors considered best practices for most users, process areas and practices should be interpreted using an in-depth knowledge of CMMI-DEV, your organizational constraints, and your business environment.

Wow! eBook CHAPTER 2

PROCESS AREA COMPONENTS

This chapter describes the components found in each process area and in the generic goals and generic practices. Understanding these components is critical to using the information in Part Two effec- tively. If you are unfamiliar with Part Two, you may want to skim the Generic Goals and Generic Practices section and a couple of process area sections to get a general feel for the content and layout before reading this chapter.

ptg Core Process Areas and CMMI Models All CMMI models are produced from the CMMI Framework. This framework contains all of the goals and practices that are used to produce CMMI models that belong to CMMI constellations. All CMMI models contain 16 core process areas. These process areas cover basic concepts that are fundamental to process improve- ment in any area of interest (i.e., acquisition, development, services). Some of the material in the core process areas is the same in all con- stellations. Other material may be adjusted to address a specific area of interest. Consequently, the material in the core process areas may not be exactly the same.

Required, Expected, and Informative Components Model components are grouped into three categories—required, expected, and informative—that reflect how to interpret them.

Required Components Required components are CMMI components that are essential to achiev- ing process improvement in a given process area. This achievement

19

Wow! eBook 20 PART ONE ABOUT CMMI FOR DEVELOPMENT

must be visibly implemented in an organization’s processes. The required components in CMMI are the specific and generic goals. Goal satisfaction is used in appraisals as the basis for deciding whether a process area has been satisfied.

Expected Components Expected components are CMMI components that describe the activ- ities that are important in achieving a required CMMI component. Expected components guide those who implement improvements or perform appraisals. The expected components in CMMI are the spe- cific and generic practices. Before goals can be considered to be satisfied, either their prac- tices as described, or acceptable alternatives to them, must be present in the planned and implemented processes of the organization.

Informative Components Informative components are CMMI components that help model users understand CMMI required and expected components. These components can be example boxes, detailed explanations, or other helpful information. Subpractices, notes, references, goal titles, prac- ptg tice titles, sources, example work products, and generic practice elaborations are informative model components. The informative material plays an important role in understand- ing the model. It is often impossible to adequately describe the behavior required or expected of an organization using only a single goal or practice statement. The model’s informative material provides information necessary to achieve the correct understanding of goals and practices and thus cannot be ignored.

Components Associated with Part Two The model components associated with Part Two are summarized in Figure 2.1 to illustrate their relationships. The following sections provide detailed descriptions of CMMI model components.

Process Areas A process area is a cluster of related practices in an area that, when implemented collectively, satisfies a set of goals considered important for making improvement in that area. (See the definition of “process area” in the glossary.)

Wow! eBook Chapter 2 Process Area Components 21

Process Area

Purpose Introductory Related Statement Notes Process Areas

Specific Generic Goals Goals

Specific Generic Practices Practices

Example Work Subpractices Subpractices Generic Practice Products Elaborations

Key: Required Expected Informative ptg

FIGURE 2.1 CMMI Model Components

The 22 process areas are presented in alphabetical order by acronym:

• Causal Analysis and Resolution (CAR) • Configuration Management (CM) • Decision Analysis and Resolution (DAR) • Integrated Project Management (IPM) • Measurement and Analysis (MA) • Organizational Process Definition (OPD) • Organizational Process Focus (OPF) • Organizational Performance Management (OPM) • Organizational Process Performance (OPP) • Organizational Training (OT) • Product Integration (PI) • Project Monitoring and Control (PMC)

Wow! eBook 22 PART ONE ABOUT CMMI FOR DEVELOPMENT

• Project Planning (PP) • Process and Product Quality Assurance (PPQA) • Quantitative Project Management (QPM) • Requirements Development (RD) • Requirements Management (REQM) • Risk Management (RSKM) • Supplier Agreement Management (SAM) • Technical Solution (TS) • Validation (VAL) • Verification (VER)

Purpose Statements A purpose statement describes the purpose of the process area and is an informative component. For example, the purpose statement of the Organizational Process Definition process area is “The purpose of Organizational Process Definition (OPD) is to establish and maintain a usable set of organi- zational process assets, work environment standards, and rules and guidelines for teams.” ptg Introductory Notes The introductory notes section of the process area describes the major concepts covered in the process area and is an informative component. An example from the introductory notes of the Project Monitoring and Control process area is “When actual status deviates significantly from expected values, corrective actions are taken as appropriate.”

Related Process Areas The Related Process Areas section lists references to related process areas and reflects the high-level relationships among the process areas. The Related Process Areas section is an informative component. An example of a reference found in the Related Process Areas sec- tion of the Project Planning process area is “Refer to the Risk Man- agement process area for more information about identifying and analyzing risks and mitigating risks.”

Specific Goals A specific goal describes the unique characteristics that must be pres- ent to satisfy the process area. A specific goal is a required model

Wow! eBook Chapter 2 Process Area Components 23 component and is used in appraisals to help determine whether a process area is satisfied. (See the definition of “specific goal” in the glossary.) For example, a specific goal from the Configuration Management process area is “Integrity of baselines is established and maintained.” Only the statement of the specific goal is a required model com- ponent. The title of a specific goal (preceded by the goal number) and notes associated with the goal are considered informative model components.

Generic Goals Generic goals are called “generic” because the same goal statement applies to multiple process areas. A generic goal describes the charac- teristics that must be present to institutionalize processes that imple- ment a process area. A generic goal is a required model component and is used in appraisals to determine whether a process area is satis- fied. (See the Generic Goals and Generic Practices section in Part Two for a more detailed description of generic goals. See the defini - tion of “generic goal” in the glossary.) An example of a generic goal is “The process is institutionalized as a defined process.” ptg Only the statement of the generic goal is a required model compo- nent. The title of a generic goal (preceded by the goal number) and notes associated with the goal are considered informative model components.

Specific Goal and Practice Summaries The specific goal and practice summary provides a high-level sum- mary of the specific goals and specific practices. The specific goal and practice summary is an informative component.

Specific Practices A specific practice is the description of an activity that is considered important in achieving the associated specific goal. The specific prac- tices describe the activities that are expected to result in achievement of the specific goals of a process area. A specific practice is an expected model component. (See the definition of “specific practice” in the glossary.) For example, a specific practice from the Project Monitoring and Control process area is “Monitor commitments against those identi- fied in the project plan.”

Wow! eBook 24 PART ONE ABOUT CMMI FOR DEVELOPMENT

Only the statement of the specific practice is an expected model component. The title of a specific practice (preceded by the practice number) and notes associated with the specific practice are consid- ered informative model components.

Example Work Products The example work products section lists sample outputs from a spe- cific practice. An example work product is an informative model component. (See the definition of “example work product” in the glossary.) For instance, an example work product for the specific practice “Monitor Project Planning Parameters” in the Project Monitoring and Control process area is “Records of significant deviations.”

Subpractices A subpractice is a detailed description that provides guidance for interpreting and implementing a specific or generic practice. Sub- practices can be worded as if prescriptive, but they are actually an informative component meant only to provide ideas that may be use- ful for process improvement. (See the definition of “subpractice” in ptg the glossary.) For example, a subpractice for the specific practice “Take Correc- tive Action” in the Project Monitoring and Control process area is “Determine and document the appropriate actions needed to address identified issues.”

Generic Practices Generic practices are called “generic” because the same practice applies to multiple process areas. The generic practices associated with a generic goal describe the activities that are considered impor- tant in achieving the generic goal and contribute to the institutional- ization of the processes associated with a process area. A generic practice is an expected model component. (See the definition of “generic practice” in the glossary.) For example, a generic practice for the generic goal “The process is institutionalized as a managed process” is “Provide adequate resources for performing the process, developing the work products, and providing the services of the process.” Only the statement of the generic practice is an expected model component. The title of a generic practice (preceded by the practice number) and notes associated with the practice are considered informative model components.

Wow! eBook Chapter 2 Process Area Components 25 Generic Practice Elaborations Generic practice elaborations appear after generic practices to pro- vide guidance on how the generic practices can be applied uniquely to process areas. A generic practice elaboration is an informative model component. (See the definition of “generic practice elabora- tion” in the glossary.) For example, a generic practice elaboration after the generic prac- tice “Establish and maintain an organizational policy for planning and performing the process” for the Project Planning process area is “This policy establishes organizational expectations for estimating the planning parameters, making internal and external commit- ments, and developing the plan for managing the project.”

Additions Additions are clearly marked model components that contain infor- mation of interest to particular users. An addition can be informative material, a specific practice, a specific goal, or an entire process area that extends the scope of a model or emphasizes a particular aspect of its use. There are no additions in the CMMI-DEV model.

ptg Supporting Informative Components In many places in the model, further information is needed to describe a concept. This informative material is provided in the form of the following components:

• Notes • Examples • References

Notes A note is text that can accompany nearly any other model compo- nent. It may provide detail, background, or rationale. A note is an informative model component. For example, a note that accompanies the specific practice “Implement Action Proposals” in the Causal Analysis and Resolution process area is “Only changes that prove to be of value should be considered for broad implementation.”

Examples An example is a component comprising text and often a list of items, usually in a box, that can accompany nearly any other component

Wow! eBook 26 PART ONE ABOUT CMMI FOR DEVELOPMENT

and provides one or more examples to clarify a concept or described activity. An example is an informative model component. The following is an example that accompanies the subpractice “Document noncompliance issues when they cannot be resolved in the project” under the specific practice “Communicate and Ensure the Resolution of Noncompliance Issues” in the Process and Product Quality Assurance process area.

Examples of ways to resolve noncompliance in the project include the f o l l o w i n g : • Fixing the noncompliance • Changing the process descriptions, standards, or procedures that were violated • Obtaining a waiver to cover the noncompliance

References A reference is a pointer to additional or more detailed information in related process areas and can accompany nearly any other model component. A reference is an informative model component. (See the definition of “reference” in the glossary.) ptg For example, a reference that accompanies the specific practice “Compose the Defined Process” in the Quantitative Project Manage- ment process area is “Refer to the Organizational Process Definition process area for more information about establishing organizational process assets.”

Numbering Scheme Specific and generic goals are numbered sequentially. Each specific goal begins with the prefix “SG” (e.g., SG 1). Each generic goal begins with the prefix “GG” (e.g., GG 2). Specific and generic practices are also numbered sequentially. Each specific practice begins with the prefix “SP,” followed by a num- ber in the form “x.y” (e.g., SP 1.1). The x is the same number as the goal to which the specific practice maps. The y is the sequence num- ber of the specific practice under the specific goal. An example of specific practice numbering is in the Project Plan- ning process area. The first specific practice is numbered SP 1.1 and the second is SP 1.2.

Wow! eBook Chapter 2 Process Area Components 27

Each generic practice begins with the prefix “GP,” followed by a number in the form “x.y” (e.g., GP 1.1). The x corresponds to the number of the generic goal. The y is the sequence number of the generic practice under the generic goal. For example, the first generic practice associated with GG 2 is numbered GP 2.1 and the second is GP 2.2.

Typographical Conventions The typographical conventions used in this model were designed to enable you to easily identify and select model components by pre- senting them in formats that allow you to find them quickly on the page. Figures 2.2, 2.3, and 2.4 are sample pages from process areas in Part Two; they show the different process area components, labeled so that you can identify them. Notice that components differ typo- graphically so that you can easily identify each one.

ptg

Wow! eBook 28 PART ONE ABOUT CMMI FOR DEVELOPMENT

Process Area Name

A Support Process Area at Maturity Level 3 Maturity Level Process Area Category

The purpose of Decision Analysis and Resolution (DAR) is to analyze possible decisions using a formal evaluation process that ev evaluatesaluates identified alternatives against established criteria. Purpose Statement

The Decision Analysis and Resolution process area involves estab- lishing guidelines to determine which issues should be subject to a formal evaluation process and applying formal evaluation processes to these issues. A formal evaluation process is a structured approach to evaluating alternative solutions against established criteria to determine a rec- ommended solution. A formal evaluation process involves the following actions: ptg Establishing the criteria for evaluating alternatives Introductory Notes Identifying alternative solutions Selecting methods for evaluating alternatives Evaluating alternative solutions using established criteria and methods Selecting recommended solutions from alternatives based on evalua- tion criteria

Rather than using the phrase “alternative solutions to address issues” each time, in this process area, one of two shorter phrases are used: “alternative solutions” or “alternatives.” A formal evaluation process reduces the subjective nature of a decision and provides a higher probability of selecting a solution that meets multiple demands of relevant stakeholders.

FIGURE 2.2 Sample Page from Decision Analysis and Resolution

Wow! eBook Chapter 2 Process Area Components 29

SG 2 ADDRESS CAUSES OF SELECTED OUTCOMES Specific Goal Root causes of selected outcomes are systematically addressed.

Projects operating according to a well-defined process systematically analyze where improvements are needed and implement process changes to address root causes of selected outcomes. Specific Practice SP 2.1 IMPLEMENT ACTION PROPOSALS Implement selected action proposals developed in causal analysis.is

Action proposals describe tasks necessary to address root causes of analyzed outcomes to prevent or reduce the occurrence or recurrence of negative outcomes, or incorporate realized successes. Action plans are developed and implemented for selected action proposals. Only changes that prove to be of value should be considered for broad implementation. Example Work Product Example Work Products 1. Action proposals selected for implementation 2. Action plans ptg

Subpractices 1. Analyze action proposals and determine their priorities. Example Box

Criteria for prioritizing action proposals include the following: • Implications of not addressing the outcome • Cost to implement process improvements to address the outcome • Expected impact on quality

Process performance models can be used to help identify interactions Subpractice among multiple action proposals. 2. Select action proposals to be implemented. Refer to the Decision Analysis and Resolution process area for more infor- Reference mation about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. 3. Create action plans for implementing the selected action proposals.

FIGURE 2.3 Sample Page from Causal Analysis and Resolution

Wow! eBook 30 PART ONE ABOUT CMMI FOR DEVELOPMENT

GG 2 INSTITUTIONALIZE A MANAGED PROCESS Generic Goal The process is institutionalized as a managed process.

GP 2.1 ESTABLISH AN ORGANIZATIONAL POLICY Generic Practice Establish and maintain an organizational policy for planning and perforperformingming the process.

The purpose of this generic practice is to define the organizational expectations for the process and make these expectations visible to those members of the organization who are affected. In general, sen- ior management is responsible for establishing and communicating guiding principles, direction, and expectations for the organization. Not all direction from senior management will bear the label “pol- icy.” The existence of appropriate organizational direction is the expectation of this generic practice, regardless of what it is called or how it is imparted. CAR Elaboration This policy establishes organizational expectations for identifyingng ptg and systematically addressing causal analysis of selected outcomes. Generic Practice CM Elaboration Elaboration This policy establishes organizational expectations for establishingng and maintaining baselines, tracking and controlling changes to workk products (under configuration management), and establishing and maintaining integrity of the baselines.

DAR Elaboration This policy establishes organizational expectations for selectively analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. The pol- icy should also provide guidance on which decisions require a formal evaluation process. IPM Elaboration This policy establishes organizational expectations for establishing and maintaining the project’s defined process from project startup

FIGURE 2.4 Sample Page from the Generic Goals and Generic Practices

Wow! eBook CHAPTER 3

TYING IT ALL TOGETHER

Now that you have been introduced to the components of CMMI models, you need to understand how they fit together to meet your process improvement needs. This chapter introduces the concept of levels and shows how the process areas are organized and used. CMMI-DEV does not specify that a project or organization must follow a particular process flow or that a certain number of products be developed per day or specific performance targets be achieved. The model does specify that a project or organization should have processes that address development related practices. To determine ptg whether these processes are in place, a project or organization maps its processes to the process areas in this model. The mapping of processes to process areas enables the organiza- tion to track its progress against the CMMI-DEV model as it updates or creates processes. Do not expect that every CMMI-DEV process area will map one to one with your organization’s or project’s processes.

Understanding Levels Levels are used in CMMI-DEV to describe an evolutionary path rec- ommended for an organization that wants to improve the processes it uses to develop products or services. Levels can also be the outcome of the rating activity in appraisals.1 Appraisals can apply to entire organizations or to smaller groups such as a group of projects or a division.

1. For more information about appraisals, refer to Appraisal Requirements for CMMI and the Standard CMMI Appraisal Method for Process Improvement Method Definition Document [SEI 2006a, SEI 2006b].

31

Wow! eBook 32 PART ONE ABOUT CMMI FOR DEVELOPMENT

CMMI supports two improvement paths using levels. One path enables organizations to incrementally improve processes correspon- ding to an individual process area (or group of process areas) selected by the organization. The other path enables organizations to improve a set of related processes by incrementally addressing suc- cessive sets of process areas. These two improvement paths are associated with the two types of levels: capability levels and maturity levels. These levels corre- spond to two approaches to process improvement called “representa- tions.” The two representations are called “continuous” and “staged.” Using the continuous representation enables you to achieve “capabil- ity levels.” Using the staged representation enables you to achieve “maturity levels.” To reach a particular level, an organization must satisfy all of the goals of the process area or set of process areas that are targeted for improvement, regardless of whether it is a capability or a maturity level. Both representations provide ways to improve your processes to achieve business objectives, and both provide the same essential con- tent and use the same model components. ptg Structures of the Continuous and Staged Representations Figure 3.1 illustrates the structures of the continuous and staged rep- resentations. The differences between the structures are subtle but significant. The staged representation uses maturity levels to charac- terize the overall state of the organization’s processes relative to the model as a whole, whereas the continuous representation uses capa- bility levels to characterize the state of the organization’s processes relative to an individual process area. What may strike you as you compare these two representations is their similarity. Both have many of the same components (e.g., process areas, specific goals, specific practices), and these compo- nents have the same hierarchy and configuration. What is not readily apparent from the high-level view in Figure 3.1 is that the continuous representation focuses on process area capability as measured by capability levels and the staged representa- tion focuses on overall maturity as measured by maturity levels. This dimension (the capability/maturity dimension) of CMMI is used for benchmarking and appraisal activities, as well as guiding an organi- zation’s improvement efforts.

Wow! eBook Chapter 3 Tying It All Together 33

Continuous Representation

Process Areas

Capability Levels Specific Generic Goals Goals

Specific Generic Practices Practices

Staged Representation

Process Areas

Maturity Levels

Specific Generic Goals Goals

Specific Generic ptg Practices Practices

FIGURE 3.1 Structure of the Continuous and Staged Representations

Capability levels apply to an organization’s process improvement achievement in individual process areas. These levels are a means for incrementally improving the processes corresponding to a given process area. The four capability levels are numbered 0 through 3. Maturity levels apply to an organization’s process improvement achievement across multiple process areas. These levels are a means of improving the processes corresponding to a given set of process areas (i.e., maturity level). The five maturity levels are numbered 1 through 5. Table 3.1 compares the four capability levels to the five maturity levels. Notice that the names of two of the levels are the same in both representations (i.e., Managed and Defined). The differences are that there is no maturity level 0; there are no capability levels 4 and 5; and at level 1, the names used for capability level 1 and maturity level 1 are different.

Wow! eBook 34 PART ONE ABOUT CMMI FOR DEVELOPMENT TABLE 3.1 Comparison of Capability and Maturity Levels Continuous Representation Staged Representation Level Capability Levels Maturity Levels Level 0 Incomplete Level 1 Performed Initial Level 2 Managed Managed Level 3 Defined Defined Level 4 Quantitatively Managed Level 5 Optimizing

The continuous representation is concerned with selecting both a particular process area to improve and the desired capability level for that process area. In this context, whether a process is performed or incomplete is important. Therefore, the name “Incomplete” is given to the continuous representation starting point. The staged representation is concerned with selecting multiple process areas to improve within a maturity level; whether individual processes are performed or incomplete is not the primary focus. ptg Therefore, the name “Initial” is given to the staged representation starting point. Both capability levels and maturity levels provide a way to improve the processes of an organization and measure how well organizations can and do improve their processes. However, the asso- ciated approach to process improvement is different.

Understanding Capability Levels To support those who use the continuous representation, all CMMI models reflect capability levels in their design and content. The four capability levels, each a layer in the foundation for ongo- ing process improvement, are designated by the numbers 0 through 3:

0. Incomplete 1. Performed 2. Managed 3. Defined

A capability level for a process area is achieved when all of the generic goals are satisfied up to that level. The fact that capability

Wow! eBook Chapter 3 Tying It All Together 35 levels 2 and 3 use the same terms as generic goals 2 and 3 is inten- tional because each of these generic goals and practices reflects the meaning of the capability levels of the goals and practices. (See the Generic Goals and Generic Practices section in Part Two for more information about generic goals and practices.) A short description of each capability level follows.

Capability Level 0: Incomplete An incomplete process is a process that either is not performed or is partially performed. One or more of the specific goals of the process area are not satisfied and no generic goals exist for this level since there is no reason to institutionalize a partially performed process.

Capability Level 1: Performed A capability level 1 process is characterized as a performed process. A performed process is a process that accomplishes the needed work to produce work products; the specific goals of the process area are satisfied. Although capability level 1 results in important improvements, those improvements can be lost over time if they are not institution- ptg alized. The application of institutionalization (the CMMI generic practices at capability levels 2 and 3) helps to ensure that improve- ments are maintained.

Capability Level 2: Managed A capability level 2 process is characterized as a managed process. A managed process is a performed process that is planned and executed in accordance with policy; employs skilled people having adequate resources to produce controlled outputs; involves relevant stakehold- ers; is monitored, controlled, and reviewed; and is evaluated for adherence to its process description. The process discipline reflected by capability level 2 helps to ensure that existing practices are retained during times of stress.

Capability Level 3: Defined A capability level 3 process is characterized as a defined process. A defined process is a managed process that is tailored from the organi- zation’s set of standard processes according to the organization’s tai- loring guidelines; has a maintained process description; and contributes process related experiences to the organizational process assets.

Wow! eBook 36 PART ONE ABOUT CMMI FOR DEVELOPMENT

A critical distinction between capability levels 2 and 3 is the scope of standards, process descriptions, and procedures. At capabil- ity level 2, the standards, process descriptions, and procedures can be quite different in each specific instance of the process (e.g., on a partic- ular project). At capability level 3, the standards, process descriptions, and procedures for a project are tailored from the organization’s set of standard processes to suit a particular project or organizational unit and therefore are more consistent, except for the differences allowed by the tailoring guidelines. Another critical distinction is that at capability level 3 processes are typically described more rigorously than at capability level 2. A defined process clearly states the purpose, inputs, entry criteria, activities, roles, measures, verification steps, outputs, and exit crite- ria. At capability level 3, processes are managed more proactively using an understanding of the interrelationships of the process activ- ities and detailed measures of the process and its work products.

Advancing Through Capability Levels The capability levels of a process area are achieved through the appli- cation of generic practices or suitable alternatives to the processes associated with that process area. ptg Reaching capability level 1 for a process area is equivalent to say- ing that the processes associated with that process area are performed processes. Reaching capability level 2 for a process area is equivalent to say- ing that there is a policy that indicates you will perform the process. There is a plan for performing it, resources are provided, responsibil- ities are assigned, training to perform it is provided, selected work products related to performing the process are controlled, and so on. In other words, a capability level 2 process can be planned and mon- itored just like any project or support activity. Reaching capability level 3 for a process area is equivalent to saying that an organizational standard process exists associated with that process area, which can be tailored to the needs of the project. The processes in the organization are now more consistently defined and applied because they are based on organizational standard processes. After an organization has reached capability level 3 in the process areas it has selected for improvement, it can continue its improve- ment journey by addressing high maturity process areas (Organiza- tional Process Performance, Quantitative Project Management, Causal Analysis and Resolution, and Organizational Performance Management).

Wow! eBook Chapter 3 Tying It All Together 37

The high maturity process areas focus on improving the perform- ance of those processes already implemented. The high maturity process areas describe the use of statistical and other quantitative techniques to improve organizational and project processes to better achieve business objectives. When continuing its improvement journey in this way, an organi- zation can derive the most benefit by first selecting the OPP and QPM process areas, and bringing those process areas to capability levels 1, 2, and 3. In doing so, projects and organizations align the selection and analyses of processes more closely with their business objectives. After the organization attains capability level 3 in the OPP and QPM process areas, the organization can continue its improvement path by selecting the CAR and OPM process areas. In doing so, the organization analyzes the business performance using statistical and other quantitative techniques to determine performance shortfalls, and identifies and deploys process and technology improvements that contribute to meeting quality and process-performance objec- tives. Projects and the organization use causal analysis to identify and resolve issues affecting performance and promote the dissemina- tion of best practices. ptg

Applying Principles of Empiricism

by Victor R. Basili, Kathleen C. Dangle, and Michele A. Shaw

Thinking Empirically Thinking empirically about software engineering changes the way you think about process improvement. Software engineering is an engineering discipline. Like other disciplines, software engineering requires an empirical paradigm that involves observing, building models, analyzing, and experi- menting so that we can learn. We need to model the products, the processes, and the cause/effect relationships between them in the context of the organization and the project set. This empirical mindset provides a basis for choosing the appropriate processes, analyzing the effects of those selections, and packaging the result- ing knowledge for reuse and evolution; it drives an effective process improvement initiative.

Wow! eBook 38 PART ONE ABOUT CMMI FOR DEVELOPMENT

Note: Systems and software engineering have a lot in common; both require human-intensive implementation approaches and fun- damentally focus on the issue of design, unlike manufacturing. As such, this empirical thinking can be applied to systems engineering as an underlying approach to improving systems engineering outcomes.

Empirical Principles There are several principles associated with software engineering. In what follows, we discuss a few of these principles as they relate to process improvement. P1. Observe your business. Organizations have different characteris- tics, goals, and cultures; stakeholders have different and competing needs. Well-engineered systems and software depend on many vari- ables and context plays a significant role in defining goals and objectives for what can be and what must be achieved. Organiza- tions must strive to build quantitative and qualitative models to understand the cause and effect relationships between processes and products in the context of the development/maintenance efforts. ptg How else can these organizations articulate the differences and similarities among projects in the organization so they have a basis for selecting the processes to use to achieve their goals? P2. Measurement is fundamental. Measurement is a standard abstrac- tion process that allows us to build models or representations of what we observe so we can reason about relationships in context. The use of models in conjunction with experience, judgment, and intuition can guide decision-making. Measurement through models provides a mechanism for an evidence-based investigation so that decisions are supported with facts versus a system of pure beliefs. P3. Process is a variable. Processes need to be selected and tailored to solve the problem at hand. In order to find the right process for the right situation, organizations must understand the effects of the process under differing conditions. This means a process must be measurable so that its effects can be quantified. It also means organ- izations must compile evidence that shows what works under what circumstances. P4. Stakeholders must make their goals explicit. There is a wide range of stakeholders for any project (e.g., customers, end users, contract managers, practitioners, and managers). The organization itself and

Wow! eBook Chapter 3 Tying It All Together 39 different stakeholders have different goals and needs. Organizations must make these goals and needs explicit through models and measures so they can be communicated, analyzed, synthesized, evaluated, and used to select and tailor the right processes. Making them explicit allows them to be packaged so they can be remem- bered and used again. P5. Learn from experience. Organizations have the opportunity to learn from their experiences and build their core competence in systems and software engineering. For process improvement, the focus should be on learning about processes and how they interact with the environment on each project. This learning is evolutionary and each project should make the organization smarter about how to do the next project. But this learning must be deliberate and explicit or it will not be available for the organization to leverage. P6. Software Engineering is “big science.” Improving software engi- neering processes must be done through observation and experi- mentation in the context of where the actual products are being developed. There is a synergistic relationship between practice and research. Industrial, government, and academic organizations must partner to expand and evolve systems and software competencies. ptg There are so many facets to software engineering that it requires multiple talents and differing expertise. We need real-world labora- tories that allow us to see the interactions among teams, processes, and products.

The Role of Empiricism in CMMI At CMMI level 5, process improvement is intended to be an empiri- cally-based activity. Each project is planned and executed using practices that are selected based on the context of the environment, the project needs, and past experiences. A level 5 organization understands the relationship between process and product and is capable of manipulating process to achieve various product charac- teristics. It is this capability that provides the greatest value from process improvement to the organization. Empirical thinking shifts the process-improvement mindset from “putting processes in place” to “understanding the effects of processes so that appropriate processes can be adopted.” Different and better decisions are made regarding how we choose improve- ment initiatives (prioritize), how we implement practices, and how we manage efforts in projects and in organizations when our explicit approach is based on empirical principles. That mindset

Wow! eBook 40 PART ONE ABOUT CMMI FOR DEVELOPMENT

should be in place at the beginning of the process improvement ini- tiative, thereby focusing the effort on the real objectives of the proj- ect or the organization, the specific product and process problems, relevant experience with methods, and so on. Organizations that are effective at implementing process improvement understand and apply empirical principles. Addition- ally, practices that support these principles are evident within CMMI. CMMI prescribes that data be used to make decisions about process definition at the project level as well as process change at the organizational level. Measurement and learning are catalysts for all of the practices; that is, they provide the bases for why specific practices are selected and how processes are implemented. Systems and software engineering practices are refined and fine-tuned as their effects are better understood.

Some Empirical Techniques That Support CMMI Many techniques and methods exist to assist organizations and projects realize their goals by implementing CMMI practices. Below are some techniques that the Fraunhofer Center has effectively implemented that you may wish to consider: ptg Goal/Question/Metric (GQM) Approach—An essential technique for measurement in any context, GQM can support realization of P2 and P4. Originally, GQM was defined for NASA Goddard Space Flight Center to evaluate software defects on a set of projects.2 Quality Improvement Paradigm (QIP)—A phased process for organi- zational improvement that integrates the experience of individual projects with the corporate learning process, and can help realize Empirical Principles P1, P3, and P5. QIP includes characterizing the organization through models, setting goals, choosing appropri- ate processes for implementation on projects, analyzing results, and packaging the experience for future use in the organization.3 Experience Factory (EF)—A concept based on the continual accu- mulation of project experiences that are essential to organizational and project improvement, EF highlights the logical separation between the project organization and the factory that processes the

2. V. Basili, G. Caldiera, and H. D. Rombach, “Goal Question Metric Approach,” Encyclope- dia of Software Engineering, pp. 528–532, John Wiley & Sons, Inc., 1994. 3. V. Basili and G. Caldiera, “Improve Software Quality by Reusing Knowledge and Experi- ence,” Sloan Management Review, MIT Press, vol. 37(1): 55–64, Fall 1995.

Wow! eBook Chapter 3 Tying It All Together 41

project experiences in an organization experience base to make it reusable. EF can help realize Empirical Principles P1, P2, and P5.4 GQM+Strategies—An approach that supports strategic measure- ment by extending GQM to support goal definition and alignment, strategy development, measurement implementation, and assess- ment across an enterprise.5 GQM+Strategies assists the organization in tying together its approach and motivations underlying the implementation of the Empirical Principles. Adopting these empirical principles early, understanding their role in CMMI, and deliberately selecting techniques to support your effort can have a profound effect on the success of the organization as you embark the process improvement path.

4. V. Basili, G. Caldiera, and H. D. Rombach, “The Experience Factory,” Encyclopedia of Soft- ware Engineering, pp. 469–476, John Wiley & Sons, Inc., 1994. 5. Victor R. Basili, Mikael Lindvall, Myrna Regardie, Carolyn Seaman, Jens Heidrich, Jurgen Munch, Dieter Rombach, and Adam Trendowicz, “Linking Software Development and Busi- ness Strategy Through Measurement,” IEEE Computer, pp. 57–65, April, 2010.

Understanding Maturity Levels ptg To support those who use the staged representation, all CMMI models reflect maturity levels in their design and content. A maturity level consists of related specific and generic practices for a predefined set of process areas that improve the organization’s overall performance. The maturity level of an organization provides a way to character- ize its performance. Experience has shown that organizations do their best when they focus their process improvement efforts on a manageable number of process areas at a time and that those areas require increasing sophistication as the organization improves. A maturity level is a defined evolutionary plateau for organiza- tional process improvement. Each maturity level matures an impor- tant subset of the organization’s processes, preparing it to move to the next maturity level. The maturity levels are measured by the achievement of the specific and generic goals associated with each predefined set of process areas. The five maturity levels, each a layer in the foundation for ongoing process improvement, are designated by the numbers 1 through 5:

1. Initial 2. Managed

Wow! eBook 42 PART ONE ABOUT CMMI FOR DEVELOPMENT

3. Defined 4. Quantitatively Managed 5. Optimizing

Remember that maturity levels 2 and 3 use the same terms as capability levels 2 and 3. This consistency of terminology was inten- tional because the concepts of maturity levels and capability levels are complementary. Maturity levels are used to characterize organiza- tional improvement relative to a set of process areas, and capability levels characterize organizational improvement relative to an individ- ual process area.

Maturity Level 1: Initial At maturity level 1, processes are usually ad hoc and chaotic. The organization usually does not provide a stable environment to sup- port processes. Success in these organizations depends on the com- petence and heroics of the people in the organization and not on the use of proven processes. In spite of this chaos, maturity level 1 organizations often produce products and services that work, but they frequently exceed the budget and schedule documented in their ptg plans. Maturity level 1 organizations are characterized by a tendency to overcommit, abandon their processes in a time of crisis, and be unable to repeat their successes.

Maturity Level 2: Managed At maturity level 2, the projects have ensured that processes are planned and executed in accordance with policy; the projects employ skilled people who have adequate resources to produce controlled outputs; involve relevant stakeholders; are monitored, controlled, and reviewed; and are evaluated for adherence to their process descriptions. The process discipline reflected by maturity level 2 helps to ensure that existing practices are retained during times of stress. When these practices are in place, projects are performed and managed according to their documented plans. Also at maturity level 2, the status of the work products are visi- ble to management at defined points (e.g., at major milestones, at the completion of major tasks). Commitments are established among rel- evant stakeholders and are revised as needed. Work products are appropriately controlled. The work products and services satisfy their specified process descriptions, standards, and procedures.

Wow! eBook Chapter 3 Tying It All Together 43 Maturity Level 3: Defined At maturity level 3, processes are well characterized and understood, and are described in standards, procedures, tools, and methods. The organization’s set of standard processes, which is the basis for matu- rity level 3, is established and improved over time. These standard processes are used to establish consistency across the organization. Projects establish their defined processes by tailoring the organiza- tion’s set of standard processes according to tailoring guidelines. (See the definition of “organization’s set of standard processes” in the glossary.) A critical distinction between maturity levels 2 and 3 is the scope of standards, process descriptions, and procedures. At maturity level 2, the standards, process descriptions, and procedures can be quite dif- ferent in each specific instance of the process (e.g., on a particular project). At maturity level 3, the standards, process descriptions, and procedures for a project are tailored from the organization’s set of standard processes to suit a particular project or organizational unit and therefore are more consistent except for the differences allowed by the tailoring guidelines. Another critical distinction is that at maturity level 3, processes are typically described more rigorously than at maturity level 2. A ptg defined process clearly states the purpose, inputs, entry criteria, activ- ities, roles, measures, verification steps, outputs, and exit criteria. At maturity level 3, processes are managed more proactively using an understanding of the interrelationships of process activities and detailed measures of the process, its work products, and its services. At maturity level 3, the organization further improves its processes that are related to the maturity level 2 process areas. Generic practices associated with generic goal 3 that were not addressed at maturity level 2 are applied to achieve maturity level 3.

Maturity Level 4: Quantitatively Managed At maturity level 4, the organization and projects establish quantita- tive objectives for quality and process performance and use them as criteria in managing projects. Quantitative objectives are based on the needs of the customer, end users, organization, and process implementers. Quality and process performance is understood in sta- tistical terms and is managed throughout the life of projects. For selected subprocesses, specific measures of process perform- ance are collected and statistically analyzed. When selecting sub- processes for analyses, it is critical to understand the relationships

Wow! eBook 44 PART ONE ABOUT CMMI FOR DEVELOPMENT

between different subprocesses and their impact on achieving the objectives for quality and process performance. Such an approach helps to ensure that subprocess monitoring using statistical and other quantitative techniques is applied to where it has the most overall value to the business. Process performance baselines and models can be used to help set quality and process performance objectives that help achieve business objectives. A critical distinction between maturity levels 3 and 4 is the pre- dictability of process performance. At maturity level 4, the perform- ance of projects and selected subprocesses is controlled using statistical and other quantitative techniques, and predictions are based, in part, on a statistical analysis of fine-grained process data.

Maturity Level 5: Optimizing At maturity level 5, an organization continually improves its processes based on a quantitative understanding of its business objectives and performance needs. The organization uses a quantita- tive approach to understand the variation inherent in the process and the causes of process outcomes. Maturity level 5 focuses on continually improving process per- formance through incremental and innovative process and technolog- ptg ical improvements. The organization’s quality and process performance objectives are established, continually revised to reflect changing business objectives and organizational performance, and used as cri- teria in managing process improvement. The effects of deployed process improvements are measured using statistical and other quan- titative techniques and compared to quality and process performance objectives. The project’s defined processes, the organization’s set of standard processes, and supporting technology are targets of measur- able improvement activities. A critical distinction between maturity levels 4 and 5 is the focus on managing and improving organizational performance. At maturity level 4, the organization and projects focus on understanding and controlling performance at the subprocess level and using the results to manage projects. At maturity level 5, the organization is con- cerned with overall organizational performance using data collected from multiple projects. Analysis of the data identifies shortfalls or gaps in performance. These gaps are used to drive organizational process improvement that generates measurable improvement in performance.

Wow! eBook Chapter 3 Tying It All Together 45 Advancing Through Maturity Levels Organizations can achieve progressive improvements in their matu- rity by achieving control first at the project level and continuing to the most advanced level—organization-wide performance manage- ment and continuous process improvement—using both qualitative and quantitative data to make decisions. Since improved organizational maturity is associated with improvement in the range of expected results that can be achieved by an organization, maturity is one way of predicting general outcomes of the organization’s next project. For instance, at maturity level 2, the organization has been elevated from ad hoc to disciplined by establishing sound project management. As the organization achieves generic and specific goals for the set of process areas in a maturity level, it increases its organizational maturity and reaps the benefits of process improvement. Because each maturity level forms a necessary foundation for the next level, trying to skip maturity levels is usually counterproductive. At the same time, recognize that process improvement efforts should focus on the needs of the organization in the context of its business environment and that process areas at higher maturity lev- els can address the current and future needs of an organization or ptg project. For example, organizations seeking to move from maturity level 1 to maturity level 2 are frequently encouraged to establish a process group, which is addressed by the Organizational Process Focus process area at maturity level 3. Although a process group is not a necessary characteristic of a maturity level 2 organization, it can be a useful part of the organization’s approach to achieving maturity level 2. This situation is sometimes characterized as establishing a matu- rity level 1 process group to bootstrap the maturity level 1 organiza- tion to maturity level 2. Maturity level 1 process improvement activities may depend primarily on the insight and competence of the process group until an infrastructure to support more disciplined and widespread improvement is in place. Organizations can institute process improvements anytime they choose, even before they are prepared to advance to the maturity level at which the specific practice is recommended. In such situa- tions, however, organizations should understand that the success of these improvements is at risk because the foundation for their suc- cessful institutionalization has not been completed. Processes with- out the proper foundation can fail at the point they are needed most—under stress.

Wow! eBook 46 PART ONE ABOUT CMMI FOR DEVELOPMENT

A defined process that is characteristic of a maturity level 3 organ- ization can be placed at great risk if maturity level 2 management practices are deficient. For example, management may commit to a poorly planned schedule or fail to control changes to baselined requirements. Similarly, many organizations prematurely collect the detailed data characteristic of maturity level 4 only to find the data uninterpretable because of inconsistencies in processes and measure- ment definitions. Another example of using processes associated with higher matu- rity level process areas is in the building of products. Certainly, we would expect maturity level 1 organizations to perform requirements analysis, design, product integration, and verification. However, these activities are not described until maturity level 3, where they are defined as coherent, well-integrated engineering processes. The maturity level 3 engineering process complements a maturing project management capability put in place so that the engineering improve- ments are not lost by an ad hoc management process.

Process Areas Process areas are viewed differently in the two representations. Fig- ptg ure 3.2 compares views of how process areas are used in the continu- ous representation and the staged representation. The continuous representation enables the organization to choose the focus of its process improvement efforts by choosing those process areas, or sets of interrelated process areas, that best benefit the organization and its business objectives. Although there are some limits on what an organization can choose because of the dependen- cies among process areas, the organization has considerable freedom in its selection. To support those who use the continuous representation, process areas are organized into four categories: Process Management, Proj- ect Management, Engineering, and Support. These categories empha- size some of the key relationships that exist among the process areas. Sometimes an informal grouping of process areas is mentioned: high maturity process areas. The four high maturity process areas are: Organizational Process Performance, Quantitative Project Man- agement, Organizational Performance Management, and Causal Analysis and Resolution. These process areas focus on improving the performance of implemented processes that most closely relate to the organization’s business objectives.

Wow! eBook Chapter 3 Tying It All Together 47

Continuous Target Profile

Process Area 1

Process Area 2

Process Area 3

Process Area 4

Selected Process Areas SelectedProcess Process Area N

CL1 CL2 CL3 Targeted Capability Levels

Staged Selected Maturity Level

Maturity Level 5 Maturity Level 4 ptg Maturity Level 3 Maturity Level 2

SAM REQM PPQA PP PMC MA CM

= Groups of process areas chosen for process improvement to achieve maturity level 3

FIGURE 3.2 Process Areas in the Continuous and Staged Representations

Wow! eBook 48 PART ONE ABOUT CMMI FOR DEVELOPMENT

Once you select process areas, you must also select how much you would like to mature processes associated with those process areas (i.e., select the appropriate capability level). Capability levels and generic goals and practices support the improvement of processes associated with individual process areas. For example, an organization may wish to reach capability level 2 in one process area and capability level 3 in another. As the organization reaches a capa- bility level, it sets its sights on the next capability level for one of these same process areas or decides to widen its view and address a larger number of process areas. Once it reaches capability level 3 in most of the process areas, the organization can shift its attention to the high maturity process areas and can track the capability of each through capability level 3. The selection of a combination of process areas and capability lev- els is typically described in a “target profile.” A target profile defines all of the process areas to be addressed and the targeted capability level for each. This profile governs which goals and practices the organization will address in its process improvement efforts. Most organizations, at minimum, target capability level 1 for the process areas they select, which requires that all of these process areas’ specific goals be achieved. However, organizations that target ptg capability levels higher than 1 concentrate on the institutionalization of selected processes in the organization by implementing generic goals and practices. The staged representation provides a path of improvement from maturity level 1 to maturity level 5 that involves achieving the goals of the process areas at each maturity level. To support those who use the staged representation, process areas are grouped by maturity level, indicating which process areas to implement to achieve each maturity level. For example, at maturity level 2, there is a set of process areas that an organization would use to guide its process improvement until it could achieve all the goals of all these process areas. Once maturity level 2 is achieved, the organization focuses its efforts on maturity level 3 process areas, and so on. The generic goals that apply to each process area are also predetermined. Generic goal 2 applies to maturity level 2 and generic goal 3 applies to maturity lev- els 3 through 5. Table 3.2 provides a list of CMMI-DEV process areas and their associated categories and maturity levels.

Wow! eBook Chapter 3 Tying It All Together 49 TABLE 3.2 Process Areas, Categories, and Maturity Levels Process Area Category Maturity Level Causal Analysis and Resolution (CAR) Support 5 Configuration Management (CM) Support 2 Decision Analysis and Resolution (DAR) Support 3 Integrated Project Management (IPM) Project Management 3 Measurement and Analysis (MA) Support 2 Organizational Process Definition (OPD) Process Management 3 Organizational Process Focus (OPF) Process Management 3 Organizational Performance Management (OPM) Process Management 5 Organizational Process Performance (OPP) Process Management 4 Organizational Training (OT) Process Management 3 Product Integration (PI) Engineering 3 Project Monitoring and Control (PMC) Project Management 2 Project Planning (PP) Project Management 2 Process and Product Quality Assurance (PPQA) Support 2 Quantitative Project Management (QPM) Project Management 4 Requirements Development (RD) Engineering 3 Requirements Management (REQM) Project Management 2 ptg Risk Management (RSKM) Project Management 3 Supplier Agreement Management (SAM) Project Management 2 Technical Solution (TS) Engineering 3 Validation (VAL) Engineering 3 Verification (VER) Engineering 3

Equivalent Staging Equivalent staging is a way to compare results from using the contin- uous representation to results from using the staged representation. In essence, if you measure improvement relative to selected process areas using capability levels in the continuous representation, how do you translate that work into maturity levels? Is this translation possible? Up to this point, we have not discussed process appraisals in much detail. The SCAMPI method6 is used to appraise organizations using CMMI, and one result of an appraisal is a rating [SEI 2011a,

6. The Standard CMMI Appraisal Method for Process Improvement (SCAMPI) method is described in Chapter 5.

Wow! eBook 50 PART ONE ABOUT CMMI FOR DEVELOPMENT

Ahern 2005]. If the continuous representation is used for an appraisal, the rating is a “capability level profile.” If the staged representation is used for an appraisal, the rating is a “maturity level rating” (e.g., maturity level 3). A capability level profile is a list of process areas and the corre- sponding capability level achieved for each. This profile enables an organization to track its capability level by process area. The profile is called an “achievement profile” when it represents the organiza- tion’s actual progress for each process area. Alternatively, the profile is called a “target profile” when it represents the organization’s planned process improvement objectives. Figure 3.3 illustrates a combined target and achievement profile. The blue portion of each bar represents what has been achieved. The unshaded portion represents what remains to be accomplished to meet the target profile.

Requirements Management

Project ptg Planning

Project Monitoring and Control

Supplier Agreement Management

Measurement and Analysis

Process and Product Quality Assurance

Configuration Management

Capability Capability Capability Level 1 Level 2 Level 3

FIGURE 3.3 Example Combined Target and Achievement Profile

Wow! eBook Chapter 3 Tying It All Together 51

An achievement profile, when compared with a target profile, enables an organization to plan and track its progress for each selected process area. Maintaining capability level profiles is advis- able when using the continuous representation. Target staging is a sequence of target profiles that describes the path of process improvement to be followed by the organization. When building target profiles, the organization should pay attention to the dependencies between generic practices and process areas. If a generic practice depends on a process area, either to carry out the generic practice or to provide a prerequisite work product, the generic practice can be much less effective when the process area is not implemented.7 Although the reasons to use the continuous representation are many, ratings consisting of capability level profiles are limited in their ability to provide organizations with a way to generally com- pare themselves with other organizations. Capability level profiles can be used if each organization selects the same process areas; how- ever, maturity levels have been used to compare organizations for years and already provide predefined sets of process areas. Because of this situation, equivalent staging was created. Equiva- lent staging enables an organization using the continuous representa- ptg tion to convert a capability level profile to the associated maturity level rating. The most effective way to depict equivalent staging is to provide a sequence of target profiles, each of which is equivalent to a maturity level rating of the staged representation reflected in the process areas listed in the target profile. The result is a target staging that is equiva- lent to the maturity levels of the staged representation. Figure 3.4 shows a summary of the target profiles that must be achieved when using the continuous representation to be equivalent to maturity levels 2 through 5. Each shaded area in the capability level columns represents a target profile that is equivalent to a matu- rity level. The following rules summarize equivalent staging:

• To achieve maturity level 2, all process areas assigned to maturity level 2 must achieve capability level 2 or 3. • To achieve maturity level 3, all process areas assigned to maturity lev- els 2 and 3 must achieve capability level 3.

7. See Table 7.2 in the Generic Goals and Generic Practices section of Part Two for more infor- mation about the dependencies between generic practices and process areas.

Wow! eBook 52 PART ONE ABOUT CMMI FOR DEVELOPMENT

Name Abbr. ML CL1 CL2 CL3 Configuration Management CM 2 Target Measurement and Analysis MA 2 Profile 2 Project Monitoring and Control PMC 2 Project Planning PP 2 Process and Product Quality Assurance PPQA 2 Requirements Management REQM 2 Supplier Agreement Management SAM 2 Decision Analysis and Resolution DAR 3 Target Integrated Project Management IPM 3 Profile 3 Organizational Process Definition OPD 3 Organizational Process Focus OPF 3 Organizational Training OT 3 Product Integration PI 3 Requirements Development RD 3 Risk Management RSKM 3 Technical Solution TS 3 Validation VAL 3 Verification VER 3 ptg Organizational Process Performance OPP 4 Target Quantitative Project Management QPM 4 Profile 4

Causal Analysis and Resolution CAR 5 Target Organizational Performance Management OPM 5 Profile 5

FIGURE 3.4 Target Profiles and Equivalent Staging

• To achieve maturity level 4, all process areas assigned to maturity lev- els 2, 3, and 4 must achieve capability level 3. • To achieve maturity level 5, all process areas must achieve capability level 3.

Achieving High Maturity When using the staged representation, you attain high maturity when you achieve maturity level 4 or 5. Achieving maturity level 4 involves implementing all process areas for maturity levels 2, 3, and 4. Likewise, achieving maturity level 5 involves implementing all process areas for maturity levels 2, 3, 4, and 5.

Wow! eBook Chapter 3 Tying It All Together 53

When using the continuous representation, you attain high matu- rity using the equivalent staging concept. High maturity that is equivalent to staged maturity level 4 using equivalent staging is attained when you achieve capability level 3 for all process areas except for Organizational Performance Management (OPM) and Causal Analysis and Resolution (CAR). High maturity that is equiva- lent to staged maturity level 5 using equivalent staging is attained when you achieve capability level 3 for all process areas.

Using Process Performance Baselines and Process Performance Models to Enable Success

by Michael Campo, Neal Mackertich, and Peter Kraus

Process performance baselines and process performance models are expected CMMI high maturity (maturity levels 4 and 5) artifacts that build on the measurement and analysis activities established at CMMI maturity level 2 to lift an organization from a reactive man- agement state to a proactive management state. At maturity level 2, measurements are analyzed and action taken based on trend analy- ptg sis or thresholds being crossed. By contrast, the process perform- ance baselines and models developed using high maturity practices enable an organization to predict the ability of its processes to per- form in relationship to its business objectives. By aligning process performance and business objectives, process performance base- lines and models act as facilitators to organizational and project success. In this essay, we examine the concepts of process perform- ance baselines and models based on our experiences at Raytheon Integrate Defense Systems (IDS). Before discussing process performance baselines and models, it’s important to understand the concept of “quality and process performance objectives.” Quality and process performance objec- tives derived from business objectives are used to leverage value from process performance baselines and models. This connection between quality and process performance objectives, baselines, and models is key. Without it, baselines and models may still provide some benefit to an organization, but not the systematic optimiza- tion that comes from using the baselines and models as enabling tools to focus process performance on achievement of business objectives.

Wow! eBook 54 PART ONE ABOUT CMMI FOR DEVELOPMENT

In the Measurement and Analysis process area, an organization and its projects develop a measurement system used to support management information needs that are derived from organizational and project objectives. For example, a project with an objective to meet or beat its customer delivery date has an information need to understand its progress toward achieving that milestone. Measures supporting that information need might include schedule perform- ance, productivity, and size. By collecting and analyzing such mea- sures, the project can react to trends that indicate risk to the delivery date. These activities align with the Goal Question Metric (GQM) approach developed by Dr. Victor Basili and others from his work with the United States National Aeronautics and Space Administration. Quality and process performance objectives build on the GQM approach by explicitly analyzing business, organizational, or project objectives and creating quantified “goals” related to processes that support those objectives. These goals become the quality and process performance objectives. Quality and process performance objectives are typically more strategically aligned than maturity level 2 Measurement and Analysis objectives and are developed from an understanding of historical process performance data. Often, business and organizational objectives may be expressed in ptg nonquantitative terms. For example, “Improve Customer Satisfac- tion” may be a business objective. In such cases, it may be necessary to derive quality and process performance objectives by asking additional questions, such as “What does it take to improve cus- tomer satisfaction?” The answer may be meeting cost, schedule, or quality targets from which quantitative quality and process per- formance objectives can be established. Individual projects may also establish quality and process performance objectives related to their specific project context, such as objectives related to award fee criteria. Raytheon IDS has specific business objectives associated with cost and schedule performance. Although many things impact cost and schedule performance on a project, drivers from a process per- spective involved improving productivity and reducing rework. Quantitative quality and process performance objectives were established for productivity, defect containment, and defect density in support of the cost and schedule performance objectives. Process performance baselines were then created to understand our ability to achieve the quality and process performance objectives. Histori- cal data related to productivity, defect containment, and defect den- sity was statistically analyzed. Prediction intervals and statistical

Wow! eBook Chapter 3 Tying It All Together 55 process control charts were established at both the organizational and individual project levels. Analysis results graphically depict the process performance capability of the organization and its projects as related to the quality and process performance objectives. They represent our process performance baselines. Process performance baselines are integral enabling manage- ment tools. Projects maintain their process performance baselines using ongoing project data as part of their normal ongoing manage- ment activity. Process performance baselines are compared against the quality and project performance objectives. Causal analysis is performed and corrective action taken when process performance baselines indicate risk that the quality and process performance objectives may not be achieved. In some cases, analysis involves using statistical techniques to evaluate lower level measures that support a quality and process performance objective. For example, using statistical techniques to analyze peer review data supports quality and process performance objectives related to defect con- tainment and defect density. Organizational process performance base- lines are used to maintain quality and process performance objectives by relating current organizational process performance capability to changing business objectives. ptg The direct relationship between business objectives and quality and process performance objectives, and the use of process per- formance baselines to quantitatively manage our progress toward achieving these objectives, focused our organization on the charac- teristics of success. If organizational objectives are subjective (e.g., “Improve customer satisfaction”), projects may become disengaged with those objectives. Making process performance baselines and quality and process performance objectives an integral part of proj- ect management clarifies each project’s role in business success. The projects become enlisted in a grass-roots effort to help achieve the business objectives. An organization where all individuals recognize their role and responsibility for business success is an organization that is more likely to achieve success. The versatility of the Goal Question Metric approach further enabled our development and deployment of process performance models in the form of a Goal Question Model approach. As with process performance baselines, all process performance model efforts initiate through the linkage and alignment with quality and process performance objectives derived from business objectives. This approach is absolutely critical to effective process performance modeling since without this up-front business alignment, there is a

Wow! eBook 56 PART ONE ABOUT CMMI FOR DEVELOPMENT

tendency to create elegant models rather than effective models that support business objectives. Organizational leadership engagement is integral to this process. Development of our Systems Lifecycle Analysis Model (SLAM) process performance model was a direct result of a Raytheon IDS Engineering Leadership expressed concern around the productivity and rework risks associated with accelerated concurrent engineer- ing efforts and their impact on downstream cost performance. This concern naturally led to our generation of questions around what factors related to our concurrent engineering efforts influence our achievement of cost performance, and what controllable sub- processes relate to those factors. Of specific interest in the development of the SLAM model was the potential statistical relationship between requirements volatility and the degree of requirements—design overlap with that of down- stream software and hardware development cost performance. Modeling the relationship between potential input factors and proj- ect outcomes involved the use of statistical methods such as regres- sion analysis and Monte Carlo simulation. In the specific case of the SLAM model, a mathematical function of the input factors was rea- sonably well correlated with the output responses using linear ptg regression techniques (with an adjusted r-squared value = 0.65, p = .000). See Figure 3.5. Additionally collected project data from SLAM piloting and deployment further confirmed the strength of this underlying relationship. The regression equation associated with this statistical correlation was the building block for our SLAM model development efforts. This is a nice point in our Perspective to reflect on the healthy ingredients of process performance models developed by the Soft- ware Engineering Institute:

• Are statistical, probabilistic, or simulation in nature • Predict interim and/or final project outcomes • Use controllable factors tied to subprocesses to conduct the prediction • Model the variation of factors and understand the predicted range or variation of the outcomes • Enable “what-if” analysis for project planning, dynamic replan- ning, and problem resolution during project execution • Connect “upstream” activity with “downstream” activity • Enable projects to achieve midcourse corrections to ensure project success

Wow! eBook Chapter 3 Tying It All Together 57

f(% Design Complete, System Volatility) vs HW/SW CPI R2 = 0.67 Adj R2 = 0.65 High HW/SW CPI

Low

Low f(% Design Complete, System Volatility) High

FIGURE 3.5 Relationship of Hardware/Software Cost Performance and Design Completion/Requirements Volatility

These captured ingredients have served us well in guiding our process performance model development and deployment efforts. The SLAM model leverages the statistical correlation of controllable factors to an outcome prediction in the form of a regression equa- tion. A user-friendly Excel-based interface was then created using Crystal Ball (an industry available software package) to model and ptg statistically generate a cost performance prediction interval using Monte Carlo simulation. See Figure 3.6. Note that the SLAM model user interface also includes work- sheets containing Crystal Ball download instructions, step-by-step guidance for projects on Running SLAM, an Interpreting the Results guide, and a listing of potential mitigation Strategies based on best practices and lessons learned from previous deployment efforts. Information on these worksheets has proven itself to be invaluable

FIGURE 3.6 SLAM Model User Interface

Wow! eBook 58 PART ONE ABOUT CMMI FOR DEVELOPMENT

in enabling “what-if” analysis for project planning, dynamic replan- ning, and problem resolution during project execution. Projects with aggressive schedules where hardware and software design activities begin prior to requirements release can quantify the associated cost performance risks, determine the requirements volatility level that must be maintained to meet the cost perform- ance objectives, and identify the process changes necessary to man- age requirements accordingly. The SLAM model has been effectively used by integrated project teams made up of Systems, Software, Hardware, and Quality Engineering during project planning and execution to predict, manage, and mitigate risk in achieving project cost objectives. Integrated SLAM project deployment has delivered significant cost and cycle time benefits for our programs and has become an integral part of our risk management and decision-mak- ing processes. The SLAM model has helped projects address real issues regard- ing their ability to meet their cost performance objectives. The suc- cess of SLAM promoted further business investment in the process performance modeling of the product development lifecycle. This investment has led to the development of a family of process per- formance models that support Raytheon IDS cost and schedule ptg business objectives. Effective development and deployment of process performance baselines and models have significantly improved our process align- ment and performance against Raytheon Integrated Defense Sys- tems business and engineering objectives. Resulting efforts in the areas of statistical process management, root cause analysis and cor- rective action, interdependent execution, and statistically-based risk assessment have resulted in increased productivity, reduced rework, and improved cost and schedule performance. Return-on- investment analysis pertaining to our Raytheon IDS CMMI high maturity efforts has indicated a 24:1 return from our process invest- ment. Process performance baselines and models are the spark that ignited these results and fuels our drive for more.

Wow! eBook CHAPTER 4

RELATIONSHIPS AMONG PROCESS AREAS

In this chapter we describe the key relationships among process areas to help you see the organization’s view of process improvement and how process areas depend on the implementation of other process areas. The relationships among multiple process areas, including the infor- mation and artifacts that flow from one process area to another— illustrated by the figures and descriptions in this chapter—help you to see a larger view of process implementation and improvement. Successful process improvement initiatives must be driven by the ptg business objectives of the organization. For example, a common business objective is to reduce the time it takes to get a product to market. The process improvement objective derived from that might be to improve the project management processes to ensure on-time delivery; those improvements rely on best practices in the Project Planning and Project Monitoring and Control process areas. Although we group process areas in this chapter to simplify the discussion of their relationships, process areas often interact and have an effect on one another regardless of their group, category, or level. For example, the Decision Analysis and Resolution process area (a Support process area at maturity level 3) contains specific practices that address the formal evaluation process used in the Tech- nical Solution process area for selecting a technical solution from alternative solutions. Being aware of the key relationships that exist among CMMI process areas will help you apply CMMI in a useful and productive way. Relationships among process areas are described in more detail in the references of each process area and specifically in the Related Process Areas section of each process area in Part Two. Refer to Chapter 2 for more information about references.

59

Wow! eBook 60 PART ONE ABOUT CMMI FOR DEVELOPMENT

Process Management Process Management process areas contain the cross-project activi- ties related to defining, planning, deploying, implementing, monitor- ing, controlling, appraising, measuring, and improving processes. The five Process Management process areas in CMMI-DEV are as follows:

• Organizational Process Definition (OPD) • Organizational Process Focus (OPF) • Organizational Performance Management (OPM) • Organizational Process Performance (OPP) • Organizational Training (OT)

Basic Process Management Process Areas The Basic Process Management process areas provide the organization with a capability to document and share best practices, organizational process assets, and learning across the organization. Figure 4.1 provides a bird’s-eye view of the interactions among the Basic Process Management process areas and with other process ptg area categories. As illustrated in Figure 4.1, the Organizational Process Focus process area helps the organization to plan, imple- ment, and deploy organizational process improvements based on an understanding of the current strengths and weaknesses of the organi- zation’s processes and process assets. Candidate improvements to the organization’s processes are obtained through various sources. These activities include process improvement proposals, measurement of the processes, lessons learned in implementing the processes, and results of process appraisal and product evaluation activities. The Organizational Process Definition process area establishes and maintains the organization’s set of standard processes, work environment standards, and other assets based on the process needs and objectives of the organization. These other assets include descriptions of lifecycle models, process tailoring guidelines, and process related documentation and data. Projects tailor the organization’s set of standard processes to cre- ate their defined processes. The other assets support tailoring as well as implementation of the defined processes. Experiences and work products from performing these defined processes, including measurement data, process descriptions, process

Wow! eBook process areas Project Management, Support, and Engineering Training needs Standard process, work environment standards, and other assets Improvement information (e.g., lessons learned, data, and artifacts) Training for projects and support groups in standard process and assets

ptg Standard process and other assets OT OPD Process improvement proposals; participation in defining, assessing, and deploying processes Organization’s process needs and objectives Resources and coordination Organization’s business objectives OPF Senior Management OPD = Organizational Process Definition OPF = Organizational Process Focus Training = Organizational OT FIGURE 4.1 Areas Process Management Process Basic

61 62 PART ONE ABOUT CMMI FOR DEVELOPMENT

artifacts, and lessons learned, are incorporated as appropriate into the organization’s set of standard processes and other assets. The Organizational Training process area identifies the strategic training needs of the organization as well as the tactical training needs that are common across projects and support groups. In partic- ular, training is developed or obtained to develop the skills required to perform the organization’s set of standard processes. The main components of training include a managed training development program, documented plans, staff with appropriate knowledge, and mechanisms for measuring the effectiveness of the training program.

Advanced Process Management Process Areas The Advanced Process Management process areas provide the organ- ization with an improved capability to achieve its quantitative objec- tives for quality and process performance. Figure 4.2 provides a bird’s-eye view of the interactions among the Advanced Process Management process areas and with other process area categories. Each of the Advanced Process Management process areas depends on the ability to develop and deploy processes and supporting assets. The Basic Process Management process areas provide this ability. ptg As illustrated in Figure 4.2, the Organizational Process Perform- ance process area derives quantitative objectives for quality and process performance from the organization’s business objectives. The organization provides projects and support groups with common measures, process performance baselines, and process performance models. These additional organizational assets support composing a defined process that can achieve the project’s quality and process per- formance objectives and support quantitative management. The organization analyzes the process performance data collected from these defined processes to develop a quantitative understanding of product quality, service quality, and process performance of the orga- nization’s set of standard processes. In Organizational Performance Management, process perform- ance baselines and models are analyzed to understand the organiza- tion’s ability to meet its business objectives and to derive quality and process performance objectives. Based on this understanding, the organization proactively selects and deploys incremental and innova- tive improvements that measurably improve the organization’s performance.

Wow! eBook process areas Project Management, Support, and Engineering Cost and benefit data from piloted improvements Performance capability data Quality and process measures, baselines, performance objectives, and models

Common measures ptg OPM OPP Basic process areas Process Management Business objectives Improvements Quality and process objectives Performance capability Senior Management Organization Ability to develop and deploy standard processes and other assets OPM = Organizational Performance Management = Organizational Process Performance OPP FIGURE 4.2 Advanced Process Management Process Areas

63 64 PART ONE ABOUT CMMI FOR DEVELOPMENT

The selection of improvements to deploy is based on a quantita- tive understanding of the likely benefits and predicted costs of deploying candidate improvements. The organization can also adjust business objectives and quality and process performance objectives as appropriate.

Project Management Project Management process areas cover the project management activities related to planning, monitoring, and controlling the project. The seven Project Management process areas in CMMI-DEV are as follows:

• Integrated Project Management (IPM) • Project Monitoring and Control (PMC) • Project Planning (PP) • Quantitative Project Management (QPM) • Requirements Management (REQM) • Risk Management (RSKM) • Supplier Agreement Management (SAM) ptg Basic Project Management Process Areas The Basic Project Management process areas address the activities related to establishing and maintaining the project plan, establishing and maintaining commitments, monitoring progress against the plan, taking corrective action, and managing supplier agreements. Figure 4.3 provides a bird’s-eye view of the interactions among the Basic Project Management process areas and with other process area categories. As illustrated in Figure 4.3, the Project Planning process area includes developing the project plan, involving relevant stakeholders, obtaining commitment to the plan, and maintaining the plan. Planning begins with requirements that define the product and project (“What to Build” in Figure 4.3). The project plan covers the various project management and development activities performed by the project. The project reviews other plans that affect the project from various relevant stakeholders and establishes commitments with those stakeholders for their contributions to the project. For example, these plans cover configuration management, verification, and measurement and analysis.

Wow! eBook process areas Product and product component requirements Engineering and Support REQM Corrective action What to do Commitments What to build Measurement needs Status, issues, and results of process product evaluations, measures, and analyses

ptg Product and product component requirements What to monitor PP PMC Replan Product component requirements, technical issues, completed product components, and acceptance reviews and tests Plans Status, issues, and results of reviews and monitoring Corrective action SAM Supplier agreement Supplier PMC = Project Monitoring and Control = Project Planning PP REQM = Requirements Management Agreement Management SAM = Supplier FIGURE 4.3 Areas Process Management Project Basic

65 66 PART ONE ABOUT CMMI FOR DEVELOPMENT

The Project Monitoring and Control process area contains prac- tices for monitoring and controlling activities and taking corrective action. The project plan specifies the frequency of progress reviews and the measures used to monitor progress. Progress is determined primarily by comparing project status to the plan. When the actual status deviates significantly from the expected values, corrective actions are taken as appropriate. These actions can include replan- ning, which requires using Project Planning practices. The Requirements Management process area maintains the require- ments. It describes activities for obtaining and controlling requirement changes and ensuring that other relevant plans and data are kept cur- rent. It provides traceability of requirements from customer require- ments to product requirements to product component requirements. Requirements Management ensures that changes to requirements are reflected in project plans, activities, and work products. This cycle of changes can affect the Engineering process areas; thus, requirements management is a dynamic and often recursive sequence of events. The Requirements Management process area is fundamen- tal to a controlled and disciplined engineering process. The Supplier Agreement Management process area addresses the need of the project to acquire those portions of work that are pro- ptg duced by suppliers. Sources of products that can be used to satisfy project requirements are proactively identified. The supplier is selected, and a supplier agreement is established to manage the supplier. The supplier’s progress and performance are tracked as specified in the supplier agreement, and the supplier agreement is revised as appropriate. Acceptance reviews and tests are conducted on the sup- plier-produced product component.

Advanced Project Management Process Areas The Advanced Project Management process areas address activities such as establishing a defined process that is tailored from the orga- nization’s set of standard processes, establishing the project work environment from the organization’s work environment standards, coordinating and collaborating with relevant stakeholders, forming and sustaining teams for the conduct of projects, quantitatively man- aging the project, and managing risk. Figure 4.4 provides a bird’s-eye view of the interactions among the Advanced Project Management process areas and with other process area categories. Each Advanced Project Management process area depends on the ability to plan, monitor, and control the project. The Basic Project Management process areas provide this ability.

Wow! eBook RSKM Basic process areas Risk taxonomies and parameters, risk status, mitigation plans, and corrective action Project Management Risk exposure due to unstable processes Project’s composed and defined processes Identified risks Project performance data Project’s shared vision

ptg QPM IPM Project’s defined process and work environment Coordination, commitments, and issues to resolve Teams for performingand support processes engineering and defined process process areas Teaming rulesguidelines and Lessons learned, planning, and performance data Engineering and Support Organization’s standard processes, work environment standards, and supporting assets Quantitative objectives, subprocesses to statistically manage, project’s composed, Statistical management data Product architecture for structuring teams process areas Process performance objectives, baselines, and models IPM = Integrated Project Management QPM = Quantitative Project Management RSKM = Risk Management Process Management FIGURE 4.4 Advanced Project Management Process Areas

67 68 PART ONE ABOUT CMMI FOR DEVELOPMENT

The Integrated Project Management process area establishes and maintains the project’s defined process that is tailored from the orga- nization’s set of standard processes (Organizational Process Defini- tion). The project is managed using the project’s defined process. The project uses and contributes to the organizational process assets, the project’s work environment is established and maintained from the organization’s work environment standards, and teams are established using the organization’s rules and guidelines. The proj- ect’s relevant stakeholders coordinate their efforts in a timely manner through the identification, negotiation, and tracking of critical dependencies and the resolution of coordination issues. Although risk identification and monitoring are covered in the Project Planning and Project Monitoring and Control process areas, the Risk Management process area takes a continuing, forward-looking approach to managing risks with activities that include identification of risk parameters, risk assessments, and risk mitigation. The Quantitative Project Management process area establishes objectives for quality and process performance, composes a defined process that can help achieve those objectives, and quantitatively manages the project. The project’s quality and process performance objectives are based on the objectives established by the organization ptg and the customer. The project’s defined process is composed using statistical and other quantitative techniques. Such an analysis enables the project to predict whether it will achieve its quality and process performance objectives. Based on the prediction, the project can adjust the defined process or can negotiate changes to quality and process performance objectives. As the project progresses, the performance of selected subprocesses is carefully monitored to help evaluate whether the project is on track to achieving its objectives.

Engineering Engineering process areas cover the development and maintenance activities that are shared across engineering disciplines. The Engi- neering process areas were written using general engineering termi- nology so that any technical discipline involved in the product development process (e.g., software engineering, mechanical engi- neering) can use them for process improvement. The Engineering process areas also integrate the processes associ- ated with different engineering disciplines into a single product

Wow! eBook Chapter 4 Relationships Among Process Areas 69 development process, supporting a product oriented process improve- ment strategy. Such a strategy targets essential business objectives rather than specific technical disciplines. This approach to processes effectively avoids the tendency toward an organizational “stovepipe” mentality. The Engineering process areas apply to the development of any product or service in the development domain (e.g., software prod- ucts, hardware products, services, processes). The five Engineering process areas in CMMI-DEV are as follows:

• Product Integration (PI) • Requirements Development (RD) • Technical Solution (TS) • Validation (VAL) • Verification (VER)

Figure 4.5 provides a bird’s-eye view of the interactions among the six Engineering process areas. The Requirements Development process area identifies customer needs and translates these needs into product requirements. The set ptg of product requirements is analyzed to produce a high-level concep- tual solution. This set of requirements is then allocated to establish an initial set of product component requirements. Other requirements that help define the product are derived and allocated to product components. This set of product and product component requirements clearly describes the product’s perform- ance, quality attributes, design features, verification requirements, etc., in terms the developer understands and uses. The Requirements Development process area supplies require- ments to the Technical Solution process area, where the requirements are converted into the product architecture, product component designs, and product components (e.g., by coding, fabrication). Requirements are also supplied to the Product Integration process area, where product components are combined and interfaces are verified to ensure that they meet the interface requirements supplied by Requirements Development. The Technical Solution process area develops technical data pack- ages for product components to be used by the Product Integration or Supplier Agreement Management process area. Alternative solu- tions are examined to select the optimum design based on estab- lished criteria. These criteria can be significantly different across

Wow! eBook Customer Product PI VAL

ptg Product components TS VER Customer needs Requirements Requirements Alternative solutions Requirements, product components, work products, verification, and validation reports process areas Product and product component requirements Process Management RD PI = Product Integration RD = Requirements Development Solution Technical TS = VAL = Validation VER = Verification FIGURE 4.5 Engineering Process Areas

70 Chapter 4 Relationships Among Process Areas 71 products, depending on product type, operational environment, per- formance requirements, support requirements, and cost or delivery schedules. The task of selecting the final solution makes use of the specific practices in the Decision Analysis and Resolution process area. The Technical Solution process area relies on the specific prac- tices in the Verification process area to perform design verification and peer reviews during design and prior to final build. The Verification process area ensures that selected work products meet the specified requirements. The Verification process area selects work products and verification methods that will be used to verify work products against specified requirements. Verification is gener- ally an incremental process, starting with product component verifi- cation and usually concluding with verification of fully assembled products. Verification also addresses peer reviews. Peer reviews are a proven method for removing defects early and provide valuable insight into the work products and product components being developed and maintained. The Validation process area incrementally validates products against the customer’s needs. Validation can be performed in the ptg operational environment or in a simulated operational environment. Coordination with the customer on validation requirements is an important element of this process area. The scope of the Validation process area includes validation of products, product components, selected intermediate work products, and processes. These validated elements can often require reverifica- tion and revalidation. Issues discovered during validation are usually resolved in the Requirements Development or Technical Solution process area. The Product Integration process area contains the specific prac- tices associated with generating an integration strategy, integrating product components, and delivering the product to the customer. Product Integration uses the specific practices of both Verification and Validation in implementing the product integration process. Ver- ification practices verify the interfaces and interface requirements of product components prior to product integration. Interface verifica- tion is an essential event in the integration process. During product integration in the operational environment, the specific practices of the Validation process area are used.

Wow! eBook 72 PART ONE ABOUT CMMI FOR DEVELOPMENT Expanding Capabilities Across the “Constellations”

by Mike Phillips

As we are finishing the details of the current collection of process areas that span three CMMI constellations, this perspective is my opportunity to encourage “continuous thinking.” My esteemed mentor as we began and then evolved the CMMI Product Suite was our Chief Architect, Dr. Roger Bate. Roger left us with an amazing legacy. He imagined that organizations could look at a collection of “process areas” and choose ones they might wish to use to facilitate their process improvement journey. Maturity levels for organizations were acceptable, but not as interesting to him as being able to focus attention on a collection of process areas for business benefit. Small businesses have been the first to see the advantage of this approach, as they often find the full collection of process areas in any constellation daunting. An SEI report, “CMMI Roadmaps,” describes some ways to construct thematic approaches to effective use of process areas from the CMMI for Development constellation. This report can be found on the SEI web- ptg site at http://www.sei.cmu.edu/library/abstracts/reports/08tn010.cfm. As we created the two new constellations, we took care to refer back to the predecessor collection of process areas in CMMI for Development. For example, in CMMI for Acquisition, we note that some acquisition organizations might need more technical detail in the requirements development effort than what we provided in Acquisition Requirements Development (ARD), and to “reach back” to CMMI-DEV’s Requirements Development (RD) process area for more assistance. In CMMI for Services, we suggest that the Service System Development (SSD) process area is useful when the development efforts are relatively limited, but the full engineering process area category in CMMI-DEV may be useful if major service systems are being created and delivered. Now with three full constellations to use for process improve- ment, many additional “refer to” possibilities exist. With the release of the V1.3 Product Suite, we offer the option to declare satisfaction of any of the process areas from the CMMI portfolio. What are some of the more obvious expansions? We have already mentioned two expansions: ARD using RD and SSD expanded to capture RD, TS, PI, VER, and VAL. What about

Wow! eBook Chapter 4 Relationships Among Process Areas 73 situations in which most of the development is done outside the organization, but final responsibility for effective systems integra- tion remains with your organization? Perhaps a few of the acquisi- tion PAs would be useful beyond SAM. A simple start would be to investigate using SSAD and AM as a replacement for SAM to get the additional detailed help. And ATM might give some good technical assistance in monitoring the technical progress of the elements being developed by specific partners. As we add the contributions of CMMI-SVC to the mix, several process areas offer more ways to expand. For example, in V1.2 of CMMI-DEV, we added informative material in Risk Management to address beforehand concerns about continuity of operations after some significant disruption occurs. Now, with CMMI-SVC, we have a full process area, Service Continuity (SCON), to provide robust coverage of continuity concerns. (And for those who need even more coverage, the SEI now has the Resilience Management Model (RMM) to give the greater attention that some financial institutions and similar organizations have expressed as necessary for their process improvement endeavors. For more, see http://www.cert.org/ resilience/rmm.html.) Another expansion worthy of consideration is to include the ptg Service Systems Transition (SST) process area. Organizations that are responsible for development of new systems—and maintenance of existing systems until the new system can be brought to full capability—may find the practices contained in SST to be a useful expansion because the transition part of the product lifecycle has limited coverage in CMMI-DEV. In addition, CMMI-ACQ added two practices to PP and PMC to address planning for and monitor- ing transition into use, so the CMMI-ACQ versions of these two core process areas might couple nicely with SST. A topic that challenged the development team for V1.3 was improved coverage of “strategy.” Those of us with acquisition expe- rience knew the criticality of an effective acquisition strategy to pro- gram success, so the practice was added to the CMMI-ACQ version of PP. In the CMMI-SVC constellation, Strategic Service Manage- ment (STSM) has the objective “to get the information needed to make effective strategic decisions about the set of standard services the organization maintains.” With minor interpretation, this process area could assist a development organization in determining what types of development projects should be in its product development line. The SVC constellation authors also added a robust strategy estab- lishment practice in the CMMI-SVC version of PP (Work Planning)

Wow! eBook 74 PART ONE ABOUT CMMI FOR DEVELOPMENT

to “provide the business framework for planning and managing the work.” Two process areas essential for service work were seriously con- sidered for insertion into CMMI-DEV V1.3: Capacity and Availabil- ity Management (CAM) and Incident Resolution and Prevention (IRP). In the end, expansion of the CMMI-DEV constellation from 22 to 24 process areas was determined to be less valuable than con- tinuing our efforts to streamline coverage. In any case, these two process areas offer another opportunity for the type of expansion I am exploring in this essay. Those of you who have experienced appraisals have likely seen the use of target profiles that gather the collection of process areas to be examined. Often these profiles specifically address the neces- sary collections of process areas associated with maturity levels, but this need not be the case. With the release of V1.3, we have ensured that the reporting system (SCAMPI Appraisal System, or SAS) is robust enough to allow depiction of process areas from multiple CMMI constellations. As use of other architecturally similar SEI models, such as the RMM mentioned above as well as the People CMM, grow in use, we will be able to depict profiles using mixtures of process areas or even practices from multiple models, giving ptg greater value to the process improvement efforts of a growing range of complex organizations.

Recursion and Iteration of Engineering Processes Most process standards agree that there are two ways that processes can be applied. These two ways are called recursion and iteration. Recursion occurs when a process is applied to successive levels of system elements within a system structure. The outcomes of one application are used as inputs to the next level in the system struc- ture. For example, the verification process is designed to apply to the entire assembled product, the major product components, and even components of components. How far into the product you apply the verification process depends entirely on the size and complexity of the end product. Iteration occurs when processes are repeated at the same system level. New information is created by the implementation of one process that feeds that information back into a related process. This new information typically raises questions that must be resolved before completing the processes.

Wow! eBook Chapter 4 Relationships Among Process Areas 75

For example, iteration will most likely occur between require- ments development and technical solution. Reapplication of the processes can resolve the questions that are raised. Iteration can ensure quality prior to applying the next process. Engineering processes (e.g., requirements development, verifica- tion) are implemented repeatedly on a product to ensure that these engineering processes have been adequately addressed before deliv- ery to the customer. Further, engineering processes are applied to components of the product. For example, some questions that are raised by processes associ- ated with the Verification and Validation process areas can be resolved by processes associated with the Requirements Develop- ment or Product Integration process area. Recursion and iteration of these processes enable the project to ensure quality in all compo- nents of the product before it is delivered to the customer. The project management process areas can likewise be recursive because sometimes projects are nested within projects.

Measurement Makes Improvement Meaningful

ptg by David N. Card

Measurement is an essential component of engineering, manage- ment, and process improvement. CMMI defines measurement requirements in Measurement and Analysis, Project Planning, Proj- ect Monitoring and Control, Quantitative Project Management, Organizational Process Performance, and, to lesser degrees, in other process areas. Measurement and analysis are two of the five steps in the Six Sigma DMAIC cycle. Lean is based on the application of the mathematical principles of queuing theory to processes. Unless per- formance can be quantified in terms of productivity, quality, or cycle time, it is hard to gauge improvement, regardless of the improve- ment approach adopted. An effective measurement and analysis program is essential to the long-term business success of CMMI users. Measurement is the mechanism by which an organization gains insight into its perform- ance and confronts “goodness.” A minimal measurement program may pass an appraisal but will limit the organization’s opportunities for real improvement. Moreover, without meaningful feedback, a maturity level is not likely to be maintained.

Wow! eBook 76 PART ONE ABOUT CMMI FOR DEVELOPMENT

What roles should measurement and analysis play in an organi- zation? Measurement often is defined in terms of collecting data and thus is distinguished from analysis, the interpretation and use of data. Clearly, the collection of data must be driven by its intended use. As Mark Twain observed, “Data is like garbage. You better know what you are going to do with it, before you start collecting it.” Four classes (or levels) of measurement users or decision-making processes may be defined. CMMI introduces these decision-making processes at different maturity levels. Each class of decision-making has a differ- ent focus, as described below:

• Enterprise. The long-term survival of the organization. The enterprise offers products and/or services in a market. Sur- vival typically means maintaining or increasing profitability and market share. (Organizational Performance Management, a CMMI maturity level 5 process area.) • Process. The efficiency and effectiveness of the organization’s means of accomplishing work. The processes of a typical organization may include software engineering, information technology, marketing, and administration. (CMMI maturity levels 4 and 5.) ptg • Project. Realization of a specific product or service for a specific customer or class of customers. Projects use organizational processes and resources to produce products and/or services for the enterprise to market. (CMMI maturity levels 2 and 3.) • Product. Resolution of the technical aspects of the product or service necessary to meet the customer’s requirements. Proj- ects manage the resources necessary to deliver the desired product or service. (CMMI maturity level 3.)

As indicated in the preceding descriptions, these areas of con- cern are inter-related. Moreover, measurement and analysis at all of these levels depend on many of the same sources of data. However, the level of detail and the manner in which the data is analyzed or reported for different purposes may vary.1 Measurement specialists often focus on the technical challenges of statistical analysis rather than addressing some of the human and social reasons why measurement systems sometimes fail. Many of

1. For a more detailed discussion, see D. Card, “A Practical Framework for Software Mea- surement and Analysis,” Auerbach, Systems Management Strategies, October 2000.

Wow! eBook Chapter 4 Relationships Among Process Areas 77

these problems are common to organizational change initiatives. Three issues specific to measurement are as follows:

• The cost of data collection and processing is always visible, but the benefits of measurement are not. The benefits depend on what decisions are made based on the data. No decisions—no action—no benefits. Organizations must establish an action orientation. • Measurement and analysis is new to many software and sys- tems engineers and managers. Training and coaching in these new skills are essential to their successful implementa- tion. As the scope of measurement increases with higher maturity, more sophisticated techniques must be mastered, making training even more critical. • Measurement provides insight into performance. However, gaining and using that insight may be threatening to those whose performance is measured, and even to those who are just the messengers of performance information. Measure- ment systems must protect the participants.

All of these issues can, and have been, overcome. CMMI’s Mea - ptg surement and Analysis process area provides a good starting point for introducing measurement into an organization. Building on that with skills and techniques appropriate to the target maturity level and intended uses of measurement, as well as confronting the human issues that are sure to be encountered, facilitates the transi- tion to management by fact, rather than opinion. Without an appro- priate and effective measurement program, maturity levels are insignificant assertions.

Support Support process areas cover the activities that support product devel- opment and maintenance. The Support process areas address processes that are used in the context of performing other processes. In general, the Support process areas address processes that are tar- geted toward the project and can address processes that apply more generally to the organization. For example, Process and Product Quality Assurance can be used with all the process areas to provide an objective evaluation of the processes and work products described in all the process areas.

Wow! eBook 78 PART ONE ABOUT CMMI FOR DEVELOPMENT

The five Support process areas in CMMI-DEV are as follows:

• Causal Analysis and Resolution (CAR) • Configuration Management (CM) • Decision Analysis and Resolution (DAR) • Measurement and Analysis (MA) • Process and Product Quality Assurance (PPQA)

Basic Support Process Areas The Basic Support process areas address fundamental support func- tions that are used by all process areas. Although all Support process areas rely on the other process areas for input, the Basic Support process areas provide support functions that also help implement several generic practices. Figure 4.6 provides a bird’s-eye view of the interactions among the Basic Support process areas and with all other process areas. The Measurement and Analysis process area supports all process areas by providing specific practices that guide projects and organiza- tions in aligning measurement needs and objectives with a measure- ment approach that is used to support management information ptg

Quality and Measurements noncompliance and analyses issues All process MA PPQA areas Information Processes and needs work products, standards, and Configuration procedures items and change requests Controlled configuration items, baselines, and audit reports

CM

CM = Configuration Management MA = Measurement and Analysis PPQA = Process and Product Quality Assurance

FIGURE 4.6 Basic Support Process Areas

Wow! eBook Chapter 4 Relationships Among Process Areas 79 needs. The results can be used in making informed decisions and taking appropriate corrective actions. The Process and Product Quality Assurance process area supports all process areas by providing specific practices for objectively evalu- ating performed processes, work products, and services against the applicable process descriptions, standards, and procedures, and ensuring that any issues arising from these reviews are addressed. Process and Product Quality Assurance supports the delivery of high quality products and services by providing the project staff and all levels of management with appropriate visibility into, and feed- back on, the processes and associated work products throughout the life of the project. The Configuration Management process area supports all process areas by establishing and maintaining the integrity of work products using configuration identification, configuration control, configura- tion status accounting, and configuration audits. The work products placed under configuration management include the products that are delivered to the customer, designated internal work products, acquired products, tools, and other items that are used in creating and describing these work products. Examples of work products that can be placed under configura- ptg tion management include plans, process descriptions, requirements, design data, drawings, product specifications, code, compilers, prod- uct data files, and product technical publications.

Advanced Support Process Areas The Advanced Support process areas provide the projects and organ- ization with an improved support capability. Each of these process areas relies on specific inputs or practices from other process areas. Figure 4.7 provides a bird’s-eye view of the interactions among the Advanced Support process areas and with all other process areas. Using the Causal Analysis and Resolution process area, project members identify causes of selected outcomes and take action to pre- vent negative outcomes from occurring in the future or to leverage positive outcomes. While the project’s defined processes are the ini- tial targets for root cause analysis and action plans, effective process changes can result in process improvement proposals submitted to the organization’s set of standard processes. The Decision Analysis and Resolution process area supports all the process areas by determining which issues should be subjected to a formal evaluation process and then applying a formal evaluation process to them.

Wow! eBook 80 PART ONE ABOUT CMMI FOR DEVELOPMENT

CAR

Process improvement proposals Defects, other problems, and successes

All process areas

Selected issues Formal evaluations

DAR

CAR = Causal Analysis and Resolution DAR = Decision Analysis and Resolution ptg FIGURE 4.7 Advanced Support Process Areas

People, Process, Technology, and CMMI

by Gargi Keeni

Organizations that have been successful in their process improve- ment initiatives are known to have taken a holistic approach. Any discussion on process improvement would invariably emphasize the importance of addressing the people, process, and technology (PPT) triad. However, in reality, it is not so common to see synergy between the individuals or groups responsible for these PPT aspects in an organization. Synchronizing the goals of an individual or team with those of the organization is the key for any successful improve- ment program. In the context of CMMI, this becomes more evident for the process areas whose activities would have an inherent conflict between individuals/teams priorities and the organization’s priori-

Wow! eBook Chapter 4 Relationships Among Process Areas 81 ties. Organizations may unknowingly amplify this conflict by hav- ing reward and recognition systems in which individual incentives and career goals are only tied to individual/team performance, thereby inhibiting realization of organizational goals. Let’s take an example of an organization that is trying to imple- ment a Process and Product Quality Assurance (PPQA) process. PPQA is a process area at maturity level 2 whose purpose is to pro- vide objective insight into processes and their associated work products. Now consider each component of the PPT triad: • Process. Though it may seem relatively simple to just define the processes that need to be implemented and with which projects are thus expected to comply, that alone is rarely suffi- cient to ensure that the intent for the process will be under- stood or achieved. Possible evidence for this is when individuals/teams work overtime to get records in place just before the compliance check is scheduled! • People. Further probing might highlight the fact that an incentive system links the number of noncompliances to the performance of the project leader or project team. This link- ptg age motivates the individual or team to spend nonvalue- adding effort just to ensure that the number of noncompliances is as low as possible. The reason for the noncompliance in the first place could have been any of the following: • The processes are not suitable for the activities done by the individuals/teams. • The process definition is too complicated to be fol- lowed. • The individuals/teams are not aware of the process and its intent. In this case, the organization cannot reap the benefits of the PPQA activities (because the team will “fake it” to “get by” the audit) unless it revisits its people practice of linking the number of noncompliances to the performance of an indi- vidual or team. In other words, the organization needs to pro- mote an environment that encourages employee participation in identifying and reporting quality issues (which can include issues with the processes that have been defined as well as noncompliances), which is subpractice 1 of PPQA SP1.1.

Wow! eBook 82 PART ONE ABOUT CMMI FOR DEVELOPMENT

• Technology. To obtain effective and timely conclusions with respect to the preceding reasons cited for possible noncompli- ance, organizations need to have systems for tracking and analyzing in near real time the noncompliances and the asso- ciated corrective actions. This tracking and analysis, in turn, will provide managers at all levels with appropriate visibility into the processes and work products. However, just procur- ing a tool for managing noncompliances without proper deployment of processes turns out to be counterproductive in most cases.

Here’s what needs to be realized in the situation just described: Objectivity in PPQA evaluations is critical and can be achieved by both independence and the use of criteria; however, the credibility and motivation of the independent entity has a major impact on the quality of PPQA evaluations. In organizations where individuals from other teams conduct the PPQA activities, it is very important to ensure that the right individual is made available for the task. However, as experience says, the right people are usually not avail- able unless the right motivation is provided. If individuals are always evaluated based on the work they do for their team, there is ptg no motivation for them to provide any service external to the team. However, PPQA being a Support PA that runs across the organiza- tion, the organization needs to review its incentive measures to pro- vide the right motivation for these activities which are beyond the individuals’/teams’ current functions and performance. Incentives that are more fully aligned with organizational objectives will moti- vate early identification of noncompliances, which in turn will help identify possible problem areas in the processes that the organiza- tion is trying to deploy. As you can see by the PPQA example, it is important to con- sider all three dimensions of people, process, and technology when addressing any of the activities described in the process areas in CMMI. The PPT impact becomes even more apparent for PAs at higher maturity levels, which have inherently built-in organiza- tional perspectives irrespective of whether it is a large or small organization. It is essential for an organization to provide an environment that fosters effective deployment of processes. Analysis of the out- come/results on a regular basis provides insights into the effective- ness of the processes and the changes required to meet business objectives.

Wow! eBook Chapter 4 Relationships Among Process Areas 83

The PAs at higher maturity levels often directly address people and technology in the practices. However, when implementing these practices, it is important to understand the natural relation- ships that exist to nurture these behaviors so that an organization can readily adapt to improvement and change, which is necessary in today’s environment.

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook CHAPTER 5

USING CMMI MODELS

The complexity of products today demands an integrated view of how organizations do business. CMMI can reduce the cost of process improvement across enterprises that depend on multiple functions or groups to achieve their objectives. To achieve this integrated view, the CMMI Framework includes common terminology, common model components, common appraisal methods, and common training materials. This chapter describes how organizations can use the CMMI Product Suite not only to improve their quality, reduce their costs, and optimize their ptg schedules, but also to gauge how well their process improvement program is working.

The Role of Process Standards in Process Definition†

by James W. Moore

Suppose you decide to take a physics course at a university. Natu- rally, you want to get a good grade. You faithfully attend the lec- tures, read the textbooks, attend the lab sessions, do the homework, take the final examination, and hope for the best outcome. Alterna- tively, if you can find a willing instructor, you could ask the instruc- for a copy of the final exam on the first day of the course. As time permits, you could figure out the answers to the questions and, by the end of the course, turn in your answers to the exam

† © 2010 The MITRE Corporation. All rights reserved. [The author’s affiliation with The MITRE Corporation is provided for identification purposes only, and is not intended to convey or imply MITRE’s concurrence with, or support for, the positions, opinions or viewpoints expressed by the author.]

85

Wow! eBook 86 PART ONE ABOUT CMMI FOR DEVELOPMENT

questions with a request for the corresponding grade. Which of these two approaches is the more effective way to learn physics? We’ll return to this question at the end of this article. There is a long tradition of prescriptive process definition stan- dards, like DoD-Std-2167A, that were intended to tell the software developers what they must do to achieve an acceptable level of pro- fessional practice; one conformed to such a standard by implement- ing processes meeting all of its provisions.1 In the last dozen or so years, that binary form of pass-fail evaluation has lost favor to a more graded approach of assessing the “maturity” or “capability” of the developer’s processes. ISO/IEC 15504, Process Assessment (commonly called SPICE), describes this approach. ISO/IEC 15504 is “agnostic” with respect to the processes being employed. Meth- ods conforming to the standard can be used to assess to any set of processes that are described in a particular stylized manner, called the process reference model. One such process reference model is provided by ISO/IEC/IEEE 12207, Software Lifecycle Processes. Another is provided by ISO/IEC/IEEE 15288, System Lifecycle Processes. The 15504 standard is used in some nations, but CMMI may have greater recognition in others, especially the U.S. marketplace. ptg Unlike 15504, CMMI provides a great deal of guidance regarding the definition of processes. The criteria that are applied to assess the processes implicitly specify practices that should be embedded in the developer’s processes. So this begs the question of whether any other standards are needed for process definition. After all, why not simply implement processes meeting the CMMI criteria and be done with it? There are four reasons why one should consider using process standards for defining an organization’s processes: scope, suitability, communication, and robustness. First, let’s look at scope. Although CMMI is broad, it isn’t broad enough for all possible uses. An obvious example is quality man- agement. Although CMMI includes provisions addressing the qual- ity of software products and systems, it does not address the organization-level quality management systems specified by ISO 9001, Quality management systems—Requirements. Because ISO

1. Because they were generally over prescriptive, these standards were often “tailored” by modifying their detailed provisions. Of course, conformance to a tailored standard always begs the question of whether the tailoring was done responsibly, or simply consisted of deleting all inconvenient provisions.

Wow! eBook Chapter 5 Using CMMI Models 87

9001 conformance is regarded as highly important in many coun- tries, multinational suppliers will find it appropriate to consider that standard in defining their processes—even software and system processes—related to quality. Although less prominent than ISO 9001, there are many other standards describing processes of vari- ous kinds. In particular situations, suppliers should consider the provisions of those standards. The second reason for considering process standards is suitabil- ity. It should be obvious that not every organization needs the same set of processes. After all, every organization has some unique char- acteristics that cause it to differ from its competitors. That competi- tive advantage should be addressed in organizational processes so that the advantage is relevantly applied to every project of the organization—maintaining its competitive edge. Defining a set of organizational processes (e.g., for software and systems engineer- ing) should consider a number of factors, including:

• Size of the organization • The nature of the products and services provided by the organization • The competitive structure of the relevant marketplace ptg • The nature of the competition • The regulatory structure that is relevant, including regulatory requirements • Customs, traditions, and regulation in the relevant industry sector • The “culture” of the organization, including the degree of reg- imentation that the workforce will tolerate • Organizational attitudes toward quality • The amount of investment capital available for process defini- tion and the extent to which it can be amortized over a base of projects • The uniformity (or diversity) of anticipated project requirements • The competitive advantages to be “captured” by the processes • The form in which documentation (either paper or elec- tronic) is to be captured, saved, and possibly delivered

All of these considerations, and many more, affect the defini- tion of organizational processes, leading to the conclusion that the

Wow! eBook 88 PART ONE ABOUT CMMI FOR DEVELOPMENT

selection and implementation of cost-effective processes requires a different solution for every organization. Of course, available process standards do not directly account for all of these factors. They simply provide alternatives to be con- sidered in the overall process definition. In some cases, they describe specific practices that are important in regulatory environ- ments—for example, the Modified Condition Decision Criteria (of RTCA DO-178B, Software Considerations in Airborne Systems and Equipment Certification) that is applied to the testing of avionics systems. More generally, they provide a starting point, a shortcut that allows the process definers to avoid a large volume of relatively straightforward decisions. Perhaps most importantly, they record a baseline of responsible practice, a “safety net,” that is overlooked only at some risk. The third reason for considering process standards is communi- cation. Terminology in software engineering is notoriously vague and flexible—a sign of an emerging profession. This can lead to important misunderstandings: “Oh, when you paid me to ‘imple- ment’ the software, you meant I should ‘test’ it also? I didn’t include that in the price; you have to pay me extra for that!” Even the con- sideration of proposals is difficult without a baseline of terminology ptg for comparing processes. Suppose the buyer asks the supplier what sort of design review practices are performed by the supplier. One possible answer is to provide dozens of pages of proposal prose; another possible answer is for the supplier to say, “We use IEEE Std 1028, Software Reviews.” The second answer has the advantage of being succinct, of being precise in terms of what is included, and of being generally accepted as responsible practice.2 The international standards for lifecycle processes, ISO/IEC/IEEE 12207 and ISO/IEC/IEEE 15288, provide a comprehensive set of processes that span the lifecycles of software products and systems. If a supplier states that its software development process con- forms to that of 12207, a buyer knows the answer to a large number of pertinent questions. To pick only one example, the buyer would know that the software requirements, the architectural design, the detailed design, and the code will all be evaluated for feasibility of operation and maintenance—important downstream activities.

2. By the way, I avoid the term best practice. Responsible practice is as good as it gets with me. I simply don’t believe that any single practice can be regarded as being the best in all possi- ble situations. I believe that most organizations should strive to use responsible practices that are suited to their particular needs. —James W. Moore

Wow! eBook Chapter 5 Using CMMI Models 89

The final reason for considering process standards is robust- ness. Reliance on a single source may lead to defining processes that are inadvertently oriented toward a particular concept of usage. Applying the processes in a different context may lead to unin- tended results. Despite best attempts at generality, it is impossible to anticipate all possible situations; so it is up to the process designer to consult multiple sources to find processes that are suitable for the particular organization. The level of prescription is an important consideration here. Those who define processes (including the people who wrote CMMI) necessarily have to make a trade-off of generality versus specificity. Many very good practices were omitted from CMMI because they weren’t generally applicable; and many were “watered down” to improve their generality. By consulting multiple sources, including multiple standards, process definers will find practices that are specifically suitable and effective in the intended context even though they were omitted from CMMI because they could not be generalized to the broad CMMI audience. Those who must practice at a more demanding level—e.g., developing safety-critical soft- ware—can supplement their baseline processes by applying stan- dards that provide more detailed or stringent treatments of selected ptg practices, e.g., IEEE Std 1012, Software Verification and Validation. We are now ready to return to the original question of whether a person who wants to learn physics should sit through the course or ask for a copy of the final exam on the first day. The answer to the question: “It depends!” It depends on whether the goal is to gain a broad and useful understanding of physics or to get the high- est possible grade with the least possible effort. In business, both answers are reasonable ones depending on the circumstances. Applying this analogy to the subject at hand, we can conclude (with some cynicism) that if the organization’s goal is to attain the highest possible CMMI rating with the smallest possible invest- ment, it is probably appropriate to accept the final exam on the first day of the course—that is, to define a set of organizational processes that directly and straightforwardly address the evaluation criteria of CMMI. On the other hand, if the organization’s goal is to define a set of organizational processes that are robust, descriptive, and suitable to the scope and circumstances of the organization— that is, to gain the intended advantages of process improvement— the organization should consult multiple sources, including process definition standards, select suitable ones, and use them as a baseline for defining a cost-effective solution to the organization’s needs.

Wow! eBook 90 PART ONE ABOUT CMMI FOR DEVELOPMENT

Adopting CMMI Research has shown that the most powerful initial step to process improvement is to build organizational support through strong sen- ior management sponsorship. To gain the sponsorship of senior man- agement, it is often beneficial to expose them to the performance results experienced by others who have used CMMI to improve their processes [Gibson 2006]. For more information about CMMI performance results, see the SEI website at http://www.sei.cmu.edu/cmmi/research/results/. The senior manager, once committed as the process improvement sponsor, must be actively involved in the CMMI-based process improvement effort. Activities performed by the senior management sponsor include but are not limited to the following:

• Influence the organization to adopt CMMI • Choose the best people to manage the process improvement effort • Monitor the process improvement effort personally • Be a visible advocate and spokesperson for the process improvement effort • Ensure that adequate resources are available to enable the process ptg improvement effort to be successful

Given sufficient senior management sponsorship, the next step is establishing a strong, technically competent process group that rep- resents relevant stakeholders to guide process improvement efforts [Ahern 2008, Dymond 2005]. For an organization with a mission to develop software-intensive systems, the process group might include those who represent differ- ent disciplines across the organization and other selected members based on the business needs driving improvement. For example, a systems administrator may focus on information technology support, whereas a marketing representative may focus on integrating cus- tomers’ needs. Both members could make powerful contributions to the process group. Once your organization decides to adopt CMMI, planning can begin with an improvement approach such as the IDEAL (Initiating, Diagnosing, Establishing, Acting, and Learning) model [McFeeley 1996]. For more information about the IDEAL model, see the SEI Web- site at http://www.sei.cmu.edu/library/abstracts/reports/96hb001.cfm.

Wow! eBook Chapter 5 Using CMMI Models 91 Executive Responsibilities in Process Improvement

by Bill Curtis

We constantly hear that the primary reason process improvement programs fail is from lack of executive leadership. That said, those responsible for facilitating process improvement programs are hard- pressed to specify exactly what they expect from executives. Here are 12 critical actions executives should take to ensure the success of process improvement programs.

1. Take personal responsibility. Executives do not deliver systems to customers. Project managers deliver them. Executives build organizations that deliver systems to customers, and this respon- sibility cannot be delegated. At the core of Watts Humphrey’s Process Maturity Framework that underlies CMMI is a unique model of organizational change and development. Because the responsibility for organizational transformation rests in the executive office, CMMI is a tool for executives to use in improving the performance of their organizations. CMMI will not only transform development practices, but also the organi- ptg zation’s culture and the way business results are attained. Exec- utives should not launch an improvement program until they are willing to become personally accountable for its success— or failure. 2. Set realistic goals. Executives must initiate process improve- ment with a clear statement of the issues driving change and the objectives to be achieved. Slogans such as “Level 2 in 2 years” do little more than reinforce the very behaviors that Level 2 was designed to eliminate. If the improvement objec- tives are unrealistic, the improvement program will be just one more of the organization’s chaotic death march projects. Process improvement programs must model the behaviors that executives want the organization to adopt, especially plan- driven commitments. Schedules for attaining maturity levels should result from planning, not catchy slogans or bonus cycles. Rewards and bonuses should be based on accomplish- ing planned behavioral or performance improvements, rather than arbitrary dates for appraisal results. 3. Establish an improvement project. Process improvement must be conducted as a project. Executives must assign responsibil- ity for managing the project, provide funding and resources,

Wow! eBook 92 PART ONE ABOUT CMMI FOR DEVELOPMENT

approve improvement plans, review status reports, and mea- sure results. The person assigned to lead the improvement project must be a strong role model for other project managers. Executives should ask frequent questions about project plans and the assumptions underlying them. The guidebooks, defined processes, measures, checklists, and other artifacts pro- duced through process improvement are organizational assets. They should be treated as products, albeit for internal use, and be produced with the same project discipline used in produc- ing any other product. 4. Manage change. How many initiatives can dance on the head of a project manager? Many organizations have multiple improve- ment programs underway simultaneously. In the initial stages of these programs, project managers are the people most affected and are often inundated with the number of changes expected. Executives must determine the amount of change the organization can absorb, prioritize the changes to be made, and shield the organization from improvement overkill. The good news is that some improvement programs such as Six Sigma and CMMI can be synergized, since they both evolved from the concepts initiated by Shewhart and Deming. ptg 5. Align management. Process groups have no power to enforce improvements or change management behavior. If middle man- agers resist improvements, only executives can force them to align with the program. Executives must build consensus among managers on the objectives and tactics for improvement and hold them accountable for achieving improvement objec- tives. In particular, middle managers must develop the skills of the project or program managers who report to them. Manage- ment steering committees reporting to executives are one way to ensure middle and project managers share responsibility for improvement success. 6. Align incentives. Executives must ensure that incentives are aligned with improvement goals and do not send mixed mes- sages about the behaviors the organization values. Incentives must shift from rewarding heroes to rewarding those whose sound practices avoid the need for heroes. Promotions should go to those who are strong role models of the behaviors the improvement program is trying to instill. Incentives must send a message that management values contributions to building a strong organization just as much as it values individual virtuosity.

Wow! eBook Chapter 5 Using CMMI Models 93

7. Establish policies and empower assurance. Policies that merely regurgitate goals from CMMI process areas represent a lost opportunity for executives to communicate their expectations for behavior in their organizations. After policies are estab- lished, executives need visibility into compliance. Assurance groups have influence only to the extent that executives enforce policies and address noncompliance. However, the greatest value of assurance groups, and this is subtle in PPQA, is when they serve as mentors to project managers and techni- cal staff on practices that support compliance. Consequently, assurance groups need to be staffed with competent developers and managers so that they are credible in transferring knowl- edge of best practices across the organization. 8. Involve customers. Unless customers understand how they will benefit from new practices, they may perceive the changes as making projects bureaucratic and inflexible. Consequently, project managers are often trapped in a conflict between their improvement objectives and the demands of their customers. Executives own the relationship with customers and must meet with their peers to explain the improvement strategy and how it will make the development organization a more reliable busi- ness partner. Involving customers in improving requirements ptg practices is a good initial step for engaging them in the improvement program. 9. Involve developers. The Process Maturity Framework begins the empowerment of developers by involving them in estimat- ing and planning at Level 2 and it increases their responsibility at each subsequent level. Executives must understand and encourage this cultural transition. They must also ensure that developers are involved in improvement activities because developers have the freshest knowledge of best practices. Devel- opers are also more realistic about how much change they can absorb in an improvement cycle. The role of the process assur- ance groups is to assist managers and developers in identifying and deploying best practices. 10. Review status. Executives own the organization’s commit- ments. They must approve external commitments and review progress in accomplishing them. Project reviews should not only focus on work status, but also on the progress being made in adopting improvements and of impending risks. Executives highlight their commitment through measurement and should add indicators of improvement progress and results to their

Wow! eBook 94 PART ONE ABOUT CMMI FOR DEVELOPMENT

dashboards. Improvement measures should reflect that benefits accrue project by project, rather than in one big bang. 11. Replace laggards. The process group owns responsibility for assisting improvements with the innovators, early adopters, and early and late majority. Executives own the problem of lag- gards, especially if they are in management. For an improve- ment program to succeed, executives must be willing to remove even friends for failure to make progress. Successful organizational change programs often end up with different management teams than they started with. 12. Never relent. True leadership begins under stress. With all the pressures generated by demanding business schedules and cost cutting, executives must nevertheless stand firm in driving the improvements they know the organization must make. If they relent under pressure, the organization learns the art of excuses. The ultimate appraisal of maturity is determined by which practices the organization refuses to sacrifice under grinding pressure.

There are other responsibilities executives can assume in sup- porting process improvement. Nevertheless, these 12 have proven ptg critical since they require executive authority and represent acts of leadership around which the improvement program can galvanize. Executives with little experience in process-disciplined environ- ments are understandably concerned about risking their career on practices that have not contributed significantly in their advance- ment. Fortunately, there is a growing body of data and community of mature organizations to attest that faith in sensible CMMI-based improvement programs is well placed. It does not take leadership to follow the trodden path. It takes leadership to pursue the promise of new ways.

Your Process Improvement Program Use the CMMI Product Suite to help establish your organization’s process improvement program. Using the product suite for this pur- pose can be a relatively informal process that involves understanding and applying CMMI best practices to your organization. Or, it can be a formal process that involves extensive training, creation of a process improvement infrastructure, appraisals, and more.

Wow! eBook Chapter 5 Using CMMI Models 95 Implementing Engineering Culture for Successful Process Improvement

by Tomoo Matsubara

As most early software professionals, I came from another profes- sional area: mechanical engineering. While I worked at Hitachi’s machine factory, which manufactured cranes, pumps, and bulldoz- ers, the factory was in the midst of the Kaizen (i.e., improvement) movement that reduced direct and indirect costs and improved quality. Because the factory was an engineering community, all of the improvement activities were done in an engineering way. Engi- neers are very practical and don’t decide anything without seeing, touching, and measuring things. The typical problem-solving process used in this environment was to (1) identify problems at the factory floor or in front of real things and phenomena, (2) discuss presumable causes, and (3) con- duct experiments and build prototypes. For mechanical engineers, conducting experiments, measuring things, and drawing charts are a way of life. In this environment, to ptg understand a problem and to observe the progress of improvement, they produced a lot of charts. For efficient and precise fabrication, they developed and applied jigs (i.e., tools used to create specific shapes) and then used small tools for specific fabrication. In some cases, a component with a complex shape, such as convoluted beams for a monorail, were impossible to fabricate without well- designed jigs. The culture of the factory was very exciting. Designers and workers were friendly and interacted well together. Once a month, there were sports and cultural events such as a lunchtime (one hour) marathon, boat races, and musical concerts. Twice a year, there were safety-work related competitive games such as measur- ing distances, heights, and weights without using gauges, scales, or other measuring devices. Most of the games involved competition among divisions including design, fabrication, and raw materials (i.e., casting and welding). They were like the “constructive disorders” that Tom DeMarco and Tim Lister wrote about in “Peopleware.”3

3. Tom DeMarco and Timothy Lister, “Peopleware: Productive Projects and Teams,” Dorset House, 1999.

Wow! eBook 96 PART ONE ABOUT CMMI FOR DEVELOPMENT

After nine years at the machine factory, I moved to a computer factory in 1965. As the general manager of the development organi- zation, I was responsible for developing a banking system. The development organization consisted of hardware makers, software developers, and service providers. I had to anticipate all the risks involved in such a development. One of the biggest potential prob- lems I saw was meeting the systems’ high-performance requirements. To begin meeting these requirements, I provided a scheme that consisted of a series of dynamic steps. These steps were allocated to each of the development groups to use, and measures associated with each of these steps were defined. I performed a drastic triage on the requirements to meet time constraints. In the final stage of development, I had to deal with troubleshooting problems created as the result of being saddled with components developed by differ- ent organizations. The system was successfully delivered in 1969. While I was in charge, I had to determine the causes of prob- lems that occurred among these intricately interacting components and prioritize action items to address them. I learned much about system engineering and other fields of engineering from this task and other real-world experiences and thus became acquainted with a large number of talented individuals having differing expertise. ptg After a short stint working with banking systems management, I moved again to a software organization called the “software fac- tory” in 1966 immediately after it was formed. It was named by Hitachi’s then-CEO Mr. Komai since his philosophy was that soft- ware should be treated the same as other industrial products. When I started working at the software factory, it was a culture shock. Everything seemed to be different from the way the machine factory worked or the computer factory worked. I found no experiment- related approach to problem solving, and, in turn, there was mini- mal use of measurement, numbers, and charts. Designers boldly adopted new ideas and made decisions without using any experi- ments. Tools are rarely used. What most impressed me was that they didn’t boast about their products and systems that they designed and developed. When Hitachi created its software subsidiary, Hitachi Software Engineering (HSE), in 1970, I was asked to join and plan for its operation. It was just two years later that the term software engi- neering was coined as a new engineering field at a NATO confer- ence. At that time, HSE initiated its software development with about 200 programmers who had been working at the software fac-

Wow! eBook Chapter 5 Using CMMI Models 97 tory. These programmers were scattered on multiple projects in manpower contracts with Hitachi. Because nobody had project management skills and experience, “death march” projects were rampant. Top management worried about the company’s survival. While we were coping with these incessant problems, we were spec- ulating how to eliminate root causes of problems. Eventually, the company improved its practices enough to deserve to call what it was doing “software engineering.” It was obvious that we needed to train project managers and engineers to give them the power to detect, control, and resolve problems. To do this, management (including me) tried to create a culture of self-reliance, engineering attitude, and systematic think- ing. The first thing we did was to better align the people. We gath- ered the people who were scattered onto various projects and grouped them into teams that were headed by fledgling HSE project managers. Then, I provided these project managers with three types of charts for them to use and trained them how to plot current sta- tus of the project on the charts weekly. This approach allowed us to see the project’s status and its deviation from target curves for schedule, quality, and cost from a controllability point of view (you can see some of these charts in Tom DeMarco’s classic book,4 Con- ptg trolling Software Projects, and our papers5). This idea is based on my belief that most people spontaneously act to take control if he or she realizes the situation. We trained the project managers not only how to plot status on the charts, but also what typically is going on in the project when specific curve patterns appeared. This strategy also succeeded and within a few years we achieved profitable, punc- tual, and higher quality projects. Besides these charts, I introduced a lot of mechanisms and sys- tems to help attack problems. I introduced a tool proposal system, borrowed from the idea of applying jigs that I mentioned earlier. We also introduced war game competitions on coding within the soft- ware factory. The primary principles behind these ideas are very simple: First, we observed real processes in real projects (i.e., don’t decide anything without observation). Second, take an engineering

4. Tom DeMarco, “Controlling Software Projects, Management, Measurement & Estima- tion,” Yourdon Inc., 1982. 5. D. Tajima and T. Matsubara, “The Computer Software Industry in Japan,” IEEE Computer, Vol. 14, no. 5 (May 1981), pp. 89–96.

Wow! eBook 98 PART ONE ABOUT CMMI FOR DEVELOPMENT

attitude to deal with any and all phenomena. Finally, think system- atically and build-in feedback mechanisms to lead to steady and sta- ble improvement (i.e., all the principles that I experienced at the machine and computer factories). About 30 years after HSE was created, its five divisions achieved CMMI maturity level 3 in 2002 and 2003, and one of them achieved maturity level 5 in 2004. It was one of the earliest such achievements in the world. Software process is based on software engineering, which is a branch of engineering. However, when I visited some software organizations, I used to be surprised that they lacked engineering expertise as part of software engineering to deal with problems. Developing software requires a complex system in which human beings are critical elements. Consequently, I believe that we must implement appropriate mechanisms for continuous improvement. Currently, many organizations are coping with integrated process improvement that includes the development of software, hardware, and other related work. To streamline integrated process improvement, an engineering attitude and system-level thinking are common and critical keys.

ptg Selections that Influence Your Program You must make three selections to apply CMMI to your organization for process improvement:

1. Select a part of the organization. 2. Select a model. 3. Select a representation.

Selecting the projects to be involved in your process improvement program is critical. If you select a group that is too large, it may be too much for the initial improvement effort. The selection should also consider organizational, product, and work homogeneity (i.e., whether the group’s members all are experts in the same discipline, whether they all work on the same product or business line, and so on). Selecting an appropriate model is also essential to a successful process improvement program. The CMMI-DEV model focuses on activities for developing quality products and services. The CMMI- ACQ model focuses on activities for initiating and managing the acquisition of products and services. The CMMI-SVC model focuses on activities for providing quality services to the customer and end

Wow! eBook Chapter 5 Using CMMI Models 99

users. When selecting a model, appropriate consideration should be given to the primary focus of the organization and projects, as well as to the processes necessary to satisfy business objectives. The lifecycle processes (e.g., conception, design, manufacture, deployment, opera- tions, maintenance, disposal) on which an organization concentrates should also be considered when selecting an appropriate model. Select the representation (capability or maturity levels) that fits your concept of process improvement. Regardless of which you choose, you can select nearly any process area or group of process areas to guide improvement, although dependencies among process areas should be considered when making such a selection. As process improvement plans and activities progress, other important selections must be made, including whether to use an appraisal, which appraisal method should be used, which projects should be appraised, how training for staff should be secured, and which staff members should be trained.

CMMI Models CMMI models describe best practices that organizations have found to be productive and useful to achieving their business objectives. ptg Regardless of your organization, you must use professional judgment when interpreting CMMI best practices for your situation, needs, and business objectives. This use of judgment is reinforced when you see words such as “adequate,” “appropriate,” or “as needed” in a goal or practice. These words are used for activities that may not be equally relevant in all situations. Interpret these goals and practices in ways that work for your organization. Although process areas depict the characteristics of an organiza- tion committed to process improvement, you must interpret the process areas using an in-depth knowledge of CMMI, your organiza- tion, the business environment, and the specific circumstances involved. As you begin using a CMMI model to improve your organization’s processes, map your real world processes to CMMI process areas. This mapping enables you to initially judge and later track your orga- nization’s level of conformance to the CMMI model you are using and to identify opportunities for improvement. To interpret practices, it is important to consider the overall con- text in which these practices are used and to determine how well the practices satisfy the goals of a process area in that context. CMMI

Wow! eBook 100 PART ONE ABOUT CMMI FOR DEVELOPMENT

models do not prescribe nor imply processes that are right for any organization or project. Instead, CMMI describes minimal criteria necessary to plan and implement processes selected by the organiza- tion for improvement based on business objectives. CMMI practices purposely use nonspecific phrases such as “rele- vant stakeholders,” “as appropriate,” and “as necessary” to accom- modate the needs of different organizations and projects. The specific needs of a project can also differ at various points in its life.

Interpreting CMMI When Using Agile Approaches CMMI practices are designed to provide value across a range of dif- ferent situations and thus are stated in general terms. Because CMMI does not endorse any particular approach to development, little infor- mation that is approach-specific is provided. Therefore, those who don’t have prior experience implementing CMMI in situations simi- lar to the one they are now in may find interpretation non-intuitive. To help those who use Agile methods to interpret CMMI practices in their environments, notes have been added to selected process areas. These notes are added, usually in the introductory notes, to the following process areas in CMMI-DEV: CM, PI, PMC, PP, PPQA, ptg RD, REQM, RSKM, TS, and VER. All of the notes begin with the words, “In Agile environments” and are in example boxes to help you to easily recognize them and remind you that these notes are examples of how to interpret prac- tices and therefore are neither necessary nor sufficient for imple- menting the process area. Multiple Agile approaches exist. The phrases “Agile environ- ment” and “Agile method” are shorthand for any development or management approach that adheres to the Manifesto for Agile Devel- opment [Beck 2001]. Such approaches are characterized by the following:

• Direct involvement of the customer in product development • Use of multiple development iterations to learn about and evolve the product • Customer willingness to share in the responsibility for decisions and risk

Many development and management approaches can share one or more of these characteristics and yet not be called “Agile.” For exam- ple, some teams are arguably “Agile” even though the term Agile is

Wow! eBook Chapter 5 Using CMMI Models 101

not used. Even if you are not using an Agile approach, you might still find value in these notes. Be cautious when using these notes. Your ultimate interpretation of the process area should fit the specifics of your situation, includ- ing your organization’s business, project, work group, or team objec- tives, while fully meeting a CMMI process area’s goals and practices. As mentioned earlier, the notes should be taken as examples and are neither necessary nor sufficient to implementing the process area. Some general background and motivation for the guidance given on Agile development approaches are found in the SEI technical note CMMI or Agile: Why Not Embrace Both! [Glazer 2008].

Process Improvement in a Small Company

by Khaled El Emam

In this brief perspective, I will describe how a small company implemented maturity level 2 and 3 processes of a CMMI Staged model. In this case, both the organization (30 staff members) and the organizational unit developing software (a 10-member software ptg development group and a five-member quality assurance group) were small.

Context TrialStat Corporation develops software used in clinical trials of drug development. A number of FDA regulations apply to both the software and the processes used to develop and maintain that soft- ware. Because the software is used to collect and store sensitive per- sonal health information, the security of the application is critical and is usually included in the scope of regulatory audits.

Implementing an Iterative Process Because of competitive pressures, the release cycle for the applica- tion had to be short. A decision was made to adopt an Agile methodology, which promised rapid releases—a release cycle last- ing three weeks or less. At the outset, the three-week release cycle created many prob- lems and resulted in rapid burnout of the development team. The company was at risk of losing key developers, who were unwilling to work overtime and weekends to maintain the rapid release cycle. The short cycle also required curtailing many requirements analysis

Wow! eBook 102 PART ONE ABOUT CMMI FOR DEVELOPMENT

activities and minimizing quality assurance of the product—both of which were unacceptable. The development team then experimented with increasing the release interval. After a few attempts, it was decided that a three- month interval was sufficient. This interval was short enough to address the rapidly changing business needs, but long enough to avoid some of the problems encountered with the shorter interval. The longer interval resulted in a development team that was not over- burdened, early and sufficient requirements analysis work, and effective quality assurance. This process demonstrated that there is no inconsistency between using CMMI and implementing an Agile development process with short release cycles. In fact, it was an advantage to implement process improvements within this framework.

Introducing CMMI Practices Iteratively Because of the regulated nature of the company’s business, a strong process focus (OPD) was necessary from the start. Standard operat- ing procedures documenting all of the engineering and business processes were developed at the same time as project management practices were being implemented. Regular internal audits (PPQA) ptg ensured process compliance. Because of TrialStat’s small size, there was no difference between organizational and project procedures. Training capabilities were not developed in-house, but were outsourced. However, training plans and records were maintained for all staff as part of the com- pany’s regulatory requirements (OT). The iterative development process enabled the continuous introduction of new project practices (in the Project Management, Engineering, and Support process areas), and rapid feedback on their effectiveness. Each iteration represented an opportunity to introduce new processes, a new technology, or expertise (e.g., an individual with specialized skills). After three months, it was possi- ble to determine whether the intervention succeeded and had the desired impact. If it did, it was kept for subsequent iterations. Those that caused problems were either adjusted or eliminated. This mode of introducing changes imposed some constraints. The interventions could not be large; the development team had to learn, master, and apply them well enough to provide feedback. Therefore, new practices had to be introduced gradually. For exam- ple, when peer reviews were introduced, they initially focused only on requirements. Only one or two interventions could be intro-

Wow! eBook Chapter 5 Using CMMI Models 103 duced in each iteration. If there were too many interventions, the development team could not focus on delivering features. Formal CMMI-based process appraisals were not utilized. How- ever, audits against FDA regulations and guidelines were commonly performed by third parties and clients to ensure compliance.

Measurement and Decision Making The collection and use of metrics for decision making (MA) started from the very beginning of the project, and was subsequently expanded in a series of iterations. First, data on post-release defects were collected. This data was necessary to manage and prioritize defect correction activities and resources. After that system was in place, the metrics related to the ability to meet schedule targets were collected. Scope management was the next issue. Because the delivery date of each iteration was fixed, flexibility involved controlling the scope. Features scheduled for each iteration were sized for a three- month cycle. In some cases, features were split and implemented over multiple releases. The challenge was coming up with an appro- priate approach to measuring the size of the requirements early. Currently, using the number of use cases to measure the size of ptg requirements has worked well in this environment. The measurement of size and complexity became the subse- quent focus of measurement. As the system grew, it became critical to manage its size and complexity. One approach used was to refac- tor specific parts of the system to reduce complexity.

Lessons Learned There were surprisingly few fundamental changes needed to apply CMMI in a regulated environment. Many of the practices required in larger settings still applied. Because the company operates in a regulated environment, executive and investor support for process improvement existed from the outset. Therefore, in a different envi- ronment, additional effort may be required to obtain such support. The iterative approach to implementing process improvements allowed the organization to incrementally and continuously improve its practices, and provided a rapid feedback mechanism to evaluate process changes. As more metrics are collected, it will become pos- sible to quantitatively evaluate the improvements. Process documentation proved helpful for this small organiza- tion, making it easier to integrate new staff and ensure that they contributed sooner. Without that documentation, corporate growth

Wow! eBook 104 PART ONE ABOUT CMMI FOR DEVELOPMENT

would have been more painful. The people typically attracted to a small organization are not necessarily process oriented. Process documentation contributed to establishing clear ground rules for new staff and enforcing a process-oriented corporate culture. Requirements development and requirements management processes ensured predictability. These processes were addressed early in the improvement effort and served as a foundation for other engineering processes. For a company in the early stages of market penetration, it is critical that the requirements are right for each iteration. Measurement began, and was needed, from the start. As the organization’s practices matured, the types of things that were measured changed, as did the types of decisions made based on that data. The experience of TrialStat has demonstrated that CMMI can work well in a small organizational setting. Process improvement in a small setting requires a gradual and incremental approach to introducing change to ensure that the organization is not overbur- dened with an amount of change that affects its ability to deliver its product. A pure Agile approach did not seem to work well; how- ever, with some modification and in combination with CMMI, some ptg Agile practice could be of benefit to the organization.

Using CMMI Appraisals Many organizations find value in measuring their progress by con- ducting an appraisal and earning a maturity level rating or a capabil- ity level achievement profile. These types of appraisals are typically conducted for one or more of the following reasons:

• To determine how well the organization’s processes compare to CMMI best practices and identify areas where improvement can be made • To inform external customers and suppliers about how well the orga- nization’s processes compare to CMMI best practices • To meet the contractual requirements of one or more customers

Appraisals of organizations using a CMMI model must conform to the requirements defined in the Appraisal Requirements for CMMI (ARC) [SEI 2011b] document. Appraisals focus on identifying improve- ment opportunities and comparing the organization’s processes to CMMI best practices.

Wow! eBook Chapter 5 Using CMMI Models 105

Appraisal teams use a CMMI model and ARC-conformant appraisal method to guide their evaluation of the organization and their reporting of conclusions. The appraisal results are used (e.g., by a process group) to plan improvements for the organization.

Appraisal Requirements for CMMI The Appraisal Requirements for CMMI (ARC) document describes the requirements for several types of appraisals. A full benchmarking appraisal is defined as a Class A appraisal method. Less formal meth- ods are defined as Class B or Class C methods. The ARC document was designed to help improve consistency across appraisal methods and to help appraisal method developers, sponsors, and users under- stand the tradeoffs associated with various methods. Depending on the purpose of the appraisal and the nature of the circumstances, one class may be preferred over the others. Sometimes self assessments, initial appraisals, quick-look or mini appraisals, or external appraisals are appropriate; at other times a formal bench- marking appraisal is appropriate. A particular appraisal method is declared an ARC Class A, B, or C appraisal method based on the sets of ARC requirements that the ptg method developer addressed when designing the method. More information about the ARC is available on the SEI website at http://www.sei.cmu.edu/cmmi/tools/appraisals/.

SCAMPI Appraisal Methods The SCAMPI A appraisal method is the generally accepted method used for conducting ARC Class A appraisals using CMMI models. The SCAMPI A Method Definition Document (MDD) defines rules for ensuring the consistency of SCAMPI A appraisal ratings [SEI 2011a]. For benchmarking against other organizations, appraisals must ensure consistent ratings. The achievement of a specific maturity level or the satisfaction of a process area must mean the same thing for different appraised organizations. The SCAMPI family of appraisals includes Class A, B, and C appraisal methods. The SCAMPI A appraisal method is the officially recognized and most rigorous method. It is the only method that can result in benchmark quality ratings. SCAMPI B and C appraisal methods provide organizations with improvement information that is less formal than the results of a SCAMPI A appraisal, but nonethe- less helps the organization to identify improvement opportunities.

Wow! eBook 106 PART ONE ABOUT CMMI FOR DEVELOPMENT

More information about SCAMPI methods is available on the SEI website at http://www.sei.cmu.edu/cmmi/tools/appraisals/.

Appraisal Considerations Choices that affect a CMMI-based appraisal include the following:

• CMMI model • Appraisal scope, including the organizational unit to be appraised, the CMMI process areas to be investigated, and the maturity level or capability levels to be appraised • Appraisal method • Appraisal team leader and team members • Appraisal participants selected from the appraisal entities to be interviewed • Appraisal outputs (e.g., ratings, instantiation-specific findings) • Appraisal constraints (e.g., time spent on site)

The SCAMPI MDD allows the selection of predefined options for use in an appraisal. These appraisal options are designed to help ptg organizations align CMMI with their business needs and objectives. CMMI appraisal plans and results should always include a description of the appraisal options, model scope, and organizational scope selected. This documentation confirms whether an appraisal meets the requirements for benchmarking. For organizations that wish to appraise multiple functions or groups, the integrated approach of CMMI enables some economy of scale in model and appraisal training. One appraisal method can pro- vide separate or combined results for multiple functions. The following appraisal principles for CMMI are the same as those principles used in appraisals for other process improvement models:

• Senior management sponsorship6 • A focus on the organization’s business objectives • Confidentiality for interviewees • Use of a documented appraisal method • Use of a process reference model (e.g., a CMMI model) • A collaborative team approach • A focus on actions for process improvement

6. Experience has shown that the most critical factor influencing successful process improvement and appraisals is senior management sponsorship.

Wow! eBook Chapter 5 Using CMMI Models 107

CMMI Related Training Whether your organization is new to process improvement or is already familiar with process improvement models, training is a key element in the ability of organizations to adopt CMMI. An initial set of courses is provided by the SEI and its Partner Network, but your organization may wish to supplement these courses with its own instruction. This approach allows your organization to focus on areas that provide the greatest business value. The SEI and its Partner Network offer the introductory course, Introduction to CMMI for Development. The SEI also offers advanced training to those who plan to become more deeply involved in CMMI adoption or appraisal—for example, those who will guide improve- ment as part of a process group, those who will lead SCAMPI appraisals, and those who will teach the Introduction to CMMI for Development course. Current information about CMMI related training is available on the SEI website at http://www.sei.cmu.edu/training/.

Improving Industrial Practice ptg

by Hans-Jürgen Kugler

How many software products have a design that can be maintained over the lifetime of the product? How many have a design that exhibits elegance and embodies the principles of good software engineering, as for instance, laid down in Dave Parnas’ seminal 1972 paper?7 How many projects can truly trace the relationship between customer requirements and the features that they are implementing? We now have the second or even third generation of developers—depending on your viewpoint—contributing to or suf- fering in the software crisis. Do we ever learn? The answer cannot only come from the developer community. The recent TechRepublic Live 2010 Conference8 asked, “Buggy Software: Why do we accept it?” The answers made a clear distinc- tion between IT software and other products (e.g., automobiles),

7. D. L. Parnas, “On the Criteria to Be Used in Decomposing Systems into Modules,” Com- munication of the ACM, 15, no. 12 (1972), pp. 1053–1058. 8. TechRepublic Live 2010: The Changing Face of IT conference, Louisville, Kentucky, 30 June to 2 July 2010.

Wow! eBook 108 PART ONE ABOUT CMMI FOR DEVELOPMENT

with IT software suffering from a continuous feature increase irre- spective of the dangers of increased complexity. This situation does not hold only for the end consumer market, but also for enterprise IT. Mies van der Rohe—“less is more”—must be turning in his grave. Numbers of features are used as apparent market differentia- tors, since they are easily highlighted. It was even suggested that the user community of the IT movement showed the same signs of a Stockholm Syndrome9 that hostages may develop. In a market that does not value product qualities other than cost, it is difficult to develop the urge for quality initiatives. Disci- pline in consequently applying known good practices is often seen as interfering with the time and cost dimensions. A twisted version of Crosby’s “quality is free” statement is what seems to drive the actions: There is rarely time to do it right, but always time to do it again. However, not only IT systems contain software. “Embedded Software Rules”10 is an article that discusses how software is every- where: from shavers and most kitchen equipment to airplanes and automobiles. In the automotive industry, 80% to 90% of all in-vehi- cle innovations are software-driven. Software is the “glue” between functional groups inside and outside the vehicle, but at a cost: 50 to 100 interconnected special purpose computers cooperate in a vehi- ptg cle running software load measuring in the order of 107 lines of code and many thousand (interdependent) calibration parameters. “The dynaxity is actually increasing exponentially,” according to Prof. Heinecke, CEO of BMW Car IT,11 where dynaxity is the com- pound effect of dynamics of change (in market and technology) and complexity of the system under development. The automotive industry is a competitive market, and a market governed by product warranty legislation. Many of the functions of a vehicle are safety relevant, and disciplined and traceable engineering is required. If the innovative capability of a whole market is affected, indi- vidual improvement efforts by companies, however large they may be, are not sufficient. The whole value-creation network needs to be addressed. Many automotive companies have their own strategic improvement initiatives, such as Bosch,12 with significant positive

9. Stockholm Syndrome, Wikipedia, July 2010. 10. S. Ferber, H.-J. Kugler, Robert Bosch GmbH, and KUGLER MAAG CIE GmbH, “Embed- ded Software Rules,” White Paper, 2006. 11. H. Heinecke, “Innovation Development and Community Sources. The Survival Strategy,” International SPICE Days, June 2010, Stuttgart. 12. T. Wagner, Robert Bosch GmbH, “Bringing Software on the Road,” SEPG 2004, Orlando, Florida, March 2004.

Wow! eBook Chapter 5 Using CMMI Models 109 effect.13 However, industry-wide initiatives are also needed. The automotive industry association (Verband deutscher Automobilin- dustrie, VDA)14 issued guidelines, and a software initiative (HIS) was formed by the OEMs.15 The automotive OEMs focus on the engineering quality of the development process at supplier organi- sations. Development projects at suppliers are appraised with respect to process maturity, and the results allow the OEM in question to manage risks better, to develop joint improvement actions, and to increase the urgency and importance of process improvement in the supplier organisation. The model chosen by the HIS is based on “Automotive SPICE”16 an adaptation of ISO 15504, which is— whilst not equivalent—at least compatible with the continuous rep- resentation of CMMI.17 CMMI remains the approach of choice for many organisational improvement programmes. Results show that the overall strategy “improve a whole industry” does actually work: whilst software in the early 2000s caused many a driver to resort to walking, these incidents have decreased while process maturity increased18 and while dynaxity climbed significantly. Dr. Weißbrich, Volkswagen, argued that the active and passive safety features of vehicles show a ptg statistically relevant correlation with the significantly reduced road deaths in Germany.19 These safety features are realised in software, of course. In the past, the automotive industry has focussed on its tradi- tional V-shaped iterative development processes, with projects and organisational line function organised in a matrix fashion. Interde- pendence of projects and the fact that project acquisition, launch- ing, and tear down (“close out”) are very dynamic means that stable

13. H.-J. Kugler et al., “Critical Success Factors for Software Process Improvement at Bosch GS,” Electronic Systems for Vehicles, 11th International Congress, VDI, Baden-Baden, 2003. 14. Verband deutsche Automobilindustrie (German Association of Automotive Industry), VDA Volume 13, Development of Software-Driven Systems – Requirements for Processes and Products, 2004. 15. HIS – Herstellerinitiative Software (OEM Software Initiative by Audi, BMW, Daimler, Porsche and Volkswagen), www.automotive-his.de. 16. Automotive SPICE, www.automotive-his.de. 17. K. Hörmann, “Comparison of CMMI and ISO 15504,” KUGLER MAAG CIE White Paper. 18. L. Dittmann, Nutzen von SPICE aus der Sicht eines OEMs, Volkswagen AG, IAA Work- shop, Frankfurt, September 2005. 19. Weißbrich, “Spicy Cars—the Value of Automotive SPICE to the German Automotive Industry,” International SPICE Days, Stuttgart, June 2010.

Wow! eBook 110 PART ONE ABOUT CMMI FOR DEVELOPMENT

teams are more a wish than a reality. Many companies hoped that maturity levels 3 and beyond would provide stability to the organi- sation, and therefore provide a respite from constant change. Dynaxity plays havoc with this. The ability to change, to flexibly adapt to new context conditions, and still run disciplined and trace- able processes leading to high-quality products is the market differ- entiator today. Model-based development is the mainstay of automotive devel- opment methods,20 but Agile development and Agile project man- agement, Lean and Six Sigma techniques are increasingly used within the context of process improvement programmes, and used with good results, as expected.21 No doubt the automotive industry will continue with improv- ing industrial practice. It needs to, in the light of what Lero22 calls evolving critical systems.23 The market mechanisms enforce that. Open tool platforms adapted to these needs are beginning to take shape24 and will strengthen consistent application of good practices. Can the same principles also be applied to different markets or industry domains—for example, medical devices or even ultra- portable netbooks? The medical devices market may stand a chance to follow suit, since product warranty legislation applies here too, ptg and since deployment is regulated—and the process plays an important part in the deployment decision. Importance and urgency must be felt in a market; only then will its value-creation network exhibit a willingness to change. The pressure of commu- nity-based software development on traditional software develop- ment may help,25 provided the rift that exists between many proponents of CMMI and many of those involved in community- based software can be reconciled. In my opinion, this is just a ques- tion of time, because the evidence of compatibility exists. Maybe this is even a moot argument. Experience with the “soft- ware car” firmly positions software to be the innovation driver. No

20. AUTOSAR, www.autosar.org. 21. B. Boehm and R. Turner, “Balancing Agility and Discipline,” Addison-Wesley, 7th print- ing, October 2009. 22. Lero, “Research Area: Evolving Critical Systems,” The Irish Software Engineering Research Centre, http://www.lero.ie/research/researchareas/evolvingcriticalsystems.html. 23. L. Coyle et al., “Evolving Critical Systems,” IEEE Computer, May 2010. 24. B. Lange, “Eclipse auf Rädern,” iX 01/2010. 25. H.-J. Kugler, “Open Source Teams entwickeln schneller,” Automobil-Elektronik, Dec. 2005.

Wow! eBook Chapter 5 Using CMMI Models 111 other “material” has such potential and “weird” (nonNewtonian) properties. Quantum mechanics is the only thing that comes close.26 Maybe Tom DeMarco was right when he wrote about the end of software engineering:

Consistency and predictability are still desirable, but they haven’t ever been the most important things. . . . We’ve tortured ourselves over our inability to finish a software project on time and on budget. But . . . this never should have been the supreme goal. The more important goal is transformation, creating software that changes the world or that transforms a company or how it does business. We’ve been rather successful at transformation, often while operating outside our control envelope.27

A very valid point, I find. In this case, the two sides have noth- ing to argue. Organisational capability, good processes, and individ- ual competence would still be required to learn and improve in achieving this transformation. Heisenberg’s uncertainty principle may also prevent us from objectively controlling software and its development, because there is no objectivity. But this state does not prevent us from building ecosystems on software platforms that flourish and change the world. ptg Either way, the future of process improvement, and of the CMMI, looks bright. There is a lot to be done. Let us begin.

26. H.-J. Kugler, “What Is Software? An Excursion into Software Physics” Workshop, KUGLER MAAG CIE GmbH, July 2008. 27. T. DeMarco, “Software Engineering – An Idea Whose Time Has Come and Gone?” View- points, IEEE Software, June/July 2009.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook CHAPTER 6

ESSAYS AND CASE STUDIES

For the third edition of this book, we asked contributors to write short case studies and essays instead of one large case study. The case studies are an excellent mix of experiences involving a variety of organizations and contexts.

Case Studies The following three case studies each have a unique perspective on ptg using CMMI for Development. The first focuses on how to use the model to achieve maturity level 2 and describes the lessons learned when a process improvement project can change course to achieve its objectives. The second case study follows the experience of an organization that uses CMMI over a long period of time and the com- petitive advantages it enjoys. The third describes and final case study describes how combining the use of CMMI for Development and Lean Six Sigma can accelerate and improve process improvement.

Switching Tracks: Finding the Right Way to Get to Maturity Level 2

by Heather Oppenheimer and Steve Baldassano

When your customer contract requires that your “software develop- ment process must be CMMI Level 2,” there’s a right way and a wrong way to go about meeting that demand. KNORR Brake tried both ways and has the scars and the rewards to show for it. This is the story of how a very small organization first headed down the wrong track then changed direction—resulting in a successful maturity level 2 appraisal less than nine months later. KNORR Brake Corporation supplies pneumatic and hydraulic braking systems, air conditioning

113

Wow! eBook 114 PART ONE ABOUT CMMI FOR DEVELOPMENT

equipment (HVAC), and door operators, as well as repair, overhaul, and maintenance services, for all types of mass transit rail vehicles, including Metro, Light Rail, High-Speed Trains, Commuter Rail, and Monorail. Key characteristics of KNORR Brake products include high reliability, safety, integrated operations and communications, and specialized functionality not available from other vendors. Of about 250 employees, fewer than 30 are involved in systems and software engineering; the others are employed in the manufac- turing facility. Each software project involves one or two highly expe- rienced electrical engineers who develop customized embedded software that is integrated with the hardware at the systems level. KNORR Brake has maintained ISO-9001-2000 registration since 1994 and must also adhere to IRIS (International Railway Industry Standard) and follow the quality management standards of its parent company, KNORR-Bremse.

The Wrong Route With such a strong quality management system in place, when a cus- tomer required “CMMI Level 2,” we thought it would be easy, only requiring the addition of a few new artifacts. No one expected any significant gaps, but if gaps were found, everyone assumed there ptg would be an opportunity to fix minor issues and still achieve the level. Unfortunately, this misunderstanding resulted in a SCAMPI A appraisal that was an expensive way to discover that the organization was rated “mud-sucking, bottom-dwelling Level 1.” It also cost the goodwill and enthusiasm of many of the people involved. Based on our assumption that little or no preparation would be needed, we scheduled our SCAMPI A appraisal almost immediately. Several managers and engineers went to the SEI-authorized Intro- duction to CMMI course so they could learn what small additions CMMI required them to make to the current quality management system, as well as to meet the requirements for participating as appraisal team members. However, the course examples and model informative material were not very relevant to the world of embed- ded software and we were still confused about what the model required. Based on the course information and advice from our lead appraiser, we put in place some new processes and procedures, even though none of them provided any value to the business. People were willing to use these new “CMMI processes,” but considered them to be useless overhead. Assuming that all lead appraisers are equivalent, KNORR Brake hired the same lead appraiser who was working with several other

Wow! eBook Chapter 6 Essays and Case Studies 115 subcontractors of the customer. Unfortunately, although he was very experienced and knowledgeable, the lead appraiser tried to assess our implementation of CMMI practices as if KNORR Brake was a large software application development organization—not a group of very small teams developing embedded software. Although the customer contract specified that “software develop- ment must be CMMI Level 2,” we assumed that the entire organiza- tion would need to comply with the CMMI standard—as if it were ISO or IRIS. As a result, the entire organization, including systems engineering and manufacturing as well as software engineering, was included in the scope of the appraisal. The large scope increased the duration and cost of the appraisal and made it more difficult to find the right evidence of implementation for each practice. At the same time, the areas the customer really cared about were not appraised. Even though we weren’t trying to improve any processes beyond adding some documents, the staff spent hundreds of unplanned hours just preparing for the appraisal, doing things like collecting artifacts. Based on our understanding of the SCAMPI A requirements for “objective evidence,” the team wrongly assumed that each prac- tice required at least two unique pieces of evidence; they copied and stored hundreds of documents in Practice Folders within Process ptg Area Directories. Most of the artifacts collected were useless to the appraisal team who ended up looking for all of the evidence again after being unable to make sense of the information so laboriously provided. The appraisal itself was an agonizing experience. People were nervous about preparation, and then once they were in an interview, they felt like they were on trial. Questions were asked in CMMI- speak rather than in the terminology used in KNORR Brake processes, so the answer was often, “We don’t do that” when a partic- ular CMMI practice was something they really did implement. Because no one in KNORR Brake really understood the CMMI model, the assumption was that there was nothing open to question in the lead appraiser’s expectation of how practices should be imple- mented. No one could translate the KNORR Brake processes for embedded software development into CMMI terminology that the lead appraiser would understand. As a result, the organization was rated maturity level 1, the recommendations for addressing gaps weren’t useful or implementable, and the whole organization felt that CMMI was a waste of effort.

Wow! eBook 116 PART ONE ABOUT CMMI FOR DEVELOPMENT Switching to the Right Way Because the contract still required CMMI Level 2 for Software Devel- opment, senior management decided to learn more about CMMI and try again. It wasn’t easy because we had to overcome some very nega- tive attitudes toward CMMI and SCAMPI in particular, as well as consultants, process improvement, and appraisals in general. However, this time, we treated the effort as a project, not as an activity to be done in the margins. As long as we had to do it, we decided that we might as well do something useful. We looked at the contract requirement for CMMI Level 2 as an opportunity for under- standing KNORR Brake written and unwritten processes and then improving them as needed to add business value. Senior management demonstrated their commitment by allocat- ing an experienced project manager to manage and facilitate the process improvement effort full-time and granting him the authority and resources necessary to make it happen. This person not only had quality assurance experience, project management experience, and some CMMI training, he was highly respected and trusted by every- one based on his technical experience and skill in delivering a product. All of the software engineering staff had time allocated to the process improvement program and were involved and engaged in ptg clarifying, understanding, defining, and improving the program life- cycle, operational procedures, work instructions, and the flow and dependencies among them. As an added benefit, the engineers ended up learning a lot from each other, as well as defining processes that worked. The first step was to look carefully at the contract requirement for CMMI Level 2 to determine exactly where to focus. At the same time, the CMMI Project Manager began to look for a certified lead appraiser to serve as a consultant; someone who would understand the needs and context of a very small organization that does embed- ded software and who could coach us in interpreting the CMMI model for implementation in that context. The initial phase of the process improvement project included working with the selected consultant to review the contract language to ensure that we really understood what was required, and then to confirm that understand- ing with the customer. This narrowed the scope of the effort to the software engineering group processes, rather than trying to change the whole enterprise. This approach made the process improvement project more manageable and increased the likelihood of success. On any journey, whether it’s an actual journey by rail or a metaphorical journey of process improvement, it’s important to start

Wow! eBook Chapter 6 Essays and Case Studies 117 by understanding where you are and where you want to go. We worked with the new consultant on an in-depth current state analy- sis to understand the overall KNORR Brake software engineering process architecture, map “what software engineers do and how they do it” to the CMMI model practices to identify possible gaps, and then decide what to do to address those gaps. CMMI was used as a guide, but all implementation decisions were based on the KNORR Brake process context and culture. A highly interactive version of the SCAMPI B methodology that didn’t require the organization to prepare evidence ahead of time was used for the current state analysis to reduce adverse impact on current product development activities without impacting the quality and usefulness of the appraisal results, to increase the learning opportu- nities during the appraisal, and to be able to tell the customer about process improvement progress in terms of CMMI and their contract requirements. Over the next several months, the CMMI Project Manager met twice a week for two-hour sessions (lunch provided!) with members of the software engineering group to understand, clarify, and enhance the processes and procedures related to developing embedded soft- ware. The CMMI Project Manager reported progress regularly at man- ptg agement status meetings and also met weekly by phone for coaching with the consultant to go over progress, issues, and achievements. The consultant returned twice for a few days each time to provide customized training and lead hands-on workshop sessions with the team to cover some specific areas of need related to process architec- ture definition, estimation, risk management, and measurement and analysis. Although everyone was fully engaged with defining processes and procedures, the CMMI Project Manager was responsible for actu- ally documenting them and for ensuring that the KNORR Brake doc- ument manager formatted them appropriately and submitted them to the corporate repository correctly. Our initial focus was to define and document the work flow into the KNORR V-Model (a view of the project lifecycle). When the effort started, program management had monitored only customer document deliverables and three other software engineering tasks: design, code, and test. No one other than software engineers could understand why the software engineering effort and duration estimates were so high, when there appeared to be so little work to do. After we finished, all of the key activities, dependencies, and deliverables were captured in the V-Model; as a result, program management and senior

Wow! eBook 118 PART ONE ABOUT CMMI FOR DEVELOPMENT

management now had a much better understanding of the software engineering work. CMMI practices in the Requirements Manage- ment, Project Planning, Project Monitoring and Control, and Mea- surement and Analysis processes areas provided guidance for our implementation of the KNORR V-Model. The KNORR V-Model provided the foundation for an improved and more useful work breakdown structure (WBS), which in turn was used as a starting point for identifying and estimating the factors that impact how much effort each task or deliverable would require. Although the CMMI model informative material and formal training provided many examples of these factors, such as number of func- tions, source lines of code, and numbers of pages, it turned out that the examples given were not the primary factors affecting effort and schedule estimates in KNORR embedded development. Engineers knew what factors impacted their effort, but had never captured or shared this tacit knowledge. For example, the amount of work required for customer-deliverable documents is affected by the number of circuit boards and the number of revisions a particular customer is likely to require, rather than the number of pages in a document. With some initial coaching from the consultant, the team was able to determine what factors did make a difference, and then ptg use Wide Band Delphi techniques to estimate typical effort ranges for tasks, depending on the values of those factors and historical effort data for similar activities. The typical effort ranges were used in the updated project plans and as input to new bids and projects. As a result, KNORR Brake was able to more accurately estimate the cost of new software engineering work, and provide customers with a more accurate prediction of delivery dates. CMMI practices in the Project Planning, Project Mon- itoring and Control, and Measurement and Analysis process areas helped us implement the new WBS. After the KNORR V-Model was established, the software project attributes identified, and the WBS completed, the Program Manage- ment Schedule was updated to reflect the actual activities, so it was much easier to predict how long a project would take and monitor progress during the project. An additional benefit was that the rest of the organization gained a much better appreciation of the value the software engineering team was providing to the business. Under- standing the real project attributes and WBS made a huge improve- ment in managing our software engineering resources across projects and provided valuable input when bidding new projects.

Wow! eBook Chapter 6 Essays and Case Studies 119

Some KNORR Brake customers required a Software Requirements Traceability Matrix (SRTM) as a deliverable, but the SRTM was often created after all the work was completed and used to document what existed, rather than a means to manage the work. However, one soft- ware engineer had discovered that it was a useful tool, and shared that experience with the other team members during the weekly meetings. The KNORR V-Model was then used to update the struc- ture for the SRTM so that it helps ensure requirements coverage and assessment of the impact of changes throughout a project. KNORR Brake already had a process for capturing requirements changes that were charged to the customer. However, if a change seemed small or the program manager decided to give it to the cus- tomer without charge, it wasn’t recorded, and the impact on project schedules, effort, and cost was not assessed. We enhanced and insti- tutionalized the Project Change Request (PCR) process with a new PCR Procedure that requires systems and software engineers to eval- uate all changes to determine impact on the schedule, cost, and/or scope and identify any risks related to the change. We quickly saw the value in the new procedure because several requests that would have been assumed to be small turned into rev- enue-producing changes, whereas others did not generate a customer ptg charge, but still required replanning of the project resources because they affected the effort and schedule. In conjunction with the KNORR V-Model and the new WBS, CMMI practices in the Require- ments Management, Project Planning, Project Monitoring and Con- trol, and Configuration Management process areas helped guide the changes made in the SRTM and the new PCR change request tem- plate and procedure. All of the process and procedure changes interacted synergisti- cally, each supporting and enhancing the others. All either clarified or enhanced the way work was already being done; none added use- less overhead. Because the changes were a natural extension of the current processes, it was easy to institutionalize them within the KNORR Quality Management System. After the process changes were established and as they were becoming institutionalized, the organization began planning for the SCAMPI A appraisal. Working with the consultant, the CMMI Proj- ect Manager defined a set of questions to use when interviewing potential lead appraisers to determine whether the candidate would be capable of understanding how CMMI practices can be imple- mented in a very small organization that creates embedded software. The lead appraiser needed to be able to adapt to the KNORR Brake

Wow! eBook 120 PART ONE ABOUT CMMI FOR DEVELOPMENT

organizational and process culture, rather than expecting KNORR Brake to do things in a particular way. Because KNORR Brake wanted to have the appraisal before the end of the calendar year, a time that many other organizations are also trying to schedule appraisals, the selection process began in the late summer to increase the likelihood that the lead appraiser chosen would be available when needed. Each new process advancement and tool improvement was merged into our existing Quality Management System by writing new work instructions and/or operational procedures and storing them in an online repository we call REX (Rail Excellence). This allows both existing and new employees to easily find the appropri- ate procedures and templates within the context of the process maps that clearly define the development and design lifecycles. Preparing evidence for the SCAMPI A appraisal turned out to be simple. Because the CMMI Project Manager knew which KNORR Brake processes implemented which CMMI practices, it was just a matter of pointing to the actual work products once they were cre- ated. Most of the artifacts selected for evidence had sections that were appropriate for multiple CMMI process areas and practices. The consultant reviewed the evidence with the program manager to make sure the work products selected were appropriate and complete, and ptg that there were clear explanations of the context so that the appraisal team would understand why those particular artifacts were selected. It didn’t matter which projects were eventually chosen by the lead appraiser to represent the organization, because the work product links pointed to directory structures that included all projects. The appraisal was interactive and cooperative rather than con- frontational. Work outputs were projected on a screen and the lead appraiser asked participants to explain their work for the entire appraisal team to review. If questions or concerns were raised, they were answered immediately rather than waiting for a separate valida- tion session. Because the appraisal focused on KNORR Brake processes and evaluated CMMI model implementation within that context, rather than focusing on CMMI itself, no one felt threatened and the information was complete and accurate. The appraisal was finished in less than four days and KNORR Brake software engineering achieved their goal of being rated matu- rity level 2. Although achieving that rating was certainly important for contractual reasons, perhaps more significantly, staff and manage- ment learned that CMMI practices could add value without adding overhead. Members of other functional groups and product lines were asking, “Why do only the software engineers get to do CMMI? We want to do it too!”

Wow! eBook Chapter 6 Essays and Case Studies 121 Moving Forward (The Journey Continues . . . ) Based on the successful SCAMPI A appraisal with the KNORR Brake Software Development Group and the noticeable value added by the new processes and procedures, the company decided to continue the process improvement effort. The CMMI Project Manager was pro- moted to a newly formed position as Manager, Business Process Improvement with a mandate from senior staff to make improve- ments throughout KNORR business and engineering processes. Cur- rently, KNORR has four process improvement teams in action. One team is extending the Brakes software engineering process improvement work into the domain of HVAC software engineering. The team has already completed tailoring the KNORR V-Model life- cycle and WBS for developing embedded code in the HVAC domain. Project schedules have been updated accordingly with milestones based on a more realistic view of the work required. The HVAC team improved on the Brakes software engineering process with the imple- mentation of a new template for identifying the key project attributes that allows for multiple variations. With this new tool, the HVAC Software Engineer can more accurately characterize the project type, which is a key factor in estimation of effort and schedule. We are also extending the process improvement work into sys- ptg tems engineering in both the Brakes and HVAC domains. This team is reviewing the systems engineering lifecycle and mapping CMMI practices to the systems engineering process. We are looking at the maturity level 2 process areas, as well as key maturity level 3 process areas where the organization is already strong or where there are issues that need to be addressed in the near term. Due to the successful process improvement effort completed by the Brakes software engineering group and because the team approach is working extremely well with the newly formed cross-domain soft- ware and systems engineering teams, senior management supported formation of a HVAC New Product Design Team that will use CMMI ML2 and ML3 practices in designing new HVAC equipment. After the systems and software engineering design lifecycles were well understood, we realized that the proposal process needed to be integrated into the overall product lifecycle. A proposal team has completed mapping the entire proposal process and clarifying how it feeds into the systems and software design lifecycle when proposals are awarded. The effort and documentation within the proposal process has proved extremely valuable by providing the framework for future bids as well as helping us predict the resource and time allocation needed for bid preparation itself. In addition, when the

Wow! eBook 122 PART ONE ABOUT CMMI FOR DEVELOPMENT

contract is awarded, the product and customer details from the bid process are directly transferable into estimates of project size, tasks, and deliverables, thereby saving planning time and effort. At the end of the year, we will conduct a SCAMPI A that includes systems engineering and software engineering processes for both Brakes and HVAC. We will still use some assistance from an outside consultant to validate our assumptions and decisions, but now that we are able to understand how to apply the CMMI model within the KNORR context, we need much less direct guidance. Based on our experience with process improvement in Brakes software engineer- ing, we predict a high likelihood of success, which will put KNORR at a competitive advantage in our market domain.

Lessons Learned As a result of travelling first down the wrong track and then down the right one, we have learned some key lessons, which we will apply as we continue on our journey of process improvement:

1. Question assumptions and trust your instincts. Don’t assume that you have to do something because CMMI or some expert “requires” it. If it doesn’t seem to make any business sense, it probably isn’t a ptg good implementation for your current organizational context. 2. When trying to meet contract requirements, use the contract lan- guage to help specify organization and model scope for a formal appraisal. Understanding the contract requirements focuses the improvement effort and helps achieve immediate value. 3. Especially if you have had a previous negative experience with process improvement, demonstrate value early and often. When staff and managers see that a change makes their own work easier or more effective, they are more likely to be willing to embrace further changes. 4. Process improvement using CMMI is not something that can be done in the margins. It requires dedicated effort by a respected and trusted project manager as well as involvement and engagement from everyone at every level in the organization. 5. The CMMI model provides value only if it is used as a guide for process improvement, not as a checkbox exercise. Process improve- ment is not a matter of documenting what “should” be done and forcing everyone to comply, whether or not it provides any business value. Changing the paradigm from “getting by for certification” to “clarifying and improving our processes and using the CMMI model to help” validates what is already being done, makes any changes “stick,” and ensures that those changes add business value.

Wow! eBook Chapter 6 Essays and Case Studies 123

6. All CMMI goals and practices are useful and do not add unnecessary overhead—as long as you interpret them and implement them within your own business and technical context. Within your orga- nizational context, each process improvement will often implement several CMMI practices, and synergy among the improvements increases the benefit of each. 7. A consultant who understands the organizational context can pro- vide guidance in areas where the organization doesn’t have expertise and can help the program stay on track. However, process changes need to come from inside the organization for the changes to be institutionalized. Be sure to choose a consultant who will coach rather than prescribe. 8. Choosing a lead appraiser who understands the business context ensures that the appraisal will be valid. If the lead appraiser doesn’t really understand the business context and there is no one to help translate between CMMI-ese and the organization’s processes and terminology, there’s a high likelihood that the appraisal results will be invalid due to miscommunication and you will not achieve your goal. 9. When process improvement activities bring clear business value to one part of the organization, other parts of the organization will ask ptg to get on board too. Success breeds success.

After we switched to the right track for our process improvement journey using CMMI, we have had an enjoyable trip, were able to arrive on schedule, and are looking forward to travelling to the next destination!

Using CMMI as a Basis for Competitive Advantage at ABB

by Aldo Dagnino

ABB is a multinational organization and market leader in power and automation technologies that enable utilities and industries to improve performance while reducing environmental impact. The vast majority of ABB’s products consist of complex systems that have both hardware and software components (e.g., robot systems, SCADA systems, relays, control systems, substation automation sys- tems). Some products are purely software (e.g., control systems, sub- station automation, SCADA systems), whereas others are purely hardware (e.g., power transformers, circuit breakers).

Wow! eBook 124 PART ONE ABOUT CMMI FOR DEVELOPMENT

The primary customers of ABB are commercial organizations that are important players in sectors such as utilities, petro-chemicals, pharmaceuticals, automotive, manufacturing, chemical, and oil and gas. Although ABB’s customers do not require their suppliers to demonstrate CMMI maturity levels, the company has been utilizing CMMI for almost ten years. CMMI is used for continuous process improvement of the corporation’s product development processes. ABB recognized that CMMI provides a good framework for process improvement because it contains industry best practices for the development of complex systems. Following CMMI’s principles has helped and continues to help ABB product development business units (BUs) to be leaders in their respective market segments. Process improvement has been an important element in the his- tory of ABB and can be viewed as a two-stage process. During the first stage, continuous process improvement was promoted by a cor- porate process organization. The second stage of process improve- ment was spawned as a result of the positive results observed in stage one and because members of the corporate process improvement organization from stage one moved to several development BUs and became the members primarily responsible for the BUs’ process improvement teams. ptg

Adopting CMMI The official journey for ABB in continuous process improvement of product development began in 1999 when the Software CMM (SW- CMM) based initiative was launched and BUs that developed soft- ware products began voluntary participation in the initiative. The name given to this initiative was the ABB Software Process Initiative or ASPI. There were few pilot process improvement projects that “signed up” as participants in the initiative. ASPI was a global and centralized program and a Corporate Software Engineering Process Group (CSEPG) was created to move the initiative forward. The CSEPG worked in a geographically dispersed fashion with teams in countries where the major software development groups of the corporation were located. The primary objectives of the CSEPG were to develop a group of continuous improvement experts with deep knowledge of the CMM model and to develop the basic meth- ods and tools to be employed at each development BU for improve- ment activities. Soon after the initial pilot projects began, it became clear that as ABB’s products have a combination of hardware and software compo- nents, SW-CMM did not cover practices that fully embraced both

Wow! eBook Chapter 6 Essays and Case Studies 125 types of components. For this reason, adopting CMMI as the frame- work for continuous process improvement was a natural option. Hence, the CSEPG became a Corporate Engineering Process Group (CEPG). The strategy for the CEPG was to develop deep expertise in CMMI and to promote and increase awareness of the value of CMMI among product development units. The objective was for members of the CEPG to work closely with the product development BUs as trainers, mentors, and coaches to the BU’s process improvement team. In this way, the first stage of the continuous process improve- ment initiative at ABB began. As the first pilot projects were completed, the number of product development BUs in the corporation interested in CMMI increased by an order of magnitude. Additionally, the experience of the CEPG team working with the initial pilots resulted in a need to develop key robust methods and tools that enabled working more systematically with a larger number of product development BUs. As few develop- ment BUs needed to demonstrate a maturity level, the approach taken was to instead demonstrate the economic benefits from mak- ing changes to the development processes using CMMI.

Standard Tools and Methods ptg Thus, standardized internal methods and tools needed to be devel- oped to ensure consistency when engaging multiple product devel- opment units. Furthermore, the use of a continuous improvement framework was identified as an important asset. Because develop- ment units did not need to demonstrate maturity level, the continu- ous process improvement activity became cyclic. Cycles were typically one year long and corresponded to the budgeting cycles of the organization. The IDEAL model was selected as a continuous process improvement cycle framework. The tools and methods developed by the global CEPG team mem- bers and then employed during its engagements with product devel- opment units in their geographic area of responsibility included primarily the following:

• A tool to assess the readiness of an organization to implement a con- tinuous process improvement initiative • An internal appraisal methodology, including training and tools • A methodology to translate the appraisal results into a prioritized process improvement work breakdown structure • Methods and templates for developing a process improvement plan to manage the improvement activities in the cycle

Wow! eBook 126 PART ONE ABOUT CMMI FOR DEVELOPMENT

• A method for measuring the economic benefits associated with the changes made as a result of the process improvement activity • A data repository that summarized a set of improvement measures for each product development BU engaged • A repository of lessons learned, good practices, and general knowl- edge gained in the interactions with product development units

A brief description of these tools and methods and an explanation of their usefulness to implement CMMI are described in the remain- der of this case study. An organization that decides to implement a CMMI continuous process improvement program needs to establish the “right” infra- structure, dedicate human and financial resources, have support from senior management, have a competent change agent, train the organization’s members in CMMI, and in general be “ready” for the initiative. A readiness assessment tool and method were developed and are used to quantitatively and qualitatively measure the readi- ness of an organization to conduct a continuous process improve- ment program using CMMI. The readiness assessment is conducted at the beginning of each process improvement cycle and reevaluation is conducted within the ptg cycle as needed. The results of readiness assessments have been used to ensure that senior management, change agents, and the organiza- tion are fully committed to the CMMI initiative. When deviations occur, the results of a reassessment show potential areas that need to be addressed to move the CMMI initiative forward. The use of an internally developed CMMI appraisal methodology and tools has been essential to evaluate and diagnose ABB develop- ment BUs against CMMI best practices. This internal appraisal methodology is based on Version 1.1 of the Appraisal Requirements for CMMI (ARC 1.1) and for many years has been used to appraise product development BUs. The ABB internal appraisal methodology uses both Class B and Class C internal CMMI appraisals. Class B internal CMMI appraisals have been employed primarily when a development organization begins a continuous process improvement cycle and after the improvements made have been institutionalized in the BU. Class C internal CMMI appraisals are used in between Class B appraisals to evaluate one or two CMMI process areas (PAs). All members of ABB’s internal appraisal teams must take the SEI- taught Introduction to CMMI-DEV course as well as the ABB appraisal training course. To lead a CMMI internal appraisal, the appraisal

Wow! eBook Chapter 6 Essays and Case Studies 127 leader must have participated in at least three internal Class B appraisals and must have taken the SEI-taught Intermediate Concepts of CMMI course. It is also an important and valued requirement that at least one member of the product development BU must participate in the appraisal, because ultimately the development BU will be responsible to move the process improvement project forward. The results of an internal CMMI appraisal consist of a final pres- entation that defines the strengths and weaknesses of the organization with respect to CMMI PAs. The final presentation is consolidated by the appraisal team and is delivered by the internal appraisal leader to the participating organization at the end of the appraisal. A second deliverable generated after completing the internal appraisal is a detailed observations document in which observations made by the appraisal team are detailed at the practice level for each PA included in the appraisal activity. After the appraisal is completed, the observations document is used to generate a work breakdown structure that serves as a basis for the generation of the improvement plan. At this stage, the partic- ular product development BU’s annual/biannual objectives play a central role in the selection and prioritization of PAs that will be included in the improvement cycle. ptg Those PAs that directly contribute to the achievement of the annual/biannual development BU’s objectives are selected. For exam- ple, a development BU may have as part of its business objectives to reduce the cost of poor quality (COPQ) by 15%. At the same time, the results of the appraisal may show that the BU’s process, when compared to the Requirements Development (RD) PA, has many weaknesses that seem to contribute to poor documentation and cap- ture of requirements and hence to a high degree of rework, which raises the COPQ. Under these circumstances, a WBS (work break- down structure) that addresses what improvement is needed relative to the RD PA is then generated using the observations document. The generation of the process improvement WBS is a difficult task to carry out by the BU’s process improvement team and thus the CEPG coach usually works closely with the improvement team to fully develop the WBS. After the WBS for process improvement has been completed, the process improvement plan (PIP) is developed. The development of the PIP follows closely the practices that the CMMI-DEV recom- mends in the PP (Project Planning) and PMC (Project Monitoring and Control) process areas, and it serves the purpose of properly managing the process improvement project throughout the cycle.

Wow! eBook 128 PART ONE ABOUT CMMI FOR DEVELOPMENT

The CEPG coach works closely with the process improvement proj- ect leader in the development BU to define the PIP, but the develop- ment BU ultimately “owns” the process improvement plan. An important aspect of the improvement cycle is defining how to measure the impact of CMMI process improvement activities. As explained previously, each development BU defines annual/biannual business objectives and a subset of the objectives is expected to be affected by process improvement activities. A methodology that measures key performance indicators (KPIs) before the improve- ments have taken place and after improvements have been institu- tionalized was developed by the CEPG. An important aspect of this methodology is the consistency of how the KPIs are measured and at what points in time. Of course, this aspect also has a “rewards and recognition” component that is incorporated to provide incentives to the organization for a continu- ous process improvement methodology. An interesting result to note in terms of the benefit/cost ratio of several process improvement projects is that ratios in the range of 3:1 to 5:1 have typically been observed. These results mean that for every dollar spent in a process improvement project, the return is typically three to five dollars in terms of cost savings, new market opportunities created, or revenues ptg generated by the development BU. A database that provided KPIs on the process improvement activ- ities of each product development BU was created and maintained by CEPG members. Typical KPIs included the name of the development BU, how many improvement cycles it had conducted, the types of appraisals conducted, and benefit measures. This database was updated as improvement activities were conducted at each participating development BU. A CMMI knowledge base was also created and is organized so that it provides a graphic perspective of improvement against CMMI PAs. The objective of this knowledge base is to store and make avail- able to all the corporation good practices implemented by the devel- opment BUs engaged in a continuous process improvement program, including artifacts such as templates, lessons learned, answers to fre- quently asked questions, and any other type of information that can be useful to development BUs.

The First Stage During the first stage of our continuous process improvement initia- tive, the previously mentioned tools, methods, and infrastructure were created and employed by the CEPG and participating develop-

Wow! eBook Chapter 6 Essays and Case Studies 129 ment BUs. The first stage of the initiative was characterized by the following important elements:

1. A clear business case had to be defined and aligned with the full set of objectives of the development BU carrying out the improvement project. 2. The CEPG was then in a better position to “sell” continuous process improvement to the development BUs. 3. The commitment of a Development BU to continuous process improvement highly depended on the local senior management’s background and level of knowledge in software engineering concepts. 4. There was a need to continuously monitor the level of commitment of each development BU to their improvement initiative. 5. There was little need for synchronization of improvement activities, solutions, and tool usage among development units carrying out the improvement initiative.

The Second Stage Later, several strategic changes were made to the infrastructure used to support continuous process improvement at ABB: the majority of CEPG members were assigned to development BUs and were given ptg the responsibility for that BU’s improvement initiative. Concurrently, there was also a realignment in the perspective of senior management to make the continuous process improvement initiative more global. The target of global process improvement is to significantly improve the global reach of corporate product development practices and use common methods and tools where possible.

Today The global initiative is now firmly grounded in all ABB divisions. The divisions directly own and sustain their improvements. CMMI con- tinues to be the primary framework used for continuous process improvement but the focus is on ensuring attention to lower level practices and standard methods and tools that can be used and cus- tomized by all product development units. Internal CMMI appraisals continue to be conducted on a periodic basis. In summary, ABB has been engaged in a 10-year journey to con- tinually improve its product development processes. The outcomes have continued to be good results, economic benefits, improved morale, and the recognition that the CMMI has been a valuable tool all along the way. Adjustments in the journey were required due to

Wow! eBook 130 PART ONE ABOUT CMMI FOR DEVELOPMENT

changes brought about by increased globalization of markets and the need to increase the uniformity of process solutions, but the benefits continue.

Leveraging CMMI and LSS to Accelerate the Process I m p r o v e m e n t J o u r n e y

by Kileen Harrison and Anne Prem

Although Capability Maturity Model Integration (CMMI) and Lean Six Sigma (LSS) are often viewed as competing initiatives, contend- ing for the same process improvement funding and priorities, CMMI and LSS are actually complementary. Implementing these method- ologies in tandem can result in a synergistic effect that accelerates and enhances the benefits of process improvement.

The Synergistic Effect One consideration for how CMMI and LSS relate is to associate CMMI with providing the “what” and LSS as supplying the “how.” CMMI identifies “what” activities are expected, but does not specify ptg techniques on how to accomplish those activities. Whereas, LSS identifies “how” activities can be optimized, including specifying tools and techniques. For example, the CMMI model can be lever- aged to conduct a gap analysis on an organization’s business practices to determine what opportunities for improvement exist. Then, LSS can be applied to provide the organization with tools and techniques for achieving the necessary improvements. Because of their comple- mentary nature, leveraging the two models together can lay a solid foundation for performance-driven improvement that accelerates an overall improvement journey. Applying the two initiatives together encourages a balance between a focus on process comprehensiveness through CMMI and a focus on business efficiency through LSS. For example, in a typical implementation of CMMI at the managed and defined levels, the focus is on process definition to include all of the necessary ele- ments. Then, improvement and greater efficiency is achieved through process maturity, which is generally associated with advanc- ing through the levels in the model over time. In contrast, LSS is focused on efficiency and greater effectiveness from the onset of the improvement journey. To realize the benefits of this complementary nature, Space and Naval Warfare Systems Command (SPAWAR)

Wow! eBook Chapter 6 Essays and Case Studies 131

Headquarters Organizational Process Management Office (OPMO) created an approach that utilizes both initiatives, targeting discrete business unit challenges within the organization.

Background Context SPAWAR is one of the Department of Navy (DoN) major acquisition commands. The SPAWAR Command specifically procures higher-end United States Navy (USN) information technology products and services for the USN fleet and other armed forces. Headquartered in San Diego, Team SPAWAR includes SPAWAR Headquarters, two addi- tional SPAWAR Systems Centers (SSCs), three Program Executive Offices (PEOs), one Joint Program Executive Office (JPEO), and one liaison office located in Washington, D.C. In total, the organization consists of more than 12,000 employees and contractors deployed globally and near the fleet. Since 2005, SPAWAR Headquarters’ OPMO, supported by Booz Allen Hamilton, has been integral in establishing a successful LSS program across Team SPAWAR. Although the two SPAWAR System Centers, focused on engineering, have been engaged with both LSS and CMMI for years and successfully achieved CMMI ratings through formal appraisals, SPAWAR Headquarters’ OPMO, a service- ptg focused business unit, began incorporating CMMI in 2008. At that time, OPMO identified a need to integrate CMMI as a process improvement methodology to complement the existing LSS program and the other process improvement tools and techniques they man- age and provide throughout Team SPAWAR.

The Challenge Instilling a culture of process improvement and making positive progress is challenging in any organization and SPAWAR is no excep- tion. The ability to gain and maintain process improvement buy-in from stakeholders within the organization presents an immediate hurdle. Additionally, the government’s focus to do more with less while continuing to meet mission objectives provides an additional hurdle. These challenges, coupled with the desire to see tangible results, led to the formation of an approach that leverages both LSS and CMMI.

The Approach The following paragraphs detail OPMO’s approach, including the four major steps the OPMO CMMI assessment team utilizes:

Wow! eBook 132 PART ONE ABOUT CMMI FOR DEVELOPMENT

1. Identify Mission Critical Risks 2. Select Relevant CMMI Process Areas 3. Conduct a CMMI Gap Analysis 4. Apply LSS Tools

Step 1: Identify Mission Critical Risks The OPMO CMMI assessment team initiates the assessment of a business unit by leveraging information contained within a SPAWAR tool referred to as the Policy, Procedures, Processes, Practices & Instructions (P4I) workbook system. Each business unit within SPAWAR is required to update the P4I workbook at least annually. This information is used to identify the business unit’s mission, their associated policies and procedures, as well as any risk that may criti- cally impact their mission, thereby creating a snapshot baseline of the business unit’s as-is state. The assessment team leverages the risk information identified in the P4I workbook to identify the business unit’s possible weaknesses or risk areas. These critical risks are mapped to an associated functional area (e.g., mail processing, records management) to identify a discrete focal area for process improvement activities. ptg Step 2: Select Relevant CMMI Process Areas After the focal area is selected, the assessment team conducts inter- views with the functional subject matter experts and stakeholders to gain a better understanding of the as-is process. The team also reviews supporting process documentation as well as any other rele- vant assets or tools. Based on this research, the assessment team selects the CMMI process areas that are most relevant to the focal area. Due to the cross-functional nature of Team SPAWAR’s business, the team considers all three of the CMMI constellations (CMMI for Development, CMMI for Acquisition, and CMMI for Services) when selecting the relevant process areas. The team selects the process areas that have the strongest tie to the specific mission-critical risks identified from the previous step. SPAWAR Headquarters does not currently require attainment of a particular CMMI maturity or capability level. Therefore, the approach focuses on identifying and utilizing key actions or “best practices” from the CMMI models that the business unit can implement to mit- igate their mission critical risks and improve operating efficiencies. The improvement actions that result from the best practice assess- ment assist in building or enhancing the business unit’s process

Wow! eBook Chapter 6 Essays and Case Studies 133 framework, which could be leveraged to obtain a CMMI maturity or capability level, if desired.

Step 3: Conduct a CMMI Gap Analysis Now that the assessment team has reviewed the as-is state and selected the process areas, a detailed gap analysis is conducted. The team leverages an evidence matrix to systematically compare the busi- ness unit’s targeted functional area’s as-is processes against the selected CMMI process areas. As noted previously, because SPAWAR Head- quarters does not have a requirement to meet a maturity or capability level, the assessment team focuses on selected process areas to address mission critical risks, rather than using a full CMMI model and appraisal methodology. Based on the gap analysis, the assessment team documents the findings in a detailed CMMI Assessment Report. The report identifies the business unit’s opportunities for improvement, the team’s recom- mendations, and associated benefits. Before delivering the final report, the assessment team validates the findings and recommenda- tions with key stakeholders and subject matter experts from the busi- ness unit. These validation activities have proven effective in ensuring accuracy as well as promoting stakeholder buy-in. ptg

Step 4: Apply LSS Tools Because the CMMI Assessment Reports are detailed and typically contain five to six primary process areas that translate into numerous recommendations for improvement, the business unit typically requires assistance in identifying a “starting point” for addressing the improvement activities. The assessment team helps the business unit to prioritize these activities by using LSS tools, such as the Analytical Hierarchy Process (AHP) and a scoring matrix. Applying these tools provides a quantitative basis to a task (in this case, prioritizing activ- ities) that can be qualitative in nature. The AHP is a mathematical decision making technique that lever- ages weighted criteria. Specifically, the business unit identifies four to six criteria that are relevant to their business objectives, such as risk or level of effort. These criteria are compared head-to-head to determine a weight that represents the level of importance of each criterion. Finally, these weights are input into a scoring matrix to objectively prioritize opportunities for improvement. Leveraging the use of the AHP and scoring matrix provides the opportunity to involve the business unit in prioritization and ultimately promotes

Wow! eBook 134 PART ONE ABOUT CMMI FOR DEVELOPMENT

stakeholder buy-in. This objective methodology also helps build con- fidence that the most business-critical improvement areas are addressed first. After a prioritized list of improvements is defined, the top priori- ties are reviewed to consider the best course of action. A Plan of Action and Milestones is developed, which may include the kickoff of a LSS RIE (Rapid Improvement Event), LSS DMAIC (Define, Measure, Analyze, Improve, Control), or simply a “Just Do It.” The as-is process documentation as well as the findings identified from the CMMI gap analysis activities are required inputs to the follow-on LSS events.

Tying It All Together The assessment team has worked with several different business units. The team and business unit stakeholders have experienced valuable results from the implementation of the CMMI and LSS approach. One specific business unit that the assessment team worked with is SPAWAR’s Administrative Services & Support. Their business objectives are to support Team SPAWAR through the estab- lishment of policy, guidance, and oversight for administrative func- tions including official mail, records management, directives, and ptg correspondence administration. During the initial analysis activities, the team identified two func- tional areas that were associated with the business unit’s mission crit- ical risks: 1) Official Mail and 2) Records Management. Because Administrative Services & Support’s focus is providing these services to SPAWAR, the assessment team selected the CMMI-SVC constella- tion. In the CMMI-SVC constellation, one of the process areas deemed relevant to the Records Management functional area was Configuration Management. After completing the gap analysis, the assessment team identified various opportunities for improvement and recommendations in regard to Administrative Services & Sup- port’s Configuration Management activities. A sampling of recommendations includes the following:

• Create and maintain one all-inclusive configuration item list of all records managed along with key characteristics (e.g., owner, signa- ture authority, required reviewers, expiration date, update frequency). • Clearly define the purpose and role of each records management tool and document a process for tool use. • Identify the authoritative source for managing all records and con- duct periodic audits to ensure record integrity is maintained.

Wow! eBook Chapter 6 Essays and Case Studies 135

After the final CMMI Assessment Report for Records Manage- ment was delivered, the business unit and assessment team priori- tized the opportunities for improvement using the AHP and scoring matrix in accordance with Step 4 of the approach. Based on this pri- oritized list, the Administrative Services & Support business unit created a LSS Rapid Improvement Plan that included the following activities:

• Streamline the official document review process. • Streamline the official document routing process throughout the Command. • Define a standard configuration management process and associated business rules for use of existing tools to reduce version redundancy and to identify an authoritative source for official documents.

Similarly for Official Mail, in response to the highest priority rec- ommendations for improvement, Step 4 identified the need for a LSS RIE to improve mail handling processes. The LSS RIE resulted in a resolution of duplicative responsibility for mail sorting between an in-house resource and service provider. This resolution eliminated 90% of the in-house resource’s workload, which allowed for realign- ptg ment of the individual’s time to focus on reducing backlog in other work areas. In addition to this specific example of savings realized within the Administrative Services & Support business unit, a few further examples of benefits experienced by other business units include the following:

• Recommendations from a CMMI Assessment Report resulted in the creation of process documentation that will assist with upcoming staff transition activities, including a staff member’s retirement. • Content from the detailed CMMI Assessment Report and the priori- tized list of improvement activities were leveraged as critical inputs to a business unit’s annual strategic planning session. • Recommendations from a CMMI Assessment Report are planned to be leveraged to improve quality of contractor deliverables and associ- ated communications.

Lessons Learned and Critical Success Factors The assessment team has conducted multiple assessments and gained valuable lessons learned feedback as well as knowledge of critical

Wow! eBook 136 PART ONE ABOUT CMMI FOR DEVELOPMENT

success factors. Two of the most critical challenges that the team faced were the following:

1. Business unit acceptance of ownership for improvement actions 2. Resource availability issues due to major initiatives competing for resources (e.g., Navy Enterprise Resource Planning (ERP) implementation)

The assessment team deployed several techniques to overcome the challenge of business unit ownership. After the assessment was complete, the business unit was responsible to charter and initiate LSS improvement events. However, these activities sometimes required gentle and persistent encouragement from the assessment team to maintain forward progress. In response, the team found that conducting an offsite session was an effective way to build momen- tum for the improvement activities by allowing individuals to have focused time away from their day-to-day responsibilities. Additionally, the team found that the use of a RASCI (Responsi- ble, Accountable, Support, Consulted, Informed) chart helped the business unit understand the alignment of resources to roles and pro- vided clear ownership associated with each of the process improve- ptg ment activities. To mitigate the impact of resource availability issues, the assessment team discovered that communication with senior leadership and managing a realistic schedule were the most critical factors contributing to success. The assessment team kept senior leadership informed throughout the assessment activities; their sup- port and direction were integral to the success of the project. Clear communication of the CMMI assessment goals was another critical success factor the team recognized. Due to resource con- straints and competing priorities, SPAWAR Headquarters determined upfront that the approach would focus on leveraging CMMI best practices and not focus on the achievement of a particular capability or maturity level. By communicating this message early, the assess- ment team was able to pacify fears and address resistance associated with concern over the level of effort required to reach a particular CMMI maturity or capability level. The assessment team emphasized CMMI “best practices” from the onset of the project to appropriately communicate the scope and focus of the project to the participants. In conclusion, the sum of CMMI plus LSS does not merely equal the combined benefits of two process improvement approaches; instead, it equals a synergistic result that improves process efficiency and effectiveness faster. A significant factor in achieving maximum

Wow! eBook Chapter 6 Essays and Case Studies 137

benefits of this approach is having participants trained in the appro- priate elements of both models and associated nuances. Getting the right resources involved and implementing an approach that lever- ages both CMMI and LSS can accelerate the realization of process improvement benefits.

Essays The following essays cover a variety of experiences. The first several essays provide advice for both new and experienced users of CMMI for Development. The fourth essay provides practical advice for hir- ing a consultant or lead appraiser. The fifth and final essay provides an essay to share with those who are not “sold” on CMMI from a believer that once was skeptical of CMMI’s value.

Getting Started in Process Improvement: Tips and Pitfalls

by Judah Mogilensky

So your organization has decided to embark on a process improve- ptg ment journey. First of all, congratulations on your decision! Perhaps your journey has been triggered by some project disaster you never want to repeat, or perhaps you have realized that taking the next step up in project size and complexity will only lead to success if your processes are strong enough. In any case, you are probably wonder- ing how to get started. Here’s a bad way to start, one that I call “having a process lobot- omy.” You forget any aspect of process you were doing up to now, and you say, “We have to build everything new, from scratch.” I’ve never seen this work. A much better approach is to recognize that any level of project success you have enjoyed up to now has been based in part on whatever processes, however loose and informal, you have in place. One good strategy is to have a few key people from each cur- rent project simply write down, in any form, what they are doing today. Gather these descriptions, noting similarities and differences. Then, convene a meeting of key project people to review the descrip- tions, identify “best practices” and “lessons learned.” (Best practices work; we always want projects to do these. Lessons learned are bad things; we never want projects to do these.) The results of this meet- ing will be your first documented process description. Your journey

Wow! eBook 138 PART ONE ABOUT CMMI FOR DEVELOPMENT

will focus on refining this description, adding to it as gaps are identi- fied, and getting people to follow it consistently. “Well,” you say, “we’ve got very little process in place. There’s almost nothing in the CMMI-DEV model that we are doing consis- tently. Where should we start?” My suggestion is that if you really have almost nothing in place, start with Configuration Management. Every process you undertake, including the writing of your process descriptions, will produce documents, drawings, or some other arti- facts. You need to be able to keep track of these, know where they are, know what changes have been made to them, and know what the current versions are. Configuration Management helps you manage your artifacts. Another good activity to start with is peer reviews. You will be producing descriptions of your processes so that people can follow them consistently and so that you can deliberately and visibly improve them. It’s really helpful to have a peer review process docu- mented at a very early stage so that you can use it to peer review the rest of your process descriptions. And later, you will find that per- forming peer reviews on the products of your projects will be really useful too. What is the most important factor contributing to success in a ptg process improvement journey? Anyone with experience will give you the same answer: the strength of your sponsorship. The sponsor is the person in the organization with the power and authority to direct that process improvement actions be taken and to provide the resources (funding and people) needed to carry out those actions. However, the strength of sponsorship is not defined just by providing a budget and saying, “Go do it.” Rather, sponsors, whether they real- ize it or not, are constantly sending three types of messages to their organizations: expressed (what they say), demonstrated (what they do), and reinforced (what they reward). Weak sponsors are those who say they want process improve- ment—even loudly and publicly—but their own actions and the behaviors that they reward (e.g., badly estimated projects that get into deep trouble but then are saved at the end by late-night heroic efforts) show that they really do not care very much about process improvement. The organization’s people will always, in case of con- flict, believe the demonstrated and reinforced messages before the expressed messages. Therefore, one key role of the process group (i.e., the people charged with developing and deploying processes) is to provide feed- back to the sponsor about the messages that are being sent and whether

Wow! eBook Chapter 6 Essays and Case Studies 139 they are consistent. This role usually requires some up-front agree- ment, because upward feedback is not common in most organizations. What is the next most important success factor? It is an atmos- phere that encourages and fosters open and honest feedback from the process users. The only way to tell if processes are working well and helping people is to get comments and feedback from those people. To keep the flow of feedback going, it is vital that the feedback received be acknowledged, valued, and acted on, especially the feed- back that says “this is not working for us.” Attempts to dismiss or ignore complaints will not cause them to go away, but will only drive the complaints underground where they will be less visible and more damaging. What are some of the pitfalls to watch out for? There are two that seem to come up frequently: one is loss of sponsor interest, and the other is something that I call “pathological box-checking.” The sen- ior executives who generally serve as sponsors tend to live in a very busy, pressure-filled world with short time horizons. Process improve- ment is a long-term journey with rewards that develop slowly. It is very helpful to set up some regular opportunity, no less often than quarterly, to meet with your sponsor to review progress, celebrate victories, identify obstacles, and provide the upward feedback men- ptg tioned earlier. It is also very helpful to build and regularly update your business case, showing the benefits (e.g., cost savings, time sav- ings, better milestone performance, improved delivered quality) of the process improvement effort. Remember, the worse things are when you start, the easier it is to show improvement. The other pitfall comes about when the objective of achieving some maturity level rating in an appraisal is seen as the primary orga- nizational goal, and anything that interferes with this goal, or puts it at risk, is seen as an obstacle to overcome. This “pathological box- checking” is different from true process improvement in that true process improvement puts the primary emphasis on measurable project performance, measurable business performance, customer satisfac- tion, employee satisfaction, and continuous improvement. Pathologi- cal box-checking, on the other hand, views maturity level ratings as the ultimate objectives and seeks to achieve them even if the result is negative impacts on project performance, business performance, cus- tomer and employee satisfaction, and continuous improvement. Some of the indicators that an organization has gone over to this “dark side” are the following:

Wow! eBook 140 PART ONE ABOUT CMMI FOR DEVELOPMENT

• Maturity level targets and associated dates are widely publicized and discussed, but there is little or no corresponding publicity and discus- sion devoted to project or business performance targets and their associated dates. • A lot of time and energy is devoted to determining the “minimum acceptable” process implementation that will result in a rating of “satisfied,” regardless of the process actually needed by projects or the organization. In severe cases, organizational advocates (perhaps even lawyers) will loudly demand that the lead appraiser point out “where in the model does it explicitly say that I have to do something.” • Because the focus is on achieving maturity level targets as quickly as possible, there is “no time” for broad participation, review, or piloting of new processes. Instead, the processes are developed by a small group, or even imported from outside, and imposed on groups and projects with little regard for suitability or applicability. Objections to these processes are viewed as “resistance to be overcome.” Feedback is discouraged; again, there is “no time” to update or revise processes in response to user experiences. • After a formal appraisal, or perhaps even an informal appraisal, has been conducted and a particular process is rated as “satisfied,” changes to that process are effectively (though not explicitly) prohib- ptg ited. (“If we change it, the lead appraiser might not rate it the same anymore.”) In this way, the stated model objective of continuous process improvement is essentially abandoned. • A final symptom, and perhaps the most serious one, is that effort and energy are devoted to building a “false façade” of documents to pre- sent as evidence to a lead appraiser, documents that have little or nothing to do with the actual work being done. The sad thing is that, had an effort like this been put into genuine process improvement instead, the organization would see actual business benefits that “box-checking” will never achieve.

So this brings us to the final question, namely, “How will I know when I have been successful?” Formal appraisals leading to official maturity level ratings (or capability level profiles) can be useful, especially if there is a need to show process progress to higher levels of management or to customers. Even an informal appraisal can pro- vide very useful feedback, and can help make sure that the improve- ment effort has not missed something important. However, the real measure of process improvement success is the impact on the business and the impact on its workers. In terms of business impact, look for bottom-line measured improvements in

Wow! eBook Chapter 6 Essays and Case Studies 141

cost and schedule predictability, in project efficiency, and in delivered quality. In terms of impact on workers, look for less pressure, less overtime, more opportunity for people to have real lives outside of work. (One organization expressed it as “reduced late-night pizza orders.”) For me, the beauty of process improvement has always been that there is no trade-off between project performance and developer quality of life. The only way to be truly successful is to measurably improve both at the same time.

Avoiding Typical Process Improvement Pitfalls

by Pat O’Toole

When it comes to “process,” we commonly encounter three types of people: process zealots, process agnostics, and process atheists. The zealots see “process” as the path to engineering enlightenment—and typically, the only path. The agnostics typically don’t know, and cer- tainly don’t care, whether “process” is a good or a bad thing—just as long as it doesn’t get in their way. The process atheists, on the other hand, perceive that the “p word” borders on vulgarity, believing that ptg process hampers creativity, stifles productivity, and ultimately rots the brain—but other than that, process is generally okay for others to follow. One would expect that SCAMPI Lead Appraisers—or as some might call us, “Lead Oppressors”—fall squarely in the process zealot camp. However, after having seen too many organizations inadver- tently harm themselves through the misimplementation of CMMI- based processes, I find myself leaning toward agnostic. In this article, I will provide some insight into what I perceive to be the two most common process improvement pitfalls, as well as steps your organi- zation can take to avoid them.

The Alignment Principle: Defining the Word “Improvement” In my opinion, the greatest single point of failure for most improve- ment programs is the organization’s inability or unwillingness to define the word “improvement” within their own unique business context. In too many organizations, improvement is defined as “achieving Maturity Level X.” (The fact that the organization capital- izes “Maturity Level” may indicate that they are paying it undue homage.)

Wow! eBook 142 PART ONE ABOUT CMMI FOR DEVELOPMENT

Unfortunately, this misperception elevates the implementation of CMMI-suggested practices to the level of business strategy—a role it was never intended and is ill prepared to fulfill. A better approach is to have senior management establish a comprehensive business strat- egy and then employ CMMI as one of many tactics that enable the organization to achieve it. A critical component of this preferred approach is the Alignment Principle. Wouldn’t it be wonderful if the business strategy, project objectives, and improvement initiatives were all aligned? By articu- lating an organizational Alignment Principle, senior management helps to synchronize these three elements. An organization can use CMMI to help it become the high-quality producer of goods and services in their selected market segment. Alternatively, CMMI might be used to help the organization to become the low-cost producer, or the quickest to market, or the most technically innovative, or the most environmentally friendly, or . . . Although the model can be used to help an organization achieve any of these strategic positions, the processes employed for each are likely to be vastly different. For example, high quality producers might have checks and balances all along the development life cycle to detect and eliminate defects as close as possible to the point where ptg they were introduced, whereas the quick-to-market leaders would typically have the thinnest layer of process that keeps the project from spiraling out of control. Therefore, knowing the strategic position that management is try- ing to achieve for the business is a critical prerequisite to establishing and maintaining processes that will help the organization achieve it. Left to their own devices, an Engineering Process Group (EPG) will build “CMMI-compliant processes.” However, given proper direction and motivation from senior management, those same processes will be honed to meet the organization’s strategic business objectives. After all, as long as the EPG is going to the trouble of establishing a process infrastructure to achieve maturity level X, wouldn’t it be nice if they actually helped the organization accomplish its strategic mis- sion while doing so? (For some, this question borders on insightful rather than rhetorical.) It seems fairly obvious that if your firm manufactures pace mak- ers, “quality” is the attribute to be maximized. When your primary metric is the “plop factor,” you are usually willing to overrun a bit on schedule and cost to reduce the number of field reported defects— especially those reported by the relatives of your former customers.

Wow! eBook Chapter 6 Essays and Case Studies 143

But you’d be surprised by how few executives can articulate a clear strategy and how little time they invest trying to do so. My favorite consulting engagement of all time stemmed from my first presentation at a major industry conference. After the talk, I was col- lared by an executive from a 300-person company who asked if I would be willing to facilitate a session in which his company would establish their Alignment Principle. He suggested a three-day offsite meeting in which his boss, the CEO, and the top nine executives in the organization would participate. After I agreed to help him out, he brought up one little problem— the CEO doesn’t listen to consultants! Unfazed, I informed him that I have a way to overcome such an issue—I simply triple my consulting rate. I’ve determined that if you charge enough money even the biggest of bigwigs has to listen! To start the session, I spent 15 minutes explaining the basic con- cepts of the Alignment Principle. I then handed each of the 10 atten- dees a blank index card and asked them to write a single word on the card—”Faster,” “Better,” “Cheaper,” or some other word that best describes the most important attribute for the organization’s products and services. I collected the cards and was dismayed to find 3 cards read ptg “Faster,” 3 “Better,” 3 “Cheaper,” and the last card—the one from the CEO (OK, I kept hers separate) —read “Innovative.” After sharing the results with the group, and finding myself somewhat uncertain as to how to proceed, I suggested that we refill our coffee cups. During the break, the CEO asked if we could speak privately. When we were alone she informed me that my services were no longer required. She said that the exercise pointed out that her Lead- ership Team obviously did not understand the direction in which she was trying to take the company. Because she had them offsite for three full days, she would invest the time in rectifying this situation. Rather than having a session facilitated by “one of you consultants,” she felt that it was necessary for her to demonstrate leadership by taking ownership of the meeting’s successful resolution. As she bid me farewell she concluded with “. . . and please submit your invoice for the full three days; I will make sure that you are paid promptly.” Nine days of pay for about a half hour of work—it don’t get no better than that! But I digress . . . Suppose your senior management, like that of the pace-maker company, has just informed you that quality as defined by customer- reported defects is the single most important competitive dimension in the minds of your customers. It’s time to craft the Alignment Principle:

Wow! eBook 144 PART ONE ABOUT CMMI FOR DEVELOPMENT

“Achieve an annual, sustainable X% reduction in field reported defects without degrading current levels of cost, schedule, and feature variance.” Now the projects know what is most important to senior manage- ment, and the EPG knows what it means to help the projects “improve.” It’s not that cost, schedule, and functionality aren’t important; it’s just that they are not deemed as critical as customer- perceived quality. And now the EPG can build a process that helps move the organization in that strategic direction.

Avoiding Reverse Exploitation: Focusing on Performance After you’ve addressed Pitfall #1—lack of alignment—you’ll find that you’ve simply promoted Pitfall #2—focusing on “implementing the model.” The mantra of process agnostics and atheists is “process = admin- istrivia,” and in some misguided organizations, they’re right! One might gain insight into the real problem in such organizations by dis- tinguishing among three related concepts: process, process docu- mentation, and process model. The process is the way that work is actually being performed on today’s projects. It is nothing more or less than the day-to-day activi- ties carried out by the developers, testers, project managers, and so ptg on as they do their jobs. To say that you are “process adverse” is to contend that all project activity should stop and, unless you’re like Wally from Dilbert, this is not a position that most people would advocate. The process documentation is the infrastructure being provided to the projects to help them perform their work in a consistent manner. It’s the stuff sitting on the proverbial shelf just waiting to be employed by the projects—process descriptions, procedures, guide- lines, templates, forms, metrics, checklists, tailoring guidelines, good case examples, and so on. It should provide the projects with a robust set of materials that enables them to conduct their work in the most effective and efficient manner (well, at least in theory!). Finally, the process model, in the current case, the CMMI model, is a collection of good practices that can be exploited by the organiza- tion to heighten the probability of achieving success—where success is defined by the Alignment Principle and project-specific customer objectives. Please note the direction of this exploitation—the organi- zation should exploit the model and not, as too often happens, the other way around.

Wow! eBook Chapter 6 Essays and Case Studies 145

How does one avoid reverse exploitation? Let me make three suggestions.

• Recognize that the current projects are enjoying some level of suc- cess. Some of them have actually completed and shipped products, right? Some of their customers are reasonably pleased with the work that the projects performed on their behalf, aren’t they? Responding “yes” to either question implies that the existing processes (i.e., exist- ing project activities) are working fairly well. The organization should not abandon everything they are currently doing to imple- ment some goofy model just because they are trying to achieve some arbitrary “Maturity Level” (note the caps). Rather, they should deter- mine how best to integrate some of the model’s more value-added practices into their existing processes and process documentation to enhance project performance. For some organizations, a “find-the-pain, fix-the-pain” approach may be best to kick-start the improvement effort. This approach uses the model like a medical reference guide for addressing chronic proj- ect pain. Applying a bit of process aspirin makes a lot more sense than hitting people over the head with a 500+ page book like this and telling them, “Trust me. This is really good for you!” ptg • To avoid religious warfare among the process zealots, agnostics, and atheists, abandon the term “process improvement” in favor of the term “performance improvement.” It’s hard for even the most radical process atheist to argue that improving project performance is a bad thing. • The EPG (retitled from Engineering Process Group to Engineering Performance Group), should change its focus to align with the new name. Rather than obsessing on higher maturity or capability levels— appropriately spelled with lowercase letters—the EPG should focus on helping the projects achieve ever-higher levels of success. Focus- ing on project performance may not be the quickest path to maturity level X, but it is a sure-fire way to overcome project resistance and actually build momentum and, dare I say, excitement for the improve- ment initiative.

Summary Now that you’ve addressed pitfalls #1 and #2, you’ll find that you’ve now promoted pitfalls #3, #4, #5, . . . Truth be told, there is an unlimited number of pitfalls to avoid or overcome as you traverse the ongoing path of performance improvement (most of which, unfortu- nately, are of your own making).

Wow! eBook 146 PART ONE ABOUT CMMI FOR DEVELOPMENT

Someone could write a rather voluminous book describing ways that you could improve project performance. Hold on a second— they already did and you’re holding it! Although reading and under- standing the model itself is a good start, you also need to recognize that writing process descriptions, procedures, templates, and so on is the easy part; doing so in a way that provides value, and getting peo- ple to change their behavior as they adopt these new process ele- ments is the really tough part. Just because you build it does not mean they will come! You are strongly encouraged to learn and steal shamelessly from those that have gone before, and are even more strongly encouraged to contribute your experience to the growing body of knowledge for those that are yet to come. Enjoy the journey; the rewards are worth it—provided you don’t mess it up!

Ten Missing Links to CMMI Success

by Hillel Glazer

Starting out with CMMI can be particularly daunting to individuals ptg and organizations who don’t know what they don’t know about CMMI. Often, they set out not knowing they’re missing a few bits of information that could make their journeys less arduous and possi- bly enjoyable. Absent these bits, people find themselves struggling to incorporate CMMI into their work. Between bouts of bashing their heads against the wall, they often find themselves blaming CMMI itself (or some personification of CMMI such as their consultant or lead appraiser), believing there’s something clearly “wrong” with the model. Hint: It’s not the model. Whether it’s good or bad luck, or some other characteristic of nature, I have had the privilege of working with many companies just starting out with CMMI for the first time. These companies (large or small, traditional or Agile) don’t know what to expect and they don’t really understand what using CMMI is all about. If you are just starting out with CMMI, consider yourselves lucky. You don’t have as many bad habits or history to undo. Those experi- enced with CMMI might find themselves reaching for the Excedrin—realizing, for the first time, the “missing links” to their CMMI challenges. Below, the rather unorthodox approach to using CMMI has resulted in lasting, positive effects.

Wow! eBook Chapter 6 Essays and Case Studies 147

I must acknowledge my thorough appreciation for the competi- tive nature of organizations vying for attention and business from customers expecting to see some sort of CMMI “rating” (e.g., “Matu- rity Level 3”). This is reality. It’s a destructive and counterproductive reality because it can cause organizations to pursue CMMI ratings at the expense of people, productivity, and profit. How silly is that? The purpose of incorporating CMMI into your work is to provide a sys- tematic mechanism to improve performance. Increasing overhead, decimating morale and lowering productivity are not “performance improvements.” Consider this your first missing link.

Missing Link #1: CMMI improves performance. Find aspects of your development opera- tion that can stand to improve in terms of time, cost, and quality, and CMMI will make more sense. Given that CMMI is a framework to improve performance, it stands to reason that success with CMMI comes more easily to those organizations who want to improve, can identify aspects of their operation they’d like to improve, and have the culture and leadership to make appropriate changes to facilitate such improvements. Organ- izations resistant to change (even beneficial changes), who have a ptg culture of blame and mistrust, and whose idea of “measures” starts and ends with top-line sales, are going to find it difficult to use or see any benefit from CMMI. The right culture and leadership will cause the organization’s behaviors to align well with CMMI goals. Organi- zations with the “right” culture and leadership find themselves achieving CMMI ratings without having to think about it.

Missing Link #2: CMMI alone will not make things much better. For CMMI to work well, it needs: (1) a culture of learning, empowerment, and trust, and (2) lead- ership with the courage, vision, and direction to eliminate the fear often accompanying changes. CMMI is unlike any other body of work out there. It’s different in how it’s supposed to be used, in how we determine how well it’s been used. What makes for success with other bodies of work does not necessarily work well with CMMI. In many parts of our work and personal lives, when we see a vir- tuous thing to do, we often find ways to adopt it into our daily lives. At first, it takes a conscious effort to get it going; then, if we like it, after awhile it becomes second nature. In fact, we’ll naturally find

Wow! eBook 148 PART ONE ABOUT CMMI FOR DEVELOPMENT

ways to keep doing it and improving what it is that we’re doing. In many ways, CMMI is a lot like this, but in some important ways CMMI is different. Many organizations misconstrue CMMI practices as achieve- ments in and of themselves rather than enablers of long-term results. In other words, they make the purpose of incorporating CMMI prac- tices the presence or absence of the practices rather than the business benefits of the practices. Using personal health as an analogy, it’s like saying, “Hey! Look at me! I’m eating lettuce! Ignore the cup of blue cheese dressing. I’m eating lettuce!”

Missing Link #3 Don’t make it about CMMI. Your efforts with CMMI are to help you improve your business. CMMI practices are not the ends; they’re the means to the ends. CMMI is full of practices. Practices are not processes. Practices are activities that improve processes wherever, whenever, and how- ever they may appear in your organization in reality. In fact, it’s prob- ably just best to stop thinking about processes entirely. Processes confuse people. Just think about your work. That leaves us with ask- ing questions like, “What can we do to make our work better?” You ptg can’t actually build anything by following CMMI. There simply is not enough stuff in it that you can actually follow along to run your operation. You need to have some sort of work flow that works for you first, and then use CMMI practices to improve it.

Missing Link #4 Know your work. Know how you get stuff done. Then, look into CMMI for ideas on where and how to improve how you get stuff done. That’s what CMMI’s practices are for—not to define your work, but to refine it. Speaking of practices, there’s frequently much confusion over the coincidental similarity in the names of CMMI process areas to vari- ous aspects of how work commonly gets done. The names by which you refer to activities in your work flow may also be words used in CMMI. These activities may be specific tasks, supporting work, or even steps in your development lifecycle. This coincidence does not make everything in CMMI part of your lifecycle. That many activities you do are already in CMMI is testament to the fact that the CMMI is full of smart things to do (and yet there are companies out there that don’t do them). However, the converse is not true. If something is in CMMI that you don’t think you do, it does not mean that you must

Wow! eBook Chapter 6 Essays and Case Studies 149 work them into similarly named parts of your work. More impor- tantly, each practice is an opportunity to leverage activities toward improvement. That they exist in CMMI is more about the role such activities play in improvement, not in defining how work is being done.

Missing Link #5 Don’t confuse the names of CMMI process areas with processes you might be using. What separates your conduct of a practice from its appearance in CMMI is a question of whether you’re leveraging that practice to improve your work. Do not pass go. Do not collect $200. Do not even begin to work with CMMI until you completely understand the role and limitations of models, how models are used, and the sort of model that CMMI is. All models share certain inalienable characteristics. Foremost is that they’re all fake. They’re not the real thing. They’re mere representa- tions of reality, which does not make them real. A horse might have 5m2 of surface area, but to model the horse we’d probably just repre- sent her as a sphere, cube, or box with 5m2 of surface area— depend- ing on the application we’re planning to use her in. Horses aren’t spheres, but for many models, spheres are good enough. ptg Good models help us understand complex ideas, test them, and avoid cost and/or danger compared to trying these ideas in the real world. Models are also used to build other things, like object models in software. CMMI is a model for building process performance improvement systems. As such, the model isn’t the improvement sys- tem, it’s just a list of characteristics that you use when building your improvement system. To describe these characteristics, the model includes examples and other narrative content to inform us of what’s meant by the terse summary of characteristics. Then, when bench- marking against CMMI (e.g., in a SCAMPI), what’s being investigated is how many of these characteristics your improvement system has compared to the ones listed in the model. The names of the model characteristics in CMMI are goals and practices.

Missing Link #6 CMMI is a model for building improvement systems. Goals and practices in CMMI are the characteristics that help us know whether our system is consistent with the model, and all the “informative” material helps us understand what’s meant by the goals and practices. Think: Thousands of words in lieu of pictures.

Wow! eBook 150 PART ONE ABOUT CMMI FOR DEVELOPMENT

Earlier I alluded to an experience many of us have shared: the notion of working on bringing some virtuous behavior into our lives, then growing with it until it becomes second nature. CMMI is about improving performance, and we’d eventually like improving per- formance to be second nature to our work. CMMI provides a system- atic way of doing just that, called generic practices. Generic practices are CMMI’s way of helping organizations systematically make improvement practices part of everyday work. At first, you need awareness of the practice(s) and with this awareness usually comes some level of accountability. Typically, we make sure someone knows what needs to get done; much like start- ing a health program by “just doing it.” Then, when “just doing it” doesn’t cut it, we need to think about it; we’ll start to plan ahead, think things through, set aside time and resources, put in checks and balances, get others involved, and get more organized. In other words, we “manage” it; same with the practices in CMMI. As we get even more refined, we’ll come up with better definitions for ourselves so we can compare how we’re doing to some prior experience. You know the saying, “that which is measured is improved”? It’s true.

Missing Link #7 ptg Generic practices are CMMI’s way of helping you make performance improvement practices second nature. They’re not intended to be separate and unique activities and they’re not the same as what you do to make your products. They’re what you do to ensure that the way you want things done shows up every day as “business as usual.” OK, so this next “missing link” may be a bit controversial, but it’s always worked for me and my clients. When you recognize and apply this thinking, you can avoid all sorts of heartburn. CMMI is easiest to use for experts. In fact, if you think of CMMI’s “maturity levels” as follows, it might actually start to make sense: Put a project management professional and a configuration spe- cialist to work together on a project, and they’ll naturally generate practices of maturity level 2. Add an organizational change profes- sional and an experienced engineer to these two, and much (if not all) of maturity level 3 will show up for the team. Add a process modeler and a statistical expert to this high-powered team and it shouldn’t be surprising to see maturity level 4 and 5 behaviors appear.

Wow! eBook Chapter 6 Essays and Case Studies 151 Missing Link #8 Pursue the internal capabilities of project management, engineering, and other described experts and you’ll discover the origins of many of the innate features of CMMI. Until you’ve grown those capabilities, you might want some help. Many failures en route to realizing benefits with CMMI can be tied to not seeing the complete picture of the organization’s processes, how they work together, and how they operate on all proj- ect, product, and supporting activities simultaneously and over time. They see very narrow, shallow, and temporary views of their processes, and they often fail to connect their processes to organiza- tional norms, culture, behaviors, or business performance. To com- plete the picture, define an architecture for your processes, including a taxonomy that characterizes the type of work you do and the types of process components (e.g., process descriptions, procedures, guide- lines, measures, templates) that you will use to add discipline to your process improvement journey. How much sense does it make to be ad hoc in how you pursue improvements?

Missing Link #9 ptg A process architecture and common taxonomy are needed so that you can communicate consistently about your processes, and so that you experience your process as a solution to how you get stuff done. Your processes don’t happen in a vacuum or in a different dimension of reality by other people. Treat your process solution using systems thinking and all that it entails. Understand the origin of CMMI. Specifically, that CMMI has its roots in Lean, TQM, and high performance cultures. CMMI is a sys- tematic, measurable guide to installing the many attributes of these roots. The existence of a guide against which measurable progress can be made does not obviate the validity of the basics. Creating such a guide is not easy, and following it might be tricky, but that doesn’t mean you need to be overwhelmed by it. In fact, if you go back to the basic Values, Principles, and Practices of Lean, TQM, and high perform- ance cultures, there’s almost nothing in CMMI that will thwart you.

Missing Link #10 Start with the basics. Get grounded—firmly—in what Lean, TQM, and high-performance cultures are all about. Become experts in Lean Systems Thinking, TQM, empowered organizations, and value. CMMI will follow. I hope these missing links will help you find and complete other links of your own. If I were pressed for a summary of these missing

Wow! eBook 152 PART ONE ABOUT CMMI FOR DEVELOPMENT

links it would be this: Recognize the psychology and the engineering in performance improvement. Take them both seriously and you’ll be fine. Take either lightly and you’ll face an uphill struggle. Good luck on your CMMI journey.

Adopting CMMI: Hiring a CMMI Consultant or Lead Appraiser

by Rawdon Young, Will Hayes, Kevin Schaaff, and Alexander Stall

Many organizations embarking on process improvement initiatives discover that anyone can be called a consultant. There is typically no test, no accrediting organization, or governing body to guide consul- tants’ actions—a person can simply open a business and start provid- ing services. Many organizations have learned the hard way about the wide range of expertise available. There are differing levels of competence, specialization, and experience. A consultant that is competent in one discipline (e.g., software engineering) or domain (e.g., accounting) may have relatively little to offer in another discipline (e.g., hardware ptg engineering) or domain (e.g., embedded). The experience of individ- uals offering consulting services must be carefully considered before inviting one of them to help steer the improvement program of an organization. A bad choice can be very costly, have long-term effects on the viability of the program, and affect the credibility of the lead- ership in an organization. In this essay, we identify some areas to emphasize when consider- ing prospective consultants and lead appraisers. Our intent is to help maximize the chances that you will hire a professional who can help you achieve your performance goals as well as your goals for process improvement. This essay is based on our own personal experiences as process improvement professionals, as well as observations made during the conduct of quality activities for the CMMI Appraisal Pro- gram at the SEI.

Experience Drives Success “You can’t teach experience.”—Coaching adage

Experience is perhaps the single most important characteristic of effective CMMI consultants and lead appraisers, and the one that typically distinguishes a particular professional from his or her col-

Wow! eBook Chapter 6 Essays and Case Studies 153 leagues and competitors. The content of CMMI constellations and appraisal methods can be gained from careful study of the model documents and through training. However, a real understanding of how they are beneficially applied, and the value that an organization can gain from using them comes from understanding organizations, typically as a result of actually having done that type of work. But what is experience? By itself, a count of the number of years someone has done something is not sufficient. When looking for an outside expert to help you achieve your performance goals, we have found that the breadth of an individual’s experience is usually more helpful. Has the professional gained experience by working in or with different organizations, perhaps in different roles? Or, has all of his or her experience been in the same organization, or the same type of work? Stated simply: “Does the person have X years of experience or 1 year of experience X times?” With broad experience, the consultant has a better understanding of how different kinds of organizations work. This greater repertoire aids in diagnosing issues as well as determining a variety of potential solutions. Insight about the drivers within an organization—the underlying pattern of incentives and disincentives—helps in deriving custom solutions to problems. Drawing on a breadth of experience, ptg the consultant or lead appraiser is able to see a greater variety of pos- sibilities and can foresee a variety of plausible patterns of future action in the organization. Reliance on a narrower range of experience can lead to force- fitting a specific solution because it has worked so well in the past; it might even be used as a selling point for the consultant’s business. We have encountered many professionals with “the solution.” Very often, “the solution” is based on the techniques they have seen implemented several times within a single organization. Because they have only seen it done in a very narrow context, the approach tends to be less flexible. Many of these professionals have worked hard to develop “the complete solution” with templates, tools, and training prepackaged and ready to be delivered to the next client. Tailoring this type of solution is often difficult and can result in experimentation that translates into significant effort and risk to the process improvement objectives. Lead appraisers with narrower experience often struggle with imple- mentations that are unusual or different from what they have seen in the past. Because of this, appraisal teams can arrive at findings that understate the true achievement of the organization, or miss weak- nesses that represent significant barriers to achieving performance

Wow! eBook 154 PART ONE ABOUT CMMI FOR DEVELOPMENT

goals. In the extreme case, a team may identify something as a weak- ness because it is different than what the lead appraiser has seen in the past, or different than “what the model says.” Remember though that there is a tradeoff of breadth versus depth. As the aphorism goes, “A specialist knows more and more about less and less until he knows everything about nothing and a generalist is someone who knows less and less about more and more until he knows nothing about everything.”

How Do We Find the Professional We Need? The following is advice you should consider when selecting lead appraisers or consultants to help you determine if they have the req- uisite mix of breadth and depth of experience. You are looking for both breadth and depth of experience. Breadth comes from working with a variety of different systems, domains, platforms, services, acquisition types, and so on. With this breadth of experience, professionals will have seen a variety of solu- tions in a variety of situations and thus the tool box of solutions (for consultants) or capability of interpretation (for lead appraisers) they bring to help you is greater than one of less breadth. Depth of experi- ence comes from working hands-on through the entire lifecycle. For ptg example, have they done requirements, design, build, test, and deliv- ery of engineering products and managed projects that did the same? In other words, have they done those things you are hiring them to help you improve or to evaluate against CMMI? You are looking for someone who understands the work your organization does. It is possible that you will not always be able to hire a consultant who has experience in what your organization does. In these situations, you should hire someone that understands your business. Do prospective lead appraisers or consultants under- stand the kind of work you do? Do they understand the language of your business, and not just process improvement or appraisal lan- guage? In other words, if you are an accounting company, can the consultant or lead appraiser speak cogently and coherently about accounting services? If you develop products using Agile methods, does the consultant or lead appraiser understand the concepts of agility? A good consultant or lead appraiser doesn’t necessarily have to be a CPA, a Scrum Master, or a certified expert in your specific domain. However, if you find yourself teaching them an entirely new language or explaining the nature of the competitive market in which the company operates, you should consider the possibility that they do not understand your business context and may have a difficult time developing an effective solution for your organization.

Wow! eBook Chapter 6 Essays and Case Studies 155

You are looking for someone who balances process improvement and model compliance. When talking to prospective consultants or lead appraisers, try to get an understanding of how they balance process improvement and model compliance. Do they recognize that the process implemented should reflect the work actually being done, and that it should provide positive value to either the per- former of the process or to the larger project or organization? Do they understand that changing an organization can be very difficult and time consuming? How can they help you institutionalize the process and not backslide? Key areas of discussion could be building visible and active senior management support, instilling ownership of the process in those required to follow it, identifying conse- quences for not following the process, identifying rewards for improving the process (not for simply following the process), remov- ing excuses for not following the process, and documenting processes to the level of granularity you need to ensure they are con- sistently followed. Beware of consultants and lead appraisers who overemphasize compliance with the model or the standard. A template of “the stan- dard process” that requires little input from projects and staff, a sin- gle documented process for all types of projects, or the development ptg of work products simply because “the model says you have to” are not conducive to obtaining value. Worse, they can actually have a negative impact on the success of the organization both in terms of improvement and business objectives. George Box has a quote that summarizes this point nicely, “All models are wrong, some models are useful.” Consultants that are unable to help you see the value proposition from the business perspective, or lead appraisers unable to explain the business consequences of a weakness and can only rely on the model, may have insufficient grounding in your context. On follow-up questions with candidates, try to determine how flex- ible they are in their process improvement philosophy. Will they insist on remaking your organization to reflect their solution, or are they willing to change their solution to meet the needs of your business? You are looking for someone who will provide some suggested guidance on how to evaluate your processes effectively. At some point, an organization that undertakes process improvement will probably want (or require) some level of evaluation of how well they have done, whether it be a formal benchmark like a SCAMPI A, or a less formal evaluation of strengths and weaknesses. Lead appraisers or consultants should at least have some approach to these events. Do they view appraisal results as the end goal or as a way to indicate

Wow! eBook 156 PART ONE ABOUT CMMI FOR DEVELOPMENT

where you can improve? Is their emphasis on improving the per- formance of the organization or do they focus solely on achieving appraisal results? For lead appraisers, particularly, the ability to explain, with appropriate linkage to business drivers, why an appraisal is valuable to an organization indicates that they have an understand- ing of how organizations can use appraisals to drive improvements, and not just produce numbers for use in proposals. Similarly, the ability to recognize that not all implementations are identical— sometimes even in the same organization—can mean the difference between a successful, value-added appraisal and a simple exercise to produce a rating number. You are looking for someone who will provide more than just a “level.” None of this is to say that achieving a maturity or capability level is not a worthy goal, but if “Maturity Level x” is the only goal of the consultant you are considering hiring, be aware of the potential impact. You may well achieve the goal (almost any organization can be forced to change behavior for a short time), but your performance probably won’t increase and may in fact decrease. It is also likely that the changes won’t be institutionalized (i.e., they won’t become the habit of the organization) and they will quickly disappear as the organization reverts to its old behaviors after the appraisal. ptg You are looking for someone who understands the realities of process improvement. Finally, it is usually worthwhile to ask prospec- tive lead appraisers or consultants some questions regarding their experiences related to developing, implementing, and following pro- cesses. These questions can provide insight into how well they understand the realities of process improvement. Consider using the following questions:

• Describe situations in the past where you have actually developed processes for an organization. • Were those processes ever appraised? If so, with what results? • Have you ever worked in an organization that followed the processes you helped to develop? Have you followed these processes?

Conclusion—Your Mileage May Vary Of course, no one can guarantee that an organization will truly improve or always derive value from the use of model-based improvement simply because it hired a particular external profes- sional to help along the way. Many other factors affect process improvement initiatives, such as senior management support, the allocation of organizational resources, and the willingness of the

Wow! eBook Chapter 6 Essays and Case Studies 157

leadership in the organization to accept changes—all must be addressed to have a successful improvement effort. However, the right external expert can greatly facilitate the solutions to these com- plex issues and enhance the likelihood of success. When seeking to engage an external professional, always remem- ber these two things:

1. It’s not just knowing the content of the model or the method; it’s knowing how to apply them to a real organization in its context that results in real, valuable process improvement. 2. Past application and experience in multiple contexts helps increase assurance that a consultant or lead appraiser can help you imple- ment a solution that is effective in your situation. A lot of people can do it right once in a specific context, but no two organizations are ever the same; the ability to see the issues from a variety of past experiences is important.

Model content and method structure can be taught and repre- sented in diplomas and certifications, but you can’t teach experience. If the only way consultants or lead appraisers can explain their serv- ice is by quoting phrases or content from the model or method, pro- ptg ceed with caution. At the end of the day, it’s your organization and not theirs that is trying to improve, and process improvement is about competitive advantage, not “checking boxes.” If you choose your process consultant or lead appraiser wisely, set measurable objectives, fix real problems relevant to the business of the organiza- tion, and demonstrate added value, your process improvement pro- gram will be sustained and your efforts will be successful.

From Doubter to Believer: My Journey to CMMI

by Joseph Morin

It was an honor to receive the invitation to contribute an essay for inclusion in this third edition of the CMMI for Development book. It was also somewhat ironic considering that I had mostly avoided involvement with the Process Program1 during my tenure at the SEI

1. The Process Program (now called the Software Engineering Process Management program) is the SEI group that serves as the steward of CMMI and that focuses its work on helping organizations improve their processes as a way of reaching their business objectives.

Wow! eBook 158 PART ONE ABOUT CMMI FOR DEVELOPMENT

in the late 80s and early 90s. On the other hand, it now seems appro- priate because I have become an outspoken process advocate and enthusiastic contributor to the evolution of the models and appraisal methods over the 16 years since Integrated System Diagnostics was spun off from the SEI as one of its original transition partners.2 I guess it is true that converts to an idea become its most enthusiastic promoters. Like many from my generation, involvement in computer science and software engineering was ancillary to my original career objec- tives. After graduating from Caltech in 1973 with a degree in Astron- omy and Physics, I had the enviable opportunity to work as part of the research group responsible for the first three small astronomy satellites. The mission was to identify and study sources of X-Ray emissions coming from objects in space. To do that, we needed to develop systems and software for the real-time command, control, and communication of the satellites and the on-board instrument packages. The ground based part of these systems utilized computers from Digital Equip- ment and Data General requiring a mastery of their unique operating systems, assembly languages, and communications interfaces. We also needed to develop back-end data reduction and analysis ptg systems to extract and understand the scientific data coming from the instruments. These systems involved programming in Fortran and PL/I and becoming an expert in the JCL needed to manipulate the IBM 360/195 at Goddard Space Flight Center as a single user machine from Friday evenings through Monday mornings. Those were exciting times from the research perspective. We identified many more X-Ray sources than we had anticipated and our work in understanding them led to the first experimental support for the existence of black holes. That said, and much to my surprise, I found myself drawn more and more to the challenges of developing, operating, and maintaining the systems and software than to those associated with the science. When the opportunity presented itself to join some of my old classmates at Intermetrics, a Cambridge, Massachusetts startup founded by the developers of the guidance, navigation, and control software for NASA’s Apollo spacecraft, I enthusiastically made the career change. As had been the case in my field, computers and soft- ware were becoming increasingly important in many mission and

2. An SEI transition partner is an organization that licenses SEI products to deliver services to clients.

Wow! eBook Chapter 6 Essays and Case Studies 159 life-critical systems. Computers and software were also becoming increasingly complex. Improved methods, tools, and techniques for software develop- ment and maintenance were a need recognized by policy makers at the highest levels of the U.S. government as well as by developers “in the trenches.” Intermetrics contributed to meeting that need through the development of programming languages, compilers, and support software environments for NASA, the U.S. Navy, GM’s automotive electronics division, several telecommunications companies, and a number of computer manufacturers. As we developed systems for these customers, it seemed only nat- ural that we should try to apply best practices to our own work. We adopted the principles and best practices advocated by Fred Brooks and Harlan Mills. We applied some of the empirical techniques com- ing out of Victor Basili’s group at the University of Maryland and used Terry Snyder’s Rate Charting technique to track our progress and communicate with management, customers, and other stakeholders. We actively participated in the R&D and industry standards com- munities to identify, pilot, and deploy innovative technologies in the performance of our work. It was through this latter activity that I was afforded the privilege of meeting and interacting with a number of ptg the people who would later emerge as key players in the founding and evolution of the SEI and later the CMMI Product Suite. At the time, these colleagues were brought together through the DoD Soft- ware Initiative program, which included the development of the Ada programming language and lifecycle support environments that were designed to enforce engineering discipline. Some years passed before I joined the SEI and had the opportu- nity to reunite with these respected colleagues. My career had diverged into other niches of computer science and my management responsibilities increased, working in several more startup compa- nies and doing some independent management and marketing con- sulting for other companies or their venture capital investors. The technological aspect of the work remained fascinating, but the new career direction for me was the challenge that executive management faces in meeting business or mission objectives. I had seen many good ideas and technologies fall by the wayside or undergo a meteoric rise and fall as a fad. To be successful, organi- zations need to understand the market and accommodate the wants and needs of stakeholders, including those with competing ideas or technology. Sound plans for piloting and rollout are as important as sound plans for development of systems and software.

Wow! eBook 160 PART ONE ABOUT CMMI FOR DEVELOPMENT

Clear communication of achievable expectations within pre- dictable schedule and cost constraints is critical for investors, senior managers, developers, and customers alike. Appropriate training and resources are required to allow personnel to effectively accomplish their roles. Objective measures of quality, progress, and success need to be defined, collected, analyzed, and communicated in ways that enable timely corrective actions and provide a factual basis for key business decisions. All of those factors and others need to be bal- anced and coordinated to achieve the business objective. When I received an invitation to join the SEI and work on these issues as part of the Institute’s technology transition mission, I eagerly accepted. Groups within the Institute were generating many promising technical ideas. Priscilla Fowler, John Maher, and Cecil Martin were assembling a body of knowledge for successfully manag- ing change in the transition and adoption of innovative technology. I was thoroughly enjoying applying that body of knowledge in helping the SEI move its ideas into practice with early adopters, par- ticularly in the area of domain-specific system architectures. I was aware of the work in the Process Program but had a lingering preju- dice that “real” developers and seasoned managers did not need to be taught what seemed to be common sense presented in broad brush ptg terms. That viewpoint changed dramatically when I was recruited to be the executive manager in establishing ISD as an SEI spinoff to com- mercialize the SCE appraisal method. Before I knew it, I was drawn into my first rounds of CBA-IPI assessments and SCE evaluations, predecessors of today’s SCAMPI appraisals. I quickly developed a new appreciation of the model and of the variations in practice implementations across diverse organizations. I realized that my earlier perceptions were woefully misguided. The overall state of the practice was not characterized by the relatively small groups of “heroes” working to implement a clearly shared vision in the startups or R&D centers with which I was most familiar. What had appeared to be common sense was indeed its own exten- sive body of knowledge that needed to be codified and deployed in the same fashion as the technologies it enabled. Working with organizations to help them improve their processes has not always been a smooth or easy endeavor. The CMMI Product Development Team deserves special recognition for persevering in the evolution of the model. They have had to reconcile and harmo- nize myriad viewpoints on what best practice is and how to describe it without being prescriptive. They have weathered uncertainties in

Wow! eBook Chapter 6 Essays and Case Studies 161 funding, aggressive schedules, and changes in emphasis directed by committee. They have overcome those and other challenges and obstacles to produce a work that can benefit us all. I can attest to those benefits first hand in small organizations applying Agile and Six Sigma tech- niques in attaining maturity level 5 capability and in large bureaucratic ones more slowly improving predictability and performance as they implement managed and defined processes across the organization. I encourage you to realize the benefits in your own organization. And for those of you with doubts like me in my earlier days, I encourage you to take a fresh look at what you can gain from this work. I believe that you will find, as I have, that the model captures much of your hard earned knowledge in a way that allows you to effectively communicate and apply that knowledge in improving the performance of your organization.

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook PART TWO Generic Goals and Generic Practices, and the Process Areas ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook GG S & GP S

GENERIC GOALS AND GENERIC PRACTICES

Overview This section describes in detail all the generic goals and generic prac- tices of CMMI—model components that directly address process institutionalization. As you address each process area, refer to this section for the details of all generic practices. Generic practice elaborations appear after generic practices to provide guidance on how the generic practice can be applied uniquely to process areas.

Process Institutionalization Institutionalization is an important concept in process improvement. ptg When mentioned in the generic goal and generic practice descrip- tions, institutionalization implies that the process is ingrained in the way the work is performed and there is commitment and consistency to performing (i.e., executing) the process. An institutionalized process is more likely to be retained during times of stress. When the requirements and objectives for the process change, however, the implementation of the process may also need to change to ensure that it remains effective. The generic practices describe activities that address these aspects of institutionalization. The degree of institutionalization is embodied in the generic goals and expressed in the names of the processes associated with each goal as indicated in Table 7.1.

Table 7.1 Generic Goals and Process Names Generic Goal Progression of Processes GG 1 Performed process GG 2 Managed process GG 3 Defined process

165

Wow! eBook 166 PART TWO THE PROCESS AREAS

The progression of process institutionalization is characterized in the following descriptions of each process.

Performed Process A performed process is a process that accomplishes the work neces- sary to satisfy the specific goals of a process area.

Managed Process A managed process is a performed process that is planned and executed in accordance with policy; employs skilled people having adequate resources to produce controlled outputs; involves relevant stakehold- ers; is monitored, controlled, and reviewed; and is evaluated for adherence to its process description. The process can be instantiated by a project, group, or organiza- tional function. Management of the process is concerned with insti- tutionalization and the achievement of other specific objectives established for the process, such as cost, schedule, and quality objec- tives. The control provided by a managed process helps to ensure that the established process is retained during times of stress. The requirements and objectives for the process are established by the organization. The status of the work products and services are ptg visible to management at defined points (e.g., at major milestones, on completion of major tasks). Commitments are established among those who perform the work and the relevant stakeholders and are revised as necessary. Work products are reviewed with relevant stake- holders and are controlled. The work products and services satisfy their specified requirements. A critical distinction between a performed process and a managed process is the extent to which the process is managed. A managed process is planned (the plan can be part of a more encompassing plan) and the execution of the process is managed against the plan. Corrective actions are taken when the actual results and execu- tion deviate significantly from the plan. A managed process achieves the objectives of the plan and is institutionalized for consistent execution.

Defined Process A defined process is a managed process that is tailored from the organiza- tion’s set of standard processes according to the organization’s tailor- ing guidelines; has a maintained process description; and contributes process related experiences to the organizational process assets.

Wow! eBook Generic Goals and Generic Practices 167

Organizational process assets are artifacts that relate to describ- GG

ing, implementing, and improving processes. These artifacts are S assets because they are developed or acquired to meet the business & GP objectives of the organization and they represent investments by the organization that are expected to provide current and future business S value. The organization’s set of standard processes, which are the basis of the defined process, are established and improved over time. Stan- dard processes describe the fundamental process elements that are expected in the defined processes. Standard processes also describe the relationships (e.g., the ordering, the interfaces) among these process elements. The organization-level infrastructure to support current and future use of the organization’s set of standard processes is established and improved over time. (See the definition of “stan- dard process” in the glossary.) A project’s defined process provides a basis for planning, perform- ing, and improving the project’s tasks and activities. A project can have more than one defined process (e.g., one for developing the product and another for testing the product). A defined process clearly states the following: ptg • Purpose • Inputs • Entry criteria • Activities • Roles • Measures • Verification steps • Outputs • Exit criteria

A critical distinction between a managed process and a defined process is the scope of application of the process descriptions, stan- dards, and procedures. For a managed process, the process descrip- tions, standards, and procedures are applicable to a particular project, group, or organizational function. As a result, the managed processes of two projects in one organization can be different. Another critical distinction is that a defined process is described in more detail and is performed more rigorously than a managed process. This distinction means that improvement information is eas- ier to understand, analyze, and use. Finally, management of the defined process is based on the additional insight provided by an

Wow! eBook 168 PART TWO THE PROCESS AREAS

understanding of the interrelationships of the process activities and detailed measures of the process, its work products, and its services.

Relationships Among Processes The generic goals evolve so that each goal provides a foundation for the next. Therefore, the following conclusions can be made:

• A managed process is a performed process. • A defined process is a managed process.

Thus, applied sequentially and in order, the generic goals describe a process that is increasingly institutionalized from a performed process to a defined process. Achieving GG 1 for a process area is equivalent to saying you achieve the specific goals of the process area. Achieving GG 2 for a process area is equivalent to saying you manage the execution of processes associated with the process area. There is a policy that indicates you will perform the process. There is a plan for performing it. There are resources provided, responsibili- ties assigned, training on how to perform it, selected work products from performing the process are controlled, and so on. In other ptg words, the process is planned and monitored just like any project or support activity. Achieving GG 3 for a process area is equivalent to saying that an organizational standard process exists that can be tailored to result in the process you will use. Tailoring might result in making no changes to the standard process. In other words, the process used and the standard process can be identical. Using the standard process “as is” is tailoring because the choice is made that no modification is required. Each process area describes multiple activities, some of which are repeatedly performed. You may need to tailor the way one of these activities is performed to account for new capabilities or circum- stances. For example, you may have a standard for developing or obtaining organizational training that does not consider web-based training. When preparing to develop or obtain a web-based course, you may need to tailor the standard process to account for the partic- ular challenges and benefits of web-based training.

Generic Goals and Generic Practices This section describes all of the generic goals and generic practices, as well as their associated subpractices, notes, examples, and refer-

Wow! eBook GP 2.1 Policy 169

ences. The generic goals are organized in numerical order, GG 1 GG

through GG 3. The generic practices are also organized in numerical S

order under the generic goal they support. & GP S GG 1 ACHIEVE SPECIFIC GOALS The specific goals of the process area are supported by the process by trans- forming identifiable input work products into identifiable output work p r o d u c t s .

GP 1.1 PERFORM SPECIFIC PRACTICES Perform the specific practices of the process area to develop work products and provide services to achieve the specific goals of the process area.

The purpose of this generic practice is to produce the work products and deliver the services that are expected by performing (i.e., execut- ing) the process. These practices can be done informally without fol- lowing a documented process description or plan. The rigor with which these practices are performed depends on the individuals managing and performing the work and can vary considerably. ptg GG 2 INSTITUTIONALIZE A MANAGED PROCESS The process is institutionalized as a managed process.

GP 2.1 ESTABLISH AN ORGANIZATIONAL POLICY Establish and maintain an organizational policy for planning and performing the process.

The purpose of this generic practice is to define the organizational expectations for the process and make these expectations visible to those members of the organization who are affected. In general, sen- ior management is responsible for establishing and communicating guiding principles, direction, and expectations for the organization. Not all direction from senior management will bear the label “pol- icy.” The existence of appropriate organizational direction is the expectation of this generic practice, regardless of what it is called or how it is imparted. CAR Elaboration This policy establishes organizational expectations for identifying and systematically addressing causal analysis of selected outcomes.

Wow! eBook 170 PART TWO THE PROCESS AREAS

CM Elaboration This policy establishes organizational expectations for establishing and maintaining baselines, tracking and controlling changes to work products (under configuration management), and establishing and maintaining integrity of the baselines. DAR Elaboration This policy establishes organizational expectations for selectively analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. The pol- icy should also provide guidance on which decisions require a formal evaluation process. IPM Elaboration This policy establishes organizational expectations for establishing and maintaining the project’s defined process from project startup through the life of the project, using the project’s defined process in managing the project, and coordinating and collaborating with rele- vant stakeholders.

MA Elaboration ptg This policy establishes organizational expectations for aligning mea - surement objectives and activities with identified information needs and project, organizational, or business objectives and for providing measurement results. OPD Elaboration This policy establishes organizational expectations for establishing and maintaining a set of standard processes for use by the organiza- tion, making organizational process assets available across the organ- ization, and establishing rules and guidelines for teams. OPF Elaboration This policy establishes organizational expectations for determining process improvement opportunities for the processes being used and for planning, implementing, and deploying process improvements across the organization. OPM Elaboration This policy establishes organizational expectations for analyzing the organization’s business performance using statistical and other quan- titative techniques to determine performance shortfalls, and identify-

Wow! eBook GP 2.1 Policy 171

ing and deploying process and technology improvements that con- GG

tribute to meeting quality and process performance objectives. S & GP OPP Elaboration This policy establishes organizational expectations for establishing S and maintaining process performance baselines and process perform- ance models for the organization’s set of standard processes. OT Elaboration This policy establishes organizational expectations for identifying the strategic training needs of the organization and providing that training. PI Elaboration This policy establishes organizational expectations for developing product integration strategies, procedures, and an environment; ensuring interface compatibility among product components; assem- bling the product components; and delivering the product and prod- uct components. PMC Elaboration ptg This policy establishes organizational expectations for monitoring project progress and performance against the project plan and man- aging corrective action to closure when actual or results deviate sig- nificantly from the plan. PP Elaboration This policy establishes organizational expectations for estimating the planning parameters, making internal and external commitments, and developing the plan for managing the project. PPQA Elaboration This policy establishes organizational expectations for objectively evaluating whether processes and associated work products adhere to applicable process descriptions, standards, and procedures; and ensuring that noncompliance is addressed. This policy also establishes organizational expectations for process and product quality assurance being in place for all projects. Process and product quality assurance must possess sufficient inde- pendence from project management to provide objectivity in identi- fying and reporting noncompliance issues.

Wow! eBook 172 PART TWO THE PROCESS AREAS

QPM Elaboration This policy establishes organizational expectations for using statisti- cal and other quantitative techniques and historical data when: establishing quality and process performance objectives, composing the project’s defined process, selecting subprocess attributes critical to understanding process performance, monitoring subprocess and project performance, and performing root cause analysis to address process performance deficiencies. In particular, this policy estab- lishes organizational expectations for use of process performance measures, baselines, and models. RD Elaboration This policy establishes organizational expectations for collecting stakeholder needs, formulating product and product component requirements, and analyzing and validating those requirements. REQM Elaboration This policy establishes organizational expectations for managing requirements and identifying inconsistencies between the require- ments and the project plans and work products. ptg RSKM Elaboration This policy establishes organizational expectations for defining a risk management strategy and identifying, analyzing, and mitigating risks. SAM Elaboration This policy establishes organizational expectations for establishing, maintaining, and satisfying supplier agreements. TS Elaboration This policy establishes organizational expectations for addressing the iterative cycle in which product or product component solutions are selected, designs are developed, and designs are implemented. VAL Elaboration This policy establishes organizational expectations for selecting products and product components for validation; for selecting vali- dation methods; and for establishing and maintaining validation pro- cedures, criteria, and environments that ensure the products and product components satisfy end user needs in their intended operat- ing environment.

Wow! eBook GP 2.2 Plan 173

VER Elaboration GG

This policy establishes organizational expectations for establishing S and maintaining verification methods, procedures, criteria, and the & GP

verification environment, as well as for performing peer reviews and S verifying selected work products.

GP 2.2 PLAN THE PROCESS Establish and maintain the plan for performing the process.

The purpose of this generic practice is to determine what is needed to perform the process and to achieve the established objectives, to prepare a plan for performing the process, to prepare a process description, and to get agreement on the plan from relevant stake- holders. The practical implications of applying a generic practice vary for each process area.

For example, the planning described by this generic practice as applied to the Project Monitoring and Control process area can be carried out in full by the processes associated with the Project Planning process area. How- ptg ever, this generic practice, when applied to the Project Planning process area, sets an expectation that the project planning process itself be planned.

Therefore, this generic practice can either reinforce expectations set elsewhere in CMMI or set new expectations that should be addressed. Refer to the Project Planning process area for more information about establish- ing and maintaining plans that define project activities. Establishing a plan includes documenting the plan and a process description. Maintaining the plan includes updating it to reflect cor- rective actions or changes in requirements or objectives.

The plan for performing the process typically includes the following: • Process description • Standards and requirements for the work products and services of the process • Specific objectives for the execution of the process and its results (e.g., quality, time scale, cycle time, use of resources) Continues

Wow! eBook 174 PART TWO THE PROCESS AREAS

Continued • Dependencies among the activities, work products, and services of the process • Resources (e.g., funding, people, tools) needed to perform the process • Assignment of responsibility and authority • Training needed for performing and supporting the process • Work products to be controlled and the level of control to be applied • Measurement requirements to provide insight into the execution of the process, its work products, and its services • Involvement of relevant stakeholders • Activities for monitoring and controlling the process • Objective evaluation activities of the process • Management review activities for the process and the work products

Subpractices 1. Define and document the plan for performing the process. This plan can be a stand-alone document, embedded in a more com- prehensive document, or distributed among multiple documents. In the case of the plan being distributed among multiple documents, ensure that a coherent picture of who does what is preserved. Docu- ptg ments can be hardcopy or softcopy. 2. Define and document the process description. The process description, which includes relevant standards and pro- cedures, can be included as part of the plan for performing the process or can be included in the plan by reference. 3. Review the plan with relevant stakeholders and get their agreement. This review of the plan includes reviewing that the planned process satisfies the applicable policies, plans, requirements, and standards to provide assurance to relevant stakeholders. 4. Revise the plan as necessary.

CAR Elaboration This plan for performing the causal analysis and resolution process can be included in (or referenced by) the project plan, which is described in the Project Planning process area. This plan differs from the action proposals and associated action plans described in several specific practices in this process area. The plan called for in this generic practice would address the project’s overall causal analysis and resolution process (perhaps tailored from a standard process maintained by the organization). In contrast, the process action pro- posals and associated action items address the activities needed to address a specific root cause under study.

Wow! eBook GP 2.2 Plan 175

CM Elaboration GG

This plan for performing the configuration management process can S be included in (or referenced by) the project plan, which is described & GP

in the Project Planning process area. S DAR Elaboration This plan for performing the decision analysis and resolution process can be included in (or referenced by) the project plan, which is described in the Project Planning process area. IPM Elaboration This plan for the integrated project management process unites the planning for the project planning and monitor and control processes. The planning for performing the planning related practices in Inte- grated Project Management is addressed as part of planning the project planning process. This plan for performing the monitor-and- control related practices in Integrated Project Management can be included in (or referenced by) the project plan, which is described in the Project Planning process area.

MA Elaboration ptg This plan for performing the measurement and analysis process can be included in (or referenced by) the project plan, which is described in the Project Planning process area. OPD Elaboration This plan for performing the organizational process definition process can be part of (or referenced by) the organization’s process improvement plan. OPF Elaboration This plan for performing the organizational process focus process, which is often called “the process improvement plan,” differs from the process action plans described in specific practices in this process area. The plan called for in this generic practice addresses the com- prehensive planning for all of the specific practices in this process area, from establishing organizational process needs through incor- porating process related experiences into organizational process assets. OPM Elaboration This plan for performing the organizational performance management process differs from the deployment plans described in a specific prac- tice in this process area. The plan called for in this generic practice

Wow! eBook 176 PART TWO THE PROCESS AREAS

addresses the comprehensive planning for all of the specific practices in this process area, from maintaining business objectives to evaluat- ing improvement effects. In contrast, the deployment plans called for in the specific practice would address the planning needed for the deployment of selected improvements. OPP Elaboration This plan for performing the organizational process performance process can be included in (or referenced by) the organization’s process improvement plan, which is described in the Organizational Process Focus process area. Or it may be documented in a separate plan that describes only the plan for the organizational process per- formance process. OT Elaboration This plan for performing the organizational training process differs from the tactical plan for organizational training described in a spe- cific practice in this process area. The plan called for in this generic practice addresses the comprehensive planning for all of the specific practices in this process area, from establishing strategic training needs through assessing the effectiveness of organizational training. ptg In contrast, the organizational training tactical plan called for in the specific practice of this process area addresses the periodic planning for the delivery of training offerings. PI Elaboration This plan for performing the product integration process addresses the comprehensive planning for all of the specific practices in this process area, from the preparation for product integration all the way through to the delivery of the final product. This plan for performing the product integration process can be part of (or referenced by) the project plan as described in the Project Planning process area. PMC Elaboration This plan for performing the project monitoring and control process can be part of (or referenced by) the project plan, as described in the Project Planning process area. PP Elaboration Refer to Table 7.2 in Generic Goals and Generic Practices for more information about the relationship between generic practice 2.2 and the Project Planning process area.

Wow! eBook GP 2.2 Plan 177

PPQA Elaboration GG

This plan for performing the process and product quality assurance S process can be included in (or referenced by) the project plan, which & GP

is described in the Project Planning process area. S QPM Elaboration This plan for performing the quantitative project management process can be included in (or referenced by) the project plan, which is described in the Project Planning process area. RD Elaboration This plan for performing the requirements development process can be part of (or referenced by) the project plan as described in the Proj- ect Planning process area. REQM Elaboration This plan for performing the requirements management process can be part of (or referenced by) the project plan as described in the Proj- ect Planning process area. RSKM Elaboration ptg This plan for performing the risk management process can be included in (or referenced by) the project plan, which is described in the Project Planning process area. The plan called for in this generic practice addresses the comprehensive planning for all of the specific practices in this process area. In particular, this plan provides the overall approach for risk mitigation, but is distinct from mitigation plans (including contingency plans) for specific risks. In contrast, the risk mitigation plans called for in the specific practices of this process area addresses more focused items such as the levels that trigger risk handling activities. SAM Elaboration Portions of this plan for performing the supplier agreement manage- ment process can be part of (or referenced by) the project plan as described in the Project Planning process area. Often, however, some portions of the plan reside outside of the project with a group such as contract management. TS Elaboration This plan for performing the technical solution process can be part of (or referenced by) the project plan as described in the Project Plan- ning process area.

Wow! eBook 178 PART TWO THE PROCESS AREAS

VAL Elaboration This plan for performing the validation process can be included in (or referenced by) the project plan, which is described in the Project Planning process area. VER Elaboration This plan for performing the verification process can be included in (or referenced by) the project plan, which is described in the Project Planning process area.

GP 2.3 PROVIDE RESOURCES Provide adequate resources for performing the process, developing the work products, and providing the services of the process.

The purpose of this generic practice is to ensure that the resources necessary to perform the process as defined by the plan are available when they are needed. Resources include adequate funding, appro- priate physical facilities, skilled people, and appropriate tools. The interpretation of the term “adequate” depends on many fac- tors and can change over time. Inadequate resources may be addressed by increasing resources or by removing requirements, con- ptg straints, and commitments. CAR Elaboration

Examples of resources provided include the following: • Database management systems • Process modeling tools • Statistical analysis packages

CM Elaboration

Examples of resources provided include the following: • Configuration management tools • Data management tools • Archiving and reproduction tools • Database management systems

Wow! eBook GP 2.3 Resources 179

DAR Elaboration GG S Examples of resources provided include the following: & GP • Simulators and modeling tools S • Prototyping tools • Tools for conducting surveys

IPM Elaboration

Examples of resources provided include the following: • Problem tracking and trouble reporting packages • Groupware • Video conferencing • Integrated decision database • Integrated product support environments

MA Elaboration Staff with appropriate expertise provide support for measurement and analysis activities. A measurement group with such a role may ptg exist.

Examples of resources provided include the following: • Statistical packages • Packages that support data collection over networks

OPD Elaboration A process group typically manages organizational process definition activities. This group typically is staffed by a core of professionals whose primary responsibility is coordinating organizational process improvement.

This group is supported by process owners and people with expertise in various disciplines such as the following: • Project management • The appropriate engineering disciplines • Configuration management • Quality assurance

Wow! eBook 180 PART TWO THE PROCESS AREAS Examples of resources provided include the following: • Database management systems • Process modeling tools • Web page builders and browsers

OPF Elaboration

Examples of resources provided include the following: • Database management systems • Process improvement tools • Web page builders and browsers • Groupware • Quality improvement tools (e.g., cause-and-effect diagrams, affinity dia- grams, Pareto charts)

OPM Elaboration

Examples of resources provided include the following: • Simulation packages • Prototyping tools ptg • Statistical packages • Dynamic systems modeling • Subscriptions to online technology databases and publications • Process modeling tools

OPP Elaboration Special expertise in statistical and other quantitative techniques may be needed to establish process performance baselines for the organi- zation’s set of standard processes.

Examples of resources provided include the following: • Database management systems • System dynamics models • Process modeling tools • Statistical analysis packages • Problem tracking packages

Wow! eBook GP 2.3 Resources 181

OT Elaboration GG S Examples of resources provided include the following: & GP • Subject matter experts S • Curriculum designers • Instructional designers • Instructors • Training administrators

Special facilities may be required for training. When necessary, the facilities required for the activities in the Organizational Training process area are developed or purchased.

Examples of resources provided include the following: • Instruments for analyzing training needs • Workstations to be used for training • Instructional design tools • Packages for developing presentation materials

ptg PI Elaboration Product component interface coordination can be accomplished with an Interface Control Working Group consisting of people who repre- sent external and internal interfaces. Such groups can be used to elicit needs for interface requirements development. Special facilities may be required for assembling and delivering the product. When necessary, the facilities required for the activities in the Product Integration process area are developed or purchased.

Examples of resources provided include the following: • Prototyping tools • Analysis tools • Simulation tools • Interface management tools • Assembly tools (e.g., compilers, make files, joining tools, jigs, fixtures)

Wow! eBook 182 PART TWO THE PROCESS AREAS

PMC Elaboration

Examples of resources provided include the following: • Cost tracking systems • Effort reporting systems • Action item tracking systems • Project management and scheduling programs

PP Elaboration Special expertise, equipment, and facilities in project planning may be required.

Special expertise in project planning can include the following: • Experienced estimators • Schedulers • Technical experts in applicable areas (e.g., product domain, technology)

Examples of resources provided include the following: • Spreadsheet programs ptg • Estimating models • Project planning and scheduling packages

PPQA Elaboration

Examples of resources provided include the following: • Evaluation tools • Noncompliance tracking tools

QPM Elaboration Special expertise in statistics and its use in analyzing process per- formance may be needed to define the analytic techniques used in quantitative management. Special expertise in statistics can also be needed for analyzing and interpreting the measures resulting from statistical analyses; however, teams need sufficient expertise to sup- port a basic understanding of their process performance as they per- form their daily work.

Wow! eBook GP 2.3 Resources 183 Examples of resources provided include the following: GG • Statistical analysis packages S

• Statistical process and quality control packages & GP • Scripts and tools that assist teams in analyzing their own process per- formance with minimal need for additional expert assistance S

RD Elaboration Special expertise in the application domain, methods for eliciting stakeholder needs, and methods and tools for specifying and analyz- ing customer, product, and product component requirements may be required.

Examples of resources provided include the following: • Requirements specification tools • Simulators and modeling tools • Prototyping tools • Scenario definition and management tools • Requirements tracking tools

ptg REQM Elaboration

Examples of resources provided include the following: • Requirements tracking tools • Traceability tools

RSKM Elaboration

Examples of resources provided include the following: • Risk management databases • Risk mitigation tools • Prototyping tools • Modeling and simulation tools

SAM Elaboration

Examples of resources provided include the following: • Preferred supplier lists • Requirements tracking tools • Project management and scheduling programs

Wow! eBook 184 PART TWO THE PROCESS AREAS

TS Elaboration Special facilities may be required for developing, designing, and implementing solutions to requirements. When necessary, the facili- ties required for the activities in the Technical Solution process area are developed or purchased.

Examples of resources provided include the following: • Design specification tools • Simulators and modeling tools • Prototyping tools • Scenario definition and management tools • Requirements tracking tools • Interactive documentation tools

VAL Elaboration Special facilities may be required for validating the product or prod- uct components. When necessary, the facilities required for valida- tion are developed or purchased.

ptg Examples of resources provided include the following: • Test management tools • Test case generators • Test coverage analyzers • Simulators • Load, stress, and performance testing tools

VER Elaboration Special facilities may be required for verifying selected work prod- ucts. When necessary, the facilities required for the activities in the Verification process area are developed or purchased. Certain verification methods can require special tools, equipment, facilities, and training (e.g., peer reviews can require meeting rooms and trained moderators; certain verification tests can require special test equipment and people skilled in the use of the equipment).

Examples of resources provided include the following: • Test management tools • Test case generators • Test coverage analyzers • Simulators

Wow! eBook GP 2.4 Responsibility 185

GP 2.4 ASSIGN RESPONSIBILITY GG

Assign responsibility and authority for performing the process, developing the S & GP work products, and providing the services of the process. S The purpose of this generic practice is to ensure that there is accounta- bility for performing the process and achieving the specified results throughout the life of the process. The people assigned must have the appropriate authority to perform the assigned responsibilities. Responsibility can be assigned using detailed job descriptions or in living documents, such as the plan for performing the process. Dynamic assignment of responsibility is another legitimate way to implement this generic practice, as long as the assignment and acceptance of responsibility are ensured throughout the life of the process. Subpractices 1. Assign overall responsibility and authority for performing the process. 2. Assign responsibility and authority for performing the specific tasks of the process. 3. Confirm that the people assigned to the responsibilities and authori- ptg ties understand and accept them.

OPF Elaboration Two groups are typically established and assigned responsibility for process improvement: (1) a management steering committee for process improvement to provide senior management sponsorship, and (2) a process group to facilitate and manage the process improvement activities. PPQA Elaboration Responsibility is assigned to those who can perform process and product quality assurance evaluations with sufficient independence and objectivity to guard against subjectivity or bias. TS Elaboration Appointing a lead or chief architect that oversees the technical solu- tion and has authority over design decisions helps to maintain con- sistency in product design and evolution.

Wow! eBook 186 PART TWO THE PROCESS AREAS

GP 2.5 TRAIN PEOPLE Train the people performing or supporting the process as needed.

The purpose of this generic practice is to ensure that people have the necessary skills and expertise to perform or support the process. Appropriate training is provided to those who will be performing the work. Overview training is provided to orient people who inter- act with those who perform the work.

Examples of methods for providing training include self study; self- directed training; self-paced, programmed instruction; formalized on-the- job training; mentoring; and formal and classroom training.

Training supports the successful execution of the process by establishing a common understanding of the process and by impart- ing the skills and knowledge needed to perform the process. Refer to the Organizational Training process area for more information about developing skills and knowledge of people so they can perform their roles effec- tively and efficiently. ptg CAR Elaboration

Examples of training topics include the following: • Quality management methods (e.g., root cause analysis)

CM Elaboration

Examples of training topics include the following: • Roles, responsibilities, and authority of the configuration management staff • Configuration management standards, procedures, and methods • Configuration library system

DAR Elaboration

Examples of training topics include the following: • Formal decision analysis • Methods for evaluating alternative solutions against criteria

Wow! eBook GP 2.5 Training 187

IPM Elaboration GG S Examples of training topics include the following: & GP • Tailoring the organization’s set of standard processes to meet the needs of the project S • Managing the project based on the project’s defined process • Using the organization’s measurement repository • Using the organizational process assets • Integrated management • Intergroup coordination • Group problem solving

MA Elaboration

Examples of training topics include the following: • Statistical techniques • Data collection, analysis, and reporting processes • Development of goal related measurements (e.g., Goal Question Metric)

OPD Elaboration ptg

Examples of training topics include the following: • CMMI and other process and process improvement reference models • Planning, managing, and monitoring processes • Process modeling and definition • Developing a tailorable standard process • Developing work environment standards • Ergonomics

OPF Elaboration

Examples of training topics include the following: • CMMI and other process improvement reference models • Planning and managing process improvement • Tools, methods, and analysis techniques • Process modeling • Facilitation techniques • Change management

Wow! eBook 188 PART TWO THE PROCESS AREAS

OPM Elaboration

Examples of training topics include the following: • Cost benefit analysis • Planning, designing, and conducting pilots • Technology transition • Change management

OPP Elaboration

Examples of training topics include the following: • Process and process improvement modeling • Statistical and other quantitative methods (e.g., estimating models, Pareto analysis, control charts)

OT Elaboration

Examples of training topics include the following: • Knowledge and skills needs analysis • Instructional design ptg • Instructional techniques (e.g., train the trainer) • Refresher training on subject matter

PI Elaboration

Examples of training topics include the following: • Application domain • Product integration procedures and criteria • Organization’s facilities for integration and assembly • Assembly methods • Packaging standards

PMC Elaboration

Examples of training topics include the following: • Monitoring and control of projects • Risk management • Data management

Wow! eBook GP 2.5 Training 189

PP Elaboration GG S Examples of training topics include the following: & GP • Estimating S •Budgeting •Negotiating • Identifying and analyzing risks •Managing data • Planning •Scheduling

PPQA Elaboration

Examples of training topics include the following: • Application domain • Customer relations • Process descriptions, standards, procedures, and methods for the project • Quality assurance objectives, process descriptions, standards, proce- dures, methods, and tools ptg QPM Elaboration

Examples of training topics include the following: • Basic quantitative (including statistical) analyses that help in analyzing process performance, using historical data, and identifying when correc- tive action is warranted • Process modeling and analysis • Process measurement data selection, definition, and collection

RD Elaboration

Examples of training topics include the following: • Application domain • Requirements definition and analysis • Requirements elicitation • Requirements specification and modeling • Requirements tracking

Wow! eBook 190 PART TWO THE PROCESS AREAS

REQM Elaboration

Examples of training topics include the following: • Application domain • Requirements definition, analysis, review, and management • Requirements management tools • Configuration management • Negotiation and conflict resolution

RSKM Elaboration

Examples of training topics include the following: • Risk management concepts and activities (e.g., risk identification, evalua- tion, monitoring, mitigation) • Measure selection for risk mitigation

SAM Elaboration

Examples of training topics include the following: • Regulations and business practices related to negotiating and working ptg with suppliers • Acquisition planning and preparation • Commercial off-the-shelf product acquisition • Supplier evaluation and selection • Negotiation and conflict resolution • Supplier management • Testing and transition of acquired products • Receiving, storing, using, and maintaining acquired products

TS Elaboration

Examples of training topics include the following: • Application domain of the product and product components • Design methods • Architecture methods • Interface design • Unit testing techniques • Standards (e.g., product, safety, human factors, environmental)

Wow! eBook GP 2.6 Work Products 191

VAL Elaboration GG S Examples of training topics include the following: & GP • Application domain S • Validation principles, standards, and methods • Intended-use environment

VER Elaboration

Examples of training topics include the following: • Application or service domain • Verification principles, standards, and methods (e.g., analysis, demon- stration, inspection, test) • Verification tools and facilities • Peer review preparation and procedures • Meeting facilitation

GP 2.6 CONTROL WORK PRODUCTS

Place selected work products of the process under appropriate levels of ptg c o n t r o l .

The purpose of this generic practice is to establish and maintain the integrity of the selected work products of the process (or their descriptions) throughout their useful life. The selected work products are specifically identified in the plan for performing the process, along with a specification of the appro- priate level of control. Different levels of control are appropriate for different work prod- ucts and for different points in time. For some work products, it may be sufficient to maintain version control so that the version of the work product in use at a given time, past or present, is known and changes are incorporated in a controlled manner. Version control is usually under the sole control of the work product owner (which can be an individual, group, or team). Sometimes, it can be critical that work products be placed under formal or baseline configuration management. This type of control includes defining and establishing baselines at predetermined points. These baselines are formally reviewed and approved, and serve as the basis for further development of the designated work products.

Wow! eBook 192 PART TWO THE PROCESS AREAS

Refer to the Configuration Management process area for more information about establishing and maintaining the integrity of work products using config- uration identification, configuration control, configuration status accounting, and configuration audits. Additional levels of control between version control and formal configuration management are possible. An identified work product can be under various levels of control at different points in time. CAR Elaboration

Examples of work products placed under control include the following: • Action proposals • Action plans • Causal analysis and resolution records

CM Elaboration

Examples of work products placed under control include the following: • Access lists • Change status reports • Change request database ptg • CCB meeting minutes • Archived baselines

DAR Elaboration

Examples of work products placed under control include the following: • Guidelines for when to apply a formal evaluation process • Evaluation reports containing recommended solutions

IPM Elaboration

Examples of work products placed under control include the following: • The project’s defined process • Project plans • Other plans that affect the project • Integrated plans • Actual process and product measurements collected from the project • Project’s shared vision • Team structure • Team charters

Wow! eBook GP 2.6 Work Products 193

MA Elaboration GG S Examples of work products placed under control include the following: & GP • Measurement objectives S • Specifications of base and derived measures • Data collection and storage procedures • Base and derived measurement data sets • Analysis results and draft reports • Data analysis tools

OPD Elaboration

Examples of work products placed under control include the following: • Organization’s set of standard processes • Descriptions of lifecycle models • Tailoring guidelines for the organization’s set of standard processes • Definitions of the common set of product and process measures • Organization’s measurement data • Rules and guidelines for structuring and forming teams ptg OPF Elaboration

Examples of work products placed under control include the following: • Process improvement proposals • Organization’s approved process action plans • Training materials used for deploying organizational process assets • Guidelines for deploying the organization’s set of standard processes on new projects • Plans for the organization’s process appraisals

OPM Elaboration

Examples of work products placed under control include the following: • Documented lessons learned from improvement validation • Deployment plans • Revised improvement measures, objectives, priorities • Updated process documentation and training material

Wow! eBook 194 PART TWO THE PROCESS AREAS

OPP Elaboration

Examples of work products placed under control include the following: • Organization’s quality and process performance objectives • Definitions of the selected measures of process performance • Baseline data on the organization’s process performance • Process performance models

OT Elaboration

Examples of work products placed under control include the following: • Organizational training tactical plan • Training records • Training materials and supporting artifacts • Instructor evaluation forms

PI Elaboration

Examples of work products placed under control include the following: ptg • Acceptance documents for the received product components • Evaluated assembled product and product components • Product integration strategy • Product integration procedures and criteria • Updated interface description or agreement

PMC Elaboration

Examples of work products placed under control include the following: • Project schedules with status • Project measurement data and analysis • Earned value reports

PP Elaboration

Examples of work products placed under control include the following: • Work breakdown structure • Project plan • Data management plan • Stakeholder involvement plan

Wow! eBook GP 2.6 Work Products 195

PPQA Elaboration GG S Examples of work products placed under control include the following: & GP • Noncompliance reports S • Evaluation logs and reports

QPM Elaboration

Examples of work products placed under control include the following: • Subprocesses to be included in the project’s defined process • Operational definitions of the measures, their collection points in the subprocesses, and how the integrity of the measures will be determined • Collected measurements

RD Elaboration

Examples of work products placed under control include the following: • Customer functional and quality attribute requirements • Definition of required functionality and quality attributes • Product and product component requirements ptg • Interface requirements

REQM Elaboration

Examples of work products placed under control include the following: • Requirements • Requirements traceability matrix

RSKM Elaboration

Examples of work products placed under control include the following: • Risk management strategy • Identified risk items • Risk mitigation plans

Wow! eBook 196 PART TWO THE PROCESS AREAS

SAM Elaboration

Examples of work products placed under control include the following: • Statements of work • Supplier agreements • Memoranda of agreement •Subcontracts • Preferred supplier lists

TS Elaboration

Examples of work products placed under control include the following: • Product, product component, and interface designs • Technical data packages • Interface design documents • Criteria for design and product component reuse • Implemented designs (e.g., software code, fabricated product components) • User, installation, operation, and maintenance documentation

VAL Elaboration ptg

Examples of work products placed under control include the following: • Lists of products and product components selected for validation • Validation methods, procedures, and criteria • Validation reports

VER Elaboration

Examples of work products placed under control include the following: • Verification procedures and criteria • Peer review training material • Peer review data • Verification reports

GP 2.7 IDENTIFY AND INVOLVE RELEVANT STAKEHOLDERS Identify and involve the relevant stakeholders of the process as planned.

The purpose of this generic practice is to establish and maintain the expected involvement of relevant stakeholders during the execution of the process.

Wow! eBook GP 2.7 Stakeholders 197

Involve relevant stakeholders as described in an appropriate plan GG

for stakeholder involvement. Involve stakeholders appropriately in S activities such as the following: & GP

• Planning S • Decisions • Commitments • Communications • Coordination • Reviews • Appraisals • Requirements definitions • Resolution of problems and issues

Refer to the Project Planning process area for more information about planning stakeholder involvement. The objective of planning stakeholder involvement is to ensure that interactions necessary to the process are accomplished, while not allowing excessive numbers of affected groups and individuals to impede process execution. ptg

Examples of stakeholders that might serve as relevant stakeholders for specific tasks, depending on context, include individuals, teams, manage- ment, customers, suppliers, end users, operations and support staff, other projects, and government regulators.

Subpractices 1. Identify stakeholders relevant to this process and their appropriate involvement. Relevant stakeholders are identified among the suppliers of inputs to, the users of outputs from, and the performers of the activities in the process. Once the relevant stakeholders are identified, the appropriate level of their involvement in process activities is planned. 2. Share these identifications with project planners or other planners as appropriate. 3. Involve relevant stakeholders as planned.

Wow! eBook 198 PART TWO THE PROCESS AREAS

CAR Elaboration

Examples of activities for stakeholder involvement include the following: • Conducting causal analysis • Assessing action proposals

CM Elaboration

Examples of activities for stakeholder involvement include the following: • Establishing baselines • Reviewing configuration management system reports and resolving issues • Assessing the impact of changes for configuration items • Performing configuration audits • Reviewing results of configuration management audits

DAR Elaboration

Examples of activities for stakeholder involvement include the following: • Establishing guidelines for which issues are subject to a formal evalua- ptg tion process • Defining the issue to be addressed • Establishing evaluation criteria • Identifying and evaluating alternatives • Selecting evaluation methods • Selecting solutions

IPM Elaboration

Examples of activities for stakeholder involvement include the following: • Resolving issues about the tailoring of organizational process assets • Resolving issues among the project plan and other plans that affect the project • Reviewing project progress and performance to align with current and projected needs, objectives, and requirements • Creating the project’s shared vision • Defining the team structure for the project • Populating teams

Wow! eBook GP 2.7 Stakeholders 199

MA Elaboration GG S Examples of activities for stakeholder involvement include the following: & GP • Establishing measurement objectives and procedures S • Assessing measurement data • Providing meaningful feedback to those who are responsible for provid- ing the raw data on which the analysis and results depend

OPD Elaboration

Examples of activities for stakeholder involvement include the following: • Reviewing the organization’s set of standard processes • Reviewing the organization’s lifecycle models • Resolving issues related to the tailoring guidelines • Assessing definitions of the common set of process and product measures • Reviewing work environment standards • Establishing and maintaining empowerment mechanisms • Establishing and maintaining organizational rules and guidelines for structuring and forming teams ptg

OPF Elaboration

Examples of activities for stakeholder involvement include the following: • Coordinating and collaborating on process improvement activities with process owners, those who are or will be performing the process, and support organizations (e.g., training staff, quality assurance representatives) • Establishing the organizational process needs and objectives • Appraising the organization’s processes • Implementing process action plans • Coordinating and collaborating on the execution of pilots to test selected improvements • Deploying organizational process assets and changes to organizational process assets • Communicating the plans, status, activities, and results related to plan- ning, implementing, and deploying process improvements

Wow! eBook 200 PART TWO THE PROCESS AREAS

OPM Elaboration

Examples of activities for stakeholder involvement include the following: • Reviewing improvement proposals that could contribute to meeting busi- ness objectives • Providing feedback to the organization on the readiness, status, and results of the improvement deployment activities

The feedback typically involves the following: • Informing the people who submit improvement proposals about the dis- position of their proposals • Regularly communicating the results of comparing business performance against the business objectives • Regularly informing relevant stakeholders about the plans and status for selecting and deploying improvements • Preparing and distributing a summary of improvement selection and deployment activities

OPP Elaboration ptg Examples of activities for stakeholder involvement include the following: • Establishing the organization’s quality and process performance objec- tives and their priorities • Reviewing and resolving issues on the organization’s process perform- ance baselines • Reviewing and resolving issues on the organization’s process perform- ance models

OT Elaboration

Examples of activities for stakeholder involvement include the following: • Establishing a collaborative environment for discussion of training needs and training effectiveness to ensure that the organization’s training needs are met • Identifying training needs • Reviewing the organizational training tactical plan • Assessing training effectiveness

Wow! eBook GP 2.7 Stakeholders 201

PI Elaboration GG S Examples of activities for stakeholder involvement include the following: & GP • Establishing the product integration strategy S • Reviewing interface descriptions for completeness • Establishing the product integration procedures and criteria • Assembling and delivering the product and product components • Communicating the results after evaluation • Communicating new, effective product integration processes to give affected people the opportunity to improve their process performance

PMC Elaboration

Examples of activities for stakeholder involvement include the following: • Assessing the project against the plan • Reviewing commitments and resolving issues • Reviewing project risks • Reviewing data management activities • Reviewing project progress • Managing corrective actions to closure ptg

PP Elaboration

Examples of activities for stakeholder involvement include the following: • Establishing estimates • Reviewing and resolving issues on the completeness and correctness of the project risks • Reviewing data management plans • Establishing project plans • Reviewing project plans and resolving issues on work and resource issues

PPQA Elaboration

Examples of activities for stakeholder involvement include the following: • Establishing criteria for the objective evaluations of processes and work products • Evaluating processes and work products • Resolving noncompliance issues • Tracking noncompliance issues to closure

Wow! eBook 202 PART TWO THE PROCESS AREAS

QPM Elaboration

Examples of activities for stakeholder involvement include the following: • Establishing project objectives • Resolving issues among the project’s quality and process performance objectives • Selecting analytic techniques to be used • Evaluating the process performance of selected subprocesses • Identifying and managing the risks in achieving the project’s quality and process performance objectives • Identifying what corrective action should be taken

RD Elaboration

Examples of activities for stakeholder involvement include the following: • Reviewing the adequacy of requirements in meeting needs, expectations, constraints, and interfaces • Establishing operational concepts and operational, sustainment, and development scenarios • Assessing the adequacy of requirements • Prioritizing customer requirements ptg • Establishing product and product component functional and quality attribute requirements • Assessing product cost, schedule, and risk

REQM Elaboration

Examples of activities for stakeholder involvement include the following: • Resolving issues on the understanding of requirements • Assessing the impact of requirements changes • Communicating bidirectional traceability • Identifying inconsistencies among requirements, project plans, and work products

RSKM Elaboration

Examples of activities for stakeholder involvement include the following: • Establishing a collaborative environment for free and open discussion of risk • Reviewing the risk management strategy and risk mitigation plans • Participating in risk identification, analysis, and mitigation activities • Communicating and reporting risk management status

Wow! eBook GP 2.7 Stakeholders 203

SAM Elaboration GG S Examples of activities for stakeholder involvement include the following: & GP • Establishing criteria for evaluation of potential suppliers S • Reviewing potential suppliers • Establishing supplier agreements • Resolving issues with suppliers • Reviewing supplier performance

TS Elaboration

Examples of activities for stakeholder involvement include the following: • Developing alternative solutions and selection criteria • Obtaining approval on external interface specifications and design descriptions • Developing the technical data package • Assessing the make, buy, or reuse alternatives for product components • Implementing the design

VAL Elaboration ptg

Examples of activities for stakeholder involvement include the following: • Selecting the products and product components to be validated • Establishing the validation methods, procedures, and criteria • Reviewing results of product and product component validation and resolving issues • Resolving issues with the customers or end users

Issues with the customers or end users are resolved particularly when there are significant deviations from their baseline needs. Examples of resolutions include the following: • Waivers on the contract or agreement (what, when, and for which products) • Additional in-depth studies, trials, tests, or evaluations • Possible changes in the contracts or agreements

Wow! eBook 204 PART TWO THE PROCESS AREAS

VER Elaboration

Examples of activities for stakeholder involvement include the following: • Selecting work products and methods for verification • Establishing verification procedures and criteria • Conducting peer reviews • Assessing verification results and identifying corrective action

GP 2.8 MONITOR AND CONTROL THE PROCESS Monitor and control the process against the plan for performing the process and take appropriate corrective action.

The purpose of this generic practice is to perform the direct day-to- day monitoring and controlling of the process. Appropriate visibility into the process is maintained so that appropriate corrective action can be taken when necessary. Monitoring and controlling the process can involve measuring appropriate attributes of the process or work products produced by the process. Refer to the Measurement and Analysis process area for more information about developing and sustaining a measurement capability used to support manage- ptg ment information needs. Refer to the Project Monitoring and Control process area for more information about providing an understanding of the project’s progress so that appropriate corrective actions can be taken when the project’s performance deviates signifi- cantly from the plan.

Subpractices 1. Evaluate actual progress and performance against the plan for per- forming the process. The evaluations are of the process, its work products, and its services. 2. Review accomplishments and results of the process against the plan for performing the process. 3. Review activities, status, and results of the process with the immedi- ate level of management responsible for the process and identify issues. These reviews are intended to provide the immediate level of manage- ment with appropriate visibility into the process based on the day-to- day monitoring and controlling of the process, and are supplemented by periodic and event-driven reviews with higher level management as described in GP 2.10.

Wow! eBook GP 2.8 Monitor and Control 205

4. Identify and evaluate the effects of significant deviations from the GG plan for performing the process. S 5. Identify problems in the plan for performing the process and in the & GP execution of the process. S 6. Take corrective action when requirements and objectives are not being satisfied, when issues are identified, or when progress differs significantly from the plan for performing the process. Inherent risks should be considered before any corrective action is taken.

Corrective action can include the following: • Taking remedial action to repair defective work products or services • Changing the plan for performing the process • Adjusting resources, including people, tools, and other resources • Negotiating changes to the established commitments • Securing change to the requirements and objectives that must be satisfied • Terminating the effort

7. Track corrective action to closure. ptg CAR Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of outcomes analyzed • Change in quality or process performance per instance of the causal analysis and resolution process • Schedule of activities for implementing a selected action proposal

CM Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of changes to configuration items • Number of configuration audits conducted • Schedule of CCB or audit activities

Wow! eBook 206 PART TWO THE PROCESS AREAS

DAR Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Cost-to-benefit ratio of using formal evaluation processes • Schedule for the execution of a trade study

IPM Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of changes to the project’s defined process • Schedule and effort to tailor the organization’s set of standard processes • Interface coordination issue trends (i.e., number identified and number closed) • Schedule for project tailoring activities • Project’s shared vision usage and effectiveness • Team structure usage and effectiveness • Team charters usage and effectiveness

ptg MA Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Percentage of projects using progress and performance measures • Percentage of measurement objectives addressed • Schedule for collection and review of measurement data

OPD Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Percentage of projects using the process architectures and process ele- ments of the organization’s set of standard processes • Defect density of each process element of the organization’s set of stan- dard processes • Schedule for development of a process or process change

Wow! eBook GP 2.8 Monitor and Control 207

OPF Elaboration GG S Examples of measures and work products used in monitoring and control- & GP ling include the following: • Number of process improvement proposals submitted, accepted, or S implemented • CMMI maturity level or capability level earned • Schedule for deployment of an organizational process asset • Percentage of projects using the current organization’s set of standard processes (or tailored version of the current set) • Issue trends associated with implementing the organization’s set of stan- dard processes (i.e., number of issues identified, number closed) • Progress toward achievement of process needs and objectives

OPM Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Change in quality and process performance related to business objectives • Schedule for implementing and validating an improvement ptg • Schedule for activities to deploy a selected improvement

OPP Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Trends in the organization’s process performance with respect to changes in work products and task attributes (e.g., size growth, effort, schedule, quality) • Schedule for collecting and reviewing measures to be used for establish- ing a process performance baseline

OT Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of training courses delivered (e.g., planned versus actual) • Post-training evaluation ratings • Training program quality survey ratings • Schedule for delivery of training • Schedule for development of a course

Wow! eBook 208 PART TWO THE PROCESS AREAS

PI Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Product component integration profile (e.g., product component assem- blies planned and performed, number of exceptions found) • Integration evaluation problem report trends (e.g., number written and number closed) • Integration evaluation problem report aging (i.e., how long each problem report has been open) • Schedule for conduct of specific integration activities

PMC Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of open and closed corrective actions • Schedule with status for monthly financial data collection, analysis, and reporting • Number and types of reviews performed • Review schedule (planned versus actual and slipped target dates) ptg • Schedule for collection and analysis of monitoring data

PP Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of revisions to the plan • Cost, schedule, and effort variance per plan revision • Schedule for development and maintenance of program plans

PPQA Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Variance of objective process evaluations planned and performed • Variance of objective work product evaluations planned and performed • Schedule for objective evaluations

Wow! eBook GP 2.8 Monitor and Control 209

QPM Elaboration GG S Examples of measures and work products used in monitoring and control- & GP ling include the following: • Profile of subprocess attributes whose process performance provide S insight about the risk to, or are key contributors to, achieving project objectives (e.g., number selected for monitoring through statistical tech- niques, number currently being monitored, number whose process per- formance is stable) • Number of special causes of variation identified • Schedule of data collection, analysis, and reporting activities in a mea - surement and analysis cycle as it relates to quantitative management activities

RD Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Cost, schedule, and effort expended for rework • Defect density of requirements specifications • Schedule for activities to develop a set of requirements ptg

REQM Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Requirements volatility (percentage of requirements changed) • Schedule for coordination of requirements • Schedule for analysis of a proposed requirements change

RSKM Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of risks identified, managed, tracked, and controlled • Risk exposure and changes to the risk exposure for each assessed risk, and as a summary percentage of management reserve • Change activity for risk mitigation plans (e.g., processes, schedule, funding) Continues

Wow! eBook 210 PART TWO THE PROCESS AREAS

Continued • Occurrence of unanticipated risks • Risk categorization volatility • Comparison of estimated versus actual risk mitigation effort and impact • Schedule for risk analysis activities • Schedule of actions for a specific mitigation

SAM Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of changes made to the requirements for the supplier • Cost and schedule variance in accordance with the supplier agreement • Schedule for selecting a supplier and establishing an agreement

TS Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Cost, schedule, and effort expended for rework ptg • Percentage of requirements addressed in the product or product compo- nent design • Size and complexity of the product, product components, interfaces, and documentation • Defect density of technical solutions work products • Schedule for design activities

VAL Elaboration

Examples of measures and work products used in monitoring and control- ling include the following: • Number of validation activities completed (planned versus actual) • Validation problem report trends (e.g., number written, number closed) • Validation problem report aging (i.e., how long each problem report has been open) • Schedule for a specific validation activity

Wow! eBook GP 2.9 Objectively Evaluate 211

VER Elaboration GG S

Examples of measures and work products used in monitoring and control- & GP ling include the following: • Verification profile (e.g., the number of verifications planned and per- S formed, and the defects found; or defects categorized by verification method or type) • Number of defects detected by defect category • Verification problem report trends (e.g., number written, number closed) • Verification problem report status (i.e., how long each problem report has been open) • Schedule for a specific verification activity • Peer review effectiveness

GP 2.9 OBJECTIVELY EVALUATE ADHERENCE Objectively evaluate adherence of the process and selected work products against the process description, standards, and procedures, and address non- compliance.

The purpose of this generic practice is to provide credible assurance ptg that the process and selected work products are implemented as planned and adhere to the process description, standards, and proce- dures. (See the definition of “objectively evaluate” in the glossary.) Refer to the Process and Product Quality Assurance process area for more infor- mation about objectively evaluating processes and work products. People not directly responsible for managing or performing the activities of the process typically evaluate adherence. In many cases, adherence is evaluated by people in the organization, but external to the process or project, or by people external to the organization. As a result, credible assurance of adherence can be provided even during times when the process is under stress (e.g., when the effort is behind schedule, when the effort is over budget). CAR Elaboration

Examples of activities reviewed include the following: • Determining causes of outcomes • Evaluating results of action plans

Examples of work products reviewed include the following: • Action proposals selected for implementation • Causal analysis and resolution records

Wow! eBook 212 PART TWO THE PROCESS AREAS

CM Elaboration

Examples of activities reviewed include the following: • Establishing baselines • Tracking and controlling changes • Establishing and maintaining the integrity of baselines

Examples of work products reviewed include the following: • Archives of baselines • Change request database

DAR Elaboration

Examples of activities reviewed include the following: • Evaluating alternatives using established criteria and methods

Examples of work products reviewed include the following: • Guidelines for when to apply a formal evaluation process • Evaluation reports containing recommended solutions ptg

IPM Elaboration

Examples of activities reviewed include the following: • Establishing, maintaining, and using the project’s defined process • Coordinating and collaborating with relevant stakeholders • Using the project’s shared vision • Organizing teams

Examples of work products reviewed include the following: • Project’s defined process • Project plans • Other plans that affect the project • Work environment standards • Shared vision statements • Team structure • Team charters

Wow! eBook GP 2.9 Objectively Evaluate 213

MA Elaboration GG S Examples of activities reviewed include the following: & GP • Aligning measurement and analysis activities S • Providing measurement results

Examples of work products reviewed include the following: • Specifications of base and derived measures • Data collection and storage procedures • Analysis results and draft reports

OPD Elaboration

Examples of activities reviewed include the following: • Establishing organizational process assets • Determining rules and guidelines for structuring and forming teams

Examples of work products reviewed include the following: • Organization’s set of standard processes ptg • Descriptions of lifecycle models • Tailoring guidelines for the organization’s set of standard processes • Organization’s measurement data • Empowerment rules and guidelines for people and teams • Organizational process documentation

OPF Elaboration

Examples of activities reviewed include the following: • Determining process improvement opportunities • Planning and coordinating process improvement activities • Deploying the organization’s set of standard processes on projects at their startup

Examples of work products reviewed include the following: • Process improvement plans • Process action plans • Process deployment plans • Plans for the organization’s process appraisals

Wow! eBook 214 PART TWO THE PROCESS AREAS

OPM Elaboration

Examples of activities reviewed include the following: • Analyzing process performance data to determine the organization’s ability to meet identified business objectives • Selecting improvements using quantitative analysis • Deploying improvements • Measuring effectiveness of the deployed improvements using statistical and other quantitative techniques

Examples of work products reviewed include the following: • Improvement proposals • Deployment plans • Revised improvement measures, objectives, priorities, and deployment plans • Updated process documentation and training material

OPP Elaboration

Examples of activities reviewed include the following: ptg • Establishing process performance baselines and models

Examples of work products reviewed include the following: • Process performance baselines • Organization’s quality and process performance objectives • Definitions of the selected measures of process performance

OT Elaboration

Examples of activities reviewed include the following: • Identifying training needs and making training available • Providing necessary training

Examples of work products reviewed include the following: • Organizational training tactical plan • Training materials and supporting artifacts • Instructor evaluation forms

Wow! eBook GP 2.9 Objectively Evaluate 215

PI Elaboration GG S Examples of activities reviewed include the following: & GP • Establishing and maintaining a product integration strategy S • Ensuring interface compatibility • Assembling product components and delivering the product

Examples of work products reviewed include the following: • Product integration strategy • Product integration procedures and criteria • Acceptance documents for the received product components • Assembled product and product components

PMC Elaboration

Examples of activities reviewed include the following: • Monitoring project progress and performance against the project plan • Managing corrective actions to closure

ptg Examples of work products reviewed include the following: • Records of project progress and performance • Project review results

PP Elaboration

Examples of activities reviewed include the following: • Establishing estimates • Developing the project plan • Obtaining commitments to the project plan

Examples of work products reviewed include the following: •WBS • Project plan • Data management plan • Stakeholder involvement plan

Wow! eBook 216 PART TWO THE PROCESS AREAS

PPQA Elaboration

Examples of activities reviewed include the following: • Objectively evaluating processes and work products • Tracking and communicating noncompliance issues

Examples of work products reviewed include the following: • Noncompliance reports • Evaluation logs and reports

QPM Elaboration

Examples of activities reviewed include the following: • Managing the project using quality and process performance objectives • Managing selected subprocesses using statistical and other quantitative techniques

Examples of work products reviewed include the following: • Compositions of the project’s defined process ptg • Operational definitions of the measures • Process performance analyses reports • Collected measurements

RD Elaboration

Examples of activities reviewed include the following: • Collecting stakeholder needs • Formulating product and product component functional and quality attribute requirements • Formulating architectural requirements that specify how product compo- nents are organized and designed to achieve particular end-to-end func- tional and quality attribute requirements • Analyzing and validating product and product component requirements

Examples of work products reviewed include the following: • Product requirements • Product component requirements • Interface requirements • Definition of required functionality and quality attributes • Architecturally significant quality attribute requirements

Wow! eBook GP 2.9 Objectively Evaluate 217

REQM Elaboration GG S Examples of activities reviewed include the following: & GP • Managing requirements S • Ensuring alignment among project plans, work products, and requirements

Examples of work products reviewed include the following: • Requirements • Requirements traceability matrix

RSKM Elaboration

Examples of activities reviewed include the following: • Establishing and maintaining a risk management strategy • Identifying and analyzing risks • Mitigating risks

Examples of work products reviewed include the following: ptg • Risk management strategy • Risk mitigation plans

SAM Elaboration

Examples of activities reviewed include the following: • Establishing and maintaining supplier agreements • Satisfying supplier agreements

Examples of work products reviewed include the following: • Plan for supplier agreement management • Supplier agreements

TS Elaboration

Examples of activities reviewed include the following: • Selecting product component solutions • Developing product and product component designs • Implementing product component designs

Wow! eBook 218 PART TWO THE PROCESS AREAS Examples of work products reviewed include the following: • Technical data packages • Product, product component, and interface designs • Implemented designs (e.g., software code, fabricated product components) • User, installation, operation, and maintenance documentation

VAL Elaboration

Examples of activities reviewed include the following: • Selecting the products and product components to be validated • Establishing and maintaining validation methods, procedures, and criteria • Validating products or product components

Examples of work products reviewed include the following: • Validation methods • Validation procedures • Validation criteria ptg VER Elaboration

Examples of activities reviewed include the following: • Selecting work products for verification • Establishing and maintaining verification procedures and criteria • Performing peer reviews • Verifying selected work products

Examples of work products reviewed include the following: • Verification procedures and criteria • Peer review checklists • Verification reports

GP 2.10 REVIEW STATUS WITH HIGHER LEVEL MANAGEMENT Review the activities, status, and results of the process with higher level man- agement and resolve issues.

The purpose of this generic practice is to provide higher level man- agement with the appropriate visibility into the process.

Wow! eBook GP 2.10 Higher Level Management 219

Higher level management includes those levels of management in GG

the organization above the immediate level of management responsi- S ble for the process. In particular, higher level management can & GP include senior management. These reviews are for managers who provide the policy and overall guidance for the process and not for S those who perform the direct day-to-day monitoring and controlling of the process. Different managers have different needs for information about the process. These reviews help ensure that informed decisions on the planning and performing of the process can be made. Therefore, these reviews are expected to be both periodic and event driven. OPF Elaboration These reviews are typically in the form of a briefing presented to the management steering committee by the process group and the process action teams.

Examples of presentation topics include the following: • Status of improvements being developed by process action teams • Results of pilots • Results of deployments ptg • Schedule status for achieving significant milestones (e.g., readiness for an appraisal, progress toward achieving a targeted organizational maturity level or capability level profile)

OPM Elaboration These reviews are typically in the form of a briefing presented to higher level management by those responsible for performance improvement.

Examples of presentation topics include the following: • Improvement areas identified from analysis of current performance compared to business objectives • Results of process improvement elicitation and analysis activities • Results from validation activities (e.g., pilots) compared to expected benefits • Performance data after deployment of improvements • Deployment cost, schedule, and risk • Risks of not achieving business objectives

Wow! eBook 220 PART TWO THE PROCESS AREAS

REQM Elaboration Proposed changes to commitments to be made external to the organ- ization are reviewed with higher level management to ensure that all commitments can be accomplished. RSKM Elaboration Reviews of the project risk status are held on a periodic and event driven basis, with appropriate levels of management, to provide visi- bility into the potential for project risk exposure and appropriate cor- rective action. Typically, these reviews include a summary of the most critical risks, key risk parameters (such as likelihood and consequence of the risks), and the status of risk mitigation efforts.

GG 3 INSTITUTIONALIZE A DEFINED PROCESS The process is institutionalized as a defined process.

GP 3.1 ESTABLISH A DEFINED PROCESS Establish and maintain the description of a defined process. ptg The purpose of this generic practice is to establish and maintain a description of the process that is tailored from the organization’s set of standard processes to address the needs of a specific instantiation. The organization should have standard processes that cover the process area, as well as have guidelines for tailoring these standard processes to meet the needs of a project or organizational function. With a defined process, variability in how the processes are per- formed across the organization is reduced and process assets, data, and learning can be effectively shared. Refer to the Integrated Project Management process area for more information about establishing the project’s defined process. Refer to the Organizational Process Definition process area for more informa- tion about establishing standard processes and establishing tailoring criteria and guidelines. The descriptions of the defined processes provide the basis for planning, performing, and managing the activities, work products, and services associated with the process. Subpractices 1. Select from the organization’s set of standard processes those processes that cover the process area and best meet the needs of the project or organizational function.

Wow! eBook GP 3.2 Experiences 221

2. Establish the defined process by tailoring the selected processes GG according to the organization’s tailoring guidelines. S 3. Ensure that the organization’s process objectives are appropriately & GP addressed in the defined process. S 4. Document the defined process and the records of the tailoring. 5. Revise the description of the defined process as necessary.

GP 3.2 COLLECT PROCESS RELATED EXPERIENCES Collect process related experiences derived from planning and performing the process to support the future use and improvement of the organization’s processes and process assets.

The purpose of this generic practice is to collect process related experiences, including information and artifacts derived from plan- ning and performing the process. Examples of process related experi- ences include work products, measures, measurement results, lessons learned, and process improvement suggestions. The informa- tion and artifacts are collected so that they can be included in the organizational process assets and made available to those who are (or who will be) planning and performing the same or similar processes. The information and artifacts are stored in the organization’s mea - ptg surement repository and the organization’s process asset library.

Examples of relevant information include the effort expended for the vari- ous activities, defects injected or removed in a particular activity, and les- sons learned.

Refer to the Integrated Project Management process area for more information about contributing to organizational process assets. Refer to the Organizational Process Definition process area for more informa- tion about establishing organizational process assets.

Subpractices 1. Store process and product measures in the organization’s measure- ment repository. The process and product measures are primarily those measures that are defined in the common set of measures for the organization’s set of standard processes. 2. Submit documentation for inclusion in the organization’s process asset library.

Wow! eBook 222 PART TWO THE PROCESS AREAS

3. Document lessons learned from the process for inclusion in the organization’s process asset library. 4. Propose improvements to the organizational process assets.

CAR Elaboration

Examples of process related experiences include the following: • Action proposals • Number of action plans that are open and for how long • Action plan status reports

CM Elaboration

Examples of process related experiences include the following: • Trends in the status of configuration items • Configuration audit results • Change request aging reports

DAR Elaboration ptg Examples process related experiences include the following: • Number of alternatives considered • Evaluation results • Recommended solutions to address significant issues

IPM Elaboration

Examples of process related experiences include the following: • Project’s defined process • Number of tailoring options exercised by the project to create its defined process • Interface coordination issue trends (i.e., number identified, number closed) • Number of times the process asset library is accessed for assets related to project planning by project members • Records of expenses related to holding face-to-face meetings versus holding meetings using collaborative equipment such as teleconferenc- ing and videoconferencing • Project shared vision • Team charters

Wow! eBook GP 3.2 Experiences 223

MA Elaboration GG S Examples of process related experiences include the following: & GP • Data currency status S • Results of data integrity tests • Data analysis reports

OPD Elaboration

Examples of process related experiences include the following: • Submission of lessons learned to the organization’s process asset library • Submission of measurement data to the organization’s measurement repository • Status of the change requests submitted to modify the organization’s standard process • Record of non-standard tailoring requests

OPF Elaboration

Examples of process related experiences include the following: ptg • Criteria used to prioritize candidate process improvements • Appraisal findings that address strengths and weaknesses of the organi- zation’s processes • Status of improvement activities against the schedule • Records of tailoring the organization’s set of standard processes and implementing them on identified projects

OPM Elaboration

Examples of process related experiences include the following: • Lessons learned captured from analysis of process performance data compared to business objectives • Documented measures of the costs and benefits resulting from imple- menting and deploying improvements • Report of a comparison of similar development processes to identify the potential for improving efficiency

Wow! eBook 224 PART TWO THE PROCESS AREAS

OPP Elaboration

Examples of process related experiences include the following: • Process performance baselines • Percentage of measurement data that is rejected because of inconsisten- cies with the process performance measurement definitions

OT Elaboration

Examples of process related experiences include the following: • Results of training effectiveness surveys • Training program performance assessment results • Course evaluations • Training requirements from an advisory group

PI Elaboration

Examples of process related experiences include the following: • Records of the receipt of product components, exception reports, confir- mation of configuration status, and results of readiness checking ptg • Percentage of total development effort spent in product integration (actual to date plus estimate to complete) • Defects found in the product and test environment during product i n t e g r a t i o n • Problem reports resulting from product integration

PMC Elaboration

Examples of process related experiences include the following: • Records of significant deviations • Criteria for what constitutes a deviation • Corrective action results

PP Elaboration

Examples of process related experiences include the following: • Project data library structure • Project attribute estimates • Risk impacts and probability of occurrence

Wow! eBook GP 3.2 Experiences 225

PPQA Elaboration GG S Examples of process related experiences include the following: & GP • Evaluation logs S • Quality trends • Noncompliance reports • Status reports of corrective actions • Cost of quality reports for the project

QPM Elaboration

Examples of process related experiences include the following: • Records of quantitative management data from the project, including results from the periodic review of the process performance of the sub- processes selected for management against established interim objec- tives of the project • Suggested improvements to process performance models

RD Elaboration ptg Examples of process related experiences include the following: • List of the requirements for a product that are found to be ambiguous • Number of requirements introduced at each phase of the project lifecycle • Lessons learned from the requirements allocation process

REQM Elaboration

Examples of process related experiences include the following: • Requirements traceability matrix • Number of unfunded requirements changes after baselining • Lessons learned in resolving ambiguous requirements

RSKM Elaboration

Examples of process related experiences include the following: • Risk parameters •Risk categories • Risk status reports

Wow! eBook 226 PART TWO THE PROCESS AREAS

SAM Elaboration

Examples of process related experiences include the following: • Results of supplier reviews • Trade studies used to select suppliers • Revision history of supplier agreements • Supplier performance reports

TS Elaboration

Examples of process related experiences include the following: • Results of the make, buy, or reuse analysis • Design defect density • Results of applying new methods and tools

VAL Elaboration

Examples of process related experiences include the following: • Product component prototype ptg • Percentage of time the validation environment is available • Number of product defects found through validation per development phase • Validation analysis report

VER Elaboration

Examples of process related experiences include the following: • Peer review records that include conduct time and average preparation time • Number of product defects found through verification per development phase • Verification and analysis report

Applying Generic Practices Generic practices are components that can be applied to all process areas. Think of generic practices as reminders. They serve the pur- pose of reminding you to do things right and are expected model components.

Wow! eBook Process Areas that Support Generic Practices 227

For example, consider the generic practice, “Establish and main- GG

tain the plan for performing the process” (GP 2.2). When applied to S

the Project Planning process area, this generic practice reminds you & GP to plan the activities involved in creating the plan for the project. When applied to the Organizational Training process area, this same S generic practice reminds you to plan the activities involved in devel- oping the skills and knowledge of people in the organization.

Process Areas that Support Generic Practices While generic goals and generic practices are the model components that directly address the institutionalization of a process across the organization, many process areas likewise address institutionaliza- tion by supporting the implementation of the generic practices. Knowing these relationships will help you effectively implement the generic practices. Such process areas contain one or more specific practices that when implemented can also fully implement a generic practice or generate a work product that is used in the implementation of a generic practice. An example is the Configuration Management process area and ptg GP 2.6, “Place selected work products of the process under appropri- ate levels of control.” To implement the generic practice for one or more process areas, you might choose to implement the Configura- tion Management process area, all or in part, to implement the generic practice. Another example is the Organizational Process Definition process area and GP 3.1, “Establish and maintain the description of a defined process.” To implement this generic practice for one or more process areas, you should first implement the Organizational Process Defini- tion process area, all or in part, to establish the organizational process assets that are needed to implement the generic practice. Table 7.2 describes (1) the process areas that support the imple- mentation of generic practices and (2) the recursive relationships between generic practices and their closely related process areas. Both types of relationships are important to remember during process improvement to take advantage of the natural synergies that exist between the generic practices and their related process areas. Given the dependencies that generic practices have on these process areas, and given the more holistic view that many of these process areas provide, these process areas are often implemented early, in whole or in part, before or concurrent with implementing the associated generic practices.

Wow! eBook 228 PART TWO THE PROCESS AREAS TABLE 7.2 Generic Practice and Process Area Relationships Roles of Process Areas in How the Generic Practice Implementation of the Recursively Applies to Its Generic Practice Generic Practice Related Process Area(s)1

GP 2.2 Project Planning: The project GP 2.2 applied to the project Plan the Process planning process can implement planning process can be GP 2.2 in full for all project-related characterized as “plan the plan” process areas (except for Project and covers planning project Planning itself). planning activities.

GP 2.3 Project Planning: The part of the Provide Resources project planning process that implements Project Planning SP 2.4, GP 2.4 “Plan the Project’s Resources,” supports Assign Responsibility the implementation of GP 2.3 and GP 2.4 for all project-related process areas (except perhaps initially for Project Planning itself) by identifying needed processes, roles, and responsibilities to ensure the proper staffing, facilities, equipment, and other assets needed by the project are secured.

GP 2.5 Organizational Training: GP 2.5 applied to the Train People The organizational training process organizational training process supports the implementation of GP 2.5 covers training for performing ptg as applied to all process areas by the organizational training making the training that addresses activities, which addresses the strategic or organization-wide training skills required to manage, create, needs available to those who will and accomplish the training. perform or support the process.

Project Planning: The part of the project planning process that implements Project Planning SP 2.5, “Plan Needed Knowledge and Skills,” and the organizational training process, supports the implementation of GP 2.5 in full for all project-related process areas.

GP 2.6 Configuration Management: GP 2.6 applied to the configuration Control Work The configuration management management process covers Products process can implement GP 2.6 in full change and version control for the for all project-related process areas work products produced by as well as some of the organizational configuration management process areas. activities.

1. When the relationship between a generic practice and a process area is less direct, the risk of con- fusion is reduced; therefore, we do not describe all recursive relationships in the table (e.g., for generic practices 2.3, 2.4, and 2.10).

Wow! eBook Process Areas that Support Generic Practices 229 TABLE 7.2 Generic Practice and Process Area Relationships (Continued) GG

Roles of Process Areas in How the Generic Practice S & GP Implementation of the Recursively Applies to Its 1

Generic Practice Generic Practice Related Process Area(s) S

GP 2.7 Project Planning: The part of the GP 2.7 applied to the project Identify and Involve project planning process that planning process covers the Relevant Stakeholders implements Project Planning SP 2.6, involvement of relevant “Plan Stakeholder involvement,” stakeholders in project planning can implement the stakeholder activities. identification part (first two subpractices) of GP 2.7 in full for GP 2.7 applied to the project all project-related process areas. monitoring and control process covers the involvement of relevant Project Monitoring and Control: stakeholders in project monitoring The part of the project monitoring and control activities. and control process that implements Project Monitoring and Control SP 1.5, GP 2.7 applied to the integrated “Monitor Stakeholder Involvement,” project management process can aid in implementing the third covers the involvement of relevant subpractice of GP 2.7 for all stakeholders in integrated project project-related process areas. management activities.

Integrated Project Management: The part of the integrated project management process that implements Integrated Project Management SP 2.1, “Manage Stakeholder Involvement,” ptg can aid in implementing the third subpractice of GP 2.7 for all project-related process areas.

GP 2.8 Project Monitoring and Control: GP 2.8 applied to the project Monitor and Control The project monitoring and control monitoring and control process the Process process can implement GP 2.8 in full covers the monitoring and for all project-related process areas. controlling of the project’s monitor and control activities. Measurement and Analysis: For all processes, not just project-related processes, the Measurement and Analysis process area provides general guidance about measuring, analyzing, and recording information that can be used in establishing measures for monitoring actual performance of the process.

GP 2.9 Process and Product Quality GP 2.9 applied to the process and Objectively Assurance: The process and product product quality assurance process Evaluate Adherence quality assurance process can covers the objective evaluation of implement GP 2.9 in full for all process quality assurance activities and areas (except perhaps for Process and selected work products Product Quality Assurance itself).

Continues

Wow! eBook 230 PART TWO THE PROCESS AREAS TABLE 7.2 Generic Practice and Process Area Relationships (Continued) Roles of Process Areas in How the Generic Practice Implementation of the Recursively Applies to Its Generic Practice Generic Practice Related Process Area(s)1

GP 2.10 Project Monitoring and Control: Review Status with The part of the project monitoring Higher Level and control process that implements Management Project Monitoring and Control SP 1.6, “Conduct Progress Reviews,” and SP 1.7, “Conduct Milestone Reviews,” supports the implementation of GP 2.10 for all project-related process areas, perhaps in full, depending on higher level management involvement in these reviews.

GP 3.1 Integrated Project Management: GP 3.1 applied to the integrated Establish a Defined The part of the integrated project project management process Process management process that implements covers establishing defined Integrated Project Management SP 1.1, processes for integrated project “Establish the Project’s Defined management activities. Process,” can implement GP 3.1 in full for all project-related process areas.

Organizational Process Definition: For all processes, not just project-related processes, the ptg organizational process definition process establishes the organizational process assets needed to implement GP 3.1.

GP 3.2 Integrated Project Management: GP 3.2 applied to the integrated Collect Process The part of the integrated project project management process Related Experiences management process that implements covers collecting process related Integrated Project Management SP 1.7, experiences derived from planning “Contribute to the Organizational and performing integrated Process Assets,” can implement GP 3.2 project management activities. in part or full for all project-related process areas.

Organizational Process Focus: The part of the organizational process focus process that implements Organizational Process Focus SP 3.4, “Incorporate Experiences into the Organizational Process Assets,” can implement GP 3.2 in part or full for all process areas.

Organizational Process Definition: For all processes, the organizational process definition process establishes the organizational process assets needed to implement GP 3.2.

Wow! eBook Process Areas that Support Generic Practices 231

There are also a few situations where the result of applying a GG

generic practice to a particular process area would seem to make a S whole process area redundant, but, in fact, it does not. It can be natu- & GP ral to think that applying GP 3.1, “Establish a Defined Process,” to the Project Planning and Project Monitoring and Control process S areas gives the same effect as the first specific goal of Integrated Proj- ect Management, “Use the Project’s Defined Process.” Although it is true that there is some overlap, the application of the generic practice to these two process areas provides defined processes covering project planning and project monitoring and con- trol activities. These defined processes do not necessarily cover sup- port activities (e.g., configuration management), other project management processes (e.g., integrated project management), or other processes. In contrast, the project’s defined process, provided by the Integrated Project Management process area, covers all appro- priate processes.

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook CAUSAL ANALYSIS AND RESOLUTION A Support Process Area at Maturity Level 5 CAR

Purpose TIP The purpose of Causal Analysis and Resolution (CAR) is to identify CAR helps you establish a dis- causes of selected outcomes and take action to improve process ciplined approach to analyzing performance. the causes of outcomes, both positive and negative, of your processes. You can analyze Introductory Notes defects or defect trends, prob- lems such as schedule over- Causal analysis and resolution improves quality and productivity by runs or inter-organizational preventing the introduction of defects or problems and by identifying conflicts, as well as positive ptg and appropriately incorporating the causes of superior process outcomes that you want to replicate elsewhere. performance. The Causal Analysis and Resolution process area involves the fol- lowing activities:

• Identifying and analyzing causes of selected outcomes. The selected outcomes can represent defects and problems that can be prevented from happening in the future or successes that can be implemented in projects or the organization. • Taking actions to complete the following: • Remove causes and prevent the recurrence of those types of defects and problems in the future • Proactively analyze data to identify potential problems and prevent them from occurring HINT • Incorporate the causes of successes into the process to improve Integrating CAR activities into each project phase will help future process performance (1) prevent many defects from being introduced and (2) facili- Reliance on detecting defects and problems after they have been tate repetition of those condi- introduced is not cost effective. It is more effective to prevent defects tions that enable superior and problems by integrating Causal Analysis and Resolution activi- performance, thus improving ties into each phase of the project. the likelihood of project success.

233

Wow! eBook 234 PART TWO THE PROCESS AREAS

Since similar outcomes may have been previously encountered in other projects or in earlier phases or tasks of the current project, Causal Analysis and Resolution activities are mechanisms for com- municating lessons learned among projects. HINT Types of outcomes encountered are analyzed to identify trends. You also can apply causal Based on an understanding of the defined process and how it is analysis to outcomes of con- implemented, root causes of these outcomes and future implications cern to senior management. of them are determined. HINT Since it is impractical to perform causal analysis on all outcomes, It is impossible to analyze all targets are selected by tradeoffs on estimated investments and esti- outcomes; instead, focus on mated returns of quality, productivity, and cycle time. outcomes that have the largest Measurement and analysis processes should already be in place. risk or present the greatest Existing defined measures can be used, though in some instances opportunity. new measurement definitions, redefinitions, or clarified definitions may be needed to analyze the effects of a process change. Refer to the Measurement and Analysis process area for more information about aligning measurement and analysis activities and providing measurement results. TIP Causal Analysis and Resolution activities provide a mechanism Successful implementation of for projects to evaluate their processes at the local level and look for CAR requires significant man- improvements that can be implemented. ptg agement commitment and When improvements are judged to be effective, the information is process maturity to ensure that process data is accurately and submitted to the organizational level for potential deployment in the consistently recorded, causal organizational processes. analysis meetings are ade- The specific practices of this process area apply to a process that quately supported, and CAR is selected for quantitative management. Use of the specific practices activities are consistently per- of this process area can add value in other situations, but the results formed across the organization. may not provide the same degree of impact to the organization’s qual- ity and process performance objectives.

X-REF Related Process Areas Causal analysis is mentioned elsewhere in CMMI: in an IPM Refer to the Measurement and Analysis process area for more information about SP 1.5 subpractice to address aligning measurement and analysis activities and providing measurement causes of selected issues that results. can affect project objectives; in the notes of OPM SP 1.3 Refer to the Organizational Performance Management process area for more Identify Potential Areas for information about selecting and implementing improvements for deployment. Improvement; in QPM SP 2.3 Refer to the Quantitative Project Management process area for more informa- Perform Root Cause Analysis; tion about quantitatively managing the project to achieve the project’s estab- and as an example work lished quality and process performance objectives. product in VER SP 3.2 Ana- lyze Verification Results.

Wow! eBook Causal Analysis and Resolution 235 Specific Goal and Practice Summary SG 1 Determine Causes of Selected Outcomes SP 1.1 Select Outcomes for Analysis SP 1.2 Analyze Causes SG 2 Address Causes of Selected Outcomes SP 2.1 Implement Action Proposals SP 2.2 Evaluate the Effect of Implemented Actions SP 2.3 Record Causal Analysis Data

Specific Practices by Goal CAR

SG 1 DETERMINE CAUSES OF SELECTED OUTCOMES Root causes of selected outcomes are systematically determined.

A root cause is an initiating element in a causal chain which leads to an outcome of interest.

SP 1.1 SELECT OUTCOMES FOR ANALYSIS Select outcomes for analysis. HINT Let your data help you deter- This activity could be triggered by an event (reactive) or could be mine which outcomes, if ptg addressed, will realize the most planned periodically, such as at the beginning of a new phase or task benefit to your organization. (proactive). Process Performance Baselines (PPBs) and Models (PPMs) may Example Work Products help in this determination. 1. Data to be used in the initial analysis 2. Initial analysis results data 3. Outcomes selected for further analysis

Subpractices 1. Gather relevant data.

Examples of relevant data include the following: • Defects reported by customers or end users • Defects found in peer reviews or testing • Productivity measures that are higher than expected • Project management problem reports requiring corrective action • Process capability problems • Earned value measurements by process (e.g., cost performance index) • Resource throughput, utilization, or response time measurements • Service fulfillment or service satisfaction problems

Wow! eBook 236 PART TWO THE PROCESS AREAS

TIP 2. Determine which outcomes to analyze further. It is ineffective to look at every When determining which outcomes to analyze further, consider their outcome. Therefore, you source, impact, frequency of occurrence, similarity, the cost of analy- should establish criteria to help sis, the time and resources needed, safety considerations, etc. you prioritize and categorize outcomes. Examples of methods for selecting outcomes include the following: • Pareto analysis •Histograms • Box and whisker plots for attributes • Failure mode and effects analysis (FMEA) • Process capability analysis

TIP 3. Formally define the scope of the analysis, including a clear defini- An example method for analyz- tion of the improvement needed or expected, stakeholders affected, ing and categorizing defects is target affected, etc. Orthogonal Defect Classifica- Refer to the Decision Analysis and Resolution process area for more infor- tion (see Wikipedia), which mation about analyzing possible decisions using a formal evaluation provides standard taxonomies for classifying defects and their process that evaluates identified alternatives against established criteria. resolution. SP 1.2 ANALYZE CAUSES ptg HINT Perform causal analysis of selected outcomes and propose actions to address To identify actions that address them. an outcome, you need to understand its root causes. The purpose of this analysis is to define actions that will address TIP selected outcomes by analyzing relevant outcome data and produc- An action proposal typically ing action proposals for implementation. documents the outcome to be investigated, its causes, and Example Work Products specific actions that, when 1. Root cause analysis results taken, will either mitigate the causes to prevent the outcome 2. Action proposal from reoccurring or increase the likelihood that conditions Subpractices conducive to superior perform- 1. Conduct causal analysis with those who are responsible for perform- ance will occur. ing the task. Causal analysis is performed, typically in meetings, with those who X-REF understand the selected outcome under study. Those who have the Action proposals are priori- best understanding of the selected outcome are typically those who tized, selected, and imple- are responsible for performing the task. The analysis is most effective mented in SG 2. when applied to real time data, as close as possible to the event which triggered the outcome.

Wow! eBook Causal Analysis and Resolution 237

Examples of when to perform causal analysis include the following: TIP • When a stable subprocess does not meet its specified quality and There are secondary benefits process performance objectives, or when a subprocess needs to be to causal analysis meetings. stabilized Participants develop an appre- • During the task, if and when problems warrant a causal analysis meeting ciation for how upstream activ- ities affect downstream • When a work product exhibits an unexpected deviation from its activities, as well as a sense of requirements responsibility and accountabil- • When more defects than anticipated escape from earlier phases to the ity for outcomes that might oth- current phase erwise remain unanalyzed.

• When process performance exceeds expectations CAR • At the start of a new phase or task

Refer to the Quantitative Project Management process area for more TIP information about performing root cause analysis. In their book Managing the Unexpected: Assuring High Per- 2. Analyze selected outcomes to determine their root causes. formance in an Age of Complex- Analysis of process performance baselines and models can aid in the ity, Weick and Sutcliffe identify identification of potential root causes. “mindfulness” qualities impor- Depending on the type and number of outcomes, it can be beneficial tant in high-reliability organiza- to look at the outcomes in several ways to ensure all potential root tions including preoccupation with failure and reluctance to causes are investigated. Consider looking at individual outcomes as simplify. These imply attention ptg well as grouping the outcomes. to detail when communicating about an individual situation Examples of methods to determine root causes include the following: as well as seeking to under- stand possible systemic causes • Cause-and-effect (fishbone) diagrams to a range of apparently unre- • Check sheets lated small problems.

3. Combine selected outcomes into groups based on their root causes. HINT In some cases, outcomes can be influenced by multiple root causes. You develop cause-and-effect diagrams using iterative brain- storming (i.e., the “Five Whys”). Examples of cause groups or categories include the following: This process may terminate • Inadequate training and skills when it reaches root causes outside the experience of the • Breakdown of communication group or outside the control of • Not accounting for all details of a task its management. • Making mistakes in manual procedures (e.g., keyboard entry) • Process deficiency

Where appropriate, look for trends or symptoms in or across groupings. 4. Create an action proposal that documents actions to be taken to pre- vent the future occurrence of similar outcomes or to incorporate best practices into processes.

Wow! eBook 238 PART TWO THE PROCESS AREAS

Process performance models can support cost benefit analysis of action proposals through prediction of impacts and return on investment.

Examples of proposed preventative actions include changes to the following: • The process in question • Training • Tools •Methods • Work products

TIP Examples of incorporating best practices include the following: In his book The Checklist Mani- • Creating activity checklists, which reinforce training or communications festo: How to Get Things Right, related to common problems and techniques for preventing them Atul Gawande describes the • Changing a process so that error-prone steps do not occur role that properly designed • Automating all or part of a process checklists can play in prevent- • Reordering process activities ing common problems. He • Adding process steps, such as task kickoff meetings to review common relates his experiences in problems as well as actions to prevent them designing such checklists and ptg the impact they have had in significantly reducing compli- An action proposal usually documents the following: cations and fatalities from s u r g e r y. • Originator of the action proposal • Description of the outcome to be addressed • Description of the cause • Cause category • Phase identified • Description of the action • Time, cost, and other resources required to implement the action p r o p o s a l • Expected benefits from implementing the action proposal • Estimated cost of not fixing the problem • Action proposal category

SG 2 ADDRESS CAUSES OF SELECTED OUTCOMES TIP The focus of this goal is imple- Root causes of selected outcomes are systematically addressed. menting process changes that address root causes of selected Projects operating according to a well-defined process systematically outcomes, both positive and analyze where improvements are needed and implement process negative. changes to address root causes of selected outcomes.

Wow! eBook Causal Analysis and Resolution 239

SP 2.1 IMPLEMENT ACTION PROPOSALS Implement selected action proposals developed in causal analysis.

Action proposals describe tasks necessary to address root causes of analyzed outcomes to prevent or reduce the occurrence or recurrence of negative outcomes, or incorporate realized successes. Action plans are developed and implemented for selected action proposals. Only changes that prove to be of value should be considered for broad

implementation. CAR Example Work Products 1. Action proposals selected for implementation 2. Action plans

Subpractices 1. Analyze action proposals and determine their priorities.

Criteria for prioritizing action proposals include the following: • Implications of not addressing the outcome • Cost to implement process improvements to address the outcome ptg • Expected impact on quality

Process performance models can be used to help identify interactions among multiple action proposals. 2. Select action proposals to be implemented. Refer to the Decision Analysis and Resolution process area for more infor- mation about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. 3. Create action plans for implementing the selected action proposals.

Examples of information provided in an action plan include the following: • Person responsible for implementation • Detailed description of the improvement • Description of the affected areas • People who are to be kept informed of status •Schedule • Cost expended • Next date that status will be reviewed • Rationale for key decisions • Description of implementation actions

Wow! eBook 240 PART TWO THE PROCESS AREAS

4. Implement action plans. X-REF To implement action plans, the following tasks should be performed: For more information on • Make assignments. designing experiments to understand the impact of cer- • Coordinate the people doing the work. tain changes, consult good • Review the results. references on Six Sigma and • Track action items to closure. Experimental Design. Experiments may be conducted for particularly complex changes.

Examples of experiments include the following: HINT • Using a temporarily modified process When a resolution has more • Using a new tool general applicability, don’t doc- ument it in a “lessons learned” document; document it in an Actions may be assigned to members of the causal analysis team, improvement proposal. members of the project team, or other members of the organization. 5. Look for similar causes that may exist in other processes and work X-REF products and take action as appropriate. For more information on improvements proposals and SP 2.2 EVALUATE THE EFFECT OF IMPLEMENTED ACTIONS suggestions, see OPF SP 2.4 Evaluate the effect of implemented actions on process performance. and OPM SP 2.1. Refer to the Quantitative Project Management process area for more informa- ptg tion about selecting measures and analytic techniques.

HINT Once the changed process is deployed across the project, the effect of Use the measures associated changes is evaluated to verify that the process change has improved with a process or subprocess process performance. (perhaps supplemented by other measures) to evaluate Example Work Products the effect of changes. 1. Analysis of process performance and change in process performance

Subpractices 1. Measure and analyze the change in process performance of the proj - ect’s affected processes or subprocesses. This subpractice determines whether the selected change has posi- tively influenced process performance and by how much.

An example of a change in the process performance of the project’s defined design process would be a change in the predicted ability of the design to meet the quality and process performance objectives. Another example would be a change in the defect density of the design documentation, as statistically measured through peer reviews before and after the improvement has been made. On a statistical process control chart, this change in process performance would be represented by an improvement in the mean, a reduction in variation, or both.

Wow! eBook Causal Analysis and Resolution 241

Statistical and other quantitative techniques (e.g., hypothesis testing) can be used to compare the before and after baselines to assess the statistical significance of the change. 2. Determine the impact of the change on achieving the project’s qual- ity and process performance objectives. This subpractice determines whether the selected change has posi- tively influenced the ability of the project to meet its quality and process performance objectives by understanding how changes in the process performance data have affected the objectives. Process per-

formance models can aid in the evaluation through prediction of CAR impacts and return on investment. 3. Determine and document appropriate actions if the process or sub- process improvements did not result in expected project benefits.

SP 2.3 RECORD CAUSAL ANALYSIS DATA

Record causal analysis and resolution data for use across projects and the HINT organization. Collect data to know that you are improving project perform- Example Work Products ance relative to your objec- tives, to know that you have 1. Causal analysis and resolution records prevented selected problems 2. Organizational improvement proposals from reoccurring (or have ptg enabled conditions conducive Subpractices to superior performance to recur), and to provide suffi- 1. Record causal analysis data and make the data available so that other cient context for organizational projects can make appropriate process changes and achieve similar evaluation of improvement results. proposals for possible deploy- Record the following: ment across the organization • Data on outcomes that were analyzed (OPM SP 2.2). • Rationale for decisions • Action proposals from causal analysis meetings • Action plans resulting from action proposals • Cost of analysis and resolution activities • Measures of changes to the process performance of the defined process resulting from resolutions 2. Submit process improvement proposals for the organization when the implemented actions are effective for the project as appropriate. When improvements are judged to be effective, the information can be submitted to the organizational level for potential inclusion in the organizational processes. Refer to the Organizational Performance Management process area for more information about selecting improvements.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook CONFIGURATION MANAGEMENT A Support Process Area at Maturity Level 2

Purpose The purpose of Configuration Management (CM) is to establish and TIP maintain the integrity of work products using configuration identifi- Because this is a support cation, configuration control, configuration status accounting, and process area, it is up to the configuration audits. project and organization to decide which work products CM are subject to CM, and the level Introductory Notes of control needed. The Configuration Management process area involves the following ptg activities:

• Identifying the configuration of selected work products that compose HINT baselines at given points in time CM should capture enough • Controlling changes to configuration items information to identify and maintain the configuration item • Building or providing specifications to build work products from the after those who have devel- configuration management system oped it have gone. • Maintaining the integrity of baselines • Providing accurate status and current configuration data to develop- ers, end users, and customers

The work products placed under configuration management include the products that are delivered to the customer, designated internal work products, acquired products, tools, and other items used in creating and describing these work products. (See the defini- tion of “configuration management” in the glossary.)

243

Wow! eBook 244 PART TWO THE PROCESS AREAS

TIP Examples of work products that can be placed under configuration man- Other items may include repair agement include the following: manuals and test results that • Hardware and equipment provide additional insight into •Drawings the work product. • Product specifications • Tool configurations • Code and libraries • Compilers • Test tools and test scripts • Installation logs • Product data files • Product technical publications •Plans •User stories • Iteration backlogs • Process descriptions • Requirements • Architecture documentation and design data • Product line plans, processes, and core assets

Acquired products may need to be placed under configuration ptg management by both the supplier and the project. Provisions for conducting configuration management should be established in sup- plier agreements. Methods to ensure that data are complete and con- sistent should be established and maintained. Refer to the Supplier Agreement Management process area for more information about establishing supplier agreements. HINT Configuration management of work products can be performed at Typically, you determine the several levels of granularity. Configuration items can be decomposed levels of granularity during into configuration components and configuration units. Only the planning. Select the configura- term “configuration item” is used in this process area. Therefore, in tion items to define for CM these practices, “configuration item” may be interpreted as “configu- based on technical and busi- ness needs. ration component” or “configuration unit” as appropriate. (See the definition of “configuration item” in the glossary.) Baselines provide a stable basis for the continuing evolution of configuration items.

An example of a baseline is an approved description of a product that includes internally consistent versions of requirements, requirement traceability matrices, design, discipline-specific items, and end-user documentation.

Wow! eBook Configuration Management 245

Baselines are added to the configuration management system as HINT they are developed. Changes to baselines and the release of work Review and approve baselines products built from the configuration management system are sys- before they are added to (or tematically controlled and monitored via the configuration control, promoted within) the CM system. change management, and configuration auditing functions of config- uration management. This process area applies not only to configuration management TIP on projects but also to configuration management of organizational Anything important enough to work products such as standards, procedures, reuse libraries, and ensure its integrity can be sub- other shared supporting assets. ject to CM practices. Configuration management is focused on the rigorous control of the managerial and technical aspects of work products, including the delivered product or service. This process area covers the practices for performing the configu- ration management function and is applicable to all work products that are placed under configuration management. For product lines, configuration management involves additional CM considerations due to the sharing of core assets across the products in the product line and across multiple versions of core assets and products. (See the definition of “product line” in the glossary.) ptg In Agile environments, configuration management (CM) is important because of the need to support frequent change, frequent builds (typically daily), multiple baselines, and multiple CM supported workspaces (e.g., for individuals, teams, and even for pair-programming). Agile teams may get bogged down if the organization doesn’t: 1) automate CM (e.g., build scripts, status accounting, integrity checking) and 2) implement CM as a single set of standard services. At its start, an Agile team should identify the individual who will be responsible to ensure CM is implemented correctly. At the start of each iteration, CM support needs are re-confirmed. CM is carefully integrated into the rhythms of each team with a focus on mini- mizing team distraction to get the job done. (See “Interpreting CMMI When Using Agile Approaches” in Part I.)

Related Process Areas

Refer to the Project Monitoring and Control process area for more information about monitoring the project against the plan and managing corrective action to closure. Refer to the Project Planning process area for more information about develop- ing a project plan.

Wow! eBook 246 PART TWO THE PROCESS AREAS Specific Goal and Practice Summary SG 1 Establish Baselines SP 1.1 Identify Configuration Items SP 1.2 Establish a Configuration Management System SP 1.3 Create or Release Baselines SG 2 Track and Control Changes SP 2.1 Track Change Requests SP 2.2 Control Configuration Items SG 3 Establish Integrity SP 3.1 Establish Configuration Management Records SP 3.2 Perform Configuration Audits

Specific Practices by Goal

SG 1 ESTABLISH BASELINES Baselines of identified work products are established.

Specific practices to establish baselines are covered by this specific goal. The specific practices under the Track and Control Changes specific goal serve to maintain the baselines. The specific practices of the Establish Integrity specific goal document and audit the integrity of the baselines. ptg

SP 1.1 IDENTIFY CONFIGURATION ITEMS Identify configuration items, components, and related work products to be placed under configuration management.

Configuration identification is the selection and specification of the following:

• Products delivered to the customer • Designated internal work products • Acquired products • Tools and other capital assets of the project’s work environment • Other items used in creating and describing these work products

Configuration items can include hardware, equipment, and tangi- ble assets as well as software and documentation. Documentation can include requirements specifications and interface documents. Other documents that serve to identify the configuration of the prod- uct or service, such as test results, may also be included.

Wow! eBook Configuration Management 247

A “configuration item” is an entity designated for configuration TIP management, which may consist of multiple related work products An example of a group of that form a baseline. This logical grouping provides ease of identifi- related work products is a cation and controlled access. The selection of work products for con- product component and its requirements, design, test case, figuration management should be based on criteria established verification procedure, and during planning. verification report. Example Work Products HINT 1. Identified configuration items When developing your project plan, document what is impor- Subpractices tant for you to control. This part of the plan provides guidance 1. Select configuration items and work products that compose them for the project team when based on documented criteria. identifying configuration items.

Example criteria for selecting configuration items at the appropriate work HINT product level include the following: Use criteria to select configura-

• Work products that can be used by two or more groups tion items to ensure that the CM • Work products that are expected to change over time either because of selection is consistent and errors or changes in requirements thorough. • Work products that are dependent on each other (i.e., a change in one mandates a change in the others) ptg • Work products critical to project success

Examples of work products that may be part of a configuration item include the following: •Design • Test plans and procedures • Test results • Interface descriptions •Drawings • Source code • User stories or story cards • The declared business case, logic, or value • Tools (e.g., compilers) • Process descriptions • Requirements HINT If you use a CM tool, it will sometimes assign unique iden- 2. Assign unique identifiers to configuration items. tifiers to configuration items 3. Specify the important characteristics of each configuration item. for you.

Wow! eBook 248 PART TWO THE PROCESS AREAS Example characteristics of configuration items include author, document or file type, programming language for software code files, minimum mar- ketable features, and the purpose the configuration item serves.

TIP 4. Specify when each configuration item is placed under configuration Specifying when a configura- management. tion item must be placed under CM sets expectations among team members and establishes Example criteria for determining when to place work products under con- clear objectives for controlling figuration management include the following: the project’s work products. • When the work product is ready for test • Stage of the project lifecycle • Degree of control desired on the work product • Cost and schedule limitations • Stakeholder requirements

5. Identify the owner responsible for each configuration item. 6. Specify relationships among configuration items. Incorporating the types of relationships (e.g., parent-child, depen- dency) that exist among configuration items into the configuration management structure (e.g., configuration management database) ptg assists in managing the effects and impacts of changes.

SP 1.2 ESTABLISH A CONFIGURATION MANAGEMENT SYSTEM Establish and maintain a configuration management and change management system for controlling work products.

TIP A configuration management system includes the storage media, pro- Most organizations use an cedures, and tools for accessing the system. A configuration manage- automated tool for CM. ment system can consist of multiple subsystems with different Because of the many aspects to implementations that are appropriate for each configuration manage- consider, it is difficult to main- tain a manual CM process. ment environment. A change management system includes the storage media, proce- HINT dures, and tools for recording and accessing change requests. Many turnkey systems are Example Work Products available to help you with CM. Before you purchase one, iden- 1. Configuration management system with controlled work products tify your needs and compare 2. Configuration management system access control procedures them to the system’s features. 3. Change request database

Wow! eBook Configuration Management 249

Subpractices 1. Establish a mechanism to manage multiple levels of control. TIP The level of control is typically selected based on project objectives, Not all configuration items risk, and resources. Control levels can vary in relation to the project require the same level of con- lifecycle, type of system under development, and specific project trol. Some may require more requirements. control as they move through the project lifecycle.

Example levels of control include the following: • Uncontrolled: Anyone can make changes. • Work-in-progress: Authors control changes. • Released: A designated authority authorizes and controls changes and relevant stakeholders are notified when changes are made.

Levels of control can range from informal control that simply tracks TIP changes made when configuration items are being developed to for- A formal CM process is typi-

mal configuration control using baselines that can only be changed as cally change-request based CM part of a formal configuration management process. and requires extensive tracking 2. Provide access control to ensure authorized access to the configura- and review of all changes. tion management system. 3. Store and retrieve configuration items in a configuration manage- HINT ptg ment system. You must find the balance of 4. Share and transfer configuration items between control levels in the ensuring all that need access to configuration items have it, but configuration management system. that the need for access is gen- 5. Store and recover archived versions of configuration items. uine and confirmed. 6. Store, update, and retrieve configuration management records. 7. Create configuration management reports from the configuration management system. 8. Preserve the contents of the configuration management system.

Examples of preservation functions of the configuration management sys- tem include the following: • Backup and restoration of configuration management files • Archive of configuration management files HINT • Recovery from configuration management errors Review the CM system regu- larly to ensure that it is meeting the needs of the projects it 9. Revise the configuration management structure as necessary. serves.

Wow! eBook 250 PART TWO THE PROCESS AREAS

SP 1.3 CREATE OR RELEASE BASELINES Create or release baselines for internal use and for delivery to the customer.

A baseline is represented by the assignment of an identifier to a con- figuration item or a collection of configuration items and associated entities at a distinct point in time. As a product or service evolves, multiple baselines can be used to control development and testing. (See the definition of “baseline” in the glossary.) Hardware products as well as software and documentation should also be included in baselines for infrastructure related configurations (e.g., software, hardware) and in preparation for system tests that include interfacing hardware and software. One common set of baselines includes the system level require- ments, system element level design requirements, and the product TIP definition at the end of development/beginning of production. These CCB authorization should be baselines are typically referred to respectively as the “functional base- formal and documented in line,” “allocated baseline,” and “product baseline.” some way. A software baseline can be a set of requirements, design, source TIP code files and the associated executable code, build files, and user documentation (associated entities) that have been assigned a unique If the baselines released or ptg used do not come from the CM identifier. system, the system is not serv- ing its purpose and there is a Example Work Products high risk of losing baseline 1. Baselines integrity. 2. Description of baselines

TIP Subpractices If the project or organization uses multiple baselines, it is 1. Obtain authorization from the CCB before creating or releasing base- even more critical to ensure lines of configuration items. that everyone is using the cor- 2. Create or release baselines only from configuration items in the con- rect baseline. figuration management system. 3. Document the set of configuration items that are contained in a HINT baseline. To ensure that project mem- bers use the CM system, make 4. Make the current set of baselines readily available. sure the system is easy to use. SG 2 TRACK AND CONTROL CHANGES TIP Changes to the work products under configuration management are tracked This goal is typically imple- and controlled. mented by establishing a change request system and forming a CCB whose primary The specific practices under this specific goal serve to maintain base- role is to review and approve lines after they are established by specific practices under the Estab- baseline changes. lish Baselines specific goal.

Wow! eBook Configuration Management 251

SP 2.1 TRACK CHANGE REQUESTS TIP Track change requests for configuration items. Change requests are formally submitted descriptions of desired modifications to work Change requests address not only new or changed requirements but products. If change requests also failures and defects in work products. are not documented consis- Change requests are analyzed to determine the impact that the tently, they are difficult to ana- change will have on the work product, related work products, the lyze and track. budget, and the schedule. TIP Example Work Products A change request system is usually used after an initial 1. Change requests baseline is created. A change request system allows you to Subpractices track every change that is made 1. Initiate and record change requests in the change request database. to work products.

2. Analyze the impact of changes and fixes proposed in change TIP requests.

A database provides a flexible CM Changes are evaluated through activities that ensure that they are environment for storing and consistent with all technical and project requirements. tracking change requests. Changes are evaluated for their impact beyond immediate project or contract requirements. Changes to an item used in multiple products can resolve an immediate issue while causing a problem in other ptg applications. Changes are evaluated for their impact on release plans. TIP 3. Categorize and prioritize change requests. Organizing your change Emergency requests are identified and referred to an emergency requests enables you to ana- authority if appropriate. lyze the relationships among Changes are allocated to future baselines. them and the order in which these requests should be 4. Review change requests to be addressed in the next baseline with implemented. relevant stakeholders and get their agreement. Conduct the change request review with appropriate participants. HINT Record the disposition of each change request and the rationale for Track change requests to clo- the decision, including success criteria, a brief action plan if appropri- sure to ensure that if a change ate, and needs met or unmet by the change. Perform the actions request is not addressed, it was required in the disposition and report results to relevant stakeholders. not lost or missed. 5. Track the status of change requests to closure. Change requests brought into the system should be handled in an efficient and timely manner. Once a change request has been processed, it is critical to close the request with the appropriate approved action as soon as it is practical. Actions left open result in larger than necessary status lists, which in turn result in added costs and confusion.

Wow! eBook 252 PART TWO THE PROCESS AREAS

SP 2.2 CONTROL CONFIGURATION ITEMS Control changes to configuration items.

Control is maintained over the configuration of the work product baseline. This control includes tracking the configuration of each configuration item, approving a new configuration if necessary, and TIP updating the baseline. A revision history usually con- tains not only what was Example Work Products changed, but also who made the changes, and when and 1. Revision history of configuration items why they were made. 2. Archives of baselines

TIP Subpractices The life of the product is typi- 1. Control changes to configuration items throughout the life of the cally longer than the life of the development project that cre- product or service. ated it. The responsibility for 2. Obtain appropriate authorization before changed configuration configuration items may items are entered into the configuration management system. change over time.

HINT For example, authorization can come from the CCB, the project manager, product owner, or the customer. Define authorization proce- ptg dures so that it is clear how to receive authorization to enter an updated configuration item 3. Check in and check out configuration items in the configuration into the CM system. management system for incorporation of changes in a manner that maintains the correctness and integrity of configuration items. TIP An important part of check in Examples of check-in and check-out steps include the following: and check out is ensuring that • Confirming that the revisions are authorized only one copy of a configura- • Updating the configuration items tion item is authorized for update at one time. • Archiving the replaced baseline and retrieving the new baseline • Commenting on the changes made to the item • Tying changes to related work products such as requirements, user sto- ries, and tests

4. Perform reviews to ensure that changes have not caused unintended effects on the baselines (e.g., ensure that changes have not compro- mised the safety or security of the system). 5. Record changes to configuration items and reasons for changes as appropriate. If a proposed change to the work product is accepted, a schedule is identified for incorporating the change into the work product and other affected areas.

Wow! eBook Configuration Management 253

Configuration control mechanisms can be tailored to categories of changes. For example, the approval considerations could be less stringent for component changes that do not affect other components. Changed configuration items are released after review and approval of configuration changes. Changes are not official until they are released.

SG 3 ESTABLISH INTEGRITY Integrity of baselines is established and maintained.

The integrity of baselines, established by processes associated with the Establish Baselines specific goal, and maintained by processes associated with the Track and Control Changes specific goal, is addressed by the specific practices under this specific goal.

SP 3.1 ESTABLISH CONFIGURATION MANAGEMENT RECORDS TIP CM Establish and maintain records describing configuration items. Without descriptions of the configuration items, it is diffi- Example Work Products cult and time consuming to understand the status of these 1. Revision history of configuration items items relative to the project ptg 2. Change log plan. 3. Change request records 4. Status of configuration items 5. Differences between baselines

Subpractices 1. Record configuration management actions in sufficient detail so the content and status of each configuration item is known and previous versions can be recovered. 2. Ensure that relevant stakeholders have access to and knowledge of the configuration status of configuration items.

Examples of activities for communicating configuration status include the following: • Providing access permissions to authorized end users • Making baseline copies readily available to authorized end users • Automatically alerting relevant stakeholders when items are checked in or out or changed, or of decisions made regarding change requests

3. Specify the latest version of baselines.

Wow! eBook 254 PART TWO THE PROCESS AREAS

TIP 4. Identify the version of configuration items that constitute a particu- Version control is an important lar baseline. part of CM. There are different 5. Describe differences between successive baselines. ways to identify versions. A standard way is using sequen- 6. Revise the status and history (i.e., changes, other actions) of each tial numbering. configuration item as necessary.

HINT SP 3.2 PERFORM CONFIGURATION AUDITS When describing the differ- Perform configuration audits to maintain the integrity of configuration ences between baselines, you b a s e l i n e s . should be detailed enough so that users of the baselines can differentiate them easily. Configuration audits confirm that the resulting baselines and documen- tation conform to a specified standard or requirement. Configuration item related records can exist in multiple databases or configuration TIP management systems. In such instances, configuration audits should Audits provide an objective way to know that what is sup- extend to these other databases as appropriate to ensure accuracy, posed to be controlled is con- consistency, and completeness of configuration item information. trolled and can be retrieved. (See the definition of “configuration audit” in the glossary.) Retrieval of accurate informa- tion is one of CM’s many b e n e f i t s . Examples of audit types include the following: • Functional configuration audits (FCAs): Audits conducted to verify that ptg the development of a configuration item has been completed satisfacto- rily, that the item has achieved the functional and quality attribute char- acteristics specified in the functional or allocated baseline, and that its operational and support documents are complete and satisfactory. • Physical configuration audits (PCAs): Audits conducted to verify that a configuration item, as built, conforms to the technical documentation that defines and describes it. • Configuration management audits: Audits conducted to confirm that con- figuration management records and configuration items are complete, consistent, and accurate.

Example Work Products 1. Configuration audit results 2. Action items

Subpractices TIP 1. Assess the integrity of baselines. Integrity includes accuracy, 2. Confirm that configuration management records correctly identify consistency, and completeness. configuration items. 3. Review the structure and integrity of items in the configuration management system.

Wow! eBook Configuration Management 255

4. Confirm the completeness, correctness, and consistency of items in the configuration management system. Completeness, correctness, and consistency of the configuration man- agement system’s content are based on requirements as stated in the plan and the disposition of approved change requests. TIP 5. Confirm compliance with applicable configuration management An audit is effective only when standards and procedures. all action items from the audit 6. Track action items from the audit to closure. are addressed. CM

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook DECISION ANALYSIS AND RESOLUTION A Support Process Area at Maturity Level 3

Purpose The purpose of Decision Analysis and Resolution (DAR) is to analyze TIP possible decisions using a formal evaluation process that evaluates DAR provides organizations identified alternatives against established criteria. with a criterion-based approach to making important decisions objectively. Introductory Notes

The Decision Analysis and Resolution process area involves estab- HINT lishing guidelines to determine which issues should be subject to a Initially, use this process only ptg formal evaluation process and applying formal evaluation processes for decisions that will have a to these issues. major impact on the project. A formal evaluation process is a structured approach to evaluating After you are accustomed to using a formal evaluation alternative solutions against established criteria to determine a rec- process, you may find value in DAR ommended solution. using it for selected day-to-day A formal evaluation process involves the following actions: decisions.

• Establishing the criteria for evaluating alternatives • Identifying alternative solutions • Selecting methods for evaluating alternatives • Evaluating alternative solutions using established criteria and methods • Selecting recommended solutions from alternatives based on evalua- tion criteria TIP Rather than using the phrase “alternative solutions to address DAR takes the blame out of decision making. A bad deci- issues” each time, in this process area, one of two shorter phrases are sion is made when all the nec- used: “alternative solutions” or “alternatives.” essary information that might A formal evaluation process reduces the subjective nature of a impact the decision was not decision and provides a higher probability of selecting a solution that considered and relevant stake- meets multiple demands of relevant stakeholders. holders were not consulted.

257

Wow! eBook 258 PART TWO THE PROCESS AREAS

TIP While the primary application of this process area is to technical Examples of nontechnical concerns, formal evaluation processes can be applied to many non- issues include adding technical issues, particularly when a project is being planned. Issues resources and determining that have multiple alternative solutions and evaluation criteria lend delivery approaches for a training class. themselves to a formal evaluation process.

Trade studies of equipment or software are typical examples of formal evaluation processes.

X-REF During planning, specific issues requiring a formal evaluation Many of the technical issues process are identified. Typical issues include selection among archi- that may benefit from a for- tectural or design alternatives, use of reusable or commercial off-the- mal evaluation process are shelf (COTS) components, supplier selection, engineering support addressed in TS, PI, VER, environments or associated tools, test environments, delivery alter- and VAL. natives, and logistics and production. A formal evaluation process can also be used to address a make-or-buy decision, the development of manufacturing processes, the selection of distribution locations, and other decisions. Guidelines are created for deciding when to use formal evaluation processes to address unplanned issues. Guidelines often suggest ptg using formal evaluation processes when issues are associated with medium-to-high-impact risks or when issues affect the ability to achieve project objectives. Defining an issue well helps to define the scope of alternatives to be considered. The right scope (i.e., not too broad, not too narrow) will aid in making an appropriate decision for resolving the defined issue. Formal evaluation processes can vary in formality, type of criteria, and methods employed. Less formal decisions can be analyzed in a few hours, use few criteria (e.g., effectiveness, cost to implement), and result in a one- or two-page report. More formal decisions can require separate plans, months of effort, meetings to develop and TIP approve criteria, simulations, prototypes, piloting, and extensive Tools such as the Analytic Hier- documentation. archy Process (AHP), Quality Both numeric and non-numeric criteria can be used in a formal Function Deployment (QFD), evaluation process. Numeric criteria use weights to reflect the rela- the Pugh Method, the Delphi Method, prioritization matrices, tive importance of criteria. Non-numeric criteria use a subjective cause-and-effect diagrams, ranking scale (e.g., high, medium, low). More formal decisions can decision trees, weighted crite- require a full trade study. ria spreadsheets, and simula- A formal evaluation process identifies and evaluates alternative tions exist to help with solutions. The eventual selection of a solution can involve iterative weighted decision making. activities of identification and evaluation. Portions of identified alter-

Wow! eBook Decision Analysis and Resolution 259

natives can be combined, emerging technologies can change alterna- tives, and the business situation of suppliers can change during the evaluation period. A recommended alternative is accompanied by documentation of selected methods, criteria, alternatives, and rationale for the recom- mendation. The documentation is distributed to relevant stakehold- ers; it provides a record of the formal evaluation process and rationale, which are useful to other projects that encounter a similar issue. While some of the decisions made throughout the life of the proj- ect involve the use of a formal evaluation process, others do not. As mentioned earlier, guidelines should be established to determine which issues should be subject to a formal evaluation process.

Related Process Areas

Refer to the Integrated Project Management process area for more information about establishing the project’s defined process. Refer to the Risk Management process area for more information about identi- fying and analyzing risks and mitigating risks. ptg

Specific Goal and Practice Summary SG 1 Evaluate Alternatives DAR SP 1.1 Establish Guidelines for Decision Analysis SP 1.2 Establish Evaluation Criteria SP 1.3 Identify Alternative Solutions SP 1.4 Select Evaluation Methods SP 1.5 Evaluate Alternative Solutions SP 1.6 Select Solutions

Specific Practices by Goal

SG 1 EVALUATE ALTERNATIVES Decisions are based on an evaluation of alternatives using established c r i t e r i a .

Issues requiring a formal evaluation process can be identified at any time. The objective should be to identify issues as early as possible to maximize the time available to resolve them.

Wow! eBook 260 PART TWO THE PROCESS AREAS

SP 1.1 ESTABLISH GUIDELINES FOR DECISION ANALYSIS Establish and maintain guidelines to determine which issues are subject to a formal evaluation process.

Not every decision is significant enough to require a formal evalua- tion process. The choice between the trivial and the truly important is unclear without explicit guidance. Whether a decision is signifi- cant or not is dependent on the project and circumstances and is determined by established guidelines.

Typical guidelines for determining when to require a formal evaluation process include the following: • A decision is directly related to issues that are medium-to-high-impact risk. • A decision is related to changing work products under configuration management. • A decision would cause schedule delays over a certain percentage or amount of time. • A decision affects the ability of the project to achieve its objectives. • The costs of the formal evaluation process are reasonable when com- pared to the decision’s impact. ptg • A legal obligation exists during a solicitation. • When competing quality attribute requirements would result in signifi- cantly different alternative architectures.

Refer to the Risk Management process area for more information about evaluat- ing, categorizing, and prioritizing risks.

Examples of activities for which you may use a formal evaluation process include the following: • Making decisions involving the procurement of material when 20 percent of the material parts constitute 80 percent of the total material costs • Making design-implementation decisions when technical performance failure can cause a catastrophic failure (e.g., safety-of-flight item) • Making decisions with the potential to significantly reduce design risk, engineering changes, cycle time, response time, and production costs (e.g., to use lithography models to assess form and fit capability before releasing engineering drawings and production builds)

Wow! eBook Decision Analysis and Resolution 261

Example Work Products 1. Guidelines for when to apply a formal evaluation process

Subpractices 1. Establish guidelines for when to use a formal evaluation process. HINT 2. Incorporate the use of guidelines into the defined process as Make sure the guidelines are appropriate. accessible and understood by everyone in the organization Refer to the Integrated Project Management process area for more infor- (or project). mation about establishing the project’s defined process.

SP 1.2 ESTABLISH EVALUATION CRITERIA Establish and maintain criteria for evaluating alternatives and the relative ranking of these criteria.

Evaluation criteria provide the basis for evaluating alternative solu- TIP tions. Criteria are ranked so that the highest ranked criteria exert the In some cases, all criteria may most influence on the evaluation. be roughly equal and ranking This process area is referenced by many other process areas in the may not be necessary. model, and many contexts in which a formal evaluation process can be used. Therefore, in some situations you may find that criteria have ptg already been defined as part of another process. This specific practice does not suggest that a second development of criteria be conducted. A well-defined statement of the issue to be addressed and the TIP DAR decision to be made focuses the analysis to be performed. Such a Documentation helps to pro- statement also aids in defining evaluation criteria that minimize the vide an understanding of possibility that decisions will be second guessed or that the reason which criteria were considered for making the decision will be forgotten. Decisions based on criteria and how they were ranked. that are explicitly defined and established remove barriers to stake- holder buy-in. Example Work Products 1. Documented evaluation criteria 2. Rankings of criteria importance

Subpractices 1. Define the criteria for evaluating alternative solutions. Criteria should be traceable to requirements, scenarios, business case assumptions, business objectives, or other documented sources.

Wow! eBook 262 PART TWO THE PROCESS AREAS Types of criteria to consider include the following: • Technology limitations • Environmental impact •Risks • Business value • Impact on priorities • Total ownership and lifecycle costs

TIP 2. Define the range and scale for ranking the evaluation criteria. An example of a non-numeric Scales of relative importance for evaluation criteria can be established scale is one that ranks criteria with non-numeric values or with formulas that relate the evaluation based on low, medium, or high parameter to a numeric weight. importance. 3. Rank the criteria. The criteria are ranked according to the defined range and scale to reflect the needs, objectives, and priorities of the relevant stakeholders. 4. Assess the criteria and their relative importance. 5. Evolve the evaluation criteria to improve their validity. 6. Document the rationale for the selection and rejection of evaluation criteria. Documentation of selection criteria and rationale may be needed to ptg justify solutions or for future reference and use.

SP 1.3 IDENTIFY ALTERNATIVE SOLUTIONS Identify alternative solutions to address issues.

A wider range of alternatives can surface by soliciting as many stake- holders as practical for input. Input from stakeholders with diverse skills and backgrounds can help teams identify and address assump- tions, constraints, and biases. Brainstorming sessions can stimulate innovative alternatives through rapid interaction and feedback. HINT Sufficient candidate solutions may not be furnished for analysis. You can use many brainstorm- As the analysis proceeds, other alternatives should be added to the ing techniques to identify alter- list of potential candidate solutions. The generation and considera- native solutions. However, you tion of multiple alternatives early in a decision analysis and resolu- should establish a few rules tion process increases the likelihood that an acceptable decision will before your brainstorming ses- sion to ensure that the widest be made and that consequences of the decision will be understood. possible range of alternatives is considered. Example Work Products 1. Identified alternatives

Wow! eBook Decision Analysis and Resolution 263

Subpractices 1. Perform a literature search. A literature search can uncover what others have done both inside and outside the organization. Such a search can provide a deeper understanding of the problem, alternatives to consider, barriers to implementation, existing trade studies, and lessons learned from sim- ilar decisions. 2. Identify alternatives for consideration in addition to the alternatives that may be provided with the issue. Evaluation criteria are an effective starting point for identifying alter- natives. Evaluation criteria identify priorities of relevant stakeholders and the importance of technical, logistical, or other challenges. Combining key attributes of existing alternatives can generate addi- tional and sometimes stronger alternatives. Solicit alternatives from relevant stakeholders. Brainstorming ses- sions, interviews, and working groups can be used effectively to uncover alternatives. 3. Document proposed alternatives.

SP 1.4 SELECT EVALUATION METHODS Select evaluation methods. ptg

Methods for evaluating alternative solutions against established cri- TIP teria can range from simulations to the use of probabilistic models Different evaluation methods DAR and decision theory. These methods should be carefully selected. The require different investments level of detail of a method should be commensurate with cost, sched- of resources and training. ule, performance, and risk impacts. While many problems may require only one evaluation method, TIP some problems may require multiple methods. For example, simula- Selecting solutions also tions may augment a trade study to determine which design alterna- involves risk assessment. tive best meets a given criterion. Example Work Products 1. Selected evaluation methods

Subpractices 1. Select methods based on the purpose for analyzing a decision and on the availability of the information used to support the method.

For example, the methods used for evaluating a solution when require- ments are weakly defined may be different from the methods used when the requirements are well defined.

Wow! eBook 264 PART TWO THE PROCESS AREAS Typical evaluation methods include the following: •Testing • Modeling and simulation • Engineering studies • Manufacturing studies • Cost studies • Business opportunity studies •Surveys • Extrapolations based on field experience and prototypes • End-user review and comment • Judgment provided by an expert or group of experts (e.g., Delphi method)

2. Select evaluation methods based on their ability to focus on the issues at hand without being overly influenced by side issues. Results of simulations can be skewed by random activities in the solution that are not directly related to the issues at hand. 3. Determine the measures needed to support the evaluation method. Consider the impact on cost, schedule, performance, and risks.

SP 1.5 EVALUATE ALTERNATIVE SOLUTIONS ptg Evaluate alternative solutions using established criteria and methods.

Evaluating alternative solutions involves analysis, discussion, and review. Iterative cycles of analysis are sometimes necessary. Support- ing analyses, experimentation, prototyping, piloting, or simulations may be needed to substantiate scoring and conclusions. Often, the relative importance of criteria is imprecise and the total effect on a solution is not apparent until after the analysis is per- formed. In cases where the resulting scores differ by relatively small amounts, the best selection among alternative solutions may not be clear. Challenges to criteria and assumptions should be encouraged. Example Work Products 1. Evaluation results

Subpractices 1. Evaluate proposed alternative solutions using the established evalua- tion criteria and selected methods. 2. Evaluate assumptions related to the evaluation criteria and the evi- dence that supports the assumptions.

Wow! eBook Decision Analysis and Resolution 265

3. Evaluate whether uncertainty in the values for alternative solutions affects the evaluation and address these uncertainties as appropriate. For instance, if the score varies between two values, is the difference significant enough to make a difference in the final solution set? Does the variation in score represent a high-impact risk? To address these concerns, simulations may be run, further studies may be performed, or evaluation criteria may be modified, among other things. 4. Perform simulations, modeling, prototypes, and pilots as necessary to exercise the evaluation criteria, methods, and alternative solutions. Untested criteria, their relative importance, and supporting data or functions can cause the validity of solutions to be questioned. Criteria and their relative priorities and scales can be tested with trial runs against a set of alternatives. These trial runs of a select set of criteria allow for the evaluation of the cumulative impact of criteria on a solution. If trials reveal problems, different criteria or alternatives might be considered to avoid biases. 5. Consider new alternative solutions, criteria, or methods if proposed alternatives do not test well; repeat evaluations until alternatives do test well. 6. Document the results of the evaluation. Document the rationale for the addition of new alternatives or meth- ods and changes to criteria, as well as the results of interim evalua- ptg tions.

SP 1.6 SELECT SOLUTIONS DAR Select solutions from alternatives based on evaluation criteria.

Selecting solutions involves weighing results from the evaluation of alternatives. Risks associated with the implementation of solutions should be assessed. Example Work Products 1. Recommended solutions to address significant issues

Subpractices 1. Assess the risks associated with implementing the recommended solution. Refer to the Risk Management process area for more information about identifying and analyzing risks and mitigating risks. Decisions must often be made with incomplete information. There can be substantial risk associated with the decision because of having incomplete information.

Wow! eBook 266 PART TWO THE PROCESS AREAS

When decisions must be made according to a specific schedule, time and resources may not be available for gathering complete informa- tion. Consequently, risky decisions made with incomplete informa- tion can require re-analysis later. Identified risks should be monitored. 2. Document and communicate to relevant stakeholders the results and rationale for the recommended solution. It is important to record both why a solution is selected and why another solution was rejected.

ptg

Wow! eBook INTEGRATED PROJECT MANAGEMENT TIP A Project Management Process Area at Maturity Level 3 In the CMMI for Services Model, this process area is called Integrated Work M a n a g e m e n t . Purpose

The purpose of Integrated Project Management (IPM) is to establish TIP and manage the project and the involvement of relevant stakeholders IPM matures the project man- according to an integrated and defined process that is tailored from agement activities described in the organization’s set of standard processes. PP and PMC so that they address the organizational requirements for projects Introductory Notes described in OPF and OPD. Integrated Project Management involves the following activities: ptg

• Establishing the project’s defined process at project startup by tailor- ing the organization’s set of standard processes • Managing the project using the project’s defined process • Establishing the work environment for the project based on the orga- nization’s work environment standards • Establishing teams that are tasked to accomplish project objectives • Using and contributing to organizational process assets • Enabling relevant stakeholders’ concerns to be identified, considered, and, when appropriate, addressed during the project

• Ensuring that relevant stakeholders (1) perform their tasks in a coor- IPM dinated and timely manner; (2) address project requirements, plans, TIP objectives, problems, and risks; (3) fulfill their commitments; and (4) Using IPM to guide project identify, track, and resolve coordination issues management activities enables project plans to be consistent The integrated and defined process that is tailored from the orga- with project activities because nization’s set of standard processes is called the project’s defined they are both derived from standard processes created by process. (See the definition of “project” in the glossary.) the organization. Further, plans Managing the project’s effort, cost, schedule, staffing, risks, and tend to be more reliable and other factors is tied to the tasks of the project’s defined process. The are developed more quickly, implementation and management of the project’s defined process are and new projects learn more typically described in the project plan. Certain activities may be quickly.

267

Wow! eBook 268 PART TWO THE PROCESS AREAS

covered in other plans that affect the project, such as the quality assurance plan, risk management strategy, and the configuration management plan. TIP Since the defined process for each project is tailored from the It is also easier to share organization’s set of standard processes, variability among projects is resources (e.g., training and typically reduced and projects can easily share process assets, data, software tools) and to “load and lessons learned. balance” staff across projects.

This process area also addresses the coordination of all activities associ- ated with the project such as the following: • Development activities (e.g., requirements development, design, v e r i f i c a t i o n ) • Service activities (e.g., delivery, help desk, operations, customer contact) • Acquisition activities (e.g., solicitation, agreement monitoring, transition to operations) • Support activities (e.g., configuration management, documentation, mar- keting, training)

TIP The working interfaces and interactions among relevant stake- A proactive approach to inte- holders internal and external to the project are planned and managed ptg grating plans and coordinating to ensure the quality and integrity of the overall endeavor. Relevant with relevant stakeholders out- stakeholders participate as appropriate in defining the project’s side the project is a key activity. defined process and the project plan. Reviews and exchanges are reg- ularly conducted with relevant stakeholders to ensure that coordina- tion issues receive appropriate attention and everyone involved with the project is appropriately aware of status, plans, and activities. (See the definition of “relevant stakeholder” in the glossary.) In defining the project’s defined process, formal interfaces are created as neces- sary to ensure that appropriate coordination and collaboration occurs. This process area applies in any organizational structure, includ- ing projects that are structured as line organizations, matrix organi- zations, or teams. The terminology should be appropriately interpreted for the organizational structure in place.

Related Process Areas

Refer to the Verification process area for more information about performing peer reviews. Refer to the Measurement and Analysis process area for more information about aligning measurement and analysis activities and providing measurement results.

Wow! eBook Integrated Project Management 269

Refer to the Organizational Process Definition process area for more informa- tion about establishing and maintaining a usable set of organizational process assets, work environment standards, and rules and guidelines for teams. Refer to the Project Monitoring and Control process area for more information about monitoring the project against the plan. Refer to the Project Planning process area for more information about develop- ing a project plan.

Specific Goal and Practice Summary SG 1 Use the Project’s Defined Process TIP SP 1.1 Establish the Project’s Defined Process In Version 1.3, the IPPD material SP 1.2 Use Organizational Process Assets for Planning Project Activities has been removed and SP 1.6 SP 1.3 Establish the Project’s Work Environment was added to address teams. SP 1.4 Integrate Plans SP 1.5 Manage the Project Using Integrated Plans SP 1.6 Establish Teams SP 1.7 Contribute to Organizational Process Assets SG 2 Coordinate and Collaborate with Relevant Stakeholders SP 2.1 Manage Stakeholder Involvement SP 2.2 Manage Dependencies SP 2.3 Resolve Coordination Issues ptg Specific Practices by Goal

SG 1 USE THE PROJECT’S DEFINED PROCESS The project is conducted using a defined process tailored from the organiza- tion’s set of standard processes.

The project’s defined process includes those processes from the orga- nization’s set of standard processes that address all processes neces- sary to acquire, develop, maintain, or deliver the product. The product related lifecycle processes, such as manufacturing TIP IPM and support processes, are developed concurrently with the product. All projects that use IPM have the organization’s set of stan- SP 1.1 ESTABLISH THE PROJECT’S DEFINED PROCESS dard processes as a basis to begin planning all project Establish and maintain the project’s defined process from project startup a c t i v i t i e s . through the life of the project. Refer to the Organizational Process Definition process area for more informa- tion about establishing organizational process assets and establishing the orga- nization’s measurement repository. Refer to the Organizational Process Focus process area for more information about deploying organizational process assets and deploying standard processes.

Wow! eBook 270 PART TWO THE PROCESS AREAS

The project’s defined process consists of defined processes that form an integrated, coherent lifecycle for the project. The project’s defined process should satisfy the project’s contrac- tual requirements, operational needs, opportunities, and constraints. It is designed to provide a best fit for project needs. A project’s defined process is based on the following factors:

• Stakeholder requirements • Commitments • Organizational process needs and objectives • The organization’s set of standard processes and tailoring guidelines • The operational environment • The business environment

Establishing the project’s defined process at project startup helps to ensure that project staff and relevant stakeholders implement a set of activities needed to efficiently establish an initial set of require- ments and plans for the project. As the project progresses, the description of the project’s defined process is elaborated and revised to better meet project requirements and the organization’s process needs and objectives. Also, as the organization’s set of standard ptg processes changes, the project’s defined process may need to be revised. Example Work Products 1. The project’s defined process

Subpractices TIP 1. Select a lifecycle model from the ones available in organizational IPM depends strongly on OPD. process assets. It is impossible to implement the specific practices in IPM without having in place the Examples of project characteristics that could affect the selection of organizational infrastructure lifecycle models include the following: described in OPD. • Size or complexity of the project •Project strategy • Experience and familiarity of staff with implementing the process • Constraints such as cycle time and acceptable defect levels • Availability of customers to answer questions and provide feedback on increments • Clarity of requirements • Customer expectations

Wow! eBook Integrated Project Management 271

2. Select standard processes from the organization’s set of standard processes that best fit the needs of the project. 3. Tailor the organization’s set of standard processes and other organi- TIP zational process assets according to tailoring guidelines to produce The organization’s set of stan- the project’s defined process. dard processes are tailored to Sometimes the available lifecycle models and standard processes are address the project’s specific inadequate to meet project needs. In such circumstances, the project needs and situation. For exam- ple, are stringent quality, safety, should seek approval to deviate from what is required by the organi- or security requirements in zation. Waivers are provided for this purpose. place? Is the risk of delivering Tailoring can include adapting the organization’s common measures the wrong product high? Are and specifying additional measures to meet the information needs of we working with a new cus- the project. tomer, product line, or line of 4. Use other artifacts from the organization’s process asset library as business? Are there stringent schedule constraints? appropriate.

Other artifacts can include the following: HINT • Lessons learned documents Maintain the process asset library to keep it current, or • Templates else it could become the dump- • Example documents ing ground for all project infor- • Estimating models mation and may quickly become unusable. ptg 5. Document the project’s defined process. The project’s defined process covers all of the activities for the project and its interfaces to relevant stakeholders.

Examples of project activities include the following: TIP • Project planning Remember the point from the Introductory Notes that states, •Project monitoring “Relevant stakeholders partici- • Supplier management pate as appropriate in defining • Quality assurance the project’s defined process and the project plan.” • Risk management IPM • Decision analysis and resolution • Requirements development • Requirements management • Configuration management • Product development and support • Code review • Solicitation

6. Conduct peer reviews of the project’s defined process. Refer to the Verification process area for more information about perform- ing peer reviews.

Wow! eBook 272 PART TWO THE PROCESS AREAS

7. Revise the project’s defined process as necessary.

SP 1.2 USE ORGANIZATIONAL PROCESS ASSETS FOR PLANNING PROJECT ACTIVITIES

TIP Use organizational process assets and the measurement repository for esti- These activities may be similar mating and planning project activities. to what you were doing to Refer to the Organizational Process Definition process area for more informa- address PP, but because data tion about establishing organizational process assets. and experiences from other projects are now more readily available and applicable, the When available, use results of previous planning and execution accuracy of project planning activities as predictors of the relative scope and risk of the effort should improve. being estimated. Example Work Products 1. Project estimates 2. Project plans

Subpractices 1. Use the tasks and work products of the project’s defined process as a basis for estimating and planning project activities. An understanding of the relationships among tasks and work prod- ptg ucts of the project’s defined process, and of the roles to be performed by relevant stakeholders, is a basis for developing a realistic plan. 2. Use the organization’s measurement repository in estimating the pro- ject’s planning parameters.

TIP This estimate typically includes the following: Independent validation of data • Appropriate historical data from this project or similar projects confirms that the historical data used is applicable to the • Similarities and differences between the current project and those proj- project. ects whose historical data will be used • Validated historical data • Reasoning, assumptions, and rationale used to select the historical data • Reasoning of a broad base of experienced project participants

Examples of parameters that are considered for similarities and differ- ences include the following: • Work product and task attributes • Application domain • Experience of the people • Design and development approaches • Operational environment

Wow! eBook Integrated Project Management 273 Examples of data contained in the organization’s measurement repository include the following: • Size of work products or other work product attributes • Effort •Cost •Schedule • Staffing • Response time • Service capacity • Supplier performance • Defects

SP 1.3 ESTABLISH THE PROJECT’S WORK ENVIRONMENT Establish and maintain the project’s work environment based on the organi- zation’s work environment standards.

An appropriate work environment for a project comprises an infra- TIP structure of facilities, tools, and equipment that people need to per- Often, the project’s work form their jobs effectively in support of business and project environment contains compo - objectives. The work environment and its components are main- nents that are common to the ptg organization’s overall work tained at a level of work environment performance and reliability environment. Many of these indicated by organizational work environment standards. As components may be provided required, the project’s work environment or some of its components by IT or the facilities group. can be developed internally or acquired from external sources. The project’s work environment might encompass environments for product integration, verification, and validation or they might be separate environments. Refer to the Establish the Product Integration Environment specific practice in the Product Integration process area for more information about establishing and maintaining the product integration environment for the project. IPM Refer to the Establish the Validation Environment specific practice in the Vali- dation process area for more information about establishing and maintaining the validation environment for the project. Refer to the Establish the Verification Environment specific practice in the Veri- fication process area for more information about establishing and maintaining the verification environment for the project. Refer to the Establish Work Environment Standards specific practice in the Organizational Process Definition process area for more information about work environment standards.

Wow! eBook 274 PART TWO THE PROCESS AREAS

Example Work Products 1. Equipment and tools for the project 2. Installation, operation, and maintenance manuals for the project work environment 3. User surveys and results 4. Use, performance, and maintenance records 5. Support services for the project’s work environment

Subpractices TIP 1. Plan, design, and install a work environment for the project. A facilities group can use input The critical aspects of the project work environment are, like any from the project to create the other product, requirements driven. Functionality and quality attri- work environment. butes of the work environment are explored with the same rigor as is done for any other product development project.

It may be necessary to make tradeoffs among quality attributes, costs, and risks. The following are examples of each: • Quality attribute considerations can include timely communication, safety, security, and maintainability. • Costs can include capital outlays, training, a support structure; disassem- ptg bly and disposal of existing environments; and the operation and mainte- nance of the environment. • Risks can include workflow and project disruptions.

Examples of equipment and tools include the following: • Office software • Decision support software • Project management tools • Test and evaluation equipment • Requirements management tools and design tools • Configuration management tools • Evaluation tools • Integration tools • Automated test tools

2. Provide ongoing maintenance and operational support for the proj- ect’s work environment. Maintenance and support of the work environment can be accom- plished either with capabilities found inside the organization or hired from outside the organization.

Wow! eBook Integrated Project Management 275 Examples of maintenance and support approaches include the following: • Hiring people to perform maintenance and support • Training people to perform maintenance and support • Contracting maintenance and support • Developing expert users for selected tools

3. Maintain the qualification of components of the project’s work environment. Components include software, databases, hardware, tools, test equip- ment, and appropriate documentation. Qualification of software includes appropriate certifications. Hardware and test equipment qualification includes calibration and adjustment records and trace- ability to calibration standards. 4. Periodically review how well the work environment is meeting proj- ect needs and supporting collaboration, and take action as appropriate.

Examples of actions that might be taken include the following: • Adding new tools • Acquiring additional networks, equipment, training, and support ptg

SP 1.4 INTEGRATE PLANS TIP Integrate the project plan and other plans that affect the project to describe One of the main differences the project’s defined process. between IPM and PP is that IPM Refer to the Organizational Process Definition process area for more informa- is more proactive in coordinat- ing with relevant stakeholders, tion about establishing organizational process assets and, in particular, estab- both internal (different teams) lishing the organization’s measurement repository. and external (organizational Refer to the Organizational Process Focus process area for more information functions and support groups) about establishing organizational process needs and determining process to the project, and is concerned improvement opportunities. with the integration of plans. IPM Refer to the Project Planning process area for more information about develop- ing a project plan. TIP To help formulate estimates, applicable data may be found This specific practice extends the specific practices for establishing in the organization’s measure- and maintaining a project plan to address additional planning activi- ment repository. Additionally, ties such as incorporating the project’s defined process, coordinating applicable templates, exam- with relevant stakeholders, using organizational process assets, ples, and lessons-learned doc- incorporating plans for peer reviews, and establishing objective entry uments may be found in the and exit criteria for tasks. organization’s process asset library.

Wow! eBook 276 PART TWO THE PROCESS AREAS

The development of the project plan should account for current and projected needs, objectives, and requirements of the organiza- tion, customer, suppliers, and end users as appropriate. Example Work Products 1. Integrated plans

Subpractices 1. Integrate other plans that affect the project with the project plan.

Other plans that affect the project plan can include the following: • Quality assurance plans • Risk management strategy • Verification and validation plans • Transition to operations and support plans • Configuration management plans • Documentation plans • Staff training plans • Facilities and logistics plans

ptg 2. Incorporate into the project plan the definitions of measures and measurement activities for managing the project.

Examples of measures that would be incorporated include the following: • Organization’s common set of measures • Additional project specific measures

Refer to the Measurement and Analysis process area for more information about developing and sustaining a measurement capability used to sup- port management information needs. 3. Identify and analyze product and project interface risks. Refer to the Risk Management process area for more information about identifying and analyzing risks.

Examples of product and project interface risks include the following: • Incomplete interface descriptions • Unavailability of tools, suppliers, or test equipment • Unavailability of COTS components • Inadequate or ineffective team interfaces

Wow! eBook Integrated Project Management 277

4. Schedule tasks in a sequence that accounts for critical development and delivery factors and project risks.

Examples of factors considered in scheduling include the following: • Size and complexity of tasks • Needs of the customer and end users • Availability of critical resources • Availability of key staff • Integration and test issues

5. Incorporate plans for performing peer reviews on work products of the project’s defined process. Refer to the Verification process area for more information about perform- ing peer reviews.

6. Incorporate the training needed to perform the project’s defined X-REF process in the project’s training plans. Refer to OT for more informa- This task typically includes negotiating with the organizational train- tion about organizational ing group on the support they will provide. training. 7. Establish objective entry and exit criteria to authorize the initiation and completion of tasks described in the work breakdown structure ptg (WBS). Refer to the Project Planning process area for more information about estimating the scope of the project. 8. Ensure that the project plan is appropriately compatible with the plans of relevant stakeholders. Typically the plan and changes to the plan will be reviewed for com - patibility. 9. Identify how conflicts will be resolved that arise among relevant stakeholders. IPM SP 1.5 MANAGE THE PROJECT USING INTEGRATED PLANS Manage the project using the project plan, other plans that affect the project, TIP and the project’s defined process. Walk your talk. The prior SPs established the plan—this SP Refer to the Organizational Process Definition process area for more informa- implements and manages the tion about establishing organizational process assets. project against that plan. Refer to the Organizational Process Focus process area for more information about establishing organizational process needs, deploying organizational process assets, and deploying standard processes. Refer to the Project Monitoring and Control process area for more information about providing an understanding of the project’s progress so that appropriate

Wow! eBook 278 PART TWO THE PROCESS AREAS

corrective actions can be taken when the project’s performance deviates signifi- cantly from the plan. Refer to the Risk Management process area for more information about identi- fying and analyzing risks and mitigating risks.

Example Work Products 1. Work products created by performing the project’s defined process 2. Collected measures (i.e., actuals) and status records or reports 3. Revised requirements, plans, and commitments 4. Integrated plans

Subpractices 1. Implement the project’s defined process using the organization’s process asset library.

This task typically includes the following activities: • Incorporating artifacts from the organization’s process asset library into the project as appropriate • Using lessons learned from the organization’s process asset library to manage the project ptg

TIP 2. Monitor and control the project’s activities and work products using The organization’s process the project’s defined process, project plan, and other plans that affect improvement plan is another the project. plan that might affect the p r o j e c t . This task typically includes the following activities: • Using the defined entry and exit criteria to authorize the initiation and determine the completion of tasks • Monitoring activities that could significantly affect actual values of the project’s planning parameters • Tracking project planning parameters using measurable thresholds that will trigger investigation and appropriate actions • Monitoring product and project interface risks • Managing external and internal commitments based on plans for tasks and work products of the project’s defined process

An understanding of the relationships among tasks and work prod- ucts of the project’s defined process and of the roles to be performed by relevant stakeholders, along with well-defined control mechanisms (e.g., peer reviews), achieves better visibility into project performance and better control of the project.

Wow! eBook Integrated Project Management 279

3. Obtain and analyze selected measurements to manage the project and support organization needs. Refer to the Measurement and Analysis process area for more information about obtaining measurement data and analyzing measurement data. 4. Periodically review and align the project’s performance with current and anticipated needs, objectives, and requirements of the organiza- tion, customer, and end users as appropriate. This review includes alignment with organizational process needs and objectives.

Examples of actions that achieve alignment include the following: • Changing the schedule with appropriate adjustments to other planning parameters and project risks • Changing requirements or commitments in response to a change in mar- ket opportunities or customer and end-user needs • Terminating the project, iteration, or release

5. Address causes of selected issues that can affect project objectives. Issues that require corrective action are determined and analyzed as in the Analyze Issues and Take Corrective Actions specific practices ptg of the Project Monitoring and Control process area. As appropriate, the project may periodically review issues previously encountered on other projects or in earlier phases of the project, and conduct causal analysis of selected issues to determine how to prevent recurrence for issues which can significantly affect project objectives. Project process changes implemented as a result of causal analysis activities should be evaluated for effectiveness to ensure that the process change has prevented recurrence and improved performance.

SP 1.6 ESTABLISH TEAMS

Establish and maintain teams. IPM

The project is managed using teams that reflect the organizational rules and guidelines for team structuring, formation, and operation. (See the definition of “team” in the glossary.) The project’s shared vision is established prior to establishing the team structure, which can be based on the WBS. For small organiza- tions, the whole organization and relevant external stakeholders can be treated as a team. Refer to the Establish Rules and Guidelines for Teams specific practice in the Organizational Process Definition process area for more information about establishing and maintaining organizational rules and guidelines for the struc- ture, formation, and operation of teams.

Wow! eBook 280 PART TWO THE PROCESS AREAS

TIP One of the best ways to ensure coordination and collaboration Teams can have a “sponsor.” with relevant stakeholders is to include them on the team. Although the sponsor helps to In a customer environment that requires coordination among form the team, he is usually not multiple product or service development organizations, it is impor- involved in daily activities. The sponsor may be part of the tant to establish a team with representation from all parties that affect management team. overall success. Such representation helps to ensure effective collabo- ration across these organizations, including the timely resolution of coordination issues. TIP Example Work Products The shared vision is often cap- tured in a team charter. The 1. Documented shared vision team charter is reviewed by all 2. List of members assigned to each team members of the team to ensure 3. Team charters buy-in. 4. Periodic team status reports

TIP Subpractices CMMI development is one 1. Establish and maintain the project’s shared vision. example of a team. The CMMI project communicated its When creating a shared vision, it is critical to understand the inter- shared vision broadly so that faces between the project and stakeholders external to the project. the management team guiding The vision should be shared among relevant stakeholders to obtain CMMI activities, the home their agreement and commitment. ptg organizations that sponsored 2. Establish and maintain the team structure. project team members, and the The project WBS, cost, schedule, project risks, resources, interfaces, community had a clear under- standing of the project’s objec- the project’s defined process, and organizational guidelines are evalu- tives, values, and intended ated to establish an appropriate team structure, including team outcomes. responsibilities, authorities, and interrelationships. 3. Establish and maintain each team. Establishing and maintaining teams encompasses choosing team lead- TIP ers and team members and establishing team charters for each team. Over the life of a long project, It also involves providing resources required to accomplish tasks different team structures can assigned to the team. be used to reflect the changing needs of the project. 4. Periodically evaluate the team structure and composition. Teams should be monitored to detect misalignment of work across TIP different teams, mismanaged interfaces, and mismatches of tasks to team members. Take corrective action when team or project perform- If this is an integrated team ance does not meet expectations. composed of members from different parts of the organiza- tion or different organizations, SP 1.7 CONTRIBUTE TO ORGANIZATIONAL PROCESS ASSETS each member should have dual aspirations and expectations: Contribute process related experiences to organizational process assets. those specific to the project Refer to the Organizational Process Definition process area for more informa- and those related to their home tion about establishing organizational process assets, establishing the organiza- organization. tion’s measurement repository, and establishing the organization’s process asset library.

Wow! eBook Integrated Project Management 281

Refer to the Organizational Process Focus process area for more information about incorporating experiences into organizational process assets.

This specific practice addresses contributing information from TIP processes in the project’s defined process to organizational process This SP provides feedback to assets. the organization so that the organizational assets can be Example Work Products improved, and data and experi- ences can be shared with other 1. Proposed improvements to organizational process assets projects. 2. Actual process and product measures collected from the project 3. Documentation (e.g., exemplary process descriptions, plans, training modules, checklists, lessons learned) 4. Process artifacts associated with tailoring and implementing the organization’s set of standard processes on the project

Subpractices 1. Propose improvements to the organizational process assets. X-REF 2. Store process and product measures in the organization’s measure- Improvements are proposed ment repository. using “process improvement proposals.” For more infor- Refer to the Measurement and Analysis process area for more information mation, see OPF SP 1.3. ptg about obtaining measurement data. Refer to the Project Monitoring and Control process area for more infor- mation about monitoring project planning parameters. Refer to the Project Planning process area for more information about planning data management.

These process and product measures typically include the following: • Planning data • Replanning data IPM Examples of data recorded by the project include the following: • Task descriptions • Assumptions •Estimates • Revised estimates • Definitions of recorded data and measures •Measures • Context information that relates the measures to the activities performed and work products produced • Associated information needed to reconstruct the estimates, assess their reasonableness, and derive estimates for new work

Wow! eBook 282 PART TWO THE PROCESS AREAS

3. Submit documentation for possible inclusion in the organization’s process asset library.

Examples of documentation include the following: • Exemplary process descriptions • Training modules • Exemplary plans • Checklists and templates • Project repository shells • Tool configurations

4. Document lessons learned from the project for inclusion in the orga- nization’s process asset library. 5. Provide process artifacts associated with tailoring and implementing the organization’s set of standard processes in support of the organi- zation’s process monitoring activities. Refer to the Monitor the Implementation specific practice in the Organi- zational Process Focus process area for more information about the orga- nization’s activities to understand the extent of deployment of standard processes on new and existing projects. ptg

SG 2 COORDINATE AND COLLABORATE WITH RELEVANT STAKEHOLDERS

X-REF Coordination and collaboration between the project and relevant stakehold- Relevant stakeholders are ers are conducted. identified in GP 2.7 and PP SP 2.6. SP 2.1 MANAGE STAKEHOLDER INVOLVEMENT Manage the involvement of relevant stakeholders in the project.

Stakeholder involvement is managed according to the project’s inte- grated plan and defined process. Refer to the Project Planning process area for more information about planning stakeholder involvement and obtaining plan commitment.

Example Work Products 1. Agendas and schedules for collaborative activities 2. Recommendations for resolving relevant stakeholder issues 3. Documented issues (e.g., issues with stakeholder requirements, product and product component requirements, product architecture, product design)

Wow! eBook Integrated Project Management 283

Subpractices 1. Coordinate with relevant stakeholders who should participate in project activities. The relevant stakeholders should already be identified in the project plan. 2. Ensure work products that are produced to satisfy commitments TIP meet the requirements of the recipients. These commitments may be Refer to the Verification process area for more information about verify- external commitments that ing selected work products. project staff is addressing. The work products produced to satisfy commitments can be services. This task typically includes the following: • Reviewing, demonstrating, or testing, as appropriate, each work product produced by relevant stakeholders • Reviewing, demonstrating, or testing, as appropriate, each work product produced by the project for other projects with representa- tives of the projects receiving the work product • Resolving issues related to the acceptance of the work products 3. Develop recommendations and coordinate actions to resolve misun- TIP derstandings and problems with requirements. Too often, individuals assume critical dependencies identi- ptg SP 2.2 MANAGE DEPENDENCIES fied at the beginning of a proj- ect don’t change and are Participate with relevant stakeholders to identify, negotiate, and track critical someone else’s job. dependencies. TIP Example Work Products Ironically, when time and 1. Defects, issues, and action items resulting from reviews with relevant money are limited, integration, stakeholders coordination, and collabora- 2. Critical dependencies tion activities become more critical. Coordination helps 3. Commitments to address critical dependencies ensure that all involved parties 4. Status of critical dependencies contribute to the product in a timely way to minimize rework IPM Subpractices and delays. 1. Conduct reviews with relevant stakeholders. HINT 2. Identify each critical dependency. You can facilitate coordination 3. Establish need dates and plan dates for each critical dependency on critical dependencies by based on the project schedule. determining need and plan 4. Review and get agreement on commitments to address each critical dates for each critical depen - dependency with those who are responsible for providing or receiv- dency and then establishing and managing commitments ing the work product. as described in these subprac- 5. Document critical dependencies and commitments. tices and in PP and PMC.

Wow! eBook 284 PART TWO THE PROCESS AREAS Documentation of commitments typically includes the following: • Describing the commitment • Identifying who made the commitment • Identifying who is responsible for satisfying the commitment • Specifying when the commitment will be satisfied • Specifying the criteria for determining if the commitment has been s a t i s f i e d

6. Track the critical dependencies and commitments and take correc- tive action as appropriate. Refer to the Project Monitoring and Control process area for more infor- mation about monitoring commitments.

Tracking critical dependencies typically includes the following: • Evaluating the effects of late and early completion for impacts on future activities and milestones • Resolving actual and potential problems with responsible parties when- ever possible • Escalating to the appropriate party the actual and potential problems not resolvable by the responsible individual or group ptg

SP 2.3 RESOLVE COORDINATION ISSUES

TIP Resolve issues with relevant stakeholders. These issues are expected to be resolved at the project level. However, because stakehold- Examples of coordination issues include the following: ers may be from outside the • Product and product component requirements and design defects project, issues may need to be • Late critical dependencies and commitments escalated to the appropriate • Product level problems level of management to be • Unavailability of critical resources or staff resolved.

Example Work Products 1. Relevant stakeholder coordination issues 2. Status of relevant stakeholder coordination issues

Subpractices 1. Identify and document issues. 2. Communicate issues to relevant stakeholders. 3. Resolve issues with relevant stakeholders.

Wow! eBook Integrated Project Management 285

4. Escalate to appropriate managers the issues not resolvable with rele- vant stakeholders. 5. Track issues to closure. 6. Communicate with relevant stakeholders on the status and resolu- tion of issues.

ptg IPM

Wow! eBook This page intentionally left blank

ptg

Wow! eBook MEASUREMENT AND ANALYSIS A Support Process Area at Maturity Level 2

Purpose HINT Use this process area whenever The purpose of Measurement and Analysis (MA) is to develop and you need to measure project sustain a measurement capability used to support management infor- progress, product size or qual- mation needs. ity, or process performance in support of making decisions and taking corrective action. Introductory Notes The Measurement and Analysis process area involves the following TIP activities: This process area uses the term ptg objective both as a noun mean- • Specifying objectives of measurement and analysis so that they are ing “a goal to be attained” (first bullet) and as an adjective aligned with identified information needs and project, organizational, meaning “unbiased” (fourth or business objectives bullet). • Specifying measures, analysis techniques, and mechanisms for data collection, data storage, reporting, and feedback • Implementing the analysis techniques and mechanisms for data col- lection, data reporting, and feedback • Providing objective results that can be used in making informed deci- sions and taking appropriate corrective action

The integration of measurement and analysis activities into the processes of the project supports the following:

• Objective planning and estimating • Tracking actual progress and performance against established plans and objectives • Identifying and resolving process related issues

• Providing a basis for incorporating measurement into additional MA processes in the future

287

Wow! eBook 288 PART TWO THE PROCESS AREAS

TIP The staff required to implement a measurement capability may or Measurement involves every- may not be employed in a separate organization-wide program. one. A centralized group, such Measurement capability may be integrated into individual projects or as a process group or a mea - other organizational functions (e.g., quality assurance). surement group, may provide help in defining the measures, The initial focus for measurement activities is at the project level. the analyses to perform, and However, a measurement capability can prove useful for addressing the reporting content and organization- and enterprise-wide information needs. To support this charts used. capability, measurement activities should support information needs at multiple levels, including the business, organizational unit, and project to minimize re-work as the organization matures. Projects can store project specific data and results in a project spe- cific repository, but when data are to be used widely or are to be ana- lyzed in support of determining data trends or benchmarks, data may reside in the organization’s measurement repository. Measurement and analysis of product components provided by suppliers is essential for effective management of the quality and costs of the project. It is possible, with careful management of sup- plier agreements, to provide insight into data that support supplier performance analysis. X-REF Measurement objectives are derived from information needs that It is important to include in come from project, organizational, or business objectives. In this ptg the supplier agreement any process area, when the term “objectives” is used without the “mea- mea surement and analysis surement” qualifier, it indicates either project, organizational, or activities that you want the business objectives. supplier to perform. Refer to SAM for more information about supplier activities. Related Process Areas

Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component requirements. Refer to the Configuration Management process area for more information about establishing and maintaining the integrity of work products using config- uration identification, configuration control, configuration status accounting, and configuration audits. Refer to the Organizational Process Definition process area for more informa- tion about establishing the organization’s measurement repository. Refer to the Project Monitoring and Control process area for more information about monitoring project planning parameters. Refer to the Project Planning process area for more information about establish- ing estimates. Refer to the Quantitative Project Management process area for more informa- tion about quantitatively managing the project.

Wow! eBook Measurement and Analysis 289

Refer to the Requirements Management process area for more information about maintaining bidirectional traceability of requirements.

Specific Goal and Practice Summary SG 1 Align Measurement and Analysis Activities SP 1.1 Establish Measurement Objectives SP 1.2 Specify Measures SP 1.3 Specify Data Collection and Storage Procedures SP 1.4 Specify Analysis Procedures SG 2 Provide Measurement Results SP 2.1 Obtain Measurement Data SP 2.2 Analyze Measurement Data SP 2.3 Store Data and Results SP 2.4 Communicate Results

Specific Practices by Goal

SG 1 ALIGN MEASUREMENT AND ANALYSIS ACTIVITIES Measurement objectives and activities are aligned with identified information TIP needs and objectives. When starting a measurement program, an iterative process is ptg usually helpful because you The specific practices under this specific goal can be addressed con- often do not know all of your currently or in any order. objectives initially.

• When establishing measurement objectives, experts often think ahead about necessary criteria for specifying measures and analysis proce- dures. They also think concurrently about the constraints imposed by data collection and storage procedures. • Often it is important to specify the essential analyses to be conducted before attending to details of measurement specification, data collec- tion, or storage.

SP 1.1 ESTABLISH MEASUREMENT OBJECTIVES Establish and maintain measurement objectives derived from identified infor- mation needs and objectives. HINT When establishing measure- ment objectives, ask yourself Measurement objectives document the purposes for which measure- what question you are answer- ment and analysis are done and specify the kinds of actions that can ing with the data, why you are

be taken based on results of data analyses. Measurement objectives measuring something, and how MA can also identify the change in behavior desired as a result of imple- these measurements will affect menting a measurement and analysis activity. project behavior.

Wow! eBook 290 PART TWO THE PROCESS AREAS

HINT Measurement objectives may be constrained by existing Measurement can be an expen- processes, available resources, or other measurement considerations. sive activity. Think about why Judgments may need to be made about whether the value of the you are collecting the measures result is commensurate with resources devoted to doing the work. and what need the measure will be addressing. Modifications to identified information needs and objectives can, in turn, be indicated as a consequence of the process and results of measurement and analysis. TIP It is important to consider staff in understanding information Sources of information needs and objectives can include the following: needs. You want everyone to • Project plans feel ownership of your mea - • Project performance monitoring surement program. • Interviews with managers and others who have information needs • Established management objectives • Strategic plans • Business plans • Formal requirements or contractual obligations • Recurring or other troublesome management or technical problems • Experiences of other projects or organizational entities • External industry benchmarks • Process improvement plans ptg

Example measurement objectives include the following: • Provide insight into schedule fluctuations and progress • Provide insight into actual size compared to plan • Identify unplanned growth • Evaluate the effectiveness of defect detection throughout the product development lifecycle • Determine the cost of correcting defects • Provide insight into actual costs compared to plan • Evaluate supplier progress against the plan • Evaluate the effectiveness of mitigating information system vulnerabilities

Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component requirements. Refer to the Project Monitoring and Control process area for more information about monitoring project planning parameters. Refer to the Project Planning process area for more information about establish- ing estimates. Refer to the Requirements Management process area for more information about maintaining bidirectional traceability of requirements.

Wow! eBook Measurement and Analysis 291

Example Work Products 1. Measurement objectives

Subpractices 1. Document information needs and objectives. 2. Prioritize information needs and objectives. It can be neither possible nor desirable to subject all initially identi- fied information needs to measurement and analysis. Priorities may also need to be set within the limits of available resources. 3. Document, review, and update measurement objectives. Carefully consider the purposes and intended uses of measurement and analysis. The measurement objectives are documented, reviewed by manage- ment and other relevant stakeholders, and updated as necessary. Doing so enables traceability to subsequent measurement and analy- sis activities, and helps to ensure that analyses will properly address identified information needs and objectives. It is important that users of measurement and analysis results be TIP involved in setting measurement objectives and deciding on plans of Many times, measures are action. It may also be appropriate to involve those who provide the collected and not used. If you measurement data. have adequate reviews of the ptg 4. Provide feedback for refining and clarifying information needs and objectives, you will minimize the risk of a measure not being objectives as necessary. used. Identified information needs and objectives can be refined and clari- fied as a result of setting measurement objectives. Initial descriptions of information needs may be ambiguous. Conflicts can arise between existing needs and objectives. Precise targets on an already existing measure may be unrealistic. 5. Maintain traceability of measurement objectives to identified infor- mation needs and objectives. There should always be a good answer to the question, “Why are we measuring this?” Of course, measurement objectives can also change to reflect evolving information needs and objectives.

SP 1.2 SPECIFY MEASURES Specify measures to address measurement objectives. HINT When specifying measures, ask yourself which specific mea - Measurement objectives are refined into precise, quantifiable measures. MA Measurement of project and organizational work can typically be sures you will collect. traced to one or more measurement information categories. These categories include the following: schedule and progress, effort and cost, size and stability, and quality.

Wow! eBook 292 PART TWO THE PROCESS AREAS

Measures can be either base or derived. Data for base measures are obtained by direct measurement. Data for derived measures come from other data, typically by combining two or more base measures.

Examples of commonly used base measures include the following: • Estimates and actual measures of work product size (e.g., number of pages) • Estimates and actual measures of effort and cost (e.g., number of person hours) • Quality measures (e.g., number of defects by severity) • Information security measures (e.g., number of system vulnerabilities identified) • Customer satisfaction survey scores

Examples of commonly used derived measures include the following: • Earned value • Schedule performance index • Defect density • Peer review coverage • Test or verification coverage ptg • Reliability measures (e.g., mean time to failure) • Quality measures (e.g., number of defects by severity/total number of defects) • Information security measures (e.g., percentage of system vulnerabilities mitigated) • Customer satisfaction trends

Derived measures typically are expressed as ratios, composite indices, or other aggregate summary measures. They are often more quantitatively reliable and meaningfully interpretable than the base measures used to generate them. There are direct relationships among information needs, measure- ment objectives, measurement categories, base measures, and derived measures. This direct relationship is depicted using some common examples in Table 12.1. Example Work Products 1. Specifications of base and derived measures

Subpractices 1. Identify candidate measures based on documented measurement objectives.

Wow! eBook Measurement and Analysis 293 TABLE 12.1 Example Measurement Relationships Example Project, Measurement Organizational, or Measurement Information Example Base Example Derived Business Objectives Information Need Objective Categories Measures Measures Shorten time to What is the Provide insight Schedule and Estimated and actual Milestone delivery estimated delivery into schedule progress start and end dates performance time? fluctuations and by task Be first to market Percentage of project progress the product on time Schedule estimation accuracy Increase market How accurate are Provide insight Size and effort Estimated and actual Productivity share by reducing the size and cost into actual size effort and size costs of products estimates? and costs and services compared to plan Effort and cost Estimated and actual Cost performance cost Cost variance Deliver specified Has scope or Provide insight Size and Requirements count Requirements volatility functionality project size into actual size stability Size estimation grown? compared to plan, accuracy identify unplanned growth Function point count Estimated vs. actual function points ptg Lines of code count Amount of new, modi- fied, and reused code

Reduce defects in Where are defects Evaluate the Quality Number of defects Defect containment by products delivered being inserted and effectiveness of inserted and detected lifecycle phase to the customer by detected prior to defect detection by lifecycle phase Defect density 10% without delivery? throughout the Product size affecting cost product lifecycle

What is the cost Determine the cost Cost Number of defects Rework costs of rework? of correcting inserted and detected defects by lifecycle phase Effort hours to correct defects Labor rates

Reduce What is the magni- Evaluate the Information Number of system Percentage of system information system tude of open system effectiveness of assurance vulnerabilities identi- vulnerabilities vulnerabilities vulnerabilities? mitigating system fied and number of mitigated vulnerabilities system vulnerabilities

mitigated MA

Wow! eBook 294 PART TWO THE PROCESS AREAS

Measurement objectives are refined into measures. Identified candi- date measures are categorized and specified by name and unit of measure. 2. Maintain traceability of measures to measurement objectives. Interdependencies among candidate measures are identified to enable later data validation and candidate analyses in support of measure- ment objectives. TIP 3. Identify existing measures that already address measurement Operational definitions are key objectives. to effective specification of Specifications for measures may already exist, perhaps established for measures. other purposes earlier or elsewhere in the organization. 4. Specify operational definitions for measures. X-REF Operational definitions are stated in precise and unambiguous terms. For additional information on They address two important criteria: specifying measures, see • Communication: What has been measured, how was it measured, what SEI’s Software Engineering are the units of measure, and what has been included or excluded? Measurement and Analysis • Repeatability: Can the measurement be repeated, given the same website (www.sei.cmu.edu/ definition, to get the same results? sema), the Practical Software & Systems Measurement web- 5. Prioritize, review, and update measures. site (www.psmsc.com), and Proposed specifications of measures are reviewed for their appropri- the iSixSigma website ateness with potential end users and other relevant stakeholders. Pri- ptg (www.isixsigma.com). orities are set or changed, and specifications of measures are updated as necessary.

SP 1.3 SPECIFY DATA COLLECTION AND STORAGE PROCEDURES Specify how measurement data are obtained and stored.

Explicit specification of collection methods helps to ensure that the right data are collected properly. This specification can also help fur- ther clarify information needs and measurement objectives. TIP Proper attention to storage and retrieval procedures helps to Ensuring appropriate accessi- ensure that data are available and accessible for future use. bility and maintenance of data integrity are two key concerns Example Work Products related to data storage and retrieval. 1. Data collection and storage procedures 2. Data collection tools

Subpractices 1. Identify existing sources of data that are generated from current work products, processes, or transactions. Existing sources of data may have been identified when specifying the measures. Appropriate collection mechanisms may exist whether or not pertinent data have already been collected.

Wow! eBook Measurement and Analysis 295

2. Identify measures for which data are needed but are not currently available. 3. Specify how to collect and store the data for each required measure. Explicit specifications are made of what, how, where, and when data will be collected and stored to ensure its validity and to support later use for analysis and documentation purposes.

Questions to be considered typically include the following: • Have the frequency of collection and the points in the process where measurements will be made been determined? • Has the timeline that is required to move measurement results from points of collection to repositories, other databases, or end users been established? • Who is responsible for obtaining data? • Who is responsible for data storage, retrieval, and security? • Have necessary supporting tools been developed or acquired?

4. Create data collection mechanisms and process guidance. Data collection and storage mechanisms are well integrated with other normal work processes. Data collection mechanisms can ptg include manual or automated forms and templates. Clear, concise guidance on correct procedures is available to those who are responsi- ble for doing the work. Training is provided as needed to clarify processes required for the collection of complete and accurate data and to minimize the burden on those who provide and record data. 5. Support automatic collection of data as appropriate and feasible. TIP In today’s environment, automation is often used. How- Examples of such automated support include the following: ever, some organizations use • Time stamped activity logs several tools and databases to • Static or dynamic analyses of artifacts address their measurement needs. You need to manage the compatibility among the tools 6. Prioritize, review, and update data collection and storage procedures. and databases carefully. Proposed procedures are reviewed for their appropriateness and feasi- bility with those who are responsible for providing, collecting, and storing data. They also may have useful insights about how to improve existing processes or may be able to suggest other useful measures or analyses.

7. Update measures and measurement objectives as necessary. MA

Wow! eBook 296 PART TWO THE PROCESS AREAS

SP 1.4 SPECIFY ANALYSIS PROCEDURES Specify how measurement data are analyzed and communicated.

TIP Specifying analysis procedures in advance ensures that appropriate Often, someone can manipu- analyses will be conducted and reported to address documented late data to provide the picture measurement objectives (and thereby the information needs and he wants to convey. By specify- objectives on which they are based). This approach also provides a ing the analysis procedures in check that necessary data will, in fact, be collected. Analysis proce- advance, you can minimize this type of abuse. dures should account for the quality (e.g., age, reliability) of all data that enter into an analysis (whether from the project, organization’s measurement repository, or other source). The quality of data should be considered to help select the appropriate analysis procedure and evaluate the results of the analysis. Example Work Products 1. Analysis specifications and procedures 2. Data analysis tools

Subpractices 1. Specify and prioritize the analyses to be conducted and the reports to be prepared. ptg Early on, pay attention to the analyses to be conducted and to the manner in which results will be reported. These analyses and reports should meet the following criteria: • The analyses explicitly address the documented measurement objectives. • Presentation of results is clearly understandable by the audiences to whom the results are addressed. Priorities may have to be set for available resources. 2. Select appropriate data analysis methods and tools.

Issues to be considered typically include the following: • Choice of visual display and other presentation techniques (e.g., pie charts, bar charts, histograms, radar charts, line graphs, scatter plots, tables) • Choice of appropriate descriptive statistics (e.g., arithmetic mean, median, mode) • Decisions about statistical sampling criteria when it is impossible or unnecessary to examine every data element • Decisions about how to handle analysis in the presence of missing data elements • Selection of appropriate analysis tools

Wow! eBook Measurement and Analysis 297 Descriptive statistics are typically used in data analysis to do the following: • Examine distributions of specified measures (e.g., central tendency, extent of variation, data points exhibiting unusual variation) • Examine interrelationships among specified measures (e.g., comparisons of defects by phase of the product’s lifecycle, comparisons of defects by product component) • Display changes over time

Refer to the Select Measures and Analytic Techniques specific practice and Monitor the Performance of Selected Subprocesses specific practice in the Quantitative Project Management process area for more information about the appropriate use of statistical techniques and understanding variation. 3. Specify administrative procedures for analyzing data and communi- TIP cating results. Those responsible for analyz- ing the data and presenting the results should include those Issues to be considered typically include the following: whose activities generated the • Identifying the persons and groups responsible for analyzing the data measurement data or their and presenting the results management whenever possi- ble, with support provided by a • Determining the timeline to analyze the data and present the results ptg • Determining the venues for communicating the results (e.g., progress process group, QA group, or measurement experts. reports, transmittal memos, written reports, staff meetings)

4. Review and update the proposed content and format of specified analyses and reports. All of the proposed content and format are subject to review and revi- sion, including analytic methods and tools, administrative proce- dures, and priorities. Relevant stakeholders consulted should include end users, sponsors, data analysts, and data providers. 5. Update measures and measurement objectives as necessary. Just as measurement needs drive data analysis, clarification of analy- sis criteria can affect measurement. Specifications for some measures may be refined further based on specifications established for data analysis procedures. Other measures may prove unnecessary or a need for additional measures may be recognized. Specifying how measures will be analyzed and reported can also sug- X-REF gest the need for refining measurement objectives themselves. Refer to SP 1.1 when refining 6. Specify criteria for evaluating the utility of analysis results and for your measurement objectives. MA evaluating the conduct of measurement and analysis activities.

Wow! eBook 298 PART TWO THE PROCESS AREAS Criteria for evaluating the utility of the analysis might address the extent to which the following apply: • The results are provided in a timely manner, understandable, and used for decision making. • The work does not cost more to perform than is justified by the benefits it provides.

TIP Criteria for evaluating the conduct of the measurement and analysis might The criteria are divided into include the extent to which the following apply: two lists. The first comprises • The amount of missing data or the number of flagged inconsistencies is criteria that any organization can use. The second is a bit beyond specified thresholds. more sophisticated and might • There is selection bias in sampling (e.g., only satisfied end users are sur- be used by organizations after veyed to evaluate end-user satisfaction, only unsuccessful projects are they establish their measure- evaluated to determine overall productivity). ment program. • Measurement data are repeatable (e.g., statistically reliable). • Statistical assumptions have been satisfied (e.g., about the distribution of data, about appropriate measurement scales).

SG 2 PROVIDE MEASUREMENT RESULTS Measurement results, which address identified information needs and objec- ptg tives, are provided.

TIP The primary reason for conducting measurement and analysis is to MA provides the foundation address identified information needs derived from project, organiza- for the behavior required in tional, and business objectives. Measurement results based on objec- high-maturity organizations. By tive evidence can help to monitor progress and performance, fulfill the time organizations reach obligations documented in a supplier agreement, make informed maturity level 4, management and staff will use measurement management and technical decisions, and enable corrective actions results as part of their daily to be taken. work. SP 2.1 OBTAIN MEASUREMENT DATA Obtain specified measurement data.

The data necessary for analysis are obtained and checked for com- pleteness and integrity.

Example Work Products 1. Base and derived measurement data sets 2. Results of data integrity tests

Wow! eBook Measurement and Analysis 299

Subpractices 1. Obtain data for base measures. Data are collected as necessary for previously used and newly speci- fied base measures. Existing data are gathered from project records or elsewhere in the organization. 2. Generate data for derived measures. Values are newly calculated for all derived measures. 3. Perform data integrity checks as close to the source of data as possible. TIP All measurements are subject to error in specifying or recording data. If too much time has passed, it It is always better to identify these errors and sources of missing data may be inefficient or even early in the measurement and analysis cycle. impossible to verify the integrity of a measure or to Checks can include scans for missing data, out-of-bounds data values, identify the source of missing and unusual patterns and correlation across measures. It is particu- data. larly important to do the following: • Test and correct for inconsistency of classifications made by human judgment (i.e., to determine how frequently people make differing classification decisions based on the same information, otherwise known as “inter-coder reliability”). • Empirically examine the relationships among measures that are used to calculate additional derived measures. Doing so can ensure that important distinctions are not overlooked and that derived ptg measures convey their intended meanings (otherwise known as “criterion validity”).

SP 2.2 ANALYZE MEASUREMENT DATA Analyze and interpret measurement data.

Measurement data are analyzed as planned, additional analyses are conducted as necessary, results are reviewed with relevant stakehold- ers, and necessary revisions for future analyses are noted. Example Work Products 1. Analysis results and draft reports

Subpractices TIP 1. Conduct initial analyses, interpret results, and draw preliminary Often, someone can misinter- pret analyses and draw incor- conclusions. rect conclusions. By specifying The results of data analyses are rarely self evident. Criteria for inter- criteria for interpreting results

preting results and drawing conclusions should be stated explicitly. in advance, you can minimize MA 2. Conduct additional measurement and analysis as necessary and pre- the risk of drawing incorrect pare results for presentation. conclusions.

Wow! eBook 300 PART TWO THE PROCESS AREAS

Results of planned analyses can suggest (or require) additional, unan- ticipated analyses. In addition, these analyses can identify needs to refine existing measures, to calculate additional derived measures, or even to collect data for additional base measures to properly complete the planned analysis. Similarly, preparing initial results for presenta- tion can identify the need for additional, unanticipated analyses. 3. Review initial results with relevant stakeholders. It may be appropriate to review initial interpretations of results and the way in which these results are presented before disseminating and communicating them widely. Reviewing the initial results before their release can prevent needless misunderstandings and lead to improvements in the data analysis and presentation. Relevant stakeholders with whom reviews may be conducted include intended end users, sponsors, data analysts, and data providers. TIP 4. Refine criteria for future analyses. Measurement and analysis is a Lessons that can improve future efforts are often learned from con- learning process. You will typi- ducting data analyses and preparing results. Similarly, ways to cally go through many cycles improve measurement specifications and data collection procedures before the measures and can become apparent as can ideas for refining identified information analyses are fine tuned. needs and objectives. ptg SP 2.3 STORE DATA AND RESULTS Manage and store measurement data, measurement specifications, and analysis results.

Storing measurement related information enables its timely and cost effective use as historical data and results. The information also is needed to provide sufficient context for interpretation of data, mea- surement criteria, and analysis results.

Information stored typically includes the following: • Measurement plans • Specifications of measures • Sets of data that were collected • Analysis reports and presentations • Retention period for data stored

Stored information contains or refers to other information needed to understand and interpret the measures and to assess them for rea- sonableness and applicability (e.g., measurement specifications used on different projects when comparing across projects).

Wow! eBook Measurement and Analysis 301

Typically, data sets for derived measures can be recalculated and HINT need not be stored. However, it may be appropriate to store sum- Understand what to store, what maries based on derived measures (e.g., charts, tables of results, can be recalculated or recon- report text). structed, and what to discard. Interim analysis results need not be stored separately if they can be efficiently reconstructed. Projects can choose to store project specific data and results in a project specific repository. When data are shared across projects, the data can reside in the organization’s measurement repository. Refer to the Configuration Management process area for more information about establishing a configuration management system. Refer to the Establish the Organization’s Measurement Repository specific prac- tice in the Organizational Process Definition process area for more information about establishing the organization’s measurement repository.

Example Work Products 1. Stored data inventory

Subpractices 1. Review data to ensure their completeness, integrity, accuracy, and currency. ptg 2. Store data according to data storage procedures. 3. Make stored contents available for use only to appropriate groups and staff members. 4. Prevent stored information from being used inappropriately.

Examples of ways to prevent the inappropriate use of data and related TIP information include controlling access to data and educating people on Inappropriate use of data the appropriate use of data. will seriously undermine the credibility of your MA i m p l e m e n t a t i o n . Examples of the inappropriate use of data include the following: • Disclosure of information provided in confidence • Faulty interpretations based on incomplete, out-of-context, or otherwise misleading information • Measures used to improperly evaluate the performance of people or to rank projects • Impugning the integrity of individuals MA

Wow! eBook 302 PART TWO THE PROCESS AREAS

SP 2.4 COMMUNICATE RESULTS Communicate results of measurement and analysis activities to all relevant stakeholders.

TIP The results of the measurement and analysis process are communi- An indicator of a mature organ- cated to relevant stakeholders in a timely and usable fashion to sup- ization is the daily use of mea - port decision making and assist in taking corrective action. surement data by both staff Relevant stakeholders include intended end users, sponsors, data and management to guide their activities. This requires effec- analysts, and data providers. tive communication of mea - surement data and the results Example Work Products of analyses. 1. Delivered reports and related analysis results 2. Contextual information or guidance to help interpret analysis results

Subpractices 1. Keep relevant stakeholders informed of measurement results in a timely manner. To the extent possible and as part of the normal way they do business, users of measurement results are kept personally involved in setting objectives and deciding on plans of action for measurement and analy- sis. Users are regularly kept informed of progress and interim results. ptg Refer to the Project Monitoring and Control process area for more infor- mation about conducting progress reviews. 2. Assist relevant stakeholders in understanding results. TIP Results are communicated in a clear and concise manner appropriate As organizations mature, man- to relevant stakeholders. Results are understandable, easily inter- agement and staff should pretable, and clearly tied to identified information needs and objectives. become more comfortable with The data analyzed are often not self evident to practitioners who are measurement, be more likely to not measurement experts. The communication of results should be interpret the analyses correctly, clear about the following: and be able to ask the right questions to help them draw • How and why base and derived measures were specified the right conclusions. • How data were obtained • How to interpret results based on the data analysis methods used • How results address information needs

Examples of actions taken to help others to understand results include the following: • Discussing the results with relevant stakeholders • Providing background and explanation in a document • Briefing users on results • Providing training on the appropriate use and understanding of measure- ment results

Wow! eBook OPD

ORGANIZATIONAL PROCESS DEFINITION A Process Management Process Area at Maturity Level 3

Purpose The purpose of Organizational Process Definition (OPD) is to estab- lish and maintain a usable set of organizational process assets, work environment standards, and rules and guidelines for teams.

Introductory Notes Organizational process assets enable consistent process execution TIP across the organization and provide a basis for cumulative, long-term OPD contains the specific ptg benefits to the organization. (See the definition of “organizational practices that capture the process assets” in the glossary.) organization’s requirements, standards, and guidelines to be The organization’s process asset library supports organizational used by all projects across the learning and process improvement by allowing the sharing of best organization. practices and lessons learned across the organization. (See the defini- tion of “organizational process assets” in the glossary.) HINT The organization’s set of standard processes also describes stan- As with any library, a key chal- dard interactions with suppliers. Supplier interactions are character- lenge is to enable staff to locate ized by the following typical items: deliverables expected from information quickly. Therefore, it is necessary to catalog, main- suppliers, acceptance criteria applicable to those deliverables, stan- tain, and archive information. dards (e.g., architecture and technology standards), and standard milestone and progress reviews. The organization’s “set of standard processes” is tailored by proj- ects to create their defined processes. Other organizational process assets are used to support tailoring and implementing defined processes. Work environment standards are used to guide the cre- ation of project work environments. Rules and guidelines for teams are used to aid in their structuring, formation, and operation. A “standard process” is composed of other processes (i.e., sub- processes) or process elements. A “process element” is the fundamen- tal (i.e., atomic) unit of process definition that describes activities and tasks to consistently perform work. The process architecture

303

Wow! eBook 304 PART TWO THE PROCESS AREAS

provides rules for connecting the process elements of a standard process. The organization’s set of standard processes can include multiple process architectures. (See the definitions of “standard process,” “process architecture,” “subprocess,” and “process element” in the glossary.)

TIP Organizational process assets can be organized in many ways, depending CMMI models try to capture the on the implementation of the Organizational Process Definition process “what” and not the “how.” area. Examples include the following: However, in notes and exam- • Descriptions of lifecycle models can be part of the organization’s set of ples, guidance is provided to give you some tips on the inter- standard processes or they can be documented separately. pretation and implementation • The organization’s set of standard processes can be stored in the organi- of the concepts. zation’s process asset library or it can be stored separately. • A single repository can contain both measurements and process related documentation, or they can be stored separately.

Related Process Areas

Refer to the Organizational Process Focus process area for more information about deploying organizational process assets. ptg

Specific Goal and Practice Summary SG 1 Establish Organizational Process Assets SP 1.1 Establish Standard Processes SP 1.2 Establish Lifecycle Model Descriptions TIP SP 1.3 Establish Tailoring Criteria and Guidelines SP 1.4 Establish the Organization’s Measurement Repository In Version 1.3 IPPD has been SP 1.5 Establish the Organization’s Process Asset Library eliminated and SP 1.7 was SP 1.6 Establish Work Environment Standards added that addresses teams. SP 1.7 Establish Rules and Guidelines for Teams

TIP Specific Practices by Goal Organizational process assets support a fundamental change in behavior. Projects no longer SG 1 ESTABLISH ORGANIZATIONAL PROCESS ASSETS create their processes from A set of organizational process assets is established and maintained. scratch but instead use the best practices of the organization, thus improving quality and SP 1.1 ESTABLISH STANDARD PROCESSES saving time and money. Establish and maintain the organization’s set of standard processes.

Wow! eBook Organizational Process Definition 305

Standard processes can be defined at multiple levels in an enterprise TIP Standard processes define the and they can be related hierarchically. For example, an enterprise can OPD have a set of standard processes that is tailored by individual organi- key activities performed in an zations (e.g., a division, a site) in the enterprise to establish their set organization. Some examples of standard processes include of standard processes. The set of standard processes can also be tai- requirements elicitation, lored for each of the organization’s business areas, product lines, or design, and testing; planning, standard services. Thus the organization’s set of standard processes can estimating, monitoring, and refer to the standard processes established at the organization level control; and product delivery and standard processes that may be established at lower levels, and support. although some organizations may have only one level of standard processes. (See the definitions of “standard process” and “organiza- tion’s set of standard processes” in the glossary.) Multiple standard processes may be needed to address the needs of different application domains, lifecycle models, methodologies, and tools. The organization’s set of standard processes contains process elements (e.g., a work product size estimating element) that may be interconnected according to one or more process architec- TIP tures that describe relationships among process elements. The OSSP can include The organization’s set of standard processes typically includes processes that are not directly technical, management, administrative, support, and organizational addressed by CMMI, such as processes. proposal development, project ptg The organization’s set of standard processes should collectively approval, financial manage- cover all processes needed by the organization and projects, includ- ment, and procurement. ing those processes addressed by the process areas at maturity level 2. HINT Example Work Products Often, organizations look at the exemplar processes from their 1. Organization’s set of standard processes projects as a starting point to populate the OSSP. Subpractices 1. Decompose each standard process into constituent process elements to the detail needed to understand and describe the process. TIP The objective is to decompose Each process element covers a closely related set of activities. The and define the process so that descriptions of process elements may be templates to be filled in, it can be performed consis- fragments to be completed, abstractions to be refined, or complete tently across projects but will descriptions to be tailored or used unmodified. These elements are allow enough flexibility to meet described in such detail that the process, when fully defined, can be the unique requirements of consistently performed by appropriately trained and skilled people. each project.

Examples of process elements include the following: • Template for generating work product size estimates • Description of work product design methodology Continues

Wow! eBook 306 PART TWO THE PROCESS AREAS

Continued • Tailorable peer review methodology • Template for conducting management reviews • Templates or task flows embedded in workflow tools • Description of methods for prequalifying suppliers as preferred s u p p l i e r s

2. Specify the critical attributes of each process element.

Examples of critical attributes include the following: • Process roles • Applicable standards • Applicable procedures, methods, tools, and resources • Process performance objectives • Entry criteria • Inputs • Verification points (e.g., peer reviews) •Outputs • Interfaces • Exit criteria ptg • Product and process measures

3. Specify relationships among process elements.

Examples of relationships include the following: • Order of the process elements • Interfaces among process elements • Interfaces with external processes • Interdependencies among process elements

The rules for describing relationships among process elements are referred to as the “process architecture.” The process architecture cov- ers essential requirements and guidelines. Detailed specifications of these relationships are covered in descriptions of defined processes that are tailored from the organization’s set of standard processes. HINT 4. Ensure that the organization’s set of standard processes adheres to Your initial focus should first applicable policies, standards, and models. be on standardizing what you Adherence to applicable process standards and models is typically already do well. demonstrated by developing a mapping from the organization’s set of

Wow! eBook Organizational Process Definition 307

standard processes to relevant process standards and models. This mapping is a useful input to future appraisals. OPD 5. Ensure that the organization’s set of standard processes satisfies process needs and objectives of the organization. Refer to the Organizational Process Focus process area for more informa- tion about establishing organizational process needs. 6. Ensure that there is appropriate integration among processes that are HINT included in the organization’s set of standard processes. Break down stovepipes: When 7. Document the organization’s set of standard processes. capabilities residing in differ- ent organizations are routinely 8. Conduct peer reviews on the organization’s set of standard processes. needed to understand trade- Refer to the Verification process area for more information about perform- offs and resolve system-level ing peer reviews. problems, consider establish- ing a standard end-to-end 9. Revise the organization’s set of standard processes as necessary. process for performing joint work. Examples of when the organization’s set of standard processes may need to be revised include the following: • When improvements to the process are identified • When causal analysis and resolution data indicate that a process change is needed • When process improvement proposals are selected for deployment ptg across the organization • When the organization’s process needs and objectives are updated

SP 1.2 ESTABLISH LIFECYCLE MODEL DESCRIPTIONS Establish and maintain descriptions of lifecycle models approved for use in the organization.

Lifecycle models can be developed for a variety of customers or in a TIP variety of situations, since one lifecycle model may not be appropri- When managing a project, it is ate for all situations. Lifecycle models are often used to define phases helpful to have a standard of the project. Also, the organization can define different lifecycle description for the phases the project moves through (i.e., models for each type of product and service it delivers. project lifecycle model) to organize and assess the ade- Example Work Products quacy of project activities and 1. Descriptions of lifecycle models to monitor progress.

Subpractices 1. Select lifecycle models based on the needs of projects and the organization.

Wow! eBook 308 PART TWO THE PROCESS AREAS Examples of project lifecycle models include the following: • Waterfall or Serial • Spiral • Evolutionary • Incremental • Iterative

TIP 2. Document descriptions of lifecycle models. It helps to provide guidance Lifecycle models can be documented as part of the organization’s about which lifecycle models standard process descriptions or they can be documented separately. work best with which types of 3. Conduct peer reviews on lifecycle models. projects. Refer to the Verification process area for more information about perform- ing peer reviews. 4. Revise the descriptions of lifecycle models as necessary.

SP 1.3 ESTABLISH TAILORING CRITERIA AND GUIDELINES

TIP Establish and maintain tailoring criteria and guidelines for the organization’s Tailoring allows projects to set of standard processes. adapt the OSSP and other assets to meet their needs. The Tailoring criteria and guidelines describe the following: ptg challenge is to provide guid- ance that has sufficient flexibil- ity to meet the unique needs of • How the organization’s set of standard processes and organizational each project but at the same process assets are used to create defined processes time ensure meaningful • Requirements that must be satisfied by defined processes (e.g., the c o n s i s t e n c y. subset of organizational process assets that are essential for any defined process) • Options that can be exercised and criteria for selecting among options • Procedures that must be followed in performing and documenting process tailoring

Examples of reasons for tailoring include the following: • Adapting the process to a new product line or work environment • Elaborating the process description so that the resulting defined process can be performed • Customizing the process for an application or class of similar applications

Flexibility in tailoring and defining processes is balanced with ensuring appropriate consistency of processes across the organiza- tion. Flexibility is needed to address contextual variables such as the

Wow! eBook Organizational Process Definition 309 domain; the nature of the customer; cost, schedule, and quality TIP Finding this balance usually tradeoffs; the technical difficulty of the work; and the experience of OPD the people implementing the process. Consistency across the organi- takes time, as the organization zation is needed so that organizational standards, objectives, and gains experience from using these assets. strategies are appropriately addressed, and process data and lessons learned can be shared. Tailoring is a critical activity that allows controlled changes to processes due to the specific needs of a project or a part of the organ- ization. Processes and process elements that are directly related to critical business objectives should usually be defined as mandatory, but processes and process elements that are less critical or only indi- rectly affect business objectives may allow for more tailoring. The amount of tailoring could also depend on the project’s life- cycle model, the use of suppliers, and other factors. Tailoring criteria and guidelines can allow for using a standard process “as is,” with no tailoring. Example Work Products 1. Tailoring guidelines for the organization’s set of standard processes

Subpractices ptg 1. Specify selection criteria and procedures for tailoring the organiza- tion’s set of standard processes.

Examples of criteria and procedures include the following: • Criteria for selecting lifecycle models from the ones approved by the organization • Criteria for selecting process elements from the organization’s set of stan- dard processes • Procedures for tailoring selected lifecycle models and process elements to accommodate process characteristics and needs • Procedures for adapting the organization’s common measures to address information needs

Examples of tailoring include the following: • Modifying a lifecycle model • Combining elements of different lifecycle models • Modifying process elements • Replacing process elements • Reordering process elements

Wow! eBook 310 PART TWO THE PROCESS AREAS

HINT 2. Specify the standards used for documenting defined processes. Streamline the waiver process 3. Specify the procedures used for submitting and obtaining approval to enable new projects to of waivers from the organization’s set of standard processes. establish their defined process quickly, and to avoid stalling. 4. Document tailoring guidelines for the organization’s set of standard processes. TIP 5. Conduct peer reviews on the tailoring guidelines. Both the tailoring process and Refer to the Verification process area for more information about perform- guidelines may be docu- ing peer reviews. mented as part of the OSSP. 6. Revise tailoring guidelines as necessary.

SP 1.4 ESTABLISH THE ORGANIZATION’S MEASUREMENT REPOSITORY TIP Establish and maintain the organization’s measurement repository. The organization’s measure- Refer to the Use Organizational Process Assets for Planning Project Activities ment repository is a critical specific practice in the Integrated Project Management process area for more resource that helps new proj- ects plan by providing answers information about the use of the organization’s measurement repository in plan- to questions about projects ning project activities. similar to their own (e.g., How long did it take? How much The repository contains both product and process measures that are effort was expended? What was the resulting quality?). related to the organization’s set of standard processes. It also contains or refers to information needed to understand and interpret measures ptg and to assess them for reasonableness and applicability. For example, TIP the definitions of measures are used to compare similar measures Although this practice concen- trates on repository establish- from different processes. ment and maintenance, the real value occurs when the Example Work Products people in the organization 1. Definition of the common set of product and process measures for begin using the data when the organization’s set of standard processes establishing defined processes and plans. 2. Design of the organization’s measurement repository 3. Organization’s measurement repository (i.e., the repository structure, support environment) 4. Organization’s measurement data

Subpractices 1. Determine the organization’s needs for storing, retrieving, and ana- lyzing measurements. TIP 2. Define a common set of process and product measures for the orga- These measures change over nization’s set of standard processes. time and therefore should be Measures in the common set are selected for their ability to provide reviewed periodically. visibility into processes critical to achieving business objectives and to focus on process elements significantly impacting cost, schedule, and performance within a project and across the organization. The common set of measures can vary for different standard processes.

Wow! eBook Organizational Process Definition 311

Measures defined include the ones related to agreement management, some of which may need to be collected from suppliers. OPD Operational definitions for measures specify procedures for collecting valid data and the point in the process where data will be collected.

Examples of classes of commonly used measures include the following: X-REF • Estimates of work product size (e.g., pages) Measurement and analysis practices (see MA) are a pre- • Estimates of effort and cost (e.g., person hours) requisite to establishing the • Actual measures of size, effort, and cost organization’s measurement •Test coverage repository. • Reliability measures (e.g., mean time to failure) • Quality measures (e.g., number of defects found, severity of defects) • Peer review coverage

3. Design and implement the measurement repository. Functions of the measurement repository include the following: • Supporting effective comparison and interpretation of measure- ment data among projects • Providing sufficient context to allow a new project to quickly iden- tify and access data in the repository for similar projects ptg • Enabling projects to improve the accuracy of their estimates by using their own and other projects’ historical data • Aiding in the understanding of process performance • Supporting potential statistical management of processes or sub- processes, as needed 4. Specify procedures for storing, updating, and retrieving measures. TIP Refer to the Measurement and Analysis process area for more information Entering measurements into a about specifying data collection and storage procedures. repository is commonly an automated process; however, 5. Conduct peer reviews on definitions of the common set of measures when not automated, it should and procedures for storing, updating, and retrieving measures. be done by the person collect- Refer to the Verification process area for more information about perform- ing the measurements. ing peer reviews. 6. Enter specified measures into the repository. Refer to the Measurement and Analysis process area for more information about specifying measures. 7. Make the contents of the measurement repository available for use by the organization and projects as appropriate. 8. Revise the measurement repository, the common set of measures, and procedures as the organization’s needs change.

Wow! eBook 312 PART TWO THE PROCESS AREAS Examples of when the common set of measures may need to be revised include the following: • New processes are added • Processes are revised and new measures are needed • Finer granularity of data is required • Greater visibility into the process is required • Measures are retired

SP 1.5 ESTABLISH THE ORGANIZATION’S PROCESS ASSET LIBRARY Establish and maintain the organization’s process asset library.

HINT Examples of items to be stored in the organization’s process asset library Think of why you are storing include the following: this information and how often • Organizational policies it will be retrieved. • Process descriptions • Procedures (e.g., estimating procedure) • Development plans • Acquisition plans • Quality assurance plans ptg • Training materials • Process aids (e.g., checklists) • Lessons learned reports

Example Work Products 1. Design of the organization’s process asset library 2. The organization’s process asset library 3. Selected items to be included in the organization’s process asset library 4. The catalog of items in the organization’s process asset library

Subpractices TIP 1. Design and implement the organization’s process asset library, A major objective of the including the library structure and support environment. process asset library (PAL) is to 2. Specify criteria for including items in the library. ensure that information is easy Items are selected based primarily on their relationship to the organi- to locate and use. zation’s set of standard processes. 3. Specify procedures for storing, updating, and retrieving items. 4. Enter selected items into the library and catalog them for easy refer- ence and retrieval.

Wow! eBook Organizational Process Definition 313

5. Make items available for use by projects. TIP Library maintenance can 6. Periodically review the use of each item. OPD 7. Revise the organization’s process asset library as necessary. quickly become an issue if all documents from every project are stored in the library. Examples of when the library may need to be revised include the f o l l o w i n g : TIP • New items are added Some organizations regularly • Items are retired review their PAL contents every 12 to 18 months to decide what • Current versions of items are changed to discard or archive.

SP 1.6 ESTABLISH WORK ENVIRONMENT STANDARDS Establish and maintain work environment standards. TIP Work environment standards Work environment standards allow the organization and projects to must make sense for your benefit from common tools, training, and maintenance, as well as organization, its line of business, the degree of collaboration to cost savings from volume purchases. Work environment standards be supported, and so on. address the needs of all stakeholders and consider productivity, cost, availability, security, and workplace health, safety, and ergonomic fac- HINT tors. Work environment standards can include guidelines for tailor- If your organization has a ptg ing and the use of waivers that allow adaptation of the project’s work shared vision, your work envi- environment to meet needs. ronment must support it.

TIP Examples of work environment standards include the following: Typically, projects have addi- • Procedures for the operation, safety, and security of the work tional requirements for their e n v i r o n m e n t work environment. This spe- • Standard workstation hardware and software cific practice establishes the • Standard application software and tailoring guidelines for it standards to be addressed across the organization. • Standard production and calibration equipment • Process for requesting and approving tailoring or waivers

Example Work Products 1. Work environment standards

Subpractices 1. Evaluate commercially available work environment standards appro- priate for the organization. 2. Adopt existing work environment standards and develop new ones to fill gaps based on the organization’s process needs and objectives.

Wow! eBook 314 PART TWO THE PROCESS AREAS

TIP SP 1.7 ESTABLISH RULES AND GUIDELINES FOR TEAMS Much of the power of teams Establish and maintain organizational rules and guidelines for the structure, comes from the excitement that team members share in work- formation, and operation of teams. ing together to overcome chal- lenges. However, over time, Operating rules and guidelines for teams define and control how team members can neglect the teams are created and how they interact to accomplish objectives. needs of their profession and Team members should understand the standards for work and partic- career growth. If this happens across the organization, the ipate according to those standards. organization will gradually lose When establishing rules and guidelines for teams, ensure they its core competencies. comply with all local and national regulations or laws that can affect the use of teams. TIP Structuring teams involves defining the number of teams, the Teams cannot operate as “high- type of each team, and how each team relates to the others in the performance teams” if they structure. Forming teams involves chartering each team, assigning have to go to management for team members and team leaders, and providing resources to each approval of every action or team to accomplish work. decision. Example Work Products 1. Rules and guidelines for structuring and forming teams 2. Operating rules for teams ptg Subpractices 1. Establish and maintain empowerment mechanisms to enable timely decision making. TIP In a successful teaming environment, clear channels of responsibility The authority initially given to and authority are established by documenting and deploying organi- a newly formed team may be zational guidelines that clearly define the empowerment of teams. expanded later as project 2. Establish and maintain rules and guidelines for structuring and phases are completed and the forming teams. team demonstrates mature use of the authority granted. The rules for the degree of empow- Organizational process assets can help the project to structure and imple- erment should support making ment teams. Such assets can include the following: such adjustments. • Team structure guidelines • Team formation guidelines • Team authority and responsibility guidelines • Guidelines for establishing lines of communication, authority, and e s c a l a t i o n • Team leader selection criteria

3. Define the expectations, rules, and guidelines that guide how teams work collectively.

Wow! eBook Organizational Process Definition 315 These rules and guidelines establish organizational practices for consis- tency across teams and can include the following: OPD • How interfaces among teams are established and maintained • How assignments are accepted and transferred • How resources and inputs are accessed • How work gets done • Who checks, reviews, and approves work • How work is approved • How work is delivered and communicated • Who reports to whom • What the reporting requirements (e.g., cost, schedule, performance sta- tus), measures, and methods are • Which progress reporting measures and methods are used

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook ORGANIZATIONAL PROCESS FOCUS A Process Management Process Area at Maturity Level 3 OPF

Purpose The purpose of Organizational Process Focus (OPF) is to plan, TIP implement, and deploy organizational process improvements based As Watts Humphrey said, “If on a thorough understanding of current strengths and weaknesses of you don’t know where you are, a map won’t help.” Benchmark the organization’s processes and process assets. the processes and practices in your organization before you Introductory Notes begin to improve them. The organization’s processes include all processes used by the organ- TIP ptg ization and its projects. Candidate improvements to the organiza- Although CMMI describes tion’s processes and process assets are obtained from various sources, many of the processes that are including the measurement of processes, lessons learned in imple- critical to success, it does not menting processes, results of process appraisals, results of product contain everything. Therefore, and service evaluation activities, results of customer satisfaction eval- you may improve processes, such as meeting management uations, results of benchmarking against other organizations’ processes, and proposal development, and recommendations from other improvement initiatives in the which might not be discussed organization. in CMMI. Process improvement occurs in the context of the organization’s needs and is used to address the organization’s objectives. The organ- TIP ization encourages participation in process improvement activities by Project participation is essen- those who perform the process. The responsibility for facilitating and tial to any process improve- managing the organization’s process improvement activities, includ- ment effort. ing coordinating the participation of others, is typically assigned to a TIP process group. The organization provides the long-term commitment Especially in the early phases and resources required to sponsor this group and to ensure the effec- of process improvement, the tive and timely deployment of improvements. process group must visibly Careful planning is required to ensure that process improvement demonstrate the organization’s efforts across the organization are adequately managed and imple- investment in process mented. Results of the organization’s process improvement planning i m p r o v e m e n t . are documented in a process improvement plan.

317

Wow! eBook 318 PART TWO THE PROCESS AREAS

The “organization’s process improvement plan” addresses appraisal planning, process action planning, pilot planning, and deployment HINT planning. Appraisal plans describe the appraisal timeline and sched- Run your process improvement ule, the scope of the appraisal, resources required to perform the like a project or series of proj- appraisal, the reference model against which the appraisal will be ects. Use CMMI practices to performed, and logistics for the appraisal. help you plan, implement, and manage your process improve- Process action plans usually result from appraisals and document ment activities. how improvements targeting weaknesses uncovered by an appraisal will be implemented. Sometimes the improvement described in the TIP process action plan should be tested on a small group before deploy- With any type of change, an ing it across the organization. In these cases, a pilot plan is generated. investment is required. These When the improvement is to be deployed, a deployment plan is activities may require weeks, created. This plan describes when and how the improvement will be months, or even years. One deployed across the organization. challenge is to demonstrate Organizational process assets are used to describe, implement, improvements the organization can see quickly. and improve the organization’s processes. (See the definition of “organizational process assets” in the glossary.)

Related Process Areas

Refer to the Organizational Process Definition process area for more informa- ptg tion about establishing organizational process assets.

Specific Goal and Practice Summary SG 1 Determine Process Improvement Opportunities SP 1.1 Establish Organizational Process Needs SP 1.2 Appraise the Organization’s Processes SP 1.3 Identify the Organization’s Process Improvements SG 2 Plan and Implement Process Actions SP 2.1 Establish Process Action Plans SP 2.2 Implement Process Action Plans SG 3 Deploy Organizational Process Assets and Incorporate Experiences SP 3.1 Deploy Organizational Process Assets SP 3.2 Deploy Standard Processes SP 3.3 Monitor the Implementation SP 3.4 Incorporate Experiences into Organizational Process Assets

Specific Practices by Goal

SG 1 DETERMINE PROCESS IMPROVEMENT OPPORTUNITIES Strengths, weaknesses, and improvement opportunities for the organization’s processes are identified periodically and as needed.

Wow! eBook Organizational Process Focus 319

Strengths, weaknesses, and improvement opportunities can be deter- mined relative to a process standard or model such as a CMMI model or ISO standard. Process improvements should be selected to address the organization’s needs. Process improvement opportunities can arise as a result of chang- ing business objectives, legal and regulatory requirements, and results of benchmarking studies.

SP 1.1 ESTABLISH ORGANIZATIONAL PROCESS NEEDS OPF Establish and maintain the description of process needs and objectives for the organization.

The organization’s processes operate in a business context that TIP should be understood. The organization’s business objectives, needs, Process improvement must and constraints determine the needs and objectives for the organiza- relate directly to the business’s tion’s processes. Typically, issues related to customer satisfaction, objectives. finance, technology, quality, human resources, and marketing are important process considerations.

The organization’s process needs and objectives cover aspects that ptg include the following: • Characteristics of processes • Process performance objectives, such as time-to-market and delivered quality • Process effectiveness

Example Work Products 1. The organization’s process needs and objectives

Subpractices 1. Identify policies, standards, and business objectives that are applica- ble to the organization’s processes.

Examples of standards include the following: • ISO/IEC 12207:2008 Systems and Software Engineering – Software Life Cycle Processes [ISO 2008a] • ISO/IEC 15288:2008 Systems and Software Engineering – System Life Cycle Processes [ISO 2008b] • ISO/IEC 27001:2005 Information technology – Security techniques – Information Security Management Systems – Requirements [ISO/IEC 2005] Continues

Wow! eBook 320 PART TWO THE PROCESS AREAS

Continued • ISO/IEC 14764:2006 Software Engineering – Software Life Cycle Processes – Maintenance [ISO 2006b] • ISO/IEC 20000 Information Technology – Service Management [ISO 2005b] • Assurance Focus for CMMI [DHS 2009] • NDIA Engineering for System Assurance Guidebook [NDIA 2008] • Resiliency Management Model [SEI 2010d]

2. Examine relevant process standards and models for best practices. 3. Determine the organization’s process performance objectives. Process performance objectives can be expressed in quantitative or qualitative terms. Refer to the Measurement and Analysis process area for more information about establishing measurement objectives. Refer to the Organizational Process Performance process area for more information about establishing quality and process performance objectives.

Examples of process performance objectives include the following: • Achieve a customer satisfaction rating of a certain value ptg • Ensure product reliability is at least a certain percentage • Reduce defect insertion rate by a certain percentage • Achieve a certain cycle time for a given activity • Improve productivity by a given percentage • Simplify the requirements approval workflow • Improve quality of products delivered to customer

4. Define essential characteristics of the organization’s processes. Essential characteristics of the organization’s processes are deter- mined based on the following: • Processes currently being used in the organization • Standards imposed by the organization • Standards commonly imposed by customers of the organization

Examples of process characteristics include the following: • Level of detail • Process notation •Granularity

5. Document the organization’s process needs and objectives. 6. Revise the organization’s process needs and objectives as needed.

Wow! eBook Organizational Process Focus 321

SP 1.2 APPRAISE THE ORGANIZATION’S PROCESSES HINT Appraise the organization’s processes periodically and as needed to maintain an understanding of their strengths and weaknesses. Select the appraisal method that matches the purpose and information needed. Guide Process appraisals can be performed for the following reasons: your selection by knowing the amount of information needed • To identify processes to be improved and the importance of its • To confirm progress and make the benefits of process improvement a c c u r a c y. v i s i b l e

• To satisfy the needs of a customer-supplier relationship OPF • To motivate and facilitate buy-in

The buy-in gained during a process appraisal can be eroded signifi- cantly if it is not followed by an appraisal based action plan. Example Work Products 1. Plans for the organization’s process appraisals 2. Appraisal findings that address strengths and weaknesses of the organization’s processes 3. Improvement recommendations for the organization’s processes ptg Subpractices 1. Obtain sponsorship of the process appraisal from senior manage- ment. Senior management sponsorship includes the commitment to have TIP the organization’s managers and staff participate in the process The commitment of resources appraisal and to provide resources and funding to analyze and com- to the appraisal must be visible municate findings of the appraisal. throughout the organization. 2. Define the scope of the process appraisal. Process appraisals can be performed on the entire organization or can be performed on a smaller part of an organization such as a single project or business area. The scope of the process appraisal addresses the following: • Definition of the organization (e.g., sites, business areas) to be cov- ered by the appraisal • Identification of the project and support functions that will repre- sent the organization in the appraisal • Processes to be appraised 3. Determine the method and criteria to be used for the process appraisal. Process appraisals can occur in many forms. They should address the needs and objectives of the organization, which can change over time.

Wow! eBook 322 PART TWO THE PROCESS AREAS

For example, the appraisal can be based on a process model, such as a TIP CMMI model, or on a national or international standard, such as ISO Examples of appraisal methods 9001 [ISO 2008c]. Appraisals can also be based on a benchmark com- include SCAMPI A, B, and C, as parison with other organizations in which practices that can con- well as gap analysis and tribute to improved organizational performance are identified. The s u r v e y s . characteristics of the appraisal method may vary, including time and effort, makeup of the appraisal team, and the method and depth of investigation. 4. Plan, schedule, and prepare for the process appraisal. 5. Conduct the process appraisal. 6. Document and deliver the appraisal’s activities and findings.

SP 1.3 IDENTIFY THE ORGANIZATION’S PROCESS IMPROVEMENTS Identify improvements to the organization’s processes and process assets.

Example Work Products 1. Analysis of candidate process improvements 2. Identification of improvements for the organization’s processes

Subpractices HINT 1. Determine candidate process improvements. ptg In the early stages of process improvement, there are more candidate improvements than Candidate process improvements are typically determined by doing the resources to address them. Pri- following: oritize these opportunities to • Measuring processes and analyzing measurement results be most effective. • Reviewing processes for effectiveness and suitability • Assessing customer satisfaction • Reviewing lessons learned from tailoring the organization’s set of stan- dard processes • Reviewing lessons learned from implementing processes • Reviewing process improvement proposals submitted by the organiza- tion’s managers, staff, and other relevant stakeholders • Soliciting inputs on process improvements from senior management and HINT other leaders in the organization Choose improvements that are • Examining results of process appraisals and other process related visible to the organization, reviews have a defined scope, and can • Reviewing results of other organizational improvement initiatives be addressed successfully by available resources. If you try to do too much too quickly, it 2. Prioritize candidate process improvements. may result in failure and cause Criteria for prioritization are as follows: the improvement program to be questioned. • Consider the estimated cost and effort to implement the process improvements.

Wow! eBook Organizational Process Focus 323

• Evaluate the expected improvement against the organization’s improvement objectives and priorities. • Determine the potential barriers to the process improvements and develop strategies for overcoming these barriers.

Examples of techniques to help determine and prioritize possible improvements to be implemented include the following: • A cost benefit analysis that compares the estimated cost and effort to implement the process improvements and their associated benefits • A gap analysis that compares current conditions in the organization with OPF optimal conditions • Force field analysis of potential improvements to identify potential barri- ers and strategies for overcoming those barriers • Cause-and-effect analyses to provide information on the potential effects of different improvements that can then be compared

3. Identify and document the process improvements to be implemented. 4. Revise the list of planned process improvements to keep it current.

SG 2 PLAN AND IMPLEMENT PROCESS ACTIONS TIP ptg Process actions that address improvements to the organization’s processes Organizational process assets and process assets are planned and implemented. are those created by the activi- ties in OPD. The successful implementation of improvements requires participa- tion in process action planning and implementation by process own- TIP ers, those who perform the process, and support organizations. Most of the organization should be involved in these activities. SP 2.1 ESTABLISH PROCESS ACTION PLANS Establish and maintain process action plans to address improvements to the organization’s processes and process assets.

Establishing and maintaining process action plans typically involves the following roles: • Management steering committees that set strategies and oversee process improvement activities • Process groups that facilitate and manage process improvement activities • Process action teams that define and implement process actions • Process owners that manage deployment • Practitioners that perform the process

Wow! eBook 324 PART TWO THE PROCESS AREAS

Stakeholder involvement helps to obtain buy-in on process improve- ments and increases the likelihood of effective deployment. TIP Process action plans are detailed implementation plans. These Depending on the magnitude of plans differ from the organization’s process improvement plan by tar- the improvement, a process geting improvements that were defined to address weaknesses and action plan can look similar to a that were usually uncovered by appraisals. project plan. If the improve- ment is small, the plan can look Example Work Products similar to a plan for a routine maintenance activity. 1. The organization’s approved process action plans

Subpractices 1. Identify strategies, approaches, and actions to address identified process improvements. New, unproven, and major changes are piloted before they are incor- porated into normal use. 2. Establish process action teams to implement actions. The teams and people performing the process improvement actions are called “process action teams.” Process action teams typically include process owners and those who perform the process. 3. Document process action plans. ptg Process action plans typically cover the following: • Process improvement infrastructure • Process improvement objectives • Process improvements to be addressed • Procedures for planning and tracking process actions • Strategies for piloting and implementing process actions • Responsibility and authority for implementing process actions • Resources, schedules, and assignments for implementing process actions • Methods for determining the effectiveness of process actions • Risks associated with process action plans

4. Review and negotiate process action plans with relevant stakeholders. 5. Revise process action plans as necessary.

TIP SP 2.2 IMPLEMENT PROCESS ACTION PLANS Depending on the size of the organization and the extent of Implement process action plans. the change, the implementa- tion activity can take days, Example Work Products weeks, months, or even years. 1. Commitments among process action teams

Wow! eBook Organizational Process Focus 325

2. Status and results of implementing process action plans 3. Plans for pilots

Subpractices 1. Make process action plans readily available to relevant stakeholders. 2. Negotiate and document commitments among process action teams and revise their process action plans as necessary. 3. Track progress and commitments against process action plans.

4. Conduct joint reviews with process action teams and relevant stake- OPF holders to monitor the progress and results of process actions. 5. Plan pilots needed to test selected process improvements. 6. Review the activities and work products of process action teams. 7. Identify, document, and track to closure issues encountered when implementing process action plans. 8. Ensure that results of implementing process action plans satisfy the organization’s process improvement objectives.

SG 3 DEPLOY ORGANIZATIONAL PROCESS ASSETS AND INCORPORATE EXPERIENCES Organizational process assets are deployed across the organization and TIP process related experiences are incorporated into organizational process IPM, OPF, and OPD are tightly ptg assets. related. OPD defines the orga- nizational assets. OPF manages them, deploys them across the The specific practices under this specific goal describe ongoing activ- organization, and collects feed- ities. New opportunities to benefit from organizational process assets back. IPM uses the assets on and changes to them can arise throughout the life of each project. the project and provides feed- Deployment of standard processes and other organizational process back to the organization. assets should be continually supported in the organization, particu- larly for new projects at startup.

SP 3.1 DEPLOY ORGANIZATIONAL PROCESS ASSETS Deploy organizational process assets across the organization.

Deploying organizational process assets or changes to them should HINT be performed in an orderly manner. Some organizational process Be sure to think about retiring assets or changes to them may not be appropriate for use in some the assets and work products parts of the organization (e.g., because of stakeholder requirements, that the change replaces. or the current lifecycle phase being implemented). It is therefore important that those who are or will be executing the process, as well as other organization functions (e.g., training, quality assurance), be involved in deployment as necessary.

Wow! eBook 326 PART TWO THE PROCESS AREAS

Refer to the Organizational Process Definition process area for more informa- tion about establishing organizational process assets.

Example Work Products 1. Plans for deploying organizational process assets and changes to them across the organization 2. Training materials for deploying organizational process assets and changes to them 3. Documentation of changes to organizational process assets 4. Support materials for deploying organizational process assets and changes to them

Subpractices 1. Deploy organizational process assets across the organization.

TIP Typical activities performed as a part of the deployment of process assets When planning the deploy- include the following: ment of organizational process • Identifying organizational process assets that should be adopted by assets, you may want to con- sider a staggered release to those who perform the process • Determining how organizational process assets are made available (e.g., address the needs of projects ptg at different stages in the via a website) l i f e c y c l e . • Identifying how changes to organizational process assets are communicated • Identifying resources (e.g., methods, tools) needed to support the use of organizational process assets • Planning the deployment • Assisting those who use organizational process assets • Ensuring that training is available for those who use organizational process assets

Refer to the Organizational Training process area for more information about establishing an organizational training capability. TIP 2. Document changes to organizational process assets. You may find that you have Documenting changes to organizational process assets serves two multiple versions of an asset in main purposes: use in your organization because of suggested changes. • To enable the communication of changes Just like a project, you will need • To understand the relationship of changes in the organizational to have change control prac- process assets to changes in process performance and results tices in place for your organiza- 3. Deploy changes that were made to organizational process assets tional assets. across the organization.

Wow! eBook Organizational Process Focus 327 Typical activities performed as a part of deploying changes include the f o l l o w i n g : • Determining which changes are appropriate for those who perform the process • Planning the deployment • Arranging for the support needed for the successful transition of changes

4. Provide guidance and consultation on the use of organizational

process assets. OPF

SP 3.2 DEPLOY STANDARD PROCESSES Deploy the organization’s set of standard processes to projects at their startup and deploy changes to them as appropriate throughout the life of each p r o j e c t .

It is important that new projects use proven and effective processes to perform critical early activities (e.g., project planning, receiving requirements, obtaining resources). Projects should also periodically update their defined processes to incorporate the latest changes made to the organization’s set of stan- ptg dard processes when it will benefit them. This periodic update helps to ensure that all project activities derive the full benefit of what other projects have learned. Refer to the Organizational Process Definition process area for more informa- tion about establishing standard processes and establishing tailoring criteria and guidelines.

Example Work Products 1. The organization’s list of projects and the status of process deploy- ment on each (i.e., existing and planned projects) 2. Guidelines for deploying the organization’s set of standard processes on new projects 3. Records of tailoring and implementing the organization’s set of stan- dard processes

Subpractices 1. Identify projects in the organization that are starting up. 2. Identify active projects that would benefit from implementing the organization’s current set of standard processes. 3. Establish plans to implement the organization’s current set of stan- dard processes on the identified projects.

Wow! eBook 328 PART TWO THE PROCESS AREAS

4. Assist projects in tailoring the organization’s set of standard processes to meet their needs. Refer to the Integrated Project Management process area for more infor- mation about establishing the project’s defined process. TIP 5. Maintain records of tailoring and implementing processes on the Records of tailoring and imple- identified projects. menting processes will be use- 6. Ensure that the defined processes resulting from process tailoring are ful when the organization incorporated into plans for process compliance audits. determines what changes to make to the organizational Process compliance audits are objective evaluations of project activi- process assets and provides ties against the project’s defined process. guidance on using the changed 7. As the organization’s set of standard processes is updated, identify assets. which projects should implement the changes.

SP 3.3 MONITOR THE IMPLEMENTATION Monitor the implementation of the organization’s set of standard processes and use of process assets on all projects.

By monitoring implementation, the organization ensures that the organization’s set of standard processes and other process assets are appropriately deployed to all projects. Monitoring implementation ptg also helps the organization to develop an understanding of the orga- nizational process assets being used and where they are used in the organization. Monitoring also helps to establish a broader context for interpreting and using process and product measures, lessons learned, and improvement information obtained from projects. Example Work Products 1. Results of monitoring process implementation on projects 2. Status and results of process compliance audits 3. Results of reviewing selected process artifacts created as part of process tailoring and implementation

Subpractices 1. Monitor the projects’ use of organizational process assets and changes to them. 2. Review selected process artifacts created during the life of each project. Reviewing selected process artifacts created during the life of a project ensures that all projects are making appropriate use of the organiza- tion’s set of standard processes. 3. Review results of process compliance audits to determine how well the organization’s set of standard processes has been deployed.

Wow! eBook Organizational Process Focus 329

Refer to the Process and Product Quality Assurance process area for more information about objectively evaluating processes. 4. Identify, document, and track to closure issues related to implement- ing the organization’s set of standard processes.

SP 3.4 INCORPORATE EXPERIENCES INTO ORGANIZATIONAL PROCESS ASSETS Incorporate process related experiences derived from planning and perform- ing the process into organizational process assets. OPF Example Work Products 1. Process improvement proposals 2. Process lessons learned 3. Measurements of organizational process assets 4. Improvement recommendations for organizational process assets 5. Records of the organization’s process improvement activities 6. Information on organizational process assets and improvements to them

Subpractices 1. Conduct periodic reviews of the effectiveness and suitability of the ptg organization’s set of standard processes and related organizational process assets relative to the process needs and objectives derived from the organization’s business objectives. TIP 2. Obtain feedback about the use of organizational process assets. Some feedback may be col- 3. Derive lessons learned from defining, piloting, implementing, and lected as part of QA activities. deploying organizational process assets. 4. Make lessons learned available to people in the organization as TIP appropriate. Lessons learned are usually Actions may be necessary to ensure that lessons learned are used made available through the library established in OPD. appropriately.

Examples of the inappropriate use of lessons learned include the following: • Evaluating the performance of people • Judging process performance or results

Examples of ways to prevent the inappropriate use of lessons learned include the following: • Controlling access to lessons learned • Educating people about the appropriate use of lessons learned

Wow! eBook 330 PART TWO THE PROCESS AREAS

TIP 5. Analyze measurement data obtained from the use of the organiza- Common sets of measures are tion’s common set of measures. usually kept in the organiza- Refer to the Measurement and Analysis process area for more information tion’s measurement repository, about analyzing measurement data. established in OPD. Refer to the Organizational Process Definition process area for more information about establishing the organization’s measurement repository. TIP We use the word appraise here 6. Appraise processes, methods, and tools in use in the organization more in line with Webster’s and develop recommendations for improving organizational process definition of the word, rather assets. than the appraisal term we use in CMMI: “to set a value on; estimate the amount of; or to This appraisal typically includes the following: evaluate the worth, significance • Determining which processes, methods, and tools are of potential use to or status of; especially, to give other parts of the organization an expert judgment of the value • Appraising the quality and effectiveness of organizational process assets or merit of.” • Identifying candidate improvements to organizational process assets • Determining compliance with the organization’s set of standard processes and tailoring guidelines

7. Make the best of the organization’s processes, methods, and tools available to people in the organization as appropriate. ptg 8. Manage process improvement proposals. Process improvement proposals can address both process and tech- nology improvements.

The activities for managing process improvement proposals typically include the following: • Soliciting process improvement proposals • Collecting process improvement proposals • Reviewing process improvement proposals • Selecting the process improvement proposals to be implemented • Tracking the implementation of process improvement proposals

Process improvement proposals are documented as process change requests or problem reports as appropriate. Some process improvement proposals can be incorporated into the organization’s process action plans. 9. Establish and maintain records of the organization’s process improvement activities.

Wow! eBook ORGANIZATIONAL PERFORMANCE MANAGEMENT A Process Management Process Area at Maturity Level 5

Purpose TIP The purpose of Organizational Performance Management (OPM) is OPM focuses the organization to proactively manage the organization’s performance to meet its on improving its performance business objectives. to meet business objectives. OPM Introductory Notes The Organizational Performance Management process area enables the organization to manage organizational performance by iteratively ptg analyzing aggregated project data, identifying gaps in performance against the business objectives, and selecting and deploying improve- ments to close the gaps. In this process area, the term “improvement” includes all incre- TIP mental and innovative process and technology improvements, Changes must be measurably including those improvements made to project work environments. better. In early improvement “Improvement” refers to all ideas that would change the organiza- efforts, it is more difficult to tion’s processes, technologies, and performance to better meet the determine the effects of changes and whether things organization’s business objectives and associated quality and process have really become better. performance objectives. Business objectives that this process area might address include the following:

• Improved product quality (e.g., functionality, quality attributes) • Increased productivity • Increased process efficiency and effectiveness • Increased consistency in meeting budget and schedule • Decreased cycle time • Greater customer and end-user satisfaction • Shorter development or production time to change functionality, add new features, or adapt to new technologies

331

Wow! eBook 332 PART TWO THE PROCESS AREAS

TIP • Improved performance of a supply chain involving multiple suppliers Those who execute a process • Improved use of resources across the organization are the best source for sugges- tions on how it can be improved. It is thus important to motivate The organization analyzes product and process performance data everyone in the organization to from the projects to determine if it is capable of meeting the quality reflect on their work and sug- and process performance objectives. Process performance baselines gest potential improvements. and process performance models, developed using Organizational Process Performance processes, are used as part of the analysis. TIP Causal Analysis and Resolution processes can also be used to identify The ability to manage change is potential areas of improvement or specific improvement proposals. one of the key characteristics of a mature organization; another The organization identifies and proactively solicits incremental is when the majority (70% to and innovative improvements from within the organization and from 90%) of the work force is external sources such as academia, competitive intelligence, and suc- involved in proposing and cessful improvements implemented elsewhere. evaluating changes. Realization of the improvements and their effects on the quality and process performance objectives depends on being able to effec- HINT tively identify, evaluate, implement, and deploy improvements to the Customers and suppliers can organization’s processes and technologies. also participate in the organi- zation’s process improvement Realization of the improvements and beneficial effects also activities. Processes and tech- depends on engaging the workforce in identifying and evaluating nologies that improve the possible improvements and maintaining a focus on long-term plan- ptg relationships across those ning that includes the identification of innovations. boundaries can be improved Improvement proposals are evaluated and validated for their as well. effectiveness in the target environment. Based on this evaluation, improvements are prioritized and selected for deployment to new HINT and ongoing projects. Deployment is managed in accordance with If a proposed improvement will have little impact on achieving the deployment plan and performance data are analyzed using statis- the organization’s objectives, it tical and other quantitative techniques to determine the effects of the is probably not worth pursuing. improvement on quality and process performance objectives. However, occasionally, it may This improvement cycle continually optimizes organizational indicate an opportunity missed processes based on quality and process performance objectives. Busi- by those who created the objec- ness objectives are periodically reviewed to ensure they are current tives; so if appropriate, revisit the objectives to see whether and quality and process performance objectives are updated as they should be updated. appropriate. The Organizational Process Focus process area includes no TIP assumptions about the quantitative basis for identifying improve- Although many changes may ments, nor their expected results. This process area extends the individually have merit, the Organizational Process Focus practices by focusing on process organization can expend only improvement based on a quantitative understanding of the organiza- so much attention and resources to performance tion’s set of standard processes and technologies and their expected improvement. Some kind of quality and process performance. evaluation and ranking leading to a selection is necessary.

Wow! eBook Organizational Performance Management 333

The specific practices of this process area apply to organizations TIP whose projects are quantitatively managed. Use of the specific prac- OPF may provide sufficient tices of this process area can add value in other situations, but the assistance to your initial results may not provide the same degree of impact to the organiza- process improvement efforts. Implement OPM after the tion’s quality and process performance objectives. organization has established assets (OPP) that enable quan- titative project management Related Process Areas (QPM) and has developed some facility in the use of sta- Refer to the Causal Analysis and Resolution process area for more information tistical and other quantitative about identifying causes of selected outcomes and taking action to improve techniques in both project and process performance. organizational settings. Refer to the Decision Analysis and Resolution process area for more informa- tion about analyzing possible decisions using a formal evaluation process that X-REF evaluates identified alternatives against established criteria. See The Innovator’s Solution: Refer to the Measurement and Analysis process area for more information about Creating and Sustaining Suc- cessful Growth by Clayton M. aligning measurement and analysis activities and providing measurement Christensen and Michael E. OPM results. Raynor (Harvard Business Refer to the Organizational Process Focus process area for more information Press). about planning, implementing, and deploying organizational process improve- ments based on a thorough understanding of current strengths and weaknesses X-REF ptg of the organization’s processes and process assets. See The Strategy Paradox: Refer to the Organizational Process Performance process area for more infor- Why Committing to Success Leads to Failure (And What to mation about establishing quality and process performance objectives and Do About It) by Michael E. establishing process performance baselines and models. Raynor (Crown Business). Refer to the Organizational Training process area for more information about providing training.

Specific Goal and Practice Summary SG 1 Manage Business Performance SP 1.1 Maintain Business Objectives SP 1.2 Analyze Process Performance Data SP 1.3 Identify Potential Areas for Improvement SG 2 Select Improvements SP 2.1 Elicit Suggested Improvements SP 2.2 Analyze Suggested Improvements SP 2.3 Validate Improvements SP 2.4 Select and Implement Improvements for Deployment SG 3 Deploy Improvements SP 3.1 Plan the Deployment SP 3.2 Manage the Deployment SP 3.3 Evaluate Improvement Effects

Wow! eBook 334 PART TWO THE PROCESS AREAS Specific Practices by Goal

SG 1 MANAGE BUSINESS PERFORMANCE The organization’s business performance is managed using statistical and other quantitative techniques to understand process performance shortfalls, and to identify areas for process improvement.

X-REF Managing business performance requires the following: These three bullets corre- spond respectively to SPs 1.1, • Maintaining the organization’s business objectives 1.2, and 1.3 through 3.3 (i.e., • Understanding the organization’s ability to meet the business the rest of the PA). objectives TIP • Continually improving processes related to achieving the business In government, NGOs, and non- objectives profit organizations, the term “business performance” may The organization uses defined process performance baselines to include both “business per- formance” as well as “mission determine if the current and projected organizational business objec- performance.” tives are being met. Shortfalls in process performance are identified and analyzed to determine potential areas for process improvement. TIP Refer to the Organizational Process Performance process area for more infor- ptg The term shortfall in OPM mation about establishing performance baselines and models. refers to the gap between As the organization improves its process performance or as busi- desired business performance (as characterized in business ness strategies change, new business objectives are identified and objectives or quality and associated quality and process performance objectives are derived. process performance objectives Specific goal 2 addresses eliciting and analyzing improvement [QPPOs]) and actual business suggestions that address shortfalls in achieving quality and process performance (as characterized performance objectives. in process performance base- lines [PPBs]). SP 1.1 MAINTAIN BUSINESS OBJECTIVES TIP Maintain business objectives based on an understanding of business strate- Without a well defined and gies and actual performance results. communicated set of objectives for the organization, improve- Organizational performance data, characterized by process perform- ments are less likely to be mutually reinforcing and can ance baselines, are used to evaluate whether business objectives are even be at cross purposes. realistic and aligned with business strategies. After business objec- tives have been revised and prioritized by senior management, qual- TIP ity and process performance objectives may need to be created or This practice is the responsibil- maintained and re-communicated. ity of senior management and should not be delegated. Example Work Products 1. Revised business objectives

Wow! eBook Organizational Performance Management 335

2. Revised quality and process performance objectives TIP 3. Senior management approval of revised business objectives and Understanding actual perform- quality and process performance objectives ance is key to establishing realistic objectives with some 4. Communication of all revised objectives “stretch” in them (e.g., as in 5. Updated process performance measures Hoshin planning—see Wikipedia). Subpractices 1. Evaluate business objectives periodically to ensure they are aligned X-REF with business strategies. The business strategy litera- Senior management is responsible for understanding the marketplace, ture is vast, an early mile- establishing business strategies, and establishing business objectives. stone being Michael Porter’s framework for business strat- Because business strategies and organizational performance evolve, egy development (see “Porter business objectives should be reviewed periodically to determine five forces analysis” in whether they should be updated. For example, a business objective Wikipedia). The Christensen might be retired when process performance data show that the busi- and Raynor books mentioned ness objective is being met consistently over time or when the associ- earlier are more recent,

ated business strategy has changed. addressing how to nurture OPM 2. Compare business objectives with actual process performance results innovations to create sus- to ensure they are realistic. tained growth. Both books summarize many findings Business objectives can set the bar too high to motivate real improve- from the broader business ment. Using process performance baselines helps balance desires and strategy literature. ptg reality. If process performance baselines are unavailable, sampling techniques can be used to develop a quantitative basis for comparison in a short period of time. 3. Prioritize business objectives based on documented criteria, such as the ability to win new business, retain existing clients, or accomplish other key business strategies. 4. Maintain quality and process performance objectives to address TIP changes in business objectives. Business objectives are refined Business objectives and quality and process performance objectives into QPPOs (e.g., as described will typically evolve over time. As existing objectives are achieved, in OPP SP 1.1), which then they will be monitored to ensure they continue to be met, while new serve as the “touchpoint” business objectives and associated quality and process performance between senior management and the organization’s per- objectives are identified and managed. formance management and Refer to the Organizational Process Performance process area for more analysis activities as described information about establishing quality and process performance in the rest of this PA and OPP. o b j e c t i v e s . 5. Revise process performance measures to align with quality and process performance objectives. Refer to the Organizational Process Performance process area for more information about establishing process performance measures.

Wow! eBook 336 PART TWO THE PROCESS AREAS

SP 1.2 ANALYZE PROCESS PERFORMANCE DATA

TIP Analyze process performance data to determine the organization’s ability to Is the organization achieving meet identified business objectives. its QPPOs? Where are the shortfalls? The data that result from applying the process performance mea- sures, which are defined using Organizational Process Performance HINT processes are analyzed to create process performance baselines that Poor supplier performance can help in understanding the current capability of the organization. impact achievement of the Comparing process performance baselines to quality and process QPPOs of the organization. Make sure you clearly under- performance objectives helps the organization to determine its ability stand the contribution of sup- to meet business objectives. This data typically are collected from pliers to achieving your QPPOs project level process performance data to enable organizational and understand where you analysis. have direct control of outcomes versus influence. Example Work Products 1. Analysis of current capability vs. business objectives 2. Process performance shortfalls 3. Risks associated with meeting business objectives

Subpractices ptg HINT 1. Periodically compare quality and process performance objectives to Analyses using PPBs, process current process performance baselines to evaluate the ability of the simulations, and process per- organization to meet its business objectives. formance models (PPMs) can For example, if cycle time is a critical business need, many different assist in evaluating the ability of the organization to meet its cycle time measures may be collected by the organization. Overall business objectives. cycle time performance data should be compared to the business objectives to understand if expected performance will satisfy business objectives. 2. Identify shortfalls where the actual process performance is not satis- fying the business objectives. TIP 3. Identify and analyze risks associated with not meeting business Process performance shortfalls objectives. limit the ability of the organiza- tion to pursue its chosen busi- 4. Report results of the process performance and risk analyses to orga- ness strategies. nizational leadership.

SP 1.3 IDENTIFY POTENTIAL AREAS FOR IMPROVEMENT

TIP Identify potential areas for improvement that could contribute to meeting Which areas should the organi- business objectives. zation target to resolve s h o r t f a l l s ? Potential areas for improvement are identified through a proactive analysis to determine areas that could address process performance

Wow! eBook Organizational Performance Management 337

shortfalls. Causal Analysis and Resolution processes can be used to HINT diagnose and resolve root causes. PPBs, process simulations, and The output from this activity is used to evaluate and prioritize PPMs can assist in these potential improvements, and can result in either incremental or a n a l y s e s . innovative improvement suggestions as described in specific goal 2. Example Work Products 1. Potential areas for improvement

Subpractices 1. Identify potential improvement areas based on the analysis of process performance shortfalls. Performance shortfalls include not meeting productivity, cycle time, or customer satisfaction objectives. Examples of areas to consider for improvement include product technology, process technology, staffing and staff development, team structures, supplier selection and man-

agement, and other organizational infrastructures. OPM 2. Document the rationale for the potential improvement areas, includ- ing references to applicable business objectives and process perform- ance data. 3. Document anticipated costs and benefits associated with addressing ptg potential improvement areas. 4. Communicate the set of potential improvement areas for further evaluation, prioritization, and use.

SG 2 SELECT IMPROVEMENTS Improvements are proactively identified, evaluated using statistical and other quantitative techniques, and selected for deployment based on their contri- bution to meeting quality and process performance objectives.

Improvements to be deployed across the organization are selected from improvement suggestions which have been evaluated for effec- tiveness in the target deployment environment. These improvement suggestions are elicited and submitted from across the organization to address the improvement areas identified in specific goal 1. Evaluations of improvement suggestions are based on the following:

• A quantitative understanding of the organization’s current quality and process performance • Satisfaction of the organization’s quality and process performance objectives

Wow! eBook 338 PART TWO THE PROCESS AREAS

• Estimated costs and impacts of developing and deploying the improvements, resources, and funding available for deployment • Estimated benefits in quality and process performance resulting from deploying the improvements

SP 2.1 ELICIT SUGGESTED IMPROVEMENTS TIP Elicit and categorize suggested improvements. Improvement suggestions, whether incremental or inno- vative, are elicited. This practice focuses on eliciting suggested improvements and includes categorizing suggested improvements as incremental or HINT innovative. Everyone in the organization Incremental improvements generally originate with those who do should be motivated to submit the work (i.e., users of the process or technology). Incremental improvement suggestions, and improvements can be simple and inexpensive to implement and know how to submit one. deploy. Incremental improvement suggestions are analyzed, but, if selected, may not need rigorous validation or piloting. Innovative TIP improvements such as new or redesigned processes are more trans- This practice is similar in formational than incremental improvements. approach to the Elicit Require- Innovative improvements often arise out of a systematic search for ments specific practice in RD. Potential improvements are solutions to particular performance issues or opportunities to improve ptg proactively sought after rather performance. They are identified by those who are trained and expe- than only passively collected. rienced with the maturation of particular technologies or whose job it is to track or directly contribute to increased performance. TIP Innovations can be found externally by actively monitoring inno- Investigating innovative vations used in other organizations or documented in the research improvements is an ongoing literature. Innovations can also be found by looking internally (e.g., activity that involves monitor- by examining project lessons learned). Innovations are inspired by ing the marketplace for innova- tions that could benefit the the need to achieve quality and process performance objectives, the organization and help it need to improve performance baselines, or the external business achieve its objectives. environment.

Examples of incremental improvements include the following: • Adding an item to a peer review checklist. • Combining the technical review and management review for suppliers into a single review. • Introducing an incident workaround. • Substituting a new component. • Making minor updates to a tool.

Wow! eBook Organizational Performance Management 339 Examples of innovative improvements typically include additions or major updates to the following: • Computer and related hardware products • Transformational support tools • New or redesigned workflows • Processes or lifecycle models • Interface standards • Reusable components • Management techniques and methodologies • Quality improvement techniques and methodologies • Development techniques and methodologies

Some suggested improvements may be received in the form of a proposal (e.g., an organizational improvement proposal arising from a causal analysis and resolution activity). These suggested improve- ments will have been analyzed and documented prior to input to Organizational Performance Management processes. When suggested OPM improvements are received as proposals, the proposals are reviewed for completeness and are evaluated as part of the selection process for implementation. Improvement searches can involve looking outside the organiza- ptg tion, deriving innovations from projects using Causal Analysis and Resolution processes, using competitive business intelligence, or analyzing existing organizational performance. Example Work Products 1. Suggested incremental improvements 2. Suggested innovative improvements

Subpractices

1. Elicit suggested improvements. HINT These suggestions document potential improvements to processes You can collect suggestions and technologies. Managers and staff in the organization as well as using open-ended mecha- customers, end users, and suppliers can submit suggestions. The nisms, surveys, or focus groups. organization can also search the academic and technology communi- ties for suggested improvements. Some suggested improvements may have been implemented at the project level before being proposed for the organization.

Wow! eBook 340 PART TWO THE PROCESS AREAS Examples of sources for improvements include the following: • Findings and recommendations from process appraisals • The organization’s quality and process performance objectives • Analysis of data about customer and end-user problems as well as cus- tomer and end-user satisfaction • Results of process and product benchmarking efforts • Measured effectiveness of process activities • Measured effectiveness of project work environments • Examples of improvements that were successfully adopted elsewhere • Feedback on previous improvements • Spontaneous ideas from managers and staff • Improvement proposals from Causal Analysis and Resolution processes resulting from implemented actions with proven effectiveness • Analysis of technical performance measures • Analysis of data on defect causes • Analysis of project and organizational performance compared to quality and productivity objectives

Refer to the Organizational Process Focus process area for more informa- tion about deploying organizational process assets and incorporating experiences. ptg 2. Identify suggested improvements as incremental or innovative. 3. Investigate innovative improvements that may improve the organiza- tion’s processes and technologies.

Investigating innovative improvements typically involves the following: • Maintaining awareness of leading relevant technical work and technol- ogy trends • Searching for commercially available innovative improvements • Collecting proposals for innovative improvements from the projects and the organization • Reviewing processes and technologies used externally and comparing them to the processes and technologies used in the organization • Identifying areas where innovative improvements have been used suc- cessfully, and reviewing data and documentation of experience using these improvements • Identifying improvements that integrate new technology into products and project work environments

Wow! eBook Organizational Performance Management 341

SP 2.2 ANALYZE SUGGESTED IMPROVEMENTS TIP Analyze suggested improvements for their possible impact on achieving the Elicited improvement sugges- tions are analyzed (evaluated) organization’s quality and process performance objectives. for impacts.

Suggested improvements are incremental and innovative improve- X-REF ments that are analyzed and possibly selected for validation, imple- When analyzing innovations, mentation, and deployment throughout the organization. it is important to consider their potential contribution to Example Work Products business strategy and growth. For more information, refer to 1. Suggested improvement proposals the Christensen and Raynor 2. Selected improvements to be validated books mentioned near the Related Process Areas section. Subpractices 1. Analyze the costs and benefits of suggested improvements. TIP Process performance models provide insight into the effect of process These SP 2.2 subpractices rep- resent the discipline and rigor changes on process capability and performance. that is expected of high-matu- OPM Refer to the Organizational Process Performance process area for more rity organizations and is typi- information about establishing process performance models. cally not possible at earlier stages of process improvement. Improvement suggestions that have a large cost-to-benefit ratio or that would not improve the organization’s processes may be rejected. ptg

Criteria for evaluating costs and benefits include the following: • Contribution toward meeting the organization’s quality and process per- formance objectives • Effect on mitigating identified project and organizational risks • Ability to respond quickly to changes in project requirements, market sit- uations, and the business environment • Effect on related processes and associated assets • Cost of defining and collecting data that support the measurement and analysis of the process and technology improvement • Expected life span of the improvement

2. Identify potential barriers and risks to deploying each suggested TIP improvement. To identify barriers to deploy- ment, it is helpful to under- stand the organization’s attitude Examples of barriers to deploying improvements include the following: toward change and its ability to • Turf guarding and parochial perspectives change. Such knowledge • Unclear or weak business rationale should guide how changes, • Lack of short-term benefits and visible successes especially large or complicated ones, are implemented. Continues

Wow! eBook 342 PART TWO THE PROCESS AREAS

Continued • Unclear picture of what is expected from everyone • Too many changes at the same time • Lack of involvement and support from relevant stakeholders

Examples of risk factors that affect the deployment of improvements include the following: • Compatibility of the improvement with existing processes, values, and skills of potential end users • Complexity of the improvement • Difficulty implementing the improvement • Ability to demonstrate the value of the improvement before widespread deployment • Justification for large, up-front investments in areas such as tools and training • Inability to overcome “technology drag” where the current implementa- tion is used successfully by a large and mature installed base of end users

3. Estimate the cost, effort, and schedule required for implementing, verifying, and deploying each suggested improvement. ptg 4. Select suggested improvements for validation and possible imple- mentation and deployment based on the evaluations. Refer to the Decision Analysis and Resolution process area for more infor- mation about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. TIP 5. Document the evaluation results of each selected improvement sug- For selected improvement sug- gestion in an improvement proposal. gestions, an “improvement The proposal should include a problem statement, a plan (including proposal” incorporates the cost and schedule, risk handling, method for evaluating effectiveness results of the SP 2.2 subprac- tices including analyses and in the target environment) for implementing the improvement, and evaluations, a plan for imple- quantitative success criteria for evaluating actual results of the menting the improvement, deployment. changes needed, validation 6. Determine the detailed changes needed to implement the improve- method, and success criteria. ment and document them in the improvement proposal. 7. Determine the validation method that will be used before broad- scale deployment of the change and document it in the improvement proposal. • Determining the validation method includes defining the quantita- tive success criteria that will be used to evaluate results of the validation. • Since innovations, by definition, represent a major change with high impact, most innovative improvements will be piloted. Other

Wow! eBook Organizational Performance Management 343

validation methods, including modeling and simulation can be used as appropriate. 8. Document results of the selection process.

Results of the selection process usually include the following: • The disposition of each suggested improvement • The rationale for the disposition of each suggested improvement

SP 2.3 VALIDATE IMPROVEMENTS Validate selected improvements. TIP Selected improvement sugges- Selected improvements are validated in accordance with their improve- tions are further evaluated ment proposals. through a validation.

Examples of validation methods include the following: OPM • Discussions with stakeholders, perhaps in the context of a formal review • Prototype demonstrations • Pilots of suggested improvements • Modeling and simulation ptg

Pilots can be conducted to evaluate significant changes involving TIP untried, high-risk, or innovative improvements before they are Another purpose of a pilot is to broadly deployed. Not all improvements need the rigor of a pilot. gauge a change’s applicability Criteria for selecting improvements for piloting are defined and used. to other projects. Factors such as risk, transformational nature of change, or number of TIP functional areas affected will determine the need for a pilot of the A pilot may involve a single improvement. project or a group of projects. Red-lined or rough-draft process documentation can be made available for use in piloting. Example Work Products 1. Validation plans 2. Validation evaluation reports 3. Documented lessons learned from validation TIP Subpractices Because of the need for careful 1. Plan the validation. coordination, validation may Quantitative success criteria documented in the improvement pro- be planned in the same man- ner as projects. posal can be useful when planning validation.

Wow! eBook 344 PART TWO THE PROCESS AREAS

HINT Validation plans for selected improvements to be piloted should When planning a validation, include target projects, project characteristics, a schedule for report- decide who should participate, ing results, and measurement activities. how it should be conducted, 2. Review and get relevant stakeholder agreement on validation plans. how to collect results, and what information to collect to decide 3. Consult with and assist those who perform the validation. on broad-scale deployment. 4. Create a trial implementation, in accordance with the validation plan, for selected improvements to be piloted. 5. Perform each validation in an environment that is similar to the environment present in a broad scale deployment. 6. Track validation against validation plans. 7. Review and document the results of validation. Validation results are evaluated using the quantitative criteria defined in the improvement proposal.

Reviewing and documenting results of pilots typically involves the follow- ing activities: • Reviewing pilot results with stakeholders • Deciding whether to terminate the pilot, rework implementation of the improvement, replan and continue the pilot, or proceed with deployment • Updating the disposition of improvement proposals associated with the pilot ptg • Identifying and documenting new improvement proposals as appropriate • Identifying and documenting lessons learned and problems encountered during the pilot including feedback to the improvement team and changes to the improvement

SP 2.4 SELECT AND IMPLEMENT IMPROVEMENTS FOR DEPLOYMENT TIP Select and implement improvements for deployment throughout the organi- This practice not only selects zation based on an evaluation of costs, benefits, and other factors. improvements for deployment, it prepares the organization for their deployment by reviewing Selection of suggested improvements for deployment is based on impacts to commitments and cost-to-benefit ratios with regard to quality and process performance skills; and identifying and mak- objectives, available resources, and the results of improvement pro- ing changes to training, work posal evaluation and validation activities. environments, and process Refer to the Decision Analysis and Resolution process area for more informa- assets. tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria.

Example Work Products 1. Improvements selected for deployment 2. Updated process documentation and training

Wow! eBook Organizational Performance Management 345

Subpractices 1. Prioritize improvements for deployment. The priority of an improvement is based on an evaluation of its esti- TIP mated cost-to-benefit ratio with regard to the quality and process per- There is a need to balance formance objectives as compared to the performance baselines. stability with change. You can’t Return on investment can be used as a basis of comparison. afford to make every promising 2. Select improvements to be deployed. change; therefore, you must be selective about which Selection of improvements to be deployed is based on their priorities, changes you deploy across available resources, and results of improvement proposal evaluation the organization. and validation activities. 3. Determine how to deploy each improvement.

Examples of where the improvements may be deployed include the f o l l o w i n g : • Project specific or common work environments

• Product families OPM • Organization’s projects • Organizational groups

4. Document results of the selection process. ptg

Results of the selection process usually include the following: HINT • The selection criteria for suggested improvements Documenting the results of selection can help if the busi- • The characteristics of the target projects ness environment changes • The disposition of each improvement proposal enough for you to reconsider • The rationale for the disposition of each improvement proposal the decision and make a differ- ent selection.

5. Review any changes needed to implement the improvements.

Examples of changes needed to deploy an improvement include the f o l l o w i n g : • Process descriptions, standards, and procedures • Work environments • Education and training • Skills • Existing commitments • Existing activities • Continuing support to end users • Organizational culture and characteristics

Wow! eBook 346 PART TWO THE PROCESS AREAS

TIP 6. Update the organizational process assets. PPBs and PPMs are examples of the organizational process assets that will need to be Updating the organizational process assets typically includes reviewing updated. A review of the vali- them, gaining approval for them, and communicating them. dation data may help in identi- fying what revisions will be needed to these assets to sup- Refer to the Organizational Process Definition process area for more port the “first wave” of projects information about establishing organizational process assets. that will be using the to-be- deployed improvements. SG 3 DEPLOY IMPROVEMENTS TIP Measurable improvements to the organization’s processes and technologies Whereas the activities are deployed and evaluated using statistical and other quantitative described in SG 2 are typically t e c h n i q u e s . “ongoing,” the detailed plan- ning, deployment, and evalua- tion of improvements called for Once improvements are selected for deployment, a plan for deploy- in SG 3 are typically performed ment is created and executed. The deployment of improvements is periodically (e.g., annually). managed and the effects of the improvements are measured and eval- uated as to how well they contribute to meeting quality and process performance objectives.

ptg SP 3.1 PLAN THE DEPLOYMENT TIP Establish and maintain plans for deploying selected improvements. Planning the deployment may repeat many of the considera- The plans for deploying selected improvements can be included in tions described in SP 2.4 to the plan for organizational performance management, in improve- reflect changes to assumptions about the conditions, target, ment proposals, or in separate deployment documents. pacing, and timing of the This specific practice complements the Deploy Organizational deployment. Process Assets specific practice in the Organizational Process Focus process area and adds the use of quantitative data to guide the HINT deployment and to determine the value of improvements. Depending on the magnitude of Refer to the Organizational Process Focus process area for more information the change, it could take about deploying organizational process assets and incorporating experiences. months or years before the change is fully deployed. Therefore, it is important to Example Work Products think about the retirement of 1. Deployment plans for selected improvements those processes and products that the change will replace. Subpractices 1. Determine how each improvement should be adjusted for deployment. Improvements identified in a limited context (e.g., for a single improvement proposal) might need to be modified for a selected por- tion of the organization.

Wow! eBook Organizational Performance Management 347

2. Identify strategies that address the potential barriers to deploying TIP each improvement that were defined in the improvement proposals. For example, one strategy could 3. Identify the target project population for deployment of the be to deploy sets of related improvement. improvements incrementally across the organization, so that Not all projects are good candidates for all improvements. For exam- PPBs and PPMs established in ple, improvements may be targeted to software only projects, COTS early increments are available integration projects, or operations and support projects. for use in later increments. 4. Establish measures and objectives for determining the value of each improvement with respect to the organization’s quality and process performance objectives. Measures can be based on the quantitative success criteria docu- mented in the improvement proposal or derived from organizational objectives.

Examples of measures for determining the value of an improvement include the following:

• Measured improvement in the project’s or organization’s process per- OPM formance • Time to recover the cost of the improvement • Number and types of project and organizational risks mitigated by the process or technology improvement ptg • Average time required to respond to changes in project requirements, market situations, and the business environment

Refer to the Measurement and Analysis process area for more information about aligning measurement and analysis activities and providing mea - surement results. 5. Document the plans for deploying selected improvements. The deployment plans should include relevant stakeholders, risk strategies, target projects, measures of success, and schedule. 6. Review and get agreement with relevant stakeholders on the plans for deploying selected improvements. Relevant stakeholders include the improvement sponsor, target proj- ects, support organizations, etc. 7. Revise the plans for deploying selected improvements as necessary.

SP 3.2 MANAGE THE DEPLOYMENT Manage the deployment of selected improvements.

This specific practice can overlap with the Implement Action Propos- als specific practice in the Causal Analysis and Resolution process

Wow! eBook 348 PART TWO THE PROCESS AREAS

area (e.g., when causal analysis and resolution is used organization- ally or across multiple projects). Example Work Products 1. Updated training materials (to reflect deployed improvements) 2. Documented results of improvement deployment activities 3. Revised improvement measures, objectives, priorities, and deploy- ment plans

Subpractices 1. Monitor the deployment of improvements using deployment plans. TIP 2. Coordinate the deployment of improvements across the organization. One of the goals of most organi- zations is to be nimble and agile. Therefore, it is necessary Coordinating deployment includes the following activities: to learn how to introduce • Coordinating activities of projects, support groups, and organizational changes quickly and yet groups for each improvement c o r r e c t l y. • Coordinating activities for deploying related improvements

3. Deploy improvements in a controlled and disciplined manner. ptg

Examples of methods for deploying improvements include the following: • Deploying improvements incrementally rather than as a single d e p l o y m e n t • Providing comprehensive consulting to early adopters of improvement in lieu of revised formal training

4. Coordinate the deployment of improvements into the projects’ defined processes as appropriate. Refer to the Organizational Process Focus process area for more informa- TIP tion about deploying organizational process assets and incorporating experiences. Extensive or complex improve- ments may require much sup- 5. Provide consulting as appropriate to support deployment of port, such as training, user improvements. support, or feedback on use of the new or updated process. 6. Provide updated training materials or develop communication pack- ages to reflect improvements to organizational process assets. TIP Refer to the Organizational Training process area for more information Ensure that any unanticipated about providing training. consequences have been 7. Confirm that the deployment of all improvements is completed in addressed. accordance with the deployment plan.

Wow! eBook Organizational Performance Management 349

8. Document and review results of improvement deployment.

Documenting and reviewing results includes the following: • Identifying and documenting lessons learned • Revising improvement measures, objectives, priorities, and deployment plans

SP 3.3 EVALUATE IMPROVEMENT EFFECTS

Evaluate the effects of deployed improvements on quality and process per- TIP formance using statistical and other quantitative techniques. This specific practice looks at Refer to the Measurement and Analysis process area for more information about measuring the effects of aligning measurement and analysis activities and providing measurement improvements being deployed across the organization. In par- results. ticular, what is the impact on achieving QPPOs? Is it time to

This specific practice can overlap with the Evaluate the Effect of move the “goal post” and intro- OPM Implemented Actions specific practice in the Causal Analysis and duce new “stretch” QPPOs (see Resolution process area (e.g., when causal analysis and resolution is SP 1.1)? applied organizationally or across multiple projects). ptg Example Work Products 1. Documented measures of the effects resulting from deployed improvements

Subpractices 1. Measure the results of each improvement as implemented on the tar- get projects, using the measures defined in the deployment plans. 2. Measure and analyze progress toward achieving the organization’s quality and process performance objectives using statistical and other quantitative techniques and take corrective action as needed. Refer to the Organizational Process Performance process area for more information about establishing quality and process performance objec- tives and establishing process performance baselines and models.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook ORGANIZATIONAL PROCESS PERFORMANCE A Process Management Process Area at Maturity Level 4

Purpose The purpose of Organizational Process Performance (OPP) is to establish and maintain a quantitative understanding of the perform- ance of selected processes in the organization’s set of standard processes in support of achieving quality and process performance objectives, and to provide process performance data, baselines, and models to quantitatively manage the organization’s projects.

Introductory Notes TIP ptg OPP establishes assets for use The Organizational Process Performance process area involves the by projects (QPM and CAR) and following activities: the organization (OPM). Con-

sult these PAs to better under- OPP • Establishing organizational quantitative quality and process perform- stand what roles these assets ance objectives based on business objectives (See the definition of play. This understanding, as well as your business objec- “quality and process performance objectives” in the glossary.) tives and project needs, will • Selecting processes or subprocesses for process performance analyses help you design appropriate • Establishing definitions of the measures to be used in process per- assets as you implement OPP. formance analyses (See the definition of “process performance” in the glossary.) X-REF • Establishing process performance baselines and process performance In particular, OPP and QPM models (See the definitions of “process performance baselines” and are tightly coupled process “process performance models” in the glossary.) areas. Refer to QPM exten- sively when considering how you should implement OPP. The collection and analysis of the data and creation of the process In particular, note the corre- performance baselines and models can be performed at different lev- spondence between OPP SP els of the organization, including individual projects or groups of 1.1, 1.2, 1.3 and QPM SP 1.1, related projects as appropriate based on the needs of the projects and 1.3, and 1.4, respectively. organization.

351

Wow! eBook 352 PART TWO THE PROCESS AREAS

X-REF The common measures for the organization consist of process The CMMI principle that SPs and product measures that can be used to characterize the actual per- are agnostic as to who per- formance of processes in the organization’s individual projects. By forms them applies here as analyzing the resulting measurements, a distribution or range of well. The OPP SPs, including those establishing Process results can be established that characterize the expected performance Performance Baselines (PPBs) of the process when used on an individual project. and Models (PPMs) (SPs 1.4 Measuring quality and process performance can involve combin- and 1.5), may in some cases ing existing measures into additional derived measures to provide provide more benefit when more insight into overall efficiency and effectiveness at a project or implemented for an individ- organization level. The analysis at the organization level can be used ual project(s). to study productivity, improve efficiencies, and increase throughput across projects in the organization. TIP The expected process performance can be used in establishing the A subprocess measure’s “cen- tral tendency and spread,” nor- project’s quality and process performance objectives and can be used malized appropriately (e.g., for as a baseline against which actual project performance can be com- work product size), and deter- pared. This information is used to quantitatively manage the project. mined for a set of similar proj- Each quantitatively managed project, in turn, provides actual per- ects, teams, or perhaps just for formance results that become a part of organizational process assets a single team, together with that are made available to all projects. sufficient contextual data to interpret the individual mea - Process performance models are used to represent past and cur- surements (to allow subsetting rent process performance and to predict future results of the process. ptg the data from which these sta- For example, the latent defects in the delivered product can be pre- tistics are derived), may com- dicted using measurements of work product attributes such as com- pose a PPB. plexity and process attributes such as preparation time for peer reviews. HINT When the organization has sufficient measures, data, and analyti- PPB can be displayed or sum- cal techniques for critical process, product, and service characteris- marized in different ways, depending on the nature of the tics, it is able to do the following: data and the contexts from which it arose. These might • Determine whether processes are behaving consistently or have stable include control charts, box trends (i.e., are predictable) plots, or histograms. • Identify processes in which performance is within natural bounds that are consistent across projects and could potentially be aggregated TIP • Identify processes that show unusual (e.g., sporadic, unpredictable) This paragraph describes in a behavior simplified way how the organi- zation’s PPBs (referred to here • Identify aspects of processes that can be improved in the organiza- as “expected process perform- tion’s set of standard processes ance”) support the activities • Identify the implementation of a process that performs best described in QPM, which in turn, provide data used to update these PPBs. This process area interfaces with and supports the implementa- tion of other high maturity process areas. The assets established and maintained as part of implementing this process area (e.g., the mea-

Wow! eBook Organizational Process Performance 353

sures to be used to characterize subprocess behavior, process per- formance baselines, process performance models) are inputs to the quantitative project management, causal analysis and resolution, and organizational performance management processes in support of the analyses described there. Quantitative project management processes provide the quality and process performance data needed to maintain the assets described in this process area.

Related Process Areas TIP Mastering the practices in MA is Refer to the Measurement and Analysis process area for more information about a prerequisite to effectively specifying measures, obtaining measurement data, and analyzing measurement implementing OPP and QPM. In data. particular, it is important to evaluate the effectiveness of Refer to the Organizational Performance Management process area for more your measurement system. Are information about proactively managing the organization’s performance to meet subprocess measures suffi- its business objectives. ciently accurate and precise to Refer to the Quantitative Project Management process area for more informa- be useful for your intended tion about quantitatively managing the project to achieve the project’s estab- purposes? Techniques such as ANOVA Gage Repeatability and lished quality and process performance objectives. Reproducibility (GR&R) and Fleiss’ kappa (see Wikipedia) ptg may help answer these Specific Goal and Practice Summary q u e s t i o n s . SG 1 Establish Performance Baselines and Models SP 1.1 Establish Quality and Process Performance Objectives

X-REF OPP SP 1.2 Select Processes SP 1.3 Establish Process Performance Measures Another suitable reference is SP 1.4 Analyze Process Performance and Establish Process Performance Baselines “Measurement and Analysis SP 1.5 Establish Process Performance Models Infrastructure Diagnostic: Ver- sion 1.0: Method Definition Document” by Mark Kasunic. Specific Practices by Goal See http://www.sei.cmu.edu/ library/abstracts/reports/ 10tr035.cfm. SG 1 ESTABLISH PERFORMANCE BASELINES AND MODELS Baselines and models, which characterize the expected process performance of the organization’s set of standard processes, are established and m a i n t a i n e d .

Prior to establishing process performance baselines and models, it is necessary to determine the quality and process performance objec- tives for those processes (the Establish Quality and Process Perform- ance Objectives specific practice), which processes are suitable to be measured (the Select Processes specific practice), and which mea - sures are useful for determining process performance (the Establish Process Performance Measures specific practice).

Wow! eBook 354 PART TWO THE PROCESS AREAS

The first three practices of this goal are interrelated and often need to be performed concurrently and iteratively to select quality and process performance objectives, processes, and measures. Often, the selection of one quality and process performance objective, process, or measure will constrain the selection of the others. For example, selecting a quality and process performance objective relat- ing to defects delivered to the customer will almost certainly require selecting the verification processes and defect related measures. The intent of this goal is to provide projects with the process per- formance baselines and models they need to perform quantitative project management. Many times these baselines and models are col- lected or created by the organization, but there are circumstances in which a project may need to create the baselines and models for them- selves. These circumstances include projects that are not covered by the organization’s baselines and models. For these cases the project follows the practices in this goal to create its baselines and models.

SP 1.1 ESTABLISH QUALITY AND PROCESS PERFORMANCE OBJECTIVES Establish and maintain the organization’s quantitative objectives for quality and process performance, which are traceable to business objectives. ptg The organization’s quality and process performance objectives can be established for different levels in the organizational structure (e.g., business area, product line, function, project) as well as at different levels in the process hierarchy. When establishing quality and process performance objectives, consider the following: TIP • Traceability to the organization’s business objectives By ensuring traceability to busi- ness objectives, the organiza- • Past performance of the selected processes or subprocesses in context tion’s quality and process (e.g., on projects) performance objectives • Multiple attributes of process performance (e.g., product quality, pro- (QPPOs) become the mecha- ductivity, cycle time, response time) nism for aligning the activities described in OPP and QPM • Inherent variability or natural bounds of the selected processes or with the organization’s busi- subprocesses ness objectives. The organization’s quality and process performance objectives provide focus and direction to the process performance analysis and quantitative project management activities. However, it should be noted that achieving quality and process performance objectives that are significantly different from current process capability requires use of techniques found in Causal Analysis and Resolution and Organi- zational Performance Management.

Wow! eBook Organizational Process Performance 355

Example Work Products 1. Organization’s quality and process performance objectives

Subpractices 1. Review the organization’s business objectives related to quality and process performance.

Examples of business objectives include the following: • Deliver products within budget and on time • Improve product quality by a specified percent in a specified timeframe • Improve productivity by a specified percent in a specified timeframe • Maintain customer satisfaction ratings • Improve time-to-market for new product or service releases by a speci- fied percent in a specified timeframe • Reduce deferred product functionality by a specified percent in a speci- fied timeframe • Reduce the rate of product recalls by a specified percent in a specified timeframe • Reduce customer total cost of ownership by a specified percent in a spec- ified timeframe • Decrease the cost of maintaining legacy products by a specified percent ptg in a specified timeframe OPP 2. Define the organization’s quantitative objectives for quality and HINT process performance. QPPOs can be established at Quality and process performance objectives can be established for multiple levels of an organiza- process or subprocess measurements (e.g., effort, cycle time, defect tion. One approach that aligns removal effectiveness) as well as for product measurements (e.g., reli- the whole organization toward achieving the organization’s ability, defect density) and service measurements (e.g., capacity, QPPOs and links QPPOs across response times) as appropriate. all levels is Hoshin Kanri or Hoshin Planning (see Examples of quality and process performance objectives include the Wikipedia). f o l l o w i n g : • Achieve a specified defect escape rate, productivity, duration, capacity, or cost target • Improve the defect escape rate, productivity, duration, capacity, or cost performance by a specified percent of the process performance baseline in a specified timeframe • Improve service level agreement performance by a specified percent of the process performance baseline in a specified timeframe

Wow! eBook 356 PART TWO THE PROCESS AREAS

3. Define the priorities of the organization’s objectives for quality and process performance. TIP 4. Review, negotiate, and obtain commitment to the organization’s Commitment to the organiza- quality and process performance objectives and their priorities from tion’s QPPOs means senior relevant stakeholders. management periodically 5. Revise the organization’s quantitative objectives for quality and reviews how well projects are performing relative to them. process performance as necessary. Project management and senior technical staff members incor- Examples of when the organization’s quantitative objectives for quality porate them into their projects and process performance may need to be revised include the following: and strive to achieve them. • When the organization’s business objectives change • When the organization’s set of standard processes change • When actual quality and process performance differ significantly from objectives

SP 1.2 SELECT PROCESSES HINT Select processes or subprocesses in the organization’s set of standard Because not all processes processes to be included in the organization’s process performance analyses contribute equally to a busi - and maintain traceability to business objectives. ness objective, and all such ptg analyses consume time, effort, Refer to the Organizational Process Definition process area for more informa- and money, you should start tion about establishing organizational process assets. with a small selection. As you learn more, you can modify The organization’s set of standard processes consists of a set of stan- and expand the selection dard processes that, in turn, are composed of subprocesses. accordingly. Typically, it is not possible, useful, or economically justifiable to HINT apply statistical management techniques to all processes or sub- processes of the organization’s set of standard processes. Selection of Select process elements, not composite processes processes or subprocesses is based on the quality and process per- (processes composed of multi- formance objectives of the organization, which are derived from ple elements). Multiple sources business objectives as described in the previous specific practice. of variation lie hidden in a composite process and are not Example Work Products easily analyzed. What might appear to be a stable process 1. List of processes or subprocesses identified for process performance may in fact be a collection of analyses with rationale for their selection including traceability to unstable process elements. business objectives This would leave you unable to correctly predict future behav- Subpractices ior or accurately identify oppor- tunities for improvement. 1. Establish the criteria to use when selecting subprocesses.

Wow! eBook Organizational Process Performance 357 Examples of criteria that can be used for the selection of a process or sub- process for the organization’s process performance analysis include the following: • The process or subprocess is strongly related to key business objectives. • The process or subprocess has demonstrated stability in the past. • Valid historical data are currently available that is relevant to the process or subprocess. • The process or subprocess will generate data with sufficient frequency to allow for statistical management. • The process or subprocess is an important contributor to quality and process performance. • The process or subprocess is an important predictor of quality and process performance. • The process or subprocess is a factor important to understanding the risk associated with achieving the quality and process performance objectives. • The quality of the measures and measurements associated with the process or subprocess (e.g., measurement system error) is adequate. • Multiple measurable attributes that characterize process or subprocess behavior are available.

2. Select the subprocesses and document the rationale for their selection. ptg

Example approaches to identifying and evaluating subprocess alternatives as part of a selection include the following: OPP • Causal analysis • Sensitivity analysis

Refer to the Decision Analysis and Resolution process area for more infor- mation about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. 3. Establish and maintain traceability between the selected subprocesses, quality and process performance objectives, and business objectives.

Examples of ways in which traceability can be expressed include the following: • Mapping of subprocesses to quality and process performance objectives • Mapping of subprocesses to business objectives • Objective flow-down (e.g., Big Y to Vital X, Hoshin planning) • Balanced scorecard • Quality Function Deployment (QFD) • Goal Question Metric • Documentation for a process performance model

Wow! eBook 358 PART TWO THE PROCESS AREAS

4. Revise the selection as necessary. It may be necessary to revise the selection in the following situations: • The predictions made by process performance models result in too much variation to make them useful. • The objectives for quality and process performance change. • The organization’s set of standard processes change. • The underlying quality and process performance changes.

SP 1.3 ESTABLISH PROCESS PERFORMANCE MEASURES Establish and maintain definitions of measures to be included in the organiza- tion’s process performance analyses. Refer to the Measurement and Analysis process area for more information about specifying measures.

Example Work Products 1. Definitions of selected measures of process performance with ration- X-REF ale for their selection including traceability to selected processes or A commonly heard objection subprocesses is that it is not possible to Subpractices measure certain attributes. In ptg his book How to Measure 1. Select measures that reflect appropriate attributes of the selected Anything: Finding the Value of processes or subprocesses to provide insight into the organization’s “Intangibles” in Business, Doug Hubbard provides strategies quality and process performance. for how to proceed. It is often helpful to define multiple measures for a process or sub- process to understand the impact of changes to the process and avoid X-REF sub-optimization. Also, it is often helpful to establish measures for The Goal Question Metric both product and process attributes for the selected process and sub- (GQM) is a well-known process, as well as its inputs, outputs, and resources (including peo- approach to deriving measures ple and the skill they bring) consumed. that provide insight into issues The Goal Question Metric paradigm is an approach that can be used of interest. See www.cs.umd to select measures that provide insight into the organization’s quality .edu/~mvz/handouts/gqm.pdf. and process performance objectives. It is often useful to analyze how The SEI’s variant of GQM is these quality and process performance objectives can be achieved called the Goal Question based on an understanding of process performance provided by the Indicator Metric (GQIM). See selected measures. http://www.sei.cmu.edu/ training/p06.cfm. Also, see the perspective “Applying Examples of criteria used to select measures include the following: Principles of Empiricism” in this book for more informa- • Relationship of measures to the organization’s quality and process per- tion about GQM and CMMI formance objectives (one of that perspective’s co- • Coverage that measures provide over the life of the product or service authors is one of the creators • Visibility that measures provide into process performance of GQM). Continues

Wow! eBook Organizational Process Performance 359

Continued • Availability of measures • Frequency at which observations of the measure can be collected • Extent to which measures are controllable by changes to the process or subprocess • Extent to which measures represent the end users’ view of effective process performance

2. Establish operational definitions for the selected measures. Refer to the Measurement and Analysis process area for more information about specifying measures. HINT 3. Incorporate selected measures into the organization’s set of common To begin systematic collection measures. of these measures from new Refer to the Organizational Process Definition process area for more projects, incorporate them into information about establishing organizational process assets. the organization’s set of com- mon measures (OPD SP 1.4). 4. Revise the set of measures as necessary. Measures are periodically evaluated for their continued usefulness and ability to indicate process effectiveness.

SP 1.4 ANALYZE PROCESS PERFORMANCE AND ESTABLISH PROCESS PERFORMANCE BASELINES ptg Analyze the performance of the selected processes, and establish and main- tain the process performance baselines. OPP The selected measures are analyzed to characterize the performance of the selected processes or subprocesses achieved on projects. This characterization is used to establish and maintain process perform- ance baselines. (See the definition of “process performance baseline” in the glossary.) These baselines are used to determine the expected results of the process or subprocess when used on a project under a given set of circumstances. Process performance baselines are compared to the organization’s TIP quality and process performance objectives to determine if the qual- QPPOs should motivate supe- ity and process performance objectives are being achieved. rior performance. QPPOs that The process performance baselines are a measurement of per- set the bar too high may formance for the organization’s set of standard processes at various demoralize more than they motivate. What does the per- levels of detail. The processes that the process performance baselines formance data say about how can address include the following: well a project can do relative to the QPPOs? There is a need for • Sequence of connected processes balance between “desires” and “reality.” • Processes that cover the entire life of the project • Processes for developing individual work products

Wow! eBook 360 PART TWO THE PROCESS AREAS

There can be several process performance baselines to character- ize performance for subgroups of the organization.

Examples of criteria used to categorize subgroups include the following: • Product line • Line of business • Application domain • Complexity • Team size • Work product size • Process elements from the organization’s set of standard processes

Tailoring the organization’s set of standard processes can signifi- cantly affect the comparability of data for inclusion in process per- formance baselines. Effects of tailoring should be considered in establishing baselines. Depending on the tailoring allowed, separate performance baselines may exist for each type of tailoring. Refer to the Quantitative Project Management process area for more informa- tion about quantitatively managing the project to achieve the project’s estab- lished quality and process performance objectives. ptg

Example Work Products 1. Analysis of process performance data 2. Baseline data on the organization’s process performance

Subpractices HINT 1. Collect the selected measurements for the selected processes and Record sufficient contextual subprocesses. information with a measure- The process or subprocess in use when the measurement was taken is ment to enable identification of when it was generated, by recorded to enable its use later. whom, and the PPB it should be Refer to the Measurement and Analysis process area for more information included in (or regenerating a about specifying measurement data collection and storage procedures. PPB for a different class of process instances). 2. Analyze the collected measures to establish a distribution or range of results that characterize the expected performance of selected processes or subprocesses when used on a project. This analysis should include the stability of the related process or subprocess, and the impacts of associated factors and context. Related factors include inputs to the process and other attributes that can affect the results obtained. The context includes the business context (e.g., domain) and significant tailoring of the organization’s set of standard processes.

Wow! eBook Organizational Process Performance 361

The measurements from stable subprocesses in projects should be TIP used when possible; other data may not be reliable. Unless the process is stable, 3. Establish and maintain the process performance baselines from col- the data from the process may lected measurements and analyses. actually be a mixture of mea - surements taken from different Refer to the Measurement and Analysis process area for more information processes. PPBs developed about aligning measurement and analysis activities and providing mea - from such data are limited in surement results. their usefulness to projects Process performance baselines are derived by analyzing collected (e.g., estimated natural bounds are likely to be far apart). measures to establish a distribution or range of results that character- ize the expected performance for selected processes or subprocesses when used on a project in the organization. HINT Investigate subgrouping when 4. Review and get agreement with relevant stakeholders about the incorporating data from multi- process performance baselines. ple projects (and teams) into 5. Make the process performance information available across the the same PPB. Even if the organization in the measurement repository. process is stable within indi- The organization’s process performance baselines are used by projects vidual projects, its perform- to estimate the natural bounds for process performance. ance across projects may vary so much that the resulting sin- 6. Compare the process performance baselines to associated quality gle PPB will be too “wide” to be and process performance objectives to determine if those quality and useful. process performance objectives are being achieved. These comparisons should use statistical techniques beyond a simple HINT ptg comparison of the mean to gauge the extent of quality and process Even when organizational PPBs performance objective achievement. If the quality and process per- are established, individual formance objectives are not being achieved, corrective actions should projects may still benefit from be considered. establishing their own individ- OPP ual PPBs when they have accu- Refer to the Causal Analysis and Resolution process area for more infor- mulated sufficient data. mation about determining causes of selected outcomes. Refer to the Organizational Process Focus process area for more informa- TIP tion about planning and implementing process actions. Ideally, the QPPOs are attain- able, but perhaps are a Refer to the Organizational Performance Management for more informa- “stretch” beyond the current tion about analyzing process performance data and identifying potential PPBs. This comparison can help areas for improvement. establish feasible objectives. 7. Revise the process performance baselines as necessary. CAR or OPM can also be invoked to help improve process performance. Examples of when the organization’s process performance baselines may need to be revised include the following: • When processes change • When the organization’s results change • When the organization’s needs change • When suppliers’ processes change • When suppliers change

Wow! eBook 362 PART TWO THE PROCESS AREAS

TIP SP 1.5 ESTABLISH PROCESS PERFORMANCE MODELS Part of mastering any discipline Establish and maintain process performance models for the organization’s set is developing “nuance” for fac- tors that matter in how a situa- of standard processes. tion will unfold, yet realizing every situation is somewhat High maturity organizations generally establish and maintain a set of different. Although PPMs can process performance models at various levels of detail that cover a never be perfectly accurate, range of activities that are common across the organization and they offer a systematic approach to using the data address the organization’s quality and process performance objec- available in a situation and tives. (See the definition of “process performance model” in the glos- similar situations to make, sci- sary.) Under some circumstances, projects may need to create their entifically, the best possible own process performance models. judgment or prediction. Process performance models are used to estimate or predict the value of a process performance measure from the values of other HINT process, product, and service measurements. These process perform- Establish PPMs that provide ance models typically use process and product measurements col- insight at different points in a project that help it assess lected throughout the life of the project to estimate progress toward progress toward achieving its achieving quality and process performance objectives that cannot be QPPOs (QPM SP 2.2). measured until later in the project’s life. Process performance models are used as follows: TIP In general, data-driven • The organization uses them for estimating, analyzing, and predicting ptg approaches to judgment and the process performance associated with processes in and changes to prediction should, over the the organization’s set of standard processes. long term, perform better than • The organization uses them to assess the (potential) return on invest- human judgment alone. Per- haps the performance of both ment for process improvement activities. should be analyzed and con- • Projects use them for estimating, analyzing, and predicting the trasted; then how to best use process performance of their defined processes. both can be determined. But in • Projects use them for selecting processes or subprocesses for use. situations unlike those experi- enced before, neither is likely • Projects use them for estimating progress toward achieving the proj- to perform well. ect’s quality and process performance objectives.

X-REF These measures and models are defined to provide insight into There are now many books and to provide the ability to predict critical process and product on the limitations of human characteristics that are relevant to the organization’s quality and judgment. These include an process performance objectives. update to a classic text, Rational Choice in an Uncer- tain World: The Psychology of Examples of process performance models include the following: Judgment and Decision Mak- ing by Hastie and Dawes. A • System dynamics models more recent book is The • Regression models Invisible Gorilla: And Other • Complexity models Ways Our Intuitions Deceive • Discrete event simulation models Us by Chabris and Simons. • Monte Carlo simulation models

Wow! eBook Organizational Process Performance 363

Refer to the Quantitative Project Management process area for more informa- HINT tion about quantitatively managing the project to achieve the project’s estab- Subject matter experts from lished quality and process performance objectives. different relevant disciplines and functions can help identify Example Work Products process and product character- istics important to prediction. A 1. Process performance models variety of analytic techniques such as ANOVA and sensitivity Subpractices analysis can also help screen factors for potential inclusion 1. Establish process performance models based on the organization’s in PPMs. PPBs provide a pri- set of standard processes and process performance baselines. mary source for the informa- 2. Calibrate process performance models based on the past results and tion needed for this analysis. current needs. 3. Review process performance models and get agreement with relevant HINT stakeholders. Meet with relevant stakehold- ers to discuss PPMs (e.g., their 4. Support the projects’ use of process performance models. usefulness and limitations) and 5. Revise process performance models as necessary. what will enable them to make effective use of such models on projects. Examples of when process performance models may need to be revised include the following: TIP • When processes change To use PPMs effectively, project ptg • When the organization’s results change staff members and manage- • When the organization’s quality and process performance objectives ment may need significant sup- change port. There may be an initial

tendency to dismiss or misuse OPP PPMs or their results.

X-REF Example PPMs are described in “Approaches to Process Performance Modeling: A Summary from the SEI Series of Workshops on CMMI High Maturity Measurement and Analysis” found at http:// www.sei.cmu.edu/library/ab stracts/reports/09tr021.cfm. Experiences from using PPMs are described in “Performance Effects of Measurement and Analysis: Perspectives from CMMI High Maturity Organi- zations and Appraisers” found at http://www.sei.cmu .edu/library/abstracts/report s/10tr022.cfm.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook ORGANIZATIONAL TRAINING A Process Management Process Area at Maturity Level 3

Purpose The purpose of Organizational Training (OT) is to develop skills and knowledge of people so they can perform their roles effectively and efficiently.

Introductory Notes

Organizational Training addresses training provided to support the TIP organization’s strategic business objectives and to meet the tactical OT addresses the organiza- ptg training needs that are common across projects and support groups. tion’s training needs. The pro- Training needs identified by individual projects and support groups ject’s training needs are to meet their specific needs are handled at the project and support addressed in PP, PMC, and IPM. group level and are outside the scope of the Organizational Training process area. Refer to the Project Planning process area for more information about planning needed knowledge and skills. An organizational training program involves the following activities:

• Identifying the training needed by the organization • Obtaining and providing training to address those needs OT • Establishing and maintaining a training capability • Establishing and maintaining training records • Assessing training effectiveness

Effective training requires the assessment of needs, planning, TIP instructional design, and appropriate training media (e.g., workbooks, Training data includes student computer software), as well as a repository of training process data. training records, dates of As an organizational process, the main components of training classes, and other training include a managed training development program, documented information. plans, staff with an appropriate mastery of disciplines and other areas

365

Wow! eBook 366 PART TWO THE PROCESS AREAS

of knowledge, and mechanisms for measuring the effectiveness of the training program. TIP Identifying process training needs is based primarily on the skills To deploy these processes required to perform the organization’s set of standard processes. effectively across the organiza- Refer to the Organizational Process Definition process area for more informa- tion, training is typically tion about establishing standard processes. required. Certain skills can be effectively and efficiently imparted through vehicles other than classroom training experiences (e.g., informal TIP mentoring). Other skills require more formalized training vehicles, Remember, CMMI sets expecta- such as in a classroom, by web-based training, through guided self tions on what needs to be study, or via a formalized on-the-job training program. The formal or done, not how to do it. There- fore, each organization must informal training vehicles employed for each situation should be decide what type of training is based on an assessment of the need for training and the performance best for any situation. gap to be addressed. The term “training” used throughout this process area is used broadly to include all of these learning options. Success in training is indicated by the availability of opportunities to acquire the skills and knowledge needed to perform new and ongoing enterprise activities. Skills and knowledge can be technical, organizational, or contex- tual. Technical skills pertain to the ability to use equipment, tools, materials, data, and processes required by a project or process. Orga- ptg nizational skills pertain to behavior within and according to the staff members’ organization structure, role and responsibilities, and gen- eral operating principles and methods. Contextual skills are the self- management, communication, and interpersonal abilities needed to successfully perform work in the organizational and social context of the project and support groups.

Related Process Areas

Refer to the Decision Analysis and Resolution process area for more informa- tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. Refer to the Organizational Process Definition process area for more informa- tion about establishing organizational process assets. Refer to the Project Planning process area for more information about planning needed knowledge and skills.

Specific Goal and Practice Summary SG 1 Establish an Organizational Training Capability SP 1.1 Establish Strategic Training Needs SP 1.2 Determine Which Training Needs Are the Responsibility of the Organization

Wow! eBook Organizational Training 367

SP 1.3 Establish an Organizational Training Tactical Plan SP 1.4 Establish a Training Capability SG 2 Provide Training SP 2.1 Deliver Training SP 2.2 Establish Training Records SP 2.3 Assess Training Effectiveness

Specific Practices by Goal

SG 1 ESTABLISH AN ORGANIZATIONAL TRAINING CAPABILITY A training capability, which supports the roles in the organization, is estab- lished and maintained.

The organization identifies training required to develop the skills and knowledge necessary to perform enterprise activities. Once the needs are identified, a training program addressing those needs is developed.

SP 1.1 ESTABLISH STRATEGIC TRAINING NEEDS Establish and maintain strategic training needs of the organization.

Strategic training needs address long-term objectives to build a capa- HINT ptg bility by filling significant knowledge gaps, introducing new technolo- Use strategic training to ensure gies, or implementing major changes in behavior. Strategic planning that the organization continues typically looks two to five years into the future. as a learning organization, strengthens its core competen- cies, and remains competitive. Examples of sources of strategic training needs include the following: • The organization’s standard processes • The organization’s strategic business plan • The organization’s process improvement plan • Enterprise level initiatives • Skill assessments OT • Risk analyses • Acquisition and supplier management

Example Work Products 1. Training needs 2. Assessment analysis

Subpractices 1. Analyze the organization’s strategic business objectives and process improvement plan to identify potential training needs.

Wow! eBook 368 PART TWO THE PROCESS AREAS

2. Document the strategic training needs of the organization.

Examples of categories of training needs include the following: • Process analysis and documentation • Engineering (e.g., requirements analysis, design, testing, configuration management, quality assurance) • Selection and management of suppliers • Team building • Management (e.g., estimating, tracking, risk management) • Leadership • Disaster recovery and continuity of operations • Communication and negotiation skills

3. Determine the roles and skills needed to perform the organization’s set of standard processes. 4. Document the training needed to perform roles in the organization’s set of standard processes. 5. Document the training needed to maintain the safe, secure, and con- tinued operation of the business. 6. Revise the organization’s strategic needs and required training as ptg necessary.

SP 1.2 DETERMINE WHICH TRAINING NEEDS ARE THE RESPONSIBILITY OF THE ORGANIZATION Determine which training needs are the responsibility of the organization and which are left to the individual project or support group. Refer to the Project Planning process area for more information about planning needed knowledge and skills.

TIP In addition to strategic training needs, organizational training addresses Small organizations may training requirements that are common across projects and support choose to use the practices in groups. Projects and support groups have the primary responsibility this PA to address all of their for identifying and addressing their training needs. The organiza- training. If so, the scope and intent of the practices should tion’s training staff is responsible for addressing only common cross- be expanded appropriately. project and support group training needs (e.g., training in work environments common to multiple projects). In some cases, however, the organization’s training staff may address additional training needs of projects and support groups, as negotiated with them, in the con- text of the training resources available and the organization’s training priorities.

Wow! eBook Organizational Training 369

Example Work Products 1. Common project and support group training needs 2. Training commitments

Subpractices 1. Analyze the training needs identified by projects and support groups. Analysis of project and support group needs is intended to identify common training needs that can be most efficiently addressed organi- zation wide. These needs analysis activities are used to anticipate future training needs that are first visible at the project and support group level. 2. Negotiate with projects and support groups on how their training needs will be satisfied. The support provided by the organization’s training staff depends on the training resources available and the organization’s training priorities.

Examples of training appropriately performed by the project or support group include the following: • Training in the application or service domain of the project ptg • Training in the unique tools and methods used by the project or support group • Training in safety, security, and human factors

3. Document commitments for providing training support to projects and support groups.

SP 1.3 ESTABLISH AN ORGANIZATIONAL TRAINING TACTICAL PLAN Establish and maintain an organizational training tactical plan. OT

The organizational training tactical plan is the plan to deliver the TIP training that is the responsibility of the organization and is necessary For many organizations, this for individuals to perform their roles effectively. This plan addresses planning is performed annu- the near-term execution of training and is adjusted periodically in ally, with a review each quarter. response to changes (e.g., in needs, in resources) and to evaluations of effectiveness. Example Work Products 1. Organizational training tactical plan

Wow! eBook 370 PART TWO THE PROCESS AREAS

Subpractices 1. Establish the content of the plan.

Organizational training tactical plans typically contain the following: • Training needs • Training topics • Schedules based on training activities and their dependencies • Methods used for training • Requirements and quality standards for training materials • Training tasks, roles, and responsibilities • Required resources including tools, facilities, environments, staffing, skills, and knowledge

2. Establish commitments to the plan. Documented commitments by those who are responsible for imple- menting and supporting the plan are essential for the plan to be effective. 3. Revise the plan and commitments as necessary.

SP 1.4 ESTABLISH A TRAINING CAPABILITY ptg Establish and maintain a training capability to address organizational training needs. Refer to the Decision Analysis and Resolution process area for more informa- tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria.

Example Work Products 1. Training materials and supporting artifacts

Subpractices 1. Select appropriate approaches to satisfy organizational training needs. Many factors may affect the selection of training approaches, includ- ing audience specific knowledge, costs, schedule, and the work envi- ronment. Selecting an approach requires consideration of the means to provide skills and knowledge in the most effective way possible given the constraints.

Wow! eBook Organizational Training 371 Examples of training approaches include the following: • Classroom training • Computer aided instruction • Guided self study • Formal apprenticeship and mentoring programs • Facilitated videos • Chalk talks • Brown bag lunch seminars • Structured on-the-job training

2. Determine whether to develop training materials internally or to acquire them externally. Determine the costs and benefits of internal training development and of acquiring training externally.

Example criteria that can be used to determine the most effective mode of knowledge or skill acquisition include the following: • Applicability to work or process performance objectives • Availability of time to prepare for project execution • Applicability to business objectives ptg • Availability of in-house expertise • Availability of training from external sources

Examples of external sources of training include the following: • Customer provided training • Commercially available training courses • Academic programs • Professional conferences • Seminars OT 3. Develop or obtain training materials. Training can be provided by the project, support groups, the organi- zation, or an external organization. The organization’s training staff coordinates the acquisition and delivery of training regardless of its source.

Examples of training materials include the following: • Courses • Computer-aided instruction •Videos

Wow! eBook 372 PART TWO THE PROCESS AREAS

4. Develop or obtain qualified instructors, instructional designers, or mentors. TIP To ensure that those who develop and deliver internal training have Content experts should work the necessary knowledge and training skills, criteria can be defined to with the instructional designers identify, develop, and qualify them. The development of training, to ensure accuracy and thor- including self study and online training, should involve those who oughness. However, content have experience in instructional design. In the case of external train- expertise does not qualify any- ing, the organization’s training staff can investigate how the training one as an instructional provider determines which instructors will deliver the training. This designer. selection of qualified instructors can also be a factor in selecting or continuing to use a training provider. 5. Describe the training in the organization’s training curriculum.

Examples of the information provided in training descriptions for each course include the following: • Topics covered in the training • Intended audience • Prerequisites and preparation for participating • Training objectives • Length of the training • Lesson plans ptg • Completion criteria for the course • Criteria for granting training waivers

6. Revise training materials and supporting artifacts as necessary.

Examples of situations in which training materials and supporting artifacts may need to be revised include the following: • Training needs change (e.g., when new technology associated with the training topic is available) • An evaluation of the training identifies the need for change (e.g., evalua- tions of training effectiveness surveys, training program performance assessments, instructor evaluation forms)

SG 2 PROVIDE TRAINING Training for individuals to perform their roles effectively is provided.

When selecting people to be trained, the following should be considered: • Background of the target population of training participants • Prerequisite background to receive training

Wow! eBook Organizational Training 373

• Skills and abilities needed by people to perform their roles • Need for cross-discipline training for all disciplines, including project management • Need for managers to have training in appropriate organizational processes • Need for training in basic principles of all appropriate disciplines or services to support staff in quality management, configuration man- agement, and other related support functions • Need to provide competency development for critical functional areas • Need to maintain competencies and qualifications of staff to operate and maintain work environments common to multiple projects

SP 2.1 DELIVER TRAINING Deliver training following the organizational training tactical plan.

Example Work Products 1. Delivered training course

Subpractices

1. Select those who will receive the training necessary to perform their ptg roles effectively. Training is intended to impart knowledge and skills to people per- forming various roles in the organization. Some people already pos- sess the knowledge and skills required to perform well in their designated roles. Training can be waived for these people, but care should be taken that training waivers are not abused. 2. Schedule the training, including any resources, as necessary (e.g., facilities, instructors). Training should be planned and scheduled. Training is provided that has a direct bearing on work performance expectations. Therefore, optimal training occurs in a timely manner with regard to imminent job performance expectations. OT

These performance expectations often include the following: • Training in the use of specialized tools • Training in procedures that are new to the person who will perform them

3. Deliver the training. If the training is delivered by a person, then appropriate training pro- fessionals (e.g., experienced instructors, mentors) should deliver the training. When possible, training is delivered in settings that closely

Wow! eBook 374 PART TWO THE PROCESS AREAS

resemble the actual work environment and includes activities to sim- ulate actual work situations. This approach includes integration of tools, methods, and procedures for competency development. Train- ing is tied to work responsibilities so that on-the-job activities or other outside experiences will reinforce the training within a reason- able time after the training was delivered. 4. Track the delivery of training against the plan.

SP 2.2 ESTABLISH TRAINING RECORDS Establish and maintain records of organizational training.

TIP This practice applies to the training performed at the organizational To provide consistent and level. Establishment and maintenance of training records for project complete information on each or support group sponsored training is the responsibility of each employee, the training records individual project or support group. may include all training, whether performed at the orga- Example Work Products nization’s level or by a project or support group. 1. Training records 2. Training updates to the organizational repository

Subpractices ptg TIP 1. Keep records of all students who successfully complete each training To ensure that training records course or other approved training activity as well as those who are are accurate, you may want to unsuccessful. use some CM practices. 2. Keep records of all staff who are waived from training. The rationale for granting a waiver should be documented, and both the manager responsible and the manager of the excepted individual should approve the waiver. 3. Keep records of all students who successfully complete their required training. 4. Make training records available to the appropriate people for consid- eration in assignments. Training records may be part of a skills matrix developed by the train- ing organization to provide a summary of the experience and educa- tion of people, as well as training sponsored by the organization.

SP 2.3 ASSESS TRAINING EFFECTIVENESS Assess the effectiveness of the organization’s training program.

A process should exist to determine the effectiveness of training (i.e., how well training is meeting the organization’s needs).

Wow! eBook Organizational Training 375

Examples of methods used to assess training effectiveness include the TIP f o l l o w i n g : Training effectiveness can • Testing in the training context change over time. Initially, • Post-training surveys of training participants training may be done using • Surveys of manager satisfaction with post-training effects one medium or mode of deliv- ery to train large numbers of • Assessment mechanisms embedded in courseware people and another medium or mode of delivery to deliver ongoing training to new Measures can be taken to assess the benefits of training against employees or transfers from both the project’s and organization’s objectives. Particular attention other departments. should be paid to the need for various training methods, such as training teams as integral work units. When used, work or process performance objectives should be unambiguous, observable, verifi- able, and shared with course participants. The results of the training effectiveness assessment should be used to revise training materials as described in the Establish a Training Capability specific practice. Example Work Products 1. Training effectiveness surveys 2. Training program performance assessments 3. Instructor evaluation forms ptg 4. Training examinations

Subpractices 1. Assess in-progress or completed projects to determine whether staff knowledge is adequate for performing project tasks. 2. Provide a mechanism for assessing the effectiveness of each training course with respect to established organizational, project, or individ- ual learning (or performance) objectives. 3. Obtain student evaluations of how well training activities met their needs. OT

Wow! eBook This page intentionally left blank

ptg

Wow! eBook PRODUCT INTEGRATION An Engineering Process Area at Maturity Level 3

Purpose The purpose of Product Integration (PI) is to assemble the product HINT from the product components, ensure that the product, as integrated, To achieve problem-free inte- behaves properly (i.e., possesses the required functionality and qual- gration, you need to prepare ity attributes), and deliver the product. for product integration activi- ties early in the project.

Introductory Notes This process area addresses the integration of product components TIP ptg into more complex product components or into complete products. In some engineering disci- The scope of this process area is to achieve complete product inte- plines (e.g., mechanical engi- gration through progressive assembly of product components, in one neering), the term assembly is generally preferred over inte- stage or in incremental stages, according to a defined integration strat- gration. In PI, these two words egy and procedures. Throughout the process areas, where the terms are used interchangeably. “product” and “product component” are used, their intended mean- ings also encompass services, service systems, and their components. A critical aspect of product integration is the management of HINT internal and external interfaces of the products and product compo- Many problems encountered in nents to ensure compatibility among the interfaces. These interfaces product integration are due to are not limited to user interfaces, but also apply to interfaces among interface incompatibilities. Therefore, PI also addresses components of the product, including internal and external data interface management. sources, middleware, and other components that may or may not be within the development organization’s control but on which the product relies. Attention should be paid to interface management throughout the project. Product integration is more than just a one-time assembly of the product components at the conclusion of design and fabrication. Product integration can be conducted incrementally, using an itera- PI tive process of assembling product components, evaluating them, and then assembling more product components. It can be conducted using highly automated builds and continuous integration of the

377

Wow! eBook 378 PART TWO THE PROCESS AREAS

completed unit tested product. This process can begin with analysis and simulations (e.g., threads, rapid prototypes, virtual prototypes, physical prototypes) and steadily progress through increasingly more realistic increments until the final product is achieved. In each suc- cessive build, prototypes (virtual, rapid, or physical) are constructed, evaluated, improved, and reconstructed based on knowledge gained in the evaluation process. The degree of virtual versus physical pro- totyping required depends on the functionality of the design tools, the complexity of the product, and its associated risk. There is a high probability that the product, integrated in this manner, will pass product verification and validation. For some products and services, the last integration phase will occur when they are deployed at the intended operational site. X-REF For product lines, products are assembled according to the prod- See the definition of “product uct line production plan. The product line production plan specifies line” in the glossary. the assembly process, including which core assets to use and how product line variation is resolved within those core assets.

TIP Product integration typically In Agile environments, product integration is a frequent, often daily, activ- proceeds in incremental stages. ity. For example, for software, working code is continuously added to the At each stage, a partial assem- code base in a process called “continuous integration.” In addition to ptg bly is followed by evaluation. addressing continuous integration, the product integration strategy can In early stages, prototypes may address how supplier supplied components will be incorporated, how be used in place of product functionality will be built (in layers vs. “vertical slices”), and when to components that are unavail- “refactor.” The strategy should be established early in the project and be able. The last stage may inte- revised to reflect evolving and emerging component interfaces, external grate the complete product feeds, data exchange, and application program interfaces. (See “Interpret- with the end user in the ing CMMI When Using Agile Approaches” in Part I.) intended environment.

Related Process Areas

Refer to the Requirements Development process area for more information about identifying interface requirements. Refer to the Technical Solution process area for more information about design- ing interfaces using criteria. Refer to the Validation process area for more information about performing v a l i d a t i o n . Refer to the Verification process area for more information about performing verification.

Wow! eBook Product Integration 379

Refer to the Configuration Management process area for more information about tracking and controlling changes. Refer to the Decision Analysis and Resolution process area for more informa- tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. Refer to the Risk Management process area for more information about identi- fying risks and mitigating risks. Refer to the Supplier Agreement Management process area for more information about managing the acquisition of products and services from suppliers.

Specific Goal and Practice Summary SG 1 Prepare for Product Integration SP 1.1 Establish an Integration Strategy SP 1.2 Establish the Product Integration Environment SP 1.3 Establish Product Integration Procedures and Criteria SG 2 Ensure Interface Compatibility SP 2.1 Review Interface Descriptions for Completeness SP 2.2 Manage Interfaces SG 3 Assemble Product Components and Deliver the Product SP 3.1 Confirm Readiness of Product Components for Integration SP 3.2 Assemble Product Components ptg SP 3.3 Evaluate Assembled Product Components SP 3.4 Package and Deliver the Product or Product Component

Specific Practices by Goal

SG 1 PREPARE FOR PRODUCT INTEGRATION TIP Preparation for product integration is conducted. A passive approach to product integration is to wait until prod- uct components are ready to Preparing for the integration of product components involves estab- integrate to begin assembly lishing an integration strategy, establishing the environment for per- and testing. PI recommends a forming the integration, and establishing integration procedures and proactive approach that initi- criteria. Preparation for integration starts early in the project. ates integration activities early in the project and performs integration activities concur- SP 1.1 ESTABLISH AN INTEGRATION STRATEGY rently with the practices of TS Establish and maintain a product integration strategy. and SAM. Such an approach can affect the development schedules of the project and its The product integration strategy describes the approach for receiv-

suppliers. PI ing, assembling, and evaluating the product components that com- prise the product.

Wow! eBook 380 PART TWO THE PROCESS AREAS

TIP A product integration strategy addresses items such as the following: There are many possible approaches to receiving, • Making product components available for integration (e.g., in what assembling, and evaluating sequence) components. An integration strategy should provide a • Assembling and evaluating as a single build or as a progression of coherent end-to-end approach incremental builds to accomplishing these • Including and testing features in each iteration when using iterative a c t i v i t i e s . development • Managing interfaces • Using models, prototypes, and simulations to assist in evaluating an assembly, including its interfaces • Establishing the product integration environment • Defining procedures and criteria • Making available the appropriate test tools and equipment • Managing product hierarchy, architecture, and complexity • Recording results of evaluations • Handling exceptions

The integration strategy should also be aligned with the technical approach described in the Project Planning process area and harmo- ptg nized with the selection of solutions and the design of product and product components in the Technical Solution process area. TIP Refer to the Technical Solution process area for more information about select- To be available for integration, ing product component solutions and implementing the design. product components need to Refer to the Decision Analysis and Resolution process area for more informa- be completed on time. Thus, the nature and timing of inte- tion about analyzing possible decisions using a formal evaluation process that gration drives project sched- evaluates identified alternatives against established criteria. ules and when resources need Refer to the Project Planning process area for more information about establish- to be available. Therefore, ing and maintaining plans that define project activities. many management and engi- neering processes depend on Refer to the Risk Management process area for more information about identi- the product integration fying risks and mitigating risks. strategy. Refer to the Supplier Agreement Management process area for more information about managing the acquisition of products and services from suppliers. The results of developing a product integration strategy are typi- cally documented in a product integration plan, which is reviewed with stakeholders to promote commitment and understanding. Some of the items addressed in a product integration strategy are covered in more detail in the other specific practices and generic practices of this process area (e.g., environment, procedures and criteria, train- ing, roles and responsibilities, involvement of relevant stakeholders).

Wow! eBook Product Integration 381

Example Work Products 1. Product integration strategy 2. Rationale for selecting or rejecting alternative product integration strategies

Subpractices TIP 1. Identify the product components to be integrated. Besides product components, 2. Identify the verifications to be performed during the integration of the integration strategy can the product components. also address integration with This identification includes verifications to be performed on test equipment and assembly interfaces. fixtures. 3. Identify alternative product component integration strategies. HINT Developing an integration strategy can involve specifying and evalu- ating several alternative integration strategies or sequences. At each stage following an assembly, test whether the 4. Select the best integration strategy. resulting assembly behaves as The availability of the following will need to be aligned or harmo- expected. nized with the integration strategy: product components; the integra- tion environment; test tools and equipment; procedures and criteria; TIP relevant stakeholders; and staff who possess the appropriate skills. You can use a formal evalua- 5. Periodically review the product integration strategy and revise as tion process to select an inte- ptg needed. gration strategy or sequence from among alternatives. Crite- Assess the product integration strategy to ensure that variations in ria for selection can include the production and delivery schedules have not had an adverse impact on impact of the alternative on the integration sequence or compromised the factors on which earlier cost, schedule, performance, decisions were made. and risk. 6. Record the rationale for decisions made and deferred.

SP 1.2 ESTABLISH THE PRODUCT INTEGRATION ENVIRONMENT Establish and maintain the environment needed to support the integration of TIP the product components. Providing an integration envi- ronment to support product integration may be a significant The environment for product integration can either be acquired or endeavor best treated as a developed. To establish an environment, requirements for the pur- project. chase or development of equipment, software, or other resources will need to be developed. These requirements are gathered when imple- menting the processes associated with the Requirements Develop- ment process area. The product integration environment can include the reuse of existing organizational resources. The decision to acquire or develop the product integration environment is addressed PI in the processes associated with the Technical Solution process area. Refer to the Technical Solution process area for more information about per- forming make, buy, or reuse analyses.

Wow! eBook 382 PART TWO THE PROCESS AREAS

TIP The environment required at each step of the product integration Consider what is required of process can include test equipment, simulators (taking the place of the integration environment at unavailable product components), pieces of real equipment, and each stage of integration recording devices. according to the integration strategy established in SP 1.1. Example Work Products 1. Verified environment for product integration 2. Support documentation for the product integration environment

Subpractices 1. Identify the requirements for the product integration environment. 2. Identify verification procedures and criteria for the product integra- tion environment. 3. Decide whether to make or buy the needed product integration environment. Refer to the Supplier Agreement Management process area for more infor- mation about managing the acquisition of products and services from s u p p l i e r s .

HINT 4. Develop an integration environment if a suitable environment can- not be acquired. Revise the integration environ- ptg ment if the integration strategy For unprecedented, complex projects, the product integration envi- significantly changes, addi- ronment can be a major development. As such, it would involve tional prototypes and simula- project planning, requirements development, technical solutions, tions are needed to replace verification, validation, and risk management. product components that are 5. Maintain the product integration environment throughout the delivered late, or the under- standing of the operational project. environment changes 6. Dispose of those portions of the environment that are no longer s i g n i f i c a n t l y. useful.

TIP SP 1.3 ESTABLISH PRODUCT INTEGRATION PROCEDURES AND CRITERIA This specific practice answers Establish and maintain procedures and criteria for integration of the product questions such as how to obtain product components to components. be assembled, how to assem- ble them, and what testing to Procedures for the integration of the product components can perform. Also, criteria are include such things as the number of incremental iterations to be established to answer ques- performed and details of the expected tests and other evaluations to tions such as when a compo- nent is ready for integration, be carried out at each stage. what level of performance is Criteria can indicate the readiness of a product component for required on each test, and what integration or its acceptability. is acceptable validation and Procedures and criteria for product integration address the delivery of the final product. following:

Wow! eBook Product Integration 383

• Level of testing for build components • Verification of interfaces • Thresholds of performance deviation • Derived requirements for the assembly and its external interfaces • Allowable substitutions of components • Testing environment parameters • Limits on cost of testing • Quality/cost tradeoffs for integration operations • Probability of proper functioning • Delivery rate and its variation • Lead time from order to delivery • Staff member availability • Availability of the integration facility/line/environment

Criteria can be defined for how the product components are to be verified and the behaviors (functionality and quality attributes) they are expected to have. Criteria can be defined for how the assembled product components and final integrated product are to be validated and delivered. Criteria can also constrain the degree of simulation permitted for TIP ptg a product component to pass a test, or can constrain the environ- Criteria may impose limits or ment to be used for the integration test. constraints on how testing is Pertinent parts of the schedule and criteria for assembly should performed. be shared with suppliers of work products to reduce the occurrence of delays and component failure. Refer to the Supplier Agreement Management process area for more information about executing the supplier agreement.

Example Work Products 1. Product integration procedures 2. Product integration criteria

Subpractices 1. Establish and maintain product integration procedures for the prod- uct components. 2. Establish and maintain criteria for product component integration and evaluation.

3. Establish and maintain criteria for validation and delivery of the PI integrated product.

Wow! eBook 384 PART TWO THE PROCESS AREAS

TIP SG 2 ENSURE INTERFACE COMPATIBILITY The source of many product The product component interfaces, both internal and external, are integration problems lies in incompatible interfaces. SG 2 is c o m p a t i b l e . designed to help ensure that interfaces are compatible. Many product integration problems arise from unknown or uncon- trolled aspects of both internal and external interfaces. Effective X-REF management of product component interface requirements, specifi- Because of the importance of cations, and designs helps ensure that implemented interfaces will be interfaces to effective prod- complete and compatible. uct development, interfaces are addressed in multiple places: Interface require- SP 2.1 REVIEW INTERFACE DESCRIPTIONS FOR COMPLETENESS ments is covered in RD SP 2.3, Review interface descriptions for coverage and completeness. design of interfaces is cov- ered in TS SP 2.3, and ensur- ing interface compatibility is The interfaces should include, in addition to product component covered in PI SG 2. interfaces, all the interfaces with the product integration environment.

TIP Example Work Products These reviews should be peri- 1. Categories of interfaces odic to ensure consistency is maintained between interface 2. List of interfaces per category descriptions and product com- 3. Mapping of the interfaces to the product components and the prod- ptg ponents (see Subpractice 3) uct integration environment and that new interfaces are not overlooked. Subpractices

TIP 1. Review interface data for completeness and ensure complete cover- Interfaces with environments age of all interfaces. such as product integration, Consider all the product components and prepare a relationship verification, and validation table. Interfaces are usually classified in three main classes: environ- environments, as well as oper- mental, physical, and functional. Typical categories for these classes ational, maintenance, and sup- include the following: mechanical, fluid, sound, electrical, climatic, port environments, should be electromagnetic, thermal, message, and the human-machine or addressed. human interface. TIP Interface data is all the data Examples of interfaces (e.g., for mechanical or electronic components) that associated with product com- can be classified within these three classes include the following: ponent interfaces, including • Mechanical interfaces (e.g., weight and size, center of gravity, clearance of requirements, designs, and interface descriptions. parts in operation, space required for maintenance, fixed links, mobile links, shocks and vibrations received from the bearing structure) • Noise interfaces (e.g., noise transmitted by the structure, noise transmit- ted in the air, acoustics) • Climatic interfaces (e.g., temperature, humidity, pressure, salinity) Continues

Wow! eBook Product Integration 385

Continued TIP • Thermal interfaces (e.g., heat dissipation, transmission of heat to the Overlooked or incompletely bearing structure, air conditioning characteristics) described interfaces are risks • Fluid interfaces (e.g., fresh water inlet/outlet, seawater inlet/outlet for a to be identified and managed naval/coastal product, air conditioning, compressed air, nitrogen, fuel, (RSKM). The realization of lubricating oil, exhaust gas outlet) these risks in some business situations may lead to safety • Electrical interfaces (e.g., power supply consumption by network with recalls and be very costly. transients and peak values; nonsensitive control signal for power supply and communications; sensitive signal [e.g., analog links]; disturbing signal HINT [e.g., microwave]; grounding signal to comply with the TEMPEST standard) Use these examples (and oth- • Electromagnetic interfaces (e.g., magnetic field, radio and radar links, ers) to ensure that all aspects of optical band link wave guides, coaxial and optical fibers) an interface are addressed in • Human-machine interface (e.g., audio or voice synthesis, audio or voice an interface description. recognition, display [analog dial, liquid crystal display, indicators’ light emitting diodes], manual controls [pedal, joystick, track ball, keyboard, push buttons, touch screen]) • Message interfaces (e.g., origination, destination, stimulus, protocols, data characteristics)

2. Ensure that product components and interfaces are marked to TIP ensure easy and correct connection to the joining product For mechanical or electrical ptg component. product components, markings 3. Periodically review the adequacy of interface descriptions. can help ensure correct c o n n e c t i o n s . Once established, the interface descriptions should be periodically reviewed to ensure there is no deviation between the existing descrip- tions and the products being developed, processed, produced, or bought. The interface descriptions for product components should be HINT reviewed with relevant stakeholders to avoid misinterpretations, Include stakeholders internal reduce delays, and prevent the development of interfaces that do not and external to the project work properly. (e.g., suppliers).

SP 2.2 MANAGE INTERFACES Manage internal and external interface definitions, designs, and changes for products and product components.

Interface requirements drive the development of the interfaces neces- HINT sary to integrate product components. Managing product and prod- Manage interfaces early in the uct component interfaces starts early in the development of the product. project to help prevent incon-

The definitions and designs for interfaces affect not only the product sistencies from arising between PI components and external systems, but can also affect the verification product components. and validation environments.

Wow! eBook 386 PART TWO THE PROCESS AREAS

Refer to the Requirements Development process area for more information about identifying interface requirements. Refer to the Technical Solution process area for more information about design- ing interfaces using criteria. Refer to the Configuration Management process area for more information about establishing and maintaining the integrity of work products using config- uration identification, configuration control, configuration status accounting, and configuration audits. X-REF Refer to the Manage Requirements Changes specific practice in the Require- Interface descriptions are ments Management process area for more information about managing the typically placed under config- changes to the interface requirements. uration management (see GP 2.6) so that changes in status Management of the interfaces includes maintenance of the consis- are recorded and communi- tency of the interfaces throughout the life of the product, compliance cated (see CM SP 3.1). with architectural decisions and constraints, and resolution of con- flict, noncompliance, and change issues. The management of inter- faces between products acquired from suppliers and other products or product components is critical for success of the project. Refer to the Supplier Agreement Management process area for more information about managing the acquisition of products and services from suppliers. The interfaces should include, in addition to product component ptg interfaces, all the interfaces with the environment as well as other environments for verification, validation, operations, and support. The interface changes are documented, maintained, and readily accessible. Example Work Products TIP 1. Table of relationships among the product components and the exter- These tables in the example nal environment (e.g., main power supply, fastening product, com- work products are developed puter bus system) as part of the previous specific practice (SP 2.1), but here they 2. Table of relationships among the different product components have a role in managing inter- 3. List of agreed-to interfaces defined for each pair of product compo- faces. nents, when applicable 4. Reports from the interface control working group meetings 5. Action items for updating interfaces TIP 6. Application program interface (API) An API is the interface that a 7. Updated interface description or agreement computer system, library, or application provides to allow Subpractices other computer programs to make requests for service 1. Ensure the compatibility of the interfaces throughout the life of the and/or exchange data. product. 2. Resolve conflict, noncompliance, and change issues.

Wow! eBook Product Integration 387

3. Maintain a repository for interface data accessible to project participants. A common accessible repository for interface data provides a mecha- nism to ensure that everyone knows where the current interface data reside and can access them for use.

SG 3 ASSEMBLE PRODUCT COMPONENTS AND DELIVER THE PRODUCT Verified product components are assembled and the integrated, verified, and TIP validated product is delivered. Although the primary focus of SG 3 is on PI, it summarizes Integration of product components proceeds according to the prod- concepts from three PAs: PI, VER, and VAL. uct integration strategy and procedures. Before integration, each product component should be confirmed to be compliant with its X-REF interface requirements. Product components are assembled into If problems are identified larger, more complex product components. These assembled product before, during, or after an components are checked for correct interoperation. This process integration stage, an excep- continues until product integration is complete. If, during this tion report is written (see process, problems are identified, the problem should be documented PMC SG 2). and a corrective action process initiated. TIP The timely receipt of needed product components and the The first three specific prac- involvement of the right people contribute to the successful integra- tices of SG 3 progressively inte- ptg tion of the product components that compose the product. grate product components in the integration environment SP 3.1 CONFIRM READINESS OF PRODUCT COMPONENTS FOR INTEGRATION using the integration strategy, procedures, and criteria estab- Confirm, prior to assembly, that each product component required to assem- lished in SG 1 until the com- ble the product has been properly identified, behaves according to its plete product is assembled. description, and that the product component interfaces comply with the The final specific practice is interface descriptions. responsible for product pack- aging, delivery, and installation. Refer to the Verification process area for more information about performing verification.

The purpose of this specific practice is to ensure that the properly TIP identified product component that meets its description can actually It may seem as if these prac- be assembled according to the product integration strategy and pro- tices were written for assem- cedures. The product components are checked for quantity, obvious bling mechanical or electronic product components, but with damage, and consistency between the product component and inter- the exception of a few notes face descriptions. and subpractices, they also Those who conduct product integration are ultimately responsi- apply more generally, including

ble for checking to make sure everything is proper with the product to software. PI components before assembly.

Wow! eBook 388 PART TWO THE PROCESS AREAS

Example Work Products 1. Acceptance documents for the received product components 2. Delivery receipts 3. Checked packing lists 4. Exception reports 5. Waivers

Subpractices 1. Track the status of all product components as soon as they become available for integration. 2. Ensure that product components are delivered to the product inte- gration environment in accordance with the product integration strategy and procedures. 3. Confirm the receipt of each properly identified product component. 4. Ensure that each received product component meets its description. 5. Check the configuration status against the expected configuration. 6. Perform a pre-check (e.g., by a visual inspection, using basic mea - sures) of all the physical interfaces before connecting product com- ponents together. ptg SP 3.2 ASSEMBLE PRODUCT COMPONENTS Assemble product components according to the product integration strategy and procedures.

HINT The assembly activities of this specific practice and the evaluation Assembly (SP 3.2) and evalua- activities of the next specific practice are conducted iteratively, from tion (SP 3.3) occur at each inte- the initial product components, through the interim assemblies of gration stage. Evaluate the assembled product compo- product components, to the product as a whole. nents before proceeding to the next integration stage. Example Work Products 1. Assembled product or product components

Subpractices 1. Ensure the readiness of the product integration environment.

HINT 2. Conduct integration in accordance with the product integration strategy, procedures, and criteria. If the integration does not pro- ceed as planned (e.g., a product Record all appropriate information (e.g., configuration status, serial component is not ready for numbers of the product components, types, calibration date of the integration), you may have to meters). revise the integration strategy, 3. Revise the product integration strategy, procedures, and criteria as procedures, criteria, and plans. appropriate.

Wow! eBook Product Integration 389

SP 3.3 EVALUATE ASSEMBLED PRODUCT COMPONENTS Evaluate assembled product components for interface compatibility. TIP Refer to the Validation process area for more information about performing We also addressed interface v a l i d a t i o n . compatibility in SG 2. Here in SP 3.3, the focus is on evaluat- Refer to the Verification process area for more information about performing ing an assembly for interface verification. compatibility. In this practice, the evaluations indicated by the integration procedures and This evaluation involves examining and testing assembled product criteria established in SP 1.3 are components for performance, suitability, or readiness using the prod- conducted. uct integration procedures, criteria, and environment. It is performed as appropriate for different stages of assembly of product compo- TIP nents as identified in the product integration strategy and proce- Integration may be accom- dures. The product integration strategy and procedures can define a plished in multiple stages as more refined integration and evaluation sequence than might be opposed to a single stage. envisioned just by examining the product hierarchy or architecture. For example, if an assembly of product components is composed of four less complex product components, the integration strategy will not necessarily call for the simultaneous integration and evaluation of the four units as one. Rather, the four less complex units can be integrated progressively, one at a time, with an evaluation after each ptg assembly operation prior to realizing the more complex product component that matched the specification in the product architec- ture. Alternatively, the product integration strategy and procedures could have determined that only a final evaluation was the best one to perform. Example Work Products 1. Exception reports 2. Interface evaluation reports TIP 3. Product integration summary reports Evaluations include tests for interface compatibility and of Subpractices how well the assembled prod- uct components interoperate, 1. Conduct the evaluation of assembled product components following and may include tests involving the product integration strategy, procedures, and criteria. end users. Thus, the evalua- tions may involve both verifica- 2. Record the evaluation results. tion and validation.

Example results include the following: HINT

• Any adaptation required to the integration procedure or criteria Remember to document any PI • Any change to the product configuration (spare parts, new release) adaptation, change, or devia- tion from the established inte- • Evaluation procedure or criteria deviations gration procedure or criteria.

Wow! eBook 390 PART TWO THE PROCESS AREAS

SP 3.4 PACKAGE AND DELIVER THE PRODUCT OR PRODUCT COMPONENT Package the assembled product or product component and deliver it to the customer. Refer to the Validation process area for more information about performing v a l i d a t i o n . Refer to the Verification process area for more information about performing verification.

The packaging requirements for some products can be addressed in their specifications and verification criteria. This handling of require- ments is especially important when items are stored and transported by the customer. In such cases, there can be a spectrum of environ- mental and stress conditions specified for the package. In other cir- cumstances, factors such as the following can become important:

TIP • Economy and ease of transportation (e.g., containerization) Depending on the nature of the • Accountability (e.g., shrink wrapping) product, packaging may be rel- • Ease and safety of unpacking (e.g., sharp edges, strength of binding atively straightforward (e.g., distributing software on CDs) or methods, childproofing, environmental friendliness of packing mate- complicated (e.g., packaging rial, weight) ptg and transporting large HVAC systems to office buildings). The adjustment required to fit product components together in the factory could be different from the one required to fit product components together when installed on the operational site. In that case, the product’s logbook for the customer should be used to record such specific parameters. Example Work Products 1. Packaged product or product components 2. Delivery documentation

Subpractices 1. Review the requirements, design, product, verification results, and documentation to ensure that issues affecting the packaging and delivery of the product are identified and resolved. 2. Use effective methods to package and deliver the assembled product.

Examples of software packaging and delivery methods include the following: • Magnetic tape • Diskettes Continues

Wow! eBook Product Integration 391

Continued • Hardcopy documents • Compact disks • Other electronic distribution such as the Internet

3. Satisfy the applicable requirements and standards for packaging and delivering the product.

Examples of requirements and standards include ones for safety, the envi- ronment, security, transportability, and disposal.

Examples of requirements and standards for packaging and delivering software include the following: • Type of storage and delivery media • Custodians of the master and backup copies • Required documentation •Copyrights • License provisions • Security of the software ptg

4. Prepare the operational site for installation of the product. X-REF Preparing the operational site can be the responsibility of the cus- A product installation tomer or end users. process (TS SP 3.2) should describe how to prepare a 5. Deliver the product and related documentation and confirm receipt. site for product installation. 6. Install the product at the operational site and confirm correct operation. Installing the product can be the responsibility of the customer or the end users. In some circumstances, little may need to be done to con- firm correct operation. In other circumstances, final verification of the integrated product occurs at the operational site. PI

Wow! eBook This page intentionally left blank

ptg

Wow! eBook PMC

PROJECT MONITORING AND CONTROL A Project Management Process Area at Maturity Level 2 TIP In the CMMI for Services Model, this process area is called Work Monitoring and Purpose Control.

The purpose of Project Monitoring and Control (PMC) is to provide TIP an understanding of the project’s progress so that appropriate correc- PP provides the overall plan tive actions can be taken when the project’s performance deviates sig- and PMC tracks activities nificantly from the plan. against the plan.

Introductory Notes A project’s documented plan is the basis for monitoring activities, TIP ptg communicating status, and taking corrective action. Progress is pri- Initially, the organization may marily determined by comparing actual work product and task have a reactive culture. How- attributes, effort, cost, and schedule to the plan at prescribed mile- ever, as monitoring and control of activities become routine, stones or control levels in the project schedule or WBS. Appropriate project managers and staff visibility of progress enables timely corrective action to be taken begin to anticipate problems when performance deviates significantly from the plan. A deviation is and success in advance. significant if, when left unresolved, it precludes the project from meeting its objectives. The term “project plan” is used throughout this process area to refer to the overall plan for controlling the project. When actual status deviates significantly from expected values, TIP corrective actions are taken as appropriate. These actions can require Throughout the other PAs, replanning, which can include revising the original plan, establishing when corrective action is new agreements, or including additional mitigation activities in the needed, PMC is referenced. current plan.

Related Process Areas

Refer to the Measurement and Analysis process area for more information about providing measurement results.

393

Wow! eBook 394 PART TWO THE PROCESS AREAS

Refer to the Project Planning process area for more information about establish- ing and maintaining plans that define project activities.

Specific Goal and Practice Summary SG 1 Monitor the Project Against the Plan SP 1.1 Monitor Project Planning Parameters SP 1.2 Monitor Commitments SP 1.3 Monitor Project Risks SP 1.4 Monitor Data Management SP 1.5 Monitor Stakeholder Involvement SP 1.6 Conduct Progress Reviews SP 1.7 Conduct Milestone Reviews SG 2 Manage Corrective Action to Closure SP 2.1 Analyze Issues SP 2.2 Take Corrective Action SP 2.3 Manage Corrective Actions

Specific Practices by Goal

SG 1 MONITOR THE PROJECT AGAINST THE PLAN X-REF Actual project progress and performance are monitored against the project ptg The project plan was devel- plan. oped in PP.

SP 1.1 MONITOR PROJECT PLANNING PARAMETERS Monitor actual values of project planning parameters against the project plan.

Project planning parameters constitute typical indicators of project progress and performance and include attributes of work products and tasks, costs, effort, and schedule. Attributes of the work products and tasks include size, complexity, service level, availability, weight, form, fit, and function. The frequency of monitoring parameters should be considered. TIP Monitoring typically involves measuring actual values of project Actual values are used as “his- planning parameters, comparing actual values to estimates in the torical data” to provide a basis plan, and identifying significant deviations. Recording actual values for estimates in planning. of project planning parameters includes recording associated contex- tual information to help understand measures. An analysis of the impact that significant deviations have on determining the corrective actions to take is handled in specific goal 2 and its specific practices in this process area.

Wow! eBook Project Monitoring and Control 395

Example Work Products

1. Records of project performance PMC 2. Records of significant deviations 3. Cost performance reports

Subpractices TIP 1. Monitor progress against the schedule. These subpractices mirror the specific practices in PP. Progress monitoring typically includes the following: • Periodically measuring the actual completion of activities and milestones • Comparing actual completion of activities and milestones against the project plan schedule • Identifying significant deviations from the project plan schedule e s t i m a t e s

2. Monitor the project’s costs and expended effort.

Effort and cost monitoring typically includes the following: • Periodically measuring the actual effort and costs expended and staff ptg assigned • Comparing actual effort, costs, staffing, and training to the project plan budget and estimates • Identifying significant deviations from the project plan budget and e s t i m a t e s

3. Monitor the attributes of work products and tasks. Refer to the Measurement and Analysis process area for more information about developing and sustaining a measurement capability used to sup- port management information needs. Refer to the Project Planning process area for more information about establishing estimates of work product and task attributes.

Monitoring the attributes of work products and tasks typically includes the following: • Periodically measuring the actual attributes of work products and tasks, such as size, complexity, or service levels (and changes to these attributes) • Comparing the actual attributes of work products and tasks (and changes to these attributes) to the project plan estimates • Identifying significant deviations from the project plan estimates

Wow! eBook 396 PART TWO THE PROCESS AREAS

4. Monitor resources provided and used. Refer to the Project Planning process area for more information about planning the project’s resources.

Examples of resources include the following: • Physical facilities • Computers, peripherals, and software •Networks • Security environment •Project staff •Processes

5. Monitor the knowledge and skills of project staff. Refer to the Project Planning process area for more information about planning needed knowledge and skills.

Monitoring the knowledge and skills of project staff typically includes the following: • Periodically measuring the acquisition of knowledge and skills by project ptg staff • Comparing the actual training obtained to that documented in the project plan • Identifying significant deviations from the project plan estimates

6. Document significant deviations in project planning parameters.

SP 1.2 MONITOR COMMITMENTS TIP Monitor commitments against those identified in the project plan. Situations may prevent appro- priate follow-through with Example Work Products commitments, especially in an 1. Records of commitment reviews immature organization. There- fore, it is necessary to monitor Subpractices commitments and take correc- tive action when commitments 1. Regularly review commitments (both external and internal). change. 2. Identify commitments that have not been satisfied or are at signifi- cant risk of not being satisfied. 3. Document the results of commitment reviews.

Wow! eBook Project Monitoring and Control 397

SP 1.3 MONITOR PROJECT RISKS

Monitor risks against those identified in the project plan. TIP PMC Refer to the Project Planning process area for more information about identify- SP 1.3 checks the status of the ing project risks. risks that were identified in PP. This practice is reactive and Refer to the Risk Management process area for more information about identi- involves minimal risk manage- fying potential problems before they occur so that risk handling activities can ment activities. For more com- be planned and invoked as needed across the life of the product or project to plete and proactive handling of mitigate adverse impacts on achieving objectives. project risks, refer to RSKM.

Example Work Products 1. Records of project risk monitoring

Subpractices 1. Periodically review the documentation of risks in the context of the project’s current status and circumstances. 2. Revise the documentation of risks as additional information becomes available. As projects progress (especially projects of long duration or continu- ous operation), new risks arise. It is important to identify and analyze these new risks. For example, software, equipment, and tools in use ptg can become obsolete; or key staff can gradually lose skills in areas of particular long-term importance to the project and organization. 3. Communicate the risk status to relevant stakeholders.

Examples of risk status include the following: • A change in the probability that the risk occurs • A change in risk priority

SP 1.4 MONITOR DATA MANAGEMENT Monitor the management of project data against the project plan. Refer to the Plan Data Management specific practice in the Project Planning process area for more information about identifying types of data to be man- aged and how to plan for their management.

Data management activities should be monitored to ensure that data management requirements are being satisfied. Depending on the results of monitoring and changes in project requirements, situation, or status, it may be necessary to re-plan the project’s data manage- ment activities.

Wow! eBook 398 PART TWO THE PROCESS AREAS

Example Work Products 1. Records of data management

Subpractices 1. Periodically review data management activities against their descrip- tion in the project plan. 2. Identify and document significant issues and their impacts.

An example of a significant issue is when stakeholders do not have the access to project data they need to fulfill their roles as relevant stakeholders.

3. Document results of data management activity reviews.

SP 1.5 MONITOR STAKEHOLDER INVOLVEMENT Monitor stakeholder involvement against the project plan. Refer to the Plan Stakeholder Involvement specific practice in the Project Plan- ning process area for more information about identifying relevant stakeholders and planning appropriate involvement with them. ptg TIP Stakeholder involvement should be monitored to ensure that appro- In some plans, a specific stake- priate interactions occur. Depending on the results of monitoring holder may be identified; and changes in project requirements, situation, or status, it may be whereas other stakeholders necessary to re-plan stakeholder involvement. will be identified by role such as quality assurance r e p r e s e n t a t i v e . In Agile environments, the sustained involvement of customer and poten- tial end users in the project’s product development activities can be cru- cial to project success; thus, customer and end-user involvement in project activities should be monitored. (See “Interpreting CMMI When Using Agile Approaches” in Part I.)

Example Work Products 1. Records of stakeholder involvement

Subpractices 1. Periodically review the status of stakeholder involvement. 2. Identify and document significant issues and their impacts. 3. Document the results of stakeholder involvement status reviews.

Wow! eBook Project Monitoring and Control 399

SP 1.6 CONDUCT PROGRESS REVIEWS

Periodically review the project’s progress, performance, and issues. PMC

A “project’s progress” is the project’s status as viewed at a particular TIP time when the project activities performed so far and their results Progress reviews are held regu- and impacts are reviewed with relevant stakeholders (especially proj- larly (e.g., weekly, monthly, or ect representatives and project management) to determine whether quarterly). there are significant issues or performance shortfalls to be addressed. Progress reviews are project reviews to keep relevant stakeholders informed. These project reviews can be informal and may not be specified explicitly in project plans. Example Work Products 1. Documented project review results

Subpractices 1. Regularly communicate status on assigned activities and work prod- ucts to relevant stakeholders. Managers, staff, customers, end users, suppliers, and other relevant stakeholders are included in reviews as appropriate. 2. Review the results of collecting and analyzing measures for control- ptg ling the project. The measurements reviewed can include measures of customer satisfaction. Refer to the Measurement and Analysis process area for more information about aligning measurement and analysis activities and providing mea - surement results. 3. Identify and document significant issues and deviations from the plan. 4. Document change requests and problems identified in work prod- ucts and processes. Refer to the Configuration Management process area for more informa- tion about tracking and controlling changes. 5. Document the results of reviews. 6. Track change requests and problem reports to closure.

SP 1.7 CONDUCT MILESTONE REVIEWS TIP Review the project’s accomplishments and results at selected project Milestones are major events in m i l e s t o n e s . a project. If you are using a project lifecycle model, mile- Refer to the Establish the Budget and Schedule specific practice in the Project stones may be predetermined. Planning process area for more information about identifying major milestones.

Wow! eBook 400 PART TWO THE PROCESS AREAS

Milestones are pre-planned events or points in time at which a thor- ough review of status is conducted to understand how well stake- holder requirements are being met. (If the project includes a developmental milestone, then the review is conducted to ensure that the assumptions and requirements associated with that mile- stone are being met.) Milestones can be associated with the overall project or a particular service type or instance. Milestones can thus be event based or calendar based. Milestone reviews are planned during project planning and are typically formal reviews. Progress reviews and milestone reviews need not be held sepa- rately. A single review can address the intent of both. For example, a single pre-planned review can evaluate progress, issues, and perform- ance up through a planned time period (or milestone) against the plan’s expectations. Depending on the project, “project startup” and “project close- out” could be phases covered by milestone reviews. Example Work Products 1. Documented milestone review results ptg Subpractices 1. Conduct milestone reviews with relevant stakeholders at meaningful points in the project’s schedule, such as the completion of selected phases. Managers, staff, customers, end users, suppliers, and other relevant stakeholders are included in milestone reviews as appropriate. 2. Review commitments, the plan, status, and risks of the project. 3. Identify and document significant issues and their impacts. 4. Document results of the review, action items, and decisions. 5. Track action items to closure.

SG 2 MANAGE CORRECTIVE ACTION TO CLOSURE HINT Corrective actions are managed to closure when the project’s performance or Managing corrective action to results deviate significantly from the plan. closure is critical. It is not enough to identify the action SP 2.1 ANALYZE ISSUES item; you must confirm that it has been completed. Collect and analyze issues and determine corrective actions to address them.

Example Work Products 1. List of issues requiring corrective actions

Wow! eBook Project Monitoring and Control 401

Subpractices

1. Gather issues for analysis. PMC Issues are collected from reviews and the execution of other processes.

Examples of issues to be gathered include the following: • Issues discovered when performing technical reviews, verification, and validation • Significant deviations in project planning parameters from estimates in the project plan • Commitments (either internal or external) that have not been satisfied • Significant changes in risk status • Data access, collection, privacy, or security issues • Stakeholder representation or involvement issues • Product, tool, or environment transition assumptions (or other customer or supplier commitments) that have not been achieved

2. Analyze issues to determine the need for corrective action. Refer to the Establish the Budget and Schedule specific practice in the Project Planning process area for more information about corrective ptg action criteria. Corrective action is required when the issue, if left unresolved, may prevent the project from meeting its objectives.

SP 2.2 TAKE CORRECTIVE ACTION

Take corrective action on identified issues. TIP In some cases, the corrective Example Work Products action can be to monitor the 1. Corrective action plans situation. A corrective action does not always result in a Subpractices complete solution to the problem. 1. Determine and document the appropriate actions needed to address identified issues. Refer to the Project Planning process area for more information about developing a project plan.

Examples of potential actions include the following: • Modifying the statement of work • Modifying requirements Continues

Wow! eBook 402 PART TWO THE PROCESS AREAS

Continued • Revising estimates and plans • Renegotiating commitments • Adding resources • Changing processes • Revising project risks

2. Review and get agreement with relevant stakeholders on the actions to be taken. 3. Negotiate changes to internal and external commitments.

SP 2.3 MANAGE CORRECTIVE ACTIONS Manage corrective actions to closure.

Example Work Products 1. Corrective action results

Subpractices 1. Monitor corrective actions for their completion. 2. Analyze results of corrective actions to determine the effectiveness of ptg the corrective actions. 3. Determine and document appropriate actions to correct deviations from planned results from performing corrective actions. Lessons learned as a result of taking corrective action can be inputs to planning and risk management processes.

Wow! eBook PROJECT PLANNING TIP A Project Management Process Area at Maturity Level 2 In the CMMI for Services Model, this process area is

called Work Planning. PP

Purpose TIP In planning, you determine the The purpose of Project Planning (PP) is to establish and maintain requirements to be fulfilled, plans that define project activities. the tasks to perform, and the resources and coordination required. This information is Introductory Notes the basis for obtaining the needed resources and One of the keys to effectively managing a project is project planning. c o m m i t m e n t s . The Project Planning process area involves the following activities: TIP ptg • Developing the project plan The plan is a declaration that • Interacting with relevant stakeholders appropriately the work has been rationally • Getting commitment to the plan thought through and requests for resources are credible. If • Maintaining the plan you ask management to commit resources, they want to know it Planning includes estimating the attributes of work products and is worth the investment. A proj- tasks, determining the resources needed, negotiating commitments, ect plan helps you convince producing a schedule, and identifying and analyzing project risks. them. Iterating through these activities may be necessary to establish the TIP project plan. The project plan provides the basis for performing and Project planning is not just for controlling project activities that address commitments with the proj - large projects. Research with ect’s customer. (See the definition of “project” in the glossary.) the Personal Software Process The project plan is usually revised as the project progresses to (PSP) has demonstrated that address changes in requirements and commitments, inaccurate esti- individuals working on tasks mates, corrective actions, and process changes. Specific practices lasting only a few hours increase overall quality and describing both planning and replanning are contained in this productivity by making time process area. to plan. The term “project plan” is used throughout this process area to refer to the overall plan for controlling the project. The project plan X-REF can be a stand-alone document or be distributed across multiple doc- PMC addresses tracking of uments. In either case, a coherent picture of who does what should project activities in the plan. be included. Likewise, monitoring and control can be centralized or

403

Wow! eBook 404 PART TWO THE PROCESS AREAS

HINT distributed, as long as at the project level a coherent picture of proj- To limit change in the plan, ect status can be maintained. determine early in the project when and what kind of changes will be accepted. For product lines, there are multiple sets of work activities that would ben- efit from the practices of this process area. These work activities include HINT the creation and maintenance of the core assets, developing products to Both the organization and the be built using the core assets, and orchestrating the overall product line project should define triggers effort to support and coordinate the operations of the inter-related work for replanning. groups and their activities.

In Agile environments, performing incremental development involves planning, monitoring, controlling, and re-planning more frequently than in more traditional development environments. While a high-level plan for the overall project or work effort is typically established, teams will esti- mate, plan, and carry out the actual work an increment or iteration at a time. Teams typically do not forecast beyond what is known about the project or iteration, except for anticipating risks, major events, and large- scale influences and constraints. Estimates reflect iteration and team spe- cific factors that influence the time, effort, resources, and risks to accomplish the iteration. Teams plan, monitor, and adjust plans during each iteration as often as it takes (e.g., daily). Commitments to plans are demonstrated when tasks are assigned and accepted during iteration plan- ptg ning, user stories are elaborated or estimated, and iterations are populated with tasks from a maintained backlog of work. (See “Interpreting CMMI When Using Agile Approaches” in Part I.)

Related Process Areas

Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component requirements. Refer to the Technical Solution process area for more information about select- ing, designing, and implementing solutions to requirements. Refer to the Measurement and Analysis process area for more information about specifying measures. Refer to the Requirements Management process area for more information about managing requirements. Refer to the Risk Management process area for more information about identi- fying and analyzing risks and mitigating risks.

Wow! eBook Project Planning 405 Specific Goal and Practice Summary SG 1 Establish Estimates SP 1.1 Estimate the Scope of the Project SP 1.2 Establish Estimates of Work Product and Task Attributes SP 1.3 Define Project Lifecycle Phases SP 1.4 Estimate Effort and Cost SG 2 Develop a Project Plan SP 2.1 Establish the Budget and Schedule SP 2.2 Identify Project Risks SP 2.3 Plan Data Management SP 2.4 Plan the Project’s Resources PP SP 2.5 Plan Needed Knowledge and Skills SP 2.6 Plan Stakeholder Involvement SP 2.7 Establish the Project Plan SG 3 Obtain Commitment to the Plan SP 3.1 Review Plans That Affect the Project SP 3.2 Reconcile Work and Resource Levels SP 3.3 Obtain Plan Commitment

Specific Practices by Goal

SG 1 ESTABLISH ESTIMATES X-REF Estimates of project planning parameters are established and maintained. SG 1 focuses on providing ptg estimates of project planning parameters; actual values are Project planning parameters include all information needed by the monitored in PMC SP 1.1. project to perform necessary planning, organizing, staffing, directing, coordinating, reporting, and budgeting. TIP Estimates of planning parameters should have a sound basis to Project planning parameters instill confidence that plans based on these estimates are capable of are a key to managing a project. supporting project objectives. Planning parameters primarily include size, effort, and cost. Factors to consider when estimating these parameters include project requirements, including product requirements, requirements TIP imposed by the organization, requirements imposed by the customer, The basis for estimates can and other requirements that affect the project include historical data, the Documentation of the estimating rationale and supporting data is judgment of experienced esti- needed for stakeholder review and commitment to the plan and for mators, and other factors. maintenance of the plan as the project progresses. HINT Use this rationale to help justify to management why you need resources and why the effort and schedule estimates are appropriate.

Wow! eBook 406 PART TWO THE PROCESS AREAS

HINT SP 1.1 ESTIMATE THE SCOPE OF THE PROJECT To develop estimates, decom- Establish a top-level work breakdown structure (WBS) to estimate the scope pose the project into smaller work items (the WBS), estimate of the project. the resources needed by each item, and then roll these up. The WBS evolves with the project. A top-level WBS can serve to struc- This will result in more accurate ture initial estimating. The development of a WBS divides the overall estimates. project into an interconnected set of manageable components. TIP Typically, the WBS is a product, work product, or task oriented Interaction and iteration structure that provides a scheme for identifying and organizing the among planning, requirements logical units of work to be managed, which are called “work pack- definition, and design are often ages.” The WBS provides a reference and organizational mechanism necessary. A project can learn a for assigning effort, schedule, and responsibility and is used as the lot from each iteration and can underlying framework to plan, organize, and control the work done use this knowledge to update the plan, requirements, and on the project. design for the next iteration. Some projects use the term “contract WBS” to refer to the portion of the WBS placed under contract (possibly the entire WBS). Not all TIP projects have a contract WBS (e.g., internally funded development). The WBS is the basis for many management-related tasks (not Example Work Products just estimating). The Product 1. Task descriptions Breakdown Structure (PBS) may ptg be part of the WBS and defines 2. Work package descriptions the structure of products and 3. WBS product components that will be developed for a specific Subpractices project. The WBS details the activities required to achieve a 1. Develop a WBS. goal, whereas the PBS defines The WBS provides a scheme for organizing the project’s work. The the desired outcomes (products) WBS should permit the identification of the following items: needed to achieve the goal. • Risks and their mitigation tasks • Tasks for deliverables and supporting activities TIP • Tasks for skill and knowledge acquisition The level of detail often • Tasks for the development of needed support plans, such as config- depends on the level and com- pleteness of the requirements. uration management, quality assurance, and verification plans Often, the work packages and • Tasks for the integration and management of nondevelopmental estimates evolve as the items requirements evolve. 2. Define the work packages in sufficient detail so that estimates of project tasks, responsibilities, and schedule can be specified. HINT The top-level WBS is intended to help gauge the project work effort Consider establishing a man- for tasks and organizational roles and responsibilities. The amount of agement reserve commensurate detail in the WBS at this level helps in developing realistic schedules, to the overall uncertainty that thereby minimizing the need for management reserve. allows for the efficient alloca- tion of resources to address the uncertainty of estimates.

Wow! eBook Project Planning 407

3. Identify products and product components to be externally acquired. Refer to the Supplier Agreement Management process area for more infor- mation about managing the acquisition of products and services from X-REF s u p p l i e r s . Reuse is also addressed in TS. 4. Identify work products to be reused.

SP 1.2 ESTABLISH ESTIMATES OF WORK PRODUCT AND TASK ATTRIBUTES HINT Establish and maintain estimates of work product and task attributes. Learn to quantify the resources needed for particular tasks by associating size measures with PP Size is the primary input to many models used to estimate effort, each type of work product and cost, and schedule. Models can also be based on other attributes such building historical data. By col- as service level, connectivity, complexity, availability, and structure. lecting historical data from projects, you can learn how measured size relates to the Examples of attributes to estimate include the following: resources consumed by tasks. • Number and complexity of requirements This knowledge can then be • Number and complexity of interfaces used when planning the next project. • Volume of data • Number of functions X-REF • Function points For more information on esti- • Source lines of code mating, see Richard D. ptg • Number of classes and objects Stutzke, Estimating Software- • Team velocity and complexity Intensive Systems: Projects, • Number of pages Products, and Processes, SEI • Number of inputs and outputs Series in Software Engineer- ing, 2005. • Number of technical risk items • Number of database tables • Number of fields in data tables • Architecture elements • Experience of project participants TIP • Amount of code to be reused versus created This list purposely provides a • Number of logic gates for integrated circuits lengthy list of examples. Rarely • Number of parts (e.g., printed circuit boards, components, mechanical parts) would you see all of these attributes addressed in any • Physical constraints (e.g., weight, volume) plan. • Geographic dispersal of project members • Proximity of customers, end users, and suppliers • How agreeable or difficult the customer is • Quality and “cleanliness” of the existing code base

Wow! eBook 408 PART TWO THE PROCESS AREAS

HINT The estimates should be consistent with project requirements to Consider providing guidelines determine the project’s effort, cost, and schedule. A relative level of on how to estimate the diffi- difficulty or complexity should be assigned for each size attribute. culty or complexity of a task to improve estimation accuracy, Example Work Products especially when size measures are not available. 1. Size and complexity of tasks and work products 2. Estimating models 3. Attribute estimates 4. Technical approach

Subpractices 1. Determine the technical approach for the project. The technical approach defines a top-level strategy for development of the product. It includes decisions on architectural features, such as distributed or client/server; state-of-the-art or established technolo- gies to be applied, such as robotics, composite materials, or artificial intelligence; and the functionality and quality attributes expected in the final products, such as safety, security, and ergonomics. X-REF 2. Use appropriate methods to determine the attributes of the work Mature organizations main- products and tasks to be used to estimate resource requirements. tain historical data to help Methods for determining size and complexity should be based on val- ptg projects establish reasonable idated models or historical data. estimates (see SP 1.4, MA, and IPM SP 1.2). The methods for determining attributes evolve as the understanding of the relationship of product characteristics to attributes increases. 3. Estimate the attributes of work products and tasks.

Examples of work products for which size estimates are made include the following: • Deliverable and nondeliverable work products • Documents and files • Operational and support hardware, firmware, and software

SP 1.3 DEFINE PROJECT LIFECYCLE PHASES Define project lifecycle phases on which to scope the planning effort.

The determination of a project’s lifecycle phases provides for planned periods of evaluation and decision making. These periods are nor- mally defined to support logical decision points at which the appro- priateness of continued reliance on the project plan and strategy is determined and significant commitments are made concerning

Wow! eBook Project Planning 409 resources. Such points provide planned events at which project TIP course corrections and determinations of future scope and cost can For example, in waterfall devel- be made. opment, at the end of the Understanding the project lifecycle is crucial in determining the requirements analysis phase, the requirements are evaluated scope of the planning effort and the timing of initial planning, as well to assess consistency, com- as the timing and criteria (critical milestones) for replanning. pleteness, and feasibility, and The project lifecycle phases need to be defined depending on the to decide whether the project scope of requirements, the estimates for project resources, and the is ready (from a technical and nature of the project. Larger projects can contain multiple phases, risk perspective) to commit resources to the design phase. such as concept exploration, development, production, operations, PP and disposal. Within these phases, subphases may be needed. A development phase can include subphases such as requirements analysis, design, fabrication, integration, and verification. The deter- mination of project phases typically includes selection and refinement of one or more development models to address interdependencies and appropriate sequencing of the activities in the phases. Depending on the strategy for development, there can be interme- diate phases for the creation of prototypes, increments of capability, or spiral model cycles. In addition, explicit phases for “project startup” and “project close-out” can be included. Example Work Products ptg 1. Project lifecycle phases

SP 1.4 ESTIMATE EFFORT AND COST Estimate the project’s effort and cost for work products and tasks based on estimation rationale.

Estimates of effort and cost are generally based on results of analysis X-REF using models or historical data applied to size, activities, and other An example of a parametric planning parameters. Confidence in these estimates is based on software cost estimation rationale for the selected model and the nature of the data. There can model is COCOMO II (see be occasions when available historical data do not apply, such as http://sunset.usc.edu/csse/ when efforts are unprecedented or when the type of task does not fit research/COCOMOII/ cocomo_main.html). available models. For example, an effort can be considered unprece- dented if the organization has no experience with such a product or TIP task. Unprecedented efforts require Unprecedented efforts are more risky, require more research to an iterative or spiral develop- develop reasonable bases of estimate, and require more management ment model that provides reserve. The uniqueness of the project should be documented when frequent opportunities for feedback used to resolve using these models to ensure a common understanding of any issues or risks and to plan the assumptions made in the initial planning phases. next iteration.

Wow! eBook 410 PART TWO THE PROCESS AREAS

Example Work Products 1. Estimation rationale 2. Project effort estimates 3. Project cost estimates

Subpractices 1. Collect models or historical data to be used to transform the attri - butes of work products and tasks into estimates of labor hours and costs. HINT Many parametric models have been developed to help estimate cost If you are using only one para- and schedule. The use of these models as the sole source of estima- metric model, make sure it is tion is not recommended because these models are based on histori- calibrated to your project’s cal project data that may or may not be pertinent to the project. characteristics. Multiple models and methods can be used to ensure a high level of confidence in the estimate. TIP Historical data should include the cost, effort, and schedule data from Scaling can be reliable when previously executed projects and appropriate scaling data to account applied from experiences simi- for differing sizes and complexity. lar to the one at hand. 2. Include supporting infrastructure needs when estimating effort and cost. The supporting infrastructure includes resources needed from a ptg development and sustainment perspective for the product. Consider the infrastructure resource needs in the development envi- ronment, the test environment, the production environment, the operational environment, or any appropriate combination of these environments when estimating effort and cost.

Examples of infrastructure resources include the following: • Critical computer resources (e.g., memory, disk and network capacity, peripherals, communication channels, the capacities of these resources) • Engineering environments and tools (e.g., tools for prototyping, testing, integration, assembly, computer-aided design [CAD], simulation) • Facilities, machinery, and equipment (e.g., test benches, recording devices)

3. Estimate effort and cost using models, historical data, or a combina- tion of both.

Wow! eBook Project Planning 411 Examples of effort and cost inputs used for estimating typically include the TIP following: The common phrase “walk • Estimates provided by an expert or group of experts (e.g., Delphi method, the talk” is often used when talking about how a project is Extreme Programming’s Planning Game) conducted. The project plan is • Risks, including the extent to which the effort is unprecedented “the talk.” • Critical competencies and roles needed to perform the work •Travel •WBS • Selected project lifecycle model and processes

• Lifecycle cost estimates PP • Skill levels of managers and staff needed to perform the work • Knowledge, skill, and training needs • Direct labor and overhead • Service agreements for call centers and warranty work • Level of security required for tasks, work products, hardware, software, staff, and work environment • Facilities needed (e.g., office and meeting space and workstations) • Product and product component requirements • Size estimates of work products, tasks, and anticipated changes • Cost of externally acquired products • Capability of manufacturing processes ptg • Engineering facilities needed • Capability of tools provided in engineering environment • Technical approach

SG 2 DEVELOP A PROJECT PLAN TIP In some cases, each project A project plan is established and maintained as the basis for managing the phase may have a more project. detailed and focused plan of its own, in addition to the overall A project plan is a formal, approved document used to manage and project plan. Also, a detailed plan typically is provided for control the execution of the project. It is based on project require- each iteration of development ments and established estimates. focused on particular require- The project plan should consider all phases of the project lifecy- ments issues, design issues, or cle. Project planning should ensure that all plans affecting the project other risks. are consistent with the overall project plan. TIP Plans that may affect the proj- ect plan include configuration management plans, quality assurance plans, the organiza- tion’s process improvement plan, and the organization’s training plan.

Wow! eBook 412 PART TWO THE PROCESS AREAS

SP 2.1 ESTABLISH THE BUDGET AND SCHEDULE Establish and maintain the project’s budget and schedule.

HINT The project’s budget and schedule are based on developed estimates If the budget is dictated by oth- and ensure that budget allocation, task complexity, and task depen- ers and it doesn’t cover your dencies are appropriately addressed. estimated resource needs, Event driven, resource-limited schedules have proven to be effec- replan to ensure that the proj- ect will be within budget. Like- tive in dealing with project risk. Identifying accomplishments to be wise, if the schedule is dictated demonstrated before initiation of an event provides some flexibility by others and it isn’t consistent in the timing of the event, a common understanding of what is with your plan, replan to expected, a better vision of the state of the project, and a more accu- ensure that the project will be rate status of the project’s tasks. able to deliver the product on time (perhaps with fewer Example Work Products f e a t u r e s ) . 1. Project schedules TIP 2. Schedule dependencies In an event-driven schedule, 3. Project budget tasks can be initiated only after certain criteria are met. Subpractices 1. Identify major milestones. ptg TIP Milestones are pre-planned events or points in time at which a Defining event-based mile- thorough review of status is conducted to understand how well stake- stones and monitoring their holder requirements are being met. (If the project includes a develop- completion (PMC SP 1.1 Sub- mental milestone, then the review is conducted to ensure that the practice 1) provides visibility assumptions and requirements associated with that milestone are into the project’s progress. being met.) Milestones can be associated with the overall project or a particular service type or instance. Milestones can thus be event based or calendar based. If calendar based, once agreed, milestone dates are often difficult to change. 2. Identify schedule assumptions. When schedules are initially developed, it is common to make assumptions about the duration of certain activities. These assump- tions are frequently made on items for which little if any estimation data are available. Identifying these assumptions provides insight into the level of confidence (i.e., uncertainties) in the overall schedule. 3. Identify constraints. Factors that limit the flexibility of management options should be identified as early as possible. The examination of the attributes of work products and tasks often bring these issues to the surface. Such attributes can include task duration, resources, inputs, and outputs. 4. Identify task dependencies. Frequently, the tasks for a project or service can be accomplished in some ordered sequence that minimizes the duration. This sequencing

Wow! eBook Project Planning 413

involves the identification of predecessor and successor tasks to determine optimal ordering.

Examples of tools and inputs that can help determine optimal ordering of task activities include the following: • Critical Path Method (CPM) • Program Evaluation and Review Technique (PERT) • Resource limited scheduling • Customer priorities • Marketable features PP • End-user value

5. Establish and maintain the budget and schedule.

Establishing and maintaining the project’s budget and schedule typically includes the following: • Defining the committed or expected availability of resources and f a c i l i t i e s • Determining the time phasing of activities • Determining a breakout of subordinate schedules ptg • Defining dependencies among activities (predecessor or successor rela- tionships) • Defining schedule activities and milestones to support project monitor- ing and control • Identifying milestones, releases, or increments for the delivery of prod- ucts to the customer • Defining activities of appropriate duration • Defining milestones of appropriate time separation • Defining a management reserve based on the confidence level in meeting the schedule and budget • Using appropriate historical data to verify the schedule • Defining incremental funding requirements • Documenting project assumptions and rationale

6. Establish corrective action criteria. HINT Criteria are established for determining what constitutes a significant Establish corrective action cri- deviation from the project plan. A basis for gauging issues and prob- teria early in the project to lems is necessary to determine when corrective action should be ensure that issues are taken. Corrective actions can lead to replanning, which may include addressed appropriately and revising the original plan, establishing new agreements, or including consistently. mitigation activities in the current plan. The project plan defines

Wow! eBook 414 PART TWO THE PROCESS AREAS

when (e.g., under what circumstances, with what frequency) the cri- teria will be applied and by whom.

SP 2.2 IDENTIFY PROJECT RISKS TIP Identify and analyze project risks. Risk management is a key to Refer to the Monitor Project Risks specific practice in the Project Monitoring project success. and Control process area for more information about risk monitoring activities. Refer to the Risk Management process area for more information about identi- fying potential problems before they occur so that risk handling activities can be planned and invoked as needed across the life of the product or project to mitigate adverse impacts on achieving objectives.

Risks are identified or discovered and analyzed to support project planning. This specific practice should be extended to all plans that affect the project to ensure that appropriate interfacing is taking place among all relevant stakeholders on identified risks.

Project planning risk identification and analysis typically include the f o l l o w i n g : • Identifying risks ptg • Analyzing risks to determine the impact, probability of occurrence, and time frame in which problems are likely to occur • Prioritizing risks

Example Work Products 1. Identified risks 2. Risk impacts and probability of occurrence 3. Risk priorities

Subpractices 1. Identify risks. The identification of risks involves the identification of potential issues, hazards, threats, vulnerabilities, and so on that could nega- tively affect work efforts and plans. Risks should be identified and described understandably before they can be analyzed and managed properly. When identifying risks, it is a good idea to use a standard method for defining risks. Risk identification and analysis tools can be used to help identify possible problems.

Wow! eBook Project Planning 415 Examples of risk identification and analysis tools include the following: TIP • Risk taxonomies Risk identification and analysis tools help to identify risks by • Risk assessments identifying them more com- • Checklists pletely and rapidly, analyzing • Structured interviews them more consistently, and • Brainstorming allowing what has been • Process, project, and product performance models learned on previous projects to •Cost models be applied to new projects. • Network analysis

• Quality factor analysis PP

2. Document risks. 3. Review and obtain agreement with relevant stakeholders on the completeness and correctness of documented risks. 4. Revise risks as appropriate.

Examples of when identified risks may need to be revised include the f o l l o w i n g : • When new risks are identified • When risks become problems ptg • When risks are retired • When project circumstances change significantly

SP 2.3 PLAN DATA MANAGEMENT Plan for the management of project data. TIP This SP helps answer questions such as what data the project Data are forms of documentation required to support a project in all should collect, distribute, of its areas (e.g., administration, engineering, configuration manage- deliver, and archive; how and ment, finance, logistics, quality, safety, manufacturing, procurement). when it should do this; who The data can take any form (e.g., reports, manuals, notebooks, should be able to access it; and charts, drawings, specifications, files, correspondence). The data can how data will be stored to exist in any medium (e.g., printed or drawn on various materials, address the need for privacy and security, yet give access to photographs, electronic, multimedia). those who need it. Data can be deliverable (e.g., items identified by a project’s con- tract data requirements) or data can be nondeliverable (e.g., informal HINT data, trade studies, analyses, internal meeting minutes, internal Interpret data broadly to bene- design review documentation, lessons learned, action items). Distri- fit from data management more bution can take many forms, including electronic transmission. fully.

Wow! eBook 416 PART TWO THE PROCESS AREAS

TIP Data requirements for the project should be established for both Selecting a standard form can data items to be created and their content and form, based on a com- facilitate communication and mon or standard set of data requirements. Uniform content and for- understanding. mat requirements for data items facilitate understanding of data content and help with consistent management of data resources. TIP The reason for collecting each document should be clear. This Providing a reason for the data being collected encourages task includes the analysis and verification of project deliverables and cooperation from those provid- nondeliverables, data requirements, and customer supplied data. ing the data. Often, data are collected with no clear understanding of how they will be used. Data are costly and should be collected only when needed. Example Work Products TIP 1. Data management plan The data management plan 2. Master list of managed data defines the data necessary for 3. Data content and format description the project, who owns it, where it is stored, and how it is used. 4. Lists of data requirements for acquirers and suppliers Also, it may specify what hap- 5. Privacy requirements pens to the data after the proj- 6. Security requirements ect terminates. 7. Security procedures

8. Mechanisms for data retrieval, reproduction, and distribution ptg 9. Schedule for the collection of project data 10. List of project data to be collected TIP Subpractices Data privacy and security should be considered, but may 1. Establish requirements and procedures to ensure privacy and the not be an applicable considera- security of data. tion for certain types of Not everyone will have the need or clearance necessary to access p r o j e c t s . project data. Procedures should be established to identify who has X-REF access to which data as well as when they have access to which data. Measurement data is a subset 2. Establish a mechanism to archive data and to access archived data. of project data. See MA SPs Accessed information should be in an understandable form (e.g., 1.3 and 2.3 for more informa- electronic or computer output from a database) or represented as tion on collecting, storing, originally generated. and controlling access to 3. Determine the project data to be identified, collected, and distributed. measurement data. 4. Determine the requirements for providing access to and distribution X-REF of data to relevant stakeholders. These requirements are often A review of other elements of the project plan can help to determine developed when you are elic- who requires access to or receipt of project data as well as which data iting stakeholder needs and are involved. developing customer require- ments, which is addressed in RD SP1.1 and SP1.2.

Wow! eBook Project Planning 417

5. Decide which project data and plans require version control or other X-REF levels of configuration control and establish mechanisms to ensure Levels of control are project data are controlled. addressed in CM SP1.2.

SP 2.4 PLAN THE PROJECT’S RESOURCES Plan for resources to perform the project. TIP This practice addresses all Defining project resources (e.g., labor, equipment, materials, meth- resources, not just personnel. ods) and quantities needed to perform project activities builds on initial estimates and provides additional information that can be PP applied to expand the WBS used to manage the project. The top-level WBS developed earlier as an estimation mechanism is typically expanded by decomposing these top levels into work packages that represent single work units that can be separately assigned, performed, and tracked. This subdivision is done to distrib- ute management responsibility and provide better management control. Each work package in the WBS should be assigned a unique iden- TIP tifier (e.g., number) to permit tracking. A WBS can be based on The WBS established in SP 1.1 requirements, activities, work products, services, or a combination of is expanded to help identify these items. A dictionary that describes the work for each work pack- roles as well as staffing, ptg age in the WBS should accompany the work breakdown structure. process, facility, and tool requirements; assign work; Example Work Products obtain commitment to perform the work; and track it to com- 1. Work packages pletion. Automated tools can 2. WBS task dictionary help you with this activity. 3. Staffing requirements based on project size and scope 4. Critical facilities and equipment list 5. Process and workflow definitions and diagrams 6. Project administration requirements list 7. Status reports

Subpractices 1. Determine process requirements. The processes used to manage a project are identified, defined, and TIP coordinated with all relevant stakeholders to ensure efficient opera- At maturity level 3, the organi- tions during project execution. zation is typically the main 2. Determine communication requirements. source of process require- ments, standard processes, These requirements address the kinds of mechanisms to be used for and process assets that aid in communicating with customers, end users, project staff, and other their use (see OPD). relevant stakeholders.

Wow! eBook 418 PART TWO THE PROCESS AREAS

3. Determine staffing requirements. The staffing of a project depends on the decomposition of project requirements into tasks, roles, and responsibilities for accomplishing project requirements as laid out in the work packages of the WBS. Staffing requirements should consider the knowledge and skills required for each identified position as defined in the Plan Needed Knowledge and Skills specific practice. 4. Determine facility, equipment, and component requirements. Most projects are unique in some way and require a set of unique assets to accomplish project objectives. The determination and acqui- sition of these assets in a timely manner are crucial to project success. TIP It is best to identify lead-time items early to determine how they will be addressed. Even when required assets are not unique, compiling a Some items (e.g., unusual skills) take time to obtain, and need list of all facilities, equipment, and parts (e.g., number of computers for these should be identified for the staff working on the project, software applications, office early. space) provides insight into aspects of the scope of an effort that are often overlooked. 5. Determine other continuing resource requirements. Beyond determining processes, reporting templates, staffing, facilities, and equipment, there may be a continuing need for other types of resources to effectively carry out project activities, including the fol- lowing: ptg • Consumables (e.g., electricity, office supplies) • Access to intellectual property • Access to transportation (for people and equipment) The requirements for such resources are derived from the require- ments found in (existing and future) agreements (e.g., customer agreements, service agreements, supplier agreements), the project’s TIP strategic approach, and the need to manage and maintain the project’s This practice addresses the operations for a period of time. training that is specific to the project. SP 2.5 PLAN NEEDED KNOWLEDGE AND SKILLS TIP Plan for knowledge and skills needed to perform the project. At maturity level 2, the organi- Refer to the Organizational Training process area for more information about zation may not be capable of developing skills and knowledge of people so they can perform their roles effec- providing much training for its projects. Each project might tively and efficiently. address all of its knowledge and skill needs. At maturity Knowledge delivery to projects involves training project staff and level 3, the organization takes acquiring knowledge from outside sources. responsibility for addressing common training needs (e.g., Staffing requirements are dependent on the knowledge and skills training in the organization’s available to support the execution of the project. set of standard processes).

Wow! eBook Project Planning 419

Example Work Products 1. Inventory of skill needs TIP 2. Staffing and new hire plans Either the project or the organi- zation can maintain these 3. Databases (e.g., skills, training) example work products. 4. Training plans

Subpractices HINT 1. Identify the knowledge and skills needed to perform the project. Consider all knowledge and

2. Assess the knowledge and skills available. skills required for the project, PP 3. Select mechanisms for providing needed knowledge and skills. not just the technical aspects.

Example mechanisms include the following: • In-house training (both organizational and project) TIP • External training If a skill is needed for the cur- • Staffing and new hires rent project, but is not expected • External skill acquisition to be needed for future proj- ects, external skill acquisition may be the best choice. How- The choice of in-house training or outsourced training for needed ever, if the skill needed for the project is expected to continue, knowledge and skills is determined by the availability of training ptg expertise, the project’s schedule, and business objectives. training existing employees or hiring a new employee should 4. Incorporate selected mechanisms into the project plan. be explored.

SP 2.6 PLAN STAKEHOLDER INVOLVEMENT Plan the involvement of identified stakeholders. X-REF Identifying and involving rel- Stakeholders are identified from all phases of the project lifecycle by evant stakeholders is also addressed in PMC SP 1.5, IPM, identifying the people and functions that should be represented in and GP 2.7. the project and describing their relevance and the degree of interac- tion for project activities. A two-dimensional matrix with stakehold- HINT ers along one axis and project activities along the other axis is a For each project phase, identify convenient format for accomplishing this identification. Relevance of stakeholders important to the the stakeholder to the activity in a particular project phase and the success of that phase and amount of interaction expected would be shown at the intersection their role (e.g., implementer, of the project phase activity axis and the stakeholder axis. reviewer, or consultant). Arrange this information into For inputs of stakeholders to be useful, careful selection of rele- a matrix to aid in communica- vant stakeholders is necessary. For each major activity, identify stake- tion, obtain their commitment holders who are affected by the activity and those who have expertise (SP 3.3), and monitor status that is needed to conduct the activity. This list of relevant stakehold- (PMC SP 1.5). ers will probably change as the project moves through phases of the project lifecycle. It is important, however, to ensure that relevant

Wow! eBook 420 PART TWO THE PROCESS AREAS

TIP stakeholders in the latter phases of the lifecycle have early input to Not all stakeholders identified requirements and design decisions that affect them. will be relevant stakeholders. Only a limited number of stake- holders are selected for inter- Examples of the type of material that should be included in a plan for action with the project as work stakeholder interaction include the following: progresses. • List of all relevant stakeholders • Rationale for stakeholder involvement • Relationships among stakeholders • Resources (e.g., training, materials, time, funding) needed to ensure stakeholder interaction • Schedule for the phasing of stakeholder interaction • Roles and responsibilities of relevant stakeholders with respect to the project, by project lifecycle phase • Relative importance of the stakeholder to the success of the project, by project lifecycle phase

Implementing this specific practice relies on shared or exchanged information with the previous Plan Needed Knowledge and Skills specific practice. Example Work Products ptg HINT Most project plans change over 1. Stakeholder involvement plan time as requirements are better understood, so plan how and SP 2.7 ESTABLISH THE PROJECT PLAN when you will maintain the plan. Establish and maintain the overall project plan.

TIP A documented plan that addresses all relevant planning items is nec- The plan document should essary to achieve the mutual understanding and commitment of indi- reflect the project’s status as viduals, groups, and organizations that execute or support the plans. requirements and the project The plan generated for the project defines all aspects of the effort, environment change. tying together the following in a logical manner:

TIP • Project lifecycle considerations A documented plan communi- • Project tasks cates resources needed, expec- tations, and commitments; • Budgets and schedules contains a game plan for rele- • Milestones vant stakeholders, including • Data management the project team (SP 3.3); and is the basis for managing the • Risk identification p r o j e c t . • Resource and skill requirements • Stakeholder identification and interaction • Infrastructure considerations

Wow! eBook Project Planning 421

Infrastructure considerations include responsibility and authority relationships for project staff, management, and support organizations. Lifecycle considerations can include coverage of later phases of the product or service life (that might be beyond the life of the proj- ect), especially transition to another phase or party (e.g., transition to manufacturing, training, operations, a service provider).

For software, the planning document is often referred to as one of the f o l l o w i n g : • Software development plan PP • Software project plan • Software plan

For hardware, the planning document is often referred to as a hardware development plan. Development activities in preparation for production can be included in the hardware development plan or defined in a sepa- rate production plan.

Examples of plans that have been used in the U.S. Department of Defense community include the following: ptg • Integrated Master Plan—an event driven plan that documents significant accomplishments with pass/fail criteria for both business and technical elements of the project and that ties each accomplishment to a key proj- ect event. • Integrated Master Schedule—an integrated and networked multi-layered schedule of project tasks required to complete the work effort docu- mented in a related Integrated Master Plan. • Systems Engineering Management Plan—a plan that details the inte- grated technical effort across the project. • Systems Engineering Master Schedule—an event based schedule that contains a compilation of key technical accomplishments, each with measurable criteria, requiring successful completion to pass identified events. • Systems Engineering Detailed Schedule—a detailed, time dependent, task oriented schedule that associates dates and milestones with the Sys- tems Engineering Master Schedule.

Example Work Products 1. Overall project plan

Wow! eBook 422 PART TWO THE PROCESS AREAS

TIP SG 3 OBTAIN COMMITMENT TO THE PLAN Before making a commitment, a Commitments to the project plan are established and maintained. project member analyzes what it will take to meet that commit- ment. Project members who To be effective, plans require commitment by those who are responsi- make commitments should ble for implementing and supporting the plan. continually evaluate their abil- ity to meet their commitments, SP 3.1 REVIEW PLANS THAT AFFECT THE PROJECT communicate immediately to those affected when they can- Review all plans that affect the project to understand project commitments. not meet their commitments, and mitigate the impacts of being unable to meet their Plans developed in other process areas typically contain information commitments. similar to that called for in the overall project plan. These plans can provide additional detailed guidance and should be compatible with TIP and support the overall project plan to indicate who has the author- Commitments are a recurring ity, responsibility, accountability, and control. All plans that affect the theme in CMMI. Requirements project should be reviewed to ensure they contain a common under- are committed to in REQM, standing of the scope, objectives, roles, and relationships that are documented and reconciled in PP, monitored in PMC, and required for the project to be successful. Many of these plans are addressed more thoroughly described by the Plan the Process generic practice. in IPM. Example Work Products ptg HINT 1. Record of the reviews of plans that affect the project Beware of commitments that are given freely. A favorite SP 3.2 RECONCILE WORK AND RESOURCE LEVELS quote that applies is “How bad do you want it? That is how bad Adjust the project plan to reconcile available and estimated resources. you will get it!” If you do not allow commitments to be made freely, staff most likely will try To establish a project that is feasible, obtain commitment from rele- to provide the commitment you vant stakeholders and reconcile differences between estimates and want to hear instead of a well- available resources. Reconciliation is typically accomplished by mod- thought out answer that is ifying or deferring requirements, negotiating more resources, finding accurate. ways to increase productivity, outsourcing, adjusting the staff skill mix, or revising all plans that affect the project or its schedules. TIP Because resource availability Example Work Products can change, such reconciliation will likely need to be done mul- 1. Revised methods and corresponding estimating parameters (e.g., tiple times during the life of the better tools, the use of off-the-shelf components) project. 2. Renegotiated budgets 3. Revised schedules 4. Revised requirements list 5. Renegotiated stakeholder agreements

Wow! eBook Project Planning 423

SP 3.3 OBTAIN PLAN COMMITMENT Obtain commitment from relevant stakeholders responsible for performing TIP and supporting plan execution. Commitments are a two-way form of communication. Obtaining commitment involves interaction among all relevant TIP stakeholders, both internal and external to the project. The individ- ual or group making a commitment should have confidence that the In maturity level 1 organiza- tions, management often com- work can be performed within cost, schedule, and performance con- municates a different picture of straints. Often, a provisional commitment is adequate to allow the effort the project than the staff does. to begin and to permit research to be performed to increase confi- This inconsistency indicates PP dence to the appropriate level needed to obtain a full commitment. that commitments were not obtained. Example Work Products TIP 1. Documented requests for commitments Documenting commitments 2. Documented commitments makes clear the responsibili- ties of those involved with the Subpractices project. 1. Identify needed support and negotiate commitments with relevant stakeholders. TIP The WBS can be used as a checklist for ensuring that commitments ptg are obtained for all tasks. A commitment not documented is a commitment not made The plan for stakeholder interaction should identify all parties from (memories are imperfect and whom commitment should be obtained. thus unreliable). 2. Document all organizational commitments, both full and provi- sional, ensuring the appropriate level of signatories. TIP Commitments should be documented to ensure a consistent mutual Senior management must be understanding and for project tracking and maintenance. Provisional informed of external commit- commitments should be accompanied by a description of risks associ- ments (especially those with ated with the relationship. customers, end users, and sup- pliers), as they can expose the 3. Review internal commitments with senior management as appropriate. organization to unnecessary 4. Review external commitments with senior management as appropriate. risk. Management can have the necessary insight and authority to reduce risks associated with external commitments. X-REF 5. Identify commitments regarding interfaces between project elements For more information on and other projects and organizational units so that these commit- managing commitments, ments can be monitored. dependencies, and coordina- tion issues among relevant Well-defined interface specifications form the basis for commitments. stakeholders, see IPM SG 2. For more information on identifying and managing interfaces, see PI SG 2.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook PROCESS AND PRODUCT QUALITY ASSURANCE A Support Process Area at Maturity Level 2

Purpose The purpose of Process and Product Quality Assurance (PPQA) is to TIP provide staff and management with objective insight into processes PPQA is often referred to as the and associated work products. “eyes and ears” of the organi- zation. It ensures that the orga-

nization’s policies, practices, PPQA Introductory Notes and processes are followed. The Process and Product Quality Assurance process area involves the following activities: TIP ptg • Objectively evaluating performed processes and work products The phrase “process descrip- tions, standards, and proce- against applicable process descriptions, standards, and procedures dures” is used in PPQA (and GP • Identifying and documenting noncompliance issues 2.9) to represent management’s • Providing feedback to project staff and managers on the results of expectations as to how project quality assurance activities work will be performed. The applicable process descrip- • Ensuring that noncompliance issues are addressed tions, standards, and proce- dures are typically identified The Process and Product Quality Assurance process area supports during project planning. the delivery of high-quality products by providing project staff and managers at all levels with appropriate visibility into, and feedback on, processes and associated work products throughout the life of the project. The practices in the Process and Product Quality Assurance process area ensure that planned processes are implemented, while the practices in the Verification process area ensure that specified requirements are satisfied. These two process areas can on occasion address the same work product but from different perspectives. Proj- ects should take advantage of the overlap to minimize duplication of effort while taking care to maintain separate perspectives. Objectivity in process and product quality assurance evaluations is critical to the success of the project. (See the definition of “objectively

425

Wow! eBook 426 PART TWO THE PROCESS AREAS

TIP evaluate” in the glossary.) Objectivity is achieved by both indepen- Objectivity is required. Inde- dence and the use of criteria. A combination of methods providing pendence is not required but is evaluations against criteria by those who do not produce the work often the means used to assure product is often used. Less formal methods can be used to provide objectivity. broad day-to-day coverage. More formal methods can be used peri- odically to assure objectivity.

TIP Examples of ways to perform objective evaluations include the following: For less mature organizations, • Formal audits by organizationally separate quality assurance organizations formal audits conducted by a separate QA group may be • Peer reviews, which can be performed at various levels of formality best. Implementing peer • In-depth review of work at the place it is performed (i.e., desk audits) reviews as an objective evalua- • Distributed review and comment of work products tion method (second bullet) • Process checks built into the processes such as a fail-safe for processes requires care; see upcoming when they are done incorrectly (e.g., Poka-Yoke) notes and example box.

Traditionally, a quality assurance group that is independent of the project provides objectivity. However, another approach may be appropriate in some organizations to implement the process and product quality assurance role without that kind of independence. ptg TIP For example, in an organization with an open, quality oriented culture, the When an independent process and product quality assurance role can be performed, partially or approach is not used, it is completely, by peers and the quality assurance function can be embedded important to be able to demon- in the process. For small organizations, this embedded approach might be strate how the objectivity was the most feasible approach. achieved.

If quality assurance is embedded in the process, several issues should be addressed to ensure objectivity. Everyone performing qual- ity assurance activities should be trained in quality assurance. Those who perform quality assurance activities for a work product should be separate from those who are directly involved in developing or maintaining the work product. An independent reporting channel to the appropriate level of organizational management should be avail- able so that noncompliance issues can be escalated as necessary.

For example, when implementing peer reviews as an objective evaluation method, the following issues should be addressed: • Members are trained and roles are assigned for people attending the peer reviews. • A member of the peer review who did not produce this work product is assigned to perform the quality assurance role. Continues

Wow! eBook Process and Product Quality Assurance 427

Continued • Checklists based on process descriptions, standards, and procedures are available to support the quality assurance activity. • Noncompliance issues are recorded as part of the peer review report and are tracked and escalated outside the project when necessary.

Quality assurance should begin in the early phases of a project to HINT establish plans, processes, standards, and procedures that will add Start QA early in your project. value to the project and satisfy the requirements of the project and organizational policies. Those who perform quality assurance activi- TIP ties participate in establishing plans, processes, standards, and proce- For organizations just begin- dures to ensure that they fit project needs and that they will be ning to establish QA, sampling usable for performing quality assurance evaluations. In addition, is usually 100%. However, as an processes and associated work products to be evaluated during the organization becomes more experienced in QA, it becomes project are designated. This designation can be based on sampling or possible to select a smaller on objective criteria that are consistent with organizational policies, sample of the processes and PPQA project requirements, and project needs. work products to evaluate When noncompliance issues are identified, they are first addressed without compromising QA in the project and resolved there if possible. Noncompliance issues effectiveness. that cannot be resolved in the project are escalated to an appropriate level of management for resolution. ptg This process area applies to evaluations of project activities and work products, and to organizational (e.g., process group, organiza- tional training) activities and work products. For organizational activities and work products, the term “project” should be appropri- ately interpreted.

In Agile environments, teams tend to focus on immediate needs of the iter- ation rather than on longer term and broader organizational needs. To ensure that objective evaluations are perceived to have value and are effi- cient, discuss the following early: (1) how objective evaluations are to be done, (2) which processes and work products will be evaluated, (3) how results of evaluations will be integrated into the team’s rhythms (e.g., as part of daily meetings, checklists, peer reviews, tools, continuous integra- tion, retrospectives). (See “Interpreting CMMI When Using Agile Approaches” in Part I.)

Related Process Areas

Refer to the Verification process area for more information about ensuring that selected work products meet their specified requirements.

Wow! eBook 428 PART TWO THE PROCESS AREAS Specific Goal and Practice Summary SG 1 Objectively Evaluate Processes and Work Products SP 1.1 Objectively Evaluate Processes SP 1.2 Objectively Evaluate Work Products SG 2 Provide Objective Insight SP 2.1 Communicate and Resolve Noncompliance Issues SP 2.2 Establish Records

Specific Practices by Goal

SG 1 OBJECTIVELY EVALUATE PROCESSES AND WORK PRODUCTS Adherence of the performed process and associated work products to applicable process descriptions, standards, and procedures is objectively evaluated.

SP 1.1 OBJECTIVELY EVALUATE PROCESSES Objectively evaluate selected performed processes against applicable process descriptions, standards, and procedures.

Objectivity in quality assurance evaluations is critical to the success of the project. A description of the quality assurance reporting chain ptg and how it ensures objectivity should be defined. Example Work Products 1. Evaluation reports 2. Noncompliance reports 3. Corrective actions

Subpractices TIP 1. Promote an environment (created as part of project management) Quality is everyone’s job. It is that encourages staff participation in identifying and reporting qual- important that everyone in the ity issues. organization be comfortable 2. Establish and maintain clearly stated criteria for evaluations. identifying and openly dis- cussing quality concerns. The intent of this subpractice is to provide criteria, based on business needs, such as the following: • What will be evaluated • When or how often a process will be evaluated • How the evaluation will be conducted • Who must be involved in the evaluation 3. Use the stated criteria to evaluate selected performed processes for adherence to process descriptions, standards, and procedures.

Wow! eBook Process and Product Quality Assurance 429

4. Identify each noncompliance found during the evaluation. HINT 5. Identify lessons learned that could improve processes. If you are using a traditional QA approach that involves a QA group, assign that group the SP 1.2 OBJECTIVELY EVALUATE WORK PRODUCTS responsibility for sharing new Objectively evaluate selected work products against applicable process best practices with projects descriptions, standards, and procedures. during project planning.

Example Work Products HINT You can embed objective eval- 1. Evaluation reports uations of work products in 2. Noncompliance reports some verification activities— 3. Corrective actions particularly peer reviews— although doing so requires Subpractices care. (See the Introductory Notes for more information.) 1. Select work products to be evaluated based on documented sampling criteria if sampling is used.

Work products can include services produced by a process whether PPQA the recipient of the service is internal or external to the project or organization. 2. Establish and maintain clearly stated criteria for the evaluation of selected work products. ptg The intent of this subpractice is to provide criteria, based on business needs, such as the following: • What will be evaluated during the evaluation of a work product • When or how often a work product will be evaluated • How the evaluation will be conducted • Who must be involved in the evaluation 3. Use the stated criteria during evaluations of selected work products. 4. Evaluate selected work products at selected times. HINT Subpractice 4 recommends Examples of when work products can be evaluated against process evaluation of work products at descriptions, standards, or procedures include the following: different times and from differ- ent perspectives. The impor- • Before delivery to the customer tant point is that you think • During delivery to the customer broadly about what will best • Incrementally, when it is appropriate give you the objective insight • During unit testing you need during your project. • During integration • When demonstrating an increment TIP Reports and feedback from QA 5. Identify each case of noncompliance found during evaluations. help the organization identify 6. Identify lessons learned that could improve processes. what is and is not working.

Wow! eBook 430 PART TWO THE PROCESS AREAS

SG 2 PROVIDE OBJECTIVE INSIGHT Noncompliance issues are objectively tracked and communicated, and reso- lution is ensured.

SP 2.1 COMMUNICATE AND RESOLVE NONCOMPLIANCE ISSUES Communicate quality issues and ensure the resolution of noncompliance issues with the staff and managers.

HINT Noncompliance issues are problems identified in evaluations that Noncompliance issues may be common in low-maturity reflect a lack of adherence to applicable standards, process descrip- organizations and should be tions, or procedures. The status of noncompliance issues provides an addressed at the lowest indication of quality trends. Quality issues include noncompliance possible level. Don’t “criminal- issues and trend analysis results. ize” those responsible for When noncompliance issues cannot be resolved in the project, noncompliance. use established escalation mechanisms to ensure that the appropriate level of management can resolve the issue. Track noncompliance issues to resolution. Example Work Products

1. Corrective action reports ptg 2. Evaluation reports 3. Quality trends

Subpractices 1. Resolve each noncompliance with the appropriate members of the staff if possible. TIP 2. Document noncompliance issues when they cannot be resolved in In some cases, organizational the project. requirements and project requirements conflict. Such sit- uations can require noncompli- Examples of ways to resolve noncompliance in the project include the ance issues to be escalated to a f o l l o w i n g : level that includes the project as well as QA. • Fixing the noncompliance • Changing the process descriptions, standards, or procedures that were violated • Obtaining a waiver to cover the noncompliance

3. Escalate noncompliance issues that cannot be resolved in the project to the appropriate level of management designated to receive and act on noncompliance issues.

Wow! eBook Process and Product Quality Assurance 431

4. Analyze noncompliance issues to see if there are quality trends that can be identified and addressed. 5. Ensure that relevant stakeholders are aware of results of evaluations HINT and quality trends in a timely manner. QA activities are often unap- 6. Periodically review open noncompliance issues and trends with the preciated. To address this issue, manager designated to receive and act on noncompliance issues. consider reporting a compli- ance percentage rather than the 7. Track noncompliance issues to resolution. number of noncompliance issues for projects. This puts a SP 2.2 ESTABLISH RECORDS positive spin on QA and may encourage friendly Establish and maintain records of quality assurance activities. competition. Example Work Products 1. Evaluation logs 2. Quality assurance reports 3. Status reports of corrective actions

4. Reports of quality trends PPQA

Subpractices 1. Record process and product quality assurance activities in sufficient TIP detail so that status and results are known. Records provide a way to iden- ptg 2. Revise the status and history of quality assurance activities as tify trends in QA activities necessary. (including noncompliance issues) that allow the organiza- tion to identify where addi- tional guidance or process changes are needed.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook QUANTITATIVE PROJECT MANAGEMENT TIP A Project Management Process Area at Maturity Level 4 In the CMMI for Services Model, this process area is called Quan- titative Work Management.

Purpose TIP QPM helps a project migrate The purpose of Quantitative Project Management (QPM) is to quan- from using the rear view mirror titatively manage the project to achieve the project’s established qual- to manage the project to using ity and process performance objectives. predictive, forward-looking approaches to help steer the project to success. Introductory Notes TIP The Quantitative Project Management process area involves the fol- When asked to state why QPM lowing activities: is important, one colleague ptg replied, “The enterprise needs • Establishing and maintaining the project’s quality and process per- data for the next generation!” formance objectives TIP • Composing a defined process for the project to help to achieve the Although CMMI does not QPM project’s quality and process performance objectives define who performs the SPs of • Selecting subprocesses and attributes critical to understanding per- a PA, the activities described in formance and that help to achieve the project’s quality and process QPM are often best performed performance objectives by those who actually execute the project’s defined process • Selecting measures and analytic techniques to be used in quantitative as they are in the best position management to understand the details of the • Monitoring the performance of selected subprocesses using statistical situation they are in. Access to and other quantitative techniques those with special expertise in measurement and quantitative • Managing the project using statistical and other quantitative tech- techniques (e.g., Six Sigma black niques to determine whether or not the project’s objectives for quality belts) can, however, be beneficial. and process performance are being satisfied • Performing root cause analysis of selected issues to address deficiencies TIP in achieving the project’s quality and process performance objectives When effectively implemented, QPM empowers individuals and teams by enabling them to Organizational process assets used to achieve high maturity, accurately estimate (make pre- including quality and process performance objectives, selected processes, dictions) and thus make measures, baselines, and models, are established using organizational sounder commitments based process performance processes and used in quantitative project on these estimates (predictions).

433

Wow! eBook 434 PART TWO THE PROCESS AREAS

X-REF management processes. The project can use organizational process QPM and OPP are tightly cou- performance processes to define additional objectives, measures, pled process areas. Each pro- baselines, and models as needed to effectively analyze and manage duces work products used by performance. The measures, measurements, and other data resulting the other. Refer to OPP exten- sively when considering how from quantitative project management processes are incorporated to implement QPM. into the organizational process assets. In this way, the organization and its projects derive benefit from assets improved through use. The project’s defined process is a set of interrelated subprocesses that form an integrated and coherent process for the project. The Inte- grated Project Management practices describe establishing the proj- ect’s defined process by selecting and tailoring processes from the organization’s set of standard processes. (See the definition of “defined process” in the glossary.) Quantitative Project Management practices, unlike Integrated Project Management practices, help you to develop a quantitative understanding of the expected performance of processes or sub- processes. This understanding is used as a basis for establishing the project’s defined process by evaluating alternative processes or sub- processes for the project and selecting the ones that will best achieve the quality and process performance objectives. Establishing effective relationships with suppliers is also important ptg to the successful implementation of this process area. Establishing effective relationships can involve establishing quality and process performance objectives for suppliers, determining the measures and analytic techniques to be used to gain insight into supplier progress and performance, and monitoring progress toward achieving those objectives. An essential element of quantitative management is having confi- dence in predictions (i.e., the ability to accurately predict the extent to which the project can fulfill its quality and process performance TIP objectives). Subprocesses to be managed through the use of statisti- By “statistical thinking,” we cal and other quantitative techniques are chosen based on the needs mean using statistical tech- for predictable process performance. niques as tools in appropriate Another essential element of quantitative management is under- ways to assess the variation in the performance of a process, standing the nature and extent of the variation experienced in to investigate its causes, and to process performance and recognizing when the project’s actual per- recognize from the data when formance may not be adequate to achieve the project’s quality and the process is not performing process performance objectives. as it should. (See the definition Thus, quantitative management includes statistical thinking and of “statistical techniques” in the correct use of a variety of statistical techniques. (See the defini- the glossary.) tion of “quantitative management” in the glossary.)

Wow! eBook Quantitative Project Management 435

Statistical and other quantitative techniques are used to develop TIP an understanding of the actual performance or to predict the per- An important term is “statistical formance of processes. Such techniques can be applied at multiple and other quantitative tech- levels, from a focus on individual subprocesses to analyses that span niques.” (See its definition in the glossary.) High maturity lifecycle phases, projects, and support functions. Non-statistical organizations and projects techniques provide a less rigorous but still useful set of approaches leverage an understanding of that together with statistical techniques help the project to under- what factors matter in an stand whether or not quality and process performance objectives are unfolding situation to control being satisfied and to identify any needed corrective actions. its future course. Statistical and This process area applies to managing a project. Applying these other quantitative techniques provide the insight needed to concepts to managing other groups and functions can help to link accomplish this, thus helping different aspects of performance in the organization to provide a “steer” the project toward basis for balancing and reconciling competing priorities to address a achieving its objectives. broader set of business objectives.

Examples of other groups and functions that could benefit from using this process area include the following: • Quality assurance or quality control functions • Process definition and improvement • Internal research and development functions • Risk identification and management functions ptg • Technology scouting functions • Market research

• Customer satisfaction assessment QPM • Problem tracking and reporting X-REF See How to Measure Any- Related Process Areas thing: Finding the Value of “Intangibles” in Business by Refer to the Causal Analysis and Resolution process area for more information Douglas W. Hubbard (John about identifying causes of selected outcomes and taking action to improve Wiley & Sons, Inc.). process performance. X-REF Refer to the Integrated Project Management process area for more information See Moving Up the CMMI about establishing the project’s defined process. Capability and Maturity Levels Using Simulation by David M. Refer to the Measurement and Analysis process area for more information about Raffo and Wayne Wakeland aligning measurement and analysis activities and providing measurement (found at http://www.sei.cmu results. .edu/reports/08tr002.pdf).

Refer to the Organizational Process Definition process area for more informa- X-REF tion about establishing organizational process assets. See Understanding Variation: Refer to the Organizational Performance Management process area for more The Key to Managing Chaos information about proactively managing the organization’s performance to meet (Second Edition) by Donald J. its business objectives. Wheeler (SPC Press, Inc.).

Wow! eBook 436 PART TWO THE PROCESS AREAS

Refer to the Organizational Process Performance process area for more infor- mation about establishing and maintaining a quantitative understanding of the performance of selected processes in the organization’s set of standard processes in support of achieving quality and process performance objectives, and provid- ing process performance data, baselines, and models to quantitatively manage the organization’s projects. Refer to the Project Monitoring and Control process area for more information about providing an understanding of the project’s progress so that appropriate corrective actions can be taken when the project’s performance deviates signifi- cantly from the plan. Refer to the Supplier Agreement Management process area for more information about managing the acquisition of products and services from suppliers.

Specific Goal and Practice Summary SG 1 Prepare for Quantitative Management SP 1.1 Establish the Project’s Objectives SP 1.2 Compose the Defined Process SP 1.3 Select Subprocesses and Attributes SP 1.4 Select Measures and Analytic Techniques SG 2 Quantitatively Manage the Project SP 2.1 Monitor the Performance of Selected Subprocesses ptg SP 2.2 Manage Project Performance SP 2.3 Perform Root Cause Analysis

Specific Practices by Goal

SG 1 PREPARE FOR QUANTITATIVE MANAGEMENT Preparation for quantitative management is conducted.

Preparation activities include establishing quantitative objectives for the project, composing a defined process for the project that can help to achieve those objectives, selecting subprocesses and attributes critical to understanding performance and achieving the objectives, and selecting measures and analytic techniques that support quanti- HINT tative management. You can perform SPs 1.1 These activities may need to be repeated when needs and priori- through 1.4 concurrently and ties change, when there is an improved understanding of process per- iteratively. formance, or as part of risk mitigation or corrective action.

Wow! eBook Quantitative Project Management 437

SP 1.1 ESTABLISH THE PROJECT’S OBJECTIVES TIP Establish and maintain the project’s quality and process performance Generally, these objectives are established early during proj- o b j e c t i v e s . ect planning as customer requirements relating to prod- When establishing the project’s quality and process performance uct quality, service quality, and objectives, think about the processes that will be included in the proj- process performance are being ect’s defined process and what the historical data indicate regarding established and analyzed. their process performance. These considerations, along with others X-REF such as technical capability, will help in establishing realistic objec- Quality and process perform- tives for the project. ance objectives (QPPOs) The project’s objectives for quality and process performance are should satisfy certain criteria established and negotiated at an appropriate level of detail (e.g., for such as “SMART criteria.” individual product components, subprocesses, project teams) to per- SMART is an abbreviation for mit an overall evaluation of the objectives and risks at the project specific, measurable, attain- level. As the project progresses, project objectives can be updated as able, relevant, and time- bound (see Wikipedia for the project’s actual performance becomes known and more pre- more information). dictable, and to reflect changing needs and priorities of relevant stakeholders. Example Work Products 1. The project’s quality and process performance objectives ptg 2. Assessment of the risk of not achieving the project’s objectives

Subpractices QPM 1. Review the organization’s objectives for quality and process TIP performance. The project’s QPPOs are based, This review ensures that project members understand the broader in part, on those of the organi- business context in which the project operates. The project’s objec- zation. This approach helps to ensure that the project’s QPPOs tives for quality and process performance are developed in the con- are aligned with those of the text of these overarching organizational objectives. organization. Refer to the Organizational Process Performance process area for more information about establishing quality and process performance o b j e c t i v e s . 2. Identify the quality and process performance needs and priorities of the customer, suppliers, end users, and other relevant stakeholders. Typically, the identification of relevant stakeholders’ needs will begin early (e.g., during development of the statement of work). Needs are further elicited, analyzed, refined, prioritized, and balanced during requirements development.

Wow! eBook 438 PART TWO THE PROCESS AREAS Examples of quality and process performance attributes for which needs and priorities might be identified include the following: •Duration • Predictability •Reliability • Maintainability •Usability • Timeliness • Functionality •Accuracy

3. Define and document measurable quality and process performance objectives for the project. Defining and documenting objectives for the project involve the following: • Incorporating appropriate organizational quality and process per- formance objectives • Writing objectives that reflect the quality and process performance needs and priorities of the customer, end users, and other relevant stakeholders • Determining how each objective will be achieved ptg • Reviewing the objectives to ensure they are sufficiently specific, measurable, attainable, relevant, and time-bound

Examples of measurable quality attributes include the following: • Mean time between failures • Number and severity of defects in the released product • Critical resource utilization • Number and severity of customer complaints concerning the provided service

Examples of measurable process performance attributes include the f o l l o w i n g : • Cycle time • Percentage of rework time • Percentage of defects removed by product verification activities (per- haps by type of verification, such as peer reviews and testing) • Defect escape rates • Number and severity of defects found (or incidents reported) in first year following product delivery (or start of service)

Wow! eBook Quantitative Project Management 439 Examples of project quality and process performance objectives include: • Maintain change request backlog size below a target value. • Improve velocity in an Agile environment to a target value by a target date. • Reduce idle time by x% by a target date. • Maintain schedule slippage below a specified percent. • Reduce the total lifecycle cost by a specified percent by a target date. • Reduce defects in products delivered to the customer by 10% without affecting cost.

4. Derive interim objectives to monitor progress toward achieving the project’s objectives. Interim objectives can be established for attributes of selected lifecy- cle phases, milestones, work products, and subprocesses. Since process performance models characterize relationships among product and process attributes, these models can be used to help derive interim objectives that guide the project toward achieving its objectives. 5. Determine the risk of not achieving the project’s quality and process performance objectives. The risk is a function of the established objectives, the product archi- ptg tecture, the project’s defined process, availability of needed knowl- edge and skills, etc. Process performance baselines and models can be used to evaluate the likelihood of achieving a set of objectives and QPM provide guidance in negotiating objectives and commitments. The assessment of risk can involve various project stakeholders and can be conducted as part of the conflict resolution described in the next subpractice. 6. Resolve conflicts among the project’s quality and process perform- ance objectives (e.g., if one objective cannot be achieved without compromising another). Process performance models can help to identify conflicts and help to ensure that the resolution of conflicts does not introduce new con- flicts or risks. Resolving conflicts involves the following activities: • Setting relative priorities for objectives • Considering alternative objectives in light of long-term business strategies as well as short-term needs • Involving the customer, end users, senior management, project management, and other relevant stakeholders in tradeoff decisions • Revising objectives as necessary to reflect results of conflict resolution

Wow! eBook 440 PART TWO THE PROCESS AREAS

7. Establish traceability to the project’s quality and process perform- ance objectives from their sources.

Examples of sources of objectives include the following: • Requirements • The organization’s quality and process performance objectives • The customer’s quality and process performance objectives • Business objectives • Discussions with customers and potential customers • Market surveys • Product Architecture

An example of a method to identify and trace these needs and priorities is Quality Function Deployment (QFD).

HINT 8. Define and negotiate quality and process performance objectives for After a supplier or set of suppliers. suppliers is selected, revisit 9. Revise the project’s quality and process performance objectives as these objectives. Teaming with necessary. suppliers who have quantita- ptg tively managed processes may present opportunities to opti- SP 1.2 COMPOSE THE DEFINED PROCESS mize across organizational Using statistical and other quantitative techniques, compose a defined boundaries. process that enables the project to achieve its quality and process perform- ance objectives. Refer to the Integrated Project Management process area for more information about establishing the project’s defined process. Refer to the Organizational Process Definition process area for more informa- tion about establishing organizational process assets. TIP Some organizations may not Refer to the Organizational Process Performance process area for more infor- need alternative subprocesses mation about establishing performance baselines and models. (e.g., because the projects are sufficiently similar to one Composing the project’s defined process goes beyond the process another). In such cases, there selection and tailoring described in the Integrated Project Manage- are still some alternatives to explore (e.g., one can adjust the ment process area. It involves identifying alternatives to one or more level or depth of intensity with processes or subprocesses, performing quantitative analysis of per- which a subprocess is applied; formance and selecting the alternatives that are best able to help the see Subpractice 2). project to achieve its quality and process performance objectives.

Wow! eBook Quantitative Project Management 441

Example Work Products TIP 1. Criteria used to evaluate alternatives for the project Process performance baselines and models (PPBs and PPMs) 2. Alternative subprocesses can help the project evaluate 3. Subprocesses to be included in the project’s defined process alternatives to determine 4. Assessment of risk of not achieving the project’s objectives which alternatives best help it achieve its QPPOs. Subpractices 1. Establish the criteria to use in evaluating process alternatives for the project.

Criteria can be based on the following: • Quality and process performance objectives • Availability of process performance data and the relevance of the data to evaluating an alternative • Familiarity with an alternative or with alternatives similar in composition • Existence of process performance models that can be used in evaluating an alternative • Product line standards • Project lifecycle models • Stakeholder requirements ptg • Laws and regulations

2. Identify alternative processes and subprocesses for the project. QPM Identifying alternatives can include one or more of the following: • Analyzing organizational process performance baselines to identify candidate subprocesses that would help achieve the project’s qual- ity and process performance objectives • Identifying subprocesses from the organization’s set of standard processes as well as tailored processes in the process asset library that can help to achieve the objectives • Identifying processes from external sources (e.g., such as other organizations, professional conferences, academic research) • Adjusting the level or depth of intensity with which a subprocess is applied (as described in further detail in a subpractice that follows) Adjusting the level or depth of intensity with which the subprocesses are applied can involve the following choices: • Number and type of peer reviews to be held and when • Amount of effort or calendar time devoted to particular tasks • Number and selection of people involved

Wow! eBook 442 PART TWO THE PROCESS AREAS

• Skill level requirements for performing specific tasks • Selective application of specialized construction or verification techniques • Reuse decisions and associated risk mitigation strategies • The product and process attributes to be measured • Sampling rate for management data Refer to the Integrated Project Management process area for more infor- mation about using organizational process assets for planning project activities. HINT 3. Analyze the interaction of alternative subprocesses to understand When project subprocesses relationships among the subprocesses, including their attributes. interact with supplier or end- An analysis of the interaction will provide insight into the relative user subprocesses, the dynam- strengths and weaknesses of particular alternatives. This analysis can ics may not be obvious. Use be supported by a calibration of the organization’s process perform- process simulation in concert ance models with process performance data (e.g., as characterized in with PPMs to uncover hidden process performance baselines). behavior and unintended c o n s e q u e n c e s . Additional modeling may be needed if existing process performance models cannot address significant relationships among the alternative X-REF subprocesses under consideration and there is high risk of not achiev- ing objectives. For more information about process simulation, see the 4. Evaluate alternative subprocesses against the criteria. ptg Raffo and Wakeland report Use historical data, process performance baselines, and process per- mentioned earlier. formance models as appropriate to assist in evaluating alternatives against the criteria. These evaluations can include use of a sensitivity analysis particularly in high risk situations. Refer to the Decision Analysis and Resolution process area for more infor- mation about evaluating alternatives. 5. Select the alternative subprocesses that best meet the criteria. It may be necessary to iterate through the activities described in the previous subpractices several times before confidence is achieved that the best available alternatives have been identified. 6. Evaluate the risk of not achieving the project’s quality and process performance objectives. An analysis of risk associated with the selected alternative defined process can lead to identifying new alternatives to be evaluated, as well as areas requiring more management attention. Refer to the Risk Management process area for more information about identifying and analyzing risks.

Wow! eBook Quantitative Project Management 443

SP 1.3 SELECT SUBPROCESSES AND ATTRIBUTES Select subprocesses and attributes critical to evaluating performance and that TIP help to achieve the project’s quality and process performance objectives. The focus of SP 1.2 was on determining what process will Some subprocesses are critical because their performance signifi- best help achieve QPPOs. In SP 1.3, the focus shifts to identify- cantly influences or contributes to achieving the project’s objectives. ing those critical subprocesses These subprocesses may be good candidates for monitoring and con- within that process that need to trol using statistical and other quantitative techniques as described in be performed correctly and the first specific practice of the second specific goal. consistently (that need to be Also, some attributes of these subprocesses can serve as leading “mastered”) or project per- formance suffers; as well as in indicators of the process performance to expect of subprocesses that identifying those attributes that are further downstream and can be used to assess the risk of not give an early indication (or achieving the project’s objectives (e.g., by using process performance “signal”) that the project has models). gone “off course.” Subprocesses and attributes that play such critical roles may have already been identified as part of the analyses described in the previ- ous specific practice. For small projects, and other circumstances in which subprocess data may not be generated frequently enough in the project to sup- port a sufficiently sensitive statistical inference, it may still be possi- ble to understand performance by examining process performance ptg across similar iterations, teams, or projects. Example Work Products QPM 1. Criteria used to select subprocesses that are key contributors to achieving the project’s objectives 2. Selected subprocesses 3. Attributes of selected subprocesses that help in predicting future project performance

Subpractices 1. Analyze how subprocesses, their attributes, other factors, and project HINT performance results relate to each other. Learn which “knobs” you can A root cause analysis, sensitivity analysis, or process performance turn at different points within a model can help to identify the subprocesses and attributes that most project to desirably affect proj- ect performance and achieve- contribute to achieving particular performance results (and variation ment of outcomes. Such in performance results) or that are useful indicators of future achieve- learning goes on at the organi- ment of performance results. zational level (OPP SPs 1.2 and Refer to the Causal Analysis and Resolution process area for more infor- 1.5) and continues within indi- mation about determining causes of selected outcomes. vidual projects.

Wow! eBook 444 PART TWO THE PROCESS AREAS

2. Identify criteria to be used in selecting subprocesses that are key contributors to achieving the project’s quality and process perform- ance objectives.

TIP Examples of criteria used to select subprocesses include the following: The project’s selection of sub- • There is a strong correlation with performance results that are addressed processes and attributes is in the project’s objectives. based, in part, on those selected by the organization • Stable performance of the subprocess is important. (OPP SP 1.2). • Poor subprocess performance is associated with major risks to the p r o j e c t . • One or more attributes of the subprocess serve as key inputs to process performance models used in the project. • The subprocess will be executed frequently enough to provide sufficient data for analysis.

3. Select subprocesses using the identified criteria. Historical data, process performance models, and process perform- ance baselines can help in evaluating candidate subprocesses against selection criteria. Refer to the Decision Analysis and Resolution process area for more infor- ptg mation about evaluating alternatives. 4. Identify product and process attributes to be monitored. These attributes may have been identified as part of performing the previous subpractices. Attributes that provide insight into current or future subprocess per- formance are candidates for monitoring, whether or not the associ- ated subprocesses are under the control of the project. Also, some of these same attributes may serve other roles (e.g., to help in monitor- ing project progress and performance as described in Project Monitor- ing and Control [PMC]).

Examples of product and process attributes include the following: • Effort consumed to perform the subprocess • The rate at which the subprocess is performed • Cycle time for process elements that make up the subprocess • Resource or materials consumed as input to the subprocess • Skill level of the staff member performing the subprocess • Quality of the work environment used to perform the subprocess • Volume of outputs of the subprocess (e.g., intermediate work products) • Quality attributes of outputs of the subprocess (e.g., reliability, testability)

Wow! eBook Quantitative Project Management 445

SP 1.4 SELECT MEASURES AND ANALYTIC TECHNIQUES TIP Select measures and analytic techniques to be used in quantitative This practice has a dynamic aspect to it. Although many m a n a g e m e n t . measures and analytic tech- Refer to the Measurement and Analysis process area for more information about niques can be identified in aligning measurement and analysis activities and providing measurement advance, and thus the project results. team is prepared to apply these and interpret results, specific situations may arise Example Work Products during the project that may 1. Definitions of measures and analytic techniques to be used in quan- require more in-depth analysis titative management to diagnose the situation and evaluate potential resolutions 2. Traceability of measures back to the project’s quality and process (e.g., see SP 2.3). Access to spe- performance objectives cial expertise in analytic tech- 3. Quality and process performance objectives for selected sub- niques may be needed. processes and their attributes 4. Process performance baselines and models for use by the project

Subpractices TIP 1. Identify common measures from the organizational process assets For example, the identification of measures to support moni- that support quantitative management. toring of selected subprocesses Refer to the Organizational Process Definition process area for more (SP 2.1) is based on the attri - ptg information about establishing organizational process assets. butes selected (SP 1.3) and on measures established by the Refer to the Organizational Process Performance process area for more organization to be included in

information about establishing performance baselines and models. its process performance analy- QPM Product lines or other stratification criteria can categorize common ses (OPP SP 1.3). measures. X-REF 2. Identify additional measures that may be needed to cover critical product and process attributes of the selected subprocesses. Much of the material found in the remaining subpractices (2 In some cases, measures can be research oriented. Such measures through 9) is a more sophisti- should be explicitly identified. cated application of MA SG 1, 3. Identify the measures to be used in managing subprocesses. Align Measurement and When selecting measures, keep the following considerations in mind: Analysis Activities. • Measures that aggregate data from multiple sources (e.g., different TIP processes, input sources, environments) or over time (e.g., at a For example, additional meas- phase level) can mask underlying problems, making problem iden- ures may be needed to address tification and resolution difficult. unique customer requirements • For short-term projects, it may be necessary to aggregate data or supplier approaches. across similar instances of a process to enable analysis of its process performance while continuing to use the unaggregated data in sup- port of individual projects. • Selection should not be limited to progress or performance mea- sures only. “Analysis measures” (e.g., inspection preparation rates,

Wow! eBook 446 PART TWO THE PROCESS AREAS

staff member skill levels, path coverage in testing) may provide bet- ter insight into process performance. 4. Specify the operational definitions of measures, their collection points in subprocesses, and how the integrity of measures will be determined. 5. Analyze the relationship of identified measures to the project quality and process performance objectives and derive subprocess quality and process performance objectives that state targets (e.g., thresh- olds, ranges) to be met for each measured attribute of each selected subprocess.

Examples of derived subprocess quality and process performance objec- tives include the following: • Maintain a code review rate between 75 to 100 lines of code per hour • Keep requirements gathering sessions to under three hours • Keep test rate over a specified number of test cases per day • Maintain rework levels below a specified percent • Maintain productivity in generating use cases per day • Keep design complexity (fan-out rate) below a specified threshold

ptg TIP 6. Identify the statistical and other quantitative techniques to be used See the definitions of “statisti- in quantitative management. cal and other quantitative tech- In quantitative management, the process performance of selected sub- niques” and “statistical processes is analyzed using statistical and other quantitative tech- techniques” in the glossary. niques that help to characterize subprocess variation, identify when statistically unexpected behavior occurs, recognize when variation is excessive, and investigate why. Examples of statistical techniques that can be used in the analysis of process performance include statistical process control charts, regression analysis, analysis of variance, and time series analysis. The project can benefit from analyzing the performance of sub- processes not selected for their impact on project performance. Statis- tical and other quantitative techniques can be identified to address these subprocesses as well. Statistical and other quantitative techniques sometimes involve the use of graphical displays that help visualize associations among the data and results of analyses. Such graphical displays can help visual- ize process performance and variation over time (i.e., trends), identify problems or opportunities, and evaluate the effects of particular factors.

Wow! eBook Quantitative Project Management 447 Examples of graphical displays include the following: • Scatterplots •Histograms • Box and whiskers plots •Run charts • Ishikawa diagrams

Examples of other techniques used to analyze process performance include the following: • Tally sheets • Classification schemas (e.g., Orthogonal Defect Classification)

7. Determine what process performance baselines and models may be needed to support identified analyses. In some situations, the set of baselines and models provided as described in Organizational Process Performance may be inadequate to support quantitative project management. This situation can hap- pen when the objectives, processes, stakeholders, skill levels, or envi- ronment for the project are different from other projects for which baselines and models were established. ptg As the project progresses, data from the project can serve as a more representative data set for establishing missing or a project specific set of process performance baselines and models. Hypothesis testing comparing project data to prior historical data can QPM confirm the need to establish additional baselines and models specific to the project. 8. Instrument the organizational or project support environment to support collection, derivation, and analysis of measures. This instrumentation is based on the following: • Description of the organization’s set of standard processes • Description of the project’s defined process • Capabilities of the organizational or project support environment 9. Revise measures and statistical analysis techniques as necessary. TIP

SG 2 QUANTITATIVELY MANAGE THE PROJECT We prepare for quantitative management in SG 1. We per- The project is quantitatively managed. form quantitative management in SG 2. Under some circum- Quantitatively managing the project involves the use of statistical stances, it may be necessary to revisit the practices under SG 1 and other quantitative techniques to do the following: (e.g., to address changes to • Monitor the selected subprocesses using statistical and other quanti- QPPOs or select a superior sub- tative techniques process alternative).

Wow! eBook 448 PART TWO THE PROCESS AREAS

TIP • Determine whether or not the project’s quality and process perform- These three bullets correspond ance objectives are being satisfied to SPs 2.1 through 2.3. • Perform root cause analysis of selected issues to address deficiencies

TIP SP 2.1 MONITOR THE PERFORMANCE OF SELECTED SUBPROCESSES The subprocesses being moni- Monitor the performance of selected subprocesses using statistical and other tored are those selected in quantitative techniques. SP 1.3. The intent of this specific practice is to use statistical and other quan- titative techniques to analyze variation in subprocess performance and to determine actions necessary to achieve each subprocess’s qual- ity and process performance objectives. Example Work Products 1. Natural bounds of process performance for each selected subprocess attribute 2. The actions needed to address deficiencies in the process stability or capability of each selected subprocess

Subpractices ptg TIP 1. Collect data, as defined by the selected measures, on the sub- The subpractice addresses the processes as they execute. question “Is the subprocess 2. Monitor the variation and stability of the selected subprocesses and stable?” address deficiencies. This analysis involves evaluating measurements in relation to the nat- HINT ural bounds calculated for each selected measure and identifying out- When subprocess variation is liers or other signals of potential non-random behavior, determining too large, investigate process their causes and preventing or mitigating the effects of their recur- and measurement fidelity. Also, rence (i.e., addressing special causes of variation). investigate what is happening “upstream” and at a lower level During such analysis, be sensitive to the sufficiency of the data and to of granularity. You may need to shifts in process performance that can affect the ability to achieve or restratify your data (e.g., to maintain process stability. match input types) or disaggre- Analytic techniques for identifying outliers or signals include statisti- gate the measure (so it is no cal process control charts, prediction intervals, and analysis of vari- longer reporting totals) to ance. Some of these techniques involve graphical displays. achieve control. Other deficiencies in process performance to consider include when X-REF variation is too large to have confidence that the subprocess is stable, See Example 7.1 in Measuring or too great to assess its capability (next subpractice) of achieving the the Software Process, Statisti- objectives established for each selected attribute. cal Process Control for Soft- 3. Monitor the capability and performance of the selected subprocesses ware Process Improvement by and address deficiencies. William A. Florac and Anita D. Carleton (Addison-Wesley). The intent of this subpractice is to identify what actions to take to help the subprocess achieve its quality and process performance

Wow! eBook Quantitative Project Management 449

objectives. Be sure that the subprocess performance is stable relative TIP to the selected measures (previous subpractice) before comparing its The subpractice addresses the capability to its quality and process performance objectives. question “Is the subprocess capable?”

Examples of actions that can be taken when the performance of a selected HINT subprocess fails to satisfy its objectives include the following: Regularly compare the natural • Improving the implementation of the existing subprocess to reduce its bounds for the subprocess variation or improve its performance (i.e., addressing common causes of (“voice of the process”) against variation) the associated QPPOs (“voice • Identifying and implementing an alternative subprocess through identi- of the customer”) and take cor- fying and adopting new process elements, subprocesses, and technolo- rective action as appropriate. gies that may help better align with objectives • Identifying risks and risk mitigation strategies for each deficiency in sub- TIP process capability Sometimes, you can change • Renegotiating or re-deriving objectives for each selected attribute of a QPPOs associated with a subprocess so that they can be met by the subprocess particular subprocess by inves - tigating whether other sub- processes can give up some needed slack. (Or by differenti- Some actions can involve the use of root cause analysis, which is fur- ating by type or size of work ther described in SP 2.3. products it is applied to.) Refer to the Project Monitoring and Control process area for more infor- mation about managing corrective action to closure. ptg

SP 2.2 MANAGE PROJECT PERFORMANCE TIP QPM Manage the project using statistical and other quantitative techniques to Expand your process improve- determine whether or not the project’s objectives for quality and process per- ment toolkit to include tech- formance will be satisfied. niques such as Six Sigma or Theory of Constraints when Refer to the Measurement and Analysis process area for more information about managing project performance. aligning measurement and analysis activities and providing measurement results. X-REF Refer to the Organizational Performance Management process area for more To see how to use Six Sigma information about managing business performance. with CMMI, see “CMMI and Six Sigma: Partners in Process Improvement,” at www.sei This specific practice is project focused and uses multiple inputs to .cmu.edu/library/abstracts/ predict if the project’s quality and process performance objectives books/0321516087.cfm. will be satisfied. Based on this prediction, risks associated with not X-REF meeting the project’s quality and process performance objectives are identified and managed, and actions to address deficiencies are For more information on using Theory of Constraints defined as appropriate. and the Critical Thinking Key inputs to this analysis include the individual subprocess sta- Tools, see Thinking for a bility and capability data derived from the previous specific practice, Change: Putting the TOC as well as performance data from monitoring other subprocesses, Thinking Processes to Use by risks, and suppliers’ progress. Lisa Scheinkopf (CRC).

Wow! eBook 450 PART TWO THE PROCESS AREAS

Example Work Products 1. Predictions of results to be achieved relative to the project’s quality and process performance objectives 2. Graphical displays and data tabulations for other subprocesses, which support quantitative management 3. Assessment of risks of not achieving the project’s quality and process performance objectives 4. Actions needed to address deficiencies in achieving project objectives

Subpractices 1. Periodically review the performance of subprocesses. Stability and capability data from monitoring selected subprocesses, as described in SP2.1, are a key input into understanding the project’s overall ability to meet quality and process performance objectives. In addition, subprocesses not selected for their impact on project objectives can still create problems or risks for the project and thus some level of monitoring for these subprocesses may be desired as well. Analytic techniques involving the use of graphical displays can also prove to be useful to understanding subprocess performance. 2. Monitor and analyze suppliers’ progress toward achieving their qual- ity and process performance objectives. ptg 3. Periodically review and analyze actual results achieved against estab- lished interim objectives. TIP 4. Use process performance models calibrated with project data to PPMs (OPP SP 1.5) calibrated assess progress toward achieving the project’s quality and process with project-specific data or performance objectives. PPBs can help the project Process performance models are used to assess progress toward determine whether it will be achieving objectives that cannot be measured until a future phase in able to achieve its QPPOs. the project lifecycle. Objectives can either be interim objectives or overall objectives.

An example is the use of process performance models to predict the latent defects in work products in future phases or in the delivered product.

Calibration of process performance models is based on the results obtained from performing the activities described in the previous sub- practices and specific practices. 5. Identify and manage risks associated with achieving the project’s quality and process performance objectives. Refer to the Risk Management process area for more information about identifying and analyzing risks and mitigating risks.

Wow! eBook Quantitative Project Management 451 Example sources of risks include the following: • Subprocesses having inadequate performance or capability • Suppliers not achieving their quality and process performance objectives • Lack of visibility into supplier capability • Inaccuracies in the process performance models used for predicting performance • Deficiencies in predicted process performance (estimated progress) • Other identified risks associated with identified deficiencies

6. Determine and implement actions needed to address deficiencies in achieving the project’s quality and process performance objectives. The intent of this subpractice is to identify and implement the right set of actions, resources, and schedule to place the project back on a path toward achieving its objectives.

Examples of actions that can be taken to address deficiencies in achieving the project’s objectives include the following: • Changing quality and process performance objectives so that they are within the expected range of the project’s defined process • Improving the implementation of the project’s defined process ptg • Adopting new subprocesses and technologies that have the potential for satisfying objectives and managing associated risks • Identifying the risk and risk mitigation strategies for deficiencies • Terminating the project QPM

Some actions can involve the use of root cause analysis, which is addressed in the next specific practice. Refer to the Project Monitoring and Control process area for more infor- mation about managing corrective action to closure. When corrective actions result in changes to attributes or measures related to adjustable factors in a process performance model, the model can be used to predict the effects of the actions. When under- taking critical corrective actions in high risk situations, a process per- formance model can be created to predict the effects of the change.

SP 2.3 PERFORM ROOT CAUSE ANALYSIS TIP The deficiencies mentioned Perform root cause analysis of selected issues to address deficiencies in in the SP statement are those achieving the project’s quality and process performance objectives. arising out of the analyses described in SP 2.1 (subprocess Issues to address include deficiencies in subprocess stability and performance deficiencies) and capability, and deficiencies in project performance relative to its SP 2.2 (predicted deficiencies in achieving project QPPOs). objectives.

Wow! eBook 452 PART TWO THE PROCESS AREAS

Root cause analysis of selected issues is best performed shortly after the problem is first identified, while the event is still recent enough to be carefully investigated. The formality of and effort required for a root cause analysis can vary greatly and can be determined by such factors as the stakehold- ers who are involved; the risk or opportunity that is present; the complexity of the situation; the frequency with which the situation could recur; the availability of data, baselines, and models that can be used in the analysis; and how much time has passed since the events triggering the deficiency. In the case of a subprocess that exhibits too much variation, is performed rarely, and involves different stakeholders, it could take weeks or months to identify root causes. Likewise, the actions to take can range significantly in terms of effort and time needed to determine, plan, and implement them. It is often difficult to know how much time is needed unless an initial analysis of the deficiencies is undertaken. Refer to the Causal Analysis and Resolution process area for more information about identifying causes of selected outcomes and taking action to improve process performance. Refer to the Measurement and Analysis process area for more information about ptg aligning measurement and analysis activities and providing measurement results.

Example Work Products 1. Subprocess and project performance measurements and analyses (including statistical analyses) recorded in the organization’s mea - surement repository X-REF 2. Graphical displays of data used to understand subprocess and proj- Although an Ishikawa dia- ect performance and performance trends gram (i.e., fishbone diagram) is a common tool for simple 3. Identified root causes and potential actions to take root cause analysis, deficien- cies affecting achievement of Subpractices the project’s QPPOs often 1. Perform root cause analysis, as appropriate, to diagnose process per- require more in-depth analy- formance deficiencies. sis. In addition to the use of PPBs and PPMs, consider Process performance baselines and models are used in diagnosing tools such as the Current deficiencies; identifying possible solutions; predicting future project Reality Tree from the Theory and process performance; and evaluating potential actions as of Constraints. For more appropriate. information, visit the Goldratt The use of process performance models in predicting future project Institute at www.goldratt.com/ and process performance is described in a subpractice of the previous or consult Wikipedia. specific practice.

Wow! eBook Quantitative Project Management 453

2. Identify and analyze potential actions. 3. Implement selected actions. 4. Assess the impact of the actions on subprocess performance. This assessment of impact can include an evaluation of the statistical significance of the impacts resulting from the actions taken to improve process performance

ptg QPM

Wow! eBook This page intentionally left blank

ptg

Wow! eBook REQUIREMENTS DEVELOPMENT An Engineering Process Area at Maturity Level 3

Purpose The purpose of Requirements Development (RD) is to elicit, analyze, TIP and establish customer, product, and product component requirements. REQM works alongside RD to manage requirements, changes to requirements, and traceabil- Introductory Notes ity and to ensure alignment of work products as requirements This process area describes three types of requirements: customer are developed. requirements, product requirements, and product component require- ments. Taken together, these requirements address the needs of rele- ptg vant stakeholders, including needs pertinent to various product lifecycle phases (e.g., acceptance testing criteria) and product attrib- utes (e.g., responsiveness, safety, reliability, maintainability). Require- ments also address constraints caused by the selection of design solutions (e.g., integration of commercial off-the-shelf products, use of a particular architecture pattern). All development projects have requirements. Requirements are the basis for design. The development of requirements includes the following activities:

• Elicitation, analysis, validation, and communication of customer TIP

needs, expectations, and constraints to obtain prioritized customer Customer needs can prescribe RD requirements that constitute an understanding of what will satisfy particular solutions (e.g., a stakeholders client-server application) in addition to describing the • Collection and coordination of stakeholder needs problem to be solved. • Development of the lifecycle requirements of the product • Establishment of the customer functional and quality attribute TIP requirements See the definition of “quality attributes” in the glossary. • Establishment of initial product and product component require- Quality attribute requirements ments consistent with customer requirements are nonfunctional requirements.

455

Wow! eBook 456 PART TWO THE PROCESS AREAS

TIP • This process area addresses all customer requirements rather than See the definition of “service only product level requirements because the customer can also pro- system” in the glossary. For vide specific design requirements. more information about devel- oping service systems, see the Customer requirements are further refined into product and Service System Development process area of the CMMI for product component requirements. In addition to customer require- Services model. ments, product and product component requirements are derived from the selected design solutions. Throughout the process areas, TIP where the terms “product” and “product component” are used, their As long as they continue to be intended meanings also encompass services, service systems, and maintained, requirements pro- their components. vide value to those supporting Requirements are identified and refined throughout the phases of the product throughout its life. For example, requirements the product lifecycle. Design decisions, subsequent corrective remind implementers, main- actions, and feedback during each phase of the product’s lifecycle are tainers, and trainers of what analyzed for impact on derived and allocated requirements. the product is supposed to do The Requirements Development process area includes three spe- so that their actions contribute cific goals. The Develop Customer Requirements specific goal to requirements satisfaction. addresses defining a set of customer requirements to use in the devel- TIP opment of product requirements. The Develop Product Require- See the definitions of “derived ments specific goal addresses defining a set of product or product requirements” and “allocated component requirements to use in the design of products and prod- ptg requirements” in the glossary. uct components. The Analyze and Validate Requirements specific goal addresses the analysis of customer, product, and product com- TIP ponent requirements to define, derive, and understand the require- For more on recursion, see ments. The specific practices of the third specific goal are intended to “Recursion and Iteration of assist the specific practices in the first two specific goals. The Engineering Processes” in s e c t i o n 4 . processes associated with the Requirements Development process area and processes associated with the Technical Solution process TIP area can interact recursively with one another. Requirements validation is Analyses are used to understand, define, and select the require- intended to uncover critical ments at all levels from competing alternatives. These analyses stakeholder needs, expecta- include the following: tions, interfaces, and con- straints. The practices in VAL may provide additional insight • Analysis of needs and requirements for each product lifecycle phase, into how RD validation activi- including needs of relevant stakeholders, the operational environ- ties can be performed. ment, and factors that reflect overall customer and end-user expecta- tions and satisfaction, such as safety, security, and affordability TIP • Development of an operational concept Safety, security, and affordabil- • Definition of the required functionality and quality attributes ity are quality attributes. Note that they may have different implications for different This definition of required functionality and quality attributes phases of the product’s life and describes what the product is to do. (See the definition of “definition for different stakeholders. of required functionality and quality attributes” in the glossary.) This

Wow! eBook Requirements Development 457 definition can include descriptions, decompositions, and a partition- ing of the functions (or in object oriented analysis what has been referred to as “services” or “methods”) of the product. In addition, the definition specifies design considerations or con- TIP straints on how the required functionality will be realized in the The term “definition of product. Quality attributes address such things as product availabil- required functionality and ity; maintainability; modifiability; timeliness, throughput, and respon- quality attributes” is new in siveness; reliability; security; and scalability. Some quality attributes CMMI V1.3. It extends the con- cept “definition of required will emerge as architecturally significant and thus drive the develop- functionality” to include qual- ment of the product architecture. ity attributes, recognizing that Such analyses occur recursively at successively more detailed lay- architectural tradeoffs can ers of a product’s architecture until sufficient detail is available to span both. enable detailed design, acquisition, and testing of the product to pro- ceed. As a result of the analysis of requirements and the operational concept (including functionality, support, maintenance, and disposal), the manufacturing or production concept produces more derived requirements, including consideration of the following:

• Constraints of various types • Technological limitations • Cost and cost drivers ptg • Time constraints and schedule drivers • Risks • Consideration of issues implied but not explicitly stated by the cus- tomer or end user • Factors introduced by the developer’s unique business considerations, regulations, and laws

A hierarchy of logical entities (e.g., functions and subfunctions, object classes and subclasses; processes; other architectural entities) is established through iteration with the evolving operational con- cept. Requirements are refined, derived, and allocated to these logical RD entities. Requirements and logical entities are allocated to products, product components, people, or associated processes. In the case of iterative or incremental development, the requirements are also allo- cated to iterations or increments. Involvement of relevant stakeholders in both requirements devel- opment and analysis gives them visibility into the evolution of requirements. This activity continually assures them that the require- ments are being properly defined. For product lines, engineering processes (including requirements development) may be applied to at least two levels in the organization.

Wow! eBook 458 PART TWO THE PROCESS AREAS

X-REF At an organizational or product line level, a “commonality and varia- See the definition of “product tion analysis” is performed to help elicit, analyze, and establish core line” in the glossary. assets for use by projects within the product line. At the project level, these core assets are then used as per the product line production plan as part of the project’s engineering activities.

TIP In Agile environments, customer needs and ideas are iteratively elicited, The discipline implied by the elaborated, analyzed, and validated. Requirements are documented in practices in RD may help proj- forms such as user stories, scenarios, use cases, product backlogs, and the ects—not just those employing Agile methods—adequately results of iterations (working code in the case of software). Which require- attend to the needs of other ments will be addressed in a given iteration is driven by an assessment of stakeholders, other lifecycle risk and by the priorities associated with what is left on the product back- phases, and needed quality log. What details of requirements (and other artifacts) to document is attributes so that critical needs driven by the need for coordination (among team members, teams, and and risks are not overlooked or later iterations) and the risk of losing what was learned. When the cus- forgotten. tomer is on the team, there can still be a need for separate customer and product documentation to allow multiple solutions to be explored. As the solution emerges, responsibilities for derived requirements are allocated to the appropriate teams. (See “Interpreting CMMI When Using Agile Approaches” in Part I.)

ptg Related Process Areas

Refer to the Product Integration process area for more information about ensur- ing interface compatibility. Refer to the Technical Solution process area for more information about select- ing product component solutions and developing the design. Refer to the Validation process area for more information about validating prod- uct or product components. Refer to the Verification process area for more information about verifying selected work products. Refer to the Configuration Management process area for more information about tracking and controlling changes. Refer to the Requirements Management process area for more information about managing requirements. Refer to the Risk Management process area for more information about identi- fying and analyzing risks.

Wow! eBook Requirements Development 459 Specific Goal and Practice Summary SG 1 Develop Customer Requirements SP 1.1 Elicit Needs SP 1.2 Transform Stakeholder Needs into Customer Requirements SG 2 Develop Product Requirements SP 2.1 Establish Product and Product Component Requirements SP 2.2 Allocate Product Component Requirements SP 2.3 Identify Interface Requirements SG 3 Analyze and Validate Requirements SP 3.1 Establish Operational Concepts and Scenarios SP 3.2 Establish a Definition of Required Functionality and Quality Attributes SP 3.3 Analyze Requirements SP 3.4 Analyze Requirements to Achieve Balance SP 3.5 Validate Requirements

Specific Practices by Goal

SG 1 DEVELOP CUSTOMER REQUIREMENTS TIP In some situations, project Stakeholder needs, expectations, constraints, and interfaces are collected and work is authorized through a translated into customer requirements. supplier agreement that docu- ments the requirements (see The needs of stakeholders (e.g., customers, end users, suppliers, SAM SP 1.3). Additional effort ptg builders, testers, manufacturers, logistics support staff) are the basis may be needed to determine a more refined and complete set for determining customer requirements. The stakeholder needs, of customer requirements. expectations, constraints, interfaces, operational concepts, and prod- uct concepts are analyzed, harmonized, refined, and elaborated for TIP translation into a set of customer requirements. Some stakeholder needs are Frequently, stakeholder needs, expectations, constraints, and rarely communicated in an offi- interfaces are poorly identified or conflicting. Since stakeholder cial document. They are com- needs, expectations, constraints, and limitations should be clearly municated in documentation, identified and understood, an iterative process is used throughout conversations, meetings, demonstrations, and so on. the life of the project to accomplish this objective. To facilitate the Therefore, this information

required interaction, a surrogate for the end user or customer is fre- must be translated into RD quently involved to represent their needs and help resolve conflicts. requirements that the project The customer relations or marketing part of the organization as well and stakeholders can agree to. as members of the development team from disciplines such as human engineering or support can be used as surrogates. Environ- HINT mental, legal, and other constraints should be considered when cre- Rarely does a customer know exactly what he needs. Proto- ating and resolving the set of customer requirements. types or an iterative develop- ment process can sometimes help you more effectively learn the requirements for the p r o d u c t .

Wow! eBook 460 PART TWO THE PROCESS AREAS

SP 1.1 ELICIT NEEDS Elicit stakeholder needs, expectations, constraints, and interfaces for all phases of the product lifecycle.

X-REF Eliciting goes beyond collecting requirements by proactively identi- In the case of product lines fying additional requirements not explicitly provided by customers. or product families, SP 1.1 is Additional requirements should address the various product lifecycle sometimes implemented activities and their impact on the product. across the product line or product family. See the penultimate paragraph of Examples of techniques to elicit needs include the following: the RD Intro Notes for more • Technology demonstrations information. • Interface control working groups HINT • Technical control working groups Determine what the product • Interim project reviews must do and how it should • Questionnaires, interviews, and scenarios (operational, sustainment, and behave. Also determine what is development) obtained from end users required to produce it, license • Operational, sustainment, and development walkthroughs and end-user it, market it, distribute it, install task analysis it, train end users, repair it, • Quality attribute elicitation workshops with stakeholders maintain it, migrate to new ver- • Prototypes and models sions, support its use, retire it, ptg and dispose of it. • Brainstorming • Quality Function Deployment • Market surveys •Beta testing • Extraction from sources such as documents, standards, or specifications • Observation of existing products, environments, and workflow patterns • Use cases •User stories • Delivering small incremental “vertical slices” of product functionality • Business case analysis • Reverse engineering (for legacy products) • Customer satisfaction surveys

Examples of sources of requirements that may not be identified by the cus- tomer include the following: • Business policies •Standards • Previous architectural design decisions and principles • Business environmental requirements (e.g., laboratories, testing and other facilities, information technology infrastructure) Continues

Wow! eBook Requirements Development 461

Continued • Technology • Legacy products or product components (reuse product components) • Regulatory statutes

Example Work Products 1. Results of requirements elicitation activities

Subpractices 1. Engage relevant stakeholders using methods for eliciting needs, expectations, constraints, and external interfaces.

SP 1.2 TRANSFORM STAKEHOLDER NEEDS INTO CUSTOMER REQUIREMENTS Transform stakeholder needs, expectations, constraints, and interfaces into prioritized customer requirements.

The various inputs from the relevant stakeholders should be consoli- TIP dated, missing information should be obtained, and conflicts should Neither the acquirer nor the be resolved as customer requirements are developed and prioritized. project should abrogate or ptg The customer requirements can include needs, expectations, and diminish project responsibility to implement appropriate veri- constraints with regard to verification and validation. fications and validations. In some situations, the customer provides a set of requirements to the project, or the requirements exist as an output of a previous proj- ect’s activities. In these situations, the customer requirements could conflict with the relevant stakeholders’ needs, expectations, con- straints, and interfaces and will need to be transformed into the rec- ognized set of customer requirements after appropriate resolution of conflicts. Relevant stakeholders representing all phases of the product’s life-

cycle should include business as well as technical functions. In this RD way, concepts for all product related lifecycle processes are consid- ered concurrently with the concepts for the products. Customer requirements result from informed decisions on the business as well as technical effects of their requirements. Example Work Products 1. Prioritized customer requirements 2. Customer constraints on the conduct of verification 3. Customer constraints on the conduct of validation

Wow! eBook 462 PART TWO THE PROCESS AREAS

TIP Subpractices Prioritization also provides 1. Translate stakeholder needs, expectations, constraints, and interfaces guidance when making archi- tectural tradeoffs and ensures into documented customer requirements. priorities are not later over- 2. Establish and maintain a prioritization of customer functional and looked or forgotten. quality attribute requirements.

X-REF Having prioritized customer requirements helps to determine project, iteration, or increment scope. This prioritization ensures that func- The Kano model is an tional and quality attribute requirements critical to the customer and approach to classifying and prioritizing functionality and other stakeholders are addressed quickly. attributes according to their 3. Define constraints for verification and validation. impact on customer percep- tions of quality. One source of SG 2 DEVELOP PRODUCT REQUIREMENTS information on the Kano model is the Wikipedia web- Customer requirements are refined and elaborated to develop product and site (en.wikipedia.org/wiki/ product component requirements. Kano_model). Customer requirements are analyzed in conjunction with the devel- opment of the operational concept to derive more detailed and pre- cise sets of requirements called “product and product component requirements.” Product and product component requirements address TIP the needs associated with each product lifecycle phase. Derived ptg See the definitions of “product requirements arise from constraints; consideration of issues implied lifecycle” and “architecture” in but not explicitly stated in the customer requirements baseline; fac- the glossary. tors introduced by the selected architecture, product lifecycle, and design; and the developer’s unique business considerations. The requirements are reexamined with each successive, lower level set of requirements and architecture, and the preferred product concept is refined. The requirements are allocated to product functions and product components including objects, people, and processes. In the case of iterative or incremental development, the requirements are also allo- cated to iterations or increments based on customer priorities, technol- ogy issues, and project objectives. The traceability of requirements to functions, objects, tests, issues, or other entities is documented. The allocated requirements and functions (or other logical entities) are the basis for the synthesis of the technical solution; however, as the architecture is defined or emerges, it serves as the ultimate basis for directing the allocation of requirements to the solution. As internal components are developed, additional interfaces are defined and interface requirements are established. Refer to the Requirements Management process area for more information about maintaining bidirectional traceability of requirements.

Wow! eBook Requirements Development 463

SP 2.1 ESTABLISH PRODUCT AND PRODUCT COMPONENT REQUIREMENTS TIP Establish and maintain product and product component requirements, which Recursion is built into SP 2.1. Product requirements are asso- are based on the customer requirements. ciated with the top level of the product hierarchy. Product The customer functional and quality attribute requirements can be component requirements are expressed in the customer’s terms and can be nontechnical descrip- recursively developed for each tions. The product requirements are the expression of these require- lower level in parallel with the recursive development of TS. ments in technical terms that can be used for design decisions. An example of this translation is found in the first House of Quality X-REF Function Deployment, which maps customer desires into technical There are many sources parameters. For instance, “solid sounding door” may be mapped to of information on QFD, size, weight, fit, dampening, and resonant frequencies. including the iSixSigma web- Product and product component requirements address the satis- site (www.isixsigma.com) and faction of customer, business, and project objectives and associated the Quality Function Deploy- attributes, such as effectiveness and affordability. ment Institute website (www.qfdi.org). Derived requirements also address the needs of other lifecycle phases (e.g., production, operations, disposal) to the extent compati- ble with business objectives. The modification of requirements due to approved requirement TIP changes is covered by the “maintain” aspect of this specific practice; The wording in SP 2.1, “. . . whereas, the administration of requirement changes is covered by the maintain product and product- ptg Requirements Management process area. component requirements,” should be interpreted to cover Refer to the Requirements Management process area for more information about the modification of require- managing requirements. ments, not the administration of such changes (REQM SP 1.3). Example Work Products 1. Derived requirements 2. Product requirements 3. Product component requirements 4. Architectural requirements, which specify or constrain the relation-

ships among product components RD

Subpractices 1. Develop requirements in technical terms necessary for product and product component design. 2. Derive requirements that result from design decisions. Refer to the Technical Solution process area for more information about selecting product component solutions and developing the design. Selection of a technology brings with it additional requirements. For instance, use of electronics requires additional technology specific requirements such as electromagnetic interference limits.

Wow! eBook 464 PART TWO THE PROCESS AREAS

X-REF Architectural decisions, such as selection of architecture patterns, For sources of information introduce additional derived requirements for product components. on architecture patterns, see For example, the Layers Pattern will constrain dependencies between x-ref for TS SP 2.1. certain product components. 3. Develop architectural requirements capturing critical quality attri - butes and quality attribute measures necessary for establishing the product architecture and design.

TIP Examples of quality attribute measures include the following: Architecture requirements • Respond within 1 second express the product qualities • System is available 99% of the time and performance points that are critical to customer satisfac- • Implement a change with no more than one staff week of effort tion (also see TS SP 2.1).

4. Establish and maintain relationships between requirements for con- sideration during change management and requirements allocation. Refer to the Requirements Management process area for more information about maintaining bidirectional traceability of requirements. Relationships between requirements can aid in evaluating the impact of changes. ptg SP 2.2 ALLOCATE PRODUCT COMPONENT REQUIREMENTS Allocate the requirements for each product component. Refer to the Technical Solution process area for more information about select- ing product component solutions.

TIP The product architecture provides the basis for allocating product Other examples of shared requirements to product components. The requirements for product requirements include a con- components of the defined solution include allocation of product per- straint on memory that is to be formance; design constraints; and fit, form, and function to meet met by a collection of compo- requirements and facilitate production. In cases where a higher level nents and a safety requirement that is to be achieved by the requirement specifies a quality attribute that will be the responsibil- components of a product. ity of more than one product component, the quality attribute can sometimes be partitioned for unique allocation to each product com- TIP ponent as a derived requirement, however, other times the shared Sometimes, a provisional allo- requirement should instead be allocated directly to the architecture. cation of a higher-level require- For example, allocation of shared requirements to the architecture ment to product components is would describe how a performance requirement (e.g., on responsive- made, but is later revised to account for the unique or ness) is budgeted among components so as to account in an end-to- emerging capabilities of indi- end manner for realization of the requirement. This concept of vidual suppliers, teams, or new shared requirements can extend to other architecturally significant COTS products. quality attributes (e.g., security, reliability).

Wow! eBook Requirements Development 465

Example Work Products 1. Requirement allocation sheets 2. Provisional requirement allocations 3. Design constraints 4. Derived requirements 5. Relationships among derived requirements

Subpractices 1. Allocate requirements to functions. 2. Allocate requirements to product components and the architecture. 3. Allocate design constraints to product components and the architecture. 4. Allocate requirements to delivery increments. 5. Document relationships among allocated requirements. Relationships include dependencies in which a change in one require- ment can affect other requirements.

SP 2.3 IDENTIFY INTERFACE REQUIREMENTS Identify interface requirements. ptg Interfaces between functions (or between objects or other logical entities) are identified. Interfaces can drive the development of alter- native solutions described in the Technical Solution process area. Refer to the Product Integration process area for more information about ensur- TIP ing interface compatibility. Because of the importance of Interface requirements between products or product components interfaces to effective product identified in the product architecture are defined. They are controlled development, interfaces are addressed in multiple places: as part of product and product component integration and are an Interface requirements is cov- integral part of the architecture definition. ered in RD SP 2.3, design of

interfaces is covered in TS SP RD Example Work Products 2.3, and ensuring interface 1. Interface requirements compatibility is covered in PI SG 2. Subpractices

1. Identify interfaces both external to the product and internal to the TIP product (e.g., between functional partitions or objects). Identifying the interfaces for As the design progresses, the product architecture will be altered by which requirements will be technical solution processes, creating new interfaces between product developed is not a one-time components and components external to the product. event, but may continue for as long as design progresses or in Interfaces with product related lifecycle processes should also be response to changes. identified.

Wow! eBook 466 PART TWO THE PROCESS AREAS Examples of these interfaces include interfaces with test equipment, trans- portation systems, support systems, and manufacturing facilities.

2. Develop the requirements for the identified interfaces. Refer to the Technical Solution process area for more information about designing interfaces using criteria. Requirements for interfaces are defined in terms such as origination, destination, stimulus, data characteristics for software, and electrical and mechanical characteristics for hardware.

TIP SG 3 ANALYZE AND VALIDATE REQUIREMENTS The purpose of requirements The requirements are analyzed and validated. validation is to make sure you have a clear understanding of what the customer wants and The specific practices of the Analyze and Validate Requirements spe- needs. Often, this understand- cific goal support the development of the requirements in both the ing evolves over time and Develop Customer Requirements specific goal and the Develop Prod- requires a series of require- uct Requirements specific goal. The specific practices associated with ments validation activities. this specific goal cover analyzing and validating the requirements with respect to the end user’s intended environment. TIP Analyses are performed to determine what impact the intended ptg Requirements analyses examine requirements from operational environment will have on the ability to satisfy the stake- different perspectives (e.g., fea- holders’ needs, expectations, constraints, and interfaces. Considera- sibility, cost, and risk) and often tions, such as feasibility, mission needs, cost constraints, potential use different abstractions (e.g., market size, and acquisition strategy, should all be taken into process, functional, data flow, account, depending on the product context. Architecturally signifi- entity-relationship, state dia- cant quality attributes are identified based on mission and business grams, and temporal). drivers. A definition of required functionality and quality attributes HINT is also established. All specified usage modes for the product are considered. Technical performance param- eters (TPPs) can be used to The objectives of the analyses are to determine candidate require- help in assessing or predicting ments for product concepts that will satisfy stakeholder needs, quality attributes, cost, sched- expectations, and constraints and then to translate these concepts ule, risk, etc. into requirements. In parallel with this activity, the parameters that will be used to evaluate the effectiveness of the product are deter- X-REF mined based on customer input and the preliminary product For more information about concept. technical performance param- Requirements are validated to increase the probability that the eters, see “Using CMMI to Improve Earned Value Man- resulting product will perform as intended in the use environment. agement” (www.sei.cmu.edu/ publications/documents/ 02.reports/02tn016.html).

Wow! eBook Requirements Development 467

SP 3.1 ESTABLISH OPERATIONAL CONCEPTS AND SCENARIOS Establish and maintain operational concepts and associated scenarios.

A scenario is typically a sequence of events that may occur in the HINT development, use, or sustainment of the product, which is used to Think of an operational con- make explicit some of the functional or quality attribute needs of the cept as a picture that portrays stakeholders. In contrast, an operational concept for a product usually the product, end user, and depends on both the design solution and the scenario. For example, other entities in the intended environment. Think of a sce- the operational concept for a satellite based communications product nario as a story describing a is quite different from one based on landlines. Since the alternative sequence of events and end- solutions have not usually been defined when preparing the initial user and product interactions. operational concepts, conceptual solutions are developed for use An operational concept pro- when analyzing the requirements. The operational concepts are refined vides a context for developing as solution decisions are made and lower level detailed requirements or evaluating a set of scenarios. are developed. TIP Just as a design decision for a product can become a requirement Operational concepts and for a product component, the operational concept can become the scenarios are a way to demon - scenarios (requirements) for product components. Operational con- strate or bring to life what the cepts and scenarios are evolved to facilitate the selection of product requirements are trying to component solutions that, when implemented, will satisfy the c a p t u r e . intended use of the product or facilitate its development and sustain- ptg ment. Operational concepts and scenarios document the interaction of the product components with the environment, end users, and other product components, regardless of engineering discipline. They should be documented for all modes and states within opera- tions, product development, deployment, delivery, support (includ- ing maintenance and sustainment), training, and disposal. Scenarios can be developed to address operational, sustainment, development, or other event sequences. Example Work Products

1. Operational concept RD 2. Product or product component development, installation, opera- tional, maintenance, and support concepts 3. Disposal concepts 4. Use cases 5. Timeline scenarios 6. New requirements

Subpractices 1. Develop operational concepts and scenarios that include operations, installation, development, maintenance, support, and disposal as appropriate.

Wow! eBook 468 PART TWO THE PROCESS AREAS

Identify and develop scenarios, consistent with the level of detail in the stakeholder needs, expectations, and constraints in which the proposed product or product component is expected to operate. Augment scenarios with quality attribute considerations for the func- tions (or other logical entities) described in the scenario. 2. Define the environment in which the product or product component will operate, including boundaries and constraints. 3. Review operational concepts and scenarios to refine and discover requirements. Operational concept and scenario development is an iterative process. The reviews should be held periodically to ensure that they agree with the requirements. The review can be in the form of a walk- through. X-REF 4. Develop a detailed operational concept, as products and product Component selection is fur- components are selected, that defines the interaction of the product, ther described in TS. the end user, and the environment, and that satisfies the operational, maintenance, support, and disposal needs.

SP 3.2 ESTABLISH A DEFINITION OF REQUIRED FUNCTIONALITY AND QUALITY ATTRIBUTES Establish and maintain a definition of required functionality and quality attributes. ptg

One approach to defining required functionality and quality attri- butes is to analyze scenarios using what some have called a “func- tional analysis” to describe what the product is intended to do. This functional description can include actions, sequence, inputs, out- puts, or other information that communicates the manner in which the product will be used. The resulting description of functions, logi- cal groupings of functions, and their association with requirements is referred to as a functional architecture. (See the definitions of “func- tional analysis” and “functional architecture” in the glossary.) TIP Such approaches have evolved in recent years through the intro- As the activities described in duction of architecture description languages, methods, and tools to RD and TS progress, the defini- more fully address and characterize the quality attributes, allowing a tion of required functionality richer (e.g., multi-dimensional) specification of constraints on how and quality attributes is refined to reflect decisions made. the defined functionality will be realized in the product, and facilitat- ing additional analyses of the requirements and technical solutions. Some quality attributes will emerge as architecturally significant and thus drive the development of the product architecture. These qual- ity attributes often reflect cross-cutting concerns that may not be allocatable to lower level elements of a solution. A clear understand- ing of the quality attributes and their importance based on mission or business needs is an essential input to the design process.

Wow! eBook Requirements Development 469

Example Work Products 1. Definition of required functionality and quality attributes 2. Functional architecture 3. Activity diagrams and use cases 4. Object oriented analysis with services or methods identified 5. Architecturally significant quality attribute requirements

Subpractices 1. Determine key mission and business drivers. 2. Identify desirable functionality and quality attributes. Functionality and quality attributes can be identified and defined through an analysis of various scenarios with relevant stakeholders as described in the previous specific practice. 3. Determine architecturally significant quality attributes based on key mission and business drivers. 4. Analyze and quantify functionality required by end users. This analysis can involve considering the sequencing of time critical functions. 5. Analyze requirements to identify logical or functional partitions (e.g., subfunctions). ptg 6. Partition requirements into groups, based on established criteria (e.g., similar functionality, similar quality attribute requirements, coupling), to facilitate and focus the requirements analysis. 7. Allocate customer requirements to functional partitions, objects, people, or support elements to support the synthesis of solutions. 8. Allocate requirements to functions and subfunctions (or other logi- cal entities). TIP Requirements analyses help SP 3.3 ANALYZE REQUIREMENTS answer questions such as Analyze requirements to ensure that they are necessary and sufficient. whether all requirements are

necessary, whether any are RD missing, whether they are con- In light of the operational concept and scenarios, the requirements sistent with one other, and for one level of the product hierarchy are analyzed to determine whether they can be imple- whether they are necessary and sufficient to meet the objectives of mented and verified. higher levels of the product hierarchy. The analyzed requirements X-REF then provide the basis for more detailed and precise requirements for When analyzing require- lower levels of the product hierarchy. ments, look at some of the As requirements are defined, their relationship to higher level characteristics described in requirements and the higher level definition of required functionality REQM SP 1.1 Subpractice 2 to and quality attributes should be understood. Also, the key require- understand the many factors ments used to track progress are determined. For instance, the that can be considered.

Wow! eBook 470 PART TWO THE PROCESS AREAS

TIP weight of a product or size of a software product can be monitored The relationships among through development based on its risk or its criticality to the cus- requirements up and down the tomer. product hierarchy are investi- gated and recorded (see REQM Refer to the Verification process area for more information about establishing SP 1.4). verification procedures and criteria.

Example Work Products 1. Requirements defects reports 2. Proposed requirements changes to resolve defects 3. Key requirements 4. Technical performance measures

Subpractices HINT 1. Analyze stakeholder needs, expectations, constraints, and external As conflicts are removed, interfaces to organize them into related subjects and remove inform the relevant stakehold- conflicts. ers of changes that affect the 2. Analyze requirements to determine whether they satisfy the objec- requirements they provided. tives of higher level requirements. 3. Analyze requirements to ensure that they are complete, feasible, real- izable, and verifiable. ptg While design determines the feasibility of a particular solution, this subpractice addresses knowing which requirements affect feasibility. TIP 4. Identify key requirements that have a strong influence on cost, In these subpractices, you schedule, performance, or risk. determine whether the require- 5. Identify technical performance measures that will be tracked during ments serve as an adequate the development effort. basis for product development and how to track progress in Refer to the Measurement and Analysis process area for more information achieving key requirements. about developing and sustaining a measurement capability used to sup- port management information needs. 6. Analyze operational concepts and scenarios to refine the customer needs, constraints, and interfaces and to discover new requirements. This analysis can result in more detailed operational concepts and scenarios as well as supporting the derivation of new requirements.

SP 3.4 ANALYZE REQUIREMENTS TO ACHIEVE BALANCE TIP Analyze requirements to balance stakeholder needs and constraints. Often, the wish list is too large. It is necessary to understand Stakeholder needs and constraints can address such things as cost, the tradeoffs and what is truly schedule, product or project performance, functionality, priorities, important. reusable components, maintainability, or risk.

Wow! eBook Requirements Development 471

Example Work Products 1. Assessment of risks related to requirements

Subpractices 1. Use proven models, simulations, and prototyping to analyze the bal- ance of stakeholder needs and constraints. Results of the analyses can be used to reduce the cost of the product and the risk in developing the product. 2. Perform a risk assessment on the requirements and definition of required functionality and quality attributes. Refer to the Risk Management process area for more information about identifying and analyzing risks. 3. Examine product lifecycle concepts for impacts of requirements on risks. 4. Assess the impact of the architecturally significant quality attribute requirements on the product and product development costs and risks. When the impact of requirements on costs and risks seems to out- weigh the perceived benefit, relevant stakeholders should be con- sulted to determine what changes may be needed. ptg As an example, a really tight response time requirement or a high availability requirement could prove expensive to implement. Perhaps the requirement could be relaxed once the impacts (e.g., on cost) are understood.

SP 3.5 VALIDATE REQUIREMENTS Validate requirements to ensure the resulting product will perform as intended in the end user’s environment.

Requirements validation is performed early in the development effort TIP

with end users to gain confidence that the requirements are capable RD Requirements validation of guiding a development that results in successful final validation. should begin early and should This activity should be integrated with risk management activities. be repeated to limit the risk of Mature organizations will typically perform requirements validation having inadequate require- in a more sophisticated way using multiple techniques and will ments. broaden the basis of the validation to include other stakeholder needs and expectations.

Wow! eBook 472 PART TWO THE PROCESS AREAS Examples of techniques used for requirements validation include the f o l l o w i n g : • Analysis • Simulations • Prototyping • Demonstrations

Example Work Products 1. Record of analysis methods and results

Subpractices 1. Analyze the requirements to determine the risk that the resulting TIP product will not perform appropriately in its intended use environ- Requirements validation in this ment. SP and product validation in 2. Explore the adequacy and completeness of requirements by develop- VAL have the same goal, though ing product representations (e.g., prototypes, simulations, models, different perspectives. Require- scenarios, storyboards) and by obtaining feedback about them from ments validation focuses on relevant stakeholders. the adequacy and complete- ness of the requirements. VAL Refer to the Validation process area for more information about preparing focuses on predicting at multi- for validation and validating product or product components. ptg ple points in product develop- 3. Assess the design as it matures in the context of the requirements ment how well the product will validation environment to identify validation issues and expose satisfy user needs. unstated needs and customer requirements.

Wow! eBook REQUIREMENTS MANAGEMENT A Project Management Process Area at Maturity Level 2

Purpose X-REF The purpose of Requirements Management (REQM) is to manage REQM does not address elic- requirements of the project’s products and product components and iting or developing require- to ensure alignment between those requirements and the project’s ments. For more information plans and work products. on these topics, refer to the RD process area.

Introductory Notes Requirements management processes manage all requirements TIP ptg received or generated by the project, including both technical and REQM addresses all require- nontechnical requirements as well as requirements levied on the ments handled by the project, project by the organization. thus providing a stable founda- In particular, if the Requirements Development process area is tion for project planning, devel- implemented, its processes will generate product and product com- opment, testing, and delivery. ponent requirements that will also be managed by the requirements TIP management processes. Nontechnical requirements Throughout the process areas, where the terms “product” and include requirements that “product component” are used, their intended meanings also encom- address cost and schedule. pass services, service systems, and their components. When the Requirements Management, Requirements Development, TIP and Technical Solution process areas are all implemented, their asso- RD develops requirements, TS ciated processes can be closely tied and be performed concurrently. implements them, and REQM The project takes appropriate steps to ensure that the set of manages them from develop- approved requirements is managed to support the planning and exe- ment through implementation. cution needs of the project. When a project receives requirements from an approved requirements provider, these requirements are TIP reviewed with the requirements provider to resolve issues and pre- Requirements providers can vent misunderstanding before requirements are incorporated into include customers, end users, REQM project plans. Once the requirements provider and the requirements suppliers, management, regula- tory agencies, or standards receiver reach an agreement, commitment to the requirements is bodies. obtained from project participants. The project manages changes to

473

Wow! eBook 474 PART TWO THE PROCESS AREAS

TIP requirements as they evolve and identifies inconsistencies that occur Managing changes to require- among plans, work products, and requirements. ments ensures that all parts Part of managing requirements is documenting requirements of the project are working on changes and their rationale and maintaining bidirectional traceability the most current set of requirements. between source requirements, all product and product component requirements, and other specified work products. (See the definition TIP of “bidirectional traceability” in the glossary.) Bidirectional traceability, All projects have requirements. In the case of maintenance activi- explained in SP 1.4, is a concept ties, changes are based on changes to the existing requirements, that is critical to implementing design, or implementation. In projects that deliver increments of RD and TS. product capability, the changes can also be due to evolving customer needs, technology maturation and obsolescence, and standards evo- lution. In both cases, the requirements changes, if any, might be doc- umented in change requests from the customer or end users, or they might take the form of new requirements received from the require- ments development process. Regardless of their source or form, activities that are driven by changes to requirements are managed accordingly.

TIP In Agile environments, requirements are communicated and tracked In maintenance projects, through mechanisms such as product backlogs, story cards, and screen ptg requirements changes still mock-ups. Commitments to requirements are either made collectively by need to be managed, regard- the team or an empowered team leader. Work assignments are regularly less of their source. (e.g., daily, weekly) adjusted based on progress made and as an improved TIP understanding of the requirements and solution emerge. Traceability and consistency across requirements and work products is addressed through The practices and principles in a process area are often more the mechanisms already mentioned as well as during start-of-iteration or broadly applicable than first end-of-iteration activities such as “retrospectives” and “demo days.” (See imagined. REQM is often valu- “Interpreting CMMI When Using Agile Approaches” in Part I.) able to an organization’s train- ing department, quality assurance group, marketing group, and so on. Related Process Areas

Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component requirements. Refer to the Technical Solution process area for more information about select- ing, designing, and implementing solutions to requirements. Refer to the Configuration Management process area for more information about establishing baselines and tracking and controlling changes.

Wow! eBook Requirements Management 475

Refer to the Project Monitoring and Control process area for more information about monitoring the project against the plan and managing corrective action to closure. Refer to the Project Planning process area for more information about establish- ing and maintaining plans that define project activities. Refer to the Risk Management process area for more information about identi- fying and analyzing risks.

Specific Goal and Practice Summary SG 1 Manage Requirements SP 1.1 Understand Requirements SP 1.2 Obtain Commitment to Requirements SP 1.3 Manage Requirements Changes SP 1.4 Maintain Bidirectional Traceability of Requirements SP 1.5 Ensure Alignment Between Project Work and Requirements

Specific Practices by Goal

SG 1 MANAGE REQUIREMENTS Requirements are managed and inconsistencies with project plans and work TIP ptg products are identified. Requirements communicate expectations for the final prod- The project maintains a current and approved set of requirements uct. These expectations evolve as the project progresses. over the life of the project by doing the following:

• Managing all changes to requirements • Maintaining relationships among requirements, project plans, and work products • Ensuring alignment among requirements, project plans, and work products • Taking corrective action

Refer to the Requirements Development process area for more information about analyzing and validating requirements. Refer to the Develop Alternative Solutions and Selection Criteria specific prac- tice in the Technical Solution process area for more information about deter-

mining the feasibility of the requirements. REQM Refer to the Project Monitoring and Control process area for more information about managing corrective action to closure.

Wow! eBook 476 PART TWO THE PROCESS AREAS

SP 1.1 UNDERSTAND REQUIREMENTS Develop an understanding with the requirements providers on the meaning of the requirements.

TIP As the project matures and requirements are derived, all activities or “Requirements creep” is the disciplines will receive requirements. To avoid requirements creep, tendency for requirements to criteria are established to designate appropriate channels or official continually flow into a project sources from which to receive requirements. Those who receive (often from multiple sources) and expand the project’s scope requirements conduct analyses of them with the provider to ensure beyond what was planned. that a compatible, shared understanding is reached on the meaning of requirements. The result of these analyses and dialogs is a set of TIP approved requirements. Official sources of require- ments are the people you Example Work Products select to get requirements from. 1. Lists of criteria for distinguishing appropriate requirements Unofficial discussions that result in additional require- providers ments can cause trouble. 2. Criteria for evaluation and acceptance of requirements 3. Results of analyses against criteria 4. A set of approved requirements

Subpractices ptg X-REF 1. Establish criteria for distinguishing appropriate requirements Refer to GP 2.7 for examples providers. of requirements providers (and other relevant stake- 2. Establish objective criteria for the evaluation and acceptance of holders) that may be requirements. involved with the REQM Lack of evaluation and acceptance criteria often results in inadequate process. verification, costly rework, or customer rejection.

Examples of evaluation and acceptance criteria include the following: • Clearly and properly stated • Complete • Consistent with one another • Uniquely identified • Consistent with architectural approach and quality attribute priorities • Appropriate to implement • Verifiable (i.e., testable) •Traceable •Achievable • Tied to business value • Identified as a priority for the customer

Wow! eBook Requirements Management 477

3. Analyze requirements to ensure that established criteria are met. TIP 4. Reach an understanding of requirements with requirements Two-way communication is crit- providers so that project participants can commit to them. ical to ensuring that a shared understanding of requirements exists between the project and SP 1.2 OBTAIN COMMITMENT TO REQUIREMENTS the requirements providers.

Obtain commitment to requirements from project participants. TIP Refer to the Project Monitoring and Control process area for more information Typically, at the beginning of a about monitoring commitments. project, only about 50% of the requirements are known. The previous specific practice dealt with reaching an understanding TIP with requirements providers. This specific practice deals with agree- Project members who make ments and commitments among those who carry out activities neces- commitments must continually sary to implement requirements. Requirements evolve throughout evaluate whether they can the project. As requirements evolve, this specific practice ensures meet their commitments, com- that project participants commit to the current and approved require- municate immediately when ments and the resulting changes in project plans, activities, and work they realize they cannot meet a products. commitment, and mitigate the impacts of not being able to Example Work Products meet a commitment. TIP 1. Requirements impact assessments ptg 2. Documented commitments to requirements and requirements Commitments are a recurring changes theme in CMMI. They are also documented and reconciled in Subpractices PP, monitored in PMC, and addressed more thoroughly 1. Assess the impact of requirements on existing commitments. in IPM. The impact on the project participants should be evaluated when the requirements change or at the start of a new requirement. TIP 2. Negotiate and record commitments. Documented commitments can be in the form of meeting min- Changes to existing commitments should be negotiated before project utes, signed-off documents, or participants commit to a new requirement or requirement change. email.

SP 1.3 MANAGE REQUIREMENTS CHANGES TIP Manage changes to requirements as they evolve during the project. Commitments comprise both the resources involved and the Refer to the Configuration Management process area for more information schedule for completion. about tracking and controlling changes. TIP Requirements change for a variety of reasons. As needs change and as Controlling changes ensures that project members and REQM work proceeds, changes may have to be made to existing require- customers have a clear and ments. It is essential to manage these additions and changes effi- shared understanding of the ciently and effectively. To effectively analyze the impact of changes, it requirements. is necessary that the source of each requirement is known and the

Wow! eBook 478 PART TWO THE PROCESS AREAS

TIP rationale for the change is documented. The project may want to “Requirements volatility” is the track appropriate measures of requirements volatility to judge rate at which requirements whether new or revised approach to change control is necessary. change after implementation begins. When requirements Example Work Products volatility is high, the project cannot progress because it con- 1. Requirements change requests tinually adjusts to requirements 2. Requirements change impact reports changes. It is important to iden- 3. Requirements status tify when high requirements volatility occurs, so appropriate 4. Requirements database action can be taken. Subpractices TIP 1. Document all requirements and requirements changes that are given A database is an example of a to or generated by the project. tool used to manage the change history of the requirements. 2. Maintain a requirements change history, including the rationale for changes. TIP Maintaining the change history helps to track requirements volatility. The source of requirements 3. Evaluate the impact of requirement changes from the standpoint of changes can be either internal relevant stakeholders. or external to the project. Requirements changes that affect the product architecture can affect TIP many stakeholders. ptg A change history allows project 4. Make requirements and change data available to the project. members to review why deci- sions were made when they SP 1.4 MAINTAIN BIDIRECTIONAL TRACEABILITY OF REQUIREMENTS are questioned later. Maintain bidirectional traceability among requirements and work products. TIP Tracing source requirements to The intent of this specific practice is to maintain the bidirectional lower-level requirements and traceability of requirements. (See the definition of “bidirectional trace- then to the product component demonstrates where and how ability” in the glossary.) When requirements are managed well, traceabil- each requirement is met ity can be established from a source requirement to its lower level through product decomposi- requirements and from those lower level requirements back to their tion, and identifies which source requirements. Such bidirectional traceability helps to determine requirements are to be evalu- whether all source requirements have been completely addressed and ated in work product verifica- tion (VER). whether all lower level requirements can be traced to a valid source. Requirements traceability also covers relationships to other enti- ties such as intermediate and final work products, changes in design documentation, and test plans. Traceability can cover horizontal rela- tionships, such as across interfaces, as well as vertical relationships. Traceability is particularly needed when assessing the impact of requirements changes on project activities and work products.

Wow! eBook Requirements Management 479 Examples of what aspects of traceability to consider include the following: • Scope of traceability: The boundaries within which traceability is needed • Definition of traceability: The elements that need logical relationships • Type of traceability: When horizontal and vertical traceability is needed

Such bidirectional traceability is not always automated. It can be TIP done manually using spreadsheets, databases, and other common Maintaining traceability tools. across horizontal relationships can greatly reduce problems Example Work Products encountered in product integration. 1. Requirements traceability matrix 2. Requirements tracking system TIP A traceability matrix can take Subpractices many forms: a spreadsheet, a database, and so on. 1. Maintain requirements traceability to ensure that the source of lower level (i.e., derived) requirements is documented. HINT 2. Maintain requirements traceability from a requirement to its derived When a requirement changes, requirements and allocation to work products. update the traceability matrix Work products for which traceability may be maintained include the to retain insight into the require- ments-to-work product rela- architecture, product components, development iterations (or incre- ptg ments), functions, interfaces, objects, people, processes, and other tionships mentioned earlier. work products. 3. Generate a requirements traceability matrix.

SP 1.5 ENSURE ALIGNMENT BETWEEN PROJECT WORK AND REQUIREMENTS Ensure that project plans and work products remain aligned with TIP r e q u i r e m e n t s . Especially for larger projects, product components are devel- oped in parallel and it is chal- This specific practice finds inconsistencies between requirements lenging to keep all work and project plans and work products and initiates corrective actions products fully consistent with to resolve them. changes to the requirements. Example Work Products 1. Documentation of inconsistencies between requirements and project plans and work products, including sources and conditions 2. Corrective actions

Subpractices TIP REQM A traceability matrix can help 1. Review project plans, activities, and work products for consistency with this review of project with requirements and changes made to them. plans, activities, and work 2. Identify the source of the inconsistency (if any). products.

Wow! eBook 480 PART TWO THE PROCESS AREAS

3. Identify any changes that should be made to plans and work prod- ucts resulting from changes to the requirements baseline. 4. Initiate any necessary corrective actions.

ptg

Wow! eBook RSKM

RISK MANAGEMENT A Project Management Process Area at Maturity Level 3

TIP Purpose In any project, problems impact the achievement of objectives. The purpose of Risk Management (RSKM) is to identify potential Resolving problems after they occur can be disruptive and problems before they occur so that risk handling activities can be expensive. RSKM is a system- planned and invoked as needed across the life of the product or proj- atic approach to identifying ect to mitigate adverse impacts on achieving objectives. potential problems before they occur to provide time to deter- mine how best to address Introductory Notes them. Risk management is a continuous, forward-looking process that is an TIP ptg important part of project management. Risk management should RSKM also can apply to identi- address issues that could endanger achievement of critical objectives. fying, evaluating, and maximiz- A continuous risk management approach effectively anticipates and ing (or realizing) opportunities. mitigates risks that can have a critical impact on a project. Effective risk management includes early and aggressive risk TIP identification through collaboration and the involvement of relevant In a dynamic environment, risk stakeholders as described in the stakeholder involvement plan management must be a contin- uous process of identifying, addressed in the Project Planning process area. Strong leadership analyzing, and mitigating risks. among all relevant stakeholders is needed to establish an environ- ment for free and open disclosure and discussion of risk. TIP Risk management should consider both internal and external, as Without a free and open envi- well as both technical and non-technical, sources of cost, schedule, ronment, many risks remain performance, and other risks. Early and aggressive detection of risk undisclosed until they surface as problems. is important because it is typically easier, less costly, and less disrup- tive to make changes and correct work efforts during the earlier, HINT rather than the later, phases of the project. Relevant stakeholders external For example, decisions related to product architecture are often to the project bring perspec- made early before their impacts can be fully understood, and thus the tives and insight into identify- risk implications of such choices should be carefully considered. ing and evaluating risks. They Industry standards can help when determining how to prevent or also may control the resources needed by the project, so it is mitigate specific risks commonly found in a particular industry. Cer- important to maintain a dialog tain risks can be proactively managed or mitigated by reviewing on related risks. industry best practices and lessons learned. 481

Wow! eBook 482 PART TWO THE PROCESS AREAS

TIP Risk management can be divided into the following parts: Sources of risks are not just technical, but can be program- • Defining a risk management strategy matic (e.g., cost, schedule, sup- • Identifying and analyzing risks plier risks) or business related (e.g., competitor getting to mar- • Handling identified risks, including the implementation of risk miti- ket first). gation plans as needed

TIP As represented in the Project Planning and Project Monitoring A precondition to efficient risk and Control process areas, organizations initially may focus on risk management is having shared and consistent project objec- identification for awareness and react to the realization of these risks tives. Project objectives pro- as they occur. The Risk Management process area describes an evolu- vide focus to risk management tion of these specific practices to systematically plan, anticipate, and and help guide its activities. mitigate risks to proactively minimize their impact on the project. Although the primary emphasis of the Risk Management process X-REF area is on the project, these concepts can also be applied to manage PP and PMC have risk-man- organizational risks. agement-related practices: See PP, SP 2.2 Identify Project Risks, and PMC, SP 1.3 Moni- In Agile environments, some risk management activities are inherently tor Project Risks. embedded in the Agile method used. For example, some technical risks can be addressed by encouraging experimentation (early “failures”) or by X-REF executing a “spike” outside of the routine iteration. However, the Risk ptg For more information, see B. Management process area encourages a more systematic approach to Boehm, “Software Risk Man- managing risks, both technical and non-technical. Such an approach can agement: Principles and Prac- be integrated into Agile’s typical iteration and meeting rhythms; more tices,” IEEE Software, 1990; specifically, during iteration planning, task estimating, and acceptance of Project Management Institute, A Guide to the Project Man- tasks. (See “Interpreting CMMI When Using Agile Approaches” in Part I.) agement Body of Knowledge (PMBOK Guide) Fourth Edi- tion, 2008 (Chapter 11 deals with risk management); and Related Process Areas www.sei.cmu.edu/risk/ index.cfm. Refer to the Decision Analysis and Resolution process area for more informa- tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. Refer to the Project Monitoring and Control process area for more information about monitoring project risks. Refer to the Project Planning process area for more information about identify- ing project risks and planning stakeholder involvement.

Specific Goal and Practice Summary SG 1 Prepare for Risk Management SP 1.1 Determine Risk Sources and Categories SP 1.2 Define Risk Parameters SP 1.3 Establish a Risk Management Strategy

Wow! eBook Risk Management 483

SG 2 Identify and Analyze Risks SP 2.1 Identify Risks RSKM SP 2.2 Evaluate, Categorize, and Prioritize Risks SG 3 Mitigate Risks SP 3.1 Develop Risk Mitigation Plans SP 3.2 Implement Risk Mitigation Plans

Specific Practices by Goal

SG 1 PREPARE FOR RISK MANAGEMENT TIP Preparation for risk management is conducted. Without a risk management strategy, projects fail to identify critical risks, inconsistently Prepare for risk management by establishing and maintaining a strat- evaluate them, and ineffec- egy for identifying, analyzing, and mitigating risks. Typically, this tively and inefficiently handle strategy is documented in a risk management plan. The risk manage- them. ment strategy addresses specific actions and the management approach used to apply and control the risk management program. TIP The strategy typically includes identifying sources of risk, the When there is a lot of homo- scheme used to categorize risks, and parameters used to evaluate, geneity in an organization’s projects, SG 1 may be cost- bound, and control risks for effective handling. effectively implemented at the organizational level (especially ptg SP 1.1 DETERMINE RISK SOURCES AND CATEGORIES if the projects are small and short in duration). Such an Determine risk sources and categories. approach helps senior man- agers evaluate risk exposure Identifying risk sources provides a basis for systematically examining across multiple projects. changing situations over time to uncover circumstances that affect the ability of the project to meet its objectives. Risk sources are both HINT internal and external to the project. As the project progresses, addi- If a project uses safety and security analysis methods, you tional sources of risk can be identified. Establishing categories for should incorporate these meth- risks provides a mechanism for collecting and organizing risks as ods as part of the risk manage- well as ensuring appropriate scrutiny and management attention to ment strategy. risks that can have serious consequences on meeting project objectives. TIP Under stress, people lose per- Example Work Products spective when it comes to risks. Some risks (e.g., those external 1. Risk source lists (external and internal) to the team) may be given too 2. Risk categories list much emphasis and others too little. Having a list of risk Subpractices sources, both internal and external, helps bring objectivity 1. Determine risk sources. to the identification of risks. Risk sources are fundamental drivers that cause risks in a project or organization. There are many sources of risks, both internal and external to a project. Risk sources identify where risks can originate.

Wow! eBook 484 PART TWO THE PROCESS AREAS

TIP Typical internal and external risk sources include the following: In a typical project, risks and • Uncertain requirements their status change. A standard • Unprecedented efforts (i.e., estimates unavailable) list of risk sources enables a • Infeasible design project to be thorough in its identification of risks at each • Competing quality attribute requirements that affect solution selection point in the project. and design • Unavailable technology TIP • Unrealistic schedule estimates or allocation Disruptions to the continuity • Inadequate staffing and skills of operations is a risk source • Cost or funding issues (generally) that is often neg- • Uncertain or inadequate subcontractor capability lected. It may have huge • Uncertain or inadequate supplier capability consequences, but also has • Inadequate communication with actual or potential customers or with cost-effective mitigation their representatives techniques. Refer to the Serv - ice Continuity PA of CMMI for • Disruptions to the continuity of operations Serices for more information. • Regulatory constraints (e.g., security, safety, environment)

TIP Many of these sources of risk are accepted without adequately plan- A taxonomy of risk sources can ning for them. Early identification of both internal and external provide a project with a rea- sonably thorough checklist for sources of risk can lead to early identification of risks. Risk mitigation identifying new risks, even as plans can then be implemented early in the project to preclude occur- ptg situations change. rence of risks or reduce consequences of their occurrence. 2. Determine risk categories. TIP Risk categories are “bins” used for collecting and organizing risks. Categories are used to group Identifying risk categories aids the future consolidation of activities in related risks that can often be risk mitigation plans. addressed by the same mitiga- tion activities, thereby increas- ing risk management efficiency. The following factors can be considered when determining risk categories: • Phases of the project’s lifecycle model (e.g., requirements, design, manu- TIP facturing, test and evaluation, delivery, disposal) Risks are often grouped by life- • Types of processes used cycle phase. • Types of products used • Project management risks (e.g., contract risks, budget risks, schedule risks, resource risks) • Technical performance risks (e.g., quality attribute related risks, support- ability risks)

A risk taxonomy can be used to provide a framework for determining risk sources and categories.

Wow! eBook Risk Management 485

SP 1.2 DEFINE RISK PARAMETERS

Define parameters used to analyze and categorize risks and to control the risk RSKM management effort.

Parameters for evaluating, categorizing, and prioritizing risks include the following:

• Risk likelihood (i.e., probability of risk occurrence) • Risk consequence (i.e., impact and severity of risk occurrence) • Thresholds to trigger management activities

Risk parameters are used to provide common and consistent cri- teria for comparing risks to be managed. Without these parameters, it is difficult to gauge the severity of an unwanted change caused by a risk and to prioritize the actions required for risk mitigation planning. Projects should document the parameters used to analyze and categorize risks so that they are available for reference throughout the life of the project because circumstances change over time. Using these parameters, risks can easily be re-categorized and analyzed when changes occur. The project can use techniques such as failure mode and effects ptg analysis (FMEA) to examine risks of potential failures in the product or in selected product development processes. Such techniques can help to provide discipline in working with risk parameters. Example Work Products TIP 1. Risk evaluation, categorization, and prioritization criteria These requirements include 2. Risk management requirements (e.g., control and approval levels, the thresholds associated with reassessment intervals) each risk category.

Subpractices TIP 1. Define consistent criteria for evaluating and quantifying risk likeli- To be effective, risk manage- hood and severity levels. ment must be objective and Consistently used criteria (e.g., bounds on likelihood, severity levels) quantitative. Therefore, treat risks consistently with respect allow impacts of different risks to be commonly understood, to to key parameters. Defining cri- receive the appropriate level of scrutiny, and to obtain the manage- teria for evaluating risks helps ment attention warranted. In managing dissimilar risks (e.g., staff to ensure consistency (e.g., safety versus environmental pollution), it is important to ensure con- helping to decide which risks sistency in the end result. (For example, a high-impact risk of envi- get escalated, mitigated, and so ronmental pollution is as important as a high-impact risk to staff on, based only on the values of safety.) One way of providing a common basis for comparing dissimi- their parameters). lar risks is assigning dollar values to risks (e.g., through a process of risk monetization).

Wow! eBook 486 PART TWO THE PROCESS AREAS

TIP 2. Define thresholds for each risk category. In the middle of a project, you For each risk category, thresholds can be established to determine often lose perspective. Defining acceptability or unacceptability of risks, prioritization of risks, or trig- thresholds in advance enables gers for management action. a more objective treatment of risks. Examples of thresholds include the following: • Project-wide thresholds could be established to involve senior manage- ment when product costs exceed 10 percent of the target cost or when cost performance indices (CPIs) fall below 0.95. TIP • Schedule thresholds could be established to involve senior management These thresholds may need to when schedule performance indices (SPIs) fall below 0.95. be refined later as part of risk mitigation planning. • Performance thresholds could be established to involve senior manage- ment when specified key items (e.g., processor utilization, average response times) exceed 125 percent of the intended design.

TIP 3. Define bounds on the extent to which thresholds are applied against Bounds are intended to scope or within a category. the risk management effort in sensible ways that help con- There are few limits to which risks can be assessed in either a quanti- serve project resources. tative or qualitative fashion. Definition of bounds (or boundary con- ditions) can be used to help define the extent of the risk management effort and avoid excessive resource expenditures. Bounds can include ptg the exclusion of a risk source from a category. These bounds can also exclude conditions that occur below a given frequency.

SP 1.3 ESTABLISH A RISK MANAGEMENT STRATEGY TIP The risk management strategy Establish and maintain the strategy to be used for risk management. documents the results of the first two specific practices. A comprehensive risk management strategy addresses items such as the following:

• The scope of the risk management effort • Methods and tools to be used for risk identification, risk analysis, risk mitigation, risk monitoring, and communication • Project specific sources of risks • How risks are to be organized, categorized, compared, and consolidated • Parameters used for taking action on identified risks, including likeli- TIP hood, consequence, and thresholds An important part of the strat- • Risk mitigation techniques to be used, such as prototyping, piloting, egy is to determine the fre- simulation, alternative designs, or evolutionary development quency of monitoring activities and of reassessing risks for • The definition of risk measures used to monitor the status of risks changes in status. • Time intervals for risk monitoring or reassessment

Wow! eBook Risk Management 487

The risk management strategy should be guided by a common TIP

vision of success that describes desired future project outcomes in All relevant stakeholders must RSKM terms of the product delivered, its cost, and its fitness for the task. understand the risk manage- The risk management strategy is often documented in a risk manage- ment strategy fully. ment plan for the organization or project. This strategy is reviewed with relevant stakeholders to promote commitment and understanding. A risk management strategy should be developed early in the TIP project, so that relevant risks are identified and managed proactively. Some organizations develop a standard risk management Early identification and assessment of critical risks allows the project strategy template that is tai- to formulate risk handling approaches and adjust project definition lored to meet the needs of indi- and allocation of resources based on critical risks. vidual projects.

Example Work Products TIP 1. Project risk management strategy The risk management strategy is often documented as part of a risk management plan, or as a SG 2 IDENTIFY AND ANALYZE RISKS section in the project plan. Risks are identified and analyzed to determine their relative importance. TIP The degree of risk affects the resources assigned to handle the risk Risk identification and analysis and the timing of when appropriate management attention is is a continuing activity for the duration of the project. required. ptg Risk analysis entails identifying risks from identified internal and external sources and evaluating each identified risk to determine its TIP likelihood and consequences. Risk categorization, based on an evalu- Describing context, conditions, ation against established risk categories and criteria developed for and consequences of a risk in a risk statement provide much of the risk management strategy, provides information needed for risk the information that is needed handling. Related risks can be grouped to enable efficient handling later to understand and evalu- and effective use of risk management resources. ate the risk.

SP 2.1 IDENTIFY RISKS Identify and document risks.

Identifying potential issues, hazards, threats, and vulnerabilities that could negatively affect work efforts or plans is the basis for sound and successful risk management. Risks should be identified and described understandably before they can be analyzed and managed TIP properly. Risks are documented in a concise statement that includes In OPD, process assets can the context, conditions, and consequences of risk occurrence. include risk taxonomies, tem- Risk identification should be an organized, thorough approach to plates for describing risks, les- seek out probable or realistic risks in achieving objectives. To be sons learned, and exemplar effective, risk identification should not attempt to address every pos- risk management strategies sible event. Using categories and parameters developed in the risk that can benefit a new project.

Wow! eBook 488 PART TWO THE PROCESS AREAS

HINT management strategy and identified sources of risk can provide the Establish a work environment discipline and streamlining appropriate for risk identification. Identi- in which risks can be disclosed fied risks form a baseline for initiating risk management activities. and discussed openly, without Risks should be reviewed periodically to reexamine possible sources fear of repercussions. (Don’t shoot the messenger!) of risk and changing conditions to uncover sources and risks previ- ously overlooked or nonexistent when the risk management strategy TIP was last updated. Many of the methods used for Risk identification focuses on the identification of risks, not the identifying risks use organiza- placement of blame. The results of risk identification activities tional process assets such as a should never be used by management to evaluate the performance of risk taxonomy, risk repository, individuals. and lessons learned from past projects. Subject-matter experts may also be utilized. Many methods are used for identifying risks. Typical identification meth- Other methods involve review- ods include the following: ing project artifacts such as the • Examine each element of the project work breakdown structure. project shared vision, WBS, designs, project interfaces, and • Conduct a risk assessment using a risk taxonomy. contractual requirements. • Interview subject matter experts. • Review risk management efforts from similar products. • Examine lessons learned documents or databases. • Examine design specifications and agreement requirements. ptg HINT Sometimes, customers are Example Work Products short-sighted about their 1. List of identified risks, including the context, conditions, and conse- requirements. Explain to cus- quences of risk occurrence tomers the implications of their requirements and associated Subpractices risks. However, the result may be a decision to not mitigate 1. Identify the risks associated with cost, schedule, and performance. the associated risks (unless Risks associated with cost, schedule, performance, and other business required or desired for other objectives should be examined to understand their effect on project reasons). objectives. Risk candidates can be discovered that are outside the scope of project objectives but vital to customer interests. For exam- TIP ple, risks in development costs, product acquisition costs, cost of One approach to identifying spare (or replacement) products, and product disposition (or dis- risks is to consider the cost, posal) costs have design implications. schedule, and performance The customer may not have considered the full cost of supporting a issues associated with each lifecycle phase. As each phase fielded product or using a delivered service. The customer should be typically has a clear set of informed of such risks, but actively managing those risks may not be objectives and a completion necessary. Mechanisms for making such decisions should be exam- milestone, a phase is a suitable ined at project and organization levels and put in place if deemed context for identifying the risks appropriate, especially for risks that affect the project’s ability to ver- associated with that phase. ify and validate the product.

Wow! eBook Risk Management 489

In addition to the cost risks identified above, other cost risks can TIP

include the ones associated with funding levels, funding estimates, “Can a project have no risks?” RSKM and distributed budgets. “Can multiple projects have the Schedule risks can include risks associated with planned activities, same risk lists?” If these ques- key events, and milestones. tions are asked, someone does not understand risks. Some risks may be product specific, Performance risks can include risks associated with the following: contract specific, or staff • Requirements s p e c i f i c . • Analysis and design • Application of new technology TIP • Physical size Performance risks may be asso- •Shape ciated with lifecycle phases, •Weight new technology, or desired • Manufacturing and fabrication quality attributes of the product. • Product behavior and operation with respect to functionality or quality attributes • Verification • Validation • Performance maintenance attributes

ptg Performance maintenance attributes are those characteristics that enable an in-use product or service to provide required performance, such as maintaining safety and security performance. There are risks that do not fall into cost, schedule, or performance categories, but can be associated with other aspects of the organiza- tion’s operation.

Examples of these other risks include risks related to the following: TIP •Strikes Not all risks fall into the cost, • Diminishing sources of supply schedule, and performance • Technology cycle time subcategories. • Competition

2. Review environmental elements that can affect the project. TIP Risks to a project that frequently are missed include risks supposedly Environmental risks are often outside the scope of the project (i.e., the project does not control ignored, even though some whether they occur but can mitigate their impact). These risks can cost-effective mitigation activi- include weather or natural disasters, political changes, and telecom- ties are available. munications failures. 3. Review all elements of the work breakdown structure as part of iden- tifying risks to help ensure that all aspects of the work effort have been considered.

Wow! eBook 490 PART TWO THE PROCESS AREAS

TIP 4. Review all elements of the project plan as part of identifying risks to A good risk statement is fact help ensure that all aspects of the project have been considered. based, actionable, and brief. Refer to the Project Planning process area for more information about identifying project risks. TIP A standard format for docu- 5. Document the context, conditions, and potential consequences of menting risks makes it easier to each risk. train staff on what is needed in Risk statements are typically documented in a standard format that risk statements. Also, senior contains the risk context, conditions, and consequences of occur- managers have a quicker rence. The risk context provides additional information about the risk understanding of risk state- such as the relative time frame of the risk, the circumstances or con- ments across different projects ditions surrounding the risk that has brought about the concern, and if they are all in the same format. any doubt or uncertainty. HINT 6. Identify the relevant stakeholders associated with each risk. It is important to identify who is affected by each risk. SP 2.2 EVALUATE, CATEGORIZE, AND PRIORITIZE RISKS Evaluate and categorize each identified risk using defined risk categories and parameters, and determine its relative priority.

HINT The evaluation of risks is needed to assign a relative importance to You can simplify risk manage- each identified risk and is used in determining when appropriate ment when you can treat a management attention is required. Often it is useful to aggregate ptg number of risks as one group risks based on their interrelationships and develop options at an from evaluation and mitigation aggregate level. When an aggregate risk is formed by a roll up of perspectives. lower level risks, care should be taken to ensure that important lower TIP level risks are not ignored. Risk evaluation is often called Collectively, the activities of risk evaluation, categorization, and risk assessment. (CMMI uses prioritization are sometimes called a “risk assessment” or “risk analysis.” risk analysis instead.) Example Work Products 1. List of risks and their assigned priority

Subpractices 1. Evaluate identified risks using defined risk parameters. TIP Each risk is evaluated and assigned values according to defined risk Risk exposure is the result of parameters, which can include likelihood, consequence (i.e., severity, the combination of the likeli- impact), and thresholds. The assigned risk parameter values can be hood and the consequence of a integrated to produce additional measures, such as risk exposure (i.e., risk (expressed quantitatively). the combination of likelihood and consequence), which can be used to prioritize risks for handling. Often, a scale with three to five values is used to evaluate both likeli- hood and consequence.

Wow! eBook Risk Management 491 Likelihood, for example, can be categorized as remote, unlikely, likely,

highly likely, or nearly certain. RSKM

Example categories for consequence include the following: •Low •Medium •High • Negligible • Marginal •Significant •Critical • Catastrophic

Probability values are frequently used to quantify likelihood. Conse- quences are generally related to cost, schedule, environmental impact, or human measures (e.g., labor hours lost, severity of injury). Risk evaluation is often a difficult and time consuming task. Specific TIP expertise or group techniques may be needed to assess risks and gain Determining values for risk confidence in the prioritization. In addition, priorities can require likelihood and consequence is reevaluation as time progresses. To provide a basis for comparing the easier and more repeatable if ptg impact of the realization of identified risks, consequences of the risks there are clear statements of can be monetized. objectives, criteria exist for assigning values, there is 2. Categorize and group risks according to defined risk categories. appropriate representation of Risks are categorized into defined risk categories, providing a means relevant stakeholders, and staff to review them according to their source, taxonomy, or project com- has been trained. ponent. Related or equivalent risks can be grouped for efficient han- dling. The cause-and-effect relationships between related risks are documented. 3. Prioritize risks for mitigation. A relative priority is determined for each risk based on assigned risk TIP parameters. Clear criteria should be used to determine risk priority. Priority assignment can like- Risk prioritization helps to determine the most effective areas to wise be a repeatable process. which resources for risks mitigation can be applied with the greatest positive impact on the project.

SG 3 MITIGATE RISKS Risks are handled and mitigated as appropriate to reduce adverse impacts on achieving objectives.

The steps in handling risks include developing risk handling options, monitoring risks, and performing risk handling activities when

Wow! eBook 492 PART TWO THE PROCESS AREAS

X-REF defined thresholds are exceeded. Risk mitigation plans are developed RSKM heavily influences both and implemented for selected risks to proactively reduce the poten- engineering (to adjust require- tial impact of risk occurrence. Risk mitigation planning can also ments, design, implementation, include contingency plans to deal with the impact of selected risks verification, and validation in light of risks and risk mitiga- that can occur despite attempts to mitigate them. Risk parameters tion) and project manage- used to trigger risk handling activities are defined by the risk man- ment (to plan for these agement strategy. activities and monitor thresh- olds that trigger the deploy- SP 3.1 DEVELOP RISK MITIGATION PLANS ment of mitigation plans). Develop a risk mitigation plan in accordance with the risk management TIP s t r a t e g y . The risk management literature uses the term contingency A critical component of risk mitigation planning is developing alter- plans in various ways. In CMMI, native courses of action, workarounds, and fallback positions, and a a risk contingency plan is the recommended course of action for each critical risk. The risk mitiga- part of a risk mitigation plan that addresses what actions to tion plan for a given risk includes techniques and methods used to take after a risk is realized. avoid, reduce, and control the probability of risk occurrence; the extent of damage incurred should the risk occur (sometimes called a HINT “contingency plan”); or both. Risks are monitored and when they Reward staff members who exceed established thresholds, risk mitigation plans are deployed to prevent crises, as opposed to return the affected effort to an acceptable risk level. If the risk cannot ptg those who allow a crisis to hap- pen and then heroically be mitigated, a contingency plan can be invoked. Both risk mitiga- resolve it. tion and contingency plans often are generated only for selected risks for which consequences of the risks are high or unacceptable. Other HINT risks may be accepted and simply monitored. One way to identify actions to avoid, reduce, or control a risk Options for handling risks typically include alternatives such as the is to perform a causal analysis on the sources of the risk. f o l l o w i n g : Actions that eliminate or • Risk avoidance: changing or lowering requirements while still meeting reduce causes may be suitable end user needs candidates for inclusion in a • Risk control: taking active steps to minimize risks risk mitigation plan. • Risk transfer: reallocating requirements to lower risks • Risk monitoring: watching and periodically reevaluating the risk for TIP changes in assigned risk parameters A risk mitigation plan should, • Risk acceptance: acknowledging risk but not taking action when its associated threshold is exceeded, return the project to an acceptable risk level. Often, especially for high-impact risks, more than one approach HINT to handling a risk should be generated. Generate more than one approach or option for high- priority risks.

Wow! eBook Risk Management 493 For example, in the case of an event that disrupts the continuity of opera-

tions, approaches to risk management can include establishing the RSKM f o l l o w i n g : TIP • Resource reserves to respond to disruptive events These are cost-effective mitiga- • Lists of available backup equipment tions to disruptions to continu- • Backups to key staff ity of operations. • Plans for testing emergency response systems • Posted procedures for emergencies • Disseminated lists of key contacts and information resources for e m e r g e n c i e s

In many cases, risks are accepted or watched. Risk acceptance is usually done when the risk is judged too low for formal mitigation or when there appears to be no viable way to reduce the risk. If a risk is accepted, the rationale for this decision should be documented. Risks are watched when there is an objectively defined, verifiable, and doc- umented threshold (e.g., for cost, schedule, performance, risk expo- sure) that will trigger risk mitigation planning or invoke a contingency plan. Refer to the Decision Analysis and Resolution process area for more informa- HINT tion about evaluating alternatives and selecting solutions. You can establish thresholds ptg Adequate consideration should be given early to technology for different attributes (cost, schedule, and performance) demonstrations, models, simulations, pilots, and prototypes as part and trigger different activities of risk mitigation planning. (contingency plan deployment, risk mitigation planning). Example Work Products 1. Documented handling options for each identified risk 2. Risk mitigation plans 3. Contingency plans HINT 4. List of those who are responsible for tracking and addressing each You can use thresholds in mul- risk tiple ways. A risk is discovered and watched (a schedule Subpractices threshold is set to invoke miti- 1. Determine the levels and thresholds that define when a risk becomes gation), the risk is controlled unacceptable and triggers the execution of a risk mitigation plan or (the schedule threshold was contingency plan. exceeded and a mitigation plan is invoked; the plan sets a Risk level (derived using a risk model) is a measure combining the quality threshold), the risk is uncertainty of reaching an objective with the consequences of failing watched (until poor quality to reach the objective. invokes the mitigation plan), or Risk levels and thresholds that bound planned or acceptable cost, the risk is controlled (the miti- schedule, or performance should be clearly understood and defined gation activities address the to provide a means with which risk can be understood. Proper quality problem).

Wow! eBook 494 PART TWO THE PROCESS AREAS

TIP categorization of risk is essential for ensuring an appropriate priority Thresholds trigger actions that based on severity and the associated management response. There can may impact the work of staff be multiple thresholds employed to initiate varying levels of manage- and relevant stakeholders. ment response. Typically, thresholds for the execution of risk mitiga- Thus, it is important that the tion plans are set to engage before the execution of contingency motivation for particular plans. thresholds and the actions they invoke are well understood by 2. Identify the person or group responsible for addressing each risk. all who will be affected. 3. Determine the costs and benefits of implementing the risk mitiga- tion plan for each risk. TIP Risk mitigation activities should be examined for benefits they pro- Assigning responsibility for risk vide versus resources they will expend. Just like any other design mitigation is easier to do if miti- activity, alternative plans may need to be developed and costs and gation activities and responsi- bilities are documented in a benefits of each alternative assessed. The most appropriate plan is plan, such as the project plan. selected for implementation. 4. Develop an overall risk mitigation plan for the project to orchestrate TIP the implementation of individual risk mitigation and contingency Alternative risk mitigation plans. plans may be formally evalu- The complete set of risk mitigation plans may not be affordable. A ated (using DAR) to choose the tradeoff analysis should be performed to prioritize risk mitigation best one. Relevant stakehold- plans for implementation. ers may play an important role in such an evaluation (particu- 5. Develop contingency plans for selected critical risks in the event larly for risks that impact them). their impacts are realized. ptg Risk mitigation plans are developed and implemented as needed to HINT proactively reduce risks before they become problems. Despite best If you integrate all the risk miti- efforts, some risks can be unavoidable and will become problems that gation plans together and find affect the project. Contingency plans can be developed for critical that the result is not affordable, risks to describe actions a project can take to deal with the occur- trim the list using the priorities rence of this impact. The intent is to define a proactive plan for han- assigned in SP 2.2 or reprioritize them using a group-consensus dling the risk. Either the risk is reduced (mitigation) or addressed approach (e.g., multivoting (contingency). In either event, the risk is managed. within risk categories). Some risk management literature may consider contingency plans a synonym or subset of risk mitigation plans. These plans also can be TIP addressed together as risk handling or risk action plans. In spite of best efforts, some risks may be realized and con- SP 3.2 IMPLEMENT RISK MITIGATION PLANS tingency plans are invoked. Monitor the status of each risk periodically and implement the risk mitigation TIP plan as appropriate. Effective monitoring of risk sta- tus includes communicating To effectively control and manage risks during the work effort, follow the results with relevant stake- a proactive program to regularly monitor risks and the status and holders. Data may be collected through risk monitoring and results of risk handling actions. The risk management strategy mitigation activities. Auto- defines the intervals at which risk status should be revisited. This mated support can help in col- activity can result in the discovery of new risks or new risk handling lecting and managing this data. options that can require replanning and reassessment. In either

Wow! eBook Risk Management 495 event, acceptability thresholds associated with the risk should be compared to the risk status to determine the need for implementing a RSKM risk mitigation plan.

Example Work Products HINT 1. Updated lists of risk status Weekly or monthly updates to risk status are typical. 2. Updated assessments of risk likelihood, consequence, and thresholds 3. Updated list of risk handling options 4. Updated list of actions taken to handle risks 5. Risk mitigation plans of risk handling options TIP Incorporating risk mitigation Subpractices plans into the project plan enables their status to be 1. Monitor risk status. reviewed as part of the proj- After a risk mitigation plan is initiated, the risk is still monitored. ect’s regular progress reviews. Thresholds are assessed to check for the potential execution of a con- tingency plan. HINT A mechanism for monitoring should be employed. Ensure that risk management 2. Provide a method for tracking open risk handling action items to activities are consolidated from closure. the top, all the way down, at each level of the project hierar- Refer to the Project Monitoring and Control process area for more infor- chy. Also ensure that risks are ptg mation about managing corrective action to closure. escalated to the level relevant 3. Invoke selected risk handling options when monitored risks exceed to the exposure they create. Their mitigation is then dele- defined thresholds. gated to the appropriate proj- Often, risk handling is only performed for risks judged to be high and ect or team. medium. The risk handling strategy for a given risk can include tech- niques and methods to avoid, reduce, and control the likelihood of the risk or the extent of damage incurred should the risk occur, or both. In this context, risk handling includes both risk mitigation plans and contingency plans. Risk handling techniques are developed to avoid, reduce, and control TIP adverse impact to project objectives and to bring about acceptable To be sure risk mitigation activ- outcomes in light of probable impacts. Actions generated to handle a ities are performed properly, risk require proper resource loading and scheduling in plans and they must be planned, sched- baseline schedules. This replanning should closely consider the uled, and resourced, just as any effects on adjacent or dependent work initiatives or activities. other project activity. 4. Establish a schedule or period of performance for each risk handling activity that includes a start date and anticipated completion date. 5. Provide a continued commitment of resources for each plan to allow the successful execution of risk handling activities. 6. Collect performance measures on risk handling activities.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook SUPPLIER AGREEMENT MANAGEMENT A Project Management Process Area at Maturity Level 2 SAM

Purpose The purpose of Supplier Agreement Management (SAM) is to man- X-REF age the acquisition of products and services from suppliers. If you are using SAM to understand best practices for Introductory Notes purchasing, outsourcing, and acquiring products and serv- The scope of this process area addresses the acquisition of products, ices to deliver or assemble services, and product and service components that can be delivered and deliver to your customer, you may want to refer to the to the project’s customer or included in a product or service system. CMMI for Acquisition model ptg This process area’s practices can also be used for other purposes that for more detailed guidance benefit the project (e.g., purchasing consumables). on improving your acquisi- This process area does not apply in all contexts in which commer- tion processes. cial off-the-shelf (COTS) components are acquired but does apply in cases where there are modifications to COTS components, govern- TIP ment off-the-shelf components, or freeware, that are of significant SAM helps prevent problems value to the project or that represent significant project risk. such as suppliers who can’t meet requirements, supplier Throughout the process areas, where the terms “product” and agreements that prevent a “product component” are used, their intended meanings also encom- proactive approach to supplier pass services, service systems, and their components. management, poor visibility The Supplier Agreement Management process area involves the into supplier activities, and following activities: failing to address risks associ- ated with the use of in-house vendors and COTS. • Determining the type of acquisition • Selecting suppliers TIP • Establishing and maintaining agreements with suppliers Although SAM primarily • Executing supplier agreements addresses the acquisition of products and product compo- • Accepting delivery of acquired products nents that are delivered to your • Ensuring successful transition of acquired products customer, SAM can be of bene- fit in other acquisitions critical This process area primarily addresses the acquisition of products to the success of the business and product components that are delivered to the project’s customer. such as training.

497

Wow! eBook 498 PART TWO THE PROCESS AREAS

HINT Examples of products and product components that can be acquired by Take a broad view when apply- the project include the following: ing SAM to reduce business- • Subsystems (e.g., navigational system on an airplane) critical risks in obtaining •Software products from suppliers. How- • Hardware ever, also be aware that not all practices apply equally to all • Documentation (e.g., installation, operator’s, and user’s manuals) situations. • Parts and materials (e.g., gauges, switches, wheels, steel, raw materials)

To minimize risks to the project, this process area can also address the acquisition of significant products and product compo- nents not delivered to the project’s customer but used to develop and maintain the product or service (for example, development tools and test environments). Typically, the products to be acquired by the project are deter- mined during the early stages of planning and development. HINT The Technical Solution process area provides practices for deter- If you need to acquire a prod- mining the products and product components that can be acquired uct, the earlier you prepare, the from suppliers. more likely you are to provide This process area does not directly address arrangements in the right product of the right which the supplier is integrated into the project team and uses the quality at the right time for ptg project success. same processes and reports to the same management as the project team members (e.g., integrated teams). Typically, these situations are handled by other processes or functions (e.g., project management processes, processes or functions external to the project) though some of the specific practices of this process area can be useful in managing the supplier agreement. HINT This process area typically is not implemented to address arrange- When a supplier is integrated ments in which the project’s customer is also a supplier. These situa- into the project team, pick the tions are usually handled by either informal agreements with the best process for the situation customer or by specification of the customer furnished items in the and then determine whether it overall agreement that the project has with the customer. In the latter maps to SAM, teaming practices case, some of the specific practices of this process area can be useful (in OPD and IPM), or both. in managing the agreement, although others may not, due to the fun- damentally different relationship that exists with a customer as opposed to an ordinary supplier. See the CMMI-ACQ model for more TIP information about other types of agreements. Acquisitions from in-house vendors often proceed without Suppliers can take many forms depending on business needs, well-defined requirements, a including in-house suppliers (i.e., suppliers that are in the same supplier agreement, and organization but are external to the project), fabrication depart- agreed-to acceptance tests. Is it ments, suppliers of reuse libraries, and commercial suppliers. (See any wonder that relationships the definition of “supplier” in the glossary.) with in-house vendors can be A supplier agreement is established to manage the relationship problematic? between the organization and the supplier. A supplier agreement is

Wow! eBook Supplier Agreement Management 499

any written agreement between the organization (representing the project) and the supplier. This agreement can be a contract, license, service level agreement, or memorandum of agreement. The acquired product is delivered to the project from the supplier according to the supplier agreement. (See the definition of “supplier agreement” in the glossary.)

Related Process Areas SAM Refer to the Technical Solution process area for more information about per- forming make, buy, or reuse analysis. Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component requirements. Refer to the Project Monitoring and Control process area for more information about monitoring the project against the plan and managing corrective action to closure. Refer to the Requirements Management process area for more information about maintaining bidirectional traceability of requirements. ptg Specific Goal and Practice Summary SG 1 Establish Supplier Agreements SP 1.1 Determine Acquisition Type SP 1.2 Select Suppliers SP 1.3 Establish Supplier Agreements SG 2 Satisfy Supplier Agreements SP 2.1 Execute the Supplier Agreement SP 2.2 Accept the Acquired Product SP 2.3 Ensure Transition of Products

Specific Practices by Goal X-REF The entry point for SAM is SG 1 ESTABLISH SUPPLIER AGREEMENTS normally TS SP 2.4, when the decision is made to buy Agreements with the suppliers are established and maintained. rather than build a product component. SP 1.1 DETERMINE ACQUISITION TYPE TIP Determine the type of acquisition for each product or product component to The type of acquisition varies be acquired. according to the nature of prod- Refer to the Technical Solution process area for more information about per- ucts that are available to satisfy forming make, buy, or reuse analyses. the project’s needs and requirements (including COTS).

Wow! eBook 500 PART TWO THE PROCESS AREAS

Many different types of acquisitions can be used to acquire products and product components that can be used by the project.

Examples of types of acquisitions include the following: • Purchasing modified COTS products of significant value to the project • Obtaining products through a supplier agreement • Obtaining products from an in-house supplier • Obtaining products from the customer • Obtaining products from a preferred supplier • Combining some of the above (e.g., contracting for a modification to a COTS product, having another part of the business enterprise co-develop products with an external supplier)

X-REF If acquiring modified COTS products of significant value to the The use of COTS involves project or that represent significant project risk, care in evaluating additional considerations. and selecting these products and the supplier can be critical to the See TS, SAM SP 1.2 Subprac- tice 5, and SAM SP 1.3 Sub- project. Aspects to consider in the selection decision include propri- practice 3. Also, see www.sei etary issues and the availability of the products. .cmu.edu/acquisition/tools/ methods/cotshome.cfm. Example Work Products ptg 1. List of the acquisition types that will be used for all products and product components to be acquired

SP 1.2 SELECT SUPPLIERS Select suppliers based on an evaluation of their ability to meet the specified requirements and established criteria. Refer to the Decision Analysis and Resolution process area for more informa- tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. Refer to the Requirements Management process area for more information about obtaining commitment to requirements. TIP The criteria used to select a Criteria should be established to address factors that are important to supplier depend on the proj- the project. ect, its requirements, and other factors. If you enter “supplier selection criteria” into your Examples of factors that can be important to the project include the favorite search engine, you will f o l l o w i n g : be amazed by both the com- • Geographical location of the supplier monality and the variety of supplier selection criteria used • Supplier’s performance records on similar work in different industries. Continues

Wow! eBook Supplier Agreement Management 501

Continued • Engineering capabilities • Staff and facilities available to perform the work • Prior experience in similar situations • Customer satisfaction with similar products delivered by the supplier

Example Work Products 1. Market studies SAM 2. List of candidate suppliers 3. Preferred supplier list 4. Trade study or other record of evaluation criteria, advantages and disadvantages of candidate suppliers, and rationale for selection of suppliers 5. Solicitation materials and requirements

Subpractices HINT 1. Establish and document criteria for evaluating potential suppliers. Consider using DAR when eval- 2. Identify potential suppliers and distribute solicitation material and uating potential suppliers. requirements to them. These subpractices describe ptg A proactive manner of performing this activity is to conduct market some of what is involved in research to identify potential sources of candidate products to be such an evaluation. acquired, including candidates from suppliers of custom made prod- TIP ucts and suppliers of COTS products. A proactive approach provides Refer to the Organizational Performance Management process area for benefits such as addressing a more information about selecting improvements and validating capability gap of the organiza- i m p r o v e m e n t s . tion uniformly, reducing time 3. Evaluate proposals according to evaluation criteria. that projects take to select sup- pliers, establishing a more effi- 4. Evaluate risks associated with each proposed supplier. cient umbrella agreement with Refer to the Risk Management process area for more information about a preferred supplier, and pro- identifying and analyzing risks. tecting core competencies.

5. Evaluate proposed suppliers’ abilities to perform the work. TIP Risks are typically included as Examples of methods used to evaluate the proposed supplier’s abilities to criteria in a formal evaluation. perform the work include the following: • Evaluation of prior experience in similar applications • Evaluation of customer satisfaction with similar products provided • Evaluation of prior performance on similar work • Evaluation of management capabilities • Capability evaluations Continues

Wow! eBook 502 PART TWO THE PROCESS AREAS

X-REF Continued Capability evaluation meth- • Evaluation of staff available to perform the work ods associated with CMMI • Evaluation of available facilities and resources include the SCAMPI A and B • Evaluation of the project’s ability to work with the proposed supplier appraisal methods. See http://www.sei.cmu.edu/ • Evaluation of the impact of candidate COTS products on the project’s cmmi/tools/appraisals/ plan and commitments index.cfm. When modified COTS products are being evaluated, consider the following: • Cost of the modified COTS products • Cost and effort to incorporate the modified COTS products into the project • Security requirements • Benefits and impacts that can result from future product releases

Future releases of the modified COTS product can provide additional features that support planned or anticipated enhancements for the project, but can result in the supplier discontinuing support of its current release. 6. Select the supplier. HINT If it’s not documented in the SP 1.3 ESTABLISH SUPPLIER AGREEMENTS ptg supplier agreement, don’t count on it happening! Renego- Establish and maintain supplier agreements. tiating an agreement can be expensive, so make sure it cov- A supplier agreement is any written agreement between the organization ers everything that is important (representing the project) and the supplier. This agreement can be a con- to you for managing the sup- tract, license, service level agreement, or memorandum of agreement. plier and receiving the product that you are expecting. The content of the supplier agreement should specify the arrange- ment for selecting supplier processes and work products to be moni- HINT tored, analyzed, and evaluated, if the arrangement is appropriate to Create a supplier agreement in the acquisition or product being acquired. The supplier agreement the form suitable to the nature should also specify the reviews, monitoring, evaluations, and accep- of the business transaction. tance testing to be performed. The agreement does not need Supplier processes that are critical to the success of the project to be lengthy but usually has a section with signatures. (e.g., due to complexity, due to importance) should be monitored. Supplier agreements between independent legal entities are typi- TIP cally reviewed by legal or contract advisors prior to approval. The project must engage the supplier in reviews, monitor- Example Work Products ing, and evaluations to a depth 1. Statements of work and breadth appropriate to the 2. Contracts circumstances and risks. The supplier agreement must cover 3. Memoranda of agreement details of these reviews. 4. Licensing agreement

Wow! eBook Supplier Agreement Management 503

Subpractices 1. Revise the requirements (e.g., product requirements, service level TIP requirements) to be fulfilled by the supplier to reflect negotiations Establishing (and revising) an with the supplier when necessary. agreement often requires nego- Refer to the Requirements Development process area for more information tiation skills. See Getting to Yes: Negotiating Agreement Without about developing product requirements. Giving In, Revised 2nd Edition Refer to the Requirements Management process area for more information by William Ury, Roger Fisher, about managing requirements of the project’s products and product com- and Bruce Patton (Penguin USA, 1991). ponents and to ensure alignment between those requirements and the proj - SAM ect’s plans and work products. 2. Document what the project will provide to the supplier. Include the following: • Project furnished facilities • Documentation • Services 3. Document the supplier agreement. The supplier agreement should include a statement of work, a specifi- HINT cation, terms and conditions, a list of deliverables, a schedule, a The supplier agreement is the budget, and a defined acceptance process. basis for monitoring your sup- plier and accepting the prod- ptg uct. Make sure it covers all This subpractice typically includes the following tasks: critical information. • Identifying the type and depth of project oversight of the supplier, proce- dures, and evaluation criteria to be used in monitoring supplier perform- ance including selection of processes to be monitored and work products to be evaluated TIP • Establishing the statement of work, specification, terms and conditions, Often, projects overlook a sup- list of deliverables, schedule, budget, and acceptance process plier’s responsibilities for • Identifying who from the project and supplier are responsible and ongoing maintenance and sup- authorized to make changes to the supplier agreement port. Some companies have tied staff compensation, in part, • Identifying how requirements changes and changes to the supplier to maintenance and support agreement are to be determined, communicated, and addressed costs incurred following prod- • Identifying standards and procedures that will be followed uct delivery, thus motivating • Identifying critical dependencies between the project and the supplier the staff to consider longer- • Identifying the types of reviews that will be conducted with the supplier term support needs when • Identifying the supplier’s responsibilities for ongoing maintenance and establishing an agreement with support of the acquired products a supplier. • Identifying warranty, ownership, and rights of use for the acquired products • Identifying acceptance criteria

Wow! eBook 504 PART TWO THE PROCESS AREAS In some cases, selection of modified COTS products can require a supplier agreement in addition to the agreements in the product’s license. Exam- ples of what could be covered in an agreement with a COTS supplier include the following: • Discounts for large quantity purchases • Coverage of relevant stakeholders under the licensing agreement, includ- ing project suppliers, team members, and the project’s customer • Plans for future enhancements • On-site support, such as responses to queries and problem reports • Additional capabilities that are not in the product • Maintenance support, including support after the product is withdrawn from general availability

4. Periodically review the supplier agreement to ensure it accurately reflects the project’s relationship with the supplier and current risks and market conditions. 5. Ensure that all parties to the supplier agreement understand and agree to all requirements before implementing the agreement or any changes. TIP 6. Revise the supplier agreement as necessary to reflect changes to the Especially with long-term supplier’s processes or work products. ptg agreements (more than one year), technical and nontechni- 7. Revise the project’s plans and commitments, including changes to cal requirements may change. the project’s processes or work products, as necessary to reflect the It is necessary to address how supplier agreement. such changes will be handled Refer to the Project Monitoring and Control process area for more infor- in the supplier agreement mation about monitoring commitments. because this is often the legal document that will make these significant changes binding. SG 2 SATISFY SUPPLIER AGREEMENTS Agreements with suppliers are satisfied by both the project and the supplier.

SP 2.1 EXECUTE THE SUPPLIER AGREEMENT Perform activities with the supplier as specified in the supplier agreement. Refer to the Project Monitoring and Control process area for more information about providing an understanding of the project’s progress so that appropriate corrective actions can be taken when the project’s performance deviates signifi- cantly from the plan.

Example Work Products 1. Supplier progress reports and performance measures 2. Supplier review materials and reports

Wow! eBook Supplier Agreement Management 505

3. Action items tracked to closure 4. Product and documentation deliveries

Subpractices TIP 1. Monitor supplier progress and performance (e.g., schedule, effort, In V1.2, subpractices 2 and 3 cost, technical performance) as defined in the supplier agreement. were SP 2.2 and 2.3.

2. Select, monitor, and analyze processes used by the supplier as HINT defined in the supplier agreement. Select appropriate processes Supplier processes that are critical to the success of the project (e.g., for monitoring and evaluating SAM due to complexity, due to importance) should be monitored. The work products to obtain visibil- selection of processes to monitor should consider the impact of the ity into supplier progress and selection on the supplier. performance and to identify 3. Select and evaluate work products from the supplier as defined in and mitigate risks. the supplier agreement. TIP The work products selected for evaluation should include critical Monitoring is a cost to both par- products, product components, and work products that provide ties, so which processes to insight into quality issues as early as possible. In situations of low select for monitoring depends risk, it may not be necessary to select any work products for on which ones provide the evaluation. most insight into supplier activ- 4. Conduct reviews with the supplier as specified in the supplier ities, pose the most risk, and agreement. provide an early indication of problems. In cases of low risk, ptg Refer to the Project Monitoring and Control process area for more infor- no processes are monitored. mation about conducting milestone reviews and conducting progress reviews. HINT Reviews cover both formal and informal reviews and include the fol- There is often more risk with a lowing steps: custom-made product. The requirements may not be • Preparing for the review defined fully or the supplier • Ensuring that relevant stakeholders participate may not address all the • Conducting the review requirements fully. Evaluate selected work products to Identifying, documenting, and tracking all action items to closure • uncover issues early in the • Preparing and distributing to the relevant stakeholders a summary l i f e c y c l e . report of the review 5. Conduct technical reviews with the supplier as defined in the sup- X-REF plier agreement. Including suppliers in the project’s progress and mile- stone reviews is covered in Technical reviews typically include the following: PMC SP 1.6 and 1.7. • Providing the supplier with visibility into the needs and desires of the project’s customers and end users as appropriate TIP • Reviewing the supplier’s technical activities and verifying that the sup- The purpose of a technical plier’s interpretation and implementation of the requirements are con- review is to review the sup- plier’s technical progress and sistent with the project’s interpretation identify and resolve issues if Continues they arise.

Wow! eBook 506 PART TWO THE PROCESS AREAS

Continued • Ensuring that technical commitments are being met and that technical issues are communicated and resolved in a timely manner • Obtaining technical information about the supplier’s products • Providing appropriate technical information and support to the supplier

TIP 6. Conduct management reviews with the supplier as defined in the The purpose of a management supplier agreement. review is to review the sup- plier’s progress against the plan and identify and resolve Management reviews typically include the following: issues (usually risks) when they • Reviewing critical dependencies arise. • Reviewing project risks involving the supplier • Reviewing schedule and budget • Reviewing the supplier’s compliance with legal and regulatory r e q u i r e m e n t s

TIP Technical and management reviews can be coordinated and held jointly. Just as peer reviews provide secondary benefits, so do 7. Use the results of reviews to improve the supplier’s performance and ptg reviews conducted with the to establish and nurture long-term relationships with preferred sup- supplier. Establishing a good pliers. relationship with your suppli- 8. Monitor risks involving the supplier and take corrective action as ers is often key to having a suc- cessful product component necessary. delivered on time according to Refer to the Project Monitoring and Control process area for more infor- the agreement. mation about monitoring project risks.

SP 2.2 ACCEPT THE ACQUIRED PRODUCT

HINT Ensure that the supplier agreement is satisfied before accepting the acquired Before accepting the product, product. be sure you are getting what you wanted (which should be Acceptance reviews, tests, and configuration audits should be com- documented in the supplier pleted before accepting the product as defined in the supplier agreement). agreement. Example Work Products 1. Acceptance procedures 2. Acceptance reviews or test results 3. Discrepancy reports or corrective action plans

Wow! eBook Supplier Agreement Management 507

Subpractices X-REF 1. Define the acceptance procedures. Acceptance procedures, reviews, and tests are also 2. Review and obtain agreement from relevant stakeholders on the covered in PI, VER, and VAL, acceptance procedures before the acceptance review or test. so consult their practices for 3. Verify that the acquired products satisfy their requirements. more information about establishing the appropriate Refer to the Verification process area for more information about verify- environment, procedures, ing selected work products. and criteria for verification 4. Confirm that the nontechnical commitments associated with the and validation (and thus for acquired work product are satisfied. accepting the acquired SAM p r o d u c t ) . This confirmation can include confirming that the appropriate license, warranty, ownership, use, and support or maintenance agree- HINT ments are in place and that all supporting materials are received. Share pertinent parts of the 5. Document the results of the acceptance review or test. schedule and criteria for 6. Establish an action plan and obtain supplier agreement to take assembly with suppliers of action to correct acquired work products that do not pass their work products to reduce the occurrence of delays and com- acceptance review or test. ponent failure (PI SP 3.1). 7. Identify, document, and track action items to closure. Refer to the Project Monitoring and Control process area for more infor- HINT mation about managing corrective action to closure. Be sure to address proprietary issues related to the acquired ptg product before the project SP 2.3 ENSURE TRANSITION OF PRODUCTS accepts the product. Ensure the transition of products acquired from the supplier. TIP A successful transition involves Before the acquired product is transferred to the project, customer, or planning for the appropriate end user, appropriate preparation and evaluation should occur to facilities, training, use, mainte- ensure a smooth transition. nance, and support.

Refer to the Product Integration process area for more information about assem- TIP bling product components. In some cases, the supplier may deliver the product Example Work Products directly to the customer or 1. Transition plans end user. 2. Training reports 3. Support and maintenance reports

Subpractices 1. Ensure that facilities exist to receive, store, integrate, and maintain the acquired products as appropriate. 2. Ensure that appropriate training is provided for those who are involved in receiving, storing, integrating, and maintaining acquired products.

Wow! eBook 508 PART TWO THE PROCESS AREAS

3. Ensure that acquired products are stored, distributed, and integrated according to the terms and conditions specified in the supplier agreement or license.

ptg

Wow! eBook TECHNICAL SOLUTION An Engineering Process Area at Maturity Level 3

Purpose The purpose of Technical Solution (TS) is to select, design, and imple- ment solutions to requirements. Solutions, designs, and implementa- tions encompass products, product components, and product related lifecycle processes either singly or in combination as appropriate. TS Introductory Notes The Technical Solution process area is applicable at any level of the ptg product architecture and to every product, product component, and product related lifecycle process. Throughout the process areas, where the terms “product” and “product component” are used, their intended meanings also encompass services, service systems, and their components. This process area focuses on the following:

• Evaluating and selecting solutions (sometimes referred to as “design approaches,” “design concepts,” or “preliminary designs”) that poten- tially satisfy an appropriate set of allocated functional and quality attribute requirements • Developing detailed designs for the selected solutions (detailed in the context of containing all the information needed to manufacture, code, or otherwise implement the design as a product or product component) • Implementing the designs as a product or product component

Typically, these activities interactively support each other. Some level of design, at times fairly detailed, can be needed to select solu- tions. Prototypes or pilots can be used as a means of gaining suffi- cient knowledge to develop a technical data package or a complete set of requirements. Quality attribute models, simulations, prototypes

509

Wow! eBook 510 PART TWO THE PROCESS AREAS

HINT or pilots can be used to provide additional information about the It is not enough to consider properties of the potential design solutions to aid in the selection of product behavior in the solutions. Simulations can be particularly useful for projects devel- intended operational environ- oping systems-of-systems. ment when developing a solu- Technical Solution specific practices apply not only to the prod- tion. Also ask questions about uct and product components but also to product related lifecycle other phases in the life of the product, including whether the processes. The product related lifecycle processes are developed in solution can be manufactured; concert with the product or product component. Such development whether it is easy to test, install, can include selecting and adapting existing processes (including repair, migrate to new versions standard processes) for use as well as developing new processes. or platforms, and support; and Processes associated with the Technical Solution process area what the costs and legal impli- receive the product and product component requirements from the cations will be. requirements management processes. The requirements management X-REF processes place the requirements, which originate in requirements TS is driven by the require- development processes, under appropriate configuration manage- ments established by RD, ment and maintain their traceability to previous requirements. which are managed by REQM. For a maintenance or sustainment project, the requirements in The processes associated need of maintenance actions or redesign can be driven by user needs, with these PAs interact signif- icantly to accomplish their technology maturation and obsolescence, or latent defects in the purposes. product components. New requirements can arise from changes in the operating environment. Such requirements can be uncovered ptg TIP during verification of the product(s) where its actual performance TS applies not only to develop- can be compared against its specified performance and unacceptable ing a product, but also to degradation can be identified. Processes associated with the Techni- maintaining a product (i.e., cor - cal Solution process area should be used to perform the maintenance rective, adaptive, and perfec- or sustainment design efforts. tive maintenance). For product lines, these practices apply to both core asset devel- X-REF opment (i.e., building for reuse) and product development (i.e., See the definition of “product building with reuse). Core asset development additionally requires line” in the glossary. product line variation management (the selection and implementa- tion of product line variation mechanisms) and product line produc- tion planning (the development of processes and other work products that define how products will be built to make best use of these core assets).

In Agile environments, the focus is on early solution exploration. By mak- ing the selection and tradeoff decisions more explicit, the Technical Solu- tion process area helps improve the quality of those decisions, both individually and over time. Solutions can be defined in terms of functions, feature sets, releases, or any other components that facilitate product development. When someone other than the team will be working on the product in the future, release information, maintenance logs, and other Continues

Wow! eBook Technical Solution 511

Continued data are typically included with the installed product. To support future product updates, rationale (for trade-offs, interfaces, and purchased parts) is captured so that why the product exists can be better understood. If there is low risk in the selected solution, the need to formally capture deci- sions is significantly reduced. (See “Interpreting CMMI When Using Agile Approaches” in Part I.)

X-REF See Software Architecture in Related Process Areas Practice, Second Edition by L. Bass, P. Clements, and Refer to the Requirements Development process area for more information about R. Kazman (Addison-Wesley). allocating product component requirements, establishing operational concepts and scenarios, and identifying interface requirements. X-REF See Evaluating Software Refer to the Verification process area for more information about performing Architectures: Methods and peer reviews and verifying selected work products. Case Studies by P. Clements, R. Kazman, and M. Klein

Refer to the Decision Analysis and Resolution process area for more informa- TS (Addison-Wesley). tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. X-REF Refer to the Organizational Performance Management process area for more See Documenting Software ptg information about selecting improvements and deploying improvements. Architectures: Views and Beyond by P. Clements, Refer to the Requirements Management process area for more information about F. Bachmann, L. Bass, D. Garlan, managing requirements of the project’s products and product components and J. Ivers, R. Little, R. Nord, and ensuring alignment between those requirements and the project’s plans and J. Stafford (Addison-Wesley). work products. X-REF See Software Product Lines: Specific Goal and Practice Summary Practices and Patterns by SG 1 Select Product Component Solutions P. Clements and L. Northrop SP 1.1 Develop Alternative Solutions and Selection Criteria (Addison-Wesley). SP 1.2 Select Product Component Solutions SG 2 Develop the Design X-REF SP 2.1 Design the Product or Product Component See Handbook of Software SP 2.2 Establish a Technical Data Package Architecture by G. Booch SP 2.3 Design Interfaces Using Criteria (found at http://www.hand- SP 2.4 Perform Make, Buy, or Reuse Analyses bookofsoftwarearchitecture SG 3 Implement the Product Design .com/). SP 3.1 Implement the Design SP 3.2 Develop Product Support Documentation X-REF See Software Architecture Essential Bookshelf (found at http://www.sei.cmu.edu/ architecture/start/ publications/bookshelf.cfm).

Wow! eBook 512 PART TWO THE PROCESS AREAS

HINT Specific Practices by Goal When you consider the suc- cessful products or services SG 1 SELECT PRODUCT COMPONENT SOLUTIONS you use, ask what makes them successful. Is it features, usabil- Product or product component solutions are selected from alternative ity, cost, customer support, s o l u t i o n s . response time, reliability, etc.? These characteristics were Alternative solutions and their relative merits are considered in achieved through careful iden- tification and evaluation of advance of selecting a solution. Key requirements, design issues, and alternative design approaches. constraints are established for use in alternative solution analysis. Architectural choices and patterns that support achievement of qual- TIP ity attribute requirements are considered. Also, the use of commer- Sometimes, an insignificant cial off-the-shelf (COTS) product components are considered relative requirements change can greatly improve the merits of a to cost, schedule, performance, and risk. COTS alternatives can be COTS-based solution, espe- used with or without modification. Sometimes such items can cially with regard to cost and require modifications to aspects such as interfaces or a customization schedule risks. However, the of some of the features to correct a mismatch with functional or qual- use of COTS may constrain the ity attribute requirements, or with architectural designs. overall solution’s performance and the support that can be One indicator of a good design process is that the design was cho- offered later in the product’s sen after comparing and evaluating it against alternative solutions. life. A relationship with a ven- Decisions about architecture, custom development versus off the dor may need to be maintained. shelf, and product component modularization are typical of the ptg design choices that are addressed. Some of these decisions can X-REF require the use of a formal evaluation process. See www.sei.cmu.edu/ acquisition/tools/methods/ Refer to the Decision Analysis and Resolution process area for more informa- cotshome.cfm for more infor- tion about analyzing possible decisions using a formal evaluation process that mation about using COTS. evaluates identified alternatives against established criteria. Sometimes the search for solutions examines alternative instances TIP of the same requirements with no allocations needed for lower level The iteration in these practices eventually results in a product product components. Such is the case at the bottom of the product component solution and design architecture. There are also cases where one or more of the solutions that are either implemented are fixed (e.g., a specific solution is directed or available product (TS SG 3), acquired (SAM), or components, such as COTS, are investigated for use). reused (TS SP 2.4). In the general case, solutions are defined as a set. That is, when HINT defining the next layer of product components, the solution for each Whether you are defining or of the product components in the set is established. The alternative evaluating an alternative solu- solutions are not only different ways of addressing the same require- tion, treat its components ments, but they also reflect a different allocation of requirements together, not individually. For among the product components comprising the solution set. The example, do not rush to select a objective is to optimize the set as a whole and not the individual promising COTS component or pieces. There will be significant interaction with processes associated new technology without con- sidering the other components with the Requirements Development process area to support the pro- it will need to work with, as visional allocations to product components until a solution set is well as their impacts and risks. selected and final allocations are established.

Wow! eBook Technical Solution 513

Product related lifecycle processes are among the product compo- nent solutions that are selected from alternative solutions. Examples of these product related lifecycle processes are the man- ufacturing, delivery, and support processes.

SP 1.1 DEVELOP ALTERNATIVE SOLUTIONS AND SELECTION CRITERIA X-REF Develop alternative solutions and selection criteria. RD SP 2.2 describes allocating requirements after a solution Refer to the Allocate Product Component Requirements specific practice in the has been selected. However, Requirements Development process area for more information about obtaining it can also be applied prior to allocations of requirements to solution alternatives for the product components. selection (a “provisional allo- cation”) to gain insight into Refer to the Decision Analysis and Resolution process area for more informa- the relative merits of an alter- tion about establishing evaluation criteria. native solution.

Alternative solutions should be identified and analyzed to enable the HINT selection of a balanced solution across the life of the product in terms It is important to involve rele- vant stakeholders in establish- of cost, schedule, performance, and risk. These solutions are based

ing selection criteria and TS on proposed product architectures that address critical product qual- alternative solutions. ity attribute requirements and span a design space of feasible solu- tions. Specific practices associated with the Develop the Design TIP Selection criteria typically specific goal provide more information on developing potential prod- ptg uct architectures that can be incorporated into alternative solutions include cost, schedule, per- formance, and risk. How these for the product. criteria are defined in detail, Alternative solutions frequently encompass alternative require- however, depends on the ment allocations to different product components. These alternative requirements. solutions can also include the use of COTS solutions in the product architecture. Processes associated with the Requirements Develop- HINT ment process area would then be employed to provide a more com- Consider the entire life of the product when selecting, plete and robust provisional allocation of requirements to the designing, and implementing a alternative solutions. solution to ensure that it will Alternative solutions span the acceptable range of cost, schedule, have the desired versatility and and performance. The product component requirements are received market endurance. and used along with design issues, constraints, and criteria to HINT develop the alternative solutions. Selection criteria would typically Consider a range of alternative address costs (e.g., time, people, money), benefits (e.g., product per- solutions. Input from stake- formance, capability, effectiveness), and risks (e.g., technical, cost, holders with diverse skills and schedule). Considerations for alternative solutions and selection cri- backgrounds can help teams teria include the following: identify and address assump- tions, constraints, and biases. Brainstorming sessions may • Cost of development, manufacturing, procurement, maintenance, and stimulate innovative alterna- support tives. Also, consider which • Achievement of key quality attribute requirements, such as product quality attributes are critical to timeliness, safety, reliability, and maintainability product success.

Wow! eBook 514 PART TWO THE PROCESS AREAS

TIP • Complexity of the product component and product related lifecycle Complexity is a “double-edged processes sword.” Sometimes today’s • Robustness to product operating and use conditions, operating high-performance solution modes, environments, and variations in product related lifecycle becomes tomorrow’s high- maintenance solution. processes • Product expansion and growth • Technology limitations • Sensitivity to construction methods and materials • Risk • Evolution of requirements and technology • Disposal • Capabilities and limitations of end users and operators • Characteristics of COTS products

HINT The considerations listed here are a basic set; organizations You may need to demonstrate should develop screening criteria to narrow down the list of alterna- to cost-conscious customers tives that are consistent with their business objectives. Product life- how your product will result in cycle cost, while being a desirable parameter to minimize, can be reduced costs over the prod- uct’s life. outside the control of development organizations. A customer may not be willing to pay for features that cost more in the short term but ptg ultimately decrease cost over the life of the product. In such cases, customers should at least be advised of any potential for reducing lifecycle costs. The criteria used to select final solutions should pro- vide a balanced approach to costs, benefits, and risks. Example Work Products 1. Alternative solution screening criteria 2. Evaluation reports of new technologies 3. Alternative solutions 4. Selection criteria for final selection 5. Evaluation reports of COTS products TIP Subpractices Screening criteria may involve setting thresholds for selected 1. Identify screening criteria to select a set of alternative solutions for quality attributes (e.g., consideration. response time) that must be 2. Identify technologies currently in use and new product technologies met by alternative solutions. for competitive advantage. TIP Refer to the Organizational Performance Management process area for It may be possible to exploit a more information about selecting improvements and deploying technology not available to i m p r o v e m e n t s . competitors.

Wow! eBook Technical Solution 515

The project should identify technologies applied to current products HINT and processes and monitor the progress of currently used technolo- Explore the use of COTS (or gies throughout the life of the project. The project should identify, open source or new technol- select, evaluate, and invest in new technologies to achieve competi- ogy) early in product develop- tive advantage. Alternative solutions could include newly developed ment because to use COTS technologies, but could also include applying mature technologies in effectively, you may need to different applications or to maintain current methods. consider changes to require- ments. Fully understand such 3. Identify candidate COTS products that satisfy the requirements. requirements-and-design Refer to the Supplier Agreement Management process area for more infor- tradeoffs early, before commit- mation about selecting suppliers. ting to (and putting under con- tract) a particular development The supplier of the COTS product will need to meet requirements approach. that include the following: • Product functionality and quality attributes • Terms and conditions of warranties for the products • Expectations (e.g., for review activities), constraints, or check- points to help mitigate suppliers’ responsibilities for ongoing main- TIP tenance and support of the products Because a solution has yet to TS 4. Identify re-usable solution components or applicable architecture be selected, this requirements allocation is referred to as a patterns. “provisional allocation of For product lines, the organization’s core assets can be used as a basis requirements.” Such allocation for a solution. provides better insight into the ptg 5. Generate alternative solutions. solutions’ pros and cons. 6. Obtain a complete requirements allocation for each alternative. HINT 7. Develop the criteria for selecting the best alternative solution. You can incorporate in subse- Criteria should be included that address design issues for the life of quent releases new technolo- the product, such as provisions for more easily inserting new tech- gies, COTS, or open source nologies or the ability to better exploit commercial products. Exam- components that were too ples include criteria related to open design or open architecture immature to incorporate into a concepts for the alternatives being evaluated. product’s first release.

SP 1.2 SELECT PRODUCT COMPONENT SOLUTIONS TIP DAR supports the selection of Select the product component solutions based on selection criteria. product component solutions Refer to the Allocate Product Component Requirements and Identify Interface from among alternative solu- Requirements specific practices of the Requirements Development process area tions, especially in novel for more information about establishing the allocated requirements for product s i t u a t i o n s . components and interface requirements among product components. TIP Selecting product component Selecting product components that best satisfy the criteria establishes solutions from alternative solu- the requirement allocations to product components. Lower level require- tions positions us for further ments are generated from the selected alternative and used to develop requirements development product component designs. Interfaces among product components and design.

Wow! eBook 516 PART TWO THE PROCESS AREAS

HINT are described. Physical interface descriptions are included in the You may want to examine the documentation for interfaces to items and activities external to the rationale for a particular selec- product. tion when you later learn that a The description of the solutions and the rationale for selection are promising technology or COTS component is now available. It documented. The documentation evolves throughout development may be unnecessary to inter- as solutions and detailed designs are developed and those designs are rupt product development to implemented. Maintaining a record of rationale is critical to down- explore the implications if they stream decision making. Such records keep downstream stakeholders were already explored earlier from redoing work and provide insights to apply technology as it and records were maintained. becomes available in applicable circumstances.

HINT Example Work Products To gain the insight needed to 1. Product component selection decisions and rationale fully evaluate alternative solu- 2. Documented relationships between requirements and product tions, you may need to refine components operational concepts and sce- narios to understand the impli- 3. Documented solutions, evaluations, and rationale cations of each alternative solution. Subpractices 1. Evaluate each alternative solution/set of solutions against the selec- tion criteria established in the context of the operational concepts and scenarios. ptg X-REF Develop timeline scenarios for product operation and user interaction You may also consider for each alternative solution. different user roles and types. See RD SP 3.1 for more 2. Based on the evaluation of alternatives, assess the adequacy of the information. selection criteria and update these criteria as necessary. 3. Identify and resolve issues with the alternative solutions and requirements. HINT 4. Select the best set of alternative solutions that satisfy the established Following the evaluation, you selection criteria. may conclude that the selection 5. Establish the functional and quality attribute requirements associ- criteria were not complete or ated with the selected set of alternatives as the set of allocated detailed enough to adequately requirements to those product components. differentiate alternative solu- tions. If so, you may need to 6. Identify the product component solutions that will be reused or iterate through SP 1.1 and 1.2. acquired. Refer to the Supplier Agreement Management process area for more information about managing the acquisition of products and services from suppliers. TIP 7. Establish and maintain the documentation of the solutions, evalua- Documentation assists product tions, and rationale. maintenance and support later in the life of the product.

Wow! eBook Technical Solution 517

SG 2 DEVELOP THE DESIGN Product or product component designs are developed. TIP A design describes a product’s Product or product component designs should provide the appropri- components’ behavior and their interconnections. It guides ate content not only for implementation, but also for other phases of the activities of a broad range the product lifecycle such as modification, reprocurement, mainte- of stakeholders, including nance, sustainment, and installation. The design documentation pro- implementers, testers, installers, vides a reference to support mutual understanding of the design by and maintainers. Thus, stake- relevant stakeholders and supports future changes to the design both holders will use a design over during development and in subsequent phases of the product life- the life of the product. cycle. A complete design description is documented in a technical TIP data package that includes a full range of features and parameters Organizational standards can including form, fit, function, interface, manufacturing process char- help achieve consistency and acteristics, and other parameters. Established organizational or proj- completeness in design. ect design standards (e.g., checklists, templates, object frameworks) form the basis for achieving a high degree of definition and com-

pleteness in design documentation. TS

SP 2.1 DESIGN THE PRODUCT OR PRODUCT COMPONENT Develop a design for the product or product component. TIP ptg A good design does more than Product design consists of two broad phases that can overlap in exe- identify functionality. In the words of Steve Jobs, “Design is cution: preliminary and detailed design. Preliminary design estab- not just what it looks like and lishes product capabilities and the product architecture, including feels like. Design is how it architectural styles and patterns, product partitions, product compo- works. . . . Design is the funda- nent identifications, system states and modes, major intercomponent mental soul of a human-made interfaces, and external product interfaces. Detailed design fully creation that ends up express- ing itself in successive outer defines the structure and capabilities of the product components. layers of the product or Refer to the Establish a Definition of Required Functionality and Quality service.” Attributes specific practice in the Requirements Development process area for more information about developing architectural requirements. TIP Architecture definition is driven from a set of architectural The design activity is often requirements developed during the requirements development divided into two phases: pre- liminary design and detailed processes. These requirements identify the quality attributes that are design. In preliminary design, critical to the success of the product. The architecture defines struc- the product architecture is tural elements and coordination mechanisms that either directly sat- established. isfy requirements or support the achievement of the requirements as the details of the product design are established. Architectures can include standards and design rules governing development of prod- uct components and their interfaces as well as guidance to aid prod- uct developers. Specific practices in the Select Product Component

Wow! eBook 518 PART TWO THE PROCESS AREAS

X-REF Solutions specific goal contain more information about using prod- Architecture requirements uct architectures as a basis for alternative solutions. express the qualities and per- Architects postulate and develop a model of the product, making formance points (i.e., thresh- judgments about allocation of functional and quality attribute olds) critical to the success of the product. They are devel- requirements to product components including hardware and soft- oped in RD SP 2.1. ware. Multiple architectures, supporting alternative solutions, can be developed and analyzed to determine the advantages and disadvan- TIP tages in the context of the architectural requirements. An architecture includes struc- Operational concepts and operational, sustainment, and develop- tural elements (e.g., product ment scenarios are used to generate use cases and quality attribute partitions and components), related scenarios that are used to refine the architecture. They are coordination mechanisms (e.g., interfaces), design rules and also used as a means to evaluate the suitability of the architecture for principles, standards, and its intended purpose during architecture evaluations, which are con- guidance. An architecture may ducted periodically throughout product design. also include views that support Refer to the Establish Operational Concepts and Scenarios specific practice in reasoning about particular the Requirements Development process area for more information about devel- quality attributes or other fea- oping operational concepts and scenarios used in architecture evaluation. tures, issues, or perspectives.

HINT Examples of architecture definition tasks include the following: Scenarios can be used to help • Establishing the structural relations of partitions and rules regarding refine the architecture and ptg evaluate it against the architec- interfaces between elements within partitions, and between partitions ture requirements. • Selecting architectural patterns that support the functional and quality attribute requirements, and instantiating or composing those patterns to create the product architecture • Identifying major internal interfaces and all external interfaces • Identifying product components and interfaces between them • Formally defining component behavior and interaction using an architec- ture description language • Defining coordination mechanisms (e.g., for software, hardware) • Establishing infrastructure capabilities and services • Developing product component templates or classes and frameworks • Establishing design rules and authority for making decisions • Defining a process/thread model • Defining physical deployment of software to hardware • Identifying major reuse approaches and sources

TIP During detailed design, the product architecture details are final- In detailed design, we fully ized, product components are completely defined, and interfaces are define the structure and capa- bilities of the product compo- fully characterized. Product component designs can be optimized for nents. Product component certain quality attributes. Designers can evaluate the use of legacy or designs may be optimized rela- COTS products for the product components. As the design matures, tive to particular quality attrib- the requirements assigned to lower level product components are utes (e.g., response time). tracked to ensure that those requirements are satisfied.

Wow! eBook Technical Solution 519

Refer to the Requirements Management process area for more information about ensuring alignment between project work and requirements. For software engineering, detailed design is focused on software product component development. The internal structure of product components is defined, data schemas are generated, algorithms are developed, and heuristics are established to provide product compo- nent capabilities that satisfy allocated requirements. For hardware engineering, detailed design is focused on product development of electronic, mechanical, electro-optical, and other hardware products and their components. Electrical schematics and interconnection diagrams are developed, mechanical and optical assembly models are generated, and fabrication and assembly processes are developed. Example Work Products 1. Product architecture

2. Product component design TS

Subpractices 1. Establish and maintain criteria against which the design can be HINT evaluated. A design is a document used by ptg stakeholders over the life of the product, and thus must Examples of quality attributes, in addition to expected product perform- communicate clearly and ance, for which design criteria can be established, include the following: accommodate change. Con- • Modular sider this when selecting crite- ria to be used in evaluating a •Clear design. • Simple • Maintainable •Verifiable •Portable •Reliable •Accurate TIP • Secure An effective design method • Scalable (and design language and tool) •Usable enables a designer to describe the entities that comprise a design and its connections, analyze attributes of interest 2. Identify, develop, or acquire the design methods appropriate for the (e.g., design cohesiveness, product. presence of race conditions), Effective design methods can embody a wide range of activities, tools, test the design against use and descriptive techniques. Whether a given method is effective or cases or scenarios, and revise not depends on the situation. Two companies may have effective the design to reflect decisions design methods for products in which they specialize, but these made.

Wow! eBook 520 PART TWO THE PROCESS AREAS

TIP methods may not be effective in cooperative ventures. Highly sophis- Prototyping is an important ticated methods are not necessarily effective in the hands of designers technique that can help you who have not been trained in the use of the methods. discover design deficiencies Whether a method is effective also depends on how much assistance early, before fully committing to it provides the designer, and the cost effectiveness of that assistance. implementing a particular For example, a multiyear prototyping effort may not be appropriate design. for a simple product component but might be the right thing to do X-REF for an unprecedented, expensive, and complex product development. Rapid prototyping techniques, however, can be highly effective for For more information about many product components. Methods that use tools to ensure that a design approaches, see “Architecture description lan- design will encompass all the necessary attributes needed to imple- guage,” “Unified Modeling ment the product component design can be effective. For example, a Language,” “Prototype,” and design tool that “knows” the capabilities of the manufacturing “Design Pattern (computer processes can allow the variability of the manufacturing process to be science)” on the Wikipedia accounted for in the design tolerances. site, www.wikipedia.org. Examples of techniques and methods that facilitate effective design include the following: • Prototypes • Structural models • Object oriented design • Essential systems analysis ptg • Entity relationship models •Design reuse • Design patterns

TIP 3. Ensure that the design adheres to applicable design standards and Verification methods that help criteria. ensure that a design adheres to applicable standards and criteria include peer reviews, Examples of design standards include the following (some or all of these modeling, simulation, and standards may be design criteria, particularly in circumstances where the prototyping. (See VER for more standards have not been established): i n f o r m a t i o n . ) • Operator interface standards • Test scenarios • Safety standards • Design constraints (e.g., electromagnetic compatibility, signal integrity, environmental) • Production constraints • Design tolerances • Parts standards (e.g., production scrap, waste)

Wow! eBook Technical Solution 521

4. Ensure that the design adheres to allocated requirements. TIP Identified COTS product components should be taken into account. Verification methods that help For example, putting existing product components into the product ensure that a design adheres to architecture might modify the requirements and the requirements allocated requirements are allocation. identified in the previous tip. (See VER for more information.) 5. Document the design.

SP 2.2 ESTABLISH A TECHNICAL DATA PACKAGE

Establish and maintain a technical data package. TIP A technical data package is the A technical data package provides the developer with a comprehen- complete design documenta- sive description of the product or product component as it is devel- tion for a product or product oped. Such a package also provides procurement flexibility in a variety component and the additional information needed to support of circumstances such as performance based contracting or build-to- its effective use. print. (See the definition of “technical data package” in the glossary.) The design is recorded in a technical data package that is created during preliminary design to document the architecture definition. TIP TS This technical data package is maintained throughout the life of the A technical data package is product to record essential details of the product design. The techni- maintained throughout the life cal data package provides the description of a product or product of the product. It is an essential component (including product related lifecycle processes if not han- input to the acquisition of the ptg product (if it is to be “bought”) dled as separate product components) that supports an acquisition as well as to implementation, strategy, or the implementation, production, engineering, and logis- production, maintenance, and tics support phases of the product lifecycle. The description includes support (if it is to be “built”). the definition of the required design configuration and procedures to ensure adequacy of product or product component performance. It includes all applicable technical data such as drawings, associated lists, specifications, design descriptions, design databases, standards, qual- ity attribute requirements, quality assurance provisions, and packag- ing details. The technical data package includes a description of the selected alternative solution that was chosen for implementation. Because design descriptions can involve a large amount of data and can be crucial to successful product component development, it is advisable to establish criteria for organizing the data and for select- ing the data content. It is particularly useful to use the product archi- tecture as a means of organizing this data and abstracting views that are clear and relevant to an issue or feature of interest. These views include the following:

• Customers • Requirements • The environment

Wow! eBook 522 PART TWO THE PROCESS AREAS

• Functional • Logical • Security • Data • States/modes • Construction • Management

These views are documented in the technical data package. Example Work Products 1. Technical data package

Subpractices

HINT 1. Determine the number of levels of design and the appropriate level Determine early in the project of documentation for each design level. how much documentation is Determining the number of levels of product components (e.g., sub- needed for each level of system, hardware configuration item, circuit board, computer soft- design. Perhaps use cases are ware configuration item [CSCI], computer software product sufficient for designs to be fully component, computer software unit) that require documentation and implemented in software or requirements traceability is important to manage documentation ptg human operations. costs and to support integration and verification plans. 2. Determine the views to be used to document the architecture. Views are selected to document the structures inherent in the product and to address particular stakeholder concerns. 3. Base detailed design descriptions on the allocated product compo- nent requirements, architecture, and higher level designs. 4. Document the design in the technical data package. 5. Document the key (i.e., significant effect on cost, schedule, or tech- nical performance) decisions made or defined, including their rationale. 6. Revise the technical data package as necessary.

SP 2.3 DESIGN INTERFACES USING CRITERIA HINT Where possible, reach early Design product component interfaces using established criteria. agreement on interface design. Lack of agreement leads to Interface designs include the following: wasted time, as assumptions made by the teams on each • Origination side of an interface need to be validated. Project costs • Destination increase, there may be rework, • Stimulus and data characteristics for software, including sequencing and the schedule may slip. constraints or protocols

Wow! eBook Technical Solution 523

• Resources consumed processing a particular stimulus TIP • Exception or error handling behavior for stimuli that are erroneous Interface design may some- or out of specified limits times follow a formal evalua- tion (DAR) process: Establish • Electrical, mechanical, and functional characteristics for hardware criteria for desired attributes • Services lines of communication (e.g., correctness) of an inter- face (based in part on interface The criteria for interfaces frequently reflect critical parameters requirements), identify design that should be defined, or at least investigated, to ascertain their alternatives, and so on. applicability. These parameters are often peculiar to a given type of TIP product (e.g., software, mechanical, electrical, service) and are often Because of the importance of associated with safety, security, durability, and mission critical interfaces to effective product characteristics. development, interfaces are Refer to the Identify Interface Requirements specific practice in the Require- addressed in multiple places: ments Development process area for more information about identifying product Interface requirements is cov- and product component interface requirements. ered in RD SP 2.3, design of interfaces is covered in TS SP 2.3, and ensuring inter-

Example Work Products face compatibility is covered TS 1. Interface design specifications in PI SG 2.

2. Interface control documents TIP 3. Interface specification criteria Interface control documents ptg 4. Rationale for selected interface design define interfaces in terms of data items passed, protocols Subpractices used for interaction, and so on. These documents are particu- 1. Define interface criteria. larly useful in controlling prod- These criteria can be a part of the organizational process assets. uct components being built by Refer to the Organizational Process Definition process area for more different teams. information about establishing and maintaining a usable set of organiza- tional process assets and work environment standards. 2. Identify interfaces associated with other product components. X-REF 3. Identify interfaces associated with external items. For examples of interfaces, see PI SP 2.1. 4. Identify interfaces between product components and the product related lifecycle processes. For example, such interfaces could include the ones between a prod- uct component to be fabricated and the jigs and fixtures used to enable that fabrication during the manufacturing process. 5. Apply the criteria to the interface design alternatives. Refer to the Decision Analysis and Resolution process area for more infor- mation about analyzing possible decisions using a formal evaluation HINT process that evaluates identified alternatives against established criteria. Interface designs should be 6. Document the selected interface designs and the rationale for the documented in the technical data package. selection.

Wow! eBook 524 PART TWO THE PROCESS AREAS

TIP SP 2.4 PERFORM MAKE, BUY, OR REUSE ANALYSES This is also known as make- Evaluate whether the product components should be developed, purchased, or-buy analysis (or make-buy- or-reuse analysis). or reused based on established criteria.

TIP The determination of what products or product components will be Organizations in all kinds of acquired is frequently referred to as a “make-or-buy analysis.” It is industries (even public institu- based on an analysis of the needs of the project. This make-or-buy tions such as school systems) analysis begins early in the project during the first iteration of increasingly engage in such design; continues during the design process; and is completed with analyses to reduce costs or improve time-to-market of the decision to develop, acquire, or reuse the product. their products and services. Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component requirements. Refer to the Requirements Management process area for more information about managing requirements. Factors affecting the make-or-buy decision include the following:

• Functions the products will provide and how these functions will fit into the project • Available project resources and skills ptg • Costs of acquiring versus developing internally • Critical delivery and integration dates • Strategic business alliances, including high-level business requirements • Market research of available products, including COTS products • Functionality and quality of available products • Skills and capabilities of potential suppliers • Impact on core competencies • Licenses, warranties, responsibilities, and limitations associated with products being acquired • Product availability • Proprietary issues • Risk reduction • Match between needs and product line core assets

The make-or-buy decision can be conducted using a formal evalu- ation approach. Refer to the Decision Analysis and Resolution process area for more informa- tion about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria.

Wow! eBook Technical Solution 525

As technology evolves, so does the rationale for choosing to TIP develop or purchase a product component. While complex develop- As technology evolves, a make- ment efforts can favor purchasing an off-the-shelf product compo- or-buy analysis might lead to a nent, advances in productivity and tools can provide an opposing different decision. rationale. Off-the-shelf products can have incomplete or inaccurate documentation and may or may not be supported in the future. Once the decision is made to purchase an off-the-shelf product component, how to implement that decision depends on the type of item being acquired. There are times when “off the shelf” refers to an TIP existing item that is not readily available because it must first be cus- If an off-the-shelf product com- tomized to meet particular purchaser specified requirements for per- ponent cannot be readily pur- formance and other product characteristics as part of its procurement chased but must be procured (e.g., aircraft engines). To manage such procurements, a supplier from a supplier, a supplier agreement may be used to agreement is established that includes these requirements and the manage the procurement. acceptance criteria to be met. In other cases, the off-the-shelf product is literally off the shelf (word processing software, for example) and there is no agreement with the supplier that needs to be managed. TS Refer to the Establish Supplier Agreements specific goal in the Supplier Agree- ment Management process area for more information about handling supplier agreements for modified COTS products. ptg Example Work Products 1. Criteria for design and product component reuse 2. Make-or-buy analyses 3. Guidelines for choosing COTS product components

Subpractices TIP 1. Develop criteria for the reuse of product component designs. If the organization has already been using design methods as 2. Analyze designs to determine if product components should be part of a larger systematic developed, reused, or purchased. reuse effort, it has already 3. Analyze implications for maintenance when considering purchased established the reuse criteria. or nondevelopmental (e.g., COTS, government off the shelf, reuse) Otherwise, reuse is unlikely to significantly improve the eco- items. nomics of design.

Examples of implications for maintenance include the following: HINT • Compatibility with future releases of COTS products When using off-the-shelf and • Configuration management of supplier changes other nondevelopmental items, plan how they will be • Defects in the nondevelopmental item and their resolution maintained. • Unplanned obsolescence

Wow! eBook 526 PART TWO THE PROCESS AREAS

SG 3 IMPLEMENT THE PRODUCT DESIGN Product components, and associated support documentation, are imple- mented from their designs.

Product components are implemented from the designs established by the specific practices in the Develop the Design specific goal. The implementation usually includes unit testing of the product compo- nents before sending them to product integration and development of end-user documentation.

SP 3.1 IMPLEMENT THE DESIGN Implement the designs of the product components.

Once the design has been completed, it is implemented as a product component. The characteristics of that implementation depend on the type of product component. TIP Design implementation at the top level of the product hierarchy At an upper level of the prod- involves the specification of each of the product components at the uct hierarchy, implementing next level of the product hierarchy. This activity includes the alloca- the design implies recursion (i.e., repeating RD SG 2-3 and tion, refinement, and verification of each product component. It also ptg TS SG 1-2) to establish the next involves the coordination between the various product component level of product components. development efforts. At the lowest level of the prod- Refer to the Product Integration process area for more information about man- uct hierarchy, there is no more aging interfaces and assembling product components. decomposition or recursion; the design is directly Refer to the Requirements Development process area for more information about i m p l e m e n t e d . the allocating product component requirements and analyzing requirements.

Example characteristics of this implementation are as follows: • Software is coded. • Data are documented. • Services are documented. • Electrical and mechanical parts are fabricated. • Product-unique manufacturing processes are put into operation. • Processes are documented. • Facilities are constructed. • Materials are produced (e.g., a product-unique material could be petro- leum, oil, a lubricant, a new alloy).

Wow! eBook Technical Solution 527

Example Work Products 1. Implemented design

Subpractices 1. Use effective methods to implement the product components. TIP Which implementation method to use depends on the type of Examples of software coding methods include the following: product component being • Structured programming implemented. An organization • Object oriented programming may establish its own imple- • Aspect oriented programming mentation methods. • Automatic code generation • Software code reuse • Use of applicable design patterns

Examples of hardware implementation methods include the following:

• Gate level synthesis TS • Circuit board layout (place and route) • Computer aided design drawing • Post layout simulation • Fabrication methods ptg

2. Adhere to applicable standards and criteria. TIP The implementation standards and criteria used depend on Examples of implementation standards include the following: the type of product component • Language standards (e.g., standards for software programming languages, being implemented. An organi- hardware description languages) zation may establish its own implementation standards • Drawing requirements and criteria. • Standard parts lists • Manufactured parts • Structure and hierarchy of software product components • Process and quality standards

Examples of criteria include the following: • Modularity •Clarity • Simplicity •Reliability •Safety • Maintainability

Wow! eBook 528 PART TWO THE PROCESS AREAS

X-REF 3. Conduct peer reviews of the selected product components. In VER SP 1.1, you select work Refer to the Verification process area for more information about perform- products to be verified and ing peer reviews. the verification method to use for each. In particular, 4. Perform unit testing of the product component as appropriate. you select which product Note that unit testing is not limited to software. Unit testing involves components to be verified the testing of individual hardware or software units or groups of through peer review (e.g., related items prior to integration of those items. software code). Refer to the Verification process area for more information about verify- ing selected work products.

Examples of unit testing methods (manual or automated) include the f o l l o w i n g : • Statement coverage testing • Branch coverage testing • Predicate coverage testing • Path coverage testing • Boundary value testing • Special value testing

Examples of unit testing methods include the following: ptg • Functional testing TIP • Radiation inspection testing The product component may • Environmental testing now be ready for product inte- gration (see PI).

TIP 5. Revise the product component as necessary. Another reason to revise the product component is to An example of when the product component may need to be revised is remove defects discovered when problems surface during implementation that could not be foreseen through peer reviews or unit during design. testing.

SP 3.2 DEVELOP PRODUCT SUPPORT DOCUMENTATION TIP Develop and maintain the end-use documentation. Documentation can be treated as a type of product component This specific practice develops and maintains the documentation for which a solution may be selected, designed, and imple- that will be used to install, operate, and maintain the product. mented. There are design and implementation methods and Example Work Products standards for documentation. 1. End-user training materials

Wow! eBook Technical Solution 529

2. User’s manual TIP 3. Operator’s manual Installers, operators, end users, 4. Maintenance manual and maintainers may have dif- ferent documentation needs 5. Online help that may be addressed in dif- ferent documents. Subpractices 1. Review the requirements, design, product, and test results to ensure TIP that issues affecting the installation, operation, and maintenance Relevant stakeholders (product documentation are identified and resolved. installers, field support staff, 2. Use effective methods to develop the installation, operation, and trainers, technical writers, and maintenance documentation. so on) can reduce the number of serious issues that must be 3. Adhere to the applicable documentation standards. resolved through documenta- tion by participating in reviews of intermediate work products. Examples of documentation standards include the following: In these reviews, issues • Compatibility with designated word processors impacting installation, opera- • Acceptable fonts tion, and so on are identified • Numbering of pages, sections, and paragraphs and resolved. TS • Consistency with a designated style manual • Use of abbreviations • Security classification markings ptg • Internationalization requirements

4. Develop preliminary versions of the installation, operation, and TIP maintenance documentation in early phases of the project lifecycle A preliminary version of a doc- for review by the relevant stakeholders. ument produced for review by 5. Conduct peer reviews of the installation, operation, and mainte- relevant stakeholders can be nance documentation. considered a prototype of the final document. Refer to the Verification process area for more information about perform- ing peer reviews. 6. Revise the installation, operation, and maintenance documentation as necessary.

Examples of when documentation may need to be revised include when the following events occur: • Requirements changes are made • Design changes are made • Product changes are made • Documentation errors are identified • Workaround fixes are identified

Wow! eBook This page intentionally left blank

ptg

Wow! eBook VALIDATION An Engineering Process Area at Maturity Level 3

Purpose The purpose of Validation (VAL) is to demonstrate that a product or TIP product component fulfills its intended use when placed in its Validation is a series of evalua- intended environment. tions in which end users (or proxies), the product (or proto- type), and other stakeholders Introductory Notes or systems in the intended environment (real or simu- Validation activities can be applied to all aspects of the product in lated) interact with each other any of its intended environments, such as operation, training, manu- in various typical and not-so- typical ways to determine ptg facturing, maintenance, and support services. The methods employed whether the product will fulfill to accomplish validation can be applied to work products as well as its intended use. The results to the product and product components. (Throughout the process may increase confidence in the product or identify issues to be areas, where the terms “product” and “product component” are used, VAL their intended meanings also encompass services, service systems, resolved. and their components.) The work products (e.g., requirements, HINT designs, prototypes) should be selected on the basis of which are the best predictors of how well the product and product component will If you wait until the acceptance test to find issues, you may be satisfy end user needs and thus validation is performed early (con- in big trouble. cept/exploration phases) and incrementally throughout the product lifecycle (including transition to operations and sustainment). The validation environment should represent the intended envi- ronment for the product and product components as well as repre- sent the intended environment suitable for validation activities with work products. Validation demonstrates that the product, as provided, will fulfill its intended use; whereas, verification addresses whether the work product properly reflects the specified requirements. In other words, verification ensures that “you built it right”; whereas, validation ensures that “you built the right thing.” Validation activities use approaches similar to verification (e.g., test, analysis, inspection, demonstration, simulation). Often, the end users and other relevant

531

Wow! eBook 532 PART TWO THE PROCESS AREAS

HINT stakeholders are involved in the validation activities. Both validation Typically, only a well-verified and verification activities often run concurrently and can use por- work product should be used tions of the same environment. in validation; otherwise, you may lose valuable time with Refer to the Verification process area for more information about ensuring that disruptions or rediscovering selected work products meet their specified requirements. requirements you already Whenever possible, validation should be accomplished using the knew. This is particularly true product or product component operating in its intended environ- when validation opportunities ment. The entire environment can be used or only part of it. How- are rare or expensive. Con- versely, validation helps you ever, validation issues can be discovered early in the life of the to uncover missing needs that project using work products by involving relevant stakeholders. Vali- you can incorporate into dation activities for services can be applied to work products such as requirements. proposals, service catalogs, statements of work, and service records. When validation issues are identified, they are referred to processes TIP associated with the Requirements Development, Technical Solution, RD addresses requirements or Project Monitoring and Control process areas for resolution. validation. Requirements vali- dation determines the ade- The specific practices of this process area build on each other in quacy and completeness of the the following way: requirements. • The Select Products for Validation specific practice enables the identi- TIP fication of the product or product component to be validated and The validation activities inher- methods to be used to perform the validation. ptg ent to an Agile method are • The Establish the Validation Environment specific practice enables often insufficient to address all desired functionality and qual- the determination of the environment to be used to carry out the ity attributes. Use the practices validation. of this process area to deter- • The Establish Validation Procedures and Criteria specific practice mine and address gaps in vali- enables the development of validation procedures and criteria that are dation coverage. aligned with the characteristics of selected products, customer con- straints on validation, methods, and the validation environment. • The Perform Validation specific practice enables the performance of validation according to methods, procedures, and criteria.

Related Process Areas

Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component requirements. Refer to the Technical Solution process area for more information about select- ing, designing, and implementing solutions to requirements. Refer to the Verification process area for more information about ensuring that selected work products meet their specified requirements.

Wow! eBook Validation 533 Specific Goal and Practice Summary SG 1 Prepare for Validation SP 1.1 Select Products for Validation SP 1.2 Establish the Validation Environment SP 1.3 Establish Validation Procedures and Criteria SG 2 Validate Product or Product Components SP 2.1 Perform Validation SP 2.2 Analyze Validation Results

Specific Practices by Goal

SG 1 PREPARE FOR VALIDATION Preparation for validation is conducted.

Preparation activities include selecting products and product compo- HINT nents for validation and establishing and maintaining the validation Any product or product com- environment, procedures, and criteria. Items selected for validation ponent can benefit from vali- can include only the product or it can include appropriate levels of dation. What you select to validate should depend on the product components used to build the product. Any product or prod- issues relating to user needs uct component can be subject to validation, including replacement, that pose the highest risk to maintenance, and training products, to name a few. project success and on avail- ptg The environment required to validate the product or product able resources. component is prepared. The environment can be purchased or can be specified, designed, and built. Environments used for product inte- TIP gration and verification can be considered in collaboration with the Integration tests can address VAL validation-type activities (with validation environment to reduce cost and improve efficiency or an end user present to evaluate productivity. the integrated product under different scenarios). Thus, for SP 1.1 SELECT PRODUCTS FOR VALIDATION some product components, product integration, verifica- Select products and product components to be validated and validation tion, and validation activities methods to be used. may be addressed together.

Products and product components are selected for validation based TIP on their relationship to end user needs. For each product compo- Validation can be an expensive nent, the scope of the validation (e.g., operational behavior, mainte- activity. It takes good judgment nance, training, user interface) should be determined. to select (and limit) what needs to be validated.

Examples of products and product components that can be validated include the following: • Product and product component requirements and designs • Product and product components (e.g., system, hardware units, software, service documentation) Continues

Wow! eBook 534 PART TWO THE PROCESS AREAS

HINT Continued When you seek to validate a • User interfaces product component that has •User manuals not yet been built, consider • Training materials developing a prototype based on the component’s require- • Process documentation ments and design. The timely • Access protocols end-user feedback you obtain • Data interchange reporting formats may more than compensate for the expense of the validation exercise. The requirements and constraints for performing validation are collected. Then, validation methods are selected based on their abil- X-REF ity to demonstrate that end user needs are satisfied. The validation Customer constraints are methods not only define the approach to product validation, but also briefly mentioned as part of the development of customer drive the needs for the facilities, equipment, and environments. The requirements in RD SP 1.2. validation approach and needs can result in the generation of lower level product component requirements that are handled by the HINT requirements development processes. Derived requirements, such as The product (or prototype) may interface requirements to test sets and test equipment, can be gener- need special interfaces and ated. These requirements are also passed to the requirements devel- functionality to properly inter- opment processes to ensure that the product or product components act with elements of the valida- can be validated in an environment that supports the methods. tion environment (e.g., data ptg recording equipment). You can Validation methods should be selected early in the life of the proj- develop their requirements ect so they are clearly understood and agreed to by relevant stake- and incorporate them with holders. other product requirements. Validation methods address the development, maintenance, support, and training for the product or product component as appropriate. TIP Because validation generally involves stakeholders external Examples of validation methods include the following: to the project, it is important • Discussions with end users, perhaps in the context of a formal review early in the project to identify • Prototype demonstrations and communicate with them • Functional demonstrations (e.g., system, hardware units, software, service about validation methods so that appropriate preparations documentation, user interfaces) can begin. • Pilots of training materials • Tests of products and product components by end users and other rele- vant stakeholders • Incremental delivery of working and potentially acceptable product • Analyses of product and product components (e.g., simulations, model- ing, user analyses)

Hardware validation activities include modeling to validate form, fit, and function of mechanical designs; thermal modeling; maintainability and reliability analysis; timeline demonstrations; and electrical design simula- tions of electronic or mechanical product components.

Wow! eBook Validation 535

Example Work Products HINT 1. Lists of products and product components selected for validation Items selected for validation might be shown as a table with 2. Validation methods for each product or product component columns identifying items to be 3. Requirements for performing validation for each product or product validated, issues to be investi- component gated, related requirements and constraints, and validation 4. Validation constraints for each product or product component methods. The table might also list the work products to be Subpractices verified and the verification 1. Identify the key principles, features, and phases for product or prod- methods to be used. Using one uct component validation throughout the life of the project. table to address both may lead you to discover opportunities 2. Determine which categories of end user needs (operational, mainte- to combine verification and nance, training, or support) are to be validated. validation. The product or product component should be maintainable and sup- portable in its intended operational environment. This specific prac- TIP tice also addresses the actual maintenance, training, and support Validation is not only applied services that can be delivered with the product. to discover missing functional- ity; nor limited to only the end- user operational environment. An example of evaluation of maintenance concepts in the operational Quality attributes, other envi- environment is a demonstration that maintenance tools are operating with ronments, and other phases of the product lifecycle should be the actual product. ptg considered.

3. Select the product and product components to be validated.

4. Select the evaluation methods for product or product component VAL validation. 5. Review the validation selection, constraints, and methods with rele- HINT vant stakeholders. Preparing for and conducting validation requires coordina- SP 1.2 ESTABLISH THE VALIDATION ENVIRONMENT tion with many external groups. Obtain commitment from these Establish and maintain the environment needed to support validation. groups to support the planned validation efforts. The requirements for the validation environment are driven by the product or product components selected, by the type of the work products (e.g., design, prototype, final version), and by the methods of validation. These selections can yield requirements for the pur- chase or development of equipment, software, or other resources. These requirements are provided to the requirements development processes for development. The validation environment can include the reuse of existing resources. In this case, arrangements for the use of these resources should be made.

Wow! eBook 536 PART TWO THE PROCESS AREAS Example types of elements in a validation environment include the f o l l o w i n g : • Test tools interfaced with the product being validated (e.g., scope, elec- tronic devices, probes) • Temporary embedded test software • Recording tools for dump or further analysis and replay • Simulated subsystems or components (e.g., software, electronics, mechanics) • Simulated interfaced systems (e.g., a dummy warship for testing a naval radar) • Real interfaced systems (e.g., aircraft for testing a radar with trajectory tracking facilities) • Facilities and customer supplied products • Skilled people to operate or use all the preceding elements • Dedicated computing or network test environment (e.g., pseudo- operational telecommunications network test bed or facility with actual trunks, switches, and systems established for realistic integration and v a l i d a t i o n t r i a l s )

TIP Early selection of products or product components to be vali- Validation, verification, and dated, work products to be used in validation, and validation meth- ptg product integration environ- ods is needed to ensure that the validation environment will be ments can sometimes be one and the same, or at a minimum, available when necessary. can share some of the same The validation environment should be carefully controlled to pro- resources. vide for replication, results analysis, and revalidation of problem areas. Example Work Products 1. Validation environment

Subpractices TIP 1. Identify requirements for the validation environment. Because validation resembles a 2. Identify customer supplied products. controlled experiment and because of the need for fidelity 3. Identify test equipment and tools. with the operational environ- 4. Identify validation resources that are available for reuse and modifi- ment, many tools, simulations, cation. computers, networks, and 5. Plan the availability of resources in detail. skilled people may need to be involved. Thus, validation planning may itself be c h a l l e n g i n g .

Wow! eBook Validation 537

SP 1.3 ESTABLISH VALIDATION PROCEDURES AND CRITERIA TIP Establish and maintain procedures and criteria for validation. This practice helps answer questions such as how you will exercise the product prototype Validation procedures and criteria are defined to ensure the product to better understand a particu- or product component will fulfill its intended use when placed in its lar issue (validation proce- intended environment. Test cases and procedures for acceptance test- dures) and how you will know ing can be used for validation procedures. whether the behavior is accept- able (validation criteria). The validation procedures and criteria include test and evaluation of maintenance, training, and support services. TIP Validation procedures and cri- Examples of sources for validation criteria include the following: teria should address opera- • Product and product component requirements tions, but can address other product lifecycle phases as •Standards well such as transition and • Customer acceptance criteria s u s t a i n m e n t . • Environmental performance • Thresholds of performance deviation

Example Work Products

1. Validation procedures ptg 2. Validation criteria 3. Test and evaluation procedures for maintenance, training, and support VAL Subpractices 1. Review the product requirements to ensure that issues affecting vali- dation of the product or product component are identified and resolved. 2. Document the environment, operational scenario, procedures, inputs, outputs, and criteria for the validation of the selected prod- uct or product component. 3. Assess the design as it matures in the context of the validation envi- ronment to identify validation issues.

SG 2 VALIDATE PRODUCT OR PRODUCT COMPONENTS The product or product components are validated to ensure they are suitable for use in their intended operating environment.

The validation methods, procedures, and criteria are used to validate the selected products and product components and any associated maintenance, training, and support services using the appropriate

Wow! eBook 538 PART TWO THE PROCESS AREAS

validation environment. Validation activities are performed through- out the product lifecycle.

SP 2.1 PERFORM VALIDATION Perform validation on selected products and product components.

HINT To be acceptable to stakeholders, a product or product component The bottom line is to determine should perform as expected in its intended operational environment. whether the product will per- Validation activities are performed and the resulting data are col- form as expected. lected according to established methods, procedures, and criteria. The as-run validation procedures should be documented and TIP the deviations occurring during the execution should be noted as The validation environment appropriate. may support the automatic col- lection of much of the data. Example Work Products 1. Validation reports 2. Validation results 3. Validation cross reference matrix 4. As-run procedures log 5. Operational demonstrations ptg

SP 2.2 ANALYZE VALIDATION RESULTS HINT Analyze results of validation activities. Validation activities are expen- sive and it is important to maxi- The data resulting from validation tests, inspections, demonstrations, mize learning. Therefore, or evaluations are analyzed against defined validation criteria. Analy- analyzing the results of the val- idation activities may help you sis reports indicate whether needs were met. In the case of deficien- to discover missing require- cies, these reports document the degree of success or failure and ments, features in the product categorize probable causes of failure. The collected test, inspection, that delight the customer, lin- or review results are compared with established evaluation criteria to gering issues, and risks. determine whether to proceed or to address requirements or design issues in the requirements development or technical solution processes. Analysis reports or as-run validation documentation can also indicate that bad test results are due to a validation procedure prob- lem or a validation environment problem. Example Work Products 1. Validation deficiency reports 2. Validation issues 3. Procedure change request

Wow! eBook Validation 539

Subpractices 1. Compare actual results to expected results. 2. Based on the established validation criteria, identify products and HINT product components that do not perform suitably in their intended If requirements are missing, operating environments, or identify problems with methods, criteria, you need to revisit your engi- or the environment. neering processes. If there are 3. Analyze validation data for defects. problems with the validation methods, environment, proce- 4. Record results of the analysis and identify issues. dures, or criteria, you need to 5. Use validation results to compare actual measurements and perform- revisit project activities that ance to the intended use or operational need. correspond to the specific practices of SG 1. 6. Provide information on how defects can be resolved (including vali- dation methods, criteria, and validation environment) and initiate corrective action. Refer to the Project Monitoring and Control process area for more infor- mation about managing corrective actions.

ptg VAL

Wow! eBook This page intentionally left blank

ptg

Wow! eBook VERIFICATION An Engineering Process Area at Maturity Level 3

Purpose TIP The purpose of Verification (VER) is to ensure that selected work Testing and peer reviews are products meet their specified requirements. verification methods covered in this process area. Introductory Notes The Verification process area involves the following: verification X-REF preparation, verification performance, and identification of correc- Managing corrective action to tive action. closure is addressed in PMC SG 2. ptg Verification includes verification of the product and intermediate work products against all selected requirements, including customer, product, and product component requirements. For product lines, core assets and their associated product line variation mechanisms should also be verified. Throughout the process areas, where the terms “product” and “product component” are used, their intended mean- ings also encompass services, service systems, and their components. Verification is inherently an incremental process because it occurs throughout the development of the product and work products, beginning with verification of requirements, progressing through the verification of evolving work products, and culminating in the verifi- cation of the completed product. VER The specific practices of this process area build on each other in the following way:

• The Select Work Products for Verification specific practice enables the identification of work products to be verified, methods to be used to perform the verification, and the requirements to be satisfied by each selected work product. • The Establish the Verification Environment specific practice enables the determination of the environment to be used to carry out the verification.

541

Wow! eBook 542 PART TWO THE PROCESS AREAS

• The Establish Verification Procedures and Criteria specific practice enables the development of verification procedures and criteria that are aligned with selected work products, requirements, methods, and characteristics of the verification environment. • The Perform Verification specific practice conducts the verification according to available methods, procedures, and criteria.

Verification of work products substantially increases the likeli- hood that the product will meet the customer, product, and product component requirements. The Verification and Validation process areas are similar, but they address different issues. Validation demonstrates that the product, as provided (or as it will be provided), will fulfill its intended use, whereas verification addresses whether the work product properly reflects the specified requirements. In other words, verification ensures that “you built it right”; whereas, validation ensures that “you built the right thing.” TIP Peer reviews are an important part of verification and are a proven Peer reviews also focus on mechanism for effective defect removal. An important corollary is to obtaining the data necessary to develop a better understanding of the work products and the prevent defects and improve processes that produced them so that defects can be prevented and ptg the process. process improvement opportunities can be identified. Peer reviews involve a methodical examination of work products by the producers’ peers to identify defects and other changes that are needed.

TIP Examples of peer review methods include the following: “Inspections” and “structured • Inspections walkthroughs” are peer review • Structured walkthroughs methods that have been widely • Deliberate refactoring used for decades and have • Pair programming many variants. “Deliberate refactoring” and “pair program- ming” are techniques charac- In Agile environments, because of customer involvement and frequent teristic of some Agile methods. releases, verification and validation mutually support each other. For example, a defect can cause a prototype or early release to fail validation prematurely. Conversely, early and continuous validation helps ensure verification is applied to the right product. The Verification and Validation process areas help ensure a systematic approach to selecting the work products to be reviewed and tested, the methods and environments to be used, and the interfaces to be managed, which help ensure that defects are identified and addressed early. The more complex the product, the more systematic the approach needs to be to ensure compatibility among requirements and solutions, and consistency with how the product will be used. (See “Interpreting CMMI When Using Agile Approaches” in Part I.)

Wow! eBook Verification 543 Related Process Areas X-REF Refer to the Requirements Development process area for more information about eliciting, analyzing, and establishing customer, product, and product component Many books and other sources describe testing and requirements. peer reviews (sometimes Refer to the Validation process area for more information about demonstrating known as inspections). For a that a product or product component fulfills its intended use when placed in its summary of software testing intended environment. principles and terminology, see the entry for “Software Refer to the Requirements Management process area for more information about Testing” on the Wikipedia ensuring alignment between project work and requirements. site, www.wikipedia.org. Also, many other forums exist that may be of benefit, e.g., http:// Specific Goal and Practice Summary spac.wordpress.com. SG 1 Prepare for Verification SP 1.1 Select Work Products for Verification SP 1.2 Establish the Verification Environment SP 1.3 Establish Verification Procedures and Criteria SG 2 Perform Peer Reviews SP 2.1 Prepare for Peer Reviews SP 2.2 Conduct Peer Reviews SP 2.3 Analyze Peer Review Data SG 3 Verify Selected Work Products ptg SP 3.1 Perform Verification SP 3.2 Analyze Verification Results

Specific Practices by Goal

SG 1 PREPARE FOR VERIFICATION Preparation for verification is conducted.

Up-front preparation is necessary to ensure that verification provi- sions are embedded in product and product component requirements, designs, developmental plans, and schedules. Verification includes VER the selection, inspection, testing, analysis, and demonstration of work products. Methods of verification include, but are not limited to, inspec- tions, peer reviews, audits, walkthroughs, analyses, architecture eval- uations, simulations, testing, and demonstrations. Practices related to peer reviews as a specific verification method are included in spe- cific goal 2. Preparation also entails the definition of support tools, test equip- ment and software, simulations, prototypes, and facilities.

Wow! eBook 544 PART TWO THE PROCESS AREAS

HINT SP 1.1 SELECT WORK PRODUCTS FOR VERIFICATION Identify which work products Select work products to be verified and verification methods to be used. put the project (or product) at the highest risk. Work products are selected based on their contribution to meeting HINT project objectives and requirements, and to addressing project risks. Don’t forget to verify work The work products to be verified can include the ones associated products important to other with maintenance, training, and support services. The work product phases of the product lifecycle, requirements for verification are included with the verification meth- such as maintenance docu- ods. The verification methods address the approach to work product mentation, installation serv- ices, and operator training. verification and the specific approaches that will be used to verify that specific work products meet their requirements. TIP When a work product cannot Examples of verification methods include the following: be executed (e.g., designs), test- ing is not an option. It may be • Software architecture evaluation and implementation conformance possible to develop prototypes evaluation and models, and “execute” • Path coverage testing these to gain insight into prod- • Load, stress, and performance testing uct characteristics. Peer reviews • Decision table based testing may be another option. • Functional decomposition based testing •Test case reuse X-REF ptg One source of information on • Acceptance testing software architecture evalua- • Continuous integration (i.e., Agile approach that identifies integration tions is Clements, P., Kazman, issues early) R., Klein, M. Evaluating Soft- ware Architectures: Methods and Case Studies, Addison- Verification for systems engineering typically includes prototyping, model- Wesley, 2001. ing, and simulation to verify adequacy of system design (and allocation).

Verification for hardware engineering typically requires a parametric approach that considers various environmental conditions (e.g., pressure, temperature, vibration, humidity), various input ranges (e.g., input power TIP could be rated at 20V to 32V for a planned nominal of 28V), variations Reverification is not called out induced from part to part tolerance issues, and many other variables. separately in this process area, Hardware verification normally tests most variables separately except because reverification is when problematic interactions are suspected. actually an iteration of the verification process. However, reverification should be con- Selection of verification methods typically begins with the defini- sidered when planning verifi- cation activities. tion of product and product component requirements to ensure that the requirements are verifiable. Re-verification should be addressed HINT by verification methods to ensure that rework performed on work Work products that have been products does not cause unintended defects. Suppliers should be “reworked” need to be involved in this selection to ensure that the project’s methods are reverified. appropriate for the supplier’s environment.

Wow! eBook Verification 545

Example Work Products TIP 1. Lists of work products selected for verification Methods used for each work product may be shown in a 2. Verification methods for each selected work product table with columns identifying the work product to be veri- Subpractices fied, requirements to be satis- 1. Identify work products for verification. fied, and verification methods to be used. 2. Identify requirements to be satisfied by each selected work product. Refer to the Maintain Bidirectional Traceability of Requirements specific practice in the Requirements Management process area for more informa- tion about tracing requirements to work products. 3. Identify verification methods available for use. 4. Define verification methods to be used for each selected work product. TIP 5. Submit for integration with the project plan the identification of By incorporating such a table work products to be verified, the requirements to be satisfied, and into the project plan (perhaps the methods to be used. by reference), resources can be provided and commitments Refer to the Project Planning process area for more information about made to perform the appropri- developing the project plan. ate verification activities.

SP 1.2 ESTABLISH THE VERIFICATION ENVIRONMENT ptg Establish and maintain the environment needed to support verification. TIP Some work products and verifi- An environment should be established to enable verification to take cation methods may require special facilities and tools. place. The verification environment can be acquired, developed, These should be identified and reused, modified, or obtained using a combination of these activities, obtained in advance. depending on the needs of the project. The type of environment required depends on the work products TIP selected for verification and the verification methods used. A peer In the case of peer reviews, a review can require little more than a package of materials, reviewers, co-located team might meet in a room where the document and a room. A product test can require simulators, emulators, sce- VER nario generators, data reduction tools, environmental controls, and being peer reviewed can be displayed. Remote team mem- interfaces with other systems. bers might participate through teleconferencing and use of a Example Work Products web-based collaboration tool 1. Verification environment that allows them to see the document and engage in the Subpractices discussion. 1. Identify verification environment requirements. 2. Identify verification resources that are available for reuse or modification.

Wow! eBook 546 PART TWO THE PROCESS AREAS

TIP 3. Identify verification equipment and tools. The organization’s IT or Facili- 4. Acquire verification support equipment and an environment (e.g., ties Group, or perhaps other test equipment, software). projects, might have some of the verification resources that a project might need. In some SP 1.3 ESTABLISH VERIFICATION PROCEDURES AND CRITERIA cases, special facilities such as Establish and maintain verification procedures and criteria for the selected environmental labs or antenna work products. ranges used by multiple proj- ects must be reserved well in advance for use. Verification criteria are defined to ensure that work products meet their requirements. HINT The bottom line is to determine Examples of sources for verification criteria include the following: verification methods, environ- ments, and procedures early to • Product and product component requirements allow time for preparation and •Standards coordination. • Organizational policies • Test type TIP • Test parameters In the case of engineering arti- • Parameters for tradeoff between quality and cost of testing facts (e.g., architectures, • Type of work products designs, and implementations), the primary source for verifica- • Suppliers ptg tion criteria is likely to be the • Proposals and agreements requirements assigned to the • Customers reviewing work products collaboratively with developers work product being verified.

Example Work Products 1. Verification procedures 2. Verification criteria HINT Remember to establish com- Subpractices prehensive verification proce- dures and criteria for 1. Generate a set of comprehensive, integrated verification procedures nondevelopmental items (such for work products and commercial off-the-shelf products, as neces- as off-the-shelf products) that sary. subject the project to moderate 2. Develop and refine verification criteria as necessary. or high risk. 3. Identify the expected results, tolerances allowed, and other criteria HINT for satisfying the requirements. Often, there isn’t a single right 4. Identify equipment and environmental components needed to sup- result but a range of results that port verification. might be acceptable. In such instances, specify how much variability from the expected answer is still acceptable.

Wow! eBook Verification 547

SG 2 PERFORM PEER REVIEWS TIP Peer reviews are performed on selected work products. Peers are generally coworkers from your project that have interest in the item under peer Peer reviews involve a methodical examination of work products by review. Peer review partici- the producers’ peers to identify defects for removal and to recom- pants should generally not mend other changes that are needed. include your management The peer review is an important and effective verification method because that could restrain open and beneficial dialog. implemented via inspections, structured walkthroughs, or a number of other collegial review methods. TIP Peer reviews are primarily applied to work products developed by Peer reviews provide opportu- the projects, but they can also be applied to other work products nities to learn and share infor- such as documentation and training work products that are typically mation across the team. developed by support groups. HINT SP 2.1 PREPARE FOR PEER REVIEWS Use peer reviews not just for development artifacts, but also Prepare for peer reviews of selected work products. for project management arti- facts (e.g., plans), process man- Preparation activities for peer reviews typically include identifying agement artifacts (e.g., process the staff to be invited to participate in the peer review of each work descriptions), and support arti- facts (e.g., measure definitions). product; identifying key reviewers who should participate in the peer ptg review; preparing and updating materials to be used during peer reviews, such as checklists and review criteria and scheduling peer reviews. Example Work Products TIP Easily overlooked, training 1. Peer review schedule improves the effectiveness of 2. Peer review checklist peer reviews. 3. Entry and exit criteria for work products TIP 4. Criteria for requiring another peer review Although many different types 5. Peer review training material of reviews might be considered 6. Selected work products to be reviewed “peer reviews” (e.g., informal

walkthroughs), these specific VER Subpractices practices focus on those having some formality and discipline 1. Determine the type of peer review to be conducted. in their performance.

Examples of types of peer reviews include the following: X-REF • Inspections For a summary of peer review principles and terminology, • Structured walkthroughs see the entry for “Software • Active reviews Inspections” on the Wiki - • Architecture implementation conformance evaluation pedia site, www.wikipedia.org.

Wow! eBook 548 PART TWO THE PROCESS AREAS

2. Define requirements for collecting data during the peer review. Refer to the Measurement and Analysis process area for more information about obtaining measurement data. 3. Establish and maintain entry and exit criteria for the peer review. 4. Establish and maintain criteria for requiring another peer review. HINT 5. Establish and maintain checklists to ensure that work products are A checklist identifies the reviewed consistently. classes of defects that com- monly occur for a type of work product. By using the checklist, Examples of items addressed by the checklists include the following: you’re less likely to overlook • Rules of construction certain classes of defects. Also, • Design guidelines some classes of defects might • Completeness be assigned to different peer review participants, helping to •Correctness ensure adequate coverage of • Maintainability the checklist. • Common defect types

The checklists are modified as necessary to address the specific type of work product and peer review. The peers of the checklist develop- ers and potential end-users review the checklists. TIP 6. Develop a detailed peer review schedule, including the dates for peer ptg A classic problem that arises in review training and for when materials for peer reviews will be low-maturity organizations is available. that participants skip prepara- 7. Ensure that the work product satisfies the peer review entry criteria tion when under schedule prior to distribution. pressure. One way to address this is to incorporate the peer 8. Distribute the work product to be reviewed and related information review schedule into the devel- to participants early enough to enable them to adequately prepare opment plans and schedule to for the peer review. help ensure adequate alloca- 9. Assign roles for the peer review as appropriate. tion of time. Another way is to postpone the peer review if participants have not prepared. Examples of roles include the following: • Leader • Reader • Recorder •Author

HINT 10. Prepare for the peer review by reviewing the work product prior to To maximize the effectiveness conducting the peer review. of the peer review, participants need to prepare prior to the meeting.

Wow! eBook Verification 549

SP 2.2 CONDUCT PEER REVIEWS Conduct peer reviews of selected work products and identify issues resulting from these reviews.

One of the purposes of conducting a peer review is to find and TIP remove defects early. Peer reviews are performed incrementally as The ratios often reported that work products are being developed. These reviews are structured and compare the cost to find a defect late in the project to the are not management reviews. cost to find a defect early in the Peer reviews can be performed on key work products of specifica- project range from 5:1 to 100:1. tion, design, test, and implementation activities and specific plan- ning work products. HINT The focus of the peer review should be on the work product in Provide a nonthreatening envi- review, not on the person who produced it. ronment so that open discus- When issues arise during the peer review, they should be commu- sions can occur. nicated to the primary developer of the work product for correction. Refer to the Project Monitoring and Control process area for more information about monitoring the project against the plan. Peer reviews should address the following guidelines: there should be sufficient preparation, the conduct should be managed and controlled, consistent and sufficient data should be recorded (an ptg example is conducting a formal inspection), and action items should be recorded. Example Work Products 1. Peer review results 2. Peer review issues 3. Peer review data HINT Subpractices During peer review training, 1. Perform the assigned roles in the peer review. train the staff in all of the roles necessary to conduct the peer 2. Identify and document defects and other issues in the work product. reviews. These roles can vary VER 3. Record results of the peer review, including action items. from one peer review to the 4. Collect peer review data. next.

Refer to the Measurement and Analysis process area for more information TIP about obtaining measurement data. Defects and issues are often 5. Identify action items and communicate issues to relevant recorded in a table with stakeholders. columns for such things as location of defect, defect 6. Conduct an additional peer review if needed. description (including type and 7. Ensure that the exit criteria for the peer review are satisfied. origin), discussion, and action items.

Wow! eBook 550 PART TWO THE PROCESS AREAS

TIP SP 2.3 ANALYZE PEER REVIEW DATA The analysis can help answer Analyze data about the preparation, conduct, and results of the peer reviews. questions such as what types of defects you are encounter- Refer to the Measurement and Analysis process area for more information about ing, their severity, in which obtaining measurement data and analyzing measurement data. phases they are being injected and detected, the peer review Example Work Products “yield” (percent of defects detected), the review rate 1. Peer review data (pages per hour), and the cost 2. Peer review action items (or hours expended) per defect found. Subpractices 1. Record data related to the preparation, conduct, and results of the peer reviews. Typical data are product name, product size, composition of the peer review team, type of peer review, preparation time per reviewer, length of the review meeting, number of defects found, type and ori- gin of defect, and so on. Additional information on the work product being peer reviewed can be collected, such as size, development stage, operating modes examined, and requirements being evaluated. 2. Store the data for future reference and analysis.

3. Protect the data to ensure that peer review data are not used ptg inappropriately.

Examples of the inappropriate use of peer review data include using data to evaluate the performance of people and using data for attribution.

4. Analyze the peer review data.

Examples of peer review data that can be analyzed include the following: • Phase defect was injected • Preparation time or rate versus expected time or rate • Number of defects versus number expected • Types of defects detected • Causes of defects • Defect resolution impact • User stories or case studies associated with a defect • The end users and customers who are associated with defects

Wow! eBook Verification 551

SG 3 VERIFY SELECTED WORK PRODUCTS

Selected work products are verified against their specified requirements. TIP This goal should align with the Verification methods, procedures, and criteria are used to verify planning and preparation selected work products and associated maintenance, training, and activities addressed in SG 1. support services using the appropriate verification environment. Ver- TIP ification activities should be performed throughout the product life- Performing peer reviews is cycle. Practices related to peer reviews as a specific verification addressed in SG 2; this specific method are included in specific goal 2. practice addresses all other forms of verification. SP 3.1 PERFORM VERIFICATION TIP Perform verification on selected work products. Verification often means “test- ing.” However, verification Verifying products and work products incrementally promotes early methods may also include detection of problems and can result in the early removal of defects. analyses, simulations, demon- The results of verification save the considerable cost of fault isolation strations, and formal methods. and rework associated with troubleshooting problems. X-REF Example Work Products For more information, see the entries for “Software Testing,” 1. Verification results “Computer Simulation,” and ptg 2. Verification reports “Formal Methods” on the Wiki- 3. Demonstrations pedia site, www.wikipedia.org. 4. As-run procedures log TIP When requirements change, it Subpractices is necessary to determine the 1. Perform the verification of selected work products against their work products affected (REQM), requirements. rework them (RD, TS, and PI), and reverify them against the 2. Record the results of verification activities. changed requirements. Thus, 3. Identify action items resulting from the verification of work requirements volatility can sig- products. nificantly increase the cost of verification activities and can VER 4. Document the “as-run” verification method and deviations from lead to schedule slips. available methods and procedures discovered during its performance. HINT Sometimes, the verification SP 3.2 ANALYZE VERIFICATION RESULTS procedure cannot be run as Analyze results of all verification activities. defined (e.g., incorrect assump- tions were made as to the nature of the work product or Actual results should be compared to established verification criteria verification environment). If so, to determine acceptability. record any deviations.

Wow! eBook 552 PART TWO THE PROCESS AREAS

TIP The results of the analysis are recorded as evidence that verifica- Analysis helps to identify areas tion was conducted. (or risks) on which to focus lim- For each work product, all available verification results are incre- ited resources. mentally analyzed to ensure that requirements have been met. Since a peer review is one of several verification methods, peer review data should be included in this analysis activity to ensure that verification results are analyzed sufficiently. HINT Analysis reports or “as-run” method documentation can also indi- When piloting a new verifica- cate that bad verification results are due to method problems, criteria tion tool, analyze results to problems, or a verification environment problem. help identify ways to adjust the process or tool to increase the Example Work Products effectiveness of verification activities. 1. Analysis report (e.g., statistics on performance, causal analysis of nonconformances, comparison of the behavior between the real product and models, trends) 2. Trouble reports 3. Change requests for verification methods, criteria, and the environ- ment

Subpractices 1. Compare actual results to expected results. ptg TIP 2. Based on the established verification criteria, identify products that The verification criteria estab- do not meet their requirements or identify problems with methods, lished in SP 1.3 play an impor- procedures, criteria, and the verification environment. tant role in determining where 3. Analyze defect data. problems lie. 4. Record all results of the analysis in a report. TIP 5. Use verification results to compare actual measurements and per- Technical performance param- formance to technical performance parameters. eters (see RD SG 3) help to 6. Provide information on how defects can be resolved (including veri- monitor key characteristics of fication methods, criteria, and verification environment) and initiate the emerging product. corrective action. Refer to the Project Monitoring and Control process area for more infor- mation about taking corrective action.

Wow! eBook PART THREE The Appendices

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook APPENDIX A

REFERENCES

Ahern 2005 Ahern, Dennis M.; Armstrong, Jim; Clouse, Aaron; Ferguson, Jack R.; Hayes, Will; & Nidiffer, Kenneth E. CMMI SCAMPI Distilled: Appraisals for Process Improvement. Boston: Addison-Wesley, 2005. Ahern 2008 Ahern, Dennis M.; Clouse, Aaron; & Turner, Richard. CMMI Distilled: A Practical Introduction to Integrated Process Improvement, Third Edition. Boston: Addison-Wesley, 2008. Beck 2001 Beck, Kent et al. Manifesto for Agile Software Develop- ptg ment. 2001. agilemanifesto.org/. Chrissis 2011 Chrissis, Mary Beth; Konrad, Mike; & Shrum, Sandy. CMMI: Guidelines for Process Integration and Product Improvement, Third Edition. Boston: Addison-Wesley, 2011. Crosby 1979 Crosby, Philip B. Quality Is Free: The Art of Making Quality Certain. New York: McGraw-Hill, 1979. Curtis 2009 Curtis, Bill; Hefley, William E.; & Miller, Sally A. The People CMM: A Framework for Human Capital Management, Second Edition. Boston: Addison-Wesley, 2009. Deming 1986 Deming, W. Edwards. Out of the Crisis. Cambridge, MA: MIT Center for Advanced Engineering, 1986. DoD 1996 Department of Defense. DoD Guide to Integrated Product and Process Development (Version 1.0). Washington, DC: Office of the Under Secretary of Defense (Acquisition and Technology), February 5, 1996. https://www.acquisition.gov/sevensteps/library/dod-guide-to- integrated.pdf.

555

Wow! eBook 556 PART THREE THE APPENDICES

Dymond 2005 Dymond, Kenneth M. A Guide to the CMMI: Interpreting the Capability Maturity Model Integration, Second Edi- tion. Annapolis, MD: Process Transition International Inc., 2005. EIA 2002a Electronic Industries Alliance. Systems Engineering Capability Model (EIA/IS-731.1). Washington, DC, 2002. EIA 2002b Government Electronics and Information Technology Alliance. Earned Value Management Systems (ANSI/EIA-748). New York, NY, 2002. webstore.ansi.org/RecordDetail.aspx?sku=ANSI%2FEIA-748-B. EIA 2003 Electronic Industries Alliance. EIA Interim Standard: Systems Engineering (EIA/IS-632). Washington, DC, 2003. Forrester 2011 Forrester, Eileen; Buteau, Brandon; & Shrum, Sandy. CMMI for Services: Guidelines for Superior Service, 2nd Edition. Boston: Addison-Wesley, 2011. Gallagher 2011 Gallagher, Brian; Phillips, Mike; Richter, Karen; & Shrum, Sandy. CMMI-ACQ: Guidelines for Improving the Acquisi- tion of Products and Services, 2nd Edition. Boston: Addison-Wesley, 2011. GEIA 2004 Government Electronic Industries Alliance. Data Management (GEIA-859). Washington, DC, 2004. ptg webstore.ansi.org/RecordDetail.aspx?sku=ANSI%2FGEIA+859- 2009. Gibson 2006 Gibson, Diane L.; Goldenson, Dennis R. & Kost, Keith. Performance Results of CMMI-Based Process Improvement. (CMU/SEI-2006-TR-004, ESC-TR-2006-004). : Soft- ware Engineering Institute, Carnegie Mellon University, August 2006. www.sei.cmu.edu/library/abstracts/reports/06tr004.cfm. Glazer 2008 Glazer, Hillel; Dalton, Jeff; Anderson, David; Konrad, Mike; & Shrum, Sandy. CMMI or Agile: Why Not Embrace Both! (CMU/SEI-2008-TN-003). Pittsburgh: Software Engineering Institute, Carnegie Mellon University, November 2008. www.sei.cmu.edu/library/abstracts/reports/08tn003.cfm. Humphrey 1989 Humphrey, Watts S. Managing the Software Process. Reading: Addison-Wesley, 1989. IEEE 1991 Institute of Electrical and Electronics Engineers. IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer Glossaries. New York: IEEE, 1991.

Wow! eBook Appendix A References 557

ISO 2005a International Organization for Standardization. ISO 9000: International Standard. 2005. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm? csnumber=42180. ISO 2005b International Organization for Standardization and International Electrotechnical Commission. ISO/IEC 20000-1 Information Technology – Service Management, Part 1: Specification; ISO/IEC 20000-2 Information Technology – Service Management, Part 2: Code of Practice, 2005. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_tc_ browse.htm?commid=45086. ISO 2006a International Organization for Standardization and International Electrotechnical Commission. ISO/IEC 15504 Information Technology—Process Assessment Part 1: Concepts and Vocabulary, Part 2: Performing an Assessment, Part 3: Guidance on Performing an Assessment, Part 4: Guidance on Use for Process Improvement and Process Capability Determination, Part 5: An Exemplar Process Assessment Model, 2003–2006. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_tc_ browse.htm?commid=45086. ptg ISO 2006b International Organization for Standardization and International Electrotechnical Commission. ISO/IEC 14764 Soft- ware Engineering – Software Life Cycle Processes – Maintenance, 2006. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_tc_ browse.htm?commid=45086. ISO 2007 International Organization for Standardization and International Electrotechnical Commission. ISO/IEC 15939 Systems and Software Engineering—Measurement Process , 2007. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_tc_ browse.htm?commid=45086. ISO 2008a International Organization for Standardization and International Electrotechnical Commission. ISO/IEC 12207 Systems and Software Engineering—Software Life Cycle Processes, 2008. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_ tc_browse.htm?commid=45086. ISO 2008b International Organization for Standardization and International Electrotechnical Commission. ISO/IEC 15288 Sys- tems and Software Engineering—System Life Cycle Processes, 2008.

Wow! eBook 558 PART THREE THE APPENDICES

www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_tc_ browse.htm?commid=45086. ISO 2008c International Organization for Standardization. ISO 9001, Quality Management Systems—Requirements, 2008. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_tc_ browse.htm?commid=53896. IT Governance 2005 IT Governance Institute. CobiT 4.0. Rolling Meadows, IL: IT Governance Institute, 2005. www.isaca.org/Content/NavigationMenu/Members_and_ Leaders/COBIT6/Obtain_COBIT/Obtain_COBIT.htm. Juran 1988 Juran, Joseph M. Juran on Planning for Quality. New York: Macmillan, 1988. McFeeley 1996 McFeeley, Robert. IDEAL: A User’s Guide for Soft- ware Process Improvement (CMU/SEI-96-HB-001, ADA305472). Pittsburgh: Software Engineering Institute, Carnegie Mellon University, February 1996. www.sei.cmu.edu/library/abstracts/reports/96hb001.cfm. McGarry 2001 McGarry, John; Card, David; Jones, Cheryl; Layman, Beth; Clark, Elizabeth; Dean, Joseph; & Hall, Fred. Practical Software Measurement: Objective Information for Decision ptg Makers. Boston: Addison-Wesley, 2001. Office of Government Commerce 2007a Office of Government Commerce. ITIL: Continual Service Improvement. London: Office of Government Commerce, 2007. Office of Government Commerce 2007b Office of Government Commerce. ITIL: Service Design. London: Office of Government Commerce, 2007. Office of Government Commerce 2007c Office of Government Commerce. ITIL: Service Operation. London: Office of Govern- ment Commerce, 2007. Office of Government Commerce 2007d Office of Government Commerce. ITIL: Service Strategy. London: Office of Government Commerce, 2007. Office of Government Commerce 2007e Office of Government Commerce. ITIL: Service Transition. London: Office of Govern- ment Commerce, 2007. SEI 1995 Software Engineering Institute. The Capability Maturity Model: Guidelines for Improving the Software Process. Reading, MA: Addison-Wesley, 1995.

Wow! eBook Appendix A References 559

SEI 2002 Software Engineering Institute. Software Acquisition Capability Maturity Model (SA-CMM) Version 1.03 (CMU/SEI- 2002-TR-010, ESC-TR-2002-010). Pittsburgh: Software Engineer- ing Institute, Carnegie Mellon University, March 2002. www.sei.cmu.edu/publications/documents/02.reports/02tr010.html. SEI 2006 CMMI Product Team. CMMI for Development, Version 1.2 (CMU/SEI-2006-TR-008, ADA455858). Pittsburgh: Software Engineering Institute, Carnegie Mellon University, August 2006. www.sei.cmu.edu/library/abstracts/reports/06tr008.cfm. SEI 2010a CMMI Product Team. CMMI for Services, Version 1.3 (CMU/SEI-2010-TR-034). Pittsburgh: Software Engineering Institute, Carnegie Mellon University, November 2010. www.sei.cmu.edu/library/abstracts/reports/10tr034.cfm. SEI 2010b CMMI Product Team. CMMI for Acquisition, Version 1.3 (CMU/SEI-2010-TR-032). Pittsburgh: Software Engineering Institute, Carnegie Mellon University, November 2010. www.sei.cmu.edu/library/abstracts/reports/10tr032.cfm. SEI 2010c CMMI Product Team. CMMI for Development, Version 1.3 (CMU/SEI-2010-TR-033). Pittsburgh: Software Engineering Institute, Carnegie Mellon University, November 2010. ptg www.sei.cmu.edu/library/abstracts/reports/10tr033.cfm. SEI 2010d Caralli, Richard; Allen, Julia; Curtis, Pamela; White, David; and Young, Lisa. CERT Resilience Management Model, V e r s i o n 1 . 0 (CMU/SEI-2010-TR-012). Pittsburgh: Software Engineering Institute, Carnegie Mellon University, May 2010. www.sei.cmu.edu/library/abstracts/reports/10tr012.cfm. SEI 2011a SCAMPI Upgrade Team. Standard CMMI Appraisal Method for Process Improvement (SCAMPI) A, Version 1.3: Method Definition Document (CMU/SEI-2011-HB-001). Pittsburgh: Soft- ware Engineering Institute, Carnegie Mellon University, expected January 2011. www.sei.cmu.edu/library/abstracts/reports/11hb001.cfm. SEI 2011b SCAMPI Upgrade Team. Appraisal Requirements for CMMI, Version 1.2 (ARC, V1.3) (CMU/SEI-2011-TR-001). Pitts- burgh: Software Engineering Institute, Carnegie Mellon Univer- sity, expected January 2011. www.sei.cmu.edu/library/abstracts/reports/11tr0101.cfm. Shewhart 1931 Shewhart, Walter A. Economic Control of Quality of Manufactured Product. New York: Van Nostrand, 1931.

Wow! eBook 560 PART THREE THE APPENDICES

Information Assurance/Information Security Related Sources DHS 2009 Department of Homeland Security. Assurance Focus for CMMI (Summary of Assurance for CMMI Efforts), 2009. https://buildsecurityin.us-cert.gov/swa/proself_assm.html. DoD and DHS 2008 Department of Defense and Department of Homeland Security. Software Assurance in Acquisition: Mitigating Risks to the Enterprise, 2008. https://buildsecurityin.us-cert.gov/swa/downloads/SwA_in_ Acquisition_102208.pdf. ISO/IEC 2005 International Organization for Standardization and International Electrotechnical Commission. ISO/IEC 27001 Infor- mation Technology – Security Techniques – Information Security Management Systems – Requirements, 2005. www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_ detail.htm?csnumber= 42103. NDIA 2008 NDIA System Assurance Committee. Engineering for System Assurance. Arlington, VA: NDIA, 2008. www.ndia.org/Divisions/Divisions/SystemsEngineering/ Documents/Studies/SA-Guidebook-v1-Oct2008-REV.pdf. ptg

Wow! eBook APPENDIX B

ACRONYMS

ANSI American National Standards Institute API application program interface ARC Appraisal Requirements for CMMI CAD computer-aided design CAR Causal Analysis and Resolution (process area) CCB configuration control board CL capability level CM Configuration Management (process area) CMU Carnegie Mellon University ptg CMF CMMI Model Foundation CMM Capability Maturity Model CMMI Capability Maturity Model Integration CMMI-ACQ CMMI for Acquisition CMMI-DEV CMMI for Development CMMI-SVC CMMI for Services CobiT Control Objectives for Information and related Technology COTS commercial off-the-shelf CPI cost performance index CPM critical path method CSCI computer software configuration item DAR Decision Analysis and Resolution (process area) DHS Department of Homeland Security DoD Department of Defense EIA Electronic Industries Alliance EIA/IS Electronic Industries Alliance/Interim Standard FCA functional configuration audit FMEA failure mode and effects analysis GG generic goal GP generic practice

561

Wow! eBook 562 PART THREE THE APPENDICES

IBM International Business Machines IDEAL Initiating, Diagnosing, Establishing, Acting, Learning IEEE Institute of Electrical and Electronics Engineers INCOSE International Council on Systems Engineering IPD-CMM Integrated Product Development Capability Maturity Model IPM Integrated Project Management (process area) ISO International Organization for Standardization ISO/IEC International Organization for Standardization and International Electrotechnical Commission ITIL Information Technology Infrastructure Library MA Measurement and Analysis (process area) MDD Method Definition Document ML maturity level NDIA National Defense Industrial Association OID Organizational Innovation and Deployment (former process area) OPD Organizational Process Definition (process area) OPF Organizational Process Focus (process area) OPM Organizational Performance Management (process area) OPP Organizational Process Performance (process area) ptg OT Organizational Training (process area) P-CMM People Capability Maturity Model PCA physical configuration audit PERT Program Evaluation and Review Technique PI Product Integration (process area) PMC Project Monitoring and Control (process area) PP Project Planning (process area) PPQA Process and Product Quality Assurance (process area) QFD Quality Function Deployment QPM Quantitative Project Management (process area) RD Requirements Development (process area) REQM Requirements Management (process area) RSKM Risk Management (process area) SA-CMM Software Acquisition Capability Maturity Model SAM Supplier Agreement Management (process area) SCAMPI Standard CMMI Appraisal Method for Process Improvement SECAM Systems Engineering Capability Assessment Model SECM Systems Engineering Capability Model SEI Software Engineering Institute SG specific goal

Wow! eBook Appendix B Acronyms 563

SP specific practice SPI schedule performance index SSD Service System Development (process area in CMMI-SVC) SSE-CMM Systems Security Engineering Capability Maturity Model SW-CMM Capability Maturity Model for Software or Software Capability Maturity Model TS Technical Solution (process area) VAL Validation (process area) VER Verification (process area) WBS work breakdown structure

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook APPENDIX C

CMMI VERSION 1.3 PROJECT PARTICIPANTS

Many talented people were part of the product team that developed CMMI Version 1.3 models. Listed here are those who participated in one or more of the following teams during the development of CMMI Version 1.3. The organizations listed by members’ names are those they represented at the time of their team membership. The following are the primary groups involved in the develop- ment of this model:

• CMMI Steering Group ptg • CMMI for Services Advisory Group • CMMI V1.3 Coordination Team • CMMI V1.3 Configuration Control Board • CMMI V1.3 Core Model Team • CMMI V1.3 Translation Team • CMMI V1.3 High Maturity Team • CMMI V1.3 Acquisition Mini Team • CMMI V1.3 Services Mini Team • CMMI V1.3 SCAMPI Upgrade Team • CMMI V1.3 Training Teams • CMMI V1.3 Quality Team

CMMI Steering Group The CMMI Steering Group guides and approves the plans of the CMMI Product Team, provides consultation on significant CMMI project issues, ensures involvement from a variety of interested com- munities, and approves the final release of the model.

565

Wow! eBook 566 PART THREE THE APPENDICES Steering Group Members Alan Bemish, U.S. Air Force Anita Carleton, Software Engineering Institute Clyde Chittister, Software Engineering Institute James Gill, Boeing Integrated Defense Systems John C. Kelly, NASA Kathryn Lundeen, Defense Contract Management Agency Larry McCarthy, Motorola Lawrence Osiecki, U.S. Army Robert Rassa, Raytheon Space and Airborne Systems (lead) Karen Richter, Institute for Defense Analyses Joan Weszka, Lockheed Martin Corporation Harold Wilson, Northrop Grumman Corporation Brenda Zettervall, U.S. Navy

Ex-Officio Steering Group Members Mike Konrad, Software Engineering Institute Susan LaFortune, National Security Agency David (Mike) Phillips, Software Engineering Institute ptg

Steering Group Support Mary Beth Chrissis, Software Engineering Institute (CCB) Eric Hayes, Software Engineering Institute (secretary) Rawdon Young, Software Engineering Institute (Appraisal program)

CMMI for Services Advisory Group The Services Advisory Group provides advice to the product devel- opment team about service industries. Brandon Buteau, Northrop Grumman Corporation Christian Carmody, University of Pittsburgh Medical Center Sandra Cepeda, Cepeda Systems & Software Analysis/RDECOM SED Annie Combelles, DNV IT Global Services Jeff Dutton, Jacobs Technology, Inc. Eileen Forrester, Software Engineering Institute Craig Hollenbach, Northrop Grumman Corporation (lead) Bradley Nelson, Department of Defense

Wow! eBook Appendix C CMMI Version 1.3 Project Participants 567

Lawrence Osiecki, U.S. Army ARDEC David (Mike) Phillips, Software Engineering Institute Timothy Salerno, Lockheed Martin Corporation Sandy Shrum, Software Engineering Institute Nidhi Srivastava, Tata Consultancy Services Elizabeth Sumpter, NSA David Swidorsky, Bank of America

CMMI V1.3 Coordination Team The Coordination Team brings together members of other product development teams to ensure coordination across the project. Rhonda Brown, Software Engineering Institute Mary Beth Chrissis, Software Engineering Institute Eileen Forrester, Software Engineering Institute Will Hayes, Software Engineering Institute Mike Konrad, Software Engineering Institute So Norimatsu, Norimatsu Process Engineering Lab, Inc. Mary Lynn Penn, Lockheed Martin Corporation ptg David (Mike) Phillips, Software Engineering Institute (lead) Mary Lynn Russo, Software Engineering Institute (nonvoting member) Sandy Shrum, Software Engineering Institute Kathy Smith, Hewlett-Packard Barbara Tyson, Software Engineering Institute Rawdon Young, Software Engineering Institute

CMMI V1.3 Configuration Control Board The Configuration Control Board approves all changes to CMMI materials, including the models, the SCAMPI MDD, and introduc- tory model training. Rhonda Brown, Software Engineering Institute (nonvoting member) Michael Campo, Raytheon Mary Beth Chrissis, Software Engineering Institute (chair) Kirsten Dauplaise, NAVAIR Mike Evanoo, Systems and Software Consortium, Inc. Rich Frost, General Motors

Wow! eBook 568 PART THREE THE APPENDICES

Brian Gallagher, Northrop Grumman Corporation Sally Godfrey, NASA Stephen Gristock, JP Morgan Chase and Co. Eric Hayes, Software Engineering Institute (nonvoting member) Nils Jacobsen, Motorola Steve Kapurch, NASA Mike Konrad, Software Engineering Institute Chris Moore, U.S. Air Force Wendell Mullison, General Dynamics Land Systems David (Mike) Phillips, Software Engineering Institute Robert Rassa, Raytheon Space and Airborne Systems Karen Richter, Institute for Defense Analyses Mary Lou Russo, Software Engineering Institute (nonvoting member) Warren Schwoemeyer, Lockheed Martin Corporation John Scibilia, U.S. Army Dave Swidorsky, Bank of America Barbara Tyson, Software Engineering Institute Mary Van Tyne, Software Engineering Institute (nonvoting member) Rawdon Young, Software Engineering Institute ptg

CMMI V1.3 Core Model Team The Core Model Team develops the model material for all three constellations. Jim Armstrong, Stevens Institute of Technology Rhonda Brown, Software Engineering Institute (co-lead) Brandon Buteau, Northrop Grumman Corporation Michael Campo, Raytheon Sandra Cepeda, Cepeda Systems & Software Analysis/RDECOM SED Mary Beth Chrissis, Software Engineering Institute Mike D’Ambrosa, Process Performance Professionals Eileen Forrester, Software Engineering Institute Will Hayes, Software Engineering Institute Mike Konrad, Software Engineering Institute (co-lead) So Norimatsu, Norimatsu Process Engineering Lab, Inc. Mary Lynn Penn, Lockheed Martin Corporation David (Mike) Phillips, Software Engineering Institute

Wow! eBook Appendix C CMMI Version 1.3 Project Participants 569

Karen Richter, Institute for Defense Analyses Mary Lynn Russo, Software Engineering Institute (nonvoting member) John Scibilia, U.S. Army Sandy Shrum, Software Engineering Institute (co-lead) Kathy Smith, Hewlett-Packard Katie Smith-McGarty, U.S. Navy

CMMI V1.3 Translation Team The Translation Team coordinates translation work on CMMI materials. Richard Basque, Alcyonix Jose Antonio Calvo-Manzano, Universidad Politecnica de Madrid Carlos Caram, Integrated Systems Diagnostics Brazil Gonzalo Cuevas, Universidad Politecnica de Madrid Mike Konrad, Software Engineering Institute Antoine Nardeze, Alcyonix So Norimatsu, Norimatsu Process Engineering Lab, Inc. (lead) Seven Ou, Institute for Information Industry Ricardo Panero Lamothe, Accenture ptg Mary Lynn Russo, Software Engineering Institute (nonvoting member) Winfried Russwurm, Siemens AG Tomas San Feliu, Universidad Politecnica de Madrid

CMMI V1.3 High Maturity Team The High Maturity Team develops high maturity model material. Dan Bennett, U.S. Air Force Will Hayes, Software Engineering Institute Rick Hefner, Northrop Grumman Corporation Jim Kubeck, Lockheed Martin Corporation Alice Parry, Raytheon Mary Lynn Penn, Lockheed Martin Corporation (lead) Kathy Smith, Hewlett-Packard Rawdon Young, Software Engineering Institute

Wow! eBook 570 PART THREE THE APPENDICES

CMMI V1.3 Acquisition Mini Team The Acquisition Mini Team provides acquisition expertise for model development work. Rich Frost, General Motors Tom Keuten, Keuten and Associates David (Mike) Phillips, Software Engineering Institute (lead) Karen Richter, Institute for Defense Analyses John Scibilia, US Army

CMMI V1.3 Services Mini Team The Services Mini Team provides service expertise for model devel- opment work. Drew Allison, Systems and Software Consortium, Inc. Brandon Buteau, Northrop Grumman Corporation Eileen Forrester, Software Engineering Institute (lead) Christian Hertneck, Anywhere.24 GmbH Pam Schoppert, Science Applications International Corporation ptg

CMMI V1.3 SCAMPI Upgrade Team The SCAMPI Upgrade Team develops the Appraisal Requirements for CMMI (ARC) document and SCAMPI Method Definition Document (MDD). Mary Busby, Lockheed Martin Corporation Palma Buttles-Valdez, Software Engineering Institute Paul Byrnes, Integrated System Diagnostics Will Hayes, Software Engineering Institute (leader) Ravi Khetan, Northrop Grumman Corporation Denise Kirkham, The Boeing Company Lisa Ming, BAE Systems Charlie Ryan, Software Engineering Institute Kevin Schaaff, Software Engineering Institute Alexander Stall, Software Engineering Institute Agapi Svolou, Software Engineering Institute Ron Ulrich, Northrop Grumman Corporation

Wow! eBook Appendix C CMMI Version 1.3 Project Participants 571

CMMI Version 1.3 Training Teams The two training teams (one for CMMI-DEV and CMMI-ACQ and the other for CMMI-SVC) develop model training materials.

ACQ and DEV Training Team Barbara Baldwin, Software Engineering Institute Bonnie Bollinger, Process Focus Management Cat Brandt-Zaccardi, Software Engineering Institute Rhonda Brown, Software Engineering Institute Michael Campo, Raytheon Mary Beth Chrissis, Software Engineering Institute (lead) Stacey Cope, Software Engineering Institute Eric Dorsett, Jeppesen Dan Foster, PF Williamson Eric Hayes, Software Engineering Institute Kurt Hess, Software Engineering Institute Mike Konrad, Software Engineering Institute

Steve Masters, Software Engineering Institute ptg Robert McFeeley, Software Engineering Institute Diane Mizukami-Williams, Northrop Grumman Corporation Daniel Pipitone, Software Engineering Institute Mary Lou Russo, Software Engineering Institute (nonvoting member) Sandy Shrum, Software Engineering Institute Katie Smith-McGarty, U.S. Navy Barbara Tyson, Software Engineering Institute

SVC Training Team Drew Allison, Systems and Software Consortium, Inc. Mike Bridges, University of Pittsburgh Medical Center Paul Byrnes, Integrated System Diagnostics Sandra Cepeda, Cepeda Systems & Software Analysis/RDECOM SED Eileen Clark, Tidewaters Consulting Kieran Doyle, Excellence in Measurement Eileen Forrester, Software Engineering Institute (lead of SVC training) Hillel Glazer, Entinex Christian Hertneck, Anywhere.24 GmbH Pat Kirwan, Software Engineering Institute

Wow! eBook 572 PART THREE THE APPENDICES

Suzanne Miller, Software Engineering Institute Judah Mogilensky, PEP Heather Oppenheimer, Oppenheimer Partners Pat O’Toole, PACT Agapi Svolou, Alexanna Jeff Welch, Software Engineering Institute

CMMI V1.3 Quality Team The Quality Team conducts various quality assurance checks on the model material to ensure its accuracy, readability, and consistency. Rhonda Brown, Software Engineering Institute (co-lead) Erin Harper, Software Engineering Institute Mike Konrad, Software Engineering Institute Mary Lou Russo, Software Engineering Institute Mary Lynn Russo, Software Engineering Institute Sandy Shrum, Software Engineering Institute (co-lead)

ptg

Wow! eBook APPENDIX D

GLOSSARY

The glossary defines the basic terms used in CMMI models. Glossary entries are typically multiple-word terms consisting of a noun and one or more restrictive modifiers. (There are some exceptions to this rule that account for one-word terms in the glossary.) The CMMI glossary of terms is not a required, expected, or informative component of CMMI models. Interpret the terms in the glossary in the context of the model component in which they appear. To formulate definitions appropriate for CMMI, we consulted ptg multiple sources. We first consulted the Merriam-Webster Online dic- tionary (www.merriam-webster.com/). We also consulted other stan- dards as needed, including the following:

• ISO 9000 [ISO 2005a] • ISO/IEC 12207 [ISO 2008a] • ISO/IEC 15504 [ISO 2006a] • ISO/IEC 15288 [ISO 2008b] • ISO/IEC 15939 [ISO 2007] • ISO 20000-1 [ISO 2005b] • IEEE [IEEE 1991] • CMM for Software (SW-CMM) V1.1 • EIA 632 [EIA 2003] • SA-CMM [SEI 2002] • People CMM (P-CMM) [Curtis 2009] • CobiT v. 4.0 [IT Governance 2005] • ITIL v3 (Service Improvement, Service Design, Service Operation, Service Strategy, and Service Transition) [Office of Government Commerce 2007]

573

Wow! eBook 574 PART THREE THE APPENDICES

We developed the glossary recognizing the importance of using terminology that all model users can understand. We also recognized that words and terms can have different meanings in different con- texts and environments. The glossary in CMMI models is designed to document the meanings of words and terms that should have the widest use and understanding by users of CMMI products. Even though the term “product” includes services as well as prod- ucts and the term “service” is defined as a type of product, many of the terms in the glossary contain both the words “product” and “ser- vice” to emphasize that CMMI applies to both products and services. Every glossary entry has two to three components. There is always a term and always a definition. Sometimes additional notes are provided. The term defined is listed on the left side of the page. The defini- tion appears first in a type size similar to the term listed. Glossary notes follow the definition and are in a smaller type size.

acceptance criteria The criteria that a deliverable must satisfy to be accepted by a user, customer, or other authorized entity. (See also “deliverable.”) Refer to the Decision Analysis and Resolution process area for more infor- ptg mation about analyzing possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. acceptance testing Formal testing conducted to enable a user, cus- tomer, or other authorized entity to determine whether to accept a deliverable. (See also “unit testing.”) achievement profile A list of process areas and their correspond- ing capability levels that represent the organization’s progress for each process area while advancing through the capability levels. (See also “capability level profile,” “target profile,” and “target staging.”) acquirer The stakeholder that acquires or procures a product or service from a supplier. (See also “stakeholder.”) acquisition The process of obtaining products or services through supplier agreements. (See also “supplier agreement.”) acquisition strategy The specific approach to acquiring products and services that is based on considerations of supply sources, acquisition methods, requirements specification types, agreement types, and related acquisition risks. addition A clearly marked model component that contains infor- mation of interest to particular users.

Wow! eBook Appendix D Glossary 575

In a CMMI model, all additions bearing the same name can be optionally selected as a group for use. In CMMI for Services, the Service System Development (SSD) process area is an addition. allocated requirement Requirement that results from levying all or part of a higher level requirement on a lower level architectural element or design component. More generally, requirements can be allocated to other logical or physical components including people, consumables, delivery increments, or the architecture as a whole, depending on what best enables the product or serv- ice to achieve the requirements. appraisal An examination of one or more processes by a trained team of professionals using an appraisal reference model as the basis for determining, at a minimum, strengths and weaknesses. This term has a special meaning in the CMMI Product Suite besides its common standard English meaning. appraisal findings The results of an appraisal that identify the most important issues, problems, or opportunities for process improvement within the appraisal scope. Appraisal findings are inferences drawn from corroborated objective evidence. appraisal participants Members of the organizational unit who ptg participate in providing information during an appraisal. appraisal rating The value assigned by an appraisal team to (a) a CMMI goal or process area, (b) the capability level of a process area, or (c) the maturity level of an organizational unit. This term is used in CMMI appraisal materials such as the SCAMPI MDD. A rating is determined by enacting the defined rating process for the appraisal method being employed. appraisal reference model The CMMI model to which an appraisal team correlates implemented process activities. This term is used in CMMI appraisal materials such as the SCAMPI MDD. appraisal scope The definition of the boundaries of an appraisal encompassing the organizational limits and CMMI model limits within which the processes to be investigated operate. This term is used in CMMI appraisal materials such as the SCAMPI MDD. architecture The set of structures needed to reason about a prod- uct. These structures are comprised of elements, relations among them, and properties of both. In a service context, the architecture is often applied to the service system. Note that functionality is only one aspect of the product. Quality attributes, such as responsiveness, reliability, and security, are also important to reason

Wow! eBook 576 PART THREE THE APPENDICES

about. Structures provide the means for highlighting different portions of the architecture. (See also “functional architecture.”) audit An objective examination of a work product or set of work products against specific criteria (e.g., requirements). (See also “objectively evaluate.”) This is a term used in several ways in CMMI, including configuration audits and process compliance audits. baseline A set of specifications or work products that has been formally reviewed and agreed on, which thereafter serves as the basis for further development, and which can be changed only through change control procedures. (See also “configuration baseline” and “product baseline.”) base measure Measure defined in terms of an attribute and the method for quantifying it. (See also “derived measure.”) A base measure is functionally independent of other measures. bidirectional traceability An association among two or more logi- cal entities that is discernable in either direction (i.e., to and from an entity). (See also “requirements traceability” and “traceability.”) business objectives (See “organization’s business objectives.”) ptg capability level Achievement of process improvement within an individual process area. (See also “generic goal,” “specific goal,” “maturity level,” and “process area.”) A capability level is defined by appropriate specific and generic goals for a process area. capability level profile A list of process areas and their correspond- ing capability levels. (See also “achievement profile,” “target pro- file,” and “target staging.”) A capability level profile can be an “achievement profile” when it represents the organization’s progress for each process area while advancing through the capability levels. Or, it can be a “target profile” when it represents an objective for process improvement. capability maturity model A model that contains the essential ele- ments of effective processes for one or more areas of interest and describes an evolutionary improvement path from ad hoc, imma- ture processes to disciplined, mature processes with improved quality and effectiveness. capable process A process that can satisfy its specified product quality, service quality, and process performance objectives. (See also “stable process” and “standard process.”)

Wow! eBook Appendix D Glossary 577 causal analysis The analysis of outcomes to determine their causes. change management Judicious use of means to effect a change, or a proposed change, to a product or service. (See also “configura- tion management.”) CMMI Framework The basic structure that organizes CMMI com- ponents, including elements of current CMMI models as well as rules and methods for generating models, appraisal methods (including associated artifacts), and training materials. (See also “CMMI model” and “CMMI Product Suite.”) The framework enables new areas of interest to be added to CMMI so that they will integrate with the existing ones. CMMI model A model generated from the CMMI Framework. (See also “CMMI Framework” and “CMMI Product Suite.”) CMMI model component Any of the main architectural elements that compose a CMMI model. Some of the main elements of a CMMI model include specific practices, generic practices, specific goals, generic goals, process areas, capability levels, and maturity levels. CMMI Product Suite The complete set of products developed around the CMMI concept. (See also “CMMI Framework” and ptg “CMMI model.”) These products include the framework itself, models, appraisal methods, appraisal materials, and training materials. commercial off-the-shelf Items that can be purchased from a com- mercial supplier. common cause of variation The variation of a process that exists because of normal and expected interactions among components of a process. (See also “special cause of variation.”) configuration audit An audit conducted to verify that a configura- tion item or a collection of configuration items that make up a baseline conforms to a specified standard or requirement. (See also “audit” and “configuration item.”) configuration baseline The configuration information formally designated at a specific time during a product’s or product com- ponent’s life. (See also “product lifecycle.”) Configuration baselines plus approved changes from those baselines consti- tute the current configuration information. configuration control An element of configuration management consisting of the evaluation, coordination, approval or disap- proval, and implementation of changes to configuration items

Wow! eBook 578 PART THREE THE APPENDICES

after formal establishment of their configuration identification. (See also “configuration identification,” “configuration item,” and “configuration management.”) configuration control board A group of people responsible for evaluating and approving or disapproving proposed changes to configuration items and for ensuring implementation of approved changes. (See also “configuration item.”) Configuration control boards are also known as “change control boards.” configuration identification An element of configuration manage- ment consisting of selecting the configuration items for a prod- uct, assigning unique identifiers to them, and recording their functional and physical characteristics in technical documenta- tion. (See also “configuration item,” “configuration manage- ment,” and “product.”) configuration item An aggregation of work products that is desig- nated for configuration management and treated as a single entity in the configuration management process. (See also “con- figuration management.”) configuration management A discipline applying technical and administrative direction and surveillance to (1) identify and doc- ptg ument the functional and physical characteristics of a configura- tion item, (2) control changes to those characteristics, (3) record and report change processing and implementation status, and (4) verify compliance with specified requirements. (See also “configuration audit,” “configuration control,” “configuration identification,” and “configuration status accounting.”) configuration status accounting An element of configuration management consisting of the recording and reporting of infor- mation needed to manage a configuration effectively. (See also “configuration identification” and “configuration management.”) This information includes a list of the approved configuration, the status of proposed changes to the configuration, and the implementation status of approved changes. constellation A collection of CMMI components that are used to construct models, training materials, and appraisal related docu- ments for an area of interest (e.g., acquisition, development, services). continuous representation A capability maturity model structure wherein capability levels provide a recommended order for approaching process improvement within each specified process

Wow! eBook Appendix D Glossary 579

area. (See also “capability level,” “process area,” and “staged representation.”) contractor (See “supplier.”) contractual requirements The result of the analysis and refine- ment of customer requirements into a set of requirements suit- able to be included in one or more solicitation packages, or supplier agreements. (See also “acquirer,” “customer require- ment,” “supplier agreement,” and “solicitation package.”) Contractual requirements include both technical and nontechnical require- ments necessary for the acquisition of a product or service. corrective action Acts or deeds used to remedy a situation or remove an error. customer The party responsible for accepting the product or for authorizing payment. The customer is external to the project or work group (except possibly in certain project structures in which the customer effectively is on the project team or in the work group) but not necessarily external to the organization. The customer can be a higher level project or work group. Customers are a subset of stakeholders. (See also “stakeholder.”) In most cases where this term is used, the preceding definition is intended; ptg however, in some contexts, the term “customer” is intended to include other relevant stakeholders. (See also “customer requirement.”) End users can be distinguished from customers if the parties that directly receive the value of products and services are not the same as the parties that arrange for, pay for, or negotiate agreements. In contexts where cus- tomers and end users are essentially the same parties, the term “customer” can encompass both types. (See also “end user.”) customer requirement The result of eliciting, consolidating, and resolving conflicts among the needs, expectations, constraints, and interfaces of the product’s relevant stakeholders in a way that is acceptable to the customer. (See also “customer.”) data Recorded information. Recorded information can include technical data, computer software docu- ments, financial information, management information, representation of facts, numbers, or datum of any nature that can be communicated, stored, and processed. data management The disciplined processes and systems that plan for, acquire, and provide stewardship for business and technical data, consistent with data requirements, throughout the data lifecycle. defect density Number of defects per unit of product size. An example is the number of problem reports per thousand lines of code.

Wow! eBook 580 PART THREE THE APPENDICES

defined process A managed process that is tailored from the orga- nization’s set of standard processes according to the organiza- tion’s tailoring guidelines; has a maintained process description; and contributes process related experiences to the organizational process assets. (See also “managed process.”) definition of required functionality and quality attributes A char- acterization of required functionality and quality attributes obtained through “chunking,” organizing, annotating, structuring, or formalizing the requirements (functional and non-functional) to facilitate further refinement and reasoning about the require- ments as well as (possibly, initial) solution exploration, defini- tion, and evaluation. (See also “architecture,” “functional architecture,” and “quality attribute.”) As technical solution processes progress, this characterization can be fur- ther evolved into a description of the architecture versus simply helping scope and guide its development, depending on the engineering processes used; requirements specification and architectural languages used; and the tools and the environment used for product or service system development. deliverable An item to be provided to an acquirer or other desig- nated recipient as specified in an agreement. (See also “acquirer.”) ptg This item can be a document, hardware item, software item, service, or any type of work product. delivery environment The complete set of circumstances and con- ditions under which services are delivered in accordance with service agreements. (See also “service” and “service agreement.”) The delivery environment encompasses everything that has or can have a significant effect on service delivery, including but not limited to service sys- tem operation, natural phenomena, and the behavior of all parties, whether or not they intend to have such an effect. For example, consider the effect of weather or traffic patterns on a transportation service. (See also “service system.”) The delivery environment is uniquely distinguished from other environ- ments (e.g., simulation environments, testing environments). The delivery environment is the one in which services are actually delivered and count as satisfying a service agreement. derived measure Measure that is defined as a function of two or more values of base measures. (See also “base measure.”) derived requirements Requirements that are not explicitly stated in customer requirements but are inferred (1) from contextual requirements (e.g., applicable standards, laws, policies, common

Wow! eBook Appendix D Glossary 581

practices, management decisions) or (2) from requirements needed to specify a product or service component. Derived requirements can also arise during analysis and design of compo- nents of the product or service. (See also “product requirements.”) design review A formal, documented, comprehensive, and system- atic examination of a design to determine if the design meets the applicable requirements, to identify problems, and to propose solutions. development To create a product or service system by deliberate effort. In some contexts, development can include the maintenance of the developed product. document A collection of data, regardless of the medium on which it is recorded, that generally has permanence and can be read by humans or machines. Documents include both paper and electronic documents. end user A party that ultimately uses a delivered product or that receives the benefit of a delivered service. (See also “customer.”) End users may or may not also be customers (who can establish and accept agreements or authorize payments). ptg In contexts where a single service agreement covers multiple service deliver- ies, any party that initiates a service request can be considered an end user. (See also “service agreement” and “service request.”) enterprise The full composition of a company. (See also “organization.”) A company can consist of many organizations in many locations with dif- ferent customers. entry criteria States of being that must be present before an effort can begin successfully. equivalent staging A target staging, created using the continuous representation that is defined so that the results of using the tar- get staging can be compared to maturity levels of the staged rep- resentation. (See also “capability level profile,” “maturity level,” “target profile,” and “target staging.”) Such staging permits benchmarking of progress among organizations, enterprises, projects, and work groups, regardless of the CMMI representa- tion used. The organization can implement components of CMMI models beyond the ones reported as part of equivalent staging. Equivalent staging relates how the organization compares to other organizations in terms of maturity levels.

Wow! eBook 582 PART THREE THE APPENDICES

establish and maintain Create, document, use, and revise work products as necessary to ensure they remain useful. The phrase “establish and maintain” plays a special role in communicating a deeper principle in CMMI: work products that have a central or key role in work group, project, and organizational performance should be given attention to ensure they are used and useful in that role. This phrase has particular significance in CMMI because it often appears in goal and practice statements (though in the former as “established and maintained”) and should be taken as shorthand for applying the principle to whatever work product is the object of the phrase. example work product An informative model component that provides sample outputs from a specific practice. executive (See “senior manager.”) exit criteria States of being that must be present before an effort can end successfully. expected CMMI components CMMI components that describe the activities that are important in achieving a required CMMI component. Model users can implement the expected components explicitly or imple- ment equivalent practices to these components. Specific and generic prac- ptg tices are expected model components. findings (See “appraisal findings.”) formal evaluation process A structured approach to evaluating alternative solutions against established criteria to determine a recommended solution to address an issue. framework (See “CMMI Framework.”) functional analysis Examination of a defined function to identify all the subfunctions necessary to accomplish that function; iden- tification of functional relationships and interfaces (internal and external) and capturing these relationships and interfaces in a functional architecture; and flow down of upper level require- ments and assignment of these requirements to lower level sub- functions. (See also “functional architecture.”) functional architecture The hierarchical arrangement of functions, their internal and external (external to the aggregation itself) functional interfaces and external physical interfaces, their respective requirements, and their design constraints. (See also “architecture,” “functional analysis,” and “definition of required functionality and quality attributes.”)

Wow! eBook Appendix D Glossary 583 generic goal A required model component that describes charac- teristics that must be present to institutionalize processes that implement a process area. (See also “institutionalization.”) generic practice An expected model component that is considered important in achieving the associated generic goal. The generic practices associated with a generic goal describe the activities that are expected to result in achievement of the generic goal and contribute to the institutionalization of the processes associated with a process area. generic practice elaboration An informative model component that appears after a generic practice to provide guidance on how the generic practice could be applied uniquely to a process area. (This model component is not present in all CMMI models.) hardware engineering The application of a systematic, disciplined, and quantifiable approach to transforming a set of requirements that represent the collection of stakeholder needs, expectations, and constraints, using documented techniques and technology to design, implement, and maintain a tangible product. (See also “software engineering” and “systems engineering.”) In CMMI, hardware engineering represents all technical fields (e.g., electrical, mechanical) that transform requirements and ideas into tangible products. ptg higher level management The person or persons who provide the policy and overall guidance for the process but do not provide the direct day-to-day monitoring and controlling of the process. (See also “senior manager.”) Such persons belong to a level of management in the organization above the immediate level responsible for the process and can be (but are not neces- sarily) senior managers. incomplete process A process that is not performed or is per- formed only partially; one or more of the specific goals of the process area are not satisfied. An incomplete process is also known as capability level 0. informative CMMI components CMMI components that help model users understand the required and expected components of a model. These components can be examples, detailed explanations, or other helpful information. Subpractices, notes, references, goal titles, practice titles, sources, example work products, and generic practice elaborations are informative model components. institutionalization The ingrained way of doing business that an organization follows routinely as part of its corporate culture.

Wow! eBook 584 PART THREE THE APPENDICES

interface control In configuration management, the process of (1) identifying all functional and physical characteristics relevant to the interfacing of two or more configuration items provided by one or more organizations and (2) ensuring that proposed changes to these characteristics are evaluated and approved prior to implementation. (See also “configuration item” and “configu- ration management.”) lifecycle model A partitioning of the life of a product, service, project, work group, or set of work activities into phases. managed process A performed process that is planned and exe- cuted in accordance with policy; employs skilled people having adequate resources to produce controlled outputs; involves rele- vant stakeholders; is monitored, controlled, and reviewed; and is evaluated for adherence to its process description. (See also “performed process.”) manager A person who provides technical and administrative direction and control to those who perform tasks or activities within the manager’s area of responsibility. This term has a special meaning in the CMMI Product Suite besides its common standard English meaning. The traditional functions of a manager ptg include planning, organizing, directing, and controlling work within an area of responsibility. maturity level Degree of process improvement across a predefined set of process areas in which all goals in the set are attained. (See also “capability level” and “process area.”) measure (noun) Variable to which a value is assigned as a result of measurement. (See also “base measure,” “derived measure,” and “measurement.”) The definition of this term in CMMI is consistent with the definition of this term in ISO 15939. measurement A set of operations to determine the value of a measure. (See also “measure.”) The definition of this term in CMMI is consistent with the definition of this term in ISO 15939. measurement result A value determined by performing a measure- ment. (See also “measurement.”) memorandum of agreement Binding document of understanding or agreement between two or more parties. A memorandum of agreement is also known as a “memorandum of u n d e r s t a n d i n g . ”

Wow! eBook Appendix D Glossary 585 natural bounds The inherent range of variation in a process, as determined by process performance measures. Natural bounds are sometimes referred to as “voice of the process.” Techniques such as control charts, confidence intervals, and prediction intervals are used to determine whether the variation is due to common causes (i.e., the process is predictable or stable) or is due to some special cause that can and should be identified and removed. (See also “measure” and “process performance.”) nondevelopmental item An item that was developed prior to its current use in an acquisition or development process. Such an item can require minor modifications to meet the requirements of its current intended use. nontechnical requirements Requirements affecting product and service acquisition or development that are not properties of the product or service. Examples include numbers of products or services to be delivered, data rights for delivered COTS and nondevelopmental items, delivery dates, and milestones with exit criteria. Other nontechnical requirements include work constraints associated with training, site provisions, and deployment schedules. ptg objectively evaluate To review activities and work products against criteria that minimize subjectivity and bias by the reviewer. (See also “audit.”) An example of an objective evaluation is an audit against requirements, standards, or procedures by an independent quality assurance function. operational concept A general description of the way in which an entity is used or operates. An operational concept is also known as “concept of operations.” operational scenario A description of an imagined sequence of events that includes the interaction of the product or service with its environment and users, as well as interaction among its prod- uct or service components. Operational scenarios are used to evaluate the requirements and design of the system and to verify and validate the system. organization An administrative structure in which people collec- tively manage one or more projects or work groups as a whole, share a senior manager, and operate under the same policies. However, the word “organization” as used throughout CMMI models can also apply to one person who performs a function in a small organization that might be performed by a group of people in a large organization. (See also “enterprise.”)

Wow! eBook 586 PART THREE THE APPENDICES

organizational maturity The extent to which an organization has explicitly and consistently deployed processes that are documented, managed, measured, controlled, and continually improved. Organizational maturity can be measured via appraisals. organizational policy A guiding principle typically established by senior management that is adopted by an organization to influ- ence and determine decisions. organizational process assets Artifacts that relate to describing, implementing, and improving processes. Examples of these artifacts include policies, measurement descriptions, process descriptions, process implementation support tools. The term “process assets” is used to indicate that these artifacts are devel- oped or acquired to meet the business objectives of the organization and that they represent investments by the organization that are expected to provide current and future business value. (See also “process asset library.”) organization’s business objectives Senior-management-developed objectives designed to ensure an organization’s continued exis- tence and enhance its profitability, market share, and other fac- tors influencing the organization’s success. (See also “quality and process performance objectives” and “quantitative objective.”) ptg organization’s measurement repository A repository used to col- lect and make measurement results available on processes and work products, particularly as they relate to the organization’s set of standard processes. This repository contains or references actual measurement results and related information needed to understand and analyze measurement results. organization’s process asset library A library of information used to store and make process assets available that are useful to those who are defining, implementing, and managing processes in the organization. This library contains process assets that include process related documenta- tion such as policies, defined processes, checklists, lessons learned docu- ments, templates, standards, procedures, plans, and training materials. organization’s set of standard processes A collection of defini- tions of the processes that guide activities in an organization. These process descriptions cover the fundamental process elements (and their relationships to each other such as ordering and interfaces) that should be incorporated into the defined processes that are implemented in projects, work groups, and work across the organization. A standard process enables consistent development and maintenance activities across the organization and is essential for long-term stability and improvement. (See also “defined process” and “process element.”)

Wow! eBook Appendix D Glossary 587 outsourcing (See “acquisition.”) peer review The review of work products performed by peers dur- ing the development of work products to identify defects for removal. (See also “work product.”) The term “peer review” is used in the CMMI Product Suite instead of the term “work product inspection.” performance parameters The measures of effectiveness and other key measures used to guide and control progressive development. performed process A process that accomplishes the needed work to produce work products; the specific goals of the process area are satisfied. planned process A process that is documented by both a descrip- tion and a plan. The description and plan should be coordinated and the plan should include standards, requirements, objectives, resources, and assignments. policy (See “organizational policy.”) process A set of interrelated activities, which transform inputs into outputs, to achieve a given purpose. (See also “process area,” “subprocess,” and “process element.”) ptg There is a special use of the phrase “the process” in the statements and descriptions of the generic goals and generic practices. “The process,” as used in Part Two, is the process or processes that implement the process area. The terms “process,” “subprocess” and “process element” form a hierarchy with “process” as the highest, most general term, “subprocesses” below it, and “process element” as the most specific. A particular process can be called a subprocess if it is part of another larger process. It can also be called a process element if it is not decomposed into subprocesses. This definition of process is consistent with the definition of process in ISO 9000, ISO 12207, ISO 15504, and EIA 731. process action plan A plan, usually resulting from appraisals, that documents how specific improvements targeting the weaknesses uncovered by an appraisal will be implemented. process action team A team that has the responsibility to develop and implement process improvement activities for an organiza- tion as documented in a process action plan. process and technology improvements Incremental and innova- tive improvements to processes and to process, product, or service technologies. process architecture (1) The ordering, interfaces, interdependen- cies, and other relationships among the process elements in a

Wow! eBook 588 PART THREE THE APPENDICES

standard process, or (2) the interfaces, interdependencies, and other relationships between process elements and external processes. process area A cluster of related practices in an area that, when implemented collectively, satisfies a set of goals considered important for making improvement in that area. process asset Anything the organization considers useful in attain- ing the goals of a process area. (See also “organizational process assets.”) process asset library A collection of process asset holdings that can be used by an organization, project, or work group. (See also “organization’s process asset library.”) process attribute A measurable characteristic of process capability applicable to any process. process capability The range of expected results that can be achieved by following a process. process definition The act of defining and describing a process. The result of process definition is a process description. (See also “process description.”) ptg process description A documented expression of a set of activities performed to achieve a given purpose. A process description provides an operational definition of the major com- ponents of a process. The description specifies, in a complete, precise, and verifiable manner, the requirements, design, behavior, or other characteris- tics of a process. It also can include procedures for determining whether these provisions have been satisfied. Process descriptions can be found at the activity, project, work group, or organizational level. process element The fundamental unit of a process. A process can be defined in terms of subprocesses or process elements. A subprocess is a process element when it is not further decomposed into sub- processes or process elements. (See also “process” and “subprocess.”) Each process element covers a closely related set of activities (e.g., estimat- ing element, peer review element). Process elements can be portrayed using templates to be completed, abstractions to be refined, or descriptions to be modified or used. A process element can be an activity or task. The terms “process,” “subprocess,” and “process element” form a hierarchy with “process” as the highest, most general term, “subprocesses” below it, and “process element” as the most specific. process group A collection of specialists who facilitate the defini- tion, maintenance, and improvement of processes used by the organization.

Wow! eBook Appendix D Glossary 589 process improvement A program of activities designed to improve the process performance and maturity of the organization’s processes, and the results of such a program. process improvement objectives A set of target characteristics established to guide the effort to improve an existing process in a specific, measurable way either in terms of resultant product or service characteristics (e.g., quality, product performance, con- formance to standards) or in the way in which the process is exe- cuted (e.g., elimination of redundant process steps, combination of process steps, improvement of cycle time). (See also “organiza- tion’s business objectives” and “quantitative objective.”) process improvement plan A plan for achieving organizational process improvement objectives based on a thorough under- standing of current strengths and weaknesses of the organiza- tion’s processes and process assets. process measurement A set of operations used to determine val- ues of measures of a process and its resulting products or services for the purpose of characterizing and understanding the process. (See also “measurement.”) process owner The person (or team) responsible for defining and ptg maintaining a process. At the organizational level, the process owner is the person (or team) responsible for the description of a standard process; at the project or work group level, the process owner is the person (or team) responsible for the description of the defined process. A process can therefore have multiple owners at different levels of responsibility. (See also “defined process” and “standard process.”) process performance A measure of results achieved by following a process. (See also “measure.”) Process performance is characterized by both process measures (e.g., effort, cycle time, defect removal efficiency) and product or service measures (e.g., reliability, defect density, response time). process performance baseline A documented characterization of process performance, which can include central tendency and variation. (See also “process performance.”) A process performance baseline can be used as a benchmark for comparing actual process performance against expected process performance. process performance model A description of relationships among the measurable attributes of one or more processes or work products that is developed from historical process performance

Wow! eBook 590 PART THREE THE APPENDICES

data and is used to predict future performance. (See also “measure.”) One or more of the measurable attributes represent controllable inputs tied to a subprocess to enable performance of “what-if” analyses for planning, dynamic re-planning, and problem resolution. Process performance models include statistical, probabilistic and simulation based models that predict interim or final results by connecting past performance with future out- comes. They model the variation of the factors, and provide insight into the expected range and variation of predicted results. A process performance model can be a collection of models that (when combined) meet the criteria of a process performance model. process tailoring Making, altering, or adapting a process descrip- tion for a particular end. For example, a project or work group tailors its defined process from the organization’s set of standard processes to meet objectives, constraints, and the environment of the project or work group. (See also “defined process,” “organization’s set of standard processes,” and “process description.”) product A work product that is intended for delivery to a cus- tomer or end user. This term has a special meaning in the CMMI Product Suite besides its common standard English meaning. The form of a product can vary in dif- ptg ferent contexts. (See also “customer,” “product component,” “service,” and “work product.”) product baseline The initial approved technical data package defining a configuration item during the production, operation, maintenance, and logistic support of its lifecycle. (See also “con- figuration item,” “configuration management,” and “technical data package.”) This term is related to configuration management. product component A work product that is a lower level compo- nent of the product. (See also “product” and “work product.”) Product components are integrated to produce the product. There can be multiple levels of product components. Throughout the process areas, where the terms “product” and “product com- ponent” are used, their intended meanings also encompass services, service systems, and their components. This term has a special meaning in the CMMI Product Suite besides its common standard English meaning. product component requirements A complete specification of a product or service component, including fit, form, function, per- formance, and any other requirement.

Wow! eBook Appendix D Glossary 591 product lifecycle The period of time, consisting of phases, that begins when a product or service is conceived and ends when the product or service is no longer available for use. Since an organization can be producing multiple products or services for multiple customers, one description of a product lifecycle may not be ade- quate. Therefore, the organization can define a set of approved product life- cycle models. These models are typically found in published literature and are likely to be tailored for use in an organization. A product lifecycle could consist of the following phases: (1) concept and vision, (2) feasibility, (3) design/development, (4) production, and (5) phase out. product line A group of products sharing a common, managed set of features that satisfy specific needs of a selected market or mis- sion and that are developed from a common set of core assets in a prescribed way. (See also “service line.”) The development or acquisition of products for the product line is based on exploiting commonality and bounding variation (i.e., restricting unneces- sary product variation) across the group of products. The managed set of core assets (e.g., requirements, architectures, components, tools, testing arti- facts, operating procedures, software) includes prescriptive guidance for their use in product development. Product line operations involve interlock- ptg ing execution of the broad activities of core asset development, product development, and management. Many people use “product line” just to mean the set of products produced by a particular business unit, whether they are built with shared assets or not. We call that collection a “portfolio,” and reserve “product line” to have the technical meaning given here. product related lifecycle processes Processes associated with a product or service throughout one or more phases of its life (e.g., from conception through disposal), such as manufacturing and support processes. product requirements A refinement of customer requirements into the developers’ language, making implicit requirements into explicit derived requirements. (See also “derived requirements” and “product component requirements.”) The developer uses product requirements to guide the design and building of the product or service. product suite (See “CMMI Product Suite.”) project A managed set of interrelated activities and resources, including people, that delivers one or more products or services to a customer or end user. A project has an intended beginning (i.e., project startup) and end. Projects typically operate according to a plan. Such a plan is frequently documented

Wow! eBook 592 PART THREE THE APPENDICES

and specifies what is to be delivered or implemented, the resources and funds to be used, the work to be done, and a schedule for doing the work. A project can be composed of projects. (See also “project startup.”) In some contexts, the term “program” is used to refer to a project. project plan A plan that provides the basis for performing and controlling the project’s activities, which addresses the commit- ments to the project’s customer. Project planning includes estimating the attributes of work products and tasks, determining the resources needed, negotiating commitments, produc- ing a schedule, and identifying and analyzing project risks. Iterating through these activities may be necessary to establish the project plan. project progress and performance What a project achieves with respect to implementing project plans, including effort, cost, schedule, and technical performance. (See also “technical performance.”) project startup When a set of interrelated resources for a project are directed to develop or deliver one or more products or serv- ices for a customer or end user. (See also “project.”) prototype A preliminary type, form, or instance of a product, serv- ice, product component, or service component that serves as a ptg model for later stages or for the final, complete version of the product or service. This model of the product or service (e.g., physical, electronic, digital, analytical) can be used for the following (and other) purposes: • Assessing the feasibility of a new or unfamiliar technology • Assessing or mitigating technical risk • Validating requirements • Demonstrating critical features • Qualifying a product or service • Qualifying a process • Characterizing performance or features of the product or service • Elucidating physical principles quality The degree to which a set of inherent characteristics ful- fills requirements. quality and process performance objectives Quantitative objec- tives and requirements for product quality, service quality, and process performance. Quantitative process performance objectives include quality; however, to emphasize the importance of quality in the CMMI Product Suite, the phrase “quality and process performance objectives” is used. “Process performance

Wow! eBook Appendix D Glossary 593

objectives” are referenced in maturity level 3; the term “quality and process performance objectives” implies the use of quantitative data and is only used in maturity levels 4 and 5. quality assurance A planned and systematic means for assuring management that the defined standards, practices, procedures, and methods of the process are applied. quality attribute A property of a product or service by which its quality will be judged by relevant stakeholders. Quality attributes are characterizable by some appropriate measure. Quality attributes are non-functional, such as timeliness, throughput, responsiveness, security, modifiability, reliability, and usability. They have a significant influence on the architecture. quality control The operational techniques and activities that are used to fulfill requirements for quality. (See also “quality assurance.”) quantitative management Managing a project or work group using statistical and other quantitative techniques to build an understanding of the performance or predicted performance of processes in comparison to the project’s or work group’s qual- ity and process performance objectives, and identifying correc- ptg tive action that may need to be taken. (See also “statistical techniques.”) Statistical techniques used in quantitative management include analysis, creation, or use of process performance models; analysis, creation, or use of process performance baselines; use of control charts; analysis of variance, regression analysis; and use of confidence intervals or prediction intervals, sensitivity analysis, simulations, and tests of hypotheses. quantitative objective Desired target value expressed using quan- titative measures. (See also “measure,” “process improvement objectives,” and “quality and process performance objectives.”) quantitatively managed (See “quantitative management.”) reference model A model that is used as a benchmark for measur- ing an attribute. relevant stakeholder A stakeholder that is identified for involve- ment in specified activities and is included in a plan. (See also “stakeholder.”) representation The organization, use, and presentation of a CMM’s components. Overall, two types of approaches to presenting best practices are evident: the staged representation and the continuous representation.

Wow! eBook 594 PART THREE THE APPENDICES

required CMMI components CMMI components that are essential to achieving process improvement in a given process area. Specific goals and generic goals are required model components. Goal satis- faction is used in appraisals as the basis for deciding whether a process area has been satisfied. requirement (1) A condition or capability needed by a user to solve a problem or achieve an objective. (2) A condition or capa- bility that must be met or possessed by a product, service, prod- uct component, or service component to satisfy a supplier agreement, standard, specification, or other formally imposed documents. (3) A documented representation of a condition or capability as in (1) or (2). (See also “supplier agreement.”) requirements analysis The determination of product or service specific functional and quality attribute characteristics based on analyses of customer needs, expectations, and constraints; opera- tional concept; projected utilization environments for people, products, services, and processes; and measures of effectiveness. (See also “operational concept.”) requirements elicitation Using systematic techniques such as pro- totypes and structured surveys to proactively identify and docu- ptg ment customer and end-user needs. requirements management The management of all requirements received by or generated by the project or work group, including both technical and nontechnical requirements as well as those requirements levied on the project or work group by the organi- zation. (See also “nontechnical requirements.”) requirements traceability A discernable association between requirements and related requirements, implementations, and verifications. (See also “bidirectional traceability” and “traceability.”) return on investment The ratio of revenue from output (product or service) to production costs, which determines whether an organization benefits from performing an action to produce something. risk analysis The evaluation, classification, and prioritization of risks. risk identification An organized, thorough approach used to seek out probable or realistic risks in achieving objectives. risk management An organized, analytic process used to identify what might cause harm or loss (identify risks); to assess and

Wow! eBook Appendix D Glossary 595

quantify the identified risks; and to develop and, if needed, implement an appropriate approach to prevent or handle causes of risk that could result in significant harm or loss. Typically, risk management is performed for the activities of a project, a work group, an organization, or other organizational units that are devel- oping or delivering products or services. senior manager A management role at a high enough level in an organization that the primary focus of the person filling the role is the long-term vitality of the organization rather than short-term concerns and pressures. (See also “higher level management.”) A senior manager has authority to direct the allocation or reallocation of resources in support of organizational process improvement effectiveness. A senior manager can be any manager who satisfies this description, including the head of the organization. Synonyms for senior manager include “executive” and “top-level manager.” However, to ensure consistency and usability, these synonyms are not used in CMMI models. This term has a special meaning in the CMMI Product Suite besides its common standard English meaning. service A product that is intangible and non-storable. (See also “product,” “customer,” and “work product.”) ptg Services are delivered through the use of service systems that have been designed to satisfy service requirements. (See also “service system.”) Many service providers deliver combinations of services and goods. A single service system can deliver both types of products. For example, a training organization can deliver training materials along with its training services. Services may be delivered through combinations of manual and automated processes. This term has a special meaning in the CMMI Product Suite besides its common standard English meaning. service agreement A binding, written record of a promised exchange of value between a service provider and a customer. (See also “customer.”) Service agreements can be fully negotiable, partially negotiable, or non- negotiable, and they can be drafted either by the service provider, the cus- tomer, or both, depending on the situation. A “promised exchange of value” means a joint recognition and acceptance of what each party will provide to the other to satisfy the agreement. Typi- cally, the customer provides payment in return for delivered services, but other arrangements are possible. A “written” record need not be contained in a single document or other arti- fact. Alternatively, it may be extremely brief for some types of services (e.g., a receipt that identifies a service, its price, its recipient).

Wow! eBook 596 PART THREE THE APPENDICES

service catalog A list or repository of standardized service definitions. Service catalogs can include varying degrees of detail about available service levels, quality, prices, negotiable/tailorable items, and terms and conditions. A service catalog need not be contained in a single document or other arti- fact, and can be a combination of items that provide equivalent information (such as web pages linked to a database.) Alternatively, for some services an effective catalog can be a simple printed menu of available services and their prices. Service catalog information can be partitioned into distinct subsets to sup- port different types of stakeholders (e.g., customers, end users, provider staff, suppliers). service incident An indication of an actual or potential interfer- ence with a service. Service incidents can occur in any service domain because customer and end-user complaints are types of incidents and even the simplest of services can generate complaints. The word “incident” can be used in place of “service incident” for brevity when the context makes the meaning clear. service level A defined magnitude, degree, or quality of service ptg delivery performance. (See also “service” and “service level measure.”) service level agreement A service agreement that specifies deliv- ered services; service measures; levels of acceptable and unac- ceptable services; and expected responsibilities, liabilities, and actions of both the provider and customer in anticipated situa- tions. (See also “measure,” “service,” and “service agreement.”) A service level agreement is a kind of service agreement that documents the details indicated in the definition. The use of the term “service agreement” always includes “service level agreement” as a subcategory and the former may be used in place of the lat- ter for brevity. However, “service level agreement” is the preferred term when it is desired to emphasize situations in which distinct levels of accept- able services exist, or other details of a service level agreement are likely to be important to the discussion. service level measure A measure of service delivery performance associated with a service level. (See also “measure” and “service level.”) service line A consolidated and standardized set of services and service levels that satisfy specific needs of a selected market or mission area. (See also “product line” and “service level.”)

Wow! eBook Appendix D Glossary 597 service request A communication from a customer or end user that one or more specific instances of service delivery are desired. (See also “service agreement.”) These requests are made within the context of a service agreement. In cases where services are to be delivered continuously or periodically, some service requests may be explicitly identified in the service agreement itself. In other cases, service requests that fall within the scope of a previously established service agreement are generated over time by customers or end users as their needs develop. service requirements The complete set of requirements that affect service delivery and service system development. (See also “ser- vice system.”) Service requirements include both technical and nontechnical requirements. Technical requirements are properties of the service to be delivered and the service system needed to enable delivery. Nontechnical requirements may include additional conditions, provisions, commitments, and terms identi- fied by agreements, and regulations, as well as needed capabilities and con- ditions derived from business objectives. service system An integrated and interdependent combination of component resources that satisfies service requirements. (See ptg also “service system component” and “service requirements.”) A service system encompasses everything required for service delivery, including work products, processes, facilities, tools, consumables, and human resources. Note that a service system includes the people necessary to perform the service system’s processes. In contexts where end users perform some processes for service delivery to be accomplished, those end users are also part of the service system (at least for the duration of those interactions). A complex service system may be divisible into multiple distinct delivery and support systems or subsystems. While these divisions and distinctions may be significant to the service provider organization, they may not be as meaningful to other stakeholders. service system component A resource required for a service sys- tem to successfully deliver services. Some components can remain owned by a customer, end user, or third party before service delivery begins and after service delivery ends. (See also “customer” and “end user.”) Some components can be transient resources that are part of the service sys- tem for a limited time (e.g., items that are under repair in a maintenance shop). Components can include processes and people.

Wow! eBook 598 PART THREE THE APPENDICES

The word “component” can be used in place of “service system component” for brevity when the context makes the meaning clear. The word “infrastructure” can be used to refer collectively to service system components that are tangible and essentially permanent. Depending on the context and type of service, infrastructure can include human resources. service system consumable A service system component that ceases to be available or becomes permanently changed by its use during the delivery of a service. Fuel, office supplies, and disposable containers are examples of commonly used consumables. Particular types of services can have their own special- ized consumables (e.g., a health care service may require medications or blood supplies). People are not consumables, but their labor time is a consumable. shared vision A common understanding of guiding principles, including mission, objectives, expected behavior, values, and final outcomes, which are developed and used by a project or work group. software engineering (1) The application of a systematic, disci- plined, quantifiable approach to the development, operation, and maintenance of software. (2) The study of approaches as in (1). ptg (See also “hardware engineering” and “systems engineering.”) solicitation The process of preparing a package to be used in selecting a supplier. (See also “solicitation package.”) solicitation package A collection of formal documents that includes a description of the desired form of response from a potential supplier, the relevant statement of work for the sup- plier, and required provisions in the supplier agreement. special cause of variation A cause of a defect that is specific to some transient circumstance and is not an inherent part of a process. (See also “common cause of variation.”) specific goal A required model component that describes the unique characteristics that must be present to satisfy the process area. (See also “capability level,” “generic goal,” “organization’s business objectives,” and “process area.”) specific practice An expected model component that is considered important in achieving the associated specific goal. (See also “process area” and “specific goal.”) The specific practices describe the activities expected to result in achieve- ment of the specific goals of a process area.

Wow! eBook Appendix D Glossary 599 stable process The state in which special causes of process varia- tion have been removed and prevented from recurring so that only common causes of process variation of the process remain. (See also “capable process,” “common cause of variation,” “spe- cial cause of variation,” and “standard process.”) staged representation A model structure wherein attaining the goals of a set of process areas establishes a maturity level; each level builds a foundation for subsequent levels. (See also “matu- rity level” and “process area.”) stakeholder A group or individual that is affected by or is in some way accountable for the outcome of an undertaking. (See also “customer” and “relevant stakeholder.”) Stakeholders may include project or work group members, suppliers, cus- tomers, end users, and others. This term has a special meaning in the CMMI Product Suite besides its common standard English meaning. standard (noun) Formal requirements developed and used to pre- scribe consistent approaches to acquisition, development, or service. Examples of standards include ISO/IEC standards, IEEE standards, and ptg organizational standards. standard process An operational definition of the basic process that guides the establishment of a common process in an organization. A standard process describes the fundamental process elements that are expected to be incorporated into any defined process. It also describes rela- tionships (e.g., ordering, interfaces) among these process elements. (See also “defined process.”) statement of work A description of work to be performed. statistical and other quantitative techniques Analytic techniques that enable accomplishing an activity by quantifying parameters of the task (e.g., inputs, size, effort, and performance). (See also “statistical techniques” and “quantitative management.”) This term is used in the high maturity process areas where the use of statis- tical and other quantitative techniques to improve understanding of project, work, and organizational processes is described. Examples of non-statistical quantitative techniques include trend analysis, run charts, Pareto analysis, bar charts, radar charts, and data averaging. The reason for using the compound term “statistical and other quantitative techniques” in CMMI is to acknowledge that while statistical techniques are expected, other quantitative techniques can also be used effectively.

Wow! eBook 600 PART THREE THE APPENDICES

statistical process control Statistically based analysis of a process and measures of process performance, which identify common and special causes of variation in process performance and main- tain process performance within limits. (See also “common cause of variation,” “special cause of variation,” and “statistical techniques.”) statistical techniques Techniques adapted from the field of mathe- matical statistics used for activities such as characterizing process performance, understanding process variation, and predicting outcomes. Examples of statistical techniques include sampling techniques, analysis of variance, chi-squared tests, and process control charts. subpractice An informative model component that provides guidance for interpreting and implementing specific or generic practices. Subpractices may be worded as if prescriptive, but they are actually meant only to provide ideas that can be useful for process improvement. subprocess A process that is part of a larger process. (See also “process,” “process description,” and “process element.”) A subprocess may or may not be further decomposed into more granular ptg subprocesses or process elements. The terms “process,” “subprocess,” and “process element” form a hierarchy with “process” as the highest, most gen- eral term, “subprocesses” below it, and “process element” as the most spe- cific. A subprocess can also be called a process element if it is not decomposed into further subprocesses. supplier (1) An entity delivering products or performing services being acquired. (2) An individual, partnership, company, corpo- ration, association, or other entity having an agreement with an acquirer for the design, development, manufacture, maintenance, modification, or supply of items under the terms of an agree- ment. (See also “acquirer.”) supplier agreement A documented agreement between the acquirer and supplier. (See also “supplier.”) Supplier agreements are also known as contracts, licenses, and memoranda of agreement. sustainment The processes used to ensure that a product or serv- ice remains operational. system of systems A set or arrangement of systems that results when independent and useful systems are integrated into a large system that delivers unique capabilities.

Wow! eBook Appendix D Glossary 601 systems engineering The interdisciplinary approach governing the total technical and managerial effort required to transform a set of customer needs, expectations, and constraints into a solution and to support that solution throughout its life. (See also “hard- ware engineering” and “software engineering.”) This approach includes the definition of technical performance measures, the integration of engineering specialties toward the establishment of an architecture, and the definition of supporting lifecycle processes that bal- ance cost, schedule, and performance objectives. tailoring The act of making, altering, or adapting something for a particular end. For example, a project or work group establishes its defined process by tai- loring from the organization’s set of standard processes to meet its objec- tives, constraints, and environment. Likewise, a service provider tailors standard services for a particular service agreement. tailoring guidelines Organizational guidelines that enable proj- ects, work groups, and organizational functions to appropriately adapt standard processes for their use. The organization’s set of standard processes is described at a general level that may not be directly usable to perform a process. ptg Tailoring guidelines aid those who establish the defined processes for project or work groups. Tailoring guidelines cover (1) selecting a standard process, (2) selecting an approved lifecycle model, and (3) tailoring the selected standard process and lifecycle model to fit project or work group needs. Tai- loring guidelines describe what can and cannot be modified and identify process components that are candidates for modification. target profile A list of process areas and their corresponding capa- bility levels that represent an objective for process improvement. (See also “achievement profile” and “capability level profile.”) Target profiles are only available when using the continuous representation. target staging A sequence of target profiles that describes the path of process improvement to be followed by the organization. (See also “achievement profile,” “capability level profile,” and “target profile.”) Target staging is only available when using the continuous representation. team A group of people with complementary skills and expertise who work together to accomplish specified objectives. A team establishes and maintains a process that identifies roles, responsibil- ities, and interfaces; is sufficiently precise to enable the team to measure, manage, and improve their work performance; and enables the team to make and defend their commitments.

Wow! eBook 602 PART THREE THE APPENDICES

Collectively, team members provide skills and advocacy appropriate to all aspects of their work (e.g., for the different phases of a work product’s life) and are responsible for accomplishing the specified objectives. Not every project or work group member must belong to a team (e.g., a per- son staffed to accomplish a task that is largely self-contained). Thus, a large project or work group can consist of many teams as well as project staff not belonging to any team. A smaller project or work group can consist of only a single team (or a single individual). technical data package A collection of items that can include the following if such information is appropriate to the type of prod- uct and product component (e.g., material and manufacturing requirements may not be useful for product components associ- ated with software services or processes): • Product architecture description • Allocated requirements • Product component descriptions • Product related lifecycle process descriptions if not described as separate product components • Key product characteristics • Required physical characteristics and constraints ptg • Interface requirements • Materials requirements (bills of material and material characteristics) • Fabrication and manufacturing requirements (for both the origi- nal equipment manufacturer and field support) • Verification criteria used to ensure requirements have been achieved • Conditions of use (environments) and operating/usage scenarios, modes and states for operations, support, training, manufacturing, disposal, and verifications throughout the life of the product • Rationale for decisions and characteristics (e.g., requirements, requirement allocations, design choices) technical performance Characteristic of a process, product, or service, generally defined by a functional or technical requirement. Examples of technical performance types include estimating accuracy, end- user functions, security functions, response time, component accuracy, max- imum weight, minimum throughput, allowable range. technical performance measure Precisely defined technical mea - sure of a requirement, capability, or some combination of require- ments and capabilities. (See also “measure.”) technical requirements Properties (i.e., attributes) of products or services to be acquired or developed.

Wow! eBook Appendix D Glossary 603 traceability A discernable association among two or more logical entities such as requirements, system elements, verifications, or tasks. (See also “bidirectional traceability” and “requirements traceability.”) trade study An evaluation of alternatives, based on criteria and systematic analysis, to select the best alternative for attaining determined objectives. training Formal and informal learning options. These learning options can include classroom training, informal mentoring, web-based training, guided self study, and formalized on-the-job training programs. The learning options selected for each situation are based on an assessment of the need for training and the performance gap to be addressed. unit testing Testing of individual hardware or software units or groups of related units. (See also “acceptance testing.”) validation Confirmation that the product or service, as provided (or as it will be provided), will fulfill its intended use. In other words, validation ensures that “you built the right thing.” (See also “verification.”) verification Confirmation that work products properly reflect the ptg requirements specified for them. In other words, verification ensures that “you built it right.” (See also “validation.”) version control The establishment and maintenance of baselines and the identification of changes to baselines that make it possi- ble to return to the previous baseline. In some contexts, an individual work product may have its own baseline and a level of control less than formal configuration control may be s u f f i c i e n t . work breakdown structure (WBS) An arrangement of work ele- ments and their relationship to each other and to the end prod- uct or service. work group A managed set of people and other assigned resources that delivers one or more products or services to a customer or end user. (See also “project.”) A work group can be any organizational entity with a defined purpose, whether or not that entity appears on an organization chart. Work groups can appear at any level of an organization, can contain other work groups, and can span organizational boundaries. A work group together with its work can be considered the same as a proj- ect if it has an intentionally limited lifetime.

Wow! eBook 604 PART THREE THE APPENDICES

work plan A plan of activities and related resource allocations for a work group. Work planning includes estimating the attributes of work products and tasks, determining the resources needed, negotiating commitments, produc- ing a schedule, and identifying and analyzing risks. Iterating through these activities can be necessary to establish the work plan. work product A useful result of a process. This result can include files, documents, products, parts of a product, serv- ices, process descriptions, specifications, and invoices. A key distinction between a work product and a product component is that a work product is not necessarily part of the end product. (See also “product” and “product component.”) In CMMI models, the definition of “work product” includes services, how- ever, the phrase “work products and services” is sometimes used to empha- size the inclusion of services in the discussion. work product and task attributes Characteristics of products, services, and tasks used to help in estimating work. These char- acteristics include items such as size, complexity, weight, form, fit, and function. They are typically used as one input to deriving other resource estimates (e.g., effort, cost, schedule). ptg work startup When a set of interrelated resources for a work group is directed to develop or deliver one or more products or services for a customer or end user. (See also “work group.”)

Wow! eBook Book Contributors

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook BOOK AUTHORS

ptg

Sandy Shrum, Mary Beth Chrissis, Mike Konrad (l. to r.)

Mary Beth Chrissis Senior Member of the Technical Staff CMMI Initiative Software Engineering Institute Mary Beth Chrissis is a senior member of the technical staff at the Software Engineering Institute (SEI). Since joining the SEI in 1988, Chrissis has been a coauthor of the Capability Maturity Model Integra- tion for Development (CMMI-DEV) and Capability Maturity Model for Software (SW-CMM) models. Currently, Chrissis chairs the CMMI Configuration Control Board (CCB), is a member of the IEEE Soft- ware and Systems Engineering Standards Executive Committee, and is an instructor of various CMMI model-related courses at the SEI. Prior to joining the SEI, Chrissis worked at GTE Government Systems in Rockville, MD; Dravo Automation Sciences in Pitts- burgh, PA; and Sperry Corporation in Great Neck, NY. Before com- ing to the SEI, Mary Beth was pursuing her masters in Computer

607

Wow! eBook 608 Book Authors

Science from Johns Hopkins University and in 1983 she received a B.S. from Carnegie Mellon University. Mike Konrad Senior Member of the Technical Staff CMMI Initiative Software Engineering Institute Mike Konrad has been with the Software Engineering Institute (SEI) at Carnegie Mellon University since 1988. Dr. Konrad is the Chief Architect of CMMI and Manager of SEI’s CMMI Modeling Team. Previously, he was Chair of the CMMI Configuration Control Board (2001–2006) and a member of the International Process Research Consortium (2004–2006). Also, he was a member of the teams that developed the original Software CMM Version 1.0 (1988–1991) and ISO 15504 (1993–1997). Prior to the SEI, Mike worked with several companies in computer science-related posi- tions, including ISSI, SAIC, and Honeywell and briefly with George Mason University and the University of Maryland. Mike obtained his Ph.D. in Mathematics in 1978 from Ohio University, Athens, Ohio. ptg Sandy Shrum Senior Writer/Editor Communications Software Engineering Institute Sandy Shrum is a senior writer/editor and communications point of contact for the Software Engineering Process Management program at the Software Engineering Institute. In addition to this book, she has coauthored two other CMMI books: CMMI-ACQ: Guidelines for Improving the Acquisition of Products and Services and CMM for Serv- ices: Guidelines for Superior Service. She has been with the SEI since 1995 and has been a member of the CMMI Development Team since the CMMI project’s inception in 1998. Her roles on the project have included model author, small review team member, reviewer, editor, model development process coordinator, and quality assurance process owner. Before joining the SEI, Sandy worked for eight years with Legent Corporation, a Virginia-based software company. Her experience as a technical communicator dates back to 1988, when she earned her M.A. in Professional Writing from Carnegie Mellon University. Her undergraduate degree, a B.S. in Business Adminis- tration, was earned at Gannon University, Erie, Pennsylvania.

Wow! eBook CONTRIBUTING AUTHORS

ptg

Steve Baldassano Manager of Business Process Improvement KNORR Brake Corporation Steve Baldassano is the Manager of Business Process Improvement at KNORR Brake Corporation. Steve has experience and training in Quality Practices for Process Improvements; and is a Certified Quality Manager from ASQ and a student of TQM Deming/Duran. He has more than 25 years of experience with Program Management and Facilitation with Process Improvement Teams, managing both projects and processes. Since January 2009, he has been assigned full time to facilitate the improvement efforts for the Software Development Group using the CMMI DEV 1.2 model as a guide. Steve held positions at KNORR as Quality Manager for five years and Program Manager for six years prior to his current assignment.

609

Wow! eBook 610 Contributing Authors Victor R. Basili Professor Emeritus of Computer Science University of Maryland Senior Research Fellow Fraunhofer Center Dr. Victor R. Basili is Professor Emeritus of Computer Science at the University of Maryland, College Park, and Senior Research Fellow at the Fraunhofer Center–Maryland. He was one of the founders and principals in the Software Engineering Laboratory (SEL). He works on measuring, evaluating, and improving the software development process and product via mechanisms for observing and evolving knowledge through empirical research (e.g., the Goal/Question/Met- ric Approach, the Quality Improvement Paradigm, the Experience Factory, GQM+Strategies). Kathleen C. Dangle and Michele Shaw Applied Technology Engineer Fraunhofer Center for Experimental Software Engineering Kathleen C. Dangle and Michele A. Shaw are Applied Technology Engineers at the Fraunhofer ptg Center for Experimental Software Engineering, Maryland, a not-for- profit affiliate of the Computer Science Department of the Univer- sity of Maryland, College Park. They are responsible for helping organizations improve their software business through process improvement initiatives. They work closely with Dr. Basili to apply empirical thinking in real-world software contexts. Roger Bate Late CMMI Chief Architect Software Engineering Institute Roger Bate was a visiting scientist at the Software Engineer- ing Institute (SEI) at Carnegie Mellon University. He was the visionary behind CMMI and the chief architect of the CMMI Framework. Before that, Roger was chief computer scientist for Texas Instruments and a TI fellow, chief of the Reactor Theory Branch of the Oak Ridge National Labora- tory, and a professor at the U.S. Air Force Academy. He was a fellow of the ACM, a fellow of the Society for Design and Process Science, and a recipient of the SEI Leadership Award. Roger passed away in March 2009.

Wow! eBook Contributing Authors 611 Michael Campo Principal Engineering Fellow Raytheon Company Michael Campo is a Principal Engineering Fellow at Raytheon Company. As process group leader for Raytheon Integrated Defense Systems (IDS), Mike developed and deployed processes that led to the achievement of CMMI Maturity Level 3 in 2003, Maturity Level 4 in 2005, and Maturity Level 5 in 2008. Mike’s present position is IDS Process Technical Director. Mike is a certified CMMI instructor, and a member of the CMMI V1.3 Core Model Team, the CMMI V1.3 Training Team, the CMMI Configuration Control Board, and the National Defense Industrial Association CMMI Working Group. David N. Card Fellow Q-Labs Editor-in-Chief Journal of Systems and Software

David N. Card is a fellow of Q-Labs. He is the author of Meas- ptg uring Software Design Quality (Prentice Hall, 1990), coauthor of Practical Software Measurement (Addison-Wesley, 2002), coeditor of ISO/IEC Standard 15939: Software Measurement Process (International Organization for Standardization, 2002), and Editor-in-Chief of the Journal of Systems and Software. Bill Curtis Senior Vice President and Chief Scientist CAST Software Founding Director Consortium for IT Software Quality Bill Curtis is Senior Vice President and Chief Scientist at CAST Software and the Founding Director of the Consor- tium for IT Software Quality. He coauthored the Capability Maturity Model (CMM), the People CMM, and the Busi- ness Process MM. Until its acquisition by Borland, he was Cofounder and Chief Scientist of TeraQuest, the global leader in providing CMM-based services. He is a former Director of the Software Process Program in the Software Engineer- ing Institute at Carnegie Mellon University. Prior to joining the SEI, Dr. Curtis worked for MCC, ITT’s Programming Technology Center, GE Space Division, and taught statistics at the University of Wash- ington. He has published five books, more than 150 articles, and

Wow! eBook 612 Contributing Authors

was elected a Fellow of the IEEE for his contributions to software process improvement and measurement. Aldo Dagnino System Architectures and Technology Group Lead US Corporate Research Center ABB, Inc. Dr. Aldo Dagnino has more than 20 years of experience in the software industry and close to 10 years of experience using the CMMI model. He is currently leading the System Architectures and Technologies Group at the US Corporate Research Center of ABB Inc. Dr. Dagnino is also an Adjunct Assistant Professor in the Department of Computer Science at North Carolina State University. He has given several presenta- tions at software engineering conferences such as IEEE, NDIA, and SEPG, among others. Dr. Dagnino was honored to receive the 2008 SEI Member Award Winner for most Outstanding Technical Con- tributor at the 2008 SEPG Conference in Tampa, FL. His primary areas of interest include CMMI models, process improvement, Empirical Software Engineering, system architectures, data mining, and knowledge-intensive systems. ptg Khaled El Emam Associate Professor Canada Research Chair in Electronic Health Information University of Ottawa Senior Investigator Children’s Hospital of Eastern Ontario Research Institute Dr. Khaled El Emam is an Associate Professor at the Uni- versity of Ottawa, Faculty of Medicine, and the School of Information Technology and Engineering, a senior investi- gator at the Children’s Hospital of Eastern Ontario Research Institute, and a Canada Research Chair in Electronic Health Information at the University of Ottawa. His main area of research is developing techniques for health data anonymization. Previously, Khaled was a Senior Research Officer at the National Research Council of Canada; and prior to that, he was head of the Quantitative Methods Group at the Fraunhofer Institute in Kaiserslautern, Germany. He has (co)founded two companies to commercialize the results of his research work. In 2003 and 2004, he was ranked as the top systems and software engineering scholar worldwide by the Journal of Systems and Software based on his research on measurement and quality evaluation and improvement,

Wow! eBook Contributing Authors 613 and ranked second in 2002 and 2005. He holds a Ph.D. from the Department of Electrical and Electronics, King’s College, at the Uni- versity of London (UK). His lab’s website is www.ehealthinforma- tion.ca/. Hillel Glazer Principal and CEO Entinex, Inc. Hillel has been working in process improvement since his first job out of college. He is a CMMI High Maturity Lead Appraiser and CMMI Instructor working with Agile teams as well as an SEI Visiting Scientist. Hillel is the lead author on the SEI’s first-ever official publication addressing Agile development. His diverse experience base, which includes aerospace/defense and systems engineering, large and small consulting practices, federal agencies, dot-com operations, and financial systems support, is probably what gave him the necessary perspective to pioneer how to bring CMMI and Agile together as far back as his 2001 CrossTalk article highlighting the compatibilities of (then) CMM and XP. ptg Kileen Harrison Lead Associate Booz Allen Hamilton Kileen Harrison is a Lead Associate with Booz Allen Hamil- ton, specializing in Process Improvement. She has a Mas- ters of Business Administration from Johns Hopkins University and more than 15 years of experience in the industry. Throughout her career, Kileen has played key leadership roles in the development, documentation, main- tenance, and implementation of process, to enable projects and organizations to achieve their process improvement goals. Her experience includes working with external clients as well as helping to ensure Booz Allen Hamilton maintains its CMMI-DEV +IPPD, Maturity Level 3 rating. She is an authorized SCAMPI B&C Team Lead for CMMI-DEV, CMMI-ACQ, and CMMI-SVC; and a certified instructor for Introduction to CMMI, Acquisition Supplement, and Services Supplement. She is a trained Lean Six Sigma Green Belt and ITIL Foundation Certified.

Wow! eBook 614 Contributing Authors Will Hayes SCAMPI Upgrade Team Leader Software Engineering Institute Will Hayes has served in a variety of roles at the Software Engineering Institute for nearly twenty years. His present focus is on leading the team updating the Standard CMMI Appraisal Method for Process Improvement (SCAMPI) for Version 1.3 of the CMMI product suite. He recently served on the team updating the High Maturity Process Areas of CMMI. Prior to these assignments, Will was the leader of the SEI Appraisal Program’s Quality Management Team and initiated the process for auditing SCAMPI Appraisals. He has been a frequent presenter at conferences throughout the world on topics relating to Process Improvement, Measurement, and High Maturity Practices. Watts Humphrey Founder of Software Process Program Software Engineering Institute Watts S. Humphrey founded the Software Process Program of the Software Engineering Institute at Carnegie Mellon ptg University, led the initial development of the Capability Maturity Model (CMM), and was a Fellow of the Institute. From 1959 to 1986, he was at IBM where he was director of programming and VP of Technical Development. His publi- cations include many technical papers and thirteen books, including Winning With Software: An Executive Strategy (2002), PSP: A Self-Improvement Process for Software Engineers (2005), and TSP: Coaching Development Teams (2006). He held five U.S. patents and served on the Malcolm Baldridge National Quality Award Board of Examiners. He held degrees in physics, a master’s degree in Busi- ness Administration, and an honorary Ph.D. degree in Software Engineering. The Watts Humphrey Software Quality Institute in Chennai, India was named in his honor; and, in 2005, the President of the United States awarded Mr. Humphrey the National Medal of Technology. Watts passed away in October 2010.

Wow! eBook Contributing Authors 615 Gargi Keeni Vice President Tata Consultancy Services, Ltd. Gargi Keeni, Vice President at Tata Consultancy Services Ltd., serves on the Industry Advisory Board (IAB) of IEEE Computer Society, the business planning group (SWG1) of ISO/IEC JTC1 SC7, and is the cochair of the advisory panel of NASSCOM Quality forum. An SEI authorized lead appraiser and instructor for CMMI and an examiner for JRD-QV (in lines of MBNQA), she was also a lead appraiser for Peo- ple CMM. Dr Keeni’s research interests include process improve- ment, quality management systems, and business excellence. For more information, see Gargi Keeni, “The Evolution of Quality Processes at Tata Consultancy Services” IEEE Software (July/ August 2000). Peter Kraus Engineering Fellow Raytheon Integrated Defense Systems Dr. Peter Kraus is an Engineering Fellow focusing on statis- tical engineering training and consulting efforts within ptg Raytheon Integrated Defense Systems. Peter is responsible for implementing Design for Six Sigma techniques across Engineering and Operations to achieve Mission Assurance. Peter earned a master’s degree in Mathematics from North- eastern University and a Ph.D. in Operations Research from the University of Massachusetts in Amherst. Hans-Jürgen Kugler Principal and Chief Scientist KUGLER MAAG CIE, GmbH Industry Advisory Lero Hans-Jürgen Kugler is Principal and Chief Scientist of KUGLER MAAG CIE GmbH. In addition, he is industry advisor to Lero, the Irish Software Engineering Research Centre. Mr. Kugler was previously a lecturer at Trinity Col- lege Dublin, a director of software product and services companies, and Technical Director of the European Soft- ware Institute. He was involved in the design industry competence development in the automotive sector, and conducted the first inde- pendent CMM assessment in the automotive industry in Germany.

Wow! eBook 616 Contributing Authors Neal Mackertich Engineering Fellow and Founder Raytheon Six Sigma Institute Dr. Neal Mackertich is an Engineering Fellow and founder of the Raytheon Six Sigma Institute. A Six Sigma Master Black Belt versatile in technical background and applica- tion experience, Neal is presently responsible for the Sys- tems Engineering enablement of Mission Assurance through product and process optimization strategies within Raytheon’s Integrated Defense Systems business. Dr. Mack- ertich holds a B.S. in Chemical Engineering, an M.S. in Engineering Management, and a Ph.D. in Engineering Operations Research from the University of Massachusetts-Amherst. Tomoo Matsubara Independent Consultant Software Technology, Management, and Business Tomoo Matsubara is an independent consultant of software technology, management, and business. Currently, he is an Editorial Board member of the Cutter IT Journal and an ptg Advisory Board member of IEEE Software. For six years, he served as an editor of the “Soapbox” column. He started his career at the Hitachi’s machine factory. When Hitachi initi- ated its computer business, he was conscripted from its computer department. For a period of time, he was involved in almost any kind of software development projects. He left Hitachi Software Engineering at the end of 1991 and initiated a consulting business. He became actively involved in many international activi- ties, wrote a score of English papers, and made presentations at international conferences. Two of his papers were chosen for a book of influential papers “Software State-of-the-Art: Selected Papers,” (Dorset House) compiled by Tom DeMarco and Timothy Lister. He has published a dozen books, including translation of software engineering books. Judah Mogilensky Owner/Partner Process Enhancement Partners, Inc. Judah Mogilensky, an Owner/Partner of Process Enhancement Part- ners, Inc., is a leader in the process improvement field. He is a

Wow! eBook Contributing Authors 617

Certified High Maturity Lead Appraiser for the SCAMPI method (DEV, SVC, and People CMM) and a trainer for Introduction to CMMI-DEV, Introduction to CMMI-SVC, and Introduction to the People CMM courses. Judah is a cofounder of the first SPIN, in the Washington, D.C. area. He is a qualified user of the MBTI and a graduate of Satir System Training. He is also a Visiting Scientist at the SEI. James W. Moore Software Engineer MITRE Corporation Vice President for Professional Activities Member of Board of Governors IEEE Computer Society James W. Moore is a 40-year veteran of software engineer- ing in IBM and now the MITRE Corporation. He serves as the IEEE Computer Society’s Vice President for Profes- sional Activities and as a member of its Board of Governors. He was an Executive Editor of the Society’s 2004 “Guide to the Software Engineering Body of Knowledge” and a mem- ber of the Editorial Board of the recent revision of the ptg “Encyclopedia of Software Engineering.” He performs software and systems engineering standardization for the IEEE, serving as its liai- son to ISO/IEC JTC1/SC7 and as a member of the Executive Com- mittee of the IEEE Software and Systems Engineering Standards Committee. The IEEE Computer Society has recognized him as a Charter Member of their Golden Core, and the IEEE has named him a Fellow of the IEEE. His work on software engineering stan- dards has been recognized by the International Committee on Information Technology Standards (INCITS) with their Interna- tional Award, by the Computer Society with the Hans Karlsson Award, and by the IEEE with the Charles Proteus Steinmetz Award. His latest book on software engineering standards was published in 2006 by John Wiley and Sons. He holds two U.S. patents and, dat- ing to times when software was not regarded as patentable, two “defensive publications.” He graduated from the University of North Carolina with a B.S. in Mathematics and Syracuse University with an M.S. in Systems and Information Science.

Wow! eBook 618 Contributing Authors Joseph Morin Principal and CEO Integrated System Diagnostics, Inc. Joseph Morin is principal and CEO of Integrated System Diagnostics, Inc., one of the original SEI Partners. He has more than 35 years of experience in systems and software engineering, management, and process improvement. He is a Certified High Maturity Lead Appraiser and Instructor for CMMI. Since leaving the SEI in 1994 to found ISD, Mr. Morin has continued to contribute to SEI initiatives including par- ticipation as a member of the core author team for the SCAMPI V1.1 method and serving as Chairman of the SEI Partner Advisory Board. Heather Oppenheimer Senior Partner Oppenheimer Partners, LLC Heather Oppenheimer is a Senior Partner in Oppenheimer Partners, LLC, a process improvement consulting company. She is a certified CMMI instructor and SCAMPI Lead Appraiser, with more than 20 years of experience in all areas of product development, service delivery, and software engi- ptg neering. She has cochaired workshops for the International Conference on Systems Engineering, reviews software engi- neering journals, and referees grants for the Natural Sciences and Engineering Research Council of Canada. She is a member of the team that developed the SEI-authorized 3-day Introduction to CMMI-SVC course. Although she is known for her work with large, globally distributed development projects, she is currently focusing on coaching very small service delivery and embedded software development organizations in how to apply Agile methodologies along with the CMMI model for lightweight, but robust, processes. Pat O’Toole Principal Consultant PACT As Principal Consultant, Pat O’Toole is PACT’s only, and certainly most highly regarded, employee. He is a certified high maturity SCAMPI Lead Appraiser as well as a certified instructor for a variety of CMMI-related courses. Pat has published more than 50 articles related to process and per- formance improvement and, on behalf of the SEI-certified community, facilitates the ATLAS forum (ATLAS = Ask The Lead AppraiserS). Pat is also a Visiting Scientist at the SEI and

Wow! eBook Contributing Authors 619 coteaches a course in the University of Minnesota’s Master of Sci- ence in Software Engineering degree program. Mike Phillips Program Manager CMMI Initiative Software Engineering Institute Mike Phillips is the Program Manager for CMMI at the Software Engineering Institute (SEI), a position created to lead the Capability Maturity Model Integration [CMMI] product suite evolution. He has led the team, which spans government, industry, and the SEI, through three significant upgrades to the original version of the integrated model, which now covers engineering, acquisition, and services. He was previously responsible for Transition Enabling activities at the SEI. He has authored Technical Reports, Technical Notes, CMMI Columns, and various articles in addition to presenting CMMI material at conferences around the world. He is a coauthor of the first edition of the Addison-Wesley book on CMMI for Acquisition. Prior to his retirement as a colonel from the U.S. Air Force, he was the program manager of the $36B development program for the ptg B-2 stealth bomber in the B-2 System Program Office at Wright-Pat- terson AFB, Ohio. In addition to more than five years of B-2 experi- ence, he has four years of experience guiding acquisition programs in the Pentagon for both the Air Force and the Office of the Secre- tary of Defense. His bachelor’s degree in Astronautical Engineering is from the Air Force Academy, and his masters’ degrees are in Nuclear Engineering from Georgia Tech, Systems Management from the University of Southern California, and International Affairs from Salve Regina College and the Naval War College. He is a grad- uate of the Program Management Course at the Defense Systems Management College and of the Air Force Test Pilot School. Anne Prem Process Improvement Associate Booz Allen Hamilton Anne Prem is an Associate at Booz Allen Hamilton special- izing in process improvement. She has more than eight years of experience supporting the Department of Defense (DoD) in systems/software design and development, lever- aging various process improvement disciplines. Anne’s current focus is process definition and deployment using CMMI- DEV, CMMI-ACQ, and CMMI-SVC, while capitalizing on

Wow! eBook 620 Contributing Authors

the benefits of an integrated CMMI and Lean Six Sigma (LSS) approach. She has a Bachelor of Arts in Psychology and a Certificate in Markets & Management from Duke University. Anne is also cer- tified by the Project Management Institute (PMI) as a Project Man- agement Professional (PMP) and is LSS Green Belt trained. Bob Rassa Director of Engineering Programs Space and Airborne Systems Raytheon Founder and Chairman National Defense Industry Association (NDIA) Bob Rassa is Director of Engineering Programs at Raytheon’s Space and Airborne Systems (formerly Hughes Aircraft), in El Segundo, California. He is founder and Chairman of the National Defense Industry Association (NDIA) Systems Engineering Division (SE Div), as well as a founding member of their Automatic Test Committee. Mr. Rassa is the Industry Sponsor of CMMI (Capability Maturity Model Integration, the worldwide-adopted model for process improve- ment for systems engineering, software, project management, and ptg hardware design), and serves as the Chair of the CMMI Steering Group. He is an elected Fellow of IEEE and has served IEEE as Past President & Founder, IEEE Systems Council, Past President, IEEE Aerospace & Electronic Systems Society, and Past President, IEEE Instrumentation & Measurement Society. He holds a BSEE from the University of California–Berkeley, and has numerous published papers and delivered presentations. He holds the U.S. patent for a satellite-based Advanced Maintenance System for Aircraft & Mili- tary Weapons, issued in August 1999. He is recipient of the West- inghouse Order of Merit, the IEEE Third Millennium Medal, the Man of The Year Award from IEEE-AUTOTESTCON, the Lt. Gen- eral Thomas Ferguson Award for Systems Engineering Excellence from NDIA, and the Raytheon Award for Excellence in Engineering Process Improvement. Kevin Schaff Senior Member of the Technical Staff Software Engineering Institute Kevin Schaaff is a senior member of the technical staff at the Soft- ware Engineering Institute. He is a certified High Maturity Lead Appraiser and Instructor, with more than 30 years of experience in

Wow! eBook Contributing Authors 621 systems engineering and project management. His previous experience includes managing a variety of programs rang- ing from large Navy R&D efforts to implementing small to medium IT systems. Before joining the SEI, Kevin led other organizations’ quality programs to SW-CMM and CMMI Maturity Level 3 and CMMI Maturity Level 5, and he was responsible for numerous sites being both ISO and AS9100 registered. He currently serves as a process improvement consultant to several commercial and government organizations implementing a variety of models and standards. Alexander Stall Senior Member of the Technical Staff Software Engineering Institute Alexander Stall is a senior member of the technical staff at the Software Engineering Institute. He is a certified High Maturity Lead Appraiser and Instructor, and a member of the SCAMPI Quality Management Team. He has more than 20 years of experience as a software engineer, manager, and consultant across the systems development life cycle, and has worked in both the public and private sectors. As a Process ptg Improvement specialist, he has worked in and led efforts for process development, deployment, implementation, and appraisal. He currently works with several commercial and government organizations to implement a variety of models and standards. Rawdon Young Appraisal Manager Software Engineering Institute Mr. Young has worked in systems and software develop- ment for more than 35 years and has extensive experience in systems and software development at all levels from entry level programmer/analyst through senior manager. Additionally, he has both provided and managed engineer- ing and management consulting services. He has worked in small organizations, large organizations, government, and private industry. In 2006, he joined the SEI and has concentrated on appraisal quality, high maturity, Lead Appraiser certification and training. He is a SCAMPI High Maturity Lead Appraiser and an Instructor for the Introduction to CMMI and is an expert in high maturity development and management. He is currently the SEI Appraisal Manager.

Wow! eBook This page intentionally left blank

ptg

Wow! eBook INDEX

A Acquisition Mini Team, 570 ptg ABB case study, 123–124 Acquisition process, 574 CMMI adoption, 124–125 Acquisition strategies and activities current status, 129–130 defined, 574 first stage, 128–129 Integrated Project Management, 268 second stage, 129 Requirements Development, 466 tools and methods, 125–128 suppliers. See Supplier Agreement Management Accelerated Improvement Method (AIM), 8 (SAM) process area Accept the Acquired Product practice, 506–507 Technical Solution, 521 Acceptance criteria Acquisitions Requirements Development (ARD), defined, 574 72–73 Requirements Management, 476 Acronyms list, 561–563 Supplier Agreement Management, 503 Action teams, 323 Technical Solution, 525 Actions and action plans Validation, 537 Causal Analysis and Resolution, 239–240 Acceptance tests Integrated Project Management, 284 defined, 574 Organizational Process Focus, 318, 323–325 Requirements Development, 455 Product Integration, 387 Supplier Agreement Management, 502 Project Monitoring and Control, 394–402 Accountability, practice for, 185, 228 Project Planning, 413–414 Achieve Specific Goals goal, 169 Quantitative Project Management, 435 Achievement profiles Requirements Development, 456 defined, 574 Requirements Management, 475 purpose, 50 Supplier Agreement Management, 506 Acquirers, 574 Verification, 541

623

Wow! eBook 624 Index

Additions, 25, 574–575 Analysis Address Causes of Defects goal, 238 causal. See Causal Analysis and Resolution (CAR) Evaluate the Effect of Implemented Actions process area practice, 240–241 decision. See Decision Analysis and Resolution Implement Action Proposals practice, 239–240 (DAR) process area Record Causal Analysis Data practice, 241 functional, 468 Adopting CMMI, 90 measurement. See Measurement and Analysis Advanced process areas (MA) process area Process Management, 62–64 Quantitative Project Management, 445–447 Project Management, 66–68 risk Support, 79–80 Organizational Training, 367 Agile approaches, 100–101, 110 Risk Management, 487–491 case study, 101-104 Analytic Hierarchy process (AHP), 133 Configuration Management, 245 Analyze and Validate Requirements goal, 466 Product Integration, 378 Analyze Requirements practice, 469–470 Process and Product Quality Assurance, 427 Analyze Requirements to Achieve Balance Project Monitoring and Control, 398 practice, 470–471 Project Planning, 404 Establish a Definition of Required Functionality Requirements Development, 458 and Attributes practice, 468–469 Requirements Management, 474 Establish Operational Concepts and Scenarios Risk Management, 506 practice, 467–468 Technical Solution, 510–511 Validate Requirements practice, 471–472 Verification, 542 Analyze Causes practice, 236–238 Agnostics, process, 141 Analyze Issues practice, 400–401 Agreements, supplier. See Supplier Agreement Analyze Measurement Data practice, 299–300 ptg Management (SAM) process area Analyze Peer Review Data practice, 550 AIM (Accelerated Improvement Method), 8 Analyze Process Performance and Establish Align Measurement and Analysis Activities goal, Performance Baselines practice, 359–361 289 Analyze Process Performance Data practice, 336 Establish Measurement Objectives practice, Analyze Requirements practice, 469–470 289–291 Analyze Requirements to Achieve Balance practice, Specify Analysis Procedures practice, 296–298 470–471 Specify Data Collection and Storage Procedures Analyze Suggested Improvements practice, practice, 294–295 341–343 Specify Measures practice, 291–294 Analyze Validation Results practice, 538–539 Alignment of project work and requirements, Analyze Verification Results practice, 551–552 479–480 Applying generic practices, 226–227 Alignment Principle, 141–144 Appraisal Requirements for CMMI (ARC) Allocate Product Component Requirements document, 105 practice, 464–465 Appraisals Allocated baselines, 250 ABB, 126–127 Allocated requirements, 575 considerations for, 106 Allocating requirements defined, 575 Requirements Development, 456, 464–465 findings, 575 Technical Solution, 509 KNORR Brake Corporation, 120 Alternative solutions participants, 575 Decision Analysis and Resolution, 259–266 ratings, 575 Technical Solution, 513–515 reference models, 575

Wow! eBook Index 625

scope, 575 Project Planning, 407–408 working with, 104–106 Quantitative Project Management, 444 Appraise the Organization’s Processes practice, Audits 321–322 Configuration Management, 254–255 Appropriate resolution of conflicts, 461 defined, 576 ARC (Appraisal Requirements for CMMI) Process and Product Quality Assurance, 426 document, 105 Supplier Agreement Management, 506 Architecture Verification, 543 allocation of requirements, 462, 464-465 Authors, biographies, 607–608 architecture definition, 517-518, 580 Automotive industry, 108–109 architecture description languages, 468, 518, 520, Automotive SPICE, 109 580 architectural requirements, 463-464, 518 basis for when to require a formal evaluation, 260 B defined, 575–576 Baldassano, Steve derived requirements, 462, 464 biography, 609 documentation, 521-522, 602 KNORR Brake case study, 113–123 evaluation, 457, 518, 543, 544, 547 Barriers to improvements, 341–342 example of attributes to estimate in project Base measures, 576 planning, 407 Baselines implementation, 522, 547 Configuration Management, 244–245, 250 interface requirements and interfaces, 465, 518 defined, 576 open architectures, 515 Measurement and Analysis, 292 patterns, 455, 464, 515, 518 Organizational Performance Management, 332 product integration strategy, 380, 389 Organizational Process Performance, 351, ptg quality attributes, 457, 468-469 353–363 references to key books and Web sites, 511 Quantitative Project Management, 433, 447 requirements changes, 478 Requirements Development, 462, 480 risk sources, 439, 481technical solution, 512, 515 Risk Management, 488 ARD (Acquisitions Requirements Development), for success, 53–58 72–73 Basic process areas Assemble Product Components and Deliver the Process Management, 60–62 Product goal, 387 Project Management, 64–66 Assemble Product Components practice, 388 Support, 78–79 Confirm Readiness of Product Components for Basili, Victor R. Integration practice, 387–388 biography, 610 Evaluate Assembled Product Components empirical techniques, 159 practice, 389 empirical thinking essay, 37–41 Package and Deliver the Product or Product GQM approach, 54 Component practice, 390–391 Bate, Roger, xxiii, 17, 72 Assemble Product Components practice, 388 biography, 610 Assess Training Effectiveness practice, 374–375 CMMI framework essay, 14-16 Assets. See Process assets Benefits of CMMI report, 13 Assign Responsibility practice, 185, 228 Bidirectional traceability Assumptions, questioning, 122 defined, 576 Assurance empowerment, 93 maintaining, 478–479 Atheists, process, 141 Bosch company, 108–109 Attributes Box, George, 155 Project Monitoring and Control, 393 Box-checking, pathological, 139

Wow! eBook 626 Index

Brainstorming sessions, 262 Monitor and Control the Process practice Broad experience factor for consultants and lead elaboration, 205 appraisers, 153–154 Objectively Evaluate Adherence practice Brooks, Fred, 159 elaboration, 211 Budgets, 412–414 overview, 79 “Buggy Software: Why do we accept it?”, 107–108 Plan the Process practice elaboration, 174 Business objectives, 435 Provide Resources practice elaboration, 178 Business performance, 334–337 purpose, 233 related process areas, 234 Train People practice elaboration, 186 C Verification, 552 Campo, Michael Change management biography, 611 defined, 577 models for success essay, 53–58 systems for, 248–249 Capability and capability levels, 31–34 Changes defined, 576 Configuration Management, 250–253 designations, 34–37 in process improvement, 92 in equivalent staging, 50–51 Requirements Development, 464 profiles, 576 Requirements Management, 477–478 Supplier Agreement Management, 501 Chrissis, Mary Beth training, 367–372 acknowledgments, xxii Capability Maturity Model: Guidelines for Improving biography, 607–608 the Software Process, 10 CMF (CMMI Model Foundation), 14 Capability Maturity Model for Software (SW-CMM), CMMI-ACQ, 11, 73 10 CMMI-AIM, 8 ptg Capability maturity models, 576 CMMI-DEV, 11, 73–74 Capable processes, 576 CMMI Framework, 14–18, 577 Capacity and Availability Management (CAM) CMMI model components, 577 process area, 74 CMMI Model Foundation (CMF), 14 Card, David N. CMMI models, 577 biography, 611 CMMI or Agile: Why Not Embrace Both!, 101 measurement essay, 75–77 CMMI overview Case studies CMMI for Development, 18 ABB, 123–130 evolution, 10–11 KNORR Brake Corporation, 113–123 framework architecture, 14–18 LSS, 130–137 future and direction, 5–9 Categories of Risk Management, 483–484, 490–491 integration and improvement, 12–13 Causal analysis, 577 CMMI Product Suite, 577 Causal Analysis and Resolution (CAR) process area “CMMI Roadmaps” report, 72 Address Causes of Defects goal, 238–241 CMMI-SVC, 11, 73 Collect Process Related Experiences practice CMMI Version 1.3 project participants, 565 elaboration, 222 Acquisition Mini Team, 570 Control Work Products practice elaboration, 192 Configuration Control Board, 567–568 Determine Causes of Defects goal, 235–238 Coordination Team, 567 Establish an Organizational Policy practice Core Model Team, 568–569 elaboration, 169 High Maturity Team, 569 Identify and Involve Relevant Stakeholders Quality Team, 572 practice elaboration, 198 SCAMPI Upgrade Team, 570 introductory notes, 233–234 Services Advisory Group, 566–567

Wow! eBook Index 627

Services Mini Team, 570 Establish Baselines goal, 246–250 Steering Group, 565–566 Establish Integrity goal, 253–255 SVC Training Teams, 571–572 generic goals and practices, 227–228 Training Teams, 571 Identify and Involve Relevant Stakeholders Translation Team, 569 practice elaboration, 198 Collaboration, 282–285 introductory notes, 243–245 Collect Process Related Experiences practice Monitor and Control the Process practice overview, 221–226 elaboration, 205 process area roles in, 230 Objectively Evaluate Adherence practice Commercial off-the-shelf (COTS) product elaboration, 212 components. See COTS Organizational Training, 373 Commitments Plan the Process practice elaboration, 175 Project Monitoring and Control, 396 Project Planning, 415 Project Planning, 422–423 Provide Resources practice elaboration, 178 Common cause of variation, 577 purpose, 243 Communicate and Resolve Noncompliance Issues related process areas, 245 practice, 430–431 starting with, 138 Communicate Results practice, 302 support by, 79 Communications, process standards for, 88 Technical Solution, 510, 522 Compatibility, interface, 384–387 Track and Control Changes goal, 250–253 Competitive business intelligence, 339 Train People practice elaboration, 186 Component requirements Configuration status Requirements Development, 455, 463–465 accounting of, 578 Requirements Management, 473 Product Integration, 388 Technical Solution, 510 Confirm Readiness of Product Components for ptg Validation, 533 Integration practice, 387–388 Verification, 541 Conflict resolution, 461 Compose the Defined Process practice, 440–442 Constellations Computer factory, 96 CMMI for Development, 18 Conduct Milestone Reviews practice, 399–400 CMMI Framework, 14–16 Conduct Peer Reviews practice, 549 defined, 578 Conduct Progress Reviews practice, 399 expanding capabilities across, 72–74 Configuration audits Consultants defined, 577 hiring, 152–157 Supplier Agreement Management, 506 limitations, 123 Configuration baselines, 577 Context in small company process improvement, 101 Configuration control, 577–578 Continuous improvement, 7 Configuration control boards Continuous representation of CMMs CMMI Version 1.3 project, 567–568 defined, 578–579 defined, 578 levels in, 32–34 Configuration identification, 578 process areas in, 46–49 Configuration items, 578 Continuous thinking, 72 Configuration management, 578 Contractual requirements, 579 Configuration Management (CM) process area Contribute to Organizational Process Assets Collect Process Related Experiences practice practice, 280–282 elaboration, 222 contributing authors, biographies, 609–620 Control Work Products practice elaboration, Control Configuration Items practice, 252–253 192 Control Work Products practice Establish an Organizational Policy practice overview, 191–196 elaboration, 170 process area roles in, 228

Wow! eBook 628 Index

Coordinate and Collaborate with Relevant Curriculum, training, 372 Stakeholders goal, 282 Curtis, Bill Manage Dependencies practice, 283–284 biography, 611 Manage Stakeholder Involvement practice, executive responsibilities essay, 91–94 282–283 Customers Resolve Coordination Issues practice, 284–285 defined, 579 Coordination Team, 567 in process improvement, 93 Core Model Team, 568–569 requirements, 579 Core process areas, 20 Requirements Development, 455, 459–462 Corporate Engineering Process Group (CEPG) at Customer Satisfaction ABB, 125 business objectives, 54, 55, 139, 319, 355 Corporate Software Engineering Process Group evaluations, 322, 435, 501 (CSEPG) at ABB, 124–125 measures, 292, 399, 501 Corrective actions process performance objectives, 320, 337, 464 defined, 579 surveys, 292, 317, 460 Integrated Project Management, 284 Product Integration, 387 Project Monitoring and Control, 394–402 D Project Planning, 413–414 Dagnino, Aldo Quantitative Project Management, 435 ABB case study, 123–130 Requirements Development, 456 biography, 612 Requirements Management, 475 Dangle, Kathleen C. Supplier Agreement Management, 506 biography, 610 Verification, 541 empirical thinking essay, 37–41 Costs Data, defined, 579 ptg data collection and processing, 77 Data collection and management Project Planning, 409–411 costs, 77 COTS (commercial off-the-shelf) product defined, 579 components Measurement and Analysis, 294–295 alternative solutions, 512, 513, 515, 524, 525 Project Monitoring and Control, 397–398 applicability of SAM, 497 Requirements Management, 478 defined, 577 Decision Analysis and Resolution (DAR) process evaluation, 502, 512, 514, 518, 525 area modified COTS, 497, 500, 502, 504 Collect Process Related Experiences practice risk of unavailability, 276, 497 elaboration, 222 type of acquisition, 500, 504 Control Work Products practice elaboration, 192 Create or Release Baselines practice, 250 Establish an Organizational Policy practice Crises, focus on, 6 elaboration, 170 Criteria Evaluate Alternatives goal, 259–266 Organizational Process Definition, 308–310 Identify and Involve Relevant Stakeholders Product Integration, 382–383 practice elaboration, 198 Requirements Management, 476 introductory notes, 257–259 Supplier Agreement Management, 503 Monitor and Control the Process practice Technical Solution, 522–523 elaboration, 206 Validation, 537 Objectively Evaluate Adherence practice Verification, 546, 548 elaboration, 212 Crosby, Phillip, 9 overview, 79–80 Culture, improvements from, 147 Plan the Process practice elaboration, 175

Wow! eBook Index 629

Provide Resources practice elaboration, 179 Deployment in Organizational Process Focus, purpose, 257 325–330 related process areas, 259 Derived measures Train People practice elaboration, 186 defined, 580 Defects and defect density expressing, 292 Causal Analysis and Resolution, 235–241 Derived requirements defined, 579 defined, 580–581 Measurement and Analysis, 292 Requirements Development, 462 Organizational Process Performance, 352 Requirements Management, 479 Deficiencies, root cause analysis for, 451–453 Validation, 534 Define Project Lifecycle practice, 408–409 Descriptions, process, 220–221 Define Risk Parameters practice, 485–486 Organizational Process Definition, 307–308 Defined capability level, 35 Process and Product Quality Assurance, 425 Defined maturity level, 43, 46 Design Interfaces Using Criteria practice, 522–523 Defined processes Design reviews defined, 580 defined, 581 Integrated Project Management, 267–282 documentation, 415 Organizational Process Definition, 303 Design the Product or Product Component practice, overview, 166–168 517–521 Quantitative Project Management, 433–434, Determine Acquisition Type practice, 499–500 440–442 Determine Causes of Defects goal, 235 Definition of required functionality and quality Analyze Causes practice, 236–238 attributes. See also Establish a Definition of Select Outcomes for Analysis practice, 235–236 Required Functionality and Quality Attributes Determine Process Improvement Opportunities practice goal, 318–319 ptg defined, 580 Appraise the Organization’s Processes practice, discussed, 456-457, 468 321–322 Deliver Training practice, 373–374 Establish Organizational Process Needs practice, Deliverable items, 580 319–320 Delivery environments, 580 Identify the Organization’s Process Improvements DeMarco, Tom, 94, 97, 111 practice, 322–323 Deming, W. Edwards, 9 Determine Risk Sources and Categories practice, Dependencies, 283–284 483–484 Deploy Improvements goal, 346 Determine Which Training Needs Are the Evaluate Improvement Effects practice, 349 Responsibility of the Organization practice, Manage the Deployment practice, 347–349 368–369 Plan the Deployment practice, 346–347 Develop a Project Plan goal, 411 Deploy Organizational Process Assets and Establish the Budget and Schedule practice, Incorporate Experiences goal, 325 412–414 Deploy Organizational Process Assets practice, Establish the Project Plan practice, 420–421 325–327 Identify Project Risks practice, 414–415 Deploy Standard Processes practice, 327–328 Plan for Data Management practice, 415–417 Incorporate Experiences into Organizational Plan Needed Knowledge and Skills practice, Process Assets practice, 329–330 418–419 Monitor Implementation practice, 328–329 Plan Stakeholder Involvement practice, 419–420 Deploy Organizational Process Assets practice, Plan the Project’s Resources practice, 417–418 325–327 Develop Alternative Solutions and Selection Criteria Deploy Standard Processes practice, 327–328 practice, 513–515

Wow! eBook 630 Index

Develop Customer Requirements goal, 459 Empirical software engineering principles, 37–41 Elicit Needs practice, 460–461 End users, 581 Transform Stakeholder Needs into Customer Engineering Requirements practice, 461–462 empirical, 37–41 Develop Product Requirements goal, 462 principles, 37–41 Allocate Product Component Requirements in successful process improvement, 95–98 practice, 464–465 Technical Solution, 519, 527 Establish Product and Product Component Verification, 544 Requirements practice, 463–464 Engineering process areas Identify Interface Requirements practice, 465–466 overview, 68–71 Develop Product Support Documentation practice, recursion and iteration of, 74–75 528–529 Ensure Alignment Between Project Work and Develop Risk Mitigation Plans practice, 492–494 Requirements practice, 479–480 Develop the Design goal, 517 Ensure Interface Compatibility goal, 384 Design Interfaces Using Criteria practice, Manage Interfaces practice, 385–387 522–523 Review Interface Descriptions for Completeness Design the Product or Product Component practice, 384–385 practice, 517–521 Ensure Transition of Products practice, 507–508 Establish a Technical Data Package practice, Enterprise class measurements, 76 521–522 Enterprises Perform Make, Buy, or Reuse Analyses practice, defined, 581 524–525 standard processes for, 305 Developer involvement, 93 Entry criteria Development defined, 581 defined, 581 peer reviews, 548 ptg Integrated Project Management for, 268 Equivalent staging Developmental plans, 543 defined, 581 Disciplines, Requirements Development, 459 overview, 49–52 “Do Systems Engineering? Who, Me?”, 17 Establish a Configuration Management System Documentation practice, 248–249 process, 103–104 Establish a Defined Process practice, 220–221, 230 Project Planning, 415 Establish a Definition of Required Functionality and Requirements Management, 474 Attributes practice, 468–469 Technical Solution, 528–529 Establish a Risk Management Strategy practice, Documents, 581 486–487 Doubter to believer process, 157–161 Establish a Technical Data Package practice, Drucker, Peter, 8 521–522 Dynaxity, 108–110 Establish a Training Capability practice, 370–372 Establish an Integration Strategy practice, 379–381 Establish an Organizational Policy practice, E 169–173 EF (Experience Factory) concept, 40–41 Establish an Organizational Training Capability El Emam, Khaled goal, 367 biography, 612–613 Determine Which Training Needs Are the process improvement in small companies essay, Responsibility of the Organization practice, 101–104 368–369 Elicit Needs practice, 460–461 Establish a Training Capability practice, 370–372 Elicit Suggested Improvements practice, 338–340 Establish an Organizational Training Tactical Plan “Embedded Software Rules,” 108 practice, 369–370

Wow! eBook Index 631

Establish the Strategic Training Needs practice, Establish Organization’s Measurement Repository 367–368 practice, 310–312 Establish an Organizational Training Tactical Plan Establish Performance Baselines and Models goal, practice, 369–370 353–354 “Establish and maintain,” 582 Analyze Process Performance and Establish Establish Baselines goal, 246–250 Performance Baselines practice, 359–361 Create or Release Baselines practice, 250 Establish Process Performance Measures practice, Establish a Configuration Management System 358–359 practice, 248–249 Establish Process Performance Models practice, Identify Configuration Items practice, 246–248 362–363 Establish Configuration Management Records Establish Quality and Process Performance practice, 253–254 Objectives practice, 354–356 Establish Estimates goal, 405 Select Processes practice, 356–358 Define Project Lifecycle practice, 408–409 Establish Process Action Plans practice, 323–324 Establish Estimates of Work Product and Task Establish Process Performance Measures practice, Attributes practice, 407–408 358–359 Estimate Effort and Cost practice, 409–411 Establish Process Performance Models practice, Estimate the Scope of the Project practice, 362–363 406–407 Establish Product and Product Component Establish Estimates of Work Product and Task Requirements practice, 463–464 Attributes practice, 407–408 Establish Product Integration Procedures and Establish Evaluation Criteria practice, 261–262 Criteria practice, 382–383 Establish Guidelines for Decision Analysis practice, Establish Quality and Process Performance 260–261 Objectives practice, 354–356 Establish Integrity goal, 253 Establish Records practice, 431 ptg Establish Configuration Management Records Establish Rules and Guidelines for Teams practice, practice, 253–254 314–315 Perform Configuration Audits practice, 254–255 Establish Standard Processes practice, 304–307 Establish Lifecycle Model Descriptions practice, Establish Supplier Agreements goal, 499 307–308 Determine Acquisition Type practice, 499–500 Establish Measurement Objectives practice, Establish Supplier Agreements practice, 502–504 289–291 Select Suppliers practice, 500–502 Establish Operational Concepts and Scenarios Establish Supplier Agreements practice, 502–504 practice, 467–468 Establish Tailoring Criteria and Guidelines Process Establish Organizational Process Assets goal, 304 practice, 308–310 Establish Lifecycle Model Descriptions practice, Establish Teams practice, 279–280 307–308 Establish the Budget and Schedule practice, Establish Organization’s Measurement Repository 412–414 practice, 310–312 Establish the Organization’s Process Asset Library Establish Rules and Guidelines for Teams practice, 312–313 practice, 314–315 Establish the Product Integration Environment Establish Standard Processes practice, 304–307 practice, 381–382 Establish Tailoring Criteria and Guidelines Establish the Project Plan practice, 420–421 Process practice, 308–310 Establish the Project’s Defined Process practice, Establish the Organization’s Process Asset Library 269–272 practice, 312–313 Establish the Project’s Objectives practice, 437–440 Establish the Work Environment Standards Establish the Project’s Work Environment practice, practice, 313 273–275 Establish Organizational Process Needs practice, Establish the Strategic Training Needs practice, 319–320 367–368

Wow! eBook 632 Index

Establish the Validation Environment practice, Experience factor for consultants and lead 535–536 appraisers, 152–154 Establish the Verification Environment practice, Experience Factory (EF) concept, 40–41 545–546 Experiences. See also Collect Process Related Establish the Work Environment Standards practice, Experiences practice and Defined processes 313 Integrated Project Management, 280-282 Establish Training Records practice, 374 Organizational Process Focus, 325–330 Establish Validation Procedures and Criteria Exploitation, reverse, 144–145 practice, 537 Establish Verification Procedures and Criteria practice, 546 F Estimate Effort and Cost practice, 409–411 Factory culture, 95–98 Estimate the Scope of the Project practice, 406–407 False facades, 140 Estimates Formal evaluation processes effort and cost, 409–411 DAR. See Decision Analysis and Resolution scope, 406–407 (DAR) process area work product, 407–408 defined, 582 Evaluate, Categorize, and Prioritize Risks practice, Foundation, CMMI model, 15 490–491 Fowler, Priscilla, 160 Evaluate Alternatives goal, 259 Framework architecture for CMMI, 14–18 Establish Evaluation Criteria practice, 261–262 Functional analysis Establish Guidelines for Decision Analysis defined, 582 practice, 260–261 discussed, 468 Evaluate Alternatives practice, 264–265 Functional architecture. See also Architecture Identify Alternative Solutions practice, 262–263 defined, 582 ptg Select Evaluation Methods practice, 263–264 discussed, 468 Select Solutions practice, 265–266 Functional baselines, 250 Evaluate Alternatives practice, 264–265 Future and direction of CMMI, 5–9 Evaluate Assembled Product Components practice, 389 Evaluate Improvement Effects practice, 349 G Evaluate the Effect of Implemented Actions gap analysis in SPAWAR,133 practice, 240–241 Generic goals and practices, 23–25 Evaluation Achieve Specific Goals, 169 Decision Analysis and Resolution, 263–264 applying, 226–227 processes and work products, 428–429 defined, 583 risk, 490–491 elaboration, 583 Evolution of CMMI, 10–11 Institutionalize a Defined Process Evolving critical systems, 110 Collect Process Related Experiences practice, Example work products, 24, 582 221–226 Examples for process areas, 25–26 Establish a Defined Process practice, 220–221 Execute the Supplier Agreement practice, 504–506 Institutionalize a Managed Process Executive responsibilities in process improvement, Assign Responsibility practice, 185 91–94 Control Work Products practice, 191–196 Exit criteria Establish an Organizational Policy practice, defined, 582 169–173 Verification, 549 Identify and Involve Relevant Stakeholders Expected components practice, 196–204 defined, 582 Monitor and Control the Process practice, in process areas, 20 204–211

Wow! eBook Index 633

Objectively Evaluate Adherence practice, Review Status with Higher Level Management 211–218 practice, 218–220, 230 Plan the Process practice, 173–178 Hitachi Software Engineering (HSE), 96–98 Provide Resources practice, 178–184 Humphrey, Watts, v, 9, 91, 317,556 Review Status with Higher Level Management biography, 614 practice, 218–220 CMMI history essay, 5–9 Train People practice, 186–191 Hypothesis testing, Quantitative Project overview, 165 Management, 447 processes in defined, 166–168 institutionalization, 165–166 I managed, 166 IDEAL (Initiating, Diagnosing, Establishing, Acting, performed, 166 and Learning) model relationships, 168 at ABB, 125 support for, 227–231 for planning, 90 purpose, 150 Identify Alternative Solutions practice, 262–263 Glazer, Hillel Identify and Analyze Risks goal, 487 biography, 613 Evaluate, Categorize, and Prioritize Risks missing links essay, 146–152 practice, 490–491 Glossary, 573–604 Identify Risks practice, 487–490 Goal Question Model approach, 55 Identify and Involve Relevant Stakeholders practice Goals. See also specific goals by name overview, 196–204 in empirical software engineering, 38–39 process area roles in, 229 generic. See Generic goals and practices Identify Configuration Items practice, 246–248 in process improvement, 91 Identify Interface Requirements practice, 465–466 ptg Good/Question/Metric (GQM) approach, 40 Identify Potential Areas for Improvement practice, GQM+Strategies approach, 41 336–337 Guidelines Identify Project Risks practice, 414–415 missing links, 146–152 Identify Risks practice, 487–490 startup, 137–141 Identify the Organization’s Process Improvements practice, 322–323 IDS (Integrated Defense Systems), 53–54, 58 H IEEE Standard 1012, 89 Happy family model, 5–6 Implement Action Proposals practice, 239–240 Hardware development plans, 421 Implement Process Action Plans practice, 324–325 Hardware engineering Implement Risk Mitigation Plans practice, 494–495 defined, 583 Implement the Design practice, 526–528 Technical Solution, 519, 527 Implement the Product Design goal, 526 Validation, 534 Develop Product Support Documentation Verification, 544 practice, 528–529 Harrison, Kileen Implement the Design practice, 526–528 biography, 613 Implementation LSS essay, 130–137 Organizational Process Focus, 328–329 Hayes, Will process improvements, 323–325 biography, 614 Improvement suggestion. See also Suggested hiring process essay,152–157 improvements Heinecke, H., 108 address identified improvement areas, 337 High Maturity Team, 569 example of process related experience, 221 Higher level management Improvements defined, 583 defining, 141–144

Wow! eBook 634 Index

Improvements (continued) Integrated Master Plans, 421 OPM. See Organizational Performance Integrated Master Schedules, 421 Management (OPM) process area Integrated Product Development Capability process. See Process improvements Maturity Model (IPD-CMM), 10 Incentives alignment, 92 Integrated Project Management (IPM) process area, Incident Resolution and Prevention (IRP) process 68 area, 74 Collect Process Related Experiences practice Incomplete capability level, 35 elaboration, 222 Incomplete processes, 583 Control Work Products practice elaboration, 192 Inconsistencies in project work and requirements, Coordinate and Collaborate with Relevant 479–480 Stakeholders goal, 282–285 Incorporate Experiences into Organizational Establish an Organizational Policy practice Process Assets practice, 329–330 elaboration, 170 Incremental improvements, 338 generic goals and practices, 230 Industrial practices, improving, 107–111 Identify and Involve Relevant Stakeholders Information assurance, references and sources, 560 practice elaboration, 198 Information security, references and sources, 560 introductory notes, 267–268 Informative components Monitor and Control the Process practice defined, 583 elaboration, 206 in process areas, 20 Objectively Evaluate Adherence practice Initial maturity level, 42 elaboration, 212 Innovative improvements, 338–340. See also Plan the Process practice elaboration, 175 Organizational Performance Management Provide Resources practice elaboration, 179 (OPM) process area purpose, 267 Institutionalization related process areas, 268–269 ptg defined, 583 Train People practice elaboration, 187 process, 165–166 Use the Project’s Defined Process goal, 269–282 Institutionalize a Defined Process goal Integrity, Configuration Management, 253–255 Collect Process Related Experiences practice, Interface control 221–226 defined, 584 Establish a Defined Process practice, 220–221 Product Integration, 384–387 Institutionalize a Managed Process goal Technical Solution, 522–523 Assign Responsibility practice, 185 International standards for lifecycle processes, 88 Control Work Products practice, 191–196 Introduction to CMMI for Development, 107 Establish an Organizational Policy practice, Introductory notes for process areas, 22 169–173 IPD-CMM (Integrated Product Development Identify and Involve Relevant Stakeholders Capability Maturity Model), 10 practice, 196–204 ISO 9001 standard, 86–87 Monitor and Control the Process practice, ISO/IEC 15504 standard, 86 204–211 ISO/IEC/IEEE 15288 standard, 86, 88 Objectively Evaluate Adherence practice, ISO/IEC/IEEE/EIA 12207 standard, 86, 88 211–218 Issues analysis, 400–401 Plan the Process practice, 173–178 Iteration Provide Resources practice, 178–184 engineering processes, 74–75 Review Status with Higher Level Management in small company process improvement, practice, 218–220 101–103 Train People practice, 186–191 Instructors, developing, 372 Integrate Plans practice, 275–277 J Integrated Defense Systems (IDS), 53–54, 58 Juran, Joseph, 9

Wow! eBook Index 635

K process area selection, 132–133 Kaizen movement, 95–98 results and recommendations, 134–135 Keeni, Gargi synergistic effect, 130–131 biography, 615 tools, 133–134 PPT essay, 80–83 Learning from experience, 39 Key performance indicators (KPIs), ABB, 128 Learning step in CMMI, 6 KNORR Brake Corporation case study, 113–114 Legal and regulatory requirements. See Laws and lessons learned, 122–123 regulations ongoing work, 120–121 Lessons learned in small company process successful steps, 116–120 improvement, 103–104 wrong route, 114–115 Levels, 32–33 Knowledge needs, Project Planning, 418–419 capability, 34–37 Komai, Kenichiro, 96 designations, 41–46 Konrad, Mike Lifecycle processes and models acknowledgments, xxiii defined, 584, 591 biography, 608 international standards for, 88 CMMI framework essay, 16–18 Organizational Process Definition, 304, 307–308 Kraus, Peter Project Planning, 408–409 biography, 615 Quantitative Project Management, 441 models for success essay, 53–58 Requirements Development, 456, 461 Kugler, Hans-Jürgen standard processes for, 305 biography, 615 Technical Solution, 509, 517, 521 industrial practices essay, 107–111 Linear regression techniques with SLAM, 56 Lister, Tim, 94 LSS. See Lean Six Sigma (LSS) case study ptg L Laggards, 94 Laws and regulations, use in M basis for when to require a formal evaluation, 260 Machine factory, 95–98 process definition, 87, 88, 102, 110 Mackertich, Neal process selection, 441 biography, 616 requirements, 459, 461, 473, 580 models for success essay, 53–58 risk sources, 484 Maher, John, 160 stakeholders, 197 Maintain Bidirectional Traceability of Requirements supplier agreements, 502, 506 practice, 478–479 teams, 314 Maintain Business Objectives practice, 334–335 Lead appraisers Manage Business Performance goal, 334 hiring, 152–157 Analyze Process Performance Data practice, 336 importance, 123 Identify Potential Areas for Improvement practice, Leadership, improvements from, 147 336–337 Lean approaches, 110 Maintain Business Objectives practice, 334–335 Lean Six Sigma (LSS) case study, 130 Manage Corrective Action practice, 402 approach, 131–132 Manage Corrective Action to Closure goal, 400 background context, 131 Analyze Issues practice, 400–401 challenges, 131 Manage Corrective Action practice, 402 gap analysis, 133 Take Corrective Action practice, 401–402 lessons learned and critical success factors, Manage Dependencies practice, 283–284 135–137 Manage Interfaces practice, 385–387 mission critical risks, 132 Manage Project Performance practice, 449–451

Wow! eBook 636 Index

Manage Requirements Changes practice, 477–478 Monitor and Control the Process practice Manage Requirements goal, 475 elaboration, 206 Ensure Alignment Between Project Work and Objectively Evaluate Adherence practice Requirements practice, 479–480 elaboration, 213 Maintain Bidirectional Traceability of Plan the Process practice elaboration, 175 Requirements practice, 478–479 Provide Measurement Results goal, 298–302 Manage Requirements Changes practice, 477–478 Provide Resources practice elaboration, 179 Obtain Commitment to Requirements practice, 477 purpose, 287 Understand Requirements practice, 476–477 related process areas, 288 Manage Stakeholder Involvement practice, 282–283 in small company process improvement, 103 Manage the Deployment practice, 347–349 support by, 78–79 Manage the Project Using Integrated Plans practice, Train People practice elaboration, 187 277–279 training for, 77 Managed capability level, 35 Measurement repository Managed maturity level, 42 Configuration Management, 272–273 Managed processes. See also Institutionalize a Measurement and Analysis, 288 Managed Process goal Organizational Innovation and Deployment, defined, 584 310–312 overview, 166 Measurement specialists, 76–77 Management Measurements in process improvement, 91–94 defined, 584 status reviews with, 218–220, 230 Quantitative Project Management, 445–447 Managers, 584 results, 584 Manifesto for Agile Development, 100 Measures, 584 Martin, Cecil, 160 Memoranda of agreement ptg Matsubara, Tomoo, 16–17 defined, 584 biography, 616 Supplier Agreement Management, 499 engineering culture essay, 95–98 Mentors, developing, 372 Maturity levels Method Definition Document (MDD), 105 defined, 584 Milestone reviews, 399–400 designations, 41–46 Mills, Harlan, 159 expertise for, 150–151 Mitigate Risks goal, 491–492 logic for, 6–7 Develop Risk Mitigation Plans practice, 492–494 overview, 31–34 Implement Risk Mitigation Plans practice, MDD (Method Definition Document), 105 494–495 Measurement Models in empirical software engineering, 38, 40 CMMI, 10–11, 99–100 for meaningful improvement, 75–77 limitations, 149 in small company process improvement, 104 for success, 53–58 Measurement and Analysis (MA) process area, 75 Mogilensky, Judah Align Measurement and Analysis Activities goal, biography, 616–617 289–298 process improvement essay, 137–141 Collect Process Related Experiences practice Monitor and Control the Process practice elaboration, 223 overview, 204–211 Control Work Products practice elaboration, 193 process area roles in, 229 Establish an Organizational Policy practice Monitor Commitments practice, 396 elaboration, 170 Monitor Data Management practice, 397–398 Identify and Involve Relevant Stakeholders Monitor Implementation practice, 328–329 practice elaboration, 199 Monitor Project Planning Parameters practice, introductory notes, 287–288 394–396

Wow! eBook Index 637

Monitor Project Risks practice, 397 Objectively Evaluate Work Products practice, 429 Monitor Stakeholder Involvement practice, 398 Objectives, business, 334–335, 435 Monitor the Performance of Selected Subprocesses Observation in empirical software engineering, 38 practice, 448–449 Obtain Commitment to Requirements practice, 477 Monitor the Project Against the Plan goal, 394 Obtain Commitment to the Plan goal, 422 Conduct Milestone Reviews practice, 399–400 Obtain Plan Commitment practice, 423 Conduct Progress Reviews practice, 399 Reconcile Work and Resource Levels practice, 422 Monitor Commitments practice, 396 Review Plans That Affect the Project practice, 422 Monitor Data Management practice, 397–398 Obtain Measurement Data practice, 298–299 Monitor Project Planning Parameters practice, Obtain Plan Commitment practice, 423 394–396 Operational concepts and scenarios Monitor Stakeholder Involvement practice, 398 defined, 585 Monitoring. See Project Monitoring and Control Requirements Development, 462, 467–468 (PMC) process area Technical Solution, 518 Moore, James W. Validation, 537 biography, 617 Oppenheimer, Heather process standards essay, 85–89 biography, 618 Morin, Joseph KNORR Brake case study, 113–123 biography, 618 Optimizing maturity level, 44 journey to CMMI essay, 157–161 Organizational business objectives defined, 586 managing, 435 N Organizational maturity, 586 Natural bounds Organizational measurement repository defined, 585 Configuration Management, 272–273 ptg estimating, 361 defined, 586 Noncompliance issues, 430–431 Measurement and Analysis, 288 Nondevelopmental items Organizational Innovation and Deployment, defined, 585 310–312 Project Planning, 406 Organizational Performance Management (OPM) Technical Solution, 525 process area Nontechnical requirements Collect Process Related Experiences practice defined, 585 elaboration, 223 managing, 473 Control Work Products practice elaboration, 193 Notes for process areas, 25 Deploy Improvements goal, 346–349 Numbering schemes for process areas, 26–27 Establish an Organizational Policy practice elaboration, 170–171 Identify and Involve Relevant Stakeholders O practice elaboration, 200 Objective evaluation, 425, 428–429 introductory notes, 331–333 Objective evidence, 298 Manage Business Performance goal, 334–337 Objective insights, 430–431 Monitor and Control the Process practice Objectively Evaluate Adherence practice, 211–218, elaboration, 207 229 Objectively Evaluate Adherence practice Objectively evaluate process, 585 elaboration, 214 Objectively Evaluate Processes and Work Products overview, 62 goal, 428 Plan the Process practice elaboration, 175–176 Objectively Evaluate Processes practice, 428–429 Provide Resources practice elaboration, 180 Objectively Evaluate Work Products practice, 429 purpose, 331 Objectively Evaluate Processes practice, 428–429 related process areas, 333

Wow! eBook 638 Index

Organizational Performance Management (OPM) Deploy Organizational Process Assets and process area (continued) Incorporate Experiences goal, 325–330 Review Status with Higher Level Management Determine Process Improvement Opportunities practice elaboration, 219 goal, 318–323 Select Improvements goal, 337–346 Establish an Organizational Policy practice Train People practice elaboration, 188 elaboration, 170 Organizational policies Identify and Involve Relevant Stakeholders defined, 586 practice elaboration, 199 Process and Product Quality Assurance, 427 introductory notes, 317–318 in process improvement, 93 Monitor and Control the Process practice Verification, 546 elaboration, 207 Organizational process assets Objectively Evaluate Adherence practice defined, 586 elaboration, 213 Integrated Project Management, 272–273, Plan and Implement Process Actions goal, 280–282 323–325 libraries, 586 Plan the Process practice elaboration, 175 Organizational Process Definition, 304–315 Provide Resources practice elaboration, 180 Organizational Process Focus, 317, 325–330 purpose, 317 Organizational Process Performance, 352 related process areas, 318 Quantitative Project Management, 434 Review Status with Higher Level Management Technical Solution, 523 practice elaboration, 219 Organizational Process Definition (OPD) process Train People practice elaboration, 187 area Organizational Process Performance (OPP) process Collect Process Related Experiences practice area elaboration, 223 Collect Process Related Experiences practice ptg Control Work Products practice elaboration, 193 elaboration, 224 Establish an Organizational Policy practice Control Work Products practice elaboration, 194 elaboration, 170 Establish an Organizational Policy practice Establish Organizational Process Assets goal, elaboration, 171 304–315 Establish Performance Baselines and Models goal, generic goals and practices, 227 353–363 Identify and Involve Relevant Stakeholders Identify and Involve Relevant Stakeholders practice elaboration, 199 practice elaboration, 200 introductory notes, 303–304 introductory notes, 351–353 Monitor and Control the Process practice Monitor and Control the Process practice elaboration, 206 elaboration, 207 Objectively Evaluate Adherence practice Objectively Evaluate Adherence practice elaboration, 213 elaboration, 214 overview, 60, 102 overview, 62 Plan the Process practice elaboration, 175 Plan the Process practice elaboration, 176 Provide Resources practice elaboration, 179–180 Provide Resources practice elaboration, 180 purpose, 303 purpose, 351 related process areas, 304 related process areas, 353 Train People practice elaboration, 187 Train People practice elaboration, 188 Organizational Process Focus (OPF) process area, 60 Organizational set of standard processes Assign Responsibility practice elaboration, 185 defined, 586 Collect Process Related Experiences practice Organizational Process Performance, 351 elaboration, 223 Organizational Training, 366 Control Work Products practice elaboration, 193 Quantitative Project Management, 434

Wow! eBook Index 639

Organizational Training (OT) process area Conduct Peer Reviews practice, 549 Collect Process Related Experiences practice Prepare for Peer Reviews practice, 547–548 elaboration, 224 Perform Root Cause Analysis practice, 451–453 Control Work Products practice elaboration, 194 Perform Specific Practices practice, overview, 169 Establish an Organizational Policy practice Perform Validation practice, 538 elaboration, 171 Perform Verification practice, 551 Establish an Organizational Training Capability Performance parameters, 587 goal, 367–372 Performed capability level, 35 generic goals and practices, 228 Performed processes Identify and Involve Relevant Stakeholders defined, 587 practice elaboration, 200 description, 166 introductory notes, 365–366 Personal responsibility, 91 Monitor and Control the Process practice Phillips, Mike elaboration, 207 biography, 619 Objectively Evaluate Adherence practice constellations essay, 72–74 elaboration, 214 Pilots for improvement validation, 343 overview, 62 Pitfalls, 139–141 Plan the Process practice elaboration, 176 “improvement” not defined, 141–144 Provide Resources practice elaboration, 181 reverse exploitation, 144–145 Provide Training goal, 372–375 summary, 145–146 purpose, 365 Plan and Implement Process Actions goal, 323 related process areas, 366 Establish Process Action Plans practice, 323–324 Train People practice elaboration, 188 Implement Process Action Plans practice, Organizations, 585 324–325 O’Toole, Pat Plan for Data Management practice, 415–417 ptg biography, 618–619 Plan Needed Knowledge and Skills practice, process improvement pitfalls essay, 141–146 418–419 Plan Stakeholder Involvement practice, 419–420 Plan the Deployment practice, 346–347 P Plan the Process practice, 173–178, 228 Package and Deliver the Product or Product Plan the Project’s Resources practice, 417–418 Component practice, 390–391 Planned processes Parnas, Dave, 107 defined, 587 Pathological box-checking, 139 Organizational Process Focus, 323 Peer reviews Process and Product Quality Assurance, 425 defined, 587 Plans. See also Project Planning (PP) process area Integrated Project Management, 271 Integrated Project Management, 275–279 Measurement and Analysis, 292 Organizational Process Focus, 317–318, 323–325 Organizational Process Definition, 307 Organizational Training, 367 Process and Product Quality Assurance, 426 Requirements Management, 475 starting with, 138 Risk Management, 490, 494–495 Technical Solution, 528 Verification, 543, 545 Verification, 542, 547–550 Policies People in PPT triangle, 81 Process and Product Quality Assurance, 427 Perform Configuration Audits practice, 254–255 in process improvement, 93 Perform Make, Buy, or Reuse Analyses practice, Verification, 546 524–525 Policy, Procedures, Processes, Practices & Perform Peer Reviews goal, 547 Instructions (P4I) workbook, 132 Analyze Peer Review Data practice, 550 PPT (process, people, and technology) triangle, 80–83

Wow! eBook 640 Index

Practices Process action plans, 587 generic. See Generic goals and practices Process action teams, 587 as means to ends, 149 Process and Product Quality Assurance (PPQA) for process areas, 23–25. See also specific practices process area by name Assign Responsibility practice elaboration, 185 Prem, Anne Collect Process Related Experiences practice biography, 619–620 elaboration, 225 LSS essay, 130–137 Control Work Products practice elaboration, 195 Prepare for Peer Reviews practice, 547–548 Establish an Organizational Policy practice Prepare for Product Integration goal, 379 elaboration, 171 Establish an Integration Strategy practice, generic goals and practices, 229 379–381 Identify and Involve Relevant Stakeholders Establish Product Integration Procedures and practice elaboration, 201 Criteria practice, 382–383 introductory notes, 425–427 Establish the Product Integration Environment Monitor and Control the Process practice practice, 381–382 elaboration, 208 Prepare for Quantitative Management goal, 436 Objectively Evaluate Adherence practice Compose the Defined Process practice, 440–442 elaboration, 216 Establish the Project’s Objectives practice, Objectively Evaluate Processes and Work 437–440 Products goal, 428–429 Select Measures and Analytic Techniques practice, overview, 80–83, 93 445–447 Plan the Process practice elaboration, 177 Select Subprocesses and Attributes practice, Provide Objective Insight goal, 430–431 443–444 Provide Resources practice elaboration, 182 Prepare for Risk Management goal, 482 purpose, 425 ptg Define Risk Parameters practice, 485–486 related process areas, 427–428 Determine Risk Sources and Categories practice, support by, 79 483–484 Train People practice elaboration, 189 Establish a Risk Management Strategy practice, Process and technology improvements, 587 486–487 Process architecture, 587–588 Prepare for Validation goal, 533 Process areas, 19–25, 46–49. See also specific process Establish the Validation Environment practice, areas by name 535–536 core, 20 Establish Validation Procedures and Criteria defined, 588 practice, 537 Engineering, 68–71 Select Products for Validation practice, 533–535 expected components in, 20 Prepare for Verification goal, 543 generic practices support by, 227–231 Establish the Verification Environment practice, informative components in, 20, 25–26 545–546 numbering schemes for, 26–27 Establish Verification Procedures and Criteria Process Management, 60–64 practice, 546 Project Management, 64–68 Select Work Products for Verification practice, required components in, 19–20 544–545 support, 77–80 Pressures in process improvement, 94 typographical conventions for, 27–30 Prioritizing risks, 490–491 Process assets Problem areas, focus on, 6 defined, 588 Problem-solving process in Kaizen movement, Integrated Project Management, 272–273, 95–98 280–282 Process, people, and technology (PPT) triangle, libraries, 588 80–83 Organizational Process Definition, 304–315

Wow! eBook Index 641

Organizational Process Focus, 317, 325–330 Process performance models Organizational Process Performance, 352 defined, 589–590 Quantitative Project Management, 434 for success, 53–58 Technical Solution, 523 Process Program, 157–158, 160 Process attributes, 588 Processes Process capability, 588 architecture, 303–304 Process definitions, 588 attributes, 444 Process descriptions, 220–221 as concept, 144 defined, 588 defined, 166–168, 587 Organizational Process Definition, 308 documentation, 103–104 Process and Product Quality Assurance, 425 in empirical software engineering, 38 Process documentation, 144 institutionalization, 165–166 Process elements managed, 166 defined, 588 performed, 166 Organizational Process Definition for, 303–304 planned, 323, 425 Process groups, 588 in PPT triangle, 81 Process improvement plan (PIP) relationships among, 168 ABB, 127–128 tailoring. See Tailoring defined, 589 Product baselines, 250, 590 Process improvements Product components defined, 589 defined, 590 engineering culture for, 95–98 requirements, 590 executive responsibilities for, 91–94 Product Integration (PI) process area in maturity levels, 45 Assemble Product Components and Deliver the measurement for, 75–77 Product goal, 387–391 ptg objectives, 589 Collect Process Related Experiences practice Organizational Process Definition, 303 elaboration, 224 Organizational Process Focus, 317–323 Control Work Products practice elaboration, 194 Organizational Training, 367 Ensure Interface Compatibility goal, 384–387 paths, 32 Establish an Organizational Policy practice selections for, 98–99 elaboration, 171 in small companies, 101–104 Identify and Involve Relevant Stakeholders Verification, 542 practice elaboration, 201 Process management. See Organizational introductory notes, 377–378 Performance Management (OPM) process area Monitor and Control the Process practice Process measurements, 589 elaboration, 208 Process models, 144 Objectively Evaluate Adherence practice basis for appraisal, 322 elaboration, 215 Process owners, 589 overview, 71 Process performance. See also Organizational Plan the Process practice elaboration, 176 Process Performance (OPP) process area Prepare for Product Integration goal, 379–383 defined, 589 Provide Resources practice elaboration, 181 measurement for, 77 purpose, 377 Organizational Process Performance, 351–352, related process areas, 378–379 358–359 Train People practice elaboration, 188 Quantitative Project Management, 433, 439 Product integration strategy. See Establish an Verification, 552 Integration Strategy practice Process performance baselines, 433 Product lines defined, 589 Configuration Management, 245 for success, 53–58 defined, 591

Wow! eBook 642 Index

Product lines (continued) generic goals and practices, 228–229 Product Integration, 378 Identify and Involve Relevant Stakeholders Project Planning, 404 practice elaboration, 201 Requirements Development, 457-458 introductory notes, 403–404 organization’s standard set of processes, 305 Monitor and Control the Process practice Technical Solution, 510 elaboration, 208 Verification, 541 Objectively Evaluate Adherence practice Product related lifecycle processes, 591 elaboration, 215 Products Obtain Commitment to the Plan goal, 422–423 defined, 590 overview, 64 lifecycle. See Lifecycle processes and models Plan the Process practice elaboration, 176 requirements. See Requirements Provide Resources practice elaboration, 182 standard processes for, 305 purpose, 403 Program Management Schedule, 118 related process areas, 404 Programs, process improvement, 94 Train People practice elaboration, 189 Progress and performance indicators, 394, 399 Project plans Project Change Request (PCR) process, 119 defined, 592 Project Management process areas, 64–68 Measurement and Analysis, 290 Project Monitoring and Control (PMC) process area Project Monitoring and Control, 393 Collect Process Related Experiences practice Requirements Management, 475 elaboration, 224 Risk Management, 490 Control Work Products practice elaboration, 194 Verification, 545 Establish an Organizational Policy practice Project progress and performance, 592 elaboration, 171 Project startup, 592 generic goals and practices, 229–230 Projects ptg Identify and Involve Relevant Stakeholders defined, 591–592 practice elaboration, 201 lifecycle. See Lifecycle processes and models introductory notes, 393 meaning of, 17 Manage Corrective Action to Closure goal, 400 in process improvement, 91–92 Monitor and Control the Process practice Prototypes elaboration, 208 defined, 592 Monitor the Project Against the Plan goal, Product Integration, 378 394–400 Project Planning, 409 Objectively Evaluate Adherence practice Requirements Development, 471 elaboration, 215 Risk Management, 493 overview, 66 Validation, 531 Plan the Process practice elaboration, 176 Verification, 543 Provide Resources practice elaboration, 182 Provide Measurement Results goal, 298 purpose, 393 Analyze Measurement Data practice, 299–300 related process areas, 393–394 Communicate Results practice, 302 Train People practice elaboration, 188 Obtain Measurement Data practice, 298–299 Project Planning (PP) process area Store Data and Results practice, 300–301 Collect Process Related Experiences practice Provide Objective Insight goal, 430 elaboration, 224 Communicate and Resolve Noncompliance Issues Control Work Products practice elaboration, 194 practice, 430–431 Develop a Project Plan goal, 411–421 Establish Records practice, 431 Establish an Organizational Policy practice Provide Resources practice elaboration, 171 overview, 178–184 Establish Estimates goal, 405–411 process area roles in, 228

Wow! eBook Index 643

Provide Training goal, 372–373 Quantitative management, 593 Assess Training Effectiveness practice, 374–375 Quantitative objectives Deliver Training practice, 373–374 defined, 593 Establish Training Records practice, 374 Organizational Process Performance, 354–356 Purpose statements, 22 Quantitative Project Management (QPM) process area Collect Process Related Experiences practice Q elaboration, 225 QIP (Quality Improvement Paradigm), 40 Control Work Products practice elaboration, 195 Quality, defined, 592 Establish an Organizational Policy practice Quality and process performance objectives, 53–54 elaboration, 172 defined, 592 Identify and Involve Relevant Stakeholders Organizational Process Performance, 351, practice elaboration, 202 354–356 introductory notes, 433–435 Quantitative Project Management, 433 Monitor and Control the Process practice Quality assurance. See also Process and Product elaboration, 209 Quality Assurance (PPQA) process area Objectively Evaluate Adherence practice defined, 593 elaboration, 216 Organizational Process Focus, 325 overview, 68 Organizational Training, 368 Plan the Process practice elaboration, 177 Project Planning, 406 Prepare for Quantitative Management goal, Technical Solution, 521 436–447 Quality attributes. See also Definition of required Provide Resources practice elaboration, 182–183 functionality and quality attributes purpose, 433 allocated, 464, 516, 518 Quantitatively Manage the Project goal, 447–453 ptg analyzed, 469, 471 related process areas, 435–436 architecturally significant, 464 Train People practice elaboration, 189 architectures support achievement of, 512, Quantitatively Manage the Project goal, 447–448 517–518, 575 Manage Project Performance practice, 449–451 business objectives, 331 Monitor the Performance of Selected defined, 593 Subprocesses practice, 448–449 discussed, 457, 468 Perform Root Cause Analysis practice, 451–453 elicited, 460 Quantitatively Managed maturity level, 43–44 examples, 438, 444, 457, 464, 513, 519 Quantitative management, defined, 593 formal evaluation, basis for, 260 Queuing theory, 75 models, simulations, prototypes, or pilots, role of, 509-510 prioritized, 462, 476 R product integration, 377, 383 Radice, Ron, 9 risks, 471, 484, 489 Rapid prototypes, 378 scenarios used to elicit and express, 467-468, 518 RASCI (Responsibility, Accountable, Support, solutions that address, 513 Consulted, Informed) charts, 136 technical data package, 521 Rassa, Bob validation, 535 biography, 620 work environment, 274 CMMI history essay, 12–13 Quality control, 593 Raytheon IDS, 53–54 Quality Improvement Paradigm (QIP), 40 Readiness assessment at ABB, 126 Quality Management System at KNORR, 120 Readiness of product components, 387–388 Quality Team, 572 Reconcile Work and Resource Levels practice, 422

Wow! eBook 644 Index

Record Causal Analysis Data practice, 241 traceability Records Requirements Management, 478 Configuration Management, 253–254 Technical Solution, 522 Process and Product Quality Assurance, 431 Validation, 534, 537 training, 374 Verification, 541, 544 Recursion of engineering processes, 74–75 Requirements analysis, 594 References and reference models Requirements Development (RD) process area defined, 593 Analyze and Validate Requirements goal, 466–472 Organizational Process Focus, 318 Collect Process Related Experiences practice for process areas, 26 elaboration, 225 sources, 555–560 Control Work Products practice elaboration, 195 Regression techniques with SLAM, 56 Develop Customer Requirements goal, 459–462 Related process areas sections, 22 Develop Product Requirements goal, 462–466 Relevant stakeholders Establish an Organizational Policy practice defined, 593, 599 elaboration, 172 in empirical software engineering, 38–39 expansions, 72–73 formal evaluation process for. See Decision Identify and Involve Relevant Stakeholders Analysis and Resolution (DAR) process area practice elaboration, 202 Identify and Involve Relevant Stakeholders introductory notes, 455–458 practice, 196–204, 229 Monitor and Control the Process practice Integrated Project Management, 268, 282–285 elaboration, 209 Organizational Process Focus, 324 Objectively Evaluate Adherence practice Organizational Process Performance, 356 elaboration, 216 Process and Product Quality Assurance, 431 overview, 68–70 Product Integration, 385 Plan the Process practice elaboration, 177 ptg Project Monitoring and Control, 398 Provide Resources practice elaboration, 183 Project Planning, 419–420 purpose, 455 Quantitative Project Management, 437 related process areas, 458 Requirements Development, 455, 461–462 Train People practice elaboration, 189 Requirements Management, 478 Requirements elicitation, 594 Risk Management, 481 Requirements management, 594 Supplier Agreement Management, 504, 507 Requirements Management (REQM) process area Validation, 531–532 Collect Process Related Experiences practice Verification, 549 elaboration, 225 Repository Control Work Products practice elaboration, 195 Configuration Management, 272–273 Establish an Organizational Policy practice Measurement and Analysis, 288 elaboration, 172 Organizational Innovation and Deployment, Identify and Involve Relevant Stakeholders 310–312 practice elaboration, 202 Representation, 593 introductory notes, 473–474 Required CMMI components, 594 Manage Requirements goal, 475–480 Requirements Monitor and Control the Process practice defined, 591, 594 elaboration, 209 Organizational Training, 368 Objectively Evaluate Adherence practice in process areas, 19–20 elaboration, 217 Product Integration, 382 overview, 66 Project Planning, 405, 409 Plan the Process practice elaboration, 177 in small company process improvement, 104 Provide Resources practice elaboration, 183 Technical Solution, 509–510 purpose, 473

Wow! eBook Index 645

Requirements Management (REQM) process area Provide Resources practice elaboration, 183 (continued) purpose, 481 related process areas, 474–475 related process areas, 482 Review Status with Higher Level Management Review Status with Higher Level Management practice elaboration, 220 practice elaboration, 220 Train People practice elaboration, 190 Train People practice elaboration, 190 Requirements traceability, 594 Risks, monitoring, 397 Resilience Management Model (RMM), 73 RMM (Resilience Management Model), 73 Resolution of conflicts, 461 Root cause analysis, 451–453 Resolve Coordination Issues practice, 284–285 Rules and guidelines for teams, 314–315 Resources Project Planning, 422 Provide Resources practice, 178–184, 228 S Responsibility, Accountable, Support, Consulted, SAS (SCAMPI Appraisal System), 74 Informed (RASCI) charts, 136 Satisfy Supplier Agreements goal, 504 Return on investment, 594 Accept the Acquired Product practice, 506–507 Reverse exploitation, 144–145 Ensure Transition of Products practice, 507–508 Review Interface Descriptions for Completeness Execute the Supplier Agreement practice, practice, 384–385 504–506 Review Plans That Affect the Project practice, 422 SCAMPI Appraisal System (SAS), 74 Review Status with Higher Level Management for appraisals, 105–107 practice in equivalent staging, 49–50 overview, 218–220 SCAMPI Upgrade Team, 570 process area roles in, 230 Scenarios Risk analysis and identification Requirements Development, 467–468 ptg defined, 594 Technical Solution, 518 Organizational Training, 367 Validation, 537 Project Management, 68 Schaff, Kevin Project Planning, 414–415 biography, 620–621 Requirements Development, 471 hiring process essay, 152–157 Risk Management, 487–491 Schedules, Project Planning, 412–414, 421 Risk factors in improvements, 342 Scope Risk management, 594–595 estimating, 406–407 Risk Management (RSKM) process area in small company process improvement, 103 Collect Process Related Experiences practice SECM (Systems Engineering Capability Method), 10 elaboration, 225 SEI (Software Engineering Institute), 4, 13 Control Work Products practice elaboration, 195 Select and Implement Improvements for Establish an Organizational Policy practice Deployment practice, 344–346 elaboration, 172 Select Evaluation Methods practice, 263–264 Identify and Analyze Risks goal, 487–491 Select Improvements goal, 337–338 Identify and Involve Relevant Stakeholders Analyze Suggested Improvements practice, practice elaboration, 202 341–343 introductory notes, 481–482 Elicit Suggested Improvements practice, 338–340 Mitigate Risks goal, 491–495 Select and Implement Improvements for Monitor and Control the Process practice Deployment practice, 344–346 elaboration, 209–210 Validate Improvements practice, 343–344 Objectively Evaluate Adherence practice Select Measures and Analytic Techniques practice, elaboration, 217 445–447 Plan the Process practice elaboration, 177 Select Outcomes for Analysis practice, 235–236 Prepare for Risk Management goal, 482–487 Select Processes practice, 356–358

Wow! eBook 646 Index

Select Product Component Solutions goal, 512–513 Shaw, Michele A. Develop Alternative Solutions and Selection biography, 610 Criteria practice, 513–515 empirical thinking essay, 37–41 Select Product Component Solutions practice, Shewhart, Walter, 9 515–516 Shrum, Sandy Select Product Component Solutions practice, acknowledgments, xxiii-xxiiii 515–516 biography, 608 Select Products for Validation practice, 533–535 Six Sigma Select Solutions practice, 265–266 approaches, 110 Select Subprocesses and Attributes practice, LSS. See Lean Six Sigma (LSS) case study 443–444 Skills needs, 418–419 Select Suppliers practice, 500–502 SLAM (Systems Lifecycle Analysis Model), 56–58 Select Work Products for Verification practice, Small companies, process improvements in, 101–104 544–545 Snyder, Terry,159 Selection choices for process improvement, 98–99 Software baselines, 250 Senior managers Software CMM initiative, 124 defined, 595 Software development plans, 421 in Risk Management, 486 Software engineering Service agreements, 595 defined, 598 Service catalogs, 596 empirical, 37–41 Service Continuity (SCON) process area, 73 Technical Solution, 519, 527 Service incidents, 596 Verification, 544 Service level agreements, 596 Software Engineering Institute (SEI), 4, 13 Service level measures, 596 Software factory, 96 Service levels, 596 Software Requirements Traceability Matrix (SRTM), ptg Service lines, 596 119 Service requests, 597 Solicitation, 598 Service requirements, 597 Solicitation materials and requirements, 501 Service system components, 597–598 Solicitation packages, 598 Service system consumables, 598 Sources, risk, 483–484 Service System Development (SSD) process area, Space and Naval Warfare Systems Command 72–73 (SPAWAR) case study, 130–131 Service systems, 597 approach, 131–132 Service Systems Transition (SST) process area, 73 background context, 131 Services challenges, 131 defined, 595 gap analysis, 133 evaluating, 429 lessons learned and critical success factors, Integrated Project Management, 268 135–137 Services Advisory Group, 566–567 LSS tools, 133–134 Services Mini Team, 570 mission critical risks, 132 Set of standard processes process area selection, 132–133 Integrated Project Management, 267–269 results and recommendations, 134–135 Organizational Process Definition, 303–305 Special cause of variation, 598 Organizational Process Focus, 322 Specific goals Organizational Process Performance, 351 defined, 598 Organizational Training, 366, 368 process areas, 22–23 Quantitative Project Management, 434 Specific practices Technical Solution, 510 defined, 598 Shared vision, 598 process areas, 23–24

Wow! eBook Index 647

Specify Analysis Procedures practice, 296–298 Stress in process improvement, 94 Specify Data Collection and Storage Procedures Subpractices practice, 294–295 defined, 600 Specify Measures practice, 291–294 description, 24 Sponsors, 138–139 Subprocesses Stable processes, 599 defined, 600 Staged representation of CMMs Quantitative Project Management, 433, 448–449 defined, 599 Success, baselines for, 53–58 levels in, 32–34 Suggested improvements. See also Improvement process areas in, 46–48 suggestions Stakeholders. See Relevant stakeholders analyzing, 341–343 Stall, Alexander eliciting, 338–340 biography, 621 Supplier Agreement Management (SAM) process hiring process essay, 152–157 area Standard processes, 167 Collect Process Related Experiences practice defined, 599 elaboration, 226 Integrated Project Management, 267–269 Control Work Products practice elaboration, Organizational Process Definition, 303–305 196 Organizational Process Focus, 322 Establish an Organizational Policy practice Organizational Process Performance, 351 elaboration, 172 Organizational Training, 366, 368 Establish Supplier Agreements goal, 499–504 Quantitative Project Management, 434 Identify and Involve Relevant Stakeholders Technical Solution, 510 practice elaboration, 203 Standards introductory notes, 497–498 defined, 599 Monitor and Control the Process practice ptg Organizational Process Definition, 303 elaboration, 210 Process and Product Quality Assurance, 425 Objectively Evaluate Adherence practice in process definition, 85–89 elaboration, 217 Quantitative Project Management, 441 overview, 66, 69 Stars in process improvement, 14–16 Plan the Process practice elaboration, 177 Startup guidelines, 137–141 Provide Resources practice elaboration, 183 Statements of work purpose, 497 defined, 599 related process areas, 499 Project Monitoring and Control, 401 Satisfy Supplier Agreements goal, 504–508 Supplier Agreement Management, 503 Train People practice elaboration, 190 Statistical process control, 600 Supplier agreements, 600 Statistical relationships in SLAM, 56 Suppliers, 600 Statistical techniques Support activities, Integrated Project Management defined, 599–600 for, 268 Quantitative Project Management, 434, 446 Support process areas, 77–80 Status reviews Supporting informative components, 25–26 executive responsibilities, 93 Sustainment Review Status with Higher Level Management defined, 600 practice, 218–220, 230 Project Planning, 410 Steering Group, 565–566 Requirements Development, 467 Stockholm Syndrome, 108 Technical Solution, 510 Storage procedures, 294–295, 300–301 SVC Training Teams, 571–572 Store Data and Results practice, 300–301 SW-CMM (Capability Maturity Model for Software), Strategic Service Management (STSM) process area, 10 73 Synergistic effect with LSS, 130–131

Wow! eBook 648 Index

Systems engineering Develop the Design goal, 517–525 defined, 601 Establish an Organizational Policy practice Verification, 544 elaboration, 172 Systems Engineering Capability Method (SECM), 10 Identify and Involve Relevant Stakeholders Systems Engineering Detailed Schedules, 421 practice elaboration, 203 Systems Engineering Management Plans, 421 Implement the Product Design goal, 526–529 Systems Engineering Master Schedules, 421 introductory notes, 509–510 Systems Lifecycle Analysis Model (SLAM), 56–58 Monitor and Control the Process practice Systems of systems, 600 elaboration, 210 Objectively Evaluate Adherence practice elaboration, 217–218 T overview, 68–70 Tailoring Plan the Process practice elaboration, 177 defined, 590, 601 Provide Resources practice elaboration, 184 guidelines, 601 purpose, 509 Integrated Project Management, 271 related process areas, 511 Organizational Process Definition, 303, 308–310 Select Product Component Solutions goal, Organizational Process Focus, 322 512–516 Organizational Process Performance, 360 Train People practice elaboration, 190 Quantitative Project Management, 434 Technology in PPT triangle, 82 Take Corrective Action practice, 401–402 Terminology in communication, 88 Target profiles Tests defined, 601 Requirements Development, 455 in equivalent staging, 50–51 Supplier Agreement Management, 502, 506 for process areas, 47–48 Technical Solution, 526, 528 ptg Target staging Tolstoy principle, 5–6 defined, 601 Traceability description, 51 defined, 603 Task attributes Measurement and Analysis, 291 Project Monitoring and Control, 393 Quantitative Project Management, 440 Project Planning, 407–408 Requirements Development, 462 Taxonomies, 151 Requirements Management, 478–479 Team Software Process (TSP), 8–9 Technical Solution, 510, 522 Teams Track and Control Changes goal, 250 defined, 601–602 Control Configuration Items practice, 252–253 establishing, 279–280 Track Change Requests practice, 251 Organizational Process Definition, 303, 314–315 Track Change Requests practice, 251 Technical data packages Trade studies defined, 602 defined, 603 developing, 509 Supplier Agreement Management, 501 Technical performance, 602 Train People practice Technical performance measures, 602 overview, 186–191 Technical requirements process area roles in, 228 defined, 602 Training. See also Organizational Training (OT) managing, 473 process area Technical Solution (TS) process area at ABB, 126–127 Assign Responsibility practice elaboration, 185 CMMI-related, 107 Collect Process Related Experiences practice defined, 603 elaboration, 226 Requirements Development, 467 Control Work Products practice elaboration, 196 Training Teams, 571

Wow! eBook Index 649

Transform Stakeholder Needs into Customer introductory notes, 531–532 Requirements practice, 461–462 Monitor and Control the Process practice Translation Team, 569 elaboration, 210 TSP (Team Software Process), 8–9 Objectively Evaluate Adherence practice Twain, Mark, 76 elaboration, 218 Typographical conventions, 27–30 overview, 71 Plan the Process practice elaboration, 178 Prepare for Validation goal, 533–537 U Provide Resources practice elaboration, 184 Understand Requirements practice, 476–477 purpose, 531 Unit testing related process areas, 532 defined, 603 Train People practice elaboration, 191 implementation process, 526, 528 Validate Product or Product Components goal, Use Organizational Process Assets for Planning 537–539 Project Activities practice, 272–273 van der Rohe, Mies, 108 Use the Project’s Defined Process goal, 269 Verband deutscher Automobilindustrie (VDA), 109 Contribute to Organizational Process Assets Verification practice, 280–282 defined, 603 Establish Teams practice, 279–280 Requirements Development, 461 Establish the Project’s Defined Process practice, Verification (VER) process area 269–272 Collect Process Related Experiences practice Establish the Project’s Work Environment elaboration, 226 practice, 273–275 Control Work Products practice elaboration, Integrate Plans practice, 275–277 196 Manage the Project Using Integrated Plans Establish an Organizational Policy practice ptg practice, 277–279 elaboration, 173 Use Organizational Process Assets for Planning Identify and Involve Relevant Stakeholders Project Activities practice, 272–273 practice elaboration, 204 introductory notes, 541–542 Monitor and Control the Process practice V elaboration, 211 V-Model at KNORR, 117–119, 121 Objectively Evaluate Adherence practice Validate Improvements practice, 343–344 elaboration, 218 Validate Product or Product Components goal, overview, 71 537–538 Perform Peer Reviews goal, 547–550 Analyze Validation Results practice, 538–539 Plan the Process practice elaboration, 178 Perform Validation practice, 538 Prepare for Verification goal, 543–546 Validate Requirements practice, 471–472 Provide Resources practice elaboration, 184 Validation purpose, 541 defined, 603 related process areas, 543 Requirements Development, 461, 466–472 Train People practice elaboration, 191 Validation (VAL) process area Verify Selected Work Products goal, 551–552 Collect Process Related Experiences practice Verify Selected Work Products goal, 551 elaboration, 226 Analyze Verification Results practice, 551–552 Control Work Products practice elaboration, 196 Perform Verification practice, 551 Establish an Organizational Policy practice Version control, 603 elaboration, 172 Vulnerabilities Identify and Involve Relevant Stakeholders risks to be identified, 414, 487 practice elaboration, 203 measured and analyzed, 316, 317

Wow! eBook 650 Index

W evaluating, 429 Weissbrich, A., 109 Project Monitoring and Control, 393 Wide Band Delphi techniques, 118 Requirements Management for, 473 Work breakdown structure (WBS) Verification, 551–552 ABB, 127 Work startup, 604 defined, 603 Integrated Project Management, 277 KNORR Brake Corporation, 118 X Project Monitoring and Control, 393 X-Ray emissions project, 158–161 Project Planning, 406, 417 Risk Management, 489 Work environment Y Integrated Project Management, 273–275 Young, Rawdon Organizational Process Definition, 313 biography, 621 Work groups, 603 hiring process essay, 152–157 Work levels, Project Planning, 422 Work plans, 604 Work product and task attributes, 604 Z Work products Zealots, process, 141 defined, 604

ptg

Wow! eBook This page intentionally left blank

ptg

Wow! eBook The SEI Partner Network: Helping hands with a global reach Do you need help getting started with adoption of SEI tools ptg and methods in your organization? Or are you an experienced professional in the field who wants to join a global network of SEI service providers? The SEI Partner Network is a world-wide group of licensed organizations with individuals qualified by the SEI to deliver SEI services. SEI Partners can provide you with the training courses, adoption assistance, appraisal methods, and teamwork and management processes that you need to succeed.

To find an SEI Partner near you, or to learn more about this global network of professionals, please visit the SEI Partner Network website at http://www.sei.cmu.edu/partners

Wow! eBook ESSENTIAL GUIDES TO CMMI®

CMMI® for Development: CMMI® for Services: Guidelines for Process Integration Guidelines for Superior Service, and Product Improvement, Second Edition Third Edition Eileen C. Forrester, Brandon L. Buteau, Mary Beth Chrissis, Mike Konrad, and Sandy Shrum and Sandy Shrum ISBN-13: 978-0-321-71152-6 ISBN-13: 978-0-321-71150-2 The authoritative guide to CMMI for The definitive guide to CMMI—now Services v1.3, a model designed to help updated for CMMI v1.3. Whether you are service-provider organizations improve new to CMMI or already familiar with their processes and thereby gain business some version of it, this book is the essen- advantage. This book, which contains tial resource for managers, practitioners, the complete model, also includes help- and process improvement team members ful commentary by the authors and case who to need to understand, evaluate, studies to illustrate how the model is and/or implement a CMMI model. being used.

CMMI® for Acquisition: CMMI® and Six Sigma: Guidelines for Improving the Partners in Process Improvement Acquisition of Products and Jeannine M. Siviy, M. Lynn Penn, and Services, Second Edition Robert W. Stoddard Brian P. Gallagher, Mike Phillips, ISBN-13: 978-0-321-51608-4 Karen J. Richter, and Sandy Shrum Focuses on the synergistic, rather than ptg ISBN-13: 978-0-321-71151-9 competitive, implementation of CMMI The official guide to CMMI-ACQ—an and Six Sigma—with synergy translating extended CMMI framework for improv- to “faster, better, cheaper” achievement ing product and service acquisition of mission success. processes. In addition to the complete CMMI-ACQ itself, the book includes tips, hints, and case studies to enhance your understanding and to provide valuable, practical advice.

Integrating CMMI® and Agile CMMI® Distilled: A Practical Development: Case Studies and Introduction to Integrated Process Proven Techniques for Faster Improvement, Third Edition Performance Improvement Dennis M. Ahern, Aaron Clouse, Paul E. McMahon and Richard Turner ISBN-13: 978-0-321-71410-7 ISBN-13: 978-0-321-46108-7 Explains how combining an Agile Updated for CMMI version 1.2, this approach with the CMMI process third edition again provides a concise improvement framework is the fastest, and readable introduction to the model, most effective way to achieve your as well as straightforward, no-nonsense business objectives. information on integrated, continuous process improvement.

For more information on these and other CMMI-related books, as well as on all titles in The SEI Series in Software Engineering, please visit informit.com/seiseries.

Wow! eBook The SEI Series in Software Engineering

ISBN 0-321-46108-8 ISBN 0-321-22876-6 ISBN 0-321-11886-3 ISBN 0-201-73723-X ISBN 0-321-50917-X ISBN 0-321-15495-9

ISBN 0-321-17935-8 ISBN 0-321-27967-0 ISBN 0-201-70372-6 ISBN 0-201-70482-X ISBN 0-201-70332-7 ISBN 0-201-60445-0

ptg ISBN 0-201-60444-2 ISBN 0-321-42277-5 ISBN 0-201-52577-1 ISBN 0-201-25592-8 ISBN 0-321-47717-0 ISBN 0-201-54597-7

ISBN 0-201-54809-7 ISBN 0-321-30549-3 ISBN 0-201-18095-2 ISBN 0-201-54610-8 ISBN 0-201-47719-X ISBN 0-321-34962-8

ISBN 0-201-77639-1 ISBN 0-201-73-1134 ISBN 0-201-61626-2 ISBN 0-201-70454-4 ISBN 0-201-73409-5 ISBN 0-201-85-4805

ISBN 0-321-11884-7 ISBN 0-321-33572-4 ISBN 0-321-51608-7 ISBN 0-201-70312-2 ISBN 0-201-70-0646 ISBN 0-201-17782-X

Please see our web site at informit.com/seiseries for more information on these titles.

Wow! eBook ptg ptg ptg ptg Process Areas by Maturity Level Maturity Level 2 CM Configuration Management MA Measurement and Analysis PPQA Process and Product Quality Assurance PMC Project Monitoring and Control PP Project Planning REQM Requirements Management SAM Supplier Agreement Management

Maturity Level 3 DAR Decision Analysis and Resolution IPM Integrated Project Management OPD Organizational Process Definition OPF Organizational Process Focus OT Organizational Training PI Product Integration RD Requirements Development RSKM Risk Management TS Technical Solution VAL Validation ptg VER Verification

Maturity Level 4 OPP Organizational Process Performance QPM Quantitative Project Management

Maturity Level 5 CAR Causal Analysis and Resolution OPM Organizational Performance Management

Wow! eBook Generic Goals and Practices GG2 Institutionalize a Managed Process GP2.1 Establish an Organizational Policy GP 2.2 Plan the Process GP 2.3 Provide Resources GP 2.4 Assign Responsibility GP 2.5 Train People GP 2.6 Control Work Products GP 2.7 Identify and Involve Relevant Stakeholders GP 2.8 Monitor and Control the Process GP 2.9 Objectively Evaluate Adherence GP 2.10 Review Status with Higher Level Management GG3 Institutionalize a Defined Process GP 3.1 Establish a Defined Process GP 3.2 Collect Process Related Experiences

ptg

Wow! eBook