AGILE PROJECT MANAGEMENT ADVISORY SERVICE WHITE PAPER by Jim Highsmith CUTTER CONSORTIUM PRACTICE AREAS:

à Distributed Architecture

à Business-IT Strategies

à Business Intelligence

à Business Technology Trends and Impacts

à Project Management

à IT Management

à Sourcing

CUTTER CONSORTIUM, 37 BROADWAY, SUITE 1, ARLINGTON, MA 02474-5552, USA; Tel: +1 781 648 8700; Fax: +1 781 648 8707; [email protected]; www.cutter.com CUTTER CONSORTIUM

ABOUT THE CUTTER CONSORTIUM Cutter Consortium’s mission is to help senior executives leverage technology for competitive advantage and business success. Cutter’s offerings are entirely unique in the research/analyst industry sector because they are produced and provided by the top thinkers in IT today — a distinguished group of internation- ally recognized experts committed to providing high-level, critical advice and guidance. These experts provide all of Cutter’s written deliverables and perform all of the consulting and training assignments. Cutter Consortium’s products and services include: high-level advisory/research services, online and print publications, benchmarking metrics, management and technical consulting, and advanced training. The content is aimed at both a technical and business audience with an emphasis on strategic processes and thinking. An independent, privately held entity that has no allegiance or connections to any computer vendors, Cutter has a well-earned reputation for its objectivity. Cutter’s more than 5,300 clients include CIOs, CEOs, CFOs, and senior IT managers in Fortune 500 companies and other businesses, national and state governments, and universities around the world. As a smaller information provider, the Consortium customizes its services to meet each client’s individual needs and ensure them access to the experts.

FOR MORE INFORMATION To learn more about the Cutter Consortium, call +1 800 964 5118 (toll-free in North America) or +1 781 648 8700, send e-mail to [email protected], or visit the Cutter Consortium Web site: www.cutter.com. Extreme Programming

[This article orginally appeared in the February 2000 edition of Cutter Consortium’s e-Business Application Delivery newsletter, now titled the Agile Project Management Advisory Service.] by Jim Highsmith

As we have explored in several most staid organization, it is then rather than thinking and therefore issues of eAD, the two most press- that we must examine every aspect results in low quality. This is an ing issues in information technology of how business is managed, cus- easy way to dismiss practices that today are: tomers are delighted, and products conflict with one’s own assump- are developed. tions about the world. à How do we deliver functionali- The Extreme Programming move- Looked at another way, XP may be ty to business clients quickly? ment has been a subset of the a potential piece of a puzzle I’ve object-oriented (OO) programming been writing about over the past 18 à How do we keep up with near-continuous change? community for several years, but months. Turbulent times give rise has recently attracted more atten- to new problems that, in turn, give Change is changing. Not only does tion, especially with the recent rise to new practices — new prac- the pace of change continue to release of Kent Beck’s new book tices that often fly in the face of accelerate, but, as the September Extreme Programming Explained: conventional wisdom but survive issue of eAD pointed out, organiza- Embrace Change. Don’t be put off because they are better adapted to tions are having to deal with differ- by the somewhat “in-your-face” the new reality. There are at ent types of change — disruptive moniker of Extreme Programming least four practices I would assign change and punctuated equilibrium. (XP to practitioners). Although to this category: Disruptive technologies, like personal Beck doesn’t claim that practices computers in the early 1980s, impact such as pair programming and à XP — the focus of this issue an industry (in the case of PCs, incremental planning originated several related industries), while a with XP, there are some very inter- à Lean development — dis- punctuated equilibrium — a massive esting, and I think important, con- cussed in the November 1998 intervention into an ecosystem or an cepts articulated by XP. There’s a issue of eAD economy — impacts a very large lot of talk today about change, but number of species, or companies. XP has some pretty good ideas à Crystal Light methods — men- The Internet, which has become about how to actually do it. Hence tioned in the November 1999 the backbone for e-commerce and the subtitle, Embrace Change. issue of eAD and further dis- e-business, has disrupted a wide cussed in this issue There is a tendency, particularly by range of industries — more a punc- rigorous methodologists, to dismiss à Adaptive software develop- tuated equilibrium than a disruption. anything less ponderous than the ment — described in the When whole business models are Capability Maturity Model (CMM) August 1998 issue of eAD changing, when time-to-market or maybe the International (then called Application becomes the mantra of companies, Organization for Standardization’s Development Strategies — when flexibility and interconnected- standards, as hacking. The conno- ADS) ness are demanded from even the tation: hacking promotes doing Although there are differences in 4 AGILE PROJECT MANAGEMENT ADVISORY SERVICE each of these practices, there are claims. For example, both are clear ed to know. I’ve been involved in also similarities: they each describe about XP’s applicability to small enough rapid application develop- variations from the conventional (less than 10 people), co-located ment (RAD) projects for large IT wisdom about how to approach teams (with which they have direct organizations over the years to . Whereas experience); they don’t try to con- understand why success does not lean and adaptive development vince people that the practices will consistently translate into accep- practices target strategic and pro- work for teams of 200. tance. There are always at least a ject management, XP brings its dif- hundred very good reasons why fering world view to the realm of The Project success at RAD, or XP, or lean the developer and tester. development, or other out-of-the- The most prominent XP project box approaches doesn’t translate Much of XP is derived from good reported on to date is the Chrysler into wider use — but more on this practices that have been around for Comprehensive Compensation sys- issue later. a long time. “None of the ideas in tem (the C3 project) that was initiat- XP are new. Most are as old as pro- ed in the mid-1990s and converted Practices gramming,” Beck offers to readers to an XP project in 1997. Jeffries, in the preface to his book. I might one of the “Three Extremoes” (with One thing to keep in mind is that XP differ with Beck in one respect: Beck and Ward Cunningham), and I practices are intended for use with although the practices XP uses spent several hours talking about small, co-located teams. They aren’t new, the conceptual founda- the C3 project and other XP issues therefore tend toward minimalism, tion and how they are melded at the recent Miller Freeman at least as far as artifacts other than together greatly enhance these Software Developer conference in code and test cases are concerned. “older” practices. I think there are Washington, DC, USA. The presentation of XP’s practices four critical ideas to take away from have both positive and negative Originally, the C3 project was con- XP (in addition to a number of aspects. At one level, they sound ceived as an OO programming pro- other good ideas): like rules — do this, don’t do that. ject, specifically using Smalltalk. Beck explains that the practices are Beck, a well-known Smalltalk more like guidelines than rules, à The cost of change expert, was called in to consult on guidelines that are pliable depend- Smalltalk performance optimi- ing on the situation. However, à Refactoring zation, and the project was trans- some, like the “40-hour week,” can formed into a pilot of OO (XP) prac- Collaboration come off as a little preachy. Jeffries à tices after the original project was makes the point that the practices deemed unreclaimable. Beck à Simplicity also interact, counterbalance, and brought in Jeffries to assist on a reinforce each other, such that pick- more full-time basis, and Jeffries But first, I discuss some XP basics: ing and choosing which to use and the dozen practices that define XP. worked with the C3 team until which to discard can be tricky. spring 1999. The initial require- ments were to handle the monthly The planning game. XP’s planning XP — The Basics payroll of some 10,000 salaried approach mirrors that of most itera- I must admit that one thing I like employees. The system consists of tive RAD approaches to projects. about XP’s principal figures is their approximately 2,000 classes and Short, three-week cycles, frequent lack of pretension. XP proponents 30,000 methods and was ready updates, splitting business and tech- are careful to articulate where they within a reasonable tolerance peri- nical priorities, and assigning “sto- think XP is appropriate and where it od of the planned schedule. ries” (a story defines a particular feature requirement and is dis- is not. While practitioners like Beck As we talked, I asked Jeffries how played in a simple card format) all and Ron Jeffries may envision that success on the C3 project translated define XP’s approach to planning. XP has wider applicability, they are into XP use on other Chrysler IT generally circumspect about their projects. His grin told me all I need-

Extreme Programming. ©2002 by Cutter Consortium. All rights reserved. Unauthorized reproduction in any form, including photocopying, faxing, and image scanning, is against the law.

WHITE PAPER www.cutter.com EXTREME PROGRAMMING 5

Small releases. “Every release for potential future functionality. — for example, how about “Test should be as small as possible, con- Two, create the best design that can and then code”? I’ve worked with taining the most valuable business deliver that functionality. In other software companies and a few IT requirements,” states Beck. This words, don’t guess about the future: organizations in which programmer mirrors two of Tom Gilb’s principles create the best (simple) design you performance was measured on of evolutionary delivery from his can today. “If you believe that the lines of code delivered and testing book Principles of Software future is uncertain, and you believe was measured on defects found — Engineering Management: “All large that you can cheaply change your neither side was motivated to projects are capable of being divid- mind, then putting in functionality reduce the number of defects prior ed into many useful partial result on speculation is crazy,” writes to testing. XP uses two types of steps,” and “Evolutionary steps Beck. “Put in what you need when testing: unit and functional. should be delivered on the principle you need it.” However, the practice for unit test- of the juiciest one next.” ing involves developing the test for In the early 1980s, I published an the feature prior to writing the code Small releases provide the sense of article in Datamation magazine and further states that the tests accomplishment that is often miss- titled “Synchronizing Data with should be automated. Once the ing in long projects as well as more Reality.” The gist of the article was code is written, it is immediately frequent (and more relevant) feed- that data quality is a function of use, subjected to the test suite — instant back. However, a development not capture and storage. feedback. team needs to also consider the dif- Furthermore, I said that data that ference between “release” and was not systematically used would The most active discussion group “releasable.” The cost of each rapidly go bad. Data quality is a on XP remains the Wiki exchange release — installation, training, con- function of systematic usage, not (XP is a piece of the overall discus- versions — needs to be factored anticipatory design. Trying to antici- sion about patterns). One of the into whether or not the product pro- pate what data we will need in the discussions centers around a lifecy- duced at the end of a cycle is actu- future only leads us to design for cle of Listen (requirements) Õ Test ally released to the end user or is data that we will probably never Õ Code Õ Design. Listen closely to simply declared to be in use; even the data we did guess customers while gathering their a releasable state. correctly on won’t be correct any- requirements. Develop test cases. way. XP’s simple design approach Code the objects (using pair pro- Metaphor. XP’s use of the terms mirrors the same concepts. As gramming). Design (or refactor) as “metaphor” and “story” take a little described later in this article, this more objects are added to the sys- wearing in to become comfortable. doesn’t mean that no anticipatory tem. This seemingly convoluted However, both terms help make the design ever happens; it does mean lifecycle begins to make sense only technology more understandable in that the economics of anticipatory in an environment in which change human terms, especially to clients. design changes dramatically. dominates. At one level, metaphor and archi- tecture are synonyms — they are Refactoring. If I had to pick one Pair programming. One of the few both intended to provide a broad thing that sets XP apart from other practices that view of the project’s goal. But approaches, it would be refactoring enjoys near-universal acceptance architectures often get bogged — the ongoing redesign of software (at least in theory) and has been down in symbols and connections. to improve its responsiveness to well measured is software inspec- XP uses “metaphor” in an attempt change. RAD approaches have tions (also referred to as reviews or to define an overall coherent theme often been associated with little or walkthroughs). At their best, to which both developers and busi- no design; XP should be thought of inspections are collaborative inter- ness clients can relate. The as continuous design. In times of actions that speed learning as much metaphor describes the broad rapid, constant change, much more as they uncover defects. One of the sweep of the project, while stories attention needs to be focused on lesser-known statistics about are used to describe individual refactoring. See the sections inspections is that although they are features. “Refactoring” and “Data very cost effective in uncovering Refactoring,” below. defects, they are even more effec- Simple design. Simple design has tive at preventing defects in the first two parts. One, design for the func- Testing. XP is full of interesting place through the team’s ongoing tionality that has been defined, not twists that encourage one to think

©2002 CUTTER CONSORTIUM WHITE PAPER 6 AGILE PROJECT MANAGEMENT ADVISORY SERVICE learning and incorporation of better to excel. Collective ownership there. And that you should work no programming practices. encourages the entire team to work more than one week of overtime. more closely together: each individ- If you go beyond that, there’s some- One software company client I ual and each pair strives a little thing wrong — and you’re tiring out worked with cited an internal study harder to produce high-quality and probably doing worse than if that showed that the amount of designs, code, and test cases. you were on a normal schedule. I time to isolate defects was 15 hours Granted, all this forced “together- agree with you on the sentiment per defect with testing, 2-3 hours ness” may not work for every about the 60-hour work week. per defect using inspections, and 15 project team. When we were young and eager, minutes per defect by finding the they were probably okay. It’s the defect before it got to the inspec- . Daily dragging weeks to watch for.” tion. The latter figure arises builds have become the norm in from the ongoing team learning many software companies — mim- I don’t think the number of hours engendered by regular inspections. icking the published material on the makes much difference. What Pair programming takes this to the “Microsoft” process (see, for exam- defines the difference is volun- next step — rather than the incre- ple, Michael A. Cusumano and teered commitment. Do people mental learning using inspections, Richard Selby’s Microsoft Secrets). want to come to work? Do they why not continuous learning using Whereas many companies set daily anticipate each day with great rel- pair programming? builds as a minimum, XP practition- ish? People have to come to work, ers set the daily integration as the but they perform great feats by “Pair programming is a dialog maximum — opting for frequent being committed to the project, and between two people trying to simul- builds every couple of hours. XP’s commitment only arises from a taneously program and understand feedback cycles are quick: develop sense of purpose. how to program better,” writes the test case, code, integrate Beck. Having two people sitting in On-site customer. This practice (build), and test. front of the same terminal (one corresponds to one of the oldest entering code or test cases, one The perils of integration defects cries in software development — reviewing and thinking) creates a have been understood for many user involvement. XP, as with every continuous, dynamic interchange. years, but we haven’t always had other rapid development approach, Research conducted by Laurie the tools and practices to put calls for ongoing, on-site user Williams for her doctoral disserta- that knowledge to good use. XP not involvement with the project team. tion at the University of Utah con- only reminds us of the potential for Coding standards. XP practices firm that pair programming’s bene- serious integration errors, but pro- are supportive of each other. For fits aren’t just wishful thinking (see vides a revised perspective on prac- example, if you do pair program- Resources and References). tices and tools. ming and let anyone modify the Collective ownership. XP defines 40-hour week. Some of XP’s 12 communal code, then coding collective ownership as the practice practices are principles, while oth- standards would seem to be a that anyone on the project team ers, such as the 40-hour practice, necessity. can change any of the code at any sound more like rules. I agree with time. For many programmers, and XP’s sentiments here; I just don’t Values and Principles certainly for many managers, the think work hours define the issue. I prospect of communal code raises would prefer a statement like, On Saturday, 1 January 2000, the concerns, ranging from “I don’t “Don’t burn out the troops,” rather Wall Street Journal (you know, the want those bozos changing my than a 40-hour rule. There are situ- “Monday through Friday” newspa- code” to “Who do I blame when ations in which working 40 hours is per) published a special problems arise?” Collective owner- pure drudgery and others in which 58-page millennial edition. The ship provides another level to the the team has to be pried away from introduction to the Industry & collaboration begun by pair pro- a 60-hour work week. Economics section, titled “So Long gramming. Supply and Demand: There’s a new Jeffries provided additional thoughts economy out there — and it looks Pair programming encourages two on overtime. “What we say is that nothing like the old one,” was writ- people to work closely together: overtime is defined as time in the ten by Tom Petzinger. “The bottom each drives the other a little harder office when you don’t want to be line: creativity is overtaking capital

WHITE PAPER www.cutter.com EXTREME PROGRAMMING 7 as the principal elixir of growth,” Simplicity. XP asks of each team practices that are based on com- Petzinger states. member, “What is the simplest munication, simplicity, and feed- thing that could possibly work?” back, you are given courage, the Petzinger isn’t talking about a hand- Make it simple today, and create an confidence to go ahead in a light- ful of creative geniuses, but the cre- environment in which the cost of weight manner,” as opposed to ativity of groups — from teams to change tomorrow is low. being weighted down by the more departments to companies. Once cumbersome, design-heavy prac- we leave the realm of the single Feedback. “Optimism is an occu- tices. creative genius, creativity becomes pational hazard of programming,” a function of the environment and says Beck. “Feedback is the treat- Quality work. Okay, all of you out how people interact and collabo- ment.” Whether it’s hourly builds there, please raise your hand if you rate to produce results. If your or frequent functionality testing advocate poor-quality work. company’s fundamental principles with customers, XP embraces Whether you are a proponent of the point to software development as a change by constant feedback. Rational Unified Process, CMM, or statistically repeatable, rigorous, Although every approach to soft- XP, the real issues are “How do you engineering process, then XP is ware development advocates feed- define quality?” and “What actions probably not for you. Although XP back — even the much-maligned do you think deliver high quality?” contains certain rigorous practices, — the difference is Defining quality as “no defects” pro- its intent is to foster creativity and that XP practitioners understand vides one perspective on the ques- communication. that feedback is more important tion; Jerry Weinberg’s definition, than feedforward. Whether it’s fix- “Quality is value to some person,” Environments are driven by values ing an object that failed a test case provides another. I get weary of and principles. XP (or the other or refactoring a design that is resist- methodologists who use the “hack- practices mentioned in this issue) ing a change, high-change environ- er” label to ward off the intrusion of may or may not work in your orga- ments require a much different approaches like XP and lean devel- nization, but, ultimately, success understanding of feedback. opment. It seems unproductive to won’t depend on using 40-hour return the favor. Let’s concede that work weeks or pair programming Courage. Whether it’s a CMM prac- all these approaches are based on — it will depend on whether or tice or an XP practice that defines the fundamental principle that indi- not the values and principles of your discipline, discipline requires viduals want to do a good, high- XP align with those of your courage. Many define courage as quality job; what “quality” means organization. doing what’s right, even when pres- and how to achieve it — now sured to do something else. Beck identifies four values, five there’s the gist of the real debate! Developers often cite the pressure fundamental principles, and ten to ship a buggy product and the secondary principles — but I’ll courage to resist. However, the Managing XP mention five that should provide deeper issues can involve legiti- One area in which XP (at least as enough background. mate differences of opinion over articulated in Beck’s book) falls Communication. So, what’s new what is right. Often, people don’t short is management, understand- here? It depends on your perspec- lack courage — they lack convic- able for a practice oriented toward tive. XP focuses on building a tion, which puts us right back to both small project teams and pro- person-to-person, mutual under- other values. If a team’s values gramming. As Beck puts it, standing of the problem environ- aren’t aligned, the team won’t be “Perhaps the most important job for ment through minimal formal convinced that some practice is the coach is the acquisition of toys documentation and maximum face- “right,” and, without conviction, and food.” (Coaching is one of to-face interaction. “Problems with courage doesn’t seem so important. Beck’s components of manage- projects can invariably be traced It’s hard to work up the energy to ment strategy.) back to somebody not talking to fight for something you don’t With many programmers, their rec- somebody else about something believe in. ommended management strategy important,” Beck says. XP’s prac- “Courage isn’t just about having the seems to be: get out of the way. tices are designed to encourage discipline,” says Jeffries. “It is also The underlying assumption? interaction — developer to develop- a resultant value. If you do the Getting out of the way will create a er, developer to customer.

©2002 CUTTER CONSORTIUM WHITE PAPER 8 AGILE PROJECT MANAGEMENT ADVISORY SERVICE collaborative environment. Steeped The vertical axis in Figure 1 usually 1999 issues of ADS.) Let’s simplify in the tradition of task-based project depicts the cost of finding defects even further and assume, for now, management, this assumption late in the development cycle. that the business and operational seems valid. However, in my expe- However, this assumes that all requirements are known. rience, creating and maintaining changes are the results of a mistake Our design goal is to balance the highly functional collaborative envi- — i.e., a defect. Viewed from this rapid delivery of functionality while ronments challenges management perspective, traditional methods we also create a design that can be far beyond making up task lists and have concentrated on “defect pre- easily modified. Even within the checking off their completion. vention” in early lifecycle stages. goal of rapid delivery, there remains But in today’s environment, we another balance: proceed too hur- can’t prevent what we don’t know riedly and bugs creep in; try to The Cost of Change about — changes arise from itera- anticipate every eventuality and Early on in Beck’s book, he chal- tively gaining knowledge about the time flies. However, let’s again sim- lenges one of the oldest assump- application, not from a defective plify our problem and assume we tions in software engineering. From process. So, although our practices have reached a reasonable balance the mid-1970s, structured methods need to be geared toward prevent- of design versus code and test time. and then more comprehensive ing some defects, they must also be methodologies were sold based on geared toward reducing the cost of With all these simplifications, we the “facts” shown in Figure 1. I continuous change. Actually, as are left with one question: how should know; I developed, taught, Alistair Cockburn points out, the much anticipatory design work do sold, and installed several of these high cost of removing defects we do? Current design produces methodologies during the 1980s. shown by Figure 1 provides an eco- the functionality we have already Beck asks us to consider that per- nomic justification for practices like specified. Anticipatory design builds haps the economics of Figure 1, pair programming. in extra facilities with the anticipa- tion that future requirements will be probably valid in the 1970s and In this issue, I want to restrict the faster to implement. Anticipatory 1980s, now look like Figure 2 — that discussion to change at the project design trades current time for future is, the cost of maintenance, or or application level — decisions time, under the assumption that a ongoing change, flattens out rather about operating systems, develop- little time now will save more time than escalates. Actually, whether ment language, database, middle- later. But under what conditions is Figure 2 shows today’s cost profile ware, etc., are constraints outside that assumption true? Might it not or not is irrelevant — we have the control of the development be faster to redesign later, when we to make it true! If Figure 1 remains team. (For ideas on “architectural” know exactly what the changes are, true, then we are doomed because flexibility, see the June and July of today’s pace of change. rather than guessing now? Figure 1 — Historical lifecycle change costs Figure 2 — Contemporary lifecycle change costs. of Change of Change Historic Cost Contemporary Cost Req. Design Maintenance Req. Design Maintenance Time or Lifecycle Phase Time or Lifecycle Phase (Source: Adapted from Beck)

WHITE PAPER www.cutter.com EXTREME PROGRAMMING 9

Anticipatory Designing Refactoring

Anticipatory Refactoring Designing

Lower Rate of Change Higher Lower Rate of Change Higher Figure 3 — Balancing design and refactoring, pre-internet. Figure 4 — Balancing design and refactoring today.

This is where refactoring enters the maintain strategy has been that soft- counteracted the natural tendency equation. Refactoring, according to ware systems exhibit tremendous of applications to degrade over author Martin Fowler, is “the entropy; they degrade over time as time. “The objective was not to process of changing a software sys- maintainers rush fixes, patches, and rewrite the entire application,” said tem in such a way that it does not enhancements into production. The Higgins in a recent conversation, alter the external behavior of the problem is worse today because of “but to rewrite those portions for code yet improves its internal struc- the accelerated pace of change, but which enhancements had been ture.” XP proponents practice con- current refactoring approaches requested.” tinuous, incremental refactoring as aren’t the first to address the prob- Although this older-style “refactor- a way to incorporate change. If lem. Back in the “dark ages” (circa ing” was not widely practiced, the changes are continuous, then we’ll 1986), Dave Higgins wrote Data ideas are the same as they are never get an up-front design com- Structured Software Maintenance, a today — the need today is just pleted. Furthermore, as changes book that addressed the high cost greater. Two things enable, or become more unpredictable — a of maintenance, due in large part to drive, increased levels of refactor- great likelihood today — then much the cumulative effects of changes to ing: one is better languages and anticipatory design likely will be systems over time. Although tools, and the other is rapid change. wasted. Higgins advocated a particular pro- gram-design approach (the Warnier/ Another approach to high change I think the diagram in Figure 3 Orr Approach), one of his primary arose in the early days of RAD: the depicts the situation prior to the themes was to stop the degradation idea of throwaway code. The idea rapid-paced change of the Internet of systems over time by systemati- was that things were changing so era. Since the rate of change (illus- cally redesigning programs during rapidly that we could just code trated by the positioning of the bal- maintenance activities. applications very quickly, then ance point in the figure) was lower, throw them away and start over more anticipatory designing versus Higgins’s approach to program when the time for change arose. refactoring may have been reason- maintenance was first to develop a This turned out to be a poor long- able. As Figure 4 shows, however, pattern (although the term pattern term strategy. as the rate of change increases, the was not used then) for how the pro- viability of anticipatory design loses gram “should be” designed, then to out to refactoring — a situation I create a map from the “good” pat- Refactoring think defines many systems today. tern to the “spaghetti” code. Refactoring is closely related to fac- Programmers would then use the In the long run, the only way to test toring, or what is now referred to as map to help understand the pro- whether a design is flexible involves using design patterns. Design gram and, further, to revise the pro- making changes and measuring Patterns: Elements of Reusable gram over time to look more like how easy they are to implement. Object-Oriented Software, by Erich the pattern. Using Higgins’s One of the biggest problems with Gamma, Richard Helm, Ralph approach, program maintenance the traditional up-front-design-then- Johnson, and John Vlissides, pro-

©2002 CUTTER CONSORTIUM WHITE PAPER 10 AGILE PROJECT MANAGEMENT ADVISORY SERVICE vides the foundational work on degradation of that code base. will decay,” Fowler writes. “Loss of design patterns. Design Patterns However, Fowler is well aware of structure has a cumulative effect.” serves modern-day OO program- the concern: “Before I do the refac- Historically, our approach to main- mers much as Larry Constantine toring, I need to figure out how to tenance has been “quick and dirty,” and Ed Yourdon’s Structural Design do it safely. I’ve written down the so even in those cases where good served a previous generation; it pro- safe steps in the catalog.” Fowler’s initial design work was done, it vides guidelines for program struc- book, Refactoring: Improving the degraded over time. tures that are more effective than Design of Existing Code, catalogs Figure 5 shows the impact of other program structures. not only the before (poor code) and neglected refactoring — at some after (better code based on pat- If Figure 4 shows the correct bal- point, the cost of enhancements terns), but also the steps required ance of designing versus refactoring becomes prohibitive because the to migrate from one to the other. for environments experiencing high software is so shaky. At this point, These migration steps reduce the rates of change, then the quality of monumental redesign (or replace- chances of introducing defects dur- initial design remains extremely ment) becomes the only option, ing the refactoring effort. important. Design patterns provide and these are usually high-risk, or at the means for improving the quality Beck describes his “two-hat” least high-cost, projects. Figure 5 of initial designs by offering models approach to refactoring — namely also shows that while in the 1980s that have proven effective in the that adding new functionality and software decay might have taken a past. refactoring are two different activi- decade, the rate of change today ties. Refactoring, per se, doesn’t hastens the decay. For example, So, you might ask, why a separate change the observable behavior of many client-server applications hur- refactoring book? Can’t we just use the software; it enhances the inter- riedly built in the early 1990s are the design patterns in redesign? nal structure. When new function- now more costly to maintain than Yes and no. As all developers (and ality needs to be added, the first mainframe legacy applications built their managers) understand, mess- step is often to refactor in order to in the 1980s. ing with existing code can be a tick- simplify the addition of new func- lish proposition. The cliché “if it tionality. This new functionality that ain’t broke, don’t fix it” lives on in Data Refactoring: Comments by is proposed, in fact, should provide annals of development folklore. Ken Orr the impetus to refactor. However, as Fowler comments, Editor’s Note: As I mentioned above, “The program may not be broken, Refactoring might be thought of as one thing I like about XP and refac- but it does hurt.” Fear of breaking incremental, as opposed to monu- toring proponents is that they are some part of the code base that’s mental, redesign. “Without refac- clear about the boundary conditions “working” actually hastens the toring, the design of the program for which they consider their ideas applicable. For example, Fowler has Figure 5 — Software entropy over time. an entire chapter titled “Problems with Refactoring.” Database refac- toring tops Fowler’s list. Fowler’s target, as stated in the subtitle to his book, is to improve code. So, for data, I turn to someone who has been thinking about data refactoring for a long time (although not using Today that specific term). The following Mid 1980s section on data refactoring was written by Ken Orr.

Cost of Change Refactoring Goal When Jim asked me to put together something on refactoring, I had to ask him what that really meant. It seemed to me to come down to a 2 4 6 8 10 12 14 couple of very simple ideas: Years

WHITE PAPER www.cutter.com EXTREME PROGRAMMING 11

1. Do what you know how to do. to implement, and they were also Enter Data Refactoring easier to understand and change, Jim and I had never been persuaded 2. Do it quickly. since everything had a purpose. by the argument that the database 3. When changes occur, go back Still, building minimal systems goes design could never be changed and redesign them in. against the grain of many analysts once installed. We had the idea and programmers, who pride them- that if you wanted to have a mini- 4. Go to 1. selves on thinking ahead and antici- mal system, then it was necessary Over the years, Jim and I have pating future needs, no matter how to take changes or new require- worked together on a variety of sys- remote. I think this attitude stems ments to the system and repeat the tems methodologies, all of which from the difficulty that program- basic system cycle over again, rein- were consistent with the refactoring mers have had with maintenance. tegrating these new requirements philosophy. Back in the 1970s, we Maintaining large systems has been with the original requirements to created a methodology built on so difficult and fraught with prob- create a new system. You could data structures. The idea was that lems that many analysts and pro- say that what we were doing was if you knew what people wanted, grammers would rather spend data refactoring, although we never you could work backward and enormous effort at the front end of called it that. the systems development cycle, so design a database that would give The advantages of this approach they don’t have to maintain the sys- you just the data that you needed, turned out to be significant. For tem ever again. But as history and from there you could deter- one thing, there was no major dif- shows, this approach of guessing mine just what inputs you needed ference between development of a about the future never works out. to update the database so that you new system and the maintenance No matter how clever we are in could produce the output required. or major modification of an existing thinking ahead, some new, unantic- one. This meant that training and Creating systems by working back- ipated requirement comes up to project management could be sim- ward from outputs to database to bite us. (How many people includ- plified considerably. It also meant inputs proved to be a very effective ed Internet-based e-business as one that our systems tended not to and efficient means of developing of their top requirements in systems degrade over time, since we “built systems. This methodology was they were building 10 years ago?) developed at about the same time in” changes rather than “adding that relational databases were com- Ultimately, one of the reasons that them on” to the existing system. maintenance is so difficult revolves ing into vogue, and we could show Over a period of years, we built a around the problem of changing the that our approach would always methodology (Data Structured database design. In most develop- create a well-behaved, normalized Systems Development or Warnier/ ers’ eyes, once you design a data- database. More than that, however, Orr) and trained thousands of base and start to program against it, was the idea that approaching sys- systems analysts and programmers. it is almost impossible to change tems this way created minimal sys- The process that we developed was that database design. In a way, the tems. In fact, one of our customers largely manual, although we database design is something like actually used this methodology to thought that if we built a detailed- the foundation of the system: once rebuild a system that was already in enough methodology, it should be you have poured concrete for the place. The customer started with possible to automate large pieces foundation, there is almost no way the outputs and worked backward of that methodology in CASE tools. to design a minimal database with you can go back and change it. As minimal input requirements. it turns out, major changes to data- bases in large systems happen very Automating Data Refactoring The new system had only about infrequently, only when they are To make the story short, a group of one-third the data elements of the unavoidable. People simply do not systems developers in South system it was replacing. This was a think about redesigning a database America finally accomplished the major breakthrough. These devel- as a normal part of systems mainte- automation of our data refactoring opers came to understand that cre- nance, and, as a consequence, approach in the late 1980s. A com- ating minimal systems had enor- major changes are often unbeliev- pany led by Breogán Gonda and mous advantages: they were much ably difficult. Nicolás Jodal created a tool called smaller and therefore much faster

©2002 CUTTER CONSORTIUM WHITE PAPER 12 AGILE PROJECT MANAGEMENT ADVISORY SERVICE

GeneXus1 that accomplished what news is that, when implemented process. For another, it puts much we had conceived in the 1970s. correctly, refactoring makes it possi- more stress on rapid development. ble for us to build very robust sys- They created an approach in which When people tell you that building tems very rapidly. The bad news is you could enter data structures for simple, minimal systems is out of that we have to rethink how we go input screens; with those data date in this Internet age, tell them about developing systems. Many of structures, GeneXus automatically that the Internet is all about speed our most cherished project man- designed a normalized database and service. Tell them that refactor- agement and development strate- and generated the code to navigate, ing is not just the best way to build gies need to be rethought. We update, and report against that the kind of systems that we need for have to become very conscious of database. the 21st century, it is the only way. interactive, incremental design. We But that was the easy part. They have to be much more willing to designed their tool in such a way prototype our way to success and CRYSTAL LIGHT METHODS: that when requirements changed or to use tools that will do complex COMMENTS users came up with something new BY ALISTAIR COCKBURN parts of the systems development or different, they could restate their process (database design and code Editor’s note: In the early 1990s, requirements, rerun (recompile), generation) for us. Alistair Cockburn was hired by the and GeneXus would redesign the IBM Consulting Group to construct database, convert the previous In the 1980s, CASE was a technology and document a methodology for database automatically to the new that was somehow going to revolu- OO development. IBM had no pref- design, and then regenerate just tionize programming. In the 1990s, erences as to what the answer those programs that were affected objects and OO development were might look like, just that it work. by the changes in the database going to do the same. Neither of Cockburn’s approach to the assign- design. They created a closed-loop these technologies lived up to their ment was to interview as many refactoring cycle based on data early expectations. But today, tools project team members as possible, requirements. like GeneXus really do many of the writing down whatever the teams things that the system gurus of the GeneXus showed us what was really said was important to their success 1980s anticipated. It is possible, possible using a refactoring frame- (or failure). The results were sur- currently, to take a set of require- work. For the first time in my experi- prising. The remainder of this sec- ments, automatically design a data- ence, developers were freed from tion was written by Cockburn and is base from those requirements, gen- having to worry about future based on his “in-process” book on erate an operational database from requirements. It allowed them to minimal methodologies. among the number of commercially define just what they knew and available relational databases In the IBM study, team after suc- then rapidly build a system that did (Oracle, DB2, Informix, MS SQL cessful team “apologized” for not just what they had defined. Then, Server, and Access), and generate following a formal process, for not when (not if) the requirements code (prototype and production) using a high-tech CASE tool, for changed, they could simply reenter that will navigate, update, and report “merely” sitting close to each other those changes, recompile the sys- against those databases in a variety and discussing as they went. tem, and they had a new, complete- of different languages (COBOL, RPG, Meanwhile, a number of failing ly integrated, minimal system that C, C++, and Java). Moreover, it teams puzzled over why they failed incorporated the new requirements. will do this at very high speed. despite using a formal process — maybe they hadn’t followed it well This new approach to systems What Does All This Mean? enough? I finally started encounter- development allows us to spend ing teams who asserted that they Refactoring is becoming something much more time with users, explor- succeeded exactly because they of a buzzword. And like all buzz- ing their requirements and giving did not get caught up in fancy words, there is some good news them user interface choices that processes and deliverables, but and some bad news. The good were never possible when we were instead sat close together so they 1Gonda and Jodal created a company building things at arm’s length. But could talk easily and delivered test- called ARTech to market the GeneXus not everybody appreciates this new ed software frequently. product. It currently has more than 3,000 world. For one thing, it takes a customers worldwide and is marketed in great deal of the mystery out of the the US by GeneXus, Inc.

WHITE PAPER www.cutter.com EXTREME PROGRAMMING 13

These results have been consistent, It isn’t quite so simple. Another The result of collecting those from 1991 to 1999, from Hong Kong result of all those project interviews factors is shown in Figure 6. The to the Americas, Norway, and South was that: different projects have figure shows three factors that Africa, in COBOL, Smalltalk, Java, different needs. Terribly obvious, influence the selection of method- Visual Basic, Sapiens, and Synon. except (somehow) to methodolo- ology: communications load (as The shortest statement of the gists. Sure, if your project only given by staff size), system results are: needs 3 to 6 people, just put them criticality, and project priorities. into a room together. But if you To the extent you can replace Locate the segment of the X axis have 45 or 100 people, that won’t written documentation with for the staff size (typically just work. If you have to pass Food & face-to-face interactions, you the development team). For a Drug Administration process scrutiny, can reduce reliance on written distributed development project, you can’t get away with this. If you work products and improve the move right one box to account are going to shoot me to Mars in a likelihood of delivering the for the loss of face-to-face commu- rocket, I’ll ask you not to try it. We system. nications. must remember factors such as The more frequently you can team size and demands on the On the Y axis, identify the damage deliver running, tested slices of project, such as: effect of the system: loss of com- the system, the more you can fort, loss of “discretionary” monies, reduce reliance on written loss of “essential” monies (e.g., à As the number of people “promissory” notes and improve involved grows, so does the going bankrupt), or loss of life. the likelihood of delivering the need to coordinate communi- The different planes behind the system. cations. top layer reflect the different possi- People are communicating beings. ble project priorities, whether it is Even introverted programmers do à As the potential for damage time to market at all costs (such as better with informal, face-to-face increases, the need for public in the first layer), productivity and scrutiny increases, and the tol- communication than with paper tolerance (the hidden second erance for personal stylistic documents. From a cost and layer), or legal liability (the hidden variations decreases. time perspective, writing takes third layer). The box in the grid longer and is less communicative indicates the class of projects à Some projects depend on than discussing at the whiteboard. time-to-market and can toler- (for example, C6) with similar communications load and safety Written, reviewed requirements ate defects (Web browsers needs and can be used to select and design documents are “promis- being an example); other pro- a methodology. es” for what will be built, serving as jects aim for traceability or timed progress markers. There are legal liability protection. times when creating them is good. However, a more accurate timed Figure 6 — The family of Crystal methods. progress marker is running tested code. It is more accurate because PrioritizedPrioritized for for Legal Legal Liability Liability it is not a timed promise, it is a Prioritized for Productivity and Tolerance timed accomplishment. Life (L) Recently, a bank’s IT group decided L6 L20 L40 L100L200 L500 L1000 to take the above results at face Essential value. They began a small project by money simply putting three people into the (E) E6 E20 E40 E100 E200 E500 E1000 same room and more or less leaving Discretionary money them alone. Surprisingly (to them), D6 D20 D40 D100 D200 D500 D1000

System Criticality: (D) the team delivered the system in a fine, timely manner. The bank man- Cause Loss of ... Defects Comfort agement team was a bit bemused. (C) C6 C20 C40 C100 C200 C500 C1000 Surely it can’t be this simple? 1-67-2021-40 41-100 101-200 201-500 501-1,000 Number of people involved ±20%

©2002 CUTTER CONSORTIUM WHITE PAPER 14 AGILE PROJECT MANAGEMENT ADVISORY SERVICE

The grid characterizes projects fairly is outlined in the methodology chap- up in the same place on the grid objectively, useful for choosing a ter of Surviving Object-Oriented and made Clear look heavy! What methodology. I have used it myself Projects (see Editor’s note below). was going on? to change methodologies on a pro- Having worked with the Crystal It turns out that Beck had found ject as it shifted in size and com- Light methods for several years another knob to twist on the plexity. There are, of course, many now, I found a few more surprises. methodology control panel: disci- other factors, but these three deter- pline. To the extent that a team can mine methodology selection quite The first surprise is just how little increase its internal discipline and well. process and control a team actually consistency of action, it can lighten needs to thrive (this is thrive, not Suppose it is time to choose a its methodology even more. The merely survive). It seems that most methodology for the project. To Crystal Light family is predicated on people are interested in being good benefit from the project interviews allowing developers the maximum citizens and in producing a quality mentioned earlier, create the individual preference. XP is product, and they use their native lightest methodology you can even predicated on having everyone fol- cognitive and communications abil- imagine working for the cell in the low tight, disciplined practices: ities to accomplish this. This grid, one in which person-to-person matches Jim’s conclusions about communication is enhanced as à Everyone follows a tight adaptive software development much as possible, and running test- coding standard. (see Resources and References). ed code is the basic timing marker. You need one notch less control The result is a light, habitable à The team forms a consensus than you expect, and less is better (meaning rather pleasant, as on what is “better” code, so when it comes to delivering quickly. opposed to oppressive), effective that changes converge and methodology. Assign this method- More specifically, when Jim and I don’t just bounce around. ology to C6 on the grid. traded notes on project manage- ment, we found we had both à Unit tests exist for all func- Repeating this for all the boxes pro- tions, and they always pass observed a critical success element duces a family of lightweight meth- at 100%. of project management: that team ods, related by their reliance on members understand and commu- people, communication, and fre- à All production code is written nicate their work dependencies. quent delivery of running code. I by two people working They can do this in lots of simple, call this family the Crystal Light together. low-tech, low-overhead ways. It family of methodologies. The fami- is often not necessary to introduce ly is segmented into vertical stripes à Tested function is delivered tool-intensive work products to by color (not shown in figure): The frequently, in the two- to four- manage it. methodology for 2-6 person projects week range. is Crystal Clear, for 6-20 person pro- Oh, but it is necessary to introduce In other words, Crystal Clear illus- jects is Crystal Yellow, for 20-40 per- two more things into the project: trates and XP magnifies the core son projects is Crystal Orange, trust and communication. principle of light methods: then Red, Magenta, Blue, etc. A project that is short on trust is in Intermediate work products can Shifts in the vertical axis can be trouble in more substantial ways be reduced and project delivery thought of as “hardening” of the than just the weight of the method- enhanced, to the extent that methodology. A life-critical 2-6-per- ology. To the extent that you can team communications are son project would use “hardened” enhance trust and communication, improved and frequency of Crystal Clear, and so on. What sur- you can reap the benefits of Crystal delivery increased. prises me is that the project inter- Clear, XP, and the other lightweight XP and Crystal Clear are related to views are showing rather little dif- methods. each other in these ways: ference in the hardness require- The second surprise with defining ment, up to life-critical projects. the Crystal Light methods was XP. I à XP pursues greater productivi- Crystal Clear is documented in a had designed Crystal Clear to be ty through increased disci- forthcoming book, currently in draft the least bureaucratic methodology pline, but it is harder for a form on the Web. Crystal Orange I could imagine. Then XP showed team to follow.

WHITE PAPER www.cutter.com EXTREME PROGRAMMING 15

tion to market conditions — which American Airline’s answer, but the

à Crystal Clear permits greater means continuous changes to soft- question remains an important one. ware. individuality within the team There are five key ideas to take and more relaxed work habits Deliver quickly. Change quickly. away from this discussion of XP in exchange for some loss in Change often. These three driving and light methods: productivity. forces, in addition to better soft- ware tools, compel us to rethink à Crystal Clear may be easier for à For projects that must be man- a team to adopt, but XP pro- traditional software engineering aged in high-speed, high- duces better results if the practices — not abandon the prac- change environments, we team can follow it. tices, but rethink them. XP, for need to reexamine software example, doesn’t ask us to abandon development practices and à A team can start with Crystal good software engineering prac- the assumptions behind them. Clear and move itself to XP. A tices. It does, however, ask us to team that falls off XP can back consider closely the absolute mini- à Practices such as refactoring, up to Crystal Clear. mum set of practices that enable a simplicity, and collaboration Although there are differences in small, co-located team to function (pair programming, metaphor, Crystal Clear and XP, the fundamen- effectively in today’s software deliv- collective ownership) prompt tal values are consistent — simplici- ery environment. us to think in new ways. ty, communications, and minimal Cockburn made the observation formality. à We need to rethink both how that implementation of XP (at least to reduce the cost of change Editor’s note: For more information as Beck and Jeffries define it) in our existing environments on the Crystal Clear methodology, requires three key environmental and how to create new envi- see Alistair Cockburn’s Web site, features: inexpensive inter-face ronments that minimize the listed in the References and changes, close communications, cost of change. Resources section. For more infor- and automated regression testing. mation on Crystal Orange, it is cov- Rather than asking “How do I à In times of high change, the ered in the book Surviving Object- reduce the cost of change?” XP, in ability to refactor code, data, Oriented Projects, also listed in the effect, postulates a low-change cost and whole applications References and Resources section. environment and then says, “This is becomes a critical skill. how we will work.” For example, rather than experience the delays of à Matching methods to the pro- CONCLUSIONS: GOING TO ject, relying on people first and EXTREMES a traditional relational database environment (and dealing with documentation later, and min- Orr and Cockburn each describe multiple outside groups), the C3 imizing formality are methods geared to change and speed. their approaches and experience project used GemStone, an OO with lighter methodologies. But database. earlier, in describing Chrysler’s C3 EDITOR’S MUSINGS project, I alluded to the difficulty in Some might argue that this Extreme rules! In the middle of extending the use of approaches approach is cheating, but that is the writing this issue, I received the 20 like XP or even RAD. In every sur- point. For example, Southwest December issue of BusinessWeek vey we have done of eAD sub- Airlines created a powerhouse magazine, which contains the cover scribers, and every survey conduct- by reducing costs — using a single story, “Xtreme Retailing,” about ed of software organizations in gen- type of aircraft (Boeing 737s). If tur- “brick” stores fighting back against eral, respondents rate reducing bulence and change are the norm, their “click” cousins. If we can delivery time as a critical initiative. then perhaps the right question have extreme retailing, why not But it is not just initial delivery that may be: how do we create an Extreme Programming? is critical. Although Amazon.com environment in which the cost (and may have garnered an advantage time) of change is minimized? Refactoring, design patterns, com- by its early entry in the online book- Southwest got to expand without an prehensive unit testing, pair pro- store market, it has maintained inventory of “legacy” airplanes, so gramming — these are not the tools leadership by continuous adapta- its answer might be different than of hackers. These are the tools of

©2002 CUTTER CONSORTIUM WHITE PAPER 16 AGILE PROJECT MANAGEMENT ADVISORY SERVICE developers who are exploring new moderate levels of change and rea- and measured. Nevertheless, I ways to meet the difficult goals of sonably predictable desired out- firmly believe that our turbulent e- rapid product delivery, low defect comes. However, the business business economy requires us to levels, and flexibility. Writing about world is no longer very predictable, revisit how we develop and man- quality, Beck says, “The only possi- and software requirements change age software delivery. While new, ble values are ‘excellent’ and at rates that swamp traditional these approaches offer alternatives ‘insanely excellent,’ depending on methods. “The bureaucracy and well worth considering. whether lives are at stake or not” inflexibility of organizations like the In the coming year, we will no and “runs the tests until they pass Software Engineering Institute and doubt see more in print on XP. (100% correct).” You might accuse practices such as CMM are making Beck, Jeffries, Fowler, and XP practitioners of being delusional, them less and less relevant to Cunningham are working in various but not of being poor-quality-orient- today’s software development combinations with others to publish ed hackers. issues,” remarks Bob Charette, who additional books on XP, so addition- originated the practices of lean To traditional methodology propo- al information on practices, man- development for software. nents, reducing time-to-market is agement philosophy, and project considered the enemy of quality. As Beck points out in the introduc- examples will be available. However, I’ve seen some very slow tion to his book, the individual prac- development efforts produce some tices of XP are drawn from well- very poor-quality software, just as known, well-tested, traditional prac- I’ve seen speedy efforts produce tices. The principles driving the use poor-quality software. Although of these practices, along with the there is obviously some relationship integrative nature of using a specific between time and quality, I think minimal set of practices, make XP a it is a much more complicated novel solution to modern software relationship than we would like development problems. to think. But I must end with a cautionary Traditional methodologies were note. None of these new practices developed to build software in envi- has much history. Their successes ronments characterized by low to are anecdotal, rather than studied

WHITE PAPER www.cutter.com EXTREME PROGRAMMING 17

Resources and References Books and Articles Beck, Kent. Extreme Programming Explained: Embrace Change. Addison-Wesley, 1999.

Cockburn, Alistair. Surviving Object-Oriented Projects. Addison-Wesley, 1998.

Cusumano, Michael A. and Richard Selby. Microsoft Secrets. Free Press, 1995.

Fowler, Martin. Refactoring: Improving the Design of Existing Code. Addison-Wesley, 1999.

Gamma, Erich, Richard Helm, Ralph Johnson, and John Vlissides. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley, 1995.

Gilb, Tom. Principles of Software Engineering Management. Addison-Wesley, 1988.

Higgins, David. Data Structured Software Maintenance. Dorset House Publishing, 1986.

Yourdon, Edward and Larry L. Constantine. Structured Design: Fundamentals of a Discipline of Computer Program and Systems Design. Prentice Hall, 1986.

General Resources Adaptive software development. See the article in the August 1998 Application Development Strategy (now eAD), “Managing Complexity.”

ARTech. Montevideo, Uruguay. Web site: www.artech.com.uy. Developers of GeneXus.

Crystal Clear method. Web site: http://members.aol.com/humansandt/crystal/clear.

Alistair Cockburn. Web site: http://members.aol.com/acockburn.

Bob Charette. Lean Development. ITABHI Corporation, 11609 Stonewall Jackson Drive, Spotsylvania, VA 22553, USA. E-mail: [email protected].

Ward Cunningham’s Extreme Programming Roadmap. Web site: http://c2.com/cgi/ wiki?ExtremeProgrammingRoadmap.

eXtreme Programming and Flexible Processes in Software Engineering — XP2000 Conference. 21-23 June 2000, Cagliari, Sardinia, Italy. Web site: http://numa.sern.enel.ucalgary.ca/extreme.

Martin Fowler. Web site: http://ourworld.compuserve.com/homepages/martin_fowler/.

Ron Jeffries. E-mail: [email protected]. Web site: www.XProgramming.com.

Lean Development. See the November 1998 ADS article “Time is of the Essence.”

Object Mentor, Green Oaks, IL, USA. Web site: www.objectmentor.com/.

Ken Orr, Ken Orr Institute, Topeka, KS, USA. Web site: www.kenorrinst.com.

Laurie Williams. Web site: www.cs.utah.edu/~lwilliam.

©2002 CUTTER CONSORTIUM WHITE PAPER Agile Resources The Experts’Guideto gl:BeyondtheBasics Agile: yKn ekadMcalHl . Hill Michael and Beck Kent by Performance Ten CheapActionstoImproveYour IT . . Avery M. Christopher by Leadership MindsetandCulture Agile: ASetofMethodsandSkillsora ..12 PollyannaPixton and Larsen Diana by Team CollaborationforSeniorLeadership . . Hussman David by CardsandUserStories Story HIFI:TheEvolutionof LOTECH — . Jeffries Ron by The AgileProject Schedule . Davies Rachel by Applied AgileDynamics ..3 Lister Kerievsky,Tim Joshua and TomHighsmith, by Jim DeMarco, Reassessing XP . Lister Tim by Agile MethodsandRisk Contents CUTTER CONSORTIUM CUTTER Agile ResourcesGuide! Download yourfreecopyofthis Visit www.cutter.com/offers/agileresources.html ..1 ..9 . .7 . .11 ..14 .15 Cutter Consortium

Are You Ready Cutter Consortium’s Agile Project Management Resource Center provides to Take Agile you with information and guidance — from experienced, hands-on Senior Consultants — that will help you create a culture that embraces agility. Led by Practice Director Jim Highsmith, Cutter’s team of experts focuses Beyond the on agile principles and traits — delivering customer value, embracing change, reflection, adaptation, etc. — to help you shorten your product Project Level? development schedules, and increase the quality of your resultant products.

Cutting edge ideas on collaboration, governance, and measurement/metrics are united with agile practices, such as iterative development, test-first design, project chartering, team collocation, onsite customers, sustainable work schedules, and others, to help your organization innovate and ultimately deliver high return-on-investment.

Subscribe now and your team will immediately benefit from access to Cutter’s comprehensive online resource library, including webinars, podcasts, white papers, reports, and articles authored by Cutter’s Agile Project Management Senior Consultants — worldwide agile thought leaders:

 Monthly Executive Reports and accompanying Executive Summaries, also delivered in hard copy, on topics such as organizational structure for agile projects, collaboration and leadership, adaptive performance management systems, refactoring databases, the compatibility of agile methods and enterprise architecture, responsible change, and more.  Executive Updates, containing Cutter benchmark information and case studies on agile implementation  Weekly E-Mail Advisors, written by Jim Highsmith and other APM Senior Consultants  Complimentary admittance to all Cutter Consortium webinars for the duration of your service, such as Ed Yourdon on Planning for, and Coping with IT Litigation; Cutter Consortium Jim Watson on Breaking the Barriers between Enterprise Architecture and Agile; 37 Broadway, Suite 1 and Ken Collier on Agile Business Intelligence. Arlington, MA 02474 USA Phone: +1 781 648 8700 Bring the best project management, collaboration, agile methods, XP, and Fax: +1 781 648 1950 software development practices to bear on your state-of-the-art projects. Start E-mail: [email protected] your subscription to Cutter Consortium’s Agile Project Management Resource Center Web: www.cutter.com today. For more information, contact us at +1 781 648 8700 or send e-mail to Jack Wainwright at [email protected].